Science.gov

Sample records for radiation code cduct-larc

  1. FLYCHK Collisional-Radiative Code

    National Institute of Standards and Technology Data Gateway

    SRD 160 FLYCHK Collisional-Radiative Code (Web, free access)   FLYCHK provides a capability to generate atomic level populations and charge state distributions for low-Z to mid-Z elements under NLTE conditions.

  2. Testing Impact?s Radiation Code

    SciTech Connect

    Edis, T; Cameron-Smith, P; Grant, K E; Bergmann, D; Chuang, C C

    2004-07-12

    This is a summary of work done over an 8 week period from May to July 2004, which concerned testing the longwave and shortwave radiation packages in Impact. The radiation code was initially developed primarily by Keith Grant in the context of LLNL's 2D model, and was added to Impact over the last few summers. While the radiation code had been tested and also used in some aerosol-related calculations, its 3D form in Impact had not been validated with comparisons to satellite data. Along with such comparisons, our work described here was also motivated by the need to validate the radiation code for use in the SciDAC consortium project. This involved getting the radiation code working with CAM/WACCM met data, and setting the stage for comparing CAM/WACCM radiation output with Impact results.

  3. MACRAD: A mass analysis code for radiators

    SciTech Connect

    Gallup, D.R.

    1988-01-01

    A computer code to estimate and optimize the mass of heat pipe radiators (MACRAD) is currently under development. A parametric approach is used in MACRAD, which allows the user to optimize radiator mass based on heat pipe length, length to diameter ratio, vapor to wick radius, radiator redundancy, etc. Full consideration of the heat pipe operating parameters, material properties, and shielding requirements is included in the code. Preliminary results obtained with MACRAD are discussed.

  4. TORUS: Radiation transport and hydrodynamics code

    NASA Astrophysics Data System (ADS)

    Harries, Tim

    2014-04-01

    TORUS is a flexible radiation transfer and radiation-hydrodynamics code. The code has a basic infrastructure that includes the AMR mesh scheme that is used by several physics modules including atomic line transfer in a moving medium, molecular line transfer, photoionization, radiation hydrodynamics and radiative equilibrium. TORUS is useful for a variety of problems, including magnetospheric accretion onto T Tauri stars, spiral nebulae around Wolf-Rayet stars, discs around Herbig AeBe stars, structured winds of O supergiants and Raman-scattered line formation in symbiotic binaries, and dust emission and molecular line formation in star forming clusters. The code is written in Fortran 2003 and is compiled using a standard Gnu makefile. The code is parallelized using both MPI and OMP, and can use these parallel sections either separately or in a hybrid mode.

  5. Tests of Exoplanet Atmospheric Radiative Transfer Codes

    NASA Astrophysics Data System (ADS)

    Harrington, Joseph; Challener, Ryan; DeLarme, Emerson; Cubillos, Patricio; Blecic, Jasmina; Foster, Austin; Garland, Justin

    2016-10-01

    Atmospheric radiative transfer codes are used both to predict planetary spectra and in retrieval algorithms to interpret data. Observational plans, theoretical models, and scientific results thus depend on the correctness of these calculations. Yet, the calculations are complex and the codes implementing them are often written without modern software-verification techniques. In the process of writing our own code, we became aware of several others with artifacts of unknown origin and even outright errors in their spectra. We present a series of tests to verify atmospheric radiative-transfer codes. These include: simple, single-line line lists that, when combined with delta-function abundance profiles, should produce a broadened line that can be verified easily; isothermal atmospheres that should produce analytically-verifiable blackbody spectra at the input temperatures; and model atmospheres with a range of complexities that can be compared to the output of other codes. We apply the tests to our own code, Bayesian Atmospheric Radiative Transfer (BART) and to several other codes. The test suite is open-source software. We propose this test suite as a standard for verifying current and future radiative transfer codes, analogous to the Held-Suarez test for general circulation models. This work was supported by NASA Planetary Atmospheres grant NX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G.

  6. Airborne antenna radiation pattern code user's manual

    NASA Technical Reports Server (NTRS)

    Burnside, Walter D.; Kim, Jacob J.; Grandchamp, Brett; Rojas, Roberto G.; Law, Philip

    1985-01-01

    The use of a newly developed computer code to analyze the radiation patterns of antennas mounted on a ellipsoid and in the presence of a set of finite flat plates is described. It is shown how the code allows the user to simulate a wide variety of complex electromagnetic radiation problems using the ellipsoid/plates model. The code has the capacity of calculating radiation patterns around an arbitrary conical cut specified by the user. The organization of the code, definition of input and output data, and numerous practical examples are also presented. The analysis is based on the Uniform Geometrical Theory of Diffraction (UTD), and most of the computed patterns are compared with experimental results to show the accuracy of this solution.

  7. NASA Space Radiation Transport Code Development Consortium.

    PubMed

    Townsend, Lawrence W

    2005-01-01

    Recently, NASA established a consortium involving the University of Tennessee (lead institution), the University of Houston, Roanoke College and various government and national laboratories, to accelerate the development of a standard set of radiation transport computer codes for NASA human exploration applications. This effort involves further improvements of the Monte Carlo codes HETC and FLUKA and the deterministic code HZETRN, including developing nuclear reaction databases necessary to extend the Monte Carlo codes to carry out heavy ion transport, and extending HZETRN to three dimensions. The improved codes will be validated by comparing predictions with measured laboratory transport data, provided by an experimental measurements consortium, and measurements in the upper atmosphere on the balloon-borne Deep Space Test Bed (DSTB). In this paper, we present an overview of the consortium members and the current status and future plans of consortium efforts to meet the research goals and objectives of this extensive undertaking.

  8. An integrated radiation physics computer code system.

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Harris, D. W.

    1972-01-01

    An integrated computer code system for the semi-automatic and rapid analysis of experimental and analytic problems in gamma photon and fast neutron radiation physics is presented. Such problems as the design of optimum radiation shields and radioisotope power source configurations may be studied. The system codes allow for the unfolding of complex neutron and gamma photon experimental spectra. Monte Carlo and analytic techniques are used for the theoretical prediction of radiation transport. The system includes a multichannel pulse-height analyzer scintillation and semiconductor spectrometer coupled to an on-line digital computer with appropriate peripheral equipment. The system is geometry generalized as well as self-contained with respect to material nuclear cross sections and the determination of the spectrometer response functions. Input data may be either analytic or experimental.

  9. The Helicopter Antenna Radiation Prediction Code (HARP)

    NASA Technical Reports Server (NTRS)

    Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.

    1990-01-01

    The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.

  10. Validation of comprehensive space radiation transport code

    SciTech Connect

    Shinn, J.L.; Simonsen, L.C.; Cucinotta, F.A.

    1998-12-01

    The HZETRN code has been developed over the past decade to evaluate the local radiation fields within sensitive materials on spacecraft in the space environment. Most of the more important nuclear and atomic processes are now modeled and evaluation within a complex spacecraft geometry with differing material components, including transition effects across boundaries of dissimilar materials, are included. The atomic/nuclear database and transport procedures have received limited validation in laboratory testing with high energy ion beams. The codes have been applied in design of the SAGE-III instrument resulting in material changes to control injurious neutron production, in the study of the Space Shuttle single event upsets, and in validation with space measurements (particle telescopes, tissue equivalent proportional counters, CR-39) on Shuttle and Mir. The present paper reviews the code development and presents recent results in laboratory and space flight validation.

  11. THE MCNPX MONTE CARLO RADIATION TRANSPORT CODE

    SciTech Connect

    WATERS, LAURIE S.; MCKINNEY, GREGG W.; DURKEE, JOE W.; FENSIN, MICHAEL L.; JAMES, MICHAEL R.; JOHNS, RUSSELL C.; PELOWITZ, DENISE B.

    2007-01-10

    MCNPX (Monte Carlo N-Particle eXtended) is a general-purpose Monte Carlo radiation transport code with three-dimensional geometry and continuous-energy transport of 34 particles and light ions. It contains flexible source and tally options, interactive graphics, and support for both sequential and multi-processing computer platforms. MCNPX is based on MCNP4B, and has been upgraded to most MCNP5 capabilities. MCNP is a highly stable code tracking neutrons, photons and electrons, and using evaluated nuclear data libraries for low-energy interaction probabilities. MCNPX has extended this base to a comprehensive set of particles and light ions, with heavy ion transport in development. Models have been included to calculate interaction probabilities when libraries are not available. Recent additions focus on the time evolution of residual nuclei decay, allowing calculation of transmutation and delayed particle emission. MCNPX is now a code of great dynamic range, and the excellent neutronics capabilities allow new opportunities to simulate devices of interest to experimental particle physics; particularly calorimetry. This paper describes the capabilities of the current MCNPX version 2.6.C, and also discusses ongoing code development.

  12. LPGS. Code System for Calculating Radiation Exposure

    SciTech Connect

    White, J.E.; Eckerman, K.F.

    1983-01-01

    LPGS was developed to calculate the radiological impacts resulting from radioactive releases to the hydrosphere. The name LPGS was derived from the Liquid Pathway Generic Study for which the original code was used primarily as an analytic tool in the assessment process. The hydrosphere is represented by the following types of water bodies: estuary, small river, well, lake, and one-dimensional (1-d) river. LPGS is designed to calculate radiation dose (individual and population) to body organs as a function of time for the various exposure pathways. The radiological consequences to the aquatic biota are estimated. Several simplified radionuclide transport models are employed with built-in formulations to describe the release rate of the radionuclides. A tabulated user-supplied release model can be input, if desired. Printer plots of dose versus time for the various exposure pathways are provided.

  13. Advances in space radiation shielding codes

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Qualls, Garry D.; Cucinotta, Francis A.; Prael, Richard E.; Norbury, John W.; Heinbockel, John H.; Tweed, John; De Angelis, Giovanni

    2002-01-01

    Early space radiation shield code development relied on Monte Carlo methods and made important contributions to the space program. Monte Carlo methods have resorted to restricted one-dimensional problems leading to imperfect representation of appropriate boundary conditions. Even so, intensive computational requirements resulted and shield evaluation was made near the end of the design process. Resolving shielding issues usually had a negative impact on the design. Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary concept to the final design. For the last few decades, we have pursued deterministic solutions of the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design methods. A single ray trace in such geometry requires 14 milliseconds and limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given.

  14. Space Radiation Transport Code Development: 3DHZETRN

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    The space radiation transport code, HZETRN, has been used extensively for research, vehicle design optimization, risk analysis, and related applications. One of the simplifying features of the HZETRN transport formalism is the straight-ahead approximation, wherein all particles are assumed to travel along a common axis. This reduces the governing equation to one spatial dimension allowing enormous simplification and highly efficient computational procedures to be implemented. Despite the physical simplifications, the HZETRN code is widely used for space applications and has been found to agree well with fully 3D Monte Carlo simulations in many circumstances. Recent work has focused on the development of 3D transport corrections for neutrons and light ions (Z < 2) for which the straight-ahead approximation is known to be less accurate. Within the development of 3D corrections, well-defined convergence criteria have been considered, allowing approximation errors at each stage in model development to be quantified. The present level of development assumes the neutron cross sections have an isotropic component treated within N explicit angular directions and a forward component represented by the straight-ahead approximation. The N = 1 solution refers to the straight-ahead treatment, while N = 2 represents the bi-directional model in current use for engineering design. The figure below shows neutrons, protons, and alphas for various values of N at locations in an aluminum sphere exposed to a solar particle event (SPE) spectrum. The neutron fluence converges quickly in simple geometry with N > 14 directions. The improved code, 3DHZETRN, transports neutrons, light ions, and heavy ions under space-like boundary conditions through general geometry while maintaining a high degree of computational efficiency. A brief overview of the 3D transport formalism for neutrons and light ions is given, and extensive benchmarking results with the Monte Carlo codes Geant4, FLUKA, and

  15. Description of Transport Codes for Space Radiation Shielding

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Wilson, John W.; Cucinotta, Francis A.

    2011-01-01

    This slide presentation describes transport codes and their use for studying and designing space radiation shielding. When combined with risk projection models radiation transport codes serve as the main tool for study radiation and designing shielding. There are three criteria for assessing the accuracy of transport codes: (1) Ground-based studies with defined beams and material layouts, (2) Inter-comparison of transport code results for matched boundary conditions and (3) Comparisons to flight measurements. These three criteria have a very high degree with NASA's HZETRN/QMSFRG.

  16. Radiation flux tables for ICRCCM using the GLA GCM radiation codes

    NASA Technical Reports Server (NTRS)

    HARSHVARDHAN

    1986-01-01

    Tabulated values of longwave and shortwave radiation fluxes and also cooling and heating rates in the atmosphere for standard atmospheric profiles are presented. The radiation codes used in the Goddard general circulation model were employed for the computations. These results were obtained for an international intercomparison projected called Intercomparison of Radiation Codes in Climate Models (ICRCCM).

  17. Recent developments in the Los Alamos radiation transport code system

    SciTech Connect

    Forster, R.A.; Parsons, K.

    1997-06-01

    A brief progress report on updates to the Los Alamos Radiation Transport Code System (LARTCS) for solving criticality and fixed-source problems is provided. LARTCS integrates the Diffusion Accelerated Neutral Transport (DANT) discrete ordinates codes with the Monte Carlo N-Particle (MCNP) code. The LARCTS code is being developed with a graphical user interface for problem setup and analysis. Progress in the DANT system for criticality applications include a two-dimensional module which can be linked to a mesh-generation code and a faster iteration scheme. Updates to MCNP Version 4A allow statistical checks of calculated Monte Carlo results.

  18. Overview of HZETRN and BRNTRN Space Radiation Shielding Codes

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Cucinotta, F. A.; Shinn, J. L.; Simonsen, L. C.; Badavi, F. F.

    1997-01-01

    The NASA Radiation Health Program has supported basic research over the last decade in radiation physics to develop ionizing radiation transport codes and corresponding data bases for the protection of astronauts from galactic and solar cosmic rays on future deep space missions. The codes describe the interactions of the incident radiations with shield materials where their content is modified by the atomic and nuclear reactions through which high energy heavy ions are fragmented into less massive reaction products and reaction products are produced as radiations as direct knockout of shield constituents or produced as de-excitation products in the reactions. This defines the radiation fields to which specific devices are subjected onboard a spacecraft. Similar reactions occur in the device itself which is the initiating event for the device response. An overview of the computational procedures and data base with some applications to photonic and data processing devices will be given.

  19. The Continual Intercomparison of Radiation Codes: Results from Phase I

    NASA Technical Reports Server (NTRS)

    Oreopoulos, Lazaros; Mlawer, Eli; Delamere, Jennifer; Shippert, Timothy; Cole, Jason; Iacono, Michael; Jin, Zhonghai; Li, Jiangnan; Manners, James; Raisanen, Petri; Rose, Fred; Zhang, Yuanchong; Wilson Michael J.; Rossow, William

    2011-01-01

    The computer codes that calculate the energy budget of solar and thermal radiation in Global Climate Models (GCMs), our most advanced tools for predicting climate change, have to be computationally efficient in order to not impose undue computational burden to climate simulations. By using approximations to gain execution speed, these codes sacrifice accuracy compared to more accurate, but also much slower, alternatives. International efforts to evaluate the approximate schemes have taken place in the past, but they have suffered from the drawback that the accurate standards were not validated themselves for performance. The manuscript summarizes the main results of the first phase of an effort called "Continual Intercomparison of Radiation Codes" (CIRC) where the cases chosen to evaluate the approximate models are based on observations and where we have ensured that the accurate models perform well when compared to solar and thermal radiation measurements. The effort is endorsed by international organizations such as the GEWEX Radiation Panel and the International Radiation Commission and has a dedicated website (i.e., http://circ.gsfc.nasa.gov) where interested scientists can freely download data and obtain more information about the effort's modus operandi and objectives. In a paper published in the March 2010 issue of the Bulletin of the American Meteorological Society only a brief overview of CIRC was provided with some sample results. In this paper the analysis of submissions of 11 solar and 13 thermal infrared codes relative to accurate reference calculations obtained by so-called "line-by-line" radiation codes is much more detailed. We demonstrate that, while performance of the approximate codes continues to improve, significant issues still remain to be addressed for satisfactory performance within GCMs. We hope that by identifying and quantifying shortcomings, the paper will help establish performance standards to objectively assess radiation code quality

  20. Stratospheric Relaxation in IMPACT's Radiation Code

    SciTech Connect

    Edis, T; Grant, K; Cameron-Smith, P

    2006-11-13

    While Impact incorporates diagnostic radiation routines from our work in previous years, it has not previously included the stratospheric relaxation required for forcing calculations. We have now implemented the necessary changes for stratospheric relaxation, tested its stability, and compared the results with stratosphere temperatures obtained from CAM3 met data. The relaxation results in stable temperature profiles in the stratosphere, which is encouraging for use in forcing calculations. It does, however, produce a cooling bias when compared to CAM3, which appears to be due to differences in radiation calculations rather than the interactive treatment of ozone. The cause of this bias is unclear as yet, but seems to be systematic and hence cancels out when differences are taken relative to a control simulation.

  1. Acceleration of a Monte Carlo radiation transport code

    SciTech Connect

    Hochstedler, R.D.; Smith, L.M.

    1996-03-01

    Execution time for the Integrated TIGER Series (ITS) Monte Carlo radiation transport code has been reduced by careful re-coding of computationally intensive subroutines. Three test cases for the TIGER (1-D slab geometry), CYLTRAN (2-D cylindrical geometry), and ACCEPT (3-D arbitrary geometry) codes were identified and used to benchmark and profile program execution. Based upon these results, sixteen top time-consuming subroutines were examined and nine of them modified to accelerate computations with equivalent numerical output to the original. The results obtained via this study indicate that speedup factors of 1.90 for the TIGER code, 1.67 for the CYLTRAN code, and 1.11 for the ACCEPT code are achievable. {copyright} {ital 1996 American Institute of Physics.}

  2. A Radiation Shielding Code for Spacecraft and Its Validation

    NASA Technical Reports Server (NTRS)

    Shinn, J. L.; Cucinotta, F. A.; Singleterry, R. C.; Wilson, J. W.; Badavi, F. F.; Badhwar, G. D.; Miller, J.; Zeitlin, C.; Heilbronn, L.; Tripathi, R. K.

    2000-01-01

    The HZETRN code, which uses a deterministic approach pioneered at NASA Langley Research Center, has been developed over the past decade to evaluate the local radiation fields within sensitive materials (electronic devices and human tissue) on spacecraft in the space environment. The code describes the interactions of shield materials with the incident galactic cosmic rays, trapped protons, or energetic protons from solar particle events in free space and low Earth orbit. The content of incident radiations is modified by atomic and nuclear reactions with the spacecraft and radiation shield materials. High-energy heavy ions are fragmented into less massive reaction products, and reaction products are produced by direct knockout of shield constituents or from de-excitation products. An overview of the computational procedures and database which describe these interactions is given. Validation of the code with recent Monte Carlo benchmarks, and laboratory and flight measurement is also included.

  3. Comparison of Space Radiation Calculations from Deterministic and Monte Carlo Transport Codes

    NASA Technical Reports Server (NTRS)

    Adams, J. H.; Lin, Z. W.; Nasser, A. F.; Randeniya, S.; Tripathi, r. K.; Watts, J. W.; Yepes, P.

    2010-01-01

    The presentation outline includes motivation, radiation transport codes being considered, space radiation cases being considered, results for slab geometry, results from spherical geometry, and summary. ///////// main physics in radiation transport codes hzetrn uprop fluka geant4, slab geometry, spe, gcr,

  4. Code for Analyzing and Designing Spacecraft Power System Radiators

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert

    2005-01-01

    GPHRAD is a computer code for analysis and design of disk or circular-sector heat-rejecting radiators for spacecraft power systems. A specific application is for Stirling-cycle/linear-alternator electric-power systems coupled to radioisotope general-purpose heat sources. GPHRAD affords capabilities and options to account for thermophysical properties (thermal conductivity, density) of either metal-alloy or composite radiator materials.

  5. Prototype demonstration of radiation therapy planning code system

    SciTech Connect

    Little, R.C.; Adams, K.J.; Estes, G.P.; Hughes, L.S. III; Waters, L.S.

    1996-09-01

    This is the final report of a one-year, Laboratory-Directed Research and Development project at the Los Alamos National Laboratory (LANL). Radiation therapy planning is the process by which a radiation oncologist plans a treatment protocol for a patient preparing to undergo radiation therapy. The objective is to develop a protocol that delivers sufficient radiation dose to the entire tumor volume, while minimizing dose to healthy tissue. Radiation therapy planning, as currently practiced in the field, suffers from inaccuracies made in modeling patient anatomy and radiation transport. This project investigated the ability to automatically model patient-specific, three-dimensional (3-D) geometries in advanced Los Alamos radiation transport codes (such as MCNP), and to efficiently generate accurate radiation dose profiles in these geometries via sophisticated physics modeling. Modem scientific visualization techniques were utilized. The long-term goal is that such a system could be used by a non-expert in a distributed computing environment to help plan the treatment protocol for any candidate radiation source. The improved accuracy offered by such a system promises increased efficacy and reduced costs for this important aspect of health care.

  6. NERO- a post-maximum supernova radiation transport code

    NASA Astrophysics Data System (ADS)

    Maurer, I.; Jerkstrand, A.; Mazzali, P. A.; Taubenberger, S.; Hachinger, S.; Kromer, M.; Sim, S.; Hillebrandt, W.

    2011-12-01

    The interpretation of supernova (SN) spectra is essential for deriving SN ejecta properties such as density and composition, which in turn can tell us about their progenitors and the explosion mechanism. A very large number of atomic processes are important for spectrum formation. Several tools for calculating SN spectra exist, but they mainly focus on the very early or late epochs. The intermediate phase, which requires a non-local thermodynamic equilibrium (NLTE) treatment of radiation transport has rarely been studied. In this paper, we present a new SN radiation transport code, NERO, which can look at those epochs. All the atomic processes are treated in full NLTE, under a steady-state assumption. This is a valid approach between roughly 50 and 500 days after the explosion depending on SN type. This covers the post-maximum photospheric and the early and the intermediate nebular phase. As a test, we compare NERO to the radiation transport code of Jerkstrand, Fransson & Kozma and to the nebular code of Mazzali et al. All three codes have been developed independently and a comparison provides a valuable opportunity to investigate their reliability. Currently, NERO is one-dimensional and can be used for predicting spectra of synthetic explosion models or for deriving SN properties by spectral modelling. To demonstrate this, we study the spectra of the 'normal' Type Ia supernova (SN Ia) 2005cf between 50 and 350 days after the explosion and identify most of the common SN Ia line features at post-maximum epochs.

  7. A Radiation Solver for the National Combustion Code

    NASA Technical Reports Server (NTRS)

    Sockol, Peter M.

    2015-01-01

    A methodology is given that converts an existing finite volume radiative transfer method that requires input of local absorption coefficients to one that can treat a mixture of combustion gases and compute the coefficients on the fly from the local mixture properties. The Full-spectrum k-distribution method is used to transform the radiative transfer equation (RTE) to an alternate wave number variable, g . The coefficients in the transformed equation are calculated at discrete temperatures and participating species mole fractions that span the values of the problem for each value of g. These results are stored in a table and interpolation is used to find the coefficients at every cell in the field. Finally, the transformed RTE is solved for each g and Gaussian quadrature is used to find the radiant heat flux throughout the field. The present implementation is in an existing cartesian/cylindrical grid radiative transfer code and the local mixture properties are given by a solution of the National Combustion Code (NCC) on the same grid. Based on this work the intention is to apply this method to an existing unstructured grid radiation code which can then be coupled directly to NCC.

  8. A more accurate nonequilibrium air radiation code - NEQAIR second generation

    NASA Technical Reports Server (NTRS)

    Moreau, Stephane; Laux, Christophe O.; Chapman, Dean R.; Maccormack, Robert W.

    1992-01-01

    Two experiments, one an equilibrium flow in a plasma torch at Stanford, the other a nonequilibrium flow in a SDIO/IST Bow-Shock-Ultra-Violet missile flight, have provided the basis for modifying, enhancing, and testing the well-known radiation code, NEQAIR. The original code, herein termed NEQAIR1, lacked computational efficiency, accurate data for some species and the flexibility to handle a variety of species. The modified code, herein termed NEQAIR2, incorporates recent findings in the spectroscopic and radiation models. It can handle any number of species and radiative bands in a gas whose thermodynamic state can be described by up to four temperatures. It provides a new capability of computing very fine spectra in a reasonable CPU time, while including transport phenomena along the line of sight and the characteristics of instruments that were used in the measurements. Such a new tool should allow more accurate testing and diagnosis of the different physical models used in numerical simulations of radiating, low density, high energy flows.

  9. Towards a 3D Space Radiation Transport Code

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathl, R. K.; Cicomptta, F. A.; Heinbockel, J. H.; Tweed, J.

    2002-01-01

    High-speed computational procedures for space radiation shielding have relied on asymptotic expansions in terms of the off-axis scatter and replacement of the general geometry problem by a collection of flat plates. This type of solution was derived for application to human rated systems in which the radius of the shielded volume is large compared to the off-axis diffusion limiting leakage at lateral boundaries. Over the decades these computational codes are relatively complete and lateral diffusion effects are now being added. The analysis for developing a practical full 3D space shielding code is presented.

  10. Validation of a comprehensive space radiation transport code.

    PubMed

    Shinn, J L; Cucinotta, F A; Simonsen, L C; Wilson, J W; Badavi, F F; Badhwar, G D; Miller, J; Zeitlin, C; Heilbronn, L; Tripathi, R K; Clowdsley, M S; Heinbockel, J H; Xapsos, M A

    1998-12-01

    The HZETRN code has been developed over the past decade to evaluate the local radiation fields within sensitive materials on spacecraft in the space environment. Most of the more important nuclear and atomic processes are now modeled and evaluation within a complex spacecraft geometry with differing material components, including transition effects across boundaries of dissimilar materials, are included. The atomic/nuclear database and transport procedures have received limited validation in laboratory testing with high energy ion beams. The codes have been applied in design of the SAGE-III instrument resulting in material changes to control injurious neutron production, in the study of the Space Shuttle single event upsets, and in validation with space measurements (particle telescopes, tissue equivalent proportional counters, CR-39) on Shuttle and Mir. The present paper reviews the code development and presents recent results in laboratory and space flight validation.

  11. New Parallel computing framework for radiation transport codes

    SciTech Connect

    Kostin, M.A.; Mokhov, N.V.; Niita, K.; /JAERI, Tokai

    2010-09-01

    A new parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was integrated with the MARS15 code, and an effort is under way to deploy it in PHITS. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. Several checkpoint files can be merged into one thus combining results of several calculations. The framework also corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  12. A model code for the radiative theta pinch

    SciTech Connect

    Lee, S.; Saw, S. H.; Lee, P. C. K.; Akel, M.; Damideh, V.; Khattak, N. A. D.; Mongkolnavin, R.; Paosawatyanyong, B.

    2014-07-15

    A model for the theta pinch is presented with three modelled phases of radial inward shock phase, reflected shock phase, and a final pinch phase. The governing equations for the phases are derived incorporating thermodynamics and radiation and radiation-coupled dynamics in the pinch phase. A code is written incorporating correction for the effects of transit delay of small disturbing speeds and the effects of plasma self-absorption on the radiation. Two model parameters are incorporated into the model, the coupling coefficient f between the primary loop current and the induced plasma current and the mass swept up factor f{sub m}. These values are taken from experiments carried out in the Chulalongkorn theta pinch.

  13. Evaluation of coded aperture radiation detectors using a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Miller, Kyle; Huggins, Peter; Labov, Simon; Nelson, Karl; Dubrawski, Artur

    2016-12-01

    We investigate tradeoffs arising from the use of coded aperture gamma-ray spectrometry to detect and localize sources of harmful radiation in the presence of noisy background. Using an example application scenario of area monitoring and search, we empirically evaluate weakly supervised spectral, spatial, and hybrid spatio-spectral algorithms for scoring individual observations, and two alternative methods of fusing evidence obtained from multiple observations. Results of our experiments confirm the intuition that directional information provided by spectrometers masked with coded aperture enables gains in source localization accuracy, but at the expense of reduced probability of detection. Losses in detection performance can however be to a substantial extent reclaimed by using our new spatial and spatio-spectral scoring methods which rely on realistic assumptions regarding masking and its impact on measured photon distributions.

  14. Recent radiation damage studies and developments of the Marlowe code

    NASA Astrophysics Data System (ADS)

    Ortiz, C. J.; Souidi, A.; Becquart, C. S.; Domain, C.; Hou, M.

    2014-07-01

    Radiation damage in materials relevant to applications evolves over time scales spanning from the femtosecond - the characteristic time for an atomic collision - to decades - the aging time expected for nuclear materials. The relevant kinetic energies of atoms span from thermal motion to the MeV range.The question motivating this contribution is to identify the relationship between elementary atomic displacements triggered by irradiation and the subsequent microstructural evolution of metals in the long term. The Marlowe code, based on the binary collision approximation (BCA) is used to simulate the sequences of atomic displacements generated by energetic primary recoils and the Object Kinetic Monte Carlo code LAKIMOCA, parameterized on a range of ab initio calculations, is used to predict the subsequent long-term evolution of point defect and clusters thereof. In agreement with full Molecular Dynamics, BCA displacement cascades in body-centered cubic (BCC) Fe and a face-centered cubic (FCC) Febond Nibond Cr alloy display recursive properties that are found useful for predictions in the long term.The case of defects evolution in W due to external irradiation with energetic H and He is also discussed. To this purpose, it was useful to extend the inelastic energy loss model available in Marlowe up to the Bethe regime. The last version of the Marlowe code (version 15) was delivered before message passing instructions softwares (such as MPI) were available but the structure of the code was designed in such a way to permit parallel executions within a distributed memory environment. This makes possible to obtain N different cascades simultaneously using N independent nodes without any communication between processors. The parallelization of the code using MPI was recently achieved by one author of this report (C.J.O.). Typically, the parallelized version of Marlowe allows simulating millions of displacement cascades using a limited number of processors (<64) within only

  15. 3D unstructured-mesh radiation transport codes

    SciTech Connect

    Morel, J.

    1997-12-31

    Three unstructured-mesh radiation transport codes are currently being developed at Los Alamos National Laboratory. The first code is ATTILA, which uses an unstructured tetrahedral mesh in conjunction with standard Sn (discrete-ordinates) angular discretization, standard multigroup energy discretization, and linear-discontinuous spatial differencing. ATTILA solves the standard first-order form of the transport equation using source iteration in conjunction with diffusion-synthetic acceleration of the within-group source iterations. DANTE is designed to run primarily on workstations. The second code is DANTE, which uses a hybrid finite-element mesh consisting of arbitrary combinations of hexahedra, wedges, pyramids, and tetrahedra. DANTE solves several second-order self-adjoint forms of the transport equation including the even-parity equation, the odd-parity equation, and a new equation called the self-adjoint angular flux equation. DANTE also offers three angular discretization options: $S{_}n$ (discrete-ordinates), $P{_}n$ (spherical harmonics), and $SP{_}n$ (simplified spherical harmonics). DANTE is designed to run primarily on massively parallel message-passing machines, such as the ASCI-Blue machines at LANL and LLNL. The third code is PERICLES, which uses the same hybrid finite-element mesh as DANTE, but solves the standard first-order form of the transport equation rather than a second-order self-adjoint form. DANTE uses a standard $S{_}n$ discretization in angle in conjunction with trilinear-discontinuous spatial differencing, and diffusion-synthetic acceleration of the within-group source iterations. PERICLES was initially designed to run on workstations, but a version for massively parallel message-passing machines will be built. The three codes will be described in detail and computational results will be presented.

  16. A dual-sided coded-aperture radiation detection system

    NASA Astrophysics Data System (ADS)

    Penny, R. D.; Hood, W. E.; Polichar, R. M.; Cardone, F. H.; Chavez, L. G.; Grubbs, S. G.; Huntley, B. P.; Kuharski, R. A.; Shyffer, R. T.; Fabris, L.; Ziock, K. P.; Labov, S. E.; Nelson, K.

    2011-10-01

    We report the development of a large-area, mobile, coded-aperture radiation imaging system for localizing compact radioactive sources in three dimensions while rejecting distributed background. The 3D Stand-Off Radiation Detection System (SORDS-3D) has been tested at speeds up to 95 km/h and has detected and located sources in the millicurie range at distances of over 100 m. Radiation data are imaged to a geospatially mapped world grid with a nominal 1.25- to 2.5-m pixel pitch at distances out to 120 m on either side of the platform. Source elevation is also extracted. Imaged radiation alarms are superimposed on a side-facing video log that can be played back for direct localization of sources in buildings in urban environments. The system utilizes a 37-element array of 5×5×50 cm 3 cesium-iodide (sodium) detectors. Scintillation light is collected by a pair of photomultiplier tubes placed at either end of each detector, with the detectors achieving an energy resolution of 6.15% FWHM (662 keV) and a position resolution along their length of 5 cm FWHM. The imaging system generates a dual-sided two-dimensional image allowing users to efficiently survey a large area. Imaged radiation data and raw spectra are forwarded to the RadioNuclide Analysis Kit (RNAK), developed by our collaborators, for isotope ID. An intuitive real-time display aids users in performing searches. Detector calibration is dynamically maintained by monitoring the potassium-40 peak and digitally adjusting individual detector gains. We have recently realized improvements, both in isotope identification and in distinguishing compact sources from background, through the installation of optimal-filter reconstruction kernels.

  17. Operation of the helicopter antenna radiation prediction code

    NASA Technical Reports Server (NTRS)

    Braeden, E. W.; Klevenow, F. T.; Newman, E. H.; Rojas, R. G.; Sampath, K. S.; Scheik, J. T.; Shamansky, H. T.

    1993-01-01

    HARP is a front end as well as a back end for the AMC and NEWAIR computer codes. These codes use the Method of Moments (MM) and the Uniform Geometrical Theory of Diffraction (UTD), respectively, to calculate the electromagnetic radiation patterns for antennas on aircraft. The major difficulty in using these codes is in the creation of proper input files for particular aircraft and in verifying that these files are, in fact, what is intended. HARP creates these input files in a consistent manner and allows the user to verify them for correctness using sophisticated 2 and 3D graphics. After antenna field patterns are calculated using either MM or UTD, HARP can display the results on the user's screen or provide hardcopy output. Because the process of collecting data, building the 3D models, and obtaining the calculated field patterns was completely automated by HARP, the researcher's productivity can be many times what it could be if these operations had to be done by hand. A complete, step by step, guide is provided so that the researcher can quickly learn to make use of all the capabilities of HARP.

  18. VISRAD, 3-D Target Design and Radiation Simulation Code

    NASA Astrophysics Data System (ADS)

    Golovkin, Igor; Macfarlane, Joseph; Golovkina, Viktoriya

    2016-10-01

    The 3-D view factor code VISRAD is widely used in designing HEDP experiments at major laser and pulsed-power facilities, including NIF, OMEGA, OMEGA-EP, ORION, LMJ, Z, and PLX. It simulates target designs by generating a 3-D grid of surface elements, utilizing a variety of 3-D primitives and surface removal algorithms, and can be used to compute the radiation flux throughout the surface element grid by computing element-to-element view factors and solving power balance equations. Target set-up and beam pointing are facilitated by allowing users to specify positions and angular orientations using a variety of coordinates systems (e.g., that of any laser beam, target component, or diagnostic port). Analytic modeling for laser beam spatial profiles for OMEGA DPPs and NIF CPPs is used to compute laser intensity profiles throughout the grid of surface elements. We will discuss recent improvements to the software package and plans for future developments.

  19. Development, Verification and Validation of Enclosure Radiation Capabilities in the CHarring Ablator Response (CHAR) Code

    NASA Technical Reports Server (NTRS)

    Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.

    2016-01-01

    With the recent development of multi-dimensional thermal protection system (TPS) material response codes including the capabilities to account for radiative heating is a requirement. This paper presents the recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute view factors for radiation problems involving multiple surfaces. Furthermore, verification and validation of the code's radiation capabilities are demonstrated by comparing solutions to analytical results, to other codes, and to radiant test data.

  20. HELIOS: A new open-source radiative transfer code

    NASA Astrophysics Data System (ADS)

    Malik, Matej; Grosheintz, Luc; Lukas Grimm, Simon; Mendonça, João; Kitzmann, Daniel; Heng, Kevin

    2015-12-01

    I present the new open-source code HELIOS, developed to accurately describe radiative transfer in a wide variety of irradiated atmospheres. We employ a one-dimensional multi-wavelength two-stream approach with scattering. Written in Cuda C++, HELIOS uses the GPU’s potential of massive parallelization and is able to compute the TP-profile of an atmosphere in radiative equilibrium and the subsequent emission spectrum in a few minutes on a single computer (for 60 layers and 1000 wavelength bins).The required molecular opacities are obtained with the recently published code HELIOS-K [1], which calculates the line shapes from an input line list and resamples the numerous line-by-line data into a manageable k-distribution format. Based on simple equilibrium chemistry theory [2] we combine the k-distribution functions of the molecules H2O, CO2, CO & CH4 to generate a k-table, which we then employ in HELIOS.I present our results of the following: (i) Various numerical tests, e.g. isothermal vs. non-isothermal treatment of layers. (ii) Comparison of iteratively determined TP-profiles with their analytical parametric prescriptions [3] and of the corresponding spectra. (iii) Benchmarks of TP-profiles & spectra for various elemental abundances. (iv) Benchmarks of averaged TP-profiles & spectra for the exoplanets GJ1214b, HD189733b & HD209458b. (v) Comparison with secondary eclipse data for HD189733b, XO-1b & Corot-2b.HELIOS is being developed, together with the dynamical core THOR and the chemistry solver VULCAN, in the group of Kevin Heng at the University of Bern as part of the Exoclimes Simulation Platform (ESP) [4], which is an open-source project aimed to provide community tools to model exoplanetary atmospheres.-----------------------------[1] Grimm & Heng 2015, ArXiv, 1503.03806[2] Heng, Lyons & Tsai, Arxiv, 1506.05501Heng & Lyons, ArXiv, 1507.01944[3] e.g. Heng, Mendonca & Lee, 2014, ApJS, 215, 4H[4] exoclime.net

  1. CODE's new solar radiation pressure model for GNSS orbit determination

    NASA Astrophysics Data System (ADS)

    Arnold, D.; Meindl, M.; Beutler, G.; Dach, R.; Schaer, S.; Lutz, S.; Prange, L.; Sośnica, K.; Mervart, L.; Jäggi, A.

    2015-08-01

    The Empirical CODE Orbit Model (ECOM) of the Center for Orbit Determination in Europe (CODE), which was developed in the early 1990s, is widely used in the International GNSS Service (IGS) community. For a rather long time, spurious spectral lines are known to exist in geophysical parameters, in particular in the Earth Rotation Parameters (ERPs) and in the estimated geocenter coordinates, which could recently be attributed to the ECOM. These effects grew creepingly with the increasing influence of the GLONASS system in recent years in the CODE analysis, which is based on a rigorous combination of GPS and GLONASS since May 2003. In a first step we show that the problems associated with the ECOM are to the largest extent caused by the GLONASS, which was reaching full deployment by the end of 2011. GPS-only, GLONASS-only, and combined GPS/GLONASS solutions using the observations in the years 2009-2011 of a global network of 92 combined GPS/GLONASS receivers were analyzed for this purpose. In a second step we review direct solar radiation pressure (SRP) models for GNSS satellites. We demonstrate that only even-order short-period harmonic perturbations acting along the direction Sun-satellite occur for GPS and GLONASS satellites, and only odd-order perturbations acting along the direction perpendicular to both, the vector Sun-satellite and the spacecraft's solar panel axis. Based on this insight we assess in the third step the performance of four candidate orbit models for the future ECOM. The geocenter coordinates, the ERP differences w. r. t. the IERS 08 C04 series of ERPs, the misclosures for the midnight epochs of the daily orbital arcs, and scale parameters of Helmert transformations for station coordinates serve as quality criteria. The old and updated ECOM are validated in addition with satellite laser ranging (SLR) observations and by comparing the orbits to those of the IGS and other analysis centers. Based on all tests, we present a new extended ECOM which

  2. Modeling Planet-Building Stellar Disks with Radiative Transfer Code

    NASA Astrophysics Data System (ADS)

    Swearingen, Jeremy R.; Sitko, Michael L.; Whitney, Barbara; Grady, Carol A.; Wagner, Kevin Robert; Champney, Elizabeth H.; Johnson, Alexa N.; Warren, Chelsea C.; Russell, Ray W.; Hammel, Heidi B.; Lisse, Casey M.; Cure, Michel; Kraus, Stefan; Fukagawa, Misato; Calvet, Nuria; Espaillat, Catherine; Monnier, John D.; Millan-Gabet, Rafael; Wilner, David J.

    2015-01-01

    Understanding the nature of the many planetary systems found outside of our own solar system cannot be completed without knowledge of the beginnings these systems. By detecting planets in very young systems and modeling the disks of material around stars from which they form, we can gain a better understanding of planetary origin and evolution. The efforts presented here have been in modeling two pre-transitional disk systems using a radiative transfer code. With the first of these systems, V1247 Ori, a model that fits the spectral energy distribution (SED) well and whose parameters are consistent with existing interferometry data (Kraus et al 2013) has been achieved. The second of these two systems, SAO 206462, has presented a different set of challenges but encouraging SED agreement between the model and known data gives hope that the model can produce images that can be used in future interferometry work. This work was supported by NASA ADAP grant NNX09AC73G, and the IR&D program at The Aerospace Corporation.

  3. Development and Verification of Enclosure Radiation Capabilities in the CHarring Ablator Response (CHAR) Code

    NASA Technical Reports Server (NTRS)

    Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.

    2016-01-01

    With the recent development of multi-dimensional thermal protection system (TPS) material response codes, the capability to account for surface-to-surface radiation exchange in complex geometries is critical. This paper presents recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute geometric view factors for radiation problems involving multiple surfaces. Verification of the code's radiation capabilities and results of a code-to-code comparison are presented. Finally, a demonstration case of a two-dimensional ablating cavity with enclosure radiation accounting for a changing geometry is shown.

  4. Development of A Monte Carlo Radiation Transport Code System For HEDS: Status Update

    NASA Technical Reports Server (NTRS)

    Townsend, Lawrence W.; Gabriel, Tony A.; Miller, Thomas M.

    2003-01-01

    Modifications of the Monte Carlo radiation transport code HETC are underway to extend the code to include transport of energetic heavy ions, such as are found in the galactic cosmic ray spectrum in space. The new HETC code will be available for use in radiation shielding applications associated with missions, such as the proposed manned mission to Mars. In this work the current status of code modification is described. Methods used to develop the required nuclear reaction models, including total, elastic and nuclear breakup processes, and their associated databases are also presented. Finally, plans for future work on the extended HETC code system and for its validation are described.

  5. Radiation transport phenomena and modeling. Part A: Codes; Part B: Applications with examples

    SciTech Connect

    Lorence, L.J. Jr.; Beutler, D.E.

    1997-09-01

    This report contains the notes from the second session of the 1997 IEEE Nuclear and Space Radiation Effects Conference Short Course on Applying Computer Simulation Tools to Radiation Effects Problems. Part A discusses the physical phenomena modeled in radiation transport codes and various types of algorithmic implementations. Part B gives examples of how these codes can be used to design experiments whose results can be easily analyzed and describes how to calculate quantities of interest for electronic devices.

  6. Code system to compute radiation dose in human phantoms

    SciTech Connect

    Ryman, J.C.; Cristy, M.; Eckerman, K.F.; Davis, J.L.; Tang, J.S.; Kerr, G.D.

    1986-01-01

    Monte Carlo photon transport code and a code using Monte Carlo integration of a point kernel have been revised to incorporate human phantom models for an adult female, juveniles of various ages, and a pregnant female at the end of the first trimester of pregnancy, in addition to the adult male used earlier. An analysis code has been developed for deriving recommended values of specific absorbed fractions of photon energy. The computer code system and calculational method are described, emphasizing recent improvements in methods. (LEW)

  7. Comparison of codes assessing galactic cosmic radiation exposure of aircraft crew.

    PubMed

    Bottollier-Depois, J F; Beck, P; Bennett, B; Bennett, L; Bütikofer, R; Clairand, I; Desorgher, L; Dyer, C; Felsberger, E; Flückiger, E; Hands, A; Kindl, P; Latocha, M; Lewis, B; Leuthold, G; Maczka, T; Mares, V; McCall, M J; O'Brien, K; Rollet, S; Rühm, W; Wissmann, F

    2009-10-01

    The assessment of the exposure to cosmic radiation onboard aircraft is one of the preoccupations of bodies responsible for radiation protection. Cosmic particle flux is significantly higher onboard aircraft than at ground level and its intensity depends on the solar activity. The dose is usually estimated using codes validated by the experimental data. In this paper, a comparison of various codes is presented, some of them are used routinely, to assess the dose received by the aircraft crew caused by the galactic cosmic radiation. Results are provided for periods close to solar maximum and minimum and for selected flights covering major commercial routes in the world. The overall agreement between the codes, particularly for those routinely used for aircraft crew dosimetry, was better than +/-20 % from the median in all but two cases. The agreement within the codes is considered to be fully satisfactory for radiation protection purposes.

  8. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    SciTech Connect

    Smith, L.M.; Hochstedler, R.D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of the accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).

  9. A Monte Carlo Code for Relativistic Radiation Transport Around Kerr Black Holes

    NASA Technical Reports Server (NTRS)

    Schnittman, Jeremy David; Krolik, Julian H.

    2013-01-01

    We present a new code for radiation transport around Kerr black holes, including arbitrary emission and absorption mechanisms, as well as electron scattering and polarization. The code is particularly useful for analyzing accretion flows made up of optically thick disks and optically thin coronae. We give a detailed description of the methods employed in the code and also present results from a number of numerical tests to assess its accuracy and convergence.

  10. A MONTE CARLO CODE FOR RELATIVISTIC RADIATION TRANSPORT AROUND KERR BLACK HOLES

    SciTech Connect

    Schnittman, Jeremy D.; Krolik, Julian H. E-mail: jhk@pha.jhu.edu

    2013-11-01

    We present a new code for radiation transport around Kerr black holes, including arbitrary emission and absorption mechanisms, as well as electron scattering and polarization. The code is particularly useful for analyzing accretion flows made up of optically thick disks and optically thin coronae. We give a detailed description of the methods employed in the code and also present results from a number of numerical tests to assess its accuracy and convergence.

  11. Development and application of a reverse Monte Carlo radiative transfer code for rocket plume base heating

    NASA Technical Reports Server (NTRS)

    Everson, John; Nelson, H. F.

    1993-01-01

    A reverse Monte Carlo radiative transfer code to predict rocket plume base heating is presented. In this technique rays representing the radiation propagation are traced backwards in time from the receiving surface to the point of emission in the plume. This increases the computational efficiency relative to the forward Monte Carlo technique when calculating the radiation reaching a specific point, as only the rays that strike the receiving point are considered.

  12. Flexible Radiation Codes for Numerical Weather Prediction Across Space and Time Scales

    DTIC Science & Technology

    2013-09-30

    time and space scales, especially from regional models to global models. OBJECTIVES We are adapting radiation codes developed for climate ...PSrad is now complete, thorougly tested and debugged, is functioning as the radiation scheme in the climate model ECHAM 6.2 developed at the Max Planck...statiically significant change at most stations, indicating that errors in most places are not primarily driven by radiation errors. We are working

  13. TRAP/SEE Code Users Manual for Predicting Trapped Radiation Environments

    NASA Astrophysics Data System (ADS)

    Armstrong, T. W.; Colborn, B. L.

    2000-01-01

    TRAP/SEE is a PC-based computer code with a user-friendly interface which predicts the ionizing radiation exposure of spacecraft having orbits in the Earth's trapped radiation belts. The code incorporates the standard AP8 and AE8 trapped proton and electron models but also allows application of an improved database interpolation method. The code treats low-Earth as well as highly-elliptical Earth orbits, taking into account trajectory perturbations due to gravitational forces from the Moon and Sun, atmospheric drag, and solar radiation pressure. Orbit-average spectra, peak spectra per orbit, and instantaneous spectra at points along the orbit trajectory are calculated. Described in this report are the features, models, model limitations and uncertainties, input and output descriptions, and example calculations and applications for the TRAP/SEE code.

  14. ICRCCM (InterComparison of Radiation Codes used in Climate Models) Phase 2: Verification and calibration of radiation codes in climate models

    SciTech Connect

    Ellingson, R.G.; Wiscombe, W.J.; Murcray, D.; Smith, W.; Strauch, R.

    1990-01-01

    Following the finding by the InterComparison of Radiation Codes used in Climate Models (ICRCCM) of large differences among fluxes predicted by sophisticated radiation models that could not be sorted out because of the lack of a set of accurate atmospheric spectral radiation data measured simultaneously with the important radiative properties of the atmosphere, our team of scientists proposed to remedy the situation by carrying out a comprehensive program of measurement and analysis called SPECTRE (Spectral Radiance Experiment). SPECTRE will establish an absolute standard against which to compare models, and will aim to remove the hidden variables'' (unknown humidities, aerosols, etc.) which radiation modelers have invoked to excuse disagreements with observation. The data to be collected during SPECTRE will form the test bed for the second phase of ICRCCM, namely verification and calibration of radiation codes used to climate models. This should lead to more accurate radiation models for use in parameterizing climate models, which in turn play a key role in the prediction of trace-gas greenhouse effects. Overall, the project is proceeding much as had been anticipated in the original proposal. The most significant accomplishments to date include the completion of the analysis of the original ICRCCM calculations, the completion of the initial sensitivity analysis of the radiation calculations for the effects of uncertainties in the measurement of water vapor and temperature and the acquisition and testing of the inexpensive spectrometers for use in the field experiment. The sensitivity analysis and the spectrometer tests given us much more confidence that the field experiment will yield the quality of data necessary to make a significant tests of and improvements to radiative transfer models used in climate studies.

  15. Protection of the genome and central protein-coding sequences by non-coding DNA against DNA damage from radiation.

    PubMed

    Qiu, Guo-Hua

    2015-01-01

    Non-coding DNA comprises a very large proportion of the total genomic content in higher organisms, but its function remains largely unclear. Non-coding DNA sequences constitute the majority of peripheral heterochromatin, which has been hypothesized to be the genome's 'bodyguard' against DNA damage from chemicals and radiation for almost four decades. The bodyguard protective function of peripheral heterochromatin in genome defense has been strengthened by the results from numerous recent studies, which are summarized in this review. These data have suggested that cells and/or organisms with a higher level of heterochromatin and more non-coding DNA sequences, including longer telomeric DNA and rDNAs, exhibit a lower frequency of DNA damage, higher radioresistance and longer lifespan after IR exposure. In addition, the majority of heterochromatin is peripherally located in the three-dimensional structure of genome organization. Therefore, the peripheral heterochromatin with non-coding DNA could play a protective role in genome defense against DNA damage from ionizing radiation by both absorbing the radicals from water radiolysis in the cytosol and reducing the energy of IR. However, the bodyguard protection by heterochromatin has been challenged by the observation that DNA damage is less frequently detected in peripheral heterochromatin than in euchromatin, which is inconsistent with the expectation and simulation results. Previous studies have also shown that the DNA damage in peripheral heterochromatin is rarely repaired and moves more quickly, broadly and outwardly to approach the nuclear pore complex (NPC). Additionally, it has been shown that extrachromosomal circular DNAs (eccDNAs) are formed in the nucleus, highly detectable in the cytoplasm (particularly under stress conditions) and shuttle between the nucleus and the cytoplasm. Based on these studies, this review speculates that the sites of DNA damage in peripheral heterochromatin could occur more

  16. CRASH: A BLOCK-ADAPTIVE-MESH CODE FOR RADIATIVE SHOCK HYDRODYNAMICS-IMPLEMENTATION AND VERIFICATION

    SciTech Connect

    Van der Holst, B.; Toth, G.; Sokolov, I. V.; Myra, E. S.; Fryxell, B.; Drake, R. P.; Powell, K. G.; Holloway, J. P.; Stout, Q.; Adams, M. L.; Morel, J. E.; Karni, S.

    2011-06-01

    We describe the Center for Radiative Shock Hydrodynamics (CRASH) code, a block-adaptive-mesh code for multi-material radiation hydrodynamics. The implementation solves the radiation diffusion model with a gray or multi-group method and uses a flux-limited diffusion approximation to recover the free-streaming limit. Electrons and ions are allowed to have different temperatures and we include flux-limited electron heat conduction. The radiation hydrodynamic equations are solved in the Eulerian frame by means of a conservative finite-volume discretization in either one-, two-, or three-dimensional slab geometry or in two-dimensional cylindrical symmetry. An operator-split method is used to solve these equations in three substeps: (1) an explicit step of a shock-capturing hydrodynamic solver; (2) a linear advection of the radiation in frequency-logarithm space; and (3) an implicit solution of the stiff radiation diffusion, heat conduction, and energy exchange. We present a suite of verification test problems to demonstrate the accuracy and performance of the algorithms. The applications are for astrophysics and laboratory astrophysics. The CRASH code is an extension of the Block-Adaptive Tree Solarwind Roe Upwind Scheme (BATS-R-US) code with a new radiation transfer and heat conduction library and equation-of-state and multi-group opacity solvers. Both CRASH and BATS-R-US are part of the publicly available Space Weather Modeling Framework.

  17. The Performance of Current Atmospheric Radiation Codes in Phase I of CIRC

    NASA Technical Reports Server (NTRS)

    Oreopoulos, L.; Mlawer, E.; Shippert, T.; Cole, J.; Fomin, B.; Iacono, M.; Jin, Z.; Li, J.; Manners, J.; Raisanen, P.; Rose, F.; Zhang, Y.; Wilson, M.; Rossow, W.

    2012-01-01

    The Continual Intercomparison of Radiation Codes (CIRC) is intended as an evolving and regularly updated reference source for evaluation of radiative transfer (RT) codes used in Global Climate Models and other atmospheric applications. In our presentation we will discuss our evaluation of the performance of 13 shortwave and 11 longwave RT codes that participated in Phase I of CIRC. CIRC differs from previous intercomparisons in that it relies on an observationally validated catalogue of cases. The seven CIRC Phase I baseline cases, five cloud-free, and two with overcast liquid clouds, are built around observations by the Atmospheric Radiation Measurements (ARM) program that satisfy the goals .of Phase I, namely to examine RT model performance in realistic, yet not overly complex, atmospheric conditions. Besides the seven baseline cases, additional idealized "subcases" are also examined to facilitate interpretation of model errors. We will quantify individual model performance with respect to reference line-by-line calculations, and will also highlight RT code behavior for conditions of doubled CO2 , aspects of utilizing a spectral specification of surface albedo, and the impact of the inclusion of scattering in the thermal infrared. Our analysis suggests that RT codes should work towards improving their calculation of diffuse shortwave flux, shortwave absorption, treatment of spectral surface albedo, and shortwave CO2 forcing. Despite practical difficulties in comparing our results to previous results by the Intercomparison of Radiation Codes in Climate Models (ICRCCM) conducted about 20 years ago, it appears that the current generation of RT codes do indeed perform better than the codes of the ICRCCM era. By enhancing the range of conditions under which participating codes are tested, future CIRC phases will hopefully allow even more rigorous examination of RT code performance.

  18. General relativistic radiative transfer code in rotating black hole space-time: ARTIST

    NASA Astrophysics Data System (ADS)

    Takahashi, Rohta; Umemura, Masayuki

    2017-02-01

    We present a general relativistic radiative transfer code, ARTIST (Authentic Radiative Transfer In Space-Time), that is a perfectly causal scheme to pursue the propagation of radiation with absorption and scattering around a Kerr black hole. The code explicitly solves the invariant radiation intensity along null geodesics in the Kerr-Schild coordinates, and therefore properly includes light bending, Doppler boosting, frame dragging, and gravitational redshifts. The notable aspect of ARTIST is that it conserves the radiative energy with high accuracy, and is not subject to the numerical diffusion, since the transfer is solved on long characteristics along null geodesics. We first solve the wavefront propagation around a Kerr black hole that was originally explored by Hanni. This demonstrates repeated wavefront collisions, light bending, and causal propagation of radiation with the speed of light. We show that the decay rate of the total energy of wavefronts near a black hole is determined solely by the black hole spin in late phases, in agreement with analytic expectations. As a result, the ARTIST turns out to correctly solve the general relativistic radiation fields until late phases as t ˜ 90 M. We also explore the effects of absorption and scattering, and apply this code for a photon wall problem and an orbiting hotspot problem. All the simulations in this study are performed in the equatorial plane around a Kerr black hole. The ARTIST is the first step to realize the general relativistic radiation hydrodynamics.

  19. Evaluation of the ECHAM family radiation codes performance in the representation of the solar signal

    NASA Astrophysics Data System (ADS)

    Sukhodolov, T.; Rozanov, E.; Shapiro, A. I.; Anet, J.; Cagnazzo, C.; Peter, T.; Schmutz, W.

    2014-12-01

    Solar radiation is the main source of energy for the Earth's atmosphere and in many respects defines its composition, photochemistry, temperature profile and dynamics. The magnitude of the solar irradiance variability strongly depends on the wavelength, making difficult its representation in climate models. Due to some deficiencies in the applied radiation codes, several models fail to show a clear response in middle stratospheric heating rates to solar spectral irradiance variability; therefore, it is important to evaluate model performance in this respect before doing multiple runs. In this work we evaluate the performance of three generations of ECHAM (4, 5 and 6) solar radiation schemes by a comparison with the reference high-resolution libRadtran code. We found that all original ECHAM radiation codes miss almost all solar signals in the heating rates in the mesosphere. In the stratosphere the two-band ECHAM4 code (E4) has an almost negligible radiative response to solar irradiance changes and the six-band ECHAM5 code (E5c) reproduces only about half of the reference signal, while representation in the ECHAM6 code (E6) is better - it misses a maximum of about 15% in the upper stratosphere. On the basis of the comparison results we suggest necessary improvements to the ECHAM family codes by the inclusion of available parameterizations of the heating rate due to absorption by oxygen (O2) and ozone (O3). Improvement is presented for E5c and E6, and both codes, with the introduced parameterizations, represent the heating rate response to the spectral solar irradiance variability simulated with libRadtran much better without a substantial increase in computer time. The suggested parameterizations are recommended to be applied in the middle-atmosphere version of the ECHAM-5 and 6 models for the study of the solar irradiance influence on climate.

  20. Fan Noise Prediction System Development: Source/Radiation Field Coupling and Workstation Conversion for the Acoustic Radiation Code

    NASA Technical Reports Server (NTRS)

    Meyer, H. D.

    1993-01-01

    The Acoustic Radiation Code (ARC) is a finite element program used on the IBM mainframe to predict far-field acoustic radiation from a turbofan engine inlet. In this report, requirements for developers of internal aerodynamic codes regarding use of their program output an input for the ARC are discussed. More specifically, the particular input needed from the Bolt, Beranek and Newman/Pratt and Whitney (turbofan source noise generation) Code (BBN/PWC) is described. In a separate analysis, a method of coupling the source and radiation models, that recognizes waves crossing the interface in both directions, has been derived. A preliminary version of the coupled code has been developed and used for initial evaluation of coupling issues. Results thus far have shown that reflection from the inlet is sufficient to indicate that full coupling of the source and radiation fields is needed for accurate noise predictions ' Also, for this contract, the ARC has been modified for use on the Sun and Silicon Graphics Iris UNIX workstations. Changes and additions involved in this effort are described in an appendix.

  1. Radiative transfer codes for atmospheric correction and aerosol retrieval: intercomparison study.

    PubMed

    Kotchenova, Svetlana Y; Vermote, Eric F; Levy, Robert; Lyapustin, Alexei

    2008-05-01

    Results are summarized for a scientific project devoted to the comparison of four atmospheric radiative transfer codes incorporated into different satellite data processing algorithms, namely, 6SV1.1 (second simulation of a satellite signal in the solar spectrum, vector, version 1.1), RT3 (radiative transfer), MODTRAN (moderate resolution atmospheric transmittance and radiance code), and SHARM (spherical harmonics). The performance of the codes is tested against well-known benchmarks, such as Coulson's tabulated values and a Monte Carlo code. The influence of revealed differences on aerosol optical thickness and surface reflectance retrieval is estimated theoretically by using a simple mathematical approach. All information about the project can be found at http://rtcodes.ltdri.org.

  2. On the Development of a Deterministic Three-Dimensional Radiation Transport Code

    NASA Technical Reports Server (NTRS)

    Rockell, Candice; Tweed, John

    2011-01-01

    Since astronauts on future deep space missions will be exposed to dangerous radiations, there is a need to accurately model the transport of radiation through shielding materials and to estimate the received radiation dose. In response to this need a three dimensional deterministic code for space radiation transport is now under development. The new code GRNTRN is based on a Green's function solution of the Boltzmann transport equation that is constructed in the form of a Neumann series. Analytical approximations will be obtained for the first three terms of the Neumann series and the remainder will be estimated by a non-perturbative technique . This work discusses progress made to date and exhibits some computations based on the first two Neumann series terms.

  3. Method for calculating internal radiation and ventilation with the ADINAT heat-flow code

    SciTech Connect

    Butkovich, T.R.; Montan, D.N.

    1980-04-01

    One objective of the spent fuel test in Climax Stock granite (SFTC) is to correctly model the thermal transport, and the changes in the stress field and accompanying displacements from the application of the thermal loads. We have chosen the ADINA and ADINAT finite element codes to do these calculations. ADINAT is a heat transfer code compatible to the ADINA displacement and stress analysis code. The heat flow problem encountered at SFTC requires a code with conduction, radiation, and ventilation capabilities, which the present version of ADINAT does not have. We have devised a method for calculating internal radiation and ventilation with the ADINAT code. This method effectively reproduces the results from the TRUMP multi-dimensional finite difference code, which correctly models radiative heat transport between drift surfaces, conductive and convective thermal transport to and through air in the drifts, and mass flow of air in the drifts. The temperature histories for each node in the finite element mesh calculated with ADINAT using this method can be used directly in the ADINA thermal-mechanical calculation.

  4. A multigroup radiation diffusion test problem: Comparison of code results with analytic solution

    SciTech Connect

    Shestakov, A I; Harte, J A; Bolstad, J H; Offner, S R

    2006-12-21

    We consider a 1D, slab-symmetric test problem for the multigroup radiation diffusion and matter energy balance equations. The test simulates diffusion of energy from a hot central region. Opacities vary with the cube of the frequency and radiation emission is given by a Wien spectrum. We compare results from two LLNL codes, Raptor and Lasnex, with tabular data that define the analytic solution.

  5. RRTMGP: A High-Performance Broadband Radiation Code for the Next Decade

    DTIC Science & Technology

    2015-09-30

    radiation calculations needed for climate simulations require many independent and complicated calculations, and are therefore an inviting target for new...parallel’), a modern version of the radiation code (RRTMG) used by many climate models, directed at the current generation of vector- and cache-based...represent the interests of the Navy GCM (NAVGEM). Due to the expected wide impact of this development effort on climate and weather modeling

  6. RRTMGP: A fast and accurate radiation code for the next decade

    NASA Astrophysics Data System (ADS)

    Mlawer, E. J.; Pincus, R.; Wehe, A.; Delamere, J.

    2015-12-01

    Atmospheric radiative processes are key drivers of the Earth's climate and must be accurately represented in global circulations models (GCMs) to allow faithful simulations of the planet's past, present, and future. The radiation code RRTMG is widely utilized by global modeling centers for both climate and weather predictions, but it has become increasingly out-of-date. The code's structure is not well suited for the current generation of computer architectures and its stored absorption coefficients are not consistent with the most recent spectroscopic information. We are developing a new broadband radiation code for the current generation of computational architectures. This code, called RRTMGP, will be a completely restructured and modern version of RRTMG. The new code preserves the strengths of the existing RRTMG parameterization, especially the high accuracy of the k-distribution treatment of absorption by gases, but the entire code is being rewritten to provide highly efficient computation across a range of architectures. Our redesign includes refactoring the code into discrete kernels corresponding to fundamental computational elements (e.g. gas optics), optimizing the code for operating on multiple columns in parallel, simplifying the subroutine interface, revisiting the existing gas optics interpolation scheme to reduce branching, and adding flexibility with respect to run-time choices of streams, need for consideration of scattering, aerosol and cloud optics, etc. The result of the proposed development will be a single, well-supported and well-validated code amenable to optimization across a wide range of platforms. Our main emphasis is on highly-parallel platforms including Graphical Processing Units (GPUs) and Many-Integrated-Core processors (MICs), which experience shows can accelerate broadband radiation calculations by as much as a factor of fifty. RRTMGP will provide highly efficient and accurate radiative fluxes calculations for coupled global

  7. Evaluation of the ECHAM family radiation codes performance in the representation of the solar signal

    NASA Astrophysics Data System (ADS)

    Sukhodolov, T.; Rozanov, E.; Shapiro, A. I.; Anet, J.; Cagnazzo, C.; Peter, T.; Schmutz, W.

    2014-02-01

    Solar radiation is the main source of energy for the Earth's atmosphere and in many respects defines its composition, photochemistry, temperature profile and dynamics. The magnitude of the solar irradiance variability strongly depends on the wavelength making difficult its representation in climate models. Due to some deficiencies of the applied radiation codes several models fail to show a clear response in middle stratospheric heating rates to solar spectral irradiance variability, therefore it is important to prove reasonable model performance in this respect before doing multiple model runs. In this work we evaluate the performance of three generations of ECHAM (4, 5 and 6) radiation schemes by comparison with the reference high resolution libRadtran code. We found that both original ECHAM5 and 6 solar radiation codes miss almost all solar signal in the heating rates in the mesosphere. In the stratosphere ECHAM5 code reproduces only about a half of the reference signal, while representation of ECHAM6 code is better - it maximally misses about 17% in the upper stratosphere. On the basis of the comparison results we suggest necessary improvements of the ECHAM family codes by inclusion of available parameterizations of the heating rate due to absorption by oxygen (O2) and ozone (O3). Both codes with the introduced parameterizations represent the heating rate response to the spectral solar irradiance variability simulated with libRadtran much better without substantial increase of computer time. The suggested parameterizations are recommended to apply in the middle atmosphere version of the ECHAM-5 and 6 models for the study of the solar irradiance influence on climate.

  8. Benchmarking Space Radiation Transport Codes Using Measured LET Spectra from the Crater Instrument on LRO

    NASA Astrophysics Data System (ADS)

    Townsend, L. W.; Porter, J.; Spence, H. E.; Golightly, M. J.; Smith, S. S.; Schwadron, N.; Kasper, J. C.; Case, A. W.; Blake, J. B.; Mazur, J. E.; Looper, M. D.; Zeitlin, C. J.

    2014-12-01

    The Cosmic Ray Telescope for the Effects of Radiation (CRaTER) instrument on the Lunar Reconnaissance Orbiter (LRO) spacecraft measures the energy depositions by solar and galactic cosmic radiations in its silicon detectors. These energy depositions are converted to linear energy transfer (LET) spectra, which can contribute to benchmarking space radiation transport codes and also used to estimate doses for the Lunar environment. In this work the Monte Carlo transport code HETC-HEDS (High Energy Transport Code - Human Exploration and Development in Space) and the deterministic NASA space radiation transport code HZETRN2010 are used to estimate LET and dose contributions from the incident primary ions and their charged secondaries produced in nuclear collisions within the components of the CRaTER instrument. Comparisons of the calculated LET spectra with measurements of LET from the CRaTER instrument are made and clearly show the importance of including corrections to the calculated average energy deposition spectra in the silicon detectors using a Vavilov distribution function.

  9. Creation and utilization of a World Wide Web based space radiation effects code: SIREST

    NASA Technical Reports Server (NTRS)

    Singleterry, R. C. Jr; Wilson, J. W.; Shinn, J. L.; Tripathi, R. K.; Thibeault, S. A.; Noor, A. K.; Cucinotta, F. A.; Badavi, F. F.; Chang, C. K.; Qualls, G. D.; Clowdsley, M. S.; Kim, M. H.; Heinbockel, J. H.; Norbury, J.; Blattning, S. R.; Miller, J.; Zeitlin, C.; Heilbronn, L. H.

    2001-01-01

    In order for humans and electronics to fully and safely operate in the space environment, codes like HZETRN (High Charge and Energy Transport) must be included in any designer's toolbox for design evaluation with respect to radiation damage. Currently, spacecraft designers do not have easy access to accurate radiation codes like HZETRN to evaluate their design for radiation effects on humans and electronics. Today, the World Wide Web is sophisticated enough to support the entire HZETRN code and all of the associated pre and post processing tools. This package is called SIREST (Space Ionizing Radiation Effects and Shielding Tools). There are many advantages to SIREST. The most important advantage is the instant update capability of the web. Another major advantage is the modularity that the web imposes on the code. Right now, the major disadvantage of SIREST will be its modularity inside the designer's system. This mostly comes from the fact that a consistent interface between the designer and the computer system to evaluate the design is incomplete. This, however, is to be solved in the Intelligent Synthesis Environment (ISE) program currently being funded by NASA.

  10. TAU: A 1D radiative transfer code for transmission spectroscopy of extrasolar planet atmospheres

    NASA Astrophysics Data System (ADS)

    Hollis, M. D. J.; Tessenyi, M.; Tinetti, G.

    2013-10-01

    The TAU code is a 1D line-by-line radiative transfer code, which is generally applicable for modelling transmission spectra of close-in extrasolar planets. The inputs are the assumed pressure-temperature profile of the planetary atmosphere, the continuum absorption coefficients and the absorption cross-sections for the trace molecular absorbers present in the model, as well as the fundamental system parameters taken from the published literature. The program then calculates the optical path through the planetary atmosphere of the radiation from the host star, and quantifies the absorption due to the modelled composition in a transmission spectrum of transit depth as a function of wavelength. The code is written in C++, parallelised using OpenMP, and is available for public download and use from http://www.ucl.ac.uk/exoplanets/. Running time: From 0:5 to 500 s, depending on run parameters

  11. TAU: A 1D radiative transfer code for transmission spectroscopy of extrasolar planet atmospheres

    NASA Astrophysics Data System (ADS)

    Hollis, M. D. J.; Tessenyi, M.; Tinetti, G.

    2014-02-01

    The TAU code is a 1D line-by-line radiative transfer code, which is generally applicable for modeling transmission spectra of close-in extrasolar planets. The inputs are the assumed temperature-pressure profile of the planetary atmosphere, the continuum absorption coefficients and the absorption cross-sections for the trace molecular absorbers present in the model, as well as the fundamental system parameters taken from the published literature. The program then calculates the optical path through the planetary atmosphere of the radiation from the host star, and quantifies the absorption due to the modeled composition in a transmission spectrum of transit depth as a function of wavelength. The code is written in C++, parallelized using OpenMP, and is available for public download and use from http://www.ucl.ac.uk/exoplanets/.

  12. User's manual for the Heat Pipe Space Radiator design and analysis Code (HEPSPARC)

    NASA Technical Reports Server (NTRS)

    Hainley, Donald C.

    1991-01-01

    A heat pipe space radiatior code (HEPSPARC), was written for the NASA Lewis Research Center and is used for the design and analysis of a radiator that is constructed from a pumped fluid loop that transfers heat to the evaporative section of heat pipes. This manual is designed to familiarize the user with this new code and to serve as a reference for its use. This manual documents the completed work and is intended to be the first step towards verification of the HEPSPARC code. Details are furnished to provide a description of all the requirements and variables used in the design and analysis of a combined pumped loop/heat pipe radiator system. A description of the subroutines used in the program is furnished for those interested in understanding its detailed workings.

  13. Development of a coupling code for PWR reactor cavity radiation streaming calculation

    SciTech Connect

    Zheng, Z.; Wu, H.; Cao, L.; Zheng, Y.; Zhang, H.; Wang, M.

    2012-07-01

    PWR reactor cavity radiation streaming is important for the safe of the personnel and equipment, thus calculation has to be performed to evaluate the neutron flux distribution around the reactor. For this calculation, the deterministic codes have difficulties in fine geometrical modeling and need huge computer resource; and the Monte Carlo codes require very long sampling time to obtain results with acceptable precision. Therefore, a coupling method has been developed to eliminate the two problems mentioned above in each code. In this study, we develop a coupling code named DORT2MCNP to link the Sn code DORT and Monte Carlo code MCNP. DORT2MCNP is used to produce a combined surface source containing top, bottom and side surface simultaneously. Because SDEF card is unsuitable for the combined surface source, we modify the SOURCE subroutine of MCNP and compile MCNP for this application. Numerical results demonstrate the correctness of the coupling code DORT2MCNP and show reasonable agreement between the coupling method and the other two codes (DORT and MCNP). (authors)

  14. FURN3D: A computer code for radiative heat transfer in pulverized coal furnaces

    SciTech Connect

    Ahluwalia, R.K.; Im, K.H.

    1992-08-01

    A computer code FURN3D has been developed for assessing the impact of burning different coals on heat absorption pattern in pulverized coal furnaces. The code is unique in its ability to conduct detailed spectral calculations of radiation transport in furnaces fully accounting for the size distributions of char, soot and ash particles, ash content, and ash composition. The code uses a hybrid technique of solving the three-dimensional radiation transport equation for absorbing, emitting and anisotropically scattering media. The technique achieves an optimal mix of computational speed and accuracy by combining the discrete ordinate method (S[sub 4]), modified differential approximation (MDA) and P, approximation in different range of optical thicknesses. The code uses spectroscopic data for estimating the absorption coefficients of participating gases C0[sub 2], H[sub 2]0 and CO. It invokes Mie theory for determining the extinction and scattering coefficients of combustion particulates. The optical constants of char, soot and ash are obtained from dispersion relations derived from reflectivity, transmissivity and extinction measurements. A control-volume formulation is adopted for determining the temperature field inside the furnace. A simple char burnout model is employed for estimating heat release and evolution of particle size distribution. The code is written in Fortran 77, has modular form, and is machine-independent. The computer memory required by the code depends upon the number of grid points specified and whether the transport calculations are performed on spectral or gray basis.

  15. FURN3D: A computer code for radiative heat transfer in pulverized coal furnaces

    SciTech Connect

    Ahluwalia, R.K.; Im, K.H.

    1992-08-01

    A computer code FURN3D has been developed for assessing the impact of burning different coals on heat absorption pattern in pulverized coal furnaces. The code is unique in its ability to conduct detailed spectral calculations of radiation transport in furnaces fully accounting for the size distributions of char, soot and ash particles, ash content, and ash composition. The code uses a hybrid technique of solving the three-dimensional radiation transport equation for absorbing, emitting and anisotropically scattering media. The technique achieves an optimal mix of computational speed and accuracy by combining the discrete ordinate method (S{sub 4}), modified differential approximation (MDA) and P, approximation in different range of optical thicknesses. The code uses spectroscopic data for estimating the absorption coefficients of participating gases C0{sub 2}, H{sub 2}0 and CO. It invokes Mie theory for determining the extinction and scattering coefficients of combustion particulates. The optical constants of char, soot and ash are obtained from dispersion relations derived from reflectivity, transmissivity and extinction measurements. A control-volume formulation is adopted for determining the temperature field inside the furnace. A simple char burnout model is employed for estimating heat release and evolution of particle size distribution. The code is written in Fortran 77, has modular form, and is machine-independent. The computer memory required by the code depends upon the number of grid points specified and whether the transport calculations are performed on spectral or gray basis.

  16. Three-dimensional radiation dose mapping with the TORT computer code

    SciTech Connect

    Slater, C.O.; Pace, J.V. III; Childs, R.L.; Haire, M.J. ); Koyama, T. )

    1991-01-01

    The Consolidated Fuel Reprocessing Program (CFRP) at Oak Ridge National Laboratory (ORNL) has performed radiation shielding studies in support of various facility designs for many years. Computer codes employing the point-kernel method have been used, and the accuracy of these codes is within acceptable limits. However, to further improve the accuracy and to calculate dose at a larger number of locations, a higher order method is desired, even for analyses performed in the early stages of facility design. Consequently, the three-dimensional discrete ordinates transport code TORT, developed at ORNL in the mid-1980s, was selected to examine in detail the dose received at equipment locations. The capabilities of the code have been previously reported. Recently, the Power Reactor and Nuclear Fuel Development Corporation in Japan and the US Department of Energy have used the TORT code as part of a collaborative agreement to jointly develop breeder reactor fuel reprocessing technology. In particular, CFRP used the TORT code to estimate radiation dose levels within the main process cell for a conceptual plant design and to establish process equipment lifetimes. The results reported in this paper are for a conceptual plant design that included the mechanical head and (i.e., the disassembly and shear machines), solvent extraction equipment, and miscellaneous process support equipment.

  17. HADES code for numerical simulations of high-mach number astrophysical radiative flows

    NASA Astrophysics Data System (ADS)

    Michaut, C.; Di Menza, L.; Nguyen, H. C.; Bouquet, S. E.; Mancini, M.

    2017-03-01

    The understanding of astrophysical phenomena requires to deal with robust numerical tools in order to handle realistic scales in terms of energy, characteristic lengths and Mach number that cannot be easily reproduced by means of laboratory experiments. In this paper, we present the 2D numerical code HADES for the simulation of realistic astrophysical phenomena in various contexts, first taking into account radiative losses. The version of HADES including a multigroup modeling of radiative transfer will be presented in a forthcoming study. Validation of HADES is performed using several benchmark tests and some realistic applications are discussed. Optically thin radiative loss is modeled by a cooling function in the conservation law of energy. Numerical methods involve the MUSCL-Hancock finite volume scheme as well as HLLC and HLLE Riemann solvers, coupled with a second-order ODE solver by means of Strang splitting algorithm that handles source terms arising from geometrical or radiative contributions, for cartesian or axisymmetric configurations. A good agreement has been observed for all benchmark tests, either in hydrodynamic cases or in radiative cases. Furthermore, an overview of the main astrophysical studies driven with this code is proposed. First, simulations of radiative shocks in accretion columns and supernova remnant dynamics at large timescales including Vishniac instability have improved the understanding of these phenomena. Finally, astrophysical jets are investigated and the influence of the cooling effect on the jet morphology is numerically demonstrated. It is also found that periodic source enables to recover pulsating jets that mimic the structure of Herbig-Haro objects. HADES code has revealed its robustness, especially for the wall-shock test and for the so-called implosion test which turns out to be a severe one since the hydrodynamic variables are self-similar and become infinite at finite time. The simulations have proved the efficiency of

  18. Development of a GPU Compatible Version of the Fast Radiation Code RRTMG

    NASA Astrophysics Data System (ADS)

    Iacono, M. J.; Mlawer, E. J.; Berthiaume, D.; Cady-Pereira, K. E.; Suarez, M.; Oreopoulos, L.; Lee, D.

    2012-12-01

    The absorption of solar radiation and emission/absorption of thermal radiation are crucial components of the physics that drive Earth's climate and weather. Therefore, accurate radiative transfer calculations are necessary for realistic climate and weather simulations. Efficient radiation codes have been developed for this purpose, but their accuracy requirements still necessitate that as much as 30% of the computational time of a GCM is spent computing radiative fluxes and heating rates. The overall computational expense constitutes a limitation on a GCM's predictive ability if it becomes an impediment to adding new physics to or increasing the spatial and/or vertical resolution of the model. The emergence of Graphics Processing Unit (GPU) technology, which will allow the parallel computation of multiple independent radiative calculations in a GCM, will lead to a fundamental change in the competition between accuracy and speed. Processing time previously consumed by radiative transfer will now be available for the modeling of other processes, such as physics parameterizations, without any sacrifice in the accuracy of the radiative transfer. Furthermore, fast radiation calculations can be performed much more frequently and will allow the modeling of radiative effects of rapid changes in the atmosphere. The fast radiation code RRTMG, developed at Atmospheric and Environmental Research (AER), is utilized operationally in many dynamical models throughout the world. We will present the results from the first stage of an effort to create a version of the RRTMG radiation code designed to run efficiently in a GPU environment. This effort will focus on the RRTMG implementation in GEOS-5. RRTMG has an internal pseudo-spectral vector of length of order 100 that, when combined with the much greater length of the global horizontal grid vector from which the radiation code is called in GEOS-5, makes RRTMG/GEOS-5 particularly suited to achieving a significant speed improvement

  19. Monte Carlo Code System for High-Energy Radiation Transport Calculations.

    SciTech Connect

    FILGES, DETLEF

    2000-02-16

    Version 00 HERMES-KFA consists of a set of Monte Carlo Codes used to simulate particle radiation and interaction with matter. The main codes are HETC, MORSE, and EGS. They are supported by a common geometry package, common random routines, a command interpreter, and auxiliary codes like NDEM that is used to generate a gamma-ray source from nuclear de-excitation after spallation processes. The codes have been modified so that any particle history falling outside the domain of the physical theory of one program can be submitted to another program in the suite to complete the work. Also response data can be submitted by each program, to be collected and combined by a statistic package included within the command interpreter.

  20. A public code for general relativistic, polarised radiative transfer around spinning black holes

    NASA Astrophysics Data System (ADS)

    Dexter, Jason

    2016-10-01

    Ray tracing radiative transfer is a powerful method for comparing theoretical models of black hole accretion flows and jets with observations. We present a public code, GRTRANS, for carrying out such calculations in the Kerr metric, including the full treatment of polarised radiative transfer and parallel transport along geodesics. The code is written in FORTRAN 90 and efficiently parallelises with OPENMP, and the full code and several components have PYTHON interfaces. We describe several tests which are used for verifiying the code, and we compare the results for polarised thin accretion disc and semi-analytic jet problems with those from the literature as examples of its use. Along the way, we provide accurate fitting functions for polarised synchrotron emission and transfer coefficients from thermal and power-law distribution functions, and compare results from numerical integration and quadrature solutions of the polarised radiative transfer equations. We also show that all transfer coefficients can play an important role in predicted images and polarisation maps of the Galactic centre black hole, Sgr A*, at submillimetre wavelengths.

  1. European Code against Cancer 4th Edition: Ionising and non-ionising radiation and cancer.

    PubMed

    McColl, Neil; Auvinen, Anssi; Kesminiene, Ausrele; Espina, Carolina; Erdmann, Friederike; de Vries, Esther; Greinert, Rüdiger; Harrison, John; Schüz, Joachim

    2015-12-01

    Ionising radiation can transfer sufficient energy to ionise molecules, and this can lead to chemical changes, including DNA damage in cells. Key evidence for the carcinogenicity of ionising radiation comes from: follow-up studies of the survivors of the atomic bombings in Japan; other epidemiological studies of groups that have been exposed to radiation from medical, occupational or environmental sources; experimental animal studies; and studies of cellular responses to radiation. Considering exposure to environmental ionising radiation, inhalation of naturally occurring radon is the major source of radiation in the population - in doses orders of magnitude higher than those from nuclear power production or nuclear fallout. Indoor exposure to radon and its decay products is an important cause of lung cancer; radon may cause approximately one in ten lung cancers in Europe. Exposures to radon in buildings can be reduced via a three-step process of identifying those with potentially elevated radon levels, measuring radon levels, and reducing exposure by installation of remediation systems. In the 4th Edition of the European Code against Cancer it is therefore recommended to: "Find out if you are exposed to radiation from naturally high radon levels in your home. Take action to reduce high radon levels". Non-ionising types of radiation (those with insufficient energy to ionise molecules) - including extremely low-frequency electric and magnetic fields as well as radiofrequency electromagnetic fields - are not an established cause of cancer and are therefore not addressed in the recommendations to reduce cancer risk.

  2. A Radiation Chemistry Code Based on the Green's Function of the Diffusion Equation

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Wu, Honglu

    2014-01-01

    Stochastic radiation track structure codes are of great interest for space radiation studies and hadron therapy in medicine. These codes are used for a many purposes, notably for microdosimetry and DNA damage studies. In the last two decades, they were also used with the Independent Reaction Times (IRT) method in the simulation of chemical reactions, to calculate the yield of various radiolytic species produced during the radiolysis of water and in chemical dosimeters. Recently, we have developed a Green's function based code to simulate reversible chemical reactions with an intermediate state, which yielded results in excellent agreement with those obtained by using the IRT method. This code was also used to simulate and the interaction of particles with membrane receptors. We are in the process of including this program for use with the Monte-Carlo track structure code Relativistic Ion Tracks (RITRACKS). This recent addition should greatly expand the capabilities of RITRACKS, notably to simulate DNA damage by both the direct and indirect effect.

  3. A unified radiative magnetohydrodynamics code for lightning-like discharge simulations

    SciTech Connect

    Chen, Qiang Chen, Bin Xiong, Run; Cai, Zhaoyang; Chen, P. F.

    2014-03-15

    A two-dimensional Eulerian finite difference code is developed for solving the non-ideal magnetohydrodynamic (MHD) equations including the effects of self-consistent magnetic field, thermal conduction, resistivity, gravity, and radiation transfer, which when combined with specified pulse current models and plasma equations of state, can be used as a unified lightning return stroke solver. The differential equations are written in the covariant form in the cylindrical geometry and kept in the conservative form which enables some high-accuracy shock capturing schemes to be equipped in the lightning channel configuration naturally. In this code, the 5-order weighted essentially non-oscillatory scheme combined with Lax-Friedrichs flux splitting method is introduced for computing the convection terms of the MHD equations. The 3-order total variation diminishing Runge-Kutta integral operator is also equipped to keep the time-space accuracy of consistency. The numerical algorithms for non-ideal terms, e.g., artificial viscosity, resistivity, and thermal conduction, are introduced in the code via operator splitting method. This code assumes the radiation is in local thermodynamic equilibrium with plasma components and the flux limited diffusion algorithm with grey opacities is implemented for computing the radiation transfer. The transport coefficients and equation of state in this code are obtained from detailed particle population distribution calculation, which makes the numerical model is self-consistent. This code is systematically validated via the Sedov blast solutions and then used for lightning return stroke simulations with the peak current being 20 kA, 30 kA, and 40 kA, respectively. The results show that this numerical model consistent with observations and previous numerical results. The population distribution evolution and energy conservation problems are also discussed.

  4. Simulation of high-energy radiation belt electron fluxes using NARMAX-VERB coupled codes.

    PubMed

    Pakhotin, I P; Drozdov, A Y; Shprits, Y Y; Boynton, R J; Subbotin, D A; Balikhin, M A

    2014-10-01

    This study presents a fusion of data-driven and physics-driven methodologies of energetic electron flux forecasting in the outer radiation belt. Data-driven NARMAX (Nonlinear AutoRegressive Moving Averages with eXogenous inputs) model predictions for geosynchronous orbit fluxes have been used as an outer boundary condition to drive the physics-based Versatile Electron Radiation Belt (VERB) code, to simulate energetic electron fluxes in the outer radiation belt environment. The coupled system has been tested for three extended time periods totalling several weeks of observations. The time periods involved periods of quiet, moderate, and strong geomagnetic activity and captured a range of dynamics typical of the radiation belts. The model has successfully simulated energetic electron fluxes for various magnetospheric conditions. Physical mechanisms that may be responsible for the discrepancies between the model results and observations are discussed.

  5. Simulation of high-energy radiation belt electron fluxes using NARMAX-VERB coupled codes

    PubMed Central

    Pakhotin, I P; Drozdov, A Y; Shprits, Y Y; Boynton, R J; Subbotin, D A; Balikhin, M A

    2014-01-01

    This study presents a fusion of data-driven and physics-driven methodologies of energetic electron flux forecasting in the outer radiation belt. Data-driven NARMAX (Nonlinear AutoRegressive Moving Averages with eXogenous inputs) model predictions for geosynchronous orbit fluxes have been used as an outer boundary condition to drive the physics-based Versatile Electron Radiation Belt (VERB) code, to simulate energetic electron fluxes in the outer radiation belt environment. The coupled system has been tested for three extended time periods totalling several weeks of observations. The time periods involved periods of quiet, moderate, and strong geomagnetic activity and captured a range of dynamics typical of the radiation belts. The model has successfully simulated energetic electron fluxes for various magnetospheric conditions. Physical mechanisms that may be responsible for the discrepancies between the model results and observations are discussed. PMID:26167432

  6. Milagro Version 2 An Implicit Monte Carlo Code for Thermal Radiative Transfer: Capabilities, Development, and Usage

    SciTech Connect

    T.J. Urbatsch; T.M. Evans

    2006-02-15

    We have released Version 2 of Milagro, an object-oriented, C++ code that performs radiative transfer using Fleck and Cummings' Implicit Monte Carlo method. Milagro, a part of the Jayenne program, is a stand-alone driver code used as a methods research vehicle and to verify its underlying classes. These underlying classes are used to construct Implicit Monte Carlo packages for external customers. Milagro-2 represents a design overhaul that allows better parallelism and extensibility. New features in Milagro-2 include verified momentum deposition, restart capability, graphics capability, exact energy conservation, and improved load balancing and parallel efficiency. A users' guide also describes how to configure, make, and run Milagro2.

  7. DOPEX-1D2C: A one-dimensional, two-constraint radiation shield optimization code

    NASA Technical Reports Server (NTRS)

    Lahti, G. P.

    1973-01-01

    A one-dimensional, two-constraint radiation sheild weight optimization procedure and a computer program, DOPEX-1D2C, is described. The DOPEX-1D2C uses the steepest descent method to alter a set of initial (input) thicknesses of a spherical shield configuration to achieve a minimum weight while simultaneously satisfying two dose-rate constraints. The code assumes an exponential dose-shield thickness relation with parameters specified by the user. Code input instruction, a FORTRAN-4 listing, and a sample problem are given. Typical computer time required to optimize a seven-layer shield is less than 1/2 minute on an IBM 7094.

  8. Evaluation of Error-Correcting Codes for Radiation-Tolerant Memory

    NASA Astrophysics Data System (ADS)

    Jeon, S.; Vijaya Kumar, B. V. K.; Hwang, E.; Cheng, M. K.

    2010-05-01

    In space, radiation particles can introduce temporary or permanent errors in memory systems. To protect against potential memory faults, either thick shielding or error-correcting codes (ECC) are used by memory modules. Thick shielding translates into increased mass, and conventional ECCs designed for memories are typically capable of correcting only a single error and detecting a double error. Decoding is usually performed through hard decisions where bits are treated as either correct or flipped in polarity. We demonstrate that low-density parity-check (LDPC) codes that are already prevalent in many communication applications can also be used to protect memories in space. Because the achievable code rate monotonically decreases with time due to the accumulation of permanent errors, the achievable rate serves as a useful metric in designing an appropriate ECC. We describe how to compute soft symbol reliabilities on our channel and compare the performance of soft-decision decoding LDPC codes against conventional hard-decision decoding of Reed-Solomon (RS) codes and Bose-Chaudhuri-Hocquenghem (BCH) codes for a specific memory structure.

  9. Intercomparison of three microwave/infrared high resolution line-by-line radiative transfer codes

    NASA Astrophysics Data System (ADS)

    Schreier, F.; Garcia, S. Gimeno; Milz, M.; Kottayil, A.; Höpfner, M.; von Clarmann, T.; Stiller, G.

    2013-05-01

    An intercomparison of three line-by-line (lbl) codes developed independently for atmospheric sounding - ARTS, GARLIC, and KOPRA - has been performed for a thermal infrared nadir sounding application assuming a HIRS-like (High resolution Infrared Radiation Sounder) setup. Radiances for the HIRS infrared channels and a set of 42 atmospheric profiles from the "Garand dataset" have been computed. Results of this intercomparison and a discussion of reasons of the observed differences are presented.

  10. HT-FRTC: a fast radiative transfer code using kernel regression

    NASA Astrophysics Data System (ADS)

    Thelen, Jean-Claude; Havemann, Stephan; Lewis, Warren

    2016-09-01

    The HT-FRTC is a principal component based fast radiative transfer code that can be used across the electromagnetic spectrum from the microwave through to the ultraviolet to calculate transmittance, radiance and flux spectra. The principal components cover the spectrum at a very high spectral resolution, which allows very fast line-by-line, hyperspectral and broadband simulations for satellite-based, airborne and ground-based sensors. The principal components are derived during a code training phase from line-by-line simulations for a diverse set of atmosphere and surface conditions. The derived principal components are sensor independent, i.e. no extra training is required to include additional sensors. During the training phase we also derive the predictors which are required by the fast radiative transfer code to determine the principal component scores from the monochromatic radiances (or fluxes, transmittances). These predictors are calculated for each training profile at a small number of frequencies, which are selected by a k-means cluster algorithm during the training phase. Until recently the predictors were calculated using a linear regression. However, during a recent rewrite of the code the linear regression was replaced by a Gaussian Process (GP) regression which resulted in a significant increase in accuracy when compared to the linear regression. The HT-FRTC has been trained with a large variety of gases, surface properties and scatterers. Rayleigh scattering as well as scattering by frozen/liquid clouds, hydrometeors and aerosols have all been included. The scattering phase function can be fully accounted for by an integrated line-by-line version of the Edwards-Slingo spherical harmonics radiation code or approximately by a modification to the extinction (Chou scaling).

  11. New Particle-in-Cell Code for Numerical Simulation of Coherent Synchrotron Radiation

    SciTech Connect

    Balsa Terzic, Rui Li

    2010-05-01

    We present a first look at the new code for self-consistent, 2D simulations of beam dynamics affected by the coherent synchrotron radiation. The code is of the particle-in-cell variety: the beam bunch is sampled by point-charge particles, which are deposited on the grid; the corresponding forces on the grid are then computed using retarded potentials according to causality, and interpolated so as to advance the particles in time. The retarded potentials are evaluated by integrating over the 2D path history of the bunch, with the charge and current density at the retarded time obtained from interpolation of the particle distributions recorded at discrete timesteps. The code is benchmarked against analytical results obtained for a rigid-line bunch. We also outline the features and applications which are currently being developed.

  12. Performance of the dot product function in radiative transfer code SORD

    NASA Astrophysics Data System (ADS)

    Korkin, Sergey; Lyapustin, Alexei; Sinyuk, Aliaksandr; Holben, Brent

    2016-10-01

    The successive orders of scattering radiative transfer (RT) codes frequently call the scalar (dot) product function. In this paper, we study performance of some implementations of the dot product in the RT code SORD using 50 scenarios for light scattering in the atmosphere-surface system. In the dot product function, we use the unrolled loops technique with different unrolling factor. We also considered the intrinsic Fortran functions. We show results for two machines: ifort compiler under Windows, and pgf90 under Linux. Intrinsic DOT_PRODUCT function showed best performance for the ifort. For the pgf90, the dot product implemented with unrolling factor 4 was the fastest. The RT code SORD together with the interface that runs all the mentioned tests are publicly available from ftp://maiac.gsfc.nasa.gov/pub/skorkin/SORD_IP_16B (current release) or by email request from the corresponding (first) author.

  13. Comparison of radiation spectra from selected source-term computer codes

    SciTech Connect

    Brady, M.C.; Hermann, O.W.; Wilson, W.B.

    1989-04-01

    This report compares the radiation spectra and intensities predicted by three radionuclide inventory/depletion codes, ORIGEN2, ORIGEN-S, and CINDER-2. The comparisons were made for a series of light-water reactor models (including three pressurized-water reactors (PWR) and two boiling-water reactors (BWR)) at cooling times ranging from 30 d to 100 years. The work presented here complements the results described in an earlier report that discusses in detail the three depletion codes, the various reactor models, and the comparison by nuclide of the inventories, activities, and decay heat predictions by nuclide for the three codes. In this report, the photon production rates from fission product nuclides and actinides were compared as well as the total photon production rates and energy spectra. Very good agreement was observed in the photon source terms predicted by ORIGEN2 and ORIGEN-S. The absence of bremsstrahlung radiation in the CINDER-2 calculations resulted in large differences in both the production rates and spectra in comparison with the ORIGEN2 and ORIGEN-S results. A comparison of the CINDER-2 photon production rates with an ORIGEN-S calculation neglecting bremsstrahlung radiation showed good agreement. An additional discrepancy was observed in the photon spectra predicted from the CINDER-2 calculations and has been attributed to the absence of spectral data for /sup 144/Pr in those calculations. 12 refs., 26 figs., 36 tabs.

  14. SKIRT: An advanced dust radiative transfer code with a user-friendly architecture

    NASA Astrophysics Data System (ADS)

    Camps, P.; Baes, M.

    2015-03-01

    We discuss the architecture and design principles that underpin the latest version of SKIRT, a state-of-the-art open source code for simulating continuum radiation transfer in dusty astrophysical systems, such as spiral galaxies and accretion disks. SKIRT employs the Monte Carlo technique to emulate the relevant physical processes including scattering, absorption and emission by the dust. The code features a wealth of built-in geometries, radiation source spectra, dust characterizations, dust grids, and detectors, in addition to various mechanisms for importing snapshots generated by hydrodynamical simulations. The configuration for a particular simulation is defined at run-time through a user-friendly interface suitable for both occasional and power users. These capabilities are enabled by careful C++ code design. The programming interfaces between components are well defined and narrow. Adding a new feature is usually as simple as adding another class; the user interface automatically adjusts to allow configuring the new options. We argue that many scientific codes, like SKIRT, can benefit from careful object-oriented design and from a friendly user interface, even if it is not a graphical user interface.

  15. Predicting radiative heat transfer in thermochemical nonequilibrium flow fields. Theory and user's manual for the LORAN code

    NASA Technical Reports Server (NTRS)

    Chambers, Lin Hartung

    1994-01-01

    The theory for radiation emission, absorption, and transfer in a thermochemical nonequilibrium flow is presented. The expressions developed reduce correctly to the limit at equilibrium. To implement the theory in a practical computer code, some approximations are used, particularly the smearing of molecular radiation. Details of these approximations are presented and helpful information is included concerning the use of the computer code. This user's manual should benefit both occasional users of the Langley Optimized Radiative Nonequilibrium (LORAN) code and those who wish to use it to experiment with improved models or properties.

  16. Predicting radiative heat transfer in thermochemical nonequilibrium flow fields. Theory and user's manual for the LORAN code

    NASA Astrophysics Data System (ADS)

    Chambers, Lin Hartung

    1994-09-01

    The theory for radiation emission, absorption, and transfer in a thermochemical nonequilibrium flow is presented. The expressions developed reduce correctly to the limit at equilibrium. To implement the theory in a practical computer code, some approximations are used, particularly the smearing of molecular radiation. Details of these approximations are presented and helpful information is included concerning the use of the computer code. This user's manual should benefit both occasional users of the Langley Optimized Radiative Nonequilibrium (LORAN) code and those who wish to use it to experiment with improved models or properties.

  17. Reanalysis and forecasting killer electrons in Earth's radiation belts using the VERB code

    NASA Astrophysics Data System (ADS)

    Kellerman, Adam; Kondrashov, Dmitri; Shprits, Yuri; Podladchikova, Tatiana; Drozdov, Alexander

    2016-07-01

    The Van Allen radiation belts are torii-shaped regions of trapped energetic particles, that in recent years, have become a principle focus for satellite operators and engineers. During geomagnetic storms, electrons can be accelerated up to relativistic energies, where they may penetrate spacecraft shielding and damage electrical systems, causing permanent damage or loss of spacecraft. Data-assimilation provides an optimal way to combine observations of the radiation belts with a physics-based model in order to more accurately specify the global state of the Earth's radiation belts. We present recent advances to the data-assimilative version of the Versatile Electron Radiation Belt (VERB) code, including more sophisticated error analysis, and incorporation of realistic field-models to more accurately specify fluxes at a given MLT or along a spacecraft trajectory. The effect of recent stream-interaction-region (SIR) driven enhancements are investigated using the improved model. We also present a real-time forecast model based on the data-assimilative VERB code, and discuss the forecast performance over the past 12 months.

  18. Radiation Coupling with the FUN3D Unstructured-Grid CFD Code

    NASA Technical Reports Server (NTRS)

    Wood, William A.

    2012-01-01

    The HARA radiation code is fully-coupled to the FUN3D unstructured-grid CFD code for the purpose of simulating high-energy hypersonic flows. The radiation energy source terms and surface heat transfer, under the tangent slab approximation, are included within the fluid dynamic ow solver. The Fire II flight test, at the Mach-31 1643-second trajectory point, is used as a demonstration case. Comparisons are made with an existing structured-grid capability, the LAURA/HARA coupling. The radiative surface heat transfer rates from the present approach match the benchmark values within 6%. Although radiation coupling is the focus of the present work, convective surface heat transfer rates are also reported, and are seen to vary depending upon the choice of mesh connectivity and FUN3D ux reconstruction algorithm. On a tetrahedral-element mesh the convective heating matches the benchmark at the stagnation point, but under-predicts by 15% on the Fire II shoulder. Conversely, on a mixed-element mesh the convective heating over-predicts at the stagnation point by 20%, but matches the benchmark away from the stagnation region.

  19. TIME-DEPENDENT MULTI-GROUP MULTI-DIMENSIONAL RELATIVISTIC RADIATIVE TRANSFER CODE BASED ON SPHERICAL HARMONIC DISCRETE ORDINATE METHOD

    SciTech Connect

    Tominaga, Nozomu; Shibata, Sanshiro; Blinnikov, Sergei I. E-mail: sshibata@post.kek.jp

    2015-08-15

    We develop a time-dependent, multi-group, multi-dimensional relativistic radiative transfer code, which is required to numerically investigate radiation from relativistic fluids that are involved in, e.g., gamma-ray bursts and active galactic nuclei. The code is based on the spherical harmonic discrete ordinate method (SHDOM) which evaluates a source function including anisotropic scattering in spherical harmonics and implicitly solves the static radiative transfer equation with ray tracing in discrete ordinates. We implement treatments of time dependence, multi-frequency bins, Lorentz transformation, and elastic Thomson and inelastic Compton scattering to the publicly available SHDOM code. Our code adopts a mixed-frame approach; the source function is evaluated in the comoving frame, whereas the radiative transfer equation is solved in the laboratory frame. This implementation is validated using various test problems and comparisons with the results from a relativistic Monte Carlo code. These validations confirm that the code correctly calculates the intensity and its evolution in the computational domain. The code enables us to obtain an Eddington tensor that relates the first and third moments of intensity (energy density and radiation pressure) and is frequently used as a closure relation in radiation hydrodynamics calculations.

  20. Spectral and Structure Modeling of Low and High Mass Young Stars Using a Radiative Trasnfer Code

    NASA Astrophysics Data System (ADS)

    Robson Rocha, Will; Pilling, Sergio

    The spectroscopy data from space telescopes (ISO, Spitzer, Herchel) shows that in addition to dust grains (e.g. silicates), there is also the presence of the frozen molecular species (astrophysical ices, such as H _{2}O, CO, CO _{2}, CH _{3}OH) in the circumstellar environments. In this work we present a study of the modeling of low and high mass young stellar objects (YSOs), where we highlight the importance in the use of the astrophysical ices processed by the radiation (UV, cosmic rays) comes from stars in formation process. This is important to characterize the physicochemical evolution of the ices distributed by the protostellar disk and its envelope in some situations. To perform this analysis, we gathered (i) observational data from Infrared Space Observatory (ISO) related with low mass protostar Elias29 and high mass protostar W33A, (ii) absorbance experimental data in the infrared spectral range used to determinate the optical constants of the materials observed around this objects and (iii) a powerful radiative transfer code to simulate the astrophysical environment (RADMC-3D, Dullemond et al, 2012). Briefly, the radiative transfer calculation of the YSOs was done employing the RADMC-3D code. The model outputs were the spectral energy distribution and theoretical images in different wavelengths of the studied objects. The functionality of this code is based on the Monte Carlo methodology in addition to Mie theory for interaction among radiation and matter. The observational data from different space telescopes was used as reference for comparison with the modeled data. The optical constants in the infrared, used as input in the models, were calculated directly from absorbance data obtained in the laboratory of both unprocessed and processed simulated interstellar samples by using NKABS code (Rocha & Pilling 2014). We show from this study that some absorption bands in the infrared, observed in the spectrum of Elias29 and W33A can arises after the ices

  1. Comparison of Radiation Transport Codes, HZETRN, HETC and FLUKA, Using the 1956 Webber SPE Spectrum

    NASA Technical Reports Server (NTRS)

    Heinbockel, John H.; Slaba, Tony C.; Blattnig, Steve R.; Tripathi, Ram K.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; Reddell, Brandon; Clowdsley, Martha S.; Singleterry, Robert C.; Norbury, John W.; Badavi, Francis F.; Aghara, Sukesh K.

    2009-01-01

    Protection of astronauts and instrumentation from galactic cosmic rays (GCR) and solar particle events (SPE) in the harsh environment of space is of prime importance in the design of personal shielding, spacec raft, and mission planning. Early entry of radiation constraints into the design process enables optimal shielding strategies, but demands efficient and accurate tools that can be used by design engineers in every phase of an evolving space project. The radiation transport code , HZETRN, is an efficient tool for analyzing the shielding effectiveness of materials exposed to space radiation. In this paper, HZETRN is compared to the Monte Carlo codes HETC-HEDS and FLUKA, for a shield/target configuration comprised of a 20 g/sq cm Aluminum slab in front of a 30 g/cm^2 slab of water exposed to the February 1956 SPE, as mode led by the Webber spectrum. Neutron and proton fluence spectra, as well as dose and dose equivalent values, are compared at various depths in the water target. This study shows that there are many regions where HZETRN agrees with both HETC-HEDS and FLUKA for this shield/target configuration and the SPE environment. However, there are also regions where there are appreciable differences between the three computer c odes.

  2. The Development of the Ducted Fan Noise Propagation and Radiation Code CDUCT-LaRC

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Farassat, F.; Pope, D. Stuart; Vatsa, Veer

    2003-01-01

    The development of the ducted fan noise propagation and radiation code CDUCT-LaRC at NASA Langley Research Center is described. This code calculates the propagation and radiation of given acoustic modes ahead of the fan face or aft of the exhaust guide vanes in the inlet or exhaust ducts, respectively. This paper gives a description of the modules comprising CDUCT-LaRC. The grid generation module provides automatic creation of numerical grids for complex (non-axisymmetric) geometries that include single or multiple pylons. Files for performing automatic inviscid mean flow calculations are also generated within this module. The duct propagation is based on the parabolic approximation theory of R. P. Dougherty. This theory allows the handling of complex internal geometries and the ability to study the effect of non-uniform (i.e. circumferentially and axially segmented) liners. Finally, the duct radiation module is based on the Ffowcs Williams-Hawkings (FW-H) equation with a penetrable data surface. Refraction of sound through the shear layer between the external flow and bypass duct flow is included. Results for benchmark annular ducts, as well as other geometries with pylons, are presented and compared with available analytical data.

  3. GRAVE: An Interactive Geometry Construction and Visualization Software System for the TORT Nuclear Radiation Transport Code

    SciTech Connect

    Blakeman, E.D.

    2000-05-07

    A software system, GRAVE (Geometry Rendering and Visual Editor), has been developed at the Oak Ridge National Laboratory (ORNL) to perform interactive visualization and development of models used as input to the TORT three-dimensional discrete ordinates radiation transport code. Three-dimensional and two-dimensional visualization displays are included. Display capabilities include image rotation, zoom, translation, wire-frame and translucent display, geometry cuts and slices, and display of individual component bodies and material zones. The geometry can be interactively edited and saved in TORT input file format. This system is an advancement over the current, non-interactive, two-dimensional display software. GRAVE is programmed in the Java programming language and can be implemented on a variety of computer platforms. Three- dimensional visualization is enabled through the Visualization Toolkit (VTK), a free-ware C++ software library developed for geometric and data visual display. Future plans include an extension of the system to read inputs using binary zone maps and combinatorial geometry models containing curved surfaces, such as those used for Monte Carlo code inputs. Also GRAVE will be extended to geometry visualization/editing for the DORT two-dimensional transport code and will be integrated into a single GUI-based system for all of the ORNL discrete ordinates transport codes.

  4. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    SciTech Connect

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art.

  5. Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code

    DOE PAGES

    Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; ...

    2015-12-21

    This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Somemore » specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000® problems. These benchmark and scaling studies show promising results.« less

  6. Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code

    SciTech Connect

    Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; Davidson, Gregory G.; Hamilton, Steven P.; Godfrey, Andrew T.

    2015-12-21

    This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Some specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000® problems. These benchmark and scaling studies show promising results.

  7. A novel method involving Matlab coding to determine the distribution of a collimated ionizing radiation beam

    NASA Astrophysics Data System (ADS)

    Ioan, M.-R.

    2016-08-01

    In ionizing radiation related experiments, precisely knowing of the involved parameters it is a very important task. Some of these experiments are involving the use of electromagnetic ionizing radiation such are gamma rays and X rays, others make use of energetic charged or not charged small dimensions particles such are protons, electrons, neutrons and even, in other cases, larger accelerated particles such are helium or deuterium nuclei are used. In all these cases the beam used to hit an exposed target must be previously collimated and precisely characterized. In this paper, a novel method to determine the distribution of the collimated beam involving Matlab coding is proposed. The method was implemented by using of some Pyrex glass test samples placed in the beam where its distribution and dimension must be determined, followed by taking high quality pictures of them and then by digital processing the resulted images. By this method, information regarding the doses absorbed in the exposed samples volume are obtained too.

  8. HELIOS-CR A 1-D radiation-magnetohydrodynamics code with inline atomic kinetics modeling

    NASA Astrophysics Data System (ADS)

    Macfarlane, J. J.; Golovkin, I. E.; Woodruff, P. R.

    2006-05-01

    HELIOS-CR is a user-oriented 1D radiation-magnetohydrodynamics code to simulate the dynamic evolution of laser-produced plasmas and z-pinch plasmas. It includes an in-line collisional-radiative (CR) model for computing non-LTE atomic level populations at each time step of the hydrodynamics simulation. HELIOS-CR has been designed for ease of use, and is well-suited for experimentalists, as well as graduate and undergraduate student researchers. The energy equations employed include models for laser energy deposition, radiation from external sources, and high-current discharges. Radiative transport can be calculated using either a multi-frequency flux-limited diffusion model, or a multi-frequency, multi-angle short characteristics model. HELIOS-CR supports the use of SESAME equation of state (EOS) tables, PROPACEOS EOS/multi-group opacity data tables, and non-LTE plasma properties computed using the inline CR modeling. Time-, space-, and frequency-dependent results from HELIOS-CR calculations are readily displayed with the HydroPLOT graphics tool. In addition, the results of HELIOS simulations can be post-processed using the SPECT3D Imaging and Spectral Analysis Suite to generate images and spectra that can be directly compared with experimental measurements. The HELIOS-CR package runs on Windows, Linux, and Mac OSX platforms, and includes online documentation. We will discuss the major features of HELIOS-CR, and present example results from simulations.

  9. Extension of the MURaM Radiative MHD Code for Coronal Simulations

    NASA Astrophysics Data System (ADS)

    Rempel, M.

    2017-01-01

    We present a new version of the MURaM radiative magnetohydrodynamics (MHD) code that allows for simulations spanning from the upper convection zone into the solar corona. We implement the relevant coronal physics in terms of optically thin radiative loss, field aligned heat conduction, and an equilibrium ionization equation of state. We artificially limit the coronal Alfvén and heat conduction speeds to computationally manageable values using an approximation to semi-relativistic MHD with an artificially reduced speed of light (Boris correction). We present example solutions ranging from quiet to active Sun in order to verify the validity of our approach. We quantify the role of numerical diffusivity for the effective coronal heating. We find that the (numerical) magnetic Prandtl number determines the ratio of resistive to viscous heating and that owing to the very large magnetic Prandtl number of the solar corona, heating is expected to happen predominantly through viscous dissipation. We find that reasonable solutions can be obtained with values of the reduced speed of light just marginally larger than the maximum sound speed. Overall this leads to a fully explicit code that can compute the time evolution of the solar corona in response to photospheric driving using numerical time steps not much smaller than 0.1 s. Numerical simulations of the coronal response to flux emergence covering a time span of a few days are well within reach using this approach.

  10. Specification and Prediction of the Radiation Environment Using Data Assimilative VERB code

    NASA Astrophysics Data System (ADS)

    Shprits, Yuri; Kellerman, Adam

    2016-07-01

    We discuss how data assimilation can be used for the reconstruction of long-term evolution, bench-marking of the physics based codes and used to improve the now-casting and focusing of the radiation belts and ring current. We also discuss advanced data assimilation methods such as parameter estimation and smoothing. We present a number of data assimilation applications using the VERB 3D code. The 3D data assimilative VERB allows us to blend together data from GOES, RBSP A and RBSP B. 1) Model with data assimilation allows us to propagate data to different pitch angles, energies, and L-shells and blends them together with the physics-based VERB code in an optimal way. We illustrate how to use this capability for the analysis of the previous events and for obtaining a global and statistical view of the system. 2) The model predictions strongly depend on initial conditions that are set up for the model. Therefore, the model is as good as the initial conditions that it uses. To produce the best possible initial conditions, data from different sources (GOES, RBSP A, B, our empirical model predictions based on ACE) are all blended together in an optimal way by means of data assimilation, as described above. The resulting initial conditions do not have gaps. This allows us to make more accurate predictions. Real-time prediction framework operating on our website, based on GOES, RBSP A, B and ACE data, and 3D VERB, is presented and discussed.

  11. The FLUKA code: new developments and applications for radiation protection in deep space

    NASA Astrophysics Data System (ADS)

    Ferrari, A.; Fluka Collaboration

    The FLUKA code is used for dosimetry, radioprotection and physics simulations in several fields, ranging from accelerators, to commercial flight dosimetry and space radiation. It is the code used for all radioprotection and dosimetry applicatiuons at CERN, in particular for the Large Hadron Collider project, which when operational in 2008 will accelerate protons up to 7 TeV, and Pb ions up to 2.7 TeV/n. In recent years, the code underwent significant improvements in the treatment of heavy ion beams in an energy range covering therapy, space dosimetry and fundamental physics related to galactic cosmic rays and future LHC beams. The talk will present the latest developments in the modelling of nucleus-nucleus interactions, including the implementation of a model for ion electromagnetic dissociation. The talk will also include an application of FLUKA models to a a Fe ion beam of interest for dosimetry and radiobiological applications and experiments. Various results obtained with the models, as well as several issues related to hadron beam dosimetry will be presented and discussed.

  12. A Random Walk on WASP-12b with the Bayesian Atmospheric Radiative Transfer (BART) Code

    NASA Astrophysics Data System (ADS)

    Harrington, Joseph; Cubillos, Patricio; Blecic, Jasmina; Challener, Ryan; Rojo, Patricio; Lust, Nathaniel B.; Bowman, Oliver; Blumenthal, Sarah D.; Foster, Andrew S. D.; Foster, Austin James; Stemm, Madison; Bruce, Dylan

    2016-01-01

    We present the Bayesian Atmospheric Radiative Transfer (BART) code for atmospheric property retrievals from transit and eclipse spectra, and apply it to WASP-12b, a hot (~3000 K) exoplanet with a high eclipse signal-to-noise ratio. WASP-12b has been controversial. We (Madhusudhan et al. 2011, Nature) claimed it was the first planet with a high C/O abundance ratio. Line et al. (2014, ApJ) suggested a high CO2 abundance to explain the data. Stevenson et al. (2014, ApJ, atmospheric model by Madhusudhan) add additional data and reaffirm the original result, stating that C2H2 and HCN, not included in the Line et al. models, explain the data. We explore several modeling configurations and include Hubble, Spitzer, and ground-based eclipse data.BART consists of a differential-evolution Markov-Chain Monte Carlo sampler that drives a line-by-line radiative transfer code through the phase space of thermal- and abundance-profile parameters. BART is written in Python and C. Python modules generate atmospheric profiles from sets of MCMC parameters and integrate the resulting spectra over observational bandpasses, allowing high flexibility in modeling the planet without interacting with the fast, C portions that calculate the spectra. BART's shared memory and optimized opacity calculation allow it to run on a laptop, enabling classroom use. Runs can scale constant abundance profiles, profiles of thermochemical equilibrium abundances (TEA) calculated by the included TEA code, or arbitrary curves. Several thermal profile parameterizations are available. BART is an open-source, reproducible-research code. Users must release any code or data modifications if they publish results from it, and we encourage the community to use it and to participate in its development via http://github.com/ExOSPORTS/BART.This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. J. Blecic holds a NASA Earth and Space Science

  13. WASP-12b According to the Bayesian Atmospheric Radiative Transfer (BART) Code

    NASA Astrophysics Data System (ADS)

    Harrington, Joseph; Cubillos, Patricio E.; Blecic, Jasmina; Challener, Ryan C.; Rojo, Patricio M.; Lust, Nate B.; Bowman, M. Oliver; Blumenthal, Sarah D.; Foster, Andrew SD; Foster, A. J.

    2015-11-01

    We present the Bayesian Atmospheric Radiative Transfer (BART) code for atmospheric property retrievals from transit and eclipse spectra, and apply it to WASP-12b, a hot (~3000 K) exoplanet with a high eclipse signal-to-noise ratio. WASP-12b has been controversial. We (Madhusudhan et al. 2011, Nature) claimed it was the first planet with a high C/O abundance ratio. Line et al. (2014, ApJ) suggested a high CO2 abundance to explain the data. Stevenson et al. (2014, ApJ, atmospheric model by Madhusudhan) add additional data and reaffirm the original result, stating that C2H2 and HCN, not included in the Line et al. models, explain the data. We explore several modeling configurations and include Hubble, Spitzer, and ground-based eclipse data.BART consists of a differential-evolution Markov-Chain Monte Carlo sampler that drives a line-by-line radiative transfer code through the phase space of thermal- and abundance-profile parameters. BART is written in Python and C. Python modules generate atmospheric profiles from sets of MCMC parameters and integrate the resulting spectra over observational bandpasses, allowing high flexibility in modeling the planet without interacting with the fast, C portions that calculate the spectra. BART's shared memory and optimized opacity calculation allow it to run on a laptop, enabling classroom use. Runs can scale constant abundance profiles, profiles of thermochemical equilibrium abundances (TEA) calculated by the included TEA code, or arbitrary curves. Several thermal profile parameterizations are available. BART is an open-source, reproducible-research code. Users must release any code or data modifications if they publish results from it, and we encourage the community to use it and to participate in its development via http://github.com/ExOSPORTS/BART.This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. J. Blecic holds a NASA Earth and Space Science

  14. A Radiation Chemistry Code Based on the Greens Functions of the Diffusion Equation

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Wu, Honglu

    2014-01-01

    Ionizing radiation produces several radiolytic species such as.OH, e-aq, and H. when interacting with biological matter. Following their creation, radiolytic species diffuse and chemically react with biological molecules such as DNA. Despite years of research, many questions on the DNA damage by ionizing radiation remains, notably on the indirect effect, i.e. the damage resulting from the reactions of the radiolytic species with DNA. To simulate DNA damage by ionizing radiation, we are developing a step-by-step radiation chemistry code that is based on the Green's functions of the diffusion equation (GFDE), which is able to follow the trajectories of all particles and their reactions with time. In the recent years, simulations based on the GFDE have been used extensively in biochemistry, notably to simulate biochemical networks in time and space and are often used as the "gold standard" to validate diffusion-reaction theories. The exact GFDE for partially diffusion-controlled reactions is difficult to use because of its complex form. Therefore, the radial Green's function, which is much simpler, is often used. Hence, much effort has been devoted to the sampling of the radial Green's functions, for which we have developed a sampling algorithm This algorithm only yields the inter-particle distance vector length after a time step; the sampling of the deviation angle of the inter-particle vector is not taken into consideration. In this work, we show that the radial distribution is predicted by the exact radial Green's function. We also use a technique developed by Clifford et al. to generate the inter-particle vector deviation angles, knowing the inter-particle vector length before and after a time step. The results are compared with those predicted by the exact GFDE and by the analytical angular functions for free diffusion. This first step in the creation of the radiation chemistry code should help the understanding of the contribution of the indirect effect in the

  15. Review of Fast Monte Carlo Codes for Dose Calculation in Radiation Therapy Treatment Planning

    PubMed Central

    Jabbari, Keyvan

    2011-01-01

    An important requirement in radiation therapy is a fast and accurate treatment planning system. This system, using computed tomography (CT) data, direction, and characteristics of the beam, calculates the dose at all points of the patient's volume. The two main factors in treatment planning system are accuracy and speed. According to these factors, various generations of treatment planning systems are developed. This article is a review of the Fast Monte Carlo treatment planning algorithms, which are accurate and fast at the same time. The Monte Carlo techniques are based on the transport of each individual particle (e.g., photon or electron) in the tissue. The transport of the particle is done using the physics of the interaction of the particles with matter. Other techniques transport the particles as a group. For a typical dose calculation in radiation therapy the code has to transport several millions particles, which take a few hours, therefore, the Monte Carlo techniques are accurate, but slow for clinical use. In recent years, with the development of the ‘fast’ Monte Carlo systems, one is able to perform dose calculation in a reasonable time for clinical use. The acceptable time for dose calculation is in the range of one minute. There is currently a growing interest in the fast Monte Carlo treatment planning systems and there are many commercial treatment planning systems that perform dose calculation in radiation therapy based on the Monte Carlo technique. PMID:22606661

  16. A study of the earth radiation budget using a 3D Monte-Carlo radiative transer code

    NASA Astrophysics Data System (ADS)

    Okata, M.; Nakajima, T.; Sato, Y.; Inoue, T.; Donovan, D. P.

    2013-12-01

    The purpose of this study is to evaluate the earth's radiation budget when data are available from satellite-borne active sensors, i.e. cloud profiling radar (CPR) and lidar, and a multi-spectral imager (MSI) in the project of the Earth Explorer/EarthCARE mission. For this purpose, we first developed forward and backward 3D Monte Carlo radiative transfer codes that can treat a broadband solar flux calculation including thermal infrared emission calculation by k-distribution parameters of Sekiguchi and Nakajima (2008). In order to construct the 3D cloud field, we tried the following three methods: 1) stochastic cloud generated by randomized optical thickness each layer distribution and regularly-distributed tilted clouds, 2) numerical simulations by a non-hydrostatic model with bin cloud microphysics model and 3) Minimum cloud Information Deviation Profiling Method (MIDPM) as explained later. As for the method-2 (numerical modeling method), we employed numerical simulation results of Californian summer stratus clouds simulated by a non-hydrostatic atmospheric model with a bin-type cloud microphysics model based on the JMA NHM model (Iguchi et al., 2008; Sato et al., 2009, 2012) with horizontal (vertical) grid spacing of 100m (20m) and 300m (20m) in a domain of 30km (x), 30km (y), 1.5km (z) and with a horizontally periodic lateral boundary condition. Two different cell systems were simulated depending on the cloud condensation nuclei (CCN) concentration. In the case of horizontal resolution of 100m, regionally averaged cloud optical thickness, , and standard deviation of COT, were 3.0 and 4.3 for pristine case and 8.5 and 7.4 for polluted case, respectively. In the MIDPM method, we first construct a library of pair of observed vertical profiles from active sensors and collocated imager products at the nadir footprint, i.e. spectral imager radiances, cloud optical thickness (COT), effective particle radius (RE) and cloud top temperature (Tc). We then select a

  17. CEM2k and LAQGSM Codes as Event-Generators for Space Radiation Shield and Cosmic Rays Propagation Applications

    NASA Technical Reports Server (NTRS)

    Mashnik, S. G.; Gudima, K. K.; Sierk, A. J.; Moskalenko, I. V.

    2002-01-01

    Space radiation shield applications and studies of cosmic ray propagation in the Galaxy require reliable cross sections to calculate spectra of secondary particles and yields of the isotopes produced in nuclear reactions induced both by particles and nuclei at energies from threshold to hundreds of GeV per nucleon. Since the data often exist in a very limited energy range or sometimes not at all, the only way to obtain an estimate of the production cross sections is to use theoretical models and codes. Recently, we have developed improved versions of the Cascade-Exciton Model (CEM) of nuclear reactions: the codes CEM97 and CEM2k for description of particle-nucleus reactions at energies up to about 5 GeV. In addition, we have developed a LANL version of the Quark-Gluon String Model (LAQGSM) to describe reactions induced both by particles and nuclei at energies up to hundreds of GeVhucleon. We have tested and benchmarked the CEM and LAQGSM codes against a large variety of experimental data and have compared their results with predictions by other currently available models and codes. Our benchmarks show that CEM and LAQGSM codes have predictive powers no worse than other currently used codes and describe many reactions better than other codes; therefore both our codes can be used as reliable event-generators for space radiation shield and cosmic ray propagation applications. The CEM2k code is being incorporated into the transport code MCNPX (and several other transport codes), and we plan to incorporate LAQGSM into MCNPX in the near future. Here, we present the current status of the CEM2k and LAQGSM codes, and show results and applications to studies of cosmic ray propagation in the Galaxy.

  18. Updating the Tools Used to Estimate Space Radiation Exposures for Operations: Codes, Models and Interfaces

    NASA Astrophysics Data System (ADS)

    Zapp, E.; Shelfer, T.; Semones, E.; Johnson, A.; Weyland, M.; Golightly, M.; Smith, G.; Dardano, C.

    increase in speed due to "in -lining" calculations and reconstructing of the algorithms in a manner which calls for fewer elemental calculations, as well as time saved through better interfaces with geometry models and code input routines. The overall result is to enhance the radiation protection capabilities available for manned space flight.

  19. Code System for Calculating Radiation Exposure Resulting from Accidental Radioactive Releases to the Hydrosphere.

    SciTech Connect

    1982-11-18

    Version 00 LPGS was developed to calculate the radiological impacts resulting from radioactive releases to the hydrosphere. The name LPGS was derived from the Liquid Pathway Generic Study for which the original code was used primarily as an analytic tool in the assessment process. The hydrosphere is represented by the following types of water bodies: estuary, small river, well, lake, and one-dimensional (1-D) river. LPGS is designed to calculate radiation dose (individual and population) to body organs as a function of time for the various exposure pathways. The radiological consequences to the aquatic biota are estimated. Several simplified radionuclide transport models are employed with built-in formulations to describe the release rate of the radionuclides. A tabulated user-supplied release model can be input, if desired. Printer plots of dose versus time for the various exposure pathways are provided.

  20. Development and validation of a GEANT4 radiation transport code for CT dosimetry.

    PubMed

    Carver, D E; Kost, S D; Fernald, M J; Lewis, K G; Fraser, N D; Pickens, D R; Price, R R; Stabin, M G

    2015-04-01

    The authors have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate their simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air with a standard 16-cm acrylic head phantom and with a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of the Monte Carlo simulations. It was found that simulated and measured CTDI values were within an overall average of 6% of each other.

  1. Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code

    NASA Astrophysics Data System (ADS)

    Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; Davidson, Gregory G.; Hamilton, Steven P.; Godfrey, Andrew T.

    2016-03-01

    This work discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package authored at Oak Ridge National Laboratory. Shift has been developed to scale well from laptops to small computing clusters to advanced supercomputers and includes features such as support for multiple geometry and physics engines, hybrid capabilities for variance reduction methods such as the Consistent Adjoint-Driven Importance Sampling methodology, advanced parallel decompositions, and tally methods optimized for scalability on supercomputing architectures. The scaling studies presented in this paper demonstrate good weak and strong scaling behavior for the implemented algorithms. Shift has also been validated and verified against various reactor physics benchmarks, including the Consortium for Advanced Simulation of Light Water Reactors' Virtual Environment for Reactor Analysis criticality test suite and several Westinghouse AP1000® problems presented in this paper. These benchmark results compare well to those from other contemporary Monte Carlo codes such as MCNP5 and KENO.

  2. The Monte Carlo code MCPTV--Monte Carlo dose calculation in radiation therapy with carbon ions.

    PubMed

    Karg, Juergen; Speer, Stefan; Schmidt, Manfred; Mueller, Reinhold

    2010-07-07

    The Monte Carlo code MCPTV is presented. MCPTV is designed for dose calculation in treatment planning in radiation therapy with particles and especially carbon ions. MCPTV has a voxel-based concept and can perform a fast calculation of the dose distribution on patient CT data. Material and density information from CT are taken into account. Electromagnetic and nuclear interactions are implemented. Furthermore the algorithm gives information about the particle spectra and the energy deposition in each voxel. This can be used to calculate the relative biological effectiveness (RBE) for each voxel. Depth dose distributions are compared to experimental data giving good agreement. A clinical example is shown to demonstrate the capabilities of the MCPTV dose calculation.

  3. An object-oriented implementation of a parallel Monte Carlo code for radiation transport

    NASA Astrophysics Data System (ADS)

    Santos, Pedro Duarte; Lani, Andrea

    2016-05-01

    This paper describes the main features of a state-of-the-art Monte Carlo solver for radiation transport which has been implemented within COOLFluiD, a world-class open source object-oriented platform for scientific simulations. The Monte Carlo code makes use of efficient ray tracing algorithms (for 2D, axisymmetric and 3D arbitrary unstructured meshes) which are described in detail. The solver accuracy is first verified in testcases for which analytical solutions are available, then validated for a space re-entry flight experiment (i.e. FIRE II) for which comparisons against both experiments and reference numerical solutions are provided. Through the flexible design of the physical models, ray tracing and parallelization strategy (fully reusing the mesh decomposition inherited by the fluid simulator), the implementation was made efficient and reusable.

  4. Comparison of spent-fuel cask radiation doses calculated by one- and two-dimensional shielding codes

    SciTech Connect

    Carbajo, J.J. )

    1992-01-01

    Spent-fuel cask shield design and calculation of radiation doses are major parts of the overall cask design. This paper compares radiation doses calculated by one- and two-dimensional or three-dimensional shielding codes. The paper also investigates the appropriateness of using one-dimensional codes for two-dimensional geometries. From these results, it can be concluded that the one-dimensional XSDRNPM/XSDOSE codes are adequate for both radial and axial shielding calculations if appropriate bucklings are used. For radial calculations, no buckling or a buckling equal to the length of the fuel are appropriate. For axial calculations, a buckling at least equal to the diameter of the cask must be used for neutron doses. For gamma axial doses, a buckling around the diameter of the fuel region is adequate. More complicated two- or three-dimensional codes are not needed for these types of problems.

  5. Acoustic radiation force impulse (ARFI) imaging of zebrafish embryo by high-frequency coded excitation sequence.

    PubMed

    Park, Jinhyoung; Lee, Jungwoo; Lau, Sien Ting; Lee, Changyang; Huang, Ying; Lien, Ching-Ling; Kirk Shung, K

    2012-04-01

    Acoustic radiation force impulse (ARFI) imaging has been developed as a non-invasive method for quantitative illustration of tissue stiffness or displacement. Conventional ARFI imaging (2-10 MHz) has been implemented in commercial scanners for illustrating elastic properties of several organs. The image resolution, however, is too coarse to study mechanical properties of micro-sized objects such as cells. This article thus presents a high-frequency coded excitation ARFI technique, with the ultimate goal of displaying elastic characteristics of cellular structures. Tissue mimicking phantoms and zebrafish embryos are imaged with a 100-MHz lithium niobate (LiNbO₃) transducer, by cross-correlating tracked RF echoes with the reference. The phantom results show that the contrast of ARFI image (14 dB) with coded excitation is better than that of the conventional ARFI image (9 dB). The depths of penetration are 2.6 and 2.2 mm, respectively. The stiffness data of the zebrafish demonstrate that the envelope is harder than the embryo region. The temporal displacement change at the embryo and the chorion is as large as 36 and 3.6 μm. Consequently, this high-frequency ARFI approach may serve as a remote palpation imaging tool that reveals viscoelastic properties of small biological samples.

  6. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    SciTech Connect

    Kostin, Mikhail; Mokhov, Nikolai; Niita, Koji

    2013-09-25

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  7. ODYSSEY: A PUBLIC GPU-BASED CODE FOR GENERAL RELATIVISTIC RADIATIVE TRANSFER IN KERR SPACETIME

    SciTech Connect

    Pu, Hung-Yi; Younsi, Ziri

    2016-04-01

    General relativistic radiative transfer calculations coupled with the calculation of geodesics in the Kerr spacetime are an essential tool for determining the images, spectra, and light curves from matter in the vicinity of black holes. Such studies are especially important for ongoing and upcoming millimeter/submillimeter very long baseline interferometry observations of the supermassive black holes at the centers of Sgr A* and M87. To this end we introduce Odyssey, a graphics processing unit (GPU) based code for ray tracing and radiative transfer in the Kerr spacetime. On a single GPU, the performance of Odyssey can exceed 1 ns per photon, per Runge–Kutta integration step. Odyssey is publicly available, fast, accurate, and flexible enough to be modified to suit the specific needs of new users. Along with a Graphical User Interface powered by a video-accelerated display architecture, we also present an educational software tool, Odyssey-Edu, for showing in real time how null geodesics around a Kerr black hole vary as a function of black hole spin and angle of incidence onto the black hole.

  8. European Code against Cancer 4th Edition: Ultraviolet radiation and cancer.

    PubMed

    Greinert, Rüdiger; de Vries, Esther; Erdmann, Friederike; Espina, Carolina; Auvinen, Anssi; Kesminiene, Ausrele; Schüz, Joachim

    2015-12-01

    Ultraviolet radiation (UVR) is part of the electromagnetic spectrum emitted naturally from the sun or from artificial sources such as tanning devices. Acute skin reactions induced by UVR exposure are erythema (skin reddening), or sunburn, and the acquisition of a suntan triggered by UVR-induced DNA damage. UVR exposure is the main cause of skin cancer, including cutaneous malignant melanoma, basal-cell carcinoma, and squamous-cell carcinoma. Skin cancer is the most common cancer in fair-skinned populations, and its incidence has increased steeply over recent decades. According to estimates for 2012, about 100,000 new cases of cutaneous melanoma and about 22,000 deaths from it occurred in Europe. The main mechanisms by which UVR causes cancer are well understood. Exposure during childhood appears to be particularly harmful. Exposure to UVR is a risk factor modifiable by individuals' behaviour. Excessive exposure from natural sources can be avoided by seeking shade when the sun is strongest, by wearing appropriate clothing, and by appropriately applying sunscreens if direct sunlight is unavoidable. Exposure from artificial sources can be completely avoided by not using sunbeds. Beneficial effects of sun or UVR exposure, such as for vitamin D production, can be fully achieved while still avoiding too much sun exposure and the use of sunbeds. Taking all the scientific evidence together, the recommendation of the 4th edition of the European Code Against Cancer for ultraviolet radiation is: "Avoid too much sun, especially for children. Use sun protection. Do not use sunbeds."

  9. Odyssey: A Public GPU-based Code for General Relativistic Radiative Transfer in Kerr Spacetime

    NASA Astrophysics Data System (ADS)

    Pu, Hung-Yi; Yun, Kiyun; Younsi, Ziri; Yoon, Suk-Jin

    2016-04-01

    General relativistic radiative transfer calculations coupled with the calculation of geodesics in the Kerr spacetime are an essential tool for determining the images, spectra, and light curves from matter in the vicinity of black holes. Such studies are especially important for ongoing and upcoming millimeter/submillimeter very long baseline interferometry observations of the supermassive black holes at the centers of Sgr A* and M87. To this end we introduce Odyssey, a graphics processing unit (GPU) based code for ray tracing and radiative transfer in the Kerr spacetime. On a single GPU, the performance of Odyssey can exceed 1 ns per photon, per Runge-Kutta integration step. Odyssey is publicly available, fast, accurate, and flexible enough to be modified to suit the specific needs of new users. Along with a Graphical User Interface powered by a video-accelerated display architecture, we also present an educational software tool, Odyssey_Edu, for showing in real time how null geodesics around a Kerr black hole vary as a function of black hole spin and angle of incidence onto the black hole.

  10. Implementation of tetrahedral-mesh geometry in Monte Carlo radiation transport code PHITS.

    PubMed

    Furuta, Takuya; Sato, Tatsuhiko; Han, Min; Yeom, Yeon; Kim, Chan; Brown, Justin; Bolch, Wesley

    2017-04-04

    A new function to treat tetrahedral-mesh geometry was implemented in the Particle and Heavy Ion Transport code Systems (PHITS). To accelerate the computational speed in the transport process, an original algorithm was introduced to initially prepare decomposition maps for the container box of the tetrahedral-mesh geometry. The computational performance was tested by conducting radiation transport simulations of 100 MeV protons and 1 MeV photons in a water phantom represented by tetrahedral mesh. The simulation was repeated with varying number of meshes and the required computational times were then compared with those of the conventional voxel representation. Our results show that the computational costs for each boundary crossing of the region mesh are essentially equivalent for both representations. This study suggests that the tetrahedral-mesh representation offers not only a flexible description of the transport geometry but also improvement of computational efficiency for the radiation transport. Due to the adaptability of tetrahedrons in both size and shape, dosimetrically equivalent objects can be represented by tetrahedrons with a much fewer number of meshes as compared its voxelized representation. Our study additionally included dosimetric calculations using a computational human phantom. A significant acceleration of the computational speed, about 4 times, was confirmed by the adoption of a tetrahedral mesh over the traditional voxel mesh geometry.

  11. Radiation environment at the Moon: Comparisons of transport code modeling and measurements from the CRaTER instrument

    NASA Astrophysics Data System (ADS)

    Porter, Jamie A.; Townsend, Lawrence W.; Spence, Harlan; Golightly, Michael; Schwadron, Nathan; Kasper, Justin; Case, Anthony W.; Blake, John B.; Zeitlin, Cary

    2014-06-01

    The Cosmic Ray Telescope for the Effects of Radiation (CRaTER), an instrument carried on the Lunar Reconnaissance Orbiter spacecraft, directly measures the energy depositions by solar and galactic cosmic radiations in its silicon wafer detectors. These energy depositions are converted to linear energy transfer (LET) spectra. High LET particles, which are mainly high-energy heavy ions found in the incident cosmic ray spectrum, or target fragments and recoils produced by protons and heavier ions, are of particular importance because of their potential to cause significant damage to human tissue and electronic components. Aside from providing LET data useful for space radiation risk analyses for lunar missions, the observed LET spectra can also be used to help validate space radiation transport codes, used for shielding design and risk assessment applications, which is a major thrust of this work. In this work the Monte Carlo transport code HETC-HEDS (High-Energy Transport Code-Human Exploration and Development in Space) is used to estimate LET contributions from the incident primary ions and their charged secondaries produced by nuclear collisions as they pass through the three pairs of silicon detectors. Also in this work, the contributions to the LET of the primary ions and their charged secondaries are analyzed and compared with estimates obtained using the deterministic space radiation code HZETRN 2010, developed at NASA Langley Research Center. LET estimates obtained from the two transport codes are compared with measurements of LET from the CRaTER instrument during the mission. Overall, a comparison of the LET predictions of the HETC-HEDS code to the predictions of the HZETRN code displays good agreement. The code predictions are also in good agreement with the CRaTER LET measurements above 15 keV/µm but differ from the measurements for smaller values of LET. A possible reason for this disagreement between measured and calculated spectra below 15 keV/µm is an

  12. Development of a Space Radiation Monte-Carlo Computer Simulation Based on the FLUKE and Root Codes

    NASA Technical Reports Server (NTRS)

    Pinsky, L. S.; Wilson, T. L.; Ferrari, A.; Sala, Paola; Carminati, F.; Brun, R.

    2001-01-01

    The radiation environment in space is a complex problem to model. Trying to extrapolate the projections of that environment into all areas of the internal spacecraft geometry is even more daunting. With the support of our CERN colleagues, our research group in Houston is embarking on a project to develop a radiation transport tool that is tailored to the problem of taking the external radiation flux incident on any particular spacecraft and simulating the evolution of that flux through a geometrically accurate model of the spacecraft material. The output will be a prediction of the detailed nature of the resulting internal radiation environment within the spacecraft as well as its secondary albedo. Beyond doing the physics transport of the incident flux, the software tool we are developing will provide a self-contained stand-alone object-oriented analysis and visualization infrastructure. It will also include a graphical user interface and a set of input tools to facilitate the simulation of space missions in terms of nominal radiation models and mission trajectory profiles. The goal of this project is to produce a code that is considerably more accurate and user-friendly than existing Monte-Carlo-based tools for the evaluation of the space radiation environment. Furthermore, the code will be an essential complement to the currently existing analytic codes in the BRYNTRN/HZETRN family for the evaluation of radiation shielding. The code will be directly applicable to the simulation of environments in low earth orbit, on the lunar surface, on planetary surfaces (including the Earth) and in the interplanetary medium such as on a transit to Mars (and even in the interstellar medium). The software will include modules whose underlying physics base can continue to be enhanced and updated for physics content, as future data become available beyond the timeframe of the initial development now foreseen. This future maintenance will be available from the authors of FLUKA as

  13. Radiation transport codes for potential applications related to radiobiology and radiotherapy using protons, neutrons, and negatively charged pions

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.

    1972-01-01

    Several Monte Carlo radiation transport computer codes are used to predict quantities of interest in the fields of radiotherapy and radiobiology. The calculational methods are described and comparisions of calculated and experimental results are presented for dose distributions produced by protons, neutrons, and negatively charged pions. Comparisons of calculated and experimental cell survival probabilities are also presented.

  14. FACET: a radiation view factor computer code for axisymmetric, 2D planar, and 3D geometries with shadowing

    SciTech Connect

    Shapiro, A.B.

    1983-08-01

    The computer code FACET calculates the radiation geometric view factor (alternatively called shape factor, angle factor, or configuration factor) between surfaces for axisymmetric, two-dimensional planar and three-dimensional geometries with interposed third surface obstructions. FACET was developed to calculate view factors for input to finite-element heat-transfer analysis codes. The first section of this report is a brief review of previous radiation-view-factor computer codes. The second section presents the defining integral equation for the geometric view factor between two surfaces and the assumptions made in its derivation. Also in this section are the numerical algorithms used to integrate this equation for the various geometries. The third section presents the algorithms used to detect self-shadowing and third-surface shadowing between the two surfaces for which a view factor is being calculated. The fourth section provides a user's input guide followed by several example problems.

  15. Radiation Protection Studies for Medical Particle Accelerators using Fluka Monte Carlo Code.

    PubMed

    Infantino, Angelo; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Mostacci, Domiziano; Marengo, Mario

    2016-11-24

    Radiation protection (RP) in the use of medical cyclotrons involves many aspects both in the routine use and for the decommissioning of a site. Guidelines for site planning and installation, as well as for RP assessment, are given in international documents; however, the latter typically offer analytic methods of calculation of shielding and materials activation, in approximate or idealised geometry set-ups. The availability of Monte Carlo (MC) codes with accurate up-to-date libraries for transport and interaction of neutrons and charged particles at energies below 250 MeV, together with the continuously increasing power of modern computers, makes the systematic use of simulations with realistic geometries possible, yielding equipment and site-specific evaluation of the source terms, shielding requirements and all quantities relevant to RP at the same time. In this work, the well-known FLUKA MC code was used to simulate different aspects of RP in the use of biomedical accelerators, particularly for the production of medical radioisotopes. In the context of the Young Professionals Award, held at the IRPA 14 conference, only a part of the complete work is presented. In particular, the simulation of the GE PETtrace cyclotron (16.5 MeV) installed at S. Orsola-Malpighi University Hospital evaluated the effective dose distribution around the equipment; the effective number of neutrons produced per incident proton and their spectral distribution; the activation of the structure of the cyclotron and the vault walls; the activation of the ambient air, in particular the production of (41)Ar. The simulations were validated, in terms of physical and transport parameters to be used at the energy range of interest, through an extensive measurement campaign of the neutron environmental dose equivalent using a rem-counter and TLD dosemeters. The validated model was then used in the design and the licensing request of a new Positron Emission Tomography facility.

  16. The Havemann-Taylor Fast Radiative Transfer Code (HT-FRTC) and its applications

    NASA Astrophysics Data System (ADS)

    Thelen, Jean-Claude; Havemann, Stephan; Lewis, Warren

    2015-09-01

    The Havemann-Taylor Fast Radiative Transfer Code (HT-FRTC) is a component of the Met Office NEON Tactical Decision Aid (TDA). Within NEON, the HT-FRTC has for a number of years been used to predict the IR apparent thermal contrasts between different surface types as observed by an airborne sensor. To achieve this, the HT-FRTC is supplied with the inherent temperatures and spectral properties of these surfaces (i.e. ground target(s) and background). A key strength of the HT-FRTC is its ability to take into account the detailed properties of the atmosphere, which in the context of NEON tend to be provided by a Numerical Weather Prediction (NWP) forecast model. While water vapour and ozone are generally the most important gases, additional trace gases are now being incorporated into the HT-FRTC. The HT-FRTC also includes an exact treatment of atmospheric scattering based on spherical harmonics. This allows the treatment of several different aerosol species and of liquid and ice clouds. Recent developments can even account for rain and falling snow. The HT-FRTC works in Principal Component (PC) space and is trained on a wide variety of atmospheric and surface conditions, which significantly reduces the computational requirements regarding memory and time. One clear-sky simulation takes approximately one millisecond. Recent developments allow the training to be completely general and sensor independent. This is significant as the user of the code can add new sensors and new surfaces/targets by simply supplying extra files which contain their (possibly classified) spectral properties. The HT-FRTC has been extended to cover the spectral range of Photopic and NVG sensors. One aim here is to give guidance on the expected, directionally resolved sky brightness, especially at night, again taking the actual or forecast atmospheric conditions into account. Recent developments include light level predictions during the period of twilight.

  17. Particle-In-Cell (PIC) code simulation results and comparison with theory scaling laws for photoelectron-generated radiation

    SciTech Connect

    Dipp, T.M. |

    1993-12-01

    The generation of radiation via photoelectrons induced off of a conducting surface was explored using Particle-In-Cell (PIC) code computer simulations. Using the MAGIC PIC code, the simulations were performed in one dimension to handle the diverse scale lengths of the particles and fields in the problem. The simulations involved monoenergetic, nonrelativistic photoelectrons emitted normal to the illuminated conducting surface. A sinusoidal, 100% modulated, 6.3263 ns pulse train, as well as unmodulated emission, were used to explore the behavior of the particles, fields, and generated radiation. A special postprocessor was written to convert the PIC code simulated electron sheath into far-field radiation parameters by means of rigorous retarded time calculations. The results of the small-spot PIC simulations were used to generate various graphs showing resonance and nonresonance radiation quantities such as radiated lobe patterns, frequency, and power. A database of PIC simulation results was created and, using a nonlinear curve-fitting program, compared with theoretical scaling laws. Overall, the small-spot behavior predicted by the theoretical scaling laws was generally observed in the PIC simulation data, providing confidence in both the theoretical scaling laws and the PIC simulations.

  18. Pymiedap: a versatile radiative transfer code with polarization for terrestrial (exo)planets.

    NASA Astrophysics Data System (ADS)

    Rossi, Loïc; Stam, Daphne; Hogenboom, Michael

    2016-04-01

    Polarimetry promises to be an important method to detect exoplanets: the light of a star is usually unpolarized te{kemp1987} while scattering by gas and clouds in an atmosphere can generate high levels of polarization. Furthermore, the polarization of scattered light contains information about the properties of the atmosphere and surface of a planet, allowing a possible characterization te{stam2008}, a method already validated in the solar system with Venus te{hansen1974,rossi2015}. We present here Pymiedap (Python Mie Doubling-Adding Program): a set of Python objects interfaced with Fortran radiative transfer codes that allows to define a planetary atmosphere and compute the flux and polarization of the light that is scattered. Several different properties of the planet can be set interactively by the user through the Python interface such as gravity, distance to the star, surface properties, atmospheric layers, gaseous and aerosol composition. The radiative transfer calculations are then computed following the doubling-adding method te{deHaan1987}. We present some results of the code and show its possible use for different planetary atmospheres for both resolved and disk-integrated measurements. We investigate the effect of gas, clouds and aerosols composition and surface properties for horizontally homogeneous and inhomogenous planets, in the case of Earth-like planets. We also study the effect of gaseous absorption on the flux and polarization as a marker for gaseous abundance and cloud top altitude. [1]{kemp1987} Kemp et al. The optical polarization of the sun measured at a sensitivity of parts in ten million. Nature, 1987, 326, 270-273 [2]{stam2008} Stam, D. M. Spectropolarimetric signatures of Earth-like extrasolar planets. A&A, 2008, 482, 989-1007 [3]{hansen1974} Hansen, J. E. & Hovenier, J. W. Interpretation of the polarization of Venus. Journal of Atmospheric Sciences, 1974, 31, 1137-1160 [4]{rossi2015} Rossi et al. Preliminary study of Venus cloud layers

  19. Accurate dose assessment system for an exposed person utilising radiation transport calculation codes in emergency response to a radiological accident.

    PubMed

    Takahashi, F; Shigemori, Y; Seki, A

    2009-01-01

    A system has been developed to assess radiation dose distribution inside the body of exposed persons in a radiological accident by utilising radiation transport calculation codes-MCNP and MCNPX. The system consists mainly of two parts, pre-processor and post-processor of the radiation transport calculation. Programs for the pre-processor are used to set up a 'problem-dependent' input file, which defines the accident condition and dosimetric quantities to be estimated. The program developed for the post-processor part can effectively indicate dose information based upon the output file of the code. All of the programs in the dosimetry system can be executed with a generally used personal computer and accurately give the dose profile to an exposed person in a radiological accident without complicated procedures. An experiment using a physical phantom was carried out to verify the availability of the dosimetry system with the developed programs in a gamma ray irradiation field.

  20. Coronal extension of the MURaM radiative MHD code: From quiet sun to flare simulations

    NASA Astrophysics Data System (ADS)

    Rempel, Matthias D.; Cheung, Mark

    2016-05-01

    We present a new version of the MURaM radiative MHD code, which includes a treatment of the solar corona in terms of MHD, optically thin radiative loss and field-aligned heat conduction. In order to relax the severe time-step constraints imposed by large Alfven velocities and heat conduction we use a combination of semi-relativistic MHD with reduced speed of light ("Boris correction") and a hyperbolic formulation of heat conduction. We apply the numerical setup to 4 different setups including a mixed polarity quiet sun, an open flux region, an arcade solution and an active region setup and find all cases an amount of coronal heating sufficient to maintain a corona with temperatures from 1 MK (quiet sun) to 2 MK (active region, arcade). In all our setups the Poynting flux is self-consistently created by photospheric and sub-photospheric magneto-convection in the lower part of our simulation domain. Varying the maximum allowed Alfven velocity ("reduced speed of light") leads to only minor changes in the coronal structure as long as the limited Alfven velocity remains larger than the speed of sound and about 1.5-3 times larger than the peak advection velocity. We also found that varying details of the numerical diffusivities that govern the resistive and viscous energy dissipation do not strongly affect the overall coronal heating, but the ratio of resistive and viscous energy dependence is strongly dependent on the effective numerical magnetic Prandtl number. We use our active region setup in order to simulate a flare triggered by the emergence of a twisted flux rope into a pre-existing bipolar active region. Our simulation yields a series of flares, with the strongest one reaching GOES M1 class. The simulation reproduces many observed properties of eruptions such as flare ribbons, post flare loops and a sunquake.

  1. XTAT: A New Multilevel-Multiline Polarized Radiative Transfer Code with PRD

    NASA Astrophysics Data System (ADS)

    Bommier, V.

    2014-10-01

    This work is intended to the interpretation of the so-called "Second Solar Spectrum" (Stenflo 1996), which is the spectrum of the linear polarization formed by scattering and observed close to the solar internal limb. The lines are also optically thick, and the problem is to solve in a coherent manner, the statistical equilibrium of the atomic density matrix and the polarized radiative transfer in the atmosphere. Following Belluzzi & Landi Degl'Innocenti (2009), 30 % of the solar visible line linear polarization profiles display the M-type shape typical of coherent scattering effect in the far wings. A new theory including both coherent (Rayleigh) and resonant scatterings was developed by Bommier (1997a,b). Raman scattering was later added (Bommier 1999, SPW2). In this theory, which is straightly derived from the Schrödinger equation for the atomic density matrix, the radiative line broadening appears as a non-Markovian process of atom-photon interaction. The collisional broadening is included. The Rayleigh (Raman) scattering appears as an additional term in the emissivity from the fourth order of the atom-photon interaction perturbation development. The development is pursued and finally summed up, leading to a non-perturbative final result. In this formalism, the use of redistribution functions is avoided. The published formalism was limited to the two-level atom without lower level alignment. But most of the solar lines are more complex. We will present how the theory has to be complemented for multi-level atom modeling, including lower level alignment. The role of the collisions as balancing coherent and resonant scatterings is fully taken into account. Progress report will be given about the development of a new code for the numerical iterative solution of the statistical equilibrium and polarized radiative transfer equations, for multi-level atoms and their multi-line spectrum. Fine and hyperfine structures, and Hanle, Kemp (Kemp et al. 1984), Zeeman

  2. C5 Benchmark Problem with Discrete Ordinate Radiation Transport Code DENOVO

    SciTech Connect

    Yesilyurt, Gokhan; Clarno, Kevin T; Evans, Thomas M; Davidson, Gregory G; Fox, Patricia B

    2011-01-01

    The C5 benchmark problem proposed by the Organisation for Economic Co-operation and Development/Nuclear Energy Agency was modeled to examine the capabilities of Denovo, a three-dimensional (3-D) parallel discrete ordinates (S{sub N}) radiation transport code, for problems with no spatial homogenization. Denovo uses state-of-the-art numerical methods to obtain accurate solutions to the Boltzmann transport equation. Problems were run in parallel on Jaguar, a high-performance supercomputer located at Oak Ridge National Laboratory. Both the two-dimensional (2-D) and 3-D configurations were analyzed, and the results were compared with the reference MCNP Monte Carlo calculations. For an additional comparison, SCALE/KENO-V.a Monte Carlo solutions were also included. In addition, a sensitivity analysis was performed for the optimal angular quadrature and mesh resolution for both the 2-D and 3-D infinite lattices of UO{sub 2} fuel pin cells. Denovo was verified with the C5 problem. The effective multiplication factors, pin powers, and assembly powers were found to be in good agreement with the reference MCNP and SCALE/KENO-V.a Monte Carlo calculations.

  3. Exo-Transmit: Radiative transfer code for calculating exoplanet transmission spectra

    NASA Astrophysics Data System (ADS)

    Kempton, Eliza M.-R.; Lupu, Roxana E.; Owusu-Asare, Albert; Slough, Patrick; Cale, Bryson

    2016-11-01

    Exo-Transmit calculates the transmission spectrum of an exoplanet atmosphere given specified input information about the planetary and stellar radii, the planet's surface gravity, the atmospheric temperature-pressure (T-P) profile, the location (in terms of pressure) of any cloud layers, the composition of the atmosphere, and opacity data for the atoms and molecules that make up the atmosphere. The code solves the equation of radiative transfer for absorption of starlight passing through the planet's atmosphere as it transits, accounting for the oblique path of light through the planetary atmosphere along an Earth-bound observer's line of sight. The fraction of light absorbed (or blocked) by the planet plus its atmosphere is calculated as a function of wavelength to produce the wavelength-dependent transmission spectrum. Functionality is provided to simulate the presence of atmospheric aerosols in two ways: an optically thick (gray) cloud deck can be generated at a user-specified height in the atmosphere, and the nominal Rayleigh scattering can be increased by a specified factor.

  4. Recent Developments in the VISRAD 3-D Target Design and Radiation Simulation Code

    NASA Astrophysics Data System (ADS)

    Macfarlane, Joseph; Woodruff, P.; Golovkin, I.

    2011-10-01

    The 3-D view factor code VISRAD is widely used in designing HEDP experiments at major laser and pulsed-power facilities, including NIF, OMEGA, OMEGA-EP, ORION, Z, and PLX. It simulates target designs by generating a 3-D grid of surface elements, utilizing a variety of 3-D primitives and surface removal algorithms, and can be used to compute the radiation flux throughout the surface element grid by computing element-to-element view factors and solving power balance equations. Target set-up and beam pointing are facilitated by allowing users to specify positions and angular orientations using a variety of coordinates systems (e . g . , that of any laser beam, target component, or diagnostic port). Analytic modeling for laser beam spatial profiles for OMEGA DPPs and NIF CPPs is used to compute laser intensity profiles throughout the grid of surface elements. VISRAD includes a variety of user-friendly graphics for setting up targets and displaying results, can readily display views from any point in space, and can be used to generate image sequences for animations. We will discuss recent improvements to the software package and plans for future developments.

  5. Description of the strategic high-altitude atmospheric radiation code (SHARC). Scientific report, Jan 89-Oct 90

    SciTech Connect

    Duff, J.W.; Sundberg, R.L.; Gruninger, J.H.; Bernstein, L.S.; Robertson, D.C.

    1990-11-27

    The report describes an upgraded version of the strategic high-altitude radiance code, SHARC-2. SHARC calculates atmospheric radiance and transmittance over the 2-40 micrometer spectral region for arbitrary paths within 50 and 300 km altitude, including space viewing. It models radiation due to NLTE (Non-Local Thermodynamic Equilibrium) molecular emissions which are the dominant sources at these altitudes. This new version, which is now ready for distribution, has been upgraded to include a fully integrated auroral model with time-dependent chemistry, extention down to 50 km altitude, and radiation from the minor isotopes of CO2. In addition, there have been numerous internal upgrades to the various modules. These include a Voigt lineshape for the radiative excitation module; embedding of the auroral region into a quiescent atmosphere; and improvements in the radiation transport algorithms.

  6. Fast aerosol optical thickness retrieval from MERIS data with the use of fast radiative transfer code and analytical radiative transfer solutions

    NASA Astrophysics Data System (ADS)

    Kokhanovsky, Alexander; Katsev, Iosif; Prikhach, Alexander; Zege, Eleonora

    We present the new fast aerosol retrieval technique (FAR) to retrieve the aerosol optical thick-ness (AOT), Angstrom parameter, and land reflectance from spectral satellite data. The most important difference of the proposed techniques from NASA/MODIS, ESA/MERIS and some other well-known AOT retrieval codes is that our retrievals do not use the look-up tables (LUT) technique but instead it is based on our previously developed extremely fast code RAY for ra-diative transfer (RT) computations and includes analytical solutions of radiative transfer. The previous version of the retrieval code (ART) was completely based at the RT computations. The FAR technique is about 100 times faster than ART because of the use combination of the RAY computation and analytical solution of the radiative transfer theory. The accuracy of these approximate solutions is thoroughly checked. Using the RT computations in the course of the AOT retrieval allows one to include any available local models of molecular atmosphere and of aerosol in upper and middle atmosphere layers for the treated area. Any set of wave-lengths from any satellite optical instruments can be processed. Moreover, we use the method of least squares in the retrieval of optical parameters of aerosol because the RAY code pro-vides the derivatives of the radiation characteristics with respect to the parameters in question. This technique allows the optimal use on multi-spectral information. The retrieval methods are flexible and can be used in synergetic algorithms, which couple data of two or more satel-lite receivers. These features may be considered as definite merits in comparison with the LUT technique. The successful comparison of FAR retrieved data with results of some other algorithms and with AERONET measurements will be demonstrated. Beside two important problems, namely, the effect of a priory choice of aerosol model to the retrieved AOT accuracy and effect of adjacent pixels containing clouds or snow spots is

  7. Improvements of the Radiation Code "MstrnX" in AORI/NIES/JAMSTEC Models

    NASA Astrophysics Data System (ADS)

    Sekiguchi, M.; Suzuki, K.; Takemura, T.; Watanabe, M.; Ogura, T.

    2015-12-01

    There is a large demand for an accurate yet rapid radiation transfer scheme accurate for general climate models. The broadband radiative transfer code "mstrnX", ,which was developed by Atmosphere and Ocean Research Institute (AORI) and was implemented in several global and regional climate models cooperatively developed in the Japanese research community, for example, MIROC (the Model for Interdisciplinary Research on Climate) [Watanabe et al., 2010], NICAM (Non-hydrostatic Icosahedral Atmospheric Model) [Satoh et al, 2008], and CReSS (Cloud Resolving Storm Simulator) [Tsuboki and Sakakibara, 2002]. In this study, we improve the gas absorption process and the scattering process of ice particles. For update of gas absorption process, the absorption line database is replaced by the latest versions of the Harvard-Smithsonian Center, HITRAN2012. An optimization method is adopted in mstrnX to decrease the number of integration points for the wavenumber integration using the correlated k-distribution method and to increase the computational efficiency in each band. The integration points and weights of the correlated k-distribution are optimized for accurate calculation of the heating rate up to altitude of 70 km. For this purpose we adopted a new non-linear optimization method of the correlated k-distribution and studied an optimal initial condition and the cost function for the non-linear optimization. It is known that mstrnX has a considerable bias in case of quadrapled carbon dioxide concentrations [Pincus et al., 2015], however, the bias is decreased by this improvement. For update of scattering process of ice particles, we adopt a solid column as an ice crystal habit [Yang et al., 2013]. The single scattering properties are calculated and tabulated in advance. The size parameter of this table is ranged from 0.1 to 1000 in mstrnX, we expand the maximum to 50000 in order to correspond to large particles, like fog and rain drop. Those update will be introduced to

  8. Multi-Code Ab Initio Calculation of Ionization Distributions and Radiation Losses for Tungsten in Tokamak Plasmas

    SciTech Connect

    Ralchenko, Yu.; Abdallah, J. Jr.; Colgan, J.; Fontes, C. J.; Foster, M.; Zhang, H. L.; Bar-Shalom, A.; Oreg, J.; Bauche, J.; Bauche-Arnoult, C.; Bowen, C.; Faussurier, G.; Chung, H.-K.; Hansen, S. B.; Lee, R. W.; Scott, H.; Gaufridy de Dortan, F. de; Poirier, M.; Golovkin, I.; Novikov, V.

    2009-09-10

    We present calculations of ionization balance and radiative power losses for tungsten in magnetic fusion plasmas. The simulations were performed within the framework of Non-Local Thermodynamic Equilibrium (NLTE) Code Comparison Workshops utilizing several independent collisional-radiative models. The calculations generally agree with each other; however, a clear disagreement with experimental ionization distributions at low temperatures 2 keV

  9. All-sky radiative transfer calculations for IASI and IASI-NG: The σ-IASI-as code

    NASA Astrophysics Data System (ADS)

    Liuzzi, G.; Blasi, M. G.; Masiello, G.; Serio, C.; Venafra, S.

    2017-02-01

    In the context of the development by EUMETSAT of a new generation of meteorological satellites, we have built the new σ-IASI-as (where "as" stands for "all sky") radiative transfer code. Unlike its predecessor σ-IASI, the code is able to calculate both clear and cloudy sky radiances, as well as their Jacobians with respect to any desired geophysical parameter. In addition, σ-IASI-as can perform calculations to simulate the extinction effect of the most common types of atmospheric aerosols and of clouds via ab-initio Mie calculations. We briefly describe the analytical scheme on which the model is based, and have a glance to its potentialities illustrating some sample calculations. Overall, the new model is a complete and fast radiative transfer tool for IASI, and already available for IASI-NG and MTG-IRS.

  10. GENII (Generation II): The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    SciTech Connect

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.

  11. Activities of the Radiation Shielding Information Center and a report on codes/data for high energy radiation transport

    SciTech Connect

    Roussin, R.W.

    1993-01-01

    From the very early days in its history Radiation Shielding Information Center (RSIC) has been involved with high energy radiation transport. The National Aeronautics and Space Administration was an early sponsor of RSIC until the completion of the Apollo Moon Exploration Program. In addition, the intranuclear cascade work of Bertini at Oak Ridge National Laboratory provided valuable resources which were made available through RSIC. Over the years, RSIC has had interactions with many of the developers of high energy radiation transport computing technology and data libraries and has been able to collect and disseminate this technology. The current status of this technology will be reviewed and prospects for new advancements will be examined.

  12. Activities of the Radiation Shielding Information Center and a report on codes/data for high energy radiation transport

    SciTech Connect

    Roussin, R.W.

    1993-03-01

    From the very early days in its history Radiation Shielding Information Center (RSIC) has been involved with high energy radiation transport. The National Aeronautics and Space Administration was an early sponsor of RSIC until the completion of the Apollo Moon Exploration Program. In addition, the intranuclear cascade work of Bertini at Oak Ridge National Laboratory provided valuable resources which were made available through RSIC. Over the years, RSIC has had interactions with many of the developers of high energy radiation transport computing technology and data libraries and has been able to collect and disseminate this technology. The current status of this technology will be reviewed and prospects for new advancements will be examined.

  13. Experiences in the Performance Analysis and Optimization of a Deterministic Radiation Transport Code on the Cray SV1

    SciTech Connect

    Peter Cebull

    2004-05-01

    The Attila radiation transport code, which solves the Boltzmann neutron transport equation on three-dimensional unstructured tetrahedral meshes, was ported to a Cray SV1. Cray's performance analysis tools pointed to two subroutines that together accounted for 80%-90% of the total CPU time. Source code modifications were performed to enable vectorization of the most significant loops, to correct unfavorable strides through memory, and to replace a conjugate gradient solver subroutine with a call to the Cray Scientific Library. These optimizations resulted in a speedup of 7.79 for the INEEL's largest ATR model. Parallel scalability of the OpenMP version of the code is also discussed, and timing results are given for other non-vector platforms.

  14. Bayesian Atmospheric Radiative Transfer (BART)Thermochemical Equilibrium Abundance (TEA) Code and Application to WASP-43b

    NASA Astrophysics Data System (ADS)

    Blecic, Jasmina; Harrington, Joseph; Bowman, Matthew O.; Cubillos, Patricio E.; Stemm, Madison; Foster, Andrew

    2014-11-01

    We present a new, open-source, Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. TEA uses the Gibbs-free-energy minimization method with an iterative Lagrangian optimization scheme. It initializes the radiative-transfer calculation in our Bayesian Atmospheric Radiative Transfer (BART) code. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature-pressure pairs. The code is tested against the original method developed by White at al. (1958), the analytic method developed by Burrows and Sharp (1999), and the Newton-Raphson method implemented in the open-source Chemical Equilibrium with Applications (CEA) code. TEA is written in Python and is available to the community via the open-source development site GitHub.com. We also present BART applied to eclipse depths of WASP-43b exoplanet, constraining atmospheric thermal and chemical parameters. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  15. HZETRN: A heavy ion/nucleon transport code for space radiations

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Chun, Sang Y.; Badavi, Forooz F.; Townsend, Lawrence W.; Lamkin, Stanley L.

    1991-01-01

    The galactic heavy ion transport code (GCRTRN) and the nucleon transport code (BRYNTRN) are integrated into a code package (HZETRN). The code package is computer efficient and capable of operating in an engineering design environment for manned deep space mission studies. The nuclear data set used by the code is discussed including current limitations. Although the heavy ion nuclear cross sections are assumed constant, the nucleon-nuclear cross sections of BRYNTRN with full energy dependence are used. The relation of the final code to the Boltzmann equation is discussed in the context of simplifying assumptions. Error generation and propagation is discussed, and comparison is made with simplified analytic solutions to test numerical accuracy of the final results. A brief discussion of biological issues and their impact on fundamental developments in shielding technology is given.

  16. Up-regulation of BRAF activated non-coding RNA is associated with radiation therapy for lung cancer.

    PubMed

    Chen, Jian-xiang; Chen, Ming; Zheng, Yuan-da; Wang, Sheng-ye; Shen, Zhu-ping

    2015-04-01

    Radiation therapy has become more effective in treating primary tumors, such as lung cancer. Recent evidence suggested that BRAF activated non-coding RNAs (BANCR) play a critical role in cellular processes and are found to be dysregulated in a variety of cancers. The clinical significance of BANCR in radiation therapy, and its molecular mechanisms controlling tumor growth are unclear. In the present study, C57BL/6 mice were inoculated Lewis lung cancer cells and exposed to radiation therapy, then BANCR expression was analyzed using qPCR. Chromatin immunoprecipitation and western blot were performed to calculate the enrichment of histone acetylation and HDAC3 protein levels in Lewis lung cancer cells, respectively. MTT assay was used to evaluate the effects of BANCR on Lewis lung cancer cell viability. Finally, we found that BANCR expression was significantly increased in C57BL/6 mice receiving radiation therapy (P<0.05) compared with control group. Additionally, knockdown of BANCR expression was associated with larger tumor size in C57BL/6 mice inoculated Lewis lung cancer cells. Histone deacetylation was observed to involve in the regulation of BANCR in Lewis lung cancer cells. Moreover, over expression HDAC3 reversed the effect of rays on BANCR expression. MTT assay showed that knockdown of BANCR expression promoted cell viability surviving from radiation. In conclusion, these findings indicated that radiation therapy was an effective treatment for lung cancer, and it may exert function through up-regulation BANCR expression.

  17. Application of 3-dimensional radiation transport codes to the analysis of the CRBR prototypic coolant pipe chaseway neutron streaming experiment

    SciTech Connect

    Chatani, K. )

    1992-08-01

    This report summarizes the calculational results from analyses of a Clinch River Breeder Reactor (CRBR) prototypic coolant pipe chaseway neutron streaming experiment Comparisons of calculated and measured results are presented, major emphasis being placed on results at bends in the chaseway. Calculations were performed with three three-dimensional radiation transport codes: the discrete ordinates code TORT and the Monte Carlo code MORSE, both developed by the Oak Ridge National Laboratory (ORNL), and the discrete ordinates code ENSEMBLE, developed by Japan. The calculated results from the three codes are compared (1) with previously-calculated DOT3.5 two-dimensional results, (2) among themselves, and (3) with measured results. Calculations with TORT used both the weighted-difference and nodal methods. Only the weighted-difference method was used in ENSEMBLE. When the calculated results were compared to measured results, it was found that calculation-to-experiment (C/E) ratios were good in the regions of the chaseway where two-dimensional modeling might be difficult and where there were no significant discrete ordinates ray effects. Excellent agreement was observed for responses dominated by thermal neutron contributions. MORSE-calculated results and comparisons are described also, and detailed results are presented in an appendix.

  18. Combining node-centered parallel radiation transport and higher-order multi-material cell-centered hydrodynamics methods in three-temperature radiation hydrodynamics code TRHD

    NASA Astrophysics Data System (ADS)

    Sijoy, C. D.; Chaturvedi, S.

    2016-06-01

    Higher-order cell-centered multi-material hydrodynamics (HD) and parallel node-centered radiation transport (RT) schemes are combined self-consistently in three-temperature (3T) radiation hydrodynamics (RHD) code TRHD (Sijoy and Chaturvedi, 2015) developed for the simulation of intense thermal radiation or high-power laser driven RHD. For RT, a node-centered gray model implemented in a popular RHD code MULTI2D (Ramis et al., 2009) is used. This scheme, in principle, can handle RT in both optically thick and thin materials. The RT module has been parallelized using message passing interface (MPI) for parallel computation. Presently, for multi-material HD, we have used a simple and robust closure model in which common strain rates to all materials in a mixed cell is assumed. The closure model has been further generalized to allow different temperatures for the electrons and ions. In addition to this, electron and radiation temperatures are assumed to be in non-equilibrium. Therefore, the thermal relaxation between the electrons and ions and the coupling between the radiation and matter energies are required to be computed self-consistently. This has been achieved by using a node-centered symmetric-semi-implicit (SSI) integration scheme. The electron thermal conduction is calculated using a cell-centered, monotonic, non-linear finite volume scheme (NLFV) suitable for unstructured meshes. In this paper, we have described the details of the 2D, 3T, non-equilibrium, multi-material RHD code developed with a special attention to the coupling of various cell-centered and node-centered formulations along with a suite of validation test problems to demonstrate the accuracy and performance of the algorithms. We also report the parallel performance of RT module. Finally, in order to demonstrate the full capability of the code implementation, we have presented the simulation of laser driven shock propagation in a layered thin foil. The simulation results are found to be in good

  19. Activities of the Radiation Shielding Information Center and a report on codes/data for high energy radiation transport

    SciTech Connect

    Roussin, R.W.

    1994-10-01

    From the very early days in its history RSIC has been involved with high energy radiation transport. The National Aeronautics and Space Administration was an early sponsor of RSIC until the completion of the Apollo Moon Exploration Program. In addition, the intranuclear cascade work of Bertini at Oak Ridge National Laboratory provided valuable resources which were made available through RSIC. Over the years, RSIC has had interactions with many of the developers of high energy radiation transport computing technology and data libraries and has been able to collect and disseminate this technology. The current status of this technology will be reviewed and prospects for new advancements will be examined.

  20. A NEW SEMI-EMPIRICAL AMBIENT TO EFFECTIVE DOSE CONVERSION MODEL FOR THE PREDICTIVE CODE FOR AIRCREW RADIATION EXPOSURE (PCAIRE).

    PubMed

    Dumouchel, T; McCall, M; Lemay, F; Bennett, L; Lewis, B; Bean, M

    2016-12-01

    The Predictive Code for Aircrew Radiation Exposure (PCAIRE) is a semi-empirical code that estimates both ambient dose equivalent, based on years of on-board measurements, and effective dose to aircrew. Currently, PCAIRE estimates effective dose by converting the ambient dose equivalent to effective dose (E/H) using a model that is based on radiation transport calculations and on the radiation weighting factors recommended in International Commission on Radiological Protection (ICRP) 60. In this study, a new semi-empirical E/H model is proposed to replace the existing transport calculation models. The new model is based on flight data measured using a tissue-equivalent proportional counter (TEPC). The measured flight TEPC data are separated into a low- and a high-lineal-energy spectrum using an amplitude-weighted (137)Cs TEPC spectrum. The high-lineal-energy spectrum is determined by subtracting the low-lineal-energy spectrum from the measured flight TEPC spectrum. With knowledge of E/H for the low- and high-lineal-energy spectra, the total E/H is estimated for a given flight altitude and geographic location. The semi-empirical E/H model also uses new radiation weighting factors to align the model with the most recent ICRP 103 recommendations. The ICRP 103-based semi-empirical effective dose model predicts that there is a ∼30 % reduction in dose in comparison with the ICRP 60-based model. Furthermore, the ambient dose equivalent is now a more conservative dose estimate for jet aircraft altitudes in the range of 7-13 km (FL230-430). This new semi-empirical E/H model is validated against E/H predicted from a Monte Carlo N-Particle transport code simulation of cosmic ray propagation through the Earth's atmosphere. Its implementation allows PCAIRE to provide an accurate semi-empirical estimate of the effective dose.

  1. The FLUKA radiation transport code and its use for space problems.

    PubMed

    Ferrari, A; Ranft, J; Sala, P R

    2001-01-01

    FLUKA is a multiparticle transport code capable of handling hadronic and electromagnetic showers up to very high energies (100 TeV), widely used for radioprotection and detector simulation studies. The physical models embedded into FLUKA are briefly described and their capabilities demonstrated against available experimental data. The complete modelling of cosmic ray showers in the earth atmosphere with FLUKA is also described, and its relevance for benchmarking the code for space-like environments discussed. Finally, the ongoing developments of the physical models of the code are presented and discussed.

  2. GARLIC - A general purpose atmospheric radiative transfer line-by-line infrared-microwave code: Implementation and evaluation

    NASA Astrophysics Data System (ADS)

    Schreier, Franz; Gimeno García, Sebastián; Hedelt, Pascal; Hess, Michael; Mendrok, Jana; Vasquez, Mayte; Xu, Jian

    2014-04-01

    A suite of programs for high resolution infrared-microwave atmospheric radiative transfer modeling has been developed with emphasis on efficient and reliable numerical algorithms and a modular approach appropriate for simulation and/or retrieval in a variety of applications. The Generic Atmospheric Radiation Line-by-line Infrared Code - GARLIC - is suitable for arbitrary observation geometry, instrumental field-of-view, and line shape. The core of GARLIC's subroutines constitutes the basis of forward models used to implement inversion codes to retrieve atmospheric state parameters from limb and nadir sounding instruments. This paper briefly introduces the physical and mathematical basics of GARLIC and its descendants and continues with an in-depth presentation of various implementation aspects: An optimized Voigt function algorithm combined with a two-grid approach is used to accelerate the line-by-line modeling of molecular cross sections; various quadrature methods are implemented to evaluate the Schwarzschild and Beer integrals; and Jacobians, i.e. derivatives with respect to the unknowns of the atmospheric inverse problem, are implemented by means of automatic differentiation. For an assessment of GARLIC's performance, a comparison of the quadrature methods for solution of the path integral is provided. Verification and validation are demonstrated using intercomparisons with other line-by-line codes and comparisons of synthetic spectra with spectra observed on Earth and from Venus.

  3. HELIOS: An Open-source, GPU-accelerated Radiative Transfer Code for Self-consistent Exoplanetary Atmospheres

    NASA Astrophysics Data System (ADS)

    Malik, Matej; Grosheintz, Luc; Mendonça, João M.; Grimm, Simon L.; Lavie, Baptiste; Kitzmann, Daniel; Tsai, Shang-Min; Burrows, Adam; Kreidberg, Laura; Bedell, Megan; Bean, Jacob L.; Stevenson, Kevin B.; Heng, Kevin

    2017-02-01

    We present the open-source radiative transfer code named HELIOS, which is constructed for studying exoplanetary atmospheres. In its initial version, the model atmospheres of HELIOS are one-dimensional and plane-parallel, and the equation of radiative transfer is solved in the two-stream approximation with nonisotropic scattering. A small set of the main infrared absorbers is employed, computed with the opacity calculator HELIOS-K and combined using a correlated-k approximation. The molecular abundances originate from validated analytical formulae for equilibrium chemistry. We compare HELIOS with the work of Miller-Ricci & Fortney using a model of GJ 1214b, and perform several tests, where we find: model atmospheres with single-temperature layers struggle to converge to radiative equilibrium; k-distribution tables constructed with ≳ 0.01 cm‑1 resolution in the opacity function (≲ {10}3 points per wavenumber bin) may result in errors ≳ 1%–10% in the synthetic spectra; and a diffusivity factor of 2 approximates well the exact radiative transfer solution in the limit of pure absorption. We construct “null-hypothesis” models (chemical equilibrium, radiative equilibrium, and solar elemental abundances) for six hot Jupiters. We find that the dayside emission spectra of HD 189733b and WASP-43b are consistent with the null hypothesis, while the latter consistently underpredicts the observed fluxes of WASP-8b, WASP-12b, WASP-14b, and WASP-33b. We demonstrate that our results are somewhat insensitive to the choice of stellar models (blackbody, Kurucz, or PHOENIX) and metallicity, but are strongly affected by higher carbon-to-oxygen ratios. The code is publicly available as part of the Exoclimes Simulation Platform (exoclime.net).

  4. Code of Practice for the Use of Ionizing Radiations in Secondary Schools.

    ERIC Educational Resources Information Center

    National Health and Medical Research Council, Canberra (Australia).

    The appreciation of the potential hazard of ionizing radiation led to the setting up of national, and later, international commissions for the defining of standards of protection for the occupationally exposed worker in the use of ionizing radiation. However, in the last twenty years, with the large scale development of nuclear energy, the need…

  5. A fast Monte Carlo code for proton transport in radiation therapy based on MCNPX.

    PubMed

    Jabbari, Keyvan; Seuntjens, Jan

    2014-07-01

    An important requirement for proton therapy is a software for dose calculation. Monte Carlo is the most accurate method for dose calculation, but it is very slow. In this work, a method is developed to improve the speed of dose calculation. The method is based on pre-generated tracks for particle transport. The MCNPX code has been used for generation of tracks. A set of data including the track of the particle was produced in each particular material (water, air, lung tissue, bone, and soft tissue). This code can transport protons in wide range of energies (up to 200 MeV for proton). The validity of the fast Monte Carlo (MC) code is evaluated with data MCNPX as a reference code. While analytical pencil beam algorithm transport shows great errors (up to 10%) near small high density heterogeneities, there was less than 2% deviation of MCNPX results in our dose calculation and isodose distribution. In terms of speed, the code runs 200 times faster than MCNPX. In the Fast MC code which is developed in this work, it takes the system less than 2 minutes to calculate dose for 10(6) particles in an Intel Core 2 Duo 2.66 GHZ desktop computer.

  6. A fast Monte Carlo code for proton transport in radiation therapy based on MCNPX

    PubMed Central

    Jabbari, Keyvan; Seuntjens, Jan

    2014-01-01

    An important requirement for proton therapy is a software for dose calculation. Monte Carlo is the most accurate method for dose calculation, but it is very slow. In this work, a method is developed to improve the speed of dose calculation. The method is based on pre-generated tracks for particle transport. The MCNPX code has been used for generation of tracks. A set of data including the track of the particle was produced in each particular material (water, air, lung tissue, bone, and soft tissue). This code can transport protons in wide range of energies (up to 200 MeV for proton). The validity of the fast Monte Carlo (MC) code is evaluated with data MCNPX as a reference code. While analytical pencil beam algorithm transport shows great errors (up to 10%) near small high density heterogeneities, there was less than 2% deviation of MCNPX results in our dose calculation and isodose distribution. In terms of speed, the code runs 200 times faster than MCNPX. In the Fast MC code which is developed in this work, it takes the system less than 2 minutes to calculate dose for 106 particles in an Intel Core 2 Duo 2.66 GHZ desktop computer. PMID:25190994

  7. Automatic commissioning of a GPU-based Monte Carlo radiation dose calculation code for photon radiotherapy

    NASA Astrophysics Data System (ADS)

    Tian, Zhen; Jiang Graves, Yan; Jia, Xun; Jiang, Steve B.

    2014-10-01

    Monte Carlo (MC) simulation is commonly considered as the most accurate method for radiation dose calculations. Commissioning of a beam model in the MC code against a clinical linear accelerator beam is of crucial importance for its clinical implementation. In this paper, we propose an automatic commissioning method for our GPU-based MC dose engine, gDPM. gDPM utilizes a beam model based on a concept of phase-space-let (PSL). A PSL contains a group of particles that are of the same type and close in space and energy. A set of generic PSLs was generated by splitting a reference phase-space file. Each PSL was associated with a weighting factor, and in dose calculations the particle carried a weight corresponding to the PSL where it was from. Dose for each PSL in water was pre-computed, and hence the dose in water for a whole beam under a given set of PSL weighting factors was the weighted sum of the PSL doses. At the commissioning stage, an optimization problem was solved to adjust the PSL weights in order to minimize the difference between the calculated dose and measured one. Symmetry and smoothness regularizations were utilized to uniquely determine the solution. An augmented Lagrangian method was employed to solve the optimization problem. To validate our method, a phase-space file of a Varian TrueBeam 6 MV beam was used to generate the PSLs for 6 MV beams. In a simulation study, we commissioned a Siemens 6 MV beam on which a set of field-dependent phase-space files was available. The dose data of this desired beam for different open fields and a small off-axis open field were obtained by calculating doses using these phase-space files. The 3D γ-index test passing rate within the regions with dose above 10% of dmax dose for those open fields tested was improved averagely from 70.56 to 99.36% for 2%/2 mm criteria and from 32.22 to 89.65% for 1%/1 mm criteria. We also tested our commissioning method on a six-field head-and-neck cancer IMRT plan. The

  8. MESTRN: A Deterministic Meson-Muon Transport Code for Space Radiation

    NASA Technical Reports Server (NTRS)

    Blattnig, Steve R.; Norbury, John W.; Norman, Ryan B.; Wilson, John W.; Singleterry, Robert C., Jr.; Tripathi, Ram K.

    2004-01-01

    A safe and efficient exploration of space requires an understanding of space radiations, so that human life and sensitive equipment can be protected. On the way to these sensitive sites, the radiation fields are modified in both quality and quantity. Many of these modifications are thought to be due to the production of pions and muons in the interactions between the radiation and intervening matter. A method used to predict the effects of the presence of these particles on the transport of radiation through materials is developed. This method was then used to develop software, which was used to calculate the fluxes of pions and muons after the transport of a cosmic ray spectrum through aluminum and water. Software descriptions are given in the appendices.

  9. ZEUS-2D: A radiation magnetohydrodynamics code for astrophysical flows in two space dimensions. I - The hydrodynamic algorithms and tests.

    NASA Astrophysics Data System (ADS)

    Stone, James M.; Norman, Michael L.

    1992-06-01

    A detailed description of ZEUS-2D, a numerical code for the simulation of fluid dynamical flows including a self-consistent treatment of the effects of magnetic fields and radiation transfer is presented. Attention is given to the hydrodynamic (HD) algorithms which form the foundation for the more complex MHD and radiation HD algorithms. The effect of self-gravity on the flow dynamics is accounted for by an iterative solution of the sparse-banded matrix resulting from discretizing the Poisson equation in multidimensions. The results of an extensive series of HD test problems are presented. A detailed description of the MHD algorithms in ZEUS-2D is presented. A new method of computing the electromotive force is developed using the method of characteristics (MOC). It is demonstrated through the results of an extensive series of MHD test problems that the resulting hybrid MOC-constrained transport method provides for the accurate evolution of all modes of MHD wave families.

  10. The Strategic High-Altitude Atmospheric Radiation Code (SHARC) User Instructions.

    DTIC Science & Technology

    1989-02-03

    40 spectral region. It models radiation due to NLTE (Non-Local Thermodynamic Equilibrium)-molecular emissions which are the dominant sources at these...radiative decay. This leads to a condition of Non-Local Thermodynamic Equilibrium (NLTE) where the various degrees of vibrational, rotational, and...The SHARC INTERPRETER is a modified Sandia interpreter from which information on elements in the periodic table, the thermodynamic data base

  11. Retrieving the Molecular Composition of Planet-Forming Material: An Accurate Non-LTE Radiative Transfer Code for JWST

    NASA Astrophysics Data System (ADS)

    Pontoppidan, Klaus

    Based on the observed distributions of exoplanets and dynamical models of their evolution, the primary planet-forming regions of protoplanetary disks are thought to span distances of 1-20 AU from typical stars. A key observational challenge of the next decade will be to understand the links between the formation of planets in protoplanetary disks and the chemical composition of exoplanets. Potentially habitable planets in particular are likely formed by solids growing within radii of a few AU, augmented by unknown contributions from volatiles formed at larger radii of 10-50 AU. The basic chemical composition of these inner disk regions is characterized by near- to far-infrared (2-200 micron) emission lines from molecular gas at temperatures of 50-1500 K. A critical step toward measuring the chemical composition of planet-forming regions is therefore to convert observed infrared molecular line fluxes, profiles and images to gas temperatures, densities and molecular abundances. However, current techniques typically employ approximate radiative transfer methods and assumptions of local thermodynamic equilibrium (LTE) to retrieve abundances, leading to uncertainties of orders of magnitude and inconclusive comparisons to chemical models. Ultimately, the scientific impact of the high quality spectroscopic data expected from the James Webb Space Telescope (JWST) will be limited by the availability of radiative transfer tools for infrared molecular lines. We propose to develop a numerically accurate, non-LTE 3D line radiative transfer code, needed to interpret mid-infrared molecular line observations of protoplanetary and debris disks in preparation for the James Webb Space Telescope (JWST). This will be accomplished by adding critical functionality to the existing Monte Carlo code LIME, which was originally developed to support (sub)millimeter interferometric observations. In contrast to existing infrared codes, LIME calculates the exact statistical balance of arbitrary

  12. Voxel2MCNP: a framework for modeling, simulation and evaluation of radiation transport scenarios for Monte Carlo codes.

    PubMed

    Pölz, Stefan; Laubersheimer, Sven; Eberhardt, Jakob S; Harrendorf, Marco A; Keck, Thomas; Benzler, Andreas; Breustedt, Bastian

    2013-08-21

    The basic idea of Voxel2MCNP is to provide a framework supporting users in modeling radiation transport scenarios using voxel phantoms and other geometric models, generating corresponding input for the Monte Carlo code MCNPX, and evaluating simulation output. Applications at Karlsruhe Institute of Technology are primarily whole and partial body counter calibration and calculation of dose conversion coefficients. A new generic data model describing data related to radiation transport, including phantom and detector geometries and their properties, sources, tallies and materials, has been developed. It is modular and generally independent of the targeted Monte Carlo code. The data model has been implemented as an XML-based file format to facilitate data exchange, and integrated with Voxel2MCNP to provide a common interface for modeling, visualization, and evaluation of data. Also, extensions to allow compatibility with several file formats, such as ENSDF for nuclear structure properties and radioactive decay data, SimpleGeo for solid geometry modeling, ImageJ for voxel lattices, and MCNPX's MCTAL for simulation results have been added. The framework is presented and discussed in this paper and example workflows for body counter calibration and calculation of dose conversion coefficients is given to illustrate its application.

  13. Massive Neutrino Decay Driven Radiative Instabilities, Sub-Structure Survival in Galaxy Clusters and a Nested - Particle-Mesh Code

    NASA Astrophysics Data System (ADS)

    Splinter, Randall John

    1995-01-01

    I have performed a series of studies concerning the clustering of mass on large scales in the universe, with the goal being an increased understanding of the role of various processes in the formation of structure in the universe. One of the first dark matter candidates was the massive neutrino. In this section I investigate the role of a radiative decay mode for a massive neutrino species, and its impact on structure formation. By reviving a concept known as "mock gravity" I attempt to provide seed masses for eventual galaxy formation in a Hot Dark Matter universe. I show that mock gravity is ineffective at generating seed masses for galaxy formation; the ionization rate is too large and the universe becomes fully ionized well before the radiation pressure can have any effect on the clumping of matter. The final section of this thesis presents a series of N-body experiments which are aimed at understanding the theoretical sources of substructure in galaxy clusters. I perform a series of simulations using a variety of power law initial conditions to generate our cluster data sets. From there I use the statistical methods developed by Bird to analyze the subsequent survival of the sub-structures. I find that for a high omega universe that a significant number of clusters should exhibit sub-structure for very long periods of time after their formation. To test whether the sub-structure results are dependent upon the resolution of the N-body code I develop a nested-grid code for use in performing high resolution studies of gravitational instability. In the next section I present an N-body code which features a nested-grid technology. This nested-grid method allows me to extend both the force and mass resolution of a traditional particle-mesh type code. This code will prove extremely useful for studying problems in large-scale structure formation where one is focusing on highly non -linear objects, and hence force and mass resolution are at a premium. In this chapter I

  14. Bayesian Atmospheric Radiative Transfer (BART) Code and Application to WASP-43b

    NASA Astrophysics Data System (ADS)

    Blecic, Jasmina; Harrington, Joseph; Cubillos, Patricio; Bowman, Oliver; Rojo, Patricio; Stemm, Madison; Lust, Nathaniel B.; Challener, Ryan; Foster, Austin James; Foster, Andrew S.; Blumenthal, Sarah D.; Bruce, Dylan

    2016-01-01

    We present a new open-source Bayesian radiative-transfer framework, Bayesian Atmospheric Radiative Transfer (BART, https://github.com/exosports/BART), and its application to WASP-43b. BART initializes a model for the atmospheric retrieval calculation, generates thousands of theoretical model spectra using parametrized pressure and temperature profiles and line-by-line radiative-transfer calculation, and employs a statistical package to compare the models with the observations. It consists of three self-sufficient modules available to the community under the reproducible-research license, the Thermochemical Equilibrium Abundances module (TEA, https://github.com/dzesmin/TEA, Blecic et al. 2015}, the radiative-transfer module (Transit, https://github.com/exosports/transit), and the Multi-core Markov-chain Monte Carlo statistical module (MCcubed, https://github.com/pcubillos/MCcubed, Cubillos et al. 2015). We applied BART on all available WASP-43b secondary eclipse data from the space- and ground-based observations constraining the temperature-pressure profile and molecular abundances of the dayside atmosphere of WASP-43b. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  15. The Air Transport of Radiation (ATR) Code: Development and Testing of ATR5

    DTIC Science & Technology

    1990-01-02

    Cod( for Computing Fission Product Gamma Dose and Dose Rates," RRA-N7236 (October 1972). 42. F.R. Mynatt , et mi., "Calculations of the Penetration of...penetration of missile silos. Works such 24), and incorporated a method for estimating the contribution of debris radiation to the total. as that by Mynatt , et

  16. Quantitative comparisons between experimentally measured 2-D carbon radiation and Monte Carlo impurity (MCI) code simulations

    SciTech Connect

    Evans, T.E.; Leonard, A.W.; West, W.P.; Finkenthal, D.F.; Fenstermacher, M.E.; Porter, G.D.

    1998-08-01

    Experimentally measured carbon line emissions and total radiated power distributions from the DIII-D divertor and Scrape-Off Layer (SOL) are compared to those calculated with the Monte Carlo Impurity (MCI) model. A UEDGE background plasma is used in MCI with the Roth and Garcia-Rosales (RG-R) chemical sputtering model and/or one of six physical sputtering models. While results from these simulations do not reproduce all of the features seen in the experimentally measured radiation patterns, the total radiated power calculated in MCI is in relatively good agreement with that measured by the DIII-D bolometric system when the Smith78 physical sputtering model is coupled to RG-R chemical sputtering in an unaltered UEDGE plasma. Alternatively, MCI simulations done with UEDGE background ion temperatures along the divertor target plates adjusted to better match those measured in the experiment resulted in three physical sputtering models which when coupled to the RG-R model gave a total radiated power that was within 10% of measured value.

  17. General circulation and thermal structure simulated by a Venus AGCM with a two-stream radiative code

    NASA Astrophysics Data System (ADS)

    Yamamoto, Masaru; Ikeda, Kohei; Takahashi, Masaaki

    2016-10-01

    Atmospheric general circulation model (AGCM) is expected to be a powerful tool for understanding Venus climate and atmospheric dynamics. At the present stage, however, the full-physics model is under development. Ikeda (2011) developed a two-stream radiative transfer code, which covers the solar to infrared radiative processes due to the gases and aerosol particles. The radiative code was applied to Venus AGCM (T21L52) at Atmosphere and Ocean Research Institute, Univ. Tokyo. We analyzed the results in a few Venus days simulation that was restarted after nudging zonal wind to a super-rotating state until the equilibrium. The simulated thermal structure has low-stability layer around 105 Pa at low latitudes, and the neutral stability extends from ˜105 Pa to the lower atmosphere at high latitudes. At the equatorial cloud top, the temperature lowers in the region between noon and evening terminator. For zonal and meridional winds, we can see difference between the zonal and day-side means. As was indicated in previous works, the day-side mean meridional wind speed mostly corresponds to the poleward component of the thermal tide and is much higher than the zonal mean. Toward understanding dynamical roles of waves in UV cloud tracking and brightness, we calculated the eddy heat and momentum fluxes averaged over the day-side hemisphere. The eddy heat and momentum fluxes are poleward in the poleward flank of the jet. In contrast, the fluxes are relatively weak and equatorward at low latitudes. The eddy momentum flux becomes equatorward in the dynamical situation that the simulated equatorial wind is weaker than the midlatitude jet. The sensitivity to the zonal flow used for the nudging will be also discussed in the model validation.

  18. AN OPEN-SOURCE NEUTRINO RADIATION HYDRODYNAMICS CODE FOR CORE-COLLAPSE SUPERNOVAE

    SciTech Connect

    O’Connor, Evan

    2015-08-15

    We present an open-source update to the spherically symmetric, general-relativistic hydrodynamics, core-collapse supernova (CCSN) code GR1D. The source code is available at http://www.GR1Dcode.org. We extend its capabilities to include a general-relativistic treatment of neutrino transport based on the moment formalisms of Shibata et al. and Cardall et al. We pay special attention to implementing and testing numerical methods and approximations that lessen the computational demand of the transport scheme by removing the need to invert large matrices. This is especially important for the implementation and development of moment-like transport methods in two and three dimensions. A critical component of neutrino transport calculations is the neutrino–matter interaction coefficients that describe the production, absorption, scattering, and annihilation of neutrinos. In this article we also describe our open-source neutrino interaction library NuLib (available at http://www.nulib.org). We believe that an open-source approach to describing these interactions is one of the major steps needed to progress toward robust models of CCSNe and robust predictions of the neutrino signal. We show, via comparisons to full Boltzmann neutrino-transport simulations of CCSNe, that our neutrino transport code performs remarkably well. Furthermore, we show that the methods and approximations we employ to increase efficiency do not decrease the fidelity of our results. We also test the ability of our general-relativistic transport code to model failed CCSNe by evolving a 40-solar-mass progenitor to the onset of collapse to a black hole.

  19. An Open-source Neutrino Radiation Hydrodynamics Code for Core-collapse Supernovae

    NASA Astrophysics Data System (ADS)

    O'Connor, Evan

    2015-08-01

    We present an open-source update to the spherically symmetric, general-relativistic hydrodynamics, core-collapse supernova (CCSN) code GR1D. The source code is available at http://www.GR1Dcode.org. We extend its capabilities to include a general-relativistic treatment of neutrino transport based on the moment formalisms of Shibata et al. and Cardall et al. We pay special attention to implementing and testing numerical methods and approximations that lessen the computational demand of the transport scheme by removing the need to invert large matrices. This is especially important for the implementation and development of moment-like transport methods in two and three dimensions. A critical component of neutrino transport calculations is the neutrino-matter interaction coefficients that describe the production, absorption, scattering, and annihilation of neutrinos. In this article we also describe our open-source neutrino interaction library NuLib (available at http://www.nulib.org). We believe that an open-source approach to describing these interactions is one of the major steps needed to progress toward robust models of CCSNe and robust predictions of the neutrino signal. We show, via comparisons to full Boltzmann neutrino-transport simulations of CCSNe, that our neutrino transport code performs remarkably well. Furthermore, we show that the methods and approximations we employ to increase efficiency do not decrease the fidelity of our results. We also test the ability of our general-relativistic transport code to model failed CCSNe by evolving a 40-solar-mass progenitor to the onset of collapse to a black hole.

  20. On the Green's function of the partially diffusion-controlled reversible ABCD reaction for radiation chemistry codes

    NASA Astrophysics Data System (ADS)

    Plante, Ianik; Devroye, Luc

    2015-09-01

    Several computer codes simulating chemical reactions in particles systems are based on the Green's functions of the diffusion equation (GFDE). Indeed, many types of chemical systems have been simulated using the exact GFDE, which has also become the gold standard for validating other theoretical models. In this work, a simulation algorithm is presented to sample the interparticle distance for partially diffusion-controlled reversible ABCD reaction. This algorithm is considered exact for 2-particles systems, is faster than conventional look-up tables and uses only a few kilobytes of memory. The simulation results obtained with this method are compared with those obtained with the independent reaction times (IRT) method. This work is part of our effort in developing models to understand the role of chemical reactions in the radiation effects on cells and tissues and may eventually be included in event-based models of space radiation risks. However, as many reactions are of this type in biological systems, this algorithm might play a pivotal role in future simulation programs not only in radiation chemistry, but also in the simulation of biochemical networks in time and space as well.

  1. On the Green's function of the partially diffusion-controlled reversible ABCD reaction for radiation chemistry codes

    SciTech Connect

    Plante, Ianik; Devroye, Luc

    2015-09-15

    Several computer codes simulating chemical reactions in particles systems are based on the Green's functions of the diffusion equation (GFDE). Indeed, many types of chemical systems have been simulated using the exact GFDE, which has also become the gold standard for validating other theoretical models. In this work, a simulation algorithm is presented to sample the interparticle distance for partially diffusion-controlled reversible ABCD reaction. This algorithm is considered exact for 2-particles systems, is faster than conventional look-up tables and uses only a few kilobytes of memory. The simulation results obtained with this method are compared with those obtained with the independent reaction times (IRT) method. This work is part of our effort in developing models to understand the role of chemical reactions in the radiation effects on cells and tissues and may eventually be included in event-based models of space radiation risks. However, as many reactions are of this type in biological systems, this algorithm might play a pivotal role in future simulation programs not only in radiation chemistry, but also in the simulation of biochemical networks in time and space as well.

  2. FY05 LDRD Final Report Molecular Radiation Biodosimetry LDRD Project Tracking Code: 04-ERD-076

    SciTech Connect

    Jones, I M; A.Coleman, M; Lehmann, J; Manohar, C F; Marchetti, F; Mariella, R; Miles, R; Nelson, D O; Wyrobek, A J

    2006-02-03

    In the event of a nuclear or radiological accident or terrorist event, it is important to identify individuals that can benefit from prompt medical care and to reassure those that do not need it. Achieving these goals will maximize the ability to manage the medical consequences of radiation exposure that unfold over a period of hours, days, weeks, years, depending on dose. Medical interventions that reduce near term morbidity and mortality from high but non-lethal exposures require advanced medical support and must be focused on those in need as soon as possible. There are two traditional approaches to radiation dosimetry, physical and biological. Each as currently practiced has strengths and limitations. Physical dosimetry for radiation exposure is routine for selected sites and for individual nuclear workers in certain industries, medical centers and research institutions. No monitoring of individuals in the general population is currently performed. When physical dosimetry is available at the time of an accident/event or soon thereafter, it can provide valuable information in support of accident/event triage. Lack of data for most individuals is a major limitation, as differences in exposure can be significant due to shielding, atmospherics, etc. A smaller issue in terms of number of people affected is that the same dose may have more or less biological effect on subsets of the population. Biological dosimetry is the estimation of exposure based on physiological or cellular alterations induced in an individual by radiation. The best established and precise biodosimetric methods are measurement of the decline of blood cells over time and measurement of the frequency of chromosome aberrations. In accidents or events affecting small numbers of people, it is practical to allocate the resources and time (days of clinical follow-up or specialists laboratory time) to conduct these studies. However, if large numbers of people have been exposed, or fear they may have

  3. MagRad: A code to optimize the operation of superconducting magnets in a radiation environment

    SciTech Connect

    Yeaw, Christopher T.

    1995-01-01

    A powerful computational tool, called MagRad, has been developed which optimizes magnet design for operation in radiation fields. Specifically, MagRad has been used for the analysis and design modification of the cable-in-conduit conductors of the TF magnet systems in fusion reactor designs. Since the TF magnets must operate in a radiation environment which damages the material components of the conductor and degrades their performance, the optimization of conductor design must account not only for start-up magnet performance, but also shut-down performance. The degradation in performance consists primarily of three effects: reduced stability margin of the conductor; a transition out of the well-cooled operating regime; and an increased maximum quench temperature attained in the conductor. Full analysis of the magnet performance over the lifetime of the reactor includes: radiation damage to the conductor, stability, protection, steady state heat removal, shielding effectiveness, optimal annealing schedules, and finally costing of the magnet and reactor. Free variables include primary and secondary conductor geometric and compositional parameters, as well as fusion reactor parameters. A means of dealing with the radiation damage to the conductor, namely high temperature superconductor anneals, is proposed, examined, and demonstrated to be both technically feasible and cost effective. Additionally, two relevant reactor designs (ITER CDA and ARIES-II/IV) have been analyzed. Upon addition of pure copper strands to the cable, the ITER CDA TF magnet design was found to be marginally acceptable, although much room for both performance improvement and cost reduction exists. A cost reduction of 10-15% of the capital cost of the reactor can be achieved by adopting a suitable superconductor annealing schedule. In both of these reactor analyses, the performance predictive capability of MagRad and its associated costing techniques have been demonstrated.

  4. Code System for Calculating Radiation Exposure to Man from Routine Release of Nuclear Reactor Liquid Effluents.

    SciTech Connect

    1980-02-29

    Version 00 LADTAP II calculates the radiation exposure to man from potable water, aquatic foods, shoreline deposits, swimming, boating, and irrigated foods, and also the dose to biota. Doses are calculated for both the maximum individual and for the population and are summarized for each pathway by age group and organ. It also calculates the doses to certain representative biota other than man in the aquatic environment such as fish, invertebrates, algae, muskrats, raccoons, herons, and ducks using models presented in WASH-1258.

  5. PORTA: A Massively Parallel Code for 3D Non-LTE Polarized Radiative Transfer

    NASA Astrophysics Data System (ADS)

    Štěpán, J.

    2014-10-01

    The interpretation of the Stokes profiles of the solar (stellar) spectral line radiation requires solving a non-LTE radiative transfer problem that can be very complex, especially when the main interest lies in modeling the linear polarization signals produced by scattering processes and their modification by the Hanle effect. One of the main difficulties is due to the fact that the plasma of a stellar atmosphere can be highly inhomogeneous and dynamic, which implies the need to solve the non-equilibrium problem of generation and transfer of polarized radiation in realistic three-dimensional stellar atmospheric models. Here we present PORTA, a computer program we have developed for solving, in three-dimensional (3D) models of stellar atmospheres, the problem of the generation and transfer of spectral line polarization taking into account anisotropic radiation pumping and the Hanle and Zeeman effects in multilevel atoms. The numerical method of solution is based on a highly convergent iterative algorithm, whose convergence rate is insensitive to the grid size, and on an accurate short-characteristics formal solver of the Stokes-vector transfer equation which uses monotonic Bezier interpolation. In addition to the iterative method and the 3D formal solver, another important feature of PORTA is a novel parallelization strategy suitable for taking advantage of massively parallel computers. Linear scaling of the solution with the number of processors allows to reduce the solution time by several orders of magnitude. We present useful benchmarks and a few illustrations of applications using a 3D model of the solar chromosphere resulting from MHD simulations. Finally, we present our conclusions with a view to future research. For more details see Štěpán & Trujillo Bueno (2013).

  6. NUSTART: A PC code for NUclear STructure And Radiative Transition analysis and supplementation

    SciTech Connect

    Larsen, G.L.; Gardner, D.G.; Gardner, M.A.

    1990-10-01

    NUSTART is a computer program for the IBM PC/At. It is designed for use with the nuclear reaction cross-section code STAPLUS, which is a STAPRE-based CRAY computer code that is being developed at Lawrence Livermore National Laboratory. The NUSTART code was developed to handle large sets of discrete nuclear levels and the multipole transitions among these levels; it operates in three modes. The Data File Error Analysis mode analyzes an existing STAPLUS input file containing the levels and their multipole transition branches for a number of physics and/or typographical errors. The Interactive Data File Generation mode allows the user to create input files of discrete levels and their branching fractions in the format required by STAPLUS, even though the user enters the information in the (different) format used by many people in the nuclear structure field. In the Branching Fractions Calculations mode, the discrete nuclear level set is read, and the multipole transitions among the levels are computed under one of two possible assumptions: (1) the levels have no collective character, or (2) the levels are all rotational band heads. Only E1, M1, and E2 transitions are considered, and the respective strength functions may be constants or, in the case of E1 transitions, the strength function may be energy dependent. The first option is used for nuclei closed shells; the bandhead option may be used to vary the E1, M1, and E2 strengths for interband transitions. K-quantum number selection rules may be invoked if desired. 19 refs.

  7. Monitoring Cosmic Radiation Risk: Comparisons between Observations and Predictive Codes for Naval Aviation

    DTIC Science & Technology

    2009-01-01

    various materials can be found by use of the SRIM code [17]. F. Pions/Muons The pion, originally referred to as the π meson , was one of the earliest...These are the lightest mesons and have a very short half- life. In atmospheric interactions, they help produce muons and neutrinos [17]. The π+ is...keV/micron High Varies Pions Mesons π0 π+ π- π+,- : +-e π0 : 0 ᝺ keV/micron Low π+,- : 2.6x10-8 π0 : 0.84x10-16 Muons Leptons μ- μ+ +1, -1 ᝺

  8. Monitoring Cosmic Radiation Risk: Comparisons Between Observations and Predictive Codes for Naval Aviation

    DTIC Science & Technology

    2009-07-05

    materials can be found by use of the SRIM code [17]. F. Pions/Muons The pion, originally referred to as the π meson , was one of the earliest...These are the lightest mesons and have a very short half- life. In atmospheric interactions, they help produce muons and neutrinos [17]. The π+ is...micron High Varies Pions Mesons π0 π+ π- π+,- : +-e π0 : 0 ᝺ keV/micron Low π+,- : 2.6x10-8 π0 : 0.84x10-16 Muons Leptons μ- μ+ +1, -1 ᝺ keV

  9. FESTR: Finite-Element Spectral Transfer of Radiation spectroscopic modeling and analysis code

    DOE PAGES

    Hakel, Peter

    2016-10-01

    Here we report on the development of a new spectral postprocessor of hydrodynamic simulations of hot, dense plasmas. Based on given time histories of one-, two-, and three-dimensional spatial distributions of materials, and their local temperature and density conditions, spectroscopically-resolved signals are computed. The effects of radiation emission and absorption by the plasma on the emergent spectra are simultaneously taken into account. This program can also be used independently of hydrodynamic calculations to analyze available experimental data with the goal of inferring plasma conditions.

  10. Impact of differences in the solar irradiance spectrum on surface reflectance retrieval with different radiative transfer codes

    NASA Technical Reports Server (NTRS)

    Staenz, K.; Williams, D. J.; Fedosejevs, G.; Teillet, P. M.

    1995-01-01

    Surface reflectance retrieval from imaging spectrometer data as acquired with the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) has become important for quantitative analysis. In order to calculate surface reflectance from remotely measured radiance, radiative transfer codes such as 5S and MODTRAN2 play an increasing role for removal of scattering and absorption effects of the atmosphere. Accurate knowledge of the exo-atmospheric solar irradiance (E(sub 0)) spectrum at the spectral resolution of the sensor is important for this purpose. The present study investigates the impact of differences in the solar irradiance function, as implemented in a modified version of 5S (M5S), 6S, and MODTRAN2, and as proposed by Green and Gao, on the surface reflectance retrieved from AVIRIS data. Reflectance measured in situ is used as a basis of comparison.

  11. INTDOS: a computer code for estimating internal radiation dose using recommendations of the International Commission on Radiological Protection

    SciTech Connect

    Ryan, M.T.

    1981-09-01

    INTDOS is a user-oriented computer code designed to calculate estimates of internal radiation dose commitment resulting from the acute inhalation intake of various radionuclides. It is designed so that users unfamiliar with the details of such can obtain results by answering a few questions regarding the exposure case. The user must identify the radionuclide name, solubility class, particle size, time since exposure, and the measured lung burden. INTDOS calculates the fractions of the lung burden remaining at time, t, postexposure considering the solubility class and particle size information. From the fraction remaining in the lung at time, t, the quantity inhaled is estimated. Radioactive decay is accounted for in the estimate. Finally, effective committed dose equivalents to various organs and tissues of the body are calculated using inhalation committed dose factors presented by the International Commission on Radiological Protection (ICRP). This computer code was written for execution on a Digital Equipment Corporation PDP-10 computer and is written in Fortran IV. A flow chart and example calculations are discussed in detail to aid the user who is unfamiliar with computer operations.

  12. Development of parallel monte carlo electron and photon transport (PMCEPT) code III: Applications to medical radiation physics

    NASA Astrophysics Data System (ADS)

    Kum, Oyeon; Han, Youngyih; Jeong, Hae Sun

    2012-05-01

    Minimizing the differences between dose distributions calculated at the treatment planning stage and those delivered to the patient is an essential requirement for successful radiotheraphy. Accurate calculation of dose distributions in the treatment planning process is important and can be done only by using a Monte Carlo calculation of particle transport. In this paper, we perform a further validation of our previously developed parallel Monte Carlo electron and photon transport (PMCEPT) code [Kum and Lee, J. Korean Phys. Soc. 47, 716 (2005) and Kim and Kum, J. Korean Phys. Soc. 49, 1640 (2006)] for applications to clinical radiation problems. A linear accelerator, Siemens' Primus 6 MV, was modeled and commissioned. A thorough validation includes both small fields, closely related to the intensity modulated radiation treatment (IMRT), and large fields. Two-dimensional comparisons with film measurements were also performed. The PMCEPT results, in general, agreed well with the measured data within a maximum error of about 2%. However, considering the experimental errors, the PMCEPT results can provide the gold standard of dose distributions for radiotherapy. The computing time was also much faster, compared to that needed for experiments, although it is still a bottleneck for direct applications to the daily routine treatment planning procedure.

  13. Development of radiative transfer code for JUICE/SWI mission toward the atmosphere of icy moons of Jupiter

    NASA Astrophysics Data System (ADS)

    Yamada, Takayoshi; Kasai, Yasuko; Yoshida, Naohiro

    2016-07-01

    The Submillimeter Wave Instrument (SWI) is one of the scientific instruments on the JUpiter Icy moon Explorer (JUICE). We plan to observe atmospheric compositions including water vapor and its isotopomers in Galilean moons (Io, Europa, Ganymede, and Callisto). The frequency windows of SWI are 530 to 625 GHz and 1080 to 1275 GHz with 100 kHz spectral resolution. We are developing a radiative transfer code in Japan with line-by-line method for Ganymede atmosphere in THz region (0 - 3 THz). Molecular line parameters (line intensity and partition function) were taken from JPL (Jet Propulsion Laboratory) catalogue. The pencil beam was assumed to calculate a spectrum of H _{2}O and CO in rotational transitions at the THz region. We performed comparisons between our model and ARTS (Atmospheric Radiative Transfer Simulator). The difference were less than 10% and 5% for H _{2}O and CO, respectively, under the condition of the local thermodynamic equilibrium (LTE). Comparison with several models with non-LTE assumption will be presented.

  14. Coupling External Radiation Transport Code Results to the GADRAS Detector Response Function

    SciTech Connect

    Mitchell, Dean J.; Thoreson, Gregory G.; Horne, Steven M.

    2014-01-01

    Simulating gamma spectra is useful for analyzing special nuclear materials. Gamma spectra are influenced not only by the source and the detector, but also by the external, and potentially complex, scattering environment. The scattering environment can make accurate representations of gamma spectra difficult to obtain. By coupling the Monte Carlo Nuclear Particle (MCNP) code with the Gamma Detector Response and Analysis Software (GADRAS) detector response function, gamma spectrum simulations can be computed with a high degree of fidelity even in the presence of a complex scattering environment. Traditionally, GADRAS represents the external scattering environment with empirically derived scattering parameters. By modeling the external scattering environment in MCNP and using the results as input for the GADRAS detector response function, gamma spectra can be obtained with a high degree of fidelity. This method was verified with experimental data obtained in an environment with a significant amount of scattering material. The experiment used both gamma-emitting sources and moderated and bare neutron-emitting sources. The sources were modeled using GADRAS and MCNP in the presence of the external scattering environment, producing accurate representations of the experimental data.

  15. Parameterization of the level-resolved radiative recombination rate coefficients for the SPEX code

    NASA Astrophysics Data System (ADS)

    Mao, Junjie; Kaastra, Jelle

    2016-03-01

    The level-resolved radiative recombination (RR) rate coefficients for H-like to Na-like ions from H (Z = 1) up to and including Zn (Z = 30) are studied here. For H-like ions, the quantum-mechanical exact photoionization cross sections for nonrelativistic hydrogenic systems are usedto calculate the RR rate coefficients under the principle of detailed balance, while for He-like to Na-like ions, the archival data on ADAS are adopted. Parameterizations are made for the direct capture rates in a wide temperature range. The fitting accuracies are better than 5% for about 99% of the ~3 × 104 levels considered here. The ~1% exceptions include levels from low-charged many-electron ions, and/or high-shell (n ≳ 4) levels are less important in terms of interpreting X-ray emitting astrophysical plasmas. The RR data will be incorporated into the high-resolution spectral analysis package SPEX. Results of the parameterizations are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/587/A84

  16. Radiation Transport Calculations of a Simple Structure Using the Vehicle Code System with 69-Group Cross Sections and the Monte-Carlo Neutron and Photon Code

    DTIC Science & Technology

    1989-08-01

    Code System (VCS) User’s Manual , Oak Ridge National Laboratory, ORNL-TM-4648 (1974). (UNCLASSIFIED) 3. F.R. Mynatt , F.J. Muckenthaler and P.N...and L.M. Petrie, Vehicle Code System (VCS) User’s Manual , Oak Ridge National Laboratory, ORNL-TM-4648 (1974). (UNCLASSIFIED) 3. F.R. Mynatt , F.J

  17. US medical researchers, the Nuremberg Doctors Trial, and the Nuremberg Code. A review of findings of the Advisory Committee on Human Radiation Experiments.

    PubMed

    Faden, R R; Lederer, S E; Moreno, J D

    1996-11-27

    The Advisory Committee on Human Radiation Experiments (ACHRE), established to review allegations of abuses of human subjects in federally sponsored radiation research, was charged with identifying appropriate standards to evaluate the ethics of cold war radiation experiments. One central question for ACHRE was to determine what role, if any, the Nuremberg Code played in the norms and practices of US medical researchers. Based on the evidence from ACHRE's Ethics Oral History Project and extensive archival research, we conclude that the Code, at the time it was promulgated, had little effect on mainstream medical researchers engaged in human subjects research. Although some clinical investigators raised questions about the conduct of research involving human beings, the medical profession did not pursue this issue until the 1960s.

  18. A Multigroup diffusion solver using pseudo transient continuation for a radiation-hydrodynamic code with patch-based AMR

    SciTech Connect

    Shestakov, A I; Offner, S R

    2006-09-21

    We present a scheme to solve the nonlinear multigroup radiation diffusion (MGD) equations. The method is incorporated into a massively parallel, multidimensional, Eulerian radiation-hydrodynamic code with adaptive mesh refinement (AMR). The patch-based AMR algorithm refines in both space and time creating a hierarchy of levels, coarsest to finest. The physics modules are time-advanced using operator splitting. On each level, separate 'level-solve' packages advance the modules. Our multigroup level-solve adapts an implicit procedure which leads to a two-step iterative scheme that alternates between elliptic solves for each group with intra-cell group coupling. For robustness, we introduce pseudo transient continuation ({Psi}tc). We analyze the magnitude of the {Psi}tc parameter to ensure positivity of the resulting linear system, diagonal dominance and convergence of the two-step scheme. For AMR, a level defines a subdomain for refinement. For diffusive processes such as MGD, the refined level uses Dirichet boundary data at the coarse-fine interface and the data is derived from the coarse level solution. After advancing on the fine level, an additional procedure, the sync-solve (SS), is required in order to enforce conservation. The MGD SS reduces to an elliptic solve on a combined grid for a system of G equations, where G is the number of groups. We adapt the 'partial temperature' scheme for the SS; hence, we reuse the infrastructure developed for scalar equations. Results are presented. We consider a multigroup test problem with a known analytic solution. We demonstrate utility of {Psi}tc by running with increasingly larger timesteps. Lastly, we simulate the sudden release of energy Y inside an Al sphere (r = 15 cm) suspended in air at STP. For Y = 11 kT, we find that gray radiation diffusion and MGD produce similar results. However, if Y = 1 MT, the two packages yield different results. Our large Y simulation contradicts a long-standing theory and demonstrates

  19. Filling-In, Spatial Summation, and Radiation of Pain: Evidence for a Neural Population Code in the Nociceptive System

    PubMed Central

    Quevedo, Alexandre S.

    2009-01-01

    The receptive field organization of nociceptive neurons suggests that noxious information may be encoded by population-based mechanisms. Electrophysiological evidence of population coding mechanisms has remained limited. However, psychophysical studies examining interactions between multiple noxious stimuli can provide indirect evidence that neuron population recruitment can contribute to both spatial and intensity-related percepts of pain. In the present study, pairs of thermal stimuli (35°C/49°C or 49°C/49°C) were delivered at different distances on the leg (0, 5, 10, 20, 40 cm) and abdomen (within and across dermatomes) and subjects evaluated pain intensity and perceived spatial attributes of stimuli. Reports of perceived pain spreading to involve areas that were not stimulated (radiation of pain) were most frequent at 5- and 10-cm distances (χ2 = 34.107, P < 0.0001). Perceived connectivity between two noxious stimuli (filling-in) was influenced by the distance between stimuli (χ2 = 16.756, P < 0.01), with the greatest connectivity reported at 5- and 10-cm separation distances. Spatial summation of pain occurred over probe separation distances as large as 40 cm and six dermatomes (P < 0.05), but was maximal at 5- and 10-cm separation distances. Taken together, all three of these phenomena suggest that interactions between recruited populations of neurons may support both spatial and intensity-related dimensions of the pain experience. PMID:19759320

  20. ZEUS-2D: A Radiation Magnetohydrodynamics Code for Astrophysical Flows in Two Space Dimensions. II. The Magnetohydrodynamic Algorithms and Tests

    NASA Astrophysics Data System (ADS)

    Stone, James M.; Norman, Michael L.

    1992-06-01

    In this, the second of a series of three papers, we continue a detailed description of ZEUS-2D, a numerical code for the simulation of fluid dynamical flows in astrophysics including a self-consistent treatment of the effects of magnetic fields and radiation transfer. In this paper, we give a detailed description of the magnetohydrodynamical (MHD) algorithms in ZEUS-2D. The recently developed constrained transport (CT) algorithm is implemented for the numerical evolution of the components of the magnetic field for MHD simulations. This formalism guarantees the numerically evolved field components will satisfy the divergence-free constraint at all times. We find, however, that the method used to compute the electromotive forces must be chosen carefully to propagate accurately all modes of MHD wave families (in particular shear Alfvén waves). A new method of computing the electromotive force is developed using the method of characteristics (MOC). It is demonstrated through the results of an extensive series of MHD test problems that the resulting hybrid MOC-CT method provides for the accurate evolution of all modes of MHD wave families.

  1. Using the FLUKA Monte Carlo Code to Simulate the Interactions of Ionizing Radiation with Matter to Assist and Aid Our Understanding of Ground Based Accelerator Testing, Space Hardware Design, and Secondary Space Radiation Environments

    NASA Technical Reports Server (NTRS)

    Reddell, Brandon

    2015-01-01

    Designing hardware to operate in the space radiation environment is a very difficult and costly activity. Ground based particle accelerators can be used to test for exposure to the radiation environment, one species at a time, however, the actual space environment cannot be duplicated because of the range of energies and isotropic nature of space radiation. The FLUKA Monte Carlo code is an integrated physics package based at CERN that has been under development for the last 40+ years and includes the most up-to-date fundamental physics theory and particle physics data. This work presents an overview of FLUKA and how it has been used in conjunction with ground based radiation testing for NASA and improve our understanding of secondary particle environments resulting from the interaction of space radiation with matter.

  2. TRHD: Three-temperature radiation-hydrodynamics code with an implicit non-equilibrium radiation transport using a cell-centered monotonic finite volume scheme on unstructured-grids

    NASA Astrophysics Data System (ADS)

    Sijoy, C. D.; Chaturvedi, S.

    2015-05-01

    Three-temperature (3T), unstructured-mesh, non-equilibrium radiation hydrodynamics (RHD) code have been developed for the simulation of intense thermal radiation or high-power laser driven radiative shock hydrodynamics in two-dimensional (2D) axis-symmetric geometries. The governing hydrodynamics equations are solved using a compatible unstructured Lagrangian method based on a control volume differencing (CVD) scheme. A second-order predictor-corrector (PC) integration scheme is used for the temporal discretization of the hydrodynamics equations. For the radiation energy transport, frequency averaged gray model is used in which the flux-limited diffusion (FLD) approximation is used to recover the free-streaming limit of the radiation propagation in optically thin regions. The proposed RHD model allows to have different temperatures for the electrons and ions. In addition to this, the electron and thermal radiation temperatures are assumed to be in non-equilibrium. Therefore, the thermal relaxation between the electrons and ions and the coupling between the radiation and matter energies are required to be computed self-consistently. For this, the coupled flux limited electron heat conduction and the non-equilibrium radiation diffusion equations are solved simultaneously by using an implicit, axis-symmetric, cell-centered, monotonic, nonlinear finite volume (NLFV) scheme. In this paper, we have described the details of the 2D, 3T, non-equilibrium RHD code developed along with a suite of validation test problems to demonstrate the accuracy and performance of the algorithms. We have also conducted a performance analysis with different linearity preserving interpolation schemes that are used for the evaluation of the nodal values in the NLFV scheme. Finally, in order to demonstrate full capability of the code implementation, we have presented the simulation of laser driven thin Aluminum (Al) foil acceleration. The simulation results are found to be in good agreement

  3. The Havemann-Taylor Fast Radiative Transfer Code (HT-FRTC) and its application within Tactical Decision Aids (TDAs)

    NASA Astrophysics Data System (ADS)

    Thelen, Jean-Claude; Havemann, Stephan; Wong, Gerald

    2015-10-01

    The Havemann-Taylor Fast Radiative Transfer Code (HT-FRTC) is a core component of the Met Office NEON Tactical Decision Aid (TDA). Within NEON, the HT-FRTC has for a number of years been used to predict the infrared apparent thermal contrasts between different surface types as observed by an airborne sensor. To achieve this, the HT-FRTC is supplied with the inherent temperatures and spectral properties of these surfaces (i.e. ground target(s) and backgrounds). A key strength of the HT-FRTC is its ability to take into account the detailed properties of the atmosphere, which in the context of NEON tend to be provided by a Numerical Weather Prediction (NWP) forecast model. While water vapour and ozone are generally the most important gases, additional trace gases are now being incorporated into the HT-FRTC. The HT-FRTC also includes an exact treatment of atmospheric scattering based on spherical harmonics. This allows for the treatment of several different aerosol species and of liquid and ice clouds. Recent developments can even account for rain and falling snow. The HT-FRTC works in Principal Component (PC) space and is trained on a wide variety of atmospheric and surface conditions, which significantly reduces the computational requirements regarding memory and processing time. One clear-sky simulation takes approximately one millisecond at the time of writing. Recent developments allow the training of HT-FRTC to be both completely generalised and sensor independent. This is significant as the user of the code can add new sensors and new surfaces/targets by supplying extra files which contain their (possibly classified) spectral properties. The HT-FRTC has been extended to cover the spectral range of Photopic and NVG sensors. One aim here is to give guidance on the expected, directionally resolved sky brightness, especially at night, again taking the actual or forecast atmospheric conditions into account. Recent developments include light level predictions during

  4. Comparison of the 3D VERB Code Simulations of the Dynamic Evolution of the Outer and Inner Radiation Belts With the Reanalysis Obtained from Observations on Multiple Spacecraft

    NASA Astrophysics Data System (ADS)

    Shprits, Y.; Subbotin, D.; Ni, B.; Daae, M.; Kondrashov, D. A.; Hartinger, M.; Kim, K.; Orlova, K.; Nagai, T.; Friedel, R. H.; Chen, Y.

    2010-12-01

    In this study we present simulations of the inner and outer radiation belts using the Versatile Electron Radiation Belt (VERB) accounting for radial, pitch-angle, energy, and mixed diffusion. Qusi-linear diffusion coefficients are computed using the Full Diffusion Code (FDC) due to day-side and night-side chorus waves, magneto-sonic waves, phasmaspheric hiss waves, EMIC and hiss waves in the regions of plumes, lightning generated whistlers and anthropogenic whistlers. Sensitivity simulations show that the knowledge of wave spectral properties and spacial distribution of waves is crucially important for reproducing long term observations. The 3D VERB code simulations are compared to 3D reanalysis of the radiation belt fluxes obtained by blending the predictive model with observations from LANL GEO, CRRES, Akebono, and GPS. We also discuss the initial results of coupled RCM-VERB simulations. Finally, we present a statistical analysis of radiation belt phase space density obtained from reanalysis to explore sudden drop outs of the radiation belt fluxes and location of peaks in phase space density. The application of the developed tools to future measurements on board RBSP is discussed.

  5. Spectral longwave emission in the tropics: FTIR measurements at the sea surface and comparison with fast radiation codes

    SciTech Connect

    Lubin, D.; Cutchin, D.; Conant, W.; Grassl, H.; Schmid, U.; Biselli, W.

    1995-02-01

    Longwave emission by the tropical western Pacific atmosphere has been measured at the ocean surface by a Fourier Transform Infrared (FTIR) spectroradiometer deployed aboard the research vessel John Vickers as part of the Central Equatorial Pacific Experiment. The instrument operated throughout a Pacific Ocean crossing, beginning on 7 March 1993 in Honiara, Solomon Islands, and ending on 29 March 1993 in Los Angeles, and recorded longwave emission spectra under atmospheres associated with sea surface temperatures ranging from 291.0 to 302.8 K. Precipitable water vapor abundances ranged from 1.9 to 5.5 column centimeters. Measured emission spectra (downwelling zenith radiance) covered the middled infrared (5-20 {mu}m) with one inverse centimeter spectral resolution. FTIR measurements made under an entirely clear field of view are compared with spectra generated by LOWTRAN 7 and MODTRAN 2, as well as downwelling flux calculated by the NCAR COmmunity Climate Model (CCM-2) radiation code, using radiosonde profiles as input data for these calculations. In the spectral interval 800-1000 cm{sup -1}, these comparisons show a discrepance between FTIR data and MODTRAN 2 having an overall variability of 6-7 mW m{sup -2} sr{sup -1} cm and a concave shape that may be related to the representation of water vapor continuum emission in MODTRAN 2. Another discrepancy appears in the spectral interval 1200-1300 cm{sup -1}, whether MODTRAN 2 appears to overestimate zenith radiance by 5 mW m{sup -2} sr-1 cm. These discrepancies appear consistently; however, they become only slightly larger at the highest water vapor abundances. Because these radiance discrepancies correspond to broadband (500-2000 cm{sup -1}) flux uncertainties of around 3 W m{sup -2}, there appear to be no serious inadequacies with the performance of MODTRAN 2 or LOWTRAN 7 at high atmospheric temperatures and water vapor abundances. 23 refs., 10 figs.

  6. Three dimensional data-assimilative VERB-code simulations of the Earth's radiation belts: Reanalysis during the Van Allen Probe era, and operational forecasting

    NASA Astrophysics Data System (ADS)

    Kellerman, Adam; Shprits, Yuri; Podladchikova, Tatiana; Kondrashov, Dmitri

    2016-04-01

    The Versatile Electron Radiation Belt (VERB) code 2.0 models the dynamics of radiation-belt electron phase space density (PSD) in Earth's magnetosphere. Recently, a data-assimilative version of this code has been developed, which utilizes a split-operator Kalman-filtering approach to solve for electron PSD in terms of adiabatic invariants. A new dataset based on the TS07d magnetic field model is presented, which may be utilized for analysis of past geomagnetic storms, and for initial and boundary conditions in running simulations. Further, a data-assimilative forecast model is introduced, which has the capability to forecast electron PSD several days into the future, given a forecast Kp index. The model assimilates an empirical model capable of forecasting the conditions at geosynchronous orbit. The model currently runs in real time and a forecast is available to view online http://rbm.epss.ucla.edu.

  7. Assessment of shielding analysis methods, codes, and data for spent fuel transport/storage applications. [Radiation dose rates from shielded spent fuels and high-level radioactive waste

    SciTech Connect

    Parks, C.V.; Broadhead, B.L.; Hermann, O.W.; Tang, J.S.; Cramer, S.N.; Gauthey, J.C.; Kirk, B.L.; Roussin, R.W.

    1988-07-01

    This report provides a preliminary assessment of the computational tools and existing methods used to obtain radiation dose rates from shielded spent nuclear fuel and high-level radioactive waste (HLW). Particular emphasis is placed on analysis tools and techniques applicable to facilities/equipment designed for the transport or storage of spent nuclear fuel or HLW. Applications to cask transport, storage, and facility handling are considered. The report reviews the analytic techniques for generating appropriate radiation sources, evaluating the radiation transport through the shield, and calculating the dose at a desired point or surface exterior to the shield. Discrete ordinates, Monte Carlo, and point kernel methods for evaluating radiation transport are reviewed, along with existing codes and data that utilize these methods. A literature survey was employed to select a cadre of codes and data libraries to be reviewed. The selection process was based on specific criteria presented in the report. Separate summaries were written for several codes (or family of codes) that provided information on the method of solution, limitations and advantages, availability, data access, ease of use, and known accuracy. For each data library, the summary covers the source of the data, applicability of these data, and known verification efforts. Finally, the report discusses the overall status of spent fuel shielding analysis techniques and attempts to illustrate areas where inaccuracy and/or uncertainty exist. The report notes the advantages and limitations of several analysis procedures and illustrates the importance of using adequate cross-section data sets. Additional work is recommended to enable final selection/validation of analysis tools that will best meet the US Department of Energy's requirements for use in developing a viable HLW management system. 188 refs., 16 figs., 27 tabs.

  8. Extension of radiative transfer code MOMO, matrix-operator model to the thermal infrared - Clear air validation by comparison to RTTOV and application to CALIPSO-IIR

    NASA Astrophysics Data System (ADS)

    Doppler, Lionel; Carbajal-Henken, Cintia; Pelon, Jacques; Ravetta, François; Fischer, Jürgen

    2014-09-01

    1-D radiative transfer code Matrix-Operator Model (MOMO), has been extended from [0.2-3.65 μm] the band to the whole [0.2-100 μm] spectrum. MOMO can now be used for the computation of a full range of radiation budgets (shortwave and longwave). This extension to the longwave part of the electromagnetic radiation required to consider radiative transfer processes that are features of the thermal infrared: the spectroscopy of the water vapor self- and foreign-continuum of absorption at 12 μm and the emission of radiation by gases, aerosol, clouds and surface. MOMO's spectroscopy module, Coefficient of Gas Absorption (CGASA), has been developed for computation of gas extinction coefficients, considering continua and spectral line absorptions. The spectral dependences of gas emission/absorption coefficients and of Planck's function are treated using a k-distribution. The emission of radiation is implemented in the adding-doubling process of the matrix operator method using Schwarzschild's approach in the radiative transfer equation (a pure absorbing/emitting medium, namely without scattering). Within the layer, the Planck-function is assumed to have an exponential dependence on the optical-depth. In this paper, validation tests are presented for clear air case studies: comparisons to the analytical solution of a monochromatic Schwarzschild's case without scattering show an error of less than 0.07% for a realistic atmosphere with an optical depth and a blackbody temperature that decrease linearly with altitude. Comparisons to radiative transfer code RTTOV are presented for simulations of top of atmosphere brightness temperature for channels of the space-borne instrument MODIS. Results show an agreement varying from 0.1 K to less than 1 K depending on the channel. Finally MOMO results are compared to CALIPSO Infrared Imager Radiometer (IIR) measurements for clear air cases. A good agreement was found between computed and observed radiance: biases are smaller than 0.5 K

  9. PORTA: A three-dimensional multilevel radiative transfer code for modeling the intensity and polarization of spectral lines with massively parallel computers

    NASA Astrophysics Data System (ADS)

    Štěpán, Jiří; Trujillo Bueno, Javier

    2013-09-01

    The interpretation of the intensity and polarization of the spectral line radiation produced in the atmosphere of the Sun and of other stars requires solving a radiative transfer problem that can be very complex, especially when the main interest lies in modeling the spectral line polarization produced by scattering processes and the Hanle and Zeeman effects. One of the difficulties is that the plasma of a stellar atmosphere can be highly inhomogeneous and dynamic, which implies the need to solve the non-equilibrium problem of the generation and transfer of polarized radiation in realistic three-dimensional (3D) stellar atmospheric models. Here we present PORTA, an efficient multilevel radiative transfer code we have developed for the simulation of the spectral line polarization caused by scattering processes and the Hanle and Zeeman effects in 3D models of stellar atmospheres. The numerical method of solution is based on the non-linear multigrid iterative method and on a novel short-characteristics formal solver of the Stokes-vector transfer equation which uses monotonic Bézier interpolation. Therefore, with PORTA the computing time needed to obtain at each spatial grid point the self-consistent values of the atomic density matrix (which quantifies the excitation state of the atomic system) scales linearly with the total number of grid points. Another crucial feature of PORTA is its parallelization strategy, which allows us to speed up the numerical solution of complicated 3D problems by several orders of magnitude with respect to sequential radiative transfer approaches, given its excellent linear scaling with the number of available processors. The PORTA code can also be conveniently applied to solve the simpler 3D radiative transfer problem of unpolarized radiation in multilevel systems.

  10. Mechanisms of the alternative activation of macrophages and non-coding RNAs in the development of radiation-induced lung fibrosis

    PubMed Central

    Duru, Nadire; Wolfson, Benjamin; Zhou, Qun

    2016-01-01

    Radiation-induced lung fibrosis (RILF) is a common side effect of thoracic irradiation therapy and leads to high mortality rates after cancer treatment. Radiation injury induces inflammatory M1 macrophage polarization leading to radiation pneumonitis, the first stage of RILF progression. Fibrosis occurs due to the transition of M1 macrophages to the anti-inflammatory pro-fibrotic M2 phenotype, and the resulting imbalance of macrophage regulated inflammatory signaling. Non-coding RNA signaling has been shown to play a large role in the regulation of the M2 mediated signaling pathways that are associated with the development and progression of fibrosis. While many studies show the link between M2 macrophages and fibrosis, there are only a few that explore their distinct role and the regulation of their signaling by non-coding RNA in RILF. In this review we summarize the current body of knowledge describing the roles of M2 macrophages in RILF, with an emphasis on the expression and functions of non-coding RNAs. PMID:27957248

  11. Intercomparision of Monte Carlo Radiation Transport Codes MCNPX, GEANT4, and FLUKA for Simulating Proton Radiotherapy of the Eye

    PubMed Central

    Randeniya, S. D.; Taddei, P. J.; Newhauser, W. D.; Yepes, P.

    2010-01-01

    Monte Carlo simulations of an ocular treatment beam-line consisting of a nozzle and a water phantom were carried out using MCNPX, GEANT4, and FLUKA to compare the dosimetric accuracy and the simulation efficiency of the codes. Simulated central axis percent depth-dose profiles and cross-field dose profiles were compared with experimentally measured data for the comparison. Simulation speed was evaluated by comparing the number of proton histories simulated per second using each code. The results indicate that all the Monte Carlo transport codes calculate sufficiently accurate proton dose distributions in the eye and that the FLUKA transport code has the highest simulation efficiency. PMID:20865141

  12. RH 1.5D: a massively parallel code for multi-level radiative transfer with partial frequency redistribution and Zeeman polarisation

    NASA Astrophysics Data System (ADS)

    Pereira, Tiago M. D.; Uitenbroek, Han

    2015-02-01

    The emergence of three-dimensional magneto-hydrodynamic simulations of stellar atmospheres has sparked a need for efficient radiative transfer codes to calculate detailed synthetic spectra. We present RH 1.5D, a massively parallel code based on the RH code and capable of performing Zeeman polarised multi-level non-local thermodynamical equilibrium calculations with partial frequency redistribution for an arbitrary amount of chemical species. The code calculates spectra from 3D, 2D or 1D atmospheric models on a column-by-column basis (or 1.5D). While the 1.5D approximation breaks down in the cores of very strong lines in an inhomogeneous environment, it is nevertheless suitable for a large range of scenarios and allows for faster convergence with finer control over the iteration of each simulation column. The code scales well to at least tens of thousands of CPU cores, and is publicly available. In the present work we briefly describe its inner workings, strategies for convergence optimisation, its parallelism, and some possible applications.

  13. Development of rotating shadowband spectral radiometers and GCM radiation code test data sets in support of ARM. Technical progress report, September 15, 1990--September 14, 1991

    SciTech Connect

    Harrison, L.; Michalsky, J.

    1991-03-13

    Three separate tasks are included in the first year of the project. Two involve assembling data sets useful for testing radiation models in global climate modeling (GCM) codes, and the third is concerned with the development of advance instrumentation for performing accurate spectral radiation measurements. Task 1: Three existing data sets have been merged for two locations, one in the wet northeastern US and a second in the dry western US. The data sets are meteorological data from the WBAN network, upper air data from the NCDC, and high quality solar radiation measurements from Albany, New York and Golden, Colorado. These represent test data sets for those modelers developing radiation codes for the GCM models. Task 2: Existing data are not quite adequate from a modeler`s perspective without downwelling infrared data and surface albedo, or reflectance, data. Before the deployment of the first CART site in ARM the authors are establishing this more complete set of radiation measurements at the Albany site to be operational only until CART is operational. The authors will have the site running by April 1991, which will provide about one year`s data from this location. They will coordinate their measurements with satellite overpasses, and, to the extent possible, with radiosonde releases, in order that the data set be coincident in time. Task 3: Work has concentrated on the multiple filter instrument. The mechanical, optical, and software engineering for this instrument is complete, and the first field prototype is running at the Rattlesnake Mountain Observatory (RMO) test site. This instrument is performing well, and is already delivering reliable and useful information.

  14. A Study of Longwave Radiation Codes for Climate Studies: Validation with ARM Observations and Tests in General Circulation Models

    SciTech Connect

    Robert G. Ellingson

    2004-09-28

    One specific goal of the Atmospheric Radiation Measurements (ARM) program is to improve the treatment of radiative transfer in General Circulation Models (GCMs) under clear-sky, general overcast and broken cloud conditions. Our project was geared to contribute to this goal by attacking major problems associated with one of the dominant radiation components of the problem --longwave radiation. The primary long-term project objectives were to: (1) develop an optimum longwave radiation model for use in GCMs that has been calibrated with state-of-the-art observations for clear and cloudy conditions, and (2) determine how the longwave radiative forcing with an improved algorithm contributes relatively in a GCM when compared to shortwave radiative forcing, sensible heating, thermal advection and convection. The approach has been to build upon existing models in an iterative, predictive fashion. We focused on comparing calculations from a set of models with operationally observed data for clear, overcast and broken cloud conditions. The differences found through the comparisons and physical insights have been used to develop new models, most of which have been tested with new data. Our initial GCM studies used existing GCMs to study the climate model-radiation sensitivity problem. Although this portion of our initial plans was curtailed midway through the project, we anticipate that the eventual outcome of this approach will provide both a better longwave radiative forcing algorithm and from our better understanding of how longwave radiative forcing influences the model equilibrium climate, how improvements in climate prediction using this algorithm can be achieved.

  15. CSDUST3 - A radiation transport code for a dusty medium with 1-D planar, spherical or cylindrical geometry

    NASA Technical Reports Server (NTRS)

    Egan, Michael P.; Leung, Chun Ming; Spagna, George F., Jr.

    1988-01-01

    The program solves the radiation transport problem in a dusty medium with one-dimensional planar, spherical or cylindrical geometry. It determines self-consistently the effects of multiple scattering, absorption, and re-emission of photons on the temperature of dust grains and the characteristics of the internal radiation field. The program can treat radiation field anisotropy, linear anisotropic scattering, and multi-grain components. The program output consists of the dust-temperature distribution, flux spectrum, surface brightness at each frequency and the observed intensities (involving a convolution with a telescope beam pattern).

  16. TH-A-19A-11: Validation of GPU-Based Monte Carlo Code (gPMC) Versus Fully Implemented Monte Carlo Code (TOPAS) for Proton Radiation Therapy: Clinical Cases Study

    SciTech Connect

    Giantsoudi, D; Schuemann, J; Dowdell, S; Paganetti, H; Jia, X; Jiang, S

    2014-06-15

    Purpose: For proton radiation therapy, Monte Carlo simulation (MCS) methods are recognized as the gold-standard dose calculation approach. Although previously unrealistic due to limitations in available computing power, GPU-based applications allow MCS of proton treatment fields to be performed in routine clinical use, on time scales comparable to that of conventional pencil-beam algorithms. This study focuses on validating the results of our GPU-based code (gPMC) versus fully implemented proton therapy based MCS code (TOPAS) for clinical patient cases. Methods: Two treatment sites were selected to provide clinical cases for this study: head-and-neck cases due to anatomical geometrical complexity (air cavities and density heterogeneities), making dose calculation very challenging, and prostate cases due to higher proton energies used and close proximity of the treatment target to sensitive organs at risk. Both gPMC and TOPAS methods were used to calculate 3-dimensional dose distributions for all patients in this study. Comparisons were performed based on target coverage indices (mean dose, V90 and D90) and gamma index distributions for 2% of the prescription dose and 2mm. Results: For seven out of eight studied cases, mean target dose, V90 and D90 differed less than 2% between TOPAS and gPMC dose distributions. Gamma index analysis for all prostate patients resulted in passing rate of more than 99% of voxels in the target. Four out of five head-neck-cases showed passing rate of gamma index for the target of more than 99%, the fifth having a gamma index passing rate of 93%. Conclusion: Our current work showed excellent agreement between our GPU-based MCS code and fully implemented proton therapy based MC code for a group of dosimetrically challenging patient cases.

  17. SU-F-18C-09: Assessment of OSL Dosimeter Technology in the Validation of a Monte Carlo Radiation Transport Code for CT Dosimetry

    SciTech Connect

    Carver, D; Kost, S; Pickens, D; Price, R; Stabin, M

    2014-06-15

    Purpose: To assess the utility of optically stimulated luminescent (OSL) dosimeter technology in calibrating and validating a Monte Carlo radiation transport code for computed tomography (CT). Methods: Exposure data were taken using both a standard CT 100-mm pencil ionization chamber and a series of 150-mm OSL CT dosimeters. Measurements were made at system isocenter in air as well as in standard 16-cm (head) and 32-cm (body) CTDI phantoms at isocenter and at the 12 o'clock positions. Scans were performed on a Philips Brilliance 64 CT scanner for 100 and 120 kVp at 300 mAs with a nominal beam width of 40 mm. A radiation transport code to simulate the CT scanner conditions was developed using the GEANT4 physics toolkit. The imaging geometry and associated parameters were simulated for each ionization chamber and phantom combination. Simulated absorbed doses were compared to both CTDI{sub 100} values determined from the ion chamber and to CTDI{sub 100} values reported from the OSLs. The dose profiles from each simulation were also compared to the physical OSL dose profiles. Results: CTDI{sub 100} values reported by the ion chamber and OSLs are generally in good agreement (average percent difference of 9%), and provide a suitable way to calibrate doses obtained from simulation to real absorbed doses. Simulated and real CTDI{sub 100} values agree to within 10% or less, and the simulated dose profiles also predict the physical profiles reported by the OSLs. Conclusion: Ionization chambers are generally considered the standard for absolute dose measurements. However, OSL dosimeters may also serve as a useful tool with the significant benefit of also assessing the radiation dose profile. This may offer an advantage to those developing simulations for assessing radiation dosimetry such as verification of spatial dose distribution and beam width.

  18. IM3D: A parallel Monte Carlo code for efficient simulations of primary radiation displacements and damage in 3D geometry

    PubMed Central

    Li, Yong Gang; Yang, Yang; Short, Michael P.; Ding, Ze Jun; Zeng, Zhi; Li, Ju

    2015-01-01

    SRIM-like codes have limitations in describing general 3D geometries, for modeling radiation displacements and damage in nanostructured materials. A universal, computationally efficient and massively parallel 3D Monte Carlo code, IM3D, has been developed with excellent parallel scaling performance. IM3D is based on fast indexing of scattering integrals and the SRIM stopping power database, and allows the user a choice of Constructive Solid Geometry (CSG) or Finite Element Triangle Mesh (FETM) method for constructing 3D shapes and microstructures. For 2D films and multilayers, IM3D perfectly reproduces SRIM results, and can be ∼102 times faster in serial execution and > 104 times faster using parallel computation. For 3D problems, it provides a fast approach for analyzing the spatial distributions of primary displacements and defect generation under ion irradiation. Herein we also provide a detailed discussion of our open-source collision cascade physics engine, revealing the true meaning and limitations of the “Quick Kinchin-Pease” and “Full Cascades” options. The issues of femtosecond to picosecond timescales in defining displacement versus damage, the limitation of the displacements per atom (DPA) unit in quantifying radiation damage (such as inadequacy in quantifying degree of chemical mixing), are discussed. PMID:26658477

  19. Scaling and performance of a 3-D radiation hydrodynamics code on message-passing parallel computers: final report

    SciTech Connect

    Hayes, J C; Norman, M

    1999-10-28

    This report details an investigation into the efficacy of two approaches to solving the radiation diffusion equation within a radiation hydrodynamic simulation. Because leading-edge scientific computing platforms have evolved from large single-node vector processors to parallel aggregates containing tens to thousands of individual CPU's, the ability of an algorithm to maintain high compute efficiency when distributed over a large array of nodes is critically important. The viability of an algorithm thus hinges upon the tripartite question of numerical accuracy, total time to solution, and parallel efficiency.

  20. Development and Implementation of Photonuclear Cross-Section Data for Mutually Coupled Neutron-Photon Transport Calculations in the Monte Carlo N-Particle (MCNP) Radiation Transport Code

    SciTech Connect

    White, Morgan C.

    2000-07-01

    The fundamental motivation for the research presented in this dissertation was the need to development a more accurate prediction method for characterization of mixed radiation fields around medical electron accelerators (MEAs). Specifically, a model is developed for simulation of neutron and other particle production from photonuclear reactions and incorporated in the Monte Carlo N-Particle (MCNP) radiation transport code. This extension of the capability within the MCNP code provides for the more accurate assessment of the mixed radiation fields. The Nuclear Theory and Applications group of the Los Alamos National Laboratory has recently provided first-of-a-kind evaluated photonuclear data for a select group of isotopes. These data provide the reaction probabilities as functions of incident photon energy with angular and energy distribution information for all reaction products. The availability of these data is the cornerstone of the new methodology for state-of-the-art mutually coupled photon-neutron transport simulations. The dissertation includes details of the model development and implementation necessary to use the new photonuclear data within MCNP simulations. A new data format has been developed to include tabular photonuclear data. Data are processed from the Evaluated Nuclear Data Format (ENDF) to the new class ''u'' A Compact ENDF (ACE) format using a standalone processing code. MCNP modifications have been completed to enable Monte Carlo sampling of photonuclear reactions. Note that both neutron and gamma production are included in the present model. The new capability has been subjected to extensive verification and validation (V&V) testing. Verification testing has established the expected basic functionality. Two validation projects were undertaken. First, comparisons were made to benchmark data from literature. These calculations demonstrate the accuracy of the new data and transport routines to better than 25 percent. Second, the ability to

  1. Radiation

    NASA Video Gallery

    Outside the protective cocoon of Earth's atmosphere, the universe is full of harmful radiation. Astronauts who live and work in space are exposed not only to ultraviolet rays but also to space radi...

  2. Emission from Very Small Grains and PAH Molecules in Monte Carlo Radiation Transfer Codes: Application to the Edge-On Disk of Gomez's Hamburger

    NASA Astrophysics Data System (ADS)

    Wood, Kenneth; Whitney, Barbara A.; Robitaille, Thomas; Draine, Bruce T.

    2008-12-01

    We have modeled optical to far-infrared images, photometry, and spectroscopy of the object known as Gomez's Hamburger. We reproduce the images and spectrum with an edge-on disk of mass 0.3 M⊙ and radius 1600 AU, surrounding an A0 III star at a distance of 280 pc. Our mass estimate is in excellent agreement with recent CO observations. However, our distance determination is more than an order of magnitude smaller than previous analyses, which inaccurately interpreted the optical spectrum. To accurately model the infrared spectrum we have extended our Monte Carlo radiation transfer codes to include emission from polycyclic aromatic hydrocarbon (PAH) molecules and very small grains (VSG). We do this using precomputed PAH/VSG emissivity files for a wide range of values of the mean intensity of the exciting radiation field. When Monte Carlo energy packets are absorbed by PAHs/VSGs, we reprocess them to other wavelengths by sampling from the emissivity files, thus simulating the absorption and reemission process without reproducing lengthy computations of statistical equilibrium, excitation, and de-excitation in the complex many-level molecules. Using emissivity lookup tables in our Monte Carlo codes gives us the flexibility to use the latest grain physics calculations of PAH/VSG emissivity and opacity that are being continually updated in the light of higher resolution infrared spectra. We find our approach gives a good representation of the observed PAH spectrum from the disk of Gomez's Hamburger. Our models also indicate that the PAHs/VSGs in the disk have a larger scale height than larger radiative equilibrium grains, providing evidence for dust coagulation and settling to the midplane.

  3. Validation of a GPU-based Monte Carlo code (gPMC) for proton radiation therapy: clinical cases study

    NASA Astrophysics Data System (ADS)

    Giantsoudi, Drosoula; Schuemann, Jan; Jia, Xun; Dowdell, Stephen; Jiang, Steve; Paganetti, Harald

    2015-03-01

    Monte Carlo (MC) methods are recognized as the gold-standard for dose calculation, however they have not replaced analytical methods up to now due to their lengthy calculation times. GPU-based applications allow MC dose calculations to be performed on time scales comparable to conventional analytical algorithms. This study focuses on validating our GPU-based MC code for proton dose calculation (gPMC) using an experimentally validated multi-purpose MC code (TOPAS) and compare their performance for clinical patient cases. Clinical cases from five treatment sites were selected covering the full range from very homogeneous patient geometries (liver) to patients with high geometrical complexity (air cavities and density heterogeneities in head-and-neck and lung patients) and from short beam range (breast) to large beam range (prostate). Both gPMC and TOPAS were used to calculate 3D dose distributions for all patients. Comparisons were performed based on target coverage indices (mean dose, V95, D98, D50, D02) and gamma index distributions. Dosimetric indices differed less than 2% between TOPAS and gPMC dose distributions for most cases. Gamma index analysis with 1%/1 mm criterion resulted in a passing rate of more than 94% of all patient voxels receiving more than 10% of the mean target dose, for all patients except for prostate cases. Although clinically insignificant, gPMC resulted in systematic underestimation of target dose for prostate cases by 1-2% compared to TOPAS. Correspondingly the gamma index analysis with 1%/1 mm criterion failed for most beams for this site, while for 2%/1 mm criterion passing rates of more than 94.6% of all patient voxels were observed. For the same initial number of simulated particles, calculation time for a single beam for a typical head and neck patient plan decreased from 4 CPU hours per million particles (2.8-2.9 GHz Intel X5600) for TOPAS to 2.4 s per million particles (NVIDIA TESLA C2075) for gPMC. Excellent agreement was

  4. A study of longwave radiation codes for climate studies: Validation with ARM observations and tests in general circulation models. Technical report, 15 September 1990--25 April 1993

    SciTech Connect

    Ellingson, R.G.; Baer, F.

    1993-12-31

    This report summarizes the activities of our group to meet our stated objectives. The report is divided into sections entitled: Radiation Model Testing Activities, General Circulation Model Testing Activities, Science Team Activities, and Publications, Presentations and Meetings. The section on Science Team Activities summarizes our participation with the science team to further advance the observation and modeling programs. Appendix A lists graduate students supported, and post-doctoral appointments during the project. Reports on the activities during each of the first two years are included as Appendix B. Significant progress has been made in: determining the ability of line-by-line radiation models to calculate the downward longwave flux at the surface; determining the uncertainties in calculated the downwelling radiance and flux at the surface associated with the use of different proposed profiling techniques; intercomparing clear-sky radiance and flux observations with calculations from radiation codes from different climate models; determining the uncertainties associated with estimating N* from surface longwave flux observations; and determining the sensitivity of model calculations to different formulations of the effects of finite sized clouds.

  5. Monte Carlo Simulation of a 6 MV X-Ray Beam for Open and Wedge Radiation Fields, Using GATE Code.

    PubMed

    Bahreyni-Toosi, Mohammad-Taghi; Nasseri, Shahrokh; Momennezhad, Mahdi; Hasanabadi, Fatemeh; Gholamhosseinian, Hamid

    2014-10-01

    The aim of this study is to provide a control software system, based on Monte Carlo simulation, and calculations of dosimetric parameters of standard and wedge radiation fields, using a Monte Carlo method. GATE version 6.1 (OpenGATE Collaboration), was used to simulate a compact 6 MV linear accelerator system. In order to accelerate the calculations, the phase-space technique and cluster computing (Condor version 7.2.4, Condor Team, University of Wisconsin-Madison) were used. Dosimetric parameters used in treatment planning systems for the standard and wedge radiation fields (10 cm × 10 cm to 30 cm × 30 cm and a 60° wedge), including the percentage depth dose and dose profiles, were measured by both computational and experimental methods. Gamma index was applied to compare calculated and measured results with 3%/3 mm criteria. Gamma index was applied to compare calculated and measured results. Almost all calculated data points have satisfied gamma index criteria of 3% to 3 mm. Based on the good agreement between calculated and measured results obtained for various radiation fields in this study, GATE may be used as a useful tool for quality control or pretreatment verification procedures in radiotherapy.

  6. Monte Carlo Simulation of a 6 MV X-Ray Beam for Open and Wedge Radiation Fields, Using GATE Code

    PubMed Central

    Bahreyni-Toosi, Mohammad-Taghi; Nasseri, Shahrokh; Momennezhad, Mahdi; Hasanabadi, Fatemeh; Gholamhosseinian, Hamid

    2014-01-01

    The aim of this study is to provide a control software system, based on Monte Carlo simulation, and calculations of dosimetric parameters of standard and wedge radiation fields, using a Monte Carlo method. GATE version 6.1 (OpenGATE Collaboration), was used to simulate a compact 6 MV linear accelerator system. In order to accelerate the calculations, the phase-space technique and cluster computing (Condor version 7.2.4, Condor Team, University of Wisconsin–Madison) were used. Dosimetric parameters used in treatment planning systems for the standard and wedge radiation fields (10 cm × 10 cm to 30 cm × 30 cm and a 60° wedge), including the percentage depth dose and dose profiles, were measured by both computational and experimental methods. Gamma index was applied to compare calculated and measured results with 3%/3 mm criteria. Gamma index was applied to compare calculated and measured results. Almost all calculated data points have satisfied gamma index criteria of 3% to 3 mm. Based on the good agreement between calculated and measured results obtained for various radiation fields in this study, GATE may be used as a useful tool for quality control or pretreatment verification procedures in radiotherapy. PMID:25426430

  7. Reanalysis of Radiation Belt Electron Phase Space Density using the UCLA 1-D VERB code and Kalman filtering: Correlation between the inner edge of the outer radiation belt phase space density and the plasmapause location

    NASA Astrophysics Data System (ADS)

    Espy, P. J.; Daae, M.; Shprits, Y.

    2010-12-01

    The correlation between the inner edge of the outer radiation belt phase space density (PSD) and the plasmapause location (Lpp) using reanalysis is investigated. A large data set is applied for the statistical analysis, using data from 1990-1991 from the CRRES satellite, GEO 1989, GPS-ns18 and Akebono. These data are incorporated into reanalysis by means of a Kalman filter with the UCLA 1-D VERB code. The result is a continuous radial and temporal distribution of the PSD from L*=3 to L*=7. The innovation vector of the reconstructed PSD can give us information about regions where local loss or source processes are dominating. We analyze both the PSD and the innovation vector by binning them into slots of Dst and Kp values. This has been done by finding the time for when the Dst (Kp) is within each bin-size of 20 nT (1) from 10 nT to -130 nT (1 to 8). The PSD and innovation vector was then averaged over each of those times. The result shows a good correlation between the location of the inner edge of the outer radiation belt in the PSD and the location of the plasmapause, which is consistent with previous observations. The boundary between the inner edge of the radiation belt and the Lpp becomes sharper, and the radiation belt becomes thinner, during times of high geomagnetic activity. The innovation vector shows that the inner edge of the source region also lines up well with the Lpp, and further showing a battle between losses and sources during active times. This study also illustrates how data assimilation in the radiation belts can be used to understand the underlining processes of acceleration and loss in the inner magnetosphere.

  8. The PHARO Code.

    DTIC Science & Technology

    1981-11-24

    n.cet..ary ad Identfy by block nutrb.) Visible radiation Sensors Infrared radiation Line and band transitions Isophots High altitude nuclear data...radiation (watts sr) in arbitrary wavelength intervals is determined. The results are a series of " isophot " plots for rbitrariiy placed cameras or sensors...Section II. The output of the PHARO code consists of contour plots of radiative intensity (watts/cm ster) or " isophot " plots for arbitrarily placed sensors

  9. The Microwave Applications Theory Program at NRL and Some Chemistry Code Applications to Ionospheric Heating by Microwave Radiation.

    DTIC Science & Technology

    1980-08-26

    RADIATION 1. INTRODUCTION The advent of high power pulsed microwave devices, the magnetrons, at NRL,I which currently generate ’U 1 G Watt at X 1 0 cm and a...separation needed to sustain such a plasma. ( g ) relaxation of the disturbed air and the impact of the late time air chemistry on multi pulse breakdown...and the first negative bands of N 2+ . These two band systems correspond2 7 to N2 +(B2E - X2 E) and N2 (C 3Tu - B 3 g ) transitions, respectively. The

  10. The LENS Facilities and Experimental Studies to Evaluate the Modeling of Boundary Layer Transition, Shock/Boundary Layer Interaction, Real Gas, Radiation and Plasma Phenomena in Contemporary CFD Codes

    DTIC Science & Technology

    2010-04-01

    Center Body Mylar Diaphragms Nozzle Throat Evacuated Test Section Test Model Tunnel Loaded, Ready to Fire To Vacuum Tank Tunnel Started by Pressurizing ...Layer Interaction, Real Gas, Radiation and Plasma Phenomena in Contemporary CFD Codes Michael S. Holden, PhD CUBRC, Inc. 4455 Genesee Street Buffalo...Phenomena in Contemporary CFD Codes 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  11. An Intercomparison of Radiation Codes for Retrieving Upper Tropospheric Humidity in the 6.3-micron Band: A Report from the 1st GVaP Workshop

    NASA Technical Reports Server (NTRS)

    Soden, B.; Tjemkes, S.; Schmetz, J.; Saunders, R.; Bates, J.; Ellingson, B.; Engelen, R.; Garand, L.; Jackson, D.; Jedlovec, G.

    1999-01-01

    An intercomparison of radiation codes used in retrieving upper tropospheric humidity (UTH) from observations in the v2 (6.3 microns) water vapor absorption band was performed. This intercomparison is one part of a coordinated effort within the GEWEX Water Vapor Project (GVaP) to assess our ability to monitor the distribution and variations of upper tropospheric moisture from space-borne sensors. A total of 23 different codes, ranging from detailed line-by-line (LBL) models, to coarser resolution narrow-band (NB) models, to highly-parameterized single-band (SB) models participated in the study. Forward calculations were performed using a carefully selected set of temperature and moisture profiles chosen to be representative of a wide range of atmospheric conditions. The LBL model calculations exhibited the greatest consistency with each other, typically agreeing to within 0.5 K in terms of the equivalent blackbody brightness temperature (T(sub b)). The majority of NB and SB models agreed to within +/- 1 K of the LBL models, although a few older models exhibited systematic T(sub b) biases in excess of 2 K. A discussion of the discrepancies between various models, their association with differences in model physics (e.g. continuum absorption), and their implications for UTH retrieval and radiance assimilation is presented.

  12. A review of the use and potential of the GATE Monte Carlo simulation code for radiation therapy and dosimetry applications.

    PubMed

    Sarrut, David; Bardiès, Manuel; Boussion, Nicolas; Freud, Nicolas; Jan, Sébastien; Létang, Jean-Michel; Loudos, George; Maigne, Lydia; Marcatili, Sara; Mauxion, Thibault; Papadimitroulas, Panagiotis; Perrot, Yann; Pietrzyk, Uwe; Robert, Charlotte; Schaart, Dennis R; Visvikis, Dimitris; Buvat, Irène

    2014-06-01

    In this paper, the authors' review the applicability of the open-source GATE Monte Carlo simulation platform based on the GEANT4 toolkit for radiation therapy and dosimetry applications. The many applications of GATE for state-of-the-art radiotherapy simulations are described including external beam radiotherapy, brachytherapy, intraoperative radiotherapy, hadrontherapy, molecular radiotherapy, and in vivo dose monitoring. Investigations that have been performed using GEANT4 only are also mentioned to illustrate the potential of GATE. The very practical feature of GATE making it easy to model both a treatment and an imaging acquisition within the same framework is emphasized. The computational times associated with several applications are provided to illustrate the practical feasibility of the simulations using current computing facilities.

  13. A review of the use and potential of the GATE Monte Carlo simulation code for radiation therapy and dosimetry applications

    SciTech Connect

    Sarrut, David; Bardiès, Manuel; Marcatili, Sara; Mauxion, Thibault; Boussion, Nicolas; Freud, Nicolas; Létang, Jean-Michel; Jan, Sébastien; Maigne, Lydia; Perrot, Yann; Pietrzyk, Uwe; Robert, Charlotte; and others

    2014-06-15

    In this paper, the authors' review the applicability of the open-source GATE Monte Carlo simulation platform based on the GEANT4 toolkit for radiation therapy and dosimetry applications. The many applications of GATE for state-of-the-art radiotherapy simulations are described including external beam radiotherapy, brachytherapy, intraoperative radiotherapy, hadrontherapy, molecular radiotherapy, and in vivo dose monitoring. Investigations that have been performed using GEANT4 only are also mentioned to illustrate the potential of GATE. The very practical feature of GATE making it easy to model both a treatment and an imaging acquisition within the same frameworkis emphasized. The computational times associated with several applications are provided to illustrate the practical feasibility of the simulations using current computing facilities.

  14. Compressible Astrophysics Simulation Code

    SciTech Connect

    Howell, L.; Singer, M.

    2007-07-18

    This is an astrophysics simulation code involving a radiation diffusion module developed at LLNL coupled to compressible hydrodynamics and adaptive mesh infrastructure developed at LBNL. One intended application is to neutrino diffusion in core collapse supernovae.

  15. Montecarlo simulation code in optimisation of the IntraOperative Radiation Therapy treatment with mobile dedicated accelerator

    NASA Astrophysics Data System (ADS)

    Catalano, M.; Agosteo, S.; Moretti, R.; Andreoli, S.

    2007-06-01

    The principle of optimisation of the EURATOM 97/43 directive foresees that for all medical exposure of individuals for radiotherapeutic purposes, exposures of target volumes shall be individually planned, taking into account that doses of non-target volumes and tissues shall be as low as reasonably achievable and consistent with the intended radiotherapeutic purpose of the exposure. Treatment optimisation has to be carried out especially in non conventional radiotherapic procedures, as Intra Operative Radiation Therapy (IORT) with mobile dedicated LINear ACcelerator (LINAC), which does not make use of a Treatment Planning System. IORT is carried out with electron beams and refers to the application of radiation during a surgical intervention, after the removal of a neoplastic mass and it can also be used as a one-time/stand alone treatment in initial cancer of small volume. IORT foresees a single session and a single beam only; therefore it is necessary to use protection systems (disks) temporary positioned between the target volume and the underlying tissues, along the beam axis. A single high Z shielding disk is used to stop the electrons of the beam at a certain depth and protect the tissues located below. Electron back scatter produces an enhancement in the dose above the disk, and this can be reduced if a second low Z disk is placed above the first. Therefore two protection disks are used in clinical application. On the other hand the dose enhancement at the interface of the high Z disk and the target, due to back scattering radiation, can be usefully used to improve the uniformity in treatment of thicker target volumes. Furthermore the dose above the disks of different Z material has to be evaluated in order to study the optimal combination of shielding disks that allow both to protect the underlying tissues and to obtain the most uniform dose distribution in target volumes of different thicknesses. The dose enhancement can be evaluated using the electron

  16. Total effective dose equivalent assessment after exposure to high-level natural radiation using the RESRAD code.

    PubMed

    Ziajahromi, Shima; Khanizadeh, Meysam; Nejadkoorki, Farhad

    2014-03-01

    The current work reports the activity concentrations of several natural radionuclides ((226)Ra, (232)Th, and (40)K) in Khak-Sefid area of Ramsar, Iran. An evaluation of total effective dose equivalent (TEDE) from exposure to high-level natural radiations is also presented. Soil samples were analyzed using a high-purity germanium detector with 80 % relative efficiency. The TEDE was calculated on a land area of 40,000 m(2) with 1.5-m thickness of contaminated zone for the member of three critical groups of farmer, construction worker, and resident using Residual Radioactive Material Guidelines (RESRAD) modeling program. It was found that the mean activity concentrations (in Bq/kg) were 23,118 ± 468, 25.8 ± 2.3, and 402.6 ± 16.5 for (226)Ra, (232)Th, and (40)K, respectively. The maximum calculated TEDE during 1,000 years was 107.1 mSv/year at year 90, 92.42 mSv/year at year 88, and 22.09 mSv/year at year 46 for farmer, resident, and construction worker scenarios, respectively. The maximum TEDE in farmer scenario can be reduced to the level below the dose limit of 1 mSv/year which is safe for public health using soil cover with thickness of 50 cm or more on the contaminated zone. According to RESRAD prediction, the TEDE received by individuals for all exposure scenarios considerably exceed the set dose limit, and it is mainly due to (226)Ra.

  17. Polar Codes

    DTIC Science & Technology

    2014-12-01

    density parity check (LDPC) code, a Reed–Solomon code, and three convolutional codes. iii CONTENTS EXECUTIVE SUMMARY...the most common. Many civilian systems use low density parity check (LDPC) FEC codes, and the Navy is planning to use LDPC for some future systems...other forward error correction methods: a turbo code, a low density parity check (LDPC) code, a Reed–Solomon code, and three convolutional codes

  18. Spacecraft Solar Particle Event (SPE) Shielding: Shielding Effectiveness as a Function of SPE model as Determined with the FLUKA Radiation Transport Code

    NASA Technical Reports Server (NTRS)

    Koontz, Steve; Atwell, William; Reddell, Brandon; Rojdev, Kristina

    2010-01-01

    Analysis of both satellite and surface neutron monitor data demonstrate that the widely utilized Exponential model of solar particle event (SPE) proton kinetic energy spectra can seriously underestimate SPE proton flux, especially at the highest kinetic energies. The more recently developed Band model produces better agreement with neutron monitor data ground level events (GLEs) and is believed to be considerably more accurate at high kinetic energies. Here, we report the results of modeling and simulation studies in which the radiation transport code FLUKA (FLUktuierende KAskade) is used to determine the changes in total ionizing dose (TID) and single-event environments (SEE) behind aluminum, polyethylene, carbon, and titanium shielding masses when the assumed form (i. e., Band or Exponential) of the solar particle event (SPE) kinetic energy spectra is changed. FLUKA simulations have fully three dimensions with an isotropic particle flux incident on a concentric spherical shell shielding mass and detector structure. The effects are reported for both energetic primary protons penetrating the shield mass and secondary particle showers caused by energetic primary protons colliding with shielding mass nuclei. Our results, in agreement with previous studies, show that use of the Exponential form of the event

  19. Radiation Therapy: Additional Treatment Options

    MedlinePlus

    ... Upper GI What is Radiation Therapy? Find a Radiation Oncologist Last Name: Facility: City: State: Zip Code: ... infections. This is refered to as immunotherapy . Intraoperative Radiation Therapy Radiation therapy given during surgery is called ...

  20. Clinical coding. Code breakers.

    PubMed

    Mathieson, Steve

    2005-02-24

    --The advent of payment by results has seen the role of the clinical coder pushed to the fore in England. --Examinations for a clinical coding qualification began in 1999. In 2004, approximately 200 people took the qualification. --Trusts are attracting people to the role by offering training from scratch or through modern apprenticeships.

  1. MO-G-BRE-05: Clinical Process Improvement and Billing in Radiation Oncology: A Case Study of Applying FMEA for CPT Code 77336 (continuing Medical Physics Consultation)

    SciTech Connect

    Spirydovich, S; Huq, M

    2014-06-15

    Purpose: The improvement of quality in healthcare can be assessed by Failure Mode and Effects Analysis (FMEA). In radiation oncology, FMEA, as applied to the billing CPT code 77336, can improve both charge capture and, most importantly, quality of the performed services. Methods: We created an FMEA table for the process performed under CPT code 77336. For a given process step, each member of the assembled team (physicist, dosimetrist, and therapist) independently assigned numerical values for: probability of occurrence (O, 1–10), severity (S, 1–10), and probability of detection (D, 1–10) for every failure mode cause and effect combination. The risk priority number, RPN, was then calculated as a product of O, S and D from which an average RPN was calculated for each combination mentioned above. A fault tree diagram, with each process sorted into 6 categories, was created with linked RPN. For processes with high RPN recommended actions were assigned. 2 separate R and V systems (Lantis and EMR-based ARIA) were considered. Results: We identified 9 potential failure modes and corresponding 19 potential causes of these failure modes all resulting in unjustified 77336 charge and compromised quality of care. In Lantis, the range of RPN was 24.5–110.8, and of S values – 2–10. The highest ranking RPN of 110.8 came from the failure mode described as “end-of-treatment check not done before the completion of treatment”, and the highest S value of 10 (RPN=105) from “overrides not checked”. For the same failure modes, within ARIA electronic environment with its additional controls, RPN values were significantly lower (44.3 for end-of-treatment missing check and 20.0 for overrides not checked). Conclusion: Our work has shown that when charge capture was missed that also resulted in some services not being performed. Absence of such necessary services may result in sub-optimal quality of care rendered to patients.

  2. Radiator technology

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1993-01-01

    Radiator technology is discussed in the context of the Civilian Space Technology Initiative's (CSTI's) high capacity power-thermal management project. The CSTI project is a subset of a project to develop a piloted Mars nuclear electric propulsion (NEP) vehicle. The following topics are presented in vugraph form: advanced radiator concepts; heat pipe codes and testing; composite materials; radiator design and integration; and surface morphology.

  3. Combustion chamber analysis code

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-01-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  4. Electrical Circuit Simulation Code

    SciTech Connect

    Wix, Steven D.; Waters, Arlon J.; Shirley, David

    2001-08-09

    Massively-Parallel Electrical Circuit Simulation Code. CHILESPICE is a massively-arallel distributed-memory electrical circuit simulation tool that contains many enhanced radiation, time-based, and thermal features and models. Large scale electronic circuit simulation. Shared memory, parallel processing, enhance convergence. Sandia specific device models.

  5. Ethical coding.

    PubMed

    Resnik, Barry I

    2009-01-01

    It is ethical, legal, and proper for a dermatologist to maximize income through proper coding of patient encounters and procedures. The overzealous physician can misinterpret reimbursement requirements or receive bad advice from other physicians and cross the line from aggressive coding to coding fraud. Several of the more common problem areas are discussed.

  6. Comparison of measured responses in two spectrally-sensitive X-ray detectors to predictions obtained using the its radiation transport code

    SciTech Connect

    Carlson, G.A.; Beutler, D.E.; Seager, K.D.; Knott, D.P.

    1988-12-01

    Responses of a Ge detector and a filtered TLD array detector have been measured at a steady-state bremsstrahlung source (the Pelletron), at endpoint energies from 150 to 900 keV. Predictions of detector response using Monte Carlo ITS codes are found to be in excellent agreement with measured responses for both detectors. These results extend the range of validity of the ITS codes. With calibration provided by these experiments and by ITS predictions, dose-depth data from the TLD arrays can be used to estimate flash X-ray source endpoint energies.

  7. Comparison of measured responses in two spectrally-sensitive x-ray detectors to predictions obtained using the ITS (Integrated Tiger Series) radiation transport code

    SciTech Connect

    Carlson, G.A.; Beutler, D.E.; Seager, K.D.; Knott, D.P.

    1988-01-01

    Responses of a Ge detector and a filtered TLD array detector have been measured at a steady-state bremsstrahlung source (the Pelletron), at endpoint energies from 150 to 900 keV. Predictions of detector response using Monte Carlo ITS codes are found to be in excellent agreement with measured response for both detectors. These results extend the range of validity of the ITS codes. With calibration provided by these experiments and by ITS predictions, dose-depth data from the TLD arrays can be used to estimate flash x-ray source endpoint energies.

  8. Sharing code.

    PubMed

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  9. Development of rotating shadowband spectral radiometers and GCM radiation code test data sets in support of ARM. Technical progress report, September 15, 1992--October 31, 1993

    SciTech Connect

    Michalsky, J.; Harrison, L.

    1993-04-30

    The ARM goal is to help improve both longwave and shortwave models by providing improved radiometric shortwave data. These data can be used directly to test shortwave model predictions. As will be described below they can also provide inferred values for aerosol and cloud properties that are useful for longwave modeling efforts as well. The current ARM research program includes three tasks all related to the study of shortwave radiation transfer through clouds and aerosol. Two of the tasks involve the assembly of archived and new radiation and meteorological data sets; the third and dominant task has been the development and use of new shortwave radiometric sensors. Archived data from Golden, Colorado, and Albany, New York, were combined with National Weather Service ground and upper air data for testing radiation models for the era when the Earth Radiation Budget Experiment (ERBE) was operational. These data do not include optimum surface radiation measurements; consequently we are acquiring downwelling shortwave, including direct and diffuse irradiance, plus downwelling longwave, upwelling shortwave, and aerosol optical depth, at our own institution, as an additional dataset for ARM modelers.

  10. Intercomparison of Monte Carlo radiation transport codes to model TEPC response in low-energy neutron and gamma-ray fields.

    PubMed

    Ali, F; Waker, A J; Waller, E J

    2014-10-01

    Tissue-equivalent proportional counters (TEPC) can potentially be used as a portable and personal dosemeter in mixed neutron and gamma-ray fields, but what hinders this use is their typically large physical size. To formulate compact TEPC designs, the use of a Monte Carlo transport code is necessary to predict the performance of compact designs in these fields. To perform this modelling, three candidate codes were assessed: MCNPX 2.7.E, FLUKA 2011.2 and PHITS 2.24. In each code, benchmark simulations were performed involving the irradiation of a 5-in. TEPC with monoenergetic neutron fields and a 4-in. wall-less TEPC with monoenergetic gamma-ray fields. The frequency and dose mean lineal energies and dose distributions calculated from each code were compared with experimentally determined data. For the neutron benchmark simulations, PHITS produces data closest to the experimental values and for the gamma-ray benchmark simulations, FLUKA yields data closest to the experimentally determined quantities.

  11. The FLUKA Code: an Overview

    SciTech Connect

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M.V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P.R.; /Milan U. /INFN, Milan /Pavia U. /INFN, Pavia /CERN /Siegen U. /Houston U. /SLAC /Frascati /NASA, Houston /ENEA, Frascati

    2005-11-09

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  12. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P. R.; Scannicchio, D.; Trovati, S.; Villari, R.; Wilson, T.

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  13. Confocal coded aperture imaging

    DOEpatents

    Tobin, Jr., Kenneth William; Thomas, Jr., Clarence E.

    2001-01-01

    A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

  14. The Phantom SPH code

    NASA Astrophysics Data System (ADS)

    Price, Daniel; Wurster, James; Nixon, Chris

    2016-05-01

    I will present the capabilities of the Phantom SPH code for global simulations of dust and gas in protoplanetary discs. I will present our new algorithms for simulating both small and large grains in discs, as well as our progress towards simulating evolving grain populations and coupling with radiation. Finally, I will discuss our recent applications to HL Tau and the physics of dust gap opening.

  15. Inner Radiation Belt Representation of the Energetic Electron Environment: Model and Data Synthesis Using the Salammbo Radiation Belt Transport Code and Los Alamos Geosynchronous and GPS Energetic Particle Data

    NASA Technical Reports Server (NTRS)

    Friedel, R. H. W.; Bourdarie, S.; Fennell, J.; Kanekal, S.; Cayton, T. E.

    2004-01-01

    The highly energetic electron environment in the inner magnetosphere (GEO inward) has received a lot of research attention in resent years, as the dynamics of relativistic electron acceleration and transport are not yet fully understood. These electrons can cause deep dielectric charging in any space hardware in the MEO to GEO region. We use a new and novel approach to obtain a global representation of the inner magnetospheric energetic electron environment, which can reproduce the absolute environment (flux) for any spacecraft orbit in that region to within a factor of 2 for the energy range of 100 KeV to 5 MeV electrons, for any levels of magnetospheric activity. We combine the extensive set of inner magnetospheric energetic electron observations available at Los Alamos with the physics based Salammbo transport code, using the data assimilation technique of "nudging". This in effect input in-situ data into the code and allows the diffusion mechanisms in the code to interpolate the data into regions and times of no data availability. We present here details of the methods used, both in the data assimilation process and in the necessary inter-calibration of the input data used. We will present sample runs of the model/data code and compare the results to test spacecraft data not used in the data assimilation process.

  16. Sharing code

    PubMed Central

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing. PMID:25165519

  17. Improved outer boundary conditions for outer radiation belt data assimilation using THEMIS-SST data and the Salammbo-EnKF code

    NASA Astrophysics Data System (ADS)

    Maget, V.; Sicard-Piet, A.; Bourdarie, S.; Lazaro, D.; Turner, D. L.; Daglis, I. A.; Sandberg, I.

    2015-07-01

    Over the last decade, efforts have been made in the radiation belt community to develop data assimilation tools in order to improve the accuracy of radiation belts models. In this paper we present a new method to correctly take into account the outer boundary conditions at L* = 8 in such an enhanced model of the radiation belts. To do that we based our work on the Time History of Events and Macroscale Interactions during Substorms/Solid State Telescope data set. Statistics are developed to define a consistent electron distribution at L* = 8 (in both equatorial pitch angle and energy), and a variance-covariance matrix is estimated in order to more realistically drive the Monte Carlo sampling required by the Ensemble Kalman Filter (EnKF). Data processing is first described as well as caveats avoided, and then the use of these information in a machinery such as the EnKF is described. It is shown that the way the Monte Carlo simulations are performed is of great importance to realistically reproduced outer boundary distribution needed by the physic-based Salammbô model. Finally, EnKF simulations are performed and compared during September 2011 in order to analyze the improvements gained using this new method of defining outer boundary conditions. In particular, we highlight in this study that such a method provides great improvement in the reconstruction of the dynamics observed at geosynchronous orbit, both during quiet and active magnetic conditions.

  18. Investigation of the Performance of Various CVD Diamond Crystal Qualities for the Measurement of Radiation Doses from a Low Energy Mammography X-Ray Beam, Compared with MC Code (PENELOPE) Calculations

    NASA Astrophysics Data System (ADS)

    Zakari, Y. I.; Mavunda, R. D.; Nam, T. L.; Keddy, R. J.

    The tissue equivalence of diamond allows for accurate radiation dose determination without large corrections for different attenuation values in biological tissue, but its low Z value limits this advantage however to the lower energy photons such as for example in Mammography X-ray beams. This paper assays the performance of nine Chemical Vapour Deposition (CVD) diamonds for use as radiation sensing material. The specimens fabricated in wafer form are classified as detector grade, optical grade and single crystals. It is well known that the presence of defects in diamonds, including CVD specimens, not only dictates but also affects the responds of diamond to radiation in different ways. In this investigation, tools such as electron spin resonance (ESR), thermoluminescence (TL) Raman spectroscopy and ultra violet (UV) spectroscopy were used to probe each of the samples. The linearity, sensitivity and other characteristics of the detector to photon interaction was analyzed, and from the I-V characteristics. The diamonds categorized into four each, of the so called Detector and Optical grades, and a single crystal CVD were exposed to low X-ray peak voltage range (22 to 27 KVp) with a trans-crystal polarizing fields of 0.4 kV.cm-1, 0.66 kV.cm-1 and 0.8 kV.cm-1. The presentation discusses the presence of defects identifiable by the techniques used and correlates the radiation performance of the three types of crystals to their presence. The choice of a wafer as either a spectrometer or as X-ray dosimeter within the selected energy range was made. The analyses was validated with Monte-Carlo code (PENELOPE)

  19. Alpha particles at energies of 10 MeV to 1 TeV: conversion coefficients for fluence-to-absorbed dose, effective dose, and gray equivalent, calculated using Monte Carlo radiation transport code MCNPX 2.7.A.

    PubMed

    Copeland, Kyle; Parker, Donald E; Friedberg, Wallace

    2010-03-01

    Conversion coefficients have been calculated for fluence to absorbed dose, fluence to effective dose and fluence to gray equivalent, for isotropic exposure to alpha particles in the energy range of 10 MeV to 1 TeV (0.01-1000 GeV). The coefficients were calculated using Monte Carlo transport code MCNPX 2.7.A and BodyBuilder 1.3 anthropomorphic phantoms modified to allow calculation of effective dose to a Reference Person using tissues and tissue weighting factors from 1990 and 2007 recommendations of the International Commission on Radiological Protection (ICRP) and gray equivalent to selected tissues as recommended by the National Council on Radiation Protection and Measurements. Coefficients for effective dose are within 30 % of those calculated using ICRP 1990 recommendations.

  20. Fluence to absorbed dose, effective dose and gray equivalent conversion coefficients for iron nuclei from 10 MeV to 1 TeV, calculated using Monte Carlo radiation transport code MCNPX 2.7.A.

    PubMed

    Copeland, Kyle; Parker, Donald E; Friedberg, Wallace

    2010-03-01

    Conversion coefficients have been calculated for fluence-to-absorbed dose, fluence-to-effective dose and fluence-to-gray equivalent for isotropic exposure of an adult male and an adult female to (56)Fe(26+) in the energy range of 10 MeV to 1 TeV (0.01-1000 GeV). The coefficients were calculated using Monte Carlo transport code MCNPX 2.7.A and BodyBuilder 1.3 anthropomorphic phantoms modified to allow calculation of effective dose using tissues and tissue weighting factors from either the 1990 or 2007 recommendations of the International Commission on Radiological Protection (ICRP) and gray equivalent to selected tissues as recommended by the National Council on Radiation Protection and Measurements. Calculations using ICRP 2007 recommendations result in fluence-to-effective dose conversion coefficients that are almost identical at most energies to those calculated using ICRP 1990 recommendations.

  1. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  2. Deuterons at energies of 10 MeV to 1 TeV: conversion coefficients for fluence-to-absorbed dose, equivalent dose, effective dose and gray equivalent, calculated using Monte Carlo radiation transport code MCNPX 2.7.C.

    PubMed

    Copeland, Kyle; Parker, Donald E; Friedberg, Wallace

    2011-01-01

    Conversion coefficients were calculated for fluence-to-absorbed dose, fluence-to-equivalent dose, fluence-to-effective dose and fluence-to-gray equivalent for isotropic exposure of an adult female and an adult male to deuterons ((2)H(+)) in the energy range 10 MeV-1 TeV (0.01-1000 GeV). Coefficients were calculated using the Monte Carlo transport code MCNPX 2.7.C and BodyBuilder™ 1.3 anthropomorphic phantoms. Phantoms were modified to allow calculation of the effective dose to a Reference Person using tissues and tissue weighting factors from 1990 and 2007 recommendations of the International Commission on Radiological Protection (ICRP) and gray equivalent to selected tissues as recommended by the National Council on Radiation Protection and Measurements. Coefficients for the equivalent and effective dose incorporated a radiation weighting factor of 2. At 15 of 19 energies for which coefficients for the effective dose were calculated, coefficients based on ICRP 1990 and 2007 recommendations differed by <3%. The greatest difference, 47%, occurred at 30 MeV.

  3. An Overview of the Monte Carlo Methods, Codes, & Applications Group

    SciTech Connect

    Trahan, Travis John

    2016-08-30

    This report sketches the work of the Group to deliver first-principle Monte Carlo methods, production quality codes, and radiation transport-based computational and experimental assessments using the codes MCNP and MCATK for such applications as criticality safety, non-proliferation, nuclear energy, nuclear threat reduction and response, radiation detection and measurement, radiation health protection, and stockpile stewardship.

  4. Radiation transport calculations for cosmic radiation.

    PubMed

    Endo, A; Sato, T

    2012-01-01

    The radiation environment inside and near spacecraft consists of various components of primary radiation in space and secondary radiation produced by the interaction of the primary radiation with the walls and equipment of the spacecraft. Radiation fields inside astronauts are different from those outside them, because of the body's self-shielding as well as the nuclear fragmentation reactions occurring in the human body. Several computer codes have been developed to simulate the physical processes of the coupled transport of protons, high-charge and high-energy nuclei, and the secondary radiation produced in atomic and nuclear collision processes in matter. These computer codes have been used in various space radiation protection applications: shielding design for spacecraft and planetary habitats, simulation of instrument and detector responses, analysis of absorbed doses and quality factors in organs and tissues, and study of biological effects. This paper focuses on the methods and computer codes used for radiation transport calculations on cosmic radiation, and their application to the analysis of radiation fields inside spacecraft, evaluation of organ doses in the human body, and calculation of dose conversion coefficients using the reference phantoms defined in ICRP Publication 110.

  5. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  6. Development of the Code RITRACKS

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Cucinotta, Francis A.

    2013-01-01

    A document discusses the code RITRACKS (Relativistic Ion Tracks), which was developed to simulate heavy ion track structure at the microscopic and nanoscopic scales. It is a Monte-Carlo code that simulates the production of radiolytic species in water, event-by-event, and which may be used to simulate tracks and also to calculate dose in targets and voxels of different sizes. The dose deposited by the radiation can be calculated in nanovolumes (voxels). RITRACKS allows simulation of radiation tracks without the need of extensive knowledge of computer programming or Monte-Carlo simulations. It is installed as a regular application on Windows systems. The main input parameters entered by the user are the type and energy of the ion, the length and size of the irradiated volume, the number of ions impacting the volume, and the number of histories. The simulation can be started after the input parameters are entered in the GUI. The number of each kind of interactions for each track is shown in the result details window. The tracks can be visualized in 3D after the simulation is complete. It is also possible to see the time evolution of the tracks and zoom on specific parts of the tracks. The software RITRACKS can be very useful for radiation scientists to investigate various problems in the fields of radiation physics, radiation chemistry, and radiation biology. For example, it can be used to simulate electron ejection experiments (radiation physics).

  7. Studies of acute and chronic radiation injury at the Biological and Medical Research Division, Argonne National Laboratory, 1953-1970: Description of individual studies, data files, codes, and summaries of significant findings

    SciTech Connect

    Grahn, D.; Fox, C.; Wright, B.J.; Carnes, B.A.

    1994-05-01

    Between 1953 and 1970, studies on the long-term effects of external x-ray and {gamma} irradiation on inbred and hybrid mouse stocks were carried out at the Biological and Medical Research Division, Argonne National Laboratory. The results of these studies, plus the mating, litter, and pre-experimental stock records, were routinely coded on IBM cards for statistical analysis and record maintenance. Also retained were the survival data from studies performed in the period 1943-1953 at the National Cancer Institute, National Institutes of Health, Bethesda, Maryland. The card-image data files have been corrected where necessary and refiled on hard disks for long-term storage and ease of accessibility. In this report, the individual studies and data files are described, and pertinent factors regarding caging, husbandry, radiation procedures, choice of animals, and other logistical details are summarized. Some of the findings are also presented. Descriptions of the different mouse stocks and hybrids are included in an appendix; more than three dozen stocks were involved in these studies. Two other appendices detail the data files in their original card-image format and the numerical codes used to describe the animal`s exit from an experiment and, for some studies, any associated pathologic findings. Tabular summaries of sample sizes, dose levels, and other variables are also given to assist investigators in their selection of data for analysis. The archive is open to any investigator with legitimate interests and a willingness to collaborate and acknowledge the source of the data and to recognize appropriate conditions or caveats.

  8. Spaceflight Validation of Hzetrn Code

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Shinn, J. L.; Singleterry, R. C.; Badavi, F. F.; Badhwar, G. D.; Reitz, G.; Beaujean, R.; Cucinotta, F. A.

    1999-01-01

    HZETRN is being developed as a fast deterministic radiation transport code applicable to neutrons, protons, and multiply charged ions in the space environment. It was recently applied to 50 hours of IMP8 data measured during the August 4, 1972 solar event to map the hourly exposures within the human body under several shield configurations. This calculation required only 18 hours on a VAX 4000 machine. A similar calculation using the Monte Carlo method would have required two years of dedicated computer time. The code has been benchmarked against well documented and tested Monte Carlo proton transport codes with good success. The code will allow important trade studies to be made with relative ease due to the computational speed and will be useful in assessing design alternatives in an integrated system software environment. Since there are no well tested Monte Carlo codes for HZE particles, we have been engaged in flight validation of the HZETRN results. To date we have made comparison with TEPC, CR-39, charge particle telescopes, and Bonner spheres. This broad range of detectors allows us to test a number of functions related to differing physical processes which add to the complicated radiation fields within a spacecraft or the human body, which functions can be calculated by the HZETRN code system. In the present report we will review these results.

  9. FAST GYROSYNCHROTRON CODES

    SciTech Connect

    Fleishman, Gregory D.; Kuznetsov, Alexey A.

    2010-10-01

    Radiation produced by charged particles gyrating in a magnetic field is highly significant in the astrophysics context. Persistently increasing resolution of astrophysical observations calls for corresponding three-dimensional modeling of the radiation. However, available exact equations are prohibitively slow in computing a comprehensive table of high-resolution models required for many practical applications. To remedy this situation, we develop approximate gyrosynchrotron (GS) codes capable of quickly calculating the GS emission (in non-quantum regime) from both isotropic and anisotropic electron distributions in non-relativistic, mildly relativistic, and ultrarelativistic energy domains applicable throughout a broad range of source parameters including dense or tenuous plasmas and weak or strong magnetic fields. The computation time is reduced by several orders of magnitude compared with the exact GS algorithm. The new algorithm performance can gradually be adjusted to the user's needs depending on whether precision or computation speed is to be optimized for a given model. The codes are made available for users as a supplement to this paper.

  10. Tritons at energies of 10 MeV to 1 TeV: conversion coefficients for fluence-to-absorbed dose, equivalent dose, effective dose and gray equivalent, calculated using Monte Carlo radiation transport code MCNPX 2.7.C.

    PubMed

    Copeland, Kyle; Parker, Donald E; Friedberg, Wallace

    2010-12-01

    Conversion coefficients were calculated for fluence-to-absorbed dose, fluence-to-equivalent dose, fluence-to-effective dose and fluence-to-gray equivalent for isotropic exposure of an adult female and an adult male to tritons ((3)H(+)) in the energy range of 10 MeV to 1 TeV (0.01-1000 GeV). Coefficients were calculated using Monte Carlo transport code MCNPX 2.7.C and BodyBuilder™ 1.3 anthropomorphic phantoms. Phantoms were modified to allow calculation of effective dose to a Reference Person using tissues and tissue weighting factors from 1990 and 2007 recommendations of the International Commission on Radiological Protection (ICRP) and calculation of gray equivalent to selected tissues as recommended by the National Council on Radiation Protection and Measurements. At 15 of the 19 energies for which coefficients for effective dose were calculated, coefficients based on ICRP 2007 and 1990 recommendations differed by less than 3%. The greatest difference, 43%, occurred at 30 MeV.

  11. Helions at energies of 10 MeV to 1 TeV: conversion coefficients for fluence-to-absorbed dose, equivalent dose, effective dose and gray equivalent, calculated using Monte Carlo radiation transport code MCNPX 2.7.C.

    PubMed

    Copeland, Kyle; Parker, Donald E; Friedberg, Wallace

    2010-12-01

    Conversion coefficients were calculated for fluence-to-absorbed dose, fluence-to-equivalent dose, fluence-to-effective dose and fluence-to-gray equivalent, for isotropic exposure of an adult male and an adult female to helions ((3)He(2+)) in the energy range of 10 MeV to 1 TeV (0.01-1000 GeV). Calculations were performed using Monte Carlo transport code MCNPX 2.7.C and BodyBuilder™ 1.3 anthropomorphic phantoms modified to allow calculation of effective dose using tissues and tissue weighting factors from either the 1990 or 2007 recommendations of the International Commission on Radiological Protection (ICRP), and gray equivalent to selected tissues as recommended by the National Council on Radiation Protection and Measurements. At 15 of the 19 energies for which coefficients for effective dose were calculated, coefficients based on ICRP 2007 and 1990 recommendations differed by less than 2%. The greatest difference, 62%, occurred at 100 MeV.

  12. The Integrated TIGER Series Codes

    SciTech Connect

    Kensek, Ronald P.; Franke, Brian C.; Laub, Thomas W.

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  13. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  14. TU-EF-304-10: Efficient Multiscale Simulation of the Proton Relative Biological Effectiveness (RBE) for DNA Double Strand Break (DSB) Induction and Bio-Effective Dose in the FLUKA Monte Carlo Radiation Transport Code

    SciTech Connect

    Moskvin, V; Tsiamas, P; Axente, M; Farr, J; Stewart, R

    2015-06-15

    Purpose: One of the more critical initiating events for reproductive cell death is the creation of a DNA double strand break (DSB). In this study, we present a computationally efficient way to determine spatial variations in the relative biological effectiveness (RBE) of proton therapy beams within the FLUKA Monte Carlo (MC) code. Methods: We used the independently tested Monte Carlo Damage Simulation (MCDS) developed by Stewart and colleagues (Radiat. Res. 176, 587–602 2011) to estimate the RBE for DSB induction of monoenergetic protons, tritium, deuterium, hellium-3, hellium-4 ions and delta-electrons. The dose-weighted (RBE) coefficients were incorporated into FLUKA to determine the equivalent {sup 6}°60Co γ-ray dose for representative proton beams incident on cells in an aerobic and anoxic environment. Results: We found that the proton beam RBE for DSB induction at the tip of the Bragg peak, including primary and secondary particles, is close to 1.2. Furthermore, the RBE increases laterally to the beam axis at the area of Bragg peak. At the distal edge, the RBE is in the range from 1.3–1.4 for cells irradiated under aerobic conditions and may be as large as 1.5–1.8 for cells irradiated under anoxic conditions. Across the plateau region, the recorded RBE for DSB induction is 1.02 for aerobic cells and 1.05 for cells irradiated under anoxic conditions. The contribution to total effective dose from secondary heavy ions decreases with depth and is higher at shallow depths (e.g., at the surface of the skin). Conclusion: Multiscale simulation of the RBE for DSB induction provides useful insights into spatial variations in proton RBE within pristine Bragg peaks. This methodology is potentially useful for the biological optimization of proton therapy for the treatment of cancer. The study highlights the need to incorporate spatial variations in proton RBE into proton therapy treatment plans.

  15. Electromagnetic particle simulation codes

    NASA Technical Reports Server (NTRS)

    Pritchett, P. L.

    1985-01-01

    Electromagnetic particle simulations solve the full set of Maxwell's equations. They thus include the effects of self-consistent electric and magnetic fields, magnetic induction, and electromagnetic radiation. The algorithms for an electromagnetic code which works directly with the electric and magnetic fields are described. The fields and current are separated into transverse and longitudinal components. The transverse E and B fields are integrated in time using a leapfrog scheme applied to the Fourier components. The particle pushing is performed via the relativistic Lorentz force equation for the particle momentum. As an example, simulation results are presented for the electron cyclotron maser instability which illustrate the importance of relativistic effects on the wave-particle resonance condition and on wave dispersion.

  16. The stellar atmosphere simulation code Bifrost. Code description and validation

    NASA Astrophysics Data System (ADS)

    Gudiksen, B. V.; Carlsson, M.; Hansteen, V. H.; Hayek, W.; Leenaarts, J.; Martínez-Sykora, J.

    2011-07-01

    Context. Numerical simulations of stellar convection and photospheres have been developed to the point where detailed shapes of observed spectral lines can be explained. Stellar atmospheres are very complex, and very different physical regimes are present in the convection zone, photosphere, chromosphere, transition region and corona. To understand the details of the atmosphere it is necessary to simulate the whole atmosphere since the different layers interact strongly. These physical regimes are very diverse and it takes a highly efficient massively parallel numerical code to solve the associated equations. Aims: The design, implementation and validation of the massively parallel numerical code Bifrost for simulating stellar atmospheres from the convection zone to the corona. Methods: The code is subjected to a number of validation tests, among them the Sod shock tube test, the Orzag-Tang colliding shock test, boundary condition tests and tests of how the code treats magnetic field advection, chromospheric radiation, radiative transfer in an isothermal scattering atmosphere, hydrogen ionization and thermal conduction. Results.Bifrost completes the tests with good results and shows near linear efficiency scaling to thousands of computing cores.

  17. Radiation load to the SNAP CCD

    SciTech Connect

    N. V. Mokhov, I. L. Rakhno and S. I. Striganov

    2003-08-14

    Results of an express Monte Carlo analysis with the MARS14 code of radiation load to the CCD optical detectors in the Supernova Acceleration Project (SNAP) mission presented for realistic radiation environment over the satellite orbit.

  18. Diagnostic Coding for Epilepsy.

    PubMed

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  19. Model Children's Code.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  20. How Should I Care for Myself During Radiation Therapy?

    MedlinePlus

    ... Upper GI What is Radiation Therapy? Find a Radiation Oncologist Last Name: Facility: City: State: Zip Code: ... information How Should I Care for Myself During Radiation Therapy? Get plenty of rest. Many patients experience ...

  1. Phylogeny of genetic codes and punctuation codes within genetic codes.

    PubMed

    Seligmann, Hervé

    2015-03-01

    Punctuation codons (starts, stops) delimit genes, reflect translation apparatus properties. Most codon reassignments involve punctuation. Here two complementary approaches classify natural genetic codes: (A) properties of amino acids assigned to codons (classical phylogeny), coding stops as X (A1, antitermination/suppressor tRNAs insert unknown residues), or as gaps (A2, no translation, classical stop); and (B) considering only punctuation status (start, stop and other codons coded as -1, 0 and 1 (B1); 0, -1 and 1 (B2, reflects ribosomal translational dynamics); and 1, -1, and 0 (B3, starts/stops as opposites)). All methods separate most mitochondrial codes from most nuclear codes; Gracilibacteria consistently cluster with metazoan mitochondria; mitochondria co-hosted with chloroplasts cluster with nuclear codes. Method A1 clusters the euplotid nuclear code with metazoan mitochondria; A2 separates euplotids from mitochondria. Firmicute bacteria Mycoplasma/Spiroplasma and Protozoan (and lower metazoan) mitochondria share codon-amino acid assignments. A1 clusters them with mitochondria, they cluster with the standard genetic code under A2: constraints on amino acid ambiguity versus punctuation-signaling produced the mitochondrial versus bacterial versions of this genetic code. Punctuation analysis B2 converges best with classical phylogenetic analyses, stressing the need for a unified theory of genetic code punctuation accounting for ribosomal constraints.

  2. International assessment of PCA codes

    SciTech Connect

    Neymotin, L.; Lui, C.; Glynn, J.; Archarya, S.

    1993-11-01

    Over the past three years (1991-1993), an extensive international exercise for intercomparison of a group of six Probabilistic Consequence Assessment (PCA) codes was undertaken. The exercise was jointly sponsored by the Commission of European Communities (CEC) and OECD Nuclear Energy Agency. This exercise was a logical continuation of a similar effort undertaken by OECD/NEA/CSNI in 1979-1981. The PCA codes are currently used by different countries for predicting radiological health and economic consequences of severe accidents at nuclear power plants (and certain types of non-reactor nuclear facilities) resulting in releases of radioactive materials into the atmosphere. The codes participating in the exercise were: ARANO (Finland), CONDOR (UK), COSYMA (CEC), LENA (Sweden), MACCS (USA), and OSCAAR (Japan). In parallel with this inter-code comparison effort, two separate groups performed a similar set of calculations using two of the participating codes, MACCS and COSYMA. Results of the intercode and inter-MACCS comparisons are presented in this paper. The MACCS group included four participants: GREECE: Institute of Nuclear Technology and Radiation Protection, NCSR Demokritos; ITALY: ENEL, ENEA/DISP, and ENEA/NUC-RIN; SPAIN: Universidad Politecnica de Madrid (UPM) and Consejo de Seguridad Nuclear; USA: Brookhaven National Laboratory, US NRC and DOE.

  3. Power System Optimization Codes Modified

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1999-01-01

    A major modification of and addition to existing Closed Brayton Cycle (CBC) space power system optimization codes was completed. These modifications relate to the global minimum mass search driver programs containing three nested iteration loops comprising iterations on cycle temperature ratio, and three separate pressure ratio iteration loops--one loop for maximizing thermodynamic efficiency, one for minimizing radiator area, and a final loop for minimizing overall power system mass. Using the method of steepest ascent, the code sweeps through the pressure ratio space repeatedly, each time with smaller iteration step sizes, so that the three optimum pressure ratios can be obtained to any desired accuracy for each of the objective functions referred to above (i.e., maximum thermodynamic efficiency, minimum radiator area, and minimum system mass). Two separate options for the power system heat source are available: 1. A nuclear fission reactor can be used. It is provided with a radiation shield 1. (composed of a lithium hydride (LiH) neutron shield and tungsten (W) gamma shield). Suboptions can be used to select the type of reactor (i.e., fast spectrum liquid metal cooled or epithermal high-temperature gas reactor (HTGR)). 2. A solar heat source can be used. This option includes a parabolic concentrator and heat receiver for raising the temperature of the recirculating working fluid. A useful feature of the code modifications is that key cycle parameters are displayed, including the overall system specific mass in kilograms per kilowatt and the system specific power in watts per kilogram, as the results for each temperature ratio are computed. As the minimum mass temperature ratio is encountered, a message is printed out. Several levels of detailed information on cycle state points, subsystem mass results, and radiator temperature profiles are stored for this temperature ratio condition and can be displayed or printed by users.

  4. Accumulate repeat accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

  5. Concatenated Coding Using Trellis-Coded Modulation

    NASA Technical Reports Server (NTRS)

    Thompson, Michael W.

    1997-01-01

    In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.

  6. Coset Codes Viewed as Terminated Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1996-01-01

    In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.

  7. Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems

    SciTech Connect

    Anderson, S R; Bihari, B L; Salari, K; Woodward, C S

    2006-12-29

    As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.

  8. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  9. Bar Codes for Libraries.

    ERIC Educational Resources Information Center

    Rahn, Erwin

    1984-01-01

    Discusses the evolution of standards for bar codes (series of printed lines and spaces that represent numbers, symbols, and/or letters of alphabet) and describes the two types most frequently adopted by libraries--Code-A-Bar and CODE 39. Format of the codes is illustrated. Six references and definitions of terminology are appended. (EJS)

  10. Manually operated coded switch

    DOEpatents

    Barnette, Jon H.

    1978-01-01

    The disclosure relates to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made.

  11. QR Codes 101

    ERIC Educational Resources Information Center

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  12. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  13. Comparison of the Gauss-Seidel spherical polarized radiative transfer code with other radiative transfer codes.

    PubMed

    Herman, B M; Caudill, T R; Flittner, D E; Thome, K J; Ben-David, A

    1995-07-20

    Calculations that use the Gauss-Seidel method are presented of the diffusely scattered light in a spherical atmosphere with polarization fully included. Comparisons are made between this method and the Monte Carlo calculations of other researchers for spherical geometry in a pure Rayleigh atmosphere. Comparisons with plane-parallel atmospheres are also presented. Single-scatter intensity comparisons with spherical geometry show excellent agreement. When all orders of scattering are included, comparisons of polarization parameters I, Q and U as well as the plane of polarization show good agreement when allowances are made for the statistical variability inherent in the Monte Carlo method.

  14. Efficient entropy coding for scalable video coding

    NASA Astrophysics Data System (ADS)

    Choi, Woong Il; Yang, Jungyoup; Jeon, Byeungwoo

    2005-10-01

    The standardization for the scalable extension of H.264 has called for additional functionality based on H.264 standard to support the combined spatio-temporal and SNR scalability. For the entropy coding of H.264 scalable extension, Context-based Adaptive Binary Arithmetic Coding (CABAC) scheme is considered so far. In this paper, we present a new context modeling scheme by using inter layer correlation between the syntax elements. As a result, it improves coding efficiency of entropy coding in H.264 scalable extension. In simulation results of applying the proposed scheme to encoding the syntax element mb_type, it is shown that improvement in coding efficiency of the proposed method is up to 16% in terms of bit saving due to estimation of more adequate probability model.

  15. The program RADLST (Radiation Listing)

    SciTech Connect

    Burrows, T.W.

    1988-02-29

    The program RADLST (Radiation Listing) is designed to calculate the nuclear and atomic radiations associated with the radioactive decay of nuclei. It uses as its primary input nuclear decay data in the Evaluated Nuclear Structure Data File (ENSDF) format. The code is written in FORTRAN 77 and, with a few exceptions, is consistent with the ANSI standard. 65 refs.

  16. Effects of Nuclear Interactions in Space Radiation Transport

    NASA Technical Reports Server (NTRS)

    Lin, Zi-Wei; Barghouty, A. F.

    2004-01-01

    Space radiation transport codes have been developed to calculate radiation effects behind materials in human missions to the Moon, Mars or beyond. We study how nuclear fragmentation processes affect predictions from such radiation transport codes. In particular, we investigate the effects of fragmentation cross sections at different energies on fluxes, dose and dose-equivalent from galactic cosmic rays behind typical shielding materials.

  17. Effects of Nuclear Interactions in Space Radiation Transport

    NASA Technical Reports Server (NTRS)

    Lin, Zi-Wei; Barghouty, A. F.

    2005-01-01

    Space radiation transport codes have been developed to calculate radiation effects behind materials in human mission to the Moon, Mars or beyond. We study how nuclear fragmentation processes affect predictions from such radiation transport codes. In particular, we investigate the effects of fragmentation cross sections at different energies on fluxes, dose and dose-equivalent from galactic cosmic rays behind typical shielding materials.

  18. Honesty and Honor Codes.

    ERIC Educational Resources Information Center

    McCabe, Donald; Trevino, Linda Klebe

    2002-01-01

    Explores the rise in student cheating and evidence that students cheat less often at schools with an honor code. Discusses effective use of such codes and creation of a peer culture that condemns dishonesty. (EV)

  19. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-02-20

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  20. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-01-01

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  1. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  2. DIANE multiparticle transport code

    NASA Astrophysics Data System (ADS)

    Caillaud, M.; Lemaire, S.; Ménard, S.; Rathouit, P.; Ribes, J. C.; Riz, D.

    2014-06-01

    DIANE is the general Monte Carlo code developed at CEA-DAM. DIANE is a 3D multiparticle multigroup code. DIANE includes automated biasing techniques and is optimized for massive parallel calculations.

  3. Radiation dosimetry.

    PubMed Central

    Cameron, J

    1991-01-01

    This article summarizes the basic facts about the measurement of ionizing radiation, usually referred to as radiation dosimetry. The article defines the common radiation quantities and units; gives typical levels of natural radiation and medical exposures; and describes the most important biological effects of radiation and the methods used to measure radiation. Finally, a proposal is made for a new radiation risk unit to make radiation risks more understandable to nonspecialists. PMID:2040250

  4. EMF wire code research

    SciTech Connect

    Jones, T.

    1993-11-01

    This paper examines the results of previous wire code research to determines the relationship with childhood cancer, wire codes and electromagnetic fields. The paper suggests that, in the original Savitz study, biases toward producing a false positive association between high wire codes and childhood cancer were created by the selection procedure.

  5. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  6. Mapping Local Codes to Read Codes.

    PubMed

    Bonney, Wilfred; Galloway, James; Hall, Christopher; Ghattas, Mikhail; Tramma, Leandro; Nind, Thomas; Donnelly, Louise; Jefferson, Emily; Doney, Alexander

    2017-01-01

    Background & Objectives: Legacy laboratory test codes make it difficult to use clinical datasets for meaningful translational research, where populations are followed for disease risk and outcomes over many years. The Health Informatics Centre (HIC) at the University of Dundee hosts continuous biochemistry data from the clinical laboratories in Tayside and Fife dating back as far as 1987. However, the HIC-managed biochemistry dataset is coupled with incoherent sample types and unstandardised legacy local test codes, which increases the complexity of using the dataset for reasonable population health outcomes. The objective of this study was to map the legacy local test codes to the Scottish 5-byte Version 2 Read Codes using biochemistry data extracted from the repository of the Scottish Care Information (SCI) Store.

  7. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  8. Verification of the Calore thermal analysis code.

    SciTech Connect

    Dowding, Kevin J.; Blackwell, Bennie Francis

    2004-07-01

    Calore is the ASC code developed to model steady and transient thermal diffusion with chemistry and dynamic enclosure radiation. An integral part of the software development process is code verification, which addresses the question 'Are we correctly solving the model equations'? This process aids the developers in that it identifies potential software bugs and gives the thermal analyst confidence that a properly prepared input will produce satisfactory output. Grid refinement studies have been performed on problems for which we have analytical solutions. In this talk, the code verification process is overviewed and recent results are presented. Recent verification studies have focused on transient nonlinear heat conduction and verifying algorithms associated with (tied) contact and adaptive mesh refinement. In addition, an approach to measure the coverage of the verification test suite relative to intended code applications is discussed.

  9. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  10. DLLExternalCode

    SciTech Connect

    Greg Flach, Frank Smith

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  11. Defeating the coding monsters.

    PubMed

    Colt, Ross

    2007-02-01

    Accuracy in coding is rapidly becoming a required skill for military health care providers. Clinic staffing, equipment purchase decisions, and even reimbursement will soon be based on the coding data that we provide. Learning the complicated myriad of rules to code accurately can seem overwhelming. However, the majority of clinic visits in a typical outpatient clinic generally fall into two major evaluation and management codes, 99213 and 99214. If health care providers can learn the rules required to code a 99214 visit, then this will provide a 90% solution that can enable them to accurately code the majority of their clinic visits. This article demonstrates a step-by-step method to code a 99214 visit, by viewing each of the three requirements as a monster to be defeated.

  12. Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1990-01-01

    The continued development and improvement of the viscous shock layer (VSL) nonequilibrium chemistry blunt body engineering code, the incorporation in a coupled manner of radiation models into the VSL code, and the initial development of appropriate precursor models are presented.

  13. Radiation from advanced solid rocket motor plumes

    NASA Astrophysics Data System (ADS)

    Farmer, Richard C.; Smith, Sheldon D.; Myruski, Brian L.

    1994-12-01

    The overall objective of this study was to develop an understanding of solid rocket motor (SRM) plumes in sufficient detail to accurately explain the majority of plume radiation test data. Improved flowfield and radiation analysis codes were developed to accurately and efficiently account for all the factors which effect radiation heating from rocket plumes. These codes were verified by comparing predicted plume behavior with measured NASA/MSFC ASRM test data. Upon conducting a thorough review of the current state-of-the-art of SRM plume flowfield and radiation prediction methodology and the pertinent data base, the following analyses were developed for future design use. The NOZZRAD code was developed for preliminary base heating design and Al2O3 particle optical property data evaluation using a generalized two-flux solution to the radiative transfer equation. The IDARAD code was developed for rapid evaluation of plume radiation effects using the spherical harmonics method of differential approximation to the radiative transfer equation. The FDNS CFD code with fully coupled Euler-Lagrange particle tracking was validated by comparison to predictions made with the industry standard RAMP code for SRM nozzle flowfield analysis. The FDNS code provides the ability to analyze not only rocket nozzle flow, but also axisymmetric and three-dimensional plume flowfields with state-of-the-art CFD methodology. Procedures for conducting meaningful thermo-vision camera studies were developed.

  14. Radiation from advanced solid rocket motor plumes

    NASA Technical Reports Server (NTRS)

    Farmer, Richard C.; Smith, Sheldon D.; Myruski, Brian L.

    1994-01-01

    The overall objective of this study was to develop an understanding of solid rocket motor (SRM) plumes in sufficient detail to accurately explain the majority of plume radiation test data. Improved flowfield and radiation analysis codes were developed to accurately and efficiently account for all the factors which effect radiation heating from rocket plumes. These codes were verified by comparing predicted plume behavior with measured NASA/MSFC ASRM test data. Upon conducting a thorough review of the current state-of-the-art of SRM plume flowfield and radiation prediction methodology and the pertinent data base, the following analyses were developed for future design use. The NOZZRAD code was developed for preliminary base heating design and Al2O3 particle optical property data evaluation using a generalized two-flux solution to the radiative transfer equation. The IDARAD code was developed for rapid evaluation of plume radiation effects using the spherical harmonics method of differential approximation to the radiative transfer equation. The FDNS CFD code with fully coupled Euler-Lagrange particle tracking was validated by comparison to predictions made with the industry standard RAMP code for SRM nozzle flowfield analysis. The FDNS code provides the ability to analyze not only rocket nozzle flow, but also axisymmetric and three-dimensional plume flowfields with state-of-the-art CFD methodology. Procedures for conducting meaningful thermo-vision camera studies were developed.

  15. Relativistic radiation damping for simulation

    NASA Astrophysics Data System (ADS)

    Chotia, Amodsen

    2005-10-01

    The aim of this work is to implement radiation braking into a simulation code. Radiation physics of accelerated charges is not new. It dates from the end of the 19th century, from Maxwell theory and Larmor, Poynting, Thomson, Poincare, Lorentz, Von Laue, Abraham, Schott, Planck, Landau, Einstein, Dirac, Wheeler et Feynmann (and many others). The result reaches out from the length of life of exited levels of atoms, antennas, and lays out through specific production of radiation by bremsstrahlung in particles accelerators but also spatial and stellar astrophysics. In this work we start from Landau Lifchitz equation to express the quadrivector acceleration in term of the fields. Using a result from Pomeranchouck we deduce the energy lost by radiation. We do an instantaneous colinear projection of the velocity vector in order to substract the loss of kinetic energy due to radiation. The equation of motion is then solved based on Boris algorithm. The code is tested on few examples.

  16. Radiation transport Part B: Applications with examples

    SciTech Connect

    Beutler, D.E.

    1997-06-01

    In the previous sections Len Lorence has described the need, theory, and types of radiation codes that can be applied to model the results of radiation effects tests or working environments for electronics. For the rest of this segment, the author will concentrate on the specific ways the codes can be used to predict device response or analyze radiation test results. Regardless of whether one is predicting responses in a working or test environment, the procedures are virtually the same. The same can be said for the use of 1-, 2-, or 3-dimensional codes and Monte Carlo or discrete ordinates codes. No attempt is made to instruct the student on the specifics of the code. For example, the author will not discuss the details, such as the number of meshes, energy groups, etc. that are appropriate for a discrete ordinates code. For the sake of simplicity, he will restrict himself to the 1-dimensional code CEPXS/ONELD. This code along with a wide variety of other radiation codes can be obtained form the Radiation Safety Information Computational Center (RSICC) for a nominal handling fee.

  17. Benchmarking the Multidimensional Stellar Implicit Code MUSIC

    NASA Astrophysics Data System (ADS)

    Goffrey, T.; Pratt, J.; Viallet, M.; Baraffe, I.; Popov, M. V.; Walder, R.; Folini, D.; Geroux, C.; Constantino, T.

    2017-03-01

    We present the results of a numerical benchmark study for the MUltidimensional Stellar Implicit Code (MUSIC) based on widely applicable two- and three-dimensional compressible hydrodynamics problems relevant to stellar interiors. MUSIC is an implicit large eddy simulation code that uses implicit time integration, implemented as a Jacobian-free Newton Krylov method. A physics based preconditioning technique which can be adjusted to target varying physics is used to improve the performance of the solver. The problems used for this benchmark study include the Rayleigh-Taylor and Kelvin-Helmholtz instabilities, and the decay of the Taylor-Green vortex. Additionally we show a test of hydrostatic equilibrium, in a stellar environment which is dominated by radiative effects. In this setting the flexibility of the preconditioning technique is demonstrated. This work aims to bridge the gap between the hydrodynamic test problems typically used during development of numerical methods and the complex flows of stellar interiors. A series of multidimensional tests were performed and analysed. Each of these test cases was analysed with a simple, scalar diagnostic, with the aim of enabling direct code comparisons. As the tests performed do not have analytic solutions, we verify MUSIC by comparing it to established codes including ATHENA and the PENCIL code. MUSIC is able to both reproduce behaviour from established and widely-used codes as well as results expected from theoretical predictions. This benchmarking study concludes a series of papers describing the development of the MUSIC code and provides confidence in future applications.

  18. DOE 2011 occupational radiation exposure

    SciTech Connect

    none,

    2012-12-01

    The U.S. Department of Energy (DOE) Office of Analysis within the Office of Health, Safety and Security (HSS) publishes the annual DOE Occupational Radiation Exposure Report to provide an overview of the status of radiation protection practices at DOE (including the National Nuclear Security Administration [NNSA]). The DOE 2011 Occupational Radiation Exposure Report provides an evaluation of DOE-wide performance regarding compliance with Title 10, Code of Federal Regulations (C.F.R.), Part 835, Occupational Radiation Protection dose limits and as low as reasonably achievable (ALARA) process requirements. In addition, the report provides data to DOE organizations responsible for developing policies for protection of individuals from the adverse health effects of radiation. The report provides a summary and an analysis of occupational radiation exposure information from the monitoring of individuals involved in DOE activities. The occupational radiation exposure information is analyzed in terms of aggregate data, dose to individuals, and dose by site over the past five years.

  19. DOE 2012 occupational radiation exposure

    SciTech Connect

    none,

    2013-10-01

    The U.S. Department of Energy (DOE) Office of Analysis within the Office of Health, Safety and Security (HSS) publishes the annual DOE Occupational Radiation Exposure Report to provide an overview of the status of radiation protection practices at DOE (including the National Nuclear Security Administration [NNSA]). The DOE 2012 Occupational Radiation Exposure Report provides an evaluation of DOE-wide performance regarding compliance with Title 10, Code of Federal Regulations (C.F.R.), Part 835, Occupational Radiation Protection dose limits and as low as reasonably achievable (ALARA) process requirements. In addition, the report provides data to DOE organizations responsible for developing policies for protection of individuals from the adverse health effects of radiation. The report provides a summary and an analysis of occupational radiation exposure information from the monitoring of individuals involved in DOE activities. Over the past 5-year period, the occupational radiation exposure information is analyzed in terms of aggregate data, dose to individuals, and dose by site.

  20. Implict Monte Carlo Radiation Transport Simulations of Four Test Problems

    SciTech Connect

    Gentile, N

    2007-08-01

    Radiation transport codes, like almost all codes, are difficult to develop and debug. It is helpful to have small, easy to run test problems with known answers to use in development and debugging. It is also prudent to re-run test problems periodically during development to ensure that previous code capabilities have not been lost. We describe four radiation transport test problems with analytic or approximate analytic answers. These test problems are suitable for use in debugging and testing radiation transport codes. We also give results of simulations of these test problems performed with an Implicit Monte Carlo photonics code.

  1. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  2. More box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    A new investigation shows that, starting from the BCH (21,15;3) code represented as a 7 x 3 matrix and adding a row and column to add even parity, one obtains an 8 x 4 matrix (32,15;8) code. An additional dimension is obtained by specifying odd parity on the rows and even parity on the columns, i.e., adjoining to the 8 x 4 matrix, the matrix, which is zero except for the fourth column (of all ones). Furthermore, any seven rows and three columns will form the BCH (21,15;3) code. This box code has the same weight structure as the quadratic residue and BCH codes of the same dimensions. Whether there exists an algebraic isomorphism to either code is as yet unknown.

  3. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  4. Non-Ionizing Radiation Used in Microwave Ovens

    MedlinePlus

    ... that emit radiation. The standards ensure that radiation emissions do not pose a hazard to public health. These standards can be viewed on FDA's Code of Federal Regulations on Microwave Ovens . FDA establishes performance standards for ...

  5. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  6. Phonological coding during reading

    PubMed Central

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  7. GALPROP: New Developments in CR Propagation Code

    NASA Technical Reports Server (NTRS)

    Moskalenko, I. V.; Jones, F. C.; Mashnik, S. G.; Strong, A. W.; Ptuskin, V. S.

    2003-01-01

    The numerical Galactic CR propagation code GALPROP has been shown to reproduce simultaneously observational data of many kinds related to CR origin and propagation. It has been validated on direct measurements of nuclei, antiprotons, electrons, positrons as well as on astronomical measurements of gamma rays and synchrotron radiation. Such data provide many independent constraints on model parameters while revealing some contradictions in the conventional view of Galactic CR propagation. Using a new version of GALPROP we study new effects such as processes of wave-particle interactions in the interstellar medium. We also report about other developments in the CR propagation code.

  8. A MULTIPURPOSE COHERENT INSTABILITY SIMULATION CODE

    SciTech Connect

    BLASKIEWICZ,M.

    2007-06-25

    A multipurpose coherent instability simulation code has been written, documented, and released for use. TRANFT (tran-eff-tee) uses fast Fourier transforms to model transverse wakefields, transverse detuning wakes and longitudinal wakefields in a computationally efficient way. Dual harmonic RF allows for the study of enhanced synchrotron frequency spread. When coupled with chromaticity, the theoretically challenging but highly practical post head-tail regime is open to study. Detuning wakes allow for transverse space charge forces in low energy hadron beams, and a switch allowing for radiation damping makes the code useful for electrons.

  9. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  10. Topological subsystem codes

    SciTech Connect

    Bombin, H.

    2010-03-15

    We introduce a family of two-dimensional (2D) topological subsystem quantum error-correcting codes. The gauge group is generated by two-local Pauli operators, so that two-local measurements are enough to recover the error syndrome. We study the computational power of code deformation in these codes and show that boundaries cannot be introduced in the usual way. In addition, we give a general mapping connecting suitable classical statistical mechanical models to optimal error correction in subsystem stabilizer codes that suffer from depolarizing noise.

  11. FAA Smoke Transport Code

    SciTech Connect

    Domino, Stefan; Luketa-Hanlin, Anay; Gallegos, Carlos

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a code obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.

  12. Transonic airfoil codes

    NASA Technical Reports Server (NTRS)

    Garabedian, P. R.

    1979-01-01

    Computer codes for the design and analysis of transonic airfoils are considered. The design code relies on the method of complex characteristics in the hodograph plane to construct shockless airfoil. The analysis code uses artificial viscosity to calculate flows with weak shock waves at off-design conditions. Comparisons with experiments show that an excellent simulation of two dimensional wind tunnel tests is obtained. The codes have been widely adopted by the aircraft industry as a tool for the development of supercritical wing technology.

  13. Hybrid Compton camera/coded aperture imaging system

    DOEpatents

    Mihailescu, Lucian [Livermore, CA; Vetter, Kai M [Alameda, CA

    2012-04-10

    A system in one embodiment includes an array of radiation detectors; and an array of imagers positioned behind the array of detectors relative to an expected trajectory of incoming radiation. A method in another embodiment includes detecting incoming radiation with an array of radiation detectors; detecting the incoming radiation with an array of imagers positioned behind the array of detectors relative to a trajectory of the incoming radiation; and performing at least one of Compton imaging using at least the imagers and coded aperture imaging using at least the imagers. A method in yet another embodiment includes detecting incoming radiation with an array of imagers positioned behind an array of detectors relative to a trajectory of the incoming radiation; and performing Compton imaging using at least the imagers.

  14. Radiation Therapy

    MedlinePlus

    ... Loss Surgery? A Week of Healthy Breakfasts Shyness Radiation Therapy KidsHealth > For Teens > Radiation Therapy A A ... how to cope with side effects. What Is Radiation Therapy? Cancer is a disease that causes cells ...

  15. CosmoRec: Cosmological Recombination code

    NASA Astrophysics Data System (ADS)

    Chluba, Jens; Thomas, Rajat Mani

    2013-04-01

    CosmoRec solves the recombination problem including recombinations to highly excited states, corrections to the 2s-1s two-photon channel, HI Lyn-feedback, n>2 two-photon profile corrections, and n≥2 Raman-processes. The code can solve the radiative transfer equation of the Lyman-series photon field to obtain the required modifications to the rate equations of the resolved levels, and handles electron scattering, the effect of HeI intercombination transitions, and absorption of helium photons by hydrogen. It also allows accounting for dark matter annihilation and optionally includes detailed helium radiative transfer effects.

  16. Radiation Protection

    MedlinePlus

    Jump to main content US EPA United States Environmental Protection Agency Search Search Radiation Protection Share Facebook Twitter Google+ Pinterest Contact Us Radiation Protection Document Library View ...

  17. Atmospheric radiation

    SciTech Connect

    Harshvardhan, M.R. )

    1991-01-01

    Studies of atmospheric radiative processes are summarized for the period 1987-1990. Topics discussed include radiation modeling; clouds and radiation; radiative effects in dynamics and climate; radiation budget and aerosol effects; and gaseous absorption, particulate scattering and surface reflection. It is concluded that the key developments of the period are a defining of the radiative forcing to the climate system by trace gases and clouds, the recognition that cloud microphysics and morphology need to be incorporated not only into radiation models but also climate models, and the isolation of a few important unsolved theoretical problems in atmospheric radiation.

  18. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    PubMed Central

    Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions. PMID:26999741

  19. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding.

    PubMed

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions.

  20. Dress Codes for Teachers?

    ERIC Educational Resources Information Center

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  1. Lichenase and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2000-08-15

    The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

  2. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  3. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  4. Coding Acoustic Metasurfaces.

    PubMed

    Xie, Boyang; Tang, Kun; Cheng, Hua; Liu, Zhengyou; Chen, Shuqi; Tian, Jianguo

    2017-02-01

    Coding acoustic metasurfaces can combine simple logical bits to acquire sophisticated functions in wave control. The acoustic logical bits can achieve a phase difference of exactly π and a perfect match of the amplitudes for the transmitted waves. By programming the coding sequences, acoustic metasurfaces with various functions, including creating peculiar antenna patterns and waves focusing, have been demonstrated.

  5. Computerized mega code recording.

    PubMed

    Burt, T W; Bock, H C

    1988-04-01

    A system has been developed to facilitate recording of advanced cardiac life support mega code testing scenarios. By scanning a paper "keyboard" using a bar code wand attached to a portable microcomputer, the person assigned to record the scenario can easily generate an accurate, complete, timed, and typewritten record of the given situations and the obtained responses.

  6. Pseudonoise code tracking loop

    NASA Technical Reports Server (NTRS)

    Laflame, D. T. (Inventor)

    1980-01-01

    A delay-locked loop is presented for tracking a pseudonoise (PN) reference code in an incoming communication signal. The loop is less sensitive to gain imbalances, which can otherwise introduce timing errors in the PN reference code formed by the loop.

  7. Evolving genetic code

    PubMed Central

    OHAMA, Takeshi; INAGAKI, Yuji; BESSHO, Yoshitaka; OSAWA, Syozo

    2008-01-01

    In 1985, we reported that a bacterium, Mycoplasma capricolum, used a deviant genetic code, namely UGA, a “universal” stop codon, was read as tryptophan. This finding, together with the deviant nuclear genetic codes in not a few organisms and a number of mitochondria, shows that the genetic code is not universal, and is in a state of evolution. To account for the changes in codon meanings, we proposed the codon capture theory stating that all the code changes are non-disruptive without accompanied changes of amino acid sequences of proteins. Supporting evidence for the theory is presented in this review. A possible evolutionary process from the ancient to the present-day genetic code is also discussed. PMID:18941287

  8. Radiation physics, biophysics, and radiation biology

    SciTech Connect

    Hall, E.J.; Zaider, M.

    1993-05-01

    Research at the Center for Radiological Research is a multidisciplenary blend of physics, chemistry and biology aimed at understanding the mechanisms involved in the health problems resulting from human exposure to ionizing radiations. The focus is increased on biochemistry and the application of the techniques of molecular biology to the problems of radiation biology. Research highlights of the program from the past year are described. A mathematical model describing the production of single-strand and double-strand breaks in DNA as a function radiation quality has been completed. For the first time Monte Carlo techniques have been used to obtain directly the spatial distribution of DNA moieties altered by radiation. This information was obtained by including the transport codes a realistic description of the electronic structure of DNA. We have investigated structure activity relationships for the potential oncogenicity of a new generation of bioreductive drugs that function as hypoxic cytotoxins. Experimental and theoretical investigation of the inverse dose rate effect, whereby medium LET radiations actually produce an c effect when the dose is protracted, is now at a point where the basic mechanisms are reasonably understood and the complex interplay between dose, dose rate and radiation quality which is necessary for the effect to be present can now be predicted at least in vitro. In terms of early radiobiological damage, a quantitative link has been established between basic energy deposition and locally multiply damaged sites, the radiochemical precursor of DNA double strand breaks; specifically, the spatial and energy deposition requirements necessary to form LMDs have been evaluated. For the first time, a mechanically understood biological fingerprint'' of high-LET radiation has been established. Specifically measurement of the ratio of inter-to intra-chromosomal aberrations produces a unique signature from alpha-particles or neutrons.

  9. HotSpot Health Physics Codes

    SciTech Connect

    Homann, S. G.

    2013-04-18

    The HotSpot Health Physics Codes were created to provide emergency response personnel and emergency planners with a fast, field-portable set of software tools for evaluating insidents involving redioactive material. The software is also used for safety-analysis of facilities handling nuclear material. HotSpot provides a fast and usually conservative means for estimation the radiation effects associated with the short-term (less than 24 hours) atmospheric release of radioactive materials.

  10. Pelvic radiation - discharge

    MedlinePlus

    Radiation of the pelvis - discharge; Cancer treatment - pelvic radiation; Prostate cancer - pelvic radiation; Ovarian cancer - pelvic radiation; Cervical cancer - pelvic radiation; Uterine cancer - pelvic radiation; Rectal cancer - ...

  11. The EGS5 Code System

    SciTech Connect

    Hirayama, Hideo; Namito, Yoshihito; Bielajew, Alex F.; Wilderman, Scott J.; U., Michigan; Nelson, Walter R.; /SLAC

    2005-12-20

    In the nineteen years since EGS4 was released, it has been used in a wide variety of applications, particularly in medical physics, radiation measurement studies, and industrial development. Every new user and every new application bring new challenges for Monte Carlo code designers, and code refinements and bug fixes eventually result in a code that becomes difficult to maintain. Several of the code modifications represented significant advances in electron and photon transport physics, and required a more substantial invocation than code patching. Moreover, the arcane MORTRAN3[48] computer language of EGS4, was highest on the complaint list of the users of EGS4. The size of the EGS4 user base is difficult to measure, as there never existed a formal user registration process. However, some idea of the numbers may be gleaned from the number of EGS4 manuals that were produced and distributed at SLAC: almost three thousand. Consequently, the EGS5 project was undertaken. It was decided to employ the FORTRAN 77 compiler, yet include as much as possible, the structural beauty and power of MORTRAN3. This report consists of four chapters and several appendices. Chapter 1 is an introduction to EGS5 and to this report in general. We suggest that you read it. Chapter 2 is a major update of similar chapters in the old EGS4 report[126] (SLAC-265) and the old EGS3 report[61] (SLAC-210), in which all the details of the old physics (i.e., models which were carried over from EGS4) and the new physics are gathered together. The descriptions of the new physics are extensive, and not for the faint of heart. Detailed knowledge of the contents of Chapter 2 is not essential in order to use EGS, but sophisticated users should be aware of its contents. In particular, details of the restrictions on the range of applicability of EGS are dispersed throughout the chapter. First-time users of EGS should skip Chapter 2 and come back to it later if necessary. With the release of the EGS4 version

  12. Radiation Transport in Dynamic Spacetimes

    NASA Astrophysics Data System (ADS)

    Schnittman, Jeremy; Baker, John; Etienne, Zachariah; Giacomazzo, Bruno; Kelly, Bernard

    2017-01-01

    We present early results from a new radiation transport calculation of gas accretion onto merging binary black holes. We use the Monte Carlo radiation transport code Pandurata, now generalized for application to dynamic spacetimes. The time variability of the metric requires careful numerical techniques for solving the geodesic equation, particularly with tabulated spacetime data from numerical relativity codes. Using a new series of general relativistic magneto-hydrodynamical simulations of magnetized flow onto binary black holes, we investigate the possibility for detecting and identifying unique electromagnetic counterparts to gravitational wave events.

  13. Charged and neutral particle transport methods and applications: The CALOR code system

    SciTech Connect

    Gabriel, T.A.; Charlton, L.A.

    1997-04-01

    The CALOR code system, which is a complete radiation transport code system, is described with emphasis on the high-energy (> 20 MeV) nuclear collision models. Codes similar to CALOR are also briefly discussed. A current application using CALOR which deals with the development of the National Spallation Neutron Source is also given.

  14. Coded-aperture imaging in nuclear medicine

    NASA Technical Reports Server (NTRS)

    Smith, Warren E.; Barrett, Harrison H.; Aarsvold, John N.

    1989-01-01

    Coded-aperture imaging is a technique for imaging sources that emit high-energy radiation. This type of imaging involves shadow casting and not reflection or refraction. High-energy sources exist in x ray and gamma-ray astronomy, nuclear reactor fuel-rod imaging, and nuclear medicine. Of these three areas nuclear medicine is perhaps the most challenging because of the limited amount of radiation available and because a three-dimensional source distribution is to be determined. In nuclear medicine a radioactive pharmaceutical is administered to a patient. The pharmaceutical is designed to be taken up by a particular organ of interest, and its distribution provides clinical information about the function of the organ, or the presence of lesions within the organ. This distribution is determined from spatial measurements of the radiation emitted by the radiopharmaceutical. The principles of imaging radiopharmaceutical distributions with coded apertures are reviewed. Included is a discussion of linear shift-variant projection operators and the associated inverse problem. A system developed at the University of Arizona in Tucson consisting of small modular gamma-ray cameras fitted with coded apertures is described.

  15. Coded-aperture imaging in nuclear medicine

    NASA Astrophysics Data System (ADS)

    Smith, Warren E.; Barrett, Harrison H.; Aarsvold, John N.

    1989-11-01

    Coded-aperture imaging is a technique for imaging sources that emit high-energy radiation. This type of imaging involves shadow casting and not reflection or refraction. High-energy sources exist in x ray and gamma-ray astronomy, nuclear reactor fuel-rod imaging, and nuclear medicine. Of these three areas nuclear medicine is perhaps the most challenging because of the limited amount of radiation available and because a three-dimensional source distribution is to be determined. In nuclear medicine a radioactive pharmaceutical is administered to a patient. The pharmaceutical is designed to be taken up by a particular organ of interest, and its distribution provides clinical information about the function of the organ, or the presence of lesions within the organ. This distribution is determined from spatial measurements of the radiation emitted by the radiopharmaceutical. The principles of imaging radiopharmaceutical distributions with coded apertures are reviewed. Included is a discussion of linear shift-variant projection operators and the associated inverse problem. A system developed at the University of Arizona in Tucson consisting of small modular gamma-ray cameras fitted with coded apertures is described.

  16. Pyramid image codes

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1990-01-01

    All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

  17. Report number codes

    SciTech Connect

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  18. Embedded foveation image coding.

    PubMed

    Wang, Z; Bovik, A C

    2001-01-01

    The human visual system (HVS) is highly space-variant in sampling, coding, processing, and understanding. The spatial resolution of the HVS is highest around the point of fixation (foveation point) and decreases rapidly with increasing eccentricity. By taking advantage of this fact, it is possible to remove considerable high-frequency information redundancy from the peripheral regions and still reconstruct a perceptually good quality image. Great success has been obtained previously by a class of embedded wavelet image coding algorithms, such as the embedded zerotree wavelet (EZW) and the set partitioning in hierarchical trees (SPIHT) algorithms. Embedded wavelet coding not only provides very good compression performance, but also has the property that the bitstream can be truncated at any point and still be decoded to recreate a reasonably good quality image. In this paper, we propose an embedded foveation image coding (EFIC) algorithm, which orders the encoded bitstream to optimize foveated visual quality at arbitrary bit-rates. A foveation-based image quality metric, namely, foveated wavelet image quality index (FWQI), plays an important role in the EFIC system. We also developed a modified SPIHT algorithm to improve the coding efficiency. Experiments show that EFIC integrates foveation filtering with foveated image coding and demonstrates very good coding performance and scalability in terms of foveated image quality measurement.

  19. Computer aided radiation analysis for manned spacecraft

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew H.; Griffin, Brand N.; Tanner, Ernest R., II; Pogue, William R.; Golightly, Michael J.

    1991-01-01

    In order to assist in the design of radiation shielding an analytical tool is presented that can be employed in combination with CAD facilities and NASA transport codes. The nature of radiation in space is described, and the operational requirements for protection are listed as background information for the use of the technique. The method is based on the Boeing radiation exposure model (BREM) for combining NASA radiation transport codes and CAD facilities, and the output is given as contour maps of the radiation-shield distribution so that dangerous areas can be identified. Computational models are used to solve the 1D Boltzmann transport equation and determine the shielding needs for the worst-case scenario. BREM can be employed directly with the radiation computations to assess radiation protection during all phases of design which saves time and ultimately spacecraft weight.

  20. Development of a shuttle plume radiation heating indicator

    NASA Technical Reports Server (NTRS)

    Reardon, John E.

    1988-01-01

    The primary objectives were to develop a Base Heating Indicator Code and a new plume radiation code for the Space Shuttle. Additional work included: revision of the Space Shuttle plume radiation environment for changes in configuration and correction of errors, evaluation of radiation measurements to establish a plume radiation model for the SRB High Performance Motor (HPM) plume, radiation predictions for preliminary designs, and participation in hydrogen disposal analysis and testing for the VAFB Shuttle launch site. The two most significant accomplishments were the development of the Base Heating Indicator Code and the Shuttle Engine Plume Radiation (SEPRAD) Code. The major efforts in revising the current Shuttle plume radiation environment were for the Orbiter base heat shield and the ET components in the Orbiter-ET interface region. The work performed is summarized in the technical discussion section with references to the documents containing detailed results. The technical discussion is followed by a summary of conclusions and recommendations for future work.

  1. The PARTRAC code: Status and recent developments

    NASA Astrophysics Data System (ADS)

    Friedland, Werner; Kundrat, Pavel

    Biophysical modeling is of particular value for predictions of radiation effects due to manned space missions. PARTRAC is an established tool for Monte Carlo-based simulations of radiation track structures, damage induction in cellular DNA and its repair [1]. Dedicated modules describe interactions of ionizing particles with the traversed medium, the production and reactions of reactive species, and score DNA damage determined by overlapping track structures with multi-scale chromatin models. The DNA repair module describes the repair of DNA double-strand breaks (DSB) via the non-homologous end-joining pathway; the code explicitly simulates the spatial mobility of individual DNA ends in parallel with their processing by major repair enzymes [2]. To simulate the yields and kinetics of radiation-induced chromosome aberrations, the repair module has been extended by tracking the information on the chromosome origin of ligated fragments as well as the presence of centromeres [3]. PARTRAC calculations have been benchmarked against experimental data on various biological endpoints induced by photon and ion irradiation. The calculated DNA fragment distributions after photon and ion irradiation reproduce corresponding experimental data and their dose- and LET-dependence. However, in particular for high-LET radiation many short DNA fragments are predicted below the detection limits of the measurements, so that the experiments significantly underestimate DSB yields by high-LET radiation [4]. The DNA repair module correctly describes the LET-dependent repair kinetics after (60) Co gamma-rays and different N-ion radiation qualities [2]. First calculations on the induction of chromosome aberrations have overestimated the absolute yields of dicentrics, but correctly reproduced their relative dose-dependence and the difference between gamma- and alpha particle irradiation [3]. Recent developments of the PARTRAC code include a model of hetero- vs euchromatin structures to enable

  2. Radiation shielding of the main injector

    SciTech Connect

    Bhat, C.M.; Martin, P.S.

    1995-05-01

    The radiation shielding in the Fermilab Main Injector (FMI) complex has been carried out by adopting a number of prescribed stringent guidelines established by a previous safety analysis. Determination of the required amount of radiation shielding at various locations of the FMI has been done using Monte Carlo computations. A three dimensional ray tracing code as well as a code based upon empirical observations have been employed in certain cases.

  3. Code Disentanglement: Initial Plan

    SciTech Connect

    Wohlbier, John Greaton; Kelley, Timothy M.; Rockefeller, Gabriel M.; Calef, Matthew Thomas

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  4. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  5. Modular optimization code package: MOZAIK

    NASA Astrophysics Data System (ADS)

    Bekar, Kursat B.

    This dissertation addresses the development of a modular optimization code package, MOZAIK, for geometric shape optimization problems in nuclear engineering applications. MOZAIK's first mission, determining the optimal shape of the D2O moderator tank for the current and new beam tube configurations for the Penn State Breazeale Reactor's (PSBR) beam port facility, is used to demonstrate its capabilities and test its performance. MOZAIK was designed as a modular optimization sequence including three primary independent modules: the initializer, the physics and the optimizer, each having a specific task. By using fixed interface blocks among the modules, the code attains its two most important characteristics: generic form and modularity. The benefit of this modular structure is that the contents of the modules can be switched depending on the requirements of accuracy, computational efficiency, or compatibility with the other modules. Oak Ridge National Laboratory's discrete ordinates transport code TORT was selected as the transport solver in the physics module of MOZAIK, and two different optimizers, Min-max and Genetic Algorithms (GA), were implemented in the optimizer module of the code package. A distributed memory parallelism was also applied to MOZAIK via MPI (Message Passing Interface) to execute the physics module concurrently on a number of processors for various states in the same search. Moreover, dynamic scheduling was enabled to enhance load balance among the processors while running MOZAIK's physics module thus improving the parallel speedup and efficiency. In this way, the total computation time consumed by the physics module is reduced by a factor close to M, where M is the number of processors. This capability also encourages the use of MOZAIK for shape optimization problems in nuclear applications because many traditional codes related to radiation transport do not have parallel execution capability. A set of computational models based on the

  6. Robust Nonlinear Neural Codes

    NASA Astrophysics Data System (ADS)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  7. Radiation Transport Tools for Space Applications: A Review

    NASA Technical Reports Server (NTRS)

    Jun, Insoo; Evans, Robin; Cherng, Michael; Kang, Shawn

    2008-01-01

    This slide presentation contains a brief discussion of nuclear transport codes widely used in the space radiation community for shielding and scientific analyses. Seven radiation transport codes that are addressed. The two general methods (i.e., Monte Carlo Method, and the Deterministic Method) are briefly reviewed.

  8. Space Radiation

    NASA Technical Reports Server (NTRS)

    Wu, Honglu

    2006-01-01

    Astronauts receive the highest occupational radiation exposure. Effective protections are needed to ensure the safety of astronauts on long duration space missions. Increased cancer morbidity or mortality risk in astronauts may be caused by occupational radiation exposure. Acute and late radiation damage to the central nervous system (CNS) may lead to changes in motor function and behavior, or neurological disorders. Radiation exposure may result in degenerative tissue diseases (non-cancer or non-CNS) such as cardiac, circulatory, or digestive diseases, as well as cataracts. Acute radiation syndromes may occur due to occupational radiation exposure.

  9. Code CUGEL: A code to unfold Ge(Li) spectrometer polyenergetic gamma photon experimental distributions

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Born, U.

    1970-01-01

    A FORTRAN code was developed for the Univac 1108 digital computer to unfold lithium-drifted germanium semiconductor spectrometers, polyenergetic gamma photon experimental distributions. It was designed to analyze the combination continuous and monoenergetic gamma radiation field of radioisotope volumetric sources. The code generates the detector system response matrix function and applies it to monoenergetic spectral components discretely and to the continuum iteratively. It corrects for system drift, source decay, background, and detection efficiency. Results are presented in digital form for differential and integrated photon number and energy distributions, and for exposure dose.

  10. The fast non-LTE code DEDALE

    NASA Astrophysics Data System (ADS)

    Gilleron, Franck; Piron, Robin

    2015-12-01

    We present Dédale, a fast code implementing a simplified non-local-thermodynamic-equilibrium (NLTE) plasma model. In this approach, the stationary collisional-radiative rates equations are solved for a set of well-chosen Layzer complexes in order to determine the ion state populations. The electronic structure is approximated using the screened hydrogenic model (SHM) of More with relativistic corrections. The radiative and collisional cross-sections are based on Kramers and Van Regemorter formula, respectively, which are extrapolated to derive analytical expressions for all the rates. The latter are improved thereafter using Gaunt factors or more accurate tabulated data. Special care is taken for dielectronic rates which are compared and rescaled with quantum calculations from the Averroès code. The emissivity and opacity spectra are calculated under the same assumptions as for the radiative rates, either in a detailed manner by summing the transitions between each pair of complexes, or in a coarser statistical way by summing the one-electron transitions averaged over the complexes. Optionally, nℓ-splitting can be accounted for using a WKB approach in an approximate potential reconstructed analytically from the screened charges. It is also possible to improve the spectra by replacing some transition arrays with more accurate data tabulated using the SCO-RCG or FAC codes. This latter option is particularly useful for K-shell emission spectroscopy. The Dédale code was used to submit neon and tungsten cases in the last NLTE-8 workshop (Santa Fe, November 4-8, 2013). Some of these results are presented, as well as comparisons with Averroès calculations.

  11. 76 FR 4258 - Occupational Radiation Protection; Revision

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... Part 835 RIN 1901-AA-95 Occupational Radiation Protection; Revision AGENCY: Department of Energy...) proposes to revise the values in an appendix to its Occupational Radiation Protection requirements. The... requirements in title 10, Code of Federal Regulations, part 835 (10 CFR part 835), Occupational...

  12. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  13. Error coding simulations

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1993-01-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  14. Coded source neutron imaging

    NASA Astrophysics Data System (ADS)

    Bingham, Philip; Santos-Villalobos, Hector; Tobin, Ken

    2011-03-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100μm) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100μm and 10μm aperture hole diameters show resolutions matching the hole diameters.

  15. Full 3D visualization tool-kit for Monte Carlo and deterministic transport codes

    SciTech Connect

    Frambati, S.; Frignani, M.

    2012-07-01

    We propose a package of tools capable of translating the geometric inputs and outputs of many Monte Carlo and deterministic radiation transport codes into open source file formats. These tools are aimed at bridging the gap between trusted, widely-used radiation analysis codes and very powerful, more recent and commonly used visualization software, thus supporting the design process and helping with shielding optimization. Three main lines of development were followed: mesh-based analysis of Monte Carlo codes, mesh-based analysis of deterministic codes and Monte Carlo surface meshing. The developed kit is considered a powerful and cost-effective tool in the computer-aided design for radiation transport code users of the nuclear world, and in particular in the fields of core design and radiation analysis. (authors)

  16. HOTSPOT Health Physics codes for the PC

    SciTech Connect

    Homann, S.G.

    1994-03-01

    The HOTSPOT Health Physics codes were created to provide Health Physics personnel with a fast, field-portable calculation tool for evaluating accidents involving radioactive materials. HOTSPOT codes are a first-order approximation of the radiation effects associated with the atmospheric release of radioactive materials. HOTSPOT programs are reasonably accurate for a timely initial assessment. More importantly, HOTSPOT codes produce a consistent output for the same input assumptions and minimize the probability of errors associated with reading a graph incorrectly or scaling a universal nomogram during an emergency. The HOTSPOT codes are designed for short-term (less than 24 hours) release durations. Users requiring radiological release consequences for release scenarios over a longer time period, e.g., annual windrose data, are directed to such long-term models as CAPP88-PC (Parks, 1992). Users requiring more sophisticated modeling capabilities, e.g., complex terrain; multi-location real-time wind field data; etc., are directed to such capabilities as the Department of Energy`s ARAC computer codes (Sullivan, 1993). Four general programs -- Plume, Explosion, Fire, and Resuspension -- calculate a downwind assessment following the release of radioactive material resulting from a continuous or puff release, explosive release, fuel fire, or an area contamination event. Other programs deal with the release of plutonium, uranium, and tritium to expedite an initial assessment of accidents involving nuclear weapons. Additional programs estimate the dose commitment from the inhalation of any one of the radionuclides listed in the database of radionuclides; calibrate a radiation survey instrument for ground-survey measurements; and screen plutonium uptake in the lung (see FIDLER Calibration and LUNG Screening sections).

  17. Importance biasing scheme implemented in the PRIZMA code

    SciTech Connect

    Kandiev, I.Z.; Malyshkin, G.N.

    1997-12-31

    PRIZMA code is intended for Monte Carlo calculations of linear radiation transport problems. The code has wide capabilities to describe geometry, sources, material composition, and to obtain parameters specified by user. There is a capability to calculate path of particle cascade (including neutrons, photons, electrons, positrons and heavy charged particles) taking into account possible transmutations. Importance biasing scheme was implemented to solve the problems which require calculation of functionals related to small probabilities (for example, problems of protection against radiation, problems of detection, etc.). The scheme enables to adapt trajectory building algorithm to problem peculiarities.

  18. Code query by example

    NASA Astrophysics Data System (ADS)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  19. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  20. SAC: Sheffield Advanced Code

    NASA Astrophysics Data System (ADS)

    Griffiths, Mike; Fedun, Viktor; Mumford, Stuart; Gent, Frederick

    2013-06-01

    The Sheffield Advanced Code (SAC) is a fully non-linear MHD code designed for simulations of linear and non-linear wave propagation in gravitationally strongly stratified magnetized plasma. It was developed primarily for the forward modelling of helioseismological processes and for the coupling processes in the solar interior, photosphere, and corona; it is built on the well-known VAC platform that allows robust simulation of the macroscopic processes in gravitationally stratified (non-)magnetized plasmas. The code has no limitations of simulation length in time imposed by complications originating from the upper boundary, nor does it require implementation of special procedures to treat the upper boundaries. SAC inherited its modular structure from VAC, thereby allowing modification to easily add new physics.

  1. Autocatalysis, information and coding.

    PubMed

    Wills, P R

    2001-01-01

    Autocatalytic self-construction in macromolecular systems requires the existence of a reflexive relationship between structural components and the functional operations they perform to synthesise themselves. The possibility of reflexivity depends on formal, semiotic features of the catalytic structure-function relationship, that is, the embedding of catalytic functions in the space of polymeric structures. Reflexivity is a semiotic property of some genetic sequences. Such sequences may serve as the basis for the evolution of coding as a result of autocatalytic self-organisation in a population of assignment catalysts. Autocatalytic selection is a mechanism whereby matter becomes differentiated in primitive biochemical systems. In the case of coding self-organisation, it corresponds to the creation of symbolic information. Prions are present-day entities whose replication through autocatalysis reflects aspects of biological semiotics less obvious than genetic coding.

  2. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option

  3. Polar Code Validation

    DTIC Science & Technology

    1989-09-30

    Unclassified 2a SECURITY CLASSiF-ICATiON AUTHORIT’Y 3 DIStRIBUTION AVAILABILITY OF REPORT N,A Approved for public release; 2o DECLASSIFICAIiON DOWNGRADING SCH DI...SUMMARY OF POLAR ACHIEVEMENTS ..... .......... 3 3 . POLAR CODE PHYSICAL MODELS ..... ............. 5 3.1 PL-ASMA Su ^"ru5 I1LS SH A...11 Structure of the Bipolar Plasma Sheath Generated by SPEAR I ... ...... 1 3 The POLAR Code Wake Model: Comparison with in Situ Observations . . 23

  4. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor

  5. Securing mobile code.

    SciTech Connect

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements

  6. Radiofrequency Radiation Dosimetry Handbook. 4th Edition

    DTIC Science & Technology

    1986-10-01

    State, and ZIP Code) 10 SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT ELEMENT NO. NO. NO. ACCESSION NO. 62202F 7757 01 95 11. TITLE...density, as planewave A can be. Since near-field radiation fields vary so much from one radiation source to another, near-field dosi- metric data for...specific sources could not be given; only near-field SAR data for simple illustrative radiation fields were presented. The purpose of this fourth

  7. Radiation Proctopathy

    PubMed Central

    Grodsky, Marc B.; Sidani, Shafik M.

    2015-01-01

    Radiation therapy is a widely utilized treatment modality for pelvic malignancies, including prostate cancer, rectal cancer, and cervical cancer. Given its fixed position in the pelvis, the rectum is at a high risk for injury secondary to ionizing radiation. Despite advances made in radiation science, up to 75% of the patients will suffer from acute radiation proctitis and up to 20% may experience chronic symptoms. Symptoms can be variable and include diarrhea, bleeding, incontinence, and fistulization. A multitude of treatment options exist. This article summarizes the latest knowledge relating to radiation proctopathy focusing on the vast array of treatment options. PMID:26034407

  8. Updates to the NEQAIR Radiation Solver

    NASA Technical Reports Server (NTRS)

    Cruden, Brett A.; Brandis, Aaron M.

    2014-01-01

    The NEQAIR code is one of the original heritage solvers for radiative heating prediction in aerothermal environments, and is still used today for mission design purposes. This paper discusses the implementation of the first major revision to the NEQAIR code in the last five years, NEQAIR v14.0. The most notable features of NEQAIR v14.0 are the parallelization of the radiation computation, reducing runtimes by about 30×, and the inclusion of mid-wave CO2 infrared radiation.

  9. Radiation in Particle Simulations

    SciTech Connect

    More, R; Graziani, F; Glosli, J; Surh, M

    2010-11-19

    Hot dense radiative (HDR) plasmas common to Inertial Confinement Fusion (ICF) and stellar interiors have high temperature (a few hundred eV to tens of keV), high density (tens to hundreds of g/cc) and high pressure (hundreds of megabars to thousands of gigabars). Typically, such plasmas undergo collisional, radiative, atomic and possibly thermonuclear processes. In order to describe HDR plasmas, computational physicists in ICF and astrophysics use atomic-scale microphysical models implemented in various simulation codes. Experimental validation of the models used to describe HDR plasmas are difficult to perform. Direct Numerical Simulation (DNS) of the many-body interactions of plasmas is a promising approach to model validation but, previous work either relies on the collisionless approximation or ignores radiation. We present four methods that attempt a new numerical simulation technique to address a currently unsolved problem: the extension of molecular dynamics to collisional plasmas including emission and absorption of radiation. The first method applies the Lienard-Weichert solution of Maxwell's equations for a classical particle whose motion is assumed to be known. The second method expands the electromagnetic field in normal modes (planewaves in a box with periodic boundary-conditions) and solves the equation for wave amplitudes coupled to the particle motion. The third method is a hybrid molecular dynamics/Monte Carlo (MD/MC) method which calculates radiation emitted or absorbed by electron-ion pairs during close collisions. The fourth method is a generalization of the third method to include small clusters of particles emitting radiation during close encounters: one electron simultaneously hitting two ions, two electrons simultaneously hitting one ion, etc. This approach is inspired by the virial expansion method of equilibrium statistical mechanics. Using a combination of these methods we believe it is possible to do atomic-scale particle simulations of

  10. Coding for urologic office procedures.

    PubMed

    Dowling, Robert A; Painter, Mark

    2013-11-01

    This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff.

  11. Accumulate Repeat Accumulate Coded Modulation

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

  12. Dress Codes. Legal Brief.

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2000-01-01

    As illustrated by two recent decisions, the courts in the past decade have demarcated wide boundaries for school officials considering dress codes, whether in the form of selective prohibitions or required uniforms. Administrators must warn the community, provide legitimate justification and reasonable clarity, and comply with state law. (MLH)

  13. Dress Codes and Uniforms.

    ERIC Educational Resources Information Center

    Lumsden, Linda; Miller, Gabriel

    2002-01-01

    Students do not always make choices that adults agree with in their choice of school dress. Dress-code issues are explored in this Research Roundup, and guidance is offered to principals seeking to maintain a positive school climate. In "Do School Uniforms Fit?" Kerry White discusses arguments for and against school uniforms and summarizes the…

  14. Building Codes and Regulations.

    ERIC Educational Resources Information Center

    Fisher, John L.

    The hazard of fire is of great concern to libraries due to combustible books and new plastics used in construction and interiors. Building codes and standards can offer architects and planners guidelines to follow but these standards should be closely monitored, updated, and researched for fire prevention. (DS)

  15. Student Dress Codes.

    ERIC Educational Resources Information Center

    Uerling, Donald F.

    School officials see a need for regulations that prohibit disruptive and inappropriate forms of expression and attire; students see these regulations as unwanted restrictions on their freedom. This paper reviews court litigation involving constitutional limitations on school authority, dress and hair codes, state law constraints, and school…

  16. Video Coding for ESL.

    ERIC Educational Resources Information Center

    King, Kevin

    1992-01-01

    Coding tasks, a valuable technique for teaching English as a Second Language, are presented that enable students to look at patterns and structures of marital communication as well as objectively evaluate the degree of happiness or distress in the marriage. (seven references) (JL)

  17. Multiple trellis coded modulation

    NASA Technical Reports Server (NTRS)

    Simon, Marvin K. (Inventor); Divsalar, Dariush (Inventor)

    1990-01-01

    A technique for designing trellis codes to minimize bit error performance for a fading channel. The invention provides a criteria which may be used in the design of such codes which is significantly different from that used for average white Gaussian noise channels. The method of multiple trellis coded modulation of the present invention comprises the steps of: (a) coding b bits of input data into s intermediate outputs; (b) grouping said s intermediate outputs into k groups of s.sub.i intermediate outputs each where the summation of all s.sub.i,s is equal to s and k is equal to at least 2; (c) mapping each of said k groups of intermediate outputs into one of a plurality of symbols in accordance with a plurality of modulation schemes, one for each group such that the first group is mapped in accordance with a first modulation scheme and the second group is mapped in accordance with a second modulation scheme; and (d) outputting each of said symbols to provide k output symbols for each b bits of input data.

  18. Coding Theory and Projective Spaces

    NASA Astrophysics Data System (ADS)

    Silberstein, Natalia

    2008-05-01

    The projective space of order n over a finite field F_q is a set of all subspaces of the vector space F_q^{n}. In this work, we consider error-correcting codes in the projective space, focusing mainly on constant dimension codes. We start with the different representations of subspaces in the projective space. These representations involve matrices in reduced row echelon form, associated binary vectors, and Ferrers diagrams. Based on these representations, we provide a new formula for the computation of the distance between any two subspaces in the projective space. We examine lifted maximum rank distance (MRD) codes, which are nearly optimal constant dimension codes. We prove that a lifted MRD code can be represented in such a way that it forms a block design known as a transversal design. The incidence matrix of the transversal design derived from a lifted MRD code can be viewed as a parity-check matrix of a linear code in the Hamming space. We find the properties of these codes which can be viewed also as LDPC codes. We present new bounds and constructions for constant dimension codes. First, we present a multilevel construction for constant dimension codes, which can be viewed as a generalization of a lifted MRD codes construction. This construction is based on a new type of rank-metric codes, called Ferrers diagram rank-metric codes. Then we derive upper bounds on the size of constant dimension codes which contain the lifted MRD code, and provide a construction for two families of codes, that attain these upper bounds. We generalize the well-known concept of a punctured code for a code in the projective space to obtain large codes which are not constant dimension. We present efficient enumerative encoding and decoding techniques for the Grassmannian. Finally we describe a search method for constant dimension lexicodes.

  19. High-fidelity plasma codes for burn physics

    SciTech Connect

    Cooley, James; Graziani, Frank; Marinak, Marty; Murillo, Michael

    2016-10-19

    Accurate predictions of equation of state (EOS), ionic and electronic transport properties are of critical importance for high-energy-density plasma science. Transport coefficients inform radiation-hydrodynamic codes and impact diagnostic interpretation, which in turn impacts our understanding of the development of instabilities, the overall energy balance of burning plasmas, and the efficacy of self-heating from charged-particle stopping. Important processes include thermal and electrical conduction, electron-ion coupling, inter-diffusion, ion viscosity, and charged particle stopping. However, uncertainties in these coefficients are not well established. Fundamental plasma science codes, also called high-fidelity plasma codes, are a relatively recent computational tool that augments both experimental data and theoretical foundations of transport coefficients. This paper addresses the current status of HFPC codes and their future development, and the potential impact they play in improving the predictive capability of the multi-physics hydrodynamic codes used in HED design.

  20. Maestro and Castro: Simulation Codes for Astrophysical Flows

    NASA Astrophysics Data System (ADS)

    Zingale, Michael; Almgren, Ann; Beckner, Vince; Bell, John; Friesen, Brian; Jacobs, Adam; Katz, Maximilian P.; Malone, Christopher; Nonaka, Andrew; Zhang, Weiqun

    2017-01-01

    Stellar explosions are multiphysics problems—modeling them requires the coordinated input of gravity solvers, reaction networks, radiation transport, and hydrodynamics together with microphysics recipes to describe the physics of matter under extreme conditions. Furthermore, these models involve following a wide range of spatial and temporal scales, which puts tough demands on simulation codes. We developed the codes Maestro and Castro to meet the computational challenges of these problems. Maestro uses a low Mach number formulation of the hydrodynamics to efficiently model convection. Castro solves the fully compressible radiation hydrodynamics equations to capture the explosive phases of stellar phenomena. Both codes are built upon the BoxLib adaptive mesh refinement library, which prepares them for next-generation exascale computers. Common microphysics shared between the codes allows us to transfer a problem from the low Mach number regime in Maestro to the explosive regime in Castro. Importantly, both codes are freely available (https://github.com/BoxLib-Codes). We will describe the design of the codes and some of their science applications, as well as future development directions.Support for development was provided by NSF award AST-1211563 and DOE/Office of Nuclear Physics grant DE-FG02-87ER40317 to Stony Brook and by the Applied Mathematics Program of the DOE Office of Advance Scientific Computing Research under US DOE contract DE-AC02-05CH11231 to LBNL.

  1. A MODEL BUILDING CODE ARTICLE ON FALLOUT SHELTERS WITH RECOMMENDATIONS FOR INCLUSION OF REQUIREMENTS FOR FALLOUT SHELTER CONSTRUCTION IN FOUR NATIONAL MODEL BUILDING CODES.

    ERIC Educational Resources Information Center

    American Inst. of Architects, Washington, DC.

    A MODEL BUILDING CODE FOR FALLOUT SHELTERS WAS DRAWN UP FOR INCLUSION IN FOUR NATIONAL MODEL BUILDING CODES. DISCUSSION IS GIVEN OF FALLOUT SHELTERS WITH RESPECT TO--(1) NUCLEAR RADIATION, (2) NATIONAL POLICIES, AND (3) COMMUNITY PLANNING. FALLOUT SHELTER REQUIREMENTS FOR SHIELDING, SPACE, VENTILATION, CONSTRUCTION, AND SERVICES SUCH AS ELECTRICAL…

  2. Improved Algorithms Speed It Up for Codes

    SciTech Connect

    Hazi, A

    2005-09-20

    Huge computers, huge codes, complex problems to solve. The longer it takes to run a code, the more it costs. One way to speed things up and save time and money is through hardware improvements--faster processors, different system designs, bigger computers. But another side of supercomputing can reap savings in time and speed: software improvements to make codes--particularly the mathematical algorithms that form them--run faster and more efficiently. Speed up math? Is that really possible? According to Livermore physicist Eugene Brooks, the answer is a resounding yes. ''Sure, you get great speed-ups by improving hardware,'' says Brooks, the deputy leader for Computational Physics in N Division, which is part of Livermore's Physics and Advanced Technologies (PAT) Directorate. ''But the real bonus comes on the software side, where improvements in software can lead to orders of magnitude improvement in run times.'' Brooks knows whereof he speaks. Working with Laboratory physicist Abraham Szoeke and others, he has been instrumental in devising ways to shrink the running time of what has, historically, been a tough computational nut to crack: radiation transport codes based on the statistical or Monte Carlo method of calculation. And Brooks is not the only one. Others around the Laboratory, including physicists Andrew Williamson, Randolph Hood, and Jeff Grossman, have come up with innovative ways to speed up Monte Carlo calculations using pure mathematics.

  3. Hawking radiation

    NASA Astrophysics Data System (ADS)

    Parentani, Renaud; Spindel, Philippe

    2011-12-01

    Hawking radiation is the thermal radiation predicted to be spontaneously emitted by black holes. It arises from the steady conversion of quantum vacuum fluctuations into pairs of particles, one of which escaping at infinity while the other is trapped inside the black hole horizon. It is named after the physicist Stephen Hawking who derived its existence in 1974. This radiation reduces the mass of black holes and is therefore also known as black hole evaporation.

  4. Radiation Environment Inside Spacecraft

    NASA Technical Reports Server (NTRS)

    O'Neill, Patrick

    2015-01-01

    Dr. Patrick O'Neill, NASA Johnson Space Center, will present a detailed description of the radiation environment inside spacecraft. The free space (outside) solar and galactic cosmic ray and trapped Van Allen belt proton spectra are significantly modified as these ions propagate through various thicknesses of spacecraft structure and shielding material. In addition to energy loss, secondary ions are created as the ions interact with the structure materials. Nuclear interaction codes (FLUKA, GEANT4, HZTRAN, MCNPX, CEM03, and PHITS) transport free space spectra through different thicknesses of various materials. These "inside" energy spectra are then converted to Linear Energy Transfer (LET) spectra and dose rate - that's what's needed by electronics systems designers. Model predictions are compared to radiation measurements made by instruments such as the Intra-Vehicular Charged Particle Directional Spectrometer (IV-CPDS) used inside the Space Station, Orion, and Space Shuttle.

  5. Binary coding for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Chang, Chein-I.; Chang, Chein-Chi; Lin, Chinsu

    2004-10-01

    Binary coding is one of simplest ways to characterize spectral features. One commonly used method is a binary coding-based image software system, called Spectral Analysis Manager (SPAM) for remotely sensed imagery developed by Mazer et al. For a given spectral signature, the SPAM calculates its spectral mean and inter-band spectral difference and uses them as thresholds to generate a binary code word for this particular spectral signature. Such coding scheme is generally effective and also very simple to implement. This paper revisits the SPAM and further develops three new SPAM-based binary coding methods, called equal probability partition (EPP) binary coding, halfway partition (HP) binary coding and median partition (MP) binary coding. These three binary coding methods along with the SPAM well be evaluated for spectral discrimination and identification. In doing so, a new criterion, called a posteriori discrimination probability (APDP) is also introduced for performance measure.

  6. Computational radiology and imaging with the MCNP Monte Carlo code

    SciTech Connect

    Estes, G.P.; Taylor, W.M.

    1995-05-01

    MCNP, a 3D coupled neutron/photon/electron Monte Carlo radiation transport code, is currently used in medical applications such as cancer radiation treatment planning, interpretation of diagnostic radiation images, and treatment beam optimization. This paper will discuss MCNP`s current uses and capabilities, as well as envisioned improvements that would further enhance MCNP role in computational medicine. It will be demonstrated that the methodology exists to simulate medical images (e.g. SPECT). Techniques will be discussed that would enable the construction of 3D computational geometry models of individual patients for use in patient-specific studies that would improve the quality of care for patients.

  7. Dual-sided coded-aperture imager

    DOEpatents

    Ziock, Klaus-Peter

    2009-09-22

    In a vehicle, a single detector plane simultaneously measures radiation coming through two coded-aperture masks, one on either side of the detector. To determine which side of the vehicle a source is, the two shadow masks are inverses of each other, i.e., one is a mask and the other is the anti-mask. All of the data that is collected is processed through two versions of an image reconstruction algorithm. One treats the data as if it were obtained through the mask, the other as though the data is obtained through the anti-mask.

  8. Sinusoidal transform coding

    NASA Technical Reports Server (NTRS)

    Mcaulay, Robert J.; Quatieri, Thomas F.

    1988-01-01

    It has been shown that an analysis/synthesis system based on a sinusoidal representation of speech leads to synthetic speech that is essentially perceptually indistinguishable from the original. Strategies for coding the amplitudes, frequencies and phases of the sine waves have been developed that have led to a multirate coder operating at rates from 2400 to 9600 bps. The encoded speech is highly intelligible at all rates with a uniformly improving quality as the data rate is increased. A real-time fixed-point implementation has been developed using two ADSP2100 DSP chips. The methods used for coding and quantizing the sine-wave parameters for operation at the various frame rates are described.

  9. WHPA Code available

    NASA Astrophysics Data System (ADS)

    The Wellhead Protection Area code is now available for distribution by the International Ground Water Modeling Center in Indianapolis, Ind. The WHPA code is a modular, semianalytical, groundwater flow model developed for the U.S. Environmental Protection Agency, Office of Ground Water Protection, designed to assist state and local technical staff with the task of Wellhead Protection Area (WHPA) delineation. A complete news item appeared in Eos, May 1, 1990, p. 690.The model consists of four independent, semianalytical modules that may be used to identify the areal extent of groundwater contribution to one or multiple pumping wells. One module is a general particle tracking program that may be used as a post-processor for two-dimensional, numerical models of groundwater flow. One module incorporates a Monte Carlo approach to investigate the effects of uncertain input parameters on capture zones. Multiple pumping and injection wells may be present and barrier or stream boundary conditions may be investigated.

  10. WHPA Code available

    NASA Astrophysics Data System (ADS)

    The Wellhead Protection Area (WHPA) code is now available for distribution by the International Ground Water Modeling Center in Indianapolis, Ind. The WHPA code is a modular, semi-analytical, groundwater flow model developed for the U.S. Environmental Protection Agency, Office of Ground Water Protection. It is designed to assist state and local technical staff with the task of WHPA delineation.The model consists of four independent, semi-analytical modules that may be used to identify the areal extent of groundwater contribution to one or multiple pumping wells. One module is a general particle tracking program that may be used as a post-processor for two-dimensional, numerical models of groundwater flow. One module incorporates a Monte Carlo approach to investigate the effects of uncertain input parameters on capture zones. Multiple pumping and injection wells may be present and barrier or stream boundary conditions may be investigated.

  11. HYCOM Code Development

    DTIC Science & Technology

    2003-02-10

    HYCOM code development Alan J. Wallcraft Naval Research Laboratory 2003 Layered Ocean Model Users’ Workshop February 10, 2003 Report Documentation...unlimited 13. SUPPLEMENTARY NOTES Layered Ocean Modeling Workshop (LOM 2003), Miami, FL, Feb 2003 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY...Kraus-Turner mixed-layer Æ Energy-Loan (passive) ice model Æ High frequency atmospheric forcing Æ New I/O scheme (.a and .b files) Æ Scalability via

  12. Reeds computer code

    NASA Technical Reports Server (NTRS)

    Bjork, C.

    1981-01-01

    The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO.

  13. Trajectory Code Studies, 1987

    SciTech Connect

    Poukey, J.W.

    1988-01-01

    The trajectory code TRAJ has been used extensively to study nonimmersed foilless electron diodes. The basic goal of the research is to design low-emittance injectors for electron linacs and propagation experiments. Systems studied during 1987 include Delphi, Recirc, and Troll. We also discuss a partly successful attempt to extend the same techniques to high currents (tens of kA). 7 refs., 30 figs.

  14. Status of MARS Code

    SciTech Connect

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  15. Time coded distribution via broadcasting stations

    NASA Technical Reports Server (NTRS)

    Leschiutta, S.; Pettiti, V.; Detoma, E.

    1979-01-01

    The distribution of standard time signals via AM and FM broadcasting stations presents the distinct advantages to offer a wide area coverage and to allow the use of inexpensive receivers, but the signals are radiated a limited number of times per day, are not usually available during the night, and no full and automatic synchronization of a remote clock is possible. As an attempt to overcome some of these problems, a time coded signal with a complete date information is diffused by the IEN via the national broadcasting networks in Italy. These signals are radiated by some 120 AM and about 3000 FM and TV transmitters around the country. In such a way, a time ordered system with an accuracy of a couple of milliseconds is easily achieved.

  16. Orthopedics coding and funding.

    PubMed

    Baron, S; Duclos, C; Thoreux, P

    2014-02-01

    The French tarification à l'activité (T2A) prospective payment system is a financial system in which a health-care institution's resources are based on performed activity. Activity is described via the PMSI medical information system (programme de médicalisation du système d'information). The PMSI classifies hospital cases by clinical and economic categories known as diagnosis-related groups (DRG), each with an associated price tag. Coding a hospital case involves giving as realistic a description as possible so as to categorize it in the right DRG and thus ensure appropriate payment. For this, it is essential to understand what determines the pricing of inpatient stay: namely, the code for the surgical procedure, the patient's principal diagnosis (reason for admission), codes for comorbidities (everything that adds to management burden), and the management of the length of inpatient stay. The PMSI is used to analyze the institution's activity and dynamism: change on previous year, relation to target, and comparison with competing institutions based on indicators such as the mean length of stay performance indicator (MLS PI). The T2A system improves overall care efficiency. Quality of care, however, is not presently taken account of in the payment made to the institution, as there are no indicators for this; work needs to be done on this topic.

  17. MELCOR computer code manuals

    SciTech Connect

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  18. Bar coded retroreflective target

    DOEpatents

    Vann, Charles S.

    2000-01-01

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  19. Radiation in Particle Simulations

    SciTech Connect

    More, R M; Graziani, F R; Glosli, J; Surh, M

    2009-06-15

    Hot dense radiative (HDR) plasmas common to Inertial Confinement Fusion (ICF) and stellar interiors have high temperature (a few hundred eV to tens of keV), high density (tens to hundreds of g/cc) and high pressure (hundreds of Megabars to thousands of Gigabars). Typically, such plasmas undergo collisional, radiative, atomic and possibly thermonuclear processes. In order to describe HDR plasmas, computational physicists in ICF and astrophysics use atomic-scale microphysical models implemented in various simulation codes. Experimental validation of the models used to describe HDR plasmas are difficult to perform. Direct Numerical Simulation (DNS) of the many-body interactions of plasmas is a promising approach to model validation but, previous work either relies on the collisionless approximation or ignores radiation. We present four methods that attempt a new numerical simulation technique to address a currently unsolved problem: the extension of molecular dynamics to collisional plasmas including emission and absorption of radiation. The first method applies the Lienard-Weichert solution of Maxwell's equations for a classical particle whose motion is assumed to be known (section 3). The second method expands the electromagnetic field in normal modes (plane-waves in a box with periodic boundary-conditions) and solves the equation for wave amplitudes coupled to the particle motion (section 4). The third method is a hybrid MD/MC (molecular dynamics/Monte Carlo) method which calculates radiation emitted or absorbed by electron-ion pairs during close collisions (section 5). The fourth method is a generalization of the third method to include small clusters of particles emitting radiation during close encounters: one electron simultaneously hitting two ions, two electrons simultaneously hitting one ion, etc.(section 6). This approach is inspired by the Virial expansion method of equilibrium statistical mechanics.

  20. Radar frequency radiation

    NASA Astrophysics Data System (ADS)

    Malowicki, E.

    1981-11-01

    A method is presented for the determination of radar frequency radiation power densities that the PAVE PAWS radar system could produce in its air and ground environment. The effort was prompted by the concern of the people in the vicinity of OTIS AFB MA and BEALE AFB CA about the possible radar frequency radiation hazard of the PAVE PAWS radar. The method is based on the following main assumptions that: (a) the total field can be computed as the vector summation of the individual fields due to each antenna element; (b) the individual field can be calculated using distances for which the field point is in the far field of the antenna element. An RFR computer program was coded for the RADC HE 6180 digital computer and exercised to calculate the radiation levels in the air and ground space for the present baseline and the possible Six DB and 10 DB growth systems of the PAVE PAWS radar system at OTIS AFB MA. The average radiation levels due to the surveillance fence were computed for three regions: in the air space in front of the radar, at the radar hazard fence at OTIS AFB MA and at representative ground points in the OTIS AFB vicinity. It was concluded that the radar frequency radiation of PAVE PAWS does not present a hazard to personnel provided there is no entry to the air hazard zone or to the area within the hazard fence. The method developed offers a cost effective way to determine radiation levels from a phased array radar especially in the near field and transition regions.

  1. Suboptimum decoding of block codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao

    1991-01-01

    This paper investigates a class of decomposable codes, their distance and structural properties. it is shown that this class includes several classes of well known and efficient codes as subclasses. Several methods for constructing decomposable codes or decomposing codes are presented. A two-stage soft decision decoding scheme for decomposable codes, their translates or unions of translates is devised. This two-stage soft-decision decoding is suboptimum, and provides an excellent trade-off between the error performance and decoding complexity for codes of moderate and long block length.

  2. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  3. User's manual for the ORIGEN2 computer code

    SciTech Connect

    Croff, A.G.

    1980-07-01

    This report describes how to use a revised version of the ORIGEN computer code, designated ORIGEN2. Included are a description of the input data, input deck organization, and sample input and output. ORIGEN2 can be obtained from the Radiation Shielding Information Center at ORNL.

  4. Construction of new quantum MDS codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Taneja, Divya; Gupta, Manish; Narula, Rajesh; Bhullar, Jaskaran

    Obtaining quantum maximum distance separable (MDS) codes from dual containing classical constacyclic codes using Hermitian construction have paved a path to undertake the challenges related to such constructions. Using the same technique, some new parameters of quantum MDS codes have been constructed here. One set of parameters obtained in this paper has achieved much larger distance than work done earlier. The remaining constructed parameters of quantum MDS codes have large minimum distance and were not explored yet.

  5. Convolutional coding techniques for data protection

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  6. Combinatorial neural codes from a mathematical coding theory perspective.

    PubMed

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  7. New quantum MDS-convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Li, Fengwei; Yue, Qin

    2015-12-01

    In this paper, we utilize a family of Hermitian dual-containing constacyclic codes to construct classical and quantum MDS convolutional codes. Our classical and quantum convolutional codes are optimal in the sense that they attain the classical (quantum) generalized Singleton bound.

  8. A class of constacyclic BCH codes and new quantum codes

    NASA Astrophysics Data System (ADS)

    liu, Yang; Li, Ruihu; Lv, Liangdong; Ma, Yuena

    2017-03-01

    Constacyclic BCH codes have been widely studied in the literature and have been used to construct quantum codes in latest years. However, for the class of quantum codes of length n=q^{2m}+1 over F_{q^2} with q an odd prime power, there are only the ones of distance δ ≤ 2q^2 are obtained in the literature. In this paper, by a detailed analysis of properties of q2-ary cyclotomic cosets, maximum designed distance δ _{max} of a class of Hermitian dual-containing constacyclic BCH codes with length n=q^{2m}+1 are determined, this class of constacyclic codes has some characteristic analog to that of primitive BCH codes over F_{q^2}. Then we can obtain a sequence of dual-containing constacyclic codes of designed distances 2q^2<δ ≤ δ _{max}. Consequently, new quantum codes with distance d > 2q^2 can be constructed from these dual-containing codes via Hermitian Construction. These newly obtained quantum codes have better code rate compared with those constructed from primitive BCH codes.

  9. A Computer Code for the Calculation of NLTE Model Atmospheres Using ALI

    NASA Astrophysics Data System (ADS)

    Kubát, J.

    2003-01-01

    A code for calculation of NLTE model atmospheres in hydrostatic and radiative equilibrium in either spherically symmetric or plane parallel geometry is described. The method of accelerated lambda iteration is used for the treatment of radiative transfer. Other equations (hydrostatic equilibrium, radiative equilibrium, statistical equilibrium, optical depth) are solved using the Newton-Raphson method (linearization). In addition to the standard output of the model atmosphere (dependence of temperature, density, radius, and population numbers on column mass depth) the code enables optional additional outputs for better understanding of processes in the atmosphere. The code is able to calculate model atmospheres of plane-parallel and spherically symmetric semi-infinite atmospheres as well as models of plane parallel and spherical shells. There is also an option for solution of a restricted problem of a NLTE line formation (solution of radiative transfer and statistical equilibrium for a given model atmosphere). The overall scheme of the code is presented.

  10. Radiation effects on human heredity.

    PubMed

    Nakamura, Nori; Suyama, Akihiko; Noda, Asao; Kodama, Yoshiaki

    2013-01-01

    In experimental organisms such as fruit flies and mice, increased frequencies in germ cell mutations have been detected following exposure to ionizing radiation. In contrast, there has been no clear evidence for radiation-induced germ cell mutations in humans that lead to birth defects, chromosome aberrations, Mendelian disorders, etc. This situation exists partly because no sensitive and practical genetic marker is available for human studies and also because the number of people exposed to large doses of radiation and subsequently having offspring was small until childhood cancer survivors became an important study population. In addition, the genome of apparently normal individuals seems to contain large numbers of alterations, including dozens to hundreds of nonfunctional alleles. With the number of mutational events in protein-coding genes estimated as less than one per genome after 1 gray (Gy) exposure, it is unsurprising that genetic effects from radiation have not yet been detected conclusively in humans.

  11. Simulations of Laboratory Astrophysics Experiments using the CRASH code

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Kuranz, Carolyn; Manuel, Mario; Keiter, Paul; Drake, R. P.

    2014-10-01

    Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster/talk will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, imploding bubbles, and interacting jet experiments. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via Grant DEFC52-08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, Grant Number DE-NA0001840, and by the National Laser User Facility Program, Grant Number DE-NA0000850.

  12. Simulations of Laboratory Astrophysics Experiments using the CRASH code

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Kuranz, Carolyn; Fein, Jeff; Wan, Willow; Young, Rachel; Keiter, Paul; Drake, R. Paul

    2015-11-01

    Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, magnetized flows, jets, and laser-produced plasmas. This work is funded by the following grants: DEFC52-08NA28616, DE-NA0001840, and DE-NA0002032.

  13. Radiation Therapy

    MedlinePlus

    ... can watch you during the procedure. As you go through radiation treatment, you may feel like you're all ... treatment. Avoid exposing the treated area to the sun during the weeks you're getting radiation therapy. And when the treatment's over, wear sunscreen ...

  14. Understanding Radiation.

    ERIC Educational Resources Information Center

    Department of Energy, Washington, DC. Nuclear Energy Office.

    Radiation is a natural energy force that has been a part of the environment since the Earth was formed. It takes various forms, none of which can be smelled, tasted, seen, heard, or felt. Nevertheless, scientists know what it is, where it comes from, how to measure and detect it, and how it affects people. Cosmic radiation from outer space and…

  15. Radiation Therapy

    MedlinePlus

    ... Tumors In Children Pediatric Brain Tumor Diagnosis Family Impact Late Effects After Treatment Returning to School Pediatric ... Una publicación de ABTA en español. Radiation Imaging Technology Information on Radiation and Imaging Technology Home Donor and ...

  16. Radiation Therapy

    MedlinePlus

    ... them from spreading. About half of all cancer patients receive it. The radiation may be external, from special machines, or internal, from radioactive substances that a doctor places inside your body. The type of radiation therapy you receive depends on many factors, including The ...

  17. Radiation Exposure

    MedlinePlus

    ... particles. It occurs naturally in sunlight. Man-made radiation is used in X-rays, nuclear weapons, nuclear power plants and cancer treatment. If you are exposed to small amounts of radiation over a long time, it raises your risk ...

  18. Radiation detector

    DOEpatents

    Fultz, B.T.

    1980-12-05

    Apparatus is provided for detecting radiation such as gamma rays and x-rays generated in backscatter Moessbauer effect spectroscopy and x-ray spectrometry, which has a large window for detecting radiation emanating over a wide solid angle from a specimen and which generates substantially the same output pulse height for monoenergetic radiation that passes through any portion of the detection chamber. The apparatus includes a substantially toroidal chamber with conductive walls forming a cathode, and a wire anode extending in a circle within the chamber with the anode lying closer to the inner side of the toroid which has the least diameter than to the outer side. The placement of the anode produces an electric field, in a region close to the anode, which has substantially the same gradient in all directions extending radially from the anode, so that the number of avalanche electrons generated by ionizing radiation is independent of the path of the radiation through the chamber.

  19. Radiation detector

    DOEpatents

    Fultz, Brent T.

    1983-01-01

    Apparatus is provided for detecting radiation such as gamma rays and X-rays generated in backscatter Mossbauer effect spectroscopy and X-ray spectrometry, which has a large "window" for detecting radiation emanating over a wide solid angle from a specimen and which generates substantially the same output pulse height for monoenergetic radiation that passes through any portion of the detection chamber. The apparatus includes a substantially toroidal chamber with conductive walls forming a cathode, and a wire anode extending in a circle within the chamber with the anode lying closer to the inner side of the toroid which has the least diameter than to the outer side. The placement of the anode produces an electric field, in a region close to the anode, which has substantially the same gradient in all directions extending radially from the anode, so that the number of avalanche electrons generated by ionizing radiation is independent of the path of the radiation through the chamber.

  20. Radiation retinopathy.

    PubMed Central

    Zamber, R W; Kinyoun, J L

    1992-01-01

    Radiation therapy is effective against many cancerous and noncancerous disease processes. As with other therapeutics, side effects must be anticipated, recognized, and managed appropriately. Radiation retinopathy is a vision-threatening complication of ocular, orbital, periorbital, facial, nasopharyngeal, and cranial irradiation. Factors that appear important in the pathogenesis of radiation retinopathy include total radiation dosage, fraction size, concomitant chemotherapy, and preexisting vascular disorders. Clinical manifestations of the disorder include macular edema and nonproliferative and proliferative retinopathy, similar to changes seen in diabetic retinopathy. Argon laser photocoagulation has proved efficacious for managing macular edema and fibrovascular proliferation in some of these patients. Ongoing basic laboratory and clinical research efforts have led to a better understanding of the pathogenesis, natural history, and treatment response of radiation retinopathy. The ultimate goal of this knowledge is to improve the prevention, recognition, and management of this vision-threatening complication. Images PMID:1441494

  1. Diffuse radiation

    NASA Technical Reports Server (NTRS)

    1981-01-01

    A diffuse celestial radiation which is isotropic at least on a course scale were measured from the soft X-ray region to about 150 MeV, at which energy the intensity falls below that of the galactic emission for most galactic latitudes. The spectral shape, the intensity, and the established degree of isotropy of this diffuse radiation already place severe constraints on the possible explanations for this radiation. Among the extragalactic theories, the more promising explanations of the isotropic diffuse emission appear to be radiation from exceptional galaxies from matter antimatter annihilation at the boundaries of superclusters of galaxies of matter and antimatter in baryon symmetric big bang models. Other possible sources for extragalactic diffuse gamma radiation are discussed and include normal galaxies, clusters of galaxies, primordial cosmic rays interacting with intergalactic matter, primordial black holes, and cosmic ray leakage from galaxies.

  2. Summary of 1990 Code Conference

    SciTech Connect

    Cooper, R.K.; Chan, Kwok-Chi D.

    1990-01-01

    The Conference on Codes and the Linear Accelerator Community was held in Los Alamos in January 1990, and had approximately 100 participants. This conference was the second in a series which has as its goal the exchange of information about codes and code practices among those writing and actually using these codes for the design and analysis of linear accelerators and their components. The first conference was held in San Diego in January 1988, and concentrated on beam dynamics codes and Maxwell solvers. This most recent conference concentrated on 3-D codes and techniques to handle the large amounts of data required for three-dimensional problems. In addition to descriptions of codes, their algorithms and implementations, there were a number of paper describing the use of many of the codes. Proceedings of both these conferences are available. 3 refs., 2 tabs.

  3. Fault-Tolerant Coding for State Machines

    NASA Technical Reports Server (NTRS)

    Naegle, Stephanie Taft; Burke, Gary; Newell, Michael

    2008-01-01

    Two reliable fault-tolerant coding schemes have been proposed for state machines that are used in field-programmable gate arrays and application-specific integrated circuits to implement sequential logic functions. The schemes apply to strings of bits in state registers, which are typically implemented in practice as assemblies of flip-flop circuits. If a single-event upset (SEU, a radiation-induced change in the bit in one flip-flop) occurs in a state register, the state machine that contains the register could go into an erroneous state or could hang, by which is meant that the machine could remain in undefined states indefinitely. The proposed fault-tolerant coding schemes are intended to prevent the state machine from going into an erroneous or hang state when an SEU occurs. To ensure reliability of the state machine, the coding scheme for bits in the state register must satisfy the following criteria: 1. All possible states are defined. 2. An SEU brings the state machine to a known state. 3. There is no possibility of a hang state. 4. No false state is entered. 5. An SEU exerts no effect on the state machine. Fault-tolerant coding schemes that have been commonly used include binary encoding and "one-hot" encoding. Binary encoding is the simplest state machine encoding and satisfies criteria 1 through 3 if all possible states are defined. Binary encoding is a binary count of the state machine number in sequence; the table represents an eight-state example. In one-hot encoding, N bits are used to represent N states: All except one of the bits in a string are 0, and the position of the 1 in the string represents the state. With proper circuit design, one-hot encoding can satisfy criteria 1 through 4. Unfortunately, the requirement to use N bits to represent N states makes one-hot coding inefficient.

  4. Nevada Test Site Radiation Protection Program

    SciTech Connect

    Radiological Control Managers' Council, Nevada Test Site

    2007-08-09

    Title 10 Code of Federal Regulations (CFR) 835, 'Occupational Radiation Protection', establishes radiation protection standards, limits, and program requirements for protecting individuals from ionizing radiation resulting from the conduct of U.S. Department of Energy (DOE) activities. 10 CFR 835.101(a) mandates that DOE activities be conducted in compliance with a documented Radiation Protection Program (RPP) as approved by DOE. This document promulgates the RPP for the Nevada Test Site (NTS), related (onsite or offsite) DOE National Nuclear Security Administration Nevada Site Office (NNSA/NSO) operations, and environmental restoration offsite projects.

  5. CRETE: Comet RadiativE Transfer and Excitation

    NASA Astrophysics Data System (ADS)

    de Val-Borro, Miguel; Wilson, Thomas G.

    2016-12-01

    CRETE (Comet RadiativE Transfer and Excitation) is a one-dimensional water excitation and radiation transfer code for sub-millimeter wavelengths based on the RATRAN code (ascl:0008.002). The code considers rotational transitions of water molecules given a Haser spherically symmetric distribution for the cometary coma and produces FITS image cubes that can be analyzed with tools like MIRIAD (ascl:1106.007). In addition to collisional processes to excite water molecules, the effect of infrared radiation from the Sun is approximated by effective pumping rates for the rotational levels in the ground vibrational state.

  6. Chemical Laser Computer Code Survey,

    DTIC Science & Technology

    1980-12-01

    DOCUMENTATION: Resonator Geometry Synthesis Code Requi rement NV. L. Gamiz); Incorporate General Resonator into Ray Trace Code (W. H. Southwell... Synthesis Code Development (L. R. Stidhm) CATEGRY ATIUEOPTICS KINETICS GASOYNAM41CS None * None *iNone J.LEVEL Simrple Fabry Perot Simple SaturatedGt... Synthesis Co2de Require- ment (V L. ami l ncor~orate General Resonatorn into Ray Trace Code (W. H. Southwel) Srace Optimization Algorithms and Equations (W

  7. Energy Codes and Standards: Facilities

    SciTech Connect

    Bartlett, Rosemarie; Halverson, Mark A.; Shankle, Diana L.

    2007-01-01

    Energy codes and standards play a vital role in the marketplace by setting minimum requirements for energy-efficient design and construction. They outline uniform requirements for new buildings as well as additions and renovations. This article covers basic knowledge of codes and standards; development processes of each; adoption, implementation, and enforcement of energy codes and standards; and voluntary energy efficiency programs.

  8. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  9. IRIG Serial Time Code Formats

    DTIC Science & Technology

    2016-08-01

    TELECOMMUNICATIONS AND TIMING GROUP IRIG STANDARD 200-16 IRIG SERIAL TIME CODE FORMATS DISTRIBUTION A: APPROVED FOR...ARNOLD ENGINEERING DEVELOPMENT COMPLEX NATIONAL AERONAUTICS AND SPACE ADMINISTRATION This page intentionally left blank. IRIG SERIAL TIME CODE ...Serial Time Code Formats, RCC 200-16, August 2016 v Table of Contents Preface

  10. Coding Major Fields of Study.

    ERIC Educational Resources Information Center

    Bobbitt, L. G.; Carroll, C. D.

    The National Center for Education Statistics conducts surveys which require the coding of the respondent's major field of study. This paper presents a new system for the coding of major field of study. It operates on-line i a Computer Assisted Telephone Interview (CATI) environment and allows conversational checks to verify coding directly from…

  11. Improved code-tracking loop

    NASA Technical Reports Server (NTRS)

    Laflame, D. T.

    1980-01-01

    Delay-locked loop tracks pseudonoise codes without introducing dc timing errors, because it is not sensitive to gain imbalance between signal processing arms. "Early" and "late" reference codes pass in combined form through both arms, and each arm acts on both codes. Circuit accomodates 1 dB weaker input signals with tracking ability equal to that of tau-dither loops.

  12. Validation of the BEPLATE code

    SciTech Connect

    Giles, G.E.; Bullock, J.S.

    1997-11-01

    The electroforming simulation code BEPLATE (Boundary Element-PLATE) has been developed and validated for specific applications at Oak Ridge. New areas of application are opening up and more validations are being performed. This paper reports the validation experience of the BEPLATE code on two types of electroforms and describes some recent applications of the code.

  13. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  14. Ptolemy Coding Style

    DTIC Science & Technology

    2014-09-05

    COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Ptolemy Coding Style 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...lisp module for GNU Emacs that has appropriate indenting rules. This file works well with Emacs under both Unix and Windows. • testsuite/ptspell is a...Unix. It is much more liberal that the commonly used “GPL” or “ GNU Public License,” which encumbers the software and derivative works with the

  15. General 3D Airborne Antenna Radiation Pattern Code Users Manual.

    DTIC Science & Technology

    1983-02-01

    111112. LII iii -,1. 11112IIIJIL25I1l6 MICROCOPY RESOLUTION TESI CHART NAhONAL BUREAU OF STANDARDS %q6’A I RADC-TR-83-39 Interim Report February 1983...IS. SECURITY CLASS. (of this report) Same UNCLASSIFIED IS" OECLASSIFICATION/OOWN GRADOING N / A s c oU t- I. OISTRIOUTION STATEMENT (of this Repot

  16. Structured error recovery for code-word-stabilized quantum codes

    SciTech Connect

    Li Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P.

    2010-05-15

    Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3{sup t} times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

  17. Structured error recovery for code-word-stabilized quantum codes

    NASA Astrophysics Data System (ADS)

    Li, Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P.

    2010-05-01

    Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3t times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

  18. Low Density Parity Check Codes: Bandwidth Efficient Channel Coding

    NASA Technical Reports Server (NTRS)

    Fong, Wai; Lin, Shu; Maki, Gary; Yeh, Pen-Shu

    2003-01-01

    Low Density Parity Check (LDPC) Codes provide near-Shannon Capacity performance for NASA Missions. These codes have high coding rates R=0.82 and 0.875 with moderate code lengths, n=4096 and 8176. Their decoders have inherently parallel structures which allows for high-speed implementation. Two codes based on Euclidean Geometry (EG) were selected for flight ASIC implementation. These codes are cyclic and quasi-cyclic in nature and therefore have a simple encoder structure. This results in power and size benefits. These codes also have a large minimum distance as much as d,,, = 65 giving them powerful error correcting capabilities and error floors less than lo- BER. This paper will present development of the LDPC flight encoder and decoder, its applications and status.

  19. New quantum codes constructed from quaternary BCH codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  20. Quantum Codes From Cyclic Codes Over The Ring R2

    NASA Astrophysics Data System (ADS)

    Altinel, Alev; Güzeltepe, Murat

    2016-10-01

    Let R 2 denotes the ring F 2 + μF 2 + υ2 + μυF 2 + wF 2 + μwF 2 + υwF 2 + μυwF2. In this study, we construct quantum codes from cyclic codes over the ring R2, for arbitrary length n, with the restrictions μ2 = 0, υ2 = 0, w 2 = 0, μυ = υμ, μw = wμ, υw = wυ and μ (υw) = (μυ) w. Also, we give a necessary and sufficient condition for cyclic codes over R2 that contains its dual. As a final point, we obtain the parameters of quantum error-correcting codes from cyclic codes over R2 and we give an example of quantum error-correcting codes form cyclic codes over R 2.

  1. [Radiation carcinogenesis].

    PubMed

    Hosoi, Yoshio

    2013-11-01

    Misrepair of DNA damage induced by ionizing radiation is a potential cause of carcinogenesis following exposure to radiation. Radiation exposure increases the incidence of the same types of mutations that occur spontaneously in a given population. A high incidence of DNA double-strand breaks is characteristic of damage by ionizing radiation compared with those induced by other environmental mutagens. In China, residents living in areas with high level background radiation(6mSv/y) had a significantly higher frequency of dicentric and ring chromosomes compared to that for the residents living in the control areas(2mSv/y). Radiation-associated increases in risk were seen for most sites. Gender-averaged excess absolute risk rates estimated at age 70, after exposure at age 30, differ in the sites, and the risks of gastric cancer, breast cancer, colon cancer, and lung cancer were highly increased, in that order. Latent periods for the development of leukemia and thyroid cancer after radiation exposure at ages younger than 18 were shorter compared to those for other solid cancers.

  2. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  3. Genetic code for sine

    NASA Astrophysics Data System (ADS)

    Abdullah, Alyasa Gan; Wah, Yap Bee

    2015-02-01

    The computation of the approximate values of the trigonometric sines was discovered by Bhaskara I (c. 600-c.680), a seventh century Indian mathematician and is known as the Bjaskara's I's sine approximation formula. The formula is given in his treatise titled Mahabhaskariya. In the 14th century, Madhava of Sangamagrama, a Kerala mathematician astronomer constructed the table of trigonometric sines of various angles. Madhava's table gives the measure of angles in arcminutes, arcseconds and sixtieths of an arcsecond. The search for more accurate formulas led to the discovery of the power series expansion by Madhava of Sangamagrama (c.1350-c. 1425), the founder of the Kerala school of astronomy and mathematics. In 1715, the Taylor series was introduced by Brook Taylor an English mathematician. If the Taylor series is centered at zero, it is called a Maclaurin series, named after the Scottish mathematician Colin Maclaurin. Some of the important Maclaurin series expansions include trigonometric functions. This paper introduces the genetic code of the sine of an angle without using power series expansion. The genetic code using square root approach reveals the pattern in the signs (plus, minus) and sequence of numbers in the sine of an angle. The square root approach complements the Pythagoras method, provides a better understanding of calculating an angle and will be useful for teaching the concepts of angles in trigonometry.

  4. Radiation dosimeter

    DOEpatents

    Fox, R.J.

    1981-09-01

    A radiation detector readout circuit is provided which produces a radiation dose-rate readout from a detector even through the detector output may be highly energy dependent. A linear charge amplifier including an output charge pump circuit amplifies the charge signal pulses from the detector and pumps the charge into a charge storage capacitor. The discharge rate of the capacitor through a resistor is controlled to provide a time-dependent voltage which when integrated provides an output proportional to the dose-rate of radiation detected by the detector. This output may be converted to digital form for readout on a digital display.

  5. Radiation dosimeter

    DOEpatents

    Fox, Richard J.

    1983-01-01

    A radiation detector readout circuit is provided which produces a radiation dose-rate readout from a detector even though the detector output may be highly energy dependent. A linear charge amplifier including an output charge pump circuit amplifies the charge signal pulses from the detector and pumps the charge into a charge storage capacitor. The discharge rate of the capacitor through a resistor is controlled to provide a time-dependent voltage which when integrated provides an output proportional to the dose-rate of radiation detected by the detector. This output may be converted to digital form for readout on a digital display.

  6. DOE 2013 occupational radiation exposure

    SciTech Connect

    none,

    2014-11-01

    The Office of Analysis within the U.S. Department of Energy (DOE) Office of Environment, Health, Safety and Security (EHSS) publishes the annual DOE Occupational Radiation Exposure Report to provide an overview of the status of radiation protection practices at DOE (including the National Nuclear Security Administration [NNSA]). The DOE 2013 Occupational Radiation Exposure Report provides an evaluation of DOE-wide performance regarding compliance with Title 10, Code of Federal Regulations (C.F.R.), Part 835, Occupational Radiation Protection dose limits and as low as reasonably achievable (ALARA) process requirements. In addition, the report provides data to DOE organizations responsible for developing policies for protection of individuals from the adverse health effects of radiation. The report provides a summary and an analysis of occupational radiation exposure information from the monitoring of individuals involved in DOE activities. Over the past five-year period, the occupational radiation exposure information has been analyzed in terms of aggregate data, dose to individuals, and dose by site.

  7. Determinate-state convolutional codes

    NASA Technical Reports Server (NTRS)

    Collins, O.; Hizlan, M.

    1991-01-01

    A determinate state convolutional code is formed from a conventional convolutional code by pruning away some of the possible state transitions in the decoding trellis. The type of staged power transfer used in determinate state convolutional codes proves to be an extremely efficient way of enhancing the performance of a concatenated coding system. The decoder complexity is analyzed along with free distances of these new codes and extensive simulation results is provided of their performance at the low signal to noise ratios where a real communication system would operate. Concise, practical examples are provided.

  8. Radiation Hydrodynamics

    SciTech Connect

    Castor, J I

    2003-10-16

    The discipline of radiation hydrodynamics is the branch of hydrodynamics in which the moving fluid absorbs and emits electromagnetic radiation, and in so doing modifies its dynamical behavior. That is, the net gain or loss of energy by parcels of the fluid material through absorption or emission of radiation are sufficient to change the pressure of the material, and therefore change its motion; alternatively, the net momentum exchange between radiation and matter may alter the motion of the matter directly. Ignoring the radiation contributions to energy and momentum will give a wrong prediction of the hydrodynamic motion when the correct description is radiation hydrodynamics. Of course, there are circumstances when a large quantity of radiation is present, yet can be ignored without causing the model to be in error. This happens when radiation from an exterior source streams through the problem, but the latter is so transparent that the energy and momentum coupling is negligible. Everything we say about radiation hydrodynamics applies equally well to neutrinos and photons (apart from the Einstein relations, specific to bosons), but in almost every area of astrophysics neutrino hydrodynamics is ignored, simply because the systems are exceedingly transparent to neutrinos, even though the energy flux in neutrinos may be substantial. Another place where we can do ''radiation hydrodynamics'' without using any sophisticated theory is deep within stars or other bodies, where the material is so opaque to the radiation that the mean free path of photons is entirely negligible compared with the size of the system, the distance over which any fluid quantity varies, and so on. In this case we can suppose that the radiation is in equilibrium with the matter locally, and its energy, pressure and momentum can be lumped in with those of the rest of the fluid. That is, it is no more necessary to distinguish photons from atoms, nuclei and electrons, than it is to distinguish

  9. Kinetic neoclassical calculations of impurity radiation profiles

    SciTech Connect

    Stotler, D. P.; Battaglia, D. J.; Hager, R.; Kim, K.; Koskela, T.; Park, G.; Reinke, M. L.

    2016-12-30

    Modifications of the drift-kinetic transport code XGC0 to include the transport, ionization, and recombination of individual charge states, as well as the associated radiation, are described. The code is first applied to a simulation of an NSTX H-mode discharge with carbon impurity to demonstrate the approach to coronal equilibrium. The effects of neoclassical phenomena on the radiated power profile are examined sequentially through the activation of individual physics modules in the code. Orbit squeezing and the neoclassical inward pinch result in increased radiation for temperatures above a few hundred eV and changes to the ratios of charge state emissions at a given electron temperature. As a result, analogous simulations with a neon impurity yield qualitatively similar results.

  10. Kinetic neoclassical calculations of impurity radiation profiles

    DOE PAGES

    Stotler, D. P.; Battaglia, D. J.; Hager, R.; ...

    2016-12-30

    Modifications of the drift-kinetic transport code XGC0 to include the transport, ionization, and recombination of individual charge states, as well as the associated radiation, are described. The code is first applied to a simulation of an NSTX H-mode discharge with carbon impurity to demonstrate the approach to coronal equilibrium. The effects of neoclassical phenomena on the radiated power profile are examined sequentially through the activation of individual physics modules in the code. Orbit squeezing and the neoclassical inward pinch result in increased radiation for temperatures above a few hundred eV and changes to the ratios of charge state emissions atmore » a given electron temperature. As a result, analogous simulations with a neon impurity yield qualitatively similar results.« less

  11. SKIRT: Stellar Kinematics Including Radiative Transfer

    NASA Astrophysics Data System (ADS)

    Baes, Maarten; Dejonghe, Herwig; Davies, Jonathan

    2011-09-01

    SKIRT is a radiative transfer code based on the Monte Carlo technique. The name SKIRT, acronym for Stellar Kinematics Including Radiative Transfer, reflects the original motivation for its creation: it has been developed to study the effects of dust absorption and scattering on the observed kinematics of dusty galaxies. In a second stage, the SKIRT code was extended with a module to self-consistently calculate the dust emission spectrum under the assumption of local thermal equilibrium. This LTE version of SKIRT has been used to model the dust extinction and emission of various types of galaxies, as well as circumstellar discs and clumpy tori around active galactic nuclei. A new, extended version of SKIRT code can perform efficient 3D radiative transfer calculations including a self-consistent calculation of the dust temperature distribution and the associated FIR/submm emission with a full incorporation of the emission of transiently heated grains and PAH molecules.

  12. Sunrise: Radiation transfer through interstellar dust

    NASA Astrophysics Data System (ADS)

    Jonsson, Patrik

    2013-03-01

    Sunrise is a Monte Carlo radiation transfer code for calculating absorption and scattering of light to study the effects of dust in hydrodynamic simulations of interacting galaxies. It uses an adaptive mesh refinement grid to describe arbitrary geometries of emitting and absorbing/scattering media, with spatial dynamical range exceeding 104; it can efficiently generate images of the emerging radiation at arbitrary points in space and spectral energy distributions of simulated galaxies run with the Gadget, Gasoline, Arepo, Enzo or ART codes. In addition to the monochromatic radiative transfer typically used by Monte Carlo codes, Sunrise can propagate a range of wavelengths simultaneously. This "polychromatic" algorithm gives significant improvements in efficiency and accuracy when spectral features are calculated.

  13. SSC environmental radiation shielding

    SciTech Connect

    Jackson, J.D.

    1987-07-01

    The environmental radiation shielding requirements of the SSC have been evaluated using currently available computational tools that incorporate the well known processes of energy loss and degradation of high energy particles into Monte Carlo computer codes. These tools permit determination of isodose contours in the matter surrounding a source point and therefore the specification of minimum thicknesses or extents of shielding in order to assure annual dose equivalents less than some specified design amount. For the general public the annual dose equivalent specified in the design is 10 millirem, small compared to the dose from naturally occurring radiation. The types of radiation fall into two classes for the purposes of shielding determinations-hadrons and muons. The sources of radiation at the SSC of concern for the surrounding environment are the interaction regions, the specially designed beam dumps into which the beams are dumped from time to time, and beam clean-up regions where stops remove the beam halo in order to reduce experimental backgrounds. A final, unlikely source of radiation considered is the accidental loss of the full beam at some point around the ring. Conservative choices of a luminosity of 10{sup 34} cm{sup {minus}2}s{sup {minus}1} and a beam current three times design have been made in calculating the required shielding and boundaries of the facility. In addition to determination of minimum distances for the annual dose equivalents, the question of possible radioactivity produced in nearby wells or in municipal water supplies is addressed. The designed shielding distances and beam dumps are such that the induced radioactivity in ground water is safely smaller than the levels permitted by EPA and international agencies.

  14. Coded Aperture Imaging for Fluorescent X-rays-Biomedical Applications

    SciTech Connect

    Haboub, Abdel; MacDowell, Alastair; Marchesini, Stefano; Parkinson, Dilworth

    2013-06-01

    Employing a coded aperture pattern in front of a charge couple device pixilated detector (CCD) allows for imaging of fluorescent x-rays (6-25KeV) being emitted from samples irradiated with x-rays. Coded apertures encode the angular direction of x-rays and allow for a large Numerical Aperture x- ray imaging system. The algorithm to develop the self-supported coded aperture pattern of the Non Two Holes Touching (NTHT) pattern was developed. The algorithms to reconstruct the x-ray image from the encoded pattern recorded were developed by means of modeling and confirmed by experiments. Samples were irradiated by monochromatic synchrotron x-ray radiation, and fluorescent x-rays from several different test metal samples were imaged through the newly developed coded aperture imaging system. By choice of the exciting energy the different metals were speciated.

  15. Boltzmann Transport Code Update: Parallelization and Integrated Design Updates

    NASA Technical Reports Server (NTRS)

    Heinbockel, J. H.; Nealy, J. E.; DeAngelis, G.; Feldman, G. A.; Chokshi, S.

    2003-01-01

    The on going efforts at developing a web site for radiation analysis is expected to result in an increased usage of the High Charge and Energy Transport Code HZETRN. It would be nice to be able to do the requested calculations quickly and efficiently. Therefore the question arose, "Could the implementation of parallel processing speed up the calculations required?" To answer this question two modifications of the HZETRN computer code were created. The first modification selected the shield material of Al(2219) , then polyethylene and then Al(2219). The modified Fortran code was labeled 1SSTRN.F. The second modification considered the shield material of CO2 and Martian regolith. This modified Fortran code was labeled MARSTRN.F.

  16. Circular codes, symmetries and transformations.

    PubMed

    Fimmel, Elena; Giannerini, Simone; Gonzalez, Diego Luis; Strüngmann, Lutz

    2015-06-01

    Circular codes, putative remnants of primeval comma-free codes, have gained considerable attention in the last years. In fact they represent a second kind of genetic code potentially involved in detecting and maintaining the normal reading frame in protein coding sequences. The discovering of an universal code across species suggested many theoretical and experimental questions. However, there is a key aspect that relates circular codes to symmetries and transformations that remains to a large extent unexplored. In this article we aim at addressing the issue by studying the symmetries and transformations that connect different circular codes. The main result is that the class of 216 C3 maximal self-complementary codes can be partitioned into 27 equivalence classes defined by a particular set of transformations. We show that such transformations can be put in a group theoretic framework with an intuitive geometric interpretation. More general mathematical results about symmetry transformations which are valid for any kind of circular codes are also presented. Our results pave the way to the study of the biological consequences of the mathematical structure behind circular codes and contribute to shed light on the evolutionary steps that led to the observed symmetries of present codes.

  17. Numerical Methods for Radiation Magnetohydrodynamics in Astrophysics

    SciTech Connect

    Klein, R I; Stone, J M

    2007-11-20

    We describe numerical methods for solving the equations of radiation magnetohydrodynamics (MHD) for astrophysical fluid flow. Such methods are essential for the investigation of the time-dependent and multidimensional dynamics of a variety of astrophysical systems, although our particular interest is motivated by problems in star formation. Over the past few years, the authors have been members of two parallel code development efforts, and this review reflects that organization. In particular, we discuss numerical methods for MHD as implemented in the Athena code, and numerical methods for radiation hydrodynamics as implemented in the Orion code. We discuss the challenges introduced by the use of adaptive mesh refinement in both codes, as well as the most promising directions for future developments.

  18. Making your code citable with the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; DuPrie, Kimberly; Schmidt, Judy; Berriman, G. Bruce; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2016-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. With nearly 1,200 codes, it is the largest indexed resource for astronomy codes in existence. Established in 1999, it offers software authors a path to citation of their research codes even without publication of a paper describing the software, and offers scientists a way to find codes used in refereed publications, thus improving the transparency of the research. It also provides a method to quantify the impact of source codes in a fashion similar to the science metrics of journal articles. Citations using ASCL IDs are accepted by major astronomy journals and if formatted properly are tracked by ADS and other indexing services. The number of citations to ASCL entries increased sharply from 110 citations in January 2014 to 456 citations in September 2015. The percentage of code entries in ASCL that were cited at least once rose from 7.5% in January 2014 to 17.4% in September 2015. The ASCL's mid-2014 infrastructure upgrade added an easy entry submission form, more flexible browsing, search capabilities, and an RSS feeder for updates. A Changes/Additions form added this past fall lets authors submit links for papers that use their codes for addition to the ASCL entry even if those papers don't formally cite the codes, thus increasing the transparency of that research and capturing the value of their software to the community.

  19. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  20. Radiation therapy

    MedlinePlus

    ... Intensity-modulated radiotherapy (IMRT) Image-guided radiotherapy (IGRT) Proton therapy is another kind of radiation used to ... than using x-rays to destroy cancer cells, proton therapy uses a beam of special particles called ...

  1. RADIATION DETECTOR

    DOEpatents

    Wilson, H.N.; Glass, F.M.

    1960-05-10

    A radiation detector of the type is described wherein a condenser is directly connected to the electrodes for the purpose of performing the dual function of a guard ring and to provide capacitance coupling for resetting the detector system.

  2. Radiation Basics

    MedlinePlus

    ... of the heaviest radioactive elements, such as uranium , radium and polonium. Even though alpha particles are very ... is roughly the activity of one gram of Radium-226. Curies are not used to measure radiation ...

  3. Radiation Transport

    SciTech Connect

    Urbatsch, Todd James

    2015-06-15

    We present an overview of radiation transport, covering terminology, blackbody raditation, opacities, Boltzmann transport theory, approximations to the transport equation. Next we introduce several transport methods. We present a section on Caseology, observing transport boundary layers. We briefly broach topics of software development, including verification and validation, and we close with a section on high energy-density experiments that highlight and support radiation transport.

  4. Radiation enteritis.

    PubMed

    Harb, Ali H; Abou Fadel, Carla; Sharara, Ala I

    2014-01-01

    Radiation enteritis continues to be a major health concern in recipients of radiation therapy. The incidence of radiation enteritis is expected to continue to rise during the coming years paralleling the unprecedented use of radiotherapy in pelvic cancers. Radiation enteritis can present as either an acute or chronic syndrome. The acute form presents within hours to days of radiation exposure and typically resolves within few weeks. The chronic form may present as early as 2 months or as long as 30 years after exposure. Risk factors can be divided into patient and treatment-related factors. Chronic radiation enteritis is characterized by progressive obliterative endarteritis with exaggerated submucosal fibrosis and can manifest by stricturing, formation of fistulae, local abscesses, perforation, and bleeding. In the right clinical context, diagnosis can be confirmed by cross-sectional imaging, flexible or video capsule endoscopy. Present treatment strategies are directed primarily towards symptom relief and management of emerging complications. Recently, however, there has been a shift towards rational drug design based on improved understanding of the molecular basis of disease in an effort to limit the fibrotic process and prevent organ damage.

  5. Telescope Adaptive Optics Code

    SciTech Connect

    Phillion, D.

    2005-07-28

    The Telescope AO Code has general adaptive optics capabilities plus specialized models for three telescopes with either adaptive optics or active optics systems. It has the capability to generate either single-layer or distributed Kolmogorov turbulence phase screens using the FFT. Missing low order spatial frequencies are added using the Karhunen-Loeve expansion. The phase structure curve is extremely dose to the theoreUcal. Secondly, it has the capability to simulate an adaptive optics control systems. The default parameters are those of the Keck II adaptive optics system. Thirdly, it has a general wave optics capability to model the science camera halo due to scintillation from atmospheric turbulence and the telescope optics. Although this capability was implemented for the Gemini telescopes, the only default parameter specific to the Gemini telescopes is the primary mirror diameter. Finally, it has a model for the LSST active optics alignment strategy. This last model is highly specific to the LSST

  6. Code lock with microcircuit

    NASA Astrophysics Data System (ADS)

    Korobka, A.; May, I.

    1985-01-01

    A code lock with a microcircuit was invented which contains only a very few components. Two DD-triggers control the state of two identical transistors. When both transistors are turned on simultaneously the transistor VS1 is turned on so that the electromagnet YA1 pulls in the bolt and the door opens. This will happen only when a logic 1 appears at the inverted output of the first trigger and at the straight output of the second one. After the door is opened, a button on it resets the contactors to return both triggers to their original state. The electromagnetic is designed to produce the necessary pull force and sufficient power when under rectified 127 V line voltage, with the neutral wire of the lock circuit always connected to the - terminal of the power supply.

  7. Peripheral coding of taste

    PubMed Central

    Liman, Emily R.; Zhang, Yali V.; Montell, Craig

    2014-01-01

    Five canonical tastes, bitter, sweet, umami (amino acid), salty and sour (acid) are detected by animals as diverse as fruit flies and humans, consistent with a near universal drive to consume fundamental nutrients and to avoid toxins or other harmful compounds. Surprisingly, despite this strong conservation of basic taste qualities between vertebrates and invertebrates, the receptors and signaling mechanisms that mediate taste in each are highly divergent. The identification over the last two decades of receptors and other molecules that mediate taste has led to stunning advances in our understanding of the basic mechanisms of transduction and coding of information by the gustatory systems of vertebrates and invertebrates. In this review, we discuss recent advances in taste research, mainly from the fly and mammalian systems, and we highlight principles that are common across species, despite stark differences in receptor types. PMID:24607224

  8. Surface acoustic wave coding for orthogonal frequency coded devices

    NASA Technical Reports Server (NTRS)

    Malocha, Donald (Inventor); Kozlovski, Nikolai (Inventor)

    2011-01-01

    Methods and systems for coding SAW OFC devices to mitigate code collisions in a wireless multi-tag system. Each device producing plural stepped frequencies as an OFC signal with a chip offset delay to increase code diversity. A method for assigning a different OCF to each device includes using a matrix based on the number of OFCs needed and the number chips per code, populating each matrix cell with OFC chip, and assigning the codes from the matrix to the devices. The asynchronous passive multi-tag system includes plural surface acoustic wave devices each producing a different OFC signal having the same number of chips and including a chip offset time delay, an algorithm for assigning OFCs to each device, and a transceiver to transmit an interrogation signal and receive OFC signals in response with minimal code collisions during transmission.

  9. Improved lossless intra coding for next generation video coding

    NASA Astrophysics Data System (ADS)

    Vanam, Rahul; He, Yuwen; Ye, Yan

    2016-09-01

    Recently, there have been efforts by the ITU-T VCEG and ISO/IEC MPEG to further improve the compression performance of the High Efficiency Video Coding (HEVC) standard for developing a potential next generation video coding standard. The exploratory codec software of this potential standard includes new coding tools for inter and intra coding. In this paper, we present a new intra prediction mode for lossless intra coding. Our new intra mode derives a prediction filter for each input pixel using its neighboring reconstructed pixels, and applies this filter to the nearest neighboring reconstructed pixels to generate a prediction pixel. The proposed intra mode is demonstrated to improve the performance of the exploratory software for lossless intra coding, yielding a maximum and average bitrate savings of 4.4% and 2.11%, respectively.

  10. Transionospheric Propagation Code (TIPC)

    SciTech Connect

    Roussel-Dupre, R.; Kelley, T.A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of vhf signals following propagation through the ionosphere. The code is written in Fortran 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, DTOA study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of delta-times-of-arrival (DTOAs) vs TECs for a specified pair of receivers.

  11. Nonequilibrium air radiation (Nequair) program: User's manual

    NASA Technical Reports Server (NTRS)

    Park, C.

    1985-01-01

    A supplement to the data relating to the calculation of nonequilibrium radiation in flight regimes of aeroassisted orbital transfer vehicles contains the listings of the computer code NEQAIR (Nonequilibrium Air Radiation), its primary input data, and explanation of the user-supplied input variables. The user-supplied input variables are the thermodynamic variables of air at a given point, i.e., number densities of various chemical species, translational temperatures of heavy particles and electrons, and vibrational temperature. These thermodynamic variables do not necessarily have to be in thermodynamic equilibrium. The code calculates emission and absorption characteristics of air under these given conditions.

  12. Spin-glass models as error-correcting codes

    NASA Astrophysics Data System (ADS)

    Sourlas, Nicolas

    1989-06-01

    DURING the transmission of information, errors may occur because of the presence of noise, such as thermal noise in electronic signals or interference with other sources of radiation. One wants to recover the information with the minimum error possible. In theory this is possible by increasing the power of the emitter source. But as the cost is proportional to the energy fed into the channel, it costs less to code the message before sending it, thus including redundant 'coding' bits, and to decode at the end. Coding theory provides rigorous bounds on the cost-effectiveness of any code. The explicit codes proposed so far for practical applications do not saturate these bounds; that is, they do not achieve optimal cost-efficiency. Here we show that theoretical models of magnetically disordered materials (spin glasses) provide a new class of error-correction codes. Their cost performance can be calculated using the methods of statistical mechanics, and is found to be excellent. These models can, under certain circumstances, constitute the first known codes to saturate Shannon's well-known cost-performance bounds.

  13. Computer simulation of radiation damage in gallium arsenide

    NASA Technical Reports Server (NTRS)

    Stith, John J.; Davenport, James C.; Copeland, Randolph L.

    1989-01-01

    A version of the binary-collision simulation code MARLOWE was used to study the spatial characteristics of radiation damage in proton and electron irradiated gallium arsenide. Comparisons made with the experimental results proved to be encouraging.

  14. Synchrotron radiation with radiation reaction

    NASA Astrophysics Data System (ADS)

    Nelson, Robert W.; Wasserman, Ira

    1991-04-01

    A rigorous discussion is presented of the classical motion of a relativistic electron in a magnetic field and the resulting electromagnetic radiation when radiation reaction is important. In particular, for an electron injected with initial energy gamma(0), a systematic perturbative solution to the Lorentz-Dirac equation of motion is developed for field strengths satisfying gamma(0) B much less than 6 x 10 to the 15th G. A particularly accurate solution to the electron orbital motion in this regime is found and it is demonstrated how lowest-order corrections can be calculated. It is shown that the total energy-loss rate corresponds to what would be found using the exact Larmor power formula without including radiation reaction. Provided that the particle energy and field strength satisfy the same contraint, it is explicitly demonstrated that the intuitive prescription for calculating the time-integrated radiation spectrum described above is correct.

  15. Development and application of computational aerothermodynamics flowfield computer codes

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj

    1994-01-01

    Research was performed in the area of computational modeling and application of hypersonic, high-enthalpy, thermo-chemical nonequilibrium flow (Aerothermodynamics) problems. A number of computational fluid dynamic (CFD) codes were developed and applied to simulate high altitude rocket-plume, the Aeroassist Flight Experiment (AFE), hypersonic base flow for planetary probes, the single expansion ramp model (SERN) connected with the National Aerospace Plane, hypersonic drag devices, hypersonic ramp flows, ballistic range models, shock tunnel facility nozzles, transient and steady flows in the shock tunnel facility, arc-jet flows, thermochemical nonequilibrium flows around simple and complex bodies, axisymmetric ionized flows of interest to re-entry, unsteady shock induced combustion phenomena, high enthalpy pulsed facility simulations, and unsteady shock boundary layer interactions in shock tunnels. Computational modeling involved developing appropriate numerical schemes for the flows on interest and developing, applying, and validating appropriate thermochemical processes. As part of improving the accuracy of the numerical predictions, adaptive grid algorithms were explored, and a user-friendly, self-adaptive code (SAGE) was developed. Aerothermodynamic flows of interest included energy transfer due to strong radiation, and a significant level of effort was spent in developing computational codes for calculating radiation and radiation modeling. In addition, computational tools were developed and applied to predict the radiative heat flux and spectra that reach the model surface.

  16. Brain radiation - discharge

    MedlinePlus

    Radiation - brain - discharge; Cancer-brain radiation; Lymphoma - brain radiation; Leukemia - brain radiation ... Decadron) while you are getting radiation to the brain. It may make you hungrier, cause leg swelling ...

  17. Unstructured Polyhedral Mesh Thermal Radiation Diffusion

    SciTech Connect

    Palmer, T.S.; Zika, M.R.; Madsen, N.K.

    2000-07-27

    Unstructured mesh particle transport and diffusion methods are gaining wider acceptance as mesh generation, scientific visualization and linear solvers improve. This paper describes an algorithm that is currently being used in the KULL code at Lawrence Livermore National Laboratory to solve the radiative transfer equations. The algorithm employs a point-centered diffusion discretization on arbitrary polyhedral meshes in 3D. We present the results of a few test problems to illustrate the capabilities of the radiation diffusion module.

  18. XUV polarimeter for undulator radiation measurements

    SciTech Connect

    Gluskin, E.; Mattson, J.E.; Bader, S.D.; Viccaro, P.J. ); Barbee, T.W. Jr. ); Brookes, N. ); Pitas, A. ); Watts, R. )

    1991-01-01

    A polarimeter for x-ray and vacuum ultraviolet (XUV) radiation was built to measure the spatial spectral dependence of the polarization of the light produced by the new undulator at the U5 beamline at NSLS. The fourth-harmonic radiation was measured, and it does not agree with predictions based on ideal simulation codes in the far-field approximation. 13 ref., 7 figs.

  19. On multilevel block modulation codes

    NASA Technical Reports Server (NTRS)

    Kasami, Tadao; Takata, Toyoo; Fujiwara, Toru; Lin, Shu

    1991-01-01

    The multilevel (ML) technique for combining block coding and modulation is investigated. A general formulation is presented for ML modulation codes in terms of component codes with appropriate distance measures. A specific method for constructing ML block modulation codes (MLBMCs) with interdependency among component codes is proposed. Given an MLBMC C with no interdependency among the binary component codes, the proposed method gives an MLBC C-prime that has the same rate as C, a minimum squared Euclidean distance not less than that of C, a trellis diagram with the same number of states as that of C, and a smaller number of nearest-neighbor codewords than that of C. Finally, a technique is presented for analyzing the error performance of MLBMCs for an additive white Gaussian noise channel based on soft-decision maximum-likelihood decoding.

  20. Space life sciences: radiation risk assessment and radiation measurements in low Earth orbit.

    PubMed

    2004-01-01

    The volume contains papers presented at COSPAR symposia in October 2002 about radiation risk assessment and radiation measurements in low Earth orbit. The risk assessment symposium brought together multidisciplinary expertise including physicists, biologists, and theoretical modelers. Topics included current knowledge about known and predicted radiation environments, radiation shielding, physics cross section models, improved ion beam transport codes, biological demonstrations of specific shielding materials and applications to a manned mission to Mars, advancements in biological measurement of radiation-induced protein expression profiles, and integration of physical and biological parameters to assess key elements of radiation risk. Papers from the radiation measurements in low Earth orbit symposium included data about dose, linear energy transfer spectra, and charge spectra from recent measurements on the International Space Station (ISS), comparison between calculations and measurements of dose distribution inside a human phantom and the neutron component inside the ISS; and reviews of trapped antiprotons and positrons inside the Earth's magnetosphere.