Science.gov

Sample records for radiation code cduct-larc

  1. FLYCHK Collisional-Radiative Code

    National Institute of Standards and Technology Data Gateway

    SRD 160 FLYCHK Collisional-Radiative Code (Web, free access)   FLYCHK provides a capability to generate atomic level populations and charge state distributions for low-Z to mid-Z elements under NLTE conditions.

  2. Testing Impact?s Radiation Code

    SciTech Connect

    Edis, T; Cameron-Smith, P; Grant, K E; Bergmann, D; Chuang, C C

    2004-07-12

    This is a summary of work done over an 8 week period from May to July 2004, which concerned testing the longwave and shortwave radiation packages in Impact. The radiation code was initially developed primarily by Keith Grant in the context of LLNL's 2D model, and was added to Impact over the last few summers. While the radiation code had been tested and also used in some aerosol-related calculations, its 3D form in Impact had not been validated with comparisons to satellite data. Along with such comparisons, our work described here was also motivated by the need to validate the radiation code for use in the SciDAC consortium project. This involved getting the radiation code working with CAM/WACCM met data, and setting the stage for comparing CAM/WACCM radiation output with Impact results.

  3. MACRAD: A mass analysis code for radiators

    SciTech Connect

    Gallup, D.R.

    1988-01-01

    A computer code to estimate and optimize the mass of heat pipe radiators (MACRAD) is currently under development. A parametric approach is used in MACRAD, which allows the user to optimize radiator mass based on heat pipe length, length to diameter ratio, vapor to wick radius, radiator redundancy, etc. Full consideration of the heat pipe operating parameters, material properties, and shielding requirements is included in the code. Preliminary results obtained with MACRAD are discussed.

  4. TORUS: Radiation transport and hydrodynamics code

    NASA Astrophysics Data System (ADS)

    Harries, Tim

    2014-04-01

    TORUS is a flexible radiation transfer and radiation-hydrodynamics code. The code has a basic infrastructure that includes the AMR mesh scheme that is used by several physics modules including atomic line transfer in a moving medium, molecular line transfer, photoionization, radiation hydrodynamics and radiative equilibrium. TORUS is useful for a variety of problems, including magnetospheric accretion onto T Tauri stars, spiral nebulae around Wolf-Rayet stars, discs around Herbig AeBe stars, structured winds of O supergiants and Raman-scattered line formation in symbiotic binaries, and dust emission and molecular line formation in star forming clusters. The code is written in Fortran 2003 and is compiled using a standard Gnu makefile. The code is parallelized using both MPI and OMP, and can use these parallel sections either separately or in a hybrid mode.

  5. Tests of Exoplanet Atmospheric Radiative Transfer Codes

    NASA Astrophysics Data System (ADS)

    Harrington, Joseph; Challener, Ryan; DeLarme, Emerson; Cubillos, Patricio; Blecic, Jasmina; Foster, Austin; Garland, Justin

    2016-10-01

    Atmospheric radiative transfer codes are used both to predict planetary spectra and in retrieval algorithms to interpret data. Observational plans, theoretical models, and scientific results thus depend on the correctness of these calculations. Yet, the calculations are complex and the codes implementing them are often written without modern software-verification techniques. In the process of writing our own code, we became aware of several others with artifacts of unknown origin and even outright errors in their spectra. We present a series of tests to verify atmospheric radiative-transfer codes. These include: simple, single-line line lists that, when combined with delta-function abundance profiles, should produce a broadened line that can be verified easily; isothermal atmospheres that should produce analytically-verifiable blackbody spectra at the input temperatures; and model atmospheres with a range of complexities that can be compared to the output of other codes. We apply the tests to our own code, Bayesian Atmospheric Radiative Transfer (BART) and to several other codes. The test suite is open-source software. We propose this test suite as a standard for verifying current and future radiative transfer codes, analogous to the Held-Suarez test for general circulation models. This work was supported by NASA Planetary Atmospheres grant NX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G.

  6. Airborne antenna radiation pattern code user's manual

    NASA Technical Reports Server (NTRS)

    Burnside, Walter D.; Kim, Jacob J.; Grandchamp, Brett; Rojas, Roberto G.; Law, Philip

    1985-01-01

    The use of a newly developed computer code to analyze the radiation patterns of antennas mounted on a ellipsoid and in the presence of a set of finite flat plates is described. It is shown how the code allows the user to simulate a wide variety of complex electromagnetic radiation problems using the ellipsoid/plates model. The code has the capacity of calculating radiation patterns around an arbitrary conical cut specified by the user. The organization of the code, definition of input and output data, and numerous practical examples are also presented. The analysis is based on the Uniform Geometrical Theory of Diffraction (UTD), and most of the computed patterns are compared with experimental results to show the accuracy of this solution.

  7. NASA Space Radiation Transport Code Development Consortium.

    PubMed

    Townsend, Lawrence W

    2005-01-01

    Recently, NASA established a consortium involving the University of Tennessee (lead institution), the University of Houston, Roanoke College and various government and national laboratories, to accelerate the development of a standard set of radiation transport computer codes for NASA human exploration applications. This effort involves further improvements of the Monte Carlo codes HETC and FLUKA and the deterministic code HZETRN, including developing nuclear reaction databases necessary to extend the Monte Carlo codes to carry out heavy ion transport, and extending HZETRN to three dimensions. The improved codes will be validated by comparing predictions with measured laboratory transport data, provided by an experimental measurements consortium, and measurements in the upper atmosphere on the balloon-borne Deep Space Test Bed (DSTB). In this paper, we present an overview of the consortium members and the current status and future plans of consortium efforts to meet the research goals and objectives of this extensive undertaking.

  8. An integrated radiation physics computer code system.

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Harris, D. W.

    1972-01-01

    An integrated computer code system for the semi-automatic and rapid analysis of experimental and analytic problems in gamma photon and fast neutron radiation physics is presented. Such problems as the design of optimum radiation shields and radioisotope power source configurations may be studied. The system codes allow for the unfolding of complex neutron and gamma photon experimental spectra. Monte Carlo and analytic techniques are used for the theoretical prediction of radiation transport. The system includes a multichannel pulse-height analyzer scintillation and semiconductor spectrometer coupled to an on-line digital computer with appropriate peripheral equipment. The system is geometry generalized as well as self-contained with respect to material nuclear cross sections and the determination of the spectrometer response functions. Input data may be either analytic or experimental.

  9. An integrated radiation physics computer code system.

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Harris, D. W.

    1972-01-01

    An integrated computer code system for the semi-automatic and rapid analysis of experimental and analytic problems in gamma photon and fast neutron radiation physics is presented. Such problems as the design of optimum radiation shields and radioisotope power source configurations may be studied. The system codes allow for the unfolding of complex neutron and gamma photon experimental spectra. Monte Carlo and analytic techniques are used for the theoretical prediction of radiation transport. The system includes a multichannel pulse-height analyzer scintillation and semiconductor spectrometer coupled to an on-line digital computer with appropriate peripheral equipment. The system is geometry generalized as well as self-contained with respect to material nuclear cross sections and the determination of the spectrometer response functions. Input data may be either analytic or experimental.

  10. The Helicopter Antenna Radiation Prediction Code (HARP)

    NASA Technical Reports Server (NTRS)

    Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.

    1990-01-01

    The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.

  11. Radiation hydrodynamics integrated in the PLUTO code

    NASA Astrophysics Data System (ADS)

    Kolb, Stefan M.; Stute, Matthias; Kley, Wilhelm; Mignone, Andrea

    2013-11-01

    Aims: The transport of energy through radiation is very important in many astrophysical phenomena. In dynamical problems the time-dependent equations of radiation hydrodynamics have to be solved. We present a newly developed radiation-hydrodynamics module specifically designed for the versatile magnetohydrodynamic (MHD) code PLUTO. Methods: The solver is based on the flux-limited diffusion approximation in the two-temperature approach. All equations are solved in the co-moving frame in the frequency-independent (gray) approximation. The hydrodynamics is solved by the different Godunov schemes implemented in PLUTO, and for the radiation transport we use a fully implicit scheme. The resulting system of linear equations is solved either using the successive over-relaxation (SOR) method (for testing purposes) or using matrix solvers that are available in the PETSc library. We state in detail the methodology and describe several test cases to verify the correctness of our implementation. The solver works in standard coordinate systems, such as Cartesian, cylindrical, and spherical, and also for non-equidistant grids. Results: We present a new radiation-hydrodynamics solver coupled to the MHD-code PLUTO that is a modern, versatile, and efficient new module for treating complex radiation hydrodynamical problems in astrophysics. As test cases, either purely radiative situations, or full radiation-hydrodynamical setups (including radiative shocks and convection in accretion disks) were successfully studied. The new module scales very well on parallel computers using MPI. For problems in star or planet formation, we added the possibility of irradiation by a central source.

  12. Validation of comprehensive space radiation transport code

    SciTech Connect

    Shinn, J.L.; Simonsen, L.C.; Cucinotta, F.A.

    1998-12-01

    The HZETRN code has been developed over the past decade to evaluate the local radiation fields within sensitive materials on spacecraft in the space environment. Most of the more important nuclear and atomic processes are now modeled and evaluation within a complex spacecraft geometry with differing material components, including transition effects across boundaries of dissimilar materials, are included. The atomic/nuclear database and transport procedures have received limited validation in laboratory testing with high energy ion beams. The codes have been applied in design of the SAGE-III instrument resulting in material changes to control injurious neutron production, in the study of the Space Shuttle single event upsets, and in validation with space measurements (particle telescopes, tissue equivalent proportional counters, CR-39) on Shuttle and Mir. The present paper reviews the code development and presents recent results in laboratory and space flight validation.

  13. THE MCNPX MONTE CARLO RADIATION TRANSPORT CODE

    SciTech Connect

    WATERS, LAURIE S.; MCKINNEY, GREGG W.; DURKEE, JOE W.; FENSIN, MICHAEL L.; JAMES, MICHAEL R.; JOHNS, RUSSELL C.; PELOWITZ, DENISE B.

    2007-01-10

    MCNPX (Monte Carlo N-Particle eXtended) is a general-purpose Monte Carlo radiation transport code with three-dimensional geometry and continuous-energy transport of 34 particles and light ions. It contains flexible source and tally options, interactive graphics, and support for both sequential and multi-processing computer platforms. MCNPX is based on MCNP4B, and has been upgraded to most MCNP5 capabilities. MCNP is a highly stable code tracking neutrons, photons and electrons, and using evaluated nuclear data libraries for low-energy interaction probabilities. MCNPX has extended this base to a comprehensive set of particles and light ions, with heavy ion transport in development. Models have been included to calculate interaction probabilities when libraries are not available. Recent additions focus on the time evolution of residual nuclei decay, allowing calculation of transmutation and delayed particle emission. MCNPX is now a code of great dynamic range, and the excellent neutronics capabilities allow new opportunities to simulate devices of interest to experimental particle physics; particularly calorimetry. This paper describes the capabilities of the current MCNPX version 2.6.C, and also discusses ongoing code development.

  14. LPGS. Code System for Calculating Radiation Exposure

    SciTech Connect

    White, J.E.; Eckerman, K.F.

    1983-01-01

    LPGS was developed to calculate the radiological impacts resulting from radioactive releases to the hydrosphere. The name LPGS was derived from the Liquid Pathway Generic Study for which the original code was used primarily as an analytic tool in the assessment process. The hydrosphere is represented by the following types of water bodies: estuary, small river, well, lake, and one-dimensional (1-d) river. LPGS is designed to calculate radiation dose (individual and population) to body organs as a function of time for the various exposure pathways. The radiological consequences to the aquatic biota are estimated. Several simplified radionuclide transport models are employed with built-in formulations to describe the release rate of the radionuclides. A tabulated user-supplied release model can be input, if desired. Printer plots of dose versus time for the various exposure pathways are provided.

  15. Advances in space radiation shielding codes

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Qualls, Garry D.; Cucinotta, Francis A.; Prael, Richard E.; Norbury, John W.; Heinbockel, John H.; Tweed, John; De Angelis, Giovanni

    2002-01-01

    Early space radiation shield code development relied on Monte Carlo methods and made important contributions to the space program. Monte Carlo methods have resorted to restricted one-dimensional problems leading to imperfect representation of appropriate boundary conditions. Even so, intensive computational requirements resulted and shield evaluation was made near the end of the design process. Resolving shielding issues usually had a negative impact on the design. Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary concept to the final design. For the last few decades, we have pursued deterministic solutions of the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design methods. A single ray trace in such geometry requires 14 milliseconds and limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given.

  16. Space Radiation Transport Code Development: 3DHZETRN

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    The space radiation transport code, HZETRN, has been used extensively for research, vehicle design optimization, risk analysis, and related applications. One of the simplifying features of the HZETRN transport formalism is the straight-ahead approximation, wherein all particles are assumed to travel along a common axis. This reduces the governing equation to one spatial dimension allowing enormous simplification and highly efficient computational procedures to be implemented. Despite the physical simplifications, the HZETRN code is widely used for space applications and has been found to agree well with fully 3D Monte Carlo simulations in many circumstances. Recent work has focused on the development of 3D transport corrections for neutrons and light ions (Z < 2) for which the straight-ahead approximation is known to be less accurate. Within the development of 3D corrections, well-defined convergence criteria have been considered, allowing approximation errors at each stage in model development to be quantified. The present level of development assumes the neutron cross sections have an isotropic component treated within N explicit angular directions and a forward component represented by the straight-ahead approximation. The N = 1 solution refers to the straight-ahead treatment, while N = 2 represents the bi-directional model in current use for engineering design. The figure below shows neutrons, protons, and alphas for various values of N at locations in an aluminum sphere exposed to a solar particle event (SPE) spectrum. The neutron fluence converges quickly in simple geometry with N > 14 directions. The improved code, 3DHZETRN, transports neutrons, light ions, and heavy ions under space-like boundary conditions through general geometry while maintaining a high degree of computational efficiency. A brief overview of the 3D transport formalism for neutrons and light ions is given, and extensive benchmarking results with the Monte Carlo codes Geant4, FLUKA, and

  17. Description of Transport Codes for Space Radiation Shielding

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Wilson, John W.; Cucinotta, Francis A.

    2011-01-01

    This slide presentation describes transport codes and their use for studying and designing space radiation shielding. When combined with risk projection models radiation transport codes serve as the main tool for study radiation and designing shielding. There are three criteria for assessing the accuracy of transport codes: (1) Ground-based studies with defined beams and material layouts, (2) Inter-comparison of transport code results for matched boundary conditions and (3) Comparisons to flight measurements. These three criteria have a very high degree with NASA's HZETRN/QMSFRG.

  18. Radiation flux tables for ICRCCM using the GLA GCM radiation codes

    NASA Technical Reports Server (NTRS)

    HARSHVARDHAN

    1986-01-01

    Tabulated values of longwave and shortwave radiation fluxes and also cooling and heating rates in the atmosphere for standard atmospheric profiles are presented. The radiation codes used in the Goddard general circulation model were employed for the computations. These results were obtained for an international intercomparison projected called Intercomparison of Radiation Codes in Climate Models (ICRCCM).

  19. High Energy Radiation Transport Codes: Their Development and Application

    NASA Astrophysics Data System (ADS)

    Gabriel, Tony A.

    1996-05-01

    The development of high energy radiation transport codes has been very strongly correlated to the development of higher energy accelerators and more powerful computers. During the early 1960's a Nucleon Transport Code (NTC) was developed to transport neutrons and protons up to energies below the pion threshold. During the middle 1960's this code which was renamed to NMTC was expanded to include multiple pion production and could be used for particle energies up to 3.5 GeV. During the late 1960's and early 1970's with the development of Fermi National Accelerator Laboratory (FNAL) NMTC was again refined by the inclusion of a particle nucleus collision scaling model which could generate reliable collision information at the higher energies necessary for the development of radiation shielding at FNAL. This was HETC. During the 1970's HETC was coupled with the EGS code for electromagnetic particle transport the MORSE code for low-energy (<20MeV) neutron transport, and SPECT, a HETC analysis code for obtaining energy deposition, to produce the CALOR code system, a complete high energy radiation transport code package. For this paper CALOR will be described in detail and some recent applications will be presented. The strength and weakness as well as the applicability of other radiation transport code systems like FLUKA will be briefly discussed.

  20. Recent developments in the Los Alamos radiation transport code system

    SciTech Connect

    Forster, R.A.; Parsons, K.

    1997-06-01

    A brief progress report on updates to the Los Alamos Radiation Transport Code System (LARTCS) for solving criticality and fixed-source problems is provided. LARTCS integrates the Diffusion Accelerated Neutral Transport (DANT) discrete ordinates codes with the Monte Carlo N-Particle (MCNP) code. The LARCTS code is being developed with a graphical user interface for problem setup and analysis. Progress in the DANT system for criticality applications include a two-dimensional module which can be linked to a mesh-generation code and a faster iteration scheme. Updates to MCNP Version 4A allow statistical checks of calculated Monte Carlo results.

  1. Space radiator simulation manual for computer code

    NASA Technical Reports Server (NTRS)

    Black, W. Z.; Wulff, W.

    1972-01-01

    A computer program that simulates the performance of a space radiator is presented. The program basically consists of a rigorous analysis which analyzes a symmetrical fin panel and an approximate analysis that predicts system characteristics for cases of non-symmetrical operation. The rigorous analysis accounts for both transient and steady state performance including aerodynamic and radiant heating of the radiator system. The approximate analysis considers only steady state operation with no aerodynamic heating. A description of the radiator system and instructions to the user for program operation is included. The input required for the execution of all program options is described. Several examples of program output are contained in this section. Sample output includes the radiator performance during ascent, reentry and orbit.

  2. Overview of HZETRN and BRNTRN Space Radiation Shielding Codes

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Cucinotta, F. A.; Shinn, J. L.; Simonsen, L. C.; Badavi, F. F.

    1997-01-01

    The NASA Radiation Health Program has supported basic research over the last decade in radiation physics to develop ionizing radiation transport codes and corresponding data bases for the protection of astronauts from galactic and solar cosmic rays on future deep space missions. The codes describe the interactions of the incident radiations with shield materials where their content is modified by the atomic and nuclear reactions through which high energy heavy ions are fragmented into less massive reaction products and reaction products are produced as radiations as direct knockout of shield constituents or produced as de-excitation products in the reactions. This defines the radiation fields to which specific devices are subjected onboard a spacecraft. Similar reactions occur in the device itself which is the initiating event for the device response. An overview of the computational procedures and data base with some applications to photonic and data processing devices will be given.

  3. SPAMCART: a code for smoothed particle Monte Carlo radiative transfer

    NASA Astrophysics Data System (ADS)

    Lomax, O.; Whitworth, A. P.

    2016-10-01

    We present a code for generating synthetic spectral energy distributions and intensity maps from smoothed particle hydrodynamics simulation snapshots. The code is based on the Lucy Monte Carlo radiative transfer method, i.e. it follows discrete luminosity packets as they propagate through a density field, and then uses their trajectories to compute the radiative equilibrium temperature of the ambient dust. The sources can be extended and/or embedded, and discrete and/or diffuse. The density is not mapped on to a grid, and therefore the calculation is performed at exactly the same resolution as the hydrodynamics. We present two example calculations using this method. First, we demonstrate that the code strictly adheres to Kirchhoff's law of radiation. Secondly, we present synthetic intensity maps and spectra of an embedded protostellar multiple system. The algorithm uses data structures that are already constructed for other purposes in modern particle codes. It is therefore relatively simple to implement.

  4. The Continual Intercomparison of Radiation Codes: Results from Phase I

    NASA Technical Reports Server (NTRS)

    Oreopoulos, Lazaros; Mlawer, Eli; Delamere, Jennifer; Shippert, Timothy; Cole, Jason; Iacono, Michael; Jin, Zhonghai; Li, Jiangnan; Manners, James; Raisanen, Petri; hide

    2011-01-01

    The computer codes that calculate the energy budget of solar and thermal radiation in Global Climate Models (GCMs), our most advanced tools for predicting climate change, have to be computationally efficient in order to not impose undue computational burden to climate simulations. By using approximations to gain execution speed, these codes sacrifice accuracy compared to more accurate, but also much slower, alternatives. International efforts to evaluate the approximate schemes have taken place in the past, but they have suffered from the drawback that the accurate standards were not validated themselves for performance. The manuscript summarizes the main results of the first phase of an effort called "Continual Intercomparison of Radiation Codes" (CIRC) where the cases chosen to evaluate the approximate models are based on observations and where we have ensured that the accurate models perform well when compared to solar and thermal radiation measurements. The effort is endorsed by international organizations such as the GEWEX Radiation Panel and the International Radiation Commission and has a dedicated website (i.e., http://circ.gsfc.nasa.gov) where interested scientists can freely download data and obtain more information about the effort's modus operandi and objectives. In a paper published in the March 2010 issue of the Bulletin of the American Meteorological Society only a brief overview of CIRC was provided with some sample results. In this paper the analysis of submissions of 11 solar and 13 thermal infrared codes relative to accurate reference calculations obtained by so-called "line-by-line" radiation codes is much more detailed. We demonstrate that, while performance of the approximate codes continues to improve, significant issues still remain to be addressed for satisfactory performance within GCMs. We hope that by identifying and quantifying shortcomings, the paper will help establish performance standards to objectively assess radiation code quality

  5. Stratospheric Relaxation in IMPACT's Radiation Code

    SciTech Connect

    Edis, T; Grant, K; Cameron-Smith, P

    2006-11-13

    While Impact incorporates diagnostic radiation routines from our work in previous years, it has not previously included the stratospheric relaxation required for forcing calculations. We have now implemented the necessary changes for stratospheric relaxation, tested its stability, and compared the results with stratosphere temperatures obtained from CAM3 met data. The relaxation results in stable temperature profiles in the stratosphere, which is encouraging for use in forcing calculations. It does, however, produce a cooling bias when compared to CAM3, which appears to be due to differences in radiation calculations rather than the interactive treatment of ozone. The cause of this bias is unclear as yet, but seems to be systematic and hence cancels out when differences are taken relative to a control simulation.

  6. Acceleration of a Monte Carlo radiation transport code

    SciTech Connect

    Hochstedler, R.D.; Smith, L.M.

    1996-03-01

    Execution time for the Integrated TIGER Series (ITS) Monte Carlo radiation transport code has been reduced by careful re-coding of computationally intensive subroutines. Three test cases for the TIGER (1-D slab geometry), CYLTRAN (2-D cylindrical geometry), and ACCEPT (3-D arbitrary geometry) codes were identified and used to benchmark and profile program execution. Based upon these results, sixteen top time-consuming subroutines were examined and nine of them modified to accelerate computations with equivalent numerical output to the original. The results obtained via this study indicate that speedup factors of 1.90 for the TIGER code, 1.67 for the CYLTRAN code, and 1.11 for the ACCEPT code are achievable. {copyright} {ital 1996 American Institute of Physics.}

  7. Description of transport codes for space radiation shielding.

    PubMed

    Kim, Myung-Hee Y; Wilson, John W; Cucinotta, Francis A

    2012-11-01

    Exposure to ionizing radiation in the space environment is one of the hazards faced by crews in space missions. As space radiations traverse spacecraft, habitat shielding, or tissues, their energies and compositions are altered by interactions with the shielding. Modifications to the radiation fields arise from atomic interactions of charged particles with orbital electrons and nuclear interactions leading to projectile and target fragmentation, including secondary particles such as neutrons, protons, mesons, and nuclear recoils. The transport of space radiation through shielding can be simulated using Monte Carlo techniques or deterministic solutions of the Boltzmann equation. To determine shielding requirements and to resolve radiation constraints for future human missions, the shielding evaluation of a spacecraft concept is required as an early step in the design process. To do this requires (1) accurate knowledge of space environmental models to define the boundary condition for transport calculations, (2) transport codes with detailed shielding and body geometry models to determine particle transmission into areas of internal shielding and at each critical body organ, and (3) the assessment of organ dosimetric quantities and biological risks by applying the corresponding response models for space radiation against the particle spectra that have been accurately determined from the transport code. This paper reviews current transport codes and analyzes their accuracy through comparison to laboratory and spaceflight data. This paper also introduces a probabilistic risk assessment approach for the evaluation of radiation shielding.

  8. A Radiation Shielding Code for Spacecraft and Its Validation

    NASA Technical Reports Server (NTRS)

    Shinn, J. L.; Cucinotta, F. A.; Singleterry, R. C.; Wilson, J. W.; Badavi, F. F.; Badhwar, G. D.; Miller, J.; Zeitlin, C.; Heilbronn, L.; Tripathi, R. K.

    2000-01-01

    The HZETRN code, which uses a deterministic approach pioneered at NASA Langley Research Center, has been developed over the past decade to evaluate the local radiation fields within sensitive materials (electronic devices and human tissue) on spacecraft in the space environment. The code describes the interactions of shield materials with the incident galactic cosmic rays, trapped protons, or energetic protons from solar particle events in free space and low Earth orbit. The content of incident radiations is modified by atomic and nuclear reactions with the spacecraft and radiation shield materials. High-energy heavy ions are fragmented into less massive reaction products, and reaction products are produced by direct knockout of shield constituents or from de-excitation products. An overview of the computational procedures and database which describe these interactions is given. Validation of the code with recent Monte Carlo benchmarks, and laboratory and flight measurement is also included.

  9. Controlling Energy Radiations of Electromagnetic Waves via Frequency Coding Metamaterials

    PubMed Central

    Wu, Haotian; Liu, Shuo; Wan, Xiang; Zhang, Lei; Wang, Dan; Li, Lianlin

    2017-01-01

    Metamaterials are artificial structures composed of subwavelength unit cells to control electromagnetic (EM) waves. The spatial coding representation of metamaterial has the ability to describe the material in a digital way. The spatial coding metamaterials are typically constructed by unit cells that have similar shapes with fixed functionality. Here, the concept of frequency coding metamaterial is proposed, which achieves different controls of EM energy radiations with a fixed spatial coding pattern when the frequency changes. In this case, not only different phase responses of the unit cells are considered, but also different phase sensitivities are also required. Due to different frequency sensitivities of unit cells, two units with the same phase response at the initial frequency may have different phase responses at higher frequency. To describe the frequency coding property of unit cell, digitalized frequency sensitivity is proposed, in which the units are encoded with digits “0” and “1” to represent the low and high phase sensitivities, respectively. By this merit, two degrees of freedom, spatial coding and frequency coding, are obtained to control the EM energy radiations by a new class of frequency‐spatial coding metamaterials. The above concepts and physical phenomena are confirmed by numerical simulations and experiments. PMID:28932671

  10. Controlling Energy Radiations of Electromagnetic Waves via Frequency Coding Metamaterials.

    PubMed

    Wu, Haotian; Liu, Shuo; Wan, Xiang; Zhang, Lei; Wang, Dan; Li, Lianlin; Cui, Tie Jun

    2017-09-01

    Metamaterials are artificial structures composed of subwavelength unit cells to control electromagnetic (EM) waves. The spatial coding representation of metamaterial has the ability to describe the material in a digital way. The spatial coding metamaterials are typically constructed by unit cells that have similar shapes with fixed functionality. Here, the concept of frequency coding metamaterial is proposed, which achieves different controls of EM energy radiations with a fixed spatial coding pattern when the frequency changes. In this case, not only different phase responses of the unit cells are considered, but also different phase sensitivities are also required. Due to different frequency sensitivities of unit cells, two units with the same phase response at the initial frequency may have different phase responses at higher frequency. To describe the frequency coding property of unit cell, digitalized frequency sensitivity is proposed, in which the units are encoded with digits "0" and "1" to represent the low and high phase sensitivities, respectively. By this merit, two degrees of freedom, spatial coding and frequency coding, are obtained to control the EM energy radiations by a new class of frequency-spatial coding metamaterials. The above concepts and physical phenomena are confirmed by numerical simulations and experiments.

  11. Comparison of Space Radiation Calculations from Deterministic and Monte Carlo Transport Codes

    NASA Technical Reports Server (NTRS)

    Adams, J. H.; Lin, Z. W.; Nasser, A. F.; Randeniya, S.; Tripathi, r. K.; Watts, J. W.; Yepes, P.

    2010-01-01

    The presentation outline includes motivation, radiation transport codes being considered, space radiation cases being considered, results for slab geometry, results from spherical geometry, and summary. ///////// main physics in radiation transport codes hzetrn uprop fluka geant4, slab geometry, spe, gcr,

  12. Code for Analyzing and Designing Spacecraft Power System Radiators

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert

    2005-01-01

    GPHRAD is a computer code for analysis and design of disk or circular-sector heat-rejecting radiators for spacecraft power systems. A specific application is for Stirling-cycle/linear-alternator electric-power systems coupled to radioisotope general-purpose heat sources. GPHRAD affords capabilities and options to account for thermophysical properties (thermal conductivity, density) of either metal-alloy or composite radiator materials.

  13. HERACLES: a three-dimensional radiation hydrodynamics code

    NASA Astrophysics Data System (ADS)

    González, M.; Audit, E.; Huynh, P.

    2007-03-01

    Aims:We present a new three-dimensional radiation hydrodynamics code called HERACLES that uses an original moment method to solve the radiative transfer. Methods: The radiation transfer is modelled using a two-moment model and a closure relation that allows large angular anisotropies in the radiation field to be preserved and reproduced. The radiative equations thus obtained are solved by a second-order Godunov-type method and integrated implicitly by using iterative solvers. HERACLES has been parallelized with the MPI library and implemented in Cartesian, cylindrical, and spherical coordinates. To characterize the accuracy of HERACLES and to compare it with other codes, we performed a series of tests including purely radiative tests and radiation-hydrodynamics ones. Results: The results show that the physical model used in HERACLES for the transfer is fairly accurate in both the diffusion and transport limit, but also for semi-transparent regions. Conclusions: . This makes HERACLES very well-suited to studying many astrophysical problems such as radiative shocks, molecular jets of young stars, fragmentation and formation of dense cores in the interstellar medium, and protoplanetary discs. Appendices are only available in electronic form at http://www.aanda.org

  14. The International Intercomparison of 3-Dimensional Radiation Codes

    NASA Technical Reports Server (NTRS)

    Cahalan, R. F.; Lau, William K. M. (Technical Monitor)

    2002-01-01

    I3RC (International Intercomparison of 3-dimensional Radiation Codes) has as its primary goal to compare a wide variety of three-dimensional (3D) radiative transfer methods applied to Earth's atmosphere, with a few selected cloud fields as input, and a few selected radiative quantities as output. Phases 1 and 2 are now complete, and participants represented institutions in Canada, France, Germany, Russia, the United Kingdom, and the USA, who met for two workshops in Tucson, Arizona USA, and compared results from 5 cloud fields of varying complexity, beginning with simplified atmosphere and surface, and proceeding to more realistic cases. Phase 3 is now underway, focusing on improvement and sharing of 3D radiation code, aided by working groups on "Approximations" and "Open Source". The "Approximations" group has so far focused on diffusive approximate methods in an attempt to gain advantages in execution time, and also to advance the understanding of 3D radiation processes. The "Open Source" subgroup is developing a Monte Carlo radiative transfer toolkit that makes state-of-the-art techniques available to a wide range of users. Activities of both subgroups are further explained at the I3RC website http://i3rc.gsfc.nasa.gov. Participants in 13RC are forming a 3D Working Group under the auspices of the International Radiation Commission, and will meet for this and related activities at a workshop in Tucson in November 2002.

  15. grmonty: A MONTE CARLO CODE FOR RELATIVISTIC RADIATIVE TRANSPORT

    SciTech Connect

    Dolence, Joshua C.; Gammie, Charles F.; Leung, Po Kin; Moscibrodzka, Monika

    2009-10-01

    We describe a Monte Carlo radiative transport code intended for calculating spectra of hot, optically thin plasmas in full general relativity. The version we describe here is designed to model hot accretion flows in the Kerr metric and therefore incorporates synchrotron emission and absorption, and Compton scattering. The code can be readily generalized, however, to account for other radiative processes and an arbitrary spacetime. We describe a suite of test problems, and demonstrate the expected N {sup -1/2} convergence rate, where N is the number of Monte Carlo samples. Finally, we illustrate the capabilities of the code with a model calculation, a spectrum of the slowly accreting black hole Sgr A* based on data provided by a numerical general relativistic MHD model of the accreting plasma.

  16. Radiative transfer code SHARM for atmospheric and terrestrial applications.

    PubMed

    Lyapustin, A I

    2005-12-20

    An overview of the publicly available radiative transfer Spherical Harmonics code (SHARM) is presented. SHARM is a rigorous code, as accurate as the Discrete Ordinate Radiative Transfer (DISORT) code, yet faster. It performs simultaneous calculations for different solar zenith angles, view zenith angles, and view azimuths and allows the user to make multiwavelength calculations in one run. The Delta-M method is implemented for calculations with highly anisotropic phase functions. Rayleigh scattering is automatically included as a function of wavelength, surface elevation, and the selected vertical profile of one of the standard atmospheric models. The current version of the SHARM code does not explicitly include atmospheric gaseous absorption, which should be provided by the user. The SHARM code has several built-in models of the bidirectional reflectance of land and wind-ruffled water surfaces that are most widely used in research and satellite data processing. A modification of the SHARM code with the built-in Mie algorithm designed for calculations with spherical aerosols is also described.

  17. Escape factors in zero-dimensional radiation-transfer codes

    NASA Astrophysics Data System (ADS)

    Phillips, G. J.; Wark, J. S.; Kerr, F. M.; Rose, S. J.; Lee, R. W.

    2008-04-01

    Several zero-dimensional non-LTE radiation-transfer codes are in common use within the laser-plasma community (for example, RATION, FLY, FLYCHK and GALAXY). These codes are capable of generating calculated emission spectra for a plasma of given density and temperature in the presence of a radiation field. Although dimensionless in nature, these codes can take into account the coupling of radiation and populations by use of the escape factor method, and in this sense the codes incorporate the finite size of the plasma of interest in two ways - firstly in the calculation of the effect of the radiation on the populations and secondly when using these populations to generate a spectrum. Different lengths can be used within these two distinct operations, though it has not been made clear what these lengths should be. We submit that the appropriate length to use for the calculation of populations in such zero-dimensional codes is the mean chord of the system, whilst when calculating the spectrum the appropriate length is the size of the plasma along the line of sight. Indeed, for specific plasma shapes using the appropriate escape factors it can be shown that this interpretation agrees with analytic results. However, this is only the case if the correct escape factor is employed: use of the Holstein escape factor (which is in widely distributed versions of the codes mentioned above) is found to be significantly in error under most conditions. We also note that for the case where a plasma is close to coronal equilibrium, some limited information concerning the shape of the plasma can be extracted merely from the ratio of optically thick to optically thin lines, without the need for any explicit spatial resolution.

  18. Prototype demonstration of radiation therapy planning code system

    SciTech Connect

    Little, R.C.; Adams, K.J.; Estes, G.P.; Hughes, L.S. III; Waters, L.S.

    1996-09-01

    This is the final report of a one-year, Laboratory-Directed Research and Development project at the Los Alamos National Laboratory (LANL). Radiation therapy planning is the process by which a radiation oncologist plans a treatment protocol for a patient preparing to undergo radiation therapy. The objective is to develop a protocol that delivers sufficient radiation dose to the entire tumor volume, while minimizing dose to healthy tissue. Radiation therapy planning, as currently practiced in the field, suffers from inaccuracies made in modeling patient anatomy and radiation transport. This project investigated the ability to automatically model patient-specific, three-dimensional (3-D) geometries in advanced Los Alamos radiation transport codes (such as MCNP), and to efficiently generate accurate radiation dose profiles in these geometries via sophisticated physics modeling. Modem scientific visualization techniques were utilized. The long-term goal is that such a system could be used by a non-expert in a distributed computing environment to help plan the treatment protocol for any candidate radiation source. The improved accuracy offered by such a system promises increased efficacy and reduced costs for this important aspect of health care.

  19. NERO- a post-maximum supernova radiation transport code

    NASA Astrophysics Data System (ADS)

    Maurer, I.; Jerkstrand, A.; Mazzali, P. A.; Taubenberger, S.; Hachinger, S.; Kromer, M.; Sim, S.; Hillebrandt, W.

    2011-12-01

    The interpretation of supernova (SN) spectra is essential for deriving SN ejecta properties such as density and composition, which in turn can tell us about their progenitors and the explosion mechanism. A very large number of atomic processes are important for spectrum formation. Several tools for calculating SN spectra exist, but they mainly focus on the very early or late epochs. The intermediate phase, which requires a non-local thermodynamic equilibrium (NLTE) treatment of radiation transport has rarely been studied. In this paper, we present a new SN radiation transport code, NERO, which can look at those epochs. All the atomic processes are treated in full NLTE, under a steady-state assumption. This is a valid approach between roughly 50 and 500 days after the explosion depending on SN type. This covers the post-maximum photospheric and the early and the intermediate nebular phase. As a test, we compare NERO to the radiation transport code of Jerkstrand, Fransson & Kozma and to the nebular code of Mazzali et al. All three codes have been developed independently and a comparison provides a valuable opportunity to investigate their reliability. Currently, NERO is one-dimensional and can be used for predicting spectra of synthetic explosion models or for deriving SN properties by spectral modelling. To demonstrate this, we study the spectra of the 'normal' Type Ia supernova (SN Ia) 2005cf between 50 and 350 days after the explosion and identify most of the common SN Ia line features at post-maximum epochs.

  20. A Radiation Solver for the National Combustion Code

    NASA Technical Reports Server (NTRS)

    Sockol, Peter M.

    2015-01-01

    A methodology is given that converts an existing finite volume radiative transfer method that requires input of local absorption coefficients to one that can treat a mixture of combustion gases and compute the coefficients on the fly from the local mixture properties. The Full-spectrum k-distribution method is used to transform the radiative transfer equation (RTE) to an alternate wave number variable, g . The coefficients in the transformed equation are calculated at discrete temperatures and participating species mole fractions that span the values of the problem for each value of g. These results are stored in a table and interpolation is used to find the coefficients at every cell in the field. Finally, the transformed RTE is solved for each g and Gaussian quadrature is used to find the radiant heat flux throughout the field. The present implementation is in an existing cartesian/cylindrical grid radiative transfer code and the local mixture properties are given by a solution of the National Combustion Code (NCC) on the same grid. Based on this work the intention is to apply this method to an existing unstructured grid radiation code which can then be coupled directly to NCC.

  1. A more accurate nonequilibrium air radiation code - NEQAIR second generation

    NASA Technical Reports Server (NTRS)

    Moreau, Stephane; Laux, Christophe O.; Chapman, Dean R.; Maccormack, Robert W.

    1992-01-01

    Two experiments, one an equilibrium flow in a plasma torch at Stanford, the other a nonequilibrium flow in a SDIO/IST Bow-Shock-Ultra-Violet missile flight, have provided the basis for modifying, enhancing, and testing the well-known radiation code, NEQAIR. The original code, herein termed NEQAIR1, lacked computational efficiency, accurate data for some species and the flexibility to handle a variety of species. The modified code, herein termed NEQAIR2, incorporates recent findings in the spectroscopic and radiation models. It can handle any number of species and radiative bands in a gas whose thermodynamic state can be described by up to four temperatures. It provides a new capability of computing very fine spectra in a reasonable CPU time, while including transport phenomena along the line of sight and the characteristics of instruments that were used in the measurements. Such a new tool should allow more accurate testing and diagnosis of the different physical models used in numerical simulations of radiating, low density, high energy flows.

  2. Towards a 3D Space Radiation Transport Code

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathl, R. K.; Cicomptta, F. A.; Heinbockel, J. H.; Tweed, J.

    2002-01-01

    High-speed computational procedures for space radiation shielding have relied on asymptotic expansions in terms of the off-axis scatter and replacement of the general geometry problem by a collection of flat plates. This type of solution was derived for application to human rated systems in which the radius of the shielded volume is large compared to the off-axis diffusion limiting leakage at lateral boundaries. Over the decades these computational codes are relatively complete and lateral diffusion effects are now being added. The analysis for developing a practical full 3D space shielding code is presented.

  3. Validation of a comprehensive space radiation transport code.

    PubMed

    Shinn, J L; Cucinotta, F A; Simonsen, L C; Wilson, J W; Badavi, F F; Badhwar, G D; Miller, J; Zeitlin, C; Heilbronn, L; Tripathi, R K; Clowdsley, M S; Heinbockel, J H; Xapsos, M A

    1998-12-01

    The HZETRN code has been developed over the past decade to evaluate the local radiation fields within sensitive materials on spacecraft in the space environment. Most of the more important nuclear and atomic processes are now modeled and evaluation within a complex spacecraft geometry with differing material components, including transition effects across boundaries of dissimilar materials, are included. The atomic/nuclear database and transport procedures have received limited validation in laboratory testing with high energy ion beams. The codes have been applied in design of the SAGE-III instrument resulting in material changes to control injurious neutron production, in the study of the Space Shuttle single event upsets, and in validation with space measurements (particle telescopes, tissue equivalent proportional counters, CR-39) on Shuttle and Mir. The present paper reviews the code development and presents recent results in laboratory and space flight validation.

  4. New Parallel computing framework for radiation transport codes

    SciTech Connect

    Kostin, M.A.; Mokhov, N.V.; Niita, K.; /JAERI, Tokai

    2010-09-01

    A new parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was integrated with the MARS15 code, and an effort is under way to deploy it in PHITS. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. Several checkpoint files can be merged into one thus combining results of several calculations. The framework also corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  5. A model code for the radiative theta pinch

    SciTech Connect

    Lee, S.; Saw, S. H.; Lee, P. C. K.; Akel, M.; Damideh, V.; Khattak, N. A. D.; Mongkolnavin, R.; Paosawatyanyong, B.

    2014-07-15

    A model for the theta pinch is presented with three modelled phases of radial inward shock phase, reflected shock phase, and a final pinch phase. The governing equations for the phases are derived incorporating thermodynamics and radiation and radiation-coupled dynamics in the pinch phase. A code is written incorporating correction for the effects of transit delay of small disturbing speeds and the effects of plasma self-absorption on the radiation. Two model parameters are incorporated into the model, the coupling coefficient f between the primary loop current and the induced plasma current and the mass swept up factor f{sub m}. These values are taken from experiments carried out in the Chulalongkorn theta pinch.

  6. A model code for the radiative theta pinch

    NASA Astrophysics Data System (ADS)

    Lee, S.; Saw, S. H.; Lee, P. C. K.; Akel, M.; Damideh, V.; Khattak, N. A. D.; Mongkolnavin, R.; Paosawatyanyong, B.

    2014-07-01

    A model for the theta pinch is presented with three modelled phases of radial inward shock phase, reflected shock phase, and a final pinch phase. The governing equations for the phases are derived incorporating thermodynamics and radiation and radiation-coupled dynamics in the pinch phase. A code is written incorporating correction for the effects of transit delay of small disturbing speeds and the effects of plasma self-absorption on the radiation. Two model parameters are incorporated into the model, the coupling coefficient f between the primary loop current and the induced plasma current and the mass swept up factor fm. These values are taken from experiments carried out in the Chulalongkorn theta pinch.

  7. Evaluation of coded aperture radiation detectors using a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Miller, Kyle; Huggins, Peter; Labov, Simon; Nelson, Karl; Dubrawski, Artur

    2016-12-01

    We investigate tradeoffs arising from the use of coded aperture gamma-ray spectrometry to detect and localize sources of harmful radiation in the presence of noisy background. Using an example application scenario of area monitoring and search, we empirically evaluate weakly supervised spectral, spatial, and hybrid spatio-spectral algorithms for scoring individual observations, and two alternative methods of fusing evidence obtained from multiple observations. Results of our experiments confirm the intuition that directional information provided by spectrometers masked with coded aperture enables gains in source localization accuracy, but at the expense of reduced probability of detection. Losses in detection performance can however be to a substantial extent reclaimed by using our new spatial and spatio-spectral scoring methods which rely on realistic assumptions regarding masking and its impact on measured photon distributions.

  8. 3D unstructured-mesh radiation transport codes

    SciTech Connect

    Morel, J.

    1997-12-31

    Three unstructured-mesh radiation transport codes are currently being developed at Los Alamos National Laboratory. The first code is ATTILA, which uses an unstructured tetrahedral mesh in conjunction with standard Sn (discrete-ordinates) angular discretization, standard multigroup energy discretization, and linear-discontinuous spatial differencing. ATTILA solves the standard first-order form of the transport equation using source iteration in conjunction with diffusion-synthetic acceleration of the within-group source iterations. DANTE is designed to run primarily on workstations. The second code is DANTE, which uses a hybrid finite-element mesh consisting of arbitrary combinations of hexahedra, wedges, pyramids, and tetrahedra. DANTE solves several second-order self-adjoint forms of the transport equation including the even-parity equation, the odd-parity equation, and a new equation called the self-adjoint angular flux equation. DANTE also offers three angular discretization options: $S{_}n$ (discrete-ordinates), $P{_}n$ (spherical harmonics), and $SP{_}n$ (simplified spherical harmonics). DANTE is designed to run primarily on massively parallel message-passing machines, such as the ASCI-Blue machines at LANL and LLNL. The third code is PERICLES, which uses the same hybrid finite-element mesh as DANTE, but solves the standard first-order form of the transport equation rather than a second-order self-adjoint form. DANTE uses a standard $S{_}n$ discretization in angle in conjunction with trilinear-discontinuous spatial differencing, and diffusion-synthetic acceleration of the within-group source iterations. PERICLES was initially designed to run on workstations, but a version for massively parallel message-passing machines will be built. The three codes will be described in detail and computational results will be presented.

  9. Recent radiation damage studies and developments of the Marlowe code

    NASA Astrophysics Data System (ADS)

    Ortiz, C. J.; Souidi, A.; Becquart, C. S.; Domain, C.; Hou, M.

    2014-07-01

    Radiation damage in materials relevant to applications evolves over time scales spanning from the femtosecond - the characteristic time for an atomic collision - to decades - the aging time expected for nuclear materials. The relevant kinetic energies of atoms span from thermal motion to the MeV range.The question motivating this contribution is to identify the relationship between elementary atomic displacements triggered by irradiation and the subsequent microstructural evolution of metals in the long term. The Marlowe code, based on the binary collision approximation (BCA) is used to simulate the sequences of atomic displacements generated by energetic primary recoils and the Object Kinetic Monte Carlo code LAKIMOCA, parameterized on a range of ab initio calculations, is used to predict the subsequent long-term evolution of point defect and clusters thereof. In agreement with full Molecular Dynamics, BCA displacement cascades in body-centered cubic (BCC) Fe and a face-centered cubic (FCC) Febond Nibond Cr alloy display recursive properties that are found useful for predictions in the long term.The case of defects evolution in W due to external irradiation with energetic H and He is also discussed. To this purpose, it was useful to extend the inelastic energy loss model available in Marlowe up to the Bethe regime. The last version of the Marlowe code (version 15) was delivered before message passing instructions softwares (such as MPI) were available but the structure of the code was designed in such a way to permit parallel executions within a distributed memory environment. This makes possible to obtain N different cascades simultaneously using N independent nodes without any communication between processors. The parallelization of the code using MPI was recently achieved by one author of this report (C.J.O.). Typically, the parallelized version of Marlowe allows simulating millions of displacement cascades using a limited number of processors (<64) within only

  10. Comparison of space radiation calculations for deterministic and Monte Carlo transport codes

    NASA Astrophysics Data System (ADS)

    Lin, Zi-Wei; Adams, James; Barghouty, Abdulnasser; Randeniya, Sharmalee; Tripathi, Ram; Watts, John; Yepes, Pablo

    For space radiation protection of astronauts or electronic equipments, it is necessary to develop and use accurate radiation transport codes. Radiation transport codes include deterministic codes, such as HZETRN from NASA and UPROP from the Naval Research Laboratory, and Monte Carlo codes such as FLUKA, the Geant4 toolkit and HETC-HEDS. The deterministic codes and Monte Carlo codes complement each other in that deterministic codes are very fast while Monte Carlo codes are more elaborate. Therefore it is important to investigate how well the results of deterministic codes compare with those of Monte Carlo transport codes and where they differ. In this study we evaluate these different codes in their space radiation applications by comparing their output results in the same given space radiation environments, shielding geometry and material. Typical space radiation environments such as the 1977 solar minimum galactic cosmic ray environment are used as the well-defined input, and simple geometries made of aluminum, water and/or polyethylene are used to represent the shielding material. We then compare various outputs of these codes, such as the dose-depth curves and the flux spectra of different fragments and other secondary particles. These comparisons enable us to learn more about the main differences between these space radiation transport codes. At the same time, they help us to learn the qualitative and quantitative features that these transport codes have in common.

  11. A dual-sided coded-aperture radiation detection system

    NASA Astrophysics Data System (ADS)

    Penny, R. D.; Hood, W. E.; Polichar, R. M.; Cardone, F. H.; Chavez, L. G.; Grubbs, S. G.; Huntley, B. P.; Kuharski, R. A.; Shyffer, R. T.; Fabris, L.; Ziock, K. P.; Labov, S. E.; Nelson, K.

    2011-10-01

    We report the development of a large-area, mobile, coded-aperture radiation imaging system for localizing compact radioactive sources in three dimensions while rejecting distributed background. The 3D Stand-Off Radiation Detection System (SORDS-3D) has been tested at speeds up to 95 km/h and has detected and located sources in the millicurie range at distances of over 100 m. Radiation data are imaged to a geospatially mapped world grid with a nominal 1.25- to 2.5-m pixel pitch at distances out to 120 m on either side of the platform. Source elevation is also extracted. Imaged radiation alarms are superimposed on a side-facing video log that can be played back for direct localization of sources in buildings in urban environments. The system utilizes a 37-element array of 5×5×50 cm 3 cesium-iodide (sodium) detectors. Scintillation light is collected by a pair of photomultiplier tubes placed at either end of each detector, with the detectors achieving an energy resolution of 6.15% FWHM (662 keV) and a position resolution along their length of 5 cm FWHM. The imaging system generates a dual-sided two-dimensional image allowing users to efficiently survey a large area. Imaged radiation data and raw spectra are forwarded to the RadioNuclide Analysis Kit (RNAK), developed by our collaborators, for isotope ID. An intuitive real-time display aids users in performing searches. Detector calibration is dynamically maintained by monitoring the potassium-40 peak and digitally adjusting individual detector gains. We have recently realized improvements, both in isotope identification and in distinguishing compact sources from background, through the installation of optimal-filter reconstruction kernels.

  12. Operation of the helicopter antenna radiation prediction code

    NASA Technical Reports Server (NTRS)

    Braeden, E. W.; Klevenow, F. T.; Newman, E. H.; Rojas, R. G.; Sampath, K. S.; Scheik, J. T.; Shamansky, H. T.

    1993-01-01

    HARP is a front end as well as a back end for the AMC and NEWAIR computer codes. These codes use the Method of Moments (MM) and the Uniform Geometrical Theory of Diffraction (UTD), respectively, to calculate the electromagnetic radiation patterns for antennas on aircraft. The major difficulty in using these codes is in the creation of proper input files for particular aircraft and in verifying that these files are, in fact, what is intended. HARP creates these input files in a consistent manner and allows the user to verify them for correctness using sophisticated 2 and 3D graphics. After antenna field patterns are calculated using either MM or UTD, HARP can display the results on the user's screen or provide hardcopy output. Because the process of collecting data, building the 3D models, and obtaining the calculated field patterns was completely automated by HARP, the researcher's productivity can be many times what it could be if these operations had to be done by hand. A complete, step by step, guide is provided so that the researcher can quickly learn to make use of all the capabilities of HARP.

  13. VISRAD, 3-D Target Design and Radiation Simulation Code

    NASA Astrophysics Data System (ADS)

    Golovkin, Igor; Macfarlane, Joseph; Golovkina, Viktoriya

    2016-10-01

    The 3-D view factor code VISRAD is widely used in designing HEDP experiments at major laser and pulsed-power facilities, including NIF, OMEGA, OMEGA-EP, ORION, LMJ, Z, and PLX. It simulates target designs by generating a 3-D grid of surface elements, utilizing a variety of 3-D primitives and surface removal algorithms, and can be used to compute the radiation flux throughout the surface element grid by computing element-to-element view factors and solving power balance equations. Target set-up and beam pointing are facilitated by allowing users to specify positions and angular orientations using a variety of coordinates systems (e.g., that of any laser beam, target component, or diagnostic port). Analytic modeling for laser beam spatial profiles for OMEGA DPPs and NIF CPPs is used to compute laser intensity profiles throughout the grid of surface elements. We will discuss recent improvements to the software package and plans for future developments.

  14. Development, Verification and Validation of Enclosure Radiation Capabilities in the CHarring Ablator Response (CHAR) Code

    NASA Technical Reports Server (NTRS)

    Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.

    2016-01-01

    With the recent development of multi-dimensional thermal protection system (TPS) material response codes including the capabilities to account for radiative heating is a requirement. This paper presents the recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute view factors for radiation problems involving multiple surfaces. Furthermore, verification and validation of the code's radiation capabilities are demonstrated by comparing solutions to analytical results, to other codes, and to radiant test data.

  15. HELIOS: A new open-source radiative transfer code

    NASA Astrophysics Data System (ADS)

    Malik, Matej; Grosheintz, Luc; Lukas Grimm, Simon; Mendonça, João; Kitzmann, Daniel; Heng, Kevin

    2015-12-01

    I present the new open-source code HELIOS, developed to accurately describe radiative transfer in a wide variety of irradiated atmospheres. We employ a one-dimensional multi-wavelength two-stream approach with scattering. Written in Cuda C++, HELIOS uses the GPU’s potential of massive parallelization and is able to compute the TP-profile of an atmosphere in radiative equilibrium and the subsequent emission spectrum in a few minutes on a single computer (for 60 layers and 1000 wavelength bins).The required molecular opacities are obtained with the recently published code HELIOS-K [1], which calculates the line shapes from an input line list and resamples the numerous line-by-line data into a manageable k-distribution format. Based on simple equilibrium chemistry theory [2] we combine the k-distribution functions of the molecules H2O, CO2, CO & CH4 to generate a k-table, which we then employ in HELIOS.I present our results of the following: (i) Various numerical tests, e.g. isothermal vs. non-isothermal treatment of layers. (ii) Comparison of iteratively determined TP-profiles with their analytical parametric prescriptions [3] and of the corresponding spectra. (iii) Benchmarks of TP-profiles & spectra for various elemental abundances. (iv) Benchmarks of averaged TP-profiles & spectra for the exoplanets GJ1214b, HD189733b & HD209458b. (v) Comparison with secondary eclipse data for HD189733b, XO-1b & Corot-2b.HELIOS is being developed, together with the dynamical core THOR and the chemistry solver VULCAN, in the group of Kevin Heng at the University of Bern as part of the Exoclimes Simulation Platform (ESP) [4], which is an open-source project aimed to provide community tools to model exoplanetary atmospheres.-----------------------------[1] Grimm & Heng 2015, ArXiv, 1503.03806[2] Heng, Lyons & Tsai, Arxiv, 1506.05501Heng & Lyons, ArXiv, 1507.01944[3] e.g. Heng, Mendonca & Lee, 2014, ApJS, 215, 4H[4] exoclime.net

  16. CODE's new solar radiation pressure model for GNSS orbit determination

    NASA Astrophysics Data System (ADS)

    Arnold, D.; Meindl, M.; Beutler, G.; Dach, R.; Schaer, S.; Lutz, S.; Prange, L.; Sośnica, K.; Mervart, L.; Jäggi, A.

    2015-08-01

    The Empirical CODE Orbit Model (ECOM) of the Center for Orbit Determination in Europe (CODE), which was developed in the early 1990s, is widely used in the International GNSS Service (IGS) community. For a rather long time, spurious spectral lines are known to exist in geophysical parameters, in particular in the Earth Rotation Parameters (ERPs) and in the estimated geocenter coordinates, which could recently be attributed to the ECOM. These effects grew creepingly with the increasing influence of the GLONASS system in recent years in the CODE analysis, which is based on a rigorous combination of GPS and GLONASS since May 2003. In a first step we show that the problems associated with the ECOM are to the largest extent caused by the GLONASS, which was reaching full deployment by the end of 2011. GPS-only, GLONASS-only, and combined GPS/GLONASS solutions using the observations in the years 2009-2011 of a global network of 92 combined GPS/GLONASS receivers were analyzed for this purpose. In a second step we review direct solar radiation pressure (SRP) models for GNSS satellites. We demonstrate that only even-order short-period harmonic perturbations acting along the direction Sun-satellite occur for GPS and GLONASS satellites, and only odd-order perturbations acting along the direction perpendicular to both, the vector Sun-satellite and the spacecraft's solar panel axis. Based on this insight we assess in the third step the performance of four candidate orbit models for the future ECOM. The geocenter coordinates, the ERP differences w. r. t. the IERS 08 C04 series of ERPs, the misclosures for the midnight epochs of the daily orbital arcs, and scale parameters of Helmert transformations for station coordinates serve as quality criteria. The old and updated ECOM are validated in addition with satellite laser ranging (SLR) observations and by comparing the orbits to those of the IGS and other analysis centers. Based on all tests, we present a new extended ECOM which

  17. Development and Verification of Enclosure Radiation Capabilities in the CHarring Ablator Response (CHAR) Code

    NASA Technical Reports Server (NTRS)

    Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.

    2016-01-01

    With the recent development of multi-dimensional thermal protection system (TPS) material response codes, the capability to account for surface-to-surface radiation exchange in complex geometries is critical. This paper presents recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute geometric view factors for radiation problems involving multiple surfaces. Verification of the code's radiation capabilities and results of a code-to-code comparison are presented. Finally, a demonstration case of a two-dimensional ablating cavity with enclosure radiation accounting for a changing geometry is shown.

  18. Development of A Monte Carlo Radiation Transport Code System For HEDS: Status Update

    NASA Technical Reports Server (NTRS)

    Townsend, Lawrence W.; Gabriel, Tony A.; Miller, Thomas M.

    2003-01-01

    Modifications of the Monte Carlo radiation transport code HETC are underway to extend the code to include transport of energetic heavy ions, such as are found in the galactic cosmic ray spectrum in space. The new HETC code will be available for use in radiation shielding applications associated with missions, such as the proposed manned mission to Mars. In this work the current status of code modification is described. Methods used to develop the required nuclear reaction models, including total, elastic and nuclear breakup processes, and their associated databases are also presented. Finally, plans for future work on the extended HETC code system and for its validation are described.

  19. Radiation transport phenomena and modeling. Part A: Codes; Part B: Applications with examples

    SciTech Connect

    Lorence, L.J. Jr.; Beutler, D.E.

    1997-09-01

    This report contains the notes from the second session of the 1997 IEEE Nuclear and Space Radiation Effects Conference Short Course on Applying Computer Simulation Tools to Radiation Effects Problems. Part A discusses the physical phenomena modeled in radiation transport codes and various types of algorithmic implementations. Part B gives examples of how these codes can be used to design experiments whose results can be easily analyzed and describes how to calculate quantities of interest for electronic devices.

  20. International "Intercomparison of 3-Dimensional (3D) Radiation Codes" (13RC)

    NASA Technical Reports Server (NTRS)

    Cahalan, Robert F.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    An international "Intercomparison of 3-dimensional (3D) Radiation Codes" 13RC) has been initiated. It is endorsed by the GEWEX Radiation Panel, and funded jointly by the United States Department of Energy ARM program, and by the National Aeronautics and Space Administration Radiation Sciences program. It is a 3-phase effort that has as its goals to: (1) understand the errors and limits of 3D methods; (2) provide 'baseline' cases for future 3D code development; (3) promote sharing of 3D tools; (4) derive guidelines for 3D tool selection; and (5) improve atmospheric science education in 3D radiation.

  1. Comparison of codes assessing galactic cosmic radiation exposure of aircraft crew.

    PubMed

    Bottollier-Depois, J F; Beck, P; Bennett, B; Bennett, L; Bütikofer, R; Clairand, I; Desorgher, L; Dyer, C; Felsberger, E; Flückiger, E; Hands, A; Kindl, P; Latocha, M; Lewis, B; Leuthold, G; Maczka, T; Mares, V; McCall, M J; O'Brien, K; Rollet, S; Rühm, W; Wissmann, F

    2009-10-01

    The assessment of the exposure to cosmic radiation onboard aircraft is one of the preoccupations of bodies responsible for radiation protection. Cosmic particle flux is significantly higher onboard aircraft than at ground level and its intensity depends on the solar activity. The dose is usually estimated using codes validated by the experimental data. In this paper, a comparison of various codes is presented, some of them are used routinely, to assess the dose received by the aircraft crew caused by the galactic cosmic radiation. Results are provided for periods close to solar maximum and minimum and for selected flights covering major commercial routes in the world. The overall agreement between the codes, particularly for those routinely used for aircraft crew dosimetry, was better than +/-20 % from the median in all but two cases. The agreement within the codes is considered to be fully satisfactory for radiation protection purposes.

  2. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    SciTech Connect

    Smith, L.M.; Hochstedler, R.D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of the accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).

  3. A Monte Carlo Code for Relativistic Radiation Transport Around Kerr Black Holes

    NASA Technical Reports Server (NTRS)

    Schnittman, Jeremy David; Krolik, Julian H.

    2013-01-01

    We present a new code for radiation transport around Kerr black holes, including arbitrary emission and absorption mechanisms, as well as electron scattering and polarization. The code is particularly useful for analyzing accretion flows made up of optically thick disks and optically thin coronae. We give a detailed description of the methods employed in the code and also present results from a number of numerical tests to assess its accuracy and convergence.

  4. A MONTE CARLO CODE FOR RELATIVISTIC RADIATION TRANSPORT AROUND KERR BLACK HOLES

    SciTech Connect

    Schnittman, Jeremy D.; Krolik, Julian H. E-mail: jhk@pha.jhu.edu

    2013-11-01

    We present a new code for radiation transport around Kerr black holes, including arbitrary emission and absorption mechanisms, as well as electron scattering and polarization. The code is particularly useful for analyzing accretion flows made up of optically thick disks and optically thin coronae. We give a detailed description of the methods employed in the code and also present results from a number of numerical tests to assess its accuracy and convergence.

  5. Flexible Radiation Codes for Numerical Weather Prediction Across Space and Time Scales

    DTIC Science & Technology

    2013-09-30

    time and space scales, especially from regional models to global models. OBJECTIVES We are adapting radiation codes developed for climate ...PSrad is now complete, thorougly tested and debugged, is functioning as the radiation scheme in the climate model ECHAM 6.2 developed at the Max Planck...statiically significant change at most stations, indicating that errors in most places are not primarily driven by radiation errors. We are working

  6. Development and application of a reverse Monte Carlo radiative transfer code for rocket plume base heating

    NASA Technical Reports Server (NTRS)

    Everson, John; Nelson, H. F.

    1993-01-01

    A reverse Monte Carlo radiative transfer code to predict rocket plume base heating is presented. In this technique rays representing the radiation propagation are traced backwards in time from the receiving surface to the point of emission in the plume. This increases the computational efficiency relative to the forward Monte Carlo technique when calculating the radiation reaching a specific point, as only the rays that strike the receiving point are considered.

  7. TRAP/SEE Code Users Manual for Predicting Trapped Radiation Environments

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    2000-01-01

    TRAP/SEE is a PC-based computer code with a user-friendly interface which predicts the ionizing radiation exposure of spacecraft having orbits in the Earth's trapped radiation belts. The code incorporates the standard AP8 and AE8 trapped proton and electron models but also allows application of an improved database interpolation method. The code treats low-Earth as well as highly-elliptical Earth orbits, taking into account trajectory perturbations due to gravitational forces from the Moon and Sun, atmospheric drag, and solar radiation pressure. Orbit-average spectra, peak spectra per orbit, and instantaneous spectra at points along the orbit trajectory are calculated. Described in this report are the features, models, model limitations and uncertainties, input and output descriptions, and example calculations and applications for the TRAP/SEE code.

  8. TRAP/SEE Code Users Manual for Predicting Trapped Radiation Environments

    NASA Astrophysics Data System (ADS)

    Armstrong, T. W.; Colborn, B. L.

    2000-01-01

    TRAP/SEE is a PC-based computer code with a user-friendly interface which predicts the ionizing radiation exposure of spacecraft having orbits in the Earth's trapped radiation belts. The code incorporates the standard AP8 and AE8 trapped proton and electron models but also allows application of an improved database interpolation method. The code treats low-Earth as well as highly-elliptical Earth orbits, taking into account trajectory perturbations due to gravitational forces from the Moon and Sun, atmospheric drag, and solar radiation pressure. Orbit-average spectra, peak spectra per orbit, and instantaneous spectra at points along the orbit trajectory are calculated. Described in this report are the features, models, model limitations and uncertainties, input and output descriptions, and example calculations and applications for the TRAP/SEE code.

  9. ICRCCM (InterComparison of Radiation Codes used in Climate Models) Phase 2: Verification and calibration of radiation codes in climate models

    SciTech Connect

    Ellingson, R.G.; Wiscombe, W.J.; Murcray, D.; Smith, W.; Strauch, R.

    1990-01-01

    Following the finding by the InterComparison of Radiation Codes used in Climate Models (ICRCCM) of large differences among fluxes predicted by sophisticated radiation models that could not be sorted out because of the lack of a set of accurate atmospheric spectral radiation data measured simultaneously with the important radiative properties of the atmosphere, our team of scientists proposed to remedy the situation by carrying out a comprehensive program of measurement and analysis called SPECTRE (Spectral Radiance Experiment). SPECTRE will establish an absolute standard against which to compare models, and will aim to remove the hidden variables'' (unknown humidities, aerosols, etc.) which radiation modelers have invoked to excuse disagreements with observation. The data to be collected during SPECTRE will form the test bed for the second phase of ICRCCM, namely verification and calibration of radiation codes used to climate models. This should lead to more accurate radiation models for use in parameterizing climate models, which in turn play a key role in the prediction of trace-gas greenhouse effects. Overall, the project is proceeding much as had been anticipated in the original proposal. The most significant accomplishments to date include the completion of the analysis of the original ICRCCM calculations, the completion of the initial sensitivity analysis of the radiation calculations for the effects of uncertainties in the measurement of water vapor and temperature and the acquisition and testing of the inexpensive spectrometers for use in the field experiment. The sensitivity analysis and the spectrometer tests given us much more confidence that the field experiment will yield the quality of data necessary to make a significant tests of and improvements to radiative transfer models used in climate studies.

  10. Protection of the genome and central protein-coding sequences by non-coding DNA against DNA damage from radiation.

    PubMed

    Qiu, Guo-Hua

    2015-01-01

    Non-coding DNA comprises a very large proportion of the total genomic content in higher organisms, but its function remains largely unclear. Non-coding DNA sequences constitute the majority of peripheral heterochromatin, which has been hypothesized to be the genome's 'bodyguard' against DNA damage from chemicals and radiation for almost four decades. The bodyguard protective function of peripheral heterochromatin in genome defense has been strengthened by the results from numerous recent studies, which are summarized in this review. These data have suggested that cells and/or organisms with a higher level of heterochromatin and more non-coding DNA sequences, including longer telomeric DNA and rDNAs, exhibit a lower frequency of DNA damage, higher radioresistance and longer lifespan after IR exposure. In addition, the majority of heterochromatin is peripherally located in the three-dimensional structure of genome organization. Therefore, the peripheral heterochromatin with non-coding DNA could play a protective role in genome defense against DNA damage from ionizing radiation by both absorbing the radicals from water radiolysis in the cytosol and reducing the energy of IR. However, the bodyguard protection by heterochromatin has been challenged by the observation that DNA damage is less frequently detected in peripheral heterochromatin than in euchromatin, which is inconsistent with the expectation and simulation results. Previous studies have also shown that the DNA damage in peripheral heterochromatin is rarely repaired and moves more quickly, broadly and outwardly to approach the nuclear pore complex (NPC). Additionally, it has been shown that extrachromosomal circular DNAs (eccDNAs) are formed in the nucleus, highly detectable in the cytoplasm (particularly under stress conditions) and shuttle between the nucleus and the cytoplasm. Based on these studies, this review speculates that the sites of DNA damage in peripheral heterochromatin could occur more

  11. CRASH: A Block-adaptive-mesh Code for Radiative Shock Hydrodynamics—Implementation and Verification

    NASA Astrophysics Data System (ADS)

    van der Holst, B.; Tóth, G.; Sokolov, I. V.; Powell, K. G.; Holloway, J. P.; Myra, E. S.; Stout, Q.; Adams, M. L.; Morel, J. E.; Karni, S.; Fryxell, B.; Drake, R. P.

    2011-06-01

    We describe the Center for Radiative Shock Hydrodynamics (CRASH) code, a block-adaptive-mesh code for multi-material radiation hydrodynamics. The implementation solves the radiation diffusion model with a gray or multi-group method and uses a flux-limited diffusion approximation to recover the free-streaming limit. Electrons and ions are allowed to have different temperatures and we include flux-limited electron heat conduction. The radiation hydrodynamic equations are solved in the Eulerian frame by means of a conservative finite-volume discretization in either one-, two-, or three-dimensional slab geometry or in two-dimensional cylindrical symmetry. An operator-split method is used to solve these equations in three substeps: (1) an explicit step of a shock-capturing hydrodynamic solver; (2) a linear advection of the radiation in frequency-logarithm space; and (3) an implicit solution of the stiff radiation diffusion, heat conduction, and energy exchange. We present a suite of verification test problems to demonstrate the accuracy and performance of the algorithms. The applications are for astrophysics and laboratory astrophysics. The CRASH code is an extension of the Block-Adaptive Tree Solarwind Roe Upwind Scheme (BATS-R-US) code with a new radiation transfer and heat conduction library and equation-of-state and multi-group opacity solvers. Both CRASH and BATS-R-US are part of the publicly available Space Weather Modeling Framework.

  12. CRASH: A Block-Adaptive-Mesh Code for Radiative Shock Hydrodynamics

    NASA Astrophysics Data System (ADS)

    van der Holst, B.; Toth, G.; Sokolov, I. V.; Powell, K. G.; Holloway, J. P.; Myra, E. S.; Stout, Q.; Adams, M. L.; Morel, J. E.; Drake, R. P.

    2011-01-01

    We describe the CRASH (Center for Radiative Shock Hydrodynamics) code, a block adaptive mesh code for multi-material radiation hydrodynamics. The implementation solves the radiation diffusion model with the gray or multigroup method and uses a flux limited diffusion approximation to recover the free-streaming limit. The electrons and ions are allowed to have different temperatures and we include a flux limited electron heat conduction. The radiation hydrodynamic equations are solved in the Eulerian frame by means of a conservative finite volume discretization in either one, two, or three-dimensional slab geometry or in two-dimensional cylindrical symmetry. An operator split method is used to solve these equations in three substeps: (1) solve the hydrodynamic equations with shock-capturing schemes, (2) a linear advection of the radiation in frequency-logarithm space, and (3) an implicit solve of the stiff radiation diffusion, heat conduction, and energy exchange. We present a suite of verification test problems to demonstrate the accuracy and performance of the algorithms. The CRASH code is an extension of the Block-Adaptive Tree Solarwind Roe Upwind Scheme (BATS-R-US) code with this new radiation transfer and heat conduction library and equation-of-state and multigroup opacity solvers. Both CRASH and BATS-R-US are part of the publicly available Space Weather Modeling Framework (SWMF).

  13. CRASH: A BLOCK-ADAPTIVE-MESH CODE FOR RADIATIVE SHOCK HYDRODYNAMICS-IMPLEMENTATION AND VERIFICATION

    SciTech Connect

    Van der Holst, B.; Toth, G.; Sokolov, I. V.; Myra, E. S.; Fryxell, B.; Drake, R. P.; Powell, K. G.; Holloway, J. P.; Stout, Q.; Adams, M. L.; Morel, J. E.; Karni, S.

    2011-06-01

    We describe the Center for Radiative Shock Hydrodynamics (CRASH) code, a block-adaptive-mesh code for multi-material radiation hydrodynamics. The implementation solves the radiation diffusion model with a gray or multi-group method and uses a flux-limited diffusion approximation to recover the free-streaming limit. Electrons and ions are allowed to have different temperatures and we include flux-limited electron heat conduction. The radiation hydrodynamic equations are solved in the Eulerian frame by means of a conservative finite-volume discretization in either one-, two-, or three-dimensional slab geometry or in two-dimensional cylindrical symmetry. An operator-split method is used to solve these equations in three substeps: (1) an explicit step of a shock-capturing hydrodynamic solver; (2) a linear advection of the radiation in frequency-logarithm space; and (3) an implicit solution of the stiff radiation diffusion, heat conduction, and energy exchange. We present a suite of verification test problems to demonstrate the accuracy and performance of the algorithms. The applications are for astrophysics and laboratory astrophysics. The CRASH code is an extension of the Block-Adaptive Tree Solarwind Roe Upwind Scheme (BATS-R-US) code with a new radiation transfer and heat conduction library and equation-of-state and multi-group opacity solvers. Both CRASH and BATS-R-US are part of the publicly available Space Weather Modeling Framework.

  14. Radiation and confinement in 0D fusion systems codes

    NASA Astrophysics Data System (ADS)

    Lux, H.; Kemp, R.; Fable, E.; Wenninger, R.

    2016-07-01

    In systems modelling for fusion power plants, it is essential to robustly predict the performance of a given machine design (including its respective operating scenario). One measure of machine performance is the energy confinement time {τ\\text{E}} that is typically predicted from experimentally derived confinement scaling laws (e.g. IPB98(y,2)). However, the conventionally used scaling laws have been derived for ITER which—unlike a fusion power plant—will not have significant radiation inside the separatrix. In the absence of a new high core radiation relevant confinement scaling, we propose an ad hoc correction to the loss power {{P}\\text{L}} used in the ITER confinement scaling and the calculation of the stored energy {{W}\\text{th}} by the radiation losses from the ‘core’ of the plasma {{P}\\text{rad,\\text{core}}} . Using detailed ASTRA / TGLF simulations, we find that an appropriate definition of {{P}\\text{rad,\\text{core}}} is given by 60% of all radiative losses inside a normalised minor radius {ρ\\text{core}}=0.75 . We consider this an improvement for current design predictions, but it is far from an ideal solution. We therefore encourage more detailed experimental and theoretical work on this issue.

  15. Transit: Radiative-transfer code for planetary atmospheres

    NASA Astrophysics Data System (ADS)

    Cubillos, Patricio; Blecic, Jasmina; Harrington, Joe; Rojo, Patricio; Foster, Austin J.; Stemm, Madison; Challener, Ryan; Foster, Andrew S. D.

    2017-04-01

    Transit calculates the transmission or emission spectrum of a planetary atmosphere with application to extrasolar-planet transit and eclipse observations, respectively. It computes the spectra by solving the one-dimensional line-by-line radiative-transfer equation for an atmospheric model.

  16. The Performance of Current Atmospheric Radiation Codes in Phase I of CIRC

    NASA Technical Reports Server (NTRS)

    Oreopoulos, L.; Mlawer, E.; Shippert, T.; Cole, J.; Fomin, B.; Iacono, M.; Jin, Z.; Li, J.; Manners, J.; Raisanen, P.; Rose, F.; Zhang, Y.; Wilson, M.; Rossow, W.

    2012-01-01

    The Continual Intercomparison of Radiation Codes (CIRC) is intended as an evolving and regularly updated reference source for evaluation of radiative transfer (RT) codes used in Global Climate Models and other atmospheric applications. In our presentation we will discuss our evaluation of the performance of 13 shortwave and 11 longwave RT codes that participated in Phase I of CIRC. CIRC differs from previous intercomparisons in that it relies on an observationally validated catalogue of cases. The seven CIRC Phase I baseline cases, five cloud-free, and two with overcast liquid clouds, are built around observations by the Atmospheric Radiation Measurements (ARM) program that satisfy the goals .of Phase I, namely to examine RT model performance in realistic, yet not overly complex, atmospheric conditions. Besides the seven baseline cases, additional idealized "subcases" are also examined to facilitate interpretation of model errors. We will quantify individual model performance with respect to reference line-by-line calculations, and will also highlight RT code behavior for conditions of doubled CO2 , aspects of utilizing a spectral specification of surface albedo, and the impact of the inclusion of scattering in the thermal infrared. Our analysis suggests that RT codes should work towards improving their calculation of diffuse shortwave flux, shortwave absorption, treatment of spectral surface albedo, and shortwave CO2 forcing. Despite practical difficulties in comparing our results to previous results by the Intercomparison of Radiation Codes in Climate Models (ICRCCM) conducted about 20 years ago, it appears that the current generation of RT codes do indeed perform better than the codes of the ICRCCM era. By enhancing the range of conditions under which participating codes are tested, future CIRC phases will hopefully allow even more rigorous examination of RT code performance.

  17. General relativistic radiative transfer code in rotating black hole space-time: ARTIST

    NASA Astrophysics Data System (ADS)

    Takahashi, Rohta; Umemura, Masayuki

    2017-02-01

    We present a general relativistic radiative transfer code, ARTIST (Authentic Radiative Transfer In Space-Time), that is a perfectly causal scheme to pursue the propagation of radiation with absorption and scattering around a Kerr black hole. The code explicitly solves the invariant radiation intensity along null geodesics in the Kerr-Schild coordinates, and therefore properly includes light bending, Doppler boosting, frame dragging, and gravitational redshifts. The notable aspect of ARTIST is that it conserves the radiative energy with high accuracy, and is not subject to the numerical diffusion, since the transfer is solved on long characteristics along null geodesics. We first solve the wavefront propagation around a Kerr black hole that was originally explored by Hanni. This demonstrates repeated wavefront collisions, light bending, and causal propagation of radiation with the speed of light. We show that the decay rate of the total energy of wavefronts near a black hole is determined solely by the black hole spin in late phases, in agreement with analytic expectations. As a result, the ARTIST turns out to correctly solve the general relativistic radiation fields until late phases as t ˜ 90 M. We also explore the effects of absorption and scattering, and apply this code for a photon wall problem and an orbiting hotspot problem. All the simulations in this study are performed in the equatorial plane around a Kerr black hole. The ARTIST is the first step to realize the general relativistic radiation hydrodynamics.

  18. Evaluation of the ECHAM family radiation codes performance in the representation of the solar signal

    NASA Astrophysics Data System (ADS)

    Sukhodolov, T.; Rozanov, E.; Shapiro, A. I.; Anet, J.; Cagnazzo, C.; Peter, T.; Schmutz, W.

    2014-12-01

    Solar radiation is the main source of energy for the Earth's atmosphere and in many respects defines its composition, photochemistry, temperature profile and dynamics. The magnitude of the solar irradiance variability strongly depends on the wavelength, making difficult its representation in climate models. Due to some deficiencies in the applied radiation codes, several models fail to show a clear response in middle stratospheric heating rates to solar spectral irradiance variability; therefore, it is important to evaluate model performance in this respect before doing multiple runs. In this work we evaluate the performance of three generations of ECHAM (4, 5 and 6) solar radiation schemes by a comparison with the reference high-resolution libRadtran code. We found that all original ECHAM radiation codes miss almost all solar signals in the heating rates in the mesosphere. In the stratosphere the two-band ECHAM4 code (E4) has an almost negligible radiative response to solar irradiance changes and the six-band ECHAM5 code (E5c) reproduces only about half of the reference signal, while representation in the ECHAM6 code (E6) is better - it misses a maximum of about 15% in the upper stratosphere. On the basis of the comparison results we suggest necessary improvements to the ECHAM family codes by the inclusion of available parameterizations of the heating rate due to absorption by oxygen (O2) and ozone (O3). Improvement is presented for E5c and E6, and both codes, with the introduced parameterizations, represent the heating rate response to the spectral solar irradiance variability simulated with libRadtran much better without a substantial increase in computer time. The suggested parameterizations are recommended to be applied in the middle-atmosphere version of the ECHAM-5 and 6 models for the study of the solar irradiance influence on climate.

  19. Fan Noise Prediction System Development: Source/Radiation Field Coupling and Workstation Conversion for the Acoustic Radiation Code

    NASA Technical Reports Server (NTRS)

    Meyer, H. D.

    1993-01-01

    The Acoustic Radiation Code (ARC) is a finite element program used on the IBM mainframe to predict far-field acoustic radiation from a turbofan engine inlet. In this report, requirements for developers of internal aerodynamic codes regarding use of their program output an input for the ARC are discussed. More specifically, the particular input needed from the Bolt, Beranek and Newman/Pratt and Whitney (turbofan source noise generation) Code (BBN/PWC) is described. In a separate analysis, a method of coupling the source and radiation models, that recognizes waves crossing the interface in both directions, has been derived. A preliminary version of the coupled code has been developed and used for initial evaluation of coupling issues. Results thus far have shown that reflection from the inlet is sufficient to indicate that full coupling of the source and radiation fields is needed for accurate noise predictions ' Also, for this contract, the ARC has been modified for use on the Sun and Silicon Graphics Iris UNIX workstations. Changes and additions involved in this effort are described in an appendix.

  20. MORSE Monte Carlo radiation transport code system. [Sample problems

    SciTech Connect

    Emmett, M.B.

    1984-07-02

    For a number of years the MORSE user community has requested additional help in setting up problems using various options. The sample problems distributed with MORSE did not fully demonstrate the capability of the code. At Oak Ridge National Laboratory the code originators had a complete set of sample problems, but funds for documenting and distributing them were never available. Recently the number of requests for listings of input data and results for running some particular option the user was trying to implement has increased to the point where it is not feasible to handle them on an individual basis. Consequently it was decided to package a set of sample problems which illustrates more adequately how to run MORSE. This write-up may be added to Part III of the MORSE report. These sample problems include a combined neutron-gamma case, a neutron only case, a gamma only case, an adjoint case, a fission case, a time-dependent fission case, the collision density case, an XCHEKR run and a PICTUR run.

  1. Radiative transfer codes for atmospheric correction and aerosol retrieval: intercomparison study.

    PubMed

    Kotchenova, Svetlana Y; Vermote, Eric F; Levy, Robert; Lyapustin, Alexei

    2008-05-01

    Results are summarized for a scientific project devoted to the comparison of four atmospheric radiative transfer codes incorporated into different satellite data processing algorithms, namely, 6SV1.1 (second simulation of a satellite signal in the solar spectrum, vector, version 1.1), RT3 (radiative transfer), MODTRAN (moderate resolution atmospheric transmittance and radiance code), and SHARM (spherical harmonics). The performance of the codes is tested against well-known benchmarks, such as Coulson's tabulated values and a Monte Carlo code. The influence of revealed differences on aerosol optical thickness and surface reflectance retrieval is estimated theoretically by using a simple mathematical approach. All information about the project can be found at http://rtcodes.ltdri.org.

  2. Vector radiative transfer code SORD: Performance analysis and quick start guide

    NASA Astrophysics Data System (ADS)

    Korkin, Sergey; Lyapustin, Alexei; Sinyuk, Alexander; Holben, Brent; Kokhanovsky, Alexander

    2017-10-01

    We present a new open source polarized radiative transfer code SORD written in Fortran 90/95. SORD numerically simulates propagation of monochromatic solar radiation in a plane-parallel atmosphere over a reflecting surface using the method of successive orders of scattering (hence the name). Thermal emission is ignored. We did not improve the method in any way, but report the accuracy and runtime in 52 benchmark scenarios. This paper also serves as a quick start user's guide for the code available from ftp://maiac.gsfc.nasa.gov/pub/skorkin, from the JQSRT website, or from the corresponding (first) author.

  3. On the Development of a Deterministic Three-Dimensional Radiation Transport Code

    NASA Technical Reports Server (NTRS)

    Rockell, Candice; Tweed, John

    2011-01-01

    Since astronauts on future deep space missions will be exposed to dangerous radiations, there is a need to accurately model the transport of radiation through shielding materials and to estimate the received radiation dose. In response to this need a three dimensional deterministic code for space radiation transport is now under development. The new code GRNTRN is based on a Green's function solution of the Boltzmann transport equation that is constructed in the form of a Neumann series. Analytical approximations will be obtained for the first three terms of the Neumann series and the remainder will be estimated by a non-perturbative technique . This work discusses progress made to date and exhibits some computations based on the first two Neumann series terms.

  4. Method for calculating internal radiation and ventilation with the ADINAT heat-flow code

    SciTech Connect

    Butkovich, T.R.; Montan, D.N.

    1980-04-01

    One objective of the spent fuel test in Climax Stock granite (SFTC) is to correctly model the thermal transport, and the changes in the stress field and accompanying displacements from the application of the thermal loads. We have chosen the ADINA and ADINAT finite element codes to do these calculations. ADINAT is a heat transfer code compatible to the ADINA displacement and stress analysis code. The heat flow problem encountered at SFTC requires a code with conduction, radiation, and ventilation capabilities, which the present version of ADINAT does not have. We have devised a method for calculating internal radiation and ventilation with the ADINAT code. This method effectively reproduces the results from the TRUMP multi-dimensional finite difference code, which correctly models radiative heat transport between drift surfaces, conductive and convective thermal transport to and through air in the drifts, and mass flow of air in the drifts. The temperature histories for each node in the finite element mesh calculated with ADINAT using this method can be used directly in the ADINA thermal-mechanical calculation.

  5. RRTMGP: A High-Performance Broadband Radiation Code for the Next Decade

    DTIC Science & Technology

    2015-09-30

    radiation calculations needed for climate simulations require many independent and complicated calculations, and are therefore an inviting target for new...parallel’), a modern version of the radiation code (RRTMG) used by many climate models, directed at the current generation of vector- and cache-based...represent the interests of the Navy GCM (NAVGEM). Due to the expected wide impact of this development effort on climate and weather modeling

  6. A multigroup radiation diffusion test problem: Comparison of code results with analytic solution

    SciTech Connect

    Shestakov, A I; Harte, J A; Bolstad, J H; Offner, S R

    2006-12-21

    We consider a 1D, slab-symmetric test problem for the multigroup radiation diffusion and matter energy balance equations. The test simulates diffusion of energy from a hot central region. Opacities vary with the cube of the frequency and radiation emission is given by a Wien spectrum. We compare results from two LLNL codes, Raptor and Lasnex, with tabular data that define the analytic solution.

  7. MODTRAN6: a major upgrade of the MODTRAN radiative transfer code

    NASA Astrophysics Data System (ADS)

    Berk, Alexander; Conforti, Patrick; Kennett, Rosemary; Perkins, Timothy; Hawes, Frederick; van den Bosch, Jeannette

    2014-06-01

    The MODTRAN6 radiative transfer (RT) code is a major advancement over earlier versions of the MODTRAN atmospheric transmittance and radiance model. This version of the code incorporates modern software ar- chitecture including an application programming interface, enhanced physics features including a line-by-line algorithm, a supplementary physics toolkit, and new documentation. The application programming interface has been developed for ease of integration into user applications. The MODTRAN code has been restructured towards a modular, object-oriented architecture to simplify upgrades as well as facilitate integration with other developers' codes. MODTRAN now includes a line-by-line algorithm for high resolution RT calculations as well as coupling to optical scattering codes for easy implementation of custom aerosols and clouds.

  8. RRTMGP: A fast and accurate radiation code for the next decade

    NASA Astrophysics Data System (ADS)

    Mlawer, E. J.; Pincus, R.; Wehe, A.; Delamere, J.

    2015-12-01

    Atmospheric radiative processes are key drivers of the Earth's climate and must be accurately represented in global circulations models (GCMs) to allow faithful simulations of the planet's past, present, and future. The radiation code RRTMG is widely utilized by global modeling centers for both climate and weather predictions, but it has become increasingly out-of-date. The code's structure is not well suited for the current generation of computer architectures and its stored absorption coefficients are not consistent with the most recent spectroscopic information. We are developing a new broadband radiation code for the current generation of computational architectures. This code, called RRTMGP, will be a completely restructured and modern version of RRTMG. The new code preserves the strengths of the existing RRTMG parameterization, especially the high accuracy of the k-distribution treatment of absorption by gases, but the entire code is being rewritten to provide highly efficient computation across a range of architectures. Our redesign includes refactoring the code into discrete kernels corresponding to fundamental computational elements (e.g. gas optics), optimizing the code for operating on multiple columns in parallel, simplifying the subroutine interface, revisiting the existing gas optics interpolation scheme to reduce branching, and adding flexibility with respect to run-time choices of streams, need for consideration of scattering, aerosol and cloud optics, etc. The result of the proposed development will be a single, well-supported and well-validated code amenable to optimization across a wide range of platforms. Our main emphasis is on highly-parallel platforms including Graphical Processing Units (GPUs) and Many-Integrated-Core processors (MICs), which experience shows can accelerate broadband radiation calculations by as much as a factor of fifty. RRTMGP will provide highly efficient and accurate radiative fluxes calculations for coupled global

  9. Evaluation of the ECHAM family radiation codes performance in the representation of the solar signal

    NASA Astrophysics Data System (ADS)

    Sukhodolov, T.; Rozanov, E.; Shapiro, A. I.; Anet, J.; Cagnazzo, C.; Peter, T.; Schmutz, W.

    2014-02-01

    Solar radiation is the main source of energy for the Earth's atmosphere and in many respects defines its composition, photochemistry, temperature profile and dynamics. The magnitude of the solar irradiance variability strongly depends on the wavelength making difficult its representation in climate models. Due to some deficiencies of the applied radiation codes several models fail to show a clear response in middle stratospheric heating rates to solar spectral irradiance variability, therefore it is important to prove reasonable model performance in this respect before doing multiple model runs. In this work we evaluate the performance of three generations of ECHAM (4, 5 and 6) radiation schemes by comparison with the reference high resolution libRadtran code. We found that both original ECHAM5 and 6 solar radiation codes miss almost all solar signal in the heating rates in the mesosphere. In the stratosphere ECHAM5 code reproduces only about a half of the reference signal, while representation of ECHAM6 code is better - it maximally misses about 17% in the upper stratosphere. On the basis of the comparison results we suggest necessary improvements of the ECHAM family codes by inclusion of available parameterizations of the heating rate due to absorption by oxygen (O2) and ozone (O3). Both codes with the introduced parameterizations represent the heating rate response to the spectral solar irradiance variability simulated with libRadtran much better without substantial increase of computer time. The suggested parameterizations are recommended to apply in the middle atmosphere version of the ECHAM-5 and 6 models for the study of the solar irradiance influence on climate.

  10. Benchmarking Space Radiation Transport Codes Using Measured LET Spectra from the Crater Instrument on LRO

    NASA Astrophysics Data System (ADS)

    Townsend, L. W.; Porter, J.; Spence, H. E.; Golightly, M. J.; Smith, S. S.; Schwadron, N.; Kasper, J. C.; Case, A. W.; Blake, J. B.; Mazur, J. E.; Looper, M. D.; Zeitlin, C. J.

    2014-12-01

    The Cosmic Ray Telescope for the Effects of Radiation (CRaTER) instrument on the Lunar Reconnaissance Orbiter (LRO) spacecraft measures the energy depositions by solar and galactic cosmic radiations in its silicon detectors. These energy depositions are converted to linear energy transfer (LET) spectra, which can contribute to benchmarking space radiation transport codes and also used to estimate doses for the Lunar environment. In this work the Monte Carlo transport code HETC-HEDS (High Energy Transport Code - Human Exploration and Development in Space) and the deterministic NASA space radiation transport code HZETRN2010 are used to estimate LET and dose contributions from the incident primary ions and their charged secondaries produced in nuclear collisions within the components of the CRaTER instrument. Comparisons of the calculated LET spectra with measurements of LET from the CRaTER instrument are made and clearly show the importance of including corrections to the calculated average energy deposition spectra in the silicon detectors using a Vavilov distribution function.

  11. Creation and utilization of a World Wide Web based space radiation effects code: SIREST

    NASA Technical Reports Server (NTRS)

    Singleterry, R. C. Jr; Wilson, J. W.; Shinn, J. L.; Tripathi, R. K.; Thibeault, S. A.; Noor, A. K.; Cucinotta, F. A.; Badavi, F. F.; Chang, C. K.; Qualls, G. D.; hide

    2001-01-01

    In order for humans and electronics to fully and safely operate in the space environment, codes like HZETRN (High Charge and Energy Transport) must be included in any designer's toolbox for design evaluation with respect to radiation damage. Currently, spacecraft designers do not have easy access to accurate radiation codes like HZETRN to evaluate their design for radiation effects on humans and electronics. Today, the World Wide Web is sophisticated enough to support the entire HZETRN code and all of the associated pre and post processing tools. This package is called SIREST (Space Ionizing Radiation Effects and Shielding Tools). There are many advantages to SIREST. The most important advantage is the instant update capability of the web. Another major advantage is the modularity that the web imposes on the code. Right now, the major disadvantage of SIREST will be its modularity inside the designer's system. This mostly comes from the fact that a consistent interface between the designer and the computer system to evaluate the design is incomplete. This, however, is to be solved in the Intelligent Synthesis Environment (ISE) program currently being funded by NASA.

  12. User's manual for the Heat Pipe Space Radiator design and analysis Code (HEPSPARC)

    NASA Technical Reports Server (NTRS)

    Hainley, Donald C.

    1991-01-01

    A heat pipe space radiatior code (HEPSPARC), was written for the NASA Lewis Research Center and is used for the design and analysis of a radiator that is constructed from a pumped fluid loop that transfers heat to the evaporative section of heat pipes. This manual is designed to familiarize the user with this new code and to serve as a reference for its use. This manual documents the completed work and is intended to be the first step towards verification of the HEPSPARC code. Details are furnished to provide a description of all the requirements and variables used in the design and analysis of a combined pumped loop/heat pipe radiator system. A description of the subroutines used in the program is furnished for those interested in understanding its detailed workings.

  13. TAU: A 1D radiative transfer code for transmission spectroscopy of extrasolar planet atmospheres

    NASA Astrophysics Data System (ADS)

    Hollis, M. D. J.; Tessenyi, M.; Tinetti, G.

    2013-10-01

    The TAU code is a 1D line-by-line radiative transfer code, which is generally applicable for modelling transmission spectra of close-in extrasolar planets. The inputs are the assumed pressure-temperature profile of the planetary atmosphere, the continuum absorption coefficients and the absorption cross-sections for the trace molecular absorbers present in the model, as well as the fundamental system parameters taken from the published literature. The program then calculates the optical path through the planetary atmosphere of the radiation from the host star, and quantifies the absorption due to the modelled composition in a transmission spectrum of transit depth as a function of wavelength. The code is written in C++, parallelised using OpenMP, and is available for public download and use from http://www.ucl.ac.uk/exoplanets/. Running time: From 0:5 to 500 s, depending on run parameters

  14. TAU: A 1D radiative transfer code for transmission spectroscopy of extrasolar planet atmospheres

    NASA Astrophysics Data System (ADS)

    Hollis, M. D. J.; Tessenyi, M.; Tinetti, G.

    2014-02-01

    The TAU code is a 1D line-by-line radiative transfer code, which is generally applicable for modeling transmission spectra of close-in extrasolar planets. The inputs are the assumed temperature-pressure profile of the planetary atmosphere, the continuum absorption coefficients and the absorption cross-sections for the trace molecular absorbers present in the model, as well as the fundamental system parameters taken from the published literature. The program then calculates the optical path through the planetary atmosphere of the radiation from the host star, and quantifies the absorption due to the modeled composition in a transmission spectrum of transit depth as a function of wavelength. The code is written in C++, parallelized using OpenMP, and is available for public download and use from http://www.ucl.ac.uk/exoplanets/.

  15. Development of a coupling code for PWR reactor cavity radiation streaming calculation

    SciTech Connect

    Zheng, Z.; Wu, H.; Cao, L.; Zheng, Y.; Zhang, H.; Wang, M.

    2012-07-01

    PWR reactor cavity radiation streaming is important for the safe of the personnel and equipment, thus calculation has to be performed to evaluate the neutron flux distribution around the reactor. For this calculation, the deterministic codes have difficulties in fine geometrical modeling and need huge computer resource; and the Monte Carlo codes require very long sampling time to obtain results with acceptable precision. Therefore, a coupling method has been developed to eliminate the two problems mentioned above in each code. In this study, we develop a coupling code named DORT2MCNP to link the Sn code DORT and Monte Carlo code MCNP. DORT2MCNP is used to produce a combined surface source containing top, bottom and side surface simultaneously. Because SDEF card is unsuitable for the combined surface source, we modify the SOURCE subroutine of MCNP and compile MCNP for this application. Numerical results demonstrate the correctness of the coupling code DORT2MCNP and show reasonable agreement between the coupling method and the other two codes (DORT and MCNP). (authors)

  16. Three-dimensional radiation dose mapping with the TORT computer code

    SciTech Connect

    Slater, C.O.; Pace, J.V. III; Childs, R.L.; Haire, M.J. ); Koyama, T. )

    1991-01-01

    The Consolidated Fuel Reprocessing Program (CFRP) at Oak Ridge National Laboratory (ORNL) has performed radiation shielding studies in support of various facility designs for many years. Computer codes employing the point-kernel method have been used, and the accuracy of these codes is within acceptable limits. However, to further improve the accuracy and to calculate dose at a larger number of locations, a higher order method is desired, even for analyses performed in the early stages of facility design. Consequently, the three-dimensional discrete ordinates transport code TORT, developed at ORNL in the mid-1980s, was selected to examine in detail the dose received at equipment locations. The capabilities of the code have been previously reported. Recently, the Power Reactor and Nuclear Fuel Development Corporation in Japan and the US Department of Energy have used the TORT code as part of a collaborative agreement to jointly develop breeder reactor fuel reprocessing technology. In particular, CFRP used the TORT code to estimate radiation dose levels within the main process cell for a conceptual plant design and to establish process equipment lifetimes. The results reported in this paper are for a conceptual plant design that included the mechanical head and (i.e., the disassembly and shear machines), solvent extraction equipment, and miscellaneous process support equipment.

  17. FURN3D: A computer code for radiative heat transfer in pulverized coal furnaces

    SciTech Connect

    Ahluwalia, R.K.; Im, K.H.

    1992-08-01

    A computer code FURN3D has been developed for assessing the impact of burning different coals on heat absorption pattern in pulverized coal furnaces. The code is unique in its ability to conduct detailed spectral calculations of radiation transport in furnaces fully accounting for the size distributions of char, soot and ash particles, ash content, and ash composition. The code uses a hybrid technique of solving the three-dimensional radiation transport equation for absorbing, emitting and anisotropically scattering media. The technique achieves an optimal mix of computational speed and accuracy by combining the discrete ordinate method (S[sub 4]), modified differential approximation (MDA) and P, approximation in different range of optical thicknesses. The code uses spectroscopic data for estimating the absorption coefficients of participating gases C0[sub 2], H[sub 2]0 and CO. It invokes Mie theory for determining the extinction and scattering coefficients of combustion particulates. The optical constants of char, soot and ash are obtained from dispersion relations derived from reflectivity, transmissivity and extinction measurements. A control-volume formulation is adopted for determining the temperature field inside the furnace. A simple char burnout model is employed for estimating heat release and evolution of particle size distribution. The code is written in Fortran 77, has modular form, and is machine-independent. The computer memory required by the code depends upon the number of grid points specified and whether the transport calculations are performed on spectral or gray basis.

  18. FURN3D: A computer code for radiative heat transfer in pulverized coal furnaces

    SciTech Connect

    Ahluwalia, R.K.; Im, K.H.

    1992-08-01

    A computer code FURN3D has been developed for assessing the impact of burning different coals on heat absorption pattern in pulverized coal furnaces. The code is unique in its ability to conduct detailed spectral calculations of radiation transport in furnaces fully accounting for the size distributions of char, soot and ash particles, ash content, and ash composition. The code uses a hybrid technique of solving the three-dimensional radiation transport equation for absorbing, emitting and anisotropically scattering media. The technique achieves an optimal mix of computational speed and accuracy by combining the discrete ordinate method (S{sub 4}), modified differential approximation (MDA) and P, approximation in different range of optical thicknesses. The code uses spectroscopic data for estimating the absorption coefficients of participating gases C0{sub 2}, H{sub 2}0 and CO. It invokes Mie theory for determining the extinction and scattering coefficients of combustion particulates. The optical constants of char, soot and ash are obtained from dispersion relations derived from reflectivity, transmissivity and extinction measurements. A control-volume formulation is adopted for determining the temperature field inside the furnace. A simple char burnout model is employed for estimating heat release and evolution of particle size distribution. The code is written in Fortran 77, has modular form, and is machine-independent. The computer memory required by the code depends upon the number of grid points specified and whether the transport calculations are performed on spectral or gray basis.

  19. HADES code for numerical simulations of high-mach number astrophysical radiative flows

    NASA Astrophysics Data System (ADS)

    Michaut, C.; Di Menza, L.; Nguyen, H. C.; Bouquet, S. E.; Mancini, M.

    2017-03-01

    The understanding of astrophysical phenomena requires to deal with robust numerical tools in order to handle realistic scales in terms of energy, characteristic lengths and Mach number that cannot be easily reproduced by means of laboratory experiments. In this paper, we present the 2D numerical code HADES for the simulation of realistic astrophysical phenomena in various contexts, first taking into account radiative losses. The version of HADES including a multigroup modeling of radiative transfer will be presented in a forthcoming study. Validation of HADES is performed using several benchmark tests and some realistic applications are discussed. Optically thin radiative loss is modeled by a cooling function in the conservation law of energy. Numerical methods involve the MUSCL-Hancock finite volume scheme as well as HLLC and HLLE Riemann solvers, coupled with a second-order ODE solver by means of Strang splitting algorithm that handles source terms arising from geometrical or radiative contributions, for cartesian or axisymmetric configurations. A good agreement has been observed for all benchmark tests, either in hydrodynamic cases or in radiative cases. Furthermore, an overview of the main astrophysical studies driven with this code is proposed. First, simulations of radiative shocks in accretion columns and supernova remnant dynamics at large timescales including Vishniac instability have improved the understanding of these phenomena. Finally, astrophysical jets are investigated and the influence of the cooling effect on the jet morphology is numerically demonstrated. It is also found that periodic source enables to recover pulsating jets that mimic the structure of Herbig-Haro objects. HADES code has revealed its robustness, especially for the wall-shock test and for the so-called implosion test which turns out to be a severe one since the hydrodynamic variables are self-similar and become infinite at finite time. The simulations have proved the efficiency of

  20. The Continual Intercomparison of Radiation Codes (CIRC) Assessing Anew the Quality of GCM Radiation Algorithms

    NASA Technical Reports Server (NTRS)

    Oreopoulos, Lazaros; Mlawer, Eli

    2010-01-01

    The simulation of changes in the Earth's climate due to solar and thermal radiative processes with global climate models (GCMs) is highly complex, depending on the parameterization of a multitude of nonlinearly coupled physical processes. In contrast, the germ of global climate change, the radiative forcing from enhanced abundances of greenhouse gases, is relatively well understood. The impressive agreement between detailed radiation calculations and highly resolved spectral radiation measurements in the thermal infrared under cloudless conditions (see, for example, Fig. 1) instills confidence in our knowledge of the sources of gaseous absorption. That the agreement spans a broad range of temperature and humidity regimes using instruments mounted on surface, aircraft, and satellite platforms not only attests to our capability to accurately calculate radiative fluxes under present conditions, but also provides confidence in the spectroscopic basis for computation of fluxes under conditions that might characterize future global climate (e.g., radiative forcing). Alas, the computational costs of highly resolved spectral radiation calculations cannot be afforded presently in GCMs. Such calculations have instead been used as the foundation for approximations implemented in fast but generally less accurate algorithms performing the needed radiative transfer (RT) calculations in GCMs. Credible climate simulations by GCMs cannot be ensured without accurate solar and thermal radiative flux calculations under all types of sky conditions: pristine cloudless, aerosol-laden, and cloudy. The need for accuracy in RT calculations is not only important for greenhouse gas forcing scenarios, but is also profoundly needed for the robust simulation of many other atmospheric phenomena, such as convective processes.

  1. Development of a GPU Compatible Version of the Fast Radiation Code RRTMG

    NASA Astrophysics Data System (ADS)

    Iacono, M. J.; Mlawer, E. J.; Berthiaume, D.; Cady-Pereira, K. E.; Suarez, M.; Oreopoulos, L.; Lee, D.

    2012-12-01

    The absorption of solar radiation and emission/absorption of thermal radiation are crucial components of the physics that drive Earth's climate and weather. Therefore, accurate radiative transfer calculations are necessary for realistic climate and weather simulations. Efficient radiation codes have been developed for this purpose, but their accuracy requirements still necessitate that as much as 30% of the computational time of a GCM is spent computing radiative fluxes and heating rates. The overall computational expense constitutes a limitation on a GCM's predictive ability if it becomes an impediment to adding new physics to or increasing the spatial and/or vertical resolution of the model. The emergence of Graphics Processing Unit (GPU) technology, which will allow the parallel computation of multiple independent radiative calculations in a GCM, will lead to a fundamental change in the competition between accuracy and speed. Processing time previously consumed by radiative transfer will now be available for the modeling of other processes, such as physics parameterizations, without any sacrifice in the accuracy of the radiative transfer. Furthermore, fast radiation calculations can be performed much more frequently and will allow the modeling of radiative effects of rapid changes in the atmosphere. The fast radiation code RRTMG, developed at Atmospheric and Environmental Research (AER), is utilized operationally in many dynamical models throughout the world. We will present the results from the first stage of an effort to create a version of the RRTMG radiation code designed to run efficiently in a GPU environment. This effort will focus on the RRTMG implementation in GEOS-5. RRTMG has an internal pseudo-spectral vector of length of order 100 that, when combined with the much greater length of the global horizontal grid vector from which the radiation code is called in GEOS-5, makes RRTMG/GEOS-5 particularly suited to achieving a significant speed improvement

  2. Monte Carlo Code System for High-Energy Radiation Transport Calculations.

    SciTech Connect

    FILGES, DETLEF

    2000-02-16

    Version 00 HERMES-KFA consists of a set of Monte Carlo Codes used to simulate particle radiation and interaction with matter. The main codes are HETC, MORSE, and EGS. They are supported by a common geometry package, common random routines, a command interpreter, and auxiliary codes like NDEM that is used to generate a gamma-ray source from nuclear de-excitation after spallation processes. The codes have been modified so that any particle history falling outside the domain of the physical theory of one program can be submitted to another program in the suite to complete the work. Also response data can be submitted by each program, to be collected and combined by a statistic package included within the command interpreter.

  3. A public code for general relativistic, polarised radiative transfer around spinning black holes

    NASA Astrophysics Data System (ADS)

    Dexter, Jason

    2016-10-01

    Ray tracing radiative transfer is a powerful method for comparing theoretical models of black hole accretion flows and jets with observations. We present a public code, GRTRANS, for carrying out such calculations in the Kerr metric, including the full treatment of polarised radiative transfer and parallel transport along geodesics. The code is written in FORTRAN 90 and efficiently parallelises with OPENMP, and the full code and several components have PYTHON interfaces. We describe several tests which are used for verifiying the code, and we compare the results for polarised thin accretion disc and semi-analytic jet problems with those from the literature as examples of its use. Along the way, we provide accurate fitting functions for polarised synchrotron emission and transfer coefficients from thermal and power-law distribution functions, and compare results from numerical integration and quadrature solutions of the polarised radiative transfer equations. We also show that all transfer coefficients can play an important role in predicted images and polarisation maps of the Galactic centre black hole, Sgr A*, at submillimetre wavelengths.

  4. Simulation of high-energy radiation belt electron fluxes using NARMAX-VERB coupled codes.

    PubMed

    Pakhotin, I P; Drozdov, A Y; Shprits, Y Y; Boynton, R J; Subbotin, D A; Balikhin, M A

    2014-10-01

    This study presents a fusion of data-driven and physics-driven methodologies of energetic electron flux forecasting in the outer radiation belt. Data-driven NARMAX (Nonlinear AutoRegressive Moving Averages with eXogenous inputs) model predictions for geosynchronous orbit fluxes have been used as an outer boundary condition to drive the physics-based Versatile Electron Radiation Belt (VERB) code, to simulate energetic electron fluxes in the outer radiation belt environment. The coupled system has been tested for three extended time periods totalling several weeks of observations. The time periods involved periods of quiet, moderate, and strong geomagnetic activity and captured a range of dynamics typical of the radiation belts. The model has successfully simulated energetic electron fluxes for various magnetospheric conditions. Physical mechanisms that may be responsible for the discrepancies between the model results and observations are discussed.

  5. Simulation of high-energy radiation belt electron fluxes using NARMAX-VERB coupled codes

    PubMed Central

    Pakhotin, I P; Drozdov, A Y; Shprits, Y Y; Boynton, R J; Subbotin, D A; Balikhin, M A

    2014-01-01

    This study presents a fusion of data-driven and physics-driven methodologies of energetic electron flux forecasting in the outer radiation belt. Data-driven NARMAX (Nonlinear AutoRegressive Moving Averages with eXogenous inputs) model predictions for geosynchronous orbit fluxes have been used as an outer boundary condition to drive the physics-based Versatile Electron Radiation Belt (VERB) code, to simulate energetic electron fluxes in the outer radiation belt environment. The coupled system has been tested for three extended time periods totalling several weeks of observations. The time periods involved periods of quiet, moderate, and strong geomagnetic activity and captured a range of dynamics typical of the radiation belts. The model has successfully simulated energetic electron fluxes for various magnetospheric conditions. Physical mechanisms that may be responsible for the discrepancies between the model results and observations are discussed. PMID:26167432

  6. A Radiation Chemistry Code Based on the Green's Function of the Diffusion Equation

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Wu, Honglu

    2014-01-01

    Stochastic radiation track structure codes are of great interest for space radiation studies and hadron therapy in medicine. These codes are used for a many purposes, notably for microdosimetry and DNA damage studies. In the last two decades, they were also used with the Independent Reaction Times (IRT) method in the simulation of chemical reactions, to calculate the yield of various radiolytic species produced during the radiolysis of water and in chemical dosimeters. Recently, we have developed a Green's function based code to simulate reversible chemical reactions with an intermediate state, which yielded results in excellent agreement with those obtained by using the IRT method. This code was also used to simulate and the interaction of particles with membrane receptors. We are in the process of including this program for use with the Monte-Carlo track structure code Relativistic Ion Tracks (RITRACKS). This recent addition should greatly expand the capabilities of RITRACKS, notably to simulate DNA damage by both the direct and indirect effect.

  7. A unified radiative magnetohydrodynamics code for lightning-like discharge simulations

    SciTech Connect

    Chen, Qiang Chen, Bin Xiong, Run; Cai, Zhaoyang; Chen, P. F.

    2014-03-15

    A two-dimensional Eulerian finite difference code is developed for solving the non-ideal magnetohydrodynamic (MHD) equations including the effects of self-consistent magnetic field, thermal conduction, resistivity, gravity, and radiation transfer, which when combined with specified pulse current models and plasma equations of state, can be used as a unified lightning return stroke solver. The differential equations are written in the covariant form in the cylindrical geometry and kept in the conservative form which enables some high-accuracy shock capturing schemes to be equipped in the lightning channel configuration naturally. In this code, the 5-order weighted essentially non-oscillatory scheme combined with Lax-Friedrichs flux splitting method is introduced for computing the convection terms of the MHD equations. The 3-order total variation diminishing Runge-Kutta integral operator is also equipped to keep the time-space accuracy of consistency. The numerical algorithms for non-ideal terms, e.g., artificial viscosity, resistivity, and thermal conduction, are introduced in the code via operator splitting method. This code assumes the radiation is in local thermodynamic equilibrium with plasma components and the flux limited diffusion algorithm with grey opacities is implemented for computing the radiation transfer. The transport coefficients and equation of state in this code are obtained from detailed particle population distribution calculation, which makes the numerical model is self-consistent. This code is systematically validated via the Sedov blast solutions and then used for lightning return stroke simulations with the peak current being 20 kA, 30 kA, and 40 kA, respectively. The results show that this numerical model consistent with observations and previous numerical results. The population distribution evolution and energy conservation problems are also discussed.

  8. Milagro Version 2 An Implicit Monte Carlo Code for Thermal Radiative Transfer: Capabilities, Development, and Usage

    SciTech Connect

    T.J. Urbatsch; T.M. Evans

    2006-02-15

    We have released Version 2 of Milagro, an object-oriented, C++ code that performs radiative transfer using Fleck and Cummings' Implicit Monte Carlo method. Milagro, a part of the Jayenne program, is a stand-alone driver code used as a methods research vehicle and to verify its underlying classes. These underlying classes are used to construct Implicit Monte Carlo packages for external customers. Milagro-2 represents a design overhaul that allows better parallelism and extensibility. New features in Milagro-2 include verified momentum deposition, restart capability, graphics capability, exact energy conservation, and improved load balancing and parallel efficiency. A users' guide also describes how to configure, make, and run Milagro2.

  9. DOPEX-1D2C: A one-dimensional, two-constraint radiation shield optimization code

    NASA Technical Reports Server (NTRS)

    Lahti, G. P.

    1973-01-01

    A one-dimensional, two-constraint radiation sheild weight optimization procedure and a computer program, DOPEX-1D2C, is described. The DOPEX-1D2C uses the steepest descent method to alter a set of initial (input) thicknesses of a spherical shield configuration to achieve a minimum weight while simultaneously satisfying two dose-rate constraints. The code assumes an exponential dose-shield thickness relation with parameters specified by the user. Code input instruction, a FORTRAN-4 listing, and a sample problem are given. Typical computer time required to optimize a seven-layer shield is less than 1/2 minute on an IBM 7094.

  10. PBMC: Pre-conditioned Backward Monte Carlo code for radiative transport in planetary atmospheres

    NASA Astrophysics Data System (ADS)

    García Muñoz, A.; Mills, F. P.

    2017-08-01

    PBMC (Pre-Conditioned Backward Monte Carlo) solves the vector Radiative Transport Equation (vRTE) and can be applied to planetary atmospheres irradiated from above. The code builds the solution by simulating the photon trajectories from the detector towards the radiation source, i.e. in the reverse order of the actual photon displacements. In accounting for the polarization in the sampling of photon propagation directions and pre-conditioning the scattering matrix with information from the scattering matrices of prior (in the BMC integration order) photon collisions, PBMC avoids the unstable and biased solutions of classical BMC algorithms for conservative, optically-thick, strongly-polarizing media such as Rayleigh atmospheres.

  11. Evaluation of Error-Correcting Codes for Radiation-Tolerant Memory

    NASA Astrophysics Data System (ADS)

    Jeon, S.; Vijaya Kumar, B. V. K.; Hwang, E.; Cheng, M. K.

    2010-05-01

    In space, radiation particles can introduce temporary or permanent errors in memory systems. To protect against potential memory faults, either thick shielding or error-correcting codes (ECC) are used by memory modules. Thick shielding translates into increased mass, and conventional ECCs designed for memories are typically capable of correcting only a single error and detecting a double error. Decoding is usually performed through hard decisions where bits are treated as either correct or flipped in polarity. We demonstrate that low-density parity-check (LDPC) codes that are already prevalent in many communication applications can also be used to protect memories in space. Because the achievable code rate monotonically decreases with time due to the accumulation of permanent errors, the achievable rate serves as a useful metric in designing an appropriate ECC. We describe how to compute soft symbol reliabilities on our channel and compare the performance of soft-decision decoding LDPC codes against conventional hard-decision decoding of Reed-Solomon (RS) codes and Bose-Chaudhuri-Hocquenghem (BCH) codes for a specific memory structure.

  12. Intercomparison of three microwave/infrared high resolution line-by-line radiative transfer codes

    NASA Astrophysics Data System (ADS)

    Schreier, F.; Garcia, S. Gimeno; Milz, M.; Kottayil, A.; Höpfner, M.; von Clarmann, T.; Stiller, G.

    2013-05-01

    An intercomparison of three line-by-line (lbl) codes developed independently for atmospheric sounding - ARTS, GARLIC, and KOPRA - has been performed for a thermal infrared nadir sounding application assuming a HIRS-like (High resolution Infrared Radiation Sounder) setup. Radiances for the HIRS infrared channels and a set of 42 atmospheric profiles from the "Garand dataset" have been computed. Results of this intercomparison and a discussion of reasons of the observed differences are presented.

  13. HT-FRTC: a fast radiative transfer code using kernel regression

    NASA Astrophysics Data System (ADS)

    Thelen, Jean-Claude; Havemann, Stephan; Lewis, Warren

    2016-09-01

    The HT-FRTC is a principal component based fast radiative transfer code that can be used across the electromagnetic spectrum from the microwave through to the ultraviolet to calculate transmittance, radiance and flux spectra. The principal components cover the spectrum at a very high spectral resolution, which allows very fast line-by-line, hyperspectral and broadband simulations for satellite-based, airborne and ground-based sensors. The principal components are derived during a code training phase from line-by-line simulations for a diverse set of atmosphere and surface conditions. The derived principal components are sensor independent, i.e. no extra training is required to include additional sensors. During the training phase we also derive the predictors which are required by the fast radiative transfer code to determine the principal component scores from the monochromatic radiances (or fluxes, transmittances). These predictors are calculated for each training profile at a small number of frequencies, which are selected by a k-means cluster algorithm during the training phase. Until recently the predictors were calculated using a linear regression. However, during a recent rewrite of the code the linear regression was replaced by a Gaussian Process (GP) regression which resulted in a significant increase in accuracy when compared to the linear regression. The HT-FRTC has been trained with a large variety of gases, surface properties and scatterers. Rayleigh scattering as well as scattering by frozen/liquid clouds, hydrometeors and aerosols have all been included. The scattering phase function can be fully accounted for by an integrated line-by-line version of the Edwards-Slingo spherical harmonics radiation code or approximately by a modification to the extinction (Chou scaling).

  14. Comparison of radiation spectra from selected source-term computer codes

    SciTech Connect

    Brady, M.C.; Hermann, O.W.; Wilson, W.B.

    1989-04-01

    This report compares the radiation spectra and intensities predicted by three radionuclide inventory/depletion codes, ORIGEN2, ORIGEN-S, and CINDER-2. The comparisons were made for a series of light-water reactor models (including three pressurized-water reactors (PWR) and two boiling-water reactors (BWR)) at cooling times ranging from 30 d to 100 years. The work presented here complements the results described in an earlier report that discusses in detail the three depletion codes, the various reactor models, and the comparison by nuclide of the inventories, activities, and decay heat predictions by nuclide for the three codes. In this report, the photon production rates from fission product nuclides and actinides were compared as well as the total photon production rates and energy spectra. Very good agreement was observed in the photon source terms predicted by ORIGEN2 and ORIGEN-S. The absence of bremsstrahlung radiation in the CINDER-2 calculations resulted in large differences in both the production rates and spectra in comparison with the ORIGEN2 and ORIGEN-S results. A comparison of the CINDER-2 photon production rates with an ORIGEN-S calculation neglecting bremsstrahlung radiation showed good agreement. An additional discrepancy was observed in the photon spectra predicted from the CINDER-2 calculations and has been attributed to the absence of spectral data for /sup 144/Pr in those calculations. 12 refs., 26 figs., 36 tabs.

  15. Performance of the dot product function in radiative transfer code SORD

    NASA Astrophysics Data System (ADS)

    Korkin, Sergey; Lyapustin, Alexei; Sinyuk, Aliaksandr; Holben, Brent

    2016-10-01

    The successive orders of scattering radiative transfer (RT) codes frequently call the scalar (dot) product function. In this paper, we study performance of some implementations of the dot product in the RT code SORD using 50 scenarios for light scattering in the atmosphere-surface system. In the dot product function, we use the unrolled loops technique with different unrolling factor. We also considered the intrinsic Fortran functions. We show results for two machines: ifort compiler under Windows, and pgf90 under Linux. Intrinsic DOT_PRODUCT function showed best performance for the ifort. For the pgf90, the dot product implemented with unrolling factor 4 was the fastest. The RT code SORD together with the interface that runs all the mentioned tests are publicly available from ftp://maiac.gsfc.nasa.gov/pub/skorkin/SORD_IP_16B (current release) or by email request from the corresponding (first) author.

  16. New Particle-in-Cell Code for Numerical Simulation of Coherent Synchrotron Radiation

    SciTech Connect

    Balsa Terzic, Rui Li

    2010-05-01

    We present a first look at the new code for self-consistent, 2D simulations of beam dynamics affected by the coherent synchrotron radiation. The code is of the particle-in-cell variety: the beam bunch is sampled by point-charge particles, which are deposited on the grid; the corresponding forces on the grid are then computed using retarded potentials according to causality, and interpolated so as to advance the particles in time. The retarded potentials are evaluated by integrating over the 2D path history of the bunch, with the charge and current density at the retarded time obtained from interpolation of the particle distributions recorded at discrete timesteps. The code is benchmarked against analytical results obtained for a rigid-line bunch. We also outline the features and applications which are currently being developed.

  17. SKIRT: An advanced dust radiative transfer code with a user-friendly architecture

    NASA Astrophysics Data System (ADS)

    Camps, P.; Baes, M.

    2015-03-01

    We discuss the architecture and design principles that underpin the latest version of SKIRT, a state-of-the-art open source code for simulating continuum radiation transfer in dusty astrophysical systems, such as spiral galaxies and accretion disks. SKIRT employs the Monte Carlo technique to emulate the relevant physical processes including scattering, absorption and emission by the dust. The code features a wealth of built-in geometries, radiation source spectra, dust characterizations, dust grids, and detectors, in addition to various mechanisms for importing snapshots generated by hydrodynamical simulations. The configuration for a particular simulation is defined at run-time through a user-friendly interface suitable for both occasional and power users. These capabilities are enabled by careful C++ code design. The programming interfaces between components are well defined and narrow. Adding a new feature is usually as simple as adding another class; the user interface automatically adjusts to allow configuring the new options. We argue that many scientific codes, like SKIRT, can benefit from careful object-oriented design and from a friendly user interface, even if it is not a graphical user interface.

  18. Using Procedure Codes to Define Radiation Toxicity in Administrative Data: The Devil is in the Details.

    PubMed

    Meyer, Anne-Marie; Kuo, Tzy-Mey; Chang, YunKyung; Carpenter, William R; Chen, Ronald C; Sturmer, Til

    2017-05-01

    Systematic coding systems are used to define clinically meaningful outcomes when leveraging administrative claims data for research. How and when these codes are applied within a research study can have implications for the study validity and their specificity can vary significantly depending on treatment received. Data are from the Surveillance, Epidemiology, and End Results-Medicare linked dataset. We use propensity score methods in a retrospective cohort of prostate cancer patients first examined in a recently published radiation oncology comparative effectiveness study. With the narrowly defined outcome definition, the toxicity event outcome rate ratio was 0.88 per 100 person-years (95% confidence interval, 0.71-1.08). With the broadly defined outcome, the rate ratio was comparable, with 0.89 per 100 person-years (95% confidence interval, 0.76-1.04), although individual event rates were doubled. Some evidence of surveillance bias was suggested by a higher rate of endoscopic procedures the first year of follow-up in patients who received proton therapy compared with those receiving intensity-modulated radiation treatment (11.15 vs. 8.90, respectively). This study demonstrates the risk of introducing bias through subjective application of procedure codes. Careful consideration is required when using procedure codes to define outcomes in administrative data.

  19. Application of the new MultiTrans SP3 radiation transport code in BNCT dose planning.

    PubMed

    Kotiluoto, P; Hiisamäki, P; Savolainen, S

    2001-09-01

    Dose planning in boron neutron capture therapy (BNCT) is a complex problem and requires sophisticated numerical methods. In the framework of the Finnish BNCT project, new deterministic three-dimensional radiation transport code MultiTrans SP3 has been developed at VTT Chemical Technology, based on a novel application of the tree multigrid technique. To test the applicability of this new code in a realistic BNCT dose planning problem, cylindrical PMMA (polymethyl-methacrylate) phantom was chosen as a benchmark case. It is a convenient benchmark, as it has been modeled by several different codes, including well-known DORT and MCNP. Extensive measured data also exist. In this paper, a comparison of the new MultiTrans SP3 code with other methods is presented for the PMMA phantom case. Results show that the total neutron dose rate to ICRU adult brain calculated by the MultiTrans SP3 code differs less than 4% in 2 cm depth in phantom (in thermal maximum) from the DORT calculation. Results also show that the calculated 197Au(n,gamma) and 55Mn(n,gamma) reaction rates in 2 cm depth in phantom differ less than 4% and 1% from the measured values, respectively. However, the photon dose calculated by the MultiTrans SP3 code seems to be incorrect in this PMMA phantom case, which requires further studying. As expected, the deterministic MultiTrans SP3 code is over an order of magnitude faster than stochastic Monte Carlo codes (with similar resolution), thus providing a very efficient tool for BNCT dose planning.

  20. Radiation Coupling with the FUN3D Unstructured-Grid CFD Code

    NASA Technical Reports Server (NTRS)

    Wood, William A.

    2012-01-01

    The HARA radiation code is fully-coupled to the FUN3D unstructured-grid CFD code for the purpose of simulating high-energy hypersonic flows. The radiation energy source terms and surface heat transfer, under the tangent slab approximation, are included within the fluid dynamic ow solver. The Fire II flight test, at the Mach-31 1643-second trajectory point, is used as a demonstration case. Comparisons are made with an existing structured-grid capability, the LAURA/HARA coupling. The radiative surface heat transfer rates from the present approach match the benchmark values within 6%. Although radiation coupling is the focus of the present work, convective surface heat transfer rates are also reported, and are seen to vary depending upon the choice of mesh connectivity and FUN3D ux reconstruction algorithm. On a tetrahedral-element mesh the convective heating matches the benchmark at the stagnation point, but under-predicts by 15% on the Fire II shoulder. Conversely, on a mixed-element mesh the convective heating over-predicts at the stagnation point by 20%, but matches the benchmark away from the stagnation region.

  1. Reanalysis and forecasting killer electrons in Earth's radiation belts using the VERB code

    NASA Astrophysics Data System (ADS)

    Kellerman, Adam; Kondrashov, Dmitri; Shprits, Yuri; Podladchikova, Tatiana; Drozdov, Alexander

    2016-07-01

    The Van Allen radiation belts are torii-shaped regions of trapped energetic particles, that in recent years, have become a principle focus for satellite operators and engineers. During geomagnetic storms, electrons can be accelerated up to relativistic energies, where they may penetrate spacecraft shielding and damage electrical systems, causing permanent damage or loss of spacecraft. Data-assimilation provides an optimal way to combine observations of the radiation belts with a physics-based model in order to more accurately specify the global state of the Earth's radiation belts. We present recent advances to the data-assimilative version of the Versatile Electron Radiation Belt (VERB) code, including more sophisticated error analysis, and incorporation of realistic field-models to more accurately specify fluxes at a given MLT or along a spacecraft trajectory. The effect of recent stream-interaction-region (SIR) driven enhancements are investigated using the improved model. We also present a real-time forecast model based on the data-assimilative VERB code, and discuss the forecast performance over the past 12 months.

  2. EMMA: an adaptive mesh refinement cosmological simulation code with radiative transfer

    NASA Astrophysics Data System (ADS)

    Aubert, Dominique; Deparis, Nicolas; Ocvirk, Pierre

    2015-11-01

    EMMA is a cosmological simulation code aimed at investigating the reionization epoch. It handles simultaneously collisionless and gas dynamics, as well as radiative transfer physics using a moment-based description with the M1 approximation. Field quantities are stored and computed on an adaptive three-dimensional mesh and the spatial resolution can be dynamically modified based on physically motivated criteria. Physical processes can be coupled at all spatial and temporal scales. We also introduce a new and optional approximation to handle radiation: the light is transported at the resolution of the non-refined grid and only once the dynamics has been fully updated, whereas thermo-chemical processes are still tracked on the refined elements. Such an approximation reduces the overheads induced by the treatment of radiation physics. A suite of standard tests are presented and passed by EMMA, providing a validation for its future use in studies of the reionization epoch. The code is parallel and is able to use graphics processing units (GPUs) to accelerate hydrodynamics and radiative transfer calculations. Depending on the optimizations and the compilers used to generate the CPU reference, global GPU acceleration factors between ×3.9 and ×16.9 can be obtained. Vectorization and transfer operations currently prevent better GPU performance and we expect that future optimizations and hardware evolution will lead to greater accelerations.

  3. Predicting radiative heat transfer in thermochemical nonequilibrium flow fields. Theory and user's manual for the LORAN code

    NASA Astrophysics Data System (ADS)

    Chambers, Lin Hartung

    1994-09-01

    The theory for radiation emission, absorption, and transfer in a thermochemical nonequilibrium flow is presented. The expressions developed reduce correctly to the limit at equilibrium. To implement the theory in a practical computer code, some approximations are used, particularly the smearing of molecular radiation. Details of these approximations are presented and helpful information is included concerning the use of the computer code. This user's manual should benefit both occasional users of the Langley Optimized Radiative Nonequilibrium (LORAN) code and those who wish to use it to experiment with improved models or properties.

  4. Predicting radiative heat transfer in thermochemical nonequilibrium flow fields. Theory and user's manual for the LORAN code

    NASA Technical Reports Server (NTRS)

    Chambers, Lin Hartung

    1994-01-01

    The theory for radiation emission, absorption, and transfer in a thermochemical nonequilibrium flow is presented. The expressions developed reduce correctly to the limit at equilibrium. To implement the theory in a practical computer code, some approximations are used, particularly the smearing of molecular radiation. Details of these approximations are presented and helpful information is included concerning the use of the computer code. This user's manual should benefit both occasional users of the Langley Optimized Radiative Nonequilibrium (LORAN) code and those who wish to use it to experiment with improved models or properties.

  5. TIME-DEPENDENT MULTI-GROUP MULTI-DIMENSIONAL RELATIVISTIC RADIATIVE TRANSFER CODE BASED ON SPHERICAL HARMONIC DISCRETE ORDINATE METHOD

    SciTech Connect

    Tominaga, Nozomu; Shibata, Sanshiro; Blinnikov, Sergei I. E-mail: sshibata@post.kek.jp

    2015-08-15

    We develop a time-dependent, multi-group, multi-dimensional relativistic radiative transfer code, which is required to numerically investigate radiation from relativistic fluids that are involved in, e.g., gamma-ray bursts and active galactic nuclei. The code is based on the spherical harmonic discrete ordinate method (SHDOM) which evaluates a source function including anisotropic scattering in spherical harmonics and implicitly solves the static radiative transfer equation with ray tracing in discrete ordinates. We implement treatments of time dependence, multi-frequency bins, Lorentz transformation, and elastic Thomson and inelastic Compton scattering to the publicly available SHDOM code. Our code adopts a mixed-frame approach; the source function is evaluated in the comoving frame, whereas the radiative transfer equation is solved in the laboratory frame. This implementation is validated using various test problems and comparisons with the results from a relativistic Monte Carlo code. These validations confirm that the code correctly calculates the intensity and its evolution in the computational domain. The code enables us to obtain an Eddington tensor that relates the first and third moments of intensity (energy density and radiation pressure) and is frequently used as a closure relation in radiation hydrodynamics calculations.

  6. European Code against Cancer 4th Edition: Ionising and non-ionising radiation and cancer.

    PubMed

    McColl, Neil; Auvinen, Anssi; Kesminiene, Ausrele; Espina, Carolina; Erdmann, Friederike; de Vries, Esther; Greinert, Rüdiger; Harrison, John; Schüz, Joachim

    2015-12-01

    Ionising radiation can transfer sufficient energy to ionise molecules, and this can lead to chemical changes, including DNA damage in cells. Key evidence for the carcinogenicity of ionising radiation comes from: follow-up studies of the survivors of the atomic bombings in Japan; other epidemiological studies of groups that have been exposed to radiation from medical, occupational or environmental sources; experimental animal studies; and studies of cellular responses to radiation. Considering exposure to environmental ionising radiation, inhalation of naturally occurring radon is the major source of radiation in the population - in doses orders of magnitude higher than those from nuclear power production or nuclear fallout. Indoor exposure to radon and its decay products is an important cause of lung cancer; radon may cause approximately one in ten lung cancers in Europe. Exposures to radon in buildings can be reduced via a three-step process of identifying those with potentially elevated radon levels, measuring radon levels, and reducing exposure by installation of remediation systems. In the 4th Edition of the European Code against Cancer it is therefore recommended to: "Find out if you are exposed to radiation from naturally high radon levels in your home. Take action to reduce high radon levels". Non-ionising types of radiation (those with insufficient energy to ionise molecules) - including extremely low-frequency electric and magnetic fields as well as radiofrequency electromagnetic fields - are not an established cause of cancer and are therefore not addressed in the recommendations to reduce cancer risk. Copyright © 2015 International Agency for Research on Cancer. Published by Elsevier Ltd. All rights reserved.

  7. MULTI2D - a computer code for two-dimensional radiation hydrodynamics

    NASA Astrophysics Data System (ADS)

    Ramis, R.; Meyer-ter-Vehn, J.; Ramírez, J.

    2009-06-01

    Simulation of radiation hydrodynamics in two spatial dimensions is developed, having in mind, in particular, target design for indirectly driven inertial confinement energy (IFE) and the interpretation of related experiments. Intense radiation pulses by laser or particle beams heat high-Z target configurations of different geometries and lead to a regime which is optically thick in some regions and optically thin in others. A diffusion description is inadequate in this situation. A new numerical code has been developed which describes hydrodynamics in two spatial dimensions (cylindrical R-Z geometry) and radiation transport along rays in three dimensions with the 4 π solid angle discretized in direction. Matter moves on a non-structured mesh composed of trilateral and quadrilateral elements. Radiation flux of a given direction enters on two (one) sides of a triangle and leaves on the opposite side(s) in proportion to the viewing angles depending on the geometry. This scheme allows to propagate sharply edged beams without ray tracing, though at the price of some lateral diffusion. The algorithm treats correctly both the optically thin and optically thick regimes. A symmetric semi-implicit (SSI) method is used to guarantee numerical stability. Program summaryProgram title: MULTI2D Catalogue identifier: AECV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 151 098 No. of bytes in distributed program, including test data, etc.: 889 622 Distribution format: tar.gz Programming language: C Computer: PC (32 bits architecture) Operating system: Linux/Unix RAM: 2 Mbytes Word size: 32 bits Classification: 19.7 External routines: X-window standard library (libX11.so) and corresponding heading files (X11/*.h) are

  8. Spectral and Structure Modeling of Low and High Mass Young Stars Using a Radiative Trasnfer Code

    NASA Astrophysics Data System (ADS)

    Robson Rocha, Will; Pilling, Sergio

    The spectroscopy data from space telescopes (ISO, Spitzer, Herchel) shows that in addition to dust grains (e.g. silicates), there is also the presence of the frozen molecular species (astrophysical ices, such as H _{2}O, CO, CO _{2}, CH _{3}OH) in the circumstellar environments. In this work we present a study of the modeling of low and high mass young stellar objects (YSOs), where we highlight the importance in the use of the astrophysical ices processed by the radiation (UV, cosmic rays) comes from stars in formation process. This is important to characterize the physicochemical evolution of the ices distributed by the protostellar disk and its envelope in some situations. To perform this analysis, we gathered (i) observational data from Infrared Space Observatory (ISO) related with low mass protostar Elias29 and high mass protostar W33A, (ii) absorbance experimental data in the infrared spectral range used to determinate the optical constants of the materials observed around this objects and (iii) a powerful radiative transfer code to simulate the astrophysical environment (RADMC-3D, Dullemond et al, 2012). Briefly, the radiative transfer calculation of the YSOs was done employing the RADMC-3D code. The model outputs were the spectral energy distribution and theoretical images in different wavelengths of the studied objects. The functionality of this code is based on the Monte Carlo methodology in addition to Mie theory for interaction among radiation and matter. The observational data from different space telescopes was used as reference for comparison with the modeled data. The optical constants in the infrared, used as input in the models, were calculated directly from absorbance data obtained in the laboratory of both unprocessed and processed simulated interstellar samples by using NKABS code (Rocha & Pilling 2014). We show from this study that some absorption bands in the infrared, observed in the spectrum of Elias29 and W33A can arises after the ices

  9. AREVA Developments for an Efficient and Reliable use of Monte Carlo codes for Radiation Transport Applications

    NASA Astrophysics Data System (ADS)

    Chapoutier, Nicolas; Mollier, François; Nolin, Guillaume; Culioli, Matthieu; Mace, Jean-Reynald

    2017-09-01

    In the context of the rising of Monte Carlo transport calculations for any kind of application, AREVA recently improved its suite of engineering tools in order to produce efficient Monte Carlo workflow. Monte Carlo codes, such as MCNP or TRIPOLI, are recognized as reference codes to deal with a large range of radiation transport problems. However the inherent drawbacks of theses codes - laboring input file creation and long computation time - contrast with the maturity of the treatment of the physical phenomena. The goals of the recent AREVA developments were to reach similar efficiency as other mature engineering sciences such as finite elements analyses (e.g. structural or fluid dynamics). Among the main objectives, the creation of a graphical user interface offering CAD tools for geometry creation and other graphical features dedicated to the radiation field (source definition, tally definition) has been reached. The computations times are drastically reduced compared to few years ago thanks to the use of massive parallel runs, and above all, the implementation of hybrid variance reduction technics. From now engineering teams are capable to deliver much more prompt support to any nuclear projects dealing with reactors or fuel cycle facilities from conceptual phase to decommissioning.

  10. Structure of the solar photosphere studied from the radiation hydrodynamics code ANTARES

    NASA Astrophysics Data System (ADS)

    Leitner, P.; Lemmerer, B.; Hanslmeier, A.; Zaqarashvili, T.; Veronig, A.; Grimm-Strele, H.; Muthsam, H. J.

    2017-09-01

    The ANTARES radiation hydrodynamics code is capable of simulating the solar granulation in detail unequaled by direct observation. We introduce a state-of-the-art numerical tool to the solar physics community and demonstrate its applicability to model the solar granulation. The code is based on the weighted essentially non-oscillatory finite volume method and by its implementation of local mesh refinement is also capable of simulating turbulent fluids. While the ANTARES code already provides promising insights into small-scale dynamical processes occurring in the quiet-Sun photosphere, it will soon be capable of modeling the latter in the scope of radiation magnetohydrodynamics. In this first preliminary study we focus on the vertical photospheric stratification by examining a 3-D model photosphere with an evolution time much larger than the dynamical timescales of the solar granulation and of particular large horizontal extent corresponding to 25''×25'' on the solar surface to smooth out horizontal spatial inhomogeneities separately for up- and downflows. The highly resolved Cartesian grid thereby covers ˜4 Mm of the upper convection zone and the adjacent photosphere. Correlation analysis, both local and two-point, provides a suitable means to probe the photospheric structure and thereby to identify several layers of characteristic dynamics: The thermal convection zone is found to reach some ten kilometers above the solar surface, while convectively overshooting gas penetrates even higher into the low photosphere. An ≈145 km wide transition layer separates the convective from the oscillatory layers in the higher photosphere.

  11. Calibration of radiation codes in climate models: Comparison of calculations with observations from the SPECtral Radiation Experiment (SPECTRE)

    NASA Technical Reports Server (NTRS)

    Ellingson, R. G.; Wiscombe, W. J.; Deluisi, J.; Melfi, H.; Smith, W.

    1993-01-01

    The primary goal of SPECTRE is to: close the loopholes by which longwave radiation models have eluded incisive comparisons with measurements. Likewise, the experimental approach was quite simple in concept, namely: accurately measure the zenith infrared radiance at high spectral resolution while simultaneously profiling the radiatively important atmospheric properties with conventional and remote sensing devices. The field phase of SPECTRE was carried out as part of FIRE Cirrus II, and detailed spectra of the down welling radiance were obtained by several interferometers simultaneous to the measurement of the optical properties of the atmosphere. We are now well along in the process of analyzing the data and calibrating radiation codes so that they may be used more effectively in climate related studies. The calibration is being done with models ranging from the most detailed (line-by-line) to the broad-band parameterizations used in climate models. This paper summarizes our progress in the calibration for clear-sky conditions. When this stage is completed, we will move on to the calibration for cirrus conditions.

  12. Comparison of Radiation Transport Codes, HZETRN, HETC and FLUKA, Using the 1956 Webber SPE Spectrum

    NASA Technical Reports Server (NTRS)

    Heinbockel, John H.; Slaba, Tony C.; Blattnig, Steve R.; Tripathi, Ram K.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; Reddell, Brandon; Clowdsley, Martha S.; hide

    2009-01-01

    Protection of astronauts and instrumentation from galactic cosmic rays (GCR) and solar particle events (SPE) in the harsh environment of space is of prime importance in the design of personal shielding, spacec raft, and mission planning. Early entry of radiation constraints into the design process enables optimal shielding strategies, but demands efficient and accurate tools that can be used by design engineers in every phase of an evolving space project. The radiation transport code , HZETRN, is an efficient tool for analyzing the shielding effectiveness of materials exposed to space radiation. In this paper, HZETRN is compared to the Monte Carlo codes HETC-HEDS and FLUKA, for a shield/target configuration comprised of a 20 g/sq cm Aluminum slab in front of a 30 g/cm^2 slab of water exposed to the February 1956 SPE, as mode led by the Webber spectrum. Neutron and proton fluence spectra, as well as dose and dose equivalent values, are compared at various depths in the water target. This study shows that there are many regions where HZETRN agrees with both HETC-HEDS and FLUKA for this shield/target configuration and the SPE environment. However, there are also regions where there are appreciable differences between the three computer c odes.

  13. The Development of the Ducted Fan Noise Propagation and Radiation Code CDUCT-LaRC

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Farassat, F.; Pope, D. Stuart; Vatsa, Veer

    2003-01-01

    The development of the ducted fan noise propagation and radiation code CDUCT-LaRC at NASA Langley Research Center is described. This code calculates the propagation and radiation of given acoustic modes ahead of the fan face or aft of the exhaust guide vanes in the inlet or exhaust ducts, respectively. This paper gives a description of the modules comprising CDUCT-LaRC. The grid generation module provides automatic creation of numerical grids for complex (non-axisymmetric) geometries that include single or multiple pylons. Files for performing automatic inviscid mean flow calculations are also generated within this module. The duct propagation is based on the parabolic approximation theory of R. P. Dougherty. This theory allows the handling of complex internal geometries and the ability to study the effect of non-uniform (i.e. circumferentially and axially segmented) liners. Finally, the duct radiation module is based on the Ffowcs Williams-Hawkings (FW-H) equation with a penetrable data surface. Refraction of sound through the shear layer between the external flow and bypass duct flow is included. Results for benchmark annular ducts, as well as other geometries with pylons, are presented and compared with available analytical data.

  14. Data assimilation in the radiation belts using the Salammbô code

    NASA Astrophysics Data System (ADS)

    Maget, Vincent; Bourdarie, Sébastien

    2017-04-01

    The natural energetic electron environment in the Earth's radiation belts is of general importance as dynamic variations in this environment can impact space hardware and contribute significantly to background signals in a range of other instruments flying in that region. The most dramatic changes in the relativistic electron populations occur during enhanced periods of geomagnetic activity. The relative importance of all competing physical processes involved in the radiation belt dynamics changes from storm to storm and the net result on particle distribution might then be very different. Modeling Earth's radiation belts still constitutes an active field of research. The most common practice is to deduce empirical formulae of physical processes amplitudes versus one or more proxies like Kp, Dst or solar wind parameters from statistical studies. Although this allows us to reproduce the mean dynamics of the radiation belts, this may introduce errors in the system, which becomes even more important for high magnetic activity conditions for which statistics are usually poor. In parallel, it has been shown in the recent years that a data assimilation scheme based on an Ensemble Kalman Filter (EnKF) may lead to great improvements in (1) the accuracy of modeling the different regions of Earth's radiation belts, (2) the possibility to accurately predict the state of the radiation belts, and (3) in accurately reanalyzing a long time period as a basis for specification model and climatology. This talk aims at presenting a global overview of the recent efforts undergone at ONERA concerning data assimilation in the radiation belts based on the Salammbô code and an EnKF. We will in particular focus our attention on the benefits of being able to accurately assimilate different types of measurements in our data assimilation tool.

  15. GRAVE: An Interactive Geometry Construction and Visualization Software System for the TORT Nuclear Radiation Transport Code

    SciTech Connect

    Blakeman, E.D.

    2000-05-07

    A software system, GRAVE (Geometry Rendering and Visual Editor), has been developed at the Oak Ridge National Laboratory (ORNL) to perform interactive visualization and development of models used as input to the TORT three-dimensional discrete ordinates radiation transport code. Three-dimensional and two-dimensional visualization displays are included. Display capabilities include image rotation, zoom, translation, wire-frame and translucent display, geometry cuts and slices, and display of individual component bodies and material zones. The geometry can be interactively edited and saved in TORT input file format. This system is an advancement over the current, non-interactive, two-dimensional display software. GRAVE is programmed in the Java programming language and can be implemented on a variety of computer platforms. Three- dimensional visualization is enabled through the Visualization Toolkit (VTK), a free-ware C++ software library developed for geometric and data visual display. Future plans include an extension of the system to read inputs using binary zone maps and combinatorial geometry models containing curved surfaces, such as those used for Monte Carlo code inputs. Also GRAVE will be extended to geometry visualization/editing for the DORT two-dimensional transport code and will be integrated into a single GUI-based system for all of the ORNL discrete ordinates transport codes.

  16. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    SciTech Connect

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art.

  17. Update on the Radiation Code in IMPACT: Clouds, Heating Rates, and Comparisons

    SciTech Connect

    Edis, T; Grant, K; Cameron-Smith, P

    2005-07-22

    This is a summary of work done over two months in the summer of 2005, which was devoted to improving the radiation code of IMPACT, the LLNL 3D global atmospheric chemistry and aerosol model. Most of the work concerned the addition and testing of new cloud optical property routines designed to work with CAM3 meteorological data, and the comparison of CAM3 with the results of IMPACT runs using meteorological data from CAM3 and MACCM3. Additional related work done in the course of these main tasks will be described as necessary.

  18. Simulation of Smith-Purcell terahertz radiation using a particle-in-cell code

    NASA Astrophysics Data System (ADS)

    Donohue, J. T.; Gardelle, J.

    2006-06-01

    A simulation of the generation of Smith-Purcell (SP) radiation at terahertz frequencies has been performed using the two-dimensional particle-in-cell code MAGIC. The simulation supposes that a thin (but infinitely wide) monoenergetic electron beam passes over a diffraction grating. We simulate two configurations, one similar to the Dartmouth SP free-electron laser, with a low-energy continuous beam (we use an axial magnetic field to constrain the electrons to essentially one-dimensional motion). The other is similar to the recent MIT experiment that uses a prebunched 15 MeV beam.

  19. A novel method involving Matlab coding to determine the distribution of a collimated ionizing radiation beam

    NASA Astrophysics Data System (ADS)

    Ioan, M.-R.

    2016-08-01

    In ionizing radiation related experiments, precisely knowing of the involved parameters it is a very important task. Some of these experiments are involving the use of electromagnetic ionizing radiation such are gamma rays and X rays, others make use of energetic charged or not charged small dimensions particles such are protons, electrons, neutrons and even, in other cases, larger accelerated particles such are helium or deuterium nuclei are used. In all these cases the beam used to hit an exposed target must be previously collimated and precisely characterized. In this paper, a novel method to determine the distribution of the collimated beam involving Matlab coding is proposed. The method was implemented by using of some Pyrex glass test samples placed in the beam where its distribution and dimension must be determined, followed by taking high quality pictures of them and then by digital processing the resulted images. By this method, information regarding the doses absorbed in the exposed samples volume are obtained too.

  20. Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code

    SciTech Connect

    Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; Davidson, Gregory G.; Hamilton, Steven P.; Godfrey, Andrew T.

    2015-12-21

    This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Some specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000® problems. These benchmark and scaling studies show promising results.

  1. Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code

    DOE PAGES

    Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; ...

    2015-12-21

    This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Somemore » specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000® problems. These benchmark and scaling studies show promising results.« less

  2. Improving the Salammbo code modelling and using it to better predict radiation belts dynamics

    NASA Astrophysics Data System (ADS)

    Maget, Vincent; Sicard-Piet, Angelica; Grimald, Sandrine Rochel; Boscher, Daniel

    2016-07-01

    In the framework of the FP7-SPACESTORM project, one objective is to improve the reliability of the model-based predictions performed of the radiation belt dynamics (first developed during the FP7-SPACECAST project). In this purpose we have analyzed and improved the way the simulations using the ONERA Salammbô code are performed, especially in : - Better controlling the driving parameters of the simulation; - Improving the initialization of the simulation in order to be more accurate at most energies for L values between 4 to 6; - Improving the physics of the model. For first point a statistical analysis of the accuracy of the Kp index has been conducted. For point two we have based our method on a long duration simulation in order to extract typical radiation belt states depending on the solar wind stress and geomagnetic activity. For last point we have first improved separately the modelling of different processes acting in the radiation belts and then, we have analyzed the global improvements obtained when simulating them together. We'll discuss here on all these points and on the balance that has to be taken into account between modeled processes to globally improve the radiation belt modelling.

  3. HELIOS-CR A 1-D radiation-magnetohydrodynamics code with inline atomic kinetics modeling

    NASA Astrophysics Data System (ADS)

    Macfarlane, J. J.; Golovkin, I. E.; Woodruff, P. R.

    2006-05-01

    HELIOS-CR is a user-oriented 1D radiation-magnetohydrodynamics code to simulate the dynamic evolution of laser-produced plasmas and z-pinch plasmas. It includes an in-line collisional-radiative (CR) model for computing non-LTE atomic level populations at each time step of the hydrodynamics simulation. HELIOS-CR has been designed for ease of use, and is well-suited for experimentalists, as well as graduate and undergraduate student researchers. The energy equations employed include models for laser energy deposition, radiation from external sources, and high-current discharges. Radiative transport can be calculated using either a multi-frequency flux-limited diffusion model, or a multi-frequency, multi-angle short characteristics model. HELIOS-CR supports the use of SESAME equation of state (EOS) tables, PROPACEOS EOS/multi-group opacity data tables, and non-LTE plasma properties computed using the inline CR modeling. Time-, space-, and frequency-dependent results from HELIOS-CR calculations are readily displayed with the HydroPLOT graphics tool. In addition, the results of HELIOS simulations can be post-processed using the SPECT3D Imaging and Spectral Analysis Suite to generate images and spectra that can be directly compared with experimental measurements. The HELIOS-CR package runs on Windows, Linux, and Mac OSX platforms, and includes online documentation. We will discuss the major features of HELIOS-CR, and present example results from simulations.

  4. Extension of the MURaM Radiative MHD Code for Coronal Simulations

    NASA Astrophysics Data System (ADS)

    Rempel, M.

    2017-01-01

    We present a new version of the MURaM radiative magnetohydrodynamics (MHD) code that allows for simulations spanning from the upper convection zone into the solar corona. We implement the relevant coronal physics in terms of optically thin radiative loss, field aligned heat conduction, and an equilibrium ionization equation of state. We artificially limit the coronal Alfvén and heat conduction speeds to computationally manageable values using an approximation to semi-relativistic MHD with an artificially reduced speed of light (Boris correction). We present example solutions ranging from quiet to active Sun in order to verify the validity of our approach. We quantify the role of numerical diffusivity for the effective coronal heating. We find that the (numerical) magnetic Prandtl number determines the ratio of resistive to viscous heating and that owing to the very large magnetic Prandtl number of the solar corona, heating is expected to happen predominantly through viscous dissipation. We find that reasonable solutions can be obtained with values of the reduced speed of light just marginally larger than the maximum sound speed. Overall this leads to a fully explicit code that can compute the time evolution of the solar corona in response to photospheric driving using numerical time steps not much smaller than 0.1 s. Numerical simulations of the coronal response to flux emergence covering a time span of a few days are well within reach using this approach.

  5. A Random Walk on WASP-12b with the Bayesian Atmospheric Radiative Transfer (BART) Code

    NASA Astrophysics Data System (ADS)

    Harrington, Joseph; Cubillos, Patricio; Blecic, Jasmina; Challener, Ryan; Rojo, Patricio; Lust, Nathaniel B.; Bowman, Oliver; Blumenthal, Sarah D.; Foster, Andrew S. D.; Foster, Austin James; Stemm, Madison; Bruce, Dylan

    2016-01-01

    We present the Bayesian Atmospheric Radiative Transfer (BART) code for atmospheric property retrievals from transit and eclipse spectra, and apply it to WASP-12b, a hot (~3000 K) exoplanet with a high eclipse signal-to-noise ratio. WASP-12b has been controversial. We (Madhusudhan et al. 2011, Nature) claimed it was the first planet with a high C/O abundance ratio. Line et al. (2014, ApJ) suggested a high CO2 abundance to explain the data. Stevenson et al. (2014, ApJ, atmospheric model by Madhusudhan) add additional data and reaffirm the original result, stating that C2H2 and HCN, not included in the Line et al. models, explain the data. We explore several modeling configurations and include Hubble, Spitzer, and ground-based eclipse data.BART consists of a differential-evolution Markov-Chain Monte Carlo sampler that drives a line-by-line radiative transfer code through the phase space of thermal- and abundance-profile parameters. BART is written in Python and C. Python modules generate atmospheric profiles from sets of MCMC parameters and integrate the resulting spectra over observational bandpasses, allowing high flexibility in modeling the planet without interacting with the fast, C portions that calculate the spectra. BART's shared memory and optimized opacity calculation allow it to run on a laptop, enabling classroom use. Runs can scale constant abundance profiles, profiles of thermochemical equilibrium abundances (TEA) calculated by the included TEA code, or arbitrary curves. Several thermal profile parameterizations are available. BART is an open-source, reproducible-research code. Users must release any code or data modifications if they publish results from it, and we encourage the community to use it and to participate in its development via http://github.com/ExOSPORTS/BART.This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. J. Blecic holds a NASA Earth and Space Science

  6. WASP-12b According to the Bayesian Atmospheric Radiative Transfer (BART) Code

    NASA Astrophysics Data System (ADS)

    Harrington, Joseph; Cubillos, Patricio E.; Blecic, Jasmina; Challener, Ryan C.; Rojo, Patricio M.; Lust, Nate B.; Bowman, M. Oliver; Blumenthal, Sarah D.; Foster, Andrew SD; Foster, A. J.

    2015-11-01

    We present the Bayesian Atmospheric Radiative Transfer (BART) code for atmospheric property retrievals from transit and eclipse spectra, and apply it to WASP-12b, a hot (~3000 K) exoplanet with a high eclipse signal-to-noise ratio. WASP-12b has been controversial. We (Madhusudhan et al. 2011, Nature) claimed it was the first planet with a high C/O abundance ratio. Line et al. (2014, ApJ) suggested a high CO2 abundance to explain the data. Stevenson et al. (2014, ApJ, atmospheric model by Madhusudhan) add additional data and reaffirm the original result, stating that C2H2 and HCN, not included in the Line et al. models, explain the data. We explore several modeling configurations and include Hubble, Spitzer, and ground-based eclipse data.BART consists of a differential-evolution Markov-Chain Monte Carlo sampler that drives a line-by-line radiative transfer code through the phase space of thermal- and abundance-profile parameters. BART is written in Python and C. Python modules generate atmospheric profiles from sets of MCMC parameters and integrate the resulting spectra over observational bandpasses, allowing high flexibility in modeling the planet without interacting with the fast, C portions that calculate the spectra. BART's shared memory and optimized opacity calculation allow it to run on a laptop, enabling classroom use. Runs can scale constant abundance profiles, profiles of thermochemical equilibrium abundances (TEA) calculated by the included TEA code, or arbitrary curves. Several thermal profile parameterizations are available. BART is an open-source, reproducible-research code. Users must release any code or data modifications if they publish results from it, and we encourage the community to use it and to participate in its development via http://github.com/ExOSPORTS/BART.This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. J. Blecic holds a NASA Earth and Space Science

  7. Specification and Prediction of the Radiation Environment Using Data Assimilative VERB code

    NASA Astrophysics Data System (ADS)

    Shprits, Yuri; Kellerman, Adam

    2016-07-01

    We discuss how data assimilation can be used for the reconstruction of long-term evolution, bench-marking of the physics based codes and used to improve the now-casting and focusing of the radiation belts and ring current. We also discuss advanced data assimilation methods such as parameter estimation and smoothing. We present a number of data assimilation applications using the VERB 3D code. The 3D data assimilative VERB allows us to blend together data from GOES, RBSP A and RBSP B. 1) Model with data assimilation allows us to propagate data to different pitch angles, energies, and L-shells and blends them together with the physics-based VERB code in an optimal way. We illustrate how to use this capability for the analysis of the previous events and for obtaining a global and statistical view of the system. 2) The model predictions strongly depend on initial conditions that are set up for the model. Therefore, the model is as good as the initial conditions that it uses. To produce the best possible initial conditions, data from different sources (GOES, RBSP A, B, our empirical model predictions based on ACE) are all blended together in an optimal way by means of data assimilation, as described above. The resulting initial conditions do not have gaps. This allows us to make more accurate predictions. Real-time prediction framework operating on our website, based on GOES, RBSP A, B and ACE data, and 3D VERB, is presented and discussed.

  8. The FLUKA code: new developments and applications for radiation protection in deep space

    NASA Astrophysics Data System (ADS)

    Ferrari, A.; Fluka Collaboration

    The FLUKA code is used for dosimetry, radioprotection and physics simulations in several fields, ranging from accelerators, to commercial flight dosimetry and space radiation. It is the code used for all radioprotection and dosimetry applicatiuons at CERN, in particular for the Large Hadron Collider project, which when operational in 2008 will accelerate protons up to 7 TeV, and Pb ions up to 2.7 TeV/n. In recent years, the code underwent significant improvements in the treatment of heavy ion beams in an energy range covering therapy, space dosimetry and fundamental physics related to galactic cosmic rays and future LHC beams. The talk will present the latest developments in the modelling of nucleus-nucleus interactions, including the implementation of a model for ion electromagnetic dissociation. The talk will also include an application of FLUKA models to a a Fe ion beam of interest for dosimetry and radiobiological applications and experiments. Various results obtained with the models, as well as several issues related to hadron beam dosimetry will be presented and discussed.

  9. A Radiation Chemistry Code Based on the Greens Functions of the Diffusion Equation

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Wu, Honglu

    2014-01-01

    Ionizing radiation produces several radiolytic species such as.OH, e-aq, and H. when interacting with biological matter. Following their creation, radiolytic species diffuse and chemically react with biological molecules such as DNA. Despite years of research, many questions on the DNA damage by ionizing radiation remains, notably on the indirect effect, i.e. the damage resulting from the reactions of the radiolytic species with DNA. To simulate DNA damage by ionizing radiation, we are developing a step-by-step radiation chemistry code that is based on the Green's functions of the diffusion equation (GFDE), which is able to follow the trajectories of all particles and their reactions with time. In the recent years, simulations based on the GFDE have been used extensively in biochemistry, notably to simulate biochemical networks in time and space and are often used as the "gold standard" to validate diffusion-reaction theories. The exact GFDE for partially diffusion-controlled reactions is difficult to use because of its complex form. Therefore, the radial Green's function, which is much simpler, is often used. Hence, much effort has been devoted to the sampling of the radial Green's functions, for which we have developed a sampling algorithm This algorithm only yields the inter-particle distance vector length after a time step; the sampling of the deviation angle of the inter-particle vector is not taken into consideration. In this work, we show that the radial distribution is predicted by the exact radial Green's function. We also use a technique developed by Clifford et al. to generate the inter-particle vector deviation angles, knowing the inter-particle vector length before and after a time step. The results are compared with those predicted by the exact GFDE and by the analytical angular functions for free diffusion. This first step in the creation of the radiation chemistry code should help the understanding of the contribution of the indirect effect in the

  10. Review of Fast Monte Carlo Codes for Dose Calculation in Radiation Therapy Treatment Planning

    PubMed Central

    Jabbari, Keyvan

    2011-01-01

    An important requirement in radiation therapy is a fast and accurate treatment planning system. This system, using computed tomography (CT) data, direction, and characteristics of the beam, calculates the dose at all points of the patient's volume. The two main factors in treatment planning system are accuracy and speed. According to these factors, various generations of treatment planning systems are developed. This article is a review of the Fast Monte Carlo treatment planning algorithms, which are accurate and fast at the same time. The Monte Carlo techniques are based on the transport of each individual particle (e.g., photon or electron) in the tissue. The transport of the particle is done using the physics of the interaction of the particles with matter. Other techniques transport the particles as a group. For a typical dose calculation in radiation therapy the code has to transport several millions particles, which take a few hours, therefore, the Monte Carlo techniques are accurate, but slow for clinical use. In recent years, with the development of the ‘fast’ Monte Carlo systems, one is able to perform dose calculation in a reasonable time for clinical use. The acceptable time for dose calculation is in the range of one minute. There is currently a growing interest in the fast Monte Carlo treatment planning systems and there are many commercial treatment planning systems that perform dose calculation in radiation therapy based on the Monte Carlo technique. PMID:22606661

  11. A study of the earth radiation budget using a 3D Monte-Carlo radiative transer code

    NASA Astrophysics Data System (ADS)

    Okata, M.; Nakajima, T.; Sato, Y.; Inoue, T.; Donovan, D. P.

    2013-12-01

    The purpose of this study is to evaluate the earth's radiation budget when data are available from satellite-borne active sensors, i.e. cloud profiling radar (CPR) and lidar, and a multi-spectral imager (MSI) in the project of the Earth Explorer/EarthCARE mission. For this purpose, we first developed forward and backward 3D Monte Carlo radiative transfer codes that can treat a broadband solar flux calculation including thermal infrared emission calculation by k-distribution parameters of Sekiguchi and Nakajima (2008). In order to construct the 3D cloud field, we tried the following three methods: 1) stochastic cloud generated by randomized optical thickness each layer distribution and regularly-distributed tilted clouds, 2) numerical simulations by a non-hydrostatic model with bin cloud microphysics model and 3) Minimum cloud Information Deviation Profiling Method (MIDPM) as explained later. As for the method-2 (numerical modeling method), we employed numerical simulation results of Californian summer stratus clouds simulated by a non-hydrostatic atmospheric model with a bin-type cloud microphysics model based on the JMA NHM model (Iguchi et al., 2008; Sato et al., 2009, 2012) with horizontal (vertical) grid spacing of 100m (20m) and 300m (20m) in a domain of 30km (x), 30km (y), 1.5km (z) and with a horizontally periodic lateral boundary condition. Two different cell systems were simulated depending on the cloud condensation nuclei (CCN) concentration. In the case of horizontal resolution of 100m, regionally averaged cloud optical thickness, , and standard deviation of COT, were 3.0 and 4.3 for pristine case and 8.5 and 7.4 for polluted case, respectively. In the MIDPM method, we first construct a library of pair of observed vertical profiles from active sensors and collocated imager products at the nadir footprint, i.e. spectral imager radiances, cloud optical thickness (COT), effective particle radius (RE) and cloud top temperature (Tc). We then select a

  12. The BIANCA model/code of radiation-induced cell death: application to human cells exposed to different radiation types.

    PubMed

    Ballarini, Francesca; Altieri, Saverio; Bortolussi, Silva; Carante, Mario; Giroletti, Elio; Protti, Nicoletta

    2014-08-01

    This paper presents a biophysical model of radiation-induced cell death, implemented as a Monte Carlo code called BIophysical ANalysis of Cell death and chromosome Aberrations (BIANCA), based on the assumption that some chromosome aberrations (dicentrics, rings, and large deletions, called ‘‘lethal aberrations’’) lead to clonogenic inactivation. In turn, chromosome aberrations are assumed to derive from clustered, and thus severe, DNA lesions (called ‘‘cluster lesions,’’ or CL) interacting at the micrometer scale; the CL yield and the threshold distance governing CL interaction are the only model parameters. After a pilot study on V79 hamster cells exposed to protons and carbon ions, in the present work the model was extended and applied to AG1522 human cells exposed to photons, He ions, and heavier ions including carbon and neon. The agreement with experimental survival data taken from the literature supported the assumptions. In particular, the inactivation of AG1522 cells was explained by lethal aberrations not only for X-rays, as already reported by others, but also for the aforementioned radiation types. Furthermore, the results are consistent with the hypothesis that the critical initial lesions leading to cell death are DNA cluster lesions having yields in the order of *2 CL Gy-1 cell-1 at low LET and*20 CL Gy-1 cell-1 at high LET, and that the processing of these lesions is modulated by proximity effects at the micrometer scale related to interphase chromatin organization. The model was then applied to calculate the fraction of inactivated cells, as well as the yields of lethal aberrations and cluster lesions, as a function of LET; the results showed a maximum around 130 keV/lm, and such maximum was much higher for cluster lesions and lethal aberrations than for cell inactivation.

  13. CEM2k and LAQGSM Codes as Event-Generators for Space Radiation Shield and Cosmic Rays Propagation Applications

    NASA Technical Reports Server (NTRS)

    Mashnik, S. G.; Gudima, K. K.; Sierk, A. J.; Moskalenko, I. V.

    2002-01-01

    Space radiation shield applications and studies of cosmic ray propagation in the Galaxy require reliable cross sections to calculate spectra of secondary particles and yields of the isotopes produced in nuclear reactions induced both by particles and nuclei at energies from threshold to hundreds of GeV per nucleon. Since the data often exist in a very limited energy range or sometimes not at all, the only way to obtain an estimate of the production cross sections is to use theoretical models and codes. Recently, we have developed improved versions of the Cascade-Exciton Model (CEM) of nuclear reactions: the codes CEM97 and CEM2k for description of particle-nucleus reactions at energies up to about 5 GeV. In addition, we have developed a LANL version of the Quark-Gluon String Model (LAQGSM) to describe reactions induced both by particles and nuclei at energies up to hundreds of GeVhucleon. We have tested and benchmarked the CEM and LAQGSM codes against a large variety of experimental data and have compared their results with predictions by other currently available models and codes. Our benchmarks show that CEM and LAQGSM codes have predictive powers no worse than other currently used codes and describe many reactions better than other codes; therefore both our codes can be used as reliable event-generators for space radiation shield and cosmic ray propagation applications. The CEM2k code is being incorporated into the transport code MCNPX (and several other transport codes), and we plan to incorporate LAQGSM into MCNPX in the near future. Here, we present the current status of the CEM2k and LAQGSM codes, and show results and applications to studies of cosmic ray propagation in the Galaxy.

  14. CEM2k and LAQGSM Codes as Event-Generators for Space Radiation Shield and Cosmic Rays Propagation Applications

    NASA Technical Reports Server (NTRS)

    Mashnik, S. G.; Gudima, K. K.; Sierk, A. J.; Moskalenko, I. V.

    2002-01-01

    Space radiation shield applications and studies of cosmic ray propagation in the Galaxy require reliable cross sections to calculate spectra of secondary particles and yields of the isotopes produced in nuclear reactions induced both by particles and nuclei at energies from threshold to hundreds of GeV per nucleon. Since the data often exist in a very limited energy range or sometimes not at all, the only way to obtain an estimate of the production cross sections is to use theoretical models and codes. Recently, we have developed improved versions of the Cascade-Exciton Model (CEM) of nuclear reactions: the codes CEM97 and CEM2k for description of particle-nucleus reactions at energies up to about 5 GeV. In addition, we have developed a LANL version of the Quark-Gluon String Model (LAQGSM) to describe reactions induced both by particles and nuclei at energies up to hundreds of GeVhucleon. We have tested and benchmarked the CEM and LAQGSM codes against a large variety of experimental data and have compared their results with predictions by other currently available models and codes. Our benchmarks show that CEM and LAQGSM codes have predictive powers no worse than other currently used codes and describe many reactions better than other codes; therefore both our codes can be used as reliable event-generators for space radiation shield and cosmic ray propagation applications. The CEM2k code is being incorporated into the transport code MCNPX (and several other transport codes), and we plan to incorporate LAQGSM into MCNPX in the near future. Here, we present the current status of the CEM2k and LAQGSM codes, and show results and applications to studies of cosmic ray propagation in the Galaxy.

  15. Updating the Tools Used to Estimate Space Radiation Exposures for Operations: Codes, Models and Interfaces

    NASA Astrophysics Data System (ADS)

    Zapp, E.; Shelfer, T.; Semones, E.; Johnson, A.; Weyland, M.; Golightly, M.; Smith, G.; Dardano, C.

    increase in speed due to "in -lining" calculations and reconstructing of the algorithms in a manner which calls for fewer elemental calculations, as well as time saved through better interfaces with geometry models and code input routines. The overall result is to enhance the radiation protection capabilities available for manned space flight.

  16. An object-oriented implementation of a parallel Monte Carlo code for radiation transport

    NASA Astrophysics Data System (ADS)

    Santos, Pedro Duarte; Lani, Andrea

    2016-05-01

    This paper describes the main features of a state-of-the-art Monte Carlo solver for radiation transport which has been implemented within COOLFluiD, a world-class open source object-oriented platform for scientific simulations. The Monte Carlo code makes use of efficient ray tracing algorithms (for 2D, axisymmetric and 3D arbitrary unstructured meshes) which are described in detail. The solver accuracy is first verified in testcases for which analytical solutions are available, then validated for a space re-entry flight experiment (i.e. FIRE II) for which comparisons against both experiments and reference numerical solutions are provided. Through the flexible design of the physical models, ray tracing and parallelization strategy (fully reusing the mesh decomposition inherited by the fluid simulator), the implementation was made efficient and reusable.

  17. Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code

    NASA Astrophysics Data System (ADS)

    Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; Davidson, Gregory G.; Hamilton, Steven P.; Godfrey, Andrew T.

    2016-03-01

    This work discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package authored at Oak Ridge National Laboratory. Shift has been developed to scale well from laptops to small computing clusters to advanced supercomputers and includes features such as support for multiple geometry and physics engines, hybrid capabilities for variance reduction methods such as the Consistent Adjoint-Driven Importance Sampling methodology, advanced parallel decompositions, and tally methods optimized for scalability on supercomputing architectures. The scaling studies presented in this paper demonstrate good weak and strong scaling behavior for the implemented algorithms. Shift has also been validated and verified against various reactor physics benchmarks, including the Consortium for Advanced Simulation of Light Water Reactors' Virtual Environment for Reactor Analysis criticality test suite and several Westinghouse AP1000® problems presented in this paper. These benchmark results compare well to those from other contemporary Monte Carlo codes such as MCNP5 and KENO.

  18. Development and validation of a GEANT4 radiation transport code for CT dosimetry

    PubMed Central

    Carver, DE; Kost, SD; Fernald, MJ; Lewis, KG; Fraser, ND; Pickens, DR; Price, RR; Stabin, MG

    2014-01-01

    We have created a radiation transport code using the GEANT4 Monte Carlo toolkit to simulate pediatric patients undergoing CT examinations. The focus of this paper is to validate our simulation with real-world physical dosimetry measurements using two independent techniques. Exposure measurements were made with a standard 100-mm CT pencil ionization chamber, and absorbed doses were also measured using optically stimulated luminescent (OSL) dosimeters. Measurements were made in air, a standard 16-cm acrylic head phantom, and a standard 32-cm acrylic body phantom. Physical dose measurements determined from the ionization chamber in air for 100 and 120 kVp beam energies were used to derive photon-fluence calibration factors. Both ion chamber and OSL measurement results provide useful comparisons in the validation of our Monte Carlo simulations. We found that simulated and measured CTDI values were within an overall average of 6% of each other. PMID:25706135

  19. Code System for Calculating Radiation Exposure Resulting from Accidental Radioactive Releases to the Hydrosphere.

    SciTech Connect

    1982-11-18

    Version 00 LPGS was developed to calculate the radiological impacts resulting from radioactive releases to the hydrosphere. The name LPGS was derived from the Liquid Pathway Generic Study for which the original code was used primarily as an analytic tool in the assessment process. The hydrosphere is represented by the following types of water bodies: estuary, small river, well, lake, and one-dimensional (1-D) river. LPGS is designed to calculate radiation dose (individual and population) to body organs as a function of time for the various exposure pathways. The radiological consequences to the aquatic biota are estimated. Several simplified radionuclide transport models are employed with built-in formulations to describe the release rate of the radionuclides. A tabulated user-supplied release model can be input, if desired. Printer plots of dose versus time for the various exposure pathways are provided.

  20. The Monte Carlo code MCPTV--Monte Carlo dose calculation in radiation therapy with carbon ions.

    PubMed

    Karg, Juergen; Speer, Stefan; Schmidt, Manfred; Mueller, Reinhold

    2010-07-07

    The Monte Carlo code MCPTV is presented. MCPTV is designed for dose calculation in treatment planning in radiation therapy with particles and especially carbon ions. MCPTV has a voxel-based concept and can perform a fast calculation of the dose distribution on patient CT data. Material and density information from CT are taken into account. Electromagnetic and nuclear interactions are implemented. Furthermore the algorithm gives information about the particle spectra and the energy deposition in each voxel. This can be used to calculate the relative biological effectiveness (RBE) for each voxel. Depth dose distributions are compared to experimental data giving good agreement. A clinical example is shown to demonstrate the capabilities of the MCPTV dose calculation.

  1. Comparison of spent-fuel cask radiation doses calculated by one- and two-dimensional shielding codes

    SciTech Connect

    Carbajo, J.J. )

    1992-01-01

    Spent-fuel cask shield design and calculation of radiation doses are major parts of the overall cask design. This paper compares radiation doses calculated by one- and two-dimensional or three-dimensional shielding codes. The paper also investigates the appropriateness of using one-dimensional codes for two-dimensional geometries. From these results, it can be concluded that the one-dimensional XSDRNPM/XSDOSE codes are adequate for both radial and axial shielding calculations if appropriate bucklings are used. For radial calculations, no buckling or a buckling equal to the length of the fuel are appropriate. For axial calculations, a buckling at least equal to the diameter of the cask must be used for neutron doses. For gamma axial doses, a buckling around the diameter of the fuel region is adequate. More complicated two- or three-dimensional codes are not needed for these types of problems.

  2. Acoustic radiation force impulse (ARFI) imaging of zebrafish embryo by high-frequency coded excitation sequence.

    PubMed

    Park, Jinhyoung; Lee, Jungwoo; Lau, Sien Ting; Lee, Changyang; Huang, Ying; Lien, Ching-Ling; Kirk Shung, K

    2012-04-01

    Acoustic radiation force impulse (ARFI) imaging has been developed as a non-invasive method for quantitative illustration of tissue stiffness or displacement. Conventional ARFI imaging (2-10 MHz) has been implemented in commercial scanners for illustrating elastic properties of several organs. The image resolution, however, is too coarse to study mechanical properties of micro-sized objects such as cells. This article thus presents a high-frequency coded excitation ARFI technique, with the ultimate goal of displaying elastic characteristics of cellular structures. Tissue mimicking phantoms and zebrafish embryos are imaged with a 100-MHz lithium niobate (LiNbO₃) transducer, by cross-correlating tracked RF echoes with the reference. The phantom results show that the contrast of ARFI image (14 dB) with coded excitation is better than that of the conventional ARFI image (9 dB). The depths of penetration are 2.6 and 2.2 mm, respectively. The stiffness data of the zebrafish demonstrate that the envelope is harder than the embryo region. The temporal displacement change at the embryo and the chorion is as large as 36 and 3.6 μm. Consequently, this high-frequency ARFI approach may serve as a remote palpation imaging tool that reveals viscoelastic properties of small biological samples.

  3. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    SciTech Connect

    Kostin, Mikhail; Mokhov, Nikolai; Niita, Koji

    2013-09-25

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  4. Implementation of tetrahedral-mesh geometry in Monte Carlo radiation transport code PHITS.

    PubMed

    Furuta, Takuya; Sato, Tatsuhiko; Han, Min; Yeom, Yeon; Kim, Chan; Brown, Justin; Bolch, Wesley

    2017-04-04

    A new function to treat tetrahedral-mesh geometry was implemented in the Particle and Heavy Ion Transport code Systems (PHITS). To accelerate the computational speed in the transport process, an original algorithm was introduced to initially prepare decomposition maps for the container box of the tetrahedral-mesh geometry. The computational performance was tested by conducting radiation transport simulations of 100 MeV protons and 1 MeV photons in a water phantom represented by tetrahedral mesh. The simulation was repeated with varying number of meshes and the required computational times were then compared with those of the conventional voxel representation. Our results show that the computational costs for each boundary crossing of the region mesh are essentially equivalent for both representations. This study suggests that the tetrahedral-mesh representation offers not only a flexible description of the transport geometry but also improvement of computational efficiency for the radiation transport. Due to the adaptability of tetrahedrons in both size and shape, dosimetrically equivalent objects can be represented by tetrahedrons with a much fewer number of meshes as compared its voxelized representation. Our study additionally included dosimetric calculations using a computational human phantom. A significant acceleration of the computational speed, about 4 times, was confirmed by the adoption of a tetrahedral mesh over the traditional voxel mesh geometry.

  5. ODYSSEY: A PUBLIC GPU-BASED CODE FOR GENERAL RELATIVISTIC RADIATIVE TRANSFER IN KERR SPACETIME

    SciTech Connect

    Pu, Hung-Yi; Younsi, Ziri

    2016-04-01

    General relativistic radiative transfer calculations coupled with the calculation of geodesics in the Kerr spacetime are an essential tool for determining the images, spectra, and light curves from matter in the vicinity of black holes. Such studies are especially important for ongoing and upcoming millimeter/submillimeter very long baseline interferometry observations of the supermassive black holes at the centers of Sgr A* and M87. To this end we introduce Odyssey, a graphics processing unit (GPU) based code for ray tracing and radiative transfer in the Kerr spacetime. On a single GPU, the performance of Odyssey can exceed 1 ns per photon, per Runge–Kutta integration step. Odyssey is publicly available, fast, accurate, and flexible enough to be modified to suit the specific needs of new users. Along with a Graphical User Interface powered by a video-accelerated display architecture, we also present an educational software tool, Odyssey-Edu, for showing in real time how null geodesics around a Kerr black hole vary as a function of black hole spin and angle of incidence onto the black hole.

  6. European Code against Cancer 4th Edition: Ultraviolet radiation and cancer.

    PubMed

    Greinert, Rüdiger; de Vries, Esther; Erdmann, Friederike; Espina, Carolina; Auvinen, Anssi; Kesminiene, Ausrele; Schüz, Joachim

    2015-12-01

    Ultraviolet radiation (UVR) is part of the electromagnetic spectrum emitted naturally from the sun or from artificial sources such as tanning devices. Acute skin reactions induced by UVR exposure are erythema (skin reddening), or sunburn, and the acquisition of a suntan triggered by UVR-induced DNA damage. UVR exposure is the main cause of skin cancer, including cutaneous malignant melanoma, basal-cell carcinoma, and squamous-cell carcinoma. Skin cancer is the most common cancer in fair-skinned populations, and its incidence has increased steeply over recent decades. According to estimates for 2012, about 100,000 new cases of cutaneous melanoma and about 22,000 deaths from it occurred in Europe. The main mechanisms by which UVR causes cancer are well understood. Exposure during childhood appears to be particularly harmful. Exposure to UVR is a risk factor modifiable by individuals' behaviour. Excessive exposure from natural sources can be avoided by seeking shade when the sun is strongest, by wearing appropriate clothing, and by appropriately applying sunscreens if direct sunlight is unavoidable. Exposure from artificial sources can be completely avoided by not using sunbeds. Beneficial effects of sun or UVR exposure, such as for vitamin D production, can be fully achieved while still avoiding too much sun exposure and the use of sunbeds. Taking all the scientific evidence together, the recommendation of the 4th edition of the European Code Against Cancer for ultraviolet radiation is: "Avoid too much sun, especially for children. Use sun protection. Do not use sunbeds."

  7. Implementation of tetrahedral-mesh geometry in Monte Carlo radiation transport code PHITS

    NASA Astrophysics Data System (ADS)

    Furuta, Takuya; Sato, Tatsuhiko; Han, Min Cheol; Yeom, Yeon Soo; Kim, Chan Hyeong; Brown, Justin L.; Bolch, Wesley E.

    2017-06-01

    A new function to treat tetrahedral-mesh geometry was implemented in the particle and heavy ion transport code systems. To accelerate the computational speed in the transport process, an original algorithm was introduced to initially prepare decomposition maps for the container box of the tetrahedral-mesh geometry. The computational performance was tested by conducting radiation transport simulations of 100 MeV protons and 1 MeV photons in a water phantom represented by tetrahedral mesh. The simulation was repeated with varying number of meshes and the required computational times were then compared with those of the conventional voxel representation. Our results show that the computational costs for each boundary crossing of the region mesh are essentially equivalent for both representations. This study suggests that the tetrahedral-mesh representation offers not only a flexible description of the transport geometry but also improvement of computational efficiency for the radiation transport. Due to the adaptability of tetrahedrons in both size and shape, dosimetrically equivalent objects can be represented by tetrahedrons with a much fewer number of meshes as compared its voxelized representation. Our study additionally included dosimetric calculations using a computational human phantom. A significant acceleration of the computational speed, about 4 times, was confirmed by the adoption of a tetrahedral mesh over the traditional voxel mesh geometry.

  8. Odyssey: A Public GPU-based Code for General Relativistic Radiative Transfer in Kerr Spacetime

    NASA Astrophysics Data System (ADS)

    Pu, Hung-Yi; Yun, Kiyun; Younsi, Ziri; Yoon, Suk-Jin

    2016-04-01

    General relativistic radiative transfer calculations coupled with the calculation of geodesics in the Kerr spacetime are an essential tool for determining the images, spectra, and light curves from matter in the vicinity of black holes. Such studies are especially important for ongoing and upcoming millimeter/submillimeter very long baseline interferometry observations of the supermassive black holes at the centers of Sgr A* and M87. To this end we introduce Odyssey, a graphics processing unit (GPU) based code for ray tracing and radiative transfer in the Kerr spacetime. On a single GPU, the performance of Odyssey can exceed 1 ns per photon, per Runge-Kutta integration step. Odyssey is publicly available, fast, accurate, and flexible enough to be modified to suit the specific needs of new users. Along with a Graphical User Interface powered by a video-accelerated display architecture, we also present an educational software tool, Odyssey_Edu, for showing in real time how null geodesics around a Kerr black hole vary as a function of black hole spin and angle of incidence onto the black hole.

  9. Radiation environment at the Moon: Comparisons of transport code modeling and measurements from the CRaTER instrument

    NASA Astrophysics Data System (ADS)

    Porter, Jamie A.; Townsend, Lawrence W.; Spence, Harlan; Golightly, Michael; Schwadron, Nathan; Kasper, Justin; Case, Anthony W.; Blake, John B.; Zeitlin, Cary

    2014-06-01

    The Cosmic Ray Telescope for the Effects of Radiation (CRaTER), an instrument carried on the Lunar Reconnaissance Orbiter spacecraft, directly measures the energy depositions by solar and galactic cosmic radiations in its silicon wafer detectors. These energy depositions are converted to linear energy transfer (LET) spectra. High LET particles, which are mainly high-energy heavy ions found in the incident cosmic ray spectrum, or target fragments and recoils produced by protons and heavier ions, are of particular importance because of their potential to cause significant damage to human tissue and electronic components. Aside from providing LET data useful for space radiation risk analyses for lunar missions, the observed LET spectra can also be used to help validate space radiation transport codes, used for shielding design and risk assessment applications, which is a major thrust of this work. In this work the Monte Carlo transport code HETC-HEDS (High-Energy Transport Code-Human Exploration and Development in Space) is used to estimate LET contributions from the incident primary ions and their charged secondaries produced by nuclear collisions as they pass through the three pairs of silicon detectors. Also in this work, the contributions to the LET of the primary ions and their charged secondaries are analyzed and compared with estimates obtained using the deterministic space radiation code HZETRN 2010, developed at NASA Langley Research Center. LET estimates obtained from the two transport codes are compared with measurements of LET from the CRaTER instrument during the mission. Overall, a comparison of the LET predictions of the HETC-HEDS code to the predictions of the HZETRN code displays good agreement. The code predictions are also in good agreement with the CRaTER LET measurements above 15 keV/µm but differ from the measurements for smaller values of LET. A possible reason for this disagreement between measured and calculated spectra below 15 keV/µm is an

  10. Development of a Space Radiation Monte-Carlo Computer Simulation Based on the FLUKE and Root Codes

    NASA Technical Reports Server (NTRS)

    Pinsky, L. S.; Wilson, T. L.; Ferrari, A.; Sala, Paola; Carminati, F.; Brun, R.

    2001-01-01

    The radiation environment in space is a complex problem to model. Trying to extrapolate the projections of that environment into all areas of the internal spacecraft geometry is even more daunting. With the support of our CERN colleagues, our research group in Houston is embarking on a project to develop a radiation transport tool that is tailored to the problem of taking the external radiation flux incident on any particular spacecraft and simulating the evolution of that flux through a geometrically accurate model of the spacecraft material. The output will be a prediction of the detailed nature of the resulting internal radiation environment within the spacecraft as well as its secondary albedo. Beyond doing the physics transport of the incident flux, the software tool we are developing will provide a self-contained stand-alone object-oriented analysis and visualization infrastructure. It will also include a graphical user interface and a set of input tools to facilitate the simulation of space missions in terms of nominal radiation models and mission trajectory profiles. The goal of this project is to produce a code that is considerably more accurate and user-friendly than existing Monte-Carlo-based tools for the evaluation of the space radiation environment. Furthermore, the code will be an essential complement to the currently existing analytic codes in the BRYNTRN/HZETRN family for the evaluation of radiation shielding. The code will be directly applicable to the simulation of environments in low earth orbit, on the lunar surface, on planetary surfaces (including the Earth) and in the interplanetary medium such as on a transit to Mars (and even in the interstellar medium). The software will include modules whose underlying physics base can continue to be enhanced and updated for physics content, as future data become available beyond the timeframe of the initial development now foreseen. This future maintenance will be available from the authors of FLUKA as

  11. Radiation transport codes for potential applications related to radiobiology and radiotherapy using protons, neutrons, and negatively charged pions

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.

    1972-01-01

    Several Monte Carlo radiation transport computer codes are used to predict quantities of interest in the fields of radiotherapy and radiobiology. The calculational methods are described and comparisions of calculated and experimental results are presented for dose distributions produced by protons, neutrons, and negatively charged pions. Comparisons of calculated and experimental cell survival probabilities are also presented.

  12. FitSKIRT: genetic algorithms to automatically fit dusty galaxies with a Monte Carlo radiative transfer code

    NASA Astrophysics Data System (ADS)

    De Geyter, G.; Baes, M.; Fritz, J.; Camps, P.

    2013-02-01

    We present FitSKIRT, a method to efficiently fit radiative transfer models to UV/optical images of dusty galaxies. These images have the advantage that they have better spatial resolution compared to FIR/submm data. FitSKIRT uses the GAlib genetic algorithm library to optimize the output of the SKIRT Monte Carlo radiative transfer code. Genetic algorithms prove to be a valuable tool in handling the multi- dimensional search space as well as the noise induced by the random nature of the Monte Carlo radiative transfer code. FitSKIRT is tested on artificial images of a simulated edge-on spiral galaxy, where we gradually increase the number of fitted parameters. We find that we can recover all model parameters, even if all 11 model parameters are left unconstrained. Finally, we apply the FitSKIRT code to a V-band image of the edge-on spiral galaxy NGC 4013. This galaxy has been modeled previously by other authors using different combinations of radiative transfer codes and optimization methods. Given the different models and techniques and the complexity and degeneracies in the parameter space, we find reasonable agreement between the different models. We conclude that the FitSKIRT method allows comparison between different models and geometries in a quantitative manner and minimizes the need of human intervention and biasing. The high level of automation makes it an ideal tool to use on larger sets of observed data.

  13. FACET: a radiation view factor computer code for axisymmetric, 2D planar, and 3D geometries with shadowing

    SciTech Connect

    Shapiro, A.B.

    1983-08-01

    The computer code FACET calculates the radiation geometric view factor (alternatively called shape factor, angle factor, or configuration factor) between surfaces for axisymmetric, two-dimensional planar and three-dimensional geometries with interposed third surface obstructions. FACET was developed to calculate view factors for input to finite-element heat-transfer analysis codes. The first section of this report is a brief review of previous radiation-view-factor computer codes. The second section presents the defining integral equation for the geometric view factor between two surfaces and the assumptions made in its derivation. Also in this section are the numerical algorithms used to integrate this equation for the various geometries. The third section presents the algorithms used to detect self-shadowing and third-surface shadowing between the two surfaces for which a view factor is being calculated. The fourth section provides a user's input guide followed by several example problems.

  14. Radiation Protection Studies for Medical Particle Accelerators using Fluka Monte Carlo Code.

    PubMed

    Infantino, Angelo; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Mostacci, Domiziano; Marengo, Mario

    2016-11-24

    Radiation protection (RP) in the use of medical cyclotrons involves many aspects both in the routine use and for the decommissioning of a site. Guidelines for site planning and installation, as well as for RP assessment, are given in international documents; however, the latter typically offer analytic methods of calculation of shielding and materials activation, in approximate or idealised geometry set-ups. The availability of Monte Carlo (MC) codes with accurate up-to-date libraries for transport and interaction of neutrons and charged particles at energies below 250 MeV, together with the continuously increasing power of modern computers, makes the systematic use of simulations with realistic geometries possible, yielding equipment and site-specific evaluation of the source terms, shielding requirements and all quantities relevant to RP at the same time. In this work, the well-known FLUKA MC code was used to simulate different aspects of RP in the use of biomedical accelerators, particularly for the production of medical radioisotopes. In the context of the Young Professionals Award, held at the IRPA 14 conference, only a part of the complete work is presented. In particular, the simulation of the GE PETtrace cyclotron (16.5 MeV) installed at S. Orsola-Malpighi University Hospital evaluated the effective dose distribution around the equipment; the effective number of neutrons produced per incident proton and their spectral distribution; the activation of the structure of the cyclotron and the vault walls; the activation of the ambient air, in particular the production of (41)Ar. The simulations were validated, in terms of physical and transport parameters to be used at the energy range of interest, through an extensive measurement campaign of the neutron environmental dose equivalent using a rem-counter and TLD dosemeters. The validated model was then used in the design and the licensing request of a new Positron Emission Tomography facility.

  15. The Havemann-Taylor Fast Radiative Transfer Code (HT-FRTC) and its applications

    NASA Astrophysics Data System (ADS)

    Thelen, Jean-Claude; Havemann, Stephan; Lewis, Warren

    2015-09-01

    The Havemann-Taylor Fast Radiative Transfer Code (HT-FRTC) is a component of the Met Office NEON Tactical Decision Aid (TDA). Within NEON, the HT-FRTC has for a number of years been used to predict the IR apparent thermal contrasts between different surface types as observed by an airborne sensor. To achieve this, the HT-FRTC is supplied with the inherent temperatures and spectral properties of these surfaces (i.e. ground target(s) and background). A key strength of the HT-FRTC is its ability to take into account the detailed properties of the atmosphere, which in the context of NEON tend to be provided by a Numerical Weather Prediction (NWP) forecast model. While water vapour and ozone are generally the most important gases, additional trace gases are now being incorporated into the HT-FRTC. The HT-FRTC also includes an exact treatment of atmospheric scattering based on spherical harmonics. This allows the treatment of several different aerosol species and of liquid and ice clouds. Recent developments can even account for rain and falling snow. The HT-FRTC works in Principal Component (PC) space and is trained on a wide variety of atmospheric and surface conditions, which significantly reduces the computational requirements regarding memory and time. One clear-sky simulation takes approximately one millisecond. Recent developments allow the training to be completely general and sensor independent. This is significant as the user of the code can add new sensors and new surfaces/targets by simply supplying extra files which contain their (possibly classified) spectral properties. The HT-FRTC has been extended to cover the spectral range of Photopic and NVG sensors. One aim here is to give guidance on the expected, directionally resolved sky brightness, especially at night, again taking the actual or forecast atmospheric conditions into account. Recent developments include light level predictions during the period of twilight.

  16. Particle-In-Cell (PIC) code simulation results and comparison with theory scaling laws for photoelectron-generated radiation

    SciTech Connect

    Dipp, T.M. |

    1993-12-01

    The generation of radiation via photoelectrons induced off of a conducting surface was explored using Particle-In-Cell (PIC) code computer simulations. Using the MAGIC PIC code, the simulations were performed in one dimension to handle the diverse scale lengths of the particles and fields in the problem. The simulations involved monoenergetic, nonrelativistic photoelectrons emitted normal to the illuminated conducting surface. A sinusoidal, 100% modulated, 6.3263 ns pulse train, as well as unmodulated emission, were used to explore the behavior of the particles, fields, and generated radiation. A special postprocessor was written to convert the PIC code simulated electron sheath into far-field radiation parameters by means of rigorous retarded time calculations. The results of the small-spot PIC simulations were used to generate various graphs showing resonance and nonresonance radiation quantities such as radiated lobe patterns, frequency, and power. A database of PIC simulation results was created and, using a nonlinear curve-fitting program, compared with theoretical scaling laws. Overall, the small-spot behavior predicted by the theoretical scaling laws was generally observed in the PIC simulation data, providing confidence in both the theoretical scaling laws and the PIC simulations.

  17. Pymiedap: a versatile radiative transfer code with polarization for terrestrial (exo)planets.

    NASA Astrophysics Data System (ADS)

    Rossi, Loïc; Stam, Daphne; Hogenboom, Michael

    2016-04-01

    Polarimetry promises to be an important method to detect exoplanets: the light of a star is usually unpolarized te{kemp1987} while scattering by gas and clouds in an atmosphere can generate high levels of polarization. Furthermore, the polarization of scattered light contains information about the properties of the atmosphere and surface of a planet, allowing a possible characterization te{stam2008}, a method already validated in the solar system with Venus te{hansen1974,rossi2015}. We present here Pymiedap (Python Mie Doubling-Adding Program): a set of Python objects interfaced with Fortran radiative transfer codes that allows to define a planetary atmosphere and compute the flux and polarization of the light that is scattered. Several different properties of the planet can be set interactively by the user through the Python interface such as gravity, distance to the star, surface properties, atmospheric layers, gaseous and aerosol composition. The radiative transfer calculations are then computed following the doubling-adding method te{deHaan1987}. We present some results of the code and show its possible use for different planetary atmospheres for both resolved and disk-integrated measurements. We investigate the effect of gas, clouds and aerosols composition and surface properties for horizontally homogeneous and inhomogenous planets, in the case of Earth-like planets. We also study the effect of gaseous absorption on the flux and polarization as a marker for gaseous abundance and cloud top altitude. [1]{kemp1987} Kemp et al. The optical polarization of the sun measured at a sensitivity of parts in ten million. Nature, 1987, 326, 270-273 [2]{stam2008} Stam, D. M. Spectropolarimetric signatures of Earth-like extrasolar planets. A&A, 2008, 482, 989-1007 [3]{hansen1974} Hansen, J. E. & Hovenier, J. W. Interpretation of the polarization of Venus. Journal of Atmospheric Sciences, 1974, 31, 1137-1160 [4]{rossi2015} Rossi et al. Preliminary study of Venus cloud layers

  18. Accurate dose assessment system for an exposed person utilising radiation transport calculation codes in emergency response to a radiological accident.

    PubMed

    Takahashi, F; Shigemori, Y; Seki, A

    2009-01-01

    A system has been developed to assess radiation dose distribution inside the body of exposed persons in a radiological accident by utilising radiation transport calculation codes-MCNP and MCNPX. The system consists mainly of two parts, pre-processor and post-processor of the radiation transport calculation. Programs for the pre-processor are used to set up a 'problem-dependent' input file, which defines the accident condition and dosimetric quantities to be estimated. The program developed for the post-processor part can effectively indicate dose information based upon the output file of the code. All of the programs in the dosimetry system can be executed with a generally used personal computer and accurately give the dose profile to an exposed person in a radiological accident without complicated procedures. An experiment using a physical phantom was carried out to verify the availability of the dosimetry system with the developed programs in a gamma ray irradiation field.

  19. Coronal extension of the MURaM radiative MHD code: From quiet sun to flare simulations

    NASA Astrophysics Data System (ADS)

    Rempel, Matthias D.; Cheung, Mark

    2016-05-01

    We present a new version of the MURaM radiative MHD code, which includes a treatment of the solar corona in terms of MHD, optically thin radiative loss and field-aligned heat conduction. In order to relax the severe time-step constraints imposed by large Alfven velocities and heat conduction we use a combination of semi-relativistic MHD with reduced speed of light ("Boris correction") and a hyperbolic formulation of heat conduction. We apply the numerical setup to 4 different setups including a mixed polarity quiet sun, an open flux region, an arcade solution and an active region setup and find all cases an amount of coronal heating sufficient to maintain a corona with temperatures from 1 MK (quiet sun) to 2 MK (active region, arcade). In all our setups the Poynting flux is self-consistently created by photospheric and sub-photospheric magneto-convection in the lower part of our simulation domain. Varying the maximum allowed Alfven velocity ("reduced speed of light") leads to only minor changes in the coronal structure as long as the limited Alfven velocity remains larger than the speed of sound and about 1.5-3 times larger than the peak advection velocity. We also found that varying details of the numerical diffusivities that govern the resistive and viscous energy dissipation do not strongly affect the overall coronal heating, but the ratio of resistive and viscous energy dependence is strongly dependent on the effective numerical magnetic Prandtl number. We use our active region setup in order to simulate a flare triggered by the emergence of a twisted flux rope into a pre-existing bipolar active region. Our simulation yields a series of flares, with the strongest one reaching GOES M1 class. The simulation reproduces many observed properties of eruptions such as flare ribbons, post flare loops and a sunquake.

  20. XTAT: A New Multilevel-Multiline Polarized Radiative Transfer Code with PRD

    NASA Astrophysics Data System (ADS)

    Bommier, V.

    2014-10-01

    This work is intended to the interpretation of the so-called "Second Solar Spectrum" (Stenflo 1996), which is the spectrum of the linear polarization formed by scattering and observed close to the solar internal limb. The lines are also optically thick, and the problem is to solve in a coherent manner, the statistical equilibrium of the atomic density matrix and the polarized radiative transfer in the atmosphere. Following Belluzzi & Landi Degl'Innocenti (2009), 30 % of the solar visible line linear polarization profiles display the M-type shape typical of coherent scattering effect in the far wings. A new theory including both coherent (Rayleigh) and resonant scatterings was developed by Bommier (1997a,b). Raman scattering was later added (Bommier 1999, SPW2). In this theory, which is straightly derived from the Schrödinger equation for the atomic density matrix, the radiative line broadening appears as a non-Markovian process of atom-photon interaction. The collisional broadening is included. The Rayleigh (Raman) scattering appears as an additional term in the emissivity from the fourth order of the atom-photon interaction perturbation development. The development is pursued and finally summed up, leading to a non-perturbative final result. In this formalism, the use of redistribution functions is avoided. The published formalism was limited to the two-level atom without lower level alignment. But most of the solar lines are more complex. We will present how the theory has to be complemented for multi-level atom modeling, including lower level alignment. The role of the collisions as balancing coherent and resonant scatterings is fully taken into account. Progress report will be given about the development of a new code for the numerical iterative solution of the statistical equilibrium and polarized radiative transfer equations, for multi-level atoms and their multi-line spectrum. Fine and hyperfine structures, and Hanle, Kemp (Kemp et al. 1984), Zeeman

  1. C5 Benchmark Problem with Discrete Ordinate Radiation Transport Code DENOVO

    SciTech Connect

    Yesilyurt, Gokhan; Clarno, Kevin T; Evans, Thomas M; Davidson, Gregory G; Fox, Patricia B

    2011-01-01

    The C5 benchmark problem proposed by the Organisation for Economic Co-operation and Development/Nuclear Energy Agency was modeled to examine the capabilities of Denovo, a three-dimensional (3-D) parallel discrete ordinates (S{sub N}) radiation transport code, for problems with no spatial homogenization. Denovo uses state-of-the-art numerical methods to obtain accurate solutions to the Boltzmann transport equation. Problems were run in parallel on Jaguar, a high-performance supercomputer located at Oak Ridge National Laboratory. Both the two-dimensional (2-D) and 3-D configurations were analyzed, and the results were compared with the reference MCNP Monte Carlo calculations. For an additional comparison, SCALE/KENO-V.a Monte Carlo solutions were also included. In addition, a sensitivity analysis was performed for the optimal angular quadrature and mesh resolution for both the 2-D and 3-D infinite lattices of UO{sub 2} fuel pin cells. Denovo was verified with the C5 problem. The effective multiplication factors, pin powers, and assembly powers were found to be in good agreement with the reference MCNP and SCALE/KENO-V.a Monte Carlo calculations.

  2. Exo-Transmit: Radiative transfer code for calculating exoplanet transmission spectra

    NASA Astrophysics Data System (ADS)

    Kempton, Eliza M.-R.; Lupu, Roxana E.; Owusu-Asare, Albert; Slough, Patrick; Cale, Bryson

    2016-11-01

    Exo-Transmit calculates the transmission spectrum of an exoplanet atmosphere given specified input information about the planetary and stellar radii, the planet's surface gravity, the atmospheric temperature-pressure (T-P) profile, the location (in terms of pressure) of any cloud layers, the composition of the atmosphere, and opacity data for the atoms and molecules that make up the atmosphere. The code solves the equation of radiative transfer for absorption of starlight passing through the planet's atmosphere as it transits, accounting for the oblique path of light through the planetary atmosphere along an Earth-bound observer's line of sight. The fraction of light absorbed (or blocked) by the planet plus its atmosphere is calculated as a function of wavelength to produce the wavelength-dependent transmission spectrum. Functionality is provided to simulate the presence of atmospheric aerosols in two ways: an optically thick (gray) cloud deck can be generated at a user-specified height in the atmosphere, and the nominal Rayleigh scattering can be increased by a specified factor.

  3. Recent Developments in the VISRAD 3-D Target Design and Radiation Simulation Code

    NASA Astrophysics Data System (ADS)

    Macfarlane, Joseph; Woodruff, P.; Golovkin, I.

    2011-10-01

    The 3-D view factor code VISRAD is widely used in designing HEDP experiments at major laser and pulsed-power facilities, including NIF, OMEGA, OMEGA-EP, ORION, Z, and PLX. It simulates target designs by generating a 3-D grid of surface elements, utilizing a variety of 3-D primitives and surface removal algorithms, and can be used to compute the radiation flux throughout the surface element grid by computing element-to-element view factors and solving power balance equations. Target set-up and beam pointing are facilitated by allowing users to specify positions and angular orientations using a variety of coordinates systems (e . g . , that of any laser beam, target component, or diagnostic port). Analytic modeling for laser beam spatial profiles for OMEGA DPPs and NIF CPPs is used to compute laser intensity profiles throughout the grid of surface elements. VISRAD includes a variety of user-friendly graphics for setting up targets and displaying results, can readily display views from any point in space, and can be used to generate image sequences for animations. We will discuss recent improvements to the software package and plans for future developments.

  4. Quality improvement of International Classification of Diseases, 9th revision, diagnosis coding in radiation oncology: single-institution prospective study at University of California, San Francisco.

    PubMed

    Chen, Chien P; Braunstein, Steve; Mourad, Michelle; Hsu, I-Chow J; Haas-Kogan, Daphne; Roach, Mack; Fogh, Shannon E

    2015-01-01

    Accurate International Classification of Diseases (ICD) diagnosis coding is critical for patient care, billing purposes, and research endeavors. In this single-institution study, we evaluated our baseline ICD-9 (9th revision) diagnosis coding accuracy, identified the most common errors contributing to inaccurate coding, and implemented a multimodality strategy to improve radiation oncology coding. We prospectively studied ICD-9 coding accuracy in our radiation therapy--specific electronic medical record system. Baseline ICD-9 coding accuracy was obtained from chart review targeting ICD-9 coding accuracy of all patients treated at our institution between March and June of 2010. To improve performance an educational session highlighted common coding errors, and a user-friendly software tool, RadOnc ICD Search, version 1.0, for coding radiation oncology specific diagnoses was implemented. We then prospectively analyzed ICD-9 coding accuracy for all patients treated from July 2010 to June 2011, with the goal of maintaining 80% or higher coding accuracy. Data on coding accuracy were analyzed and fed back monthly to individual providers. Baseline coding accuracy for physicians was 463 of 661 (70%) cases. Only 46% of physicians had coding accuracy above 80%. The most common errors involved metastatic cases, whereby primary or secondary site ICD-9 codes were either incorrect or missing, and special procedures such as stereotactic radiosurgery cases. After implementing our project, overall coding accuracy rose to 92% (range, 86%-96%). The median accuracy for all physicians was 93% (range, 77%-100%) with only 1 attending having accuracy below 80%. Incorrect primary and secondary ICD-9 codes in metastatic cases showed the most significant improvement (10% vs 2% after intervention). Identifying common coding errors and implementing both education and systems changes led to significantly improved coding accuracy. This quality assurance project highlights the potential problem

  5. Description of the strategic high-altitude atmospheric radiation code (SHARC). Scientific report, Jan 89-Oct 90

    SciTech Connect

    Duff, J.W.; Sundberg, R.L.; Gruninger, J.H.; Bernstein, L.S.; Robertson, D.C.

    1990-11-27

    The report describes an upgraded version of the strategic high-altitude radiance code, SHARC-2. SHARC calculates atmospheric radiance and transmittance over the 2-40 micrometer spectral region for arbitrary paths within 50 and 300 km altitude, including space viewing. It models radiation due to NLTE (Non-Local Thermodynamic Equilibrium) molecular emissions which are the dominant sources at these altitudes. This new version, which is now ready for distribution, has been upgraded to include a fully integrated auroral model with time-dependent chemistry, extention down to 50 km altitude, and radiation from the minor isotopes of CO2. In addition, there have been numerous internal upgrades to the various modules. These include a Voigt lineshape for the radiative excitation module; embedding of the auroral region into a quiescent atmosphere; and improvements in the radiation transport algorithms.

  6. Fast aerosol optical thickness retrieval from MERIS data with the use of fast radiative transfer code and analytical radiative transfer solutions

    NASA Astrophysics Data System (ADS)

    Kokhanovsky, Alexander; Katsev, Iosif; Prikhach, Alexander; Zege, Eleonora

    We present the new fast aerosol retrieval technique (FAR) to retrieve the aerosol optical thick-ness (AOT), Angstrom parameter, and land reflectance from spectral satellite data. The most important difference of the proposed techniques from NASA/MODIS, ESA/MERIS and some other well-known AOT retrieval codes is that our retrievals do not use the look-up tables (LUT) technique but instead it is based on our previously developed extremely fast code RAY for ra-diative transfer (RT) computations and includes analytical solutions of radiative transfer. The previous version of the retrieval code (ART) was completely based at the RT computations. The FAR technique is about 100 times faster than ART because of the use combination of the RAY computation and analytical solution of the radiative transfer theory. The accuracy of these approximate solutions is thoroughly checked. Using the RT computations in the course of the AOT retrieval allows one to include any available local models of molecular atmosphere and of aerosol in upper and middle atmosphere layers for the treated area. Any set of wave-lengths from any satellite optical instruments can be processed. Moreover, we use the method of least squares in the retrieval of optical parameters of aerosol because the RAY code pro-vides the derivatives of the radiation characteristics with respect to the parameters in question. This technique allows the optimal use on multi-spectral information. The retrieval methods are flexible and can be used in synergetic algorithms, which couple data of two or more satel-lite receivers. These features may be considered as definite merits in comparison with the LUT technique. The successful comparison of FAR retrieved data with results of some other algorithms and with AERONET measurements will be demonstrated. Beside two important problems, namely, the effect of a priory choice of aerosol model to the retrieved AOT accuracy and effect of adjacent pixels containing clouds or snow spots is

  7. Improvements of the Radiation Code "MstrnX" in AORI/NIES/JAMSTEC Models

    NASA Astrophysics Data System (ADS)

    Sekiguchi, M.; Suzuki, K.; Takemura, T.; Watanabe, M.; Ogura, T.

    2015-12-01

    There is a large demand for an accurate yet rapid radiation transfer scheme accurate for general climate models. The broadband radiative transfer code "mstrnX", ,which was developed by Atmosphere and Ocean Research Institute (AORI) and was implemented in several global and regional climate models cooperatively developed in the Japanese research community, for example, MIROC (the Model for Interdisciplinary Research on Climate) [Watanabe et al., 2010], NICAM (Non-hydrostatic Icosahedral Atmospheric Model) [Satoh et al, 2008], and CReSS (Cloud Resolving Storm Simulator) [Tsuboki and Sakakibara, 2002]. In this study, we improve the gas absorption process and the scattering process of ice particles. For update of gas absorption process, the absorption line database is replaced by the latest versions of the Harvard-Smithsonian Center, HITRAN2012. An optimization method is adopted in mstrnX to decrease the number of integration points for the wavenumber integration using the correlated k-distribution method and to increase the computational efficiency in each band. The integration points and weights of the correlated k-distribution are optimized for accurate calculation of the heating rate up to altitude of 70 km. For this purpose we adopted a new non-linear optimization method of the correlated k-distribution and studied an optimal initial condition and the cost function for the non-linear optimization. It is known that mstrnX has a considerable bias in case of quadrapled carbon dioxide concentrations [Pincus et al., 2015], however, the bias is decreased by this improvement. For update of scattering process of ice particles, we adopt a solid column as an ice crystal habit [Yang et al., 2013]. The single scattering properties are calculated and tabulated in advance. The size parameter of this table is ranged from 0.1 to 1000 in mstrnX, we expand the maximum to 50000 in order to correspond to large particles, like fog and rain drop. Those update will be introduced to

  8. Multi-Code Ab Initio Calculation of Ionization Distributions and Radiation Losses for Tungsten in Tokamak Plasmas

    SciTech Connect

    Ralchenko, Yu.; Abdallah, J. Jr.; Colgan, J.; Fontes, C. J.; Foster, M.; Zhang, H. L.; Bar-Shalom, A.; Oreg, J.; Bauche, J.; Bauche-Arnoult, C.; Bowen, C.; Faussurier, G.; Chung, H.-K.; Hansen, S. B.; Lee, R. W.; Scott, H.; Gaufridy de Dortan, F. de; Poirier, M.; Golovkin, I.; Novikov, V.

    2009-09-10

    We present calculations of ionization balance and radiative power losses for tungsten in magnetic fusion plasmas. The simulations were performed within the framework of Non-Local Thermodynamic Equilibrium (NLTE) Code Comparison Workshops utilizing several independent collisional-radiative models. The calculations generally agree with each other; however, a clear disagreement with experimental ionization distributions at low temperatures 2 keV

  9. All-sky radiative transfer calculations for IASI and IASI-NG: The σ-IASI-as code

    NASA Astrophysics Data System (ADS)

    Liuzzi, G.; Blasi, M. G.; Masiello, G.; Serio, C.; Venafra, S.

    2017-02-01

    In the context of the development by EUMETSAT of a new generation of meteorological satellites, we have built the new σ-IASI-as (where "as" stands for "all sky") radiative transfer code. Unlike its predecessor σ-IASI, the code is able to calculate both clear and cloudy sky radiances, as well as their Jacobians with respect to any desired geophysical parameter. In addition, σ-IASI-as can perform calculations to simulate the extinction effect of the most common types of atmospheric aerosols and of clouds via ab-initio Mie calculations. We briefly describe the analytical scheme on which the model is based, and have a glance to its potentialities illustrating some sample calculations. Overall, the new model is a complete and fast radiative transfer tool for IASI, and already available for IASI-NG and MTG-IRS.

  10. GENII (Generation II): The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    SciTech Connect

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.

  11. Activities of the Radiation Shielding Information Center and a report on codes/data for high energy radiation transport

    SciTech Connect

    Roussin, R.W.

    1993-01-01

    From the very early days in its history Radiation Shielding Information Center (RSIC) has been involved with high energy radiation transport. The National Aeronautics and Space Administration was an early sponsor of RSIC until the completion of the Apollo Moon Exploration Program. In addition, the intranuclear cascade work of Bertini at Oak Ridge National Laboratory provided valuable resources which were made available through RSIC. Over the years, RSIC has had interactions with many of the developers of high energy radiation transport computing technology and data libraries and has been able to collect and disseminate this technology. The current status of this technology will be reviewed and prospects for new advancements will be examined.

  12. Activities of the Radiation Shielding Information Center and a report on codes/data for high energy radiation transport

    SciTech Connect

    Roussin, R.W.

    1993-03-01

    From the very early days in its history Radiation Shielding Information Center (RSIC) has been involved with high energy radiation transport. The National Aeronautics and Space Administration was an early sponsor of RSIC until the completion of the Apollo Moon Exploration Program. In addition, the intranuclear cascade work of Bertini at Oak Ridge National Laboratory provided valuable resources which were made available through RSIC. Over the years, RSIC has had interactions with many of the developers of high energy radiation transport computing technology and data libraries and has been able to collect and disseminate this technology. The current status of this technology will be reviewed and prospects for new advancements will be examined.

  13. Recent developments in the TRIPOLI-4® Monte-Carlo code for shielding and radiation protection applications

    NASA Astrophysics Data System (ADS)

    Malouch, Fadhel; Brun, Emeric; Diop, Cheikh; Hugot, François-Xavier; Jouanne, Cédric; Lee, Yi-Kang; Malvagi, Fausto; Mancusi, Davide; Mazzolo, Alain; Petit, Odile; Trama, Jean-Christophe; Visonneau, Thierry; Zoia, Andrea

    2017-09-01

    TRIPOLI-4® is a 3D continuous-energy Monte-Carlo particle transport code developed by CEA (SERMA) and devoted to shielding, reactor physics, criticality-safety and nuclear instrumentation. In this paper, we present the recent developments in the TRIPOLI-4® for shielding and radiation protection applications. Some of these additional features are already available in the TRIPOLI-4® version 10 released in December 2015. Other features are in development.

  14. Recent updates in the "Synchrotron Radiation Workshop" code, on-going developments, simulation activities, and plans for the future

    NASA Astrophysics Data System (ADS)

    Chubar, Oleg

    2014-09-01

    Recent updates in the "Synchrotron Radiation Workshop" physical optics computer code, including the transition to the Open Source development format, the results of the on-going collaborative development efforts in the area of X-ray optics, in particular grazing incidence mirrors, gratings and crystal monochromators, and in other areas, as well as some simulation activities for storage ring and X-ray free-electron laser sources are reported. Future development plans are discussed.

  15. Experiences in the Performance Analysis and Optimization of a Deterministic Radiation Transport Code on the Cray SV1

    SciTech Connect

    Peter Cebull

    2004-05-01

    The Attila radiation transport code, which solves the Boltzmann neutron transport equation on three-dimensional unstructured tetrahedral meshes, was ported to a Cray SV1. Cray's performance analysis tools pointed to two subroutines that together accounted for 80%-90% of the total CPU time. Source code modifications were performed to enable vectorization of the most significant loops, to correct unfavorable strides through memory, and to replace a conjugate gradient solver subroutine with a call to the Cray Scientific Library. These optimizations resulted in a speedup of 7.79 for the INEEL's largest ATR model. Parallel scalability of the OpenMP version of the code is also discussed, and timing results are given for other non-vector platforms.

  16. Bayesian Atmospheric Radiative Transfer (BART)Thermochemical Equilibrium Abundance (TEA) Code and Application to WASP-43b

    NASA Astrophysics Data System (ADS)

    Blecic, Jasmina; Harrington, Joseph; Bowman, Matthew O.; Cubillos, Patricio E.; Stemm, Madison; Foster, Andrew

    2014-11-01

    We present a new, open-source, Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. TEA uses the Gibbs-free-energy minimization method with an iterative Lagrangian optimization scheme. It initializes the radiative-transfer calculation in our Bayesian Atmospheric Radiative Transfer (BART) code. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature-pressure pairs. The code is tested against the original method developed by White at al. (1958), the analytic method developed by Burrows and Sharp (1999), and the Newton-Raphson method implemented in the open-source Chemical Equilibrium with Applications (CEA) code. TEA is written in Python and is available to the community via the open-source development site GitHub.com. We also present BART applied to eclipse depths of WASP-43b exoplanet, constraining atmospheric thermal and chemical parameters. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  17. HZETRN: A heavy ion/nucleon transport code for space radiations

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Chun, Sang Y.; Badavi, Forooz F.; Townsend, Lawrence W.; Lamkin, Stanley L.

    1991-01-01

    The galactic heavy ion transport code (GCRTRN) and the nucleon transport code (BRYNTRN) are integrated into a code package (HZETRN). The code package is computer efficient and capable of operating in an engineering design environment for manned deep space mission studies. The nuclear data set used by the code is discussed including current limitations. Although the heavy ion nuclear cross sections are assumed constant, the nucleon-nuclear cross sections of BRYNTRN with full energy dependence are used. The relation of the final code to the Boltzmann equation is discussed in the context of simplifying assumptions. Error generation and propagation is discussed, and comparison is made with simplified analytic solutions to test numerical accuracy of the final results. A brief discussion of biological issues and their impact on fundamental developments in shielding technology is given.

  18. Combining node-centered parallel radiation transport and higher-order multi-material cell-centered hydrodynamics methods in three-temperature radiation hydrodynamics code TRHD

    NASA Astrophysics Data System (ADS)

    Sijoy, C. D.; Chaturvedi, S.

    2016-06-01

    Higher-order cell-centered multi-material hydrodynamics (HD) and parallel node-centered radiation transport (RT) schemes are combined self-consistently in three-temperature (3T) radiation hydrodynamics (RHD) code TRHD (Sijoy and Chaturvedi, 2015) developed for the simulation of intense thermal radiation or high-power laser driven RHD. For RT, a node-centered gray model implemented in a popular RHD code MULTI2D (Ramis et al., 2009) is used. This scheme, in principle, can handle RT in both optically thick and thin materials. The RT module has been parallelized using message passing interface (MPI) for parallel computation. Presently, for multi-material HD, we have used a simple and robust closure model in which common strain rates to all materials in a mixed cell is assumed. The closure model has been further generalized to allow different temperatures for the electrons and ions. In addition to this, electron and radiation temperatures are assumed to be in non-equilibrium. Therefore, the thermal relaxation between the electrons and ions and the coupling between the radiation and matter energies are required to be computed self-consistently. This has been achieved by using a node-centered symmetric-semi-implicit (SSI) integration scheme. The electron thermal conduction is calculated using a cell-centered, monotonic, non-linear finite volume scheme (NLFV) suitable for unstructured meshes. In this paper, we have described the details of the 2D, 3T, non-equilibrium, multi-material RHD code developed with a special attention to the coupling of various cell-centered and node-centered formulations along with a suite of validation test problems to demonstrate the accuracy and performance of the algorithms. We also report the parallel performance of RT module. Finally, in order to demonstrate the full capability of the code implementation, we have presented the simulation of laser driven shock propagation in a layered thin foil. The simulation results are found to be in good

  19. Application of 3-dimensional radiation transport codes to the analysis of the CRBR prototypic coolant pipe chaseway neutron streaming experiment

    SciTech Connect

    Chatani, K. )

    1992-08-01

    This report summarizes the calculational results from analyses of a Clinch River Breeder Reactor (CRBR) prototypic coolant pipe chaseway neutron streaming experiment Comparisons of calculated and measured results are presented, major emphasis being placed on results at bends in the chaseway. Calculations were performed with three three-dimensional radiation transport codes: the discrete ordinates code TORT and the Monte Carlo code MORSE, both developed by the Oak Ridge National Laboratory (ORNL), and the discrete ordinates code ENSEMBLE, developed by Japan. The calculated results from the three codes are compared (1) with previously-calculated DOT3.5 two-dimensional results, (2) among themselves, and (3) with measured results. Calculations with TORT used both the weighted-difference and nodal methods. Only the weighted-difference method was used in ENSEMBLE. When the calculated results were compared to measured results, it was found that calculation-to-experiment (C/E) ratios were good in the regions of the chaseway where two-dimensional modeling might be difficult and where there were no significant discrete ordinates ray effects. Excellent agreement was observed for responses dominated by thermal neutron contributions. MORSE-calculated results and comparisons are described also, and detailed results are presented in an appendix.

  20. Activities of the Radiation Shielding Information Center and a report on codes/data for high energy radiation transport

    SciTech Connect

    Roussin, R.W.

    1994-10-01

    From the very early days in its history RSIC has been involved with high energy radiation transport. The National Aeronautics and Space Administration was an early sponsor of RSIC until the completion of the Apollo Moon Exploration Program. In addition, the intranuclear cascade work of Bertini at Oak Ridge National Laboratory provided valuable resources which were made available through RSIC. Over the years, RSIC has had interactions with many of the developers of high energy radiation transport computing technology and data libraries and has been able to collect and disseminate this technology. The current status of this technology will be reviewed and prospects for new advancements will be examined.

  1. The FLUKA radiation transport code and its use for space problems.

    PubMed

    Ferrari, A; Ranft, J; Sala, P R

    2001-01-01

    FLUKA is a multiparticle transport code capable of handling hadronic and electromagnetic showers up to very high energies (100 TeV), widely used for radioprotection and detector simulation studies. The physical models embedded into FLUKA are briefly described and their capabilities demonstrated against available experimental data. The complete modelling of cosmic ray showers in the earth atmosphere with FLUKA is also described, and its relevance for benchmarking the code for space-like environments discussed. Finally, the ongoing developments of the physical models of the code are presented and discussed.

  2. Improvement of Specter II Code: Injection and Evolution of an Artificial Radiation Belt

    DTIC Science & Technology

    1979-08-01

    by processes associated with the c *.osion. The latter possibility was first suggested by Colgate (Ref. 20); he esti- mated that the counting rate of...J. I. Vette, "Trapped Radiation Popula- tion", in the Trapped Radiation Handbook, DNA 2524F, revised 1977. 20. S. A. Colgate , "Energetic Electrons

  3. Code of Practice for the Use of Ionizing Radiations in Secondary Schools.

    ERIC Educational Resources Information Center

    National Health and Medical Research Council, Canberra (Australia).

    The appreciation of the potential hazard of ionizing radiation led to the setting up of national, and later, international commissions for the defining of standards of protection for the occupationally exposed worker in the use of ionizing radiation. However, in the last twenty years, with the large scale development of nuclear energy, the need…

  4. GARLIC - A general purpose atmospheric radiative transfer line-by-line infrared-microwave code: Implementation and evaluation

    NASA Astrophysics Data System (ADS)

    Schreier, Franz; Gimeno García, Sebastián; Hedelt, Pascal; Hess, Michael; Mendrok, Jana; Vasquez, Mayte; Xu, Jian

    2014-04-01

    A suite of programs for high resolution infrared-microwave atmospheric radiative transfer modeling has been developed with emphasis on efficient and reliable numerical algorithms and a modular approach appropriate for simulation and/or retrieval in a variety of applications. The Generic Atmospheric Radiation Line-by-line Infrared Code - GARLIC - is suitable for arbitrary observation geometry, instrumental field-of-view, and line shape. The core of GARLIC's subroutines constitutes the basis of forward models used to implement inversion codes to retrieve atmospheric state parameters from limb and nadir sounding instruments. This paper briefly introduces the physical and mathematical basics of GARLIC and its descendants and continues with an in-depth presentation of various implementation aspects: An optimized Voigt function algorithm combined with a two-grid approach is used to accelerate the line-by-line modeling of molecular cross sections; various quadrature methods are implemented to evaluate the Schwarzschild and Beer integrals; and Jacobians, i.e. derivatives with respect to the unknowns of the atmospheric inverse problem, are implemented by means of automatic differentiation. For an assessment of GARLIC's performance, a comparison of the quadrature methods for solution of the path integral is provided. Verification and validation are demonstrated using intercomparisons with other line-by-line codes and comparisons of synthetic spectra with spectra observed on Earth and from Venus.

  5. HELIOS: An Open-source, GPU-accelerated Radiative Transfer Code for Self-consistent Exoplanetary Atmospheres

    NASA Astrophysics Data System (ADS)

    Malik, Matej; Grosheintz, Luc; Mendonça, João M.; Grimm, Simon L.; Lavie, Baptiste; Kitzmann, Daniel; Tsai, Shang-Min; Burrows, Adam; Kreidberg, Laura; Bedell, Megan; Bean, Jacob L.; Stevenson, Kevin B.; Heng, Kevin

    2017-02-01

    We present the open-source radiative transfer code named HELIOS, which is constructed for studying exoplanetary atmospheres. In its initial version, the model atmospheres of HELIOS are one-dimensional and plane-parallel, and the equation of radiative transfer is solved in the two-stream approximation with nonisotropic scattering. A small set of the main infrared absorbers is employed, computed with the opacity calculator HELIOS-K and combined using a correlated-k approximation. The molecular abundances originate from validated analytical formulae for equilibrium chemistry. We compare HELIOS with the work of Miller-Ricci & Fortney using a model of GJ 1214b, and perform several tests, where we find: model atmospheres with single-temperature layers struggle to converge to radiative equilibrium; k-distribution tables constructed with ≳ 0.01 cm-1 resolution in the opacity function (≲ {10}3 points per wavenumber bin) may result in errors ≳ 1%-10% in the synthetic spectra; and a diffusivity factor of 2 approximates well the exact radiative transfer solution in the limit of pure absorption. We construct “null-hypothesis” models (chemical equilibrium, radiative equilibrium, and solar elemental abundances) for six hot Jupiters. We find that the dayside emission spectra of HD 189733b and WASP-43b are consistent with the null hypothesis, while the latter consistently underpredicts the observed fluxes of WASP-8b, WASP-12b, WASP-14b, and WASP-33b. We demonstrate that our results are somewhat insensitive to the choice of stellar models (blackbody, Kurucz, or PHOENIX) and metallicity, but are strongly affected by higher carbon-to-oxygen ratios. The code is publicly available as part of the Exoclimes Simulation Platform (exoclime.net).

  6. A fast Monte Carlo code for proton transport in radiation therapy based on MCNPX.

    PubMed

    Jabbari, Keyvan; Seuntjens, Jan

    2014-07-01

    An important requirement for proton therapy is a software for dose calculation. Monte Carlo is the most accurate method for dose calculation, but it is very slow. In this work, a method is developed to improve the speed of dose calculation. The method is based on pre-generated tracks for particle transport. The MCNPX code has been used for generation of tracks. A set of data including the track of the particle was produced in each particular material (water, air, lung tissue, bone, and soft tissue). This code can transport protons in wide range of energies (up to 200 MeV for proton). The validity of the fast Monte Carlo (MC) code is evaluated with data MCNPX as a reference code. While analytical pencil beam algorithm transport shows great errors (up to 10%) near small high density heterogeneities, there was less than 2% deviation of MCNPX results in our dose calculation and isodose distribution. In terms of speed, the code runs 200 times faster than MCNPX. In the Fast MC code which is developed in this work, it takes the system less than 2 minutes to calculate dose for 10(6) particles in an Intel Core 2 Duo 2.66 GHZ desktop computer.

  7. A fast Monte Carlo code for proton transport in radiation therapy based on MCNPX

    PubMed Central

    Jabbari, Keyvan; Seuntjens, Jan

    2014-01-01

    An important requirement for proton therapy is a software for dose calculation. Monte Carlo is the most accurate method for dose calculation, but it is very slow. In this work, a method is developed to improve the speed of dose calculation. The method is based on pre-generated tracks for particle transport. The MCNPX code has been used for generation of tracks. A set of data including the track of the particle was produced in each particular material (water, air, lung tissue, bone, and soft tissue). This code can transport protons in wide range of energies (up to 200 MeV for proton). The validity of the fast Monte Carlo (MC) code is evaluated with data MCNPX as a reference code. While analytical pencil beam algorithm transport shows great errors (up to 10%) near small high density heterogeneities, there was less than 2% deviation of MCNPX results in our dose calculation and isodose distribution. In terms of speed, the code runs 200 times faster than MCNPX. In the Fast MC code which is developed in this work, it takes the system less than 2 minutes to calculate dose for 106 particles in an Intel Core 2 Duo 2.66 GHZ desktop computer. PMID:25190994

  8. Automatic commissioning of a GPU-based Monte Carlo radiation dose calculation code for photon radiotherapy

    NASA Astrophysics Data System (ADS)

    Tian, Zhen; Jiang Graves, Yan; Jia, Xun; Jiang, Steve B.

    2014-10-01

    Monte Carlo (MC) simulation is commonly considered as the most accurate method for radiation dose calculations. Commissioning of a beam model in the MC code against a clinical linear accelerator beam is of crucial importance for its clinical implementation. In this paper, we propose an automatic commissioning method for our GPU-based MC dose engine, gDPM. gDPM utilizes a beam model based on a concept of phase-space-let (PSL). A PSL contains a group of particles that are of the same type and close in space and energy. A set of generic PSLs was generated by splitting a reference phase-space file. Each PSL was associated with a weighting factor, and in dose calculations the particle carried a weight corresponding to the PSL where it was from. Dose for each PSL in water was pre-computed, and hence the dose in water for a whole beam under a given set of PSL weighting factors was the weighted sum of the PSL doses. At the commissioning stage, an optimization problem was solved to adjust the PSL weights in order to minimize the difference between the calculated dose and measured one. Symmetry and smoothness regularizations were utilized to uniquely determine the solution. An augmented Lagrangian method was employed to solve the optimization problem. To validate our method, a phase-space file of a Varian TrueBeam 6 MV beam was used to generate the PSLs for 6 MV beams. In a simulation study, we commissioned a Siemens 6 MV beam on which a set of field-dependent phase-space files was available. The dose data of this desired beam for different open fields and a small off-axis open field were obtained by calculating doses using these phase-space files. The 3D γ-index test passing rate within the regions with dose above 10% of dmax dose for those open fields tested was improved averagely from 70.56 to 99.36% for 2%/2 mm criteria and from 32.22 to 89.65% for 1%/1 mm criteria. We also tested our commissioning method on a six-field head-and-neck cancer IMRT plan. The

  9. MESTRN: A Deterministic Meson-Muon Transport Code for Space Radiation

    NASA Technical Reports Server (NTRS)

    Blattnig, Steve R.; Norbury, John W.; Norman, Ryan B.; Wilson, John W.; Singleterry, Robert C., Jr.; Tripathi, Ram K.

    2004-01-01

    A safe and efficient exploration of space requires an understanding of space radiations, so that human life and sensitive equipment can be protected. On the way to these sensitive sites, the radiation fields are modified in both quality and quantity. Many of these modifications are thought to be due to the production of pions and muons in the interactions between the radiation and intervening matter. A method used to predict the effects of the presence of these particles on the transport of radiation through materials is developed. This method was then used to develop software, which was used to calculate the fluxes of pions and muons after the transport of a cosmic ray spectrum through aluminum and water. Software descriptions are given in the appendices.

  10. The Strategic High-Altitude Atmospheric Radiation Code (SHARC) User Instructions.

    DTIC Science & Technology

    1989-02-03

    40 spectral region. It models radiation due to NLTE (Non-Local Thermodynamic Equilibrium)-molecular emissions which are the dominant sources at these...radiative decay. This leads to a condition of Non-Local Thermodynamic Equilibrium (NLTE) where the various degrees of vibrational, rotational, and...The SHARC INTERPRETER is a modified Sandia interpreter from which information on elements in the periodic table, the thermodynamic data base

  11. ZEUS-2D: A radiation magnetohydrodynamics code for astrophysical flows in two space dimensions. I - The hydrodynamic algorithms and tests.

    NASA Astrophysics Data System (ADS)

    Stone, James M.; Norman, Michael L.

    1992-06-01

    A detailed description of ZEUS-2D, a numerical code for the simulation of fluid dynamical flows including a self-consistent treatment of the effects of magnetic fields and radiation transfer is presented. Attention is given to the hydrodynamic (HD) algorithms which form the foundation for the more complex MHD and radiation HD algorithms. The effect of self-gravity on the flow dynamics is accounted for by an iterative solution of the sparse-banded matrix resulting from discretizing the Poisson equation in multidimensions. The results of an extensive series of HD test problems are presented. A detailed description of the MHD algorithms in ZEUS-2D is presented. A new method of computing the electromotive force is developed using the method of characteristics (MOC). It is demonstrated through the results of an extensive series of MHD test problems that the resulting hybrid MOC-constrained transport method provides for the accurate evolution of all modes of MHD wave families.

  12. Retrieving the Molecular Composition of Planet-Forming Material: An Accurate Non-LTE Radiative Transfer Code for JWST

    NASA Astrophysics Data System (ADS)

    Pontoppidan, Klaus

    Based on the observed distributions of exoplanets and dynamical models of their evolution, the primary planet-forming regions of protoplanetary disks are thought to span distances of 1-20 AU from typical stars. A key observational challenge of the next decade will be to understand the links between the formation of planets in protoplanetary disks and the chemical composition of exoplanets. Potentially habitable planets in particular are likely formed by solids growing within radii of a few AU, augmented by unknown contributions from volatiles formed at larger radii of 10-50 AU. The basic chemical composition of these inner disk regions is characterized by near- to far-infrared (2-200 micron) emission lines from molecular gas at temperatures of 50-1500 K. A critical step toward measuring the chemical composition of planet-forming regions is therefore to convert observed infrared molecular line fluxes, profiles and images to gas temperatures, densities and molecular abundances. However, current techniques typically employ approximate radiative transfer methods and assumptions of local thermodynamic equilibrium (LTE) to retrieve abundances, leading to uncertainties of orders of magnitude and inconclusive comparisons to chemical models. Ultimately, the scientific impact of the high quality spectroscopic data expected from the James Webb Space Telescope (JWST) will be limited by the availability of radiative transfer tools for infrared molecular lines. We propose to develop a numerically accurate, non-LTE 3D line radiative transfer code, needed to interpret mid-infrared molecular line observations of protoplanetary and debris disks in preparation for the James Webb Space Telescope (JWST). This will be accomplished by adding critical functionality to the existing Monte Carlo code LIME, which was originally developed to support (sub)millimeter interferometric observations. In contrast to existing infrared codes, LIME calculates the exact statistical balance of arbitrary

  13. An Evaluation of Potential Modifications to the SEER (Simplified Estimation of Exposure to Radiation) Fallout Code.

    DTIC Science & Technology

    1983-09-01

    DEFENSE NUCLEAR AGENCY UNDER RDT&E RMSS CODE B364080464 V99QAXNJ31203 H2590D. Prepared for - Director DEFENSE NUCLEAR AGENCY Washington, DC 20305-1000...OTIC FILE COPY 86 2 7 1 x. .- . Destroy this report when it is no longer neeutd. Do not return to sender. PLEASE NOTIFY THE DEFENSE NUCLEAR AGENCY...International Corporation Defense Nuclear Agency 6c ADDRESS (City. State, and ZIP Co) 7b ADDRESS (City, Stat, andl ZIP code) P. 0. Box 1303 Mclen, A

  14. Voxel2MCNP: a framework for modeling, simulation and evaluation of radiation transport scenarios for Monte Carlo codes.

    PubMed

    Pölz, Stefan; Laubersheimer, Sven; Eberhardt, Jakob S; Harrendorf, Marco A; Keck, Thomas; Benzler, Andreas; Breustedt, Bastian

    2013-08-21

    The basic idea of Voxel2MCNP is to provide a framework supporting users in modeling radiation transport scenarios using voxel phantoms and other geometric models, generating corresponding input for the Monte Carlo code MCNPX, and evaluating simulation output. Applications at Karlsruhe Institute of Technology are primarily whole and partial body counter calibration and calculation of dose conversion coefficients. A new generic data model describing data related to radiation transport, including phantom and detector geometries and their properties, sources, tallies and materials, has been developed. It is modular and generally independent of the targeted Monte Carlo code. The data model has been implemented as an XML-based file format to facilitate data exchange, and integrated with Voxel2MCNP to provide a common interface for modeling, visualization, and evaluation of data. Also, extensions to allow compatibility with several file formats, such as ENSDF for nuclear structure properties and radioactive decay data, SimpleGeo for solid geometry modeling, ImageJ for voxel lattices, and MCNPX's MCTAL for simulation results have been added. The framework is presented and discussed in this paper and example workflows for body counter calibration and calculation of dose conversion coefficients is given to illustrate its application.

  15. The Intercomparison of 3D Radiation Codes (I3RC): Showcasing Mathematical and Computational Physics in a Critical Atmospheric Application

    NASA Astrophysics Data System (ADS)

    Davis, A. B.; Cahalan, R. F.

    2001-05-01

    The Intercomparison of 3D Radiation Codes (I3RC) is an on-going initiative involving an international group of over 30 researchers engaged in the numerical modeling of three-dimensional radiative transfer as applied to clouds. Because of their strong variability and extreme opacity, clouds are indeed a major source of uncertainty in the Earth's local radiation budget (at GCM grid scales). Also 3D effects (at satellite pixel scales) invalidate the standard plane-parallel assumption made in the routine of cloud-property remote sensing at NASA and NOAA. Accordingly, the test-cases used in I3RC are based on inputs and outputs which relate to cloud effects in atmospheric heating rates and in real-world remote sensing geometries. The main objectives of I3RC are to (1) enable participants to improve their models, (2) publish results as a community, (3) archive source code, and (4) educate. We will survey the status of I3RC and its plans for the near future with a special emphasis on the mathematical models and computational approaches. We will also describe some of the prime applications of I3RC's efforts in climate models, cloud-resolving models, and remote-sensing observations of clouds, or that of the surface in their presence. In all these application areas, computational efficiency is the main concern and not accuracy. One of I3RC's main goals is to document the performance of as wide a variety as possible of three-dimensional radiative transfer models for a small but representative number of ``cases.'' However, it is dominated by modelers working at the level of linear transport theory (i.e., they solve the radiative transfer equation) and an overwhelming majority of these participants use slow-but-robust Monte Carlo techniques. This means that only a small portion of the efficiency vs. accuracy vs. flexibility domain is currently populated by I3RC participants. To balance this natural clustering the present authors have organized a systematic outreach towards

  16. A NEW SEMI-EMPIRICAL AMBIENT TO EFFECTIVE DOSE CONVERSION MODEL FOR THE PREDICTIVE CODE FOR AIRCREW RADIATION EXPOSURE (PCAIRE).

    PubMed

    Dumouchel, T; McCall, M; Lemay, F; Bennett, L; Lewis, B; Bean, M

    2016-12-01

    The Predictive Code for Aircrew Radiation Exposure (PCAIRE) is a semi-empirical code that estimates both ambient dose equivalent, based on years of on-board measurements, and effective dose to aircrew. Currently, PCAIRE estimates effective dose by converting the ambient dose equivalent to effective dose (E/H) using a model that is based on radiation transport calculations and on the radiation weighting factors recommended in International Commission on Radiological Protection (ICRP) 60. In this study, a new semi-empirical E/H model is proposed to replace the existing transport calculation models. The new model is based on flight data measured using a tissue-equivalent proportional counter (TEPC). The measured flight TEPC data are separated into a low- and a high-lineal-energy spectrum using an amplitude-weighted (137)Cs TEPC spectrum. The high-lineal-energy spectrum is determined by subtracting the low-lineal-energy spectrum from the measured flight TEPC spectrum. With knowledge of E/H for the low- and high-lineal-energy spectra, the total E/H is estimated for a given flight altitude and geographic location. The semi-empirical E/H model also uses new radiation weighting factors to align the model with the most recent ICRP 103 recommendations. The ICRP 103-based semi-empirical effective dose model predicts that there is a ∼30 % reduction in dose in comparison with the ICRP 60-based model. Furthermore, the ambient dose equivalent is now a more conservative dose estimate for jet aircraft altitudes in the range of 7-13 km (FL230-430). This new semi-empirical E/H model is validated against E/H predicted from a Monte Carlo N-Particle transport code simulation of cosmic ray propagation through the Earth's atmosphere. Its implementation allows PCAIRE to provide an accurate semi-empirical estimate of the effective dose. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Quantitative comparisons between experimentally measured 2-D carbon radiation and Monte Carlo impurity (MCI) code simulations

    SciTech Connect

    Evans, T.E.; Leonard, A.W.; West, W.P.; Finkenthal, D.F.; Fenstermacher, M.E.; Porter, G.D.

    1998-08-01

    Experimentally measured carbon line emissions and total radiated power distributions from the DIII-D divertor and Scrape-Off Layer (SOL) are compared to those calculated with the Monte Carlo Impurity (MCI) model. A UEDGE background plasma is used in MCI with the Roth and Garcia-Rosales (RG-R) chemical sputtering model and/or one of six physical sputtering models. While results from these simulations do not reproduce all of the features seen in the experimentally measured radiation patterns, the total radiated power calculated in MCI is in relatively good agreement with that measured by the DIII-D bolometric system when the Smith78 physical sputtering model is coupled to RG-R chemical sputtering in an unaltered UEDGE plasma. Alternatively, MCI simulations done with UEDGE background ion temperatures along the divertor target plates adjusted to better match those measured in the experiment resulted in three physical sputtering models which when coupled to the RG-R model gave a total radiated power that was within 10% of measured value.

  18. The Air Transport of Radiation (ATR) Code: Development and Testing of ATR5

    DTIC Science & Technology

    1990-01-02

    Cod( for Computing Fission Product Gamma Dose and Dose Rates," RRA-N7236 (October 1972). 42. F.R. Mynatt , et mi., "Calculations of the Penetration of...penetration of missile silos. Works such 24), and incorporated a method for estimating the contribution of debris radiation to the total. as that by Mynatt , et

  19. Bayesian Atmospheric Radiative Transfer (BART) Code and Application to WASP-43b

    NASA Astrophysics Data System (ADS)

    Blecic, Jasmina; Harrington, Joseph; Cubillos, Patricio; Bowman, Oliver; Rojo, Patricio; Stemm, Madison; Lust, Nathaniel B.; Challener, Ryan; Foster, Austin James; Foster, Andrew S.; Blumenthal, Sarah D.; Bruce, Dylan

    2016-01-01

    We present a new open-source Bayesian radiative-transfer framework, Bayesian Atmospheric Radiative Transfer (BART, https://github.com/exosports/BART), and its application to WASP-43b. BART initializes a model for the atmospheric retrieval calculation, generates thousands of theoretical model spectra using parametrized pressure and temperature profiles and line-by-line radiative-transfer calculation, and employs a statistical package to compare the models with the observations. It consists of three self-sufficient modules available to the community under the reproducible-research license, the Thermochemical Equilibrium Abundances module (TEA, https://github.com/dzesmin/TEA, Blecic et al. 2015}, the radiative-transfer module (Transit, https://github.com/exosports/transit), and the Multi-core Markov-chain Monte Carlo statistical module (MCcubed, https://github.com/pcubillos/MCcubed, Cubillos et al. 2015). We applied BART on all available WASP-43b secondary eclipse data from the space- and ground-based observations constraining the temperature-pressure profile and molecular abundances of the dayside atmosphere of WASP-43b. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  20. Massive Neutrino Decay Driven Radiative Instabilities, Sub-Structure Survival in Galaxy Clusters and a Nested - Particle-Mesh Code

    NASA Astrophysics Data System (ADS)

    Splinter, Randall John

    1995-01-01

    I have performed a series of studies concerning the clustering of mass on large scales in the universe, with the goal being an increased understanding of the role of various processes in the formation of structure in the universe. One of the first dark matter candidates was the massive neutrino. In this section I investigate the role of a radiative decay mode for a massive neutrino species, and its impact on structure formation. By reviving a concept known as "mock gravity" I attempt to provide seed masses for eventual galaxy formation in a Hot Dark Matter universe. I show that mock gravity is ineffective at generating seed masses for galaxy formation; the ionization rate is too large and the universe becomes fully ionized well before the radiation pressure can have any effect on the clumping of matter. The final section of this thesis presents a series of N-body experiments which are aimed at understanding the theoretical sources of substructure in galaxy clusters. I perform a series of simulations using a variety of power law initial conditions to generate our cluster data sets. From there I use the statistical methods developed by Bird to analyze the subsequent survival of the sub-structures. I find that for a high omega universe that a significant number of clusters should exhibit sub-structure for very long periods of time after their formation. To test whether the sub-structure results are dependent upon the resolution of the N-body code I develop a nested-grid code for use in performing high resolution studies of gravitational instability. In the next section I present an N-body code which features a nested-grid technology. This nested-grid method allows me to extend both the force and mass resolution of a traditional particle-mesh type code. This code will prove extremely useful for studying problems in large-scale structure formation where one is focusing on highly non -linear objects, and hence force and mass resolution are at a premium. In this chapter I

  1. General circulation and thermal structure simulated by a Venus AGCM with a two-stream radiative code

    NASA Astrophysics Data System (ADS)

    Yamamoto, Masaru; Ikeda, Kohei; Takahashi, Masaaki

    2016-10-01

    Atmospheric general circulation model (AGCM) is expected to be a powerful tool for understanding Venus climate and atmospheric dynamics. At the present stage, however, the full-physics model is under development. Ikeda (2011) developed a two-stream radiative transfer code, which covers the solar to infrared radiative processes due to the gases and aerosol particles. The radiative code was applied to Venus AGCM (T21L52) at Atmosphere and Ocean Research Institute, Univ. Tokyo. We analyzed the results in a few Venus days simulation that was restarted after nudging zonal wind to a super-rotating state until the equilibrium. The simulated thermal structure has low-stability layer around 105 Pa at low latitudes, and the neutral stability extends from ˜105 Pa to the lower atmosphere at high latitudes. At the equatorial cloud top, the temperature lowers in the region between noon and evening terminator. For zonal and meridional winds, we can see difference between the zonal and day-side means. As was indicated in previous works, the day-side mean meridional wind speed mostly corresponds to the poleward component of the thermal tide and is much higher than the zonal mean. Toward understanding dynamical roles of waves in UV cloud tracking and brightness, we calculated the eddy heat and momentum fluxes averaged over the day-side hemisphere. The eddy heat and momentum fluxes are poleward in the poleward flank of the jet. In contrast, the fluxes are relatively weak and equatorward at low latitudes. The eddy momentum flux becomes equatorward in the dynamical situation that the simulated equatorial wind is weaker than the midlatitude jet. The sensitivity to the zonal flow used for the nudging will be also discussed in the model validation.

  2. An Open-source Neutrino Radiation Hydrodynamics Code for Core-collapse Supernovae

    NASA Astrophysics Data System (ADS)

    O'Connor, Evan

    2015-08-01

    We present an open-source update to the spherically symmetric, general-relativistic hydrodynamics, core-collapse supernova (CCSN) code GR1D. The source code is available at http://www.GR1Dcode.org. We extend its capabilities to include a general-relativistic treatment of neutrino transport based on the moment formalisms of Shibata et al. and Cardall et al. We pay special attention to implementing and testing numerical methods and approximations that lessen the computational demand of the transport scheme by removing the need to invert large matrices. This is especially important for the implementation and development of moment-like transport methods in two and three dimensions. A critical component of neutrino transport calculations is the neutrino-matter interaction coefficients that describe the production, absorption, scattering, and annihilation of neutrinos. In this article we also describe our open-source neutrino interaction library NuLib (available at http://www.nulib.org). We believe that an open-source approach to describing these interactions is one of the major steps needed to progress toward robust models of CCSNe and robust predictions of the neutrino signal. We show, via comparisons to full Boltzmann neutrino-transport simulations of CCSNe, that our neutrino transport code performs remarkably well. Furthermore, we show that the methods and approximations we employ to increase efficiency do not decrease the fidelity of our results. We also test the ability of our general-relativistic transport code to model failed CCSNe by evolving a 40-solar-mass progenitor to the onset of collapse to a black hole.

  3. AN OPEN-SOURCE NEUTRINO RADIATION HYDRODYNAMICS CODE FOR CORE-COLLAPSE SUPERNOVAE

    SciTech Connect

    O’Connor, Evan

    2015-08-15

    We present an open-source update to the spherically symmetric, general-relativistic hydrodynamics, core-collapse supernova (CCSN) code GR1D. The source code is available at http://www.GR1Dcode.org. We extend its capabilities to include a general-relativistic treatment of neutrino transport based on the moment formalisms of Shibata et al. and Cardall et al. We pay special attention to implementing and testing numerical methods and approximations that lessen the computational demand of the transport scheme by removing the need to invert large matrices. This is especially important for the implementation and development of moment-like transport methods in two and three dimensions. A critical component of neutrino transport calculations is the neutrino–matter interaction coefficients that describe the production, absorption, scattering, and annihilation of neutrinos. In this article we also describe our open-source neutrino interaction library NuLib (available at http://www.nulib.org). We believe that an open-source approach to describing these interactions is one of the major steps needed to progress toward robust models of CCSNe and robust predictions of the neutrino signal. We show, via comparisons to full Boltzmann neutrino-transport simulations of CCSNe, that our neutrino transport code performs remarkably well. Furthermore, we show that the methods and approximations we employ to increase efficiency do not decrease the fidelity of our results. We also test the ability of our general-relativistic transport code to model failed CCSNe by evolving a 40-solar-mass progenitor to the onset of collapse to a black hole.

  4. On the Green's function of the partially diffusion-controlled reversible ABCD reaction for radiation chemistry codes

    SciTech Connect

    Plante, Ianik; Devroye, Luc

    2015-09-15

    Several computer codes simulating chemical reactions in particles systems are based on the Green's functions of the diffusion equation (GFDE). Indeed, many types of chemical systems have been simulated using the exact GFDE, which has also become the gold standard for validating other theoretical models. In this work, a simulation algorithm is presented to sample the interparticle distance for partially diffusion-controlled reversible ABCD reaction. This algorithm is considered exact for 2-particles systems, is faster than conventional look-up tables and uses only a few kilobytes of memory. The simulation results obtained with this method are compared with those obtained with the independent reaction times (IRT) method. This work is part of our effort in developing models to understand the role of chemical reactions in the radiation effects on cells and tissues and may eventually be included in event-based models of space radiation risks. However, as many reactions are of this type in biological systems, this algorithm might play a pivotal role in future simulation programs not only in radiation chemistry, but also in the simulation of biochemical networks in time and space as well.

  5. On the Green's function of the partially diffusion-controlled reversible ABCD reaction for radiation chemistry codes

    NASA Astrophysics Data System (ADS)

    Plante, Ianik; Devroye, Luc

    2015-09-01

    Several computer codes simulating chemical reactions in particles systems are based on the Green's functions of the diffusion equation (GFDE). Indeed, many types of chemical systems have been simulated using the exact GFDE, which has also become the gold standard for validating other theoretical models. In this work, a simulation algorithm is presented to sample the interparticle distance for partially diffusion-controlled reversible ABCD reaction. This algorithm is considered exact for 2-particles systems, is faster than conventional look-up tables and uses only a few kilobytes of memory. The simulation results obtained with this method are compared with those obtained with the independent reaction times (IRT) method. This work is part of our effort in developing models to understand the role of chemical reactions in the radiation effects on cells and tissues and may eventually be included in event-based models of space radiation risks. However, as many reactions are of this type in biological systems, this algorithm might play a pivotal role in future simulation programs not only in radiation chemistry, but also in the simulation of biochemical networks in time and space as well.

  6. FY05 LDRD Final Report Molecular Radiation Biodosimetry LDRD Project Tracking Code: 04-ERD-076

    SciTech Connect

    Jones, I M; A.Coleman, M; Lehmann, J; Manohar, C F; Marchetti, F; Mariella, R; Miles, R; Nelson, D O; Wyrobek, A J

    2006-02-03

    In the event of a nuclear or radiological accident or terrorist event, it is important to identify individuals that can benefit from prompt medical care and to reassure those that do not need it. Achieving these goals will maximize the ability to manage the medical consequences of radiation exposure that unfold over a period of hours, days, weeks, years, depending on dose. Medical interventions that reduce near term morbidity and mortality from high but non-lethal exposures require advanced medical support and must be focused on those in need as soon as possible. There are two traditional approaches to radiation dosimetry, physical and biological. Each as currently practiced has strengths and limitations. Physical dosimetry for radiation exposure is routine for selected sites and for individual nuclear workers in certain industries, medical centers and research institutions. No monitoring of individuals in the general population is currently performed. When physical dosimetry is available at the time of an accident/event or soon thereafter, it can provide valuable information in support of accident/event triage. Lack of data for most individuals is a major limitation, as differences in exposure can be significant due to shielding, atmospherics, etc. A smaller issue in terms of number of people affected is that the same dose may have more or less biological effect on subsets of the population. Biological dosimetry is the estimation of exposure based on physiological or cellular alterations induced in an individual by radiation. The best established and precise biodosimetric methods are measurement of the decline of blood cells over time and measurement of the frequency of chromosome aberrations. In accidents or events affecting small numbers of people, it is practical to allocate the resources and time (days of clinical follow-up or specialists laboratory time) to conduct these studies. However, if large numbers of people have been exposed, or fear they may have

  7. PORTA: A Massively Parallel Code for 3D Non-LTE Polarized Radiative Transfer

    NASA Astrophysics Data System (ADS)

    Štěpán, J.

    2014-10-01

    The interpretation of the Stokes profiles of the solar (stellar) spectral line radiation requires solving a non-LTE radiative transfer problem that can be very complex, especially when the main interest lies in modeling the linear polarization signals produced by scattering processes and their modification by the Hanle effect. One of the main difficulties is due to the fact that the plasma of a stellar atmosphere can be highly inhomogeneous and dynamic, which implies the need to solve the non-equilibrium problem of generation and transfer of polarized radiation in realistic three-dimensional stellar atmospheric models. Here we present PORTA, a computer program we have developed for solving, in three-dimensional (3D) models of stellar atmospheres, the problem of the generation and transfer of spectral line polarization taking into account anisotropic radiation pumping and the Hanle and Zeeman effects in multilevel atoms. The numerical method of solution is based on a highly convergent iterative algorithm, whose convergence rate is insensitive to the grid size, and on an accurate short-characteristics formal solver of the Stokes-vector transfer equation which uses monotonic Bezier interpolation. In addition to the iterative method and the 3D formal solver, another important feature of PORTA is a novel parallelization strategy suitable for taking advantage of massively parallel computers. Linear scaling of the solution with the number of processors allows to reduce the solution time by several orders of magnitude. We present useful benchmarks and a few illustrations of applications using a 3D model of the solar chromosphere resulting from MHD simulations. Finally, we present our conclusions with a view to future research. For more details see Štěpán & Trujillo Bueno (2013).

  8. Code System for Calculating Radiation Exposure to Man from Routine Release of Nuclear Reactor Liquid Effluents.

    SciTech Connect

    1980-02-29

    Version 00 LADTAP II calculates the radiation exposure to man from potable water, aquatic foods, shoreline deposits, swimming, boating, and irrigated foods, and also the dose to biota. Doses are calculated for both the maximum individual and for the population and are summarized for each pathway by age group and organ. It also calculates the doses to certain representative biota other than man in the aquatic environment such as fish, invertebrates, algae, muskrats, raccoons, herons, and ducks using models presented in WASH-1258.

  9. MagRad: A code to optimize the operation of superconducting magnets in a radiation environment

    SciTech Connect

    Yeaw, Christopher T.

    1995-01-01

    A powerful computational tool, called MagRad, has been developed which optimizes magnet design for operation in radiation fields. Specifically, MagRad has been used for the analysis and design modification of the cable-in-conduit conductors of the TF magnet systems in fusion reactor designs. Since the TF magnets must operate in a radiation environment which damages the material components of the conductor and degrades their performance, the optimization of conductor design must account not only for start-up magnet performance, but also shut-down performance. The degradation in performance consists primarily of three effects: reduced stability margin of the conductor; a transition out of the well-cooled operating regime; and an increased maximum quench temperature attained in the conductor. Full analysis of the magnet performance over the lifetime of the reactor includes: radiation damage to the conductor, stability, protection, steady state heat removal, shielding effectiveness, optimal annealing schedules, and finally costing of the magnet and reactor. Free variables include primary and secondary conductor geometric and compositional parameters, as well as fusion reactor parameters. A means of dealing with the radiation damage to the conductor, namely high temperature superconductor anneals, is proposed, examined, and demonstrated to be both technically feasible and cost effective. Additionally, two relevant reactor designs (ITER CDA and ARIES-II/IV) have been analyzed. Upon addition of pure copper strands to the cable, the ITER CDA TF magnet design was found to be marginally acceptable, although much room for both performance improvement and cost reduction exists. A cost reduction of 10-15% of the capital cost of the reactor can be achieved by adopting a suitable superconductor annealing schedule. In both of these reactor analyses, the performance predictive capability of MagRad and its associated costing techniques have been demonstrated.

  10. NUSTART: A PC code for NUclear STructure And Radiative Transition analysis and supplementation

    SciTech Connect

    Larsen, G.L.; Gardner, D.G.; Gardner, M.A.

    1990-10-01

    NUSTART is a computer program for the IBM PC/At. It is designed for use with the nuclear reaction cross-section code STAPLUS, which is a STAPRE-based CRAY computer code that is being developed at Lawrence Livermore National Laboratory. The NUSTART code was developed to handle large sets of discrete nuclear levels and the multipole transitions among these levels; it operates in three modes. The Data File Error Analysis mode analyzes an existing STAPLUS input file containing the levels and their multipole transition branches for a number of physics and/or typographical errors. The Interactive Data File Generation mode allows the user to create input files of discrete levels and their branching fractions in the format required by STAPLUS, even though the user enters the information in the (different) format used by many people in the nuclear structure field. In the Branching Fractions Calculations mode, the discrete nuclear level set is read, and the multipole transitions among the levels are computed under one of two possible assumptions: (1) the levels have no collective character, or (2) the levels are all rotational band heads. Only E1, M1, and E2 transitions are considered, and the respective strength functions may be constants or, in the case of E1 transitions, the strength function may be energy dependent. The first option is used for nuclei closed shells; the bandhead option may be used to vary the E1, M1, and E2 strengths for interband transitions. K-quantum number selection rules may be invoked if desired. 19 refs.

  11. A Computer Code to Calculate Emission and Transmission of Infrared Radiation through Non-Equilibrium Atmospheres.

    DTIC Science & Technology

    1983-07-08

    CALCULATE Sinii.ItrmEMISSION AND TRANSMISSION OF INFRARED Sinii.Itrm RADIATION THROUGH NON-EQUILIBRIUM G. PERFORMING O1G. REPORT NUMBER ATMOSPHERES ERP ...8217 669.726-3 .9144J.1. *S4!468E+14 .S6d36E*14 .99414E414 *669.7265 .695eOE.1. .921910E+14 .94616E+14 .97342E414 ’ Saa hit.tZi!tt f.73 1Eti- .IMU1 -4 SIACIF+±4

  12. FESTR: Finite-Element Spectral Transfer of Radiation spectroscopic modeling and analysis code

    DOE PAGES

    Hakel, Peter

    2016-10-01

    Here we report on the development of a new spectral postprocessor of hydrodynamic simulations of hot, dense plasmas. Based on given time histories of one-, two-, and three-dimensional spatial distributions of materials, and their local temperature and density conditions, spectroscopically-resolved signals are computed. The effects of radiation emission and absorption by the plasma on the emergent spectra are simultaneously taken into account. This program can also be used independently of hydrodynamic calculations to analyze available experimental data with the goal of inferring plasma conditions.

  13. Impact of differences in the solar irradiance spectrum on surface reflectance retrieval with different radiative transfer codes

    NASA Technical Reports Server (NTRS)

    Staenz, K.; Williams, D. J.; Fedosejevs, G.; Teillet, P. M.

    1995-01-01

    Surface reflectance retrieval from imaging spectrometer data as acquired with the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) has become important for quantitative analysis. In order to calculate surface reflectance from remotely measured radiance, radiative transfer codes such as 5S and MODTRAN2 play an increasing role for removal of scattering and absorption effects of the atmosphere. Accurate knowledge of the exo-atmospheric solar irradiance (E(sub 0)) spectrum at the spectral resolution of the sensor is important for this purpose. The present study investigates the impact of differences in the solar irradiance function, as implemented in a modified version of 5S (M5S), 6S, and MODTRAN2, and as proposed by Green and Gao, on the surface reflectance retrieved from AVIRIS data. Reflectance measured in situ is used as a basis of comparison.

  14. INTDOS: a computer code for estimating internal radiation dose using recommendations of the International Commission on Radiological Protection

    SciTech Connect

    Ryan, M.T.

    1981-09-01

    INTDOS is a user-oriented computer code designed to calculate estimates of internal radiation dose commitment resulting from the acute inhalation intake of various radionuclides. It is designed so that users unfamiliar with the details of such can obtain results by answering a few questions regarding the exposure case. The user must identify the radionuclide name, solubility class, particle size, time since exposure, and the measured lung burden. INTDOS calculates the fractions of the lung burden remaining at time, t, postexposure considering the solubility class and particle size information. From the fraction remaining in the lung at time, t, the quantity inhaled is estimated. Radioactive decay is accounted for in the estimate. Finally, effective committed dose equivalents to various organs and tissues of the body are calculated using inhalation committed dose factors presented by the International Commission on Radiological Protection (ICRP). This computer code was written for execution on a Digital Equipment Corporation PDP-10 computer and is written in Fortran IV. A flow chart and example calculations are discussed in detail to aid the user who is unfamiliar with computer operations.

  15. Development of parallel monte carlo electron and photon transport (PMCEPT) code III: Applications to medical radiation physics

    NASA Astrophysics Data System (ADS)

    Kum, Oyeon; Han, Youngyih; Jeong, Hae Sun

    2012-05-01

    Minimizing the differences between dose distributions calculated at the treatment planning stage and those delivered to the patient is an essential requirement for successful radiotheraphy. Accurate calculation of dose distributions in the treatment planning process is important and can be done only by using a Monte Carlo calculation of particle transport. In this paper, we perform a further validation of our previously developed parallel Monte Carlo electron and photon transport (PMCEPT) code [Kum and Lee, J. Korean Phys. Soc. 47, 716 (2005) and Kim and Kum, J. Korean Phys. Soc. 49, 1640 (2006)] for applications to clinical radiation problems. A linear accelerator, Siemens' Primus 6 MV, was modeled and commissioned. A thorough validation includes both small fields, closely related to the intensity modulated radiation treatment (IMRT), and large fields. Two-dimensional comparisons with film measurements were also performed. The PMCEPT results, in general, agreed well with the measured data within a maximum error of about 2%. However, considering the experimental errors, the PMCEPT results can provide the gold standard of dose distributions for radiotherapy. The computing time was also much faster, compared to that needed for experiments, although it is still a bottleneck for direct applications to the daily routine treatment planning procedure.

  16. Development of radiative transfer code for JUICE/SWI mission toward the atmosphere of icy moons of Jupiter

    NASA Astrophysics Data System (ADS)

    Yamada, Takayoshi; Kasai, Yasuko; Yoshida, Naohiro

    2016-07-01

    The Submillimeter Wave Instrument (SWI) is one of the scientific instruments on the JUpiter Icy moon Explorer (JUICE). We plan to observe atmospheric compositions including water vapor and its isotopomers in Galilean moons (Io, Europa, Ganymede, and Callisto). The frequency windows of SWI are 530 to 625 GHz and 1080 to 1275 GHz with 100 kHz spectral resolution. We are developing a radiative transfer code in Japan with line-by-line method for Ganymede atmosphere in THz region (0 - 3 THz). Molecular line parameters (line intensity and partition function) were taken from JPL (Jet Propulsion Laboratory) catalogue. The pencil beam was assumed to calculate a spectrum of H _{2}O and CO in rotational transitions at the THz region. We performed comparisons between our model and ARTS (Atmospheric Radiative Transfer Simulator). The difference were less than 10% and 5% for H _{2}O and CO, respectively, under the condition of the local thermodynamic equilibrium (LTE). Comparison with several models with non-LTE assumption will be presented.

  17. Coupling External Radiation Transport Code Results to the GADRAS Detector Response Function

    SciTech Connect

    Mitchell, Dean J.; Thoreson, Gregory G.; Horne, Steven M.

    2014-01-01

    Simulating gamma spectra is useful for analyzing special nuclear materials. Gamma spectra are influenced not only by the source and the detector, but also by the external, and potentially complex, scattering environment. The scattering environment can make accurate representations of gamma spectra difficult to obtain. By coupling the Monte Carlo Nuclear Particle (MCNP) code with the Gamma Detector Response and Analysis Software (GADRAS) detector response function, gamma spectrum simulations can be computed with a high degree of fidelity even in the presence of a complex scattering environment. Traditionally, GADRAS represents the external scattering environment with empirically derived scattering parameters. By modeling the external scattering environment in MCNP and using the results as input for the GADRAS detector response function, gamma spectra can be obtained with a high degree of fidelity. This method was verified with experimental data obtained in an environment with a significant amount of scattering material. The experiment used both gamma-emitting sources and moderated and bare neutron-emitting sources. The sources were modeled using GADRAS and MCNP in the presence of the external scattering environment, producing accurate representations of the experimental data.

  18. Parameterization of the level-resolved radiative recombination rate coefficients for the SPEX code

    NASA Astrophysics Data System (ADS)

    Mao, Junjie; Kaastra, Jelle

    2016-03-01

    The level-resolved radiative recombination (RR) rate coefficients for H-like to Na-like ions from H (Z = 1) up to and including Zn (Z = 30) are studied here. For H-like ions, the quantum-mechanical exact photoionization cross sections for nonrelativistic hydrogenic systems are usedto calculate the RR rate coefficients under the principle of detailed balance, while for He-like to Na-like ions, the archival data on ADAS are adopted. Parameterizations are made for the direct capture rates in a wide temperature range. The fitting accuracies are better than 5% for about 99% of the ~3 × 104 levels considered here. The ~1% exceptions include levels from low-charged many-electron ions, and/or high-shell (n ≳ 4) levels are less important in terms of interpreting X-ray emitting astrophysical plasmas. The RR data will be incorporated into the high-resolution spectral analysis package SPEX. Results of the parameterizations are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/587/A84

  19. Radiation Transport Calculations of a Simple Structure Using the Vehicle Code System with 69-Group Cross Sections and the Monte-Carlo Neutron and Photon Code

    DTIC Science & Technology

    1989-08-01

    Code System (VCS) User’s Manual , Oak Ridge National Laboratory, ORNL-TM-4648 (1974). (UNCLASSIFIED) 3. F.R. Mynatt , F.J. Muckenthaler and P.N...and L.M. Petrie, Vehicle Code System (VCS) User’s Manual , Oak Ridge National Laboratory, ORNL-TM-4648 (1974). (UNCLASSIFIED) 3. F.R. Mynatt , F.J

  20. Methods for Ensuring High Quality of Coding of Cause of Death. The Mortality Register to Follow Southern Urals Populations Exposed to Radiation.

    PubMed

    Startsev, N; Dimov, P; Grosche, B; Tretyakov, F; Schüz, J; Akleyev, A

    2015-01-01

    To follow up populations exposed to several radiation accidents in the Southern Urals, a cause-of-death registry was established at the Urals Center capturing deaths in the Chelyabinsk, Kurgan and Sverdlovsk region since 1950. When registering deaths over such a long time period, quality measures need to be in place to maintain quality and reduce the impact of individual coders as well as quality changes in death certificates. To ensure the uniformity of coding, a method for semi-automatic coding was developed, which is described here. Briefly, the method is based on a dynamic thesaurus, database-supported coding and parallel coding by two different individuals. A comparison of the proposed method for organizing the coding process with the common procedure of coding showed good agreement, with, at the end of the coding process, 70  - 90% agreement for the three-digit ICD -9 rubrics. The semi-automatic method ensures a sufficiently high quality of coding by at the same time providing an opportunity to reduce the labor intensity inherent in the creation of large-volume cause-of-death registries.

  1. US medical researchers, the Nuremberg Doctors Trial, and the Nuremberg Code. A review of findings of the Advisory Committee on Human Radiation Experiments.

    PubMed

    Faden, R R; Lederer, S E; Moreno, J D

    1996-11-27

    The Advisory Committee on Human Radiation Experiments (ACHRE), established to review allegations of abuses of human subjects in federally sponsored radiation research, was charged with identifying appropriate standards to evaluate the ethics of cold war radiation experiments. One central question for ACHRE was to determine what role, if any, the Nuremberg Code played in the norms and practices of US medical researchers. Based on the evidence from ACHRE's Ethics Oral History Project and extensive archival research, we conclude that the Code, at the time it was promulgated, had little effect on mainstream medical researchers engaged in human subjects research. Although some clinical investigators raised questions about the conduct of research involving human beings, the medical profession did not pursue this issue until the 1960s.

  2. A Multigroup diffusion solver using pseudo transient continuation for a radiation-hydrodynamic code with patch-based AMR

    SciTech Connect

    Shestakov, A I; Offner, S R

    2006-09-21

    We present a scheme to solve the nonlinear multigroup radiation diffusion (MGD) equations. The method is incorporated into a massively parallel, multidimensional, Eulerian radiation-hydrodynamic code with adaptive mesh refinement (AMR). The patch-based AMR algorithm refines in both space and time creating a hierarchy of levels, coarsest to finest. The physics modules are time-advanced using operator splitting. On each level, separate 'level-solve' packages advance the modules. Our multigroup level-solve adapts an implicit procedure which leads to a two-step iterative scheme that alternates between elliptic solves for each group with intra-cell group coupling. For robustness, we introduce pseudo transient continuation ({Psi}tc). We analyze the magnitude of the {Psi}tc parameter to ensure positivity of the resulting linear system, diagonal dominance and convergence of the two-step scheme. For AMR, a level defines a subdomain for refinement. For diffusive processes such as MGD, the refined level uses Dirichet boundary data at the coarse-fine interface and the data is derived from the coarse level solution. After advancing on the fine level, an additional procedure, the sync-solve (SS), is required in order to enforce conservation. The MGD SS reduces to an elliptic solve on a combined grid for a system of G equations, where G is the number of groups. We adapt the 'partial temperature' scheme for the SS; hence, we reuse the infrastructure developed for scalar equations. Results are presented. We consider a multigroup test problem with a known analytic solution. We demonstrate utility of {Psi}tc by running with increasingly larger timesteps. Lastly, we simulate the sudden release of energy Y inside an Al sphere (r = 15 cm) suspended in air at STP. For Y = 11 kT, we find that gray radiation diffusion and MGD produce similar results. However, if Y = 1 MT, the two packages yield different results. Our large Y simulation contradicts a long-standing theory and demonstrates

  3. A simulation study of the effect of drift electric fields on the response of radiation detectors using the PENELOPE code

    NASA Astrophysics Data System (ADS)

    Távora, L. M. N.; Dias, T. H. V. T.; Conde, C. A. N.

    2006-06-01

    The effect of the presence of a drift electric field on the response of gaseous and semiconductor radiation detectors to energetic X-rays (energies Eph from 20 to 200 keV) is investigated using the PENELOPE code to simulate the photo-absorption and the slow-down of the electrons produced in Si, Ge, and Xe gas at 1 atm. For typical drift fields, the energy Ed deposited in the detection media is calculated taking into account the energy exchanged by the electrons with the field. The analysis of the calculated Ed distributions shows that the effect of the field on the distributions is negligible in Si and Ge semiconductor detectors, but not in Xe gas detectors, where for E/p=0.8 V cm-1 Torr-1 the fluctuations introduced by the field for Eph≈180 keV approach the intrinsic values for Xe, and the intrinsic discontinuity in linearity when Eph crosses the Xe K-edge (34.56 keV) is further reduced by ≈4%. The simulation data also suggest that this field effect may cause some deviations to the expected Gaussian response of Xe detectors to the absorption of monoenergetic photons.

  4. Filling-In, Spatial Summation, and Radiation of Pain: Evidence for a Neural Population Code in the Nociceptive System

    PubMed Central

    Quevedo, Alexandre S.

    2009-01-01

    The receptive field organization of nociceptive neurons suggests that noxious information may be encoded by population-based mechanisms. Electrophysiological evidence of population coding mechanisms has remained limited. However, psychophysical studies examining interactions between multiple noxious stimuli can provide indirect evidence that neuron population recruitment can contribute to both spatial and intensity-related percepts of pain. In the present study, pairs of thermal stimuli (35°C/49°C or 49°C/49°C) were delivered at different distances on the leg (0, 5, 10, 20, 40 cm) and abdomen (within and across dermatomes) and subjects evaluated pain intensity and perceived spatial attributes of stimuli. Reports of perceived pain spreading to involve areas that were not stimulated (radiation of pain) were most frequent at 5- and 10-cm distances (χ2 = 34.107, P < 0.0001). Perceived connectivity between two noxious stimuli (filling-in) was influenced by the distance between stimuli (χ2 = 16.756, P < 0.01), with the greatest connectivity reported at 5- and 10-cm separation distances. Spatial summation of pain occurred over probe separation distances as large as 40 cm and six dermatomes (P < 0.05), but was maximal at 5- and 10-cm separation distances. Taken together, all three of these phenomena suggest that interactions between recruited populations of neurons may support both spatial and intensity-related dimensions of the pain experience. PMID:19759320

  5. ZEUS-2D: A Radiation Magnetohydrodynamics Code for Astrophysical Flows in Two Space Dimensions. II. The Magnetohydrodynamic Algorithms and Tests

    NASA Astrophysics Data System (ADS)

    Stone, James M.; Norman, Michael L.

    1992-06-01

    In this, the second of a series of three papers, we continue a detailed description of ZEUS-2D, a numerical code for the simulation of fluid dynamical flows in astrophysics including a self-consistent treatment of the effects of magnetic fields and radiation transfer. In this paper, we give a detailed description of the magnetohydrodynamical (MHD) algorithms in ZEUS-2D. The recently developed constrained transport (CT) algorithm is implemented for the numerical evolution of the components of the magnetic field for MHD simulations. This formalism guarantees the numerically evolved field components will satisfy the divergence-free constraint at all times. We find, however, that the method used to compute the electromotive forces must be chosen carefully to propagate accurately all modes of MHD wave families (in particular shear Alfvén waves). A new method of computing the electromotive force is developed using the method of characteristics (MOC). It is demonstrated through the results of an extensive series of MHD test problems that the resulting hybrid MOC-CT method provides for the accurate evolution of all modes of MHD wave families.

  6. Using the FLUKA Monte Carlo Code to Simulate the Interactions of Ionizing Radiation with Matter to Assist and Aid Our Understanding of Ground Based Accelerator Testing, Space Hardware Design, and Secondary Space Radiation Environments

    NASA Technical Reports Server (NTRS)

    Reddell, Brandon

    2015-01-01

    Designing hardware to operate in the space radiation environment is a very difficult and costly activity. Ground based particle accelerators can be used to test for exposure to the radiation environment, one species at a time, however, the actual space environment cannot be duplicated because of the range of energies and isotropic nature of space radiation. The FLUKA Monte Carlo code is an integrated physics package based at CERN that has been under development for the last 40+ years and includes the most up-to-date fundamental physics theory and particle physics data. This work presents an overview of FLUKA and how it has been used in conjunction with ground based radiation testing for NASA and improve our understanding of secondary particle environments resulting from the interaction of space radiation with matter.

  7. TRHD: Three-temperature radiation-hydrodynamics code with an implicit non-equilibrium radiation transport using a cell-centered monotonic finite volume scheme on unstructured-grids

    NASA Astrophysics Data System (ADS)

    Sijoy, C. D.; Chaturvedi, S.

    2015-05-01

    Three-temperature (3T), unstructured-mesh, non-equilibrium radiation hydrodynamics (RHD) code have been developed for the simulation of intense thermal radiation or high-power laser driven radiative shock hydrodynamics in two-dimensional (2D) axis-symmetric geometries. The governing hydrodynamics equations are solved using a compatible unstructured Lagrangian method based on a control volume differencing (CVD) scheme. A second-order predictor-corrector (PC) integration scheme is used for the temporal discretization of the hydrodynamics equations. For the radiation energy transport, frequency averaged gray model is used in which the flux-limited diffusion (FLD) approximation is used to recover the free-streaming limit of the radiation propagation in optically thin regions. The proposed RHD model allows to have different temperatures for the electrons and ions. In addition to this, the electron and thermal radiation temperatures are assumed to be in non-equilibrium. Therefore, the thermal relaxation between the electrons and ions and the coupling between the radiation and matter energies are required to be computed self-consistently. For this, the coupled flux limited electron heat conduction and the non-equilibrium radiation diffusion equations are solved simultaneously by using an implicit, axis-symmetric, cell-centered, monotonic, nonlinear finite volume (NLFV) scheme. In this paper, we have described the details of the 2D, 3T, non-equilibrium RHD code developed along with a suite of validation test problems to demonstrate the accuracy and performance of the algorithms. We have also conducted a performance analysis with different linearity preserving interpolation schemes that are used for the evaluation of the nodal values in the NLFV scheme. Finally, in order to demonstrate full capability of the code implementation, we have presented the simulation of laser driven thin Aluminum (Al) foil acceleration. The simulation results are found to be in good agreement

  8. The Havemann-Taylor Fast Radiative Transfer Code (HT-FRTC) and its application within Tactical Decision Aids (TDAs)

    NASA Astrophysics Data System (ADS)

    Thelen, Jean-Claude; Havemann, Stephan; Wong, Gerald

    2015-10-01

    The Havemann-Taylor Fast Radiative Transfer Code (HT-FRTC) is a core component of the Met Office NEON Tactical Decision Aid (TDA). Within NEON, the HT-FRTC has for a number of years been used to predict the infrared apparent thermal contrasts between different surface types as observed by an airborne sensor. To achieve this, the HT-FRTC is supplied with the inherent temperatures and spectral properties of these surfaces (i.e. ground target(s) and backgrounds). A key strength of the HT-FRTC is its ability to take into account the detailed properties of the atmosphere, which in the context of NEON tend to be provided by a Numerical Weather Prediction (NWP) forecast model. While water vapour and ozone are generally the most important gases, additional trace gases are now being incorporated into the HT-FRTC. The HT-FRTC also includes an exact treatment of atmospheric scattering based on spherical harmonics. This allows for the treatment of several different aerosol species and of liquid and ice clouds. Recent developments can even account for rain and falling snow. The HT-FRTC works in Principal Component (PC) space and is trained on a wide variety of atmospheric and surface conditions, which significantly reduces the computational requirements regarding memory and processing time. One clear-sky simulation takes approximately one millisecond at the time of writing. Recent developments allow the training of HT-FRTC to be both completely generalised and sensor independent. This is significant as the user of the code can add new sensors and new surfaces/targets by supplying extra files which contain their (possibly classified) spectral properties. The HT-FRTC has been extended to cover the spectral range of Photopic and NVG sensors. One aim here is to give guidance on the expected, directionally resolved sky brightness, especially at night, again taking the actual or forecast atmospheric conditions into account. Recent developments include light level predictions during

  9. Comparison of the 3D VERB Code Simulations of the Dynamic Evolution of the Outer and Inner Radiation Belts With the Reanalysis Obtained from Observations on Multiple Spacecraft

    NASA Astrophysics Data System (ADS)

    Shprits, Y.; Subbotin, D.; Ni, B.; Daae, M.; Kondrashov, D. A.; Hartinger, M.; Kim, K.; Orlova, K.; Nagai, T.; Friedel, R. H.; Chen, Y.

    2010-12-01

    In this study we present simulations of the inner and outer radiation belts using the Versatile Electron Radiation Belt (VERB) accounting for radial, pitch-angle, energy, and mixed diffusion. Qusi-linear diffusion coefficients are computed using the Full Diffusion Code (FDC) due to day-side and night-side chorus waves, magneto-sonic waves, phasmaspheric hiss waves, EMIC and hiss waves in the regions of plumes, lightning generated whistlers and anthropogenic whistlers. Sensitivity simulations show that the knowledge of wave spectral properties and spacial distribution of waves is crucially important for reproducing long term observations. The 3D VERB code simulations are compared to 3D reanalysis of the radiation belt fluxes obtained by blending the predictive model with observations from LANL GEO, CRRES, Akebono, and GPS. We also discuss the initial results of coupled RCM-VERB simulations. Finally, we present a statistical analysis of radiation belt phase space density obtained from reanalysis to explore sudden drop outs of the radiation belt fluxes and location of peaks in phase space density. The application of the developed tools to future measurements on board RBSP is discussed.

  10. Spectral longwave emission in the tropics: FTIR measurements at the sea surface and comparison with fast radiation codes

    SciTech Connect

    Lubin, D.; Cutchin, D.; Conant, W.; Grassl, H.; Schmid, U.; Biselli, W.

    1995-02-01

    Longwave emission by the tropical western Pacific atmosphere has been measured at the ocean surface by a Fourier Transform Infrared (FTIR) spectroradiometer deployed aboard the research vessel John Vickers as part of the Central Equatorial Pacific Experiment. The instrument operated throughout a Pacific Ocean crossing, beginning on 7 March 1993 in Honiara, Solomon Islands, and ending on 29 March 1993 in Los Angeles, and recorded longwave emission spectra under atmospheres associated with sea surface temperatures ranging from 291.0 to 302.8 K. Precipitable water vapor abundances ranged from 1.9 to 5.5 column centimeters. Measured emission spectra (downwelling zenith radiance) covered the middled infrared (5-20 {mu}m) with one inverse centimeter spectral resolution. FTIR measurements made under an entirely clear field of view are compared with spectra generated by LOWTRAN 7 and MODTRAN 2, as well as downwelling flux calculated by the NCAR COmmunity Climate Model (CCM-2) radiation code, using radiosonde profiles as input data for these calculations. In the spectral interval 800-1000 cm{sup -1}, these comparisons show a discrepance between FTIR data and MODTRAN 2 having an overall variability of 6-7 mW m{sup -2} sr{sup -1} cm and a concave shape that may be related to the representation of water vapor continuum emission in MODTRAN 2. Another discrepancy appears in the spectral interval 1200-1300 cm{sup -1}, whether MODTRAN 2 appears to overestimate zenith radiance by 5 mW m{sup -2} sr-1 cm. These discrepancies appear consistently; however, they become only slightly larger at the highest water vapor abundances. Because these radiance discrepancies correspond to broadband (500-2000 cm{sup -1}) flux uncertainties of around 3 W m{sup -2}, there appear to be no serious inadequacies with the performance of MODTRAN 2 or LOWTRAN 7 at high atmospheric temperatures and water vapor abundances. 23 refs., 10 figs.

  11. Three dimensional data-assimilative VERB-code simulations of the Earth's radiation belts: Reanalysis during the Van Allen Probe era, and operational forecasting

    NASA Astrophysics Data System (ADS)

    Kellerman, Adam; Shprits, Yuri; Podladchikova, Tatiana; Kondrashov, Dmitri

    2016-04-01

    The Versatile Electron Radiation Belt (VERB) code 2.0 models the dynamics of radiation-belt electron phase space density (PSD) in Earth's magnetosphere. Recently, a data-assimilative version of this code has been developed, which utilizes a split-operator Kalman-filtering approach to solve for electron PSD in terms of adiabatic invariants. A new dataset based on the TS07d magnetic field model is presented, which may be utilized for analysis of past geomagnetic storms, and for initial and boundary conditions in running simulations. Further, a data-assimilative forecast model is introduced, which has the capability to forecast electron PSD several days into the future, given a forecast Kp index. The model assimilates an empirical model capable of forecasting the conditions at geosynchronous orbit. The model currently runs in real time and a forecast is available to view online http://rbm.epss.ucla.edu.

  12. Extension of radiative transfer code MOMO, matrix-operator model to the thermal infrared - Clear air validation by comparison to RTTOV and application to CALIPSO-IIR

    NASA Astrophysics Data System (ADS)

    Doppler, Lionel; Carbajal-Henken, Cintia; Pelon, Jacques; Ravetta, François; Fischer, Jürgen

    2014-09-01

    1-D radiative transfer code Matrix-Operator Model (MOMO), has been extended from [0.2-3.65 μm] the band to the whole [0.2-100 μm] spectrum. MOMO can now be used for the computation of a full range of radiation budgets (shortwave and longwave). This extension to the longwave part of the electromagnetic radiation required to consider radiative transfer processes that are features of the thermal infrared: the spectroscopy of the water vapor self- and foreign-continuum of absorption at 12 μm and the emission of radiation by gases, aerosol, clouds and surface. MOMO's spectroscopy module, Coefficient of Gas Absorption (CGASA), has been developed for computation of gas extinction coefficients, considering continua and spectral line absorptions. The spectral dependences of gas emission/absorption coefficients and of Planck's function are treated using a k-distribution. The emission of radiation is implemented in the adding-doubling process of the matrix operator method using Schwarzschild's approach in the radiative transfer equation (a pure absorbing/emitting medium, namely without scattering). Within the layer, the Planck-function is assumed to have an exponential dependence on the optical-depth. In this paper, validation tests are presented for clear air case studies: comparisons to the analytical solution of a monochromatic Schwarzschild's case without scattering show an error of less than 0.07% for a realistic atmosphere with an optical depth and a blackbody temperature that decrease linearly with altitude. Comparisons to radiative transfer code RTTOV are presented for simulations of top of atmosphere brightness temperature for channels of the space-borne instrument MODIS. Results show an agreement varying from 0.1 K to less than 1 K depending on the channel. Finally MOMO results are compared to CALIPSO Infrared Imager Radiometer (IIR) measurements for clear air cases. A good agreement was found between computed and observed radiance: biases are smaller than 0.5 K

  13. PORTA: A three-dimensional multilevel radiative transfer code for modeling the intensity and polarization of spectral lines with massively parallel computers

    NASA Astrophysics Data System (ADS)

    Štěpán, Jiří; Trujillo Bueno, Javier

    2013-09-01

    The interpretation of the intensity and polarization of the spectral line radiation produced in the atmosphere of the Sun and of other stars requires solving a radiative transfer problem that can be very complex, especially when the main interest lies in modeling the spectral line polarization produced by scattering processes and the Hanle and Zeeman effects. One of the difficulties is that the plasma of a stellar atmosphere can be highly inhomogeneous and dynamic, which implies the need to solve the non-equilibrium problem of the generation and transfer of polarized radiation in realistic three-dimensional (3D) stellar atmospheric models. Here we present PORTA, an efficient multilevel radiative transfer code we have developed for the simulation of the spectral line polarization caused by scattering processes and the Hanle and Zeeman effects in 3D models of stellar atmospheres. The numerical method of solution is based on the non-linear multigrid iterative method and on a novel short-characteristics formal solver of the Stokes-vector transfer equation which uses monotonic Bézier interpolation. Therefore, with PORTA the computing time needed to obtain at each spatial grid point the self-consistent values of the atomic density matrix (which quantifies the excitation state of the atomic system) scales linearly with the total number of grid points. Another crucial feature of PORTA is its parallelization strategy, which allows us to speed up the numerical solution of complicated 3D problems by several orders of magnitude with respect to sequential radiative transfer approaches, given its excellent linear scaling with the number of available processors. The PORTA code can also be conveniently applied to solve the simpler 3D radiative transfer problem of unpolarized radiation in multilevel systems.

  14. Assessment of shielding analysis methods, codes, and data for spent fuel transport/storage applications. [Radiation dose rates from shielded spent fuels and high-level radioactive waste

    SciTech Connect

    Parks, C.V.; Broadhead, B.L.; Hermann, O.W.; Tang, J.S.; Cramer, S.N.; Gauthey, J.C.; Kirk, B.L.; Roussin, R.W.

    1988-07-01

    This report provides a preliminary assessment of the computational tools and existing methods used to obtain radiation dose rates from shielded spent nuclear fuel and high-level radioactive waste (HLW). Particular emphasis is placed on analysis tools and techniques applicable to facilities/equipment designed for the transport or storage of spent nuclear fuel or HLW. Applications to cask transport, storage, and facility handling are considered. The report reviews the analytic techniques for generating appropriate radiation sources, evaluating the radiation transport through the shield, and calculating the dose at a desired point or surface exterior to the shield. Discrete ordinates, Monte Carlo, and point kernel methods for evaluating radiation transport are reviewed, along with existing codes and data that utilize these methods. A literature survey was employed to select a cadre of codes and data libraries to be reviewed. The selection process was based on specific criteria presented in the report. Separate summaries were written for several codes (or family of codes) that provided information on the method of solution, limitations and advantages, availability, data access, ease of use, and known accuracy. For each data library, the summary covers the source of the data, applicability of these data, and known verification efforts. Finally, the report discusses the overall status of spent fuel shielding analysis techniques and attempts to illustrate areas where inaccuracy and/or uncertainty exist. The report notes the advantages and limitations of several analysis procedures and illustrates the importance of using adequate cross-section data sets. Additional work is recommended to enable final selection/validation of analysis tools that will best meet the US Department of Energy's requirements for use in developing a viable HLW management system. 188 refs., 16 figs., 27 tabs.

  15. Mechanisms of the alternative activation of macrophages and non-coding RNAs in the development of radiation-induced lung fibrosis

    PubMed Central

    Duru, Nadire; Wolfson, Benjamin; Zhou, Qun

    2016-01-01

    Radiation-induced lung fibrosis (RILF) is a common side effect of thoracic irradiation therapy and leads to high mortality rates after cancer treatment. Radiation injury induces inflammatory M1 macrophage polarization leading to radiation pneumonitis, the first stage of RILF progression. Fibrosis occurs due to the transition of M1 macrophages to the anti-inflammatory pro-fibrotic M2 phenotype, and the resulting imbalance of macrophage regulated inflammatory signaling. Non-coding RNA signaling has been shown to play a large role in the regulation of the M2 mediated signaling pathways that are associated with the development and progression of fibrosis. While many studies show the link between M2 macrophages and fibrosis, there are only a few that explore their distinct role and the regulation of their signaling by non-coding RNA in RILF. In this review we summarize the current body of knowledge describing the roles of M2 macrophages in RILF, with an emphasis on the expression and functions of non-coding RNAs. PMID:27957248

  16. Mechanisms of the alternative activation of macrophages and non-coding RNAs in the development of radiation-induced lung fibrosis.

    PubMed

    Duru, Nadire; Wolfson, Benjamin; Zhou, Qun

    2016-11-26

    Radiation-induced lung fibrosis (RILF) is a common side effect of thoracic irradiation therapy and leads to high mortality rates after cancer treatment. Radiation injury induces inflammatory M1 macrophage polarization leading to radiation pneumonitis, the first stage of RILF progression. Fibrosis occurs due to the transition of M1 macrophages to the anti-inflammatory pro-fibrotic M2 phenotype, and the resulting imbalance of macrophage regulated inflammatory signaling. Non-coding RNA signaling has been shown to play a large role in the regulation of the M2 mediated signaling pathways that are associated with the development and progression of fibrosis. While many studies show the link between M2 macrophages and fibrosis, there are only a few that explore their distinct role and the regulation of their signaling by non-coding RNA in RILF. In this review we summarize the current body of knowledge describing the roles of M2 macrophages in RILF, with an emphasis on the expression and functions of non-coding RNAs.

  17. Intercomparision of Monte Carlo Radiation Transport Codes MCNPX, GEANT4, and FLUKA for Simulating Proton Radiotherapy of the Eye

    PubMed Central

    Randeniya, S. D.; Taddei, P. J.; Newhauser, W. D.; Yepes, P.

    2010-01-01

    Monte Carlo simulations of an ocular treatment beam-line consisting of a nozzle and a water phantom were carried out using MCNPX, GEANT4, and FLUKA to compare the dosimetric accuracy and the simulation efficiency of the codes. Simulated central axis percent depth-dose profiles and cross-field dose profiles were compared with experimentally measured data for the comparison. Simulation speed was evaluated by comparing the number of proton histories simulated per second using each code. The results indicate that all the Monte Carlo transport codes calculate sufficiently accurate proton dose distributions in the eye and that the FLUKA transport code has the highest simulation efficiency. PMID:20865141

  18. RH 1.5D: a massively parallel code for multi-level radiative transfer with partial frequency redistribution and Zeeman polarisation

    NASA Astrophysics Data System (ADS)

    Pereira, Tiago M. D.; Uitenbroek, Han

    2015-02-01

    The emergence of three-dimensional magneto-hydrodynamic simulations of stellar atmospheres has sparked a need for efficient radiative transfer codes to calculate detailed synthetic spectra. We present RH 1.5D, a massively parallel code based on the RH code and capable of performing Zeeman polarised multi-level non-local thermodynamical equilibrium calculations with partial frequency redistribution for an arbitrary amount of chemical species. The code calculates spectra from 3D, 2D or 1D atmospheric models on a column-by-column basis (or 1.5D). While the 1.5D approximation breaks down in the cores of very strong lines in an inhomogeneous environment, it is nevertheless suitable for a large range of scenarios and allows for faster convergence with finer control over the iteration of each simulation column. The code scales well to at least tens of thousands of CPU cores, and is publicly available. In the present work we briefly describe its inner workings, strategies for convergence optimisation, its parallelism, and some possible applications.

  19. Radiation physics and shielding codes and analyses applied to design-assist and safety analyses of CANDU{sup R} and ACR{sup TM} reactors

    SciTech Connect

    Aydogdu, K.; Boss, C. R.

    2006-07-01

    This paper discusses the radiation physics and shielding codes and analyses applied in the design of CANDU and ACR reactors. The focus is on the types of analyses undertaken rather than the inputs supplied to the engineering disciplines. Nevertheless, the discussion does show how these analyses contribute to the engineering design. Analyses in radiation physics and shielding can be categorized as either design-assist or safety and licensing (accident) analyses. Many of the analyses undertaken are designated 'design-assist' where the analyses are used to generate recommendations that directly influence plant design. These recommendations are directed at mitigating or reducing the radiation hazard of the nuclear power plant with engineered systems and components. Thus the analyses serve a primary safety function by ensuring the plant can be operated with acceptable radiation hazards to the workers and public. In addition to this role of design assist, radiation physics and shielding codes are also deployed in safety and licensing assessments of the consequences of radioactive releases of gaseous and liquid effluents during normal operation and gaseous effluents following accidents. In the latter category, the final consequences of accident sequences, expressed in terms of radiation dose to members of the public, and inputs to accident analysis, e.g., decay heat in fuel following a loss-of-coolant accident, are also calculated. Another role of the analyses is to demonstrate that the design of the plant satisfies the principle of ALARA (as low as reasonably achievable) radiation doses. This principle is applied throughout the design process to minimize worker and public doses. The principle of ALARA is an inherent part of all design-assist recommendations and safety and licensing assessments. The main focus of an ALARA exercise at the design stage is to minimize the radiation hazards at the source. This exploits material selection and impurity specifications and relies

  20. Development of rotating shadowband spectral radiometers and GCM radiation code test data sets in support of ARM. Technical progress report, September 15, 1990--September 14, 1991

    SciTech Connect

    Harrison, L.; Michalsky, J.

    1991-03-13

    Three separate tasks are included in the first year of the project. Two involve assembling data sets useful for testing radiation models in global climate modeling (GCM) codes, and the third is concerned with the development of advance instrumentation for performing accurate spectral radiation measurements. Task 1: Three existing data sets have been merged for two locations, one in the wet northeastern US and a second in the dry western US. The data sets are meteorological data from the WBAN network, upper air data from the NCDC, and high quality solar radiation measurements from Albany, New York and Golden, Colorado. These represent test data sets for those modelers developing radiation codes for the GCM models. Task 2: Existing data are not quite adequate from a modeler`s perspective without downwelling infrared data and surface albedo, or reflectance, data. Before the deployment of the first CART site in ARM the authors are establishing this more complete set of radiation measurements at the Albany site to be operational only until CART is operational. The authors will have the site running by April 1991, which will provide about one year`s data from this location. They will coordinate their measurements with satellite overpasses, and, to the extent possible, with radiosonde releases, in order that the data set be coincident in time. Task 3: Work has concentrated on the multiple filter instrument. The mechanical, optical, and software engineering for this instrument is complete, and the first field prototype is running at the Rattlesnake Mountain Observatory (RMO) test site. This instrument is performing well, and is already delivering reliable and useful information.

  1. A Study of Longwave Radiation Codes for Climate Studies: Validation with ARM Observations and Tests in General Circulation Models

    SciTech Connect

    Robert G. Ellingson

    2004-09-28

    One specific goal of the Atmospheric Radiation Measurements (ARM) program is to improve the treatment of radiative transfer in General Circulation Models (GCMs) under clear-sky, general overcast and broken cloud conditions. Our project was geared to contribute to this goal by attacking major problems associated with one of the dominant radiation components of the problem --longwave radiation. The primary long-term project objectives were to: (1) develop an optimum longwave radiation model for use in GCMs that has been calibrated with state-of-the-art observations for clear and cloudy conditions, and (2) determine how the longwave radiative forcing with an improved algorithm contributes relatively in a GCM when compared to shortwave radiative forcing, sensible heating, thermal advection and convection. The approach has been to build upon existing models in an iterative, predictive fashion. We focused on comparing calculations from a set of models with operationally observed data for clear, overcast and broken cloud conditions. The differences found through the comparisons and physical insights have been used to develop new models, most of which have been tested with new data. Our initial GCM studies used existing GCMs to study the climate model-radiation sensitivity problem. Although this portion of our initial plans was curtailed midway through the project, we anticipate that the eventual outcome of this approach will provide both a better longwave radiative forcing algorithm and from our better understanding of how longwave radiative forcing influences the model equilibrium climate, how improvements in climate prediction using this algorithm can be achieved.

  2. CSDUST3 - A radiation transport code for a dusty medium with 1-D planar, spherical or cylindrical geometry

    NASA Technical Reports Server (NTRS)

    Egan, Michael P.; Leung, Chun Ming; Spagna, George F., Jr.

    1988-01-01

    The program solves the radiation transport problem in a dusty medium with one-dimensional planar, spherical or cylindrical geometry. It determines self-consistently the effects of multiple scattering, absorption, and re-emission of photons on the temperature of dust grains and the characteristics of the internal radiation field. The program can treat radiation field anisotropy, linear anisotropic scattering, and multi-grain components. The program output consists of the dust-temperature distribution, flux spectrum, surface brightness at each frequency and the observed intensities (involving a convolution with a telescope beam pattern).

  3. TH-A-19A-11: Validation of GPU-Based Monte Carlo Code (gPMC) Versus Fully Implemented Monte Carlo Code (TOPAS) for Proton Radiation Therapy: Clinical Cases Study

    SciTech Connect

    Giantsoudi, D; Schuemann, J; Dowdell, S; Paganetti, H; Jia, X; Jiang, S

    2014-06-15

    Purpose: For proton radiation therapy, Monte Carlo simulation (MCS) methods are recognized as the gold-standard dose calculation approach. Although previously unrealistic due to limitations in available computing power, GPU-based applications allow MCS of proton treatment fields to be performed in routine clinical use, on time scales comparable to that of conventional pencil-beam algorithms. This study focuses on validating the results of our GPU-based code (gPMC) versus fully implemented proton therapy based MCS code (TOPAS) for clinical patient cases. Methods: Two treatment sites were selected to provide clinical cases for this study: head-and-neck cases due to anatomical geometrical complexity (air cavities and density heterogeneities), making dose calculation very challenging, and prostate cases due to higher proton energies used and close proximity of the treatment target to sensitive organs at risk. Both gPMC and TOPAS methods were used to calculate 3-dimensional dose distributions for all patients in this study. Comparisons were performed based on target coverage indices (mean dose, V90 and D90) and gamma index distributions for 2% of the prescription dose and 2mm. Results: For seven out of eight studied cases, mean target dose, V90 and D90 differed less than 2% between TOPAS and gPMC dose distributions. Gamma index analysis for all prostate patients resulted in passing rate of more than 99% of voxels in the target. Four out of five head-neck-cases showed passing rate of gamma index for the target of more than 99%, the fifth having a gamma index passing rate of 93%. Conclusion: Our current work showed excellent agreement between our GPU-based MCS code and fully implemented proton therapy based MC code for a group of dosimetrically challenging patient cases.

  4. SU-F-18C-09: Assessment of OSL Dosimeter Technology in the Validation of a Monte Carlo Radiation Transport Code for CT Dosimetry

    SciTech Connect

    Carver, D; Kost, S; Pickens, D; Price, R; Stabin, M

    2014-06-15

    Purpose: To assess the utility of optically stimulated luminescent (OSL) dosimeter technology in calibrating and validating a Monte Carlo radiation transport code for computed tomography (CT). Methods: Exposure data were taken using both a standard CT 100-mm pencil ionization chamber and a series of 150-mm OSL CT dosimeters. Measurements were made at system isocenter in air as well as in standard 16-cm (head) and 32-cm (body) CTDI phantoms at isocenter and at the 12 o'clock positions. Scans were performed on a Philips Brilliance 64 CT scanner for 100 and 120 kVp at 300 mAs with a nominal beam width of 40 mm. A radiation transport code to simulate the CT scanner conditions was developed using the GEANT4 physics toolkit. The imaging geometry and associated parameters were simulated for each ionization chamber and phantom combination. Simulated absorbed doses were compared to both CTDI{sub 100} values determined from the ion chamber and to CTDI{sub 100} values reported from the OSLs. The dose profiles from each simulation were also compared to the physical OSL dose profiles. Results: CTDI{sub 100} values reported by the ion chamber and OSLs are generally in good agreement (average percent difference of 9%), and provide a suitable way to calibrate doses obtained from simulation to real absorbed doses. Simulated and real CTDI{sub 100} values agree to within 10% or less, and the simulated dose profiles also predict the physical profiles reported by the OSLs. Conclusion: Ionization chambers are generally considered the standard for absolute dose measurements. However, OSL dosimeters may also serve as a useful tool with the significant benefit of also assessing the radiation dose profile. This may offer an advantage to those developing simulations for assessing radiation dosimetry such as verification of spatial dose distribution and beam width.

  5. Nuclear Physics Issues in Space Radiation Risk Assessment-The FLUKA Monte Carlo Transport Code Used for Space Radiation Measurement and Protection

    SciTech Connect

    Lee, K. T.

    2007-02-12

    The long term human exploration goals that NASA has embraced, requires the need to understand the primary radiation and secondary particle production under a variety of environmental conditions. In order to perform accurate transport simulations for the incident particles found in the space environment, accurate nucleus-nucleus inelastic event generators are needed, and NASA is funding their development. For the first time, NASA is including the radiation problem into the . design of the next manned exploration vehicle. The NASA-funded FLUER-S (FLUKA Executing Under ROOT-Space) project has several goals beyond the improvement of the internal nuclear physics simulations. These include making FLUKA more user-friendly. Several tools have been developed to simplify the use of FLUKA without compromising its accuracy or versatility. Among these tools are a general source input, ability of distributive computing, simplification of geometry input, geometry and event visualization, and standard FLUKA scoring output analysis using a ROOT GUI. In addition to describing these tools we will show how they have been used for space radiation environment data analysis in MARIE, IVCPDS, and EVCPDS. Similar analyses can be performed for future radiation measurement detectors before they are deployed in order to optimize their design. These tools can also be used in the design of nuclear-based power systems on manned exploration vehicles and planetary surfaces. In addition to these space applications, the simulations are being used to support accelerator based experiments like the cross-section measurements being performed at HIMAC and NSRL at BNL.

  6. Nuclear Physics Issues in Space Radiation Risk Assessment-The FLUKA Monte Carlo Transport Code Used for Space Radiation Measurement and Protection

    NASA Astrophysics Data System (ADS)

    Lee, K. T.

    2007-02-01

    The long term human exploration goals that NASA has embraced, requires the need to understand the primary radiation and secondary particle production under a variety of environmental conditions. In order to perform accurate transport simulations for the incident particles found in the space environment, accurate nucleus-nucleus inelastic event generators are needed, and NASA is funding their development. For the first time, NASA is including the radiation problem into the . design of the next manned exploration vehicle. The NASA-funded FLUER-S (FLUKA Executing Under ROOT-Space) project has several goals beyond the improvement of the internal nuclear physics simulations. These include making FLUKA more user-friendly. Several tools have been developed to simplify the use of FLUKA without compromising its accuracy or versatility. Among these tools are a general source input, ability of distributive computing, simplification of geometry input, geometry and event visualization, and standard FLUKA scoring output analysis using a ROOT GUI. In addition to describing these tools we will show how they have been used for space radiation environment data analysis in MARIE, IVCPDS, and EVCPDS. Similar analyses can be performed for future radiation measurement detectors before they are deployed in order to optimize their design. These tools can also be used in the design of nuclear-based power systems on manned exploration vehicles and planetary surfaces. In addition to these space applications, the simulations are being used to support accelerator based experiments like the cross-section measurements being performed at HIMAC and NSRL at BNL.

  7. IM3D: A parallel Monte Carlo code for efficient simulations of primary radiation displacements and damage in 3D geometry

    PubMed Central

    Li, Yong Gang; Yang, Yang; Short, Michael P.; Ding, Ze Jun; Zeng, Zhi; Li, Ju

    2015-01-01

    SRIM-like codes have limitations in describing general 3D geometries, for modeling radiation displacements and damage in nanostructured materials. A universal, computationally efficient and massively parallel 3D Monte Carlo code, IM3D, has been developed with excellent parallel scaling performance. IM3D is based on fast indexing of scattering integrals and the SRIM stopping power database, and allows the user a choice of Constructive Solid Geometry (CSG) or Finite Element Triangle Mesh (FETM) method for constructing 3D shapes and microstructures. For 2D films and multilayers, IM3D perfectly reproduces SRIM results, and can be ∼102 times faster in serial execution and > 104 times faster using parallel computation. For 3D problems, it provides a fast approach for analyzing the spatial distributions of primary displacements and defect generation under ion irradiation. Herein we also provide a detailed discussion of our open-source collision cascade physics engine, revealing the true meaning and limitations of the “Quick Kinchin-Pease” and “Full Cascades” options. The issues of femtosecond to picosecond timescales in defining displacement versus damage, the limitation of the displacements per atom (DPA) unit in quantifying radiation damage (such as inadequacy in quantifying degree of chemical mixing), are discussed. PMID:26658477

  8. Scaling and performance of a 3-D radiation hydrodynamics code on message-passing parallel computers: final report

    SciTech Connect

    Hayes, J C; Norman, M

    1999-10-28

    This report details an investigation into the efficacy of two approaches to solving the radiation diffusion equation within a radiation hydrodynamic simulation. Because leading-edge scientific computing platforms have evolved from large single-node vector processors to parallel aggregates containing tens to thousands of individual CPU's, the ability of an algorithm to maintain high compute efficiency when distributed over a large array of nodes is critically important. The viability of an algorithm thus hinges upon the tripartite question of numerical accuracy, total time to solution, and parallel efficiency.

  9. PiC code KARAT simulations of Coherent THz Smith-Purcell Radiation from diffraction gratings of various profiles

    NASA Astrophysics Data System (ADS)

    Artyomov, K. P.; Ryzhov, V. V.; Potylitsyn, A. P.; Sukhikh, L. G.

    2017-05-01

    Generation of coherent THz Smith-Purcell radiation by single electron bunch or multi-bunched electron beam was simulated for lamellar, sinusoidal and echelette gratings. The dependences of the CSPR intensity of the corrugation gratings depth were investigated. The angular and spectral characteristics of the CSPR for different profiles of diffraction gratings were obtained. It is shown that in the case of femtosecond multi-bunched electron beam with 10 MeV energy sinusoidal grating with period 292 μm and groove depth 60 μm has the uniform angular distribution with high radiation intensity.

  10. Radiation

    NASA Image and Video Library

    Outside the protective cocoon of Earth's atmosphere, the universe is full of harmful radiation. Astronauts who live and work in space are exposed not only to ultraviolet rays but also to space radi...

  11. Development and Implementation of Photonuclear Cross-Section Data for Mutually Coupled Neutron-Photon Transport Calculations in the Monte Carlo N-Particle (MCNP) Radiation Transport Code

    SciTech Connect

    White, Morgan C.

    2000-07-01

    The fundamental motivation for the research presented in this dissertation was the need to development a more accurate prediction method for characterization of mixed radiation fields around medical electron accelerators (MEAs). Specifically, a model is developed for simulation of neutron and other particle production from photonuclear reactions and incorporated in the Monte Carlo N-Particle (MCNP) radiation transport code. This extension of the capability within the MCNP code provides for the more accurate assessment of the mixed radiation fields. The Nuclear Theory and Applications group of the Los Alamos National Laboratory has recently provided first-of-a-kind evaluated photonuclear data for a select group of isotopes. These data provide the reaction probabilities as functions of incident photon energy with angular and energy distribution information for all reaction products. The availability of these data is the cornerstone of the new methodology for state-of-the-art mutually coupled photon-neutron transport simulations. The dissertation includes details of the model development and implementation necessary to use the new photonuclear data within MCNP simulations. A new data format has been developed to include tabular photonuclear data. Data are processed from the Evaluated Nuclear Data Format (ENDF) to the new class ''u'' A Compact ENDF (ACE) format using a standalone processing code. MCNP modifications have been completed to enable Monte Carlo sampling of photonuclear reactions. Note that both neutron and gamma production are included in the present model. The new capability has been subjected to extensive verification and validation (V&V) testing. Verification testing has established the expected basic functionality. Two validation projects were undertaken. First, comparisons were made to benchmark data from literature. These calculations demonstrate the accuracy of the new data and transport routines to better than 25 percent. Second, the ability to

  12. Emission from Very Small Grains and PAH Molecules in Monte Carlo Radiation Transfer Codes: Application to the Edge-On Disk of Gomez's Hamburger

    NASA Astrophysics Data System (ADS)

    Wood, Kenneth; Whitney, Barbara A.; Robitaille, Thomas; Draine, Bruce T.

    2008-12-01

    We have modeled optical to far-infrared images, photometry, and spectroscopy of the object known as Gomez's Hamburger. We reproduce the images and spectrum with an edge-on disk of mass 0.3 M⊙ and radius 1600 AU, surrounding an A0 III star at a distance of 280 pc. Our mass estimate is in excellent agreement with recent CO observations. However, our distance determination is more than an order of magnitude smaller than previous analyses, which inaccurately interpreted the optical spectrum. To accurately model the infrared spectrum we have extended our Monte Carlo radiation transfer codes to include emission from polycyclic aromatic hydrocarbon (PAH) molecules and very small grains (VSG). We do this using precomputed PAH/VSG emissivity files for a wide range of values of the mean intensity of the exciting radiation field. When Monte Carlo energy packets are absorbed by PAHs/VSGs, we reprocess them to other wavelengths by sampling from the emissivity files, thus simulating the absorption and reemission process without reproducing lengthy computations of statistical equilibrium, excitation, and de-excitation in the complex many-level molecules. Using emissivity lookup tables in our Monte Carlo codes gives us the flexibility to use the latest grain physics calculations of PAH/VSG emissivity and opacity that are being continually updated in the light of higher resolution infrared spectra. We find our approach gives a good representation of the observed PAH spectrum from the disk of Gomez's Hamburger. Our models also indicate that the PAHs/VSGs in the disk have a larger scale height than larger radiative equilibrium grains, providing evidence for dust coagulation and settling to the midplane.

  13. A study of longwave radiation codes for climate studies: Validation with ARM observations and tests in general circulation models. Technical report, 15 September 1990--25 April 1993

    SciTech Connect

    Ellingson, R.G.; Baer, F.

    1993-12-31

    This report summarizes the activities of our group to meet our stated objectives. The report is divided into sections entitled: Radiation Model Testing Activities, General Circulation Model Testing Activities, Science Team Activities, and Publications, Presentations and Meetings. The section on Science Team Activities summarizes our participation with the science team to further advance the observation and modeling programs. Appendix A lists graduate students supported, and post-doctoral appointments during the project. Reports on the activities during each of the first two years are included as Appendix B. Significant progress has been made in: determining the ability of line-by-line radiation models to calculate the downward longwave flux at the surface; determining the uncertainties in calculated the downwelling radiance and flux at the surface associated with the use of different proposed profiling techniques; intercomparing clear-sky radiance and flux observations with calculations from radiation codes from different climate models; determining the uncertainties associated with estimating N* from surface longwave flux observations; and determining the sensitivity of model calculations to different formulations of the effects of finite sized clouds.

  14. Validation of a GPU-based Monte Carlo code (gPMC) for proton radiation therapy: clinical cases study

    NASA Astrophysics Data System (ADS)

    Giantsoudi, Drosoula; Schuemann, Jan; Jia, Xun; Dowdell, Stephen; Jiang, Steve; Paganetti, Harald

    2015-03-01

    Monte Carlo (MC) methods are recognized as the gold-standard for dose calculation, however they have not replaced analytical methods up to now due to their lengthy calculation times. GPU-based applications allow MC dose calculations to be performed on time scales comparable to conventional analytical algorithms. This study focuses on validating our GPU-based MC code for proton dose calculation (gPMC) using an experimentally validated multi-purpose MC code (TOPAS) and compare their performance for clinical patient cases. Clinical cases from five treatment sites were selected covering the full range from very homogeneous patient geometries (liver) to patients with high geometrical complexity (air cavities and density heterogeneities in head-and-neck and lung patients) and from short beam range (breast) to large beam range (prostate). Both gPMC and TOPAS were used to calculate 3D dose distributions for all patients. Comparisons were performed based on target coverage indices (mean dose, V95, D98, D50, D02) and gamma index distributions. Dosimetric indices differed less than 2% between TOPAS and gPMC dose distributions for most cases. Gamma index analysis with 1%/1 mm criterion resulted in a passing rate of more than 94% of all patient voxels receiving more than 10% of the mean target dose, for all patients except for prostate cases. Although clinically insignificant, gPMC resulted in systematic underestimation of target dose for prostate cases by 1-2% compared to TOPAS. Correspondingly the gamma index analysis with 1%/1 mm criterion failed for most beams for this site, while for 2%/1 mm criterion passing rates of more than 94.6% of all patient voxels were observed. For the same initial number of simulated particles, calculation time for a single beam for a typical head and neck patient plan decreased from 4 CPU hours per million particles (2.8-2.9 GHz Intel X5600) for TOPAS to 2.4 s per million particles (NVIDIA TESLA C2075) for gPMC. Excellent agreement was

  15. Monte Carlo Simulation of a 6 MV X-Ray Beam for Open and Wedge Radiation Fields, Using GATE Code

    PubMed Central

    Bahreyni-Toosi, Mohammad-Taghi; Nasseri, Shahrokh; Momennezhad, Mahdi; Hasanabadi, Fatemeh; Gholamhosseinian, Hamid

    2014-01-01

    The aim of this study is to provide a control software system, based on Monte Carlo simulation, and calculations of dosimetric parameters of standard and wedge radiation fields, using a Monte Carlo method. GATE version 6.1 (OpenGATE Collaboration), was used to simulate a compact 6 MV linear accelerator system. In order to accelerate the calculations, the phase-space technique and cluster computing (Condor version 7.2.4, Condor Team, University of Wisconsin–Madison) were used. Dosimetric parameters used in treatment planning systems for the standard and wedge radiation fields (10 cm × 10 cm to 30 cm × 30 cm and a 60° wedge), including the percentage depth dose and dose profiles, were measured by both computational and experimental methods. Gamma index was applied to compare calculated and measured results with 3%/3 mm criteria. Gamma index was applied to compare calculated and measured results. Almost all calculated data points have satisfied gamma index criteria of 3% to 3 mm. Based on the good agreement between calculated and measured results obtained for various radiation fields in this study, GATE may be used as a useful tool for quality control or pretreatment verification procedures in radiotherapy. PMID:25426430

  16. Monte Carlo Simulation of a 6 MV X-Ray Beam for Open and Wedge Radiation Fields, Using GATE Code.

    PubMed

    Bahreyni-Toosi, Mohammad-Taghi; Nasseri, Shahrokh; Momennezhad, Mahdi; Hasanabadi, Fatemeh; Gholamhosseinian, Hamid

    2014-10-01

    The aim of this study is to provide a control software system, based on Monte Carlo simulation, and calculations of dosimetric parameters of standard and wedge radiation fields, using a Monte Carlo method. GATE version 6.1 (OpenGATE Collaboration), was used to simulate a compact 6 MV linear accelerator system. In order to accelerate the calculations, the phase-space technique and cluster computing (Condor version 7.2.4, Condor Team, University of Wisconsin-Madison) were used. Dosimetric parameters used in treatment planning systems for the standard and wedge radiation fields (10 cm × 10 cm to 30 cm × 30 cm and a 60° wedge), including the percentage depth dose and dose profiles, were measured by both computational and experimental methods. Gamma index was applied to compare calculated and measured results with 3%/3 mm criteria. Gamma index was applied to compare calculated and measured results. Almost all calculated data points have satisfied gamma index criteria of 3% to 3 mm. Based on the good agreement between calculated and measured results obtained for various radiation fields in this study, GATE may be used as a useful tool for quality control or pretreatment verification procedures in radiotherapy.

  17. Reanalysis of Radiation Belt Electron Phase Space Density using the UCLA 1-D VERB code and Kalman filtering: Correlation between the inner edge of the outer radiation belt phase space density and the plasmapause location

    NASA Astrophysics Data System (ADS)

    Espy, P. J.; Daae, M.; Shprits, Y.

    2010-12-01

    The correlation between the inner edge of the outer radiation belt phase space density (PSD) and the plasmapause location (Lpp) using reanalysis is investigated. A large data set is applied for the statistical analysis, using data from 1990-1991 from the CRRES satellite, GEO 1989, GPS-ns18 and Akebono. These data are incorporated into reanalysis by means of a Kalman filter with the UCLA 1-D VERB code. The result is a continuous radial and temporal distribution of the PSD from L*=3 to L*=7. The innovation vector of the reconstructed PSD can give us information about regions where local loss or source processes are dominating. We analyze both the PSD and the innovation vector by binning them into slots of Dst and Kp values. This has been done by finding the time for when the Dst (Kp) is within each bin-size of 20 nT (1) from 10 nT to -130 nT (1 to 8). The PSD and innovation vector was then averaged over each of those times. The result shows a good correlation between the location of the inner edge of the outer radiation belt in the PSD and the location of the plasmapause, which is consistent with previous observations. The boundary between the inner edge of the radiation belt and the Lpp becomes sharper, and the radiation belt becomes thinner, during times of high geomagnetic activity. The innovation vector shows that the inner edge of the source region also lines up well with the Lpp, and further showing a battle between losses and sources during active times. This study also illustrates how data assimilation in the radiation belts can be used to understand the underlining processes of acceleration and loss in the inner magnetosphere.

  18. The PHARO Code.

    DTIC Science & Technology

    1981-11-24

    n.cet..ary ad Identfy by block nutrb.) Visible radiation Sensors Infrared radiation Line and band transitions Isophots High altitude nuclear data...radiation (watts sr) in arbitrary wavelength intervals is determined. The results are a series of " isophot " plots for rbitrariiy placed cameras or sensors...Section II. The output of the PHARO code consists of contour plots of radiative intensity (watts/cm ster) or " isophot " plots for arbitrarily placed sensors

  19. The Microwave Applications Theory Program at NRL and Some Chemistry Code Applications to Ionospheric Heating by Microwave Radiation.

    DTIC Science & Technology

    1980-08-26

    RADIATION 1. INTRODUCTION The advent of high power pulsed microwave devices, the magnetrons, at NRL,I which currently generate ’U 1 G Watt at X 1 0 cm and a...separation needed to sustain such a plasma. ( g ) relaxation of the disturbed air and the impact of the late time air chemistry on multi pulse breakdown...and the first negative bands of N 2+ . These two band systems correspond2 7 to N2 +(B2E - X2 E) and N2 (C 3Tu - B 3 g ) transitions, respectively. The

  20. An Intercomparison of Radiation Codes for Retrieving Upper Tropospheric Humidity in the 6.3-micron Band: A Report from the 1st GVaP Workshop

    NASA Technical Reports Server (NTRS)

    Soden, B.; Tjemkes, S.; Schmetz, J.; Saunders, R.; Bates, J.; Ellingson, B.; Engelen, R.; Garand, L.; Jackson, D.; Jedlovec, G.

    1999-01-01

    An intercomparison of radiation codes used in retrieving upper tropospheric humidity (UTH) from observations in the v2 (6.3 microns) water vapor absorption band was performed. This intercomparison is one part of a coordinated effort within the GEWEX Water Vapor Project (GVaP) to assess our ability to monitor the distribution and variations of upper tropospheric moisture from space-borne sensors. A total of 23 different codes, ranging from detailed line-by-line (LBL) models, to coarser resolution narrow-band (NB) models, to highly-parameterized single-band (SB) models participated in the study. Forward calculations were performed using a carefully selected set of temperature and moisture profiles chosen to be representative of a wide range of atmospheric conditions. The LBL model calculations exhibited the greatest consistency with each other, typically agreeing to within 0.5 K in terms of the equivalent blackbody brightness temperature (T(sub b)). The majority of NB and SB models agreed to within +/- 1 K of the LBL models, although a few older models exhibited systematic T(sub b) biases in excess of 2 K. A discussion of the discrepancies between various models, their association with differences in model physics (e.g. continuum absorption), and their implications for UTH retrieval and radiance assimilation is presented.

  1. A review of the use and potential of the GATE Monte Carlo simulation code for radiation therapy and dosimetry applications.

    PubMed

    Sarrut, David; Bardiès, Manuel; Boussion, Nicolas; Freud, Nicolas; Jan, Sébastien; Létang, Jean-Michel; Loudos, George; Maigne, Lydia; Marcatili, Sara; Mauxion, Thibault; Papadimitroulas, Panagiotis; Perrot, Yann; Pietrzyk, Uwe; Robert, Charlotte; Schaart, Dennis R; Visvikis, Dimitris; Buvat, Irène

    2014-06-01

    In this paper, the authors' review the applicability of the open-source GATE Monte Carlo simulation platform based on the GEANT4 toolkit for radiation therapy and dosimetry applications. The many applications of GATE for state-of-the-art radiotherapy simulations are described including external beam radiotherapy, brachytherapy, intraoperative radiotherapy, hadrontherapy, molecular radiotherapy, and in vivo dose monitoring. Investigations that have been performed using GEANT4 only are also mentioned to illustrate the potential of GATE. The very practical feature of GATE making it easy to model both a treatment and an imaging acquisition within the same framework is emphasized. The computational times associated with several applications are provided to illustrate the practical feasibility of the simulations using current computing facilities.

  2. A review of the use and potential of the GATE Monte Carlo simulation code for radiation therapy and dosimetry applications

    SciTech Connect

    Sarrut, David; Bardiès, Manuel; Marcatili, Sara; Mauxion, Thibault; Boussion, Nicolas; Freud, Nicolas; Létang, Jean-Michel; Jan, Sébastien; Maigne, Lydia; Perrot, Yann; Pietrzyk, Uwe; Robert, Charlotte; and others

    2014-06-15

    In this paper, the authors' review the applicability of the open-source GATE Monte Carlo simulation platform based on the GEANT4 toolkit for radiation therapy and dosimetry applications. The many applications of GATE for state-of-the-art radiotherapy simulations are described including external beam radiotherapy, brachytherapy, intraoperative radiotherapy, hadrontherapy, molecular radiotherapy, and in vivo dose monitoring. Investigations that have been performed using GEANT4 only are also mentioned to illustrate the potential of GATE. The very practical feature of GATE making it easy to model both a treatment and an imaging acquisition within the same frameworkis emphasized. The computational times associated with several applications are provided to illustrate the practical feasibility of the simulations using current computing facilities.

  3. Measurements and code comparison of wave dispersion and antenna radiation resistance for helicon waves in a high density cylindrical plasma source

    NASA Astrophysics Data System (ADS)

    Schneider, D. A.; Borg, G. G.; Kamenski, I. V.

    1999-03-01

    Helicon wave dispersion and radiation resistance measurements in a high density (ne≈1019-1020 m-3) and magnetic field (B<0.2 T) cylindrical plasma source are compared to the results of a recently developed numerical plasma wave code [I. V. Kamenski and G. G. Borg, Phys. Plasmas 3, 4396 (1996)]. Results are compared for plasmas formed by a double saddle coil antenna and a helical antenna. In both cases, measurements reveal a dominance of the m=+1 azimuthal mode to the exclusion of most other modes; in particular, no significant m=-1 mode was observed. The helical antenna, designed to launch m<0 and m>0 modes in opposite directions along the field, resulted in an axially asymmetric discharge with very little plasma on the m<0 side of the antenna. For both antennas, good agreement of the antenna radiation resistance and wave dispersion with the model was obtained. It is concluded that unshielded antennas formed from current loops with an important |m|=1 component for the conditions of our experiment, couple most of their power to the m=+1 helicon mode and thus have negligible parasitic, nonhelicon plasma loading. This result greatly simplifies calculations of power balance in these sources by identifying the helicon as the mode by which energy is transferred to the plasma.

  4. Compressible Astrophysics Simulation Code

    SciTech Connect

    Howell, L.; Singer, M.

    2007-07-18

    This is an astrophysics simulation code involving a radiation diffusion module developed at LLNL coupled to compressible hydrodynamics and adaptive mesh infrastructure developed at LBNL. One intended application is to neutrino diffusion in core collapse supernovae.

  5. Montecarlo simulation code in optimisation of the IntraOperative Radiation Therapy treatment with mobile dedicated accelerator

    NASA Astrophysics Data System (ADS)

    Catalano, M.; Agosteo, S.; Moretti, R.; Andreoli, S.

    2007-06-01

    The principle of optimisation of the EURATOM 97/43 directive foresees that for all medical exposure of individuals for radiotherapeutic purposes, exposures of target volumes shall be individually planned, taking into account that doses of non-target volumes and tissues shall be as low as reasonably achievable and consistent with the intended radiotherapeutic purpose of the exposure. Treatment optimisation has to be carried out especially in non conventional radiotherapic procedures, as Intra Operative Radiation Therapy (IORT) with mobile dedicated LINear ACcelerator (LINAC), which does not make use of a Treatment Planning System. IORT is carried out with electron beams and refers to the application of radiation during a surgical intervention, after the removal of a neoplastic mass and it can also be used as a one-time/stand alone treatment in initial cancer of small volume. IORT foresees a single session and a single beam only; therefore it is necessary to use protection systems (disks) temporary positioned between the target volume and the underlying tissues, along the beam axis. A single high Z shielding disk is used to stop the electrons of the beam at a certain depth and protect the tissues located below. Electron back scatter produces an enhancement in the dose above the disk, and this can be reduced if a second low Z disk is placed above the first. Therefore two protection disks are used in clinical application. On the other hand the dose enhancement at the interface of the high Z disk and the target, due to back scattering radiation, can be usefully used to improve the uniformity in treatment of thicker target volumes. Furthermore the dose above the disks of different Z material has to be evaluated in order to study the optimal combination of shielding disks that allow both to protect the underlying tissues and to obtain the most uniform dose distribution in target volumes of different thicknesses. The dose enhancement can be evaluated using the electron

  6. Total effective dose equivalent assessment after exposure to high-level natural radiation using the RESRAD code.

    PubMed

    Ziajahromi, Shima; Khanizadeh, Meysam; Nejadkoorki, Farhad

    2014-03-01

    The current work reports the activity concentrations of several natural radionuclides ((226)Ra, (232)Th, and (40)K) in Khak-Sefid area of Ramsar, Iran. An evaluation of total effective dose equivalent (TEDE) from exposure to high-level natural radiations is also presented. Soil samples were analyzed using a high-purity germanium detector with 80 % relative efficiency. The TEDE was calculated on a land area of 40,000 m(2) with 1.5-m thickness of contaminated zone for the member of three critical groups of farmer, construction worker, and resident using Residual Radioactive Material Guidelines (RESRAD) modeling program. It was found that the mean activity concentrations (in Bq/kg) were 23,118 ± 468, 25.8 ± 2.3, and 402.6 ± 16.5 for (226)Ra, (232)Th, and (40)K, respectively. The maximum calculated TEDE during 1,000 years was 107.1 mSv/year at year 90, 92.42 mSv/year at year 88, and 22.09 mSv/year at year 46 for farmer, resident, and construction worker scenarios, respectively. The maximum TEDE in farmer scenario can be reduced to the level below the dose limit of 1 mSv/year which is safe for public health using soil cover with thickness of 50 cm or more on the contaminated zone. According to RESRAD prediction, the TEDE received by individuals for all exposure scenarios considerably exceed the set dose limit, and it is mainly due to (226)Ra.

  7. Spacecraft Solar Particle Event (SPE) Shielding: Shielding Effectiveness as a Function of SPE model as Determined with the FLUKA Radiation Transport Code

    NASA Technical Reports Server (NTRS)

    Koontz, Steve; Atwell, William; Reddell, Brandon; Rojdev, Kristina

    2010-01-01

    Analysis of both satellite and surface neutron monitor data demonstrate that the widely utilized Exponential model of solar particle event (SPE) proton kinetic energy spectra can seriously underestimate SPE proton flux, especially at the highest kinetic energies. The more recently developed Band model produces better agreement with neutron monitor data ground level events (GLEs) and is believed to be considerably more accurate at high kinetic energies. Here, we report the results of modeling and simulation studies in which the radiation transport code FLUKA (FLUktuierende KAskade) is used to determine the changes in total ionizing dose (TID) and single-event environments (SEE) behind aluminum, polyethylene, carbon, and titanium shielding masses when the assumed form (i. e., Band or Exponential) of the solar particle event (SPE) kinetic energy spectra is changed. FLUKA simulations have fully three dimensions with an isotropic particle flux incident on a concentric spherical shell shielding mass and detector structure. The effects are reported for both energetic primary protons penetrating the shield mass and secondary particle showers caused by energetic primary protons colliding with shielding mass nuclei. Our results, in agreement with previous studies, show that use of the Exponential form of the event

  8. Radiation Therapy: Additional Treatment Options

    MedlinePlus

    ... Upper GI What is Radiation Therapy? Find a Radiation Oncologist Last Name: Facility: City: State: Zip Code: ... infections. This is refered to as immunotherapy . Intraoperative Radiation Therapy Radiation therapy given during surgery is called ...

  9. A study of longwave radiation codes for climate studies: Validation with ARM observations and tests in general circulation models. Final report, September 15, 1990--October 31, 1994

    SciTech Connect

    Ellingson, R.G.; Baer, F.

    1998-09-01

    DOE has launched a major initiative -- the Atmospheric Radiation Measurements (ARM) Program -- directed at improving the parameterization of the physics governing cloud and radiative processes in general circulation models (GCMs). One specific goal of ARM is to improve the treatment of radiative transfer in GCMs under clear-sky, general overcast and broken cloud conditions. In 1990, the authors proposed to contribute to this goal by attacking major problems connected with one of the dominant radiation components of the problem -- longwave radiation. In particular, their long-term research goals are to: develop an optimum longwave radiation model for use in GCMs that has been calibrated with state-of-the-art observations, assess the impact of the longwave radiative forcing in a GCM, determine the sensitivity of a GCM to the radiative model used in it, and determine how the longwave radiative forcing contributes relatively when compared to shortwave radiative forcing, sensible heating, thermal advection and expansion.

  10. Clinical coding. Code breakers.

    PubMed

    Mathieson, Steve

    2005-02-24

    --The advent of payment by results has seen the role of the clinical coder pushed to the fore in England. --Examinations for a clinical coding qualification began in 1999. In 2004, approximately 200 people took the qualification. --Trusts are attracting people to the role by offering training from scratch or through modern apprenticeships.

  11. MO-G-BRE-05: Clinical Process Improvement and Billing in Radiation Oncology: A Case Study of Applying FMEA for CPT Code 77336 (continuing Medical Physics Consultation)

    SciTech Connect

    Spirydovich, S; Huq, M

    2014-06-15

    Purpose: The improvement of quality in healthcare can be assessed by Failure Mode and Effects Analysis (FMEA). In radiation oncology, FMEA, as applied to the billing CPT code 77336, can improve both charge capture and, most importantly, quality of the performed services. Methods: We created an FMEA table for the process performed under CPT code 77336. For a given process step, each member of the assembled team (physicist, dosimetrist, and therapist) independently assigned numerical values for: probability of occurrence (O, 1–10), severity (S, 1–10), and probability of detection (D, 1–10) for every failure mode cause and effect combination. The risk priority number, RPN, was then calculated as a product of O, S and D from which an average RPN was calculated for each combination mentioned above. A fault tree diagram, with each process sorted into 6 categories, was created with linked RPN. For processes with high RPN recommended actions were assigned. 2 separate R and V systems (Lantis and EMR-based ARIA) were considered. Results: We identified 9 potential failure modes and corresponding 19 potential causes of these failure modes all resulting in unjustified 77336 charge and compromised quality of care. In Lantis, the range of RPN was 24.5–110.8, and of S values – 2–10. The highest ranking RPN of 110.8 came from the failure mode described as “end-of-treatment check not done before the completion of treatment”, and the highest S value of 10 (RPN=105) from “overrides not checked”. For the same failure modes, within ARIA electronic environment with its additional controls, RPN values were significantly lower (44.3 for end-of-treatment missing check and 20.0 for overrides not checked). Conclusion: Our work has shown that when charge capture was missed that also resulted in some services not being performed. Absence of such necessary services may result in sub-optimal quality of care rendered to patients.

  12. Optical colours and spectral indices of z = 0.1 eagle galaxies with the 3D dust radiative transfer code skirt

    NASA Astrophysics Data System (ADS)

    Trayford, James W.; Camps, Peter; Theuns, Tom; Baes, Maarten; Bower, Richard G.; Crain, Robert A.; Gunawardhana, Madusha L. P.; Schaller, Matthieu; Schaye, Joop; Frenk, Carlos S.

    2017-09-01

    We present mock optical images, broad-band and H α fluxes, and D4000 spectral indices for 30 145 galaxies from the eagle hydrodynamical simulation at redshift z = 0.1, modelling dust with the skirt Monte Carlo radiative transfer code. The modelling includes a subgrid prescription for dusty star-forming regions, with both the subgrid obscuration of these regions and the fraction of metals in diffuse interstellar dust calibrated against far-infrared fluxes of local galaxies. The predicted optical colours as a function of stellar mass agree well with observation, with the skirt model showing marked improvement over a simple dust-screen model. The orientation dependence of attenuation is weaker than observed because eagle galaxies are generally puffier than real galaxies, due to the pressure floor imposed on the interstellar medium (ISM). The mock H α luminosity function agrees reasonably well with the data, and we quantify the extent to which dust obscuration affects observed H α fluxes. The distribution of D4000 break values is bimodal, as observed. In the simulation, 20 per cent of galaxies deemed 'passive' for the skirt model, i.e. exhibiting D4000 >1.8, are classified 'active' when ISM dust attenuation is not included. The fraction of galaxies with stellar mass greater than 1010 M⊙ that are deemed passive is slightly smaller than observed, which is due to low levels of residual star formation in these simulated galaxies. Colour images, fluxes and spectra of eagle galaxies are to be made available through the public eagle data base.

  13. Radiator technology

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1993-01-01

    Radiator technology is discussed in the context of the Civilian Space Technology Initiative's (CSTI's) high capacity power-thermal management project. The CSTI project is a subset of a project to develop a piloted Mars nuclear electric propulsion (NEP) vehicle. The following topics are presented in vugraph form: advanced radiator concepts; heat pipe codes and testing; composite materials; radiator design and integration; and surface morphology.

  14. Spacecraft Solar Particle Event (SPE) Shielding: Shielding Effectiveness as a Function of SPE Model as Determined with the FLUKA Radiation Transport Code

    NASA Astrophysics Data System (ADS)

    Koontz, S. L.; Atwell, W. A.; Reddell, B.; Rojdev, K.

    2010-12-01

    In the this paper, we report the results of modeling and simulation studies in which the radiation transport code FLUKA (FLUktuierende KAskade) is used to determine the changes in total ionizing dose (TID) and single-event effect (SEE) environments behind aluminum, polyethylene, carbon, and titanium shielding masses when the assumed form (i.e., Band or Exponential) of the solar particle event (SPE) kinetic energy spectra is changed. FLUKA simulations are fully three dimensional with an isotropic particle flux incident on a concentric spherical shell shielding mass and detector structure. FLUKA is a fully integrated and extensively verified Monte Carlo simulation package for the interaction and transport of high-energy particles and nuclei in matter. The effects are reported of both energetic primary protons penetrating the shield mass and secondary particle showers caused by energetic primary protons colliding with shielding mass nuclei. SPE heavy ion spectra are not addressed. Our results, in agreement with previous studies, show that use of the Exponential form of the event spectra can seriously underestimate spacecraft SPE TID and SEE environments in some, but not all, shielding mass cases. The SPE spectra investigated are taken from four specific SPEs that produced ground-level events (GLEs) during solar cycle 23 (1997-2008). GLEs are produced by highly energetic solar particle events (ESP), i.e., those that contain significant fluences of 700 MeV to 10 GeV protons. Highly energetic SPEs are implicated in increased rates of spacecraft anomalies and spacecraft failures. High-energy protons interact with Earth’s atmosphere via nuclear reaction to produce secondary particles, some of which are neutrons that can be detected at the Earth’s surface by the global neutron monitor network. GLEs are one part of the overall SPE resulting from a particular solar flare or coronal mass ejection event on the sun. The ESP part of the particle event, detected by spacecraft

  15. Superluminal Labview Code

    SciTech Connect

    Wheat, Robert; Marksteiner, Quinn; Quenzer, Jonathan; Higginson, Ian

    2012-03-26

    This labview code is used to set the phase and amplitudes on the 72 antenna of the superluminal machine, and to map out the radiation patter from the superluminal antenna.Each antenna radiates a modulated signal consisting of two separate frequencies, in the range of 2 GHz to 2.8 GHz. The phases and amplitudes from each antenna are controlled by a pair of AD8349 vector modulators (VMs). These VMs set the phase and amplitude of a high frequency signal using a set of four DC inputs, which are controlled by Linear Technologies LTC1990 digital to analog converters (DACs). The labview code controls these DACs through an 8051 microcontroller.This code also monitors the phases and amplitudes of the 72 channels. Near each antenna, there is a coupler that channels a portion of the power into a binary network. Through a labview controlled switching array, any of the 72 coupled signals can be channeled in to the Tektronix TDS 7404 digital oscilloscope. Then the labview code takes an FFT of the signal, and compares it to the FFT of a reference signal in the oscilloscope to determine the magnitude and phase of each sideband of the signal. The code compensates for phase and amplitude errors introduced by differences in cable lengths.The labview code sets each of the 72 elements to a user determined phase and amplitude. For each element, the code runs an iterative procedure, where it adjusts the DACs until the correct phases and amplitudes have been reached.

  16. Combustion chamber analysis code

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-01-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  17. Electrical Circuit Simulation Code

    SciTech Connect

    Wix, Steven D.; Waters, Arlon J.; Shirley, David

    2001-08-09

    Massively-Parallel Electrical Circuit Simulation Code. CHILESPICE is a massively-arallel distributed-memory electrical circuit simulation tool that contains many enhanced radiation, time-based, and thermal features and models. Large scale electronic circuit simulation. Shared memory, parallel processing, enhance convergence. Sandia specific device models.

  18. Ethical coding.

    PubMed

    Resnik, Barry I

    2009-01-01

    It is ethical, legal, and proper for a dermatologist to maximize income through proper coding of patient encounters and procedures. The overzealous physician can misinterpret reimbursement requirements or receive bad advice from other physicians and cross the line from aggressive coding to coding fraud. Several of the more common problem areas are discussed.

  19. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Pollara, Fabrizio; Hamkins, Jon; Dolinar, Sam; Andrews, Ken; Divsalar, Dariush

    2006-01-01

    This viewgraph presentation reviews uplink coding. The purpose and goals of the briefing are (1) Show a plan for using uplink coding and describe benefits (2) Define possible solutions and their applicability to different types of uplink, including emergency uplink (3) Concur with our conclusions so we can embark on a plan to use proposed uplink system (4) Identify the need for the development of appropriate technology and infusion in the DSN (5) Gain advocacy to implement uplink coding in flight projects Action Item EMB04-1-14 -- Show a plan for using uplink coding, including showing where it is useful or not (include discussion of emergency uplink coding).

  20. Comparison of measured responses in two spectrally-sensitive X-ray detectors to predictions obtained using the its radiation transport code

    SciTech Connect

    Carlson, G.A.; Beutler, D.E.; Seager, K.D.; Knott, D.P.

    1988-12-01

    Responses of a Ge detector and a filtered TLD array detector have been measured at a steady-state bremsstrahlung source (the Pelletron), at endpoint energies from 150 to 900 keV. Predictions of detector response using Monte Carlo ITS codes are found to be in excellent agreement with measured responses for both detectors. These results extend the range of validity of the ITS codes. With calibration provided by these experiments and by ITS predictions, dose-depth data from the TLD arrays can be used to estimate flash X-ray source endpoint energies.

  1. Comparison of measured responses in two spectrally-sensitive x-ray detectors to predictions obtained using the ITS (Integrated Tiger Series) radiation transport code

    SciTech Connect

    Carlson, G.A.; Beutler, D.E.; Seager, K.D.; Knott, D.P.

    1988-01-01

    Responses of a Ge detector and a filtered TLD array detector have been measured at a steady-state bremsstrahlung source (the Pelletron), at endpoint energies from 150 to 900 keV. Predictions of detector response using Monte Carlo ITS codes are found to be in excellent agreement with measured response for both detectors. These results extend the range of validity of the ITS codes. With calibration provided by these experiments and by ITS predictions, dose-depth data from the TLD arrays can be used to estimate flash x-ray source endpoint energies.

  2. Overview of the FLUKA code

    NASA Astrophysics Data System (ADS)

    Battistoni, Giuseppe; Boehlen, Till; Cerutti, Francesco; Chin, Pik Wai; Salvatore Esposito, Luigi; Fassò, Alberto; Ferrari, Alfredo; Mereghetti, Alessio; Garcia Ortega, Pablo; Ranft, Johannes; Roesler, Stefan; Sala, Paola R.; Vlachoudis, Vasilis

    2014-06-01

    The capabilities and physics model implemented inside the FLUKA code are briefly described, with emphasis on hadronic interaction. Examples of the performances of the code are presented including basic (thin target) and complex benchmarks, and radiation detector specific applications. In particular the ability of FLUKA in describing existing calorimeter performances and in predicting those of future ones, as well as the use of the code for neutron and mixed field radiation detectors will be demonstrated with several examples. Warning, no authors found for 2014snam.conf06006.

  3. Development of rotating shadowband spectral radiometers and GCM radiation code test data sets in support of ARM. Technical progress report, September 15, 1992--October 31, 1993

    SciTech Connect

    Michalsky, J.; Harrison, L.

    1993-04-30

    The ARM goal is to help improve both longwave and shortwave models by providing improved radiometric shortwave data. These data can be used directly to test shortwave model predictions. As will be described below they can also provide inferred values for aerosol and cloud properties that are useful for longwave modeling efforts as well. The current ARM research program includes three tasks all related to the study of shortwave radiation transfer through clouds and aerosol. Two of the tasks involve the assembly of archived and new radiation and meteorological data sets; the third and dominant task has been the development and use of new shortwave radiometric sensors. Archived data from Golden, Colorado, and Albany, New York, were combined with National Weather Service ground and upper air data for testing radiation models for the era when the Earth Radiation Budget Experiment (ERBE) was operational. These data do not include optimum surface radiation measurements; consequently we are acquiring downwelling shortwave, including direct and diffuse irradiance, plus downwelling longwave, upwelling shortwave, and aerosol optical depth, at our own institution, as an additional dataset for ARM modelers.

  4. Intercomparison of Monte Carlo radiation transport codes to model TEPC response in low-energy neutron and gamma-ray fields.

    PubMed

    Ali, F; Waker, A J; Waller, E J

    2014-10-01

    Tissue-equivalent proportional counters (TEPC) can potentially be used as a portable and personal dosemeter in mixed neutron and gamma-ray fields, but what hinders this use is their typically large physical size. To formulate compact TEPC designs, the use of a Monte Carlo transport code is necessary to predict the performance of compact designs in these fields. To perform this modelling, three candidate codes were assessed: MCNPX 2.7.E, FLUKA 2011.2 and PHITS 2.24. In each code, benchmark simulations were performed involving the irradiation of a 5-in. TEPC with monoenergetic neutron fields and a 4-in. wall-less TEPC with monoenergetic gamma-ray fields. The frequency and dose mean lineal energies and dose distributions calculated from each code were compared with experimentally determined data. For the neutron benchmark simulations, PHITS produces data closest to the experimental values and for the gamma-ray benchmark simulations, FLUKA yields data closest to the experimentally determined quantities.

  5. Sharing code.

    PubMed

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  6. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; hide

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  7. The FLUKA Code: an Overview

    SciTech Connect

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M.V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P.R.; /Milan U. /INFN, Milan /Pavia U. /INFN, Pavia /CERN /Siegen U. /Houston U. /SLAC /Frascati /NASA, Houston /ENEA, Frascati

    2005-11-09

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  8. CTI Correction Code

    NASA Astrophysics Data System (ADS)

    Massey, Richard; Stoughton, Chris; Leauthaud, Alexie; Rhodes, Jason; Koekemoer, Anton; Ellis, Richard; Shaghoulian, Edgar

    2013-07-01

    Charge Transfer Inefficiency (CTI) due to radiation damage above the Earth's atmosphere creates spurious trailing in images from Charge-Coupled Device (CCD) imaging detectors. Radiation damage also creates unrelated warm pixels, which can be used to measure CTI. This code provides pixel-based correction for CTI and has proven effective in Hubble Space Telescope Advanced Camera for Surveys raw images, successfully reducing the CTI trails by a factor of ~30 everywhere in the CCD and at all flux levels. The core is written in java for speed, and a front-end user interface is provided in IDL. The code operates on raw data by returning individual electrons to pixels from which they were unintentionally dragged during readout. Correction takes about 25 minutes per ACS exposure, but is trivially parallelisable to multiple processors.

  9. Confocal coded aperture imaging

    DOEpatents

    Tobin, Jr., Kenneth William; Thomas, Jr., Clarence E.

    2001-01-01

    A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

  10. The Phantom SPH code

    NASA Astrophysics Data System (ADS)

    Price, Daniel; Wurster, James; Nixon, Chris

    2016-05-01

    I will present the capabilities of the Phantom SPH code for global simulations of dust and gas in protoplanetary discs. I will present our new algorithms for simulating both small and large grains in discs, as well as our progress towards simulating evolving grain populations and coupling with radiation. Finally, I will discuss our recent applications to HL Tau and the physics of dust gap opening.

  11. DNA codes

    SciTech Connect

    Torney, D. C.

    2001-01-01

    We have begun to characterize a variety of codes, motivated by potential implementation as (quaternary) DNA n-sequences, with letters denoted A, C The first codes we studied are the most reminiscent of conventional group codes. For these codes, Hamming similarity was generalized so that the score for matched letters takes more than one value, depending upon which letters are matched [2]. These codes consist of n-sequences satisfying an upper bound on the similarities, summed over the letter positions, of distinct codewords. We chose similarity 2 for matches of letters A and T and 3 for matches of the letters C and G, providing a rough approximation to double-strand bond energies in DNA. An inherent novelty of DNA codes is 'reverse complementation'. The latter may be defined, as follows, not only for alphabets of size four, but, more generally, for any even-size alphabet. All that is required is a matching of the letters of the alphabet: a partition into pairs. Then, the reverse complement of a codeword is obtained by reversing the order of its letters and replacing each letter by its match. For DNA, the matching is AT/CG because these are the Watson-Crick bonding pairs. Reversal arises because two DNA sequences form a double strand with opposite relative orientations. Thus, as will be described in detail, because in vitro decoding involves the formation of double-stranded DNA from two codewords, it is reasonable to assume - for universal applicability - that the reverse complement of any codeword is also a codeword. In particular, self-reverse complementary codewords are expressly forbidden in reverse-complement codes. Thus, an appropriate distance between all pairs of codewords must, when large, effectively prohibit binding between the respective codewords: to form a double strand. Only reverse-complement pairs of codewords should be able to bind. For most applications, a DNA code is to be bi-partitioned, such that the reverse-complementary pairs are separated

  12. Inner Radiation Belt Representation of the Energetic Electron Environment: Model and Data Synthesis Using the Salammbo Radiation Belt Transport Code and Los Alamos Geosynchronous and GPS Energetic Particle Data

    NASA Technical Reports Server (NTRS)

    Friedel, R. H. W.; Bourdarie, S.; Fennell, J.; Kanekal, S.; Cayton, T. E.

    2004-01-01

    The highly energetic electron environment in the inner magnetosphere (GEO inward) has received a lot of research attention in resent years, as the dynamics of relativistic electron acceleration and transport are not yet fully understood. These electrons can cause deep dielectric charging in any space hardware in the MEO to GEO region. We use a new and novel approach to obtain a global representation of the inner magnetospheric energetic electron environment, which can reproduce the absolute environment (flux) for any spacecraft orbit in that region to within a factor of 2 for the energy range of 100 KeV to 5 MeV electrons, for any levels of magnetospheric activity. We combine the extensive set of inner magnetospheric energetic electron observations available at Los Alamos with the physics based Salammbo transport code, using the data assimilation technique of "nudging". This in effect input in-situ data into the code and allows the diffusion mechanisms in the code to interpolate the data into regions and times of no data availability. We present here details of the methods used, both in the data assimilation process and in the necessary inter-calibration of the input data used. We will present sample runs of the model/data code and compare the results to test spacecraft data not used in the data assimilation process.

  13. Improved outer boundary conditions for outer radiation belt data assimilation using THEMIS-SST data and the Salammbo-EnKF code

    NASA Astrophysics Data System (ADS)

    Maget, V.; Sicard-Piet, A.; Bourdarie, S.; Lazaro, D.; Turner, D. L.; Daglis, I. A.; Sandberg, I.

    2015-07-01

    Over the last decade, efforts have been made in the radiation belt community to develop data assimilation tools in order to improve the accuracy of radiation belts models. In this paper we present a new method to correctly take into account the outer boundary conditions at L* = 8 in such an enhanced model of the radiation belts. To do that we based our work on the Time History of Events and Macroscale Interactions during Substorms/Solid State Telescope data set. Statistics are developed to define a consistent electron distribution at L* = 8 (in both equatorial pitch angle and energy), and a variance-covariance matrix is estimated in order to more realistically drive the Monte Carlo sampling required by the Ensemble Kalman Filter (EnKF). Data processing is first described as well as caveats avoided, and then the use of these information in a machinery such as the EnKF is described. It is shown that the way the Monte Carlo simulations are performed is of great importance to realistically reproduced outer boundary distribution needed by the physic-based Salammbô model. Finally, EnKF simulations are performed and compared during September 2011 in order to analyze the improvements gained using this new method of defining outer boundary conditions. In particular, we highlight in this study that such a method provides great improvement in the reconstruction of the dynamics observed at geosynchronous orbit, both during quiet and active magnetic conditions.

  14. Sharing code

    PubMed Central

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing. PMID:25165519

  15. Investigation of the Performance of Various CVD Diamond Crystal Qualities for the Measurement of Radiation Doses from a Low Energy Mammography X-Ray Beam, Compared with MC Code (PENELOPE) Calculations

    NASA Astrophysics Data System (ADS)

    Zakari, Y. I.; Mavunda, R. D.; Nam, T. L.; Keddy, R. J.

    The tissue equivalence of diamond allows for accurate radiation dose determination without large corrections for different attenuation values in biological tissue, but its low Z value limits this advantage however to the lower energy photons such as for example in Mammography X-ray beams. This paper assays the performance of nine Chemical Vapour Deposition (CVD) diamonds for use as radiation sensing material. The specimens fabricated in wafer form are classified as detector grade, optical grade and single crystals. It is well known that the presence of defects in diamonds, including CVD specimens, not only dictates but also affects the responds of diamond to radiation in different ways. In this investigation, tools such as electron spin resonance (ESR), thermoluminescence (TL) Raman spectroscopy and ultra violet (UV) spectroscopy were used to probe each of the samples. The linearity, sensitivity and other characteristics of the detector to photon interaction was analyzed, and from the I-V characteristics. The diamonds categorized into four each, of the so called Detector and Optical grades, and a single crystal CVD were exposed to low X-ray peak voltage range (22 to 27 KVp) with a trans-crystal polarizing fields of 0.4 kV.cm-1, 0.66 kV.cm-1 and 0.8 kV.cm-1. The presentation discusses the presence of defects identifiable by the techniques used and correlates the radiation performance of the three types of crystals to their presence. The choice of a wafer as either a spectrometer or as X-ray dosimeter within the selected energy range was made. The analyses was validated with Monte-Carlo code (PENELOPE)

  16. Alpha particles at energies of 10 MeV to 1 TeV: conversion coefficients for fluence-to-absorbed dose, effective dose, and gray equivalent, calculated using Monte Carlo radiation transport code MCNPX 2.7.A.

    PubMed

    Copeland, Kyle; Parker, Donald E; Friedberg, Wallace

    2010-03-01

    Conversion coefficients have been calculated for fluence to absorbed dose, fluence to effective dose and fluence to gray equivalent, for isotropic exposure to alpha particles in the energy range of 10 MeV to 1 TeV (0.01-1000 GeV). The coefficients were calculated using Monte Carlo transport code MCNPX 2.7.A and BodyBuilder 1.3 anthropomorphic phantoms modified to allow calculation of effective dose to a Reference Person using tissues and tissue weighting factors from 1990 and 2007 recommendations of the International Commission on Radiological Protection (ICRP) and gray equivalent to selected tissues as recommended by the National Council on Radiation Protection and Measurements. Coefficients for effective dose are within 30 % of those calculated using ICRP 1990 recommendations.

  17. Fluence to absorbed dose, effective dose and gray equivalent conversion coefficients for iron nuclei from 10 MeV to 1 TeV, calculated using Monte Carlo radiation transport code MCNPX 2.7.A.

    PubMed

    Copeland, Kyle; Parker, Donald E; Friedberg, Wallace

    2010-03-01

    Conversion coefficients have been calculated for fluence-to-absorbed dose, fluence-to-effective dose and fluence-to-gray equivalent for isotropic exposure of an adult male and an adult female to (56)Fe(26+) in the energy range of 10 MeV to 1 TeV (0.01-1000 GeV). The coefficients were calculated using Monte Carlo transport code MCNPX 2.7.A and BodyBuilder 1.3 anthropomorphic phantoms modified to allow calculation of effective dose using tissues and tissue weighting factors from either the 1990 or 2007 recommendations of the International Commission on Radiological Protection (ICRP) and gray equivalent to selected tissues as recommended by the National Council on Radiation Protection and Measurements. Calculations using ICRP 2007 recommendations result in fluence-to-effective dose conversion coefficients that are almost identical at most energies to those calculated using ICRP 1990 recommendations.

  18. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  19. Deuterons at energies of 10 MeV to 1 TeV: conversion coefficients for fluence-to-absorbed dose, equivalent dose, effective dose and gray equivalent, calculated using Monte Carlo radiation transport code MCNPX 2.7.C.

    PubMed

    Copeland, Kyle; Parker, Donald E; Friedberg, Wallace

    2011-01-01

    Conversion coefficients were calculated for fluence-to-absorbed dose, fluence-to-equivalent dose, fluence-to-effective dose and fluence-to-gray equivalent for isotropic exposure of an adult female and an adult male to deuterons ((2)H(+)) in the energy range 10 MeV-1 TeV (0.01-1000 GeV). Coefficients were calculated using the Monte Carlo transport code MCNPX 2.7.C and BodyBuilder™ 1.3 anthropomorphic phantoms. Phantoms were modified to allow calculation of the effective dose to a Reference Person using tissues and tissue weighting factors from 1990 and 2007 recommendations of the International Commission on Radiological Protection (ICRP) and gray equivalent to selected tissues as recommended by the National Council on Radiation Protection and Measurements. Coefficients for the equivalent and effective dose incorporated a radiation weighting factor of 2. At 15 of 19 energies for which coefficients for the effective dose were calculated, coefficients based on ICRP 1990 and 2007 recommendations differed by <3%. The greatest difference, 47%, occurred at 30 MeV.

  20. Nature's Code

    NASA Astrophysics Data System (ADS)

    Hill, Vanessa J.; Rowlands, Peter

    2008-10-01

    We propose that the mathematical structures related to the `universal rewrite system' define a universal process applicable to Nature, which we may describe as `Nature's code'. We draw attention here to such concepts as 4 basic units, 64- and 20-unit structures, symmetry-breaking and 5-fold symmetry, chirality, double 3-dimensionality, the double helix, the Van der Waals force and the harmonic oscillator mechanism, and our explanation of how they necessarily lead to self-aggregation, complexity and emergence in higher-order systems. Biological concepts, such as translation, transcription, replication, the genetic code and the grouping of amino acids appear to be driven by fundamental processes of this kind, and it would seem that the Platonic solids, pentagonal symmetry and Fibonacci numbers have significant roles in organizing `Nature's code'.

  1. Show Code.

    PubMed

    Shalev, Daniel

    2017-01-01

    "Let's get one thing straight: there is no such thing as a show code," my attending asserted, pausing for effect. "You either try to resuscitate, or you don't. None of this halfway junk." He spoke so loudly that the two off-service consultants huddled at computers at the end of the unit looked up… We did four rounds of compressions and pushed epinephrine twice. It was not a long code. We did good, strong compressions and coded this man in earnest until the end. Toward the final round, though, as I stepped up to do compressions, my attending looked at me in a deep way. It was a look in between willing me as some object under his command and revealing to me everything that lay within his brash, confident surface but could not be spoken. © 2017 The Hastings Center.

  2. Radiation transport calculations for cosmic radiation.

    PubMed

    Endo, A; Sato, T

    2012-01-01

    The radiation environment inside and near spacecraft consists of various components of primary radiation in space and secondary radiation produced by the interaction of the primary radiation with the walls and equipment of the spacecraft. Radiation fields inside astronauts are different from those outside them, because of the body's self-shielding as well as the nuclear fragmentation reactions occurring in the human body. Several computer codes have been developed to simulate the physical processes of the coupled transport of protons, high-charge and high-energy nuclei, and the secondary radiation produced in atomic and nuclear collision processes in matter. These computer codes have been used in various space radiation protection applications: shielding design for spacecraft and planetary habitats, simulation of instrument and detector responses, analysis of absorbed doses and quality factors in organs and tissues, and study of biological effects. This paper focuses on the methods and computer codes used for radiation transport calculations on cosmic radiation, and their application to the analysis of radiation fields inside spacecraft, evaluation of organ doses in the human body, and calculation of dose conversion coefficients using the reference phantoms defined in ICRP Publication 110. Copyright © 2012. Published by Elsevier Ltd.

  3. An Overview of the Monte Carlo Methods, Codes, & Applications Group

    SciTech Connect

    Trahan, Travis John

    2016-08-30

    This report sketches the work of the Group to deliver first-principle Monte Carlo methods, production quality codes, and radiation transport-based computational and experimental assessments using the codes MCNP and MCATK for such applications as criticality safety, non-proliferation, nuclear energy, nuclear threat reduction and response, radiation detection and measurement, radiation health protection, and stockpile stewardship.

  4. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  5. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  6. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  7. Development of the Code RITRACKS

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Cucinotta, Francis A.

    2013-01-01

    A document discusses the code RITRACKS (Relativistic Ion Tracks), which was developed to simulate heavy ion track structure at the microscopic and nanoscopic scales. It is a Monte-Carlo code that simulates the production of radiolytic species in water, event-by-event, and which may be used to simulate tracks and also to calculate dose in targets and voxels of different sizes. The dose deposited by the radiation can be calculated in nanovolumes (voxels). RITRACKS allows simulation of radiation tracks without the need of extensive knowledge of computer programming or Monte-Carlo simulations. It is installed as a regular application on Windows systems. The main input parameters entered by the user are the type and energy of the ion, the length and size of the irradiated volume, the number of ions impacting the volume, and the number of histories. The simulation can be started after the input parameters are entered in the GUI. The number of each kind of interactions for each track is shown in the result details window. The tracks can be visualized in 3D after the simulation is complete. It is also possible to see the time evolution of the tracks and zoom on specific parts of the tracks. The software RITRACKS can be very useful for radiation scientists to investigate various problems in the fields of radiation physics, radiation chemistry, and radiation biology. For example, it can be used to simulate electron ejection experiments (radiation physics).

  8. Studies of acute and chronic radiation injury at the Biological and Medical Research Division, Argonne National Laboratory, 1953-1970: Description of individual studies, data files, codes, and summaries of significant findings

    SciTech Connect

    Grahn, D.; Fox, C.; Wright, B.J.; Carnes, B.A.

    1994-05-01

    Between 1953 and 1970, studies on the long-term effects of external x-ray and {gamma} irradiation on inbred and hybrid mouse stocks were carried out at the Biological and Medical Research Division, Argonne National Laboratory. The results of these studies, plus the mating, litter, and pre-experimental stock records, were routinely coded on IBM cards for statistical analysis and record maintenance. Also retained were the survival data from studies performed in the period 1943-1953 at the National Cancer Institute, National Institutes of Health, Bethesda, Maryland. The card-image data files have been corrected where necessary and refiled on hard disks for long-term storage and ease of accessibility. In this report, the individual studies and data files are described, and pertinent factors regarding caging, husbandry, radiation procedures, choice of animals, and other logistical details are summarized. Some of the findings are also presented. Descriptions of the different mouse stocks and hybrids are included in an appendix; more than three dozen stocks were involved in these studies. Two other appendices detail the data files in their original card-image format and the numerical codes used to describe the animal`s exit from an experiment and, for some studies, any associated pathologic findings. Tabular summaries of sample sizes, dose levels, and other variables are also given to assist investigators in their selection of data for analysis. The archive is open to any investigator with legitimate interests and a willingness to collaborate and acknowledge the source of the data and to recognize appropriate conditions or caveats.

  9. Spaceflight Validation of Hzetrn Code

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Shinn, J. L.; Singleterry, R. C.; Badavi, F. F.; Badhwar, G. D.; Reitz, G.; Beaujean, R.; Cucinotta, F. A.

    1999-01-01

    HZETRN is being developed as a fast deterministic radiation transport code applicable to neutrons, protons, and multiply charged ions in the space environment. It was recently applied to 50 hours of IMP8 data measured during the August 4, 1972 solar event to map the hourly exposures within the human body under several shield configurations. This calculation required only 18 hours on a VAX 4000 machine. A similar calculation using the Monte Carlo method would have required two years of dedicated computer time. The code has been benchmarked against well documented and tested Monte Carlo proton transport codes with good success. The code will allow important trade studies to be made with relative ease due to the computational speed and will be useful in assessing design alternatives in an integrated system software environment. Since there are no well tested Monte Carlo codes for HZE particles, we have been engaged in flight validation of the HZETRN results. To date we have made comparison with TEPC, CR-39, charge particle telescopes, and Bonner spheres. This broad range of detectors allows us to test a number of functions related to differing physical processes which add to the complicated radiation fields within a spacecraft or the human body, which functions can be calculated by the HZETRN code system. In the present report we will review these results.

  10. FAST GYROSYNCHROTRON CODES

    SciTech Connect

    Fleishman, Gregory D.; Kuznetsov, Alexey A.

    2010-10-01

    Radiation produced by charged particles gyrating in a magnetic field is highly significant in the astrophysics context. Persistently increasing resolution of astrophysical observations calls for corresponding three-dimensional modeling of the radiation. However, available exact equations are prohibitively slow in computing a comprehensive table of high-resolution models required for many practical applications. To remedy this situation, we develop approximate gyrosynchrotron (GS) codes capable of quickly calculating the GS emission (in non-quantum regime) from both isotropic and anisotropic electron distributions in non-relativistic, mildly relativistic, and ultrarelativistic energy domains applicable throughout a broad range of source parameters including dense or tenuous plasmas and weak or strong magnetic fields. The computation time is reduced by several orders of magnitude compared with the exact GS algorithm. The new algorithm performance can gradually be adjusted to the user's needs depending on whether precision or computation speed is to be optimized for a given model. The codes are made available for users as a supplement to this paper.

  11. Tritons at energies of 10 MeV to 1 TeV: conversion coefficients for fluence-to-absorbed dose, equivalent dose, effective dose and gray equivalent, calculated using Monte Carlo radiation transport code MCNPX 2.7.C.

    PubMed

    Copeland, Kyle; Parker, Donald E; Friedberg, Wallace

    2010-12-01

    Conversion coefficients were calculated for fluence-to-absorbed dose, fluence-to-equivalent dose, fluence-to-effective dose and fluence-to-gray equivalent for isotropic exposure of an adult female and an adult male to tritons ((3)H(+)) in the energy range of 10 MeV to 1 TeV (0.01-1000 GeV). Coefficients were calculated using Monte Carlo transport code MCNPX 2.7.C and BodyBuilder™ 1.3 anthropomorphic phantoms. Phantoms were modified to allow calculation of effective dose to a Reference Person using tissues and tissue weighting factors from 1990 and 2007 recommendations of the International Commission on Radiological Protection (ICRP) and calculation of gray equivalent to selected tissues as recommended by the National Council on Radiation Protection and Measurements. At 15 of the 19 energies for which coefficients for effective dose were calculated, coefficients based on ICRP 2007 and 1990 recommendations differed by less than 3%. The greatest difference, 43%, occurred at 30 MeV.

  12. Helions at energies of 10 MeV to 1 TeV: conversion coefficients for fluence-to-absorbed dose, equivalent dose, effective dose and gray equivalent, calculated using Monte Carlo radiation transport code MCNPX 2.7.C.

    PubMed

    Copeland, Kyle; Parker, Donald E; Friedberg, Wallace

    2010-12-01

    Conversion coefficients were calculated for fluence-to-absorbed dose, fluence-to-equivalent dose, fluence-to-effective dose and fluence-to-gray equivalent, for isotropic exposure of an adult male and an adult female to helions ((3)He(2+)) in the energy range of 10 MeV to 1 TeV (0.01-1000 GeV). Calculations were performed using Monte Carlo transport code MCNPX 2.7.C and BodyBuilder™ 1.3 anthropomorphic phantoms modified to allow calculation of effective dose using tissues and tissue weighting factors from either the 1990 or 2007 recommendations of the International Commission on Radiological Protection (ICRP), and gray equivalent to selected tissues as recommended by the National Council on Radiation Protection and Measurements. At 15 of the 19 energies for which coefficients for effective dose were calculated, coefficients based on ICRP 2007 and 1990 recommendations differed by less than 2%. The greatest difference, 62%, occurred at 100 MeV.

  13. The Integrated TIGER Series Codes

    SciTech Connect

    Kensek, Ronald P.; Franke, Brian C.; Laub, Thomas W.

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  14. Simutaneous Variational Retrievals of Temperature, Humidity, Surface and Cloud Properties from Satellite and Airborne Hyperspectral Infrared Sounder Data using the Havemann-Taylor Fast Radiative Transfer Code (HT-FRTC) as the Forward Model Operator

    NASA Astrophysics Data System (ADS)

    Havemann, S.; Thelen, J. C.; Harlow, R. C.

    2016-12-01

    Full scattering radiative transfer simulations for hyperspectral infrared and shortwave sounders are essential in order to be able to extract the maximal information content from these instruments for cloudy scenes and those with significant aerosol loading, but have been rarely done because of the high computational demands. The Havemann-Taylor Fast Radiative Transfer Code works in Principal Component space, reducing the computational demand by orders of magnitude thereby making fast simultaneous retrievals of vertical profiles of temperature and humidity, surface temperature and emissivity as well as cloud and aerosol properties feasible. Results of successful retrievals using IASI sounder data as well as data taken during flights of the Airborne Research Interferometer Evaluation System (ARIES) on board the FAAM Bae 146 aircraft will be presented. These will demonstrate that the use of all the instrument channels in PC space can provide valuable information both on temperature and humidity profiles relevant for NWP and on the cirrus cloud properties at the same time. There is very significant information on the humidity profile below semi-transparent cirrus to be gained from IR sounder data. The retrieved ice water content is in good agreement with airborne in-situ measurements during Lagrangian spiral descents. In addition to the full scattering calculations, the HT-FRTC has also been trained with a fast approximation to the scattering problem which reduces it to a clear-sky calculation but with a modified extinction (Chou scaling). Chou scaling is a reasonable approximation in the infrared but is very poor where the solar contribution becomes significant. The comparison of the retrieval performance with the full scattering solution and the Chou scaling solution in the forward model operator for infrared sounders shows that temperature and humidity profiles are only marginally degraded by the use of the Chou scaling approximation. Retrievals of the specific

  15. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  16. TU-EF-304-10: Efficient Multiscale Simulation of the Proton Relative Biological Effectiveness (RBE) for DNA Double Strand Break (DSB) Induction and Bio-Effective Dose in the FLUKA Monte Carlo Radiation Transport Code

    SciTech Connect

    Moskvin, V; Tsiamas, P; Axente, M; Farr, J; Stewart, R

    2015-06-15

    Purpose: One of the more critical initiating events for reproductive cell death is the creation of a DNA double strand break (DSB). In this study, we present a computationally efficient way to determine spatial variations in the relative biological effectiveness (RBE) of proton therapy beams within the FLUKA Monte Carlo (MC) code. Methods: We used the independently tested Monte Carlo Damage Simulation (MCDS) developed by Stewart and colleagues (Radiat. Res. 176, 587–602 2011) to estimate the RBE for DSB induction of monoenergetic protons, tritium, deuterium, hellium-3, hellium-4 ions and delta-electrons. The dose-weighted (RBE) coefficients were incorporated into FLUKA to determine the equivalent {sup 6}°60Co γ-ray dose for representative proton beams incident on cells in an aerobic and anoxic environment. Results: We found that the proton beam RBE for DSB induction at the tip of the Bragg peak, including primary and secondary particles, is close to 1.2. Furthermore, the RBE increases laterally to the beam axis at the area of Bragg peak. At the distal edge, the RBE is in the range from 1.3–1.4 for cells irradiated under aerobic conditions and may be as large as 1.5–1.8 for cells irradiated under anoxic conditions. Across the plateau region, the recorded RBE for DSB induction is 1.02 for aerobic cells and 1.05 for cells irradiated under anoxic conditions. The contribution to total effective dose from secondary heavy ions decreases with depth and is higher at shallow depths (e.g., at the surface of the skin). Conclusion: Multiscale simulation of the RBE for DSB induction provides useful insights into spatial variations in proton RBE within pristine Bragg peaks. This methodology is potentially useful for the biological optimization of proton therapy for the treatment of cancer. The study highlights the need to incorporate spatial variations in proton RBE into proton therapy treatment plans.

  17. Electromagnetic particle simulation codes

    NASA Technical Reports Server (NTRS)

    Pritchett, P. L.

    1985-01-01

    Electromagnetic particle simulations solve the full set of Maxwell's equations. They thus include the effects of self-consistent electric and magnetic fields, magnetic induction, and electromagnetic radiation. The algorithms for an electromagnetic code which works directly with the electric and magnetic fields are described. The fields and current are separated into transverse and longitudinal components. The transverse E and B fields are integrated in time using a leapfrog scheme applied to the Fourier components. The particle pushing is performed via the relativistic Lorentz force equation for the particle momentum. As an example, simulation results are presented for the electron cyclotron maser instability which illustrate the importance of relativistic effects on the wave-particle resonance condition and on wave dispersion.

  18. The stellar atmosphere simulation code Bifrost. Code description and validation

    NASA Astrophysics Data System (ADS)

    Gudiksen, B. V.; Carlsson, M.; Hansteen, V. H.; Hayek, W.; Leenaarts, J.; Martínez-Sykora, J.

    2011-07-01

    Context. Numerical simulations of stellar convection and photospheres have been developed to the point where detailed shapes of observed spectral lines can be explained. Stellar atmospheres are very complex, and very different physical regimes are present in the convection zone, photosphere, chromosphere, transition region and corona. To understand the details of the atmosphere it is necessary to simulate the whole atmosphere since the different layers interact strongly. These physical regimes are very diverse and it takes a highly efficient massively parallel numerical code to solve the associated equations. Aims: The design, implementation and validation of the massively parallel numerical code Bifrost for simulating stellar atmospheres from the convection zone to the corona. Methods: The code is subjected to a number of validation tests, among them the Sod shock tube test, the Orzag-Tang colliding shock test, boundary condition tests and tests of how the code treats magnetic field advection, chromospheric radiation, radiative transfer in an isothermal scattering atmosphere, hydrogen ionization and thermal conduction. Results.Bifrost completes the tests with good results and shows near linear efficiency scaling to thousands of computing cores.

  19. Radiation load to the SNAP CCD

    SciTech Connect

    N. V. Mokhov, I. L. Rakhno and S. I. Striganov

    2003-08-14

    Results of an express Monte Carlo analysis with the MARS14 code of radiation load to the CCD optical detectors in the Supernova Acceleration Project (SNAP) mission presented for realistic radiation environment over the satellite orbit.

  20. Homological stabilizer codes

    SciTech Connect

    Anderson, Jonas T.

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  1. How Should I Care for Myself During Radiation Therapy?

    MedlinePlus

    ... Upper GI What is Radiation Therapy? Find a Radiation Oncologist Last Name: Facility: City: State: Zip Code: ... information How Should I Care for Myself During Radiation Therapy? Get plenty of rest. Many patients experience ...

  2. Model Children's Code.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  3. Coding of Neuroinfectious Diseases.

    PubMed

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  4. Diagnostic Coding for Epilepsy.

    PubMed

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  5. Phylogeny of genetic codes and punctuation codes within genetic codes.

    PubMed

    Seligmann, Hervé

    2015-03-01

    Punctuation codons (starts, stops) delimit genes, reflect translation apparatus properties. Most codon reassignments involve punctuation. Here two complementary approaches classify natural genetic codes: (A) properties of amino acids assigned to codons (classical phylogeny), coding stops as X (A1, antitermination/suppressor tRNAs insert unknown residues), or as gaps (A2, no translation, classical stop); and (B) considering only punctuation status (start, stop and other codons coded as -1, 0 and 1 (B1); 0, -1 and 1 (B2, reflects ribosomal translational dynamics); and 1, -1, and 0 (B3, starts/stops as opposites)). All methods separate most mitochondrial codes from most nuclear codes; Gracilibacteria consistently cluster with metazoan mitochondria; mitochondria co-hosted with chloroplasts cluster with nuclear codes. Method A1 clusters the euplotid nuclear code with metazoan mitochondria; A2 separates euplotids from mitochondria. Firmicute bacteria Mycoplasma/Spiroplasma and Protozoan (and lower metazoan) mitochondria share codon-amino acid assignments. A1 clusters them with mitochondria, they cluster with the standard genetic code under A2: constraints on amino acid ambiguity versus punctuation-signaling produced the mitochondrial versus bacterial versions of this genetic code. Punctuation analysis B2 converges best with classical phylogenetic analyses, stressing the need for a unified theory of genetic code punctuation accounting for ribosomal constraints.

  6. To Code or Not To Code?

    ERIC Educational Resources Information Center

    Parkinson, Brian; Sandhu, Parveen; Lacorte, Manel; Gourlay, Lesley

    1998-01-01

    This article considers arguments for and against the use of coding systems in classroom-based language research and touches on some relevant considerations from ethnographic and conversational analysis approaches. The four authors each explain and elaborate on their practical decision to code or not to code events or utterances at a specific point…

  7. Bare Code Reader

    NASA Astrophysics Data System (ADS)

    Clair, Jean J.

    1980-05-01

    The Bare code system will be used, in every market and supermarket. The code, which is normalised in US and Europe (code EAN) gives informations on price, storage, nature and allows in real time the gestion of theshop.

  8. International assessment of PCA codes

    SciTech Connect

    Neymotin, L.; Lui, C.; Glynn, J.; Archarya, S.

    1993-11-01

    Over the past three years (1991-1993), an extensive international exercise for intercomparison of a group of six Probabilistic Consequence Assessment (PCA) codes was undertaken. The exercise was jointly sponsored by the Commission of European Communities (CEC) and OECD Nuclear Energy Agency. This exercise was a logical continuation of a similar effort undertaken by OECD/NEA/CSNI in 1979-1981. The PCA codes are currently used by different countries for predicting radiological health and economic consequences of severe accidents at nuclear power plants (and certain types of non-reactor nuclear facilities) resulting in releases of radioactive materials into the atmosphere. The codes participating in the exercise were: ARANO (Finland), CONDOR (UK), COSYMA (CEC), LENA (Sweden), MACCS (USA), and OSCAAR (Japan). In parallel with this inter-code comparison effort, two separate groups performed a similar set of calculations using two of the participating codes, MACCS and COSYMA. Results of the intercode and inter-MACCS comparisons are presented in this paper. The MACCS group included four participants: GREECE: Institute of Nuclear Technology and Radiation Protection, NCSR Demokritos; ITALY: ENEL, ENEA/DISP, and ENEA/NUC-RIN; SPAIN: Universidad Politecnica de Madrid (UPM) and Consejo de Seguridad Nuclear; USA: Brookhaven National Laboratory, US NRC and DOE.

  9. Generalized concatenated quantum codes

    SciTech Connect

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-05-15

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  10. Power System Optimization Codes Modified

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1999-01-01

    A major modification of and addition to existing Closed Brayton Cycle (CBC) space power system optimization codes was completed. These modifications relate to the global minimum mass search driver programs containing three nested iteration loops comprising iterations on cycle temperature ratio, and three separate pressure ratio iteration loops--one loop for maximizing thermodynamic efficiency, one for minimizing radiator area, and a final loop for minimizing overall power system mass. Using the method of steepest ascent, the code sweeps through the pressure ratio space repeatedly, each time with smaller iteration step sizes, so that the three optimum pressure ratios can be obtained to any desired accuracy for each of the objective functions referred to above (i.e., maximum thermodynamic efficiency, minimum radiator area, and minimum system mass). Two separate options for the power system heat source are available: 1. A nuclear fission reactor can be used. It is provided with a radiation shield 1. (composed of a lithium hydride (LiH) neutron shield and tungsten (W) gamma shield). Suboptions can be used to select the type of reactor (i.e., fast spectrum liquid metal cooled or epithermal high-temperature gas reactor (HTGR)). 2. A solar heat source can be used. This option includes a parabolic concentrator and heat receiver for raising the temperature of the recirculating working fluid. A useful feature of the code modifications is that key cycle parameters are displayed, including the overall system specific mass in kilograms per kilowatt and the system specific power in watts per kilogram, as the results for each temperature ratio are computed. As the minimum mass temperature ratio is encountered, a message is printed out. Several levels of detailed information on cycle state points, subsystem mass results, and radiator temperature profiles are stored for this temperature ratio condition and can be displayed or printed by users.

  11. Power System Optimization Codes Modified

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1999-01-01

    A major modification of and addition to existing Closed Brayton Cycle (CBC) space power system optimization codes was completed. These modifications relate to the global minimum mass search driver programs containing three nested iteration loops comprising iterations on cycle temperature ratio, and three separate pressure ratio iteration loops--one loop for maximizing thermodynamic efficiency, one for minimizing radiator area, and a final loop for minimizing overall power system mass. Using the method of steepest ascent, the code sweeps through the pressure ratio space repeatedly, each time with smaller iteration step sizes, so that the three optimum pressure ratios can be obtained to any desired accuracy for each of the objective functions referred to above (i.e., maximum thermodynamic efficiency, minimum radiator area, and minimum system mass). Two separate options for the power system heat source are available: 1. A nuclear fission reactor can be used. It is provided with a radiation shield 1. (composed of a lithium hydride (LiH) neutron shield and tungsten (W) gamma shield). Suboptions can be used to select the type of reactor (i.e., fast spectrum liquid metal cooled or epithermal high-temperature gas reactor (HTGR)). 2. A solar heat source can be used. This option includes a parabolic concentrator and heat receiver for raising the temperature of the recirculating working fluid. A useful feature of the code modifications is that key cycle parameters are displayed, including the overall system specific mass in kilograms per kilowatt and the system specific power in watts per kilogram, as the results for each temperature ratio are computed. As the minimum mass temperature ratio is encountered, a message is printed out. Several levels of detailed information on cycle state points, subsystem mass results, and radiator temperature profiles are stored for this temperature ratio condition and can be displayed or printed by users.

  12. The flying radiation case

    SciTech Connect

    Brownell, J.H.; Bowers, R.L.

    1997-04-01

    The Los Alamos foil implosion program has the goal of producing an intense, high-energy density x-ray source by converting the energy of a magnetically imploded plasma into radiation and material energy. One of the methods for converting the plasma energy into thermal energy and radiation and utilizing it for experiments is called the flying radiation case (FRC). In this paper the authors shall model the FRC and provide a physical description of the processes involved. An analytic model of a planar FRC in the hydrodynamic approximation is used to describe the assembly and shock heating of a central cushion by a conducting liner driver. The results are also used to benchmark a hydrodynamics code for modeling an FRC. They then use a radiation-hydrodynamics computational model to explore the effects of radiation production and transport when a gold plasma assembles on a CH cushion. Results are presented for the structure and evolution of the radiation hohlraum.

  13. Accumulate repeat accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

  14. Coset Codes Viewed as Terminated Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1996-01-01

    In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.

  15. Concatenated Coding Using Trellis-Coded Modulation

    NASA Technical Reports Server (NTRS)

    Thompson, Michael W.

    1997-01-01

    In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.

  16. Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems

    SciTech Connect

    Anderson, S R; Bihari, B L; Salari, K; Woodward, C S

    2006-12-29

    As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.

  17. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  18. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  19. Bar Codes for Libraries.

    ERIC Educational Resources Information Center

    Rahn, Erwin

    1984-01-01

    Discusses the evolution of standards for bar codes (series of printed lines and spaces that represent numbers, symbols, and/or letters of alphabet) and describes the two types most frequently adopted by libraries--Code-A-Bar and CODE 39. Format of the codes is illustrated. Six references and definitions of terminology are appended. (EJS)

  20. Manually operated coded switch

    DOEpatents

    Barnette, Jon H.

    1978-01-01

    The disclosure relates to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made.

  1. QR Codes 101

    ERIC Educational Resources Information Center

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  2. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  3. Radiation Protection

    MedlinePlus

    ... EPA United States Environmental Protection Agency Search Search Radiation Protection Share Facebook Twitter Google+ Pinterest Contact Us Radiation Protection Document Library View and download EPA radiation ...

  4. XSTAR Code and Database Status

    NASA Astrophysics Data System (ADS)

    Kallman, Timothy R.

    2017-08-01

    The XSTAR code is a simulation tool for calculating spectra associated with plasmas which are in a time-steady balance among the microphysical processes. It allows for treatment of plasmas which are exposed to illumination by energetic photons, but also treats processes relevant to collision-dominated plasmas. Processes are treated in a full collisional-radiative formalism which includes convergence to local thermodynamic equilibrium under suitable conditions. It features an interface to the most widely used software for fitting to astrophysical spectra, and has also been compared with laboratory plasma experiments. This poster will describe the recent updates to XSTAR, including atomic data, new features, and some recent applications of the code.

  5. Comparison of the Gauss-Seidel spherical polarized radiative transfer code with other radiative transfer codes.

    PubMed

    Herman, B M; Caudill, T R; Flittner, D E; Thome, K J; Ben-David, A

    1995-07-20

    Calculations that use the Gauss-Seidel method are presented of the diffusely scattered light in a spherical atmosphere with polarization fully included. Comparisons are made between this method and the Monte Carlo calculations of other researchers for spherical geometry in a pure Rayleigh atmosphere. Comparisons with plane-parallel atmospheres are also presented. Single-scatter intensity comparisons with spherical geometry show excellent agreement. When all orders of scattering are included, comparisons of polarization parameters I, Q and U as well as the plane of polarization show good agreement when allowances are made for the statistical variability inherent in the Monte Carlo method.

  6. Comparison of the Gauss-Seidel spherical polarized radiative transfer code with other radiative transfer codes

    NASA Astrophysics Data System (ADS)

    Herman, B. M.; Flittner, D. E.; Caudill, T. R.; Thome, K. J.; Ben-David, A.

    1995-07-01

    Calculations that use the Gauss-Seidel method are presented of the diffusely scattered light in a spherical atmosphere with polarization fully included. Comparisons are made between this method and the Monte Carlo calculations of other researchers for spherical geometry in a pure Rayleigh atmosphere. Comparisons with plane-parallel atmospheres are also presented. Single-scatter intensity comparisons with spherical geometry show excellent agreement. When all orders of scattering are included, comparisons of polarization parameters I, Q and U as well as the plane of polarization show good agreement when allowances are made for the statistical variability inherent in the Monte Carlo method.

  7. The program RADLST (Radiation Listing)

    SciTech Connect

    Burrows, T.W.

    1988-02-29

    The program RADLST (Radiation Listing) is designed to calculate the nuclear and atomic radiations associated with the radioactive decay of nuclei. It uses as its primary input nuclear decay data in the Evaluated Nuclear Structure Data File (ENSDF) format. The code is written in FORTRAN 77 and, with a few exceptions, is consistent with the ANSI standard. 65 refs.

  8. Efficient entropy coding for scalable video coding

    NASA Astrophysics Data System (ADS)

    Choi, Woong Il; Yang, Jungyoup; Jeon, Byeungwoo

    2005-10-01

    The standardization for the scalable extension of H.264 has called for additional functionality based on H.264 standard to support the combined spatio-temporal and SNR scalability. For the entropy coding of H.264 scalable extension, Context-based Adaptive Binary Arithmetic Coding (CABAC) scheme is considered so far. In this paper, we present a new context modeling scheme by using inter layer correlation between the syntax elements. As a result, it improves coding efficiency of entropy coding in H.264 scalable extension. In simulation results of applying the proposed scheme to encoding the syntax element mb_type, it is shown that improvement in coding efficiency of the proposed method is up to 16% in terms of bit saving due to estimation of more adequate probability model.

  9. Radiation sickness

    MedlinePlus

    ... to determine the amount of radiation exposure from nuclear accidents, the best signs of the severity of the ... doses of radiation, such as radiation from a nuclear power plant accident Exposure to excessive radiation for medical treatments

  10. Radiation enteritis

    MedlinePlus

    Radiation enteropathy; Radiation-induced small bowel injury; Post-radiation enteritis ... Radiation therapy uses high-powered x-rays, particles, or radioactive seeds to kill cancer cells. The therapy ...

  11. Radiation dosimetry.

    PubMed Central

    Cameron, J

    1991-01-01

    This article summarizes the basic facts about the measurement of ionizing radiation, usually referred to as radiation dosimetry. The article defines the common radiation quantities and units; gives typical levels of natural radiation and medical exposures; and describes the most important biological effects of radiation and the methods used to measure radiation. Finally, a proposal is made for a new radiation risk unit to make radiation risks more understandable to nonspecialists. PMID:2040250

  12. Effects of Nuclear Interactions in Space Radiation Transport

    NASA Technical Reports Server (NTRS)

    Lin, Zi-Wei; Barghouty, A. F.

    2004-01-01

    Space radiation transport codes have been developed to calculate radiation effects behind materials in human missions to the Moon, Mars or beyond. We study how nuclear fragmentation processes affect predictions from such radiation transport codes. In particular, we investigate the effects of fragmentation cross sections at different energies on fluxes, dose and dose-equivalent from galactic cosmic rays behind typical shielding materials.

  13. Effects of Nuclear Interactions in Space Radiation Transport

    NASA Technical Reports Server (NTRS)

    Lin, Zi-Wei; Barghouty, A. F.

    2005-01-01

    Space radiation transport codes have been developed to calculate radiation effects behind materials in human mission to the Moon, Mars or beyond. We study how nuclear fragmentation processes affect predictions from such radiation transport codes. In particular, we investigate the effects of fragmentation cross sections at different energies on fluxes, dose and dose-equivalent from galactic cosmic rays behind typical shielding materials.

  14. What is Code Biology?

    PubMed

    Barbieri, Marcello

    2017-10-06

    Various independent discoveries have shown that many organic codes exist in living systems, and this implies that they came into being during the history of life and contributed to that history. The genetic code appeared in a population of primitive systems that has been referred to as the common ancestor, and it has been proposed that three distinct signal processing codes gave origin to the three primary kingdoms of Archaea, Bacteria and Eukarya. After the genetic code and the signal processing codes, on the other hand, only the ancestors of the eukaryotes continued to explore the coding space and gave origin to splicing codes, histone code, tubulin code, compartment codes and many others. A first theoretical consequence of this historical fact is the idea that the Eukarya became increasingly more complex because they maintained the potential to bring new organic codes into existence. A second theoretical consequence comes from the fact that the evolution of the individual rules of a code can take an extremely long time, but the origin of a new organic code corresponds to the appearance of a complete set of rules and from a geological point of view this amounts to a sudden event. The great discontinuities of the history of life, in other words, can be explained as the result of the appearance of new codes. A third theoretical consequence comes from the fact that the organic codes have been highly conserved in evolution, which shows that they are the great invariants of life, the sole entities that have gone intact through billions of years while everything else has changed. This tells us that the organic codes are fundamental components of life and their study - the new research field of Code Biology - is destined to become an increasingly relevant part of the life sciences. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. DIANE multiparticle transport code

    NASA Astrophysics Data System (ADS)

    Caillaud, M.; Lemaire, S.; Ménard, S.; Rathouit, P.; Ribes, J. C.; Riz, D.

    2014-06-01

    DIANE is the general Monte Carlo code developed at CEA-DAM. DIANE is a 3D multiparticle multigroup code. DIANE includes automated biasing techniques and is optimized for massive parallel calculations.

  16. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  17. Honesty and Honor Codes.

    ERIC Educational Resources Information Center

    McCabe, Donald; Trevino, Linda Klebe

    2002-01-01

    Explores the rise in student cheating and evidence that students cheat less often at schools with an honor code. Discusses effective use of such codes and creation of a peer culture that condemns dishonesty. (EV)

  18. Cellulases and coding sequences

    SciTech Connect

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-01-01

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  19. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-02-20

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  20. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  1. Practices in Code Discoverability

    NASA Astrophysics Data System (ADS)

    Teuben, P.; Allen, A.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Much of scientific progress now hinges on the reliability, falsifiability and reproducibility of computer source codes. Astrophysics in particular is a discipline that today leads other sciences in making useful scientific components freely available online, including data, abstracts, preprints, and fully published papers, yet even today many astrophysics source codes remain hidden from public view. We review the importance and history of source codes in astrophysics and previous efforts to develop ways in which information about astrophysics codes can be shared. We also discuss why some scientist coders resist sharing or publishing their codes, the reasons for and importance of overcoming this resistance, and alert the community to a reworking of one of the first attempts for sharing codes, the Astrophysics Source Code Library (ASCL). We discuss the implementation of the ASCL in an accompanying poster paper. We suggest that code could be given a similar level of referencing as data gets in repositories such as ADS.

  2. STEEP32 computer code

    NASA Technical Reports Server (NTRS)

    Goerke, W. S.

    1972-01-01

    A manual is presented as an aid in using the STEEP32 code. The code is the EXEC 8 version of the STEEP code (STEEP is an acronym for shock two-dimensional Eulerian elastic plastic). The major steps in a STEEP32 run are illustrated in a sample problem. There is a detailed discussion of the internal organization of the code, including a description of each subroutine.

  3. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  4. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  5. Morse Code Activity Packet.

    ERIC Educational Resources Information Center

    Clinton, Janeen S.

    This activity packet offers simple directions for setting up a Morse Code system appropriate to interfacing with any of several personal computer systems. Worksheets are also included to facilitate teaching Morse Code to persons with visual or other disabilities including blindness, as it is argued that the code is best learned auditorily. (PB)

  6. EMF wire code research

    SciTech Connect

    Jones, T.

    1993-11-01

    This paper examines the results of previous wire code research to determines the relationship with childhood cancer, wire codes and electromagnetic fields. The paper suggests that, in the original Savitz study, biases toward producing a false positive association between high wire codes and childhood cancer were created by the selection procedure.

  7. Mapping Local Codes to Read Codes.

    PubMed

    Bonney, Wilfred; Galloway, James; Hall, Christopher; Ghattas, Mikhail; Tramma, Leandro; Nind, Thomas; Donnelly, Louise; Jefferson, Emily; Doney, Alexander

    2017-01-01

    Background & Objectives: Legacy laboratory test codes make it difficult to use clinical datasets for meaningful translational research, where populations are followed for disease risk and outcomes over many years. The Health Informatics Centre (HIC) at the University of Dundee hosts continuous biochemistry data from the clinical laboratories in Tayside and Fife dating back as far as 1987. However, the HIC-managed biochemistry dataset is coupled with incoherent sample types and unstandardised legacy local test codes, which increases the complexity of using the dataset for reasonable population health outcomes. The objective of this study was to map the legacy local test codes to the Scottish 5-byte Version 2 Read Codes using biochemistry data extracted from the repository of the Scottish Care Information (SCI) Store.

  8. SPIN: An Inversion Code for the Photospheric Spectral Line

    NASA Astrophysics Data System (ADS)

    Yadav, Rahul; Mathew, Shibu K.; Tiwary, Alok Ranjan

    2017-08-01

    Inversion codes are the most useful tools to infer the physical properties of the solar atmosphere from the interpretation of Stokes profiles. In this paper, we present the details of a new Stokes Profile INversion code (SPIN) developed specifically to invert the spectro-polarimetric data of the Multi-Application Solar Telescope (MAST) at Udaipur Solar Observatory. The SPIN code has adopted Milne-Eddington approximations to solve the polarized radiative transfer equation (RTE) and for the purpose of fitting a modified Levenberg-Marquardt algorithm has been employed. We describe the details and utilization of the SPIN code to invert the spectro-polarimetric data. We also present the details of tests performed to validate the inversion code by comparing the results from the other widely used inversion codes (VFISV and SIR). The inverted results of the SPIN code after its application to Hinode/SP data have been compared with the inverted results from other inversion codes.

  9. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  10. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  11. Coding for Electronic Mail

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  12. STATUS OF THE MCNPX TRANSPORT CODE

    SciTech Connect

    Hughes, H.G.; Chadwick, M.B.

    2000-10-01

    The Monte Carlo particle transport code MCNPX and its associated data have been the focus of a major development effort at Los Alamos for several years. The system has reached a mature state, and has become a significant tool for many intermediate and high-energy particle transport applications. A recent version has been released to the Radiation Safety Information Computational Center (RSICC). A recent report provides an overview of the code and an extensive set of references for the component physics modules used in the code. In this paper we review the status of the developmental version of MCNPX, and describe some important new enhancements, including the use of evaluated nuclear data files for proton transport; the use of photonuclear reaction data; improved elastic and inelastic react ion cross sections for nucleons, antinucleons, pions, and kaons; and two new modes of operation of the code. We also illustrate the use of the new proton and photonuclear data in two representative applications.

  13. Type I X-ray burst simulation code

    SciTech Connect

    Fisker, J. L.; Hix, W. R.; Liebendoerfer, M.

    2007-07-01

    dAGILE is an astrophysical code that simulates accretion of matter onto a neutron star and the subsequent x-ray burst. It is a one-dimensional time-dependent spherically symmetric code with generalized nuclear reaction networks, diffusive radiation/conduction, realistic boundary conditions, and general relativistic hydrodynamics. The code is described in more detail in Astrophysical Journal 650(2006)332 and Astrophysical Journal Supplements 174(2008)261.

  14. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  15. DLLExternalCode

    SciTech Connect

    Greg Flach, Frank Smith

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  16. Defeating the coding monsters.

    PubMed

    Colt, Ross

    2007-02-01

    Accuracy in coding is rapidly becoming a required skill for military health care providers. Clinic staffing, equipment purchase decisions, and even reimbursement will soon be based on the coding data that we provide. Learning the complicated myriad of rules to code accurately can seem overwhelming. However, the majority of clinic visits in a typical outpatient clinic generally fall into two major evaluation and management codes, 99213 and 99214. If health care providers can learn the rules required to code a 99214 visit, then this will provide a 90% solution that can enable them to accurately code the majority of their clinic visits. This article demonstrates a step-by-step method to code a 99214 visit, by viewing each of the three requirements as a monster to be defeated.

  17. Nonequilibrium radiation and chemistry models for aerocapture vehicle flowfields

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1990-01-01

    The continued development and improvement of the viscous shock layer (VSL) nonequilibrium chemistry blunt body engineering code, the incorporation in a coupled manner of radiation models into the VSL code, and the initial development of appropriate precursor models are presented.

  18. Relativistic radiation damping for simulation

    NASA Astrophysics Data System (ADS)

    Chotia, Amodsen

    2005-10-01

    The aim of this work is to implement radiation braking into a simulation code. Radiation physics of accelerated charges is not new. It dates from the end of the 19th century, from Maxwell theory and Larmor, Poynting, Thomson, Poincare, Lorentz, Von Laue, Abraham, Schott, Planck, Landau, Einstein, Dirac, Wheeler et Feynmann (and many others). The result reaches out from the length of life of exited levels of atoms, antennas, and lays out through specific production of radiation by bremsstrahlung in particles accelerators but also spatial and stellar astrophysics. In this work we start from Landau Lifchitz equation to express the quadrivector acceleration in term of the fields. Using a result from Pomeranchouck we deduce the energy lost by radiation. We do an instantaneous colinear projection of the velocity vector in order to substract the loss of kinetic energy due to radiation. The equation of motion is then solved based on Boris algorithm. The code is tested on few examples.

  19. Radiation from advanced solid rocket motor plumes

    NASA Technical Reports Server (NTRS)

    Farmer, Richard C.; Smith, Sheldon D.; Myruski, Brian L.

    1994-01-01

    The overall objective of this study was to develop an understanding of solid rocket motor (SRM) plumes in sufficient detail to accurately explain the majority of plume radiation test data. Improved flowfield and radiation analysis codes were developed to accurately and efficiently account for all the factors which effect radiation heating from rocket plumes. These codes were verified by comparing predicted plume behavior with measured NASA/MSFC ASRM test data. Upon conducting a thorough review of the current state-of-the-art of SRM plume flowfield and radiation prediction methodology and the pertinent data base, the following analyses were developed for future design use. The NOZZRAD code was developed for preliminary base heating design and Al2O3 particle optical property data evaluation using a generalized two-flux solution to the radiative transfer equation. The IDARAD code was developed for rapid evaluation of plume radiation effects using the spherical harmonics method of differential approximation to the radiative transfer equation. The FDNS CFD code with fully coupled Euler-Lagrange particle tracking was validated by comparison to predictions made with the industry standard RAMP code for SRM nozzle flowfield analysis. The FDNS code provides the ability to analyze not only rocket nozzle flow, but also axisymmetric and three-dimensional plume flowfields with state-of-the-art CFD methodology. Procedures for conducting meaningful thermo-vision camera studies were developed.

  20. Radiation from advanced solid rocket motor plumes

    NASA Astrophysics Data System (ADS)

    Farmer, Richard C.; Smith, Sheldon D.; Myruski, Brian L.

    1994-12-01

    The overall objective of this study was to develop an understanding of solid rocket motor (SRM) plumes in sufficient detail to accurately explain the majority of plume radiation test data. Improved flowfield and radiation analysis codes were developed to accurately and efficiently account for all the factors which effect radiation heating from rocket plumes. These codes were verified by comparing predicted plume behavior with measured NASA/MSFC ASRM test data. Upon conducting a thorough review of the current state-of-the-art of SRM plume flowfield and radiation prediction methodology and the pertinent data base, the following analyses were developed for future design use. The NOZZRAD code was developed for preliminary base heating design and Al2O3 particle optical property data evaluation using a generalized two-flux solution to the radiative transfer equation. The IDARAD code was developed for rapid evaluation of plume radiation effects using the spherical harmonics method of differential approximation to the radiative transfer equation. The FDNS CFD code with fully coupled Euler-Lagrange particle tracking was validated by comparison to predictions made with the industry standard RAMP code for SRM nozzle flowfield analysis. The FDNS code provides the ability to analyze not only rocket nozzle flow, but also axisymmetric and three-dimensional plume flowfields with state-of-the-art CFD methodology. Procedures for conducting meaningful thermo-vision camera studies were developed.

  1. Radiation transport Part B: Applications with examples

    SciTech Connect

    Beutler, D.E.

    1997-06-01

    In the previous sections Len Lorence has described the need, theory, and types of radiation codes that can be applied to model the results of radiation effects tests or working environments for electronics. For the rest of this segment, the author will concentrate on the specific ways the codes can be used to predict device response or analyze radiation test results. Regardless of whether one is predicting responses in a working or test environment, the procedures are virtually the same. The same can be said for the use of 1-, 2-, or 3-dimensional codes and Monte Carlo or discrete ordinates codes. No attempt is made to instruct the student on the specifics of the code. For example, the author will not discuss the details, such as the number of meshes, energy groups, etc. that are appropriate for a discrete ordinates code. For the sake of simplicity, he will restrict himself to the 1-dimensional code CEPXS/ONELD. This code along with a wide variety of other radiation codes can be obtained form the Radiation Safety Information Computational Center (RSICC) for a nominal handling fee.

  2. DOE 2011 occupational radiation exposure

    SciTech Connect

    none,

    2012-12-01

    The U.S. Department of Energy (DOE) Office of Analysis within the Office of Health, Safety and Security (HSS) publishes the annual DOE Occupational Radiation Exposure Report to provide an overview of the status of radiation protection practices at DOE (including the National Nuclear Security Administration [NNSA]). The DOE 2011 Occupational Radiation Exposure Report provides an evaluation of DOE-wide performance regarding compliance with Title 10, Code of Federal Regulations (C.F.R.), Part 835, Occupational Radiation Protection dose limits and as low as reasonably achievable (ALARA) process requirements. In addition, the report provides data to DOE organizations responsible for developing policies for protection of individuals from the adverse health effects of radiation. The report provides a summary and an analysis of occupational radiation exposure information from the monitoring of individuals involved in DOE activities. The occupational radiation exposure information is analyzed in terms of aggregate data, dose to individuals, and dose by site over the past five years.

  3. DOE 2012 occupational radiation exposure

    SciTech Connect

    none,

    2013-10-01

    The U.S. Department of Energy (DOE) Office of Analysis within the Office of Health, Safety and Security (HSS) publishes the annual DOE Occupational Radiation Exposure Report to provide an overview of the status of radiation protection practices at DOE (including the National Nuclear Security Administration [NNSA]). The DOE 2012 Occupational Radiation Exposure Report provides an evaluation of DOE-wide performance regarding compliance with Title 10, Code of Federal Regulations (C.F.R.), Part 835, Occupational Radiation Protection dose limits and as low as reasonably achievable (ALARA) process requirements. In addition, the report provides data to DOE organizations responsible for developing policies for protection of individuals from the adverse health effects of radiation. The report provides a summary and an analysis of occupational radiation exposure information from the monitoring of individuals involved in DOE activities. Over the past 5-year period, the occupational radiation exposure information is analyzed in terms of aggregate data, dose to individuals, and dose by site.

  4. Benchmarking the Multidimensional Stellar Implicit Code MUSIC

    NASA Astrophysics Data System (ADS)

    Goffrey, T.; Pratt, J.; Viallet, M.; Baraffe, I.; Popov, M. V.; Walder, R.; Folini, D.; Geroux, C.; Constantino, T.

    2017-04-01

    We present the results of a numerical benchmark study for the MUltidimensional Stellar Implicit Code (MUSIC) based on widely applicable two- and three-dimensional compressible hydrodynamics problems relevant to stellar interiors. MUSIC is an implicit large eddy simulation code that uses implicit time integration, implemented as a Jacobian-free Newton Krylov method. A physics based preconditioning technique which can be adjusted to target varying physics is used to improve the performance of the solver. The problems used for this benchmark study include the Rayleigh-Taylor and Kelvin-Helmholtz instabilities, and the decay of the Taylor-Green vortex. Additionally we show a test of hydrostatic equilibrium, in a stellar environment which is dominated by radiative effects. In this setting the flexibility of the preconditioning technique is demonstrated. This work aims to bridge the gap between the hydrodynamic test problems typically used during development of numerical methods and the complex flows of stellar interiors. A series of multidimensional tests were performed and analysed. Each of these test cases was analysed with a simple, scalar diagnostic, with the aim of enabling direct code comparisons. As the tests performed do not have analytic solutions, we verify MUSIC by comparing it to established codes including ATHENA and the PENCIL code. MUSIC is able to both reproduce behaviour from established and widely-used codes as well as results expected from theoretical predictions. This benchmarking study concludes a series of papers describing the development of the MUSIC code and provides confidence in future applications.

  5. Benchmarking of Neutron Production of Heavy-Ion Transport Codes

    SciTech Connect

    Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence

    2012-01-01

    Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required.

  6. Implict Monte Carlo Radiation Transport Simulations of Four Test Problems

    SciTech Connect

    Gentile, N

    2007-08-01

    Radiation transport codes, like almost all codes, are difficult to develop and debug. It is helpful to have small, easy to run test problems with known answers to use in development and debugging. It is also prudent to re-run test problems periodically during development to ensure that previous code capabilities have not been lost. We describe four radiation transport test problems with analytic or approximate analytic answers. These test problems are suitable for use in debugging and testing radiation transport codes. We also give results of simulations of these test problems performed with an Implicit Monte Carlo photonics code.

  7. Radiation Therapy

    MedlinePlus

    ... Loss Surgery? A Week of Healthy Breakfasts Shyness Radiation Therapy KidsHealth > For Teens > Radiation Therapy Print A ... how to cope with side effects. What Is Radiation Therapy? Cancer is a disease that causes cells ...

  8. Radiation Therapy

    MedlinePlus

    ... Loss Surgery? A Week of Healthy Breakfasts Shyness Radiation Therapy KidsHealth > For Teens > Radiation Therapy A A ... how to cope with side effects. What Is Radiation Therapy? Cancer is a disease that causes cells ...

  9. Radiation Therapy

    MedlinePlus

    Radiation therapy is a cancer treatment. It uses high doses of radiation to kill cancer cells and stop them from ... half of all cancer patients receive it. The radiation may be external, from special machines, or internal, ...

  10. More box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    A new investigation shows that, starting from the BCH (21,15;3) code represented as a 7 x 3 matrix and adding a row and column to add even parity, one obtains an 8 x 4 matrix (32,15;8) code. An additional dimension is obtained by specifying odd parity on the rows and even parity on the columns, i.e., adjoining to the 8 x 4 matrix, the matrix, which is zero except for the fourth column (of all ones). Furthermore, any seven rows and three columns will form the BCH (21,15;3) code. This box code has the same weight structure as the quadratic residue and BCH codes of the same dimensions. Whether there exists an algebraic isomorphism to either code is as yet unknown.

  11. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  12. Rewriting the Genetic Code.

    PubMed

    Mukai, Takahito; Lajoie, Marc J; Englert, Markus; Söll, Dieter

    2017-09-08

    The genetic code-the language used by cells to translate their genomes into proteins that perform many cellular functions-is highly conserved throughout natural life. Rewriting the genetic code could lead to new biological functions such as expanding protein chemistries with noncanonical amino acids (ncAAs) and genetically isolating synthetic organisms from natural organisms and viruses. It has long been possible to transiently produce proteins bearing ncAAs, but stabilizing an expanded genetic code for sustained function in vivo requires an integrated approach: creating recoded genomes and introducing new translation machinery that function together without compromising viability or clashing with endogenous pathways. In this review, we discuss design considerations and technologies for expanding the genetic code. The knowledge obtained by rewriting the genetic code will deepen our understanding of how genomes are designed and how the canonical genetic code evolved.

  13. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  14. Breaking the Neural Code

    DTIC Science & Technology

    2015-05-21

    SECURITY CLASSIFICATION OF: This seedling proposed to use advanced imaging techniques to break the neuronal code that links the firing of neurons in...Report: Breaking the Neural Code Report Title This seedling proposed to use advanced imaging techniques to break the neuronal code that links the...generating a closed-loop on-line experimental platform. We have completed all proposed tasks of the seedling and successfully completed preliminary

  15. Phonological coding during reading

    PubMed Central

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  16. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  17. Ptolemy Coding Style

    DTIC Science & Technology

    2014-09-05

    Ptolemy Coding Style Christopher Brooks Edward A. Lee Electrical Engineering and Computer Sciences University of California at Berkeley Technical...COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Ptolemy Coding Style 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...constraints, so such constraints are not new to the academic community. This document describes the coding style used in Ptolemy II, a package with

  18. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  19. A MULTIPURPOSE COHERENT INSTABILITY SIMULATION CODE

    SciTech Connect

    BLASKIEWICZ,M.

    2007-06-25

    A multipurpose coherent instability simulation code has been written, documented, and released for use. TRANFT (tran-eff-tee) uses fast Fourier transforms to model transverse wakefields, transverse detuning wakes and longitudinal wakefields in a computationally efficient way. Dual harmonic RF allows for the study of enhanced synchrotron frequency spread. When coupled with chromaticity, the theoretically challenging but highly practical post head-tail regime is open to study. Detuning wakes allow for transverse space charge forces in low energy hadron beams, and a switch allowing for radiation damping makes the code useful for electrons.

  20. GALPROP: New Developments in CR Propagation Code

    NASA Technical Reports Server (NTRS)

    Moskalenko, I. V.; Jones, F. C.; Mashnik, S. G.; Strong, A. W.; Ptuskin, V. S.

    2003-01-01

    The numerical Galactic CR propagation code GALPROP has been shown to reproduce simultaneously observational data of many kinds related to CR origin and propagation. It has been validated on direct measurements of nuclei, antiprotons, electrons, positrons as well as on astronomical measurements of gamma rays and synchrotron radiation. Such data provide many independent constraints on model parameters while revealing some contradictions in the conventional view of Galactic CR propagation. Using a new version of GALPROP we study new effects such as processes of wave-particle interactions in the interstellar medium. We also report about other developments in the CR propagation code.

  1. Hybrid Compton camera/coded aperture imaging system

    DOEpatents

    Mihailescu, Lucian [Livermore, CA; Vetter, Kai M [Alameda, CA

    2012-04-10

    A system in one embodiment includes an array of radiation detectors; and an array of imagers positioned behind the array of detectors relative to an expected trajectory of incoming radiation. A method in another embodiment includes detecting incoming radiation with an array of radiation detectors; detecting the incoming radiation with an array of imagers positioned behind the array of detectors relative to a trajectory of the incoming radiation; and performing at least one of Compton imaging using at least the imagers and coded aperture imaging using at least the imagers. A method in yet another embodiment includes detecting incoming radiation with an array of imagers positioned behind an array of detectors relative to a trajectory of the incoming radiation; and performing Compton imaging using at least the imagers.

  2. Transonic airfoil codes

    NASA Technical Reports Server (NTRS)

    Garabedian, P. R.

    1979-01-01

    Computer codes for the design and analysis of transonic airfoils are considered. The design code relies on the method of complex characteristics in the hodograph plane to construct shockless airfoil. The analysis code uses artificial viscosity to calculate flows with weak shock waves at off-design conditions. Comparisons with experiments show that an excellent simulation of two dimensional wind tunnel tests is obtained. The codes have been widely adopted by the aircraft industry as a tool for the development of supercritical wing technology.

  3. FAA Smoke Transport Code

    SciTech Connect

    Domino, Stefan; Luketa-Hanlin, Anay; Gallegos, Carlos

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a code obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.

  4. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  5. Topological subsystem codes

    SciTech Connect

    Bombin, H.

    2010-03-15

    We introduce a family of two-dimensional (2D) topological subsystem quantum error-correcting codes. The gauge group is generated by two-local Pauli operators, so that two-local measurements are enough to recover the error syndrome. We study the computational power of code deformation in these codes and show that boundaries cannot be introduced in the usual way. In addition, we give a general mapping connecting suitable classical statistical mechanical models to optimal error correction in subsystem stabilizer codes that suffer from depolarizing noise.

  6. Radiation exchange

    SciTech Connect

    Taylor, J.H. )

    1990-01-01

    This book deals with radiation laws, the phenomena of radiation exchange, the quantification of radiation, and the mechanisms whereby radiation is attenuated in passing through the earth's atmosphere. Applications of radiation exchange are discussed, such as the measurement of the effective radiating temperature of the ozonosphere. Also presented is the development of the concept of atmospheric windows and atmospheric transmittance. Radiation exchange experiments between Earth and space are presented and their interpretations given. The book fives detailed, step-by-step procedures for carrying out the radiometric calibration of an infrared prism spectrometer and a radiation thermopile.

  7. Atmospheric radiation

    SciTech Connect

    Harshvardhan, M.R. )

    1991-01-01

    Studies of atmospheric radiative processes are summarized for the period 1987-1990. Topics discussed include radiation modeling; clouds and radiation; radiative effects in dynamics and climate; radiation budget and aerosol effects; and gaseous absorption, particulate scattering and surface reflection. It is concluded that the key developments of the period are a defining of the radiative forcing to the climate system by trace gases and clouds, the recognition that cloud microphysics and morphology need to be incorporated not only into radiation models but also climate models, and the isolation of a few important unsolved theoretical problems in atmospheric radiation.

  8. CosmoRec: Cosmological Recombination code

    NASA Astrophysics Data System (ADS)

    Chluba, Jens; Thomas, Rajat Mani

    2013-04-01

    CosmoRec solves the recombination problem including recombinations to highly excited states, corrections to the 2s-1s two-photon channel, HI Lyn-feedback, n>2 two-photon profile corrections, and n≥2 Raman-processes. The code can solve the radiative transfer equation of the Lyman-series photon field to obtain the required modifications to the rate equations of the resolved levels, and handles electron scattering, the effect of HeI intercombination transitions, and absorption of helium photons by hydrogen. It also allows accounting for dark matter annihilation and optionally includes detailed helium radiative transfer effects.

  9. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding.

    PubMed

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions.

  10. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    PubMed Central

    Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions. PMID:26999741

  11. Radiation physics, biophysics, and radiation biology

    SciTech Connect

    Hall, E.J.; Zaider, M.

    1993-05-01

    Research at the Center for Radiological Research is a multidisciplenary blend of physics, chemistry and biology aimed at understanding the mechanisms involved in the health problems resulting from human exposure to ionizing radiations. The focus is increased on biochemistry and the application of the techniques of molecular biology to the problems of radiation biology. Research highlights of the program from the past year are described. A mathematical model describing the production of single-strand and double-strand breaks in DNA as a function radiation quality has been completed. For the first time Monte Carlo techniques have been used to obtain directly the spatial distribution of DNA moieties altered by radiation. This information was obtained by including the transport codes a realistic description of the electronic structure of DNA. We have investigated structure activity relationships for the potential oncogenicity of a new generation of bioreductive drugs that function as hypoxic cytotoxins. Experimental and theoretical investigation of the inverse dose rate effect, whereby medium LET radiations actually produce an c effect when the dose is protracted, is now at a point where the basic mechanisms are reasonably understood and the complex interplay between dose, dose rate and radiation quality which is necessary for the effect to be present can now be predicted at least in vitro. In terms of early radiobiological damage, a quantitative link has been established between basic energy deposition and locally multiply damaged sites, the radiochemical precursor of DNA double strand breaks; specifically, the spatial and energy deposition requirements necessary to form LMDs have been evaluated. For the first time, a mechanically understood biological fingerprint'' of high-LET radiation has been established. Specifically measurement of the ratio of inter-to intra-chromosomal aberrations produces a unique signature from alpha-particles or neutrons.

  12. Pelvic radiation - discharge

    MedlinePlus

    Radiation of the pelvis - discharge; Cancer treatment - pelvic radiation; Prostate cancer - pelvic radiation; Ovarian cancer - pelvic radiation; Cervical cancer - pelvic radiation; Uterine cancer - pelvic radiation; Rectal cancer - ...

  13. Insurance billing and coding.

    PubMed

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  14. Coding Acoustic Metasurfaces.

    PubMed

    Xie, Boyang; Tang, Kun; Cheng, Hua; Liu, Zhengyou; Chen, Shuqi; Tian, Jianguo

    2017-02-01

    Coding acoustic metasurfaces can combine simple logical bits to acquire sophisticated functions in wave control. The acoustic logical bits can achieve a phase difference of exactly π and a perfect match of the amplitudes for the transmitted waves. By programming the coding sequences, acoustic metasurfaces with various functions, including creating peculiar antenna patterns and waves focusing, have been demonstrated.

  15. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  16. Dress Codes for Teachers?

    ERIC Educational Resources Information Center

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  17. Pseudonoise code tracking loop

    NASA Technical Reports Server (NTRS)

    Laflame, D. T. (Inventor)

    1980-01-01

    A delay-locked loop is presented for tracking a pseudonoise (PN) reference code in an incoming communication signal. The loop is less sensitive to gain imbalances, which can otherwise introduce timing errors in the PN reference code formed by the loop.

  18. Modified JPEG Huffman coding.

    PubMed

    Lakhani, Gopal

    2003-01-01

    It is a well observed characteristic that when a DCT block is traversed in the zigzag order, the AC coefficients generally decrease in size and the run-length of zero coefficients increase in number. This article presents a minor modification to the Huffman coding of the JPEG baseline compression algorithm to exploit this redundancy. For this purpose, DCT blocks are divided into bands so that each band can be coded using a separate code table. Three implementations are presented, which all move the end-of-block marker up in the middle of DCT block and use it to indicate the band boundaries. Experimental results are presented to compare reduction in the code size obtained by our methods with the JPEG sequential-mode Huffman coding and arithmetic coding methods. The average code reduction to the total image code size of one of our methods is 4%. Our methods can also be used for progressive image transmission and hence, experimental results are also given to compare them with two-, three-, and four-band implementations of the JPEG spectral selection method.

  19. Code of Ethics

    ERIC Educational Resources Information Center

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  20. Dress Codes for Teachers?

    ERIC Educational Resources Information Center

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  1. Lichenase and coding sequences

    SciTech Connect

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2000-08-15

    The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

  2. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  3. Computerized mega code recording.

    PubMed

    Burt, T W; Bock, H C

    1988-04-01

    A system has been developed to facilitate recording of advanced cardiac life support mega code testing scenarios. By scanning a paper "keyboard" using a bar code wand attached to a portable microcomputer, the person assigned to record the scenario can easily generate an accurate, complete, timed, and typewritten record of the given situations and the obtained responses.

  4. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  5. Energy Conservation Code Decoded

    SciTech Connect

    Cole, Pam C.; Taylor, Zachary T.

    2006-09-01

    Designing an energy-efficient, affordable, and comfortable home is a lot easier thanks to a slime, easier to read booklet, the 2006 International Energy Conservation Code (IECC), published in March 2006. States, counties, and cities have begun reviewing the new code as a potential upgrade to their existing codes. Maintained under the public consensus process of the International Code Council, the IECC is designed to do just what its title says: promote the design and construction of energy-efficient homes and commercial buildings. Homes in this case means traditional single-family homes, duplexes, condominiums, and apartment buildings having three or fewer stories. The U.S. Department of Energy, which played a key role in proposing the changes that resulted in the new code, is offering a free training course that covers the residential provisions of the 2006 IECC.

  6. Evolving genetic code

    PubMed Central

    OHAMA, Takeshi; INAGAKI, Yuji; BESSHO, Yoshitaka; OSAWA, Syozo

    2008-01-01

    In 1985, we reported that a bacterium, Mycoplasma capricolum, used a deviant genetic code, namely UGA, a “universal” stop codon, was read as tryptophan. This finding, together with the deviant nuclear genetic codes in not a few organisms and a number of mitochondria, shows that the genetic code is not universal, and is in a state of evolution. To account for the changes in codon meanings, we proposed the codon capture theory stating that all the code changes are non-disruptive without accompanied changes of amino acid sequences of proteins. Supporting evidence for the theory is presented in this review. A possible evolutionary process from the ancient to the present-day genetic code is also discussed. PMID:18941287

  7. Radiation Basics

    EPA Pesticide Factsheets

    Radiation can come from unstable atoms or it can be produced by machines. There are two kinds of radiation; ionizing and non-ionizing radiation. Learn about alpha, beta, gamma and x-ray radiation, as well the different types of doses.

  8. Radiation Transport in Dynamic Spacetimes

    NASA Astrophysics Data System (ADS)

    Schnittman, Jeremy; Baker, John G.; Etienne, Zachariah; Giacomazzo, Bruno; Kelly, Bernard J.

    2017-08-01

    We present early results from a new radiation transport calculation of gas accretion onto merging binary black holes. We use the Monte Carlo radiation transport code Pandurata, now generalized for application to dynamic spacetimes. The time variability of the metric requires careful numerical techniques for solving the geodesic equation, particularly with tabulated spacetime data from numerical relativity codes. Using a new series of general relativistic magneto-hydrodynamical simulations of magnetized flow onto binary black holes, we investigate the possibility for detecting and identifying unique electromagnetic counterparts to gravitational wave events.

  9. Radiation Transport in Dynamic Spacetimes

    NASA Astrophysics Data System (ADS)

    Schnittman, Jeremy; Baker, John; Etienne, Zachariah; Giacomazzo, Bruno; Kelly, Bernard

    2017-01-01

    We present early results from a new radiation transport calculation of gas accretion onto merging binary black holes. We use the Monte Carlo radiation transport code Pandurata, now generalized for application to dynamic spacetimes. The time variability of the metric requires careful numerical techniques for solving the geodesic equation, particularly with tabulated spacetime data from numerical relativity codes. Using a new series of general relativistic magneto-hydrodynamical simulations of magnetized flow onto binary black holes, we investigate the possibility for detecting and identifying unique electromagnetic counterparts to gravitational wave events.

  10. HotSpot Health Physics Codes

    SciTech Connect

    Homann, S. G.

    2013-04-18

    The HotSpot Health Physics Codes were created to provide emergency response personnel and emergency planners with a fast, field-portable set of software tools for evaluating insidents involving redioactive material. The software is also used for safety-analysis of facilities handling nuclear material. HotSpot provides a fast and usually conservative means for estimation the radiation effects associated with the short-term (less than 24 hours) atmospheric release of radioactive materials.

  11. Computer aided radiation analysis for manned spacecraft

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew H.; Griffin, Brand N.; Tanner, Ernest R., II; Pogue, William R.; Golightly, Michael J.

    1991-01-01

    In order to assist in the design of radiation shielding an analytical tool is presented that can be employed in combination with CAD facilities and NASA transport codes. The nature of radiation in space is described, and the operational requirements for protection are listed as background information for the use of the technique. The method is based on the Boeing radiation exposure model (BREM) for combining NASA radiation transport codes and CAD facilities, and the output is given as contour maps of the radiation-shield distribution so that dangerous areas can be identified. Computational models are used to solve the 1D Boltzmann transport equation and determine the shielding needs for the worst-case scenario. BREM can be employed directly with the radiation computations to assess radiation protection during all phases of design which saves time and ultimately spacecraft weight.

  12. Computer aided radiation analysis for manned spacecraft

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew H.; Griffin, Brand N.; Tanner, Ernest R., II; Pogue, William R.; Golightly, Michael J.

    1991-01-01

    In order to assist in the design of radiation shielding an analytical tool is presented that can be employed in combination with CAD facilities and NASA transport codes. The nature of radiation in space is described, and the operational requirements for protection are listed as background information for the use of the technique. The method is based on the Boeing radiation exposure model (BREM) for combining NASA radiation transport codes and CAD facilities, and the output is given as contour maps of the radiation-shield distribution so that dangerous areas can be identified. Computational models are used to solve the 1D Boltzmann transport equation and determine the shielding needs for the worst-case scenario. BREM can be employed directly with the radiation computations to assess radiation protection during all phases of design which saves time and ultimately spacecraft weight.

  13. Radiation Pattern of Chair Armed Microstrip Antenna

    NASA Astrophysics Data System (ADS)

    Mishra, Rabindra Kishore; Sahu, Kumar Satyabrat

    2016-12-01

    This work analyzes planar antenna conformable to chair arm shaped surfaces for WLAN application. Closed form expressions for its radiation pattern are developed and validated using measurements on prototype and commercial EM code at 2.4 GHz.

  14. Quantum convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Yan, Tingsu; Huang, Xinmei; Tang, Yuansheng

    2014-12-01

    In this paper, three families of quantum convolutional codes are constructed. The first one and the second one can be regarded as a generalization of Theorems 3, 4, 7 and 8 [J. Chen, J. Li, F. Yang and Y. Huang, Int. J. Theor. Phys., doi:10.1007/s10773-014-2214-6 (2014)], in the sense that we drop the constraint q ≡ 1 (mod 4). Furthermore, the second one and the third one attain the quantum generalized Singleton bound.

  15. The EGS5 Code System

    SciTech Connect

    Hirayama, Hideo; Namito, Yoshihito; Bielajew, Alex F.; Wilderman, Scott J.; U., Michigan; Nelson, Walter R.; /SLAC

    2005-12-20

    In the nineteen years since EGS4 was released, it has been used in a wide variety of applications, particularly in medical physics, radiation measurement studies, and industrial development. Every new user and every new application bring new challenges for Monte Carlo code designers, and code refinements and bug fixes eventually result in a code that becomes difficult to maintain. Several of the code modifications represented significant advances in electron and photon transport physics, and required a more substantial invocation than code patching. Moreover, the arcane MORTRAN3[48] computer language of EGS4, was highest on the complaint list of the users of EGS4. The size of the EGS4 user base is difficult to measure, as there never existed a formal user registration process. However, some idea of the numbers may be gleaned from the number of EGS4 manuals that were produced and distributed at SLAC: almost three thousand. Consequently, the EGS5 project was undertaken. It was decided to employ the FORTRAN 77 compiler, yet include as much as possible, the structural beauty and power of MORTRAN3. This report consists of four chapters and several appendices. Chapter 1 is an introduction to EGS5 and to this report in general. We suggest that you read it. Chapter 2 is a major update of similar chapters in the old EGS4 report[126] (SLAC-265) and the old EGS3 report[61] (SLAC-210), in which all the details of the old physics (i.e., models which were carried over from EGS4) and the new physics are gathered together. The descriptions of the new physics are extensive, and not for the faint of heart. Detailed knowledge of the contents of Chapter 2 is not essential in order to use EGS, but sophisticated users should be aware of its contents. In particular, details of the restrictions on the range of applicability of EGS are dispersed throughout the chapter. First-time users of EGS should skip Chapter 2 and come back to it later if necessary. With the release of the EGS4 version

  16. Development of a shuttle plume radiation heating indicator

    NASA Technical Reports Server (NTRS)

    Reardon, John E.

    1988-01-01

    The primary objectives were to develop a Base Heating Indicator Code and a new plume radiation code for the Space Shuttle. Additional work included: revision of the Space Shuttle plume radiation environment for changes in configuration and correction of errors, evaluation of radiation measurements to establish a plume radiation model for the SRB High Performance Motor (HPM) plume, radiation predictions for preliminary designs, and participation in hydrogen disposal analysis and testing for the VAFB Shuttle launch site. The two most significant accomplishments were the development of the Base Heating Indicator Code and the Shuttle Engine Plume Radiation (SEPRAD) Code. The major efforts in revising the current Shuttle plume radiation environment were for the Orbiter base heat shield and the ET components in the Orbiter-ET interface region. The work performed is summarized in the technical discussion section with references to the documents containing detailed results. The technical discussion is followed by a summary of conclusions and recommendations for future work.

  17. Charged and neutral particle transport methods and applications: The CALOR code system

    SciTech Connect

    Gabriel, T.A.; Charlton, L.A.

    1997-04-01

    The CALOR code system, which is a complete radiation transport code system, is described with emphasis on the high-energy (> 20 MeV) nuclear collision models. Codes similar to CALOR are also briefly discussed. A current application using CALOR which deals with the development of the National Spallation Neutron Source is also given.

  18. Coded-aperture imaging in nuclear medicine

    NASA Technical Reports Server (NTRS)

    Smith, Warren E.; Barrett, Harrison H.; Aarsvold, John N.

    1989-01-01

    Coded-aperture imaging is a technique for imaging sources that emit high-energy radiation. This type of imaging involves shadow casting and not reflection or refraction. High-energy sources exist in x ray and gamma-ray astronomy, nuclear reactor fuel-rod imaging, and nuclear medicine. Of these three areas nuclear medicine is perhaps the most challenging because of the limited amount of radiation available and because a three-dimensional source distribution is to be determined. In nuclear medicine a radioactive pharmaceutical is administered to a patient. The pharmaceutical is designed to be taken up by a particular organ of interest, and its distribution provides clinical information about the function of the organ, or the presence of lesions within the organ. This distribution is determined from spatial measurements of the radiation emitted by the radiopharmaceutical. The principles of imaging radiopharmaceutical distributions with coded apertures are reviewed. Included is a discussion of linear shift-variant projection operators and the associated inverse problem. A system developed at the University of Arizona in Tucson consisting of small modular gamma-ray cameras fitted with coded apertures is described.

  19. Coded-aperture imaging in nuclear medicine

    NASA Astrophysics Data System (ADS)

    Smith, Warren E.; Barrett, Harrison H.; Aarsvold, John N.

    1989-11-01

    Coded-aperture imaging is a technique for imaging sources that emit high-energy radiation. This type of imaging involves shadow casting and not reflection or refraction. High-energy sources exist in x ray and gamma-ray astronomy, nuclear reactor fuel-rod imaging, and nuclear medicine. Of these three areas nuclear medicine is perhaps the most challenging because of the limited amount of radiation available and because a three-dimensional source distribution is to be determined. In nuclear medicine a radioactive pharmaceutical is administered to a patient. The pharmaceutical is designed to be taken up by a particular organ of interest, and its distribution provides clinical information about the function of the organ, or the presence of lesions within the organ. This distribution is determined from spatial measurements of the radiation emitted by the radiopharmaceutical. The principles of imaging radiopharmaceutical distributions with coded apertures are reviewed. Included is a discussion of linear shift-variant projection operators and the associated inverse problem. A system developed at the University of Arizona in Tucson consisting of small modular gamma-ray cameras fitted with coded apertures is described.

  20. Pyramid image codes

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1990-01-01

    All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

  1. Embedded foveation image coding.

    PubMed

    Wang, Z; Bovik, A C

    2001-01-01

    The human visual system (HVS) is highly space-variant in sampling, coding, processing, and understanding. The spatial resolution of the HVS is highest around the point of fixation (foveation point) and decreases rapidly with increasing eccentricity. By taking advantage of this fact, it is possible to remove considerable high-frequency information redundancy from the peripheral regions and still reconstruct a perceptually good quality image. Great success has been obtained previously by a class of embedded wavelet image coding algorithms, such as the embedded zerotree wavelet (EZW) and the set partitioning in hierarchical trees (SPIHT) algorithms. Embedded wavelet coding not only provides very good compression performance, but also has the property that the bitstream can be truncated at any point and still be decoded to recreate a reasonably good quality image. In this paper, we propose an embedded foveation image coding (EFIC) algorithm, which orders the encoded bitstream to optimize foveated visual quality at arbitrary bit-rates. A foveation-based image quality metric, namely, foveated wavelet image quality index (FWQI), plays an important role in the EFIC system. We also developed a modified SPIHT algorithm to improve the coding efficiency. Experiments show that EFIC integrates foveation filtering with foveated image coding and demonstrates very good coding performance and scalability in terms of foveated image quality measurement.

  2. Report number codes

    SciTech Connect

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  3. TACO: a finite element heat transfer code

    SciTech Connect

    Mason, W.E. Jr.

    1980-02-01

    TACO is a two-dimensional implicit finite element code for heat transfer analysis. It can perform both linear and nonlinear analyses and can be used to solve either transient or steady state problems. Either plane or axisymmetric geometries can be analyzed. TACO has the capability to handle time or temperature dependent material properties and materials may be either isotropic or orthotropic. A variety of time and temperature dependent loadings and boundary conditions are available including temperature, flux, convection, and radiation boundary conditions and internal heat generation. Additionally, TACO has some specialized features such as internal surface conditions (e.g., contact resistance), bulk nodes, enclosure radiation with view factor calculations, and chemical reactive kinetics. A user subprogram feature allows for any type of functional representation of any independent variable. A bandwidth and profile minimization option is also available in the code. Graphical representation of data generated by TACO is provided by a companion post-processor named POSTACO. The theory on which TACO is based is outlined, the capabilities of the code are explained, the input data required to perform an analysis with TACO are described. Some simple examples are provided to illustrate the use of the code.

  4. The PARTRAC code: Status and recent developments

    NASA Astrophysics Data System (ADS)

    Friedland, Werner; Kundrat, Pavel

    Biophysical modeling is of particular value for predictions of radiation effects due to manned space missions. PARTRAC is an established tool for Monte Carlo-based simulations of radiation track structures, damage induction in cellular DNA and its repair [1]. Dedicated modules describe interactions of ionizing particles with the traversed medium, the production and reactions of reactive species, and score DNA damage determined by overlapping track structures with multi-scale chromatin models. The DNA repair module describes the repair of DNA double-strand breaks (DSB) via the non-homologous end-joining pathway; the code explicitly simulates the spatial mobility of individual DNA ends in parallel with their processing by major repair enzymes [2]. To simulate the yields and kinetics of radiation-induced chromosome aberrations, the repair module has been extended by tracking the information on the chromosome origin of ligated fragments as well as the presence of centromeres [3]. PARTRAC calculations have been benchmarked against experimental data on various biological endpoints induced by photon and ion irradiation. The calculated DNA fragment distributions after photon and ion irradiation reproduce corresponding experimental data and their dose- and LET-dependence. However, in particular for high-LET radiation many short DNA fragments are predicted below the detection limits of the measurements, so that the experiments significantly underestimate DSB yields by high-LET radiation [4]. The DNA repair module correctly describes the LET-dependent repair kinetics after (60) Co gamma-rays and different N-ion radiation qualities [2]. First calculations on the induction of chromosome aberrations have overestimated the absolute yields of dicentrics, but correctly reproduced their relative dose-dependence and the difference between gamma- and alpha particle irradiation [3]. Recent developments of the PARTRAC code include a model of hetero- vs euchromatin structures to enable

  5. Radiation shielding of the main injector

    SciTech Connect

    Bhat, C.M.; Martin, P.S.

    1995-05-01

    The radiation shielding in the Fermilab Main Injector (FMI) complex has been carried out by adopting a number of prescribed stringent guidelines established by a previous safety analysis. Determination of the required amount of radiation shielding at various locations of the FMI has been done using Monte Carlo computations. A three dimensional ray tracing code as well as a code based upon empirical observations have been employed in certain cases.

  6. Coding for surgical audit.

    PubMed

    Pettigrew, R A; van Rij, A M

    1990-05-01

    A simple system of codes for operations, diagnoses and complications, developed specifically for computerized surgical audit, is described. This arose following a review of our established surgical audit in which problems in the retrieval of data from the database were identified. Evaluation of current methods of classification of surgical data highlighted the need for a dedicated coding system that was suitable for classifying surgical audit data, enabling rapid retrieval from large databases. After 2 years of use, the coding system has been found to fulfil the criteria of being sufficiently flexible and specific for computerized surgical audit, yet simple enough for medical staff to use.

  7. Collaborative Comparison of High-Energy-Density Physics Codes

    NASA Astrophysics Data System (ADS)

    Fryxell, Bruce Alan; Fatenejad, M.; Lamb, D.; Grazianni, C.; Myra, E.; Fryer, C.; Wohlbier, J.

    2012-05-01

    Performing radiation-hydrodynamic simulations is vital to the understanding of laboratory astrophysics experiments. A number of codes have been developed for this purpose. A collaboration has begun to compare several of these codes, including CRASH (Unversity of Michigan), FLASH (University of Chicago), RAGE and CASSIO (LANL) and HYDRA (LLNL). We are in the process of testing these codes on a wide variety of problems, ranging from very simple tests to full laboratory astrophysics experiments. The algorithms and physics models differ significantly between these codes, so complete agreement is not expected, especially on the full-experiment simulations. The goal is to understand the differences between the codes and how these differences influence the results. We intend to determine which codes contain the most accurate algorithms and physics models and, where possible, to improve the other codes to produce more faithful representations of the experiments. The first set of tests are simple temperature relaxation problems in an infinite, uniform medium. The second suite of tests was designed to test the diffusion solvers (both conduction and radiation) in the codes. Following this, tests will be performed that include hydrodynamic effects. Results of these comparisons will be presented. The eventual goal is to compare the results from all of the codes on simulations of radiative shock experiments being performed by The Center for Radiative Shock Hydrodynamics at the University of Michigan and to understand any discrepancies between the results of the simulations and the experiments. This research was supported by the DOE NNSA/ASC under the Predictive Science Academic Alliance Program by grant number DEFC52-08NA28616.

  8. SASSYS LMFBR systems code

    SciTech Connect

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time.

  9. Code Disentanglement: Initial Plan

    SciTech Connect

    Wohlbier, John Greaton; Kelley, Timothy M.; Rockefeller, Gabriel M.; Calef, Matthew Thomas

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  10. Critical Care Coding for Neurologists.

    PubMed

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  11. Space Radiation

    NASA Technical Reports Server (NTRS)

    Wu, Honglu

    2006-01-01

    Astronauts receive the highest occupational radiation exposure. Effective protections are needed to ensure the safety of astronauts on long duration space missions. Increased cancer morbidity or mortality risk in astronauts may be caused by occupational radiation exposure. Acute and late radiation damage to the central nervous system (CNS) may lead to changes in motor function and behavior, or neurological disorders. Radiation exposure may result in degenerative tissue diseases (non-cancer or non-CNS) such as cardiac, circulatory, or digestive diseases, as well as cataracts. Acute radiation syndromes may occur due to occupational radiation exposure.

  12. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  13. Modular optimization code package: MOZAIK

    NASA Astrophysics Data System (ADS)

    Bekar, Kursat B.

    This dissertation addresses the development of a modular optimization code package, MOZAIK, for geometric shape optimization problems in nuclear engineering applications. MOZAIK's first mission, determining the optimal shape of the D2O moderator tank for the current and new beam tube configurations for the Penn State Breazeale Reactor's (PSBR) beam port facility, is used to demonstrate its capabilities and test its performance. MOZAIK was designed as a modular optimization sequence including three primary independent modules: the initializer, the physics and the optimizer, each having a specific task. By using fixed interface blocks among the modules, the code attains its two most important characteristics: generic form and modularity. The benefit of this modular structure is that the contents of the modules can be switched depending on the requirements of accuracy, computational efficiency, or compatibility with the other modules. Oak Ridge National Laboratory's discrete ordinates transport code TORT was selected as the transport solver in the physics module of MOZAIK, and two different optimizers, Min-max and Genetic Algorithms (GA), were implemented in the optimizer module of the code package. A distributed memory parallelism was also applied to MOZAIK via MPI (Message Passing Interface) to execute the physics module concurrently on a number of processors for various states in the same search. Moreover, dynamic scheduling was enabled to enhance load balance among the processors while running MOZAIK's physics module thus improving the parallel speedup and efficiency. In this way, the total computation time consumed by the physics module is reduced by a factor close to M, where M is the number of processors. This capability also encourages the use of MOZAIK for shape optimization problems in nuclear applications because many traditional codes related to radiation transport do not have parallel execution capability. A set of computational models based on the

  14. Radiation Transport Tools for Space Applications: A Review

    NASA Technical Reports Server (NTRS)

    Jun, Insoo; Evans, Robin; Cherng, Michael; Kang, Shawn

    2008-01-01

    This slide presentation contains a brief discussion of nuclear transport codes widely used in the space radiation community for shielding and scientific analyses. Seven radiation transport codes that are addressed. The two general methods (i.e., Monte Carlo Method, and the Deterministic Method) are briefly reviewed.

  15. Radiation Transport Tools for Space Applications: A Review

    NASA Technical Reports Server (NTRS)

    Jun, Insoo; Evans, Robin; Cherng, Michael; Kang, Shawn

    2008-01-01

    This slide presentation contains a brief discussion of nuclear transport codes widely used in the space radiation community for shielding and scientific analyses. Seven radiation transport codes that are addressed. The two general methods (i.e., Monte Carlo Method, and the Deterministic Method) are briefly reviewed.

  16. Robust Nonlinear Neural Codes

    NASA Astrophysics Data System (ADS)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  17. Scalable motion vector coding

    NASA Astrophysics Data System (ADS)

    Barbarien, Joeri; Munteanu, Adrian; Verdicchio, Fabio; Andreopoulos, Yiannis; Cornelis, Jan P.; Schelkens, Peter

    2004-11-01

    Modern video coding applications require transmission of video data over variable-bandwidth channels to a variety of terminals with different screen resolutions and available computational power. Scalable video coding is needed to optimally support these applications. Recently proposed wavelet-based video codecs employing spatial domain motion compensated temporal filtering (SDMCTF) provide quality, resolution and frame-rate scalability while delivering compression performance comparable to that of the state-of-the-art non-scalable H.264-codec. These codecs require scalable coding of the motion vectors in order to support a large range of bit-rates with optimal compression efficiency. Scalable motion vector coding algorithms based on the integer wavelet transform followed by embedded coding of the wavelet coefficients were recently proposed. In this paper, a new and fundamentally different scalable motion vector codec (MVC) using median-based motion vector prediction is proposed. Extensive experimental results demonstrate that the proposed MVC systematically outperforms the wavelet-based state-of-the-art solutions. To be able to take advantage of the proposed scalable MVC, a rate allocation mechanism capable of optimally dividing the available rate among texture and motion information is required. Two rate allocation strategies are proposed and compared. The proposed MVC and rate allocation schemes are incorporated into an SDMCTF-based video codec and the benefits of scalable motion vector coding are experimentally demonstrated.

  18. Radiation Therapy: Professions in Radiation Therapy

    MedlinePlus

    ... Resources Professions Site Index A-Z Professions in Radiation Therapy Radiation Oncologist Therapeutic Medical Physicist Radiation Therapist Dosimetrist Radiation Oncology Nurse Social Worker Dietitian Radiation Oncologist Radiation oncologists are physicians who oversee the ...

  19. Radiation shielding quality assurance

    NASA Astrophysics Data System (ADS)

    Um, Dallsun

    For the radiation shielding quality assurance, the validity and reliability of the neutron transport code MCNP, which is now one of the most widely used radiation shielding analysis codes, were checked with lot of benchmark experiments. And also as a practical example, follows were performed in this thesis. One integral neutron transport experiment to measure the effect of neutron streaming in iron and void was performed with Dog-Legged Void Assembly in Knolls Atomic Power Laboratory in 1991. Neutron flux was measured six different places with the methane detectors and a BF-3 detector. The main purpose of the measurements was to provide benchmark against which various neutron transport calculation tools could be compared. Those data were used in verification of Monte Carlo Neutron & Photon Transport Code, MCNP, with the modeling for that. Experimental results and calculation results were compared in both ways, as the total integrated value of neutron fluxes along neutron energy range from 10 KeV to 2 MeV and as the neutron spectrum along with neutron energy range. Both results are well matched with the statistical error +/-20%. MCNP results were also compared with those of TORT, a three dimensional discrete ordinates code which was developed by Oak Ridge National Laboratory. MCNP results are superior to the TORT results at all detector places except one. This means that MCNP is proved as a very powerful tool for the analysis of neutron transport through iron & air and further it could be used as a powerful tool for the radiation shielding analysis. For one application of the analysis of variance (ANOVA) to neutron and gamma transport problems, uncertainties for the calculated values of critical K were evaluated as in the ANOVA on statistical data.

  20. Code CUGEL: A code to unfold Ge(Li) spectrometer polyenergetic gamma photon experimental distributions

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Born, U.

    1970-01-01

    A FORTRAN code was developed for the Univac 1108 digital computer to unfold lithium-drifted germanium semiconductor spectrometers, polyenergetic gamma photon experimental distributions. It was designed to analyze the combination continuous and monoenergetic gamma radiation field of radioisotope volumetric sources. The code generates the detector system response matrix function and applies it to monoenergetic spectral components discretely and to the continuum iteratively. It corrects for system drift, source decay, background, and detection efficiency. Results are presented in digital form for differential and integrated photon number and energy distributions, and for exposure dose.

  1. Radiation safety.

    PubMed

    Skinner, Sarah

    2013-06-01

    Diagnostic radiology procedures, such as computed tomography (CT) and X-ray, are an increasing source of ionising radiation exposure to our community. Exposure to ionising radiation is associated with increased risk of malignancy, proportional to the level of exposure. Every diagnostic test using ionising radiation needs to be justified by clinical need. General practitioners need a working knowledge of radiation safety so they can adequately inform their patients of the risks and benefits of diagnostic imaging procedures.

  2. 76 FR 4258 - Occupational Radiation Protection; Revision

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... Part 835 RIN 1901-AA-95 Occupational Radiation Protection; Revision AGENCY: Department of Energy...) proposes to revise the values in an appendix to its Occupational Radiation Protection requirements. The... requirements in title 10, Code of Federal Regulations, part 835 (10 CFR part 835), Occupational...

  3. The fast non-LTE code DEDALE

    NASA Astrophysics Data System (ADS)

    Gilleron, Franck; Piron, Robin

    2015-12-01

    We present Dédale, a fast code implementing a simplified non-local-thermodynamic-equilibrium (NLTE) plasma model. In this approach, the stationary collisional-radiative rates equations are solved for a set of well-chosen Layzer complexes in order to determine the ion state populations. The electronic structure is approximated using the screened hydrogenic model (SHM) of More with relativistic corrections. The radiative and collisional cross-sections are based on Kramers and Van Regemorter formula, respectively, which are extrapolated to derive analytical expressions for all the rates. The latter are improved thereafter using Gaunt factors or more accurate tabulated data. Special care is taken for dielectronic rates which are compared and rescaled with quantum calculations from the Averroès code. The emissivity and opacity spectra are calculated under the same assumptions as for the radiative rates, either in a detailed manner by summing the transitions between each pair of complexes, or in a coarser statistical way by summing the one-electron transitions averaged over the complexes. Optionally, nℓ-splitting can be accounted for using a WKB approach in an approximate potential reconstructed analytically from the screened charges. It is also possible to improve the spectra by replacing some transition arrays with more accurate data tabulated using the SCO-RCG or FAC codes. This latter option is particularly useful for K-shell emission spectroscopy. The Dédale code was used to submit neon and tungsten cases in the last NLTE-8 workshop (Santa Fe, November 4-8, 2013). Some of these results are presented, as well as comparisons with Averroès calculations.

  4. ENZO: AN ADAPTIVE MESH REFINEMENT CODE FOR ASTROPHYSICS

    SciTech Connect

    Bryan, Greg L.; Turk, Matthew J.; Norman, Michael L.; Bordner, James; Xu, Hao; Kritsuk, Alexei G.; O'Shea, Brian W.; Smith, Britton; Abel, Tom; Wang, Peng; Skillman, Samuel W.; Wise, John H.; Reynolds, Daniel R.; Collins, David C.; Harkness, Robert P.; Kim, Ji-hoon; Kuhlen, Michael; Goldbaum, Nathan; Hummels, Cameron; Collaboration: Enzo Collaboration; and others

    2014-04-01

    This paper describes the open-source code Enzo, which uses block-structured adaptive mesh refinement to provide high spatial and temporal resolution for modeling astrophysical fluid flows. The code is Cartesian, can be run in one, two, and three dimensions, and supports a wide variety of physics including hydrodynamics, ideal and non-ideal magnetohydrodynamics, N-body dynamics (and, more broadly, self-gravity of fluids and particles), primordial gas chemistry, optically thin radiative cooling of primordial and metal-enriched plasmas (as well as some optically-thick cooling models), radiation transport, cosmological expansion, and models for star formation and feedback in a cosmological context. In addition to explaining the algorithms implemented, we present solutions for a wide range of test problems, demonstrate the code's parallel performance, and discuss the Enzo collaboration's code development methodology.

  5. Radiation Exposure

    MedlinePlus

    Radiation is energy that travels in the form of waves or high-speed particles. It occurs naturally in sunlight. Man-made radiation is used in X-rays, nuclear weapons, nuclear power plants and cancer treatment. If you are exposed to small amounts of radiation over a ...

  6. Enhanced radiation belts and systems implications workshop

    NASA Astrophysics Data System (ADS)

    Crain, C. M.

    1983-03-01

    Determination of the degree of understanding of the effects on space systems produced by enhancement of the natural radiation belts, identification of the areas where additional understanding is needed, and provision of suggestions for further research were determined. Topics relevant to enhanced radiation belts and their potential effects on the architecture of enduring space systems were discussed. Topics included injection of trapped radiation from fission debris, loss mechanisms and lifetimes, SPECTER codes for predicting total radiation flux, mission considerations of trapped radiation, hardware vulnerability and hardening, single event phenomena, and planning for a chemical release satellite.

  7. Prioritized LT Codes

    NASA Technical Reports Server (NTRS)

    Woo, Simon S.; Cheng, Michael K.

    2011-01-01

    The original Luby Transform (LT) coding scheme is extended to account for data transmissions where some information symbols in a message block are more important than others. Prioritized LT codes provide unequal error protection (UEP) of data on an erasure channel by modifying the original LT encoder. The prioritized algorithm improves high-priority data protection without penalizing low-priority data recovery. Moreover, low-latency decoding is also obtained for high-priority data due to fast encoding. Prioritized LT codes only require a slight change in the original encoding algorithm, and no changes at all at the decoder. Hence, with a small complexity increase in the LT encoder, an improved UEP and low-decoding latency performance for high-priority data can be achieved. LT encoding partitions a data stream into fixed-sized message blocks each with a constant number of information symbols. To generate a code symbol from the information symbols in a message, the Robust-Soliton probability distribution is first applied in order to determine the number of information symbols to be used to compute the code symbol. Then, the specific information symbols are chosen uniform randomly from the message block. Finally, the selected information symbols are XORed to form the code symbol. The Prioritized LT code construction includes an additional restriction that code symbols formed by a relatively small number of XORed information symbols select some of these information symbols from the pool of high-priority data. Once high-priority data are fully covered, encoding continues with the conventional LT approach where code symbols are generated by selecting information symbols from the entire message block including all different priorities. Therefore, if code symbols derived from high-priority data experience an unusual high number of erasures, Prioritized LT codes can still reliably recover both high- and low-priority data. This hybrid approach decides not only "how to encode

  8. Comparison of synchrotron radiation calculation between EGS4, FLUKA, PHOTON and STAC8

    SciTech Connect

    Liu, James C

    2002-09-25

    Doses due to scattered synchrotron radiation were calculated, in the case of thin shield or without shield as well as with and without considering linear polarization effect, by using shielding design codes for synchrotron radiation beamlines (STAC8 and PHOTON) and Monte Carlo simulation codes (EGS4 and FLUKA). The comparison of results shows reasonably agreements between codes.

  9. Full 3D visualization tool-kit for Monte Carlo and deterministic transport codes

    SciTech Connect

    Frambati, S.; Frignani, M.

    2012-07-01

    We propose a package of tools capable of translating the geometric inputs and outputs of many Monte Carlo and deterministic radiation transport codes into open source file formats. These tools are aimed at bridging the gap between trusted, widely-used radiation analysis codes and very powerful, more recent and commonly used visualization software, thus supporting the design process and helping with shielding optimization. Three main lines of development were followed: mesh-based analysis of Monte Carlo codes, mesh-based analysis of deterministic codes and Monte Carlo surface meshing. The developed kit is considered a powerful and cost-effective tool in the computer-aided design for radiation transport code users of the nuclear world, and in particular in the fields of core design and radiation analysis. (authors)

  10. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  11. Error coding simulations

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1993-01-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  12. Induction technology optimization code

    SciTech Connect

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-08-21

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. The Induction Technology Optimization Study (ITOS) was undertaken to examine viable combinations of a linear induction accelerator and a relativistic klystron (RK) for high power microwave production. It is proposed, that microwaves from the RK will power a high-gradient accelerator structure for linear collider development. Previous work indicates that the RK will require a nominal 3-MeV, 3-kA electron beam with a 100-ns flat top. The proposed accelerator-RK combination will be a high average power system capable of sustained microwave output at a 300-Hz pulse repetition frequency. The ITOS code models many combinations of injector, accelerator, and pulse power designs that will supply an RK with the beam parameters described above.

  13. Coded source neutron imaging

    NASA Astrophysics Data System (ADS)

    Bingham, Philip; Santos-Villalobos, Hector; Tobin, Ken

    2011-03-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100μm) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100μm and 10μm aperture hole diameters show resolutions matching the hole diameters.

  14. HOTSPOT Health Physics codes for the PC

    SciTech Connect

    Homann, S.G.

    1994-03-01

    The HOTSPOT Health Physics codes were created to provide Health Physics personnel with a fast, field-portable calculation tool for evaluating accidents involving radioactive materials. HOTSPOT codes are a first-order approximation of the radiation effects associated with the atmospheric release of radioactive materials. HOTSPOT programs are reasonably accurate for a timely initial assessment. More importantly, HOTSPOT codes produce a consistent output for the same input assumptions and minimize the probability of errors associated with reading a graph incorrectly or scaling a universal nomogram during an emergency. The HOTSPOT codes are designed for short-term (less than 24 hours) release durations. Users requiring radiological release consequences for release scenarios over a longer time period, e.g., annual windrose data, are directed to such long-term models as CAPP88-PC (Parks, 1992). Users requiring more sophisticated modeling capabilities, e.g., complex terrain; multi-location real-time wind field data; etc., are directed to such capabilities as the Department of Energy`s ARAC computer codes (Sullivan, 1993). Four general programs -- Plume, Explosion, Fire, and Resuspension -- calculate a downwind assessment following the release of radioactive material resulting from a continuous or puff release, explosive release, fuel fire, or an area contamination event. Other programs deal with the release of plutonium, uranium, and tritium to expedite an initial assessment of accidents involving nuclear weapons. Additional programs estimate the dose commitment from the inhalation of any one of the radionuclides listed in the database of radionuclides; calibrate a radiation survey instrument for ground-survey measurements; and screen plutonium uptake in the lung (see FIDLER Calibration and LUNG Screening sections).

  15. Radiation Proctopathy

    PubMed Central

    Grodsky, Marc B.; Sidani, Shafik M.

    2015-01-01

    Radiation therapy is a widely utilized treatment modality for pelvic malignancies, including prostate cancer, rectal cancer, and cervical cancer. Given its fixed position in the pelvis, the rectum is at a high risk for injury secondary to ionizing radiation. Despite advances made in radiation science, up to 75% of the patients will suffer from acute radiation proctitis and up to 20% may experience chronic symptoms. Symptoms can be variable and include diarrhea, bleeding, incontinence, and fistulization. A multitude of treatment options exist. This article summarizes the latest knowledge relating to radiation proctopathy focusing on the vast array of treatment options. PMID:26034407

  16. Importance biasing scheme implemented in the PRIZMA code

    SciTech Connect

    Kandiev, I.Z.; Malyshkin, G.N.

    1997-12-31

    PRIZMA code is intended for Monte Carlo calculations of linear radiation transport problems. The code has wide capabilities to describe geometry, sources, material composition, and to obtain parameters specified by user. There is a capability to calculate path of particle cascade (including neutrons, photons, electrons, positrons and heavy charged particles) taking into account possible transmutations. Importance biasing scheme was implemented to solve the problems which require calculation of functionals related to small probabilities (for example, problems of protection against radiation, problems of detection, etc.). The scheme enables to adapt trajectory building algorithm to problem peculiarities.

  17. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  18. SAC: Sheffield Advanced Code

    NASA Astrophysics Data System (ADS)

    Griffiths, Mike; Fedun, Viktor; Mumford, Stuart; Gent, Frederick

    2013-06-01

    The Sheffield Advanced Code (SAC) is a fully non-linear MHD code designed for simulations of linear and non-linear wave propagation in gravitationally strongly stratified magnetized plasma. It was developed primarily for the forward modelling of helioseismological processes and for the coupling processes in the solar interior, photosphere, and corona; it is built on the well-known VAC platform that allows robust simulation of the macroscopic processes in gravitationally stratified (non-)magnetized plasmas. The code has no limitations of simulation length in time imposed by complications originating from the upper boundary, nor does it require implementation of special procedures to treat the upper boundaries. SAC inherited its modular structure from VAC, thereby allowing modification to easily add new physics.

  19. Code query by example

    NASA Astrophysics Data System (ADS)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  20. Autocatalysis, information and coding.

    PubMed

    Wills, P R

    2001-01-01

    Autocatalytic self-construction in macromolecular systems requires the existence of a reflexive relationship between structural components and the functional operations they perform to synthesise themselves. The possibility of reflexivity depends on formal, semiotic features of the catalytic structure-function relationship, that is, the embedding of catalytic functions in the space of polymeric structures. Reflexivity is a semiotic property of some genetic sequences. Such sequences may serve as the basis for the evolution of coding as a result of autocatalytic self-organisation in a population of assignment catalysts. Autocatalytic selection is a mechanism whereby matter becomes differentiated in primitive biochemical systems. In the case of coding self-organisation, it corresponds to the creation of symbolic information. Prions are present-day entities whose replication through autocatalysis reflects aspects of biological semiotics less obvious than genetic coding.