Sample records for rays control coding

  1. The Alba ray tracing code: ART

    NASA Astrophysics Data System (ADS)

    Nicolas, Josep; Barla, Alessandro; Juanhuix, Jordi

    2013-09-01

    The Alba ray tracing code (ART) is a suite of Matlab functions and tools for the ray tracing simulation of x-ray beamlines. The code is structured in different layers, which allow its usage as part of optimization routines as well as an easy control from a graphical user interface. Additional tools for slope error handling and for grating efficiency calculations are also included. Generic characteristics of ART include the accumulation of rays to improve statistics without memory limitations, and still providing normalized values of flux and resolution in physically meaningful units.

  2. The complete mitochondrial genome of the Giant Manta ray, Manta birostris.

    PubMed

    Hinojosa-Alvarez, Silvia; Díaz-Jaimes, Pindaro; Marcet-Houben, Marina; Gabaldón, Toni

    2015-01-01

    The complete mitochondrial genome of the giant manta ray (Manta birostris), consists of 18,075 bp with rich A + T and low G content. Gene organization and length is similar to other species of ray. It comprises of 13 protein-coding genes, 2 rRNAs genes, 23 tRNAs genes and 1 non-coding sequence, and the control region. We identified an AT tandem repeat region, similar to that reported in Mobula japanica.

  3. Learning to Analyze and Code Accounting Transactions in Interactive Mode.

    ERIC Educational Resources Information Center

    Bentz, William F.; Ambler, Eric E.

    An interactive computer-assisted instructional (CAI) system, called CODE, is used to teach transactional analysis, or coding, in elementary accounting. The first major component of CODE is TEACH, a program which controls student input and output. Following the statement of a financial position on a cathode ray tube, TEACH describes an event to…

  4. Neutron transport analysis for nuclear reactor design

    DOEpatents

    Vujic, Jasmina L.

    1993-01-01

    Replacing regular mesh-dependent ray tracing modules in a collision/transfer probability (CTP) code with a ray tracing module based upon combinatorial geometry of a modified geometrical module (GMC) provides a general geometry transfer theory code in two dimensions (2D) for analyzing nuclear reactor design and control. The primary modification of the GMC module involves generation of a fixed inner frame and a rotating outer frame, where the inner frame contains all reactor regions of interest, e.g., part of a reactor assembly, an assembly, or several assemblies, and the outer frame, with a set of parallel equidistant rays (lines) attached to it, rotates around the inner frame. The modified GMC module allows for determining for each parallel ray (line), the intersections with zone boundaries, the path length between the intersections, the total number of zones on a track, the zone and medium numbers, and the intersections with the outer surface, which parameters may be used in the CTP code to calculate collision/transfer probability and cross-section values.

  5. Neutron transport analysis for nuclear reactor design

    DOEpatents

    Vujic, J.L.

    1993-11-30

    Replacing regular mesh-dependent ray tracing modules in a collision/transfer probability (CTP) code with a ray tracing module based upon combinatorial geometry of a modified geometrical module (GMC) provides a general geometry transfer theory code in two dimensions (2D) for analyzing nuclear reactor design and control. The primary modification of the GMC module involves generation of a fixed inner frame and a rotating outer frame, where the inner frame contains all reactor regions of interest, e.g., part of a reactor assembly, an assembly, or several assemblies, and the outer frame, with a set of parallel equidistant rays (lines) attached to it, rotates around the inner frame. The modified GMC module allows for determining for each parallel ray (line), the intersections with zone boundaries, the path length between the intersections, the total number of zones on a track, the zone and medium numbers, and the intersections with the outer surface, which parameters may be used in the CTP code to calculate collision/transfer probability and cross-section values. 28 figures.

  6. Hydrodynamic evolution of plasma waveguides for soft-x-ray amplifiers

    NASA Astrophysics Data System (ADS)

    Oliva, Eduardo; Depresseux, Adrien; Cotelo, Manuel; Lifschitz, Agustín; Tissandier, Fabien; Gautier, Julien; Maynard, Gilles; Velarde, Pedro; Sebban, Stéphane

    2018-02-01

    High-density, collisionally pumped plasma-based soft-x-ray lasers have recently delivered hundreds of femtosecond pulses, breaking the longstanding barrier of one picosecond. To pump these amplifiers an intense infrared pulse must propagate focused throughout all the length of the amplifier, which spans several Rayleigh lengths. However, strong nonlinear effects hinder the propagation of the laser beam. The use of a plasma waveguide allows us to overcome these drawbacks provided the hydrodynamic processes that dominate the creation and posterior evolution of the waveguide are controlled and optimized. In this paper we present experimental measurements of the radial density profile and transmittance of such waveguide, and we compare them with numerical calculations using hydrodynamic and particle-in-cell codes. Controlling the properties (electron density value and radial gradient) of the waveguide with the help of numerical codes promises the delivery of ultrashort (tens of femtoseconds), coherent soft-x-ray pulses.

  7. [Digital acoustic burglar alarm system using infrared radio remote control].

    PubMed

    Wang, Song-De; Zhao, Yan; Yao, Li-Ping; Zhang, Shuan-Ji

    2009-03-01

    Using butt emission infrared sensors, radio receiving and sending modules, double function integrated circuit with code and code translation, LED etc, a digital acoustic burglar alarm system using infrared radio to realize remote control was designed. It uses infrared ray invisible to eyes, composing area of radio distance. Once people and objects shelter the infrared ray, a testing signal will be output by the tester, and the sender will be triggered to work. The radio coding signal that sender sent is received by the receiver, then processed by a serial circuit. The control signal is output to trigger the sounder to give out an alarm signal, and the operator will be cued to notice this variation. At the same time, the digital display will be lighted and the alarm place will be watched. Digital coding technology is used, and a number of sub alarm circuits can joint the main receiver, so a lot of places can be monitored. The whole system features a module structure, with the property of easy alignment, stable operation, debug free and so on. The system offers an alarm range reaching 1 000 meters in all directions, and can be widely used in family, shop, storehouse, orchard and so on.

  8. Simulating X-ray bursts with a radiation hydrodynamics code

    NASA Astrophysics Data System (ADS)

    Seong, Gwangeon; Kwak, Kyujin

    2018-04-01

    Previous simulations of X-ray bursts (XRBs), for example, those performed by MESA (Modules for Experiments in Stellar Astrophysics) could not address the dynamical effects of strong radiation, which are important to explain the photospheric radius expansion (PRE) phenomena seen in many XRBs. In order to study the effects of strong radiation, we propose to use SNEC (the SuperNova Explosion Code), a 1D Lagrangian open source code that is designed to solve hydrodynamics and equilibrium-diffusion radiation transport together. Because SNEC is able to control modules of radiation-hydrodynamics for properly mapped inputs, radiation-dominant pressure occurring in PRE XRBs can be handled. Here we present simulation models for PRE XRBs by applying SNEC together with MESA.

  9. Three-dimensional Monte-Carlo simulation of gamma-ray scattering and production in the atmosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, D.J.

    1989-05-15

    Monte Carlo codes have been developed to simulate gamma-ray scattering and production in the atmosphere. The scattering code simulates interactions of low-energy gamma rays (20 to several hundred keV) from an astronomical point source in the atmosphere; a modified code also simulates scattering in a spacecraft. Four incident spectra, typical of gamma-ray bursts, solar flares, and the Crab pulsar, and 511 keV line radiation have been studied. These simulations are consistent with observations of solar flare radiation scattered from the atmosphere. The production code simulates the interactions of cosmic rays which produce high-energy (above 10 MeV) photons and electrons. Itmore » has been used to calculate gamma-ray and electron albedo intensities at Palestine, Texas and at the equator; the results agree with observations in most respects. With minor modifications this code can be used to calculate intensities of other high-energy particles. Both codes are fully three-dimensional, incorporating a curved atmosphere; the production code also incorporates the variation with both zenith and azimuth of the incident cosmic-ray intensity due to geomagnetic effects. These effects are clearly reflected in the calculated albedo by intensity contrasts between the horizon and nadir, and between the east and west horizons.« less

  10. Hard X-ray imaging from Explorer

    NASA Technical Reports Server (NTRS)

    Grindlay, J. E.; Murray, S. S.

    1981-01-01

    Coded aperture X-ray detectors were applied to obtain large increases in sensitivity as well as angular resolution. A hard X-ray coded aperture detector concept is described which enables very high sensitivity studies persistent hard X-ray sources and gamma ray bursts. Coded aperture imaging is employed so that approx. 2 min source locations can be derived within a 3 deg field of view. Gamma bursts were located initially to within approx. 2 deg and X-ray/hard X-ray spectra and timing, as well as precise locations, derived for possible burst afterglow emission. It is suggested that hard X-ray imaging should be conducted from an Explorer mission where long exposure times are possible.

  11. Coded mask telescopes for X-ray astronomy

    NASA Astrophysics Data System (ADS)

    Skinner, G. K.; Ponman, T. J.

    1987-04-01

    The principle of the coded mask techniques are discussed together with the methods of image reconstruction. The coded mask telescopes built at the University of Birmingham, including the SL 1501 coded mask X-ray telescope flown on the Skylark rocket and the Coded Mask Imaging Spectrometer (COMIS) projected for the Soviet space station Mir, are described. A diagram of a coded mask telescope and some designs for coded masks are included.

  12. MMAPDNG: A new, fast code backed by a memory-mapped database for simulating delayed γ-ray emission with MCNPX package

    NASA Astrophysics Data System (ADS)

    Lou, Tak Pui; Ludewigt, Bernhard

    2015-09-01

    The simulation of the emission of beta-delayed gamma rays following nuclear fission and the calculation of time-dependent energy spectra is a computational challenge. The widely used radiation transport code MCNPX includes a delayed gamma-ray routine that is inefficient and not suitable for simulating complex problems. This paper describes the code "MMAPDNG" (Memory-Mapped Delayed Neutron and Gamma), an optimized delayed gamma module written in C, discusses usage and merits of the code, and presents results. The approach is based on storing required Fission Product Yield (FPY) data, decay data, and delayed particle data in a memory-mapped file. When compared to the original delayed gamma-ray code in MCNPX, memory utilization is reduced by two orders of magnitude and the ray sampling is sped up by three orders of magnitude. Other delayed particles such as neutrons and electrons can be implemented in future versions of MMAPDNG code using its existing framework.

  13. Evaluation of the cosmic-ray induced background in coded aperture high energy gamma-ray telescopes

    NASA Technical Reports Server (NTRS)

    Owens, Alan; Barbier, Loius M.; Frye, Glenn M.; Jenkins, Thomas L.

    1991-01-01

    While the application of coded-aperture techniques to high-energy gamma-ray astronomy offers potential arc-second angular resolution, concerns were raised about the level of secondary radiation produced in a thick high-z mask. A series of Monte-Carlo calculations are conducted to evaluate and quantify the cosmic-ray induced neutral particle background produced in a coded-aperture mask. It is shown that this component may be neglected, being at least a factor of 50 lower in intensity than the cosmic diffuse gamma-rays.

  14. Modification and benchmarking of MCNP for low-energy tungsten spectra.

    PubMed

    Mercier, J R; Kopp, D T; McDavid, W D; Dove, S B; Lancaster, J L; Tucker, D M

    2000-12-01

    The MCNP Monte Carlo radiation transport code was modified for diagnostic medical physics applications. In particular, the modified code was thoroughly benchmarked for the production of polychromatic tungsten x-ray spectra in the 30-150 kV range. Validating the modified code for coupled electron-photon transport with benchmark spectra was supplemented with independent electron-only and photon-only transport benchmarks. Major revisions to the code included the proper treatment of characteristic K x-ray production and scoring, new impact ionization cross sections, and new bremsstrahlung cross sections. Minor revisions included updated photon cross sections, electron-electron bremsstrahlung production, and K x-ray yield. The modified MCNP code is benchmarked to electron backscatter factors, x-ray spectra production, and primary and scatter photon transport.

  15. Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation

    NASA Astrophysics Data System (ADS)

    Pinilla, Samuel; Poveda, Juan; Arguello, Henry

    2018-03-01

    Phase retrieval is a problem present in many applications such as optics, astronomical imaging, computational biology and X-ray crystallography. Recent work has shown that the phase can be better recovered when the acquisition architecture includes a coded aperture, which modulates the signal before diffraction, such that the underlying signal is recovered from coded diffraction patterns. Moreover, this type of modulation effect, before the diffraction operation, can be obtained using a phase coded aperture, just after the sample under study. However, a practical implementation of a phase coded aperture in an X-ray application is not feasible, because it is computationally modeled as a matrix with complex entries which requires changing the phase of the diffracted beams. In fact, changing the phase implies finding a material that allows to deviate the direction of an X-ray beam, which can considerably increase the implementation costs. Hence, this paper describes a low cost coded X-ray diffraction system based on block-unblock coded apertures that enables phase reconstruction. The proposed system approximates the phase coded aperture with a block-unblock coded aperture by using the detour-phase method. Moreover, the SAXS/WAXS X-ray crystallography software was used to simulate the diffraction patterns of a real crystal structure called Rhombic Dodecahedron. Additionally, several simulations were carried out to analyze the performance of block-unblock approximations in recovering the phase, using the simulated diffraction patterns. Furthermore, the quality of the reconstructions was measured in terms of the Peak Signal to Noise Ratio (PSNR). Results show that the performance of the block-unblock phase coded apertures approximation decreases at most 12.5% compared with the phase coded apertures. Moreover, the quality of the reconstructions using the boolean approximations is up to 2.5 dB of PSNR less with respect to the phase coded aperture reconstructions.

  16. Hard gamma-ray background from the coding collimator of a gamma-ray telescope during in conditions of a space experiment

    NASA Astrophysics Data System (ADS)

    Aleksandrov, A. P.; Berezovoj, A. N.; Gal'Per, A. M.; Grachev, V. M.; Dmitrenko, V. V.; Kirillov-Ugryumov, V. G.; Lebedev, V. V.; Lyakhov, V. A.; Moiseev, A. A.; Ulin, S. E.; Shchvets, N. I.

    1984-11-01

    Coding collimators are used to improve the angular resolution of gamma-ray telescopes at energies above 50 MeV. However, the interaction of cosmic rays with the collimator material can lead to the appearance of a gramma-ray background flux which can have a deleterious effect on measurement efficiency. An experiment was performed on the Salyut-6-Soyuz spacecraft system with the Elena-F small-scale gamma-ray telescope in order to measure the magnitude of this background. It is shown that, even at a zenith angle of approximately zero degrees (the angle at which the gamma-ray observations are made), the coding collimator has only an insignificant effect on the background conditions.

  17. Control of the Low-energy X-rays by Using MCNP5 and Numerical Analysis for a New Concept Intra-oral X-ray Imaging System

    NASA Astrophysics Data System (ADS)

    Huh, Jangyong; Ji, Yunseo; Lee, Rena

    2018-05-01

    An X-ray control algorithm to modulate the X-ray intensity distribution over the FOV (field of view) has been developed by using numerical analysis and MCNP5, a particle transport simulation code on the basis of the Monte Carlo method. X-rays, which are widely used in medical diagnostic imaging, should be controlled in order to maximize the performance of the X-ray imaging system. However, transporting X-rays, like a liquid or a gas is conveyed through a physical form such as pipes, is not possible. In the present study, an X-ray control algorithm and technique to uniformize the Xray intensity projected on the image sensor were developed using a flattening filter and a collimator in order to alleviate the anisotropy of the distribution of X-rays due to intrinsic features of the X-ray generator. The proposed method, which is combined with MCNP5 modeling and numerical analysis, aimed to optimize a flattening filter and a collimator for a uniform distribution of X-rays. Their size and shape were estimated from the method. The simulation and the experimental results both showed that the method yielded an intensity distribution over an X-ray field of 6×4 cm2 at SID (source to image-receptor distance) of 5 cm with a uniformity of more than 90% when the flattening filter and the collimator were mounted on the system. The proposed algorithm and technique are not only confined to flattening filter development but can also be applied for other X-ray related research and development efforts.

  18. Three dimensional ray tracing of the Jovian magnetosphere in the low frequency range

    NASA Technical Reports Server (NTRS)

    Menietti, J. D.

    1984-01-01

    Ray tracing studies of Jovian low frequency emissions were studied. A comprehensive three-dimensional ray tracing computer code for examination of model Jovian decametric (DAM) emission was developed. The improvements to the computer code are outlined and described. The results of the ray tracings of Jovian emissions will be presented in summary form.

  19. Grid-enhanced X-ray coded aperture microscopy with polycapillary optics

    PubMed Central

    Sowa, Katarzyna M.; Last, Arndt; Korecki, Paweł

    2017-01-01

    Polycapillary devices focus X-rays by means of multiple reflections of X-rays in arrays of bent glass capillaries. The size of the focal spot (typically 10–100 μm) limits the resolution of scanning, absorption and phase-contrast X-ray imaging using these devices. At the expense of a moderate resolution, polycapillary elements provide high intensity and are frequently used for X-ray micro-imaging with both synchrotrons and X-ray tubes. Recent studies have shown that the internal microstructure of such an optics can be used as a coded aperture that encodes high-resolution information about objects located inside the focal spot. However, further improvements to this variant of X-ray microscopy will require the challenging fabrication of tailored devices with a well-defined capillary microstructure. Here, we show that submicron coded aperture microscopy can be realized using a periodic grid that is placed at the output surface of a polycapillary optics. Grid-enhanced X-ray coded aperture microscopy with polycapillary optics does not rely on the specific microstructure of the optics but rather takes advantage only of its focusing properties. Hence, submicron X-ray imaging can be realized with standard polycapillary devices and existing set-ups for micro X-ray fluorescence spectroscopy. PMID:28322316

  20. Grid-enhanced X-ray coded aperture microscopy with polycapillary optics.

    PubMed

    Sowa, Katarzyna M; Last, Arndt; Korecki, Paweł

    2017-03-21

    Polycapillary devices focus X-rays by means of multiple reflections of X-rays in arrays of bent glass capillaries. The size of the focal spot (typically 10-100 μm) limits the resolution of scanning, absorption and phase-contrast X-ray imaging using these devices. At the expense of a moderate resolution, polycapillary elements provide high intensity and are frequently used for X-ray micro-imaging with both synchrotrons and X-ray tubes. Recent studies have shown that the internal microstructure of such an optics can be used as a coded aperture that encodes high-resolution information about objects located inside the focal spot. However, further improvements to this variant of X-ray microscopy will require the challenging fabrication of tailored devices with a well-defined capillary microstructure. Here, we show that submicron coded aperture microscopy can be realized using a periodic grid that is placed at the output surface of a polycapillary optics. Grid-enhanced X-ray coded aperture microscopy with polycapillary optics does not rely on the specific microstructure of the optics but rather takes advantage only of its focusing properties. Hence, submicron X-ray imaging can be realized with standard polycapillary devices and existing set-ups for micro X-ray fluorescence spectroscopy.

  1. RAY-RAMSES: a code for ray tracing on the fly in N-body simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barreira, Alexandre; Llinares, Claudio; Bose, Sownak

    2016-05-01

    We present a ray tracing code to compute integrated cosmological observables on the fly in AMR N-body simulations. Unlike conventional ray tracing techniques, our code takes full advantage of the time and spatial resolution attained by the N-body simulation by computing the integrals along the line of sight on a cell-by-cell basis through the AMR simulation grid. Moroever, since it runs on the fly in the N-body run, our code can produce maps of the desired observables without storing large (or any) amounts of data for post-processing. We implemented our routines in the RAMSES N-body code and tested the implementationmore » using an example of weak lensing simulation. We analyse basic statistics of lensing convergence maps and find good agreement with semi-analytical methods. The ray tracing methodology presented here can be used in several cosmological analysis such as Sunyaev-Zel'dovich and integrated Sachs-Wolfe effect studies as well as modified gravity. Our code can also be used in cross-checks of the more conventional methods, which can be important in tests of theory systematics in preparation for upcoming large scale structure surveys.« less

  2. Method for rapid high-frequency seismogram calculation

    NASA Astrophysics Data System (ADS)

    Stabile, Tony Alfredo; De Matteis, Raffaella; Zollo, Aldo

    2009-02-01

    We present a method for rapid, high-frequency seismogram calculation that makes use of an algorithm to automatically generate an exhaustive set of seismic phases with an appreciable amplitude on the seismogram. The method uses a hierarchical order of ray and seismic-phase generation, taking into account some existing constraints for ray paths and some physical constraints. To compute synthetic seismograms, the COMRAD code (from the Italian: "COdice Multifase per il RAy-tracing Dinamico") uses as core a dynamic ray-tracing code. To validate the code, we have computed in a layered medium synthetic seismograms using both COMRAD and a code that computes the complete wave field by the discrete wave number method. The seismograms are compared according to a time-frequency misfit criteria based on the continuous wavelet transform of the signals. Although the number of phases is considerably reduced by the selection criteria, the results show that the loss in amplitude on the whole seismogram is negligible. Moreover, the time for the computing of the synthetics using the COMRAD code (truncating the ray series at the 10th generation) is 3-4-fold less than that needed for the AXITRA code (up to a frequency of 25 Hz).

  3. Hard gamma radiation background from coding collimator of gamma telescope under space experiment conditions

    NASA Astrophysics Data System (ADS)

    Aleksandrov, A. P.; Berezovoy, A. N.; Galper, A. M.; Grachev, V. M.; Dmitrenko, V. V.; Kirillov-Ugryumov, V. G.; Lebedev, V. V.; Lyakhov, V. A.; Moiseyev, A. A.; Ulin, S. Y.

    1985-09-01

    Coding collimators are used to improve the angular resolution of gamma-ray telescopes at energies above 50 MeV. However, the interaction of cosmic rays with the collimation material can lead to the appearance of a gamma-ray background flux which can have a deleterious effect on measurement efficiency. An experiment was performed on the Salyut-6-Soyuz spacecraft system with the Elena-F small-scale gamma-ray telescope in order to measure the magnitude of this background. It is shown that, even at a zenith angle of approximately zero degrees (the angle at which the gamma-ray observations are made), the coding collimator has only an insignificant effect on the background conditions.

  4. Nuclear Physics Meets the Sources of the Ultra-High Energy Cosmic Rays.

    PubMed

    Boncioli, Denise; Fedynitch, Anatoli; Winter, Walter

    2017-07-07

    The determination of the injection composition of cosmic ray nuclei within astrophysical sources requires sufficiently accurate descriptions of the source physics and the propagation - apart from controlling astrophysical uncertainties. We therefore study the implications of nuclear data and models for cosmic ray astrophysics, which involves the photo-disintegration of nuclei up to iron in astrophysical environments. We demonstrate that the impact of nuclear model uncertainties is potentially larger in environments with non-thermal radiation fields than in the cosmic microwave background. We also study the impact of nuclear models on the nuclear cascade in a gamma-ray burst radiation field, simulated at a level of complexity comparable to the most precise cosmic ray propagation code. We conclude with an isotope chart describing which information is in principle necessary to describe nuclear interactions in cosmic ray sources and propagation.

  5. Flowfield computer graphics

    NASA Technical Reports Server (NTRS)

    Desautel, Richard

    1993-01-01

    The objectives of this research include supporting the Aerothermodynamics Branch's research by developing graphical visualization tools for both the branch's adaptive grid code and flow field ray tracing code. The completed research for the reporting period includes development of a graphical user interface (GUI) and its implementation into the NAS Flowfield Analysis Software Tool kit (FAST), for both the adaptive grid code (SAGE) and the flow field ray tracing code (CISS).

  6. Validation of Ray Tracing Code Refraction Effects

    NASA Technical Reports Server (NTRS)

    Heath, Stephanie L.; McAninch, Gerry L.; Smith, Charles D.; Conner, David A.

    2008-01-01

    NASA's current predictive capabilities using the ray tracing program (RTP) are validated using helicopter noise data taken at Eglin Air Force Base in 2007. By including refractive propagation effects due to wind and temperature, the ray tracing code is able to explain large variations in the data observed during the flight test.

  7. The POPOP4 library and codes for preparing secondary gamma-ray production cross sections

    NASA Technical Reports Server (NTRS)

    Ford, W. E., III

    1972-01-01

    The POPOP4 code for converting secondary gamma ray yield data to multigroup secondary gamma ray production cross sections and the POPOP4 library of secondary gamma ray yield data are described. Recent results of the testing of uranium and iron data sets from the POPOP4 library are given. The data sets were tested by comparing calculated secondary gamma ray pulse height spectra measured at the ORNL TSR-II reactor.

  8. High-resolution imaging gamma-ray spectroscopy with externally segmented germanium detectors

    NASA Technical Reports Server (NTRS)

    Callas, J. L.; Mahoney, W. A.; Varnell, L. S.; Wheaton, W. A.

    1993-01-01

    Externally segmented germanium detectors promise a breakthrough in gamma-ray imaging capabilities while retaining the superb energy resolution of germanium spectrometers. An angular resolution of 0.2 deg becomes practical by combining position-sensitive germanium detectors having a segment thickness of a few millimeters with a one-dimensional coded aperture located about a meter from the detectors. Correspondingly higher angular resolutions are possible with larger separations between the detectors and the coded aperture. Two-dimensional images can be obtained by rotating the instrument. Although the basic concept is similar to optical or X-ray coded-aperture imaging techniques, several complicating effects arise because of the penetrating nature of gamma rays. The complications include partial transmission through the coded aperture elements, Compton scattering in the germanium detectors, and high background count rates. Extensive electron-photon Monte Carlo modeling of a realistic detector/coded-aperture/collimator system has been performed. Results show that these complicating effects can be characterized and accounted for with no significant loss in instrument sensitivity.

  9. CEM2k and LAQGSM Codes as Event-Generators for Space Radiation Shield and Cosmic Rays Propagation Applications

    NASA Technical Reports Server (NTRS)

    Mashnik, S. G.; Gudima, K. K.; Sierk, A. J.; Moskalenko, I. V.

    2002-01-01

    Space radiation shield applications and studies of cosmic ray propagation in the Galaxy require reliable cross sections to calculate spectra of secondary particles and yields of the isotopes produced in nuclear reactions induced both by particles and nuclei at energies from threshold to hundreds of GeV per nucleon. Since the data often exist in a very limited energy range or sometimes not at all, the only way to obtain an estimate of the production cross sections is to use theoretical models and codes. Recently, we have developed improved versions of the Cascade-Exciton Model (CEM) of nuclear reactions: the codes CEM97 and CEM2k for description of particle-nucleus reactions at energies up to about 5 GeV. In addition, we have developed a LANL version of the Quark-Gluon String Model (LAQGSM) to describe reactions induced both by particles and nuclei at energies up to hundreds of GeVhucleon. We have tested and benchmarked the CEM and LAQGSM codes against a large variety of experimental data and have compared their results with predictions by other currently available models and codes. Our benchmarks show that CEM and LAQGSM codes have predictive powers no worse than other currently used codes and describe many reactions better than other codes; therefore both our codes can be used as reliable event-generators for space radiation shield and cosmic ray propagation applications. The CEM2k code is being incorporated into the transport code MCNPX (and several other transport codes), and we plan to incorporate LAQGSM into MCNPX in the near future. Here, we present the current status of the CEM2k and LAQGSM codes, and show results and applications to studies of cosmic ray propagation in the Galaxy.

  10. Chromaticity calculations and code comparisons for x-ray lithography source XLS and SXLS rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsa, Z.

    1988-06-16

    This note presents the chromaticity calculations and code comparison results for the (x-ray lithography source) XLS (Chasman Green, XUV Cosy lattice) and (2 magnet 4T) SXLS lattices, with the standard beam optic codes, including programs SYNCH88.5, MAD6, PATRICIA88.4, PATPET88.2, DIMAD, BETA, and MARYLIE. This analysis is a part of our ongoing accelerator physics code studies. 4 figs., 10 tabs.

  11. Space astrophysics with large structures - CASES and P/OF. [Controls, Astrophysics, and Structures Experiment in Space and Pinhole/Occulter Facility

    NASA Technical Reports Server (NTRS)

    Hudson, Hugh S.; Davis, J. M.

    1990-01-01

    Space instruments for remote sensing, of the types used for astrophysics and solar-terrestrial physics among many disciplines, will grow to larger physical sizes in the future. The zero-g space environment does not inherently restrict such growth, because relatively lightweight structures can be used. Active servo control of the structures can greatly increase their size for a given mass. The Pinhole/Occulter Facility, a candidate Space Station attached payload, offers an example: it will achieve 0.2 arc s resolution by use of a 50-m baseline for coded-aperture telescopes for hard X-ray and gamma-ray imagers.

  12. LabVIEW control software for scanning micro-beam X-ray fluorescence spectrometer.

    PubMed

    Wrobel, Pawel; Czyzycki, Mateusz; Furman, Leszek; Kolasinski, Krzysztof; Lankosz, Marek; Mrenca, Alina; Samek, Lucyna; Wegrzynek, Dariusz

    2012-05-15

    Confocal micro-beam X-ray fluorescence microscope was constructed. The system was assembled from commercially available components - a low power X-ray tube source, polycapillary X-ray optics and silicon drift detector - controlled by an in-house developed LabVIEW software. A video camera coupled to optical microscope was utilized to display the area excited by X-ray beam. The camera image calibration and scan area definition software were also based entirely on LabVIEW code. Presently, the main area of application of the newly constructed spectrometer is 2-dimensional mapping of element distribution in environmental, biological and geological samples with micrometer spatial resolution. The hardware and the developed software can already handle volumetric 3-D confocal scans. In this work, a front panel graphical user interface as well as communication protocols between hardware components were described. Two applications of the spectrometer, to homogeneity testing of titanium layers and to imaging of various types of grains in air particulate matter collected on membrane filters, were presented. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground-Based Computation and Control Systems, and Human Health and Safety

    NASA Technical Reports Server (NTRS)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools needed to design, test, and verify the safety and reliability of modern complex technological systems. The effects of primary cosmic ray particles and secondary particle showers produced by nuclear reactions with the atmosphere, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth s surface, especially if the net target area of the sensitive electronic system components is large. Finally, accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO). In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as human health and the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in ground-based atmospheric flight, and space flight environments. Ground test methods applied to microelectronic components and systems are used in combinations with radiation transport and reaction codes to predict the performance of microelectronic systems in their operating environments. Similar radiation transport codes are used to evaluate possible human health effects of cosmic ray exposure, however, the health effects are based on worst-case analysis and extrapolation of a very limited human exposure data base combined with some limited experimental animal data. Finally, the limitations on human space operations beyond low-Earth orbit imposed by long term exposure to galactic cosmic rays are discussed.

  14. Effects of cosmic rays on single event upsets

    NASA Technical Reports Server (NTRS)

    Venable, D. D.; Zajic, V.; Lowe, C. W.; Olidapupo, A.; Fogarty, T. N.

    1989-01-01

    Assistance was provided to the Brookhaven Single Event Upset (SEU) Test Facility. Computer codes were developed for fragmentation and secondary radiation affecting Very Large Scale Integration (VLSI) in space. A computer controlled CV (HP4192) test was developed for Terman analysis. Also developed were high speed parametric tests which are independent of operator judgment and a charge pumping technique for measurement of D(sub it) (E). The X-ray secondary effects, and parametric degradation as a function of dose rate were simulated. The SPICE simulation of static RAMs with various resistor filters was tested.

  15. An Overview of the XGAM Code and Related Software for Gamma-ray Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Younes, W.

    2014-11-13

    The XGAM spectrum-fitting code and associated software were developed specifically to analyze the complex gamma-ray spectra that can result from neutron-induced reactions. The XGAM code is designed to fit a spectrum over the entire available gamma-ray energy range as a single entity, in contrast to the more traditional piecewise approaches. This global-fit philosophy enforces background continuity as well as consistency between local and global behavior throughout the spectrum, and in a natural way. This report presents XGAM and the suite of programs built around it with an emphasis on how they fit into an overall analysis methodology for complex gamma-raymore » data. An application to the analysis of time-dependent delayed gamma-ray yields from 235U fission is shown in order to showcase the codes and how they interact.« less

  16. A comparison of models for supernova remnants including cosmic rays

    NASA Astrophysics Data System (ADS)

    Kang, Hyesung; Drury, L. O'C.

    1992-11-01

    A simplified model which can follow the dynamical evolution of a supernova remnant including the acceleration of cosmic rays without carrying out full numerical simulations has been proposed by Drury, Markiewicz, & Voelk in 1989. To explore the accuracy and the merits of using such a model, we have recalculated with the simplified code the evolution of the supernova remnants considered in Jones & Kang, in which more detailed and accurate numerical simulations were done using a full hydrodynamic code based on the two-fluid approximation. For the total energy transferred to cosmic rays the two codes are in good agreement, the acceleration efficiency being the same within a factor of 2 or so. The dependence of the results of the two codes on the closure parameters for the two-fluid approximation is also qualitatively similar. The agreement is somewhat degraded in those cases where the shock is smoothed out by the cosmic rays.

  17. Ray-tracing 3D dust radiative transfer with DART-Ray: code upgrade and public release

    NASA Astrophysics Data System (ADS)

    Natale, Giovanni; Popescu, Cristina C.; Tuffs, Richard J.; Clarke, Adam J.; Debattista, Victor P.; Fischera, Jörg; Pasetto, Stefano; Rushton, Mark; Thirlwall, Jordan J.

    2017-11-01

    We present an extensively updated version of the purely ray-tracing 3D dust radiation transfer code DART-Ray. The new version includes five major upgrades: 1) a series of optimizations for the ray-angular density and the scattered radiation source function; 2) the implementation of several data and task parallelizations using hybrid MPI+OpenMP schemes; 3) the inclusion of dust self-heating; 4) the ability to produce surface brightness maps for observers within the models in HEALPix format; 5) the possibility to set the expected numerical accuracy already at the start of the calculation. We tested the updated code with benchmark models where the dust self-heating is not negligible. Furthermore, we performed a study of the extent of the source influence volumes, using galaxy models, which are critical in determining the efficiency of the DART-Ray algorithm. The new code is publicly available, documented for both users and developers, and accompanied by several programmes to create input grids for different model geometries and to import the results of N-body and SPH simulations. These programmes can be easily adapted to different input geometries, and for different dust models or stellar emission libraries.

  18. Systematic Comparison of Photoionized Plasma Codes with Application to Spectroscopic Studies of AGN in X-Rays

    NASA Technical Reports Server (NTRS)

    Mehdipour, M.; Kaastra, J. S.; Kallman, T.

    2016-01-01

    Atomic data and plasma models play a crucial role in the diagnosis and interpretation of astrophysical spectra, thus influencing our understanding of the Universe. In this investigation we present a systematic comparison of the leading photoionization codes to determine how much their intrinsic differences impact X-ray spectroscopic studies of hot plasmas in photoionization equilibrium. We carry out our computations using the Cloudy, SPEX, and XSTAR photoionization codes, and compare their derived thermal and ionization states for various ionizing spectral energy distributions. We examine the resulting absorption-line spectra from these codes for the case of ionized outflows in active galactic nuclei. By comparing the ionic abundances as a function of ionization parameter, we find that on average there is about 30 deviation between the codes in where ionic abundances peak. For H-like to B-like sequence ions alone, this deviation in is smaller at about 10 on average. The comparison of the absorption-line spectra in the X-ray band shows that there is on average about 30 deviation between the codes in the optical depth of the lines produced at log 1 to 2, reducing to about 20 deviation at log 3. We also simulate spectra of the ionized outflows with the current and upcoming high-resolution X-ray spectrometers, on board XMM-Newton, Chandra, Hitomi, and Athena. From these simulations we obtain the deviation on the best-fit model parameters, arising from the use of different photoionization codes, which is about 10 to40. We compare the modeling uncertainties with the observational uncertainties from the simulations. The results highlight the importance of continuous development and enhancement of photoionization codes for the upcoming era of X-ray astronomy with Athena.

  19. Intercomparison of Monte Carlo radiation transport codes to model TEPC response in low-energy neutron and gamma-ray fields.

    PubMed

    Ali, F; Waker, A J; Waller, E J

    2014-10-01

    Tissue-equivalent proportional counters (TEPC) can potentially be used as a portable and personal dosemeter in mixed neutron and gamma-ray fields, but what hinders this use is their typically large physical size. To formulate compact TEPC designs, the use of a Monte Carlo transport code is necessary to predict the performance of compact designs in these fields. To perform this modelling, three candidate codes were assessed: MCNPX 2.7.E, FLUKA 2011.2 and PHITS 2.24. In each code, benchmark simulations were performed involving the irradiation of a 5-in. TEPC with monoenergetic neutron fields and a 4-in. wall-less TEPC with monoenergetic gamma-ray fields. The frequency and dose mean lineal energies and dose distributions calculated from each code were compared with experimentally determined data. For the neutron benchmark simulations, PHITS produces data closest to the experimental values and for the gamma-ray benchmark simulations, FLUKA yields data closest to the experimentally determined quantities. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Modeling the Martian neutron and gamma-ray leakage fluxes using Geant4

    NASA Astrophysics Data System (ADS)

    Pirard, Benoit; Desorgher, Laurent; Diez, Benedicte; Gasnault, Olivier

    A new evaluation of the Martian neutron and gamma-ray (continuum and line) leakage fluxes has been performed using the Geant4 code. Even if numerous studies have recently been carried out with Monte Carlo methods to characterize planetary radiation environments, only a few however have been able to reproduce in detail the neutron and gamma-ray spectra observed in orbit. We report on the efforts performed to adapt and validate the Geant4-based PLAN- ETOCOSMICS code for use in planetary neutron and gamma-ray spectroscopy data analysis. Beside the advantage of high transparency and modularity common to Geant4 applications, the new code uses reviewed nuclear cross section data, realistic atmospheric profiles and soil layering, as well as specific effects such as gravity acceleration for low energy neutrons. Results from first simulations are presented for some Martian reference compositions and show a high consistency with corresponding neutron and gamma-ray spectra measured on board Mars Odyssey. Finally we discuss the advantages and perspectives of the improved code for precise simulation of planetary radiation environments.

  1. Application of computational fluid dynamics and laminar flow technology for improved performance and sonic boom reduction

    NASA Technical Reports Server (NTRS)

    Bobbitt, Percy J.

    1992-01-01

    A discussion is given of the many factors that affect sonic booms with particular emphasis on the application and development of improved computational fluid dynamics (CFD) codes. The benefits that accrue from interference (induced) lift, distributing lift using canard configurations, the use of wings with dihedral or anhedral and hybrid laminar flow control for drag reduction are detailed. The application of the most advanced codes to a wider variety of configurations along with improved ray-tracing codes to arrive at more accurate and, hopefully, lower sonic booms is advocated. Finally, it is speculated that when all of the latest technology is applied to the design of a supersonic transport it will be found environmentally acceptable.

  2. Modeling IrisCode and its variants as convex polyhedral cones and its security implications.

    PubMed

    Kong, Adams Wai-Kin

    2013-03-01

    IrisCode, developed by Daugman, in 1993, is the most influential iris recognition algorithm. A thorough understanding of IrisCode is essential, because over 100 million persons have been enrolled by this algorithm and many biometric personal identification and template protection methods have been developed based on IrisCode. This paper indicates that a template produced by IrisCode or its variants is a convex polyhedral cone in a hyperspace. Its central ray, being a rough representation of the original biometric signal, can be computed by a simple algorithm, which can often be implemented in one Matlab command line. The central ray is an expected ray and also an optimal ray of an objective function on a group of distributions. This algorithm is derived from geometric properties of a convex polyhedral cone but does not rely on any prior knowledge (e.g., iris images). The experimental results show that biometric templates, including iris and palmprint templates, produced by different recognition methods can be matched through the central rays in their convex polyhedral cones and that templates protected by a method extended from IrisCode can be broken into. These experimental results indicate that, without a thorough security analysis, convex polyhedral cone templates cannot be assumed secure. Additionally, the simplicity of the algorithm implies that even junior hackers without knowledge of advanced image processing and biometric databases can still break into protected templates and reveal relationships among templates produced by different recognition methods.

  3. Improvements in the MGA Code Provide Flexibility and Better Error Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruhter, W D; Kerr, J

    2005-05-26

    The Multi-Group Analysis (MGA) code is widely used to determine nondestructively the relative isotopic abundances of plutonium by gamma-ray spectrometry. MGA users have expressed concern about the lack of flexibility and transparency in the code. Users often have to ask the code developers for modifications to the code to accommodate new measurement situations, such as additional peaks being present in the plutonium spectrum or expected peaks being absent. We are testing several new improvements to a prototype, general gamma-ray isotopic analysis tool with the intent of either revising or replacing the MGA code. These improvements will give the user themore » ability to modify, add, or delete the gamma- and x-ray energies and branching intensities used by the code in determining a more precise gain and in the determination of the relative detection efficiency. We have also fully integrated the determination of the relative isotopic abundances with the determination of the relative detection efficiency to provide a more accurate determination of the errors in the relative isotopic abundances. We provide details in this paper on these improvements and a comparison of results obtained with current versions of the MGA code.« less

  4. Filter-fluorescer measurement of low-voltage simulator x-ray energy spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baldwin, G.T.; Craven, R.E.

    X-ray energy spectra of the Maxwell Laboratories MBS and Physics International Pulserad 737 were measured using an eight-channel filter-fluorescer array. The PHOSCAT computer code was used to calculate channel response functions, and the UFO code to unfold spectrum.

  5. Implementation of Soft X-ray Tomography on NSTX

    NASA Astrophysics Data System (ADS)

    Tritz, K.; Stutman, D.; Finkenthal, M.; Granetz, R.; Menard, J.; Park, W.

    2003-10-01

    A set of poloidal ultrasoft X-ray arrays is operated by the Johns Hopkins group on NSTX. To enable MHD mode analysis independent of the magnetic reconstruction, the McCormick-Granetz tomography code developed at MIT is being adapted to the NSTX geometry. Tests of the code using synthetic data show that that present X-ray system is adequate for m=1 tomography. In addition, we have found that spline basis functions may be better suited than Bessel functions for the reconstruction of radially localized phenomena in NSTX. The tomography code was also used to determine the necessary array expansion and optimal array placement for the characterization of higher m modes (m=2,3) in the future. Initial reconstruction of experimental soft X-ray data has been performed for m=1 internal modes, which are often encountered in high beta NSTX discharges. The reconstruction of these modes will be compared to predictions from the M3D code and magnetic measurements.

  6. Solar Proton Transport Within an ICRU Sphere Surrounded by a Complex Shield: Ray-trace Geometry

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Wilson, John W.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z is less than or equal to 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency.

  7. Solar proton exposure of an ICRU sphere within a complex structure part II: Ray-trace geometry.

    PubMed

    Slaba, Tony C; Wilson, John W; Badavi, Francis F; Reddell, Brandon D; Bahadori, Amir A

    2016-06-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z ≤ 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency. Published by Elsevier Ltd.

  8. Systemizers Are Better Code-Breakers: Self-Reported Systemizing Predicts Code-Breaking Performance in Expert Hackers and Naïve Participants

    PubMed Central

    Harvey, India; Bolgan, Samuela; Mosca, Daniel; McLean, Colin; Rusconi, Elena

    2016-01-01

    Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders—attention to detail and systemizing—may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings. PMID:27242491

  9. Systemizers Are Better Code-Breakers: Self-Reported Systemizing Predicts Code-Breaking Performance in Expert Hackers and Naïve Participants.

    PubMed

    Harvey, India; Bolgan, Samuela; Mosca, Daniel; McLean, Colin; Rusconi, Elena

    2016-01-01

    Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders-attention to detail and systemizing-may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings.

  10. A survey of hard X-ray imaging concepts currently proposed for viewing solar flares

    NASA Technical Reports Server (NTRS)

    Campbell, Jonathan W.; Davis, John M.; Emslie, A. G.

    1991-01-01

    Several approaches to imaging hard X-rays emitted from solar flares have been proposed. These include the fixed modulation collimator, the rotating modulation collimator, the spiral fresnel zone pattern, and the redundantly coded aperture. These techniques are under consideration for use in the Solar Maximum '91 balloon program, the Japanese Solar-A satellite, the Controls, Astrophysics, and Structures Experiment in Space, and the Pinhole/Occulter Facility and are outlined and discussed in the context of preliminary results from numerical modeling and the requirements derived from current ideas as to the expected hard X-ray structures in the impulsive phase of solar flares. Preliminary indications are that all of the approaches are promising, but each has its own unique set of limitations.

  11. High frequency RF waves

    NASA Astrophysics Data System (ADS)

    Horton, William; Brookman, M.; Goniche, M.; Peysson, Y.; Ekedahl, A.

    2017-10-01

    ECH and LHCD- are scattered by the density and magnetic field turbulence from drift waves as measured in and Tore Supra-WEST, EAST and DIII-D. Ray equations give the spreading from plasma refraction from the antenna through the core plasma until and change the parallel phase velocity evolves to where RF waves are absorbed by the electrons. Extensive LH ray tracing and absorption has been reported using the coupled CP3O ray tracing and LUKE electron phase space density code with collisionless electron-wave resonant absorption. In theory and simulations are shown for the ray propagation with the resulting electron distributions along with the predicted X ray distribution that compared to the measured X-ray spectrum. Lower-hybrid is essential for steady-state operation in tokamaks with control of the high-energy electrons intrinsic to tokamaks confinement and heating. The record steady tokamak plasma is Tore Supra a steady 6 minute steady state plasma with 1 Gigajoule energy passing through the plasma. WEST is repeating the experiments with ITER shaped separatrix and divertor chamber and EAST achieved comparable long-pulse plasmas. Results are presented from an IFS-3D spectral code with a pair of inside-outside LHCD antennas and a figure-8 magnetic separatrix are presented. Scattering of the slow wave into the fast wave wave is explored showing the RF scattering from drift wave dne and dB increases the core penetration may account the measured broad X-ray spectrum. Work supported by the DoE through Grants to the Institute for Fusion Studies [DE-FG02-04ER54742], ARLUT and General Atomics, San Diego, California, USA and the IRFM at Cadarache by the Comissariat Energie Atomique, France.

  12. Analysis of neutron and gamma-ray streaming along the maze of NRCAM thallium production target room.

    PubMed

    Raisali, G; Hajiloo, N; Hamidi, S; Aslani, G

    2006-08-01

    Study of the shield performance of a thallium-203 production target room has been investigated in this work. Neutron and gamma-ray equivalent dose rates at various points of the maze are calculated by simulating the transport of streaming neutrons, and photons using Monte Carlo method. For determination of neutron and gamma-ray source intensities and their energy spectrum, we have applied SRIM 2003 and ALICE91 computer codes to Tl target and its Cu substrate for a 145 microA of 28.5 MeV protons beam. The MCNP/4C code has been applied with neutron source term in mode n p to consider both prompt neutrons and secondary gamma-rays. Then the code is applied for the prompt gamma-rays as the source term. The neutron-flux energy spectrum and equivalent dose rates for neutron and gamma-rays in various positions in the maze have been calculated. It has been found that the deviation between calculated and measured dose values along the maze is less than 20%.

  13. A New Approach in Coal Mine Exploration Using Cosmic Ray Muons

    NASA Astrophysics Data System (ADS)

    Darijani, Reza; Negarestani, Ali; Rezaie, Mohammad Reza; Fatemi, Syed Jalil; Akhond, Ahmad

    2016-08-01

    Muon radiography is a technique that uses cosmic ray muons to image the interior of large scale geological structures. The muon absorption in matter is the most important parameter in cosmic ray muon radiography. Cosmic ray muon radiography is similar to X-ray radiography. The main aim in this survey is the simulation of the muon radiography for exploration of mines. So, the production source, tracking, and detection of cosmic ray muons were simulated by MCNPX code. For this purpose, the input data of the source card in MCNPX code were extracted from the muon energy spectrum at sea level. In addition, the other input data such as average density and thickness of layers that were used in this code are the measured data from Pabdana (Kerman, Iran) coal mines. The average thickness and density of these layers in the coal mines are from 2 to 4 m and 1.3 gr/c3, respectively. To increase the spatial resolution, a detector was placed inside the mountain. The results indicated that using this approach, the layers with minimum thickness about 2.5 m can be identified.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ceglio, N.M.; George, E.V.; Brooks, K.M.

    The first successful demonstration of high resolution, tomographic imaging of a laboratory plasma using coded imaging techniques is reported. ZPCI has been used to image the x-ray emission from laser compressed DT filled microballoons. The zone plate camera viewed an x-ray spectral window extending from below 2 keV to above 6 keV. It exhibited a resolution approximately 8 ..mu..m, a magnification factor approximately 13, and subtended a radiation collection solid angle at the target approximately 10/sup -2/ sr. X-ray images using ZPCI were compared with those taken using a grazing incidence reflection x-ray microscope. The agreement was excellent. In addition,more » the zone plate camera produced tomographic images. The nominal tomographic resolution was approximately 75 ..mu..m. This allowed three dimensional viewing of target emission from a single shot in planar ''slices''. In addition to its tomographic capability, the great advantage of the coded imaging technique lies in its applicability to hard (greater than 10 keV) x-ray and charged particle imaging. Experiments involving coded imaging of the suprathermal x-ray and high energy alpha particle emission from laser compressed microballoon targets are discussed.« less

  15. A Spherical Active Coded Aperture for 4π Gamma-ray Imaging

    DOE PAGES

    Hellfeld, Daniel; Barton, Paul; Gunter, Donald; ...

    2017-09-22

    Gamma-ray imaging facilitates the efficient detection, characterization, and localization of compact radioactive sources in cluttered environments. Fieldable detector systems employing active planar coded apertures have demonstrated broad energy sensitivity via both coded aperture and Compton imaging modalities. But, planar configurations suffer from a limited field-of-view, especially in the coded aperture mode. In order to improve upon this limitation, we introduce a novel design by rearranging the detectors into an active coded spherical configuration, resulting in a 4pi isotropic field-of-view for both coded aperture and Compton imaging. This work focuses on the low- energy coded aperture modality and the optimization techniquesmore » used to determine the optimal number and configuration of 1 cm 3 CdZnTe coplanar grid detectors on a 14 cm diameter sphere with 192 available detector locations.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anthony, Stephen

    The Sandia hyperspectral upper-bound spectrum algorithm (hyper-UBS) is a cosmic ray despiking algorithm for hyperspectral data sets. When naturally-occurring, high-energy (gigaelectronvolt) cosmic rays impact the earth’s atmosphere, they create an avalanche of secondary particles which will register as a large, positive spike on any spectroscopic detector they hit. Cosmic ray spikes are therefore an unavoidable spectroscopic contaminant which can interfere with subsequent analysis. A variety of cosmic ray despiking algorithms already exist and can potentially be applied to hyperspectral data matrices, most notably the upper-bound spectrum data matrices (UBS-DM) algorithm by Dongmao Zhang and Dor Ben-Amotz which served as themore » basis for the hyper-UBS algorithm. However, the existing algorithms either cannot be applied to hyperspectral data, require information that is not always available, introduce undesired spectral bias, or have otherwise limited effectiveness for some experimentally relevant conditions. Hyper-UBS is more effective at removing a wider variety of cosmic ray spikes from hyperspectral data without introducing undesired spectral bias. In addition to the core algorithm the Sandia hyper-UBS software package includes additional source code useful in evaluating the effectiveness of the hyper-UBS algorithm. The accompanying source code includes code to generate simulated hyperspectral data contaminated by cosmic ray spikes, several existing despiking algorithms, and code to evaluate the performance of the despiking algorithms on simulated data.« less

  17. Effects of a wavy neutral sheet on cosmic ray anisotropies

    NASA Technical Reports Server (NTRS)

    Kota, J.; Jokipii, J. R.

    1985-01-01

    The first results of a three-dimensional numerical code calculating cosmic ray anisotropies is presented. The code includes diffusion, convection, adiabatic cooling, and drift in an interplanetary magnetic field model containing a wavy neutral sheet. The 3-D model can reproduce all the principal observations for a reasonable set of parameters.

  18. A soft X-ray source based on a low divergence, high repetition rate ultraviolet laser

    NASA Astrophysics Data System (ADS)

    Crawford, E. A.; Hoffman, A. L.; Milroy, R. D.; Quimby, D. C.; Albrecht, G. F.

    The CORK code is utilized to evaluate the applicability of low divergence ultraviolet lasers for efficient production of soft X-rays. The use of the axial hydrodynamic code wih one ozone radial expansion to estimate radial motion and laser energy is examined. The calculation of ionization levels of the plasma and radiation rates by employing the atomic physics and radiation model included in the CORK code is described. Computations using the hydrodynamic code to determine the effect of laser intensity, spot size, and wavelength on plasma electron temperature are provided. The X-ray conversion efficiencies of the lasers are analyzed. It is observed that for a 1 GW laser power the X-ray conversion efficiency is a function of spot size, only weakly dependent on pulse length for time scales exceeding 100 psec, and better conversion efficiencies are obtained at shorter wavelengths. It is concluded that these small lasers focused to 30 micron spot sizes and 10 to the 14th W/sq cm intensities are useful sources of 1-2 keV radiation.

  19. A comparison between EGS4 and MCNP computer modeling of an in vivo X-ray fluorescence system.

    PubMed

    Al-Ghorabie, F H; Natto, S S; Al-Lyhiani, S H

    2001-03-01

    The Monte Carlo computer codes EGS4 and MCNP were used to develop a theoretical model of a 180 degrees geometry in vivo X-ray fluorescence system for the measurement of platinum concentration in head and neck tumors. The model included specification of the photon source, collimators, phantoms and detector. Theoretical results were compared and evaluated against X-ray fluorescence data obtained experimentally from an existing system developed by the Swansea In Vivo Analysis and Cancer Research Group. The EGS4 results agreed well with the MCNP results. However, agreement between the measured spectral shape obtained using the experimental X-ray fluorescence system and the simulated spectral shape obtained using the two Monte Carlo codes was relatively poor. The main reason for the disagreement between the results arises from the basic assumptions which the two codes used in their calculations. Both codes assume a "free" electron model for Compton interactions. This assumption will underestimate the results and invalidates any predicted and experimental spectra when compared with each other.

  20. Computational techniques in gamma-ray skyshine analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, D.L.

    1988-12-01

    Two computer codes were developed to analyze gamma-ray skyshine, the scattering of gamma photons by air molecules. A review of previous gamma-ray skyshine studies discusses several Monte Carlo codes, programs using a single-scatter model, and the MicroSkyshine program for microcomputers. A benchmark gamma-ray skyshine experiment performed at Kansas State University is also described. A single-scatter numerical model was presented which traces photons from the source to their first scatter, then applies a buildup factor along a direct path from the scattering point to a detector. The FORTRAN code SKY, developed with this model before the present study, was modified tomore » use Gauss quadrature, recent photon attenuation data and a more accurate buildup approximation. The resulting code, SILOGP, computes response from a point photon source on the axis of a silo, with and without concrete shielding over the opening. Another program, WALLGP, was developed using the same model to compute response from a point gamma source behind a perfectly absorbing wall, with and without shielding overhead. 29 refs., 48 figs., 13 tabs.« less

  1. Imaging Analysis of the Hard X-Ray Telescope ProtoEXIST2 and New Techniques for High-Resolution Coded-Aperture Telescopes

    NASA Technical Reports Server (NTRS)

    Hong, Jaesub; Allen, Branden; Grindlay, Jonathan; Barthelmy, Scott D.

    2016-01-01

    Wide-field (greater than or approximately equal to 100 degrees squared) hard X-ray coded-aperture telescopes with high angular resolution (greater than or approximately equal to 2 minutes) will enable a wide range of time domain astrophysics. For instance, transient sources such as gamma-ray bursts can be precisely localized without the assistance of secondary focusing X-ray telescopes to enable rapid followup studies. On the other hand, high angular resolution in coded-aperture imaging introduces a new challenge in handling the systematic uncertainty: the average photon count per pixel is often too small to establish a proper background pattern or model the systematic uncertainty in a timescale where the model remains invariant. We introduce two new techniques to improve detection sensitivity, which are designed for, but not limited to, a high-resolution coded-aperture system: a self-background modeling scheme which utilizes continuous scan or dithering operations, and a Poisson-statistics based probabilistic approach to evaluate the significance of source detection without subtraction in handling the background. We illustrate these new imaging analysis techniques in high resolution coded-aperture telescope using the data acquired by the wide-field hard X-ray telescope ProtoEXIST2 during a high-altitude balloon flight in fall 2012. We review the imaging sensitivity of ProtoEXIST2 during the flight, and demonstrate the performance of the new techniques using our balloon flight data in comparison with a simulated ideal Poisson background.

  2. Coded-aperture imaging of the Galactic center region at gamma-ray energies

    NASA Technical Reports Server (NTRS)

    Cook, Walter R.; Grunsfeld, John M.; Heindl, William A.; Palmer, David M.; Prince, Thomas A.

    1991-01-01

    The first coded-aperture images of the Galactic center region at energies above 30 keV have revealed two strong gamma-ray sources. One source has been identified with the X-ray source IE 1740.7 - 2942, located 0.8 deg away from the nucleus. If this source is at the distance of the Galactic center, it is one of the most luminous objects in the galaxy at energies from 35 to 200 keV. The second source is consistent in location with the X-ray source GX 354 + 0 (MXB 1728-34). In addition, gamma-ray flux from the location of GX 1 + 4 was marginally detected at a level consistent with other post-1980 measurements. No significant hard X-ray or gamma-ray flux was detected from the direction of the Galactic nucleus or from the direction of the recently discovered gamma-ray source GRS 1758-258.

  3. Characterization of gamma rays existing in the NMIJ standard neutron field.

    PubMed

    Harano, H; Matsumoto, T; Ito, Y; Uritani, A; Kudo, K

    2004-01-01

    Our laboratory provides national standards on fast neutron fluence. Neutron fields are always accompanied by gamma rays produced in neutron sources and surroundings. We have characterised these gamma rays in the 5.0 MeV standard neutron field. Gamma ray measurement was performed using an NE213 liquid scintillator. Pulse shape discrimination was incorporated to separate the events induced by gamma rays from those by neutrons. The measured gamma ray spectra were unfolded with the HEPRO program package to obtain the spectral fluences using the response matrix prepared with the EGS4 code. Corrections were made for the gamma rays produced by neutrons in the detector assembly using the MCNP4C code. The effective dose equivalents were estimated to be of the order of 25 microSv at the neutron fluence of 10(7) neutrons cm(-2).

  4. Secondary gamma-ray production in a coded aperture mask

    NASA Technical Reports Server (NTRS)

    Owens, A.; Frye, G. M., Jr.; Hall, C. J.; Jenkins, T. L.; Pendleton, G. N.; Carter, J. N.; Ramsden, D.; Agrinier, B.; Bonfand, E.; Gouiffes, C.

    1985-01-01

    The application of the coded aperture mask to high energy gamma-ray astronomy will provide the capability of locating a cosmic gamma-ray point source with a precision of a few arc-minutes above 20 MeV. Recent tests using a mask in conjunction with drift chamber detectors have shown that the expected point spread function is achieved over an acceptance cone of 25 deg. A telescope employing this technique differs from a conventional telescope only in that the presence of the mask modifies the radiation field in the vicinity of the detection plane. In addition to reducing the primary photon flux incident on the detector by absorption in the mask elements, the mask will also be a secondary radiator of gamma-rays. The various background components in a CAMTRAC (Coded Aperture Mask Track Chamber) telescope are considered. Monte-Carlo calculations are compared with recent measurements obtained using a prototype instrument in a tagged photon beam line.

  5. DynamiX, numerical tool for design of next-generation x-ray telescopes.

    PubMed

    Chauvin, Maxime; Roques, Jean-Pierre

    2010-07-20

    We present a new code aimed at the simulation of grazing-incidence x-ray telescopes subject to deformations and demonstrate its ability with two test cases: the Simbol-X and the International X-ray Observatory (IXO) missions. The code, based on Monte Carlo ray tracing, computes the full photon trajectories up to the detector plane, accounting for the x-ray interactions and for the telescope motion and deformation. The simulation produces images and spectra for any telescope configuration using Wolter I mirrors and semiconductor detectors. This numerical tool allows us to study the telescope performance in terms of angular resolution, effective area, and detector efficiency, accounting for the telescope behavior. We have implemented an image reconstruction method based on the measurement of the detector drifts by an optical sensor metrology. Using an accurate metrology, this method allows us to recover the loss of angular resolution induced by the telescope instability. In the framework of the Simbol-X mission, this code was used to study the impacts of the parameters on the telescope performance. In this paper we present detailed performance analysis of Simbol-X, taking into account the satellite motions and the image reconstruction. To illustrate the versatility of the code, we present an additional performance analysis with a particular configuration of IXO.

  6. Ray-tracing critical-angle transmission gratings for the X-ray Surveyor and Explorer-size missions

    NASA Astrophysics Data System (ADS)

    Günther, Hans M.; Bautz, Marshall W.; Heilmann, Ralf K.; Huenemoerder, David P.; Marshall, Herman L.; Nowak, Michael A.; Schulz, Norbert S.

    2016-07-01

    We study a critical angle transmission (CAT) grating spectrograph that delivers a spectral resolution significantly above any X-ray spectrograph ever own. This new technology will allow us to resolve kinematic components in absorption and emission lines of galactic and extragalactic matter down to unprecedented dispersion levels. We perform ray-trace simulations to characterize the performance of the spectrograph in the context of an X-ray Surveyor or Arcus like layout (two mission concepts currently under study). Our newly developed ray-trace code is a tool suite to simulate the performance of X-ray observatories. The simulator code is written in Python, because the use of a high-level scripting language allows modifications of the simulated instrument design in very few lines of code. This is especially important in the early phase of mission development, when the performances of different configurations are contrasted. To reduce the run-time and allow for simulations of a few million photons in a few minutes on a desktop computer, the simulator code uses tabulated input (from theoretical models or laboratory measurements of samples) for grating efficiencies and mirror reflectivities. We find that the grating facet alignment tolerances to maintain at least 90% of resolving power that the spectrometer has with perfect alignment are (i) translation parallel to the optical axis below 0.5 mm, (ii) rotation around the optical axis or the groove direction below a few arcminutes, and (iii) constancy of the grating period to 1:105. Translations along and rotations around the remaining axes can be significantly larger than this without impacting the performance.

  7. Validation of a Monte Carlo code system for grid evaluation with interference effect on Rayleigh scattering

    NASA Astrophysics Data System (ADS)

    Zhou, Abel; White, Graeme L.; Davidson, Rob

    2018-02-01

    Anti-scatter grids are commonly used in x-ray imaging systems to reduce scatter radiation reaching the image receptor. Anti-scatter grid performance and validation can be simulated through use of Monte Carlo (MC) methods. Our recently reported work has modified existing MC codes resulting in improved performance when simulating x-ray imaging. The aim of this work is to validate the transmission of x-ray photons in grids from the recently reported new MC codes against experimental results and results previously reported in other literature. The results of this work show that the scatter-to-primary ratio (SPR), the transmissions of primary (T p), scatter (T s), and total (T t) radiation determined using this new MC code system have strong agreement with the experimental results and the results reported in the literature. T p, T s, T t, and SPR determined in this new MC simulation code system are valid. These results also show that the interference effect on Rayleigh scattering should not be neglected in both mammographic and general grids’ evaluation. Our new MC simulation code system has been shown to be valid and can be used for analysing and evaluating the designs of grids.

  8. X-Ray, EUV, UV and Optical Emissivities of Astrophysical Plasmas

    NASA Technical Reports Server (NTRS)

    Raymond, John C.; West, Donald (Technical Monitor)

    2000-01-01

    This grant primarily covered the development of the thermal X-ray emission model code called APEC, which is meant to replace the Raymond and Smith (1977) code. The new code contains far more spectral lines and a great deal of updated atomic data. The code is now available (http://hea-www.harvard.edu/APEC), though new atomic data is still being added, particularly at longer wavelengths. While initial development of the code was funded by this grant, current work is carried on by N. Brickhouse, R. Smith and D. Liedahl under separate funding. Over the last five years, the grant has provided salary support for N. Brickhouse, R. Smith, a summer student (L. McAllister), an SAO predoctoral fellow (A. Vasquez), and visits by T. Kallman, D. Liedahl, P. Ghavamian, J.M. Laming, J. Li, P. Okeke, and M. Martos. In addition to the code development, the grant supported investigations into X-ray and UV spectral diagnostics as applied to shock waves in the ISM, accreting black holes and white dwarfs, and stellar coronae. Many of these efforts are continuing. Closely related work on the shock waves and coronal mass ejections in the solar corona has grown out of the efforts supported by the grant.

  9. Ways with Data: Understanding Coding as Writing

    ERIC Educational Resources Information Center

    Lindgren, Chris

    2017-01-01

    In this dissertation, I report findings from an exploratory case-study about Ray, a web developer, who works on a data-driven news team that finds and tells compelling stories with large sets of data. I implicate this case of Ray's coding on a data team in a writing studies epistemology, which is guided by the following question: "What might…

  10. hybrid\\scriptsize{{MANTIS}}: a CPU-GPU Monte Carlo method for modeling indirect x-ray detectors with columnar scintillators

    NASA Astrophysics Data System (ADS)

    Sharma, Diksha; Badal, Andreu; Badano, Aldo

    2012-04-01

    The computational modeling of medical imaging systems often requires obtaining a large number of simulated images with low statistical uncertainty which translates into prohibitive computing times. We describe a novel hybrid approach for Monte Carlo simulations that maximizes utilization of CPUs and GPUs in modern workstations. We apply the method to the modeling of indirect x-ray detectors using a new and improved version of the code \\scriptsize{{MANTIS}}, an open source software tool used for the Monte Carlo simulations of indirect x-ray imagers. We first describe a GPU implementation of the physics and geometry models in fast\\scriptsize{{DETECT}}2 (the optical transport model) and a serial CPU version of the same code. We discuss its new features like on-the-fly column geometry and columnar crosstalk in relation to the \\scriptsize{{MANTIS}} code, and point out areas where our model provides more flexibility for the modeling of realistic columnar structures in large area detectors. Second, we modify \\scriptsize{{PENELOPE}} (the open source software package that handles the x-ray and electron transport in \\scriptsize{{MANTIS}}) to allow direct output of location and energy deposited during x-ray and electron interactions occurring within the scintillator. This information is then handled by optical transport routines in fast\\scriptsize{{DETECT}}2. A load balancer dynamically allocates optical transport showers to the GPU and CPU computing cores. Our hybrid\\scriptsize{{MANTIS}} approach achieves a significant speed-up factor of 627 when compared to \\scriptsize{{MANTIS}} and of 35 when compared to the same code running only in a CPU instead of a GPU. Using hybrid\\scriptsize{{MANTIS}}, we successfully hide hours of optical transport time by running it in parallel with the x-ray and electron transport, thus shifting the computational bottleneck from optical to x-ray transport. The new code requires much less memory than \\scriptsize{{MANTIS}} and, as a result, allows us to efficiently simulate large area detectors.

  11. A universal heliostat control system

    NASA Astrophysics Data System (ADS)

    Gross, Fabian; Geiger, Mark; Buck, Reiner

    2017-06-01

    This paper describes the development of a universal heliostat control system as part of the AutoR project [1]. The system can control multiple receivers and heliostat types in a single application. The system offers support for multiple operators on different machines and is designed to be as adaptive as possible. Thus, the system can be used for different heliostat field setups with only minor adaptations of the system's source code. This is achieved by extensive usage of modern programming techniques like reflection and dependency injection. Furthermore, the system features co-simulation of a ray tracer, a reference PID-controller implementation for open volumetric receivers and methods for heliostat calibration and monitoring.

  12. GRAYSKY-A new gamma-ray skyshine code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witts, D.J.; Twardowski, T.; Watmough, M.H.

    1993-01-01

    This paper describes a new prototype gamma-ray skyshine code GRAYSKY (Gamma-RAY SKYshine) that has been developed at BNFL, as part of an industrially based master of science course, to overcome the problems encountered with SKYSHINEII and RANKERN. GRAYSKY is a point kernel code based on the use of a skyshine response function. The scattering within source or shield materials is accounted for by the use of buildup factors. This is an approximate method of solution but one that has been shown to produce results that are acceptable for dose rate predictions on operating plants. The novel features of GRAYSKY aremore » as follows: 1. The code is fully integrated with a semianalytical point kernel shielding code, currently under development at BNFL, which offers powerful solid-body modeling capabilities. 2. The geometry modeling also allows the skyshine response function to be used in a manner that accounts for the shielding of air-scattered radiation. 3. Skyshine buildup factors calculated using the skyshine response function have been used as well as dose buildup factors.« less

  13. Radiological characterization of the pressure vessel internals of the BNL High Flux Beam Reactor.

    PubMed

    Holden, Norman E; Reciniello, Richard N; Hu, Jih-Perng

    2004-08-01

    In preparation for the eventual decommissioning of the High Flux Beam Reactor after the permanent removal of its fuel elements from the Brookhaven National Laboratory, measurements and calculations of the decay gamma-ray dose-rate were performed in the reactor pressure vessel and on vessel internal structures such as the upper and lower thermal shields, the Transition Plate, and the Control Rod blades. Measurements of gamma-ray dose rates were made using Red Perspex polymethyl methacrylate high-dose film, a Radcal "peanut" ion chamber, and Eberline's RO-7 high-range ion chamber. As a comparison, the Monte Carlo MCNP code and MicroShield code were used to model the gamma-ray transport and dose buildup. The gamma-ray dose rate at 8 cm above the center of the Transition Plate was measured to be 160 Gy h (using an RO-7) and 88 Gy h at 8 cm above and about 5 cm lateral to the Transition Plate (using Red Perspex film). This compares with a calculated dose rate of 172 Gy h using Micro-Shield. The gamma-ray dose rate was 16.2 Gy h measured at 76 cm from the reactor core (using the "peanut" ion chamber) and 16.3 Gy h at 87 cm from the core (using Red Perspex film). The similarity of dose rates measured with different instruments indicates that using different methods and instruments is acceptable if the measurement (and calculation) parameters are well defined. Different measurement techniques may be necessary due to constraints such as size restrictions.

  14. Effect of the diffusion parameters on the observed γ-ray spectrum of sources and their contribution to the local all-electron spectrum: The EDGE code

    NASA Astrophysics Data System (ADS)

    López-Coto, R.; Hahn, J.; BenZvi, S.; Dingus, B.; Hinton, J.; Nisa, M. U.; Parsons, R. D.; Greus, F. Salesa; Zhang, H.; Zhou, H.

    2018-11-01

    The positron excess measured by PAMELA and AMS can only be explained if there is one or several sources injecting them. Moreover, at the highest energies, it requires the presence of nearby ( ∼ hundreds of parsecs) and middle age (maximum of ∼ hundreds of kyr) sources. Pulsars, as factories of electrons and positrons, are one of the proposed candidates to explain the origin of this excess. To calculate the contribution of these sources to the electron and positron flux at the Earth, we developed EDGE (Electron Diffusion and Gamma rays to the Earth), a code to treat the propagation of electrons and compute their diffusion from a central source with a flexible injection spectrum. Using this code, we can derive the source's gamma-ray spectrum, spatial extension, the all-electron density in space, the electron and positron flux reaching the Earth and the positron fraction measured at the Earth. We present in this paper the foundations of the code and study how different parameters affect the gamma-ray spectrum of a source and the electron flux measured at the Earth. We also studied the effect of several approximations usually performed in these studies. This code has been used to derive the results of the positron flux measured at the Earth in [1].

  15. Accelerator test of the coded aperture mask technique for gamma-ray astronomy

    NASA Technical Reports Server (NTRS)

    Jenkins, T. L.; Frye, G. M., Jr.; Owens, A.; Carter, J. N.; Ramsden, D.

    1982-01-01

    A prototype gamma-ray telescope employing the coded aperture mask technique has been constructed and its response to a point source of 20 MeV gamma-rays has been measured. The point spread function is approximately a Gaussian with a standard deviation of 12 arc minutes. This resolution is consistent with the cell size of the mask used and the spatial resolution of the detector. In the context of the present experiment, the error radius of the source position (90 percent confidence level) is 6.1 arc minutes.

  16. Noiseless coding for the Gamma Ray spectrometer

    NASA Technical Reports Server (NTRS)

    Rice, R.; Lee, J. J.

    1985-01-01

    The payload of several future unmanned space missions will include a sophisticated gamma ray spectrometer. Severely constrained data rates during certain portions of these missions could limit the possible science return from this instrument. This report investigates the application of universal noiseless coding techniques to represent gamma ray spectrometer data more efficiently without any loss in data integrity. Performance results demonstrate compression factors from 2.5:1 to 20:1 in comparison to a standard representation. Feasibility was also demonstrated by implementing a microprocessor breadboard coder/decoder using an Intel 8086 processor.

  17. Design Considerations of a Virtual Laboratory for Advanced X-ray Sources

    NASA Astrophysics Data System (ADS)

    Luginsland, J. W.; Frese, M. H.; Frese, S. D.; Watrous, J. J.; Heileman, G. L.

    2004-11-01

    The field of scientific computation has greatly advanced in the last few years, resulting in the ability to perform complex computer simulations that can predict the performance of real-world experiments in a number of fields of study. Among the forces driving this new computational capability is the advent of parallel algorithms, allowing calculations in three-dimensional space with realistic time scales. Electromagnetic radiation sources driven by high-voltage, high-current electron beams offer an area to further push the state-of-the-art in high fidelity, first-principles simulation tools. The physics of these x-ray sources combine kinetic plasma physics (electron beams) with dense fluid-like plasma physics (anode plasmas) and x-ray generation (bremsstrahlung). There are a number of mature techniques and software packages for dealing with the individual aspects of these sources, such as Particle-In-Cell (PIC), Magneto-Hydrodynamics (MHD), and radiation transport codes. The current effort is focused on developing an object-oriented software environment using the Rational© Unified Process and the Unified Modeling Language (UML) to provide a framework where multiple 3D parallel physics packages, such as a PIC code (ICEPIC), a MHD code (MACH), and a x-ray transport code (ITS) can co-exist in a system-of-systems approach to modeling advanced x-ray sources. Initial software design and assessments of the various physics algorithms' fidelity will be presented.

  18. Design and performance of coded aperture optical elements for the CESR-TA x-ray beam size monitor

    NASA Astrophysics Data System (ADS)

    Alexander, J. P.; Chatterjee, A.; Conolly, C.; Edwards, E.; Ehrlichman, M. P.; Flanagan, J. W.; Fontes, E.; Heltsley, B. K.; Lyndaker, A.; Peterson, D. P.; Rider, N. T.; Rubin, D. L.; Seeley, R.; Shanks, J.

    2014-12-01

    We describe the design and performance of optical elements for an x-ray beam size monitor (xBSM), a device measuring e+ and e- beam sizes in the CESR-TA storage ring. The device can measure vertical beam sizes of 10 - 100 μm on a turn-by-turn, bunch-by-bunch basis at e± beam energies of 2 - 5 GeV. x-rays produced by a hard-bend magnet pass through a single- or multiple-slit (coded aperture) optical element onto a detector. The coded aperture slit pattern and thickness of masking material forming that pattern can both be tuned for optimal resolving power. We describe several such optical elements and show how well predictions of simple models track measured performances.

  19. CREME96 and Related Error Rate Prediction Methods

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.

    2012-01-01

    Predicting the rate of occurrence of single event effects (SEEs) in space requires knowledge of the radiation environment and the response of electronic devices to that environment. Several analytical models have been developed over the past 36 years to predict SEE rates. The first error rate calculations were performed by Binder, Smith and Holman. Bradford and Pickel and Blandford, in their CRIER (Cosmic-Ray-Induced-Error-Rate) analysis code introduced the basic Rectangular ParallelePiped (RPP) method for error rate calculations. For the radiation environment at the part, both made use of the Cosmic Ray LET (Linear Energy Transfer) spectra calculated by Heinrich for various absorber Depths. A more detailed model for the space radiation environment within spacecraft was developed by Adams and co-workers. This model, together with a reformulation of the RPP method published by Pickel and Blandford, was used to create the CR ME (Cosmic Ray Effects on Micro-Electronics) code. About the same time Shapiro wrote the CRUP (Cosmic Ray Upset Program) based on the RPP method published by Bradford. It was the first code to specifically take into account charge collection from outside the depletion region due to deformation of the electric field caused by the incident cosmic ray. Other early rate prediction methods and codes include the Single Event Figure of Merit, NOVICE, the Space Radiation code and the effective flux method of Binder which is the basis of the SEFA (Scott Effective Flux Approximation) model. By the early 1990s it was becoming clear that CREME and the other early models needed Revision. This revision, CREME96, was completed and released as a WWW-based tool, one of the first of its kind. The revisions in CREME96 included improved environmental models and improved models for calculating single event effects. The need for a revision of CREME also stimulated the development of the CHIME (CRRES/SPACERAD Heavy Ion Model of the Environment) and MACREE (Modeling and Analysis of Cosmic Ray Effects in Electronics). The Single Event Figure of Merit method was also revised to use the solar minimum galactic cosmic ray spectrum and extended to circular orbits down to 200 km at any inclination. More recently a series of commercial codes was developed by TRAD (Test & Radiations) which includes the OMERE code which calculates single event effects. There are other error rate prediction methods which use Monte Carlo techniques. In this chapter the analytic methods for estimating the environment within spacecraft will be discussed.

  20. Cosmic Rays and Their Radiative Processes in Numerical Cosmology

    NASA Technical Reports Server (NTRS)

    Ryu, Dongsu; Miniati, Francesco; Jones, Tom W.; Kang, Hyesung

    2000-01-01

    A cosmological hydrodynamic code is described, which includes a routine to compute cosmic ray acceleration and transport in a simplified way. The routine was designed to follow explicitly diffusive, acceleration at shocks, and second-order Fermi acceleration and adiabatic loss in smooth flows. Synchrotron cooling of the electron population can also be followed. The updated code is intended to be used to study the properties of nonthermal synchrotron emission and inverse Compton scattering from electron cosmic rays in clusters of galaxies, in addition to the properties of thermal bremsstrahlung emission from hot gas. The results of a test simulation using a grid of 128 (exp 3) cells are presented, where cosmic rays and magnetic field have been treated passively and synchrotron cooling of cosmic ray electrons has not been included.

  1. Cosmic Rays and Their Radiative Processes in Numerical Cosmology

    NASA Astrophysics Data System (ADS)

    Ryu, D.; Miniati, F.; Jones, T. W.; Kang, H.

    2000-05-01

    A cosmological hydrodynamic code is described, which includes a routine to compute cosmic ray acceleration and transport in a simplified way. The routine was designed to follow explicitly diffusive acceleration at shocks, and second-order Fermi acceleration and adiabatic loss in smooth flows. Synchrotron cooling of the electron population can also be followed. The updated code is intended to be used to study the properties of nonthermal synchrotron emission and inverse Compton scattering from electron cosmic rays in clusters of galaxies, in addition to the properties of thermal bremsstrahlung emission from hot gas. The results of a test simulation using a grid of 1283 cells are presented, where cosmic rays and magnetic field have been treated passively and synchrotron cooling of cosmic ray electrons has not been included.

  2. Systematic design and three-dimensional simulation of X-ray FEL oscillator for Shanghai Coherent Light Facility

    NASA Astrophysics Data System (ADS)

    Li, Kai; Deng, Haixiao

    2018-07-01

    The Shanghai Coherent Light Facility (SCLF) is a quasi-continuous wave hard X-ray free electron laser facility, which is currently under construction. Due to the high repetition rate and high-quality electron beams, it is straightforward to consider X-ray free electron laser oscillator (XFELO) operation for the SCLF. In this paper, the main processes for XFELO design, and parameter optimization of the undulator, X-ray cavity, and electron beam are described. A three-dimensional X-ray crystal Bragg diffraction code, named BRIGHT, was introduced for the first time, which can be combined with the GENESIS and OPC codes for the numerical simulations of the XFELO. The performance of the XFELO of the SCLF is investigated and optimized by theoretical analysis and numerical simulation.

  3. CGRO Guest Investigator Program

    NASA Technical Reports Server (NTRS)

    Begelman, Mitchell C.

    1997-01-01

    The following are highlights from the research supported by this grant: (1) Theory of gamma-ray blazars: We studied the theory of gamma-ray blazars, being among the first investigators to propose that the GeV emission arises from Comptonization of diffuse radiation surrounding the jet, rather than from the synchrotron-self-Compton mechanism. In related work, we uncovered possible connections between the mechanisms of gamma-ray blazars and those of intraday radio variability, and have conducted a general study of the role of Compton radiation drag on the dynamics of relativistic jets. (2) A Nonlinear Monte Carlo code for gamma-ray spectrum formation: We developed, tested, and applied the first Nonlinear Monte Carlo (NLMC) code for simulating gamma-ray production and transfer under much more general (and realistic) conditions than are accessible with other techniques. The present version of the code is designed to simulate conditions thought to be present in active galactic nuclei and certain types of X-ray binaries, and includes the physics needed to model thermal and nonthermal electron-positron pair cascades. Unlike traditional Monte-Carlo techniques, our method can accurately handle highly non-linear systems in which the radiation and particle backgrounds must be determined self-consistently and in which the particle energies span many orders of magnitude. Unlike models based on kinetic equations, our code can handle arbitrary source geometries and relativistic kinematic effects In its first important application following testing, we showed that popular semi-analytic accretion disk corona models for Seyfert spectra are seriously in error, and demonstrated how the spectra can be simulated if the disk is sparsely covered by localized 'flares'.

  4. Using computational modeling to compare X-ray tube Practical Peak Voltage for Dental Radiology

    NASA Astrophysics Data System (ADS)

    Holanda Cassiano, Deisemar; Arruda Correa, Samanda Cristine; de Souza, Edmilson Monteiro; da Silva, Ademir Xaxier; Pereira Peixoto, José Guilherme; Tadeu Lopes, Ricardo

    2014-02-01

    The Practical Peak Voltage-PPV has been adopted to measure the voltage applied to an X-ray tube. The PPV was recommended by the IEC document and accepted and published in the TRS no. 457 code of practice. The PPV is defined and applied to all forms of waves and is related to the spectral distribution of X-rays and to the properties of the image. The calibration of X-rays tubes was performed using the MCNPX Monte Carlo code. An X-ray tube for Dental Radiology (operated from a single phase power supply) and an X-ray tube used as a reference (supplied from a constant potential power supply) were used in simulations across the energy range of interest of 40 kV to 100 kV. Results obtained indicated a linear relationship between the tubes involved.

  5. Modeling of Dynamic Behavior of Carbon Fiber-Reinforced Polymer (CFRP) Composite under X-ray Radiation.

    PubMed

    Zhang, Kun; Tang, Wenhui; Fu, Kunkun

    2018-01-16

    Carbon fiber-reinforced polymer (CFRP) composites have been increasingly used in spacecraft applications. Spacecraft may encounter highenergy-density X-ray radiation in outer space that can cause severe damage. To protect spacecraft from such unexpected damage, it is essential to predict the dynamic behavior of CFRP composites under X-ray radiation. In this study, we developed an in-house three-dimensional explicit finite element (FEM) code to investigate the dynamic responses of CFRP composite under X-ray radiation for the first time, by incorporating a modified PUFF equation-of-state. First, the blow-off impulse (BOI) momentum of an aluminum panel was predicted by our FEM code and compared with an existing radiation experiment. Then, the FEM code was utilized to determine the dynamic behavior of a CFRP composite under various radiation conditions. It was found that the numerical result was comparable with the experimental one. Furthermore, the CFRP composite was more effective than the aluminum panel in reducing radiation-induced pressure and BOI momentum. The numerical results also revealed that a 1 keV X-ray led to vaporization of surface materials and a high-magnitude compressive stress wave, whereas a low-magnitude stress wave was generated with no surface vaporization when a 3 keV X-ray was applied.

  6. Modeling of Dynamic Behavior of Carbon Fiber-Reinforced Polymer (CFRP) Composite under X-ray Radiation

    PubMed Central

    Zhang, Kun; Tang, Wenhui; Fu, Kunkun

    2018-01-01

    Carbon fiber-reinforced polymer (CFRP) composites have been increasingly used in spacecraft applications. Spacecraft may encounter highenergy-density X-ray radiation in outer space that can cause severe damage. To protect spacecraft from such unexpected damage, it is essential to predict the dynamic behavior of CFRP composites under X-ray radiation. In this study, we developed an in-house three-dimensional explicit finite element (FEM) code to investigate the dynamic responses of CFRP composite under X-ray radiation for the first time, by incorporating a modified PUFF equation-of-state. First, the blow-off impulse (BOI) momentum of an aluminum panel was predicted by our FEM code and compared with an existing radiation experiment. Then, the FEM code was utilized to determine the dynamic behavior of a CFRP composite under various radiation conditions. It was found that the numerical result was comparable with the experimental one. Furthermore, the CFRP composite was more effective than the aluminum panel in reducing radiation-induced pressure and BOI momentum. The numerical results also revealed that a 1 keV X-ray led to vaporization of surface materials and a high-magnitude compressive stress wave, whereas a low-magnitude stress wave was generated with no surface vaporization when a 3 keV X-ray was applied. PMID:29337891

  7. Neutron Capture Gamma-Ray Libraries for Nuclear Applications

    NASA Astrophysics Data System (ADS)

    Sleaford, B. W.; Firestone, R. B.; Summers, N.; Escher, J.; Hurst, A.; Krticka, M.; Basunia, S.; Molnar, G.; Belgya, T.; Revay, Z.; Choi, H. D.

    2011-06-01

    The neutron capture reaction is useful in identifying and analyzing the gamma-ray spectrum from an unknown assembly as it gives unambiguous information on its composition. This can be done passively or actively where an external neutron source is used to probe an unknown assembly. There are known capture gamma-ray data gaps in the ENDF libraries used by transport codes for various nuclear applications. The Evaluated Gamma-ray Activation file (EGAF) is a new thermal neutron capture database of discrete line spectra and cross sections for over 260 isotopes that was developed as part of an IAEA Coordinated Research Project. EGAF is being used to improve the capture gamma production in ENDF libraries. For medium to heavy nuclei the quasi continuum contribution to the gamma cascades is not experimentally resolved. The continuum contains up to 90% of all the decay energy and is modeled here with the statistical nuclear structure code DICEBOX. This code also provides a consistency check of the level scheme nuclear structure evaluation. The calculated continuum is of sufficient accuracy to include in the ENDF libraries. This analysis also determines new total thermal capture cross sections and provides an improved RIPL database. For higher energy neutron capture there is less experimental data available making benchmarking of the modeling codes more difficult. We are investigating the capture spectra from higher energy neutrons experimentally using surrogate reactions and modeling this with Hauser-Feshbach codes. This can then be used to benchmark CASINO, a version of DICEBOX modified for neutron capture at higher energy. This can be used to simulate spectra from neutron capture at incident neutron energies up to 20 MeV to improve the gamma-ray spectrum in neutron data libraries used for transport modeling of unknown assemblies.

  8. ARES: automated response function code. Users manual. [HPGAM and LSQVM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maung, T.; Reynolds, G.M.

    This ARES user's manual provides detailed instructions for a general understanding of the Automated Response Function Code and gives step by step instructions for using the complete code package on a HP-1000 system. This code is designed to calculate response functions of NaI gamma-ray detectors, with cylindrical or rectangular geometries.

  9. GAPD: a GPU-accelerated atom-based polychromatic diffraction simulation code.

    PubMed

    E, J C; Wang, L; Chen, S; Zhang, Y Y; Luo, S N

    2018-03-01

    GAPD, a graphics-processing-unit (GPU)-accelerated atom-based polychromatic diffraction simulation code for direct, kinematics-based, simulations of X-ray/electron diffraction of large-scale atomic systems with mono-/polychromatic beams and arbitrary plane detector geometries, is presented. This code implements GPU parallel computation via both real- and reciprocal-space decompositions. With GAPD, direct simulations are performed of the reciprocal lattice node of ultralarge systems (∼5 billion atoms) and diffraction patterns of single-crystal and polycrystalline configurations with mono- and polychromatic X-ray beams (including synchrotron undulator sources), and validation, benchmark and application cases are presented.

  10. An analysis of options available for developing a common laser ray tracing package for Ares and Kull code frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weeratunga, S K

    Ares and Kull are mature code frameworks that support ALE hydrodynamics for a variety of HEDP applications at LLNL, using two widely different meshing approaches. While Ares is based on a 2-D/3-D block-structured mesh data base, Kull is designed to support unstructured, arbitrary polygonal/polyhedral meshes. In addition, both frameworks are capable of running applications on large, distributed-memory parallel machines. Currently, both these frameworks separately support assorted collections of physics packages related to HEDP, including one for the energy deposition by laser/ion-beam ray tracing. This study analyzes the options available for developing a common laser/ion-beam ray tracing package that can bemore » easily shared between these two code frameworks and concludes with a set of recommendations for its development.« less

  11. A model of polarized-beam AGS in the ray-tracing code Zgoubi

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meot, F.; Ahrens, L.; Brown, K.

    A model of the Alternating Gradient Synchrotron, based on the AGS snapramps, has been developed in the stepwise ray-tracing code Zgoubi. It has been used over the past 5 years in a number of accelerator studies aimed at enhancing RHIC proton beam polarization. It is also used to study and optimize proton and Helion beam polarization in view of future RHIC and eRHIC programs. The AGS model in Zgoubi is operational on-line via three different applications, ’ZgoubiFromSnaprampCmd’, ’AgsZgoubiModel’ and ’AgsModelViewer’, with the latter two essentially interfaces to the former which is the actual model ’engine’. All three commands are availablemore » from the controls system application launcher in the AGS ’StartUp’ menu, or from eponymous commands on shell terminals. Main aspects of the model and of its operation are presented in this technical note, brief excerpts from various studies performed so far are given for illustration, means and methods entering in ZgoubiFromSnaprampCmd are developed further in appendix.« less

  12. Ray tracing through a hexahedral mesh in HADES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henderson, G L; Aufderheide, M B

    In this paper we describe a new ray tracing method targeted for inclusion in HADES. The algorithm tracks rays through three-dimensional tetrakis hexahedral mesh objects, like those used by the ARES code to model inertial confinement experiments.

  13. REgolith X-Ray Imaging Spectrometer (REXIS) Aboard NASA’s OSIRIS-REx Mission

    NASA Astrophysics Data System (ADS)

    Hong, JaeSub; Allen, Branden; Grindlay, Jonathan E.; Binzel, Richard P.; Masterson, Rebecca; Inamdar, Niraj K; Chodas, Mark; Smith, Matthew W; Bautz, Mark W.; Kissel, Steven E; Villasenor, Jesus Noel; Oprescu, Antonia

    2014-06-01

    The REgolith X-Ray Imaging Spectrometer (REXIS) is a student-led instrument being designed, built, and operated as a collaborative effort involving MIT and Harvard. It is a part of NASA's OSIRIS-REx mission, which is scheduled for launch in September of 2016 for a rendezvous with, and collection of a sample from the surface of the primitive carbonaceous chondrite-like asteroid 101955 Bennu in 2019. REXIS will determine spatial variations in elemental composition of Bennu's surface through solar-induced X-ray fluorescence. REXIS consists of four X-ray CCDs in the detector plane and an X-ray mask. It is the first coded-aperture X-ray telescope in a planetary mission, which combines the benefit of high X-ray throughput of wide-field collimation with imaging capability of a coded-mask, enabling detection of elemental surface distributions at approximately 50-200 m scales. We present an overview of the REXIS instrument and the expected performance.

  14. Modification and benchmarking of SKYSHINE-III for use with ISFSI cask arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertel, N.E.; Napolitano, D.G.

    1997-12-01

    Dry cask storage arrays are becoming more and more common at nuclear power plants in the United States. Title 10 of the Code of Federal Regulations, Part 72, limits doses at the controlled area boundary of these independent spent-fuel storage installations (ISFSI) to 0.25 mSv (25 mrem)/yr. The minimum controlled area boundaries of such a facility are determined by cask array dose calculations, which include direct radiation and radiation scattered by the atmosphere, also known as skyshine. NAC International (NAC) uses SKYSHINE-III to calculate the gamma-ray and neutron dose rates as a function of distance from ISFSI arrays. In thismore » paper, we present modifications to the SKYSHINE-III that more explicitly model cask arrays. In addition, we have benchmarked the radiation transport methods used in SKYSHINE-III against {sup 60}Co gamma-ray experiments and MCNP neutron calculations.« less

  15. Multimode imaging device

    DOEpatents

    Mihailescu, Lucian; Vetter, Kai M

    2013-08-27

    Apparatus for detecting and locating a source of gamma rays of energies ranging from 10-20 keV to several MeV's includes plural gamma ray detectors arranged in a generally closed extended array so as to provide Compton scattering imaging and coded aperture imaging simultaneously. First detectors are arranged in a spaced manner about a surface defining the closed extended array which may be in the form a circle, a sphere, a square, a pentagon or higher order polygon. Some of the gamma rays are absorbed by the first detectors closest to the gamma source in Compton scattering, while the photons that go unabsorbed by passing through gaps disposed between adjacent first detectors are incident upon second detectors disposed on the side farthest from the gamma ray source, where the first spaced detectors form a coded aperture array for two or three dimensional gamma ray source detection.

  16. X-ray spectral signatures of photoionized plasmas. [astrophysics

    NASA Technical Reports Server (NTRS)

    Liedahl, Duane A.; Kahn, Steven M.; Osterheld, Albert L.; Goldstein, William H.

    1990-01-01

    Plasma emission codes have become a standard tool for the analysis of spectroscopic data from cosmic X-ray sources. However, the assumption of collisional equilibrium, typically invoked in these codes, renders them inapplicable to many important astrophysical situations, particularly those involving X-ray photoionized nebulae. This point is illustrated by comparing model spectra which have been calculated under conditions appropriate to both coronal plasmas and X-ray photoionized plasmas. It is shown that the (3s-2p)/(3d-2p) line ratios in the Fe L-shell spectrum can be used to effectively discriminate between these two cases. This diagnostic will be especially useful for data analysis associated with AXAF and XMM, which will carry spectroscopic instrumentation with sufficient sensitivity and resolution to identify X-ray photoionized nebulae in a wide range of astrophysical environments.

  17. Accurate Ray-tracing of Realistic Neutron Star Atmospheres for Constraining Their Parameters

    NASA Astrophysics Data System (ADS)

    Vincent, Frederic H.; Bejger, Michał; Różańska, Agata; Straub, Odele; Paumard, Thibaut; Fortin, Morgane; Madej, Jerzy; Majczyna, Agnieszka; Gourgoulhon, Eric; Haensel, Paweł; Zdunik, Leszek; Beldycki, Bartosz

    2018-03-01

    Thermal-dominated X-ray spectra of neutron stars in quiescent, transient X-ray binaries and neutron stars that undergo thermonuclear bursts are sensitive to mass and radius. The mass–radius relation of neutron stars depends on the equation of state (EoS) that governs their interior. Constraining this relation accurately is therefore of fundamental importance to understand the nature of dense matter. In this context, we introduce a pipeline to calculate realistic model spectra of rotating neutron stars with hydrogen and helium atmospheres. An arbitrarily fast-rotating neutron star with a given EoS generates the spacetime in which the atmosphere emits radiation. We use the LORENE/NROTSTAR code to compute the spacetime numerically and the ATM24 code to solve the radiative transfer equations self-consistently. Emerging specific intensity spectra are then ray-traced through the neutron star’s spacetime from the atmosphere to a distant observer with the GYOTO code. Here, we present and test our fully relativistic numerical pipeline. To discuss and illustrate the importance of realistic atmosphere models, we compare our model spectra to simpler models like the commonly used isotropic color-corrected blackbody emission. We highlight the importance of considering realistic model-atmosphere spectra together with relativistic ray-tracing to obtain accurate predictions. We also insist upon the crucial impact of the star’s rotation on the observables. Finally, we close a controversy that has been ongoing in the literature in the recent years, regarding the validity of the ATM24 code.

  18. Silicon Drift Detector response function for PIXE spectra fitting

    NASA Astrophysics Data System (ADS)

    Calzolai, G.; Tapinassi, S.; Chiari, M.; Giannoni, M.; Nava, S.; Pazzi, G.; Lucarelli, F.

    2018-02-01

    The correct determination of the X-ray peak areas in PIXE spectra by fitting with a computer program depends crucially on accurate parameterization of the detector peak response function. In the Guelph PIXE software package, GUPIXWin, one of the most used PIXE spectra analysis code, the response of a semiconductor detector to monochromatic X-ray radiation is described by a linear combination of several analytical functions: a Gaussian profile for the X-ray line itself, and additional tail contributions (exponential tails and step functions) on the low-energy side of the X-ray line to describe incomplete charge collection effects. The literature on the spectral response of silicon X-ray detectors for PIXE applications is rather scarce, in particular data for Silicon Drift Detectors (SDD) and for a large range of X-ray energies are missing. Using a set of analytical functions, the SDD response functions were satisfactorily reproduced for the X-ray energy range 1-15 keV. The behaviour of the parameters involved in the SDD tailing functions with X-ray energy is described by simple polynomial functions, which permit an easy implementation in PIXE spectra fitting codes.

  19. The FLUKA code for space applications: recent developments

    NASA Technical Reports Server (NTRS)

    Andersen, V.; Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; hide

    2004-01-01

    The FLUKA Monte Carlo transport code is widely used for fundamental research, radioprotection and dosimetry, hybrid nuclear energy system and cosmic ray calculations. The validity of its physical models has been benchmarked against a variety of experimental data over a wide range of energies, ranging from accelerator data to cosmic ray showers in the earth atmosphere. The code is presently undergoing several developments in order to better fit the needs of space applications. The generation of particle spectra according to up-to-date cosmic ray data as well as the effect of the solar and geomagnetic modulation have been implemented and already successfully applied to a variety of problems. The implementation of suitable models for heavy ion nuclear interactions has reached an operational stage. At medium/high energy FLUKA is using the DPMJET model. The major task of incorporating heavy ion interactions from a few GeV/n down to the threshold for inelastic collisions is also progressing and promising results have been obtained using a modified version of the RQMD-2.4 code. This interim solution is now fully operational, while waiting for the development of new models based on the FLUKA hadron-nucleus interaction code, a newly developed QMD code, and the implementation of the Boltzmann master equation theory for low energy ion interactions. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  20. Should One Use the Ray-by-Ray Approximation in Core-Collapse Supernova Simulations?

    DOE PAGES

    Skinner, M. Aaron; Burrows, Adam; Dolence, Joshua C.

    2016-10-28

    We perform the first self-consistent, time-dependent, multi-group calculations in two dimensions (2D) to address the consequences of using the ray-by-ray+ transport simplification in core-collapse supernova simulations. Such a dimensional reduction is employed by many researchers to facilitate their resource-intensive calculations. Our new code (Fornax) implements multi-D transport, and can, by zeroing out transverse flux terms, emulate the ray-by-ray+ scheme. Using the same microphysics, initial models, resolution, and code, we compare the results of simulating 12-, 15-, 20-, and 25-M⊙ progenitor models using these two transport methods. Our findings call into question the wisdom of the pervasive use of the ray-by-ray+more » approach. Employing it leads to maximum post-bounce/preexplosion shock radii that are almost universally larger by tens of kilometers than those derived using the more accurate scheme, typically leaving the post-bounce matter less bound and artificially more “explodable.” In fact, for our 25-M⊙ progenitor, the ray-by-ray+ model explodes, while the corresponding multi-D transport model does not. Therefore, in two dimensions the combination of ray-by-ray+ with the axial sloshing hydrodynamics that is a feature of 2D supernova dynamics can result in quantitatively, and perhaps qualitatively, incorrect results.« less

  1. Should One Use the Ray-by-Ray Approximation in Core-collapse Supernova Simulations?

    NASA Astrophysics Data System (ADS)

    Skinner, M. Aaron; Burrows, Adam; Dolence, Joshua C.

    2016-11-01

    We perform the first self-consistent, time-dependent, multi-group calculations in two dimensions (2D) to address the consequences of using the ray-by-ray+ transport simplification in core-collapse supernova simulations. Such a dimensional reduction is employed by many researchers to facilitate their resource-intensive calculations. Our new code (Fornax) implements multi-D transport, and can, by zeroing out transverse flux terms, emulate the ray-by-ray+ scheme. Using the same microphysics, initial models, resolution, and code, we compare the results of simulating 12, 15, 20, and 25 M ⊙ progenitor models using these two transport methods. Our findings call into question the wisdom of the pervasive use of the ray-by-ray+ approach. Employing it leads to maximum post-bounce/pre-explosion shock radii that are almost universally larger by tens of kilometers than those derived using the more accurate scheme, typically leaving the post-bounce matter less bound and artificially more “explodable.” In fact, for our 25 M ⊙ progenitor, the ray-by-ray+ model explodes, while the corresponding multi-D transport model does not. Therefore, in two dimensions, the combination of ray-by-ray+ with the axial sloshing hydrodynamics that is a feature of 2D supernova dynamics can result in quantitatively, and perhaps qualitatively, incorrect results.

  2. Simulation the spatial resolution of an X-ray imager based on zinc oxide nanowires in anodic aluminium oxide membrane by using MCNP and OPTICS Codes

    NASA Astrophysics Data System (ADS)

    Samarin, S. N.; Saramad, S.

    2018-05-01

    The spatial resolution of a detector is a very important parameter for x-ray imaging. A bulk scintillation detector because of spreading of light inside the scintillator does't have a good spatial resolution. The nanowire scintillators because of their wave guiding behavior can prevent the spreading of light and can improve the spatial resolution of traditional scintillation detectors. The zinc oxide (ZnO) scintillator nanowire, with its simple construction by electrochemical deposition in regular hexagonal structure of Aluminum oxide membrane has many advantages. The three dimensional absorption of X-ray energy in ZnO scintillator is simulated by a Monte Carlo transport code (MCNP). The transport, attenuation and scattering of the generated photons are simulated by a general-purpose scintillator light response simulation code (OPTICS). The results are compared with a previous publication which used a simulation code of the passage of particles through matter (Geant4). The results verify that this scintillator nanowire structure has a spatial resolution less than one micrometer.

  3. Development of a Coded Aperture X-Ray Backscatter Imager for Explosive Device Detection

    NASA Astrophysics Data System (ADS)

    Faust, Anthony A.; Rothschild, Richard E.; Leblanc, Philippe; McFee, John Elton

    2009-02-01

    Defence R&D Canada has an active research and development program on detection of explosive devices using nuclear methods. One system under development is a coded aperture-based X-ray backscatter imaging detector designed to provide sufficient speed, contrast and spatial resolution to detect antipersonnel landmines and improvised explosive devices. The successful development of a hand-held imaging detector requires, among other things, a light-weight, ruggedized detector with low power requirements, supplying high spatial resolution. The University of California, San Diego-designed HEXIS detector provides a modern, large area, high-temperature CZT imaging surface, robustly packaged in a light-weight housing with sound mechanical properties. Based on the potential for the HEXIS detector to be incorporated as the detection element of a hand-held imaging detector, the authors initiated a collaborative effort to demonstrate the capability of a coded aperture-based X-ray backscatter imaging detector. This paper will discuss the landmine and IED detection problem and review the coded aperture technique. Results from initial proof-of-principle experiments will then be reported.

  4. Method for measuring the focal spot size of an x-ray tube using a coded aperture mask and a digital detector.

    PubMed

    Russo, Paolo; Mettivier, Giovanni

    2011-04-01

    The goal of this study is to evaluate a new method based on a coded aperture mask combined with a digital x-ray imaging detector for measurements of the focal spot sizes of diagnostic x-ray tubes. Common techniques for focal spot size measurements employ a pinhole camera, a slit camera, or a star resolution pattern. The coded aperture mask is a radiation collimator consisting of a large number of apertures disposed on a predetermined grid in an array, through which the radiation source is imaged onto a digital x-ray detector. The method of the coded mask camera allows one to obtain a one-shot accurate and direct measurement of the two dimensions of the focal spot (like that for a pinhole camera) but at a low tube loading (like that for a slit camera). A large number of small apertures in the coded mask operate as a "multipinhole" with greater efficiency than a single pinhole, but keeping the resolution of a single pinhole. X-ray images result from the multiplexed output on the detector image plane of such a multiple aperture array, and the image of the source is digitally reconstructed with a deconvolution algorithm. Images of the focal spot of a laboratory x-ray tube (W anode: 35-80 kVp; focal spot size of 0.04 mm) were acquired at different geometrical magnifications with two different types of digital detector (a photon counting hybrid silicon pixel detector with 0.055 mm pitch and a flat panel CMOS digital detector with 0.05 mm pitch) using a high resolution coded mask (type no-two-holes-touching modified uniformly redundant array) with 480 0.07 mm apertures, designed for imaging at energies below 35 keV. Measurements with a slit camera were performed for comparison. A test with a pinhole camera and with the coded mask on a computed radiography mammography unit with 0.3 mm focal spot was also carried out. The full width at half maximum focal spot sizes were obtained from the line profiles of the decoded images, showing a focal spot of 0.120 mm x 0.105 mm at 35 kVp and M = 6.1, with a detector entrance exposure as low as 1.82 mR (0.125 mA s tube load). The slit camera indicated a focal spot of 0.112 mm x 0.104 mm at 35 kVp and M = 3.15, with an exposure at the detector of 72 mR. Focal spot measurements with the coded mask could be performed up to 80 kVp. Tolerance to angular misalignment with the reference beam up to 7 degrees in in-plane rotations and 1 degrees deg in out-of-plane rotations was observed. The axial distance of the focal spot from the coded mask could also be determined. It is possible to determine the beam intensity via measurement of the intensity of the decoded image of the focal spot and via a calibration procedure. Coded aperture masks coupled to a digital area detector produce precise determinations of the focal spot of an x-ray tube with reduced tube loading and measurement time, coupled to a large tolerance in the alignment of the mask.

  5. NPTFit: A Code Package for Non-Poissonian Template Fitting

    NASA Astrophysics Data System (ADS)

    Mishra-Sharma, Siddharth; Rodd, Nicholas L.; Safdi, Benjamin R.

    2017-06-01

    We present NPTFit, an open-source code package, written in Python and Cython, for performing non-Poissonian template fits (NPTFs). The NPTF is a recently developed statistical procedure for characterizing the contribution of unresolved point sources (PSs) to astrophysical data sets. The NPTF was first applied to Fermi gamma-ray data to provide evidence that the excess of ˜GeV gamma-rays observed in the inner regions of the Milky Way likely arises from a population of sub-threshold point sources, and the NPTF has since found additional applications studying sub-threshold extragalactic sources at high Galactic latitudes. The NPTF generalizes traditional astrophysical template fits to allow for the ability to search for populations of unresolved PSs that may follow a given spatial distribution. NPTFit builds upon the framework of the fluctuation analyses developed in X-ray astronomy, thus it likely has applications beyond those demonstrated with gamma-ray data. The NPTFit package utilizes novel computational methods to perform the NPTF efficiently. The code is available at http://github.com/bsafdi/NPTFit and up-to-date and extensive documentation may be found at http://nptfit.readthedocs.io.

  6. Simulation of image formation in x-ray coded aperture microscopy with polycapillary optics.

    PubMed

    Korecki, P; Roszczynialski, T P; Sowa, K M

    2015-04-06

    In x-ray coded aperture microscopy with polycapillary optics (XCAMPO), the microstructure of focusing polycapillary optics is used as a coded aperture and enables depth-resolved x-ray imaging at a resolution better than the focal spot dimensions. Improvements in the resolution and development of 3D encoding procedures require a simulation model that can predict the outcome of XCAMPO experiments. In this work we introduce a model of image formation in XCAMPO which enables calculation of XCAMPO datasets for arbitrary positions of the object relative to the focal plane as well as to incorporate optics imperfections. In the model, the exit surface of the optics is treated as a micro-structured x-ray source that illuminates a periodic object. This makes it possible to express the intensity of XCAMPO images as a convolution series and to perform simulations by means of fast Fourier transforms. For non-periodic objects, the model can be applied by enforcing artificial periodicity and setting the spatial period larger then the field-of-view. Simulations are verified by comparison with experimental data.

  7. Some issues and subtleties in numerical simulation of X-ray FEL's

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fawley, William M.

    Part of the overall design effort for x-ray FEL's such as the LCLS and TESLA projects has involved extensive use of particle simulation codes to predict their output performance and underlying sensitivity to various input parameters (e.g. electron beam emittance). This paper discusses some of the numerical issues that must be addressed by simulation codes in this regime. We first give a brief overview of the standard approximations and simulation methods adopted by time-dependent(i.e. polychromatic) codes such as GINGER, GENESIS, and FAST3D, including the effects of temporal discretization and the resultant limited spectral bandpass,and then discuss the accuracies and inaccuraciesmore » of these codes in predicting incoherent spontaneous emission (i.e. the extremely low gain regime).« less

  8. Total x-ray power measurements in the Sandia LIGA program.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malinowski, Michael E.; Ting, Aili

    2005-08-01

    Total X-ray power measurements using aluminum block calorimetry and other techniques were made at LIGA X-ray scanner synchrotron beamlines located at both the Advanced Light Source (ALS) and the Advanced Photon Source (APS). This block calorimetry work was initially performed on the LIGA beamline 3.3.1 of the ALS to provide experimental checks of predictions of the LEX-D (LIGA Exposure- Development) code for LIGA X-ray exposures, version 7.56, the version of the code in use at the time calorimetry was done. These experiments showed that it was necessary to use bend magnet field strengths and electron storage ring energies different frommore » the default values originally in the code in order to obtain good agreement between experiment and theory. The results indicated that agreement between LEX-D predictions and experiment could be as good as 5% only if (1) more accurate values of the ring energies, (2) local values of the magnet field at the beamline source point, and (3) the NIST database for X-ray/materials interactions were used as code inputs. These local magnetic field value and accurate ring energies, together with NIST database, are now defaults in the newest release of LEX-D, version 7.61. Three dimensional simulations of the temperature distributions in the aluminum calorimeter block for a typical ALS power measurement were made with the ABAQUS code and found to be in good agreement with the experimental temperature data. As an application of the block calorimetry technique, the X-ray power exiting the mirror in place at a LIGA scanner located at the APS beamline 10 BM was measured with a calorimeter similar to the one used at the ALS. The overall results at the APS demonstrated the utility of calorimetry in helping to characterize the total X-ray power in LIGA beamlines. In addition to the block calorimetry work at the ALS and APS, a preliminary comparison of the use of heat flux sensors, photodiodes and modified beam calorimeters as total X-ray power monitors was made at the ALS, beamline 3.3.1. This work showed that a modification of a commercially available, heat flux sensor could result in a simple, direct reading beam power meter that could be a useful for monitoring total X-ray power in Sandia's LIGA exposure stations at the ALS, APS and Stanford Synchrotron Radiation Laboratory (SSRL).« less

  9. Regolith X-Ray Imaging Spectrometer (REXIS) Aboard the OSIRIS-REx Asteroid Sample Return Mission

    NASA Astrophysics Data System (ADS)

    Masterson, R. A.; Chodas, M.; Bayley, L.; Allen, B.; Hong, J.; Biswas, P.; McMenamin, C.; Stout, K.; Bokhour, E.; Bralower, H.; Carte, D.; Chen, S.; Jones, M.; Kissel, S.; Schmidt, F.; Smith, M.; Sondecker, G.; Lim, L. F.; Lauretta, D. S.; Grindlay, J. E.; Binzel, R. P.

    2018-02-01

    The Regolith X-ray Imaging Spectrometer (REXIS) is the student collaboration experiment proposed and built by an MIT-Harvard team, launched aboard NASA's OSIRIS-REx asteroid sample return mission. REXIS complements the scientific investigations of other OSIRIS-REx instruments by determining the relative abundances of key elements present on the asteroid's surface by measuring the X-ray fluorescence spectrum (stimulated by the natural solar X-ray flux) over the range of energies 0.5 to 7 keV. REXIS consists of two components: a main imaging spectrometer with a coded aperture mask and a separate solar X-ray monitor to account for the Sun's variability. In addition to element abundance ratios (relative to Si) pinpointing the asteroid's most likely meteorite association, REXIS also maps elemental abundance variability across the asteroid's surface using the asteroid's rotation as well as the spacecraft's orbital motion. Image reconstruction at the highest resolution is facilitated by the coded aperture mask. Through this operation, REXIS will be the first application of X-ray coded aperture imaging to planetary surface mapping, making this student-built instrument a pathfinder toward future planetary exploration. To date, 60 students at the undergraduate and graduate levels have been involved with the REXIS project, with the hands-on experience translating to a dozen Master's and Ph.D. theses and other student publications.

  10. March 7, 1970 solar eclipse investigation

    NASA Technical Reports Server (NTRS)

    Accardo, C. A.

    1972-01-01

    Studies from rockets directed toward establishing the solar X-ray fluxes during the 7 March 1970 total eclipse over the North American continent are reported. A map of the eclipse path is presented. The measured absorption profiles for the residual X-rays are useful in establishing their contribution to the D and E region ionization during the eclipse. The studies were performed with two Nike-Apache payloads launched over Wallops Island, Virginia. In addition to three X-ray detectors in the 1 to 8A, 8 to 20A and 44 to 60A bands, there was included in the payloads two additional experiments. These were an electric field experiment and an epithermal photoelectron experiment. The X-ray instrumentation, payload description, flight circumstances and finally, the X-ray results obtained are described. The various computer codes employed for the purpose of reducing the telemetered data as well as the eclipse codes are included.

  11. Advanced x-ray imaging spectrometer

    NASA Technical Reports Server (NTRS)

    Callas, John L. (Inventor); Soli, George A. (Inventor)

    1998-01-01

    An x-ray spectrometer that also provides images of an x-ray source. Coded aperture imaging techniques are used to provide high resolution images. Imaging position-sensitive x-ray sensors with good energy resolution are utilized to provide excellent spectroscopic performance. The system produces high resolution spectral images of the x-ray source which can be viewed in any one of a number of specific energy bands.

  12. A broad band X-ray imaging spectrophotometer for astrophysical studies

    NASA Technical Reports Server (NTRS)

    Lum, Kenneth S. K.; Lee, Dong Hwan; Ku, William H.-M.

    1988-01-01

    A broadband X-ray imaging spectrophotometer (BBXRIS) has been built for astrophysical studies. The BBXRIS is based on a large-imaging gas scintillation proportional counter (LIGSPC), a combination of a gas scintillation proportional counter and a multiwire proportional counter, which achieves 8 percent (FWHM) energy resolution and 1.5-mm (FWHM) spatial resolution at 5.9 keV. The LIGSPC can be integrated with a grazing incidence mirror and a coded aperture mask to provide imaging over a broad range of X-ray energies. The results of tests involving the LIGSPC and a coded aperture mask are presented, and possible applications of the BBXRIS are discussed.

  13. On the effect of the neutral Hydrogen density on the 26 day variations of galactic cosmic rays

    NASA Astrophysics Data System (ADS)

    Engelbrecht, Nicholas; Burger, Renier; Ferreira, Stefan; Hitge, Mariette

    Preliminary results of a 3D, steady-state ab-initio cosmic ray modulation code are presented. This modulation code utilizes analytical expressions for the parallel and perpendicular mean free paths based on the work of Teufel and Schlickeiser (2003) and Shalchi et al. (2004), incorporating Breech et al. (2008)'s model for the 2D variance, correlation scale, and normalized cross helicity. The effects of such a model for basic turbulence quantities, coupled with a 3D model for the neutral Hydrogen density on the 26-day variations of cosmic rays, is investigated, utilizing a Schwadron-Parker hybrid heliospheric magnetic field.

  14. Simulation of the Mg(Ar) ionization chamber currents by different Monte Carlo codes in benchmark gamma fields

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Chun; Liu, Yuan-Hao; Nievaart, Sander; Chen, Yen-Fu; Wu, Shu-Wei; Chou, Wen-Tsae; Jiang, Shiang-Huei

    2011-10-01

    High energy photon (over 10 MeV) and neutron beams adopted in radiobiology and radiotherapy always produce mixed neutron/gamma-ray fields. The Mg(Ar) ionization chambers are commonly applied to determine the gamma-ray dose because of its neutron insensitive characteristic. Nowadays, many perturbation corrections for accurate dose estimation and lots of treatment planning systems are based on Monte Carlo technique. The Monte Carlo codes EGSnrc, FLUKA, GEANT4, MCNP5, and MCNPX were used to evaluate energy dependent response functions of the Exradin M2 Mg(Ar) ionization chamber to a parallel photon beam with mono-energies from 20 keV to 20 MeV. For the sake of validation, measurements were carefully performed in well-defined (a) primary M-100 X-ray calibration field, (b) primary 60Co calibration beam, (c) 6-MV, and (d) 10-MV therapeutic beams in hospital. At energy region below 100 keV, MCNP5 and MCNPX both had lower responses than other codes. For energies above 1 MeV, the MCNP ITS-mode greatly resembled other three codes and the differences were within 5%. Comparing to the measured currents, MCNP5 and MCNPX using ITS-mode had perfect agreement with the 60Co, and 10-MV beams. But at X-ray energy region, the derivations reached 17%. This work shows us a better insight into the performance of different Monte Carlo codes in photon-electron transport calculation. Regarding the application of the mixed field dosimetry like BNCT, MCNP with ITS-mode is recognized as the most suitable tool by this work.

  15. Poster - 28: Shielding of X-ray Rooms in Ontario in the Absence of Best Practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frimeth, Jeff; Richer, Jeff; Nesbitt, James

    This poster will be strictly based on the Healing Arts Radiation Protection (HARP) Act, Regulation 543 under this Act (X-ray Safety Code), and personal communication the presenting author has had. In Ontario, the process of approval of an X-ray machine installation by the Director of the X-ray Inspection Service (XRIS) follows a certain protocol. Initially, the applicant submits a series of forms, including recommended shielding amounts, in order to satisfy the law. This documentation is then transferred to a third-party vendor (i.e. a professional engineer – P.Eng.) outsourced by the Ministry of Health and Long-term Care (MOHLTC). The P.Eng. thenmore » evaluates the submitted documentation for appropriate fulfillment of the HARP Act and Reg. 543 requirements. If the P.Eng.’s evaluation of the documentation is to their satisfaction, the XRIS is then notified. Finally, the Director will then issue a letter of approval to install the equipment at the facility. The methodology required to be used by the P.Eng. in order to determine the required amounts of protective barriers, and recommended to be used by the applicant, is contained within Safety Code 20A. However, Safety Code 35 has replaced the obsolete Safety Code 20A document and employs best practices in shielding design. This talk will focus further on specific intentions and limitations of Safety Code 20A. Furthermore, this talk will discuss the definition of the “practice of professional engineering” in Ontario. COMP members who are involved in shielding design are strongly encouraged to attend.« less

  16. 3D Laser Imprint Using a Smoother Ray-Traced Power Deposition Method

    NASA Astrophysics Data System (ADS)

    Schmitt, Andrew J.

    2017-10-01

    Imprinting of laser nonuniformities in directly-driven icf targets is a challenging problem to accurately simulate with large radiation-hydro codes. One of the most challenging aspects is the proper construction of the complex and rapidly changing laser interference structure driving the imprint using the reduced laser propagation models (usually ray-tracing) found in these codes. We have upgraded the modelling capability in our massively-parallel fastrad3d code by adding a more realistic EM-wave interference structure. This interference model adds an axial laser speckle to the previous transverse-only laser structure, and can be impressed on our improved smoothed 3D raytrace package. This latter package, which connects rays to form bundles and performs power deposition calculations on the bundles, is intended to decrease ray-trace noise (which can mask or add to imprint) while using fewer rays. We apply this improved model to 3D simulations of recent imprint experiments performed on the Omega-EP laser and the Nike laser that examined the reduction of imprinting due to very thin high-Z target coatings. We report on the conditions in which this new model makes a significant impact on the development of laser imprint. Supported by US DoE/NNSA.

  17. Coupling extended magnetohydrodynamic fluid codes with radiofrequency ray tracing codes for fusion modeling

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas G.; Held, Eric D.

    2015-09-01

    Neoclassical tearing modes are macroscopic (L ∼ 1 m) instabilities in magnetic fusion experiments; if unchecked, these modes degrade plasma performance and may catastrophically destroy plasma confinement by inducing a disruption. Fortunately, the use of properly tuned and directed radiofrequency waves (λ ∼ 1 mm) can eliminate these modes. Numerical modeling of this difficult multiscale problem requires the integration of separate mathematical models for each length and time scale (Jenkins and Kruger, 2012 [21]); the extended MHD model captures macroscopic plasma evolution while the RF model tracks the flow and deposition of injected RF power through the evolving plasma profiles. The scale separation enables use of the eikonal (ray-tracing) approximation to model the RF wave propagation. In this work we demonstrate a technique, based on methods of computational geometry, for mapping the ensuing RF data (associated with discrete ray trajectories) onto the finite-element/pseudospectral grid that is used to model the extended MHD physics. In the new representation, the RF data can then be used to construct source terms in the equations of the extended MHD model, enabling quantitative modeling of RF-induced tearing mode stabilization. Though our specific implementation uses the NIMROD extended MHD (Sovinec et al., 2004 [22]) and GENRAY RF (Smirnov et al., 1994 [23]) codes, the approach presented can be applied more generally to any code coupling requiring the mapping of ray tracing data onto Eulerian grids.

  18. SHOULD ONE USE THE RAY-BY-RAY APPROXIMATION IN CORE-COLLAPSE SUPERNOVA SIMULATIONS?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skinner, M. Aaron; Burrows, Adam; Dolence, Joshua C., E-mail: burrows@astro.princeton.edu, E-mail: askinner@astro.princeton.edu, E-mail: jdolence@lanl.gov

    2016-11-01

    We perform the first self-consistent, time-dependent, multi-group calculations in two dimensions (2D) to address the consequences of using the ray-by-ray+ transport simplification in core-collapse supernova simulations. Such a dimensional reduction is employed by many researchers to facilitate their resource-intensive calculations. Our new code (Fornax) implements multi-D transport, and can, by zeroing out transverse flux terms, emulate the ray-by-ray+ scheme. Using the same microphysics, initial models, resolution, and code, we compare the results of simulating 12, 15, 20, and 25 M {sub ⊙} progenitor models using these two transport methods. Our findings call into question the wisdom of the pervasive usemore » of the ray-by-ray+ approach. Employing it leads to maximum post-bounce/pre-explosion shock radii that are almost universally larger by tens of kilometers than those derived using the more accurate scheme, typically leaving the post-bounce matter less bound and artificially more “explodable.” In fact, for our 25 M {sub ⊙} progenitor, the ray-by-ray+ model explodes, while the corresponding multi-D transport model does not. Therefore, in two dimensions, the combination of ray-by-ray+ with the axial sloshing hydrodynamics that is a feature of 2D supernova dynamics can result in quantitatively, and perhaps qualitatively, incorrect results.« less

  19. CLUMPY: A code for γ-ray signals from dark matter structures

    NASA Astrophysics Data System (ADS)

    Charbonnier, Aldée; Combet, Céline; Maurin, David

    2012-03-01

    We present the first public code for semi-analytical calculation of the γ-ray flux astrophysical J-factor from dark matter annihilation/decay in the Galaxy, including dark matter substructures. The core of the code is the calculation of the line of sight integral of the dark matter density squared (for annihilations) or density (for decaying dark matter). The code can be used in three modes: i) to draw skymaps from the Galactic smooth component and/or the substructure contributions, ii) to calculate the flux from a specific halo (that is not the Galactic halo, e.g. dwarf spheroidal galaxies) or iii) to perform simple statistical operations from a list of allowed DM profiles for a given object. Extragalactic contributions and other tracers of DM annihilation (e.g. positrons, anti-protons) will be included in a second release.

  20. Air kerma calibration factors and chamber correction values for PTW soft x-ray, NACP and Roos ionization chambers at very low x-ray energies.

    PubMed

    Ipe, N E; Rosser, K E; Moretti, C J; Manning, J W; Palmer, M J

    2001-08-01

    This paper evaluates the characteristics of ionization chambers for the measurement of absorbed dose to water using very low-energy x-rays. The values of the chamber correction factor, k(ch), used in the IPEMB 1996 code of practice for the UK secondary standard ionization chambers (PTW type M23342 and PTW type M23344), the Roos (PTW type 34001) and NACP electron chambers are derived. The responses in air of the small and large soft x-ray chambers (PTW type M23342 and PTW type M23344) and the NACP and Roos electron ionization chambers were compared. Besides the soft x-ray chambers, the NACP and Roos chambers can be used for very low-energy x-ray dosimetry provided that they are used in the restricted energy range for which their response does not change by more than 5%. The chamber correction factor was found by comparing the absorbed dose to water determined using the dosimetry protocol recommended for low-energy x-rays with that for very low-energy x-rays. The overlap energy range was extended using data from Grosswendt and Knight. Chamber correction factors given in this paper are chamber dependent, varying from 1.037 to 1.066 for a PTW type M23344 chamber, which is very different from a value of unity given in the IPEMB code. However, the values of k(ch) determined in this paper agree with those given in the DIN standard within experimental uncertainty. The authors recommend that the very low-energy section of the IPEMB code is amended to include the most up-to-date values of k(ch).

  1. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground Based Computation and Control Systems and Human Health and Safety

    NASA Technical Reports Server (NTRS)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as on human health and safety, as well as the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in earth surface, atmospheric flight, and space flight environments. Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools (e.g. ground based test methods as well as high energy particle transport and reaction codes) needed to design, test, and verify the safety and reliability of modern complex electronic systems as well as effects on human health and safety. The effects of primary cosmic ray particles, and secondary particle showers produced by nuclear reactions with spacecraft materials, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth's surface, especially if the net target area of the sensitive electronic system components is large. Accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO).

  2. Response of the first wetted wall of an IFE reactor chamber to the energy release from a direct-drive DT capsule

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Medin, Stanislav A.; Basko, Mikhail M.; Orlov, Yurii N.

    2012-07-11

    Radiation hydrodynamics 1D simulations were performed with two concurrent codes, DEIRA and RAMPHY. The DEIRA code was used for DT capsule implosion and burn, and the RAMPHY code was used for computation of X-ray and fast ions deposition in the first wall liquid film of the reactor chamber. The simulations were run for 740 MJ direct drive DT capsule and Pb thin liquid wall reactor chamber of 10 m diameter. Temporal profiles for DT capsule leaking power of X-rays, neutrons and fast {sup 4}He ions were obtained and spatial profiles of the liquid film flow parameter were computed and analyzed.

  3. Study of optical design of Blu-ray pickup head system with a liquid crystal element.

    PubMed

    Fang, Yi-Chin; Yen, Chih-Ta; Hsu, Jui-Hsin

    2014-10-10

    This paper proposes a newly developed optical design and an active compensation method for a Blu-ray pickup head system with a liquid crystal (LC) element. Different from traditional pickup lens design, this new optical design delivers performance as good as the conventional one but has more room for tolerance control, which plays a role in antishaking devices, such as portable Blu-ray players. A hole-pattern electrode and LC optics with external voltage input were employed to generate a symmetric nonuniform electrical field in the LC layer that directs LC molecules into the appropriate gradient refractive index distribution, resulting in the convergence or divergence of specific light beams. LC optics deliver fast and, most importantly, active compensation through optical design when errors occur. Simulations and tolerance analysis were conducted using Code V software, including various tolerance analyses, such as defocus, tilt, and decenter, and their related compensations. Two distinct Blu-ray pickup head system designs were examined in this study. In traditional Blu-ray pickup head system designs, the aperture stop is always set on objective lenses. In the study, the aperture stop is on the LC lens as a newly developed lens. The results revealed that an optical design with aperture stop set on the LC lens as an active compensation device successfully eliminated up to 57% of coma aberration compared with traditional optical designs so that this pickup head lens design will have more space for tolerance control.

  4. Symmetry control using beam phasing in ~0.2 NIF scale high temperature Hohlraum experiment on OMEGA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delamater, Norman D; Wilson, Goug C; Kyrala, George A

    2009-01-01

    Results are shown from recent experiments at the Omega laser facility, using 40 Omega beams driving the hohlraum with 3 cones from each side and up to 19.5 kJ of laser energy. Beam phasing is achieved by decreasing the energy separately in each of the three cones, by 3 kJ, for a total drive energy of 16.5kJ. This results in a more asymmetric drive, which will vary the shape of the imploded symmetry capsule core from round to oblate or prolate in a systematic and controlled manner. These results would be the first demonstration of beam phasing for implosions inmore » such 'high temperature' (275 eV) hohlraums at Omega. Dante measurements confirmed the predicted peak drive temperatures of 275 eV. Implosion core time dependent x-ray images were obtained from framing camera data which show the expected change in symmetry due to beam phasing and which also agree well with post processed hydro code calculations. Time resolved hard x-ray data has been obtained and it was found that the hard x-rays are correlated mainly with the low angle 21{sup o} degree cone.« less

  5. SOC-DS computer code provides tool for design evaluation of homogeneous two-material nuclear shield

    NASA Technical Reports Server (NTRS)

    Disney, R. K.; Ricks, L. O.

    1967-01-01

    SOC-DS Code /Shield Optimization Code-Direc Search/, selects a nuclear shield material of optimum volume, weight, or cost to meet the requirments of a given radiation dose rate or energy transmission constraint. It is applicable to evaluating neutron and gamma ray shields for all nuclear reactors.

  6. Computing Challenges in Coded Mask Imaging

    NASA Technical Reports Server (NTRS)

    Skinner, Gerald

    2009-01-01

    This slide presaentation reviews the complications and challenges in developing computer systems for Coded Mask Imaging telescopes. The coded mask technique is used when there is no other way to create the telescope, (i.e., when there are wide fields of view, high energies for focusing or low energies for the Compton/Tracker Techniques and very good angular resolution.) The coded mask telescope is described, and the mask is reviewed. The coded Masks for the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) instruments are shown, and a chart showing the types of position sensitive detectors used for the coded mask telescopes is also reviewed. Slides describe the mechanism of recovering an image from the masked pattern. The correlation with the mask pattern is described. The Matrix approach is reviewed, and other approaches to image reconstruction are described. Included in the presentation is a review of the Energetic X-ray Imaging Survey Telescope (EXIST) / High Energy Telescope (HET), with information about the mission, the operation of the telescope, comparison of the EXIST/HET with the SWIFT/BAT and details of the design of the EXIST/HET.

  7. Approximated transport-of-intensity equation for coded-aperture x-ray phase-contrast imaging.

    PubMed

    Das, Mini; Liang, Zhihua

    2014-09-15

    Transport-of-intensity equations (TIEs) allow better understanding of image formation and assist in simplifying the "phase problem" associated with phase-sensitive x-ray measurements. In this Letter, we present for the first time to our knowledge a simplified form of TIE that models x-ray differential phase-contrast (DPC) imaging with coded-aperture (CA) geometry. The validity of our approximation is demonstrated through comparison with an exact TIE in numerical simulations. The relative contributions of absorption, phase, and differential phase to the acquired phase-sensitive intensity images are made readily apparent with the approximate TIE, which may prove useful for solving the inverse phase-retrieval problem associated with these CA geometry based DPC.

  8. Comparison of calculated beta- and gamma-ray doses after the Fukushima accident with data from single-grain luminescence retrospective dosimetry of quartz inclusions in a brick sample

    PubMed Central

    Endo, Satoru; Fujii, Keisuke; Kajimoto, Tsuyoshi; Tanaka, Kenichi; Stepanenko, Valeriy; Kolyzhenkov, Timofey; Petukhov, Aleksey; Akhmedova, Umukusum; Bogacheva, Viktoriia

    2018-01-01

    Abstract To estimate the beta- and gamma-ray doses in a brick sample taken from Odaka, Minami-Soma City, Fukushima Prefecture, Japan, a Monte Carlo calculation was performed with Particle and Heavy Ion Transport code System (PHITS) code. The calculated results were compared with data obtained by single-grain retrospective luminescence dosimetry of quartz inclusions in the brick sample. The calculated result agreed well with the measured data. The dose increase measured at the brick surface was explained by the beta-ray contribution, and the slight slope in the dose profile deeper in the brick was due to the gamma-ray contribution. The skin dose was estimated from the calculated result as 164 mGy over 3 years at the sampling site. PMID:29385528

  9. Comparison of calculated beta- and gamma-ray doses after the Fukushima accident with data from single-grain luminescence retrospective dosimetry of quartz inclusions in a brick sample.

    PubMed

    Endo, Satoru; Fujii, Keisuke; Kajimoto, Tsuyoshi; Tanaka, Kenichi; Stepanenko, Valeriy; Kolyzhenkov, Timofey; Petukhov, Aleksey; Akhmedova, Umukusum; Bogacheva, Viktoriia

    2018-05-01

    To estimate the beta- and gamma-ray doses in a brick sample taken from Odaka, Minami-Soma City, Fukushima Prefecture, Japan, a Monte Carlo calculation was performed with Particle and Heavy Ion Transport code System (PHITS) code. The calculated results were compared with data obtained by single-grain retrospective luminescence dosimetry of quartz inclusions in the brick sample. The calculated result agreed well with the measured data. The dose increase measured at the brick surface was explained by the beta-ray contribution, and the slight slope in the dose profile deeper in the brick was due to the gamma-ray contribution. The skin dose was estimated from the calculated result as 164 mGy over 3 years at the sampling site.

  10. Optical Surface Analysis Code (OSAC). 7.0

    NASA Technical Reports Server (NTRS)

    Glenn, P.

    1998-01-01

    The purpose of this modification to the Optical Surface Analysis Code (OSAC) is to upgrade the PSF program to allow the user to get proper diffracted energy normalization even when deliberately obscuring rays with internal obscurations.

  11. Microfocusing of the FERMI@Elettra FEL beam with a K-B active optics system: Spot size predictions by application of the WISE code

    NASA Astrophysics Data System (ADS)

    Raimondi, L.; Svetina, C.; Mahne, N.; Cocco, D.; Abrami, A.; De Marco, M.; Fava, C.; Gerusina, S.; Gobessi, R.; Capotondi, F.; Pedersoli, E.; Kiskinova, M.; De Ninno, G.; Zeitoun, P.; Dovillaire, G.; Lambert, G.; Boutu, W.; Merdji, H.; Gonzalez, A. I.; Gauthier, D.; Zangrando, M.

    2013-05-01

    FERMI@Elettra, the first seeded EUV-SXR free electron laser (FEL) facility located at Elettra Sincrotrone Trieste has been conceived to provide very short (10-100 fs) pulses with ultrahigh peak brightness and wavelengths from 100 nm to 4 nm. A section fully dedicated to the photon transport and analysis diagnostics, named PADReS, has already been installed and commissioned. Three of the beamlines, EIS-TIMEX, DiProI and LDM, installed after the PADReS section, are in advanced commissioning state and will accept the first users in December 2012. These beam lines employ active X-ray optics in order to focus the FEL beam as well as to perform a controlled beam-shaping at focus. Starting from mirror surface metrology characterization, it is difficult to predict the focal spot shape applying only methods based on geometrical optics such as the ray tracing. Within the geometrical optics approach one cannot take into account the diffraction effect from the optics edges, i.e. the aperture diffraction, and the impact of different surface spatial wavelengths to the spot size degradation. Both these effects are strongly dependent on the photon beam energy and mirror incident angles. We employed a method based on physical optics, which applies the Huygens-Fresnel principle to reflection (on which the WISE code is based). In this work we report the results of the first measurements of the focal spot in the DiProI beamline end-station and compare them to the predictions computed with Shadow code and WISE code, starting from the mirror surface profile characterization.

  12. X-ray astronomy in the laboratory with a miniature compact object produced by laser-driven implosion

    NASA Astrophysics Data System (ADS)

    Fujioka, Shinsuke; Takabe, Hideaki; Yamamoto, Norimasa; Salzmann, David; Wang, Feilu; Nishimura, Hiroaki; Li, Yutong; Dong, Quanli; Wang, Shoujun; Zhang, Yi; Rhee, Yong-Joo; Lee, Yong-Woo; Han, Jae-Min; Tanabe, Minoru; Fujiwara, Takashi; Nakabayashi, Yuto; Zhao, Gang; Zhang, Jie; Mima, Kunioki

    2009-11-01

    X-ray spectroscopy is an important tool for understanding the extreme photoionization processes that drive the behaviour of non-thermal equilibrium plasmas in compact astrophysical objects such as black holes. Even so, the distance of these objects from the Earth and the inability to control or accurately ascertain the conditions that govern their behaviour makes it difficult to interpret the origin of the features in astronomical X-ray measurements. Here, we describe an experiment that uses the implosion driven by a 3TW, 4kJ laser system to produce a 0.5keV blackbody radiator that mimics the conditions that exist in the neighbourhood of a black hole. The X-ray spectra emitted from photoionized silicon plasmas resemble those observed from the binary stars Cygnus X-3 (refs 7, 8) and Vela X-1 (refs 9, 10 11) with the Chandra X-ray satellite. As well as demonstrating the ability to create extreme radiation fields in a laboratory plasma, our theoretical interpretation of these laboratory spectra contrasts starkly with the generally accepted explanation for the origin of similar features in astronomical observations. Our experimental approach offers a powerful means to test and validate the computer codes used in X-ray astronomy.

  13. NPTFit: A Code Package for Non-Poissonian Template Fitting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra-Sharma, Siddharth; Rodd, Nicholas L.; Safdi, Benjamin R., E-mail: smsharma@princeton.edu, E-mail: nrodd@mit.edu, E-mail: bsafdi@mit.edu

    We present NPTFit, an open-source code package, written in Python and Cython, for performing non-Poissonian template fits (NPTFs). The NPTF is a recently developed statistical procedure for characterizing the contribution of unresolved point sources (PSs) to astrophysical data sets. The NPTF was first applied to Fermi gamma-ray data to provide evidence that the excess of ∼GeV gamma-rays observed in the inner regions of the Milky Way likely arises from a population of sub-threshold point sources, and the NPTF has since found additional applications studying sub-threshold extragalactic sources at high Galactic latitudes. The NPTF generalizes traditional astrophysical template fits to allowmore » for the ability to search for populations of unresolved PSs that may follow a given spatial distribution. NPTFit builds upon the framework of the fluctuation analyses developed in X-ray astronomy, thus it likely has applications beyond those demonstrated with gamma-ray data. The NPTFit package utilizes novel computational methods to perform the NPTF efficiently. The code is available at http://github.com/bsafdi/NPTFit and up-to-date and extensive documentation may be found at http://nptfit.readthedocs.io.« less

  14. Analysis of the Effect of Electron Density Perturbations Generated by Gravity Waves on HF Communication Links

    NASA Astrophysics Data System (ADS)

    Fagre, M.; Elias, A. G.; Chum, J.; Cabrera, M. A.

    2017-12-01

    In the present work, ray tracing of high frequency (HF) signals in ionospheric disturbed conditions is analyzed, particularly in the presence of electron density perturbations generated by gravity waves (GWs). The three-dimensional numerical ray tracing code by Jones and Stephenson, based on Hamilton's equations, which is commonly used to study radio propagation through the ionosphere, is used. An electron density perturbation model is implemented to this code based upon the consideration of atmospheric GWs generated at a height of 150 km in the thermosphere and propagating up into the ionosphere. The motion of the neutral gas at these altitudes induces disturbances in the background plasma which affects HF signals propagation. To obtain a realistic model of GWs in order to analyze the propagation and dispersion characteristics, a GW ray tracing method with kinematic viscosity and thermal diffusivity was applied. The IRI-2012, HWM14 and NRLMSISE-00 models were incorporated to assess electron density, wind velocities, neutral temperature and total mass density needed for the ray tracing codes. Preliminary results of gravity wave effects on ground range and reflection height are presented for low-mid latitude ionosphere.

  15. Main functions, recent updates, and applications of Synchrotron Radiation Workshop code

    NASA Astrophysics Data System (ADS)

    Chubar, Oleg; Rakitin, Maksim; Chen-Wiegart, Yu-Chen Karen; Chu, Yong S.; Fluerasu, Andrei; Hidas, Dean; Wiegart, Lutz

    2017-08-01

    The paper presents an overview of the main functions and new application examples of the "Synchrotron Radiation Workshop" (SRW) code. SRW supports high-accuracy calculations of different types of synchrotron radiation, and simulations of propagation of fully-coherent radiation wavefronts, partially-coherent radiation from a finite-emittance electron beam of a storage ring source, and time-/frequency-dependent radiation pulses of a free-electron laser, through X-ray optical elements of a beamline. An extended library of physical-optics "propagators" for different types of reflective, refractive and diffractive X-ray optics with its typical imperfections, implemented in SRW, enable simulation of practically any X-ray beamline in a modern light source facility. The high accuracy of calculation methods used in SRW allows for multiple applications of this code, not only in the area of development of instruments and beamlines for new light source facilities, but also in areas such as electron beam diagnostics, commissioning and performance benchmarking of insertion devices and individual X-ray optical elements of beamlines. Applications of SRW in these areas, facilitating development and advanced commissioning of beamlines at the National Synchrotron Light Source II (NSLS-II), are described.

  16. TIME-DEPENDENT MULTI-GROUP MULTI-DIMENSIONAL RELATIVISTIC RADIATIVE TRANSFER CODE BASED ON SPHERICAL HARMONIC DISCRETE ORDINATE METHOD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tominaga, Nozomu; Shibata, Sanshiro; Blinnikov, Sergei I., E-mail: tominaga@konan-u.ac.jp, E-mail: sshibata@post.kek.jp, E-mail: Sergei.Blinnikov@itep.ru

    We develop a time-dependent, multi-group, multi-dimensional relativistic radiative transfer code, which is required to numerically investigate radiation from relativistic fluids that are involved in, e.g., gamma-ray bursts and active galactic nuclei. The code is based on the spherical harmonic discrete ordinate method (SHDOM) which evaluates a source function including anisotropic scattering in spherical harmonics and implicitly solves the static radiative transfer equation with ray tracing in discrete ordinates. We implement treatments of time dependence, multi-frequency bins, Lorentz transformation, and elastic Thomson and inelastic Compton scattering to the publicly available SHDOM code. Our code adopts a mixed-frame approach; the source functionmore » is evaluated in the comoving frame, whereas the radiative transfer equation is solved in the laboratory frame. This implementation is validated using various test problems and comparisons with the results from a relativistic Monte Carlo code. These validations confirm that the code correctly calculates the intensity and its evolution in the computational domain. The code enables us to obtain an Eddington tensor that relates the first and third moments of intensity (energy density and radiation pressure) and is frequently used as a closure relation in radiation hydrodynamics calculations.« less

  17. Simulation of prompt gamma-ray emission during proton radiotherapy.

    PubMed

    Verburg, Joost M; Shih, Helen A; Seco, Joao

    2012-09-07

    The measurement of prompt gamma rays emitted from proton-induced nuclear reactions has been proposed as a method to verify in vivo the range of a clinical proton radiotherapy beam. A good understanding of the prompt gamma-ray emission during proton therapy is key to develop a clinically feasible technique, as it can facilitate accurate simulations and uncertainty analysis of gamma detector designs. Also, the gamma production cross-sections may be incorporated as prior knowledge in the reconstruction of the proton range from the measurements. In this work, we performed simulations of proton-induced nuclear reactions with the main elements of human tissue, carbon-12, oxygen-16 and nitrogen-14, using the nuclear reaction models of the GEANT4 and MCNP6 Monte Carlo codes and the dedicated nuclear reaction codes TALYS and EMPIRE. For each code, we made an effort to optimize the input parameters and model selection. The results of the models were compared to available experimental data of discrete gamma line cross-sections. Overall, the dedicated nuclear reaction codes reproduced the experimental data more consistently, while the Monte Carlo codes showed larger discrepancies for a number of gamma lines. The model differences lead to a variation of the total gamma production near the end of the proton range by a factor of about 2. These results indicate a need for additional theoretical and experimental study of proton-induced gamma emission in human tissue.

  18. Modelling of an Orthovoltage X-ray Therapy Unit with the EGSnrc Monte Carlo Package

    NASA Astrophysics Data System (ADS)

    Knöös, Tommy; Rosenschöld, Per Munck Af; Wieslander, Elinore

    2007-06-01

    Simulations with the EGSnrc code package of an orthovoltage x-ray machine have been performed. The BEAMnrc code was used to transport electrons, produce x-ray photons in the target and transport of these through the treatment machine down to the exit level of the applicator. Further transport in water or CT based phantoms was facilitated by the DOSXYZnrc code. Phase space files were scored with BEAMnrc and analysed regarding the energy spectra at the end of the applicator. Tuning of simulation parameters was based on the half-value layer quantity for the beams in either Al or Cu. Calculated depth dose and profile curves have been compared against measurements and show good agreement except at shallow depths. The MC model tested in this study can be used for various dosimetric studies as well as generating a library of typical treatment cases that can serve as both educational material and guidance in the clinical practice

  19. Computation of Cosmic Ray Ionization and Dose at Mars: a Comparison of HZETRN and Planetocosmics for Proton and Alpha Particles

    NASA Technical Reports Server (NTRS)

    Gronoff, Guillaume; Norman, Ryan B.; Mertens, Christopher J.

    2014-01-01

    The ability to evaluate the cosmic ray environment at Mars is of interest for future manned exploration. To support exploration, tools must be developed to accurately access the radiation environment in both free space and on planetary surfaces. The primary tool NASA uses to quantify radiation exposure behind shielding materials is the space radiation transport code, HZETRN. In order to build confidence in HZETRN, code benchmarking against Monte Carlo radiation transport codes is often used. This work compares the dose calculations at Mars by HZETRN and the Geant4 application Planetocosmics. The dose at ground and the energy deposited in the atmosphere by galactic cosmic ray protons and alpha particles has been calculated for the Curiosity landing conditions. In addition, this work has considered Solar Energetic Particle events, allowing for the comparison of varying input radiation environments. The results for protons and alpha particles show very good agreement between HZETRN and Planetocosmics.

  20. Photon Throughput Calculations for a Spherical Crystal Spectrometer

    NASA Astrophysics Data System (ADS)

    Gilman, C. J.; Bitter, M.; Delgado-Aparicio, L.; Efthimion, P. C.; Hill, K.; Kraus, B.; Gao, L.; Pablant, N.

    2017-10-01

    X-ray imaging crystal spectrometers of the type described in Refs. have become a standard diagnostic for Doppler measurements of profiles of the ion temperature and the plasma flow velocities in magnetically confined, hot fusion plasmas. These instruments have by now been implemented on major tokamak and stellarator experiments in Korea, China, Japan, and Germany and are currently also being designed by PPPL for ITER. A still missing part in the present data analysis is an efficient code for photon throughput calculations to evaluate the chord-integrated spectral data. The existing ray tracing codes cannot be used for a data analysis between shots, since they require extensive and time consuming numerical calculations. Here, we present a detailed analysis of the geometrical properties of the ray pattern. This method allows us to minimize the extent of numerical calculations and to create a more efficient code. This work was performed under the auspices of the U.S. Department of Energy by Princeton Plasma Physics Laboratory under contract DE-AC02-09CH11466.

  1. Monte Carlo Simulation of X-Ray Spectra in Mammography and Contrast-Enhanced Digital Mammography Using the Code PENELOPE

    NASA Astrophysics Data System (ADS)

    Cunha, Diego M.; Tomal, Alessandra; Poletti, Martin E.

    2013-04-01

    In this work, the Monte Carlo (MC) code PENELOPE was employed for simulation of x-ray spectra in mammography and contrast-enhanced digital mammography (CEDM). Spectra for Mo, Rh and W anodes were obtained for tube potentials between 24-36 kV, for mammography, and between 45-49 kV, for CEDM. The spectra obtained from the simulations were analytically filtered to correspond to the anode/filter combinations usually employed in each technique (Mo/Mo, Rh/Rh and W/Rh for mammography and Mo/Cu, Rh/Cu and W/Cu for CEDM). For the Mo/Mo combination, the simulated spectra were compared with those obtained experimentally, and for spectra for the W anode, with experimental data from the literature, through comparison of distribution shape, average energies, half-value layers (HVL) and transmission curves. For all combinations evaluated, the simulated spectra were also compared with those provided by different models from the literature. Results showed that the code PENELOPE provides mammographic x-ray spectra in good agreement with those experimentally measured and those from the literature. The differences in the values of HVL ranged between 2-7%, for anode/filter combinations and tube potentials employed in mammography, and they were less than 5% for those employed in CEDM. The transmission curves for the spectra obtained also showed good agreement compared to those computed from reference spectra, with average relative differences less than 12% for mammography and CEDM. These results show that the code PENELOPE can be a useful tool to generate x-ray spectra for studies in mammography and CEDM, and also for evaluation of new x-ray tube designs and new anode materials.

  2. 75 FR 39985 - In the Matter of Aerotest Operations, Inc. (Aerotest Radiography and Research Reactor); Order...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-13

    ...-Ray Industries, Inc., (X-Ray), and Autoliv requested that the NRC, pursuant to of Title 10 of the Code... of Aerotest's license to possess, use, and operate the ARRR, from its current owner, Autoliv to X-Ray. Autoliv, the parent company of OEA, Inc., (which is the parent company of Aerotest) and X-Ray have entered...

  3. Utilization of Patch/Triangular Target Description Data in BRL Parallel Ray Vulnerability Assessment Codes

    DTIC Science & Technology

    1979-09-01

    KEY WORDS (Continue on revmrem elde It necmmemry and Identity by block number) Target Descriptions GIFT Code C0MGE0M Descriptions FASTGEN Code...which accepts the COMGEOM target description and 1 2 produces the shotline data is the GIFT ’ code. The GIFT code evolved 3 4 from and has...the COMGEOM/ GIFT methodology, while the Navy and Air Force use the PATCH/SHOTGEN-FASTGEN methodology. Lawrence W. Bain, Mathew J. Heisinger

  4. Application of quasi-distributions for solving inverse problems of neutron and {gamma}-ray transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pogosbekyan, L.R.; Lysov, D.A.

    The considered inverse problems deal with the calculation of the unknown values of nuclear installations by means of the known (goal) functionals of neutron/{gamma}-ray distributions. The example of these problems might be the calculation of the automatic control rods position as function of neutron sensors reading, or the calculation of experimentally-corrected values of cross-sections, isotopes concentration, fuel enrichment via the measured functional. The authors have developed the new method to solve inverse problem. It finds flux density as quasi-solution of the particles conservation linear system adjointed to equalities for functionals. The method is more effective compared to the one basedmore » on the classical perturbation theory. It is suitable for vectorization and it can be used successfully in optimization codes.« less

  5. A search for outflows from X-ray bright points in coronal holes

    NASA Technical Reports Server (NTRS)

    Mullan, D. J.; Waldron, W. L.

    1986-01-01

    Properties of X-ray bright points using two of the instruments on Solar Maximum Mission were investigated. The mass outflows from magnetic regions were modeled using a two dimensional MHD code. It was concluded that mass can be detected from X-ray bright points provided that the magnetic topology is favorable.

  6. Experimental characterization of an ultra-fast Thomson scattering x-ray source with three-dimensional time and frequency-domain analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuba, J; Slaughter, D R; Fittinghoff, D N

    We present a detailed comparison of the measured characteristics of Thomson backscattered x-rays produced at the PLEIADES (Picosecond Laser-Electron Interaction for the Dynamic Evaluation of Structures) facility at Lawrence Livermore National Laboratory to predicted results from a newly developed, fully three-dimensional time and frequency-domain code. Based on the relativistic differential cross section, this code has the capability to calculate time and space dependent spectra of the x-ray photons produced from linear Thomson scattering for both bandwidth-limited and chirped incident laser pulses. Spectral broadening of the scattered x-ray pulse resulting from the incident laser bandwidth, perpendicular wave vector components in themore » laser focus, and the transverse and longitudinal phase space of the electron beam are included. Electron beam energy, energy spread, and transverse phase space measurements of the electron beam at the interaction point are presented, and the corresponding predicted x-ray characteristics are determined. In addition, time-integrated measurements of the x-rays produced from the interaction are presented, and shown to agree well with the simulations.« less

  7. ECCD-induced tearing mode stabilization via active control in coupled NIMROD/GENRAY HPC simulations

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas; Kruger, S. E.; Held, E. D.; Harvey, R. W.

    2012-10-01

    Actively controlled electron cyclotron current drive (ECCD) applied within magnetic islands formed by neoclassical tearing modes (NTMs) has been shown to control or suppress these modes. In conjunction with ongoing experimental efforts, the development and verification of integrated numerical models of this mode stabilization process is of paramount importance in determining optimal NTM stabilization strategies for ITER. In the advanced model developed by the SWIM Project, the equations/closures of extended (not reduced) MHD contain new terms arising from 3D (not toroidal or bounce-averaged) RF-induced quasilinear diffusion. The quasilinear operator formulation models the equilibration of driven current within the island using the same extended MHD dynamics which govern the physics of island formation, yielding a more accurate and self-consistent picture of 3D island response to RF drive. Results of computations which model ECRF deposition using ray tracing, assemble the 3D quasilinear operator from ray/profile data, and calculate the resultant forces within the extended MHD code will be presented. We also discuss the efficacy of various numerical active feedback control systems, which gather data from synthetic diagnostics to dynamically trigger and spatially align RF fields.

  8. The X-Ray Polarization of the Accretion Disk Coronae of Active Galactic Nuclei

    NASA Astrophysics Data System (ADS)

    Beheshtipour, Banafsheh; Krawczynski, Henric; Malzac, Julien

    2017-11-01

    Hard X-rays observed in Active Galactic Nuclei (AGNs) are thought to originate from the Comptonization of the optical/UV accretion disk photons in a hot corona. Polarization studies of these photons can help to constrain the corona geometry and the plasma properties. We have developed a ray-tracing code that simulates the Comptonization of accretion disk photons in coronae of arbitrary shapes, and use it here to study the polarization of the X-ray emission from wedge and spherical coronae. We study the predicted polarization signatures for the fully relativistic and various approximate treatments of the elemental Compton scattering processes. We furthermore use the code to evaluate the impact of nonthermal electrons and cyclo-synchrotron photons on the polarization properties. Finally, we model the NuSTAR observations of the Seyfert I galaxy Mrk 335 and predict the associated polarization signal. Our studies show that X-ray polarimetry missions such as NASA’s Imaging X-ray Polarimetry Explorer and the X-ray Imaging Polarimetry Explorer proposed to ESA will provide valuable new information about the physical properties of the plasma close to the event horizon of AGN black holes.

  9. Kinetic Modeling of Ultraintense X-ray Laser-Matter Interactions

    NASA Astrophysics Data System (ADS)

    Royle, Ryan; Sentoku, Yasuhiko; Mancini, Roberto

    2016-10-01

    Hard x-ray free-electron lasers (XFELs) have had a profound impact on the physical, chemical, and biological sciences. They can produce millijoule x-ray laser pulses just tens of femtoseconds in duration with more than 1012 photons each, making them the brightest laboratory x-ray sources ever produced by several orders of magnitude. An XFEL pulse can be intensified to 1020 W/cm2 when focused to submicron spot sizes, making it possible to isochorically heat solid matter well beyond 100 eV. These characteristics enable XFELs to create and probe well-characterized warm and hot dense plasmas of relevance to HED science, planetary science, laboratory astrophysics, relativistic laser plasmas, and fusion research. Several newly developed atomic physics models including photoionization, Auger ionization, and continuum-lowering have been implemented in a particle-in-cell code, PICLS, which self-consistently solves the x-ray transport, to enable the simulation of the non-LTE plasmas created by ultraintense x-ray laser interactions with solid density matter. The code is validated against the results of several recent experiments and is used to simulate the maximum-intensity x-ray heating of solid iron targets. This work was supported by DOE/OFES under Contract No. DE-SC0008827.

  10. Monte Carlo simulation of X-ray imaging and spectroscopy experiments using quadric geometry and variance reduction techniques

    NASA Astrophysics Data System (ADS)

    Golosio, Bruno; Schoonjans, Tom; Brunetti, Antonio; Oliva, Piernicola; Masala, Giovanni Luca

    2014-03-01

    The simulation of X-ray imaging experiments is often performed using deterministic codes, which can be relatively fast and easy to use. However, such codes are generally not suitable for the simulation of even slightly more complex experimental conditions, involving, for instance, first-order or higher-order scattering, X-ray fluorescence emissions, or more complex geometries, particularly for experiments that combine spatial resolution with spectral information. In such cases, simulations are often performed using codes based on the Monte Carlo method. In a simple Monte Carlo approach, the interaction position of an X-ray photon and the state of the photon after an interaction are obtained simply according to the theoretical probability distributions. This approach may be quite inefficient because the final channels of interest may include only a limited region of space or photons produced by a rare interaction, e.g., fluorescent emission from elements with very low concentrations. In the field of X-ray fluorescence spectroscopy, this problem has been solved by combining the Monte Carlo method with variance reduction techniques, which can reduce the computation time by several orders of magnitude. In this work, we present a C++ code for the general simulation of X-ray imaging and spectroscopy experiments, based on the application of the Monte Carlo method in combination with variance reduction techniques, with a description of sample geometry based on quadric surfaces. We describe the benefits of the object-oriented approach in terms of code maintenance, the flexibility of the program for the simulation of different experimental conditions and the possibility of easily adding new modules. Sample applications in the fields of X-ray imaging and X-ray spectroscopy are discussed. Catalogue identifier: AERO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERO_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 83617 No. of bytes in distributed program, including test data, etc.: 1038160 Distribution format: tar.gz Programming language: C++. Computer: Tested on several PCs and on Mac. Operating system: Linux, Mac OS X, Windows (native and cygwin). RAM: It is dependent on the input data but usually between 1 and 10 MB. Classification: 2.5, 21.1. External routines: XrayLib (https://github.com/tschoonj/xraylib/wiki) Nature of problem: Simulation of a wide range of X-ray imaging and spectroscopy experiments using different types of sources and detectors. Solution method: XRMC is a versatile program that is useful for the simulation of a wide range of X-ray imaging and spectroscopy experiments. It enables the simulation of monochromatic and polychromatic X-ray sources, with unpolarised or partially/completely polarised radiation. Single-element detectors as well as two-dimensional pixel detectors can be used in the simulations, with several acquisition options. In the current version of the program, the sample is modelled by combining convex three-dimensional objects demarcated by quadric surfaces, such as planes, ellipsoids and cylinders. The Monte Carlo approach makes XRMC able to accurately simulate X-ray photon transport and interactions with matter up to any order of interaction. The differential cross-sections and all other quantities related to the interaction processes (photoelectric absorption, fluorescence emission, elastic and inelastic scattering) are computed using the xraylib software library, which is currently the most complete and up-to-date software library for X-ray parameters. The use of variance reduction techniques makes XRMC able to reduce the simulation time by several orders of magnitude compared to other general-purpose Monte Carlo simulation programs. Running time: It is dependent on the complexity of the simulation. For the examples distributed with the code, it ranges from less than 1 s to a few minutes.

  11. Coded aperture detector: an image sensor with sub 20-nm pixel resolution.

    PubMed

    Miyakawa, Ryan; Mayer, Rafael; Wojdyla, Antoine; Vannier, Nicolas; Lesser, Ian; Aron-Dine, Shifrah; Naulleau, Patrick

    2014-08-11

    We describe the coded aperture detector, a novel image sensor based on uniformly redundant arrays (URAs) with customizable pixel size, resolution, and operating photon energy regime. In this sensor, a coded aperture is scanned laterally at the image plane of an optical system, and the transmitted intensity is measured by a photodiode. The image intensity is then digitally reconstructed using a simple convolution. We present results from a proof-of-principle optical prototype, demonstrating high-fidelity image sensing comparable to a CCD. A 20-nm half-pitch URA fabricated by the Center for X-ray Optics (CXRO) nano-fabrication laboratory is presented that is suitable for high-resolution image sensing at EUV and soft X-ray wavelengths.

  12. Partially coherent X-ray wavefront propagation simulations including grazing-incidence focusing optics.

    PubMed

    Canestrari, Niccolo; Chubar, Oleg; Reininger, Ruben

    2014-09-01

    X-ray beamlines in modern synchrotron radiation sources make extensive use of grazing-incidence reflective optics, in particular Kirkpatrick-Baez elliptical mirror systems. These systems can focus the incoming X-rays down to nanometer-scale spot sizes while maintaining relatively large acceptance apertures and high flux in the focused radiation spots. In low-emittance storage rings and in free-electron lasers such systems are used with partially or even nearly fully coherent X-ray beams and often target diffraction-limited resolution. Therefore, their accurate simulation and modeling has to be performed within the framework of wave optics. Here the implementation and benchmarking of a wave-optics method for the simulation of grazing-incidence mirrors based on the local stationary-phase approximation or, in other words, the local propagation of the radiation electric field along geometrical rays, is described. The proposed method is CPU-efficient and fully compatible with the numerical methods of Fourier optics. It has been implemented in the Synchrotron Radiation Workshop (SRW) computer code and extensively tested against the geometrical ray-tracing code SHADOW. The test simulations have been performed for cases without and with diffraction at mirror apertures, including cases where the grazing-incidence mirrors can be hardly approximated by ideal lenses. Good agreement between the SRW and SHADOW simulation results is observed in the cases without diffraction. The differences between the simulation results obtained by the two codes in diffraction-dominated cases for illumination with fully or partially coherent radiation are analyzed and interpreted. The application of the new method for the simulation of wavefront propagation through a high-resolution X-ray microspectroscopy beamline at the National Synchrotron Light Source II (Brookhaven National Laboratory, USA) is demonstrated.

  13. Design criteria for small coded aperture masks in gamma-ray astronomy

    NASA Technical Reports Server (NTRS)

    Sembay, S.; Gehrels, Neil

    1990-01-01

    Most theoretical work on coded aperture masks in X-ray and low-energy gamma-ray astronomy has concentrated on masks with large numbers of elements. For gamma-ray spectrometers in the MeV range, the detector plane usually has only a few discrete elements, so that masks with small numbers of elements are called for. For this case it is feasible to analyze by computer all the possible mask patterns of given dimension to find the ones that best satisfy the desired performance criteria. A particular set of performance criteria for comparing the flux sensitivities, source positioning accuracies and transparencies of different mask patterns is developed. The results of such a computer analysis for masks up to dimension 5 x 5 unit cell are presented and it is concluded that there is a great deal of flexibility in the choice of mask pattern for each dimension.

  14. X-ray metrology and performance of a 45-cm long x-ray deformable mirror

    DOE PAGES

    Poyneer, Lisa A.; Brejnholt, Nicolai F.; Hill, Randall; ...

    2016-05-20

    We describe experiments with a 45-cm long x-ray deformable mirror (XDM) that have been conducted in End Station 2, Beamline 5.3.1 at the Advanced Light Source. A detailed description of the hardware implementation is provided. We explain our one-dimensional Fresnel propagation code that correctly handles grazing incidence and includes a model of the XDM. This code is used to simulate and verify experimental results. Initial long trace profiler metrology of the XDM at 7.5 keV is presented. The ability to measure a large (150-nm amplitude) height change on the XDM is demonstrated. The results agree well with the simulated experimentmore » at an error level of 1 μrad RMS. Lastly, direct imaging of the x-ray beam also shows the expected change in intensity profile at the detector.« less

  15. X-ray metrology and performance of a 45-cm long x-ray deformable mirror

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poyneer, Lisa A., E-mail: poyneer1@llnl.gov; Brejnholt, Nicolai F.; Hill, Randall

    2016-05-15

    We describe experiments with a 45-cm long x-ray deformable mirror (XDM) that have been conducted in End Station 2, Beamline 5.3.1 at the Advanced Light Source. A detailed description of the hardware implementation is provided. We explain our one-dimensional Fresnel propagation code that correctly handles grazing incidence and includes a model of the XDM. This code is used to simulate and verify experimental results. Initial long trace profiler metrology of the XDM at 7.5 keV is presented. The ability to measure a large (150-nm amplitude) height change on the XDM is demonstrated. The results agree well with the simulated experimentmore » at an error level of 1 μrad RMS. Direct imaging of the x-ray beam also shows the expected change in intensity profile at the detector.« less

  16. X-ray microlaminography with polycapillary optics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dabrowski, K. M.; Dul, D. T.; Wrobel, A.

    2013-06-03

    We demonstrate layer-by-layer x-ray microimaging using polycapillary optics. The depth resolution is achieved without sample or source rotation and in a way similar to classical tomography or laminography. The method takes advantage from large angular apertures of polycapillary optics and from their specific microstructure, which is treated as a coded aperture. The imaging geometry is compatible with polychromatic x-ray sources and with scanning and confocal x-ray fluorescence setups.

  17. Gamma-ray spectroscopy: The diffuse galactic glow

    NASA Technical Reports Server (NTRS)

    Hartmann, Dieter H.

    1991-01-01

    The goal of this project is the development of a numerical code that provides statistical models of the sky distribution of gamma-ray lines due to the production of radioactive isotopes by ongoing Galactic nucleosynthesis. We are particularly interested in quasi-steady emission from novae, supernovae, and stellar winds, but continuum radiation and transient sources must also be considered. We have made significant progress during the first half period of this project and expect the timely completion of a code that can be applied to Oriented Scintillation Spectrometer Experiment (OSSE) Galactic plane survey data.

  18. Quasi-real-time end-to-end simulations of ELT-scale adaptive optics systems on GPUs

    NASA Astrophysics Data System (ADS)

    Gratadour, Damien

    2011-09-01

    Our team has started the development of a code dedicated to GPUs for the simulation of AO systems at the E-ELT scale. It uses the CUDA toolkit and an original binding to Yorick (an open source interpreted language) to provide the user with a comprehensive interface. In this paper we present the first performance analysis of our simulation code, showing its ability to provide Shack-Hartmann (SH) images and measurements at the kHz scale for VLT-sized AO system and in quasi-real-time (up to 70 Hz) for ELT-sized systems on a single top-end GPU. The simulation code includes multiple layers atmospheric turbulence generation, ray tracing through these layers, image formation at the focal plane of every sub-apertures of a SH sensor using either natural or laser guide stars and centroiding on these images using various algorithms. Turbulence is generated on-the-fly giving the ability to simulate hours of observations without the need of loading extremely large phase screens in the global memory. Because of its performance this code additionally provides the unique ability to test real-time controllers for future AO systems under nominal conditions.

  19. 10 CFR 170.31 - Schedule of fees for materials licenses and other regulatory services, including inspections, and...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., including x-ray fluorescence analyzers.4 Application [Program Code(s): 22140] $1,200 D. All other special... extraction of metals other than uranium or thorium, including licenses authorizing the possession of...

  20. Diagnostic x-ray dosimetry using Monte Carlo simulation.

    PubMed

    Ioppolo, J L; Price, R I; Tuchyna, T; Buckley, C E

    2002-05-21

    An Electron Gamma Shower version 4 (EGS4) based user code was developed to simulate the absorbed dose in humans during routine diagnostic radiological procedures. Measurements of absorbed dose using thermoluminescent dosimeters (TLDs) were compared directly with EGS4 simulations of absorbed dose in homogeneous, heterogeneous and anthropomorphic phantoms. Realistic voxel-based models characterizing the geometry of the phantoms were used as input to the EGS4 code. The voxel geometry of the anthropomorphic Rando phantom was derived from a CT scan of Rando. The 100 kVp diagnostic energy x-ray spectra of the apparatus used to irradiate the phantoms were measured, and provided as input to the EGS4 code. The TLDs were placed at evenly spaced points symmetrically about the central beam axis, which was perpendicular to the cathode-anode x-ray axis at a number of depths. The TLD measurements in the homogeneous and heterogenous phantoms were on average within 7% of the values calculated by EGS4. Estimates of effective dose with errors less than 10% required fewer numbers of photon histories (1 x 10(7)) than required for the calculation of dose profiles (1 x 10(9)). The EGS4 code was able to satisfactorily predict and thereby provide an instrument for reducing patient and staff effective dose imparted during radiological investigations.

  1. Diagnostic x-ray dosimetry using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Ioppolo, J. L.; Price, R. I.; Tuchyna, T.; Buckley, C. E.

    2002-05-01

    An Electron Gamma Shower version 4 (EGS4) based user code was developed to simulate the absorbed dose in humans during routine diagnostic radiological procedures. Measurements of absorbed dose using thermoluminescent dosimeters (TLDs) were compared directly with EGS4 simulations of absorbed dose in homogeneous, heterogeneous and anthropomorphic phantoms. Realistic voxel-based models characterizing the geometry of the phantoms were used as input to the EGS4 code. The voxel geometry of the anthropomorphic Rando phantom was derived from a CT scan of Rando. The 100 kVp diagnostic energy x-ray spectra of the apparatus used to irradiate the phantoms were measured, and provided as input to the EGS4 code. The TLDs were placed at evenly spaced points symmetrically about the central beam axis, which was perpendicular to the cathode-anode x-ray axis at a number of depths. The TLD measurements in the homogeneous and heterogenous phantoms were on average within 7% of the values calculated by EGS4. Estimates of effective dose with errors less than 10% required fewer numbers of photon histories (1 × 107) than required for the calculation of dose profiles (1 × 109). The EGS4 code was able to satisfactorily predict and thereby provide an instrument for reducing patient and staff effective dose imparted during radiological investigations.

  2. Comparison of a 3-D GPU-Assisted Maxwell Code and Ray Tracing for Reflectometry on ITER

    NASA Astrophysics Data System (ADS)

    Gady, Sarah; Kubota, Shigeyuki; Johnson, Irena

    2015-11-01

    Electromagnetic wave propagation and scattering in magnetized plasmas are important diagnostics for high temperature plasmas. 1-D and 2-D full-wave codes are standard tools for measurements of the electron density profile and fluctuations; however, ray tracing results have shown that beam propagation in tokamak plasmas is inherently a 3-D problem. The GPU-Assisted Maxwell Code utilizes the FDTD (Finite-Difference Time-Domain) method for solving the Maxwell equations with the cold plasma approximation in a 3-D geometry. Parallel processing with GPGPU (General-Purpose computing on Graphics Processing Units) is used to accelerate the computation. Previously, we reported on initial comparisons of the code results to 1-D numerical and analytical solutions, where the size of the computational grid was limited by the on-board memory of the GPU. In the current study, this limitation is overcome by using domain decomposition and an additional GPU. As a practical application, this code is used to study the current design of the ITER Low Field Side Reflectometer (LSFR) for the Equatorial Port Plug 11 (EPP11). A detailed examination of Gaussian beam propagation in the ITER edge plasma will be presented, as well as comparisons with ray tracing. This work was made possible by funding from the Department of Energy for the Summer Undergraduate Laboratory Internship (SULI) program. This work is supported by the US DOE Contract No.DE-AC02-09CH11466 and DE-FG02-99-ER54527.

  3. Code-division-multiplexed readout of large arrays of TES microcalorimeters

    NASA Astrophysics Data System (ADS)

    Morgan, K. M.; Alpert, B. K.; Bennett, D. A.; Denison, E. V.; Doriese, W. B.; Fowler, J. W.; Gard, J. D.; Hilton, G. C.; Irwin, K. D.; Joe, Y. I.; O'Neil, G. C.; Reintsema, C. D.; Schmidt, D. R.; Ullom, J. N.; Swetz, D. S.

    2016-09-01

    Code-division multiplexing (CDM) offers a path to reading out large arrays of transition edge sensor (TES) X-ray microcalorimeters with excellent energy and timing resolution. We demonstrate the readout of X-ray TESs with a 32-channel flux-summed code-division multiplexing circuit based on superconducting quantum interference device (SQUID) amplifiers. The best detector has energy resolution of 2.28 ± 0.12 eV FWHM at 5.9 keV and the array has mean energy resolution of 2.77 ± 0.02 eV over 30 working sensors. The readout channels are sampled sequentially at 160 ns/row, for an effective sampling rate of 5.12 μs/channel. The SQUID amplifiers have a measured flux noise of 0.17 μΦ0/√Hz (non-multiplexed, referred to the first stage SQUID). The multiplexed noise level and signal slew rate are sufficient to allow readout of more than 40 pixels per column, making CDM compatible with requirements outlined for future space missions. Additionally, because the modulated data from the 32 SQUID readout channels provide information on each X-ray event at the row rate, our CDM architecture allows determination of the arrival time of an X-ray event to within 275 ns FWHM with potential benefits in experiments that require detection of near-coincident events.

  4. TIM, a ray-tracing program for METATOY research and its dissemination

    NASA Astrophysics Data System (ADS)

    Lambert, Dean; Hamilton, Alasdair C.; Constable, George; Snehanshu, Harsh; Talati, Sharvil; Courtial, Johannes

    2012-03-01

    TIM (The Interactive METATOY) is a ray-tracing program specifically tailored towards our research in METATOYs, which are optical components that appear to be able to create wave-optically forbidden light-ray fields. For this reason, TIM possesses features not found in other ray-tracing programs. TIM can either be used interactively or by modifying the openly available source code; in both cases, it can easily be run as an applet embedded in a web page. Here we describe the basic structure of TIM's source code and how to extend it, and we give examples of how we have used TIM in our own research. Program summaryProgram title: TIM Catalogue identifier: AEKY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 124 478 No. of bytes in distributed program, including test data, etc.: 4 120 052 Distribution format: tar.gz Programming language: Java Computer: Any computer capable of running the Java Virtual Machine (JVM) 1.6 Operating system: Any; developed under Mac OS X Version 10.6 RAM: Typically 145 MB (interactive version running under Mac OS X Version 10.6) Classification: 14, 18 External routines: JAMA [1] (source code included) Nature of problem: Visualisation of scenes that include scene objects that create wave-optically forbidden light-ray fields. Solution method: Ray tracing. Unusual features: Specifically designed to visualise wave-optically forbidden light-ray fields; can visualise ray trajectories; can visualise geometric optic transformations; can create anaglyphs (for viewing with coloured "3D glasses") and random-dot autostereograms of the scene; integrable into web pages. Running time: Problem-dependent; typically seconds for a simple scene.

  5. Studies of auroral X-ray imaging from high altitude spacecraft

    NASA Technical Reports Server (NTRS)

    Mckenzie, D. L.; Mizera, P. F.; Rice, C. J.

    1980-01-01

    Results of a study of techniques for imaging the aurora from a high altitude satellite at X-ray wavelengths are summarized. The X-ray observations allow the straightforward derivation of the primary auroral X-ray spectrum and can be made at all local times, day and night. Five candidate imaging systems are identified: X-ray telescope, multiple pinhole camera, coded aperture, rastered collimator, and imaging collimator. Examples of each are specified, subject to common weight and size limits which allow them to be intercompared. The imaging ability of each system is tested using a wide variety of sample spectra which are based on previous satellite observations. The study shows that the pinhole camera and coded aperture are both good auroral imaging systems. The two collimated detectors are significantly less sensitive. The X-ray telescope provides better image quality than the other systems in almost all cases, but a limitation to energies below about 4 keV prevents this system from providing the spectra data essential to deriving electron spectra, energy input to the atmosphere, and atmospheric densities and conductivities. The orbit selection requires a tradeoff between spatial resolution and duty cycle.

  6. Detection of Explosive Devices using X-ray Backscatter Radiation

    NASA Astrophysics Data System (ADS)

    Faust, Anthony A.

    2002-09-01

    It is our goal to develop a coded aperture based X-ray backscatter imaging detector that will provide sufficient speed, contrast and spatial resolution to detect Antipersonnel Landmines and Improvised Explosive Devices (IED). While our final objective is to field a hand-held detector, we have currently constrained ourselves to a design that can be fielded on a small robotic platform. Coded aperture imaging has been used by the observational gamma astronomy community for a number of years. However, it has been the recent advances in the field of medical nuclear imaging which has allowed for the application of the technique to a backscatter scenario. In addition, driven by requirements in medical applications, advances in X-ray detection are continually being made, and detectors are now being produced that are faster, cheaper and lighter than those only a decade ago. With these advances, a coded aperture hand-held imaging system has only recently become a possibility. This paper will begin with an introduction to the technique, identify recent advances which have made this approach possible, present a simulated example case, and conclude with a discussion on future work.

  7. The potential of detecting intermediate-scale biomass and canopy interception in a coniferous forest using cosmic-ray neutron intensity measurements and neutron transport modeling

    NASA Astrophysics Data System (ADS)

    Andreasen, M.; Looms, M. C.; Bogena, H. R.; Desilets, D.; Zreda, M. G.; Sonnenborg, T. O.; Jensen, K. H.

    2014-12-01

    The water stored in the various compartments of the terrestrial ecosystem (in snow, canopy interception, soil and litter) controls the exchange of the water and energy between the land surface and the atmosphere. Therefore, measurements of the water stored within these pools are critical for the prediction of e.g. evapotranspiration and groundwater recharge. The detection of cosmic-ray neutron intensity is a novel non-invasive method for the quantification of continuous intermediate-scale soil moisture. The footprint of the cosmic-ray neutron probe is a hemisphere of a few hectometers and subsurface depths of 10-70 cm depending on wetness. The cosmic-ray neutron method offers measurements at a scale between the point-scale measurements and large-scale satellite retrievals. The cosmic-ray neutron intensity is inversely correlated to the hydrogen stored within the footprint. Overall soil moisture represents the largest pool of hydrogen and changes in the soil moisture clearly affect the cosmic-ray neutron signal. However, the neutron intensity is also sensitive to variations of hydrogen in snow, canopy interception and biomass offering the potential to determine water content in such pools from the signal. In this study we tested the potential of determining canopy interception and biomass using cosmic-ray neutron intensity measurements within the framework of the Danish Hydrologic Observatory (HOBE) and the Terrestrial Environmental Observatories (TERENO). Continuous measurements at the ground and the canopy level, along with profile measurements were conducted at towers at forest field sites. Field experiments, including shielding the cosmic-ray neutron probes with cadmium foil (to remove lower-energy neutrons) and measuring reference intensity rates at complete water saturated conditions (on the sea close to the HOBE site), were further conducted to obtain an increased understanding of the physics controlling the cosmic-ray neutron transport and the equipment used. Additionally, neutron transport modeling, using the extended version of the Monte Carlo N-Particle Transport Code, was conducted. The responses of the reference condition, different amounts of biomass, soil moisture and canopy interception on the cosmic-ray neutron intensity were simulated and compared to the measurements.

  8. Diagnosing and Mapping Pulmonary Emphysema on X-Ray Projection Images: Incremental Value of Grating-Based X-Ray Dark-Field Imaging

    PubMed Central

    Meinel, Felix G.; Schwab, Felix; Schleede, Simone; Bech, Martin; Herzen, Julia; Achterhold, Klaus; Auweter, Sigrid; Bamberg, Fabian; Yildirim, Ali Ö.; Bohla, Alexander; Eickelberg, Oliver; Loewen, Rod; Gifford, Martin; Ruth, Ronald; Reiser, Maximilian F.; Pfeiffer, Franz; Nikolaou, Konstantin

    2013-01-01

    Purpose To assess whether grating-based X-ray dark-field imaging can increase the sensitivity of X-ray projection images in the diagnosis of pulmonary emphysema and allow for a more accurate assessment of emphysema distribution. Materials and Methods Lungs from three mice with pulmonary emphysema and three healthy mice were imaged ex vivo using a laser-driven compact synchrotron X-ray source. Median signal intensities of transmission (T), dark-field (V) and a combined parameter (normalized scatter) were compared between emphysema and control group. To determine the diagnostic value of each parameter in differentiating between healthy and emphysematous lung tissue, a receiver-operating-characteristic (ROC) curve analysis was performed both on a per-pixel and a per-individual basis. Parametric maps of emphysema distribution were generated using transmission, dark-field and normalized scatter signal and correlated with histopathology. Results Transmission values relative to water were higher for emphysematous lungs than for control lungs (1.11 vs. 1.06, p<0.001). There was no difference in median dark-field signal intensities between both groups (0.66 vs. 0.66). Median normalized scatter was significantly lower in the emphysematous lungs compared to controls (4.9 vs. 10.8, p<0.001), and was the best parameter for differentiation of healthy vs. emphysematous lung tissue. In a per-pixel analysis, the area under the ROC curve (AUC) for the normalized scatter value was significantly higher than for transmission (0.86 vs. 0.78, p<0.001) and dark-field value (0.86 vs. 0.52, p<0.001) alone. Normalized scatter showed very high sensitivity for a wide range of specificity values (94% sensitivity at 75% specificity). Using the normalized scatter signal to display the regional distribution of emphysema provides color-coded parametric maps, which show the best correlation with histopathology. Conclusion In a murine model, the complementary information provided by X-ray transmission and dark-field images adds incremental diagnostic value in detecting pulmonary emphysema and visualizing its regional distribution as compared to conventional X-ray projections. PMID:23555692

  9. Diagnosing and mapping pulmonary emphysema on X-ray projection images: incremental value of grating-based X-ray dark-field imaging.

    PubMed

    Meinel, Felix G; Schwab, Felix; Schleede, Simone; Bech, Martin; Herzen, Julia; Achterhold, Klaus; Auweter, Sigrid; Bamberg, Fabian; Yildirim, Ali Ö; Bohla, Alexander; Eickelberg, Oliver; Loewen, Rod; Gifford, Martin; Ruth, Ronald; Reiser, Maximilian F; Pfeiffer, Franz; Nikolaou, Konstantin

    2013-01-01

    To assess whether grating-based X-ray dark-field imaging can increase the sensitivity of X-ray projection images in the diagnosis of pulmonary emphysema and allow for a more accurate assessment of emphysema distribution. Lungs from three mice with pulmonary emphysema and three healthy mice were imaged ex vivo using a laser-driven compact synchrotron X-ray source. Median signal intensities of transmission (T), dark-field (V) and a combined parameter (normalized scatter) were compared between emphysema and control group. To determine the diagnostic value of each parameter in differentiating between healthy and emphysematous lung tissue, a receiver-operating-characteristic (ROC) curve analysis was performed both on a per-pixel and a per-individual basis. Parametric maps of emphysema distribution were generated using transmission, dark-field and normalized scatter signal and correlated with histopathology. Transmission values relative to water were higher for emphysematous lungs than for control lungs (1.11 vs. 1.06, p<0.001). There was no difference in median dark-field signal intensities between both groups (0.66 vs. 0.66). Median normalized scatter was significantly lower in the emphysematous lungs compared to controls (4.9 vs. 10.8, p<0.001), and was the best parameter for differentiation of healthy vs. emphysematous lung tissue. In a per-pixel analysis, the area under the ROC curve (AUC) for the normalized scatter value was significantly higher than for transmission (0.86 vs. 0.78, p<0.001) and dark-field value (0.86 vs. 0.52, p<0.001) alone. Normalized scatter showed very high sensitivity for a wide range of specificity values (94% sensitivity at 75% specificity). Using the normalized scatter signal to display the regional distribution of emphysema provides color-coded parametric maps, which show the best correlation with histopathology. In a murine model, the complementary information provided by X-ray transmission and dark-field images adds incremental diagnostic value in detecting pulmonary emphysema and visualizing its regional distribution as compared to conventional X-ray projections.

  10. Evolutionary Novelty in a Butterfly Wing Pattern through Enhancer Shuffling

    PubMed Central

    Pardo-Diaz, Carolina; Hanly, Joseph J.; Martin, Simon H.; Mallet, James; Dasmahapatra, Kanchon K.; Salazar, Camilo; Joron, Mathieu; Nadeau, Nicola; McMillan, W. Owen; Jiggins, Chris D.

    2016-01-01

    An important goal in evolutionary biology is to understand the genetic changes underlying novel morphological structures. We investigated the origins of a complex wing pattern found among Amazonian Heliconius butterflies. Genome sequence data from 142 individuals across 17 species identified narrow regions associated with two distinct red colour pattern elements, dennis and ray. We hypothesise that these modules in non-coding sequence represent distinct cis-regulatory loci that control expression of the transcription factor optix, which in turn controls red pattern variation across Heliconius. Phylogenetic analysis of the two elements demonstrated that they have distinct evolutionary histories and that novel adaptive morphological variation was created by shuffling these cis-regulatory modules through recombination between divergent lineages. In addition, recombination of modules into different combinations within species further contributes to diversity. Analysis of the timing of diversification in these two regions supports the hypothesis of introgression moving regulatory modules between species, rather than shared ancestral variation. The dennis phenotype introgressed into Heliconius melpomene at about the same time that ray originated in this group, while ray introgressed back into H. elevatus much more recently. We show that shuffling of existing enhancer elements both within and between species provides a mechanism for rapid diversification and generation of novel morphological combinations during adaptive radiation. PMID:26771987

  11. Evolutionary Novelty in a Butterfly Wing Pattern through Enhancer Shuffling.

    PubMed

    Wallbank, Richard W R; Baxter, Simon W; Pardo-Diaz, Carolina; Hanly, Joseph J; Martin, Simon H; Mallet, James; Dasmahapatra, Kanchon K; Salazar, Camilo; Joron, Mathieu; Nadeau, Nicola; McMillan, W Owen; Jiggins, Chris D

    2016-01-01

    An important goal in evolutionary biology is to understand the genetic changes underlying novel morphological structures. We investigated the origins of a complex wing pattern found among Amazonian Heliconius butterflies. Genome sequence data from 142 individuals across 17 species identified narrow regions associated with two distinct red colour pattern elements, dennis and ray. We hypothesise that these modules in non-coding sequence represent distinct cis-regulatory loci that control expression of the transcription factor optix, which in turn controls red pattern variation across Heliconius. Phylogenetic analysis of the two elements demonstrated that they have distinct evolutionary histories and that novel adaptive morphological variation was created by shuffling these cis-regulatory modules through recombination between divergent lineages. In addition, recombination of modules into different combinations within species further contributes to diversity. Analysis of the timing of diversification in these two regions supports the hypothesis of introgression moving regulatory modules between species, rather than shared ancestral variation. The dennis phenotype introgressed into Heliconius melpomene at about the same time that ray originated in this group, while ray introgressed back into H. elevatus much more recently. We show that shuffling of existing enhancer elements both within and between species provides a mechanism for rapid diversification and generation of novel morphological combinations during adaptive radiation.

  12. Generation of bright attosecond x-ray pulse trains via Thomson scattering from laser-plasma accelerators.

    PubMed

    Luo, W; Yu, T P; Chen, M; Song, Y M; Zhu, Z C; Ma, Y Y; Zhuo, H B

    2014-12-29

    Generation of attosecond x-ray pulse attracts more and more attention within the advanced light source user community due to its potentially wide applications. Here we propose an all-optical scheme to generate bright, attosecond hard x-ray pulse trains by Thomson backscattering of similarly structured electron beams produced in a vacuum channel by a tightly focused laser pulse. Design parameters for a proof-of-concept experiment are presented and demonstrated by using a particle-in-cell code and a four-dimensional laser-Compton scattering simulation code to model both the laser-based electron acceleration and Thomson scattering processes. Trains of 200 attosecond duration hard x-ray pulses holding stable longitudinal spacing with photon energies approaching 50 keV and maximum achievable peak brightness up to 1020 photons/s/mm2/mrad2/0.1%BW for each micro-bunch are observed. The suggested physical scheme for attosecond x-ray pulse trains generation may directly access the fastest time scales relevant to electron dynamics in atoms, molecules and materials.

  13. Quantitative analysis of biomedical samples using synchrotron radiation microbeams

    NASA Astrophysics Data System (ADS)

    Ektessabi, Ali; Shikine, Shunsuke; Yoshida, Sohei

    2001-07-01

    X-ray fluorescence (XRF) using a synchrotron radiation (SR) microbeam was applied to investigate distributions and concentrations of elements in single neurons of patients with neurodegenerative diseases. In this paper we introduce a computer code that has been developed to quantify the trace elements and matrix elements at the single cell level. This computer code has been used in studies of several important neurodegenerative diseases such as Alzheimer's disease (AD), Parkinson's disease (PD) and parkinsonism-dementia complex (PDC), as well as in basic biological experiments to determine the elemental changes in cells due to incorporation of foreign metal elements. The substantial nigra (SN) tissue obtained from the autopsy specimens of patients with Guamanian parkinsonism-dementia complex (PDC) and control cases were examined. Quantitative XRF analysis showed that neuromelanin granules of Parkinsonian SN contained higher levels of Fe than those of the control. The concentrations were in the ranges of 2300-3100 ppm and 2000-2400 ppm respectively. On the contrary, Zn and Ni in neuromelanin granules of SN tissue from the PDC case were lower than those of the control. Especially Zn was less than 40 ppm in SN tissue from the PDC case while it was 560-810 ppm in the control. These changes are considered to be closely related to the neuro-degeneration and cell death.

  14. Global Coordinates and Exact Aberration Calculations Applied to Physical Optics Modeling of Complex Optical Systems

    NASA Astrophysics Data System (ADS)

    Lawrence, G.; Barnard, C.; Viswanathan, V.

    1986-11-01

    Historically, wave optics computer codes have been paraxial in nature. Folded systems could be modeled by "unfolding" the optical system. Calculation of optical aberrations is, in general, left for the analyst to do with off-line codes. While such paraxial codes were adequate for the simpler systems being studied 10 years ago, current problems such as phased arrays, ring resonators, coupled resonators, and grazing incidence optics require a major advance in analytical capability. This paper describes extension of the physical optics codes GLAD and GLAD V to include a global coordinate system and exact ray aberration calculations. The global coordinate system allows components to be positioned and rotated arbitrarily. Exact aberrations are calculated for components in aligned or misaligned configurations by using ray tracing to compute optical path differences and diffraction propagation. Optical path lengths between components and beam rotations in complex mirror systems are calculated accurately so that coherent interactions in phased arrays and coupled devices may be treated correctly.

  15. Energy transport in plasmas produced by a high brightness krypton fluoride laser focused to a line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Hadithi, Y.; Tallents, G.J.; Zhang, J.

    A high brightness krypton fluoride Raman laser (wavelength 0.268 [mu]m) generating 0.3 TW, 12 ps pulses with 20 [mu]rad beam divergence and a prepulse of less than 10[sup [minus]10] has been focused to produce a 10 [mu]m wide line focus (irradiances [similar to]0.8--4[times]10[sup 15] W cm[sup [minus]2]) on plastic targets with a diagnostic sodium fluoride (NaF) layer buried within the target. Axial and lateral transport of energy has been measured by analysis of x-ray images of the line focus and from x-ray spectra emitted by the layer of NaF with varying overlay thicknesses. It is shown that the ratio ofmore » the distance between the critical density surface and the ablation surface to the laser focal width controls lateral transport in a similar manner as for previous spot focus experiments. The measured axial energy transport is compared to MEDUSA [J. P. Christiansen, D. E. T. F. Ashby, and K. V. Roberts, Comput. Phys. Commun. [bold 7], 271 (1974)] one-dimensional hydrodynamic code simulations with an average atom post-processor for predicting spectral line intensities. An energy absorption of [similar to]10% in the code gives agreement with the experimental axial penetration. Various measured line ratios of hydrogen- and helium-like Na and F are investigated as temperature diagnostics in the NaF layer using the RATION [R. W. Lee, B. L. Whitten, and R. E. Strout, J. Quant. Spectrosc. Radiat. Transfer [bold 32], 91 (1984)] code.« less

  16. MCNP modelling of scintillation-detector gamma-ray spectra from natural radionuclides.

    PubMed

    Hendriks, P H G M; Maucec, M; de Meijer, R J

    2002-09-01

    gamma-ray spectra of natural radionuclides are simulated for a BGO detector in a borehole geometry using the Monte Carlo code MCNP. All gamma-ray emissions of the decay of 40K and the series of 232Th and 238U are used to describe the source. A procedure is proposed which excludes the time-consuming electron tracking in less relevant areas of the geometry. The simulated gamma-ray spectra are benchmarked against laboratory data.

  17. Ray Effect Mitigation Through Reference Frame Rotation

    DOE PAGES

    Tencer, John

    2016-05-01

    The discrete ordinates method is a popular and versatile technique for solving the radiative transport equation, a major drawback of which is the presence of ray effects. Mitigation of ray effects can yield significantly more accurate results and enhanced numerical stability for combined mode codes. Moreover, when ray effects are present, the solution is seen to be highly dependent upon the relative orientation of the geometry and the global reference frame. It is an undesirable property. A novel ray effect mitigation technique of averaging the computed solution for various reference frame orientations is proposed.

  18. CAFNA{reg{underscore}sign}, coded aperture fast neutron analysis for contraband detection: Preliminary results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, L.; Lanza, R.C.

    1999-12-01

    The authors have developed a near field coded aperture imaging system for use with fast neutron techniques as a tool for the detection of contraband and hidden explosives through nuclear elemental analysis. The technique relies on the prompt gamma rays produced by fast neutron interactions with the object being examined. The position of the nuclear elements is determined by the location of the gamma emitters. For existing fast neutron techniques, in Pulsed Fast Neutron Analysis (PFNA), neutrons are used with very low efficiency; in Fast Neutron Analysis (FNS), the sensitivity for detection of the signature gamma rays is very low.more » For the Coded Aperture Fast Neutron Analysis (CAFNA{reg{underscore}sign}) the authors have developed, the efficiency for both using the probing fast neutrons and detecting the prompt gamma rays is high. For a probed volume of n{sup 3} volume elements (voxels) in a cube of n resolution elements on a side, they can compare the sensitivity with other neutron probing techniques. As compared to PFNA, the improvement for neutron utilization is n{sup 2}, where the total number of voxels in the object being examined is n{sup 3}. Compared to FNA, the improvement for gamma-ray imaging is proportional to the total open area of the coded aperture plane; a typical value is n{sup 2}/2, where n{sup 2} is the number of total detector resolution elements or the number of pixels in an object layer. It should be noted that the actual signal to noise ratio of a system depends also on the nature and distribution of background events and this comparison may reduce somewhat the effective sensitivity of CAFNA. They have performed analysis, Monte Carlo simulations, and preliminary experiments using low and high energy gamma-ray sources. The results show that a high sensitivity 3-D contraband imaging and detection system can be realized by using CAFNA.« less

  19. The "Sigmoid Sniffer” and the "Advanced Automated Solar Filament Detection and Characterization Code” Modules

    NASA Astrophysics Data System (ADS)

    Raouafi, Noureddine; Bernasconi, P. N.; Georgoulis, M. K.

    2010-05-01

    We present two pattern recognition algorithms, the "Sigmoid Sniffer” and the "Advanced Automated Solar Filament Detection and Characterization Code,” that are among the Feature Finding modules of the Solar Dynamic Observatory: 1) Coronal sigmoids visible in X-rays and the EUV are the result of highly twisted magnetic fields. They can occur anywhere on the solar disk and are closely related to solar eruptive activity (e.g., flares, CMEs). Their appearance is typically synonym of imminent solar eruptions, so they can serve as a tool to forecast solar activity. Automatic X-ray sigmoid identification offers an unbiased way of detecting short-to-mid term CME precursors. The "Sigmoid Sniffer” module is capable of automatically detecting sigmoids in full-disk X-ray images and determining their chirality, as well as other characteristics. It uses multiple thresholds to identify persistent bright structures on a full-disk X-ray image of the Sun. We plan to apply the code to X-ray images from Hinode/XRT, as well as on SDO/AIA images. When implemented in a near real-time environment, the Sigmoid Sniffer could allow 3-7 day forecasts of CMEs and their potential to cause major geomagnetic storms. 2)The "Advanced Automated Solar Filament Detection and Characterization Code” aims to identify, classify, and track solar filaments in full-disk Hα images. The code can reliably identify filaments; determine their chirality and other relevant parameters like filament area, length, and average orientation with respect to the equator. It is also capable of tracking the day-by-day evolution of filaments as they traverse the visible disk. The code was tested by analyzing daily Hα images taken at the Big Bear Solar Observatory from mid-2000 to early-2005. It identified and established the chirality of thousands of filaments without human intervention.

  20. X-ray Fluorescence Spectroscopy: the Potential of Astrophysics-developed Techniques

    NASA Astrophysics Data System (ADS)

    Elvis, M.; Allen, B.; Hong, J.; Grindlay, J.; Kraft, R.; Binzel, R. P.; Masterton, R.

    2012-12-01

    X-ray fluorescence from the surface of airless bodies has been studied since the Apollo X-ray fluorescence experiment mapped parts of the lunar surface in 1971-1972. That experiment used a collimated proportional counter with a resolving power of ~1 and a beam size of ~1degree. Filters separated only Mg, Al and SI lines. We review progress in X-ray detectors and imaging for astrophysics and show how these advances enable much more powerful use of X-ray fluorescence for the study of airless bodies. Astrophysics X-ray instrumentation has developed enormously since 1972. Low noise, high quantum efficiency, X-ray CCDs have flown on ASCA, XMM-Newton, the Chandra X-ray Observatory, Swift and Suzaku, and are the workhorses of X-ray astronomy. They normally span 0.5 to ~8 keV with an energy resolution of ~100 eV. New developments in silicon based detectors, especially individual pixel addressable devices, such as CMOS detectors, can withstand many orders of magnitude more radiation than conventional CCDs before degradation. The capability of high read rates provides dynamic range and temporal resolution. Additionally, the rapid read rates minimize shot noise from thermal dark current and optical light. CMOS detectors can therefore run at warmer temperatures and with ultra-thin optical blocking filters. Thin OBFs mean near unity quantum efficiency below 1 keV, thus maximizing response at the C and O lines.such as CMOS detectors, promise advances. X-ray imaging has advanced similarly far. Two types of imager are now available: specular reflection and coded apertures. X-ray mirrors have been flown on the Einstein Observatory, XMM-Newton, Chandra and others. However, as X-ray reflection only occurs at small (~1degree) incidence angles, which then requires long focal lengths (meters), mirrors are not usually practical for planetary missions. Moreover the field of view of X-ray mirrors is comparable to the incident angle, so can only image relatively small regions. More useful are coded-aperture imagers, which have flown on ART-P, Integral, and Swift. The shadow pattern from a 50% full mask allows the distribution of X-rays from a wide (10s of degrees) field of view to be imaged, but uniform emission presents difficulties. A version of a coded-aperture plus CCD detector for airless bodies study is being built for OSIRIS-REx as the student experiment REXIS. We will show the quality of the spectra that can be expected from this class of instrument.

  1. GRABGAM Analysis of Ultra-Low-Level HPGe Gamma Spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winn, W.G.

    The GRABGAM code has been used successfully for ultra-low level HPGe gamma spectrometry analysis since its development in 1985 at Savannah River Technology Center (SRTC). Although numerous gamma analysis codes existed at that time, reviews of institutional and commercial codes indicated that none addressed all features that were desired by SRTC. Furthermore, it was recognized that development of an in-house code would better facilitate future evolution of the code to address SRTC needs based on experience with low-level spectra. GRABGAM derives its name from Gamma Ray Analysis BASIC Generated At MCA/PC.

  2. From Pinholes to Black Holes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fenimore, Edward E.

    2014-10-06

    Pinhole photography has made major contributions to astrophysics through the use of “coded apertures”. Coded apertures were instrumental in locating gamma-ray bursts and proving that they originate in faraway galaxies, some from the birth of black holes from the first stars that formed just after the big bang.

  3. Measuring implosion velocities in experiments and simulations of laser-driven cylindrical implosions on the OMEGA laser

    NASA Astrophysics Data System (ADS)

    Hansen, E. C.; Barnak, D. H.; Betti, R.; Campbell, E. M.; Chang, P.-Y.; Davies, J. R.; Glebov, V. Yu; Knauer, J. P.; Peebles, J.; Regan, S. P.; Sefkow, A. B.

    2018-05-01

    Laser-driven magnetized liner inertial fusion (MagLIF) on OMEGA involves cylindrical implosions, a preheat beam, and an applied magnetic field. Initial experiments excluded the preheat beam and magnetic field to better characterize the implosion. X-ray self-emission as measured by framing cameras was used to determine the shell trajectory. The 1D code LILAC was used to model the central region of the implosion, and results were compared to 2D simulations from the HYDRA code. Post-processing of simulation output with SPECT3D and Yorick produced synthetic x-ray images that were used to compare the simulation results with the x-ray framing camera data. Quantitative analysis shows that higher measured neutron yields correlate with higher implosion velocities. The future goal is to further analyze the x-ray images to characterize the uniformity of the implosions and apply these analysis techniques to integrated laser-driven MagLIF shots to better understand the effects of preheat and the magnetic field.

  4. IonRayTrace: An HF Propagation Model for Communications and Radar Applications

    DTIC Science & Technology

    2014-12-01

    for modeling the impact of ionosphere variability on detection algorithms. Modification of IonRayTrace’s source code to include flexible gridding and...color denotes plasma frequency in MHz .................................................................. 6 4. Ionospheric absorption (dB) versus... Ionosphere for its environmental background [3]. IonRayTrace’s operation is summarized briefly in Section 3. However, the scope of this document is primarily

  5. A final report to the Laboratory Directed Research and Development committee on Project 93-ERP-075: ``X-ray laser propagation and coherence: Diagnosing fast-evolving, high-density laser plasmas using X-ray lasers``

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan, A.S.; Cauble, R.; Da Silva, L.B.

    1996-02-01

    This report summarizes the major accomplishments of this three-year Laboratory Directed Research and Development (LDRD) Exploratory Research Project (ERP) entitled ``X-ray Laser Propagation and Coherence: Diagnosing Fast-evolving, High-density Laser Plasmas Using X-ray Lasers,`` tracking code 93-ERP-075. The most significant accomplishment of this project is the demonstration of a new laser plasma diagnostic: a soft x-ray Mach-Zehnder interferometer using a neonlike yttrium x-ray laser at 155 {angstrom} as the probe source. Detailed comparisons of absolute two-dimensional electron density profiles obtained from soft x-ray laser interferograms and profiles obtained from radiation hydrodynamics codes, such as LASNEX, will allow us to validate andmore » benchmark complex numerical models used to study the physics of laser-plasma interactions. Thus the development of soft x-ray interferometry technique provides a mechanism to probe the deficiencies of the numerical models and is an important tool for, the high-energy density physics and science-based stockpile stewardship programs. The authors have used the soft x-ray interferometer to study a number of high-density, fast evolving, laser-produced plasmas, such as the dynamics of exploding foils and colliding plasmas. They are pursuing the application of the soft x-ray interferometer to study ICF-relevant plasmas, such as capsules and hohlraums, on the Nova 10-beam facility. They have also studied the development of enhanced-coherence, shorter-pulse-duration, and high-brightness x-ray lasers. The utilization of improved x-ray laser sources can ultimately enable them to obtain three-dimensional holographic images of laser-produced plasmas.« less

  6. Investigation of irradiation effects on highly integrated leading-edge electronic components of diagnostics and control systems for LHD deuterium operation

    NASA Astrophysics Data System (ADS)

    Ogawa, K.; Nishitani, T.; Isobe, M.; Murata, I.; Hatano, Y.; Matsuyama, S.; Nakanishi, H.; Mukai, K.; Sato, M.; Yokota, M.; Kobuchi, T.; Nishimura, T.; Osakabe, M.

    2017-08-01

    High-temperature and high-density plasmas are achieved by means of real-time control, fast diagnostic, and high-power heating systems. Those systems are precisely controlled via highly integrated electronic components, but can be seriously affected by radiation damage. Therefore, the effects of irradiation on currently used electronic components should be investigated for the control and measurement of Large Helical Device (LHD) deuterium plasmas. For the precise estimation of the radiation field in the LHD torus hall, the MCNP6 code is used with the cross-section library ENDF B-VI. The geometry is modeled on the computer-aided design. The dose on silicon, which is a major ingredient of electronic components, over nine years of LHD deuterium operation shows that the gamma-ray contribution is dominant. Neutron irradiation tests were performed in the OKTAVIAN at Osaka University and the Fast Neutron Laboratory at Tohoku University. Gamma-ray irradiation tests were performed at the Nagoya University Cobalt-60 irradiation facility. We found that there are ethernet connection failures of programmable logic controller (PLC) modules due to neutron irradiation with a neutron flux of 3  ×  106 cm-2 s-1. This neutron flux is equivalent to that expected at basement level in the LHD torus hall without a neutron shield. Most modules of the PLC are broken around a gamma-ray dose of 100 Gy. This is comparable with the dose in the LHD torus hall over nine years. If we consider the dose only, these components may survive more than nine years. For the safety of the LHD operation, the electronic components in the torus hall have been rearranged.

  7. Calculations of the skyshine gamma-ray dose rates from independent spent fuel storage installations (ISFSI) under worst case accident conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pace, J.V. III; Cramer, S.N.; Knight, J.R.

    1980-09-01

    Calculations of the skyshine gamma-ray dose rates from three spent fuel storage pools under worst case accident conditions have been made using the discrete ordinates code DOT-IV and the Monte Carlo code MORSE and have been compared to those of two previous methods. The DNA 37N-21G group cross-section library was utilized in the calculations, together with the Claiborne-Trubey gamma-ray dose factors taken from the same library. Plots of all results are presented. It was found that the dose was a strong function of the iron thickness over the fuel assemblies, the initial angular distribution of the emitted radiation, and themore » photon source near the top of the assemblies. 16 refs., 11 figs., 7 tabs.« less

  8. Characterization of a hybrid target multi-keV x-ray source by a multi-parameter statistical analysis of titanium K-shell emission

    DOE PAGES

    Primout, M.; Babonneau, D.; Jacquet, L.; ...

    2015-11-10

    We studied the titanium K-shell emission spectra from multi-keV x-ray source experiments with hybrid targets on the OMEGA laser facility. Using the collisional-radiative TRANSPEC code, dedicated to K-shell spectroscopy, we reproduced the main features of the detailed spectra measured with the time-resolved MSPEC spectrometer. We developed a general method to infer the N e, T e and T i characteristics of the target plasma from the spectral analysis (ratio of integrated Lyman-α to Helium-α in-band emission and the peak amplitude of individual line ratios) of the multi-keV x-ray emission. Finally, these thermodynamic conditions are compared to those calculated independently bymore » the radiation-hydrodynamics transport code FCI2.« less

  9. SKYSINE-II procedure: calculation of the effects of structure design on neutron, primary gamma-ray and secondary gamma-ray dose rates in air

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lampley, C.M.

    1979-01-01

    An updated version of the SKYSHINE Monte Carlo procedure has been developed. The new computer code, SKYSHINE-II, provides a substantial increase in versatility in that the program possesses the ability to address three types of point-isotropic radiation sources: (1) primary gamma rays, (2) neutrons, and (3) secondary gamma rays. In addition, the emitted radiation may now be characterized by an energy emission spectrum product of a new energy-dependent atmospheric transmission data base developed by Radiation Research Associates, Inc. for each of the three source types described above. Most of the computational options present in the original program have been retainedmore » in the new version. Hence, the SKYSHINE-II computer code provides a versatile and viable tool for the analysis of the radiation environment in the vicinity of a building structure containing radiation sources, situated within the confines of a nuclear power plant. This report describes many of the calculational methods employed within the SKYSHINE-II program. A brief description of the new data base is included. Utilization instructions for the program are provided for operation of the SKYSHINE-II code on the Brookhaven National Laboratory Central Scientific Computing Facility. A listing of the source decks, block data routines, and the new atmospheric transmission data base are provided in the appendices of the report.« less

  10. 30 CFR 50.20-6 - Criteria-MSHA Form 7000-1, Section C.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... daughters, non-medical, non-therapeutic X-rays, radium); effects of nonionizing radiation (welding flash, ultra-violet rays, micro-waves, sunburn). (vi) Code 26—Disorders Associated with Repeated Trauma...). Examples: Poisoning by lead, mercury, cadmium, arsenic, or other metals, poisoning by carbon monoxide...

  11. Binary encoding of multiplexed images in mixed noise.

    PubMed

    Lalush, David S

    2008-09-01

    Binary coding of multiplexed signals and images has been studied in the context of spectroscopy with models of either purely constant or purely proportional noise, and has been shown to result in improved noise performance under certain conditions. We consider the case of mixed noise in an imaging system consisting of multiple individually-controllable sources (X-ray or near-infrared, for example) shining on a single detector. We develop a mathematical model for the noise in such a system and show that the noise is dependent on the properties of the binary coding matrix and on the average number of sources used for each code. Each binary matrix has a characteristic linear relationship between the ratio of proportional-to-constant noise and the noise level in the decoded image. We introduce a criterion for noise level, which is minimized via a genetic algorithm search. The search procedure results in the discovery of matrices that outperform the Hadamard S-matrices at certain levels of mixed noise. Simulation of a seven-source radiography system demonstrates that the noise model predicts trends and rank order of performance in regions of nonuniform images and in a simple tomosynthesis reconstruction. We conclude that the model developed provides a simple framework for analysis, discovery, and optimization of binary coding patterns used in multiplexed imaging systems.

  12. Spallogenic Light Elements and Cosmic Ray Origin

    NASA Technical Reports Server (NTRS)

    Ramaty, Reuven

    2000-01-01

    Most of the Galactic Li-6, all of the Be and the bulk of the B are cosmic ray produced. I will discuss the production mechanisms and detail a recently developed evolutionary code for Fe,O and these light elements. I will review the leading models for Li, Be and B origin and discuss their implications on cosmic ray origin. I will also show evidence for extragalactic production of Li-6.

  13. Initial performances of first undulator-based hard x-ray beamlines of NSLS-II compared to simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chubar, Oleg, E-mail: chubar@bnl.gov; Chu, Yong S.; Huang, Xiaojing

    2016-07-27

    Commissioning of the first X-ray beamlines of NSLS-II included detailed measurements of spectral and spatial distributions of the radiation at different locations of the beamlines, from front-ends to sample positions. Comparison of some of these measurement results with high-accuracy calculations of synchrotron (undulator) emission and wavefront propagation through X-ray transport optics, performed using SRW code, is presented.

  14. High-spatial resolution and high-spectral resolution detector for use in the measurement of solar flare hard X-rays

    NASA Technical Reports Server (NTRS)

    Desai, U. D.; Orwig, Larry E.

    1988-01-01

    In the areas of high spatial resolution, the evaluation of a hard X-ray detector with 65 micron spatial resolution for operation in the energy range from 30 to 400 keV is proposed. The basic detector is a thick large-area scintillator faceplate, composed of a matrix of high-density scintillating glass fibers, attached to a proximity type image intensifier tube with a resistive-anode digital readout system. Such a detector, combined with a coded-aperture mask, would be ideal for use as a modest-sized hard X-ray imaging instrument up to X-ray energies as high as several hundred keV. As an integral part of this study it was also proposed that several techniques be critically evaluated for X-ray image coding which could be used with this detector. In the area of high spectral resolution, it is proposed to evaluate two different types of detectors for use as X-ray spectrometers for solar flares: planar silicon detectors and high-purity germanium detectors (HPGe). Instruments utilizing these high-spatial-resolution detectors for hard X-ray imaging measurements from 30 to 400 keV and high-spectral-resolution detectors for measurements over a similar energy range would be ideally suited for making crucial solar flare observations during the upcoming maximum in the solar cycle.

  15. Estimates of galactic cosmic ray shielding requirements during solar minimum

    NASA Technical Reports Server (NTRS)

    Townsend, Lawrence W.; Nealy, John E.; Wilson, John W.; Simonsen, Lisa C.

    1990-01-01

    Estimates of radiation risk from galactic cosmic rays are presented for manned interplanetary missions. The calculations use the Naval Research Laboratory cosmic ray spectrum model as input into the Langley Research Center galactic cosmic ray transport code. This transport code, which transports both heavy ions and nucleons, can be used with any number of layers of target material, consisting of up to five different arbitrary constituents per layer. Calculated galactic cosmic ray fluxes, dose and dose equivalents behind various thicknesses of aluminum, water and liquid hydrogen shielding are presented for the solar minimum period. Estimates of risk to the skin and the blood-forming organs (BFO) are made using 0-cm and 5-cm depth dose/dose equivalent values, respectively, for water. These results indicate that at least 3.5 g/sq cm (3.5 cm) of water, or 6.5 g/sq cm (2.4 cm) of aluminum, or 1.0 g/sq cm (14 cm) of liquid hydrogen shielding is required to reduce the annual exposure below the currently recommended BFO limit of 0.5 Sv. Because of large uncertainties in fragmentation parameters and the input cosmic ray spectrum, these exposure estimates may be uncertain by as much as a factor of 2 or more. The effects of these potential exposure uncertainties or shield thickness requirements are analyzed.

  16. Techniques for the analysis of data from coded-mask X-ray telescopes

    NASA Technical Reports Server (NTRS)

    Skinner, G. K.; Ponman, T. J.; Hammersley, A. P.; Eyles, C. J.

    1987-01-01

    Several techniques useful in the analysis of data from coded-mask telescopes are presented. Methods of handling changes in the instrument pointing direction are reviewed and ways of using FFT techniques to do the deconvolution considered. Emphasis is on techniques for optimally-coded systems, but it is shown that the range of systems included in this class can be extended through the new concept of 'partial cycle averaging'.

  17. Analysis of a Distributed Pulse Power System Using a Circuit Analysis Code

    DTIC Science & Technology

    1979-06-01

    dose rate was then integrated to give a number that could be compared with measure- ments made using thermal luminescent dosimeters ( TLD ’ s). Since...NM 8 7117 AND THE BDM CORPORATION, ALBUQUERQUE, NM 87106 Abstract A sophisticated computer code (SCEPTRE), used to analyze electronic circuits...computer code (SCEPTRE), used to analyze electronic circuits, was used to evaluate the performance of a large flash X-ray machine. This device was

  18. Observational Conditions for the Detection of X-Ray Fluorescence from Sodium by the MIXS Instrument on BepiColombo

    NASA Astrophysics Data System (ADS)

    Cooper, R.; Grande, M.; Martindale, A.; Bunce, E.

    2018-05-01

    We model the expected fluorescence from the exosphere and surface of Mercury, as observed by the Mercury Imaging X-ray Spectrometer (MIXS) on the upcoming BepiColombo mission, using code modified from that used for the SMART-1 D-CIXS instrument.

  19. Extensive Air Showers in the Classroom

    ERIC Educational Resources Information Center

    Badala, A.; Blanco, F.; La Rocca, P.; Pappalardo, G. S.; Pulvirenti, A.; Riggi, F.

    2007-01-01

    The basic properties of extensive air showers of particles produced in the interaction of a high-energy primary cosmic ray in the Earth's atmosphere are discussed in the context of educational cosmic ray projects involving undergraduate students and high-school teams. Simulation results produced by an air shower development code were made…

  20. XPATCH: a high-frequency electromagnetic scattering prediction code using shooting and bouncing rays

    NASA Astrophysics Data System (ADS)

    Hazlett, Michael; Andersh, Dennis J.; Lee, Shung W.; Ling, Hao; Yu, C. L.

    1995-06-01

    This paper describes an electromagnetic computer prediction code for generating radar cross section (RCS), time domain signatures, and synthetic aperture radar (SAR) images of realistic 3-D vehicles. The vehicle, typically an airplane or a ground vehicle, is represented by a computer-aided design (CAD) file with triangular facets, curved surfaces, or solid geometries. The computer code, XPATCH, based on the shooting and bouncing ray technique, is used to calculate the polarimetric radar return from the vehicles represented by these different CAD files. XPATCH computes the first-bounce physical optics plus the physical theory of diffraction contributions and the multi-bounce ray contributions for complex vehicles with materials. It has been found that the multi-bounce contributions are crucial for many aspect angles of all classes of vehicles. Without the multi-bounce calculations, the radar return is typically 10 to 15 dB too low. Examples of predicted range profiles, SAR imagery, and radar cross sections (RCS) for several different geometries are compared with measured data to demonstrate the quality of the predictions. The comparisons are from the UHF through the Ka frequency ranges. Recent enhancements to XPATCH for MMW applications and target Doppler predictions are also presented.

  1. Coupled multi-group neutron photon transport for the simulation of high-resolution gamma-ray spectroscopy applications

    NASA Astrophysics Data System (ADS)

    Burns, Kimberly Ann

    The accurate and efficient simulation of coupled neutron-photon problems is necessary for several important radiation detection applications. Examples include the detection of nuclear threats concealed in cargo containers and prompt gamma neutron activation analysis for nondestructive determination of elemental composition of unknown samples. In these applications, high-resolution gamma-ray spectrometers are used to preserve as much information as possible about the emitted photon flux, which consists of both continuum and characteristic gamma rays with discrete energies. Monte Carlo transport is the most commonly used modeling tool for this type of problem, but computational times for many problems can be prohibitive. This work explores the use of coupled Monte Carlo-deterministic methods for the simulation of neutron-induced photons for high-resolution gamma-ray spectroscopy applications. RAdiation Detection Scenario Analysis Toolbox (RADSAT), a code which couples deterministic and Monte Carlo transport to perform radiation detection scenario analysis in three dimensions [1], was used as the building block for the methods derived in this work. RADSAT was capable of performing coupled deterministic-Monte Carlo simulations for gamma-only and neutron-only problems. The purpose of this work was to develop the methodology necessary to perform coupled neutron-photon calculations and add this capability to RADSAT. Performing coupled neutron-photon calculations requires four main steps: the deterministic neutron transport calculation, the neutron-induced photon spectrum calculation, the deterministic photon transport calculation, and the Monte Carlo detector response calculation. The necessary requirements for each of these steps were determined. A major challenge in utilizing multigroup deterministic transport methods for neutron-photon problems was maintaining the discrete neutron-induced photon signatures throughout the simulation. Existing coupled neutron-photon cross-section libraries and the methods used to produce neutron-induced photons were unsuitable for high-resolution gamma-ray spectroscopy applications. Central to this work was the development of a method for generating multigroup neutron-photon cross-sections in a way that separates the discrete and continuum photon emissions so the neutron-induced photon signatures were preserved. The RADSAT-NG cross-section library was developed as a specialized multigroup neutron-photon cross-section set for the simulation of high-resolution gamma-ray spectroscopy applications. The methodology and cross sections were tested using code-to-code comparison with MCNP5 [2] and NJOY [3]. A simple benchmark geometry was used for all cases compared with MCNP. The geometry consists of a cubical sample with a 252Cf neutron source on one side and a HPGe gamma-ray spectrometer on the opposing side. Different materials were examined in the cubical sample: polyethylene (C2H4), P, N, O, and Fe. The cross sections for each of the materials were compared to cross sections collapsed using NJOY. Comparisons of the volume-averaged neutron flux within the sample, volume-averaged photon flux within the detector, and high-purity gamma-ray spectrometer response (only for polyethylene) were completed using RADSAT and MCNP. The code-to-code comparisons show promising results for the coupled Monte Carlo-deterministic method. The RADSAT-NG cross-section production method showed good agreement with NJOY for all materials considered although some additional work is needed in the resonance region and in the first and last energy bin. Some cross section discrepancies existed in the lowest and highest energy bin, but the overall shape and magnitude of the two methods agreed. For the volume-averaged photon flux within the detector, typically the five most intense lines agree to within approximately 5% of the MCNP calculated flux for all of materials considered. The agreement in the code-to-code comparisons cases demonstrates a proof-of-concept of the method for use in RADSAT for coupled neutron-photon problems in high-resolution gamma-ray spectroscopy applications. One of the primary motivators for using the coupled method over pure Monte Carlo method is the potential for significantly lower computational times. For the code-to-code comparison cases, the run times for RADSAT were approximately 25--500 times shorter than for MCNP, as shown in Table 1. This was assuming a 40 mCi 252Cf neutron source and 600 seconds of "real-world" measurement time. The only variance reduction technique implemented in the MCNP calculation was forward biasing of the source toward the sample target. Improved MCNP runtimes could be achieved with the addition of more advanced variance reduction techniques.

  2. Illuminating heterogeneous anisotropic upper mantle: testing a new anisotropic teleseismic body-wave tomography code - part II: Inversion mode

    NASA Astrophysics Data System (ADS)

    Munzarova, Helena; Plomerova, Jaroslava; Kissling, Edi

    2015-04-01

    Considering only isotropic wave propagation and neglecting anisotropy in teleseismic tomography studies is a simplification obviously incongruous with current understanding of the mantle-lithosphere plate dynamics. Furthermore, in solely isotropic high-resolution tomography results, potentially significant artefacts (i.e., amplitude and/or geometry distortions of 3D velocity heterogeneities) may result from such neglect. Therefore, we have undertaken to develop a code for anisotropic teleseismic tomography (AniTomo), which will allow us to invert the relative P-wave travel time residuals simultaneously for coupled isotropic-anisotropic P-wave velocity models of the upper mantle. To accomplish that, we have modified frequently-used isotropic teleseismic tomography code Telinv (e.g., Weiland et al., JGR, 1995; Lippitsch, JGR, 2003; Karousova et al., GJI, 2013). Apart from isotropic velocity heterogeneities, a weak hexagonal anisotropy is assumed as well to be responsible for the observed P-wave travel-time residuals. Moreover, no limitations to orientation of the symmetry axis are prescribed in the code. We allow a search for anisotropy oriented generally in 3D, which represents a unique approach among recent trials that otherwise incorporate only azimuthal anisotopy into the body-wave tomography. The presented code for retrieving anisotropy in 3D thus enables its direct applications to datasets from tectonically diverse regions. In this contribution, we outline the theoretical background of the AniTomo anisotropic tomography code. We parameterize the mantle lithosphere and asthenosphere with an orthogonal grid of nodes with various values of isotropic velocities, as well as of strength and orientation of anisotropy in 3D, which is defined by azimuth and inclination of either fast or slow symmetry axis of the hexagonal approximation of the media. Careful testing of the new code on synthetics, concentrating on code functionality, strength and weaknesses, is a necessary step before AniTomo is applied to real datasets. We examine various aspects coming along with anisotropic tomography such as setting a starting anisotropic model and parameters controlling the inversion, and particularly influence of a ray coverage on resolvability of individual anisotropic parameters. Synthetic testing also allows investigation of the well-known trade-off between effects of P-wave anisotropy and isotropic heterogeneities. Therefore, the target synthetic models are designed to represent schematically different heterogeneous anisotropic structures of the upper mantle. Testing inversion mode of the AniTomo code, considering an azimuthally quasi-equal distribution of rays and teleseismic P-wave incidences, shows that a separation of seismic anisotropy and isotropic velocity heterogeneities is plausible and that the correct orientation of the symmetry axes in a model can be found within three iterations for well-tuned damping factors.

  3. Time-dependent, x-ray spectral unfolds and brightness temperatures for intense Li{sup +} ion beam-driven hohlraums

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fehl, D.L.; Chandler, G.A.; Biggs, F.

    X-ray-producing hohlraums are being studied as indirect drives for Inertial Confinement Fusion targets. In a 1994 target series on the PBFAII accelerator, cylindrical hohlraum targets were heated by an intense Li{sup +} ion beam and viewed by an array of 13 time-resolved, filtered x-ray detectors (XRDs). The UFO unfold code and its suite of auxiliary functions were used extensively in obtaining time- resolved x-ray spectra and radiation temperatures from this diagnostic. UFO was also used to obtain fitted response functions from calibration data, to simulate data from blackbody x-ray spectra of interest, to determine the suitability of various unfolding parametersmore » (e.g., energy domain, energy partition, smoothing conditions, and basis functions), to interpolate the XRD signal traces, and to unfold experimental data. The simulation capabilities of the code were useful in understanding an anomalous feature in the unfolded spectra at low photon energies ({le} 100 eV). Uncertainties in the differential and energy-integrated unfolded spectra were estimated from uncertainties in the data. The time-history of the radiation temperature agreed well with independent calculations of the wall temperature in the hohlraum.« less

  4. Continuous Active Sonar for Undersea Vehicles Final Report: Input of Factor Graphs into the Detection, Classification, and Localization Chain and Continuous Active SONAR in Undersea Vehicles

    DTIC Science & Technology

    2015-12-31

    image from NURP annual report. in X The ray -cone code simulates the CAS signal received after being reflected form two different targets, and...Cm where m, m, ... , 1fn are X ’s parents, and nodes C1, C1, ... , C,, are X ’s children. Image based on (Duda, Hart, & Stork, 2001). The first...Sorenson, 1970). Using the reference (Welch & Bishop, 2006), the procedure for estimating the real state x , of a discrete-time controlled process , will

  5. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; hide

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  6. Hexagonal Uniformly Redundant Arrays (HURAs) for scintillator based coded aperture neutron imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamage, K.A.A.; Zhou, Q.

    2015-07-01

    A series of Monte Carlo simulations have been conducted, making use of the EJ-426 neutron scintillator detector, to investigate the potential of using hexagonal uniformly redundant arrays (HURAs) for scintillator based coded aperture neutron imaging. This type of scintillator material has a low sensitivity to gamma rays, therefore, is of particular use in a system with a source that emits both neutrons and gamma rays. The simulations used an AmBe source, neutron images have been produced using different coded-aperture materials (boron- 10, cadmium-113 and gadolinium-157) and location error has also been estimated. In each case the neutron image clearly showsmore » the location of the source with a relatively small location error. Neutron images with high resolution can be easily used to identify and locate nuclear materials precisely in nuclear security and nuclear decommissioning applications. (authors)« less

  7. K-shell X-ray transition energies of multi-electron ions of silicon and sulfur

    NASA Astrophysics Data System (ADS)

    Beiersdorfer, P.; Brown, G. V.; Hell, N.; Santana, J. A.

    2017-10-01

    Prompted by the detection of K-shell absorption or emission features in the spectra of plasma surrounding high mass X-ray binaries and black holes, recent measurements using the Livermore electron beam ion trap have focused on the energies of the n = 2 to n = 1 K-shell transitions in the L-shell ions of lithiumlike through fluorinelike silicon and sulfur. In parallel, we have made calculations of these transitions using the Flexible Atomic Code and the multi-reference Møller-Plesset (MRMP) atomic physics code. Using this code we have attempted to produce sets of theoretical atomic data with spectroscopic accuracy for all the L-shell ions of silicon and sulfur. We present results of our calculations for oxygenlike and fluorinelike silicon and compare them to the recent electron beam ion trap measurements as well as previous calculations.

  8. Medicine, material science and security: the versatility of the coded-aperture approach.

    PubMed

    Munro, P R T; Endrizzi, M; Diemoz, P C; Hagen, C K; Szafraniec, M B; Millard, T P; Zapata, C E; Speller, R D; Olivo, A

    2014-03-06

    The principal limitation to the widespread deployment of X-ray phase imaging in a variety of applications is probably versatility. A versatile X-ray phase imaging system must be able to work with polychromatic and non-microfocus sources (for example, those currently used in medical and industrial applications), have physical dimensions sufficiently large to accommodate samples of interest, be insensitive to environmental disturbances (such as vibrations and temperature variations), require only simple system set-up and maintenance, and be able to perform quantitative imaging. The coded-aperture technique, based upon the edge illumination principle, satisfies each of these criteria. To date, we have applied the technique to mammography, materials science, small-animal imaging, non-destructive testing and security. In this paper, we outline the theory of coded-aperture phase imaging and show an example of how the technique may be applied to imaging samples with a practically important scale.

  9. K-shell X-ray transition energies of multi-electron ions of silicon and sulfur

    DOE PAGES

    Beiersdorfer, P.; Brown, G. V.; Hell, N.; ...

    2017-04-20

    Prompted by the detection of K-shell absorption or emission features in the spectra of plasma surrounding high mass X-ray binaries and black holes, recent measurements using the Livermore electron beam ion trap have focused on the energies of the n = 2 to n = 1 K-shell transitions in the L-shell ions of lithiumlike through fluorinelike silicon and sulfur. In parallel, we have made calculations of these transitions using the Flexible Atomic Code and the multi-reference Møller-Plesset (MRMP) atomic physics code. Using this code we have attempted to produce sets of theoretical atomic data with spectroscopic accuracy for all themore » L-shell ions of silicon and sulfur. Here, we present results of our calculations for oxygenlike and fluorinelike silicon and compare them to the recent electron beam ion trap measurements as well as previous calculations.« less

  10. Understanding uncertainties in modeling the galactic diffuse gamma-ray emission

    NASA Astrophysics Data System (ADS)

    Storm, Emma; Calore, Francesca; Weniger, Christoph

    2017-01-01

    The nature of the Galactic diffuse gamma-ray emission as measured by the Fermi Gamma-ray Space Telescope has remained an active area of research for the last several years. A standard technique to disentangle the origins of the diffuse emission is the template fitting approach, where predictions for various diffuse components, such as emission from cosmic rays derived from Galprop or Dragon, are compared to the data. However, this method always results in an overall bad fit to the data, with strong residuals that are difficult to interpret. Additionally, there are instrinsic uncertainties in the predicted templates that are not accounted for naturally with this method. We therefore introduce a new template fitting approach to study the various components of the Galactic diffuse gamma-ray emission, and their correlations and uncertainties. We call this approach Sky Factorization with Adaptive Constrained Templates (SkyFACT). Rather than using fixed predictions from cosmic-ray propagation codes and examining the residuals to evaluate the quality of fits and the presence of excesses, we introduce additional fine-grained variations in the templates that account for uncertainties in the predictions, such as uncertainties in the gas tracers and from small scale variations in the density of cosmic rays. We show that fits to the gamma-ray diffuse emission can be dramatically improved by including an appropriate level of uncertainty in the initial spatial templates from cosmic-ray propagation codes. We further show that we can recover the morphology of the Fermi Bubbles from its spectrum alone with SkyFACT.

  11. Development of slow control system for the Belle II ARICH counter

    NASA Astrophysics Data System (ADS)

    Yonenaga, M.; Adachi, I.; Dolenec, R.; Hataya, K.; Iori, S.; Iwata, S.; Kakuno, H.; Kataura, R.; Kawai, H.; Kindo, H.; Kobayashi, T.; Korpar, S.; Križan, P.; Kumita, T.; Mrvar, M.; Nishida, S.; Ogawa, K.; Ogawa, S.; Pestotnik, R.; Šantelj, L.; Sumiyoshi, T.; Tabata, M.; Yusa, Y.

    2017-12-01

    A slow control system (SCS) for the Aerogel Ring Imaging Cherenkov (ARICH) counter in the Belle II experiment was newly developed and coded in the development frameworks of the Belle II DAQ software. The ARICH is based on 420 Hybrid Avalanche Photo-Detectors (HAPDs). Each HAPD has 144 pixels to be readout and requires 6 power supply (PS) channels, therefore a total number of 2520 PS channels and 60,480 pixels have to be configured and controlled. Graphical User Interfaces (GUIs) with detector oriented view and device oriented view, were also implemented to ease the detector operation. The ARICH SCS is in operation for detector construction and cosmic rays tests. The paper describes the detailed features of the SCS and preliminary results of operation of a reduced set of hardware which confirm the scalability to the full detector.

  12. SolTrace | Concentrating Solar Power | NREL

    Science.gov Websites

    NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted

  13. The Sensitivity of Coded Mask Telescopes

    NASA Technical Reports Server (NTRS)

    Skinner, Gerald K.

    2008-01-01

    Simple formulae are often used to estimate the sensitivity of coded mask X-ray or gamma-ray telescopes, but t,hese are strictly only applicable if a number of basic assumptions are met. Complications arise, for example, if a grid structure is used to support the mask elements, if the detector spatial resolution is not good enough to completely resolve all the detail in the shadow of the mask or if any of a number of other simplifying conditions are not fulfilled. We derive more general expressions for the Poisson-noise-limited sensitivity of astronomical telescopes using the coded mask technique, noting explicitly in what circumstances they are applicable. The emphasis is on using nomenclature and techniques that result in simple and revealing results. Where no convenient expression is available a procedure is given which allows the calculation of the sensitivity. We consider certain aspects of the optimisation of the design of a coded mask telescope and show that when the detector spatial resolution and the mask to detector separation are fixed, the best source location accuracy is obtained when the mask elements are equal in size to the detector pixels.

  14. RAINIER: A simulation tool for distributions of excited nuclear states and cascade fluctuations

    NASA Astrophysics Data System (ADS)

    Kirsch, L. E.; Bernstein, L. A.

    2018-06-01

    A new code has been developed named RAINIER that simulates the γ-ray decay of discrete and quasi-continuum nuclear levels for a user-specified range of energy, angular momentum, and parity including a realistic treatment of level spacing and transition width fluctuations. A similar program, DICEBOX, uses the Monte Carlo method to simulate level and width fluctuations but is restricted in its initial level population algorithm. On the other hand, modern reaction codes such as TALYS and EMPIRE populate a wide range of states in the residual nucleus prior to γ-ray decay, but do not go beyond the use of deterministic functions and therefore neglect cascade fluctuations. This combination of capabilities allows RAINIER to be used to determine quasi-continuum properties through comparison with experimental data. Several examples are given that demonstrate how cascade fluctuations influence experimental high-resolution γ-ray spectra from reactions that populate a wide range of initial states.

  15. Benchmarking Geant4 for simulating galactic cosmic ray interactions within planetary bodies

    DOE PAGES

    Mesick, K. E.; Feldman, W. C.; Coupland, D. D. S.; ...

    2018-06-20

    Galactic cosmic rays undergo complex nuclear interactions with nuclei within planetary bodies that have little to no atmosphere. Radiation transport simulations are a key tool used in understanding the neutron and gamma-ray albedo coming from these interactions and tracing these signals back to geochemical composition of the target. In this paper, we study the validity of the code Geant4 for simulating such interactions by comparing simulation results to data from the Apollo 17 Lunar Neutron Probe Experiment. Different assumptions regarding the physics are explored to demonstrate how these impact the Geant4 simulation results. In general, all of the Geant4 resultsmore » over-predict the data, however, certain physics lists perform better than others. Finally, in addition, we show that results from the radiation transport code MCNP6 are similar to those obtained using Geant4.« less

  16. Benchmarking Geant4 for simulating galactic cosmic ray interactions within planetary bodies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mesick, K. E.; Feldman, W. C.; Coupland, D. D. S.

    Galactic cosmic rays undergo complex nuclear interactions with nuclei within planetary bodies that have little to no atmosphere. Radiation transport simulations are a key tool used in understanding the neutron and gamma-ray albedo coming from these interactions and tracing these signals back to geochemical composition of the target. In this paper, we study the validity of the code Geant4 for simulating such interactions by comparing simulation results to data from the Apollo 17 Lunar Neutron Probe Experiment. Different assumptions regarding the physics are explored to demonstrate how these impact the Geant4 simulation results. In general, all of the Geant4 resultsmore » over-predict the data, however, certain physics lists perform better than others. Finally, in addition, we show that results from the radiation transport code MCNP6 are similar to those obtained using Geant4.« less

  17. MUFFSgenMC: An Open Source MUon Flexible Framework for Spectral GENeration for Monte Carlo Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatzidakis, Stylianos; Greulich, Christopher

    A cosmic ray Muon Flexible Framework for Spectral GENeration for Monte Carlo Applications (MUFFSgenMC) has been developed to support state-of-the-art cosmic ray muon tomographic applications. The flexible framework allows for easy and fast creation of source terms for popular Monte Carlo applications like GEANT4 and MCNP. This code framework simplifies the process of simulations used for cosmic ray muon tomography.

  18. The GRO remote terminal system

    NASA Technical Reports Server (NTRS)

    Zillig, David J.; Valvano, Joe

    1994-01-01

    In March 1992, NASA HQ challenged GSFC/Code 531 to propose a fast, low-cost approach to close the Tracking Data Relay Satellite System (TDRSS) Zone-of-Exclusion (ZOE) over the Indian Ocean in order to provide global communications coverage for the Compton Gamma Ray Observatory (GRO) spacecraft. GRO had lost its tape recording capability which limited its valuable science data return to real-time contacts with the TDRS-E and TDRS-W synchronous data relay satellites, yielding only approximately 62 percent of the possible data obtainable. To achieve global coverage, a TDRS spacecraft would have to be moved over the Indian Ocean out of line-of-sight control of White Sands Ground Terminal (WSGT). To minimize operations life cycle costs, Headquarters also set a goal for remote control, from the WSGT, of the overseas ground station which was required for direct communications with TDRS-1. On August 27, 1992, Code 531 was given the go ahead to implement the proposed GRO Relay Terminal System (GRTS). This paper describes the Remote Ground Relay Terminal (RGRT) which went operational at the Canberra Deep Space Communications Complex (CDSCC) in Canberra, Australia in December 1993 and is currently augmenting the TDRSS constellation in returning between 80-100 percent of GRO science data under the control of a single operator at WSGT.

  19. Space Radiation Transport Codes: A Comparative Study for Galactic Cosmic Rays Environment

    NASA Astrophysics Data System (ADS)

    Tripathi, Ram; Wilson, John W.; Townsend, Lawrence W.; Gabriel, Tony; Pinsky, Lawrence S.; Slaba, Tony

    For long duration and/or deep space human missions, protection from severe space radiation exposure is a challenging design constraint and may be a potential limiting factor. The space radiation environment consists of galactic cosmic rays (GCR), solar particle events (SPE), trapped radiation, and includes ions of all the known elements over a very broad energy range. These ions penetrate spacecraft materials producing nuclear fragments and secondary particles that damage biological tissues, microelectronic devices, and materials. In deep space missions, where the Earth's magnetic field does not provide protection from space radiation, the GCR environment is significantly enhanced due to the absence of geomagnetic cut-off and is a major component of radiation exposure. Accurate risk assessments critically depend on the accuracy of the input information as well as radiation transport codes used, and so systematic verification of codes is necessary. In this study, comparisons are made between the deterministic code HZETRN2006 and the Monte Carlo codes HETC-HEDS and FLUKA for an aluminum shield followed by a water target exposed to the 1977 solar minimum GCR spectrum. Interaction and transport of high charge ions present in GCR radiation environment provide a more stringent constraint in the comparison of the codes. Dose, dose equivalent and flux spectra are compared; details of the comparisons will be discussed, and conclusions will be drawn for future directions.

  20. Ion absorption of the high harmonic fast wave in the National Spherical Torus Experiment

    NASA Astrophysics Data System (ADS)

    Rosenberg, Adam Lewis

    Ion absorption of the high harmonic fast wave in a spherical torus is of critical importance to assessing the viability of the wave as a means of heating and driving current. Analysis of recent NSTX shots has revealed that under some conditions when neutral beam and RF power are injected into the plasma simultaneously, a fast ion population with energy above the beam injection energy is sustained by the wave. In agreement with modeling, these experiments find the RF-induced fast ion tail strength and neutron rate at lower B-fields to be less enhanced, likely due to a larger β profile, which promotes greater off-axis absorption where the fast ion population is small. Ion loss codes find the increased loss fraction with decreased B insufficient to account for the changes in tail strength, providing further evidence that this is an RF interaction effect. Though greater ion absorption is predicted with lower k∥, surprisingly little variation in the tail was observed, along with a neutron rate enhancement with higher k∥. Data from the neutral particle analyzer, neutron detectors, x-ray crystal spectrometer, and Thomson scattering is presented, along with results from the TRANSP transport analysis code, ray-tracing codes HPRT and CURRAY, full-wave code and AORSA, quasilinear code CQL3D, and ion loss codes EIGOL and CONBEAM.

  1. Monte Carlo analysis of a time-dependent neutron and secondary gamma-ray integral experiment on a thick concrete and steel shield

    NASA Astrophysics Data System (ADS)

    Cramer, S. N.; Roussin, R. W.

    1981-11-01

    A Monte Carlo analysis of a time-dependent neutron and secondary gamma-ray integral experiment on a thick concrete and steel shield is presented. The energy range covered in the analysis is 15-2 MeV for neutron source energies. The multigroup MORSE code was used with the VITAMIN C 171-36 neutron-gamma-ray cross-section data set. Both neutron and gamma-ray count rates and unfolded energy spectra are presented and compared, with good general agreement, with experimental results.

  2. Chandra Reads the Cosmic Bar Code of Gas Around a Black Hole

    NASA Astrophysics Data System (ADS)

    2000-02-01

    An international team of astronomers has used NASA's Chandra X-ray Observatory to make an energy bar code of hot gas in the vicinity of a giant black hole. These measurements, the most precise of their kind ever made with an X-ray telescope, demonstrate the existence of a blanket of warm gas that is expanding rapidly away from the black hole. The team consists of Jelle Kaastra, Rolf Mewe and Albert Brinkman of Space Research Organization Netherlands (SRON) in Utrecht, Duane Liedahl of Lawrence Livermore National Laboratory in Livermore, Calif., and Stefanie Komossa of Max Planck Institute in Garching, Germany. A report of their findings will be published in the March issue of the European journal Astronomy & Astrophysics. Kaastra and colleagues used the Low Energy Transmission Grating in conjunction with the High Resolution Camera to measure the number of X rays present at each energy. With this information they constructed an X-ray spectrum of the source. Their target was the central region, or nucleus of the galaxy NGC 5548, which they observed for 24 hours. This galaxy is one of a class of galaxies known to have unusually bright nuclei that are associated with gas flowing around and into giant black holes. This inflow produces an enormous outpouring of energy that blows some of the matter away from the black hole. Astronomers have used optical, ultraviolet, and X-ray telescopes in an effort to disentangle the complex nature of inflowing and outflowing gas at different distances from the black hole in NGC 5548. X-ray observations provide a ringside seat to the action around the black hole. By using the Low Energy Transmission Grating, the Dutch-US-German team concentrated on gas that forms a warm blanket that partially covers the innermost region where the highest energy X-rays are produced. As the high-energy X rays stream away from the vicinity of the black hole, they heat the blanketing gas to temperatures of a few million degrees, and the blanket absorbs some of the X rays from the central source. This produces dark stripes, or absorption lines in the X-ray spectrum. Bright stripes or emission lines due to emission from the blanketing gas are also present. Since each element has its own unique structure, these lines can be read like a cosmic bar code to take inventory of the gas. The team was able to determine what atoms the gas contains and how many, the number of electrons each atom has retained in the hostile environment of the black hole, and how the gas is moving there. They found lines from eight different elements including carbon, nitrogen, oxygen, and iron. The amount of this gas was found to be about 100 times greater than that found with optical and ultraviolet observations. The Low Energy Transmission Grating was built by the SRON. and the Max Planck Institute under the direction of Albert Brinkman. The High Resolution Camera was built by the Smithsonian Astrophysical Observatory in Cambridge, Mass. under the direction of Stephen Murray. To follow Chandra's progress or download images visit the Chandra sites at: http://chandra.harvard.edu/photo/2000/0170/index.html AND http://chandra.nasa.gov NASA's Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program. TRW, Inc., Redondo Beach, Calif., is the prime contractor for the spacecraft. The Smithsonian's Chandra X-ray Center controls science and flight operations from Cambridge, Mass. High resolution digital versions of the X-ray spectrum (JPG, 300 dpi TIFF ) and other information associated with this release are available on the Internet at: http://chandra.harvard.edu

  3. Measuring implosion velocities in experiments and simulations of laser-driven cylindrical implosions on the OMEGA laser

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, E. C.; Barnak, D. H.; Betti, R.

    Laser-driven magnetized liner inertial fusion (MagLIF) on OMEGA involves cylindrical implosions, a preheat beam, and an applied magnetic field. Initial experiments excluded the preheat beam and magnetic field to better characterize the implosion. X-ray self-emission as measured by framing cameras was used to determine the shell trajectory. The 1-D code LILAC was used to model the central region of the implosion, and results were compared to 2-D simulations from the HYDRA code. Post-processing of simulation output with SPECT3D and Yorick produced synthetic x-ray images that were used to compare the simulation results with the x-ray framing camera data. Quantitative analysismore » shows that higher measured neutron yields correlate with higher implosion velocities. The future goal is to further analyze the x-ray images to characterize the uniformity of the implosions and apply these analysis techniques to integrated laser-driven MagLIF shots to better understand the effects of preheat and the magnetic field.« less

  4. Measuring implosion velocities in experiments and simulations of laser-driven cylindrical implosions on the OMEGA laser

    DOE PAGES

    Hansen, E. C.; Barnak, D. H.; Betti, R.; ...

    2018-04-04

    Laser-driven magnetized liner inertial fusion (MagLIF) on OMEGA involves cylindrical implosions, a preheat beam, and an applied magnetic field. Initial experiments excluded the preheat beam and magnetic field to better characterize the implosion. X-ray self-emission as measured by framing cameras was used to determine the shell trajectory. The 1-D code LILAC was used to model the central region of the implosion, and results were compared to 2-D simulations from the HYDRA code. Post-processing of simulation output with SPECT3D and Yorick produced synthetic x-ray images that were used to compare the simulation results with the x-ray framing camera data. Quantitative analysismore » shows that higher measured neutron yields correlate with higher implosion velocities. The future goal is to further analyze the x-ray images to characterize the uniformity of the implosions and apply these analysis techniques to integrated laser-driven MagLIF shots to better understand the effects of preheat and the magnetic field.« less

  5. Physical basis of radiation protection in space travel

    NASA Astrophysics Data System (ADS)

    Durante, Marco; Cucinotta, Francis A.

    2011-10-01

    The health risks of space radiation are arguably the most serious challenge to space exploration, possibly preventing these missions due to safety concerns or increasing their costs to amounts beyond what would be acceptable. Radiation in space is substantially different from Earth: high-energy (E) and charge (Z) particles (HZE) provide the main contribution to the equivalent dose in deep space, whereas γ rays and low-energy α particles are major contributors on Earth. This difference causes a high uncertainty on the estimated radiation health risk (including cancer and noncancer effects), and makes protection extremely difficult. In fact, shielding is very difficult in space: the very high energy of the cosmic rays and the severe mass constraints in spaceflight represent a serious hindrance to effective shielding. Here the physical basis of space radiation protection is described, including the most recent achievements in space radiation transport codes and shielding approaches. Although deterministic and Monte Carlo transport codes can now describe well the interaction of cosmic rays with matter, more accurate double-differential nuclear cross sections are needed to improve the codes. Energy deposition in biological molecules and related effects should also be developed to achieve accurate risk models for long-term exploratory missions. Passive shielding can be effective for solar particle events; however, it is limited for galactic cosmic rays (GCR). Active shielding would have to overcome challenging technical hurdles to protect against GCR. Thus, improved risk assessment and genetic and biomedical approaches are a more likely solution to GCR radiation protection issues.

  6. Coded-aperture Compton camera for gamma-ray imaging

    NASA Astrophysics Data System (ADS)

    Farber, Aaron M.

    This dissertation describes the development of a novel gamma-ray imaging system concept and presents results from Monte Carlo simulations of the new design. Current designs for large field-of-view gamma cameras suitable for homeland security applications implement either a coded aperture or a Compton scattering geometry to image a gamma-ray source. Both of these systems require large, expensive position-sensitive detectors in order to work effectively. By combining characteristics of both of these systems, a new design can be implemented that does not require such expensive detectors and that can be scaled down to a portable size. This new system has significant promise in homeland security, astronomy, botany and other fields, while future iterations may prove useful in medical imaging, other biological sciences and other areas, such as non-destructive testing. A proof-of-principle study of the new gamma-ray imaging system has been performed by Monte Carlo simulation. Various reconstruction methods have been explored and compared. General-Purpose Graphics-Processor-Unit (GPGPU) computation has also been incorporated. The resulting code is a primary design tool for exploring variables such as detector spacing, material selection and thickness and pixel geometry. The advancement of the system from a simple 1-dimensional simulation to a full 3-dimensional model is described. Methods of image reconstruction are discussed and results of simulations consisting of both a 4 x 4 and a 16 x 16 object space mesh have been presented. A discussion of the limitations and potential areas of further study is also presented.

  7. Delayed Gamma-Ray Spectroscopy for Non-Destructive Assay of Nuclear Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludewigt, Bernhard; Mozin, Vladimir; Campbell, Luke

    2015-06-01

    High-­energy, beta-delayed gamma-­ray spectroscopy is a potential, non-­destructive assay techniques for the independent verification of declared quantities of special nuclear materials at key stages of the fuel cycle and for directly assaying nuclear material inventories for spent fuel handling, interim storage, reprocessing facilities, repository sites, and final disposal. Other potential applications include determination of MOX fuel composition, characterization of nuclear waste packages, and challenges in homeland security and arms control verification. Experimental measurements were performed to evaluate fission fragment yields, to test methods for determining isotopic fractions, and to benchmark the modeling code package. Experimental measurement campaigns were carried outmore » at the IAC using a photo-­neutron source and at OSU using a thermal neutron beam from the TRIGA reactor to characterize the emission of high-­energy delayed gamma rays from 235U, 239Pu, and 241Pu targets following neutron induced fission. Data were collected for pure and combined targets for several irradiation/spectroscopy cycle times ranging from 10/10 seconds to 15/30 minutes.The delayed gamma-ray signature of 241Pu, a significant fissile constituent in spent fuel, was measured and compared to 239Pu. The 241Pu/ 239Pu ratios varied between 0.5 and 1.2 for ten prominent lines in the 2700-­3600 keV energy range. Such significant differences in relative peak intensities make it possible to determine relative fractions of these isotopes in a mixed sample. A method for determining fission product yields by fitting the energy and time dependence of the delayed gamma-­ray emission was developed and demonstrated on a limited 235U data set. De-­convolution methods for determining fissile fractions were developed and tested on the experimental data. The use of high count-­rate LaBr 3 detectors was investigated as a potential alternative to HPGe detectors. Modeling capabilities were added to an existing framework and codes were adapted as needed for analyzing experiments and assessing application-­specific assay concepts. A de-­convolution analysis of the delayed gamma-­ray response spectra modeled for spent fuel assemblies was performed using the same method that was applied to the experimental spectra.« less

  8. Intra-binary Shock Heating of Black Widow Companions

    NASA Astrophysics Data System (ADS)

    Romani, Roger W.; Sanchez, Nicolas

    2016-09-01

    The low-mass companions of evaporating binary pulsars (black widows and similar) are strongly heated on the side facing the pulsar. However, in high-quality photometric and spectroscopic data, the heating pattern does not match that expected for direct pulsar illumination. Here we explore a model where the pulsar power is intercepted by an intra-binary shock (IBS) before heating the low-mass companion. We develop a simple analytic model and implement it in the popular “ICARUS” light curve code. The model is parameterized by the wind momentum ratio β and the companion wind speed {f}v{v}{{orb}}, and assumes that the reprocessed pulsar wind emits prompt particles or radiation to heat the companion surface. We illustrate an interesting range of light curve asymmetries controlled by these parameters. The code also computes the IBS synchrotron emission pattern, and thus can model black widow X-ray light curves. As a test, we apply the results to the high-quality asymmetric optical light curves of PSR J2215+5135; the resulting fit gives a substantial improvement upon direct heating models and produces an X-ray light curve consistent with that seen. The IBS model parameters imply that at the present loss rate, the companion evaporation has a characteristic timescale of {τ }{{evap}}≈ 150 Myr. Still, the model is not fully satisfactory, indicating that there are additional unmodeled physical effects.

  9. Electron cyclotron emission from nonthermal tokamak plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, R.W.; O'Brien, M.R.; Rozhdestvensky, V.V.

    1993-02-01

    Electron cyclotron emission can be a sensitive indicator of nonthermal electron distributions. A new, comprehensive ray-tracing and cyclotron emission code that is aimed at predicting and interpreting the cyclotron emission from tokamak plasmas is described. The radiation transfer equation is solved along Wentzel--Kramers--Brillouin (WKB) rays using a fully relativistic calculation of the emission and absorption from electron distributions that are gyrotropic and toroidally symmetric, but may be otherwise arbitrary functions of the constants of motion. Using a radial array of electron distributions obtained from a bounce-averaged Fokker--Planck code modeling dc electron field and electron cyclotron heating effects, the cyclotron emissionmore » spectra are obtained. A pronounced strong nonthermal cyclotron emission feature that occurs at frequencies relativistically downshifted to second harmonic cyclotron frequencies outside the tokamak is calculated, in agreement with experimental results from the DIII-D [J. L. Luxon and L. G. Davies, Fusion Technol. [bold 8], 441 (1985)] and FT-1 [D. G. Bulyginsky [ital et] [ital al]., in [ital Proceedings] [ital of] [ital the] 15[ital th] [ital European] [ital Conference] [ital on] [ital Controlled] [ital Fusion] [ital and] [ital Plasma] [ital Heating], Dubrovnik, 1988 (European Physical Society, Petit-Lancy, 1988), Vol. 12B, Part II, p. 823] tokamaks. The calculations indicate the presence of a strong loss mechanism that operates on electrons in the 100--150 keV energy range.« less

  10. A FORTRAN code for the calculation of probe volume geometry changes in a laser anemometry system caused by window refraction

    NASA Technical Reports Server (NTRS)

    Owen, Albert K.

    1987-01-01

    A computer code was written which utilizes ray tracing techniques to predict the changes in position and geometry of a laser Doppler velocimeter probe volume resulting from refraction effects. The code predicts the position change, changes in beam crossing angle, and the amount of uncrossing that occur when the beams traverse a region with a changed index of refraction, such as a glass window. The code calculates the changes for flat plate, cylinder, general axisymmetric and general surface windows and is currently operational on a VAX 8600 computer system.

  11. General Tool for Evaluating High-Contrast Coronagraphic Telescope Performance Error Budgets

    NASA Technical Reports Server (NTRS)

    Marchen, Luis F.

    2011-01-01

    The Coronagraph Performance Error Budget (CPEB) tool automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. The tool uses a Code V prescription of the optical train, and uses MATLAB programs to call ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-of-sight pointing, with and without controlled fine-steering mirrors (FSMs). The sensitivity matrices are imported by macros into Excel 2007, where the error budget is evaluated. The user specifies the particular optics of interest, and chooses the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions, and combines that with the sensitivity matrices to generate an error budget for the system. CPEB also contains a combination of form and ActiveX controls with Visual Basic for Applications code to allow for user interaction in which the user can perform trade studies such as changing engineering requirements, and identifying and isolating stringent requirements. It contains summary tables and graphics that can be instantly used for reporting results in view graphs. The entire process to obtain a coronagraphic telescope performance error budget has been automated into three stages: conversion of optical prescription from Zemax or Code V to MACOS (in-house optical modeling and analysis tool), a linear models process, and an error budget tool process. The first process was improved by developing a MATLAB package based on the Class Constructor Method with a number of user-defined functions that allow the user to modify the MACOS optical prescription. The second process was modified by creating a MATLAB package that contains user-defined functions that automate the process. The user interfaces with the process by utilizing an initialization file where the user defines the parameters of the linear model computations. Other than this, the process is fully automated. The third process was developed based on the Terrestrial Planet Finder coronagraph Error Budget Tool, but was fully automated by using VBA code, form, and ActiveX controls.

  12. NASA Tech Briefs, May 2013

    NASA Technical Reports Server (NTRS)

    2013-01-01

    Topics include: Test Waveform Applications for JPL STRS Operating Environment; Pneumatic Proboscis Heat-Flow Probe; Method to Measure Total Noise Temperature of a Wireless Receiver During Operation; Cursor Control Device Test Battery; Functional Near-Infrared Spectroscopy Signals Measure Neuronal Activity in the Cortex; ESD Test Apparatus for Soldering Irons; FPGA-Based X-Ray Detection and Measurement for an X-Ray Polarimeter; Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions; Silicon/Carbon Nanotube Photocathode for Splitting Water; Advanced Materials and Fabrication Techniques for the Orion Attitude Control Motor; Flight Hardware Packaging Design for Stringent EMC Radiated Emission Requirements; RF Reference Switch for Spaceflight Radiometer Calibration; An Offload NIC for NASA, NLR, and Grid Computing; Multi-Scale CNT-Based Reinforcing Polymer Matrix Composites for Lightweight Structures; Ceramic Adhesive and Methods for On-Orbit Repair of Re-Entry Vehicles; Self-Healing Nanocomposites for Reusable Composite Cryotanks; Pt-Ni and Pt-Co Catalyst Synthesis Route for Fuel Cell Applications; Aerogel-Based Multilayer Insulation with Micrometeoroid Protection; Manufacturing of Nanocomposite Carbon Fibers and Composite Cylinders; Optimized Radiator Geometries for Hot Lunar Thermal Environments; A Mission Concept: Re-Entry Hopper-Aero-Space-Craft System on-Mars (REARM-Mars); New Class of Flow Batteries for Terrestrial and Aerospace Energy Storage Applications; Reliability of CCGA 1152 and CCGA 1272 Interconnect Packages for Extreme Thermal Environments; Using a Blender to Assess the Microbial Density of Encapsulated Organisms; Mixed Integer Programming and Heuristic Scheduling for Space Communication; Video Altimeter and Obstruction Detector for an Aircraft; Control Software for Piezo Stepping Actuators; Galactic Cosmic Ray Event-Based Risk Model (GERM) Code; Sasquatch Footprint Tool; and Multi-User Space Link Extension (SLE) System.

  13. The NuSTAR Mission: Implementation and Science Prospects

    NASA Technical Reports Server (NTRS)

    Zhang, William W.

    2009-01-01

    NuSTAR is NASA's next X-ray observatory scheduled to be launched in 2011. It will have two multi-layered X-ray mirror assemblies capable of focusing X-rays in the band of 5 to 80 keV, providing unprecedented detection and imaging sensitivity in a band that only coded-mask or collimated detection has been possible. In this talk I will describe the instrumentation and the prospects of using it to perform various kinds of astronomical studies.

  14. Studying the response of a plastic scintillator to gamma rays using the Geant4 Monte Carlo code.

    PubMed

    Ghadiri, Rasoul; Khorsandi, Jamshid

    2015-05-01

    To determine the gamma ray response function of an NE-102 scintillator and to investigate the gamma spectra due to the transport of optical photons, we simulated an NE-102 scintillator using Geant4 code. The results of the simulation were compared with experimental data. Good consistency between the simulation and data was observed. In addition, the time and spatial distributions, along with the energy distribution and surface treatments of scintillation detectors, were calculated. This simulation makes us capable of optimizing the photomultiplier tube (or photodiodes) position to yield the best coupling to the detector. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. NASA Tech Briefs, January 2004

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Topics covered include: Multisensor Instrument for Real-Time Biological Monitoring; Sensor for Monitoring Nanodevice-Fabrication Plasmas; Backed Bending Actuator; Compact Optoelectronic Compass; Micro Sun Sensor for Spacecraft; Passive IFF: Autonomous Nonintrusive Rapid Identification of Friendly Assets; Finned-Ladder Slow-Wave Circuit for a TWT; Directional Radio-Frequency Identification Tag Reader; Integrated Solar-Energy-Harvesting and -Storage Device; Event-Driven Random-Access-Windowing CCD Imaging System; Stroboscope Controller for Imaging Helicopter Rotors; Software for Checking State-charts; Program Predicts Broadband Noise from a Turbofan Engine; Protocol for a Delay-Tolerant Data-Communication Network; Software Implements a Space-Mission File-Transfer Protocol; Making Carbon-Nanotube Arrays Using Block Copolymers: Part 2; Modular Rake of Pitot Probes; Preloading To Accelerate Slow-Crack-Growth Testing; Miniature Blimps for Surveillance and Collection of Samples; Hybrid Automotive Engine Using Ethanol-Burning Miller Cycle; Fabricating Blazed Diffraction Gratings by X-Ray Lithography; Freeze-Tolerant Condensers; The StarLight Space Interferometer; Champagne Heat Pump; Controllable Sonar Lenses and Prisms Based on ERFs; Measuring Gravitation Using Polarization Spectroscopy; Serial-Turbo-Trellis-Coded Modulation with Rate-1 Inner Code; Enhanced Software for Scheduling Space-Shuttle Processing; Bayesian-Augmented Identification of Stars in a Narrow View; Spacecraft Orbits for Earth/Mars-Lander Radio Relay; and Self-Inflatable/Self-Rigidizable Reflectarray Antenna.

  16. Time-dependent, x-ray spectral unfolds and brightness temperatures for intense Li{sup +} ion beam-driven hohlraums

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fehl, D.L.; Chandler, G.A.; Biggs, F.

    X-ray-producing hohlraums are being studied as indirect drives for inertial confinement fusion targets. In a 1994 target series on the PBFAII accelerator, cylindrical hohlraum targets were heated by an intense Li{sup +} ion beam and viewed by an array of 13 time-resolved, filtered x-ray detectors (XRDs). The unfold operator (UFO) code and its suite of auxiliary functions were used extensively in obtaining time-resolved x-ray spectra and radiation temperatures from this diagnostic. The UFO was also used to obtain fitted response functions from calibration data, to simulate data from blackbody x-ray spectra of interest, to determine the suitability of various unfoldingmore » parameters (e.g., energy domain, energy partition, smoothing conditions, and basis functions), to interpolate the XRD signal traces, and to unfold experimental data. The simulation capabilities of the code were useful in understanding an anomalous feature in the unfolded spectra at low photon energies ({le}100 eV). Uncertainties in the differential and energy-integrated unfolded spectra were estimated from uncertainties in the data. The time{endash}history of the radiation temperature agreed well with independent calculations of the wall temperature in the hohlraum. {copyright} {ital 1997 American Institute of Physics.}« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hellfeld, Daniel; Barton, Paul; Gunter, Donald

    Gamma-ray imaging facilitates the efficient detection, characterization, and localization of compact radioactive sources in cluttered environments. Fieldable detector systems employing active planar coded apertures have demonstrated broad energy sensitivity via both coded aperture and Compton imaging modalities. But, planar configurations suffer from a limited field-of-view, especially in the coded aperture mode. In order to improve upon this limitation, we introduce a novel design by rearranging the detectors into an active coded spherical configuration, resulting in a 4pi isotropic field-of-view for both coded aperture and Compton imaging. This work focuses on the low- energy coded aperture modality and the optimization techniquesmore » used to determine the optimal number and configuration of 1 cm 3 CdZnTe coplanar grid detectors on a 14 cm diameter sphere with 192 available detector locations.« less

  18. Development of a new EMP code at LANL

    NASA Astrophysics Data System (ADS)

    Colman, J. J.; Roussel-Dupré, R. A.; Symbalisty, E. M.; Triplett, L. A.; Travis, B. J.

    2006-05-01

    A new code for modeling the generation of an electromagnetic pulse (EMP) by a nuclear explosion in the atmosphere is being developed. The source of the EMP is the Compton current produced by the prompt radiation (γ-rays, X-rays, and neutrons) of the detonation. As a first step in building a multi- dimensional EMP code we have written three kinetic codes, Plume, Swarm, and Rad. Plume models the transport of energetic electrons in air. The Plume code solves the relativistic Fokker-Planck equation over a specified energy range that can include ~ 3 keV to 50 MeV and computes the resulting electron distribution function at each cell in a two dimensional spatial grid. The energetic electrons are allowed to transport, scatter, and experience Coulombic drag. Swarm models the transport of lower energy electrons in air, spanning 0.005 eV to 30 keV. The swarm code performs a full 2-D solution to the Boltzmann equation for electrons in the presence of an applied electric field. Over this energy range the relevant processes to be tracked are elastic scattering, three body attachment, two body attachment, rotational excitation, vibrational excitation, electronic excitation, and ionization. All of these occur due to collisions between the electrons and neutral bodies in air. The Rad code solves the full radiation transfer equation in the energy range of 1 keV to 100 MeV. It includes effects of photo-absorption, Compton scattering, and pair-production. All of these codes employ a spherical coordinate system in momentum space and a cylindrical coordinate system in configuration space. The "z" axis of the momentum and configuration spaces is assumed to be parallel and we are currently also assuming complete spatial symmetry around the "z" axis. Benchmarking for each of these codes will be discussed as well as the way forward towards an integrated modern EMP code.

  19. Study on detecting spatial distribution of neutrons and gamma rays using a multi-imaging plate system.

    PubMed

    Tanaka, Kenichi; Sakurai, Yoshinori; Endo, Satoru; Takada, Jun

    2014-06-01

    In order to measure the spatial distributions of neutrons and gamma rays separately using the imaging plate, the requirement for the converter to enhance specific component was investigated with the PHITS code. Consequently, enhancing fast neutrons using recoil protons from epoxy resin was not effective due to high sensitivity of the imaging plate to gamma rays. However, the converter of epoxy resin doped with (10)B was found to have potential for thermal and epithermal neutrons, and graphite for gamma rays. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Laser Ray Tracing in a Parallel Arbitrary Lagrangian-Eulerian Adaptive Mesh Refinement Hydrocode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masters, N D; Kaiser, T B; Anderson, R W

    2009-09-28

    ALE-AMR is a new hydrocode that we are developing as a predictive modeling tool for debris and shrapnel formation in high-energy laser experiments. In this paper we present our approach to implementing laser ray-tracing in ALE-AMR. We present the equations of laser ray tracing, our approach to efficient traversal of the adaptive mesh hierarchy in which we propagate computational rays through a virtual composite mesh consisting of the finest resolution representation of the modeled space, and anticipate simulations that will be compared to experiments for code validation.

  1. Effective algorithm for ray-tracing simulations of lobster eye and similar reflective optical systems

    NASA Astrophysics Data System (ADS)

    Tichý, Vladimír; Hudec, René; Němcová, Šárka

    2016-06-01

    The algorithm presented is intended mainly for lobster eye optics. This type of optics (and some similar types) allows for a simplification of the classical ray-tracing procedure that requires great many rays to simulate. The method presented performs the simulation of a only few rays; therefore it is extremely effective. Moreover, to simplify the equations, a specific mathematical formalism is used. Only a few simple equations are used, therefore the program code can be simple as well. The paper also outlines how to apply the method to some other reflective optical systems.

  2. Partially coherent wavefront propagation simulations: Mirror and monochromator crystal quality assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiegart, L., E-mail: lwiegart@bnl.gov; Fluerasu, A.; Chubar, O.

    2016-07-27

    We have applied fully-and partially-coherent synchrotron radiation wavefront propagation simulations, implemented in the “Synchrotron Radiation Workshop” (SRW) computer code, to analyse the effects of imperfect mirrors and monochromator at the Coherent Hard X-ray beamline. This beamline is designed for X-ray Photon Correlation Spectroscopy, a technique that heavily relies on the partial coherence of the X-ray beam and benefits from a careful preservation of the X-ray wavefront. We present simulations and a comparison with the measured beam profile at the sample position, which show the impact of imperfect optics on the wavefront.

  3. General Electromagnetic Model for the Analysis of Complex Systems (GEMACS) Computer Code Documentation (Version 3). Volume 3. Part 2.

    DTIC Science & Technology

    1983-09-01

    F.P. PX /AMPZIJ/ REFH /AMPZIJ/ REFV /AI4PZIJ/ * RHOX /AI4PZIJ/ RHOY /At4PZIJ/ RHOZ /AI4PZIJ/ S A-ZJ SA /AMPZIJ/ SALP /AMPZIJ/ 6. CALLING ROUTINE: FLDDRV...US3NG ALGORITHM 72 COMPUTE P- YES .~:*:.~~ USING* *. 1. NAME: PLAINT (GTD) ] 2. PURPOSE: To determine if a ray traveling from a given source loca...determine if a source ray reflection from plate MP occurs. If a ray traveling from the source image location in the reflected ray direction passes through

  4. Integrated Idl Tool For 3d Modeling And Imaging Data Analysis

    NASA Astrophysics Data System (ADS)

    Nita, Gelu M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A. A.; Kontar, E. P.

    2012-05-01

    Addressing many key problems in solar physics requires detailed analysis of non-simultaneous imaging data obtained in various wavelength domains with different spatial resolution and their comparison with each other supplied by advanced 3D physical models. To facilitate achieving this goal, we have undertaken a major enhancement and improvements of IDL-based simulation tools developed earlier for modeling microwave and X-ray emission. The greatly enhanced object-based architecture provides interactive graphic user interface that allows the user i) to import photospheric magnetic field maps and perform magnetic field extrapolations to almost instantly generate 3D magnetic field models, ii) to investigate the magnetic topology of these models by interactively creating magnetic field lines and associated magnetic field tubes, iii) to populate them with user-defined nonuniform thermal plasma and anisotropic nonuniform nonthermal electron distributions; and iv) to calculate the spatial and spectral properties of radio and X-ray emission. The application integrates DLL and Shared Libraries containing fast gyrosynchrotron emission codes developed in FORTRAN and C++, soft and hard X-ray codes developed in IDL, and a potential field extrapolation DLL produced based on original FORTRAN code developed by V. Abramenko and V. Yurchishin. The interactive interface allows users to add any user-defined IDL or external callable radiation code, as well as user-defined magnetic field extrapolation routines. To illustrate the tool capabilities, we present a step-by-step live computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data produced by NORH and RHESSI instruments. This work was supported in part by NSF grants AGS-0961867, AST-0908344, AGS-0969761, and NASA grants NNX10AF27G and NNX11AB49G to New Jersey Institute of Technology, by a UK STFC rolling grant, the Leverhulme Trust, UK, and by the European Commission through the Radiosun and HESPE Networks.

  5. Assessment of radiological protection systems among diagnostic radiology facilities in North East India.

    PubMed

    Singh, Thokchom Dewan; Jayaraman, T; Arunkumar Sharma, B

    2017-03-01

    This study aims to assess the adequacy level of radiological protection systems available in the diagnostic radiology facilities located in three capital cities of North East (NE) India. It further attempts to understand, using a multi-disciplinary approach, how the safety codes/standards in diagnostic radiology framed by the Atomic Energy Regulatory Board (AERB) and the International Atomic Energy Agency (IAEA) to achieve adequate radiological protection in facilities, have been perceived, conceptualized, and applied accordingly in these facilities. About 30 diagnostic radiology facilities were randomly selected from three capitals of states in NE India; namely Imphal (Manipur), Shillong (Meghalaya) and Guwahati (Assam). A semi-structured questionnaire developed based on a multi-disciplinary approach was used for this study. It was observed that radiological practices undertaken in these facilities were not exactly in line with safety codes/standards in diagnostic radiology of the AERB and the IAEA. About 50% of the facilities had registered/licensed x-ray equipment with the AERB. More than 80% of the workers did not use radiation protective devices, although these devices were available in the facilities. About 85% of facilities had no institutional risk management system. About 70% of the facilities did not carry out periodic quality assurance testing of their x-ray equipment or surveys of radiation leakage around the x-ray room, and did not display radiation safety indicators in the x-ray rooms. Workers in these facilities exhibited low risk perception about the risks associated with these practices. The majority of diagnostic radiology facilities in NE India did not comply with the radiological safety codes/standards framed by the AERB and IAEA. The study found inadequate levels of radiological protection systems in the majority of facilities. This study suggests a need to establish firm measures that comply with the radiological safety codes/standards of the AERB and IAEA to protect patients, workers and the public of this region.

  6. INVESTIGATION OF THE SUN'S X-RAYS. III. ELECTRONIC APPARATUS (in Russian)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasil'ev, B.N.; Shurigin, A.I.; Tindo, I.P.

    1963-01-01

    The electronic portion of an apparatus constructed for the investigation of soft x rays being emitted by the sun is described. The apparatus is used in geophysical rockets and in cosmic space ships and earth satellites. In the geophysical rockets two separate detection channels are employed, one for the working counters and the other for the control counters. The working counter is always directed towards the sun while the control counter is turned 15 deg away from the sun. In the second Sputnik six identical counters were used and arranged so that their line of sight was oriented along threemore » mutually perpendicular axes. In the third Sputnik the working and contour counters were distributed in a system which was self-orienting with respect to the sun. In addition, two stationary counters were enrployed; their direction with respect to the sun changed during the course of the flight. The electronic apparatus consists of the following basic components: a circuit that forms the amplitude and shape of the counter pulses, the triggering device, the separating cascade circuit, and the coding set-up. Each of these circuits is described in detail; block diagrams are shown. (TTT)« less

  7. Bistatic 3D Electromagnetic Scattering From a Right-Angle Dihedral at Arbitrary Orientation and Position

    DTIC Science & Technology

    2011-03-24

    compared to shooting and bouncing rays (SBR) and method of moments (MoM) predictions, as well as measured data for applicable cases. The model in this...prediction codes based on Shooting and Bouncing Rays (SBR) or Method of Moments (MoM) can be used to obtain accurate bistatic scatter- ing solutions for a...in-plane RCS pattern for dihedral. (a) For monostatic in-plane scattering, rays entering a right-angle dihedral are reflected back in the direction

  8. Laser x-ray Conversion and Electron Thermal Conductivity

    NASA Astrophysics Data System (ADS)

    Wang, Guang-yu; Chang, Tie-qiang

    2001-02-01

    The influence of electron thermal conductivity on the laser x-ray conversion in the coupling of 3ωo laser with Au plane target has been investigated by using a non-LTE radiation hydrodynamic code. The non-local electron thermal conductivity is introduced and compared with the other two kinds of the flux-limited Spitzer-Härm description. The results show that the non-local thermal conductivity causes the increase of the laser x-ray conversion efficiency and important changes of the plasma state and coupling feature.

  9. Spectroscopy of M-shell x-ray transitions in Zn-like through Co-like W

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clementson, J; Beiersdorfer, P; Brown, G V

    2009-07-08

    The M-shell x-ray emission of highly charged tungsten ions has been investigated at the Livermore electron beam ion trap facility. Using the SuperEBIT electron beam ion trap and a NASA x-ray calorimeter array, transitions connecting the ground configurations in the 1500-3600 eV spectral range of zinc-like W{sup 44+} through cobalt-like W{sup 47+} have been measured. The measured spectra are compared with theoretical line positions and emissivities calculated using the FAC code.

  10. Ray Methods for Acoustic Scattering, Optics Of Bubbles, Diffraction Catastrophes, and Nonlinear Acoustics.

    DTIC Science & Technology

    1992-11-24

    15 Code I: Internal Reports ................................................................. 19 Code M : Oral...experiments. 13. S. M . Baumer: completed M.S. thesis in 1988 on light scattering. 14. C. E. Dean: completed Ph.D. dissertation in 1989 on light...novel oscillation induced flow instabilities. 18. J. M . Winey: awarded M.S. degree in 1990 with project on capillary wave experiments. He

  11. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    NASA Astrophysics Data System (ADS)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  12. Engineering a therapeutic lectin by uncoupling mitogenicity from antiviral activity.

    PubMed

    Swanson, Michael D; Boudreaux, Daniel M; Salmon, Loïc; Chugh, Jeetender; Winter, Harry C; Meagher, Jennifer L; André, Sabine; Murphy, Paul V; Oscarson, Stefan; Roy, René; King, Steven; Kaplan, Mark H; Goldstein, Irwin J; Tarbet, E Bart; Hurst, Brett L; Smee, Donald F; de la Fuente, Cynthia; Hoffmann, Hans-Heinrich; Xue, Yi; Rice, Charles M; Schols, Dominique; Garcia, J Victor; Stuckey, Jeanne A; Gabius, Hans-Joachim; Al-Hashimi, Hashim M; Markovitz, David M

    2015-10-22

    A key effector route of the Sugar Code involves lectins that exert crucial regulatory controls by targeting distinct cellular glycans. We demonstrate that a single amino-acid substitution in a banana lectin, replacing histidine 84 with a threonine, significantly reduces its mitogenicity, while preserving its broad-spectrum antiviral potency. X-ray crystallography, NMR spectroscopy, and glycocluster assays reveal that loss of mitogenicity is strongly correlated with loss of pi-pi stacking between aromatic amino acids H84 and Y83, which removes a wall separating two carbohydrate binding sites, thus diminishing multivalent interactions. On the other hand, monovalent interactions and antiviral activity are preserved by retaining other wild-type conformational features and possibly through unique contacts involving the T84 side chain. Through such fine-tuning, target selection and downstream effects of a lectin can be modulated so as to knock down one activity, while preserving another, thus providing tools for therapeutics and for understanding the Sugar Code. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Engineering a Therapeutic Lectin by Uncoupling Mitogenicity from Antiviral Activity

    PubMed Central

    Swanson, Michael D.; Boudreaux, Daniel M.; Salmon, Loïc; Chugh, Jeetender; Winter, Harry C.; Meagher, Jennifer L.; André, Sabine; Murphy, Paul V.; Oscarson, Stefan; Roy, René; King, Steven; Kaplan, Mark H.; Goldstein, Irwin J.; Tarbet, E. Bart; Hurst, Brett L.; Smee, Donald F.; de la Fuente, Cynthia; Hoffmann, Hans-Heinrich; Xue, Yi; Rice, Charles M.; Schols, Dominique; Garcia, J. Victor; Stuckey, Jeanne A.; Gabius, Hans-Joachim; Al-Hashimi, Hashim M.; Markovitz, David M.

    2015-01-01

    Summary A key effector route of the Sugar Code involves lectins that exert crucial regulatory controls by targeting distinct cellular glycans. We demonstrate that a single amino acid substitution in a banana lectin, replacing histidine 84 with a threonine, significantly reduces its mitogenicity while preserving its broad-spectrum antiviral potency. X-ray crystallography, NMR spectroscopy, and glycocluster assays reveal that loss of mitogenicity is strongly correlated with loss of pi-pi stacking between aromatic amino acids H84 and Y83, which removes a wall separating two carbohydrate binding sites, thus diminishing multivalent interactions. On the other hand, monovalent interactions and antiviral activity are preserved by retaining other wild-type conformational features and possibly through unique contacts involving the T84 side chain. Through such fine-tuning, target selection and downstream effects of a lectin can be modulated so as to knock down one activity while preserving another, thus providing tools for therapeutics and for understanding the Sugar Code. PMID:26496612

  14. Common Errors in the Calculation of Aircrew Doses from Cosmic Rays

    NASA Astrophysics Data System (ADS)

    O'Brien, Keran; Felsberger, Ernst; Kindl, Peter

    2010-05-01

    Radiation doses to air crew are calculated using flight codes. Flight codes integrate dose rates over the aircraft flight path, which were calculated by transport codes or obtained by measurements from take off at a specific airport to landing at another. The dose rates are stored in various ways, such as by latitude and longitude, or in terms of the geomagnetic vertical cutoff. The transport codes are generally quite satisfactory, but the treatment of the boundary conditions is frequently incorrect. Both the treatment of solar modulation and of the effect of the geomagnetic field are often defective, leading to the systematic overestimate of the crew doses.

  15. 3D-PDR: Three-dimensional photodissociation region code

    NASA Astrophysics Data System (ADS)

    Bisbas, T. G.; Bell, T. A.; Viti, S.; Yates, J.; Barlow, M. J.

    2018-03-01

    3D-PDR is a three-dimensional photodissociation region code written in Fortran. It uses the Sundials package (written in C) to solve the set of ordinary differential equations and it is the successor of the one-dimensional PDR code UCL_PDR (ascl:1303.004). Using the HEALpix ray-tracing scheme (ascl:1107.018), 3D-PDR solves a three-dimensional escape probability routine and evaluates the attenuation of the far-ultraviolet radiation in the PDR and the propagation of FIR/submm emission lines out of the PDR. The code is parallelized (OpenMP) and can be applied to 1D and 3D problems.

  16. Streaked x-ray backlighting with twin-slit imager for study of density profile and trajectory of low-density foam target filled with deuterium liquid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiraga, H.; Mahigashi, N.; Yamada, T.

    2008-10-15

    Low-density plastic foam filled with liquid deuterium is one of the candidates for inertial fusion target. Density profile and trajectory of 527 nm laser-irradiated planer foam-deuterium target in the acceleration phase were observed with streaked side-on x-ray backlighting. An x-ray imager employing twin slits coupled to an x-ray streak camera was used to simultaneously observe three images of the target: self-emission from the target, x-ray backlighter profile, and the backlit target. The experimentally obtained density profile and trajectory were in good agreement with predictions by one-dimensional hydrodynamic simulation code ILESTA-1D.

  17. X-ray beam-shaping via deformable mirrors: surface profile and point spread function computation for Gaussian beams using physical optics.

    PubMed

    Spiga, D

    2018-01-01

    X-ray mirrors with high focusing performances are commonly used in different sectors of science, such as X-ray astronomy, medical imaging and synchrotron/free-electron laser beamlines. While deformations of the mirror profile may cause degradation of the focus sharpness, a deliberate deformation of the mirror can be made to endow the focus with a desired size and distribution, via piezo actuators. The resulting profile can be characterized with suitable metrology tools and correlated with the expected optical quality via a wavefront propagation code or, sometimes, predicted using geometric optics. In the latter case and for the special class of profile deformations with monotonically increasing derivative, i.e. concave upwards, the point spread function (PSF) can even be predicted analytically. Moreover, under these assumptions, the relation can also be reversed: from the desired PSF the required profile deformation can be computed analytically, avoiding the use of trial-and-error search codes. However, the computation has been so far limited to geometric optics, which entailed some limitations: for example, mirror diffraction effects and the size of the coherent X-ray source were not considered. In this paper, the beam-shaping formalism in the framework of physical optics is reviewed, in the limit of small light wavelengths and in the case of Gaussian intensity wavefronts. Some examples of shaped profiles are also shown, aiming at turning a Gaussian intensity distribution into a top-hat one, and checks of the shaping performances computing the at-wavelength PSF by means of the WISE code are made.

  18. Morse Monte Carlo Radiation Transport Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emmett, M.B.

    1975-02-01

    The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one maymore » determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)« less

  19. RAINIER: A simulation tool for distributions of excited nuclear states and cascade fluctuations

    DOE PAGES

    Kirsch, L. E.; Bernstein, L. A.

    2018-03-04

    In this paper, a new code has been developed named RAINIER that simulates the γ-ray decay of discrete and quasi-continuum nuclear levels for a user-specified range of energy, angular momentum, and parity including a realistic treatment of level spacing and transition width fluctuations. A similar program, DICEBOX, uses the Monte Carlo method to simulate level and width fluctuations but is restricted in its initial level population algorithm. On the other hand, modern reaction codes such as TALYS and EMPIRE populate a wide range of states in the residual nucleus prior to γ-ray decay, but do not go beyond the usemore » of deterministic functions and therefore neglect cascade fluctuations. This combination of capabilities allows RAINIER to be used to determine quasi-continuum properties through comparison with experimental data. Finally, several examples are given that demonstrate how cascade fluctuations influence experimental high-resolution γ-ray spectra from reactions that populate a wide range of initial states.« less

  20. RAINIER: A simulation tool for distributions of excited nuclear states and cascade fluctuations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirsch, L. E.; Bernstein, L. A.

    In this paper, a new code has been developed named RAINIER that simulates the γ-ray decay of discrete and quasi-continuum nuclear levels for a user-specified range of energy, angular momentum, and parity including a realistic treatment of level spacing and transition width fluctuations. A similar program, DICEBOX, uses the Monte Carlo method to simulate level and width fluctuations but is restricted in its initial level population algorithm. On the other hand, modern reaction codes such as TALYS and EMPIRE populate a wide range of states in the residual nucleus prior to γ-ray decay, but do not go beyond the usemore » of deterministic functions and therefore neglect cascade fluctuations. This combination of capabilities allows RAINIER to be used to determine quasi-continuum properties through comparison with experimental data. Finally, several examples are given that demonstrate how cascade fluctuations influence experimental high-resolution γ-ray spectra from reactions that populate a wide range of initial states.« less

  1. ^235U(n,xnγ) Excitation Function Measurements Using Gamma-Ray Spectroscopy at GEANIE

    NASA Astrophysics Data System (ADS)

    Younes, W.; Becker, J. A.; Bernstein, L. A.; Archer, D. E.; Stoyer, M. A.; Hauschild, K.; Drake, D. M.; Johns, G. D.; Nelson, R. O.; Wilburn, S. W.

    1998-04-01

    The ^235U(n,xn) cross sections (where x<=2) have previously been measured at several incident neutron energies. In particular, the ^235U(n,2n) cross section has been measured(J. Frehaut et al.), Nucl. Sci. Eng. 74,29 (1980). reliably up to peak near E_n≈ 11 MeV, but not along the tail which is predicted by some(M.B. Chadwick, private communication.) codes to yield significant (e.g. >= 10% of peak) cross section out to E_n≈ 30 MeV. We have measured gamma-ray spectra resulting from ^235U(n,xn) as a function of neutron energy in the range 1 MeV <~ En <~ 200 MeV using the GEANIE spectrometer at the LANSCE/WNR ``white'' neutron source. We will present excitation functions for the de-excitation gamma rays in ^234,235U compared to predictions from the Hauser-Feshbach-preequilibrium code GNASH(M.B. Chadwick and P.G. Young, Los Alamos Report No. LA-UR-93-104, 1993.).

  2. Simulations of a Molecular Cloud experiment using CRASH

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Keiter, Paul; Vandervort, Robert; Drake, R. Paul; Shvarts, Dov

    2017-10-01

    Recent laboratory experiments explore molecular cloud radiation hydrodynamics. The experiment irradiates a gold foil with a laser producing x-rays to drive the implosion or explosion of a foam ball. The CRASH code, an Eulerian code with block-adaptive mesh refinement, multigroup diffusive radiation transport, and electron heat conduction developed at the University of Michigan to design and analyze high-energy-density experiments, is used to perform a parameter search in order to identify optically thick, optically thin and transition regimes suitable for these experiments. Specific design issues addressed by the simulations are the x-ray drive temperature, foam density, distance from the x-ray source to the ball, as well as other complicating issues such as the positioning of the stalk holding the foam ball. We present the results of this study and show ways the simulations helped improve the quality of the experiment. This work is funded by the LLNL under subcontract B614207 and NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, Grant Number DE-NA0002956.

  3. Acoustic waves in M dwarfs: Maintaining a corona

    NASA Technical Reports Server (NTRS)

    Mullan, D. J.; Cheng, Q. Q.

    1994-01-01

    We use a time-dependent hydrodynamics code to follow the propagation of acoustic waves into the corona of an M dwarf star. An important qualitative difference between M dwarfs and stars such as the Sun is that the acoustic spectrum in M dwarfs is expected to peak at periods close to the acoustic cutoff P(sub A): this allows more effective penetration of waves into the corona. In our code, radiative losses in the photosphere, chromosphere, and corona are computed using Rosseland mean opacities, Mg II kappa and Ly alpha emission, and optically thin emissivities respectively. We find that acoustic heating can maintain a corona with a temperature of order 0.7-1 x 10(exp 6) K and a surface X-ray flux as large as 10(exp 5)ergs/sq cm/s. In a recent survey of X-rays from M dwarfs, some (20%-30%) of the stars lie at or below this limiting X-ray flux: we suggest that such stars may be candidates for acoustically maintained coronae.

  4. Validation of a Laser-Ray Package in an Eulerian Code

    NASA Astrophysics Data System (ADS)

    Bradley, Paul; Hall, Mike; McKenty, Patrick; Collins, Tim; Keller, David

    2014-10-01

    A laser-ray absorption package was recently installed in the RAGE code by the Laboratory for Laser Energetics (LLE). In this presentation, we describe our use of this package to implode Omega 60 beam symmetric direct drive capsules. The capsules have outer diameters of about 860 microns, CH plastic shell thicknesses between 8 and 32 microns, DD or DT gas fills between 5 and 20 atmospheres, and a 1 ns square pulse of 23 to 27 kJ. These capsule implosions were previously modeled with a calibrated energy source in the outer layer of the capsule, where we matched bang time and burn ion temperature well, but the simulated yields were two to three times higher than the data. We will run simulations with laser ray energy deposition to the experiments and the results to the yield and spectroscopic data. Work performed by Los Alamos National Laboratory under Contract DE-AC52-06NA25396 for the National Nuclear Security Administration of the U.S. Department of Energy.

  5. Study of solid-conversion gaseous detector based on GEM for high energy X-ray industrial CT.

    PubMed

    Zhou, Rifeng; Zhou, Yaling

    2014-01-01

    The general gaseous ionization detectors are not suitable for high energy X-ray industrial computed tomography (HEICT) because of their inherent limitations, especially low detective efficiency and large volume. The goal of this study was to investigate a new type of gaseous detector to solve these problems. The novel detector was made by a metal foil as X-ray convertor to improve the conversion efficiency, and the Gas Electron Multiplier (hereinafter "GEM") was used as electron amplifier to lessen its volume. The detective mechanism and signal formation of the detector was discussed in detail. The conversion efficiency was calculated by using EGSnrc Monte Carlo code, and the transport course of photon and secondary electron avalanche in the detector was simulated with the Maxwell and Garfield codes. The result indicated that this detector has higher conversion efficiency as well as less volume. Theoretically this kind of detector could be a perfect candidate for replacing the conventional detector in HEICT.

  6. Experimental verification of bremsstrahlung production and dosimetry predictions for 15.5 MeV electrons

    NASA Astrophysics Data System (ADS)

    Sanford, T. W. L.; Beutler, D. E.; Halbleib, J. A.; Knott, D. P.

    1991-12-01

    The radiation produced by a 15.5-MeV monoenergetic electron beam incident on optimized and nonoptimized bremsstrahlung targets is characterized using the ITS Monte Carlo code and measurements with equilibrated and nonequilibrated TLD dosimetry. Comparisons between calculations and measurements verify the calculations and demonstrate that the code can be used to predict both bremsstrahlung production and TLD response for radiation fields that are characteristic of those produced by pulsed simulators of gamma rays. The comparisons provide independent confirmation of the validity of the TLD calibration for photon fields characteristic of gamma-ray simulators. The empirical Martin equation, which is often used to calculate radiation dose from optimized bremsstrahlung targets, is examined, and its range of validity is established.

  7. Analysis and Description of HOLTIN Service Provision for AECG monitoring in Complex Indoor Environments

    PubMed Central

    Led, Santiago; Azpilicueta, Leire; Aguirre, Erik; de Espronceda, Miguel Martínez; Serrano, Luis; Falcone, Francisco

    2013-01-01

    In this work, a novel ambulatory ECG monitoring device developed in-house called HOLTIN is analyzed when operating in complex indoor scenarios. The HOLTIN system is described, from the technological platform level to its functional model. In addition, by using in-house 3D ray launching simulation code, the wireless channel behavior, which enables ubiquitous operation, is performed. The effect of human body presence is taken into account by a novel simplified model embedded within the 3D Ray Launching code. Simulation as well as measurement results are presented, showing good agreement. These results may aid in the adequate deployment of this novel device to automate conventional medical processes, increasing the coverage radius and optimizing energy consumption. PMID:23584122

  8. Analysis of soft x-ray emission spectra of laser-produced dysprosium, erbium and thulium plasmas

    NASA Astrophysics Data System (ADS)

    Sheil, John; Dunne, Padraig; Higashiguchi, Takeshi; Kos, Domagoj; Long, Elaine; Miyazaki, Takanori; O'Reilly, Fergal; O'Sullivan, Gerard; Sheridan, Paul; Suzuki, Chihiro; Sokell, Emma; White, Elgiva; Kilbane, Deirdre

    2017-03-01

    Soft x-ray emission spectra of dysprosium, erbium and thulium ions created in laser-produced plasmas were recorded with a flat-field grazing-incidence spectrometer in the 2.5-8 nm spectral range. The ions were produced using an Nd:YAG laser of 7 ns pulse duration and the spectra were recorded at various power densities. The experimental spectra were interpreted with the aid of the Cowan suite of atomic structure codes and the flexible atomic code. At wavelengths above 5.5 nm the spectra are dominated by overlapping n = 4 - n = 4 unresolved transition arrays from adjacent ion stages. Below 6 nm, n = 4 - n = 5 transitions also give rise to a series of interesting overlapping spectral features.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lane, Taylor; Parma, Edward J.

    Delayed fission gamma-rays play an important role in determining the time dependent ioniz- ing dose for experiments in the central irradiation cavity of the Annular Core Research Reactor (ACRR). Delayed gamma-rays are produced from both fission product decay and from acti- vation of materials in the core, such as cladding and support structures. Knowing both the delayed gamma-ray emission rate and the time-dependent gamma-ray energy spectrum is nec- essary in order to properly determine the dose contributions from delayed fission gamma-rays. This information is especially important when attempting to deconvolute the time-dependent neutron, prompt gamma-ray, and delayed gamma-ray contribution tomore » the response of a diamond photo-conducting diode (PCD) or fission chamber in time frames of milliseconds to seconds following a reactor pulse. This work focused on investigating delayed gamma-ray character- istics produced from fission products from thermal, fast, and high energy fission of Th-232, U-233, U-235, U-238, and Pu-239. This work uses a modified version of CINDER2008, a transmutation code developed at Los Alamos National Laboratory, to model time and energy dependent photon characteristics due to fission. This modified code adds the capability to track photon-induced transmutations, photo-fission, and the subsequent radiation caused by fission products due to photo-fission. The data is compared against previous work done with SNL- modified CINDER2008 [ 1 ] and experimental data [ 2 , 3 ] and other published literature, includ- ing ENDF/B-VII.1 [ 4 ]. The ability to produce a high-fidelity (7,428 group) energy-dependent photon fluence at various times post-fission can improve the delayed photon characterization for radiation effects tests at research reactors, as well as other applications.« less

  10. Correlated prompt fission data in transport simulations

    NASA Astrophysics Data System (ADS)

    Talou, P.; Vogt, R.; Randrup, J.; Rising, M. E.; Pozzi, S. A.; Verbeke, J.; Andrews, M. T.; Clarke, S. D.; Jaffke, P.; Jandel, M.; Kawano, T.; Marcath, M. J.; Meierbachtol, K.; Nakae, L.; Rusev, G.; Sood, A.; Stetcu, I.; Walker, C.

    2018-01-01

    Detailed information on the fission process can be inferred from the observation, modeling and theoretical understanding of prompt fission neutron and γ-ray observables. Beyond simple average quantities, the study of distributions and correlations in prompt data, e.g., multiplicity-dependent neutron and γ-ray spectra, angular distributions of the emitted particles, n - n, n - γ, and γ - γ correlations, can place stringent constraints on fission models and parameters that would otherwise be free to be tuned separately to represent individual fission observables. The FREYA and CGMF codes have been developed to follow the sequential emissions of prompt neutrons and γ rays from the initial excited fission fragments produced right after scission. Both codes implement Monte Carlo techniques to sample initial fission fragment configurations in mass, charge and kinetic energy and sample probabilities of neutron and γ emission at each stage of the decay. This approach naturally leads to using simple but powerful statistical techniques to infer distributions and correlations among many observables and model parameters. The comparison of model calculations with experimental data provides a rich arena for testing various nuclear physics models such as those related to the nuclear structure and level densities of neutron-rich nuclei, the γ-ray strength functions of dipole and quadrupole transitions, the mechanism for dividing the excitation energy between the two nascent fragments near scission, and the mechanisms behind the production of angular momentum in the fragments, etc. Beyond the obvious interest from a fundamental physics point of view, such studies are also important for addressing data needs in various nuclear applications. The inclusion of the FREYA and CGMF codes into the MCNP6.2 and MCNPX - PoliMi transport codes, for instance, provides a new and powerful tool to simulate correlated fission events in neutron transport calculations important in nonproliferation, safeguards, nuclear energy, and defense programs. This review provides an overview of the topic, starting from theoretical considerations of the fission process, with a focus on correlated signatures. It then explores the status of experimental correlated fission data and current efforts to address some of the known shortcomings. Numerical simulations employing the FREYA and CGMF codes are compared to experimental data for a wide range of correlated fission quantities. The inclusion of those codes into the MCNP6.2 and MCNPX - PoliMi transport codes is described and discussed in the context of relevant applications. The accuracy of the model predictions and their sensitivity to model assumptions and input parameters are discussed. Finally, a series of important experimental and theoretical questions that remain unanswered are presented, suggesting a renewed effort to address these shortcomings.

  11. 29 CFR 1910.6 - Incorporation by reference.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...-63 Safety Standard for Non-Medical X-Ray and Sealed Gamma Ray Sources, IBR approved for § 1910.252(d...). (8) ANSI A14.2-56 Safety Code for Portable Metal Ladders, Supplemented by ANSI A14.2a-77, IBR... Conveyors, Cableways, and Related Equipment, IBR approved for §§ 1910.218(j)(3); 1910.261 (a)(3)(x), (b)(1...

  12. 77 FR 65314 - Missouri: Final Authorization of State Hazardous Waste Management Program Revisions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-26

    ..., Appendix I, Item O (see section H.1.e for discussion). RCRA Cluster XVII Cathode Ray Tubes Rule, 71 FR... provisions at: 40 CFR 261.39(a)(5)(exports of cathode ray tubes); 40 CFR 262.21 (Manifest Registry); 40 CFR... Hazardous Waste in Boilers and Industrial Furnaces (BIFs) that were introduced into the Federal code by a...

  13. 77 FR 12226 - Sadex Corp.; Filing of Food Additive Petition (Animal Use); Electron Beam and X-Ray Sources for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-29

    ... Sources for Irradiation of Poultry Feed and Poultry Feed Ingredients AGENCY: Food and Drug Administration... the safe use of electron beam and x-ray sources for irradiation of poultry feed and poultry feed... petition proposes to amend Title 21 of the Code of Federal Regulations (CFR) in part 579 Irradiation in the...

  14. Simultaneous modelling of X-ray emission and optical polarization of intermediate polars: the case of V405 Aur

    NASA Astrophysics Data System (ADS)

    J. Lima, I.; Vilega Rodrigues, C.; Medeiros Gomes Silva, K.; Luna, G.; D Amico, F.; Goulart Coelho, J.

    2017-10-01

    Intermediate polars are compact binaries in which mass transfer occurs from a low-mass star onto a magnetic white dwarf. A shock structure is formed in the magnetic accretion column nearby the white-dwarf surface. High-energy emission is produced in the post-shock region and the main physical process envolved is bremsstrahlung and line emission. Some systems show optical polarization, which may be also originated in the post-shock region. Our main goal is to study the magnetic structure of intermediate polars by simultaneously modelling optical polarimetry and X-ray data using the CYCLOPS code. This code was developed by our group to peform multi-wavelength fitting of the accretion column flux. It considers cyclotron and free-free emission from a 3D post-shock region, which is non-homogeneous in terms of density, temperature, and magnetic field. In this study, we present our modelling of the optical polarization and X-ray emission of V405 Aurigae, the intermediate polar that has the highest magnetic field. Previous studies of this system were not successful in proposing a geometry that explains both the optical and X-ray emissions.

  15. Design and implementation of a channel decoder with LDPC code

    NASA Astrophysics Data System (ADS)

    Hu, Diqing; Wang, Peng; Wang, Jianzong; Li, Tianquan

    2008-12-01

    Because Toshiba quit the competition, there is only one standard of blue-ray disc: BLU-RAY DISC, which satisfies the demands of high-density video programs. But almost all the patents are gotten by big companies such as Sony, Philips. As a result we must pay much for these patents when our productions use BD. As our own high-density optical disk storage system, Next-Generation Versatile Disc(NVD) which proposes a new data format and error correction code with independent intellectual property rights and high cost performance owns higher coding efficiency than DVD and 12GB which could meet the demands of playing the high-density video programs. In this paper, we develop Low-Density Parity-Check Codes (LDPC): a new channel encoding process and application scheme using Q-matrix based on LDPC encoding has application in NVD's channel decoder. And combined with the embedded system portable feature of SOPC system, we have completed all the decoding modules by FPGA. In the NVD experiment environment, tests are done. Though there are collisions between LDPC and Run-Length-Limited modulation codes (RLL) which are used in optical storage system frequently, the system is provided as a suitable solution. At the same time, it overcomes the defects of the instability and inextensibility, which occurred in the former decoding system of NVD--it was implemented by hardware.

  16. Kinetic modeling of x-ray laser-driven solid Al plasmas via particle-in-cell simulation

    NASA Astrophysics Data System (ADS)

    Royle, R.; Sentoku, Y.; Mancini, R. C.; Paraschiv, I.; Johzaki, T.

    2017-06-01

    Solid-density plasmas driven by intense x-ray free-electron laser (XFEL) radiation are seeded by sources of nonthermal photoelectrons and Auger electrons that ionize and heat the target via collisions. Simulation codes that are commonly used to model such plasmas, such as collisional-radiative (CR) codes, typically assume a Maxwellian distribution and thus instantaneous thermalization of the source electrons. In this study, we present a detailed description and initial applications of a collisional particle-in-cell code, picls, that has been extended with a self-consistent radiation transport model and Monte Carlo models for photoionization and K L L Auger ionization, enabling the fully kinetic simulation of XFEL-driven plasmas. The code is used to simulate two experiments previously performed at the Linac Coherent Light Source investigating XFEL-driven solid-density Al plasmas. It is shown that picls-simulated pulse transmissions using the Ecker-Kröll continuum-lowering model agree much better with measurements than do simulations using the Stewart-Pyatt model. Good quantitative agreement is also found between the time-dependent picls results and those of analogous simulations by the CR code scfly, which was used in the analysis of the experiments to accurately reproduce the observed K α emissions and pulse transmissions. Finally, it is shown that the effects of the nonthermal electrons are negligible for the conditions of the particular experiments under investigation.

  17. Implementing displacement damage calculations for electrons and gamma rays in the Particle and Heavy-Ion Transport code System

    NASA Astrophysics Data System (ADS)

    Iwamoto, Yosuke

    2018-03-01

    In this study, the Monte Carlo displacement damage calculation method in the Particle and Heavy-Ion Transport code System (PHITS) was improved to calculate displacements per atom (DPA) values due to irradiation by electrons (or positrons) and gamma rays. For the damage due to electrons and gamma rays, PHITS simulates electromagnetic cascades using the Electron Gamma Shower version 5 (EGS5) algorithm and calculates DPA values using the recoil energies and the McKinley-Feshbach cross section. A comparison of DPA values calculated by PHITS and the Monte Carlo assisted Classical Method (MCCM) reveals that they were in good agreement for gamma-ray irradiations of silicon and iron at energies that were less than 10 MeV. Above 10 MeV, PHITS can calculate DPA values not only for electrons but also for charged particles produced by photonuclear reactions. In DPA depth distributions under electron and gamma-ray irradiations, build-up effects can be observed near the target's surface. For irradiation of 90-cm-thick carbon by protons with energies of more than 30 GeV, the ratio of the secondary electron DPA values to the total DPA values is more than 10% and increases with an increase in incident energy. In summary, PHITS can calculate DPA values for all particles and materials over a wide energy range between 1 keV and 1 TeV for electrons, gamma rays, and charged particles and between 10-5 eV and 1 TeV for neutrons.

  18. Simulation of a complete X-ray digital radiographic system for industrial applications.

    PubMed

    Nazemi, E; Rokrok, B; Movafeghi, A; Choopan Dastjerdi, M H

    2018-05-19

    Simulating X-ray images is of great importance in industry and medicine. Using such simulation permits us to optimize parameters which affect image's quality without the limitations of an experimental procedure. This study revolves around a novel methodology to simulate a complete industrial X-ray digital radiographic system composed of an X-ray tube and a computed radiography (CR) image plate using Monte Carlo N Particle eXtended (MCNPX) code. In the process of our research, an industrial X-ray tube with maximum voltage of 300 kV and current of 5 mA was simulated. A 3-layer uniform plate including a polymer overcoat layer, a phosphor layer and a polycarbonate backing layer was also defined and simulated as the CR imaging plate. To model the image formation in the image plate, at first the absorbed dose was calculated in each pixel inside the phosphor layer of CR imaging plate using the mesh tally in MCNPX code and then was converted to gray value using a mathematical relationship determined in a separate procedure. To validate the simulation results, an experimental setup was designed and the images of two step wedges created out of aluminum and steel were captured by the experiments and compared with the simulations. The results show that the simulated images are in good agreement with the experimental ones demonstrating the ability of the proposed methodology for simulating an industrial X-ray imaging system. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. What can we learn from in-soil imaging of a live plant: X-ray Computed Tomography and 3D numerical simulation of root-soil system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xiaofan; Varga, Tamas; Liu, Chongxuan

    Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere. X-ray Computed Tomography (XCT) has been proven to be an effective tool for non-invasive root imaging and analysis. A combination of XCT, open-source software, and in-house developed code was used to non-invasively image a prairie dropseed (Sporobolus heterolepis) specimen, segment the root data to obtain a 3D image of the root structure, and extract quantitative information from the 3D data, respectively. Based on the explicitly-resolved root structure, pore-scale computational fluid dynamics (CFD) simulations were applied to numerically investigate the root-soil-groundwater system. The plant root conductivity, soilmore » hydraulic conductivity and transpiration rate were shown to control the groundwater distribution. Furthermore, the coupled imaging-modeling approach demonstrates a realistic platform to investigate rhizosphere flow processes and would be feasible to provide useful information linked to upscaled models.« less

  20. What can we learn from in-soil imaging of a live plant: X-ray Computed Tomography and 3D numerical simulation of root-soil system

    DOE PAGES

    Yang, Xiaofan; Varga, Tamas; Liu, Chongxuan; ...

    2017-05-04

    Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere. X-ray Computed Tomography (XCT) has been proven to be an effective tool for non-invasive root imaging and analysis. A combination of XCT, open-source software, and in-house developed code was used to non-invasively image a prairie dropseed (Sporobolus heterolepis) specimen, segment the root data to obtain a 3D image of the root structure, and extract quantitative information from the 3D data, respectively. Based on the explicitly-resolved root structure, pore-scale computational fluid dynamics (CFD) simulations were applied to numerically investigate the root-soil-groundwater system. The plant root conductivity, soilmore » hydraulic conductivity and transpiration rate were shown to control the groundwater distribution. Furthermore, the coupled imaging-modeling approach demonstrates a realistic platform to investigate rhizosphere flow processes and would be feasible to provide useful information linked to upscaled models.« less

  1. Direct Measurement of Lateral Correlations under Controlled Nanoconfinement

    NASA Astrophysics Data System (ADS)

    Kékicheff, P.; Iss, J.; Fontaine, P.; Johner, A.

    2018-03-01

    Lateral correlations along hydrophobic surfaces whose separation can be varied continuously are measured by x-ray scattering using a modified surface force apparatus coupled with synchrotron radiation, named SFAX. A weak isotropic diffuse scattering along the equatorial plane is revealed for mica surfaces rendered hydrophobic and charge neutral by immersion in cationic surfactant solutions at low concentrations. The peak corresponds to a lateral surface correlation length ξ ≈12 nm , without long-range order. These findings are compatible with the atomic force microscopy imaging of a single surface, where adsorbed surfactant stripes appear surrounded by bare mica zones. Remarkably, the scattering patterns remain stable for gap widths D larger than the lateral period but change in intensity and shape (to a lesser extent) as soon as D <ξ . This evolution codes for a redistribution of counterions (counterion release from antagonistic patches) and the associated new x-ray labeling of the patterns. The redistribution of counterions is also the key mechanism to the long-range electrostatic attraction between similar, overall charge-neutral walls, reported earlier.

  2. X-ray Device Makes Scrubbing Rugs Clean a Spotless Effort

    NASA Technical Reports Server (NTRS)

    2006-01-01

    If "pulling the rug out from under" means suddenly withdrawing support and assistance, then NASA is pretty good at "putting the rug under" when it comes to offering technical support and assistance to private industry. In the case of a new X-ray fluorescence (XRF) sensor featuring enhancements compliments of NASA, the Space Agency not only provided the rug, but helped give private industry a means to ensure it keeps clean. This sensor, utilized by NASA to read chemical bar codes concealed by paint and other coatings, perform on-the-spot chemical analyses in field conditions, and detect difficult-to-identify contaminants, has found another use as a tool that can measure how much soil is removed from household and commercial carpets. The original technology was developed in 2002 to conduct quality control for critical aluminum alloy parts destined for the space shuttle. Evaluation of these parts is critical for the Space Agency, as any signs of contamination, corrosion, or material deviation could compromise a shuttle mission.

  3. Modeling study of a proposed field calibration source using K-40 and high-Z targets for sodium iodide detectors

    DOE PAGES

    Rogers, Jeremy; Marianno, Craig; Kallenbach, Gene; ...

    2016-06-01

    Calibration sources based on the primordial isotope potassium-40 ( 40K) have reduced controls on the source’s activity due to its terrestrial ubiquity and very low specific activity. Potassium–40’s beta emissions and 1,460.8 keV gamma ray can be used to induce K-shell fluorescence x rays in high-Z metals between 60 and 80 keV. A gamma ray calibration source that uses potassium chloride salt and a high-Z metal to create a two-point calibration for a sodium iodide field gamma spectroscopy instrument is thus proposed. The calibration source was designed in collaboration with the Sandia National Laboratory using the Monte Carlo N-Particle eXtendedmore » (MCNPX) transport code. Two methods of x-ray production were explored. First, a thin high-Z layer (HZL) was interposed between the detector and the potassium chloride-urethane source matrix. Second, bismuth metal powder was homogeneously mixed with a urethane binding agent to form a potassium chloride-bismuth matrix (KBM). The bismuth-based source was selected as the development model because it is inexpensive, nontoxic, and outperforms the high-Z layer method in simulation. As a result, based on the MCNPX studies, sealing a mixture of bismuth powder and potassium chloride into a thin plastic case could provide a light, inexpensive field calibration source.« less

  4. Mobile, hybrid Compton/coded aperture imaging for detection, identification and localization of gamma-ray sources at stand-off distances

    NASA Astrophysics Data System (ADS)

    Tornga, Shawn R.

    The Stand-off Radiation Detection System (SORDS) program is an Advanced Technology Demonstration (ATD) project through the Department of Homeland Security's Domestic Nuclear Detection Office (DNDO) with the goal of detection, identification and localization of weak radiological sources in the presence of large dynamic backgrounds. The Raytheon-SORDS Tri-Modal Imager (TMI) is a mobile truck-based, hybrid gamma-ray imaging system able to quickly detect, identify and localize, radiation sources at standoff distances through improved sensitivity while minimizing the false alarm rate. Reconstruction of gamma-ray sources is performed using a combination of two imaging modalities; coded aperture and Compton scatter imaging. The TMI consists of 35 sodium iodide (NaI) crystals 5x5x2 in3 each, arranged in a random coded aperture mask array (CA), followed by 30 position sensitive NaI bars each 24x2.5x3 in3 called the detection array (DA). The CA array acts as both a coded aperture mask and scattering detector for Compton events. The large-area DA array acts as a collection detector for both Compton scattered events and coded aperture events. In this thesis, developed coded aperture, Compton and hybrid imaging algorithms will be described along with their performance. It will be shown that multiple imaging modalities can be fused to improve detection sensitivity over a broader energy range than either alone. Since the TMI is a moving system, peripheral data, such as a Global Positioning System (GPS) and Inertial Navigation System (INS) must also be incorporated. A method of adapting static imaging algorithms to a moving platform has been developed. Also, algorithms were developed in parallel with detector hardware, through the use of extensive simulations performed with the Geometry and Tracking Toolkit v4 (GEANT4). Simulations have been well validated against measured data. Results of image reconstruction algorithms at various speeds and distances will be presented as well as localization capability. Utilizing imaging information will show signal-to-noise gains over spectroscopic algorithms alone.

  5. Comparative modelling of lower hybrid current drive with two launcher designs in the Tore Supra tokamak

    NASA Astrophysics Data System (ADS)

    Nilsson, E.; Decker, J.; Peysson, Y.; Artaud, J.-F.; Ekedahl, A.; Hillairet, J.; Aniel, T.; Basiuk, V.; Goniche, M.; Imbeaux, F.; Mazon, D.; Sharma, P.

    2013-08-01

    Fully non-inductive operation with lower hybrid current drive (LHCD) in the Tore Supra tokamak is achieved using either a fully active multijunction (FAM) launcher or a more recent ITER-relevant passive active multijunction (PAM) launcher, or both launchers simultaneously. While both antennas show comparable experimental efficiencies, the analysis of stability properties in long discharges suggest different current profiles. We present comparative modelling of LHCD with the two different launchers to characterize the effect of the respective antenna spectra on the driven current profile. The interpretative modelling of LHCD is carried out using a chain of codes calculating, respectively, the global discharge evolution (tokamak simulator METIS), the spectrum at the antenna mouth (LH coupling code ALOHA), the LH wave propagation (ray-tracing code C3PO), and the distribution function (3D Fokker-Planck code LUKE). Essential aspects of the fast electron dynamics in time, space and energy are obtained from hard x-ray measurements of fast electron bremsstrahlung emission using a dedicated tomographic system. LHCD simulations are validated by systematic comparisons between these experimental measurements and the reconstructed signal calculated by the code R5X2 from the LUKE electron distribution. An excellent agreement is obtained in the presence of strong Landau damping (found under low density and high-power conditions in Tore Supra) for which the ray-tracing model is valid for modelling the LH wave propagation. Two aspects of the antenna spectra are found to have a significant effect on LHCD. First, the driven current is found to be proportional to the directivity, which depends upon the respective weight of the main positive and main negative lobes and is particularly sensitive to the density in front of the antenna. Second, the position of the main negative lobe in the spectrum is different for the two launchers. As this lobe drives a counter-current, the resulting driven current profile is also different for the FAM and PAM launchers.

  6. Determination of Differential Emission Measure from Solar Extreme Ultraviolet Images

    NASA Astrophysics Data System (ADS)

    Su, Yang; Veronig, Astrid M.; Hannah, Iain G.; Cheung, Mark C. M.; Dennis, Brian R.; Holman, Gordon D.; Gan, Weiqun; Li, Youping

    2018-03-01

    The Atmospheric Imaging Assembly (AIA) on board the Solar Dynamic Observatory (SDO) has been providing high-cadence, high-resolution, full-disk UV-visible/extreme ultraviolet (EUV) images since 2010, with the best time coverage among all the solar missions. A number of codes have been developed to extract plasma differential emission measures (DEMs) from AIA images. Although widely used, they cannot effectively constrain the DEM at flaring temperatures with AIA data alone. This often results in much higher X-ray fluxes than observed. One way to solve the problem is by adding more constraint from other data sets (such as soft X-ray images and fluxes). However, the spatial information of plasma DEMs are lost in many cases. In this Letter, we present a different approach to constrain the DEMs. We tested the sparse inversion code and show that the default settings reproduce X-ray fluxes that could be too high. Based on the tests with both simulated and observed AIA data, we provided recommended settings of basis functions and tolerances. The new DEM solutions derived from AIA images alone are much more consistent with (thermal) X-ray observations, and provide valuable information by mapping the thermal plasma from ∼0.3 to ∼30 MK. Such improvement is a key step in understanding the nature of individual X-ray sources, and particularly important for studies of flare initiation.

  7. Use of simulation to optimize the pinhole diameter and mask thickness for an x-ray backscatter imaging system

    NASA Astrophysics Data System (ADS)

    Vella, A.; Munoz, Andre; Healy, Matthew J. F.; Lane, David; Lockley, D.

    2017-08-01

    The PENELOPE Monte Carlo simulation code was used to determine the optimum thickness and aperture diameter of a pinhole mask for X-ray backscatter imaging in a security application. The mask material needs to be thick enough to absorb most X-rays, and the pinhole must be wide enough for sufficient field of view whilst narrow enough for sufficient image spatial resolution. The model consisted of a fixed geometry test object, various masks with and without pinholes, and a 1040 x 1340 pixels' area detector inside a lead lined camera housing. The photon energy distribution incident upon masks was flat up to selected energy limits. This artificial source was used to avoid the optimisation being specific to any particular X-ray source technology. The pixelated detector was modelled by digitising the surface area represented by the PENELOPE phase space file and integrating the energies of the photons impacting within each pixel; a MATLAB code was written for this. The image contrast, signal to background ratio, spatial resolution, and collimation effect were calculated at the simulated detector as a function of pinhole diameter and various thicknesses of mask made of tungsten, tungsten/epoxy composite or bismuth alloy. A process of elimination was applied to identify suitable masks for a viable X-ray backscattering security application.

  8. TransFit: Finite element analysis data fitting software

    NASA Technical Reports Server (NTRS)

    Freeman, Mark

    1993-01-01

    The Advanced X-Ray Astrophysics Facility (AXAF) mission support team has made extensive use of geometric ray tracing to analyze the performance of AXAF developmental and flight optics. One important aspect of this performance modeling is the incorporation of finite element analysis (FEA) data into the surface deformations of the optical elements. TransFit is software designed for the fitting of FEA data of Wolter I optical surface distortions with a continuous surface description which can then be used by SAO's analytic ray tracing software, currently OSAC (Optical Surface Analysis Code). The improved capabilities of Transfit over previous methods include bicubic spline fitting of FEA data to accommodate higher spatial frequency distortions, fitted data visualization for assessing the quality of fit, the ability to accommodate input data from three FEA codes plus other standard formats, and options for alignment of the model coordinate system with the ray trace coordinate system. TransFit uses the AnswerGarden graphical user interface (GUI) to edit input parameters and then access routines written in PV-WAVE, C, and FORTRAN to allow the user to interactively create, evaluate, and modify the fit. The topics covered include an introduction to TransFit: requirements, designs philosophy, and implementation; design specifics: modules, parameters, fitting algorithms, and data displays; a procedural example; verification of performance; future work; and appendices on online help and ray trace results of the verification section.

  9. Simulations of Laboratory Astrophysics Experiments using the CRASH code

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Kuranz, Carolyn; Fein, Jeff; Wan, Willow; Young, Rachel; Keiter, Paul; Drake, R. Paul

    2015-11-01

    Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, magnetized flows, jets, and laser-produced plasmas. This work is funded by the following grants: DEFC52-08NA28616, DE-NA0001840, and DE-NA0002032.

  10. Large Coded Aperture Mask for Spaceflight Hard X-ray Images

    NASA Technical Reports Server (NTRS)

    Vigneau, Danielle N.; Robinson, David W.

    2002-01-01

    The 2.6 square meter coded aperture mask is a vital part of the Burst Alert Telescope on the Swift mission. A random, but known pattern of more than 50,000 lead tiles, each 5 mm square, was bonded to a large honeycomb panel which projects a shadow on the detector array during a gamma ray burst. A two-year development process was necessary to explore ideas, apply techniques, and finalize procedures to meet the strict requirements for the coded aperture mask. Challenges included finding a honeycomb substrate with minimal gamma ray attenuation, selecting an adhesive with adequate bond strength to hold the tiles in place but soft enough to allow the tiles to expand and contract without distorting the panel under large temperature gradients, and eliminating excess adhesive from all untiled areas. The largest challenge was to find an efficient way to bond the > 50,000 lead tiles to the panel with positional tolerances measured in microns. In order to generate the desired bondline, adhesive was applied and allowed to cure to each tile. The pre-cured tiles were located in a tool to maintain positional accuracy, wet adhesive was applied to the panel, and it was lowered to the tile surface with synchronized actuators. Using this procedure, the entire tile pattern was transferred to the large honeycomb panel in a single bond. The pressure for the bond was achieved by enclosing the entire system in a vacuum bag. Thermal vacuum and acoustic tests validated this approach. This paper discusses the methods, materials, and techniques used to fabricate this very large and unique coded aperture mask for the Swift mission.

  11. Line x-ray source for diffraction enhanced imaging in clinical and industrial applications

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoqin

    Mammography is one type of imaging modalities that uses a low-dose x-ray or other radiation sources for examination of breasts. It plays a central role in early detection of breast cancers. The material similarity of tumor-cell and health cell, breast implants surgery and other factors, make the breast cancers hard to visualize and detect. Diffraction enhanced imaging (DEI), first proposed and investigated by D. Chapman is a new x-ray radiographic imaging modality using monochromatic x-rays from a synchrotron source, which produced images of thick absorbing objects that are almost completely free of scatter. It shows dramatically improved contrast over standard imaging when applied to the same phantom. The contrast is based not only on attenuation but also on the refraction and diffraction properties of the sample. This imaging method may improve image quality of mammography, other medical applications, industrial radiography for non-destructive testing and x-ray computed tomography. However, the size, and cost, of a synchrotron source limits the application of the new modality to be applicable at clinical levels. This research investigates the feasibility of a designed line x-ray source to produce intensity compatible to synchrotron sources. It is composed of a 2-cm in length tungsten filament, installed on a carbon steel filament cup (backing plate), as the cathode and a stationary oxygen-free copper anode with molybdenum coating on the front surface serves as the target. Characteristic properties of the line x-ray source were computationally studied and the prototype was experimentally investigated. SIMIION code was used to computationally study the electron trajectories emanating from the filament towards the molybdenum target. A Faraday cup on the prototype device, proof-of-principle, was used to measure the distribution of electrons on the target, which compares favorably to computational results. The intensities of characteristic x-ray for molybdenum, tungsten and rhodium targets were investigated with different window materials for -30kV to -100kV applied potential. Heat loading and thermal management of the target has been investigated computationally using COMSOL code package, and experimental measurements of target temperature rise was taken via thermocouples attached to the target. Temperature measurements for low voltage, low current regime without active cooling were compared to computational results for code-experiment benchmarking. Two different phantoms were used in the simulation of DEI images, which showed that the designed x-ray source with DEI setup could produce images with significant improved contrast. The computational results, along with experimental measurements on the prototype setup, indicate the possibility of scale up to larger area x-ray source adequate for DEI applications.

  12. Modeling experimental plasma diagnostics in the FLASH code: Thomson scattering

    NASA Astrophysics Data System (ADS)

    Weide, Klaus; Flocke, Norbert; Feister, Scott; Tzeferacos, Petros; Lamb, Donald

    2017-10-01

    Spectral analysis of the Thomson scattering of laser light sent into a plasma provides an experimental method to quantify plasma properties in laser-driven plasma experiments. We have implemented such a synthetic Thomson scattering diagnostic unit in the FLASH code, to emulate the probe-laser propagation, scattering and spectral detection. User-defined laser rays propagate into the FLASH simulation region and experience scattering (change in direction and frequency) based on plasma parameters. After scattering, the rays propagate out of the interaction region and are spectrally characterized. The diagnostic unit can be used either during a physics simulation or in post-processing of simulation results. FLASH is publicly available at flash.uchicago.edu. U.S. DOE NNSA, U.S. DOE NNSA ASC, U.S. DOE Office of Science and NSF.

  13. Gravitational microlensing of gamma-ray bursts

    NASA Technical Reports Server (NTRS)

    Mao, Shude

    1993-01-01

    A Monte Carlo code is developed to calculate gravitational microlensing in three dimensions when the lensing optical depth is low or moderate (not greater than 0.25). The code calculates positions of microimages and time delays between the microimages. The majority of lensed gamma-ray bursts should show a simple double-burst structure, as predicted by a single point mass lens model. A small fraction should show complicated multiple events due to the collective effects of several point masses (black holes). Cosmological models with a significant fraction of mass density in massive compact objects can be tested by searching for microlensing events in the current BATSE data. Our catalog generated by 10,000 Monte Carlo models is accessible through the computer network. The catalog can be used to take realistic selection effects into account.

  14. An MCNP-based model of a medical linear accelerator x-ray photon beam.

    PubMed

    Ajaj, F A; Ghassal, N M

    2003-09-01

    The major components in the x-ray photon beam path of the treatment head of the VARIAN Clinac 2300 EX medical linear accelerator were modeled and simulated using the Monte Carlo N-Particle radiation transport computer code (MCNP). Simulated components include x-ray target, primary conical collimator, x-ray beam flattening filter and secondary collimators. X-ray photon energy spectra and angular distributions were calculated using the model. The x-ray beam emerging from the secondary collimators were scored by considering the total x-ray spectra from the target as the source of x-rays at the target position. The depth dose distribution and dose profiles at different depths and field sizes have been calculated at a nominal operating potential of 6 MV and found to be within acceptable limits. It is concluded that accurate specification of the component dimensions, composition and nominal accelerating potential gives a good assessment of the x-ray energy spectra.

  15. Swift Burst Alert Telescope (BAT) Instrument Response

    NASA Technical Reports Server (NTRS)

    Parsons, A.; Hullinger, D.; Markwardt, C.; Barthelmy, S.; Cummings, J.; Gehrels, N.; Krimm, H.; Tueller, J.; Fenimore, E.; Palmer, D.

    2004-01-01

    The Burst Alert Telescope (BAT), a large coded aperture instrument with a wide field-of-view (FOV), provides the gamma-ray burst triggers and locations for the Swift Gamma-Ray Burst Explorer. In addition to providing this imaging information, BAT will perform a 15 keV - 150 keV all-sky hard x-ray survey based on the serendipitous pointings resulting from the study of gamma-ray bursts and will also monitor the sky for transient hard x-ray sources. For BAT to provide spectral and photometric information for the gamma-ray bursts, the transient sources and the all-sky survey, the BAT instrument response must be determined to an increasingly greater accuracy. In this talk, we describe the BAT instrument response as determined to an accuracy suitable for gamma-ray burst studies. We will also discuss the public data analysis tools developed to calculate the BAT response to sources at different energies and locations in the FOV. The level of accuracy required for the BAT instrument response used for the hard x-ray survey is significantly higher because this response must be used in the iterative clean algorithm for finding fainter sources. Because the bright sources add a lot of coding noise to the BAT sky image, fainter sources can be seen only after the counts due to the bright sources are removed. The better we know the BAT response, the lower the noise in the cleaned spectrum and thus the more sensitive the survey. Since the BAT detector plane consists of 32768 individual, 4 mm square CZT gamma-ray detectors, the most accurate BAT response would include 32768 individual detector response functions to separate mask modulation effects from differences in detector efficiencies! We describe OUT continuing work to improve the accuracy of the BAT instrument response and will present the current results of Monte Carlo simulations as well as BAT ground calibration data.

  16. PRay - A graphical user interface for interactive visualization and modification of rayinvr models

    NASA Astrophysics Data System (ADS)

    Fromm, T.

    2016-01-01

    PRay is a graphical user interface for interactive displaying and editing of velocity models for seismic refraction. It is optimized for editing rayinvr models but can also be used as a dynamic viewer for ray tracing results from other software. The main features are the graphical editing of nodes and fast adjusting of the display (stations and phases). It can be extended by user-defined shell scripts and links to phase picking software. PRay is open source software written in the scripting language Perl, runs on Unix-like operating systems including Mac OS X and provides a version controlled source code repository for community development (https://sourceforge.net/projects/pray-plot-rayinvr/).

  17. Inline CBET Model Including SRS Backscatter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, David S.

    2015-06-26

    Cross-beam energy transfer (CBET) has been used as a tool on the National Ignition Facility (NIF) since the first energetics experiments in 2009 to control the energy deposition in ignition hohlraums and tune the implosion symmetry. As large amounts of power are transferred between laser beams at the entrance holes of NIF hohlraums, the presence of many overlapping beat waves can lead to stochastic ion heating in the regions where laser beams overlap [P. Michel et al., Phys. Rev. Lett. 109, 195004 (2012)]. Using the CBET gains derived in this paper, we show how to implement these equations in amore » ray-based laser source for a rad-hydro code.« less

  18. Implementation of a tree algorithm in MCNP code for nuclear well logging applications.

    PubMed

    Li, Fusheng; Han, Xiaogang

    2012-07-01

    The goal of this paper is to develop some modeling capabilities that are missing in the current MCNP code. Those missing capabilities can greatly help for some certain nuclear tools designs, such as a nuclear lithology/mineralogy spectroscopy tool. The new capabilities to be developed in this paper include the following: zone tally, neutron interaction tally, gamma rays index tally and enhanced pulse-height tally. The patched MCNP code also can be used to compute neutron slowing-down length and thermal neutron diffusion length. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. How to Build a Time Machine: Interfacing Hydrodynamics, Ionization Calculations and X-ray Spectral Codes for Supernova Remnants

    NASA Astrophysics Data System (ADS)

    Badenes, Carlos

    2006-02-01

    Thanks to Chandra and XMM-Newton, spatially resolved spectroscopy of SNRsin the X-ray band has become a reality. Several impressive data sets forejecta-dominated SNRs can now be found in the archives, the Cas A VLP justbeing one (albeit probably the most spectacular) example. However, it isoften hard to establish quantitative, unambiguous connections between theX-ray observations of SNRs and the dramatic events involved in a corecollapse or thermonuclear SN explosion. The reason for this is that thevery high quality of the data sets generated by Chandra and XMM for thelikes of Cas A, SNR 292.0+1.8, Tycho, and SN 1006 has surpassed our abilityto analyze them. The core of the problem is in the transient nature of theplasmas in SNRs, which results in anintimate relationship between the structure of the ejecta and AM, the SNRdynamics arising from their interaction, and the ensuing X-rayemission. Thus, the ONLY way to understand the X-ray observations ofejecta-dominated SNRs at all levels, from the spatially integrated spectrato the subarcsecond scales that can be resolved by Chandra, is to couplehydrodynamic simulations to nonequilibrium ionization (NEI) calculationsand X-ray spectral codes. I will review the basic ingredients that enterthis kind of calculations, and what are the prospects for using them tounderstand the X-ray emission from the shocked ejecta in young SNRs. Thisunderstanding (when it is possible), can turn SNRs into veritable timemachines, revealing the secrets of the titanic explosions that generatedthem hundreds of years ago.

  20. The atmospheric structures of the companion stars of eclipsing binary x ray sources

    NASA Technical Reports Server (NTRS)

    Clark, George W.

    1992-01-01

    This investigation was aimed at determining structural features of the atmospheres of the massive early-type companion stars of eclipse x-ray pulsars by measurement of the attenuation of the x-ray spectrum during eclipse transitions and in deep eclipse. Several extended visits were made to ISAS in Japan by G. Clark and his graduate student, Jonathan Woo to coordinate the Ginga observations and preliminary data reduction, and to work with the Japanese host scientist, Fumiaki Nagase, in the interpretation of the data. At MIT extensive developments were made in software systems for data interpretation. In particular, a Monte Carlo code was developed for a 3-D simulation of the propagation of x-rays from the neutron star through the ionized atmosphere of the companion. With this code it was possible to determine the spectrum of Compton-scattered x-rays in deep eclipse and to subtract that component from the observed spectra, thereby isolating the software component that is attributable in large measure to x-rays that have been scattered by interstellar grains. This research has culminated in the submission of paper to the Astrophysical Journal on the determination of properties of the atmosphere of QV Nor, the BOI companion of 4U 1538-52, and the properties of interstellar dust grains along the line of sight from the source. The latter results were an unanticipated byproduct of the investigation. Data from Ginga observations of the Magellanic binaries SMC X-1 and LMC X-4 are currently under investigation as the PhD thesis project of Jonathan Woo who anticipated completion in the spring of 1993.

  1. Time Evolving Fission Chain Theory and Fast Neutron and Gamma-Ray Counting Distributions

    DOE PAGES

    Kim, K. S.; Nakae, L. F.; Prasad, M. K.; ...

    2015-11-01

    Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less

  2. NVIDIA OptiX ray-tracing engine as a new tool for modelling medical imaging systems

    NASA Astrophysics Data System (ADS)

    Pietrzak, Jakub; Kacperski, Krzysztof; Cieślar, Marek

    2015-03-01

    The most accurate technique to model the X- and gamma radiation path through a numerically defined object is the Monte Carlo simulation which follows single photons according to their interaction probabilities. A simplified and much faster approach, which just integrates total interaction probabilities along selected paths, is known as ray tracing. Both techniques are used in medical imaging for simulating real imaging systems and as projectors required in iterative tomographic reconstruction algorithms. These approaches are ready for massive parallel implementation e.g. on Graphics Processing Units (GPU), which can greatly accelerate the computation time at a relatively low cost. In this paper we describe the application of the NVIDIA OptiX ray-tracing engine, popular in professional graphics and rendering applications, as a new powerful tool for X- and gamma ray-tracing in medical imaging. It allows the implementation of a variety of physical interactions of rays with pixel-, mesh- or nurbs-based objects, and recording any required quantities, like path integrals, interaction sites, deposited energies, and others. Using the OptiX engine we have implemented a code for rapid Monte Carlo simulations of Single Photon Emission Computed Tomography (SPECT) imaging, as well as the ray-tracing projector, which can be used in reconstruction algorithms. The engine generates efficient, scalable and optimized GPU code, ready to run on multi GPU heterogeneous systems. We have compared the results our simulations with the GATE package. With the OptiX engine the computation time of a Monte Carlo simulation can be reduced from days to minutes.

  3. Full wave simulations of helicon wave losses in the scrape-off-layer of the DIII-D tokamak

    NASA Astrophysics Data System (ADS)

    Lau, Cornwall; Jaeger, Fred; Berry, Lee; Bertelli, Nicola; Pinsker, Robert

    2017-10-01

    Helicon waves have been recently proposed as an off-axis current drive actuator for DIII-D. Previous modeling using the hot plasma, full wave code AORSA, has shown good agreement with the ray tracing code GENRAY for helicon wave propagation and absorption in the core plasma. AORSA, and a new, reduced finite-element-model show discrepancies between ray tracing and full wave occur in the scrape-off-layer (SOL), especially at high densities. The reduced model is much faster than AORSA, and reproduces most of the important features of the AORSA model. The reduced model also allows for larger parametric scans and for the easy use of arbitrary tokamak geometry. Results of the full wave codes, AORSA and COMSOL, will be shown for helicon wave losses in the SOL are shown for a large range of parameters, such as SOL density profiles, n||, radial and vertical locations of the antenna, and different tokamak vessel geometries. This work was supported by DE-AC05-00OR22725, DE-AC02-09CH11466, and DE-FC02-04ER54698.

  4. X-ray backscatter radiography with lower open fraction coded masks

    NASA Astrophysics Data System (ADS)

    Muñoz, André A. M.; Vella, Anna; Healy, Matthew J. F.; Lane, David W.; Jupp, Ian; Lockley, David

    2017-09-01

    Single sided radiographic imaging would find great utility for medical, aerospace and security applications. While coded apertures can be used to form such an image from backscattered X-rays they suffer from near field limitations that introduce noise. Several theoretical studies have indicated that for an extended source the images signal to noise ratio may be optimised by using a low open fraction (<0.5) mask. However, few experimental results have been published for such low open fraction patterns and details of their formulation are often unavailable or are ambiguous. In this paper we address this process for two types of low open fraction mask, the dilute URA and the Singer set array. For the dilute URA the procedure for producing multiple 2D array patterns from given 1D binary sequences (Barker codes) is explained. Their point spread functions are calculated and their imaging properties are critically reviewed. These results are then compared to those from the Singer set and experimental exposures are presented for both type of pattern; their prospects for near field imaging are discussed.

  5. How DARHT Works - the World's Most Powerful X-ray Machine

    ScienceCinema

    None

    2018-06-01

    The Dual Axis Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory is an essential scientific tool that supports Stockpile Stewardship at the Laboratory. The World's most powerful x-ray machine, it's used to take high-speed images of mock nuclear devices - data that is used to confirm and modify advanced computer codes in assuring the safety, security, and effectiveness of the U.S. nuclear deterrent.

  6. The High Energy Telescope on EXIST: Hunting High Red-shift GRBs and Other Exotic Transients

    NASA Astrophysics Data System (ADS)

    Hong, JaeSub; Grindlay, J.; Allen, B.; Skinner, G. K.; Finger, M. H.; Jernigan, J. G.; EXIST Team

    2009-01-01

    The current baseline design of the High Energy Telescope (HET) on EXIST will localize high red-shift Gamma-Ray Bursts (GRBs) and other exotic transients fast (<10 sec) and accurately (<17") in order to allow the rapid (<1-2 min) follow-up onboard optical/IR imaging and spectroscopy. HET employs coded-aperture imaging with 5.5m2 CZT detector and a large hybrid tungsten mask (See also Skinner et al. in this meeting). The wide energy band coverage (5-600 keV) is optimal for capturing these transients and highly obscured AGNs. The continuous scan with the wide field of view ( 45 deg radius at 25% coding fraction) increases the chance of capturing rare elusive events such as soft Gamma-ray repeaters and tidal disruption events of stars by dormant supermassive black holes. Sweeping nearly the entire sky every two orbits (3 hour) will also establish a finely-sampled long-term history of the X-ray variability of many X-ray sources, opening up a new time domain of the variability study. In light of the new EXIST design concept, we review the observing strategy to maximize the science return and report the latest development of the CZT detectors for HET.

  7. Hydrodynamic study of plasma amplifiers for soft-x-ray lasers: a transition in hydrodynamic behavior for plasma columns with widths ranging from 20 μm to 2 mm.

    PubMed

    Oliva, Eduardo; Zeitoun, Philippe; Velarde, Pedro; Fajardo, Marta; Cassou, Kevin; Ros, David; Sebban, Stephan; Portillo, David; le Pape, Sebastien

    2010-11-01

    Plasma-based seeded soft-x-ray lasers have the potential to generate high energy and highly coherent short pulse beams. Due to their high density, plasmas created by the interaction of an intense laser with a solid target should store the highest amount of energy density among all plasma amplifiers. Our previous numerical work with a two-dimensional (2D) adaptive mesh refinement hydrodynamic code demonstrated that careful tailoring of plasma shapes leads to a dramatic enhancement of both soft-x-ray laser output energy and pumping efficiency. Benchmarking of our 2D hydrodynamic code in previous experiments demonstrated a high level of confidence, allowing us to perform a full study with the aim of the way for 10-100 μJ seeded soft-x-ray lasers. In this paper, we describe in detail the mechanisms that drive the hydrodynamics of plasma columns. We observed transitions between narrow plasmas, where very strong bidimensional flow prevents them from storing energy, to large plasmas that store a high amount of energy. Millimeter-sized plasmas are outstanding amplifiers, but they have the limitation of transverse lasing. In this paper, we provide a preliminary solution to this problem.

  8. The INTEGRAL scatterometer SPI

    NASA Technical Reports Server (NTRS)

    Mandrou, P.; Vedrenne, G.; Jean, P.; Kandel, B.; vonBallmoos, P.; Albernhe, F.; Lichti, G.; Schoenfelder, V.; Diehl, R.; Georgii, R.; hide

    1997-01-01

    The INTErnational Gamma Ray Astrophysics Laboratory (INTEGRAL) mission's onboard spectrometer, the INTEGRAL spectrometer (SPI), is described. The SPI constitutes one of the four main mission instruments. It is optimized for detailed measurements of gamma ray lines and for the mapping of diffuse sources. It combines a coded aperture mask with an array of large volume, high purity germanium detectors. The detectors make precise measurements of the gamma ray energies over the 20 keV to 8 MeV range. The instrument's characteristics are described and the Monte Carlo simulation of its performance is outlined. It will be possible to study gamma ray emission from compact objects or line profiles with a high energy resolution and a high angular resolution.

  9. X-Ray Spectra from MHD Simulations of Accreting Black Holes

    NASA Technical Reports Server (NTRS)

    Schnittman, Jeremy D.; Noble, Scott C.; Krolik, Julian H.

    2011-01-01

    We present new global calculations of X-ray spectra from fully relativistic magneto-hydrodynamic (MHO) simulations of black hole (BH) accretion disks. With a self consistent radiative transfer code including Compton scattering and returning radiation, we can reproduce the predominant spectral features seen in decades of X-ray observations of stellar-mass BHs: a broad thermal peak around 1 keV, power-law continuum up to >100 keV, and a relativistically broadened iron fluorescent line. By varying the mass accretion rate, different spectral states naturally emerge: thermal-dominant, steep power-law, and low/hard. In addition to the spectral features, we briefly discuss applications to X-ray timing and polarization.

  10. The Challenge of Time-Dependent Control of Both Processing and Performance of Materials at the Mesoscale, and the MaRIE Project

    NASA Astrophysics Data System (ADS)

    Barnes, Cris W.

    DOE and NNSA are recognizing a mission need for flexible and reduced-cost product-based solutions to materials through accelerated qualification, certification, and assessment. The science challenge lies between the nanoscale of materials and the integral device scale, at the middle or ''mesoscale'' where interfaces, defects, and microstructure determine the performance of the materials over the lifecycle of the intended use. Time-dependent control of the processing, structure and properties of materials at this scale lies at the heart of qualifying and certifying additive manufactured parts; experimental data of high fidelity and high resolution are necessary to discover the right physical mechanisms to model and to validate and calibrate those reduced-order models in codes on Exascale computers. The scientific requirements to do this are aided by a revolution in coherent imaging of non-periodic features that can be combined with scattering off periodic structures. This drives the need to require a coherent x-ray source, brilliant and high repetition rate, of sufficiently high energy to see into and through the mesoscale. The Matter-Radiation Interactions in Extremes (MaRIE) Project is a proposal to build such a very-high-energy X-ray Free Electron Laser.

  11. Radiation exposure for manned Mars surface missions

    NASA Technical Reports Server (NTRS)

    Simonsen, Lisa C.; Nealy, John E.; Townsend, Lawrence W.; Wilson, John W.

    1990-01-01

    The Langley cosmic ray transport code and the Langley nucleon transport code (BRYNTRN) are used to quantify the transport and attenuation of galactic cosmic rays (GCR) and solar proton flares through the Martian atmosphere. Surface doses are estimated using both a low density and a high density carbon dioxide model of the atmosphere which, in the vertical direction, provides a total of 16 g/sq cm and 22 g/sq cm of protection, respectively. At the Mars surface during the solar minimum cycle, a blood-forming organ (BFO) dose equivalent of 10.5 to 12 rem/yr due to galactic cosmic ray transport and attenuation is calculated. Estimates of the BFO dose equivalents which would have been incurred from the three large solar flare events of August 1972, November 1960, and February 1956 are also calculated at the surface. Results indicate surface BFO dose equivalents of approximately 2 to 5, 5 to 7, and 8 to 10 rem per event, respectively. Doses are also estimated at altitudes up to 12 km above the Martian surface where the atmosphere will provide less total protection.

  12. Space radiation dose estimates on the surface of Mars

    NASA Technical Reports Server (NTRS)

    Simonsen, Lisa C.; Nealy, John E.; Townsend, Lawrence W.; Wilson, John W.

    1990-01-01

    The Langley cosmic ray transport code and the Langley nucleon transport code (BRYNTRN) are used to quantify the transport and attenuation of galactic cosmic rays (GCR) and solar proton flares through the Martian atmosphere. Surface doses are estimated using both a low density and a high density carbon dioxide model of the atmosphere which, in the vertical direction, provides a total of 16 g/sq cm and 22 g/sq cm of protection, respectively. At the Mars surface during the solar minimum cycle, a blood-forming organ (BFO) dose equivalent of 10.5 to 12 rem/yr due to galactic cosmic ray transport and attenuation is calculated. Estimates of the BFO dose equivalents which would have been incurred from the three large solar flare events of August 1972, November 1960, and February 1956 are also calculated at the surface. Results indicate surface BFO dose equivalents of approximately 2 to 5, 5 to 7, and 8 to 10 rem per event, respectively. Doses are also estimated at altitudes up to 12 km above the Martian surface where the atmosphere will provide less total protection.

  13. Comment on {open_quote}{open_quote}Comments on the use of asymmetric monochromators for x-ray diffraction on a synchrotron source{close_quote}{close_quote} [Rev. Sci. Instrum. {bold 66}, 2174 (1995)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanchez del Rio, M.; Cerrina, F.

    1996-10-01

    In the article {open_quote}{open_quote}Comments on the use of asymmetric monochromators for x-ray diffraction on a synchrotron source,{close_quote}{close_quote} by Colin Nave, Ana Gonzalez, Graham Clark, Sean McSweeney, Stewart Cummings, and Michael Hart, Rev. Sci. Instrum. {bold 66}, 2174 (1995), paragraph II, the authors{close_quote} unfamiliarity with our modeling codes leads them to claim that our approach to treat bent-asymmetrically cut crystals in ray tracing calculations is incorrect. Since SHADOW is a widely used code, it is important to correct any misunderstandings, and we give here arguments to demonstrate that our approach is perfectly valid, and the arguments used by the authors tomore » criticize our method are based on an unwarranted conclusion extracted from one of our previous articles. We show that SHADOW, when properly run, treats the cases raised exactly. Indeed, their arguments provide a nice benchmark test for verifying the accuracy of SHADOW {copyright} {ital 1996 American Institute of Physics.}« less

  14. Digital pile-up rejection for plutonium experiments with solution-grown stilbene

    NASA Astrophysics Data System (ADS)

    Bourne, M. M.; Clarke, S. D.; Paff, M.; DiFulvio, A.; Norsworthy, M.; Pozzi, S. A.

    2017-01-01

    A solution-grown stilbene detector was used in several experiments with plutonium samples including plutonium oxide, mixed oxide, and plutonium metal samples. Neutrons from different reactions and plutonium isotopes are accompanied by numerous gamma rays especially by the 59-keV gamma ray of 241Am. Identifying neutrons correctly is important for nuclear nonproliferation applications and makes neutron/gamma discrimination and pile-up rejection necessary. Each experimental dataset is presented with and without pile-up filtering using a previously developed algorithm. The experiments were simulated using MCNPX-PoliMi, a Monte Carlo code designed to accurately model scintillation detector response. Collision output from MCNPX-PoliMi was processed using the specialized MPPost post-processing code to convert neutron energy depositions event-by-event into light pulses. The model was compared to experimental data after pulse-shape discrimination identified waveforms as gamma ray or neutron interactions. We show that the use of the digital pile-up rejection algorithm allows for accurate neutron counting with stilbene to within 2% even when not using lead shielding.

  15. Technology Needs for Gamma Ray Astronomy

    NASA Technical Reports Server (NTRS)

    Gehrels, Neil

    2011-01-01

    Gamma ray astronomy is currently in an exciting period of multiple missions and a wealth of data. Results from INTEGRAL, Fermi, AGILE, Suzaku and Swift are making large contributions to our knowledge of high energy processes in the universe. The advances are due to new detector and imaging technologies. The steps to date have been from scintillators to solid state detectors for sensors and from light buckets to coded aperture masks and pair telescopes for imagers. A key direction for the future is toward focusing telescopes pushing into the hard X-ray regime and Compton telescopes and pair telescopes with fine spatial resolution for medium and high energy gamma rays. These technologies will provide finer imaging of gamma-ray sources. Importantly, they will also enable large steps forward in sensitivity by reducing background.

  16. Computational design of short pulse laser driven iron opacity experiments

    DOE PAGES

    Martin, M. E.; London, R. A.; Goluoglu, S.; ...

    2017-02-23

    Here, the resolution of current disagreements between solar parameters calculated from models and observations would benefit from the experimental validation of theoretical opacity models. Iron's complex ionic structure and large contribution to the opacity in the radiative zone of the sun make iron a good candidate for validation. Short pulse lasers can be used to heat buried layer targets to plasma conditions comparable to the radiative zone of the sun, and the frequency dependent opacity can be inferred from the target's measured x-ray emission. Target and laser parameters must be optimized to reach specific plasma conditions and meet x-ray emissionmore » requirements. The HYDRA radiation hydrodynamics code is used to investigate the effects of modifying laser irradiance and target dimensions on the plasma conditions, x-ray emission, and inferred opacity of iron and iron-magnesium buried layer targets. It was determined that plasma conditions are dominantly controlled by the laser energy and the tamper thickness. The accuracy of the inferred opacity is sensitive to tamper emission and optical depth effects. Experiments at conditions relevant to the radiative zone of the sun would investigate the validity of opacity theories important to resolving disagreements between solar parameters calculated from models and observations.« less

  17. Computational design of short pulse laser driven iron opacity experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, M. E.; London, R. A.; Goluoglu, S.

    Here, the resolution of current disagreements between solar parameters calculated from models and observations would benefit from the experimental validation of theoretical opacity models. Iron's complex ionic structure and large contribution to the opacity in the radiative zone of the sun make iron a good candidate for validation. Short pulse lasers can be used to heat buried layer targets to plasma conditions comparable to the radiative zone of the sun, and the frequency dependent opacity can be inferred from the target's measured x-ray emission. Target and laser parameters must be optimized to reach specific plasma conditions and meet x-ray emissionmore » requirements. The HYDRA radiation hydrodynamics code is used to investigate the effects of modifying laser irradiance and target dimensions on the plasma conditions, x-ray emission, and inferred opacity of iron and iron-magnesium buried layer targets. It was determined that plasma conditions are dominantly controlled by the laser energy and the tamper thickness. The accuracy of the inferred opacity is sensitive to tamper emission and optical depth effects. Experiments at conditions relevant to the radiative zone of the sun would investigate the validity of opacity theories important to resolving disagreements between solar parameters calculated from models and observations.« less

  18. Mechanical behaviour of a fibrous scaffold for ligament tissue engineering: finite elements analysis vs. X-ray tomography imaging.

    PubMed

    Laurent, Cédric P; Latil, Pierre; Durville, Damien; Rahouadj, Rachid; Geindreau, Christian; Orgéas, Laurent; Ganghoffer, Jean-François

    2014-12-01

    The use of biodegradable scaffolds seeded with cells in order to regenerate functional tissue-engineered substitutes offers interesting alternative to common medical approaches for ligament repair. Particularly, finite element (FE) method enables the ability to predict and optimise both the macroscopic behaviour of these scaffolds and the local mechanic signals that control the cell activity. In this study, we investigate the ability of a dedicated FE code to predict the geometrical evolution of a new braided and biodegradable polymer scaffold for ligament tissue engineering by comparing scaffold geometries issued from FE simulations and from X-ray tomographic imaging during a tensile test. Moreover, we compare two types of FE simulations the initial geometries of which are issued either from X-ray imaging or from a computed idealised configuration. We report that the dedicated FE simulations from an idealised reference configuration can be reasonably used in the future to predict the global and local mechanical behaviour of the braided scaffold. A valuable and original dialog between the fields of experimental and numerical characterisation of such fibrous media is thus achieved. In the future, this approach should enable to improve accurate characterisation of local and global behaviour of tissue-engineering scaffolds. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Photon-induced positron annihilation lifetime spectroscopy using ultrashort laser-Compton-scattered gamma-ray pulses

    NASA Astrophysics Data System (ADS)

    Taira, Y.; Toyokawa, H.; Kuroda, R.; Yamamoto, N.; Adachi, M.; Tanaka, S.; Katoh, M.

    2013-05-01

    High-energy ultrashort gamma-ray pulses can be generated via laser Compton scattering with 90° collisions at the UVSOR-II electron storage ring. As an applied study of ultrashort gamma-ray pulses, a new photon-induced positron annihilation lifetime spectroscopy approach has been developed. Ultrashort gamma-ray pulses with a maximum energy of 6.6 MeV and pulse width of 2.2 ps created positrons throughout bulk lead via pair production. Annihilation gamma rays were detected by a BaF2 scintillator mounted on a photomultiplier tube. A positron lifetime spectrum was obtained by measuring the time difference between the RF frequency of the electron storage ring and the detection time of the annihilation gamma rays. We calculated the response of the BaF2 scintillator and the time jitter caused by the variation in the total path length of the ultrashort gamma-ray pulses, annihilation gamma rays, and scintillation light using a Monte Carlo simulation code. The positron lifetime for bulk lead was successfully measured.

  20. Explaining TeV cosmic-ray anisotropies with non-diffusive cosmic-ray propagation

    DOE PAGES

    Harding, James Patrick; Fryer, Chris Lee; Mendel, Susan Marie

    2016-05-11

    Constraining the behavior of cosmic ray data observed at Earth requires a precise understanding of how the cosmic rays propagate in the interstellar medium. The interstellar medium is not homogeneous; although turbulent magnetic fields dominate over large scales, small coherent regions of magnetic field exist on scales relevant to particle propagation in the nearby Galaxy. Guided propagation through a coherent field is significantly different from random particle diffusion and could be the explanation of spatial anisotropies in the observed cosmic rays. We present a Monte Carlo code to propagate cosmic particle through realistic magnetic field structures. We discuss the detailsmore » of the model as well as some preliminary studies which indicate that coherent magnetic structures are important effects in local cosmic-ray propagation, increasing the flux of cosmic rays by over two orders of magnitude at anisotropic locations on the sky. Furthermore, the features induced by coherent magnetic structure could be the cause of the observed TeV cosmic-ray anisotropy.« less

  1. EXPLAINING TEV COSMIC-RAY ANISOTROPIES WITH NON-DIFFUSIVE COSMIC-RAY PROPAGATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harding, J. Patrick; Fryer, Chris L.; Mendel, Susan, E-mail: jpharding@lanl.gov, E-mail: fryer@lanl.gov, E-mail: smendel@lanl.gov

    2016-05-10

    Constraining the behavior of cosmic ray data observed at Earth requires a precise understanding of how the cosmic rays propagate in the interstellar medium. The interstellar medium is not homogeneous; although turbulent magnetic fields dominate over large scales, small coherent regions of magnetic field exist on scales relevant to particle propagation in the nearby Galaxy. Guided propagation through a coherent field is significantly different from random particle diffusion and could be the explanation of spatial anisotropies in the observed cosmic rays. We present a Monte Carlo code to propagate cosmic particle through realistic magnetic field structures. We discuss the detailsmore » of the model as well as some preliminary studies which indicate that coherent magnetic structures are important effects in local cosmic-ray propagation, increasing the flux of cosmic rays by over two orders of magnitude at anisotropic locations on the sky. The features induced by coherent magnetic structure could be the cause of the observed TeV cosmic-ray anisotropy.« less

  2. Cyclotron resonant scattering feature simulations. II. Description of the CRSF simulation process

    NASA Astrophysics Data System (ADS)

    Schwarm, F.-W.; Ballhausen, R.; Falkner, S.; Schönherr, G.; Pottschmidt, K.; Wolff, M. T.; Becker, P. A.; Fürst, F.; Marcu-Cheatham, D. M.; Hemphill, P. B.; Sokolova-Lapa, E.; Dauser, T.; Klochkov, D.; Ferrigno, C.; Wilms, J.

    2017-05-01

    Context. Cyclotron resonant scattering features (CRSFs) are formed by scattering of X-ray photons off quantized plasma electrons in the strong magnetic field (of the order 1012 G) close to the surface of an accreting X-ray pulsar. Due to the complex scattering cross-sections, the line profiles of CRSFs cannot be described by an analytic expression. Numerical methods, such as Monte Carlo (MC) simulations of the scattering processes, are required in order to predict precise line shapes for a given physical setup, which can be compared to observations to gain information about the underlying physics in these systems. Aims: A versatile simulation code is needed for the generation of synthetic cyclotron lines. Sophisticated geometries should be investigatable by making their simulation possible for the first time. Methods: The simulation utilizes the mean free path tables described in the first paper of this series for the fast interpolation of propagation lengths. The code is parallelized to make the very time-consuming simulations possible on convenient time scales. Furthermore, it can generate responses to monoenergetic photon injections, producing Green's functions, which can be used later to generate spectra for arbitrary continua. Results: We develop a new simulation code to generate synthetic cyclotron lines for complex scenarios, allowing for unprecedented physical interpretation of the observed data. An associated XSPEC model implementation is used to fit synthetic line profiles to NuSTAR data of Cep X-4. The code has been developed with the main goal of overcoming previous geometrical constraints in MC simulations of CRSFs. By applying this code also to more simple, classic geometries used in previous works, we furthermore address issues of code verification and cross-comparison of various models. The XSPEC model and the Green's function tables are available online (see link in footnote, page 1).

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, K. S.; Nakae, L. F.; Prasad, M. K.

    Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less

  4. Comparison of space radiation calculations for deterministic and Monte Carlo transport codes

    NASA Astrophysics Data System (ADS)

    Lin, Zi-Wei; Adams, James; Barghouty, Abdulnasser; Randeniya, Sharmalee; Tripathi, Ram; Watts, John; Yepes, Pablo

    For space radiation protection of astronauts or electronic equipments, it is necessary to develop and use accurate radiation transport codes. Radiation transport codes include deterministic codes, such as HZETRN from NASA and UPROP from the Naval Research Laboratory, and Monte Carlo codes such as FLUKA, the Geant4 toolkit and HETC-HEDS. The deterministic codes and Monte Carlo codes complement each other in that deterministic codes are very fast while Monte Carlo codes are more elaborate. Therefore it is important to investigate how well the results of deterministic codes compare with those of Monte Carlo transport codes and where they differ. In this study we evaluate these different codes in their space radiation applications by comparing their output results in the same given space radiation environments, shielding geometry and material. Typical space radiation environments such as the 1977 solar minimum galactic cosmic ray environment are used as the well-defined input, and simple geometries made of aluminum, water and/or polyethylene are used to represent the shielding material. We then compare various outputs of these codes, such as the dose-depth curves and the flux spectra of different fragments and other secondary particles. These comparisons enable us to learn more about the main differences between these space radiation transport codes. At the same time, they help us to learn the qualitative and quantitative features that these transport codes have in common.

  5. Scanning sky monitor (SSM) onboard AstroSat

    NASA Astrophysics Data System (ADS)

    Ramadevi, M. C.; Seetha, S.; Bhattacharya, Dipankar; Ravishankar, B. T.; Sitaramamurthy, N.; Meena, G.; Sharma, M. Ramakrishna; Kulkarni, Ravi; Babu, V. Chandra; Kumar; Singh, Brajpal; Jain, Anand; Yadav, Reena; Vaishali, S.; Ashoka, B. N.; Agarwal, Anil; Balaji, K.; Nagesh, G.; Kumar, Manoj; Gaan, Dhruti Ranjan; Kulshresta, Prashanth; Agarwal, Pankaj; Sebastian, Mathew; Rajarajan, A.; Radhika, D.; Nandi, Anuj; Girish, V.; Agarwal, Vivek Kumar; Kushwaha, Ankur; Iyer, Nirmal Kumar

    2017-10-01

    Scanning Sky Monitor (SSM) onboard AstroSat is an Xray sky monitor in the soft X-ray band designed with a large field of view to detect and locate transient X-ray sources and alert the astronomical community about interesting phenomena in the X-ray sky. SSM comprises position sensitive proportional counters with 1D coded mask for imaging. There are three detector units mounted on a platform capable of rotation which helps covering about 50% of the sky in one full rotation. This paper discusses the elaborate details of the instrument and few immediate results from the instrument after launch.

  6. Bragg x-ray survey spectrometer for ITER.

    PubMed

    Varshney, S K; Barnsley, R; O'Mullane, M G; Jakhar, S

    2012-10-01

    Several potential impurity ions in the ITER plasmas will lead to loss of confined energy through line and continuum emission. For real time monitoring of impurities, a seven channel Bragg x-ray spectrometer (XRCS survey) is considered. This paper presents design and analysis of the spectrometer, including x-ray tracing by the Shadow-XOP code, sensitivity calculations for reference H-mode plasma and neutronics assessment. The XRCS survey performance analysis shows that the ITER measurement requirements of impurity monitoring in 10 ms integration time at the minimum levels for low-Z to high-Z impurity ions can largely be met.

  7. Calibration and performance of a real-time gamma-ray spectrometry water monitor using a LaBr3(Ce) detector

    NASA Astrophysics Data System (ADS)

    Prieto, E.; Casanovas, R.; Salvadó, M.

    2018-03-01

    A scintillation gamma-ray spectrometry water monitor with a 2″ × 2″ LaBr3(Ce) detector was characterized in this study. This monitor measures gamma-ray spectra of river water. Energy and resolution calibrations were performed experimentally, whereas the detector efficiency was determined using Monte Carlo simulations with EGS5 code system. Values of the minimum detectable activity concentrations for 131I and 137Cs were calculated for different integration times. As an example of the monitor performance after calibration, a radiological increment during a rainfall episode was studied.

  8. A graphics-card implementation of Monte-Carlo simulations for cosmic-ray transport

    NASA Astrophysics Data System (ADS)

    Tautz, R. C.

    2016-05-01

    A graphics card implementation of a test-particle simulation code is presented that is based on the CUDA extension of the C/C++ programming language. The original CPU version has been developed for the calculation of cosmic-ray diffusion coefficients in artificial Kolmogorov-type turbulence. In the new implementation, the magnetic turbulence generation, which is the most time-consuming part, is separated from the particle transport and is performed on a graphics card. In this article, the modification of the basic approach of integrating test particle trajectories to employ the SIMD (single instruction, multiple data) model is presented and verified. The efficiency of the new code is tested and several language-specific accelerating factors are discussed. For the example of isotropic magnetostatic turbulence, sample results are shown and a comparison to the results of the CPU implementation is performed.

  9. A method to optimize the shield compact and lightweight combining the structure with components together by genetic algorithm and MCNP code.

    PubMed

    Cai, Yao; Hu, Huasi; Pan, Ziheng; Hu, Guang; Zhang, Tao

    2018-05-17

    To optimize the shield for neutrons and gamma rays compact and lightweight, a method combining the structure and components together was established employing genetic algorithms and MCNP code. As a typical case, the fission energy spectrum of 235 U which mixed neutrons and gamma rays was adopted in this study. Six types of materials were presented and optimized by the method. Spherical geometry was adopted in the optimization after checking the geometry effect. Simulations have made to verify the reliability of the optimization method and the efficiency of the optimized materials. To compare the materials visually and conveniently, the volume and weight needed to build a shield are employed. The results showed that, the composite multilayer material has the best performance. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. LINE: a code which simulates spectral line shapes for fusion reaction products generated by various speed distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slaughter, D.

    1985-03-01

    A computer code is described which estimates the energy spectrum or ''line-shape'' for the charged particles and ..gamma..-rays produced by the fusion of low-z ions in a hot plasma. The simulation has several ''built-in'' ion velocity distributions characteristic of heated plasmas and it also accepts arbitrary speed and angular distributions although they must all be symmetric about the z-axis. An energy spectrum of one of the reaction products (ion, neutron, or ..gamma..-ray) is calculated at one angle with respect to the symmetry axis. The results are shown in tabular form, they are plotted graphically, and the moments of the spectrummore » to order ten are calculated both with respect to the origin and with respect to the mean.« less

  11. Simulation study of 3-5 keV x-ray conversion efficiency from Ar K-shell vs. Ag L-shell targets on the National Ignition Facility laser

    NASA Astrophysics Data System (ADS)

    Kemp, G. E.; Colvin, J. D.; Fournier, K. B.; May, M. J.; Barrios, M. A.; Patel, M. V.; Scott, H. A.; Marinak, M. M.

    2015-05-01

    Tailored, high-flux, multi-keV x-ray sources are desirable for studying x-ray interactions with matter for various civilian, space and military applications. For this study, we focus on designing an efficient laser-driven non-local thermodynamic equilibrium 3-5 keV x-ray source from photon-energy-matched Ar K-shell and Ag L-shell targets at sub-critical densities (˜nc/10) to ensure supersonic, volumetric laser heating with minimal losses to kinetic energy, thermal x rays and laser-plasma instabilities. Using Hydra, a multi-dimensional, arbitrary Lagrangian-Eulerian, radiation-hydrodynamics code, we performed a parameter study by varying initial target density and laser parameters for each material using conditions readily achievable on the National Ignition Facility (NIF) laser. We employ a model, benchmarked against Kr data collected on the NIF, that uses flux-limited Lee-More thermal conductivity and multi-group implicit Monte-Carlo photonics with non-local thermodynamic equilibrium, detailed super-configuration accounting opacities from Cretin, an atomic-kinetics code. While the highest power laser configurations produced the largest x-ray yields, we report that the peak simulated laser to 3-5 keV x-ray conversion efficiencies of 17.7% and 36.4% for Ar and Ag, respectively, occurred at lower powers between ˜100-150 TW. For identical initial target densities and laser illumination, the Ag L-shell is observed to have ≳10× higher emissivity per ion per deposited laser energy than the Ar K-shell. Although such low-density Ag targets have not yet been demonstrated, simulations of targets fabricated using atomic layer deposition of Ag on silica aerogels (˜20% by atomic fraction) suggest similar performance to atomically pure metal foams and that either fabrication technique may be worth pursuing for an efficient 3-5 keV x-ray source on NIF.

  12. EVIDENCE FOR ENHANCED {sup 3}HE IN FLARE-ACCELERATED PARTICLES BASED ON NEW CALCULATIONS OF THE GAMMA-RAY LINE SPECTRUM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, R. J.; Kozlovsky, B.; Share, G. H., E-mail: murphy@ssd5.nrl.navy.mil, E-mail: benz@wise.tau.ac.il, E-mail: share@astro.umd.edu

    2016-12-20

    The {sup 3}He abundance in impulsive solar energetic particle (SEP) events is enhanced up to several orders of magnitude compared to its photospheric value of [{sup 3}He]/[{sup 4}He] = 1–3 × 10{sup −4}. Interplanetary magnetic field and timing observations suggest that these events are related to solar flares. Observations of {sup 3}He in flare-accelerated ions would clarify the relationship between these two phenomena. Energetic {sup 3}He interactions in the solar atmosphere produce gamma-ray nuclear-deexcitation lines, both lines that are also produced by protons and α particles and lines that are essentially unique to {sup 3}He. Gamma-ray spectroscopy can, therefore, reveal enhanced levelsmore » of accelerated {sup 3}He. In this paper, we identify all significant deexcitation lines produced by {sup 3}He interactions in the solar atmosphere. We evaluate their production cross sections and incorporate them into our nuclear deexcitation-line code. We find that enhanced {sup 3}He can affect the entire gamma-ray spectrum. We identify gamma-ray line features for which the yield ratios depend dramatically on the {sup 3}He abundance. We determine the accelerated {sup 3}He/ α ratio by comparing these ratios with flux ratios measured previously from the gamma-ray spectrum obtained by summing the 19 strongest flares observed with the Solar Maximum Mission Gamma-Ray Spectrometer. All six flux ratios investigated show enhanced {sup 3}He, confirming earlier suggestions. The {sup 3}He/ α weighted mean of these new measurements ranges from 0.05 to 0.3 (depending on the assumed accelerated α /proton ratio) and has a <1 × 10{sup −3} probability of being consistent with the photospheric value. With the improved code, we can now exploit the full potential of gamma-ray spectroscopy to establish the relationship between flare-accelerated ions and {sup 3}He-rich SEPs.« less

  13. X-ray modeling for SMILE

    NASA Astrophysics Data System (ADS)

    Sun, T.; Wang, C.; Wei, F.; Liu, Z. Q.; Zheng, J.; Yu, X. Z.; Sembay, S.; Branduardi-Raymont, G.

    2016-12-01

    SMILE (Solar wind Magnetosphere Ionosphere Link Explorer) is a novel mission to explore the coupling of the solar wind-magnetosphere-ionosphere system via providing global images of the magnetosphere and aurora. As the X-ray imaging is a brand new technique applied to study the large scale magnetopause, modeling of the solar wind charge exchange (SWCX) X-ray emissions in the magnetosheath and cusps is vital in various aspects: it helps the design of the Soft X-ray Imager (SXI) on SMILE, selection of satellite orbits, as well as the analysis of expected scientific outcomes. Based on the PPMLR-MHD code, we present the simulation results of the X-ray emissions in geospace during storm time. Both the polar orbit and the Molniya orbit are used. From the X-ray images of the magnetosheath and cusps, the magnetospheric responses to an interplanetary shock and IMF southward turning are analyzed.

  14. Modelling of Divertor Detachment in MAST Upgrade

    NASA Astrophysics Data System (ADS)

    Moulton, David; Carr, Matthew; Harrison, James; Meakins, Alex

    2017-10-01

    MAST Upgrade will have extensive capabilities to explore the benefits of alternative divertor configurations such as the conventional, Super-X, x divertor, snowflake and variants in a single device with closed divertors. Initial experiments will concentrate on exploring the Super-X and conventional configurations, in terms of power and particle loads to divertor surfaces, access to detachment and its control. Simulations have been carried out with the SOLPS5.0 code validated against MAST experiments. The simulations predict that the Super-X configuration has significant advantages over the conventional, such as lower detachment threshold (2-3x lower in terms of upstream density and 4x higher in terms of PSOL). Synthetic spectroscopy diagnostics from these simulations have been created using the Raysect ray tracing code to produce synthetic filtered camera images, spectra and foil bolometer data. Forward modelling of the current set of divertor diagnostics will be presented, together with a discussion of future diagnostics and analysis to improve estimates of the plasma conditions. Work supported by the RCUK Energy Programme [Grant Number EP/P012450/1] and EURATOM.

  15. A General Tool for Evaluating High-Contrast Coronagraphic Telescope Performance Error Budgets

    NASA Technical Reports Server (NTRS)

    Marchen, Luis F.; Shaklan, Stuart B.

    2009-01-01

    This paper describes a general purpose Coronagraph Performance Error Budget (CPEB) tool that we have developed under the NASA Exoplanet Exploration Program. The CPEB automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. It operates in 3 steps: first, a CodeV or Zemax prescription is converted into a MACOS optical prescription. Second, a Matlab program calls ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-of-sight pointing, with and without controlled coarse and fine-steering mirrors. Third, the sensitivity matrices are imported by macros into Excel 2007 where the error budget is created. Once created, the user specifies the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions and combines them with the sensitivity matrices to generate an error budget for the system. The user can easily modify the motion allocations to perform trade studies.

  16. System for loading executable code into volatile memory in a downhole tool

    DOEpatents

    Hall, David R.; Bartholomew, David B.; Johnson, Monte L.

    2007-09-25

    A system for loading an executable code into volatile memory in a downhole tool string component comprises a surface control unit comprising executable code. An integrated downhole network comprises data transmission elements in communication with the surface control unit and the volatile memory. The executable code, stored in the surface control unit, is not permanently stored in the downhole tool string component. In a preferred embodiment of the present invention, the downhole tool string component comprises boot memory. In another embodiment, the executable code is an operating system executable code. Preferably, the volatile memory comprises random access memory (RAM). A method for loading executable code to volatile memory in a downhole tool string component comprises sending the code from the surface control unit to a processor in the downhole tool string component over the network. A central processing unit writes the executable code in the volatile memory.

  17. High Resolution Energetic X-ray Imager (HREXI)

    NASA Astrophysics Data System (ADS)

    Grindlay, Jonathan

    We propose to design and build the first imaging hard X-ray detector system that incorporates 3D stacking of closely packed detector readouts in finely-spaced imaging arrays with their required data processing and control electronics. In virtually all imaging astronomical detectors, detector readout is done with flex connectors or connections that are not vertical but rather horizontal , requiring loss of focal plane area. For high resolution pixel detectors needed for high speed event-based X-ray imaging, from low energy applications (CMOS) with focusing X-ray telescopes, to hard X-ray applications with pixelated CZT for large area coded aperture telescopes, this new detector development offers great promise. We propose to extend our previous and current APRA supported ProtoEXIST program that has developed the first large area imaging CZT detectors and demonstrated their astrophysical capabilities on two successful balloon flight to a next generation High Resolution Energetic X-ray Imager (HREXI), which would incorporate microvia technology for the first time to connect the readout ASIC on each CZT crystal directly to its control and data processing system. This 3-dimensional stacking of detector and readout/control system means that large area (>2m2) imaging detector planes for a High Resolution Wide-field hard X-ray telescope can be built with initially greatly reduced detector gaps and ultimately with no gaps. This increases detector area, efficiency, and simplicity of detector integration. Thus higher sensitivity wide-field imagers will be possible at lower cost. HREXI will enable a post-Swift NASA mission such as the EREXS concept proposed to PCOS to be conducted as a future MIDEX mission. This mission would conduct a high resolution (<2 arcmin) , broad band (5 200 keV) hard X-ray survey of black holes on all scales with ~10X higher sensitivity than Swift. In the current era of Time Domain Astrophysics, such a survey capability, in conjunction with a nIR telescope in spece, will enable GRBs to be used as probes of the formation of the first stars and structure in the Universe. HREXI on its own, with broad bandwidth and high spectral and spatial resolution, will extend both Galactic surveys for obscured young supernova remnants (44Ti sources) and for transients, black holes and flaring AGN and TDEs well at greatly increased sensitivity and spatial/spectral resolution than has been done with Swift or INTEGRAL. If the HREXI-1 technology is developed in the first year of this proposed effort, it could be used on the upcoming Brazil-US MIRAX telescope on the Lattes satellite, scheduled for a 2018 launch with imaging detector planes to be provided (under contract) by our group. Finally, the 3D stacking technology development proposed here for imaging detector arrays has broad application to Wide Field soft X-ray imaging, to CMB polarization mode (B mode) imaging detectors with very high detector-pixel count, and to Homeland Security.

  18. Investigation of photon attenuation coefficient of some building materials used in Turkey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dogan, B.; Altinsoy, N.

    In this study, some building materials regularly used in Turkey, such as concrete, gas concrete, pumice and brick have been investigated in terms of mass attenuation coefficient at different gamma-ray energies. Measurements were carried out by gamma spectrometry containing NaI(Tl) detector. Narrow beam gamma-ray transmission geometry was used for the attenuation measurements. The results are in good agreement with the theoretical calculation of XCOM code.

  19. Investigating inertial confinement fusion target fuel conditions through x-ray spectroscopya)

    NASA Astrophysics Data System (ADS)

    Hansen, Stephanie B.

    2012-05-01

    Inertial confinement fusion (ICF) targets are designed to produce hot, dense fuel in a neutron-producing core that is surrounded by a shell of compressing material. The x-rays emitted from ICF plasmas can be analyzed to reveal details of the temperatures, densities, gradients, velocities, and mix characteristics of ICF targets. Such diagnostics are critical to understand the target performance and to improve the predictive power of simulation codes.

  20. Prompt gamma-ray imaging for small animals

    NASA Astrophysics Data System (ADS)

    Xu, Libai

    Small animal imaging is recognized as a powerful discovery tool for small animal modeling of human diseases, which is providing an important clue to complete understanding of disease mechanisms and is helping researchers develop and test new treatments. The current small animal imaging techniques include positron emission tomography (PET), single photon emission tomography (SPECT), computed tomography (CT), magnetic resonance imaging (MRI), and ultrasound (US). A new imaging modality called prompt gamma-ray imaging (PGI) has been identified and investigated primarily by Monte Carlo simulation. Currently it is suggested for use on small animals. This new technique could greatly enhance and extend the present capabilities of PET and SPECT imaging from ingested radioisotopes to the imaging of selected non-radioactive elements, such as Gd, Cd, Hg, and B, and has the great potential to be used in Neutron Cancer Therapy to monitor neutron distribution and neutron-capture agent distribution. This approach consists of irradiating small animals in the thermal neutron beam of a nuclear reactor to produce prompt gamma rays from the elements in the sample by the radiative capture (n, gamma) reaction. These prompt gamma rays are emitted in energies that are characteristic of each element and they are also produced in characteristic coincident chains. After measuring these prompt gamma rays by surrounding spectrometry array, the distribution of each element of interest in the sample is reconstructed from the mapping of each detected signature gamma ray by either electronic collimations or mechanical collimations. In addition, the transmitted neutrons from the beam can be simultaneously used for very sensitive anatomical imaging, which provides the registration for the elemental distributions obtained from PGI. The primary approach is to use Monte Carlo simulation methods either with the specific purpose code CEARCPG, developed at NC State University or with the general purpose codes GEANT4 or MCNP5, to predict results and investigate the feasibility of this new imaging idea. Benchmark experiments have been conducted to test the capability of the code to simulate prompt gamma rays, which are produced by following the nuclear structures of each irradiated isotope, and coincidence counting techniques, which are considered the most important improvement in neutron-related gamma-ray detection applications to reduce gamma background and improve system signal-to-noise ratios. With coincidence prompt gamma rays available, two major imaging techniques, electronic collimations and mechanic collimations, are implemented in the simulation to illustrate the feasibility of imaging elemental distribution by this new technique. The expectation maximization algorithm is employed in electronic collimation to reconstruct images. The common SPECT imaging algorithms are used in mechanical collimation to get an image. Several critical topics concerning practical applications have already been discussed, such as the radiation dose to the mouse and the detection efficiency of high-energy gamma rays. The funding of this work is provided by the Center for Engineering Application of Radioisotopes (CEAR) at North Carolina State University (NCSU) and Nuclear Engineering Education Research.

  1. GRMHD and GRPIC Simulations

    NASA Technical Reports Server (NTRS)

    Nishikawa, K.-I.; Mizuno, Y.; Watson, M.; Fuerst, S.; Wu, K.; Hardee, P.; Fishman, G. J.

    2007-01-01

    We have developed a new three-dimensional general relativistic magnetohydrodynamic (GRMHD) code by using a conservative, high-resolution shock-capturing scheme. The numerical fluxes are calculated using the HLL approximate Riemann solver scheme. The flux-interpolated constrained transport scheme is used to maintain a divergence-free magnetic field. We have performed various 1-dimensional test problems in both special and general relativity by using several reconstruction methods and found that the new 3D GRMHD code shows substantial improvements over our previous code. The simulation results show the jet formations from a geometrically thin accretion disk near a nonrotating and a rotating black hole. We will discuss the jet properties depended on the rotation of a black hole and the magnetic field configuration including issues for future research. A General Relativistic Particle-in-Cell Code (GRPIC) has been developed using the Kerr-Schild metric. The code includes kinetic effects, and is in accordance with GRMHD code. Since the gravitational force acting on particles is extreme near black holes, there are some difficulties in numerically describing these processes. The preliminary code consists of an accretion disk and free-falling corona. Results indicate that particles are ejected from the black hole. These results are consistent with other GRMHD simulations. The GRPIC simulation results will be presented, along with some remarks and future improvements. The emission is calculated from relativistic flows in black hole systems using a fully general relativistic radiative transfer formulation, with flow structures obtained by GRMHD simulations considering thermal free-free emission and thermal synchrotron emission. Bright filament-like features protrude (visually) from the accretion disk surface, which are enhancements of synchrotron emission where the magnetic field roughly aligns with the line-of-sight in the co-moving frame. The features move back and forth as the accretion flow evolves, but their visibility and morphology are robust. We would like to extend this research using GRPIC simulations and examine a possible new mechanism for certain X-ray quasi-periodic oscillations (QPOs) observed in blackhole X-ray binaries.

  2. RAPTOR. I. Time-dependent radiative transfer in arbitrary spacetimes

    NASA Astrophysics Data System (ADS)

    Bronzwaer, T.; Davelaar, J.; Younsi, Z.; Mościbrodzka, M.; Falcke, H.; Kramer, M.; Rezzolla, L.

    2018-05-01

    Context. Observational efforts to image the immediate environment of a black hole at the scale of the event horizon benefit from the development of efficient imaging codes that are capable of producing synthetic data, which may be compared with observational data. Aims: We aim to present RAPTOR, a new public code that produces accurate images, animations, and spectra of relativistic plasmas in strong gravity by numerically integrating the equations of motion of light rays and performing time-dependent radiative transfer calculations along the rays. The code is compatible with any analytical or numerical spacetime. It is hardware-agnostic and may be compiled and run both on GPUs and CPUs. Methods: We describe the algorithms used in RAPTOR and test the code's performance. We have performed a detailed comparison of RAPTOR output with that of other radiative-transfer codes and demonstrate convergence of the results. We then applied RAPTOR to study accretion models of supermassive black holes, performing time-dependent radiative transfer through general relativistic magneto-hydrodynamical (GRMHD) simulations and investigating the expected observational differences between the so-called fast-light and slow-light paradigms. Results: Using RAPTOR to produce synthetic images and light curves of a GRMHD model of an accreting black hole, we find that the relative difference between fast-light and slow-light light curves is less than 5%. Using two distinct radiative-transfer codes to process the same data, we find integrated flux densities with a relative difference less than 0.01%. Conclusions: For two-dimensional GRMHD models, such as those examined in this paper, the fast-light approximation suffices as long as errors of a few percent are acceptable. The convergence of the results of two different codes demonstrates that they are, at a minimum, consistent. The public version of RAPTOR is available at the following URL: http://https://github.com/tbronzwaer/raptor

  3. Evaluation of Evidence for Altered Behavior and Auditory Deficits in Fishes Due to Human-Generated Noise Sources

    DTIC Science & Technology

    2006-04-01

    prepared by the Research and Animal Care Branch, Code 2351, of the Biosciences Division, Code 235, SSC San Diego. This is a work of the United...and Animal Care Branch Under authority of M. Rothe, Head Biosciences Division i EXECUTIVE SUMMARY In this study, we have evaluated peer... sharks , skates, and rays) and teleost fishes (modern bony fishes) and provide recommendations for research to address remaining issues. Clear responses

  4. 3D Radiative Transfer Code for Polarized Scattered Light with Aligned Grains

    NASA Astrophysics Data System (ADS)

    Pelkonen, V. M.; Penttilä, A.; Juvela, M.; Muinonen, K.

    2017-12-01

    Polarized scattered light has been observed in cometary comae and in circumstellar disks. It carries information about the grains from which the light scattered. However, modelling polarized scattered light is a complicated problem. We are working on a 3D Monte Carlo radiative transfer code which incorporates hierarchical grid structure (octree) and the full Stokes vector for both the incoming radiation and the radiation scattered by dust grains. In octree grid format an upper level cell can be divided into 8 subcells by halving the cell in each of the three axis. Levels of further refinement of the grid may be added, until the desired resolution is reached. The radiation field is calculated with Monte Carlo methods. The path of the model ray is traced in the cloud: absorbed intensity is counted in each cell, and from time to time, the model ray is scattered towards a new direction as determined by the dust model. Due to the non-spherical grains and the polarization, the scattering problem will be the main issue for the code and most time consuming. The scattering parameters will be taken from the models for individual grains. We can introduce populations of different grain shapes into the dust model, and randomly select, based on their amounts, from which shape the model ray scatters. Similarly, we can include aligned and non-aligned subpopulations of these grains, based on the grain alignment calculations, to see which grains should be oriented with the magnetic field, or, in the absence of a magnetic field close to the comet nucleus, with another axis of alignment (e.g., the radiation direction). The 3D nature of the grid allows us to assign these values, as well as density, for each computational cell, to model phenomena like e.g., cometary jets. The code will record polarized scattered light towards one or more observer directions within a single simulation run. These results can then be compared with the observations of comets at different phase angles, or, in the case of other star systems, of circumstellar disks, to help us study these objects. We will present tests of the code in development with simple models.

  5. An efficient HZETRN (a galactic cosmic ray transport code)

    NASA Technical Reports Server (NTRS)

    Shinn, Judy L.; Wilson, John W.

    1992-01-01

    An accurate and efficient engineering code for analyzing the shielding requirements against the high-energy galactic heavy ions is needed. The HZETRN is a deterministic code developed at Langley Research Center that is constantly under improvement both in physics and numerical computation and is targeted for such use. One problem area connected with the space-marching technique used in this code is the propagation of the local truncation error. By improving the numerical algorithms for interpolation, integration, and grid distribution formula, the efficiency of the code is increased by a factor of eight as the number of energy grid points is reduced. The numerical accuracy of better than 2 percent for a shield thickness of 150 g/cm(exp 2) is found when a 45 point energy grid is used. The propagating step size, which is related to the perturbation theory, is also reevaluated.

  6. Monte Carlo investigation of backscatter point spread function for x-ray imaging examinations

    NASA Astrophysics Data System (ADS)

    Xiong, Zhenyu; Vijayan, Sarath; Rudin, Stephen; Bednarek, Daniel R.

    2017-03-01

    X-ray imaging examinations, especially complex interventions, may result in relatively high doses to the patient's skin inducing skin injuries. A method was developed to determine the skin-dose distribution for non-uniform x-ray beams by convolving the backscatter point-spread-function (PSF) with the primary-dose distribution to generate the backscatter distribution that, when added to the primary dose, gives the total-dose distribution. This technique was incorporated in the dose-tracking system (DTS), which provides a real-time color-coded 3D-mapping of skin dose during fluoroscopic procedures. The aim of this work is to investigate the variation of the backscatter PSF with different parameters. A backscatter PSF of a 1-mm x-ray beam was generated by EGSnrc Monte-Carlo code for different x-ray beam energies, different soft-tissue thickness above bone, different bone thickness and different entrance-beam angles, as well as for different locations on the SK-150 anthropomorphic head phantom. The results show a reduction of the peak scatter to primary dose ratio of 48% when X-ray beam voltage is increased from 40 keV to 120 keV. The backscatter dose was reduced when bone was beneath the soft tissue layer and this reduction increased with thinner soft tissue and thicker bone layers. The backscatter factor increased about 21% as the angle of incidence of the beam with the entrance surface decreased from 90° (perpendicular) to 30°. The backscatter PSF differed for different locations on the SK-150 phantom by up to 15%. The results of this study can be used to improve the accuracy of dose calculation when using PSF convolution in the DTS.

  7. Simulations of ultra-high energy cosmic rays in the local Universe and the origin of cosmic magnetic fields

    NASA Astrophysics Data System (ADS)

    Hackstein, S.; Vazza, F.; Brüggen, M.; Sorce, J. G.; Gottlöber, S.

    2018-04-01

    We simulate the propagation of cosmic rays at ultra-high energies, ≳1018 eV, in models of extragalactic magnetic fields in constrained simulations of the local Universe. We use constrained initial conditions with the cosmological magnetohydrodynamics code ENZO. The resulting models of the distribution of magnetic fields in the local Universe are used in the CRPROPA code to simulate the propagation of ultra-high energy cosmic rays. We investigate the impact of six different magneto-genesis scenarios, both primordial and astrophysical, on the propagation of cosmic rays over cosmological distances. Moreover, we study the influence of different source distributions around the Milky Way. Our study shows that different scenarios of magneto-genesis do not have a large impact on the anisotropy measurements of ultra-high energy cosmic rays. However, at high energies above the Greisen-Zatsepin-Kuzmin (GZK)-limit, there is anisotropy caused by the distribution of nearby sources, independent of the magnetic field model. This provides a chance to identify cosmic ray sources with future full-sky measurements and high number statistics at the highest energies. Finally, we compare our results to the dipole signal measured by the Pierre Auger Observatory. All our source models and magnetic field models could reproduce the observed dipole amplitude with a pure iron injection composition. Our results indicate that the dipole is observed due to clustering of secondary nuclei in direction of nearby sources of heavy nuclei. A light injection composition is disfavoured, since the increase in dipole angular power from 4 to 8 EeV is too slow compared to observation by the Pierre Auger Observatory.

  8. Numerical optimization of a picosecond pulse driven Ni-like Nb x-ray laser at 20.3 nm

    NASA Astrophysics Data System (ADS)

    Lu, X.; Zhong, J. Y.; Li, Y. J.; Zhang, J.

    2003-07-01

    Detailed simulations of a Ni-like Nb x-ray laser pumped by a nanosecond prepulse followed by a picosecond main pulse are presented. The atomic physics data are obtained using the Cowan code [R. D. Cowan, The Theory of Atomic Structure and Spectra (University of California Press, Berkeley, CA, 1981)]. The optimization calculations are performed in terms of the intensity of prepulse and the time delay between the prepulse and the main pulse. A high gain over 150 cm-1 is obtained for the optimized drive pulse configuration. The ray-tracing calculations suggest that the total pump energy for a saturated x-ray laser can be reduced to less than 1 J.

  9. Characterization of germanium detectors for the measurement of the angular distribution of prompt γ-rays at the ANNRI in the MLF of the J-PARC

    NASA Astrophysics Data System (ADS)

    Takada, S.; Okudaira, T.; Goto, F.; Hirota, K.; Kimura, A.; Kitaguchi, M.; Koga, J.; Nakao, T.; Sakai, K.; Shimizu, H. M.; Yamamoto, T.; Yoshioka, T.

    2018-02-01

    In this study, the germanium detector assembly, installed at the Accurate Neutron-Nuclear Reaction measurement Instruments (ANNRI) in the Material and Life Science Facility (MLF) operated by the Japan Proton Accelerator Research Complex (J-PARC), has been characterized for extension to the measurement of the angular distribution of individual γ-ray transitions from neutron-induced compound states. We have developed a Monte Carlo simulation code using the GEANT4 toolkit, which can reproduce the pulse-height spectra of γ-rays from radioactive sources and (n,γ) reactions. The simulation is applicable to the measurement of γ-rays in the energy region of 0.5-11.0 MeV.

  10. DQE simulation of a-Se x-ray detectors using ARTEMIS

    NASA Astrophysics Data System (ADS)

    Fang, Yuan; Badano, Aldo

    2016-03-01

    Detective Quantum Efficiency (DQE) is one of the most important image quality metrics for evaluating the spatial resolution performance of flat-panel x-ray detectors. In this work, we simulate the DQE of amorphous selenium (a-Se) xray detectors with a detailed Monte Carlo transport code (ARTEMIS) for modeling semiconductor-based direct x-ray detectors. The transport of electron-hole pairs is achieved with a spatiotemporal model that accounts for recombination and trapping of carriers and Coulombic effects of space charge and external applied electric field. A range of x-ray energies has been simulated from 10 to 100 keV. The DQE results can be used to study the spatial resolution characteristics of detectors at different energies.

  11. Power requirements for cosmic ray propagation models involving diffusive reacceleration; estimates and implications for the damping of interstellar turbulence

    NASA Astrophysics Data System (ADS)

    Drury, Luke O.'C.; Strong, Andrew W.

    2017-01-01

    We make quantitative estimates of the power supplied to the Galactic cosmic ray population by second-order Fermi acceleration in the interstellar medium, or as it is usually termed in cosmic ray propagation studies, diffusive reacceleration. Using recent results on the local interstellar spectrum, following Voyager 1's crossing of the heliopause, we show that for parameter values, in particular the Alfvén speed, typically used in propagation codes such as GALPROP to fit the B/C ratio, the power contributed by diffusive reacceleration is significant and can be of order 50% of the total Galactic cosmic ray power. The implications for the damping of interstellar turbulence are briefly considered.

  12. Improving the Multi-Wavelength Capability of Chandra Large Programs

    NASA Astrophysics Data System (ADS)

    Pacucci, Fabio

    2017-09-01

    In order to fully exploit the joint Chandra/JWST/HST ventures to detect faint sources, we urgently need an advanced matching algorithm between optical/NIR and X-ray catalogs/images. This will be of paramount importance in bridging the gap between upcoming optical/NIR facilities (JWST) and later X-ray ones (Athena, Lynx). We propose to develop an advanced and automated tool to improve the identification of Chandra X-ray counterparts detected in deep optical/NIR fields based on T-PHOT, a software widely used in the community. The developed code will include more than 20 years in advancements of X-ray data analysis and will be released to the public. Finally, we will release an updated catalog of X-ray sources in the CANDELS regions: a leap forward in our endeavor of charting the Universe.

  13. SU-D-206-02: Evaluation of Partial Storage of the System Matrix for Cone Beam Computed Tomography Using a GPU Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matenine, D; Cote, G; Mascolo-Fortin, J

    2016-06-15

    Purpose: Iterative reconstruction algorithms in computed tomography (CT) require a fast method for computing the intersections between the photons’ trajectories and the object, also called ray-tracing or system matrix computation. This work evaluates different ways to store the system matrix, aiming to reconstruct dense image grids in reasonable time. Methods: We propose an optimized implementation of the Siddon’s algorithm using graphics processing units (GPUs) with a novel data storage scheme. The algorithm computes a part of the system matrix on demand, typically, for one projection angle. The proposed method was enhanced with accelerating options: storage of larger subsets of themore » system matrix, systematic reuse of data via geometric symmetries, an arithmetic-rich parallel code and code configuration via machine learning. It was tested on geometries mimicking a cone beam CT acquisition of a human head. To realistically assess the execution time, the ray-tracing routines were integrated into a regularized Poisson-based reconstruction algorithm. The proposed scheme was also compared to a different approach, where the system matrix is fully pre-computed and loaded at reconstruction time. Results: Fast ray-tracing of realistic acquisition geometries, which often lack spatial symmetry properties, was enabled via the proposed method. Ray-tracing interleaved with projection and backprojection operations required significant additional time. In most cases, ray-tracing was shown to use about 66 % of the total reconstruction time. In absolute terms, tracing times varied from 3.6 s to 7.5 min, depending on the problem size. The presence of geometrical symmetries allowed for non-negligible ray-tracing and reconstruction time reduction. Arithmetic-rich parallel code and machine learning permitted a modest reconstruction time reduction, in the order of 1 %. Conclusion: Partial system matrix storage permitted the reconstruction of higher 3D image grid sizes and larger projection datasets at the cost of additional time, when compared to the fully pre-computed approach. This work was supported in part by the Fonds de recherche du Quebec - Nature et technologies (FRQ-NT). The authors acknowledge partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council of Canada (Grant No. 432290).« less

  14. Air Traffic Controller Working Memory: Considerations in Air Traffic Control Tactical Operations

    DTIC Science & Technology

    1993-09-01

    INFORMATION PROCESSING SYSTEM 3 2. AIR TRAFFIC CONTROLLER MEMORY 5 2.1 MEMORY CODES 6 21.1 Visual Codes 7 2.1.2 Phonetic Codes 7 2.1.3 Semantic Codes 8...raise an awareness of the memory re- quirements of ATC tactical operations by presenting information on working memory processes that are relevant to...working v memory permeates every aspect of the controller’s ability to process air traffic information and control live traffic. The

  15. 30 CFR 905.773 - Requirements for permits and permit processing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., 42 U.S.C. 7401 et seq California Air Pollution Control Laws, Cal. Health & Safety Code section 39000... (11) Noise Control Act, 42 U.S.C. 4903 California Noise Control Act of 1973, Cal. Health & Safety Code... Pollution Control Laws, Cal. Health & Safety Code section 39000 et seq.; the Hazardous Waste Control Law...

  16. A concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Kasami, T.; Fujiwara, T.; Lin, S.

    1986-01-01

    In this paper, a concatenated coding scheme for error control in data communications is presented and analyzed. In this scheme, the inner code is used for both error correction and detection; however, the outer code is used only for error detection. A retransmission is requested if either the inner code decoder fails to make a successful decoding or the outer code decoder detects the presence of errors after the inner code decoding. Probability of undetected error (or decoding error) of the proposed scheme is derived. An efficient method for computing this probability is presented. Throughput efficiency of the proposed error control scheme incorporated with a selective-repeat ARQ retransmission strategy is also analyzed. Three specific examples are presented. One of the examples is proposed for error control in the NASA Telecommand System.

  17. HOT X-RAY CORONAE AROUND MASSIVE SPIRAL GALAXIES: A UNIQUE PROBE OF STRUCTURE FORMATION MODELS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bogdan, Akos; Forman, William R.; Vogelsberger, Mark

    2013-08-01

    Luminous X-ray gas coronae in the dark matter halos of massive spiral galaxies are a fundamental prediction of structure formation models, yet only a few such coronae have been detected so far. In this paper, we study the hot X-ray coronae beyond the optical disks of two 'normal' massive spirals, NGC 1961 and NGC 6753. Based on XMM-Newton X-ray observations, hot gaseous emission is detected to {approx}60 kpc-well beyond their optical radii. The hot gas has a best-fit temperature of kT {approx} 0.6 keV and an abundance of {approx}0.1 Solar, and exhibits a fairly uniform distribution, suggesting that the quasi-staticmore » gas resides in hydrostatic equilibrium in the potential well of the galaxies. The bolometric luminosity of the gas in the (0.05-0.15)r{sub 200} region (r{sub 200} is the virial radius) is {approx}6 Multiplication-Sign 10{sup 40} erg s{sup -1} for both galaxies. The baryon mass fractions of NGC 1961 and NGC 6753 are f{sub b,NGC1961} {approx} 0.11 and f{sub b,NGC6753} {approx} 0.09, which values fall short of the cosmic baryon fraction. The hot coronae around NGC 1961 and NGC 6753 offer an excellent basis to probe structure formation simulations. To this end, the observations are confronted with the moving mesh code AREPO and the smoothed particle hydrodynamics code GADGET. Although neither model gives a perfect description, the observed luminosities, gas masses, and abundances favor the AREPO code. Moreover, the shape and the normalization of the observed density profiles are better reproduced by AREPO within {approx}0.5r{sub 200}. However, neither model incorporates efficient feedback from supermassive black holes or supernovae, which could alter the simulated properties of the X-ray coronae. With the further advance of numerical models, the present observations will be essential in constraining the feedback effects in structure formation simulations.« less

  18. Simulations of GCR interactions within planetary bodies using GEANT4

    NASA Astrophysics Data System (ADS)

    Mesick, K.; Feldman, W. C.; Stonehill, L. C.; Coupland, D. D. S.

    2017-12-01

    On planetary bodies with little to no atmosphere, Galactic Cosmic Rays (GCRs) can hit the body and produce neutrons primarily through nuclear spallation within the top few meters of the surfaces. These neutrons undergo further nuclear interactions with elements near the planetary surface and some will escape the surface and can be detected by landed or orbiting neutron radiation detector instruments. The neutron leakage signal at fast neutron energies provides a measure of average atomic mass of the near-surface material and in the epithermal and thermal energy ranges is highly sensitive to the presence of hydrogen. Gamma-rays can also escape the surface, produced at characteristic energies depending on surface composition, and can be detected by gamma-ray instruments. The intra-nuclear cascade (INC) that occurs when high-energy GCRs interact with elements within a planetary surface to produce the leakage neutron and gamma-ray signals is highly complex, and therefore Monte Carlo based radiation transport simulations are commonly used for predicting and interpreting measurements from planetary neutron and gamma-ray spectroscopy instruments. In the past, the simulation code that has been widely used for this type of analysis is MCNPX [1], which was benchmarked against data from the Lunar Neutron Probe Experiment (LPNE) on Apollo 17 [2]. In this work, we consider the validity of the radiation transport code GEANT4 [3], another widely used but open-source code, by benchmarking simulated predictions of the LPNE experiment to the Apollo 17 data. We consider the impact of different physics model options on the results, and show which models best describe the INC based on agreement with the Apollo 17 data. The success of this validation then gives us confidence in using GEANT4 to simulate GCR-induced neutron leakage signals on Mars in relevance to a re-analysis of Mars Odyssey Neutron Spectrometer data. References [1] D.B. Pelowitz, Los Alamos National Laboratory, LA-CP-05-0369, 2005. [2] G.W. McKinney et al, Journal of Geophysics Research, 111, E06004, 2006. [3] S. Agostinelli et al, Nuclear Instrumentation and Methods A, 506, 2003.

  19. Application of microprocessors in an upper atmosphere instrument package

    NASA Technical Reports Server (NTRS)

    Lim, T. S.; Ehrman, C. H.; Allison, S.

    1981-01-01

    A servo-driven magnetometer table measuring offset from magnetic north has been developed by NASA to calculate payload azimuth required to point at a celestial target. Used as an aid to the study of gamma-ray phenomena, the high-altitude balloon-borne instrument determines a geocentric reference system, and calculates a set of pointing directions with respect to the system. Principal components include the magnetometer, stepping motor, microcomputer, and gray code shaft encoder. The single-chip microcomputer is used to control the orientation of the system, and consists of a central processing unit, program memory, data memory and input/output ports. Principal advantages include a low power requirement, consuming 6 watts, as compared to 30 watts consumed by the previous system.

  20. ROS Hexapod

    NASA Technical Reports Server (NTRS)

    Davis, Kirsch; Bankieris, Derek

    2016-01-01

    As an intern project for NASA Johnson Space Center (JSC), my job was to familiarize myself and operate a Robotics Operating System (ROS). The project outcome converted existing software assets into ROS using nodes, enabling a robotic Hexapod to communicate to be functional and controlled by an existing PlayStation 3 (PS3) controller. Existing control algorithms and current libraries have no ROS capabilities within the Hexapod C++ source code when the internship started, but that has changed throughout my internship. Conversion of C++ codes to ROS enabled existing code to be compatible with ROS, and is now controlled using an existing PS3 controller. Furthermore, my job description was to design ROS messages and script programs that enabled assets to participate in the ROS ecosystem by subscribing and publishing messages. Software programming source code is written in directories using C++. Testing of software assets included compiling code within the Linux environment using a terminal. The terminal ran the code from a directory. Several problems occurred while compiling code and the code would not compile. So modifying code to where C++ can read the source code were made. Once the code was compiled and ran, the code was uploaded to Hexapod and then controlled by a PS3 controller. The project outcome has the Hexapod fully functional and compatible with ROS and operates using the PlayStation 3 controller. In addition, an open source software (IDE) Arduino board will be integrated into the ecosystem with designing circuitry on a breadboard to add additional behavior with push buttons, potentiometers and other simple elements in the electrical circuitry. Other projects with the Arduino will be a GPS module, digital clock that will run off 22 satellites to show accurate real time using a GPS signal and an internal patch antenna to communicate with satellites. In addition, this internship experience has led me to pursue myself to learn coding more efficiently and effectively to write, subscribe and publish my own source code in different programming languages. With some familiarity with software programming, it will enhance my skills in the electrical engineering field. In contrast, my experience here at JSC with the Simulation and Graphics Branch (ER7) has led me to take my coding skill to be more proficient to increase my knowledge in software programming, and also enhancing my skills in ROS. This knowledge will be taken back to my university to implement coding in a school project that will use source coding and ROS to work on the PR2 robot which is controlled by ROS software. My skills learned here will be used to integrate messages to subscribe and publish ROS messages to a PR2 robot. The PR2 robot will be controlled by an existing PS3 controller by changing C++ coding to subscribe and publish messages to ROS. Overall the skills that were obtained here will not be lost, but increased.

  1. Implementation of Energy Code Controls Requirements in New Commercial Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenberg, Michael I.; Hart, Philip R.; Hatten, Mike

    Most state energy codes in the United States are based on one of two national model codes; ANSI/ASHRAE/IES 90.1 (Standard 90.1) or the International Code Council (ICC) International Energy Conservation Code (IECC). Since 2004, covering the last four cycles of Standard 90.1 updates, about 30% of all new requirements have been related to building controls. These requirements can be difficult to implement and verification is beyond the expertise of most building code officials, yet the assumption in studies that measure the savings from energy codes is that they are implemented and working correctly. The objective of the current research ismore » to evaluate the degree to which high impact controls requirements included in commercial energy codes are properly designed, commissioned and implemented in new buildings. This study also evaluates the degree to which these control requirements are realizing their savings potential. This was done using a three-step process. The first step involved interviewing commissioning agents to get a better understanding of their activities as they relate to energy code required controls measures. The second involved field audits of a sample of commercial buildings to determine whether the code required control measures are being designed, commissioned and correctly implemented and functioning in new buildings. The third step includes compilation and analysis of the information gather during the first two steps. Information gathered during these activities could be valuable to code developers, energy planners, designers, building owners, and building officials.« less

  2. Galactic cosmic ray radiation levels in spacecraft on interplanetary missions

    NASA Technical Reports Server (NTRS)

    Shinn, J. L.; Nealy, J. E.; Townsend, L. W.; Wilson, J. W.; Wood, J.S.

    1994-01-01

    Using the Langley Research Center Galactic Cosmic Ray (GCR) transport computer code (HZETRN) and the Computerized Anatomical Man (CAM) model, crew radiation levels inside manned spacecraft on interplanetary missions are estimated. These radiation-level estimates include particle fluxes, LET (Linear Energy Transfer) spectra, absorbed dose, and dose equivalent within various organs of interest in GCR protection studies. Changes in these radiation levels resulting from the use of various different types of shield materials are presented.

  3. Advanced gamma ray balloon experiment ground checkout and data analysis

    NASA Technical Reports Server (NTRS)

    Blackstone, M.

    1976-01-01

    A software programming package to be used in the ground checkout and handling of data from the advanced gamma ray balloon experiment is described. The Operator's Manual permits someone unfamiliar with the inner workings of the software system (called LEO) to operate on the experimental data as it comes from the Pulse Code Modulation interface, converting it to a form for later analysis, and monitoring the program of an experiment. A Programmer's Manual is included.

  4. Hard x ray imaging graphics development and literature search

    NASA Technical Reports Server (NTRS)

    Emslie, A. Gordon

    1991-01-01

    This report presents work performed between June 1990 and June 1991 and has the following objectives: (1) a comprehensive literature search of imaging technology and coded aperture imaging as well as relevant topics relating to solar flares; (2) an analysis of random number generators; and (3) programming simulation models of hard x ray telescopes. All programs are compatible with NASA/MSFC Space Science LAboratory VAX Cluster and are written in VAX FORTRAN and VAX IDL (Interactive Data Language).

  5. Population Synthesis of Radio and Gamma-ray Pulsars using the Maximum Likelihood Approach

    NASA Astrophysics Data System (ADS)

    Billman, Caleb; Gonthier, P. L.; Harding, A. K.

    2012-01-01

    We present the results of a pulsar population synthesis of normal pulsars from the Galactic disk using a maximum likelihood method. We seek to maximize the likelihood of a set of parameters in a Monte Carlo population statistics code to better understand their uncertainties and the confidence region of the model's parameter space. The maximum likelihood method allows for the use of more applicable Poisson statistics in the comparison of distributions of small numbers of detected gamma-ray and radio pulsars. Our code simulates pulsars at birth using Monte Carlo techniques and evolves them to the present assuming initial spatial, kick velocity, magnetic field, and period distributions. Pulsars are spun down to the present and given radio and gamma-ray emission characteristics. We select measured distributions of radio pulsars from the Parkes Multibeam survey and Fermi gamma-ray pulsars to perform a likelihood analysis of the assumed model parameters such as initial period and magnetic field, and radio luminosity. We present the results of a grid search of the parameter space as well as a search for the maximum likelihood using a Markov Chain Monte Carlo method. We express our gratitude for the generous support of the Michigan Space Grant Consortium, of the National Science Foundation (REU and RUI), the NASA Astrophysics Theory and Fundamental Program and the NASA Fermi Guest Investigator Program.

  6. A concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Lin, S.

    1985-01-01

    A concatenated coding scheme for error control in data communications is analyzed. The inner code is used for both error correction and detection, however the outer code is used only for error detection. A retransmission is requested if the outer code detects the presence of errors after the inner code decoding. The probability of undetected error of the above error control scheme is derived and upper bounded. Two specific exmaples are analyzed. In the first example, the inner code is a distance-4 shortened Hamming code with generator polynomial (X+1)(X(6)+X+1) = X(7)+X(6)+X(2)+1 and the outer code is a distance-4 shortened Hamming code with generator polynomial (X+1)X(15+X(14)+X(13)+X(12)+X(4)+X(3)+X(2)+X+1) = X(16)+X(12)+X(5)+1 which is the X.25 standard for packet-switched data network. This example is proposed for error control on NASA telecommand links. In the second example, the inner code is the same as that in the first example but the outer code is a shortened Reed-Solomon code with symbols from GF(2(8)) and generator polynomial (X+1)(X+alpha) where alpha is a primitive element in GF(z(8)).

  7. Radiation protection for human missions to the Moon and Mars

    NASA Technical Reports Server (NTRS)

    Simonsen, Lisa C.; Nealy, John E.

    1991-01-01

    Radiation protection assessments are performed for advanced Lunar and Mars manned missions. The Langley cosmic ray transport code and the nucleon transport code are used to quantify the transport and attenuation of galactic cosmic rays and solar proton flares through various shielding media. Galactic cosmic radiation at solar maximum and minimum, as well as various flare scenarios are considered. Propagation data for water, aluminum, liquid hydrogen, lithium hydride, lead, and lunar and Martian regolith (soil) are included. Shield thickness and shield mass estimates required to maintain incurred doses below 30 day and annual limits (as set for Space Station Freedom and used as a guide for space exploration) are determined for simple geometry transfer vehicles. On the surface of Mars, dose estimates are presented for crews with their only protection being the carbon dioxide atmosphere and for crews protected by shielding provided by Martian regolith for a candidate habitat.

  8. Introducing GAMER: A fast and accurate method for ray-tracing galaxies using procedural noise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groeneboom, N. E.; Dahle, H., E-mail: nicolaag@astro.uio.no

    2014-03-10

    We developed a novel approach for fast and accurate ray-tracing of galaxies using procedural noise fields. Our method allows for efficient and realistic rendering of synthetic galaxy morphologies, where individual components such as the bulge, disk, stars, and dust can be synthesized in different wavelengths. These components follow empirically motivated overall intensity profiles but contain an additional procedural noise component that gives rise to complex natural patterns that mimic interstellar dust and star-forming regions. These patterns produce more realistic-looking galaxy images than using analytical expressions alone. The method is fully parallelized and creates accurate high- and low- resolution images thatmore » can be used, for example, in codes simulating strong and weak gravitational lensing. In addition to having a user-friendly graphical user interface, the C++ software package GAMER is easy to implement into an existing code.« less

  9. RIXS of Ammonium Nitrate using OCEAN

    NASA Astrophysics Data System (ADS)

    Vinson, John; Jach, Terrence; Mueller, Matthias; Unterumsberger, Rainer; Beckhoff, Burkhard

    The ocean code allows for calculations of near-edge x-ray spectroscopies using a GW/Bethe-Salpeter equation (BSE) approach. Here we present an extension of the code for calculating resonant inelastic x-ray scattering (RIXS). Recent work has shown that peak-specific broadening of nitrogen K α emission in nitrates is due to a valence-band lifetime that is an order of magnitude shorter than that of the nitrogen 1s hole, an inversion of the usual assumption that valence holes have longer lifetimes than core-level holes. Using the BSE, including GW corrections to the DFT energies, as implemented in ocean we are able to compare calculations of RIXS with measured spectra of the same. By utilizing an approach free from fitting parameters we are able to identify the origins of various broadening effects observed in experiment.

  10. Comparisons of laboratory wavelength measurements with theoretical calculations for neon-like through lithium-like argon, sulfur, and silicon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lepson, J K; Beiersdorfer, P; Behar, E

    Atomic structure codes have a difficult time accurately calculating the wavelengths of many-electron ions without the benefit of laboratory measurements. This is especially true for wavelengths of lines in the extreme ultraviolet and soft x-ray regions. We are using the low-energy capability of the Livermore electron beam ion traps to compile a comprehensive catalog of astrophysically relevant emission lines in support of satellite x-ray observations. Our database includes wavelength measurements, relative intensities, and line assignments, and is compared to a full set of calculations using the Hebrew University - Lawrence Livermore Atomic Code (HULLAC). Mean deviation of HULLAC calculations frommore » our measured wavelength values is highest for L-shell transitions of neon-like ions and lowest for lithium-like ions, ranging from a mean deviation of over 0.5 {angstrom} for Si V to 12 m{angstrom} in Ar XVI.« less

  11. VizieR Online Data Catalog: Blazars in the Swift-BAT hard X-ray sky (Maselli+, 2010)

    NASA Astrophysics Data System (ADS)

    Maselli, A.; Cusumano, G.; Massaro, E.; La Parola, V.; Segreto, A.; Sbarufatti, B.

    2010-06-01

    We reported the list of hard X-ray blazars obtained adopting sigma=3 as detection threshold: with this choice a number of three spurious sources over a total of 121 blazars is expected. Each blazar is identified by a three-letter code, where the first two are BZ for blazar and the third one specifies the type, followed by the truncated equatorial coordinates (J2000). The codes are defined in the "Note (1)" below. We obtained 69 FSRQs, 24 BL Lac objects and 28 blazars of uncertain classification, representing 4.4%, 2.4% and 11.0% of the corresponding populations classified in the BZCAT, respectively. This sample has been compared with other lists and catalogues found in literature (Tueller et al., 2010, Cat. J/ApJS/186/378, Ajello et al. 2009ApJ...699..603A, Cusumano et al., 2010, Cat. J/A+A/510/A48). (1 data file).

  12. Introducing GAMER: A Fast and Accurate Method for Ray-tracing Galaxies Using Procedural Noise

    NASA Astrophysics Data System (ADS)

    Groeneboom, N. E.; Dahle, H.

    2014-03-01

    We developed a novel approach for fast and accurate ray-tracing of galaxies using procedural noise fields. Our method allows for efficient and realistic rendering of synthetic galaxy morphologies, where individual components such as the bulge, disk, stars, and dust can be synthesized in different wavelengths. These components follow empirically motivated overall intensity profiles but contain an additional procedural noise component that gives rise to complex natural patterns that mimic interstellar dust and star-forming regions. These patterns produce more realistic-looking galaxy images than using analytical expressions alone. The method is fully parallelized and creates accurate high- and low- resolution images that can be used, for example, in codes simulating strong and weak gravitational lensing. In addition to having a user-friendly graphical user interface, the C++ software package GAMER is easy to implement into an existing code.

  13. Analysis of X-ray and EUV spectra of solar active regions

    NASA Technical Reports Server (NTRS)

    Strong, K. T.; Acton, L. W.

    1979-01-01

    Data acquired by two flights of an array of six Bragg crystal spectrometers on an Aerobee rocket to obtain high spatial and spectral resolution observations of various coronal features at soft X-ray wavelengths (9-23A) were analyzed. The various aspects of the analysis of the X-ray data are described. These observations were coordinated with observations from the experiments on the Apollo Telescope Mount and the various data sets were related to one another. The Appendices contain the published results, abstracts of papers, computer code descriptions and preprints of papers, all produced as a result of this research project.

  14. Rolf Mewe: a career devoted to X-ray spectroscopy

    NASA Astrophysics Data System (ADS)

    Kaastra, Jelle S.; Mewe, Rolf

    2005-06-01

    An overview of the life and work of Rolf Mewe (1935-2004) as an X-ray spectroscopist is given. He was one of the pioneers in the field of X-ray spectroscopy. His work illustrates nicely how this field developed from the early days up to the present high-resolution era. His plasma emission codes, developed by him and collaborators over several decades, is one of the most widely used. His thorough knowledge of the field, as well as his ability and enthousiasm to cooperate with many colleagues, made his career a succes. He will be missed by all of us for his work and personality.

  15. The detector response matrices of the burst and transient source experiment (BATSE) on the Compton Gamma Ray Observatory

    NASA Technical Reports Server (NTRS)

    Pendleton, Geoffrey N.; Paciesas, William S.; Mallozzi, Robert S.; Koshut, Tom M.; Fishman, Gerald J.; Meegan, Charles A.; Wilson, Robert B.; Horack, John M.; Lestrade, John Patrick

    1995-01-01

    The detector response matrices for the Burst And Transient Source Experiment (BATSE) on board the Compton Gamma Ray Observatory (CGRO) are described, including their creation and operation in data analysis. These response matrices are a detailed abstract representation of the gamma-ray detectors' operating characteristics that are needed for data analysis. They are constructed from an extensive set of calibration data coupled with a complex geometry electromagnetic cascade Monte Carlo simulation code. The calibration tests and simulation algorithm optimization are described. The characteristics of the BATSE detectors in the spacecraft environment are also described.

  16. Study of X-ray photoionized Fe plasma and comparisons with astrophysical modeling codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foord, M E; Heeter, R F; Chung, H

    The charge state distributions of Fe, Na and F are determined in a photoionized laboratory plasma using high resolution x-ray spectroscopy. Independent measurements of the density and radiation flux indicate the ionization parameter {zeta} in the plasma reaches values {zeta} = 20-25 erg cm s{sup -1} under near steady-state conditions. A curve-of-growth analysis, which includes the effects of velocity gradients in a one-dimensional expanding plasma, fits the observed line opacities. Absorption lines are tabulated in the wavelength region 8-17 {angstrom}. Initial comparisons with a number of astrophysical x-ray photoionization models show reasonable agreement.

  17. Characterisation of a MeV Bremsstrahlung x-ray source produced from a high intensity laser for high areal density object radiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Courtois, C.; Compant La Fontaine, A.; Bazzoli, S.

    2013-08-15

    Results of an experiment to characterise a MeV Bremsstrahlung x-ray emission created by a short (<10 ps) pulse, high intensity (1.4 × 10{sup 19} W/cm{sup 2}) laser are presented. X-ray emission is characterized using several diagnostics; nuclear activation measurements, a calibrated hard x-ray spectrometer, and dosimeters. Results from the reconstructed x-ray energy spectra are consistent with numerical simulations using the PIC and Monte Carlo codes between 0.3 and 30 MeV. The intense Bremsstrahlung x-ray source is used to radiograph an image quality indicator (IQI) heavily filtered with thick tungsten absorbers. Observations suggest that internal features of the IQI can bemore » resolved up to an external areal density of 85 g/cm{sup 2}. The x-ray source size, inferred by the radiography of a thick resolution grid, is estimated to be approximately 400 μm (full width half maximum of the x-ray source Point Spread Function)« less

  18. (U) Second-Order Sensitivity Analysis of Uncollided Particle Contributions to Radiation Detector Responses Using Ray-Tracing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Favorite, Jeffrey A.

    The Second-Level Adjoint Sensitivity System (2nd-LASS) that yields the second-order sensitivities of a response of uncollided particles with respect to isotope densities, cross sections, and source emission rates is derived in Refs. 1 and 2. In Ref. 2, we solved problems for the uncollided leakage from a homogeneous sphere and a multiregion cylinder using the PARTISN multigroup discrete-ordinates code. In this memo, we derive solutions of the 2nd-LASS for the particular case when the response is a flux or partial current density computed at a single point on the boundary, and the inner products are computed using ray-tracing. Both themore » PARTISN approach and the ray-tracing approach are implemented in a computer code, SENSPG. The next section of this report presents the equations of the 1st- and 2nd-LASS for uncollided particles and the first- and second-order sensitivities that use the solutions of the 1st- and 2nd-LASS. Section III presents solutions of the 1st- and 2nd-LASS equations for the case of ray-tracing from a detector point. Section IV presents specific solutions of the 2nd-LASS and derives the ray-trace form of the inner products needed for second-order sensitivities. Numerical results for the total leakage from a homogeneous sphere are presented in Sec. V and for the leakage from one side of a two-region slab in Sec. VI. Section VII is a summary and conclusions.« less

  19. Reconstruction of coded aperture images

    NASA Technical Reports Server (NTRS)

    Bielefeld, Michael J.; Yin, Lo I.

    1987-01-01

    Balanced correlation method and the Maximum Entropy Method (MEM) were implemented to reconstruct a laboratory X-ray source as imaged by a Uniformly Redundant Array (URA) system. Although the MEM method has advantages over the balanced correlation method, it is computationally time consuming because of the iterative nature of its solution. Massively Parallel Processing, with its parallel array structure is ideally suited for such computations. These preliminary results indicate that it is possible to use the MEM method in future coded-aperture experiments with the help of the MPP.

  20. Numerical simulation of the early-time high altitude electromagnetic pulse

    NASA Astrophysics Data System (ADS)

    Meng, Cui; Chen, Yu-Sheng; Liu, Shun-Kun; Xie, Qin-Chuan; Chen, Xiang-Yue; Gong, Jian-Cheng

    2003-12-01

    In this paper, the finite difference method is used to develop the Fortran software MCHII. The physical process in which the electromagnetic signal is generated by the interaction of nuclear-explosion-induced Compton currents with the geomagnetic field is numerically simulated. The electromagnetic pulse waveforms below the burst point are investigated. The effects of the height of burst, yield and the time-dependence of gamma-rays are calculated by using the MCHII code. The results agree well with those obtained by using the code CHAP.

  1. Coding for reliable satellite communications

    NASA Technical Reports Server (NTRS)

    Gaarder, N. T.; Lin, S.

    1986-01-01

    This research project was set up to study various kinds of coding techniques for error control in satellite and space communications for NASA Goddard Space Flight Center. During the project period, researchers investigated the following areas: (1) decoding of Reed-Solomon codes in terms of dual basis; (2) concatenated and cascaded error control coding schemes for satellite and space communications; (3) use of hybrid coding schemes (error correction and detection incorporated with retransmission) to improve system reliability and throughput in satellite communications; (4) good codes for simultaneous error correction and error detection, and (5) error control techniques for ring and star networks.

  2. DXRaySMCS: a user-friendly interface developed for prediction of diagnostic radiology X-ray spectra produced by Monte Carlo (MCNP-4C) simulation.

    PubMed

    Bahreyni Toossi, M T; Moradi, H; Zare, H

    2008-01-01

    In this work, the general purpose Monte Carlo N-particle radiation transport computer code (MCNP-4C) was used for the simulation of X-ray spectra in diagnostic radiology. The electron's path in the target was followed until its energy was reduced to 10 keV. A user-friendly interface named 'diagnostic X-ray spectra by Monte Carlo simulation (DXRaySMCS)' was developed to facilitate the application of MCNP-4C code for diagnostic radiology spectrum prediction. The program provides a user-friendly interface for: (i) modifying the MCNP input file, (ii) launching the MCNP program to simulate electron and photon transport and (iii) processing the MCNP output file to yield a summary of the results (relative photon number per energy bin). In this article, the development and characteristics of DXRaySMCS are outlined. As part of the validation process, output spectra for 46 diagnostic radiology system settings produced by DXRaySMCS were compared with the corresponding IPEM78. Generally, there is a good agreement between the two sets of spectra. No statistically significant differences have been observed between IPEM78 reported spectra and the simulated spectra generated in this study.

  3. Anode optimization for miniature electronic brachytherapy X-ray sources using Monte Carlo and computational fluid dynamic codes

    PubMed Central

    Khajeh, Masoud; Safigholi, Habib

    2015-01-01

    A miniature X-ray source has been optimized for electronic brachytherapy. The cooling fluid for this device is water. Unlike the radionuclide brachytherapy sources, this source is able to operate at variable voltages and currents to match the dose with the tumor depth. First, Monte Carlo (MC) optimization was performed on the tungsten target-buffer thickness layers versus energy such that the minimum X-ray attenuation occurred. Second optimization was done on the selection of the anode shape based on the Monte Carlo in water TG-43U1 anisotropy function. This optimization was carried out to get the dose anisotropy functions closer to unity at any angle from 0° to 170°. Three anode shapes including cylindrical, spherical, and conical were considered. Moreover, by Computational Fluid Dynamic (CFD) code the optimal target-buffer shape and different nozzle shapes for electronic brachytherapy were evaluated. The characterization criteria of the CFD were the minimum temperature on the anode shape, cooling water, and pressure loss from inlet to outlet. The optimal anode was conical in shape with a conical nozzle. Finally, the TG-43U1 parameters of the optimal source were compared with the literature. PMID:26966563

  4. Kinetic Modeling of Ultraintense X-Ray Laser-Matter Interactions

    NASA Astrophysics Data System (ADS)

    Royle, Ryan; Sentoku, Yasuhiko; Mancini, Roberto; Johzaki, Tomoyuki

    2015-11-01

    High-intensity XFELs have become a novel way of creating and studying hot dense plasmas. The LCLS at Stanford can deliver a millijoule of energy with more than 1012 photons in a ~ 100 femtosecond pulse. By tightly focusing the beam to a micron-scale spot size, the XFEL can be intensified to more than 1018 W/cm2, making it possible to heat solid matter isochorically beyond a million degrees (>100 eV). Such extreme states of matter are of considerable interest due to their relevance to astrophysical plasmas. Additionally, they will allow novel ways of studying equation-of-state and opacity physics under Gbar pressure and strong fields. Photoionization is the dominant x-ray absorption mechanism and triggers the heating processes. A photoionization model that takes into account the subshell cross-sections has been developed in a kinetic plasma simulation code, PICLS, that solves the x-ray transport self-consistently. The XFEL-matter interaction with several elements, including solid carbon, aluminum, and iron, is studied with the code, and the results are compared with recent LCLS experiments. This work was supported by the DOE/OFES under Contract No. DE-SC0008827.

  5. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  6. Ray tracing method for the evaluation of grazing incidence x-ray telescopes described by spatially sampled surfaces.

    PubMed

    Yu, Jun; Shen, Zhengxiang; Sheng, Pengfeng; Wang, Xiaoqiang; Hailey, Charles J; Wang, Zhanshan

    2018-03-01

    The nested grazing incidence telescope can achieve a large collecting area in x-ray astronomy, with a large number of closely packed, thin conical mirrors. Exploiting the surface metrological data, the ray tracing method used to reconstruct the shell surface topography and evaluate the imaging performance is a powerful tool to assist iterative improvement in the fabrication process. However, current two-dimensional (2D) ray tracing codes, especially when utilized with densely sampled surface shape data, may not provide sufficient accuracy of reconstruction and are computationally cumbersome. In particular, 2D ray tracing currently employed considers coplanar rays and thus simulates only these rays along the meridional plane. This captures axial figure errors but leaves other important errors, such as roundness errors, unaccounted for. We introduce a semianalytic, three-dimensional (3D) ray tracing approach for x-ray optics that overcomes these shortcomings. And the present method is both computationally fast and accurate. We first introduce the principles and the computational details of this 3D ray tracing method. Then the computer simulations of this approach compared to 2D ray tracing are demonstrated, using an ideal conic Wolter-I telescope for benchmarking. Finally, the present 3D ray tracing is used to evaluate the performance of a prototype x-ray telescope fabricated for the enhanced x-ray timing and polarization mission.

  7. Resonant scattering as a sensitive diagnostic of current collisional plasma models

    NASA Astrophysics Data System (ADS)

    Ogorzalek, Anna; Zhuravleva, Irina; Allen, Steven W.; Pinto, Ciro; Werner, Norbert; Mantz, Adam; Canning, Rebecca; Fabian, Andrew C.; Kaastra, Jelle S.; de Plaa, Jelle

    2017-08-01

    Resonant scattering is a subtle process that suppresses fluxes of some of the brightest optically thick X-ray emission lines produced by collisional plasmas in galaxy clusters and massive early-type galaxies. The amplitude of the effect depends on the turbulent structure of the hot gas, making it a sensitive velocity probe. It is therefore crucial to properly model this effect in order to correctly interpret high resolution X-ray spectra. Our measurements of resonant scattering with XMM-Newton Reflection Grating Spectrometer in giant elliptical galaxies and with Hitomi in the center of Perseus Cluster show that the potentially rich inference from this effect is limited by the uncertainties in the atomic data underlying plasma codes such as APEC and SPEX. Typically, the effect is of the order of 10-20%, while the discrepancy between the two codes is of similar order or even higher. Precise knowledge of the emissivity and oscillator strengths of lines emitted by Fe XVII and Fe XXV, as well as their respective uncertainties propagated through plasma codes are key to understanding gas dynamics and microphysics in giant galaxies and cluster ICM, respectively. This is especially crucial for massive ellipticals, where sub-eV resolution would be needed to measure line broadening precisely, making resonant scattering an important velocity diagnostic in these systems for the foreseeable future. In this poster, I will summarize current status of resonant scattering measurements and show how they depend on the assumed atomic data. I will also discuss which improvements are essential to maximize scientific inference from future high resolution X-ray spectra.

  8. 3D-printed coded apertures for x-ray backscatter radiography

    NASA Astrophysics Data System (ADS)

    Muñoz, André A. M.; Vella, Anna; Healy, Matthew J. F.; Lane, David W.; Jupp, Ian; Lockley, David

    2017-09-01

    Many different mask patterns can be used for X-ray backscatter imaging using coded apertures, which can find application in the medical, industrial and security sectors. While some of these patterns may be considered to have a self-supporting structure, this is not the case for some of the most frequently used patterns such as uniformly redundant arrays or any pattern with a high open fraction. This makes mask construction difficult and usually requires a compromise in its design by drilling holes or adopting a no two holes touching version of the original pattern. In this study, this compromise was avoided by 3D printing a support structure that was then filled with a radiopaque material to create the completed mask. The coded masks were manufactured using two different methods, hot cast and cold cast. Hot casting involved casting a bismuth alloy at 80°C into the 3D printed acrylonitrile butadiene styrene mould which produced an absorber with density of 8.6 g cm-3. Cold casting was undertaken at room temperature, when a tungsten/epoxy composite was cast into a 3D printed polylactic acid mould. The cold cast procedure offered a greater density of around 9.6 to 10 g cm-3 and consequently greater X-ray attenuation. It was also found to be much easier to manufacture and more cost effective. A critical review of the manufacturing procedure is presented along with some typical images. In both cases the 3D printing process allowed square apertures to be created avoiding their approximation by circular holes when conventional drilling is used.

  9. Contribution from individual nearby sources to the spectrum of high-energy cosmic-ray electrons

    NASA Astrophysics Data System (ADS)

    Sedrati, R.; Attallah, R.

    2014-04-01

    In the last few years, very important data on high-energy cosmic-ray electrons and positrons from high-precision space-born and ground-based experiments have attracted a great deal of interest. These particles represent a unique probe for studying local comic-ray accelerators because they lose energy very rapidly. These energy losses reduce the lifetime so drastically that high-energy cosmic-ray electrons can attain the Earth only from rather local astrophysical sources. This work aims at calculating, by means of Monte Carlo simulation, the contribution from some known nearby astrophysical sources to the cosmic-ray electron/positron spectra at high energy (≥ 10 GeV). The background to the electron energy spectrum from distant sources is determined with the help of the GALPROP code. The obtained numerical results are compared with a set of experimental data.

  10. FIER: Software for analytical modeling of delayed gamma-ray spectra

    NASA Astrophysics Data System (ADS)

    Matthews, E. F.; Goldblum, B. L.; Bernstein, L. A.; Quiter, B. J.; Brown, J. A.; Younes, W.; Burke, J. T.; Padgett, S. W.; Ressler, J. J.; Tonchev, A. P.

    2018-05-01

    A new software package, the Fission Induced Electromagnetic Response (FIER) code, has been developed to analytically predict delayed γ-ray spectra following fission. FIER uses evaluated nuclear data and solutions to the Bateman equations to calculate the time-dependent populations of fission products and their decay daughters resulting from irradiation of a fissionable isotope. These populations are then used in the calculation of γ-ray emission rates to obtain the corresponding delayed γ-ray spectra. FIER output was compared to experimental data obtained by irradiation of a 235U sample in the Godiva critical assembly. This investigation illuminated discrepancies in the input nuclear data libraries, showcasing the usefulness of FIER as a tool to address nuclear data deficiencies through comparison with experimental data. FIER provides traceability between γ-ray emissions and their contributing nuclear species, decay chains, and parent fission fragments, yielding a new capability for the nuclear science community.

  11. Measurement of angularly dependent spectra of betatron gamma-rays from a laser plasma accelerator with quadrant-sectored range filters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeon, Jong Ho, E-mail: jhjeon07@ibs.re.kr; Nakajima, Kazuhisa, E-mail: naka115@dia-net.ne.jp; Rhee, Yong Joo

    Measurement of angularly dependent spectra of betatron gamma-rays radiated by GeV electron beams from laser wakefield accelerators (LWFAs) are presented. The angle-resolved spectrum of betatron radiation was deconvolved from the position dependent data measured for a single laser shot with a broadband gamma-ray spectrometer comprising four-quadrant sectored range filters and an unfolding algorithm, based on the Monte Carlo code GEANT4. The unfolded gamma-ray spectra in the photon energy range of 0.1–10 MeV revealed an approximately isotropic angular dependence of the peak photon energy and photon energy-integrated fluence. As expected by the analysis of betatron radiation from LWFAs, the results indicate thatmore » unpolarized gamma-rays are emitted by electrons undergoing betatron motion in isotropically distributed orbit planes.« less

  12. Analysis of monochromatic and quasi-monochromatic X-ray sources in imaging and therapy

    NASA Astrophysics Data System (ADS)

    Westphal, Maximillian; Lim, Sara; Nahar, Sultana; Orban, Christopher; Pradhan, Anil

    2017-04-01

    We studied biomedical imaging and therapeutic applications of recently developed quasi-monochromatic and monochromatic X-ray sources. Using the Monte Carlo code GEANT4, we found that the quasi-monochromatic 65 keV Gaussian X-ray spectrum created by inverse Compton scattering with relatavistic electron beams were capable of producing better image contrast with less radiation compared to conventional 120 kV broadband CT scans. We also explored possible experimental detection of theoretically predicted K α resonance fluorescence in high-Z elements using the European Synchrotron Research Facility with a tungsten (Z = 74) target. In addition, we studied a newly developed quasi-monochromatic source generated by converting broadband X-rays to monochromatic K α and β X-rays with a zirconium target (Z = 40). We will further study how these K α and K β dominated spectra can be implemented in conjunction with nanoparticles for targeted therapy. Acknowledgement: Ohio Supercomputer Center, Columbus, OH.

  13. Study of fission fragment de-excitation by gamma-ray spectrometry with the EXILL experiment

    NASA Astrophysics Data System (ADS)

    Materna, Thomas; a, Michal Rapał; Letourneau, Alain; Marchix, Anthony; Litaize, Olivier; Sérot, Olivier; Urban, Waldemar; Blanc, Aurélien; Jentschel, Michael; Köster, Ulli; Mutti, Paolo; Soldner, Torsten; Simpson, Gary; Ur, Călin A.; France, Gilles de

    2017-09-01

    A large array of Ge detectors installed at ILL, around a 235U target irradiated with cold neutrons, (EXILL) allowed measurement of prompt gamma-ray cascades occurring in fission fragments with an unambiguous determination of fragments. Here we present preliminary results of a systematic comparison between experimental γ-ray intensities and those obtained from the Monte-Carlo simulation code FIFRELIN, which is dedicated to the de-excitation of fission fragments. Major γ-ray intensities in the 142Ba and 92Kr fission products, extracted from EXILL data, were compared to FIFRELIN, as well as to reported values (when available) obtained with EUROGAM2 in the spontaneous fission of 248Cm. The evolution of γ-ray intensities in 92Kr versus the complementary partner in fission (i.e. versus the total number of evaporated neutrons by the fission pair) was then extracted and compared to FIFRELIN.

  14. 27 CFR 73.12 - What security controls must I use for identification codes and passwords?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 2 2010-04-01 2010-04-01 false What security controls... controls must I use for identification codes and passwords? If you use electronic signatures based upon use of identification codes in combination with passwords, you must employ controls to ensure their...

  15. Air-kerma strength determination of a miniature x-ray source for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Davis, Stephen D.

    A miniature x-ray source has been developed by Xoft Inc. for high dose-rate brachytherapy treatments. The source is contained in a 5.4 mm diameter water-cooling catheter. The source voltage can be adjusted from 40 kV to 50 kV and the beam current is adjustable up to 300 muA. Electrons are accelerated toward a tungsten-coated anode to produce a lightly-filtered bremsstrahlung photon spectrum. The sources were initially used for early-stage breast cancer treatment using a balloon applicator. More recently, Xoft Inc. has developed vaginal and surface applicators. The miniature x-ray sources have been characterized using a modification of the American Association of Physicists in Medicine Task Group No. 43 formalism normally used for radioactive brachytherapy sources. Primary measurements of air kerma were performed using free-air ionization chambers at the University of Wisconsin (UW) and the National Institute of Standards and Technology (NIST). The measurements at UW were used to calibrate a well-type ionization chamber for clinical verification of source strength. Accurate knowledge of the emitted photon spectrum was necessary to calculate the corrections required to determine air-kerma strength, defined in vacuo. Theoretical predictions of the photon spectrum were calculated using three separate Monte Carlo codes: MCNP5, EGSnrc, and PENELOPE. Each code used different implementations of the underlying radiological physics. Benchmark studies were performed to investigate these differences in detail. The most important variation among the codes was found to be the calculation of fluorescence photon production following electron-induced vacancies in the L shell of tungsten atoms. The low-energy tungsten L-shell fluorescence photons have little clinical significance at the treatment distance, but could have a large impact on air-kerma measurements. Calculated photon spectra were compared to spectra measured with high-purity germanium spectroscopy systems at both UW and NIST. The effects of escaped germanium fluorescence photons and Compton-scattered photons were taken into account for the UW measurements. The photon spectrum calculated using the PENELOPE Monte Carlo code had the best agreement with the spectrum measured at NIST. Corrections were applied to the free-air chamber measurements to arrive at an air-kerma strength determination for the miniature x-ray sources.

  16. Class of near-perfect coded apertures

    NASA Technical Reports Server (NTRS)

    Cannon, T. M.; Fenimore, E. E.

    1977-01-01

    Coded aperture imaging of gamma ray sources has long promised an improvement in the sensitivity of various detector systems. The promise has remained largely unfulfilled, however, for either one of two reasons. First, the encoding/decoding method produces artifacts, which even in the absence of quantum noise, restrict the quality of the reconstructed image. This is true of most correlation-type methods. Second, if the decoding procedure is of the deconvolution variety, small terms in the transfer function of the aperture can lead to excessive noise in the reconstructed image. It is proposed to circumvent both of these problems by use of a uniformly redundant array (URA) as the coded aperture in conjunction with a special correlation decoding method.

  17. Delayed photo-emission model for beam optics codes

    DOE PAGES

    Jensen, Kevin L.; Petillo, John J.; Panagos, Dimitrios N.; ...

    2016-11-22

    Future advanced light sources and x-ray Free Electron Lasers require fast response from the photocathode to enable short electron pulse durations as well as pulse shaping, and so the ability to model delays in emission is needed for beam optics codes. The development of a time-dependent emission model accounting for delayed photoemission due to transport and scattering is given, and its inclusion in the Particle-in-Cell code MICHELLE results in changes to the pulse shape that are described. Furthermore, the model is applied to pulse elongation of a bunch traversing an rf injector, and to the smoothing of laser jitter onmore » a short pulse.« less

  18. Time-dependent spherically symmetric accretion onto compact X-ray sources

    NASA Technical Reports Server (NTRS)

    Cowie, L. L.; Ostriker, J. P.; Stark, A. A.

    1978-01-01

    Analytical arguments and a numerical hydrodynamic code are used to investigate spherically symmetric accretion onto a compact object, in an attempt to provide some insight into gas flows heated by an outgoing X-ray flux. It is shown that preheating of spherically symmetric accretion flows by energetic radiation from an X-ray source results in time-dependent behavior for a much wider range of source parameters than was determined previously and that there are two distinct types of instability. The results are compared with observations of X-ray bursters and transients as well as with theories on quasars and active galactic nuclei that involve quasi-spherically symmetric accretion onto massive black holes. Models based on spherically symmetric accretion are found to be inconsistent with observations of bursters and transients.

  19. Radiography simulation on single-shot dual-spectrum X-ray for cargo inspection system.

    PubMed

    Gil, Youngmi; Oh, Youngdo; Cho, Moohyun; Namkung, Won

    2011-02-01

    We propose a method to identify materials in the dual energy X-ray (DeX) inspection system. This method identifies materials by combining information on the relative proportions T of high-energy and low-energy X-rays transmitted through the material, and the ratio R of the attenuation coefficient of the material when high-energy are used to that when low energy X-rays are used. In Monte Carlo N-Particle Transport Code (MCNPX) simulations using the same geometry as that of the real container inspection system, this T vs. R method successfully identified tissue-equivalent plastic and several metals. In further simulations, the single-shot mode of operating the accelerator led to better distinguishing of materials than the dual-shot system. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. Shielding requirements for constant-potential diagnostic x-ray beams determined by a Monte Carlo calculation.

    PubMed

    Simpkin, D J

    1989-02-01

    A Monte Carlo calculation has been performed to determine the transmission of broad constant-potential x-ray beams through Pb, concrete, gypsum wallboard, steel and plate glass. The EGS4 code system was used with a simple broad-beam geometric model to generate exposure transmission curves for published 70, 100, 120 and 140-kVcp x-ray spectra. These curves are compared to measured three-phase generated x-ray transmission data in the literature and found to be reasonable. For calculation ease the data are fit to an equation previously shown to describe such curves quite well. These calculated transmission data are then used to create three-phase shielding tables for Pb and concrete, as well as other materials not available in Report No. 49 of the NCRP.

  1. X-ray simulation algorithms used in ISP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, John P.

    ISP is a simulation code which is sometimes used in the USNDS program. ISP is maintained by Sandia National Lab. However, the X-ray simulation algorithm used by ISP was written by scientists at LANL – mainly by Ed Fenimore with some contributions from John Sullivan and George Neuschaefer and probably others. In email to John Sullivan on July 25, 2016, Jill Rivera, ISP project lead, said “ISP uses the function xdosemeters_sim from the xgen library.” The is a fortran subroutine which is also used to simulate the X-ray response in consim (a descendant of xgen). Therefore, no separate documentation ofmore » the X-ray simulation algorithms in ISP have been written – the documentation for the consim simulation can be used.« less

  2. Quantitative Kα line spectroscopy for energy transport in ultra-intense laser plasma interaction

    NASA Astrophysics Data System (ADS)

    Zhang, Z.; Nishimura, H.; Namimoto, T.; Fujioka, S.; Arikawa, Y.; Nakai, M.; Koga, M.; Shiraga, H.; Kojima, S.; Azechi, H.; Ozaki, T.; Chen, H.; Pakr, J.; Williams, G. J.; Nishikino, M.; Kawachi, T.; Sagisaka, A.; Orimo, S.; Ogura, K.; Pirozhkov, A.; Yogo, A.; Kiriyama, H.; Kondo, K.; Okano, Y.

    2012-10-01

    X-ray line spectra ranging from 17 to 77 keV were quantitatively measured with a Laue spectrometer, composed of a cylindrically curved crystal and a detector. The absolute sensitivity of the spectrometer system was calibrated using pre-characterized laser-produced x-ray sources and radioisotopes, for the detectors and crystal respectively. The integrated reflectivity for the crystal is in good agreement with predictions by an open code for x-ray diffraction. The energy transfer efficiency from incident laser beams to hot electrons, as the energy transfer agency for Au Kα x-ray line emissions, is derived as a consequence of this work. By considering the hot electron temperature, the transfer efficiency from LFEX laser to Au plate target is about 8% to 10%.

  3. 21 CFR 106.90 - Coding.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... CONSUMPTION INFANT FORMULA QUALITY CONTROL PROCEDURES Quality Control Procedures for Assuring Nutrient Content of Infant Formulas § 106.90 Coding. The manufacturer shall code all infant formulas in conformity...

  4. Incorporating Manual and Autonomous Code Generation

    NASA Technical Reports Server (NTRS)

    McComas, David

    1998-01-01

    Code can be generated manually or using code-generated software tools, but how do you interpret the two? This article looks at a design methodology that combines object-oriented design with autonomic code generation for attitude control flight software. Recent improvements in space flight computers are allowing software engineers to spend more time engineering the applications software. The application developed was the attitude control flight software for an astronomical satellite called the Microwave Anisotropy Probe (MAP). The MAP flight system is being designed, developed, and integrated at NASA's Goddard Space Flight Center. The MAP controls engineers are using Integrated Systems Inc.'s MATRIXx for their controls analysis. In addition to providing a graphical analysis for an environment, MATRIXx includes an autonomic code generation facility called AutoCode. This article examines the forces that shaped the final design and describes three highlights of the design process: (1) Defining the manual to autonomic code interface; (2) Applying object-oriented design to the manual flight code; (3) Implementing the object-oriented design in C.

  5. Nucleon-Nucleon Total Cross Section

    NASA Technical Reports Server (NTRS)

    Norbury, John W.

    2008-01-01

    The total proton-proton and neutron-proton cross sections currently used in the transport code HZETRN show significant disagreement with experiment in the GeV and EeV energy ranges. The GeV range is near the region of maximum cosmic ray intensity. It is therefore important to correct these cross sections, so that predictions of space radiation environments will be accurate. Parameterizations of nucleon-nucleon total cross sections are developed which are accurate over the entire energy range of the cosmic ray spectrum.

  6. Using the Monte Carlo method for assessing the tissue and organ doses of patients in dental radiography

    NASA Astrophysics Data System (ADS)

    Makarevich, K. O.; Minenko, V. F.; Verenich, K. A.; Kuten, S. A.

    2016-05-01

    This work is dedicated to modeling dental radiographic examinations to assess the absorbed doses of patients and effective doses. For simulating X-ray spectra, the TASMIP empirical model is used. Doses are assessed on the basis of the Monte Carlo method by using MCNP code for voxel phantoms of ICRP. The results of the assessment of doses to individual organs and effective doses for different types of dental examinations and features of X-ray tube are presented.

  7. X-Ray Attenuation Coefficients from 10 Kev to 100 Mev,

    DTIC Science & Technology

    1957-04-30

    is u&Ung"w APR n 4 1994 94-10025 0 Z1UNITED STATES DEPARTMENT OF COMMERCE NATIONAL BUREAU OF STANDARDS 94 .4 1 096 Data on Radiation Physics Graphs...OF COMMERCE • Sinclair Weeks, Secretary NATIONAL BUREAU OF STANDARDS , A. V. Astin, Dirvcew X-ray Attenuation Coefficients From 10 key to 100 Mev...Dit. ibtion I Availabiilty Codes Avai# and/or Dist Special National Bureau of Standards Circular 583 Issuw1 April 30, 1957 Fo e teSpr dt

  8. Probing Atmospheric Electric Fields in Thunderstorms through Radio Emission from Cosmic-Ray-Induced Air Showers.

    PubMed

    Schellart, P; Trinh, T N G; Buitink, S; Corstanje, A; Enriquez, J E; Falcke, H; Hörandel, J R; Nelles, A; Rachen, J P; Rossetto, L; Scholten, O; Ter Veen, S; Thoudam, S; Ebert, U; Koehn, C; Rutjes, C; Alexov, A; Anderson, J M; Avruch, I M; Bentum, M J; Bernardi, G; Best, P; Bonafede, A; Breitling, F; Broderick, J W; Brüggen, M; Butcher, H R; Ciardi, B; de Geus, E; de Vos, M; Duscha, S; Eislöffel, J; Fallows, R A; Frieswijk, W; Garrett, M A; Grießmeier, J; Gunst, A W; Heald, G; Hessels, J W T; Hoeft, M; Holties, H A; Juette, E; Kondratiev, V I; Kuniyoshi, M; Kuper, G; Mann, G; McFadden, R; McKay-Bukowski, D; McKean, J P; Mevius, M; Moldon, J; Norden, M J; Orru, E; Paas, H; Pandey-Pommier, M; Pizzo, R; Polatidis, A G; Reich, W; Röttgering, H; Scaife, A M M; Schwarz, D J; Serylak, M; Smirnov, O; Steinmetz, M; Swinbank, J; Tagger, M; Tasse, C; Toribio, M C; van Weeren, R J; Vermeulen, R; Vocks, C; Wise, M W; Wucknitz, O; Zarka, P

    2015-04-24

    We present measurements of radio emission from cosmic ray air showers that took place during thunderstorms. The intensity and polarization patterns of these air showers are radically different from those measured during fair-weather conditions. With the use of a simple two-layer model for the atmospheric electric field, these patterns can be well reproduced by state-of-the-art simulation codes. This in turn provides a novel way to study atmospheric electric fields.

  9. Feature extraction for ultrasonic sensor based defect detection in ceramic components

    NASA Astrophysics Data System (ADS)

    Kesharaju, Manasa; Nagarajah, Romesh

    2014-02-01

    High density silicon carbide materials are commonly used as the ceramic element of hard armour inserts used in traditional body armour systems to reduce their weight, while providing improved hardness, strength and elastic response to stress. Currently, armour ceramic tiles are inspected visually offline using an X-ray technique that is time consuming and very expensive. In addition, from X-rays multiple defects are also misinterpreted as single defects. Therefore, to address these problems the ultrasonic non-destructive approach is being investigated. Ultrasound based inspection would be far more cost effective and reliable as the methodology is applicable for on-line quality control including implementation of accept/reject criteria. This paper describes a recently developed methodology to detect, locate and classify various manufacturing defects in ceramic tiles using sub band coding of ultrasonic test signals. The wavelet transform is applied to the ultrasonic signal and wavelet coefficients in the different frequency bands are extracted and used as input features to an artificial neural network (ANN) for purposes of signal classification. Two different classifiers, using artificial neural networks (supervised) and clustering (un-supervised) are supplied with features selected using Principal Component Analysis(PCA) and their classification performance compared. This investigation establishes experimentally that Principal Component Analysis(PCA) can be effectively used as a feature selection method that provides superior results for classifying various defects in the context of ultrasonic inspection in comparison with the X-ray technique.

  10. Vulnerability assessment of a space based weapon platform electronic system exposed to a thermonuclear weapon detonation

    NASA Astrophysics Data System (ADS)

    Perez, C. L.; Johnson, J. O.

    Rapidly changing world events, the increased number of nations with inter-continental ballistic missile capability, and the proliferation of nuclear weapon technology will increase the number of nuclear threats facing the world today. Monitoring these nation's activities and providing an early warning and/or intercept system via reconnaissance and surveillance satellites and space based weapon platforms is a viable deterrent against a surprise nuclear attack. However, the deployment of satellite and weapon platform assets in space will subject the sensitive electronic equipment to a variety of natural and man-made radiation environments. These include Van Allen Belt protons and electrons; galactic and solar flare protons; and neutrons, gamma rays, and x-rays from intentionally detonated fission and fusion weapons. In this paper, the MASH vl.0 code system is used to estimate the dose to the critical electronics components of an idealized space based weapon platform from neutron and gamma-ray radiation emitted from a thermonuclear weapon detonation in space. Fluence and dose assessments were performed for the platform fully loaded, and in several stages representing limited engagement scenarios. The results indicate vulnerabilities to the Command, Control, and Communication bay instruments from radiation damage for a nuclear weapon detonation for certain source/platform orientations. The distance at which damage occurs will depend on the weapon yield (n,(gamma)/kiloton) and size (kilotons).

  11. [Shielding effect of clinical X-ray protector and lead glass against annihilation radiation and gamma rays of 99mTc].

    PubMed

    Fukuda, Atsushi; Koshida, Kichiro; Yamaguchi, Ichiro; Takahashi, Masaaki; Kitabayashi, Keitarou; Matsubara, Kousuke; Noto, Kimiya; Kawabata, Chikako; Nakagawa, Hiroto

    2004-12-01

    Various pharmaceutical companies in Japan are making radioactive drugs available for positron emission tomography (PET) in hospitals without a cyclotron. With the distribution of these drugs to hospitals, medical check-ups and examinations using PET are expected to increase. However, the safety guidelines for radiation in the new deployment of PET have not been adequately improved. Therefore, we measured the shielding effect of a clinical X-ray protector and lead glass against annihilation radiation and gamma rays of (99m)Tc. We then calculated the shielding effect of a 0.25 mm lead protector, 1 mm lead, and lead glass using the EGS4 (Electron Gamma Shower Version 4) code. The shielding effects of 22-mm lead glass against annihilation radiation and gamma rays of (99m)Tc were approximately 31.5% and 93.3%, respectively. The clinical X-ray protector against annihilation radiation approximately doubled the skin-absorbed dose.

  12. The Use of Gamma-Ray Imaging to Improve Portal Monitor Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ziock, Klaus-Peter; Collins, Jeff; Fabris, Lorenzo

    2008-01-01

    We have constructed a prototype, rapid-deployment portal monitor that uses visible-light and gamma-ray imaging to allow simultaneous monitoring of multiple lanes of traffic from the side of a roadway. Our Roadside Tracker uses automated target acquisition and tracking (TAT) software to identify and track vehicles in visible light images. The field of view of the visible camera overlaps with and is calibrated to that of a one-dimensional gamma-ray imager. The TAT code passes information on when vehicles enter and exit the system field of view and when they cross gamma-ray pixel boundaries. Based on this in-formation, the gamma-ray imager "harvests"more » the gamma-ray data specific to each vehicle, integrating its radiation signature for the entire time that it is in the field of view. In this fashion we are able to generate vehicle-specific radiation signatures and avoid source confusion problems that plague nonimaging approaches to the same problem.« less

  13. Hard X-ray Detectability of Small Impulsive Heating Events in the Solar Corona

    NASA Astrophysics Data System (ADS)

    Glesener, L.; Klimchuk, J. A.; Bradshaw, S. J.; Marsh, A.; Krucker, S.; Christe, S.

    2015-12-01

    Impulsive heating events ("nanoflares") are a candidate to supply the solar corona with its ~2 MK temperature. These transient events can be studied using extreme ultraviolet and soft X-ray observations, among others. However, the impulsive events may occur in tenuous loops on small enough timescales that the heating is essentially not observed due to ionization timescales, and only the cooling phase is observed. Bremsstrahlung hard X-rays could serve as a more direct and prompt indicator of transient heating events. A hard X-ray spacecraft based on the direct-focusing technology pioneered by the Focusing Optics X-ray Solar Imager (FOXSI) sounding rocket could search for these direct signatures. In this work, we use the hydrodynamical EBTEL code to simulate differential emission measures produced by individual heating events and by ensembles of such events. We then directly predict hard X-ray spectra and consider their observability by a future spaceborne FOXSI, and also by the RHESSI and NuSTAR spacecraft.

  14. Permanence analysis of a concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Costello, D. J., Jr.; Lin, S.; Kasami, T.

    1983-01-01

    A concatenated coding scheme for error control in data communications is analyzed. In this scheme, the inner code is used for both error correction and detection, however, the outer code is used only for error detection. A retransmission is requested if the outer code detects the presence of errors after the inner code decoding. Probability of undetected error is derived and bounded. A particular example, proposed for the planetary program, is analyzed.

  15. Contact Control, Version 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    von Sternberg, Alex

    The contact control code is a generalized force control scheme meant to interface with a robotic arm being controlled using the Robot Operating System (ROS). The code allows the user to specify a control scheme for each control dimension in a way that many different control task controllers could be built from the same generalized controller. The input to the code includes maximum velocity, maximum force, maximum displacement, and a control law assigned to each direction and the output is a 6 degree of freedom velocity command that is sent to the robot controller.

  16. TOMO3D: 3-D joint refraction and reflection traveltime tomography parallel code for active-source seismic data—synthetic test

    NASA Astrophysics Data System (ADS)

    Meléndez, A.; Korenaga, J.; Sallarès, V.; Miniussi, A.; Ranero, C. R.

    2015-10-01

    We present a new 3-D traveltime tomography code (TOMO3D) for the modelling of active-source seismic data that uses the arrival times of both refracted and reflected seismic phases to derive the velocity distribution and the geometry of reflecting boundaries in the subsurface. This code is based on its popular 2-D version TOMO2D from which it inherited the methods to solve the forward and inverse problems. The traveltime calculations are done using a hybrid ray-tracing technique combining the graph and bending methods. The LSQR algorithm is used to perform the iterative regularized inversion to improve the initial velocity and depth models. In order to cope with an increased computational demand due to the incorporation of the third dimension, the forward problem solver, which takes most of the run time (˜90 per cent in the test presented here), has been parallelized with a combination of multi-processing and message passing interface standards. This parallelization distributes the ray-tracing and traveltime calculations among available computational resources. The code's performance is illustrated with a realistic synthetic example, including a checkerboard anomaly and two reflectors, which simulates the geometry of a subduction zone. The code is designed to invert for a single reflector at a time. A data-driven layer-stripping strategy is proposed for cases involving multiple reflectors, and it is tested for the successive inversion of the two reflectors. Layers are bound by consecutive reflectors, and an initial velocity model for each inversion step incorporates the results from previous steps. This strategy poses simpler inversion problems at each step, allowing the recovery of strong velocity discontinuities that would otherwise be smoothened.

  17. Automatic removal of cosmic ray signatures in Deep Impact images

    NASA Astrophysics Data System (ADS)

    Ipatov, S. I.; A'Hearn, M. F.; Klaasen, K. P.

    The results of recognition of cosmic ray (CR) signatures on single images made during the Deep Impact mission were analyzed for several codes written by several authors. For automatic removal of CR signatures on many images, we suggest using the code imgclean ( http://pdssbn.astro.umd.edu/volume/didoc_0001/document/calibration_software/dical_v5/) written by E. Deutsch as other codes considered do not work properly automatically with a large number of images and do not run to completion for some images; however, other codes can be better for analysis of certain specific images. Sometimes imgclean detects false CR signatures near the edge of a comet nucleus, and it often does not recognize all pixels of long CR signatures. Our code rmcr is the only code among those considered that allows one to work with raw images. For most visual images made during low solar activity at exposure time t > 4 s, the number of clusters of bright pixels on an image per second per sq. cm of CCD was about 2-4, both for dark and normal sky images. At high solar activity, it sometimes exceeded 10. The ratio of the number of CR signatures consisting of n pixels obtained at high solar activity to that at low solar activity was greater for greater n. The number of clusters detected as CR signatures on a single infrared image is by at least a factor of several greater than the actual number of CR signatures; the number of clusters based on analysis of two successive dark infrared frames is in agreement with an expected number of CR signatures. Some glitches of false CR signatures include bright pixels repeatedly present on different infrared images. Our interactive code imr allows a user to choose the regions on a considered image where glitches detected by imgclean as CR signatures are ignored. In other regions chosen by the user, the brightness of some pixels is replaced by the local median brightness if the brightness of these pixels is greater by some factor than the median brightness. The interactive code allows one to delete long CR signatures and prevents removal of false CR signatures near the edge of the nucleus of the comet. The interactive code can be applied to editing any digital images. Results obtained can be used for other missions to comets.

  18. A cascaded coding scheme for error control and its performance analysis

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao; Fujiwara, Tohru; Takata, Toyoo

    1986-01-01

    A coding scheme is investigated for error control in data communication systems. The scheme is obtained by cascading two error correcting codes, called the inner and outer codes. The error performance of the scheme is analyzed for a binary symmetric channel with bit error rate epsilon <1/2. It is shown that if the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit error rate. Various specific example schemes with inner codes ranging form high rates to very low rates and Reed-Solomon codes as inner codes are considered, and their error probabilities are evaluated. They all provide extremely high reliability even for very high bit error rates. Several example schemes are being considered by NASA for satellite and spacecraft down link error control.

  19. CREME: The 2011 Revision of the Cosmic Ray Effects on Micro-Electronics Code

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Barghouty, Abdulnasser F.; Reed, Robert A.; Sierawski, Brian D.; Watts, John W., Jr.

    2012-01-01

    We describe a tool suite, CREME, which combines existing capabilities of CREME96 and CREME86 with new radiation environment models and new Monte Carlo computational capabilities for single event effects and total ionizing dose.

  20. Correlated prompt fission data in transport simulations

    DOE PAGES

    Talou, P.; Vogt, R.; Randrup, J.; ...

    2018-01-24

    Detailed information on the fission process can be inferred from the observation, modeling and theoretical understanding of prompt fission neutron and γ-ray observables. Beyond simple average quantities, the study of distributions and correlations in prompt data, e.g., multiplicity-dependent neutron and γ-ray spectra, angular distributions of the emitted particles, n -n, n - γ, and γ - γ correlations, can place stringent constraints on fission models and parameters that would otherwise be free to be tuned separately to represent individual fission observables. The FREYA and CGMF codes have been developed to follow the sequential emissions of prompt neutrons and γ raysmore » from the initial excited fission fragments produced right after scission. Both codes implement Monte Carlo techniques to sample initial fission fragment configurations in mass, charge and kinetic energy and sample probabilities of neutron and γ emission at each stage of the decay. This approach naturally leads to using simple but powerful statistical techniques to infer distributions and correlations among many observables and model parameters. The comparison of model calculations with experimental data provides a rich arena for testing various nuclear physics models such as those related to the nuclear structure and level densities of neutron-rich nuclei, the γ-ray strength functions of dipole and quadrupole transitions, the mechanism for dividing the excitation energy between the two nascent fragments near scission, and the mechanisms behind the production of angular momentum in the fragments, etc. Beyond the obvious interest from a fundamental physics point of view, such studies are also important for addressing data needs in various nuclear applications. The inclusion of the FREYA and CGMF codes into the MCNP6.2 and MCNPX - PoliMi transport codes, for instance, provides a new and powerful tool to simulate correlated fission events in neutron transport calculations important in nonproliferation, safeguards, nuclear energy, and defense programs. Here, this review provides an overview of the topic, starting from theoretical considerations of the fission process, with a focus on correlated signatures. It then explores the status of experimental correlated fission data and current efforts to address some of the known shortcomings. Numerical simulations employing the FREYA and CGMF codes are compared to experimental data for a wide range of correlated fission quantities. The inclusion of those codes into the MCNP6.2 and MCNPX - PoliMi transport codes is described and discussed in the context of relevant applications. The accuracy of the model predictions and their sensitivity to model assumptions and input parameters are discussed. Lastly, a series of important experimental and theoretical questions that remain unanswered are presented, suggesting a renewed effort to address these shortcomings.« less

  1. Correlated prompt fission data in transport simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talou, P.; Vogt, R.; Randrup, J.

    Detailed information on the fission process can be inferred from the observation, modeling and theoretical understanding of prompt fission neutron and γ-ray observables. Beyond simple average quantities, the study of distributions and correlations in prompt data, e.g., multiplicity-dependent neutron and γ-ray spectra, angular distributions of the emitted particles, n -n, n - γ, and γ - γ correlations, can place stringent constraints on fission models and parameters that would otherwise be free to be tuned separately to represent individual fission observables. The FREYA and CGMF codes have been developed to follow the sequential emissions of prompt neutrons and γ raysmore » from the initial excited fission fragments produced right after scission. Both codes implement Monte Carlo techniques to sample initial fission fragment configurations in mass, charge and kinetic energy and sample probabilities of neutron and γ emission at each stage of the decay. This approach naturally leads to using simple but powerful statistical techniques to infer distributions and correlations among many observables and model parameters. The comparison of model calculations with experimental data provides a rich arena for testing various nuclear physics models such as those related to the nuclear structure and level densities of neutron-rich nuclei, the γ-ray strength functions of dipole and quadrupole transitions, the mechanism for dividing the excitation energy between the two nascent fragments near scission, and the mechanisms behind the production of angular momentum in the fragments, etc. Beyond the obvious interest from a fundamental physics point of view, such studies are also important for addressing data needs in various nuclear applications. The inclusion of the FREYA and CGMF codes into the MCNP6.2 and MCNPX - PoliMi transport codes, for instance, provides a new and powerful tool to simulate correlated fission events in neutron transport calculations important in nonproliferation, safeguards, nuclear energy, and defense programs. Here, this review provides an overview of the topic, starting from theoretical considerations of the fission process, with a focus on correlated signatures. It then explores the status of experimental correlated fission data and current efforts to address some of the known shortcomings. Numerical simulations employing the FREYA and CGMF codes are compared to experimental data for a wide range of correlated fission quantities. The inclusion of those codes into the MCNP6.2 and MCNPX - PoliMi transport codes is described and discussed in the context of relevant applications. The accuracy of the model predictions and their sensitivity to model assumptions and input parameters are discussed. Lastly, a series of important experimental and theoretical questions that remain unanswered are presented, suggesting a renewed effort to address these shortcomings.« less

  2. Simulation study of 3–5 keV x-ray conversion efficiency from Ar K-shell vs. Ag L-shell targets on the National Ignition Facility laser

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kemp, G. E., E-mail: kemp10@llnl.gov; Colvin, J. D.; Fournier, K. B.

    2015-05-15

    Tailored, high-flux, multi-keV x-ray sources are desirable for studying x-ray interactions with matter for various civilian, space and military applications. For this study, we focus on designing an efficient laser-driven non-local thermodynamic equilibrium 3–5 keV x-ray source from photon-energy-matched Ar K-shell and Ag L-shell targets at sub-critical densities (∼n{sub c}/10) to ensure supersonic, volumetric laser heating with minimal losses to kinetic energy, thermal x rays and laser-plasma instabilities. Using HYDRA, a multi-dimensional, arbitrary Lagrangian-Eulerian, radiation-hydrodynamics code, we performed a parameter study by varying initial target density and laser parameters for each material using conditions readily achievable on the National Ignition Facilitymore » (NIF) laser. We employ a model, benchmarked against Kr data collected on the NIF, that uses flux-limited Lee-More thermal conductivity and multi-group implicit Monte-Carlo photonics with non-local thermodynamic equilibrium, detailed super-configuration accounting opacities from CRETIN, an atomic-kinetics code. While the highest power laser configurations produced the largest x-ray yields, we report that the peak simulated laser to 3–5 keV x-ray conversion efficiencies of 17.7% and 36.4% for Ar and Ag, respectively, occurred at lower powers between ∼100–150 TW. For identical initial target densities and laser illumination, the Ag L-shell is observed to have ≳10× higher emissivity per ion per deposited laser energy than the Ar K-shell. Although such low-density Ag targets have not yet been demonstrated, simulations of targets fabricated using atomic layer deposition of Ag on silica aerogels (∼20% by atomic fraction) suggest similar performance to atomically pure metal foams and that either fabrication technique may be worth pursuing for an efficient 3–5 keV x-ray source on NIF.« less

  3. The Experimental Study of Characterized Noble Gas Puffs Irradiated by Ultra-Short Laser Pulses Compared with X-Pinches as an X-Ray Source

    NASA Astrophysics Data System (ADS)

    Schultz, Kimberly Ann

    The goal of this dissertation is to study the basic physics and X-ray emission (1-10 keV) of two X-ray sources: X-pinch plasmas and a clustered gas-puff irradiated by an ultrashort laser pulse. X-pinches and other typical X-ray sources using solid targets create hot debris that can damage sensitive equipment. Therefore, to perform sensitive backlighting or X-ray effects testing, debris-free sources of radiation must be investigated. In this work, the author presents a broad study of clustered noble gas puffs including characterization measurements and laser heating experiments using several gas nozzles and multiple gases. Ultimately, the goal is to compare the laser-irradiated gas-puff and X-pinch plasmas as X-ray sources. Characterization of the gas puffs is performed at the Radiation Physics Laboratory at the University of Nevada, Reno (UNR) Physics Department using optical interferometry and Rayleigh scattering to determine density and cluster radius. By changing the gas-puff variables control of both the density and cluster size of the gas jets is obtained. Two laser systems provide the high intensities desired for the laser-irradiated gas puff experiments: the UNR Leopard Laser (1-2x1019 W/cm2) and the Lawrence Livermore National Laboratory's Titan Laser (7x1019 W/cm2). X-ray emission is studied as a function of laser pulse parameters, gas target type, gas puff density, and the gas-delay timing between puff initiation and laser interaction with the puff. The tested gases are Ar, Kr, Xe, and four mixtures of the noble gases. Time-resolved X-ray measurements are captured with Silicon diodes and photoconducting diamond detectors. Electron beam detectors include Faraday cups and a high-energy (> 1 MeV) electron spectrometer. Modeling of spectra from X-ray crystal spectrometers provides plasma density and temperature measurement and a molecular dynamics (MD) code describes cluster interactions with the laser pulse. The conversion of laser energy into X rays is also measured. Laser beam transmission through and absorption by the gas puff reveal the complexity of using laser-irradiated gas puffs as X-ray sources. A strong anisotropy of X-ray and electron emissions were observed at both laser facilities. X-pinch plasmas can provide intense hard X rays and strong electron beams originating from small sources with many applications. Recent research has been conducted into four-wire X-pinches at the UNR Zebra machine, a 1-MA pulsed power generator. Two different wire materials are considered in this study, Ag and Mo. We observe a relatively linear correlation between load mass and implosion time for Mo X-pinches; in fact, this relationship also extends to include Ag. Interestingly, X-ray burst features drastically change in shape when the load mass is varied. Advantages of laser-irradiated gas puffs include a lack of damaging debris, high repetition rate, and ease of control. Its disadvantages include its inefficiency at converting electrical energy to X-rays, which is mostly limited by laser efficiency, and relatively low total energy yield. X-pinches, on the other hand, produced kJ of energy in a broad spectral region. However, they create a large amount of debris, have a low repetition rate, and, at 1-MA, have hard-to-predict implosion times.

  4. MONTE CARLO STUDY OF THE CARDIAC ABSORBED DOSE DURING X-RAY EXAMINATION OF AN ADULT PATIENT.

    PubMed

    Kadri, O; Manai, K; Alfuraih, A

    2016-12-01

    The computational voxel phantom 'High-Definition Reference Korean-Man (HDRK-Man)' was implemented into the Monte Carlo transport toolkit Geant4. The voxel model, adjusted to the Reference Korean Man, is 171 cm in height and 68 kg in weight and composed of ∼30 million voxels whose size is 1.981 × 1.981 × 2.0854 mm 3 The Geant4 code is then utilised to compute the dose conversion coefficients (DCCs) expressed in absorbed dose per air kerma free in air for >30 tissues and organs, including almost all organs required in the new recommendation of the ICRP 103, due to a broad parallel beam of monoenergetic photons impinging in antero-postero direction with energy ranging from 10 to 150 keV. The computed DCCs of different organs are found to be in good agreement with data published using other simulation codes. Also, the influence of patient size on DCC values was investigated for a representative body size of the adult Korean patient population. The study was performed using five different sizes covering the range of 0.8-1.2 magnification order of the original HDRK-Man. It focussed on the computation of DCC for the human heart. Moreover, the provided DCCs were used to present an analytical parameterisation for the calculation of the cardiac absorbed dose for any arbitrary X-ray spectrum and for those patient sizes. Thus, the present work can be considered as an enhancement of the continuous studies performed by medical physicist as part of quality control tests and radiation protection dosimetry. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Experimental check of bremsstrahlung dosimetry predictions for 0.75 MeV electrons

    NASA Astrophysics Data System (ADS)

    Sanford, T. W. L.; Halbleib, J. A.; Beezhold, W.

    Bremsstrahlung dose in CaF2 TLDs from the radiation produced by 0.75 MeV electrons incident on Ta/C targets is measured and compared with that calculated via the CYLTRAN Monte Carlo code. The comparison was made to validate the code, which is used to predict and analyze radiation environments of flash X-ray simulators measured by TLDs. Over a wide range of Ta target thicknesses and radiation angles the code is found to agree with the 5% measurements. For Ta thickness near those that optimize the radiation output, however, the code overestimates the radiation dose at small angles. Maximum overprediction is about 14 + or - 5%. The general agreement, nonetheless, gives confidence in using the code at this energy and in the TLD calibration procedure. For the bulk of the measurements, a standard TLD employing a 2.2 mm thick Al equilibrator was used. In this paper we also show that this thickness can significantly attenuate the free-field dose and introduces significant photon buildup in the equalibrator.

  6. Color coding of control room displays: the psychocartography of visual layering effects.

    PubMed

    Van Laar, Darren; Deshe, Ofer

    2007-06-01

    To evaluate which of three color coding methods (monochrome, maximally discriminable, and visual layering) used to code four types of control room display format (bars, tables, trend, mimic) was superior in two classes of task (search, compare). It has recently been shown that color coding of visual layers, as used in cartography, may be used to color code any type of information display, but this has yet to be fully evaluated. Twenty-four people took part in a 2 (task) x 3 (coding method) x 4 (format) wholly repeated measures design. The dependent variables assessed were target location reaction time, error rates, workload, and subjective feedback. Overall, the visual layers coding method produced significantly faster reaction times than did the maximally discriminable and the monochrome methods for both the search and compare tasks. No significant difference in errors was observed between conditions for either task type. Significantly less perceived workload was experienced with the visual layers coding method, which was also rated more highly than the other coding methods on a 14-item visual display quality questionnaire. The visual layers coding method is superior to other color coding methods for control room displays when the method supports the user's task. The visual layers color coding method has wide applicability to the design of all complex information displays utilizing color coding, from the most maplike (e.g., air traffic control) to the most abstract (e.g., abstracted ecological display).

  7. Hard X-ray Observation of Cygnus X-1 By the Marshall Imaging X-ray Experiment (MIXE2)

    NASA Technical Reports Server (NTRS)

    Minamitani, Takahisa; Apple, J. A.; Austin, R. A.; Dietz, K. L.; Koloziejczak, J. J.; Ramsey, B. D.; Weisskopf, M. C.

    1998-01-01

    The second generation of the Marshall Imaging X-ray Experiment (MIXE2) was flown from Fort Sumner, New Mexico on May 7-8, 1997. The experiment consists of coded-aperture telescope with a field of view of 1.8 degrees (FWHM) and an angular resolution of 6.9 arcminutes. The detector is a large (7.84x10(exp 4) sq cm) effective area microstrip proportional counter filled with 2.0x10(exp5) Pascals of xenon with 2% isobutylene. We present MIXE2 observation of the 20-80keV spectrum and timing variability of Cygnus X-1 made during balloon flight.

  8. Effective increase in beam emittance by phase-space expansion using asymmetric Bragg diffraction.

    PubMed

    Chu, Chia-Hung; Tang, Mau-Tsu; Chang, Shih-Lin

    2015-08-24

    We propose an innovative method to extend the utilization of the phase space downstream of a synchrotron light source for X-ray transmission microscopy. Based on the dynamical theory of X-ray diffraction, asymmetrically cut perfect crystals are applied to reshape the position-angle-wavelength space of the light source, by which the usable phase space of the source can be magnified by over one hundred times, thereby "phase-space-matching" the source with the objective lens of the microscope. The method's validity is confirmed using SHADOW code simulations, and aberration through an optical lens such as a Fresnel zone plate is examined via matrix optics for nano-resolution X-ray images.

  9. Radiation production and absorption in human spacecraft shielding systems under high charge and energy Galactic Cosmic Rays: Material medium, shielding depth, and byproduct aspects

    NASA Astrophysics Data System (ADS)

    Barthel, Joseph; Sarigul-Klijn, Nesrin

    2018-03-01

    Deep space missions such as the planned 2025 mission to asteroids require spacecraft shields to protect electronics and humans from adverse effects caused by the space radiation environment, primarily Galactic Cosmic Rays. This paper first reviews the theory on how these rays of charged particles interact with matter, and then presents a simulation for a 500 day Mars flyby mission using a deterministic based computer code. High density polyethylene and aluminum shielding materials at a solar minimum are considered. Plots of effective dose with varying shield depth, charged particle flux, and dose in silicon and human tissue behind shielding are presented.

  10. Monte Carlo Simulations of Background Spectra in Integral Imager Detectors

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.; Dietz, K. L.; Ramsey, B. D.; Weisskopf, M. C.

    1998-01-01

    Predictions of the expected gamma-ray backgrounds in the ISGRI (CdTe) and PiCsIT (Csl) detectors on INTEGRAL due to cosmic-ray interactions and the diffuse gamma-ray background have been made using a coupled set of Monte Carlo radiation transport codes (HETC, FLUKA, EGS4, and MORSE) and a detailed, 3-D mass model of the spacecraft and detector assemblies. The simulations include both the prompt background component from induced hadronic and electromagnetic cascades and the delayed component due to emissions from induced radioactivity. Background spectra have been obtained with and without the use of active (BGO) shielding and charged particle rejection to evaluate the effectiveness of anticoincidence counting on background rejection.

  11. Space-Shuttle Emulator Software

    NASA Technical Reports Server (NTRS)

    Arnold, Scott; Askew, Bill; Barry, Matthew R.; Leigh, Agnes; Mermelstein, Scott; Owens, James; Payne, Dan; Pemble, Jim; Sollinger, John; Thompson, Hiram; hide

    2007-01-01

    A package of software has been developed to execute a raw binary image of the space shuttle flight software for simulation of the computational effects of operation of space shuttle avionics. This software can be run on inexpensive computer workstations. Heretofore, it was necessary to use real flight computers to perform such tests and simulations. The package includes a program that emulates the space shuttle orbiter general- purpose computer [consisting of a central processing unit (CPU), input/output processor (IOP), master sequence controller, and buscontrol elements]; an emulator of the orbiter display electronics unit and models of the associated cathode-ray tubes, keyboards, and switch controls; computational models of the data-bus network; computational models of the multiplexer-demultiplexer components; an emulation of the pulse-code modulation master unit; an emulation of the payload data interleaver; a model of the master timing unit; a model of the mass memory unit; and a software component that ensures compatibility of telemetry and command services between the simulated space shuttle avionics and a mission control center. The software package is portable to several host platforms.

  12. Testing cosmic ray acceleration with radio relics: a high-resolution study using MHD and tracers

    NASA Astrophysics Data System (ADS)

    Wittor, D.; Vazza, F.; Brüggen, M.

    2017-02-01

    Weak shocks in the intracluster medium may accelerate cosmic-ray protons and cosmic-ray electrons differently depending on the angle between the upstream magnetic field and the shock normal. In this work, we investigate how shock obliquity affects the production of cosmic rays in high-resolution simulations of galaxy clusters. For this purpose, we performed a magnetohydrodynamical simulation of a galaxy cluster using the mesh refinement code ENZO. We use Lagrangian tracers to follow the properties of the thermal gas, the cosmic rays and the magnetic fields over time. We tested a number of different acceleration scenarios by varying the obliquity-dependent acceleration efficiencies of protons and electrons, and by examining the resulting hadronic γ-ray and radio emission. We find that the radio emission does not change significantly if only quasi-perpendicular shocks are able to accelerate cosmic-ray electrons. Our analysis suggests that radio-emitting electrons found in relics have been typically shocked many times before z = 0. On the other hand, the hadronic γ-ray emission from clusters is found to decrease significantly if only quasi-parallel shocks are allowed to accelerate cosmic ray protons. This might reduce the tension with the low upper limits on γ-ray emission from clusters set by the Fermi satellite.

  13. X-ray Radiative Transfer in Protoplanetary Disks with ProDiMo

    NASA Astrophysics Data System (ADS)

    Rab, Christian; Woitke, Peter; Güdel, Manuel; Min, Michiel; Diana Team

    2013-07-01

    X-ray emission is a common property of YSOs. T Tauri stars show X-ray luminosities up to 10^32 erg/s but also Herbig Ae/Be stars can have moderate X-ray emission in the range of 10^28 to 10^31 erg/s. We want to investigate the impact of X-ray radiation on the thermal and chemical structure of protoplanetary discs around these YSOs. Therefore we have added a new X-ray Radiative Transfer module to the radiation thermo-chemical code ProDiMo (Protoplanetary Disc Modeling) extending the existing implementation of X-ray chemistry implemented by Aresu et al. This new module considers gas and dust opacities (including scattering) and a possible X-ray background field. Further we added a new set of FUV - photoreactions to the X-ray chemistry module of ProDiMo as fast electrons created in X-ray ionisation can produce a significant secondary FUV radiation field by exciting atomic or molecular hydrogen. We discuss the importance of these processes on the thermal and chemical structure of the protoplanetary disc, and present them on the basis of a typical T Tauri disc model. This work is performed in the context of the EU FP7-project DIANA (www.diana-project.com).

  14. Improved EOS for describing high-temperature off-hugoniot states in epoxy

    NASA Astrophysics Data System (ADS)

    Mulford, R. N.; Lanier, N. E.; Swift, D.; Workman, J.; Graham, Peter; Moore, Alastair

    2007-06-01

    Modeling of off-hugoniot states in an expanding interface subjected to a shock reveals the importance of a chemically complete description of the materials. Hydrodynamic experiments typically rely on pre-shot target characterization to predict how initial perturbations will affect the late-time hydrodynamic mixing. However, it is the condition of these perturbations at the time of shock arrival that dominates their eventual late-time evolution. In some cases these perturbations are heated prior to the arrival of the main shock. Correctly modeling how temperature and density gradients will develop in the pre-heated material requires an understanding of the equation-of-state. In the experiment modelled, an epoxy/foam layered package was subjected to tin L-shell radiation, producing an expanding assembly at a well-defined temperature. This assembly was then subjected to a controlled shock, and the evolution of the epoxy-foam interface imaged with x-ray radiography. Modeling of the data with the hydrodynamics code RAGE is unsuccessful under certain shock conditions, unless condensation of chemical species from the plasma is explicitly included. The EOS code CHEETAH was used to prepare suitable EOS for input into the hydrodynamics modeling.

  15. Improved EOS for Describing High-Temperature Off-Hugoniot States in Epoxy

    NASA Astrophysics Data System (ADS)

    Mulford, R. N.; Swift, D. C.; Lanier, N. E.; Workman, J.; Holmes, R. L.; Graham, P.; Moore, A.

    2007-12-01

    Modelling of off-Hugoniot states in an expanding interface subjected to a shock reveals the importance of a chemically complete description of the materials. Hydrodynamic experiments typically rely on pre-shot target characterization to predict how initial perturbations will affect the late-time hydrodynamic mixing. However, it is the condition of these perturbations at the time of shock arrival that dominates their eventual late-time evolution. In some cases these perturbations are heated prior to the arrival of the main shock. Correctly modelling how temperature and density gradients will develop in the pre-heated material requires an understanding of the equation-of-state. In the experiment modelled, an epoxy/foam layered package was subjected to tin L-shell radiation, producing an expanding assembly at a well-defined temperature. This assembly was then subjected to a controlled shock, and the evolution of the epoxy-foam interface imaged with x-ray radiography. Modelling of the data with the hydrodynamics code RAGE was unsuccessful under certain shock conditions, unless condensation of chemical species from the plasma is explicitly included. The EOS code Cheetah was used to prepare suitable EOS for input into the hydrodynamics modelling.

  16. Quasi-linear modeling of lower hybrid current drive in ITER and DEMO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardinali, A., E-mail: alessandro.cardinali@enea.it; Cesario, R.; Panaccione, L.

    2015-12-10

    First pass absorption of the Lower Hybrid waves in thermonuclear devices like ITER and DEMO is modeled by coupling the ray tracing equations with the quasi-linear evolution of the electron distribution function in 2D velocity space. As usually assumed, the Lower Hybrid Current Drive is not effective in a plasma of a tokamak fusion reactor, owing to the accessibility condition which, depending on the density, restricts the parallel wavenumber to values greater than n{sub ∥crit} and, at the same time, to the high electron temperature that would enhance the wave absorption and then restricts the RF power deposition to themore » very periphery of the plasma column (near the separatrix). In this work, by extensively using the “ray{sup star}” code, a parametric study of the propagation and absorption of the LH wave as function of the coupled wave spectrum (as its width, and peak value), has been performed very accurately. Such a careful investigation aims at controlling the power deposition layer possibly in the external half radius of the plasma, thus providing a valuable aid to the solution of how to control the plasma current profile in a toroidal magnetic configuration, and how to help the suppression of MHD mode that can develop in the outer part of the plasma. This analysis is useful not only for exploring the possibility of profile control of a pulsed operation reactor as well as the tearing mode stabilization, but also in order to reconsider the feasibility of steady state regime for DEMO.« less

  17. Searching for evidence of quasi-periodic pulsations in solar flares using the AFINO code

    NASA Astrophysics Data System (ADS)

    Inglis, Andrew; Ireland, Jack; Dennis, Brian R.; Hayes, Laura Ann; Gallagher, Peter T.

    2017-08-01

    The AFINO (Automated Flare Inference of Oscillations) code is a new tool to allow analysis of temporal solar data in search of oscillatory signatures. Using AFINO, we carry out a large-scale search for evidence of signals consistent with quasi-periodic pulsations (QPP) in solar flares, focusing on the 1-300 s timescale. We analyze 675 M- and X-class flares observed by GOES in 1-8 Å soft X-rays between 2011 February 1 and 2015 December 31. Additionally, over the same era we analyze Fermi/GBM 15-25 keV X-ray data for each of these flares associated with a GBM solar flare trigger, a total of 261 events. Using a model comparison method and the Bayesian Information Criterion statistic, we determine whether there is evidence for a substantial enhancement in the Fourier power spectrum that may be consistent with a QPP-like signature.Quasi-steady periodic signatures appear more prevalently in thermal soft X-ray data than in the counterpart hard X-ray emission: according to AFINO ~30% of GOES flares but only ~8% of the same flares observed by GBM show strong signatures consistent with classical interpretations of QPP, which include MHD wave processes and oscillatory reconnection events. For both datasets, preferred characteristic timescales of ~5-30 s were found in the QPP-like events, with no clear dependence on flare magnitude. Individual events in the sample also show similar characteristic timescales in both GBM and GOES data sets, indicating that the same phenomenon is sometimes observed simultaneously in soft and hard X-rays. We discuss the implications of these survey results, and future developments of the analysis method. AFINO continues to run daily on new flares observed by GOES, and the full AFINO catalogue is made available online.

  18. Technical Note: spektr 3.0-A computational tool for x-ray spectrum modeling and analysis.

    PubMed

    Punnoose, J; Xu, J; Sisniega, A; Zbijewski, W; Siewerdsen, J H

    2016-08-01

    A computational toolkit (spektr 3.0) has been developed to calculate x-ray spectra based on the tungsten anode spectral model using interpolating cubic splines (TASMICS) algorithm, updating previous work based on the tungsten anode spectral model using interpolating polynomials (TASMIP) spectral model. The toolkit includes a matlab (The Mathworks, Natick, MA) function library and improved user interface (UI) along with an optimization algorithm to match calculated beam quality with measurements. The spektr code generates x-ray spectra (photons/mm(2)/mAs at 100 cm from the source) using TASMICS as default (with TASMIP as an option) in 1 keV energy bins over beam energies 20-150 kV, extensible to 640 kV using the TASMICS spectra. An optimization tool was implemented to compute the added filtration (Al and W) that provides a best match between calculated and measured x-ray tube output (mGy/mAs or mR/mAs) for individual x-ray tubes that may differ from that assumed in TASMICS or TASMIP and to account for factors such as anode angle. The median percent difference in photon counts for a TASMICS and TASMIP spectrum was 4.15% for tube potentials in the range 30-140 kV with the largest percentage difference arising in the low and high energy bins due to measurement errors in the empirically based TASMIP model and inaccurate polynomial fitting. The optimization tool reported a close agreement between measured and calculated spectra with a Pearson coefficient of 0.98. The computational toolkit, spektr, has been updated to version 3.0, validated against measurements and existing models, and made available as open source code. Video tutorials for the spektr function library, UI, and optimization tool are available.

  19. Controlling Energy Radiations of Electromagnetic Waves via Frequency Coding Metamaterials.

    PubMed

    Wu, Haotian; Liu, Shuo; Wan, Xiang; Zhang, Lei; Wang, Dan; Li, Lianlin; Cui, Tie Jun

    2017-09-01

    Metamaterials are artificial structures composed of subwavelength unit cells to control electromagnetic (EM) waves. The spatial coding representation of metamaterial has the ability to describe the material in a digital way. The spatial coding metamaterials are typically constructed by unit cells that have similar shapes with fixed functionality. Here, the concept of frequency coding metamaterial is proposed, which achieves different controls of EM energy radiations with a fixed spatial coding pattern when the frequency changes. In this case, not only different phase responses of the unit cells are considered, but also different phase sensitivities are also required. Due to different frequency sensitivities of unit cells, two units with the same phase response at the initial frequency may have different phase responses at higher frequency. To describe the frequency coding property of unit cell, digitalized frequency sensitivity is proposed, in which the units are encoded with digits "0" and "1" to represent the low and high phase sensitivities, respectively. By this merit, two degrees of freedom, spatial coding and frequency coding, are obtained to control the EM energy radiations by a new class of frequency-spatial coding metamaterials. The above concepts and physical phenomena are confirmed by numerical simulations and experiments.

  20. Investigation of the hard x-ray background in backlit pinhole imagers.

    PubMed

    Fein, J R; Peebles, J L; Keiter, P A; Holloway, J P; Klein, S R; Kuranz, C C; Manuel, M J-E; Drake, R P

    2014-11-01

    Hard x-rays from laser-produced hot electrons (>10 keV) in backlit pinhole imagers can give rise to a background signal that decreases signal dynamic range in radiographs. Consequently, significant uncertainties are introduced to the measured optical depth of imaged plasmas. Past experiments have demonstrated that hard x-rays are produced when hot electrons interact with the high-Z pinhole substrate used to collimate the softer He-α x-ray source. Results are presented from recent experiments performed on the OMEGA-60 laser to further study the production of hard x-rays in the pinhole substrate and how these x-rays contribute to the background signal in radiographs. Radiographic image plates measured hard x-rays from pinhole imagers with Mo, Sn, and Ta pinhole substrates. The variation in background signal between pinhole substrates provides evidence that much of this background comes from x-rays produced in the pinhole substrate itself. A Monte Carlo electron transport code was used to model x-ray production from hot electrons interacting in the pinhole substrate, as well as to model measurements of x-rays from the irradiated side of the targets, recorded by a bremsstrahlung x-ray spectrometer. Inconsistencies in inferred hot electron distributions between the different pinhole substrate materials demonstrate that additional sources of hot electrons beyond those modeled may produce hard x-rays in the pinhole substrate.

  1. Investigation of the hard x-ray background in backlit pinhole imagers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fein, J. R., E-mail: jrfein@umich.edu; Holloway, J. P.; Peebles, J. L.

    Hard x-rays from laser-produced hot electrons (>10 keV) in backlit pinhole imagers can give rise to a background signal that decreases signal dynamic range in radiographs. Consequently, significant uncertainties are introduced to the measured optical depth of imaged plasmas. Past experiments have demonstrated that hard x-rays are produced when hot electrons interact with the high-Z pinhole substrate used to collimate the softer He-α x-ray source. Results are presented from recent experiments performed on the OMEGA-60 laser to further study the production of hard x-rays in the pinhole substrate and how these x-rays contribute to the background signal in radiographs. Radiographicmore » image plates measured hard x-rays from pinhole imagers with Mo, Sn, and Ta pinhole substrates. The variation in background signal between pinhole substrates provides evidence that much of this background comes from x-rays produced in the pinhole substrate itself. A Monte Carlo electron transport code was used to model x-ray production from hot electrons interacting in the pinhole substrate, as well as to model measurements of x-rays from the irradiated side of the targets, recorded by a bremsstrahlung x-ray spectrometer. Inconsistencies in inferred hot electron distributions between the different pinhole substrate materials demonstrate that additional sources of hot electrons beyond those modeled may produce hard x-rays in the pinhole substrate.« less

  2. Propagation Effects of Wind and Temperature on Acoustic Ground Contour Levels

    NASA Technical Reports Server (NTRS)

    Heath, Stephanie L.; McAninch, Gerry L.

    2006-01-01

    Propagation characteristics for varying wind and temperature atmospheric conditions are identified using physically-limiting propagation angles to define shadow boundary regions. These angles are graphically illustrated for various wind and temperature cases using a newly developed ray-tracing propagation code.

  3. The estimation of background production by cosmic rays in high-energy gamma ray telescopes

    NASA Technical Reports Server (NTRS)

    Edwards, H. L.; Nolan, P. L.; Lin, Y. C.; Koch, D. G.; Bertsch, D. L.; Fichtel, C. E.; Hartman, R. C.; Hunter, S. D.; Kniffen, D. A.; Hughes, E. B.

    1991-01-01

    A calculational method of estimating instrumental background in high-energy gamma-ray telescopes, using the hadronic Monte Carlo code FLUKA87, is presented. The method is applied to the SAS-2 and EGRET telescope designs and is also used to explore the level of background to be expected for alternative configurations of the proposed GRITS telescope, which adapts the external fuel tank of a Space Shuttle as a gamma-ray telescope with a very large collecting area. The background produced in proton-beam tests of EGRET is much less than the predicted level. This discrepancy appears to be due to the FLUKA87 inability to transport evaporation nucleons. It is predicted that the background in EGRET will be no more than 4-10 percent of the extragalactic diffuse gamma radiation.

  4. Simulation, optimization and testing of a novel high spatial resolution X-ray imager based on Zinc Oxide nanowires in Anodic Aluminium Oxide membrane using Geant4

    NASA Astrophysics Data System (ADS)

    Esfandi, F.; Saramad, S.

    2015-07-01

    In this work, a new generation of scintillator based X-ray imagers based on ZnO nanowires in Anodized Aluminum Oxide (AAO) nanoporous template is characterized. The optical response of ordered ZnO nanowire arrays in porous AAO template under low energy X-ray illumination is simulated by the Geant4 Monte Carlo code and compared with experimental results. The results show that for 10 keV X-ray photons, by considering the light guiding properties of zinc oxide inside the AAO template and suitable selection of detector thickness and pore diameter, the spatial resolution less than one micrometer and the detector detection efficiency of 66% are accessible. This novel nano scintillator detector can have many advantages for medical applications in the future.

  5. In situ X-ray probing reveals fingerprints of surface platinum oxide.

    PubMed

    Friebel, Daniel; Miller, Daniel J; O'Grady, Christopher P; Anniyev, Toyli; Bargar, John; Bergmann, Uwe; Ogasawara, Hirohito; Wikfeldt, Kjartan Thor; Pettersson, Lars G M; Nilsson, Anders

    2011-01-07

    In situ X-ray absorption spectroscopy (XAS) at the Pt L(3) edge is a useful probe for Pt-O interactions at polymer electrolyte membrane fuel cell (PEMFC) cathodes. We show that XAS using the high energy resolution fluorescence detection (HERFD) mode, applied to a well-defined monolayer Pt/Rh(111) sample where the bulk penetrating hard X-rays probe only surface Pt atoms, provides a unique sensitivity to structure and chemical bonding at the Pt-electrolyte interface. Ab initio multiple-scattering calculations using the FEFF code and complementary extended X-ray absorption fine structure (EXAFS) results indicate that the commonly observed large increase of the white-line at high electrochemical potentials on PEMFC cathodes originates from platinum oxide formation, whereas previously proposed chemisorbed oxygen-containing species merely give rise to subtle spectral changes.

  6. Multiwavelength and Statistical Research in Space Astrophysics

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1997-01-01

    The accomplishments in the following three research areas are summarized: multiwavelength study of active galactic nuclei; magnetic activity of young stellar objects; and statistical methodology for astronomical data analysis. The research is largely based on observations of the ROSAT and ASCA X-ray observatories, complemented by ground-based optical and radio studies. Major findings include: discovery of inverse Compton X-ray emission from radio galaxy lobes; creation of the largest and least biased available sample of BL Lac objects; characterization of X-ray and nonthermal radio emission from T Tauri stars; obtaining an improved census of young stars in a star forming region and modeling the star formation history and kinematics; discovery of X-ray emission from protostars; development of linear regression methods and codes for interpreting astronomical data; and organization of the first cross-disciplinary conferences for astronomers and statisticians.

  7. Performance of the x-ray free-electron laser oscillator with crystal cavity

    NASA Astrophysics Data System (ADS)

    Lindberg, R. R.; Kim, K.-J.; Shvyd'Ko, Yu.; Fawley, W. M.

    2011-01-01

    Simulations of the x-ray free-electron laser (FEL) oscillator are presented that include the frequency-dependent Bragg crystal reflectivity and the transverse diffraction and focusing using the two-dimensional FEL code GINGER. A review of the physics of Bragg crystal reflectors and the x-ray FEL oscillator is made, followed by a discussion of its numerical implementation in GINGER. The simulation results for a two-crystal cavity and realistic FEL parameters indicate ˜109 photons in a nearly Fourier-limited, ps pulse. Compressing the electron beam to 100 A and 100 fs results in comparable x-ray characteristics for relaxed beam emittance, energy spread, and/or undulator parameters, albeit in a larger radiation bandwidth. Finally, preliminary simulation results indicate that the four-crystal FEL cavity can be tuned in energy over a range of a few percent.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binotti, M.; Zhu, G.; Gray, A.

    An analytical approach, as an extension of one newly developed method -- First-principle OPTical Intercept Calculation (FirstOPTIC) -- is proposed to treat the geometrical impact of three-dimensional (3-D) effects on parabolic trough optical performance. The mathematical steps of this analytical approach are presented and implemented numerically as part of the suite of FirstOPTIC code. In addition, the new code has been carefully validated against ray-tracing simulation results and available numerical solutions. This new analytical approach to treating 3-D effects will facilitate further understanding and analysis of the optical performance of trough collectors as a function of incidence angle.

  9. Single Event Upset Rate Estimates for a 16-K CMOS (Complementary Metal Oxide Semiconductor) SRAM (Static Random Access Memory).

    DTIC Science & Technology

    1986-09-30

    4 . ~**..ft.. ft . - - - ft SI TABLES 9 I. SA32~40 Single Event Upset Test, 1140-MeV Krypton, 9/l8/8~4. . .. .. .. .. .. .16 II. CRUP Simulation...cosmic ray interaction analysis described in the remainder of this report were calculated using the CRUP computer code 3 modified for funneling. The... CRUP code requires, as inputs, the size of a depletion region specified as a retangular parallel piped with dimensions a 9 b S c, the effective funnel

  10. Skyshine radiation from a pressurized water reactor containment dome

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peng, W.H.

    1986-06-01

    The radiation dose rates resulting from airborne activities inside a postaccident pressurized water reactor containment are calculated by a discrete ordinates/Monte Carlo combined method. The calculated total dose rates and the skyshine component are presented as a function of distance from the containment at three different elevations for various gamma-ray source energies. The one-dimensional (ANISN code) is used to approximate the skyshine dose rates from the hemisphere dome, and the results are compared favorably to more rigorous results calculated by a three-dimensional Monte Carlo code.

  11. A Radiation Shielding Code for Spacecraft and Its Validation

    NASA Technical Reports Server (NTRS)

    Shinn, J. L.; Cucinotta, F. A.; Singleterry, R. C.; Wilson, J. W.; Badavi, F. F.; Badhwar, G. D.; Miller, J.; Zeitlin, C.; Heilbronn, L.; Tripathi, R. K.

    2000-01-01

    The HZETRN code, which uses a deterministic approach pioneered at NASA Langley Research Center, has been developed over the past decade to evaluate the local radiation fields within sensitive materials (electronic devices and human tissue) on spacecraft in the space environment. The code describes the interactions of shield materials with the incident galactic cosmic rays, trapped protons, or energetic protons from solar particle events in free space and low Earth orbit. The content of incident radiations is modified by atomic and nuclear reactions with the spacecraft and radiation shield materials. High-energy heavy ions are fragmented into less massive reaction products, and reaction products are produced by direct knockout of shield constituents or from de-excitation products. An overview of the computational procedures and database which describe these interactions is given. Validation of the code with recent Monte Carlo benchmarks, and laboratory and flight measurement is also included.

  12. Simulations of Laboratory Astrophysics Experiments using the CRASH code

    NASA Astrophysics Data System (ADS)

    Trantham, Matthew; Kuranz, Carolyn; Manuel, Mario; Keiter, Paul; Drake, R. P.

    2014-10-01

    Computer simulations can assist in the design and analysis of laboratory astrophysics experiments. The Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan developed a code that has been used to design and analyze high-energy-density experiments on OMEGA, NIF, and other large laser facilities. This Eulerian code uses block-adaptive mesh refinement (AMR) with implicit multigroup radiation transport, electron heat conduction and laser ray tracing. This poster/talk will demonstrate some of the experiments the CRASH code has helped design or analyze including: Kelvin-Helmholtz, Rayleigh-Taylor, imploding bubbles, and interacting jet experiments. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via Grant DEFC52-08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, Grant Number DE-NA0001840, and by the National Laser User Facility Program, Grant Number DE-NA0000850.

  13. High-resolution coded-aperture design for compressive X-ray tomography using low resolution detectors

    NASA Astrophysics Data System (ADS)

    Mojica, Edson; Pertuz, Said; Arguello, Henry

    2017-12-01

    One of the main challenges in Computed Tomography (CT) is obtaining accurate reconstructions of the imaged object while keeping a low radiation dose in the acquisition process. In order to solve this problem, several researchers have proposed the use of compressed sensing for reducing the amount of measurements required to perform CT. This paper tackles the problem of designing high-resolution coded apertures for compressed sensing computed tomography. In contrast to previous approaches, we aim at designing apertures to be used with low-resolution detectors in order to achieve super-resolution. The proposed method iteratively improves random coded apertures using a gradient descent algorithm subject to constraints in the coherence and homogeneity of the compressive sensing matrix induced by the coded aperture. Experiments with different test sets show consistent results for different transmittances, number of shots and super-resolution factors.

  14. The simulations of indirect-drive targets for ignition on megajoule lasers.

    NASA Astrophysics Data System (ADS)

    Lykov, Vladimir; Andreev, Eugene; Ardasheva, Ludmila; Avramenko, Michael; Chernyakov, Valerian; Chizhkov, Maxim; Karlykhanov, Nikalai; Kozmanov, Michael; Lebedev, Serge; Rykovanov, George; Seleznev, Vladimir; Sokolov, Lev; Timakova, Margaret; Shestakov, Alexander; Shushlebin, Aleksander

    2013-10-01

    The calculations were performed with use of radiation hydrodynamic codes developed in RFNC-VNIITF. The analysis of published calculations of indirect-drive targets to obtain ignition on NIF and LMJ lasers has shown that these targets have very low margins for ignition: according to 1D-ERA code calculations it could not be ignited under decreasing of thermonuclear reaction rate less than in 2 times.The purpose of new calculations is search of indirect-drive targets with the raised margins for ignition. The calculations of compression and thermonuclear burning of targets are carried out for conditions of X-ray flux asymmetry obtained in simulations of Rugby hohlraum that were performed with 2D-SINARA code. The requirements to accuracy of manufacturing and irradiation symmetry of targets were studied with use of 2D-TIGR-OMEGA-3T code. The necessity of performed researches is caused by the construction of magajoule laser in Russia.

  15. Parallelizing serial code for a distributed processing environment with an application to high frequency electromagnetic scattering

    NASA Astrophysics Data System (ADS)

    Work, Paul R.

    1991-12-01

    This thesis investigates the parallelization of existing serial programs in computational electromagnetics for use in a parallel environment. Existing algorithms for calculating the radar cross section of an object are covered, and a ray-tracing code is chosen for implementation on a parallel machine. Current parallel architectures are introduced and a suitable parallel machine is selected for the implementation of the chosen ray-tracing algorithm. The standard techniques for the parallelization of serial codes are discussed, including load balancing and decomposition considerations, and appropriate methods for the parallelization effort are selected. A load balancing algorithm is modified to increase the efficiency of the application, and a high level design of the structure of the serial program is presented. A detailed design of the modifications for the parallel implementation is also included, with both the high level and the detailed design specified in a high level design language called UNITY. The correctness of the design is proven using UNITY and standard logic operations. The theoretical and empirical results show that it is possible to achieve an efficient parallel application for a serial computational electromagnetic program where the characteristics of the algorithm and the target architecture critically influence the development of such an implementation.

  16. Neutron production by cosmic-ray muons in various materials

    NASA Astrophysics Data System (ADS)

    Manukovsky, K. V.; Ryazhskaya, O. G.; Sobolevsky, N. M.; Yudin, A. V.

    2016-07-01

    The results obtained by studying the background of neutrons produced by cosmic-raymuons in underground experimental facilities intended for rare-event searches and in surrounding rock are presented. The types of this rock may include granite, sedimentary rock, gypsum, and rock salt. Neutron production and transfer were simulated using the Geant4 and SHIELD transport codes. These codes were tuned via a comparison of the results of calculations with experimental data—in particular, with data of the Artemovsk research station of the Institute for Nuclear Research (INR, Moscow, Russia)—as well as via an intercomparison of results of calculations with the Geant4 and SHIELD codes. It turns out that the atomic-number dependence of the production and yield of neutrons has an irregular character and does not allow a description in terms of a universal function of the atomic number. The parameters of this dependence are different for two groups of nuclei—nuclei consisting of alpha particles and all of the remaining nuclei. Moreover, there are manifest exceptions from a power-law dependence—for example, argon. This may entail important consequences both for the existing underground experimental facilities and for those under construction. Investigation of cosmic-ray-induced neutron production in various materials is of paramount importance for the interpretation of experiments conducted at large depths under the Earth's surface.

  17. Effects of target fragmentation on evaluation of LET spectra from space radiations: implications for space radiation protection studies

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Wilson, J. W.; Shinn, J. L.; Badavi, F. F.; Badhwar, G. D.

    1996-01-01

    We present calculations of linear energy transfer (LET) spectra in low earth orbit from galactic cosmic rays and trapped protons using the HZETRN/BRYNTRN computer code. The emphasis of our calculations is on the analysis of the effects of secondary nuclei produced through target fragmentation in the spacecraft shield or detectors. Recent improvements in the HZETRN/BRYNTRN radiation transport computer code are described. Calculations show that at large values of LET (> 100 keV/micrometer) the LET spectra seen in free space and low earth orbit (LEO) are dominated by target fragments and not the primary nuclei. Although the evaluation of microdosimetric spectra is not considered here, calculations of LET spectra support that the large lineal energy (y) events are dominated by the target fragments. Finally, we discuss the situation for interplanetary exposures to galactic cosmic rays and show that current radiation transport codes predict that in the region of high LET values the LET spectra at significant shield depths (> 10 g/cm2 of Al) is greatly modified by target fragments. These results suggest that studies of track structure and biological response of space radiation should place emphasis on short tracks of medium charge fragments produced in the human body by high energy protons and neutrons.

  18. Mass-invariance of the iron enrichment in the hot haloes of massive ellipticals, groups, and clusters of galaxies

    NASA Astrophysics Data System (ADS)

    Mernier, F.; de Plaa, J.; Werner, N.; Kaastra, J. S.; Raassen, A. J. J.; Gu, L.; Mao, J.; Urdampilleta, I.; Truong, N.; Simionescu, A.

    2018-05-01

    X-ray measurements find systematically lower Fe abundances in the X-ray emitting haloes pervading groups (kT ≲ 1.7 keV) than in clusters of galaxies. These results have been difficult to reconcile with theoretical predictions. However, models using incomplete atomic data or the assumption of isothermal plasmas may have biased the best fit Fe abundance in groups and giant elliptical galaxies low. In this work, we take advantage of a major update of the atomic code in the spectral fitting package SPEX to re-evaluate the Fe abundance in 43 clusters, groups, and elliptical galaxies (the CHEERS sample) in a self-consistent analysis and within a common radius of 0.1r500. For the first time, we report a remarkably similar average Fe enrichment in all these systems. Unlike previous results, this strongly suggests that metals are synthesised and transported in these haloes with the same average efficiency across two orders of magnitude in total mass. We show that the previous metallicity measurements in low temperature systems were biased low due to incomplete atomic data in the spectral fitting codes. The reasons for such a code-related Fe bias, also implying previously unconsidered biases in the emission measure and temperature structure, are discussed.

  19. Code comparison for accelerator design and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsa, Z.

    1988-01-01

    We present a comparison between results obtained from standard accelerator physics codes used for the design and analysis of synchrotrons and storage rings, with programs SYNCH, MAD, HARMON, PATRICIA, PATPET, BETA, DIMAD, MARYLIE and RACE-TRACK. In our analysis we have considered 5 (various size) lattices with large and small angles including AGS Booster (10/degree/ bend), RHIC (2.24/degree/), SXLS, XLS (XUV ring with 45/degree/ bend) and X-RAY rings. The differences in the integration methods used and the treatment of the fringe fields in these codes could lead to different results. The inclusion of nonlinear (e.g., dipole) terms may be necessary inmore » these calculations specially for a small ring. 12 refs., 6 figs., 10 tabs.« less

  20. Mitigation of Hot Electrons from Laser-Plasma Instabilities in Laser-Generated X-Ray Sources

    NASA Astrophysics Data System (ADS)

    Fein, Jeffrey R.

    This thesis describes experiments to understand and mitigate energetic or "hot" electrons from laser-plasma instabilities (LPIs) in an effort to improve radiographic techniques using laser-generated x-ray sources. Initial experiments on the OMEGA-60 laser show evidence of an underlying background generated by x-rays with energies over 10 keV on radiographs using backlit pinhole radiography, whose source is consistent with hard x-rays from LPI-generated hot electrons. Mitigating this background can dramatically reduce uncertainties in measured object densities from radiographs and may be achieved by eliminating the target components in which LPIs are most likely to grow. Experiments were performed on the OMEGA-EP laser to study hot electron production from laser-plasma instabilities in high-Z plasmas relevant to laser-generated x-ray sources. Measurements of hard x-rays show a dramatic reduction in hot-electron energy going from low-Z CH to high-Z Au targets, in a manner that is consistent with steepening electron density profiles that were also measured. The profile-steepening, we infer, increased thresholds of LPIs and contributed to the reduced hot-electron production at higher Z. Possible mechanisms for generating hot electrons include the two-plasmon decay and stimulated Raman scattering instabilities driven by multiple laser beams. Radiation hydrodynamic simulations using the CRASH code predict that both of these instabilities were above threshold with linear threshold parameters that decreased with increasing Z due to steepening length-scales, as well as enhanced laser absorption and increased collisional and Landau damping of electron plasma waves. Another set of experiments were performed on the OMEGA-60 laser to test whether hard x-ray background could be mitigated in backlit pinhole imagers by controlling laser-plasma instabilities. Based on the results above, we hypothesized that LPIs and hot electrons that lead to hard x-ray background would be reduced by increasing the atomic number of the irradiated components in the pinhole imagers. Using higher-Z materials we demonstrate significant reduction in x-rays between 30-70 keV and 70% increase in the signal-to-background ratio. Based on this, a proposed backlighter and detector setup predicts a signal-to-background ratio of up to 4.5:1.

  1. Introducing DeBRa: a detailed breast model for radiological studies

    NASA Astrophysics Data System (ADS)

    Ma, Andy K. W.; Gunn, Spencer; Darambara, Dimitra G.

    2009-07-01

    Currently, x-ray mammography is the method of choice in breast cancer screening programmes. As the mammography technology moves from 2D imaging modalities to 3D, conventional computational phantoms do not have sufficient detail to support the studies of these advanced imaging systems. Studies of these 3D imaging systems call for a realistic and sophisticated computational model of the breast. DeBRa (Detailed Breast model for Radiological studies) is the most advanced, detailed, 3D computational model of the breast developed recently for breast imaging studies. A DeBRa phantom can be constructed to model a compressed breast, as in film/screen, digital mammography and digital breast tomosynthesis studies, or a non-compressed breast as in positron emission mammography and breast CT studies. Both the cranial-caudal and mediolateral oblique views can be modelled. The anatomical details inside the phantom include the lactiferous duct system, the Cooper ligaments and the pectoral muscle. The fibroglandular tissues are also modelled realistically. In addition, abnormalities such as microcalcifications, irregular tumours and spiculated tumours are inserted into the phantom. Existing sophisticated breast models require specialized simulation codes. Unlike its predecessors, DeBRa has elemental compositions and densities incorporated into its voxels including those of the explicitly modelled anatomical structures and the noise-like fibroglandular tissues. The voxel dimensions are specified as needed by any study and the microcalcifications are embedded into the voxels so that the microcalcification sizes are not limited by the voxel dimensions. Therefore, DeBRa works with general-purpose Monte Carlo codes. Furthermore, general-purpose Monte Carlo codes allow different types of imaging modalities and detector characteristics to be simulated with ease. DeBRa is a versatile and multipurpose model specifically designed for both x-ray and γ-ray imaging studies.

  2. Accuracy assessment and characterization of x-ray coded aperture coherent scatter spectral imaging for breast cancer classification

    PubMed Central

    Lakshmanan, Manu N.; Greenberg, Joel A.; Samei, Ehsan; Kapadia, Anuj J.

    2017-01-01

    Abstract. Although transmission-based x-ray imaging is the most commonly used imaging approach for breast cancer detection, it exhibits false negative rates higher than 15%. To improve cancer detection accuracy, x-ray coherent scatter computed tomography (CSCT) has been explored to potentially detect cancer with greater consistency. However, the 10-min scan duration of CSCT limits its possible clinical applications. The coded aperture coherent scatter spectral imaging (CACSSI) technique has been shown to reduce scan time through enabling single-angle imaging while providing high detection accuracy. Here, we use Monte Carlo simulations to test analytical optimization studies of the CACSSI technique, specifically for detecting cancer in ex vivo breast samples. An anthropomorphic breast tissue phantom was modeled, a CACSSI imaging system was virtually simulated to image the phantom, a diagnostic voxel classification algorithm was applied to all reconstructed voxels in the phantom, and receiver-operator characteristics analysis of the voxel classification was used to evaluate and characterize the imaging system for a range of parameters that have been optimized in a prior analytical study. The results indicate that CACSSI is able to identify the distribution of cancerous and healthy tissues (i.e., fibroglandular, adipose, or a mix of the two) in tissue samples with a cancerous voxel identification area-under-the-curve of 0.94 through a scan lasting less than 10 s per slice. These results show that coded aperture scatter imaging has the potential to provide scatter images that automatically differentiate cancerous and healthy tissue within ex vivo samples. Furthermore, the results indicate potential CACSSI imaging system configurations for implementation in subsequent imaging development studies. PMID:28331884

  3. Adaptive CT scanning system

    DOEpatents

    Sampayan, Stephen E.

    2016-11-22

    Apparatus, systems, and methods that provide an X-ray interrogation system having a plurality of stationary X-ray point sources arranged to substantially encircle an area or space to be interrogated. A plurality of stationary detectors are arranged to substantially encircle the area or space to be interrogated, A controller is adapted to control the stationary X-ray point sources to emit X-rays one at a time, and to control the stationary detectors to detect the X-rays emitted by the stationary X-ray point sources.

  4. A concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Lin, S.

    1985-01-01

    A concatenated coding scheme for error contol in data communications was analyzed. The inner code is used for both error correction and detection, however the outer code is used only for error detection. A retransmission is requested if either the inner code decoder fails to make a successful decoding or the outer code decoder detects the presence of errors after the inner code decoding. Probability of undetected error of the proposed scheme is derived. An efficient method for computing this probability is presented. Throughout efficiency of the proposed error control scheme incorporated with a selective repeat ARQ retransmission strategy is analyzed.

  5. Observation of temperature trace, induced by changing of temperature inside the human body, on the human body skin using commercially available IR camera

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2015-05-01

    As it is well-known, application of the passive THz camera for the security problems is very promising way. It allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. In previous papers, we demonstrate new possibility of the passive THz camera using for a temperature difference observing on the human skin if this difference is caused by different temperatures inside the body. For proof of validity of our statement we make the similar physical experiment using the IR camera. We show a possibility of temperature trace on human body skin, caused by changing of temperature inside the human body due to water drinking. We use as a computer code that is available for treatment of images captured by commercially available IR camera, manufactured by Flir Corp., as well as our developed computer code for computer processing of these images. Using both codes we demonstrate clearly changing of human body skin temperature induced by water drinking. Shown phenomena are very important for the detection of forbidden samples and substances concealed inside the human body using non-destructive control without X-rays using. Early we have demonstrated such possibility using THz radiation. Carried out experiments can be used for counter-terrorism problem solving. We developed original filters for computer processing of images captured by IR cameras. Their applications for computer processing of images results in a temperature resolution enhancing of cameras.

  6. A low-noise wide-dynamic-range event-driven detector using SOI pixel technology for high-energy particle imaging

    NASA Astrophysics Data System (ADS)

    Shrestha, Sumeet; Kamehama, Hiroki; Kawahito, Shoji; Yasutomi, Keita; Kagawa, Keiichiro; Takeda, Ayaki; Tsuru, Takeshi Go; Arai, Yasuo

    2015-08-01

    This paper presents a low-noise wide-dynamic-range pixel design for a high-energy particle detector in astronomical applications. A silicon on insulator (SOI) based detector is used for the detection of wide energy range of high energy particles (mainly for X-ray). The sensor has a thin layer of SOI CMOS readout circuitry and a thick layer of high-resistivity detector vertically stacked in a single chip. Pixel circuits are divided into two parts; signal sensing circuit and event detection circuit. The event detection circuit consisting of a comparator and logic circuits which detect the incidence of high energy particle categorizes the incident photon it into two energy groups using an appropriate energy threshold and generate a two-bit code for an event and energy level. The code for energy level is then used for selection of the gain of the in-pixel amplifier for the detected signal, providing a function of high-dynamic-range signal measurement. The two-bit code for the event and energy level is scanned in the event scanning block and the signals from the hit pixels only are read out. The variable-gain in-pixel amplifier uses a continuous integrator and integration-time control for the variable gain. The proposed design allows the small signal detection and wide dynamic range due to the adaptive gain technique and capability of correlated double sampling (CDS) technique of kTC noise canceling of the charge detector.

  7. Production code control system for hydrodynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slone, D.M.

    1997-08-18

    We describe how the Production Code Control System (pCCS), written in Perl, has been used to control and monitor the execution of a large hydrodynamics simulation code in a production environment. We have been able to integrate new, disparate, and often independent, applications into the PCCS framework without the need to modify any of our existing application codes. Both users and code developers see a consistent interface to the simulation code and associated applications regardless of the physical platform, whether an MPP, SMP, server, or desktop workstation. We will also describe our use of Perl to develop a configuration managementmore » system for the simulation code, as well as a code usage database and report generator. We used Perl to write a backplane that allows us plug in preprocessors, the hydrocode, postprocessors, visualization tools, persistent storage requests, and other codes. We need only teach PCCS a minimal amount about any new tool or code to essentially plug it in and make it usable to the hydrocode. PCCS has made it easier to link together disparate codes, since using Perl has removed the need to learn the idiosyncrasies of system or RPC programming. The text handling in Perl makes it easy to teach PCCS about new codes, or changes to existing codes.« less

  8. A cascaded coding scheme for error control and its performance analysis

    NASA Technical Reports Server (NTRS)

    Lin, S.

    1986-01-01

    A coding scheme for error control in data communication systems is investigated. The scheme is obtained by cascading two error correcting codes, called the inner and the outer codes. The error performance of the scheme is analyzed for a binary symmetric channel with bit error rate epsilon < 1/2. It is shown that, if the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit error rate. Various specific example schemes with inner codes ranging from high rates to very low rates and Reed-Solomon codes are considered, and their probabilities are evaluated. They all provide extremely high reliability even for very high bit error rates, say 0.1 to 0.01. Several example schemes are being considered by NASA for satellite and spacecraft down link error control.

  9. ACON: a multipurpose production controller for plasma physics codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snell, C.

    1983-01-01

    ACON is a BCON controller designed to run large production codes on the CTSS Cray-1 or the LTSS 7600 computers. ACON can also be operated interactively, with input from the user's terminal. The controller can run one code or a sequence of up to ten codes during the same job. Options are available to get and save Mass storage files, to perform Historian file updating operations, to compile and load source files, and to send out print and film files. Special features include ability to retry after Mass failures, backup options for saving files, startup messages for the various codes,more » and ability to reserve specified amounts of computer time after successive code runs. ACON's flexibility and power make it useful for running a number of different production codes.« less

  10. The Diffuse Soft X-ray Background: Trials and Tribulations

    NASA Astrophysics Data System (ADS)

    Ulmer, Melville P.

    2013-01-01

    I joined the University of Wisconsin-Madison sounding rocket group at its inception. It was an exciting time, as nobody knew what the X-ray sky looked like. Our group focused on the soft X-ray background, and built proportional counters with super thin (2 micron thick) windows. As the inter gas pressure of the counters was about 1 atmosphere, it was no mean feat to get payload to launch without the window bursting. On top of that we built all our own software from space solutions to unfolding the spectral data. For we did it then as now: Our computer code modeled the detector response and then folded various spectral shapes through the response and compared the results with the raw data. As far as interpretation goes, here are examples of how one can get things wrong: The Berkeley group published a paper of the soft X-ray background that disagreed with ours. Why? It turned out they had **assumed** the galactic plane was completely opaque to soft X-ray and hence corrected for detector background that way. It turns out that the ISM emits in soft X-rays! Another example was the faux pas of the Calgary group. They didn’t properly shield their detector from the sounding rocket telemetry. Thus they got an enormous signal, which to our amusement some (ambulance chaser) theoreticians tried to explain! So back then as now, mistakes were made, but at least we all knew how our X-ray systems worked from soup (the detectors) to nuts (the data analysis code) where as toady “anybody” with a good idea but only a vague inkling of how detectors, mirrors and software work, can be an X-ray astronomer. On the one hand, this has made the field accessible to all, and on the other, errors in interpretation can be made as the X-ray telescope user can fall prey to running black box software. Furthermore with so much funding going into supporting observers, there is little left to make the necessary technology advances or keep the core expertise in place to even to stay even with today’s observatories. We will need a newly launched facility (or two) or the field will eventually die.

  11. Dipyrone has no effects on bone healing of tibial fractures in rats

    PubMed Central

    Gali, Julio Cesar; Sansanovicz, Dennis; Ventin, Fernando Carvalho; Paes, Rodrigo Henrique; Quevedo, Francisco Carlos; Caetano, Edie Benedito

    2014-01-01

    OBJECTIVE: To evaluate the effect of dipyrone on healing of tibial fractures in rats. METHODS: Fourty-two Wistar rats were used, with mean body weight of 280g. After being anesthetized, they were submitted to closed fracture of the tibia and fibula of the right posterior paw through manual force. The rats were randomly divided into three groups: the control group that received a daily intraperitoneal injection of saline solution; group D-40, that received saline injection containing 40mg/Kg dipyrone; and group D-80, that received saline injection containing 80mg/Kg dipyrone. After 28 days the rats were sacrificed and received a new label code that was known by only one researcher. The fractured limbs were then amputated and X-rayed. The tibias were disarticulated and subjected to mechanical, radiological and histological evaluation. For statistical analysis the Kruskal-Wallis test was used at a significance level of 5%. RESULTS: There wasn't any type of dipyrone effect on healing of rats tibial fractures in relation to the control group. CONCLUSION: Dipyrone may be used safely for pain control in the treatment of fractures, without any interference on bone healing. Level of Evidence II, Controlled Laboratory Study. PMID:25246852

  12. Modeling Gamma Ray Bursts in the Megnetically Dominated Regime

    NASA Astrophysics Data System (ADS)

    Zhang, Bing

    Recent observations of broad-band prompt emission spectra of gamma-ray bursts (GRBs) by the Fermi Gamma-Ray Telescope suggest that they do not comply with the predictions of the standard fireball internal shock model. Several independent observations (including detections of high polarization degree of gamma-ray emission and early optical emission of some GRBs, as well as non-detection of PeV neutrinos from GRBs by IceCube) support or are consistent with the hypothesis that at least some GRBs have magnetically dominated jets. This calls for serious, detailed investigations of GRB models in the magnetically dominated regime, which interpret GRB emission as dissipation of strong magnetic fields entrained in the ejecta. On the other hand, because of their complexity, magnetic models are so far much less developed than the baryonic fireball models. Here we propose to tackle this difficult problem, aiming at making solid progress in this direction through a set of numerical investigations. Specifically, we propose to carry out the following simulations. (1) Using a relativistic MHD code, we will perform a global simulation to investigate whether efficient magnetic dissipation would occur when two high-σ magnetic blobs collide with a relativistic speed. (2) We will perform a local simulation of the relativistic collisions between two high-σ fluids, and track the evolution of magnetic field configuration in the colliding region and the interplay between magnetic reconnection and development of magnetic turbulence. (3) Through injecting test particles in the simulation box, we will study how electrons get accelerated in the turbulent reconnection regions. (4) Built upon the above-mentioned numerical simulation results, along with a Monte Carlo code and a synchrotron radiation code developed in our group before, we will develop a full numerical model to simulate lightcurves, time-dependent spectra, and polarization properties of GRB prompt emission within the framework of magnetically dominated jets. The results of this proposal will greatly advance our understanding of GRB physics in the magnetically dominated regime. The numerical simulations of collision-induced magnetic dissipation will be also relevant to many other astrophysical phenomena, such as active galactic nuclei, X-ray binary ``micro-quasars'', Crab nebula flares, and jets from tidal disruption events. The program conforms to NASA's Strategic Plan, and is highly relevant to the past and current NASA missions, such as CGRO/BATSE, Fermi, and Swift, as well as some future mission concepts, such as POET.

  13. Radiation transport around Kerr black holes

    NASA Astrophysics Data System (ADS)

    Schnittman, Jeremy David

    This Thesis describes the basic framework of a relativistic ray-tracing code for analyzing accretion processes around Kerr black holes. We begin in Chapter 1 with a brief historical summary of the major advances in black hole astrophysics over the past few decades. In Chapter 2 we present a detailed description of the ray-tracing code, which can be used to calculate the transfer function between the plane of the accretion disk and the detector plane, an important tool for modeling relativistically broadened emission lines. Observations from the Rossi X-Ray Timing Explorer have shown the existence of high frequency quasi-periodic oscillations (HFQPOs) in a number of black hole binary systems. In Chapter 3, we employ a simple "hot spot" model to explain the position and amplitude of these HFQPO peaks. The power spectrum of the periodic X-ray light curve consists of multiple peaks located at integral combinations of the black hole coordinate frequencies, with the relative amplitude of each peak determined by the orbital inclination, eccentricity, and hot spot arc length. In Chapter 4, we introduce additional features to the model to explain the broadening of the QPO peaks as well as the damping of higher frequency harmonics in the power spectrum. The complete model is used to fit the power spectra observed in XTE J1550-564, giving confidence limits on each of the model parameters. In Chapter 5 we present a description of the structure of a relativistic alpha- disk around a Kerr black hole. Given the surface temperature of the disk, the observed spectrum is calculated using the transfer function mentioned above. The features of this modified thermal spectrum may be used to infer the physical properties of the accretion disk and the central black hole. In Chapter 6 we develop a Monte Carlo code to calculate the detailed propagation of photons from a hot spot emitter scattering through a corona surrounding the black hole. The coronal scattering has two major observable effects: the inverse-Compton process alters the photon spectrum by adding a high energy power-law tail, and the random scattering of each photon effectively damps out the highest frequency modulations in the X-ray light curve. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617- 253-5668; Fax 617-253-1690.)

  14. GRay: A Massively Parallel GPU-based Code for Ray Tracing in Relativistic Spacetimes

    NASA Astrophysics Data System (ADS)

    Chan, Chi-kwan; Psaltis, Dimitrios; Özel, Feryal

    2013-11-01

    We introduce GRay, a massively parallel integrator designed to trace the trajectories of billions of photons in a curved spacetime. This graphics-processing-unit (GPU)-based integrator employs the stream processing paradigm, is implemented in CUDA C/C++, and runs on nVidia graphics cards. The peak performance of GRay using single-precision floating-point arithmetic on a single GPU exceeds 300 GFLOP (or 1 ns per photon per time step). For a realistic problem, where the peak performance cannot be reached, GRay is two orders of magnitude faster than existing central-processing-unit-based ray-tracing codes. This performance enhancement allows more effective searches of large parameter spaces when comparing theoretical predictions of images, spectra, and light curves from the vicinities of compact objects to observations. GRay can also perform on-the-fly ray tracing within general relativistic magnetohydrodynamic algorithms that simulate accretion flows around compact objects. Making use of this algorithm, we calculate the properties of the shadows of Kerr black holes and the photon rings that surround them. We also provide accurate fitting formulae of their dependencies on black hole spin and observer inclination, which can be used to interpret upcoming observations of the black holes at the center of the Milky Way, as well as M87, with the Event Horizon Telescope.

  15. Gas Accretion onto a Supermassive Black Hole: A Step to Model AGN Feedback

    NASA Astrophysics Data System (ADS)

    Nagamine, K.; Barai, P.; Proga, D.

    2012-08-01

    We study gas accretion onto a supermassive black hole (SMBH) using the 3D SPH code GADGET-3 on scales of 0.1-200 pc. First we test our code with the spherically symmetric, adiabatic Bondi accretion problem. We find that our simulation can reproduce the expected Bondi accretion flow very well for a limited amount of time until the effect of the outer boundary starts to be visible. We also find artificial heating of gas near the inner accretion boundary due to the artificial viscosity of SPH. Second, we implement radiative cooling and heating due to X-rays, and examine the impact of thermal feedback by the central X-ray source. The accretion flow roughly follows the Bondi solution for low central X-ray luminosities; however, the flow starts to exhibit non-spherical fragmentation due to the thermal instability for a certain range of central LX, and a strong overall outflow develops for greater LX. The cold gas develops filamentary structures that fall into the central SMBH, whereas the hot gas tries to escape through the channels in between the cold filaments. Such fragmentation of accreting gas can assist in the formation of clouds around AGN, induce star-formation, and contribute to the observed variability of narrow-line regions.

  16. Digital Controller For Emergency Beacon

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    1990-01-01

    Prototype digital controller intended for use in 406-MHz emergency beacon. Undergoing development according to international specifications, 406-MHz emergency beacon system includes satellites providing worldwide monitoring of beacons, with Doppler tracking to locate each beacon within 5 km. Controller turns beacon on and off and generates binary codes identifying source (e.g., ship, aircraft, person, or vehicle on land). Codes transmitted by phase modulation. Knowing code, monitor attempts to communicate with user, monitor uses code information to dispatch rescue team appropriate to type and locations of carrier.

  17. Synchronization Control for a Class of Discrete-Time Dynamical Networks With Packet Dropouts: A Coding-Decoding-Based Approach.

    PubMed

    Wang, Licheng; Wang, Zidong; Han, Qing-Long; Wei, Guoliang

    2017-09-06

    The synchronization control problem is investigated for a class of discrete-time dynamical networks with packet dropouts via a coding-decoding-based approach. The data is transmitted through digital communication channels and only the sequence of finite coded signals is sent to the controller. A series of mutually independent Bernoulli distributed random variables is utilized to model the packet dropout phenomenon occurring in the transmissions of coded signals. The purpose of the addressed synchronization control problem is to design a suitable coding-decoding procedure for each node, based on which an efficient decoder-based control protocol is developed to guarantee that the closed-loop network achieves the desired synchronization performance. By applying a modified uniform quantization approach and the Kronecker product technique, criteria for ensuring the detectability of the dynamical network are established by means of the size of the coding alphabet, the coding period and the probability information of packet dropouts. Subsequently, by resorting to the input-to-state stability theory, the desired controller parameter is obtained in terms of the solutions to a certain set of inequality constraints which can be solved effectively via available software packages. Finally, two simulation examples are provided to demonstrate the effectiveness of the obtained results.

  18. Nuclear Test Depth Determination with Synthetic Modelling: Global Analysis from PNEs to DPRK-2016

    NASA Astrophysics Data System (ADS)

    Rozhkov, Mikhail; Stachnik, Joshua; Baker, Ben; Epiphansky, Alexey; Bobrov, Dmitry

    2016-04-01

    Seismic event depth determination is critical for the event screening process at the International Data Center, CTBTO. A thorough determination of the event depth can be conducted mostly through additional special analysis because the IDC's Event Definition Criteria is based, in particular, on depth estimation uncertainties. This causes a large number of events in the Reviewed Event Bulletin to have depth constrained to the surface making the depth screening criterion not applicable. Further it may result in a heavier workload to manually distinguish between subsurface and deeper crustal events. Since the shape of the first few seconds of signal of very shallow events is very sensitive to the depth phases, cross correlation between observed and theoretic seismograms can provide a basis for the event depth estimation, and so an expansion to the screening process. We applied this approach mostly to events at teleseismic and partially regional distances. The approach was found efficient for the seismic event screening process, with certain caveats related mostly to poorly defined source and receiver crustal models which can shift the depth estimate. An adjustable teleseismic attenuation model (t*) for synthetics was used since this characteristic is not known for most of the rays we studied. We studied a wide set of historical records of nuclear explosions, including so called Peaceful Nuclear Explosions (PNE) with presumably known depths, and recent DPRK nuclear tests. The teleseismic synthetic approach is based on the stationary phase approximation with hudson96 program, and the regional modelling was done with the generalized ray technique by Vlastislav Cerveny modified to account for the complex source topography. The software prototype is designed to be used for the Expert Technical Analysis at the IDC. With this, the design effectively reuses the NDC-in-a-Box code and can be comfortably utilized by the NDC users. The package uses Geotool as a front-end for data retrieval and pre-processing. After the event database is compiled, the control is passed to the driver software, running the external processing and plotting toolboxes, which controls the final stage and produces the final result. The modules are mostly Python coded, C-coded (Raysynth3D complex topography regional synthetics) and FORTRAN coded synthetics from the CPS330 software package by Robert Herrmann of Saint Louis University. The extension of this single station depth determination method is under development and uses joint information from all stations participating in processing. It is based on simultaneous depth and moment tensor determination for both short and long period seismic phases. A novel approach recently developed for microseismic event location utilizing only phase waveform information was migrated to a global scale. It should provide faster computation as it does not require intensive synthetic modelling, and might benefit processing noisy signals. A consistent depth estimate for all recent nuclear tests was produced for the vast number of IMS stations (primary and auxiliary) used in processing.

  19. Isotopic composition analysis and age dating of uranium samples by high resolution gamma ray spectrometry

    NASA Astrophysics Data System (ADS)

    Apostol, A. I.; Pantelica, A.; Sima, O.; Fugaru, V.

    2016-09-01

    Non-destructive methods were applied to determine the isotopic composition and the time elapsed since last chemical purification of nine uranium samples. The applied methods are based on measuring gamma and X radiations of uranium samples by high resolution low energy gamma spectrometric system with planar high purity germanium detector and low background gamma spectrometric system with coaxial high purity germanium detector. The ;Multigroup γ-ray Analysis Method for Uranium; (MGAU) code was used for the precise determination of samples' isotopic composition. The age of the samples was determined from the isotopic ratio 214Bi/234U. This ratio was calculated from the analyzed spectra of each uranium sample, using relative detection efficiency. Special attention is paid to the coincidence summing corrections that have to be taken into account when performing this type of analysis. In addition, an alternative approach for the age determination using full energy peak efficiencies obtained by Monte Carlo simulations with the GESPECOR code is described.

  20. Benchmark Analysis of Pion Contribution from Galactic Cosmic Rays

    NASA Technical Reports Server (NTRS)

    Aghara, Sukesh K.; Blattnig, Steve R.; Norbury, John W.; Singleterry, Robert C., Jr.

    2008-01-01

    Shielding strategies for extended stays in space must include a comprehensive resolution of the secondary radiation environment inside the spacecraft induced by the primary, external radiation. The distribution of absorbed dose and dose equivalent is a function of the type, energy and population of these secondary products. A systematic verification and validation effort is underway for HZETRN, which is a space radiation transport code currently used by NASA. It performs neutron, proton and heavy ion transport explicitly, but it does not take into account the production and transport of mesons, photons and leptons. The question naturally arises as to what is the contribution of these particles to space radiation. The pion has a production kinetic energy threshold of about 280 MeV. The Galactic cosmic ray (GCR) spectra, coincidentally, reaches flux maxima in the hundreds of MeV range, corresponding to the pion production threshold. We present results from the Monte Carlo code MCNPX, showing the effect of lepton and meson physics when produced and transported explicitly in a GCR environment.

  1. Consideration of the Protection Curtain's Shielding Ability after Identifying the Source of Scattered Radiation in the Angiography.

    PubMed

    Sato, Naoki; Fujibuchi, Toshioh; Toyoda, Takatoshi; Ishida, Takato; Ohura, Hiroki; Miyajima, Ryuichi; Orita, Shinichi; Sueyoshi, Tomonari

    2017-06-15

    To decrease radiation exposure to medical staff performing angiography, the dose distribution in the angiography was calculated in room using the particle and heavy ion transport code system (PHITS), which is based on Monte Carlo code, and the source of scattered radiation was confirmed using a tungsten sheet by considering the difference shielding performance among different sheet placements. Scattered radiation generated from a flat panel detector, X-ray tube and bed was calculated using the PHITS. In this experiment, the source of scattered radiation was identified as the phantom or acrylic window attached to the X-ray tube thus, a protection curtain was placed on the bed to shield against scattered radiation at low positions. There was an average difference of 20% between the measured and calculated values. The H*(10) value decreased after placing the sheet on the right side of the phantom. Thus, the curtain could decrease scattered radiation. © Crown copyright 2016.

  2. Using AORSA to simulate helicon waves in DIII-D

    NASA Astrophysics Data System (ADS)

    Lau, C.; Jaeger, E. F.; Bertelli, N.; Berry, L. A.; Blazevski, D.; Green, D. L.; Murakami, M.; Park, J. M.; Pinsker, R. I.; Prater, R.

    2015-12-01

    Recent efforts have shown that helicon waves (fast waves at > 20ωci) may be an attractive option for driving efficient off-axis current drive during non-inductive tokamak operation for DIII-D, ITER and DEMO. For DIII-D scenarios, the ray tracing code, GENRAY, has been extensively used to study helicon current drive efficiency and location as a function of many plasma parameters. The full wave code, AORSA, which is applicable to arbitrary Larmor radius and can resolve arbitrary ion cyclotron harmonic order, has been recently used to validate the ray tracing technique at these high cyclotron harmonics. If the SOL is ignored, it will be shown that the GENRAY and AORSA calculated current drive profiles are comparable for the envisioned high beta advanced scenarios for DIII-D, where there is high single pass absorption due to electron Landau damping and minimal ion damping. AORSA is also been used to estimate possible SOL effects on helicon current drive coupling and SOL absorption due to collisional and slow wave effects.

  3. Estimations of Mo X-pinch plasma parameters on QiangGuang-1 facility by L-shell spectral analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Jian; Qiu, Aici; State Key Laboratory of Intense Pulsed Radiation Simulation and Effect, Northwest Institute of Nuclear Technology, Xi'an 710024

    2013-08-15

    Plasma parameters of molybdenum (Mo) X-pinches on the 1-MA QiangGuang-1 facility were estimated by L-shell spectral analysis. X-ray radiation from X-pinches had a pulsed width of 1 ns, and its spectra in 2–3 keV were measured with a time-integrated X-ray spectrometer. Relative intensities of spectral features were derived by correcting for the spectral sensitivity of the spectrometer. With an open source, atomic code FAC (flexible atomic code), ion structures, and various atomic radiative-collisional rates for O-, F-, Ne-, Na-, Mg-, and Al-like ionization stages were calculated, and synthetic spectra were constructed at given plasma parameters. By fitting the measured spectramore » with the modeled, Mo X-pinch plasmas on the QiangGuang-1 facility had an electron density of about 10{sup 21} cm{sup −3} and the electron temperature of about 1.2 keV.« less

  4. Large-area PSPMT based gamma-ray imager with edge reclamation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ziock, K-P; Nakae, L

    2000-09-21

    We describe a coded aperture, gamma-ray imager which uses a CsI(Na) scintillator coupled to an Hamamatsu R3292 position-sensitive photomultiplier tube (PSPMT) as the position-sensitive detector. We have modified the normal resistor divider readout of the PSPMT to allow use of nearly the full 10 cm diameter active area of the PSPMT with a single scintillator crystal one centimeter thick. This is a significant performance improvement over that obtained with the standard readout technique where the linearity and position resolution start to degrade at radii as small as 3.5 cm with a crystal 0.75 crn thick. This represents a recovery ofmore » over 60% of the PSPMT active area. The performance increase allows the construction of an imager with a field of view 20 resolution elements in diameter with useful quantum efficiency from 60-700 keV. In this paper we describe the readout technique, its implementation in a coded aperture imager and the performance of that imager.« less

  5. Cross sections and differential spectra for reactions of 2-20 MeV neutrons of /sup 27/Al

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blann, M.; Komoto, T.T.

    1988-01-01

    This report summarizes product yields, secondary n,p and ..cap alpha.. spectra, and ..gamma..-ray spectra calculated for incident neutrons of 2-20 MeV on /sup 27/Al targets. Results are all from the code ALICE, using the version ALISO which does weighting of results for targets which are a mix of isotopes. Where natural isotopic targets are involved, yields and n,p,..cap alpha.. spectra will be reported weighted over isotopic yields. Gamma-ray spectra, however, will be reported for the most abundant isotope.

  6. Bottlenecks and Waiting Points in Nucleosynthesis in X-ray bursts and Novae

    NASA Astrophysics Data System (ADS)

    Smith, Michael S.; Sunayama, Tomomi; Hix, W. Raphael; Lingerfelt, Eric J.; Nesaraja, Caroline D.

    2010-08-01

    To better understand the energy generation and element synthesis occurring in novae and X-ray bursts, we give quantitative definitions to the concepts of ``bottlenecks'' and ``waiting points'' in the thermonuclear reaction flow. We use these criteria to search for bottlenecks and waiting points in post-processing element synthesis explosion simulations. We have incorporated these into the Computational Infrastructure for Nuclear Astrophysics, a suite of nuclear astrophysics codes available online at nucastrodata.org, so that anyone may perform custom searches for bottlenecks and waiting points.

  7. Bottlenecks and Waiting Points in Nucleosynthesis in X-ray bursts and Novae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Michael S.; Hix, W. Raphael; Nesaraja, Caroline D.

    2010-08-12

    To better understand the energy generation and element synthesis occurring in novae and X-ray bursts, we give quantitative definitions to the concepts of ''bottlenecks'' and ''waiting points'' in the thermonuclear reaction flow. We use these criteria to search for bottlenecks and waiting points in post-processing element synthesis explosion simulations. We have incorporated these into the Computational Infrastructure for Nuclear Astrophysics, a suite of nuclear astrophysics codes available online at nucastrodata.org, so that anyone may perform custom searches for bottlenecks and waiting points.

  8. Galactic cosmic ray transport methods and radiation quality issues

    NASA Technical Reports Server (NTRS)

    Townsend, L. W.; Wilson, J. W.; Cucinotta, F. A.; Shinn, J. L.

    1992-01-01

    An overview of galactic cosmic ray (GCR) interaction and transport methods, as implemented in the Langley Research Center GCR transport code, is presented. Representative results for solar minimum, exo-magnetospheric GCR dose equivalents in water are presented on a component by component basis for various thicknesses of aluminum shielding. The impact of proposed changes to the currently used quality factors on exposure estimates and shielding requirements are quantified. Using the cellular track model of Katz, estimates of relative biological effectiveness (RBE) for the mixed GCR radiation fields are also made.

  9. Simulation of the Simbol-X telescope: imaging performance of a deformable x-ray telescope

    NASA Astrophysics Data System (ADS)

    Chauvin, Maxime; Roques, Jean-Pierre

    2009-08-01

    We have developed a simulation tool for a Wolter I telescope subject to deformations. The aim is to understand and predict the behavior of Simbol-X and other future missions (NuSTAR, Astro-H, IXO, ...). Our code, based on Monte-Carlo ray-tracing, computes the full photon trajectories up to the detector plane, along with the deformations. The degradation of the imaging system is corrected using metrology. This tool allows to perform many analyzes in order to optimize the configuration of any of these telescopes.

  10. Imaging Performance Analysis of Simbol-X with Simulations

    NASA Astrophysics Data System (ADS)

    Chauvin, M.; Roques, J. P.

    2009-05-01

    Simbol-X is an X-Ray telescope operating in formation flight. It means that its optical performances will strongly depend on the drift of the two spacecrafts and its ability to measure these drifts for image reconstruction. We built a dynamical ray tracing code to study the impact of these parameters on the optical performance of Simbol-X (see Chauvin et al., these proceedings). Using the simulation tool we have developed, we have conducted detailed analyses of the impact of different parameters on the imaging performance of the Simbol-X telescope.

  11. Analysis of the X-ray emission spectra of copper, germanium and rubidium plasmas produced at the Phelix laser facility

    NASA Astrophysics Data System (ADS)

    Comet, M.; Pain, J.-C.; Gilleron, F.; Piron, R.; Denis-Petit, D.; Méot, V.; Gosselin, G.; Morel, P.; Hannachi, F.; Gobet, F.; Tarisien, M.; Versteegen, M.

    2017-03-01

    We present the analysis of X-ray emission spectra of copper, germanium and rubidium plasmas measured at the Phelix laser facility. The laser intensity was around 6×1014 W.cm-2. The analysis is based on the hypothesis of an homogeneous plasma in local thermodynamic equilibrium using an effective temperature. This temperature is deduced from hydrodynamic simulations and collisional-radiative computations. Spectra are then calculated using the LTE opacity codes OPAMCDF and SCO-RCG and compared to experimental data.

  12. Constraints on the Galactic Halo Dark Matter from Fermi-LAT Diffuse Measurements

    NASA Technical Reports Server (NTRS)

    Ackermann, M.; Ajello, M.; Atwood, W. B.; Baldini, L.; Barbiellini, G.; Bastieri, D.; Bechtol, K.; Bellazzini, R.; Blandford, R. D.; Bloom, E. D.; hide

    2012-01-01

    We have performed an analysis of the diffuse gamma-ray emission with the Fermi Large Area Telescope (LAT) in the Milky Way halo region, searching for a signal from dark matter annihilation or decay. In the absence of a robust dark matter signal, constraints are presented. We consider both gamma rays produced directly in the dark matter annihilation/decay and produced by inverse Compton scattering of the e+/e- produced in the annihilation/decay. Conservative limits are derived requiring that the dark matter signal does not exceed the observed diffuse gamma-ray emission. A second set of more stringent limits is derived based on modeling the foreground astrophysical diffuse emission using the GALPROP code. Uncertainties in the height of the diffusive cosmic-ray halo, the distribution of the cosmic-ray sources in the Galaxy, the index of the injection cosmic-ray electron spectrum, and the column density of the interstellar gas are taken into account using a profile likelihood formalism, while the parameters governing the cosmic-ray propagation have been derived from fits to local cosmic-ray data. The resulting limits impact the range of particle masses over which dark matter thermal production in the early universe is possible, and challenge the interpretation of the PAMELA/Fermi-LAT cosmic ray anomalies as the annihilation of dark matter.

  13. High energy X-ray CT study on the central void formations and the fuel pin deformations of FBR fuel assemblies

    NASA Astrophysics Data System (ADS)

    Katsuyama, Kozo; Nagamine, Tsuyoshi; Matsumoto, Shin-ichiro; Sato, Seichi

    2007-02-01

    The central void formations and deformations of fuel pins were investigated in fuel assemblies irradiated to high burn-up, using a non-destructive X-ray CT (computer tomography) technique. In this X-ray CT, the effect of strong gamma ray activity could be reduced to a negligible degree by using the pulse of a high energy X-ray source and detecting the intensity of the transmitted X-rays in synchronization with the generated X-rays. Clear cross-sectional images of fuel assemblies irradiated to high burn-up in a fast breeder reactor were successively obtained, in which the wrapping wires, cladding, pellets and central voids could be distinctly seen. The diameter of a typical central void measured by X-ray CT agreed with the one obtained by ceramography within an error of 0.1 mm. Based on this result, the dependence of the central void diameter on the linear heating rate was analyzed. In addition, the deformation behavior of a fuel pin along its axial direction could be analyzed from 20 stepwise X-ray cross-sectional images obtained in a small interval, and the results obtained showed a good agreement with the predictions calculated by two computer codes.

  14. Theoretical modeling of a portable x-ray tube based KXRF system to measure lead in bone

    PubMed Central

    Specht, Aaron J; Weisskopf, Marc G; Nie, Linda Huiling

    2017-01-01

    Objective K-shell x-ray fluorescence (KXRF) techniques have been used to identify health effects resulting from exposure to metals for decades, but the equipment is bulky and requires significant maintenance and licensing procedures. A portable x-ray fluorescence (XRF) device was developed to overcome these disadvantages, but introduced a measurement dependency on soft tissue thickness. With recent advances to detector technology, an XRF device utilizing the advantages of both systems should be feasible. Approach In this study, we used Monte Carlo simulations to test the feasibility of an XRF device with a high-energy x-ray tube and detector operable at room temperature. Main Results We first validated the use of Monte Carlo N-particle transport code (MCNP) for x-ray tube simulations, and found good agreement between experimental and simulated results. Then, we optimized x-ray tube settings and found the detection limit of the high-energy x-ray tube based XRF device for bone lead measurements to be 6.91 μg g−1 bone mineral using a cadmium zinc telluride detector. Significance In conclusion, this study validated the use of MCNP in simulations of x-ray tube physics and XRF applications, and demonstrated the feasibility of a high-energy x-ray tube based XRF for metal exposure assessment. PMID:28169835

  15. Theoretical modeling of a portable x-ray tube based KXRF system to measure lead in bone.

    PubMed

    Specht, Aaron J; Weisskopf, Marc G; Nie, Linda Huiling

    2017-03-01

    K-shell x-ray fluorescence (KXRF) techniques have been used to identify health effects resulting from exposure to metals for decades, but the equipment is bulky and requires significant maintenance and licensing procedures. A portable x-ray fluorescence (XRF) device was developed to overcome these disadvantages, but introduced a measurement dependency on soft tissue thickness. With recent advances to detector technology, an XRF device utilizing the advantages of both systems should be feasible. In this study, we used Monte Carlo simulations to test the feasibility of an XRF device with a high-energy x-ray tube and detector operable at room temperature. We first validated the use of Monte Carlo N-particle transport code (MCNP) for x-ray tube simulations, and found good agreement between experimental and simulated results. Then, we optimized x-ray tube settings and found the detection limit of the high-energy x-ray tube based XRF device for bone lead measurements to be 6.91 µg g -1 bone mineral using a cadmium zinc telluride detector. In conclusion, this study validated the use of MCNP in simulations of x-ray tube physics and XRF applications, and demonstrated the feasibility of a high-energy x-ray tube based XRF for metal exposure assessment.

  16. An Integrated Model of Cognitive Control in Task Switching

    ERIC Educational Resources Information Center

    Altmann, Erik M.; Gray, Wayne D.

    2008-01-01

    A model of cognitive control in task switching is developed in which controlled performance depends on the system maintaining access to a code in episodic memory representing the most recently cued task. The main constraint on access to the current task code is proactive interference from old task codes. This interference and the mechanisms that…

  17. Controlling Energy Radiations of Electromagnetic Waves via Frequency Coding Metamaterials

    PubMed Central

    Wu, Haotian; Liu, Shuo; Wan, Xiang; Zhang, Lei; Wang, Dan; Li, Lianlin

    2017-01-01

    Metamaterials are artificial structures composed of subwavelength unit cells to control electromagnetic (EM) waves. The spatial coding representation of metamaterial has the ability to describe the material in a digital way. The spatial coding metamaterials are typically constructed by unit cells that have similar shapes with fixed functionality. Here, the concept of frequency coding metamaterial is proposed, which achieves different controls of EM energy radiations with a fixed spatial coding pattern when the frequency changes. In this case, not only different phase responses of the unit cells are considered, but also different phase sensitivities are also required. Due to different frequency sensitivities of unit cells, two units with the same phase response at the initial frequency may have different phase responses at higher frequency. To describe the frequency coding property of unit cell, digitalized frequency sensitivity is proposed, in which the units are encoded with digits “0” and “1” to represent the low and high phase sensitivities, respectively. By this merit, two degrees of freedom, spatial coding and frequency coding, are obtained to control the EM energy radiations by a new class of frequency‐spatial coding metamaterials. The above concepts and physical phenomena are confirmed by numerical simulations and experiments. PMID:28932671

  18. Hartman Testing of X-Ray Telescopes

    NASA Technical Reports Server (NTRS)

    Saha, Timo T.; Biskasch, Michael; Zhang, William W.

    2013-01-01

    Hartmann testing of x-ray telescopes is a simple test method to retrieve and analyze alignment errors and low-order circumferential errors of x-ray telescopes and their components. A narrow slit is scanned along the circumference of the telescope in front of the mirror and the centroids of the images are calculated. From the centroid data, alignment errors, radius variation errors, and cone-angle variation errors can be calculated. Mean cone angle, mean radial height (average radius), and the focal length of the telescope can also be estimated if the centroid data is measured at multiple focal plane locations. In this paper we present the basic equations that are used in the analysis process. These equations can be applied to full circumference or segmented x-ray telescopes. We use the Optical Surface Analysis Code (OSAC) to model a segmented x-ray telescope and show that the derived equations and accompanying analysis retrieves the alignment errors and low order circumferential errors accurately.

  19. Monte Carlo simulation of x-ray buildup factors of lead and its applications in shielding of diagnostic x-ray facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kharrati, Hedi; Agrebi, Amel; Karaoui, Mohamed-Karim

    2007-04-15

    X-ray buildup factors of lead in broad beam geometry for energies from 15 to 150 keV are determined using the general purpose Monte Carlo N-particle radiation transport computer code (MCNP4C). The obtained buildup factors data are fitted to a modified three parameter Archer et al. model for ease in calculating the broad beam transmission with computer at any tube potentials/filters combinations in diagnostic energies range. An example for their use to compute the broad beam transmission at 70, 100, 120, and 140 kVp is given. The calculated broad beam transmission is compared to data derived from literature, presenting good agreement.more » Therefore, the combination of the buildup factors data as determined and a mathematical model to generate x-ray spectra provide a computationally based solution to broad beam transmission for lead barriers in shielding x-ray facilities.« less

  20. Sensitivity of the Cherenkov Telescope Array to the Detection of Intergalactic Magnetic Fields

    NASA Astrophysics Data System (ADS)

    Meyer, Manuel; Conrad, Jan; Dickinson, Hugh

    2016-08-01

    Very high energy (VHE; energy E ≳ 100 GeV) γ-rays originating from extragalactic sources undergo pair production with low-energy photons of background radiation fields. These pairs can inverse-Compton-scatter background photons, initiating an electromagnetic cascade. The spatial and temporal structure of this secondary γ-ray signal is altered as the {e}+{e}- pairs are deflected in an intergalactic magnetic field (IGMF). We investigate how VHE observations with the future Cherenkov Telescope Array, with its high angular resolution and broad energy range, can potentially probe the IGMF. We identify promising sources and simulate γ-ray spectra over a wide range of values of the IGMF strength and coherence length using the publicly available ELMAG Monte Carlo code. Combining simulated observations in a joint likelihood approach, we find that current limits on the IGMF can be significantly improved. The projected sensitivity depends strongly on the time a source has been γ-ray active and on the emitted maximum γ-ray energy.

Top