Sample records for ray tracing algorithms

  1. Ray-tracing method for creeping waves on arbitrarily shaped nonuniform rational B-splines surfaces.

    PubMed

    Chen, Xi; He, Si-Yuan; Yu, Ding-Feng; Yin, Hong-Cheng; Hu, Wei-Dong; Zhu, Guo-Qiang

    2013-04-01

    An accurate creeping ray-tracing algorithm is presented in this paper to determine the tracks of creeping waves (or creeping rays) on arbitrarily shaped free-form parametric surfaces [nonuniform rational B-splines (NURBS) surfaces]. The main challenge in calculating the surface diffracted fields on NURBS surfaces is due to the difficulty in determining the geodesic paths along which the creeping rays propagate. On one single parametric surface patch, the geodesic paths need to be computed by solving the geodesic equations numerically. Furthermore, realistic objects are generally modeled as the union of several connected NURBS patches. Due to the discontinuity of the parameter between the patches, it is more complicated to compute geodesic paths on several connected patches than on one single patch. Thus, a creeping ray-tracing algorithm is presented in this paper to compute the geodesic paths of creeping rays on the complex objects that are modeled as the combination of several NURBS surface patches. In the algorithm, the creeping ray tracing on each surface patch is performed by solving the geodesic equations with a Runge-Kutta method. When the creeping ray propagates from one patch to another, a transition method is developed to handle the transition of the creeping ray tracing across the border between the patches. This creeping ray-tracing algorithm can meet practical requirements because it can be applied to the objects with complex shapes. The algorithm can also extend the applicability of NURBS for electromagnetic and optical applications. The validity and usefulness of the algorithm can be verified from the numerical results.

  2. Technical Note: A direct ray-tracing method to compute integral depth dose in pencil beam proton radiography with a multilayer ionization chamber.

    PubMed

    Farace, Paolo; Righetto, Roberto; Deffet, Sylvain; Meijers, Arturs; Vander Stappen, Francois

    2016-12-01

    To introduce a fast ray-tracing algorithm in pencil proton radiography (PR) with a multilayer ionization chamber (MLIC) for in vivo range error mapping. Pencil beam PR was obtained by delivering spots uniformly positioned in a square (45 × 45 mm 2 field-of-view) of 9 × 9 spots capable of crossing the phantoms (210 MeV). The exit beam was collected by a MLIC to sample the integral depth dose (IDD MLIC ). PRs of an electron-density and of a head phantom were acquired by moving the couch to obtain multiple 45 × 45 mm 2 frames. To map the corresponding range errors, the two-dimensional set of IDD MLIC was compared with (i) the integral depth dose computed by the treatment planning system (TPS) by both analytic (IDD TPS ) and Monte Carlo (IDD MC ) algorithms in a volume of water simulating the MLIC at the CT, and (ii) the integral depth dose directly computed by a simple ray-tracing algorithm (IDD direct ) through the same CT data. The exact spatial position of the spot pattern was numerically adjusted testing different in-plane positions and selecting the one that minimized the range differences between IDD direct and IDD MLIC . Range error mapping was feasible by both the TPS and the ray-tracing methods, but very sensitive to even small misalignments. In homogeneous regions, the range errors computed by the direct ray-tracing algorithm matched the results obtained by both the analytic and the Monte Carlo algorithms. In both phantoms, lateral heterogeneities were better modeled by the ray-tracing and the Monte Carlo algorithms than by the analytic TPS computation. Accordingly, when the pencil beam crossed lateral heterogeneities, the range errors mapped by the direct algorithm matched better the Monte Carlo maps than those obtained by the analytic algorithm. Finally, the simplicity of the ray-tracing algorithm allowed to implement a prototype procedure for automated spatial alignment. The ray-tracing algorithm can reliably replace the TPS method in MLIC PR for in vivo range verification and it can be a key component to develop software tools for spatial alignment and correction of CT calibration.

  3. Ray Tracing Through Non-Imaging Concentrators

    NASA Astrophysics Data System (ADS)

    Greynolds, Alan W.

    1984-01-01

    A generalized algorithm for tracing rays through both imaging and non-imaging radiation collectors is presented. A computer program based on the algorithm is then applied to analyzing various two-stage Winston concentrators.

  4. A data distributed parallel algorithm for ray-traced volume rendering

    NASA Technical Reports Server (NTRS)

    Ma, Kwan-Liu; Painter, James S.; Hansen, Charles D.; Krogh, Michael F.

    1993-01-01

    This paper presents a divide-and-conquer ray-traced volume rendering algorithm and a parallel image compositing method, along with their implementation and performance on the Connection Machine CM-5, and networked workstations. This algorithm distributes both the data and the computations to individual processing units to achieve fast, high-quality rendering of high-resolution data. The volume data, once distributed, is left intact. The processing nodes perform local ray tracing of their subvolume concurrently. No communication between processing units is needed during this locally ray-tracing process. A subimage is generated by each processing unit and the final image is obtained by compositing subimages in the proper order, which can be determined a priori. Test results on both the CM-5 and a group of networked workstations demonstrate the practicality of our rendering algorithm and compositing method.

  5. Fast kinematic ray tracing of first- and later-arriving global seismic phases

    NASA Astrophysics Data System (ADS)

    Bijwaard, Harmen; Spakman, Wim

    1999-11-01

    We have developed a ray tracing algorithm that traces first- and later-arriving global seismic phases precisely (traveltime errors of the order of 0.1 s), and with great computational efficiency (15 rays s- 1). To achieve this, we have extended and adapted two existing ray tracing techniques: a graph method and a perturbation method. The two resulting algorithms are able to trace (critically) refracted, (multiply) reflected, some diffracted (Pdiff), and (multiply) converted seismic phases in a 3-D spherical geometry, thus including the largest part of seismic phases that are commonly observed on seismograms. We have tested and compared the two methods in 2-D and 3-D Cartesian and spherical models, for which both algorithms have yielded precise paths and traveltimes. These tests indicate that only the perturbation method is computationally efficient enough to perform 3-D ray tracing on global data sets of several million phases. To demonstrate its potential for non-linear tomography, we have applied the ray perturbation algorithm to a data set of 7.6 million P and pP phases used by Bijwaard et al. (1998) for linearized tomography. This showed that the expected heterogeneity within the Earth's mantle leads to significant non-linear effects on traveltimes for 10 per cent of the applied phases.

  6. Real time ray tracing based on shader

    NASA Astrophysics Data System (ADS)

    Gui, JiangHeng; Li, Min

    2017-07-01

    Ray tracing is a rendering algorithm for generating an image through tracing lights into an image plane, it can simulate complicate optical phenomenon like refraction, depth of field and motion blur. Compared with rasterization, ray tracing can achieve more realistic rendering result, however with greater computational cost, simple scene rendering can consume tons of time. With the GPU's performance improvement and the advent of programmable rendering pipeline, complicated algorithm can also be implemented directly on shader. So, this paper proposes a new method that implement ray tracing directly on fragment shader, mainly include: surface intersection, importance sampling and progressive rendering. With the help of GPU's powerful throughput capability, it can implement real time rendering of simple scene.

  7. Vertex shading of the three-dimensional model based on ray-tracing algorithm

    NASA Astrophysics Data System (ADS)

    Hu, Xiaoming; Sang, Xinzhu; Xing, Shujun; Yan, Binbin; Wang, Kuiru; Dou, Wenhua; Xiao, Liquan

    2016-10-01

    Ray Tracing Algorithm is one of the research hotspots in Photorealistic Graphics. It is an important light and shadow technology in many industries with the three-dimensional (3D) structure, such as aerospace, game, video and so on. Unlike the traditional method of pixel shading based on ray tracing, a novel ray tracing algorithm is presented to color and render vertices of the 3D model directly. Rendering results are related to the degree of subdivision of the 3D model. A good light and shade effect is achieved by realizing the quad-tree data structure to get adaptive subdivision of a triangle according to the brightness difference of its vertices. The uniform grid algorithm is adopted to improve the rendering efficiency. Besides, the rendering time is independent of the screen resolution. In theory, as long as the subdivision of a model is adequate, cool effects as the same as the way of pixel shading will be obtained. Our practical application can be compromised between the efficiency and the effectiveness.

  8. Determining Hypocentral Parameters for Local Earthquakes in 1-D Using a Genetic Algorithm and Two-point ray tracing

    NASA Astrophysics Data System (ADS)

    Kim, W.; Hahm, I.; Ahn, S. J.; Lim, D. H.

    2005-12-01

    This paper introduces a powerful method for determining hypocentral parameters for local earthquakes in 1-D using a genetic algorithm (GA) and two-point ray tracing. Using existing algorithms to determine hypocentral parameters is difficult, because these parameters can vary based on initial velocity models. We developed a new method to solve this problem by applying a GA to an existing algorithm, HYPO-71 (Lee and Larh, 1975). The original HYPO-71 algorithm was modified by applying two-point ray tracing and a weighting factor with respect to the takeoff angle at the source to reduce errors from the ray path and hypocenter depth. Artificial data, without error, were generated by computer using two-point ray tracing in a true model, in which velocity structure and hypocentral parameters were known. The accuracy of the calculated results was easily determined by comparing calculated and actual values. We examined the accuracy of this method for several cases by changing the true and modeled layer numbers and thicknesses. The computational results show that this method determines nearly exact hypocentral parameters without depending on initial velocity models. Furthermore, accurate and nearly unique hypocentral parameters were obtained, although the number of modeled layers and thicknesses differed from those in the true model. Therefore, this method can be a useful tool for determining hypocentral parameters in regions where reliable local velocity values are unknown. This method also provides the basic a priori information for 3-D studies. KEY -WORDS: hypocentral parameters, genetic algorithm (GA), two-point ray tracing

  9. Improved algorithm of ray tracing in ICF cryogenic targets

    NASA Astrophysics Data System (ADS)

    Zhang, Rui; Yang, Yongying; Ling, Tong; Jiang, Jiabin

    2016-10-01

    The high precision ray tracing inside inertial confinement fusion (ICF) cryogenic targets plays an important role in the reconstruction of the three-dimensional density distribution by algebraic reconstruction technique (ART) algorithm. The traditional Runge-Kutta methods, which is restricted by the precision of the grid division and the step size of ray tracing, cannot make an accurate calculation in the case of refractive index saltation. In this paper, we propose an improved algorithm of ray tracing based on the Runge-Kutta methods and Snell's law of refraction to achieve high tracing precision. On the boundary of refractive index, we apply Snell's law of refraction and contact point search algorithm to ensure accuracy of the simulation. Inside the cryogenic target, the combination of the Runge-Kutta methods and self-adaptive step algorithm are employed for computation. The original refractive index data, which is used to mesh the target, can be obtained by experimental measurement or priori refractive index distribution function. A finite differential method is performed to calculate the refractive index gradient of mesh nodes, and the distance weighted average interpolation methods is utilized to obtain refractive index and gradient of each point in space. In the simulation, we take ideal ICF target, Luneberg lens and Graded index rod as simulation model to calculate the spot diagram and wavefront map. Compared the simulation results to Zemax, it manifests that the improved algorithm of ray tracing based on the fourth-order Runge-Kutta methods and Snell's law of refraction exhibits high accuracy. The relative error of the spot diagram is 0.2%, and the peak-to-valley (PV) error and the root-mean-square (RMS) error of the wavefront map is less than λ/35 and λ/100, correspondingly.

  10. 2-Dimensional B-Spline Algorithms with Applications to Ray Tracing in Media of Spatially-Varying Refractive Index

    DTIC Science & Technology

    2007-08-01

    In the approach, photon trajectories are computed using a solution of the Eikonal equation (ray-tracing methods) rather than linear trajectories. The...coupling the radiative transport solution into heat transfer and damage models. 15. SUBJECT TERMS: B-Splines, Ray-Tracing, Eikonal Equation...multi-layer biological tissue model. In the approach, photon trajectories are computed using a solution of the Eikonal equation (ray-tracing methods

  11. Ray tracing through a hexahedral mesh in HADES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henderson, G L; Aufderheide, M B

    In this paper we describe a new ray tracing method targeted for inclusion in HADES. The algorithm tracks rays through three-dimensional tetrakis hexahedral mesh objects, like those used by the ARES code to model inertial confinement experiments.

  12. Synthetic Image Generator Model; Application of View Angle Dependent Reflectivity Components and Performance Evaluation in the Visible Region

    DTIC Science & Technology

    1993-02-01

    3.1.2. Modeling of Environment ....................... 6 3.1.3. Ray Tracing and Radiosity ..................... 8 3.2. Reflectivity Review...SIG modeling is dependent on proper treatment of its effects. 3.1.3 Ray Tracing and Radiosity Prior to reviewing reflectivity, a brief look is made of...methods of applying complex theoretical energy propagation algorithms. Two such methods are ray tracing and radiosity (Goral, et al, 1984). Ray tracing is a

  13. Effective algorithm for ray-tracing simulations of lobster eye and similar reflective optical systems

    NASA Astrophysics Data System (ADS)

    Tichý, Vladimír; Hudec, René; Němcová, Šárka

    2016-06-01

    The algorithm presented is intended mainly for lobster eye optics. This type of optics (and some similar types) allows for a simplification of the classical ray-tracing procedure that requires great many rays to simulate. The method presented performs the simulation of a only few rays; therefore it is extremely effective. Moreover, to simplify the equations, a specific mathematical formalism is used. Only a few simple equations are used, therefore the program code can be simple as well. The paper also outlines how to apply the method to some other reflective optical systems.

  14. Internal wave scattering in continental slope canyons, part 1: Theory and development of a ray tracing algorithm

    NASA Astrophysics Data System (ADS)

    Nazarian, Robert H.; Legg, Sonya

    2017-10-01

    When internal waves interact with topography, such as continental slopes, they can transfer wave energy to local dissipation and diapycnal mixing. Submarine canyons comprise approximately ten percent of global continental slopes, and can enhance the local dissipation of internal wave energy, yet parameterizations of canyon mixing processes are currently missing from large-scale ocean models. As a first step in the development of such parameterizations, we conduct a parameter space study of M2 tidal-frequency, low-mode internal waves interacting with idealized V-shaped canyon topographies. Specifically, we examine the effects of varying the canyon mouth width, shape and slope of the thalweg (line of lowest elevation). This effort is divided into two parts. In the first part, presented here, we extend the theory of 3-dimensional internal wave reflection to a rotated coordinate system aligned with our idealized V-shaped canyons. Based on the updated linear internal wave reflection solution that we derive, we construct a ray tracing algorithm which traces a large number of rays (the discrete analog of a continuous wave) into the canyon region where they can scatter off topography. Although a ray tracing approach has been employed in other studies, we have, for the first time, used ray tracing to calculate changes in wavenumber and ray density which, in turn, can be used to calculate the Froude number (a measure of the likelihood of instability). We show that for canyons of intermediate aspect ratio, large spatial envelopes of instability can form in the presence of supercritical sidewalls. Additionally, the canyon height and length can modulate the Froude number. The second part of this study, a diagnosis of internal wave scattering in continental slope canyons using both numerical simulations and this ray tracing algorithm, as well as a test of robustness of the ray tracing, is presented in the companion article.

  15. A ray tracing model of gravity wave propagation and breakdown in the middle atmosphere

    NASA Technical Reports Server (NTRS)

    Schoeberl, M. R.

    1985-01-01

    Gravity wave ray tracing and wave packet theory is used to parameterize wave breaking in the mesosphere. Rays are tracked by solving the group velocity equations, and the interaction with the basic state is determined by considering the evolution of the packet wave action density. The ray tracing approach has a number of advantages over the steady state parameterization as the effects of gravity wave focussing and refraction, local dissipation, and wave response to rapid changes in the mean flow are more realistically considered; however, if steady state conditions prevail, the method gives identical results. The ray tracing algorithm is tested using both interactive and noninteractive models of the basic state. In the interactive model, gravity wave interaction with the polar night jet on a beta-plane is considered. The algorithm produces realistic polar night jet closure for weak topographic forcing of gravity waves. Planetary scale waves forced by local transfer of wave action into the basic flow in turn transfer their wave action into the zonal mean flow. Highly refracted rays are also found not to contribute greatly to the climatology of the mesosphere, as their wave action is severely reduced by dissipation during their lateral travel.

  16. Improved backward ray tracing with stochastic sampling

    NASA Astrophysics Data System (ADS)

    Ryu, Seung Taek; Yoon, Kyung-Hyun

    1999-03-01

    This paper presents a new technique that enhances the diffuse interreflection with the concepts of backward ray tracing. In this research, we have modeled the diffuse rays with the following conditions. First, as the reflection from the diffuse surfaces occurs in all directions, it is impossible to trace all of the reflected rays. We confined the diffuse rays by sampling the spherical angle out of the reflected rays around the normal vector. Second, the traveled distance of reflected energy from the diffuse surface differs according to the object's property, and has a comparatively short reflection distance. Considering the fact that the rays created on the diffuse surfaces affect relatively small area, it is very inefficient to trace all of the sampled diffused rays. Therefore, we set a fixed distance as the critical distance and all the rays beyond this distance are ignored. The result of this research is that as the improved backward ray tracing can model the illumination effects such as the color bleeding effects, we can replace the radiosity algorithm under the limited environment.

  17. SU-D-206-02: Evaluation of Partial Storage of the System Matrix for Cone Beam Computed Tomography Using a GPU Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matenine, D; Cote, G; Mascolo-Fortin, J

    2016-06-15

    Purpose: Iterative reconstruction algorithms in computed tomography (CT) require a fast method for computing the intersections between the photons’ trajectories and the object, also called ray-tracing or system matrix computation. This work evaluates different ways to store the system matrix, aiming to reconstruct dense image grids in reasonable time. Methods: We propose an optimized implementation of the Siddon’s algorithm using graphics processing units (GPUs) with a novel data storage scheme. The algorithm computes a part of the system matrix on demand, typically, for one projection angle. The proposed method was enhanced with accelerating options: storage of larger subsets of themore » system matrix, systematic reuse of data via geometric symmetries, an arithmetic-rich parallel code and code configuration via machine learning. It was tested on geometries mimicking a cone beam CT acquisition of a human head. To realistically assess the execution time, the ray-tracing routines were integrated into a regularized Poisson-based reconstruction algorithm. The proposed scheme was also compared to a different approach, where the system matrix is fully pre-computed and loaded at reconstruction time. Results: Fast ray-tracing of realistic acquisition geometries, which often lack spatial symmetry properties, was enabled via the proposed method. Ray-tracing interleaved with projection and backprojection operations required significant additional time. In most cases, ray-tracing was shown to use about 66 % of the total reconstruction time. In absolute terms, tracing times varied from 3.6 s to 7.5 min, depending on the problem size. The presence of geometrical symmetries allowed for non-negligible ray-tracing and reconstruction time reduction. Arithmetic-rich parallel code and machine learning permitted a modest reconstruction time reduction, in the order of 1 %. Conclusion: Partial system matrix storage permitted the reconstruction of higher 3D image grid sizes and larger projection datasets at the cost of additional time, when compared to the fully pre-computed approach. This work was supported in part by the Fonds de recherche du Quebec - Nature et technologies (FRQ-NT). The authors acknowledge partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council of Canada (Grant No. 432290).« less

  18. Combined visualization for noise mapping of industrial facilities based on ray-tracing and thin plate splines

    NASA Astrophysics Data System (ADS)

    Ovsiannikov, Mikhail; Ovsiannikov, Sergei

    2017-01-01

    The paper presents the combined approach to noise mapping and visualizing of industrial facilities sound pollution using forward ray tracing method and thin-plate spline interpolation. It is suggested to cauterize industrial area in separate zones with similar sound levels. Equivalent local source is defined for range computation of sanitary zones based on ray tracing algorithm. Computation of sound pressure levels within clustered zones are based on two-dimension spline interpolation of measured data on perimeter and inside the zone.

  19. Simulated annealing two-point ray tracing

    NASA Astrophysics Data System (ADS)

    Velis, Danilo R.; Ulrych, Tadeusz J.

    We present a new method for solving the two-point seismic ray tracing problem based on Fermat's principle. The algorithm overcomes some well known difficulties that arise in standard ray shooting and bending methods. Problems related to: (1) the selection of new take-off angles, and (2) local minima in multipathing cases, are overcome by using an efficient simulated annealing (SA) algorithm. At each iteration, the ray is propagated from the source by solving a standard initial value problem. The last portion of the raypath is then forced to pass through the receiver. Using SA, the total traveltime is then globally minimized by obtaining the initial conditions that produce the absolute minimum path. The procedure is suitable for tracing rays through 2D complex structures, although it can be extended to deal with 3D velocity media. Not only direct waves, but also reflected and head-waves can be incorporated in the scheme. One important advantage is its simplicity, in as much as any available or user-preferred initial value solver system can be used. A number of clarifying examples of multipathing in 2D media are examined.

  20. Improvements of the Ray-Tracing Based Method Calculating Hypocentral Loci for Earthquake Location

    NASA Astrophysics Data System (ADS)

    Zhao, A. H.

    2014-12-01

    Hypocentral loci are very useful to reliable and visual earthquake location. However, they can hardly be analytically expressed when the velocity model is complex. One of methods numerically calculating them is based on a minimum traveltime tree algorithm for tracing rays: a focal locus is represented in terms of ray paths in its residual field from the minimum point (namely initial point) to low residual points (referred as reference points of the focal locus). The method has no restrictions on the complexity of the velocity model but still lacks the ability of correctly dealing with multi-segment loci. Additionally, it is rather laborious to set calculation parameters for obtaining loci with satisfying completeness and fineness. In this study, we improve the ray-tracing based numerical method to overcome its advantages. (1) Reference points of a hypocentral locus are selected from nodes of the model cells that it goes through, by means of a so-called peeling method. (2) The calculation domain of a hypocentral locus is defined as such a low residual area that its connected regions each include one segment of the locus and hence all the focal locus segments are respectively calculated with the minimum traveltime tree algorithm for tracing rays by repeatedly assigning the minimum residual reference point among those that have not been traced as an initial point. (3) Short ray paths without branching are removed to make the calculated locus finer. Numerical tests show that the improved method becomes capable of efficiently calculating complete and fine hypocentral loci of earthquakes in a complex model.

  1. NVIDIA OptiX ray-tracing engine as a new tool for modelling medical imaging systems

    NASA Astrophysics Data System (ADS)

    Pietrzak, Jakub; Kacperski, Krzysztof; Cieślar, Marek

    2015-03-01

    The most accurate technique to model the X- and gamma radiation path through a numerically defined object is the Monte Carlo simulation which follows single photons according to their interaction probabilities. A simplified and much faster approach, which just integrates total interaction probabilities along selected paths, is known as ray tracing. Both techniques are used in medical imaging for simulating real imaging systems and as projectors required in iterative tomographic reconstruction algorithms. These approaches are ready for massive parallel implementation e.g. on Graphics Processing Units (GPU), which can greatly accelerate the computation time at a relatively low cost. In this paper we describe the application of the NVIDIA OptiX ray-tracing engine, popular in professional graphics and rendering applications, as a new powerful tool for X- and gamma ray-tracing in medical imaging. It allows the implementation of a variety of physical interactions of rays with pixel-, mesh- or nurbs-based objects, and recording any required quantities, like path integrals, interaction sites, deposited energies, and others. Using the OptiX engine we have implemented a code for rapid Monte Carlo simulations of Single Photon Emission Computed Tomography (SPECT) imaging, as well as the ray-tracing projector, which can be used in reconstruction algorithms. The engine generates efficient, scalable and optimized GPU code, ready to run on multi GPU heterogeneous systems. We have compared the results our simulations with the GATE package. With the OptiX engine the computation time of a Monte Carlo simulation can be reduced from days to minutes.

  2. Evaluation of simulation alternatives for the brute-force ray-tracing approach used in backlight design

    NASA Astrophysics Data System (ADS)

    Desnijder, Karel; Hanselaer, Peter; Meuret, Youri

    2016-04-01

    A key requirement to obtain a uniform luminance for a side-lit LED backlight is the optimised spatial pattern of structures on the light guide that extract the light. The generation of such a scatter pattern is usually performed by applying an iterative approach. In each iteration, the luminance distribution of the backlight with a particular scatter pattern is analysed. This is typically performed with a brute-force ray-tracing algorithm, although this approach results in a time-consuming optimisation process. In this study, the Adding-Doubling method is explored as an alternative way for evaluating the luminance of a backlight. Due to the similarities between light propagating in a backlight with extraction structures and light scattering in a cloud of light scatterers, the Adding-Doubling method which is used to model the latter could also be used to model the light distribution in a backlight. The backlight problem is translated to a form upon which the Adding-Doubling method is directly applicable. The calculated luminance for a simple uniform extraction pattern with the Adding-Doubling method matches the luminance generated by a commercial raytracer very well. Although successful, no clear computational advantage over ray tracers is realised. However, the dynamics of light propagation in a light guide as used the Adding-Doubling method, also allow to enhance the efficiency of brute-force ray-tracing algorithms. The performance of this enhanced ray-tracing approach for the simulation of backlights is also evaluated against a typical brute-force ray-tracing approach.

  3. A nonvoxel-based dose convolution/superposition algorithm optimized for scalable GPU architectures.

    PubMed

    Neylon, J; Sheng, K; Yu, V; Chen, Q; Low, D A; Kupelian, P; Santhanam, A

    2014-10-01

    Real-time adaptive planning and treatment has been infeasible due in part to its high computational complexity. There have been many recent efforts to utilize graphics processing units (GPUs) to accelerate the computational performance and dose accuracy in radiation therapy. Data structure and memory access patterns are the key GPU factors that determine the computational performance and accuracy. In this paper, the authors present a nonvoxel-based (NVB) approach to maximize computational and memory access efficiency and throughput on the GPU. The proposed algorithm employs a ray-tracing mechanism to restructure the 3D data sets computed from the CT anatomy into a nonvoxel-based framework. In a process that takes only a few milliseconds of computing time, the algorithm restructured the data sets by ray-tracing through precalculated CT volumes to realign the coordinate system along the convolution direction, as defined by zenithal and azimuthal angles. During the ray-tracing step, the data were resampled according to radial sampling and parallel ray-spacing parameters making the algorithm independent of the original CT resolution. The nonvoxel-based algorithm presented in this paper also demonstrated a trade-off in computational performance and dose accuracy for different coordinate system configurations. In order to find the best balance between the computed speedup and the accuracy, the authors employed an exhaustive parameter search on all sampling parameters that defined the coordinate system configuration: zenithal, azimuthal, and radial sampling of the convolution algorithm, as well as the parallel ray spacing during ray tracing. The angular sampling parameters were varied between 4 and 48 discrete angles, while both radial sampling and parallel ray spacing were varied from 0.5 to 10 mm. The gamma distribution analysis method (γ) was used to compare the dose distributions using 2% and 2 mm dose difference and distance-to-agreement criteria, respectively. Accuracy was investigated using three distinct phantoms with varied geometries and heterogeneities and on a series of 14 segmented lung CT data sets. Performance gains were calculated using three 256 mm cube homogenous water phantoms, with isotropic voxel dimensions of 1, 2, and 4 mm. The nonvoxel-based GPU algorithm was independent of the data size and provided significant computational gains over the CPU algorithm for large CT data sizes. The parameter search analysis also showed that the ray combination of 8 zenithal and 8 azimuthal angles along with 1 mm radial sampling and 2 mm parallel ray spacing maintained dose accuracy with greater than 99% of voxels passing the γ test. Combining the acceleration obtained from GPU parallelization with the sampling optimization, the authors achieved a total performance improvement factor of >175 000 when compared to our voxel-based ground truth CPU benchmark and a factor of 20 compared with a voxel-based GPU dose convolution method. The nonvoxel-based convolution method yielded substantial performance improvements over a generic GPU implementation, while maintaining accuracy as compared to a CPU computed ground truth dose distribution. Such an algorithm can be a key contribution toward developing tools for adaptive radiation therapy systems.

  4. A nonvoxel-based dose convolution/superposition algorithm optimized for scalable GPU architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neylon, J., E-mail: jneylon@mednet.ucla.edu; Sheng, K.; Yu, V.

    Purpose: Real-time adaptive planning and treatment has been infeasible due in part to its high computational complexity. There have been many recent efforts to utilize graphics processing units (GPUs) to accelerate the computational performance and dose accuracy in radiation therapy. Data structure and memory access patterns are the key GPU factors that determine the computational performance and accuracy. In this paper, the authors present a nonvoxel-based (NVB) approach to maximize computational and memory access efficiency and throughput on the GPU. Methods: The proposed algorithm employs a ray-tracing mechanism to restructure the 3D data sets computed from the CT anatomy intomore » a nonvoxel-based framework. In a process that takes only a few milliseconds of computing time, the algorithm restructured the data sets by ray-tracing through precalculated CT volumes to realign the coordinate system along the convolution direction, as defined by zenithal and azimuthal angles. During the ray-tracing step, the data were resampled according to radial sampling and parallel ray-spacing parameters making the algorithm independent of the original CT resolution. The nonvoxel-based algorithm presented in this paper also demonstrated a trade-off in computational performance and dose accuracy for different coordinate system configurations. In order to find the best balance between the computed speedup and the accuracy, the authors employed an exhaustive parameter search on all sampling parameters that defined the coordinate system configuration: zenithal, azimuthal, and radial sampling of the convolution algorithm, as well as the parallel ray spacing during ray tracing. The angular sampling parameters were varied between 4 and 48 discrete angles, while both radial sampling and parallel ray spacing were varied from 0.5 to 10 mm. The gamma distribution analysis method (γ) was used to compare the dose distributions using 2% and 2 mm dose difference and distance-to-agreement criteria, respectively. Accuracy was investigated using three distinct phantoms with varied geometries and heterogeneities and on a series of 14 segmented lung CT data sets. Performance gains were calculated using three 256 mm cube homogenous water phantoms, with isotropic voxel dimensions of 1, 2, and 4 mm. Results: The nonvoxel-based GPU algorithm was independent of the data size and provided significant computational gains over the CPU algorithm for large CT data sizes. The parameter search analysis also showed that the ray combination of 8 zenithal and 8 azimuthal angles along with 1 mm radial sampling and 2 mm parallel ray spacing maintained dose accuracy with greater than 99% of voxels passing the γ test. Combining the acceleration obtained from GPU parallelization with the sampling optimization, the authors achieved a total performance improvement factor of >175 000 when compared to our voxel-based ground truth CPU benchmark and a factor of 20 compared with a voxel-based GPU dose convolution method. Conclusions: The nonvoxel-based convolution method yielded substantial performance improvements over a generic GPU implementation, while maintaining accuracy as compared to a CPU computed ground truth dose distribution. Such an algorithm can be a key contribution toward developing tools for adaptive radiation therapy systems.« less

  5. Research on illumination uniformity of high-power LED array light source

    NASA Astrophysics Data System (ADS)

    Yu, Xiaolong; Wei, Xueye; Zhang, Ou; Zhang, Xinwei

    2018-06-01

    Uniform illumination is one of the most important problem that must be solved in the application of high-power LED array. A numerical optimization algorithm, is applied to obtain the best LED array typesetting so that the light intensity of the target surface is evenly distributed. An evaluation function is set up through the standard deviation of the illuminance function, then the particle swarm optimization algorithm is utilized to optimize different arrays. Furthermore, the light intensity distribution is obtained by optical ray tracing method. Finally, a hybrid array is designed and the optical ray tracing method is applied to simulate the array. The simulation results, which is consistent with the traditional theoretical calculation, show that the algorithm introduced in this paper is reasonable and effective.

  6. Computing the total atmospheric refraction for real-time optical imaging sensor simulation

    NASA Astrophysics Data System (ADS)

    Olson, Richard F.

    2015-05-01

    Fast and accurate computation of light path deviation due to atmospheric refraction is an important requirement for real-time simulation of optical imaging sensor systems. A large body of existing literature covers various methods for application of Snell's Law to the light path ray tracing problem. This paper provides a discussion of the adaptation to real time simulation of atmospheric refraction ray tracing techniques used in mid-1980's LOWTRAN releases. The refraction ray trace algorithm published in a LOWTRAN-6 technical report by Kneizys (et. al.) has been coded in MATLAB for development, and in C-language for simulation use. To this published algorithm we have added tuning parameters for variable path segment lengths, and extensions for Earth grazing and exoatmospheric "near Earth" ray paths. Model atmosphere properties used to exercise the refraction algorithm were obtained from tables published in another LOWTRAN-6 related report. The LOWTRAN-6 based refraction model is applicable to atmospheric propagation at wavelengths in the IR and visible bands of the electromagnetic spectrum. It has been used during the past two years by engineers at the U.S. Army Aviation and Missile Research, Development and Engineering Center (AMRDEC) in support of several advanced imaging sensor simulations. Recently, a faster (but sufficiently accurate) method using Gauss-Chebyshev Quadrature integration for evaluating the refraction integral was adopted.

  7. Simultaneous cryo X-ray ptychographic and fluorescence microscopy of green algae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, Junjing; Vine, David J.; Chen, Si

    Trace metals play important roles in normal and in disease-causing biological functions. X-ray fluorescence microscopy reveals trace elements with no dependence on binding affinities (unlike with visible light fluorophores) and with improved sensitivity relative to electron probes. However, X-ray fluorescence is not very sensitive for showing the light elements that comprise the majority of cellular material. Here we show that X-ray ptychography can be combined with fluorescence to image both cellular structure and trace element distribution in frozen-hydrated cells at cryogenic temperatures, with high structural and chemical fidelity. Ptychographic reconstruction algorithms deliver phase and absorption contrast images at a resolutionmore » beyond that of the illuminating lens or beam size. Using 5.2-keV X-rays, we have obtained sub-30-nm resolution structural images and ~90-nm-resolution fluorescence images of several elements in frozen-hydrated green algae. Finally, this combined approach offers a way to study the role of trace elements in their structural context.« less

  8. Simultaneous cryo X-ray ptychographic and fluorescence microscopy of green algae

    DOE PAGES

    Deng, Junjing; Vine, David J.; Chen, Si; ...

    2015-02-24

    Trace metals play important roles in normal and in disease-causing biological functions. X-ray fluorescence microscopy reveals trace elements with no dependence on binding affinities (unlike with visible light fluorophores) and with improved sensitivity relative to electron probes. However, X-ray fluorescence is not very sensitive for showing the light elements that comprise the majority of cellular material. Here we show that X-ray ptychography can be combined with fluorescence to image both cellular structure and trace element distribution in frozen-hydrated cells at cryogenic temperatures, with high structural and chemical fidelity. Ptychographic reconstruction algorithms deliver phase and absorption contrast images at a resolutionmore » beyond that of the illuminating lens or beam size. Using 5.2-keV X-rays, we have obtained sub-30-nm resolution structural images and ~90-nm-resolution fluorescence images of several elements in frozen-hydrated green algae. Finally, this combined approach offers a way to study the role of trace elements in their structural context.« less

  9. Simultaneous cryo X-ray ptychographic and fluorescence microscopy of green algae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, Junjing; Vine, David J.; Chen, Si

    Trace metals play important roles in normal and in disease-causing biological functions. X-ray fluorescence microscopy reveals trace elements with no dependence on binding affinities (unlike with visible light fluorophores) and with improved sensitivity relative to electron probes. However, X-ray fluorescence is not very sensitive for showing the light elements that comprise the majority of cellular material. Here we show that X-ray ptychography can be combined with fluorescence to image both cellular structure and trace element distribution in frozen-hydrated cells at cryogenic temperatures, with high structural and chemical fidelity. Ptychographic reconstruction algorithms deliver phase and absorption contrast images at a resolutionmore » beyond that of the illuminating lens or beam size. Using 5.2-keV X-rays, we have obtained sub-30-nm resolution structural images and similar to 90-nm-resolution fluorescence images of several elements in frozen-hydrated green algae. This combined approach offers a way to study the role of trace elements in their structural context.« less

  10. Ray Tracing for Dispersive Tsunamis and Source Amplitude Estimation Based on Green's Law: Application to the 2015 Volcanic Tsunami Earthquake Near Torishima, South of Japan

    NASA Astrophysics Data System (ADS)

    Sandanbata, Osamu; Watada, Shingo; Satake, Kenji; Fukao, Yoshio; Sugioka, Hiroko; Ito, Aki; Shiobara, Hajime

    2018-04-01

    Ray tracing, which has been widely used for seismic waves, was also applied to tsunamis to examine the bathymetry effects during propagation, but it was limited to linear shallow-water waves. Green's law, which is based on the conservation of energy flux, has been used to estimate tsunami amplitude on ray paths. In this study, we first propose a new ray tracing method extended to dispersive tsunamis. By using an iterative algorithm to map two-dimensional tsunami velocity fields at different frequencies, ray paths at each frequency can be traced. We then show that Green's law is valid only outside the source region and that extension of Green's law is needed for source amplitude estimation. As an application example, we analyzed tsunami waves generated by an earthquake that occurred at a submarine volcano, Smith Caldera, near Torishima, Japan, in 2015. The ray-tracing results reveal that the ray paths are very dependent on its frequency, particularly at deep oceans. The validity of our frequency-dependent ray tracing is confirmed by the comparison of arrival angles and travel times with those of observed tsunami waveforms at an array of ocean bottom pressure gauges. The tsunami amplitude at the source is nearly twice or more of that just outside the source estimated from the array tsunami data by Green's law.

  11. Numerical simulation and comparison of nonlinear self-focusing based on iteration and ray tracing

    NASA Astrophysics Data System (ADS)

    Li, Xiaotong; Chen, Hao; Wang, Weiwei; Ruan, Wangchao; Zhang, Luwei; Cen, Zhaofeng

    2017-05-01

    Self-focusing is observed in nonlinear materials owing to the interaction between laser and matter when laser beam propagates. Some of numerical simulation strategies such as the beam propagation method (BPM) based on nonlinear Schrödinger equation and ray tracing method based on Fermat's principle have applied to simulate the self-focusing process. In this paper we present an iteration nonlinear ray tracing method in that the nonlinear material is also cut into massive slices just like the existing approaches, but instead of paraxial approximation and split-step Fourier transform, a large quantity of sampled real rays are traced step by step through the system with changing refractive index and laser intensity by iteration. In this process a smooth treatment is employed to generate a laser density distribution at each slice to decrease the error caused by the under-sampling. The characteristics of this method is that the nonlinear refractive indices of the points on current slice are calculated by iteration so as to solve the problem of unknown parameters in the material caused by the causal relationship between laser intensity and nonlinear refractive index. Compared with the beam propagation method, this algorithm is more suitable for engineering application with lower time complexity, and has the calculation capacity for numerical simulation of self-focusing process in the systems including both of linear and nonlinear optical media. If the sampled rays are traced with their complex amplitudes and light paths or phases, it will be possible to simulate the superposition effects of different beam. At the end of the paper, the advantages and disadvantages of this algorithm are discussed.

  12. GRay: A Massively Parallel GPU-based Code for Ray Tracing in Relativistic Spacetimes

    NASA Astrophysics Data System (ADS)

    Chan, Chi-kwan; Psaltis, Dimitrios; Özel, Feryal

    2013-11-01

    We introduce GRay, a massively parallel integrator designed to trace the trajectories of billions of photons in a curved spacetime. This graphics-processing-unit (GPU)-based integrator employs the stream processing paradigm, is implemented in CUDA C/C++, and runs on nVidia graphics cards. The peak performance of GRay using single-precision floating-point arithmetic on a single GPU exceeds 300 GFLOP (or 1 ns per photon per time step). For a realistic problem, where the peak performance cannot be reached, GRay is two orders of magnitude faster than existing central-processing-unit-based ray-tracing codes. This performance enhancement allows more effective searches of large parameter spaces when comparing theoretical predictions of images, spectra, and light curves from the vicinities of compact objects to observations. GRay can also perform on-the-fly ray tracing within general relativistic magnetohydrodynamic algorithms that simulate accretion flows around compact objects. Making use of this algorithm, we calculate the properties of the shadows of Kerr black holes and the photon rings that surround them. We also provide accurate fitting formulae of their dependencies on black hole spin and observer inclination, which can be used to interpret upcoming observations of the black holes at the center of the Milky Way, as well as M87, with the Event Horizon Telescope.

  13. A computational approach for hypersonic nonequilibrium radiation utilizing space partition algorithm and Gauss quadrature

    NASA Astrophysics Data System (ADS)

    Shang, J. S.; Andrienko, D. A.; Huang, P. G.; Surzhikov, S. T.

    2014-06-01

    An efficient computational capability for nonequilibrium radiation simulation via the ray tracing technique has been accomplished. The radiative rate equation is iteratively coupled with the aerodynamic conservation laws including nonequilibrium chemical and chemical-physical kinetic models. The spectral properties along tracing rays are determined by a space partition algorithm of the nearest neighbor search process, and the numerical accuracy is further enhanced by a local resolution refinement using the Gauss-Lobatto polynomial. The interdisciplinary governing equations are solved by an implicit delta formulation through the diminishing residual approach. The axisymmetric radiating flow fields over the reentry RAM-CII probe have been simulated and verified with flight data and previous solutions by traditional methods. A computational efficiency gain nearly forty times is realized over that of the existing simulation procedures.

  14. A metal artifact reduction algorithm in CT using multiple prior images by recursive active contour segmentation

    PubMed Central

    Nam, Haewon

    2017-01-01

    We propose a novel metal artifact reduction (MAR) algorithm for CT images that completes a corrupted sinogram along the metal trace region. When metal implants are located inside a field of view, they create a barrier to the transmitted X-ray beam due to the high attenuation of metals, which significantly degrades the image quality. To fill in the metal trace region efficiently, the proposed algorithm uses multiple prior images with residual error compensation in sinogram space. Multiple prior images are generated by applying a recursive active contour (RAC) segmentation algorithm to the pre-corrected image acquired by MAR with linear interpolation, where the number of prior image is controlled by RAC depending on the object complexity. A sinogram basis is then acquired by forward projection of the prior images. The metal trace region of the original sinogram is replaced by the linearly combined sinogram of the prior images. Then, the additional correction in the metal trace region is performed to compensate the residual errors occurred by non-ideal data acquisition condition. The performance of the proposed MAR algorithm is compared with MAR with linear interpolation and the normalized MAR algorithm using simulated and experimental data. The results show that the proposed algorithm outperforms other MAR algorithms, especially when the object is complex with multiple bone objects. PMID:28604794

  15. A Thermo-Optic Propagation Modeling Capability.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schrader, Karl; Akau, Ron

    2014-10-01

    A new theoretical basis is derived for tracing optical rays within a finite-element (FE) volume. The ray-trajectory equations are cast into the local element coordinate frame and the full finite-element interpolation is used to determine instantaneous index gradient for the ray-path integral equation. The FE methodology (FEM) is also used to interpolate local surface deformations and the surface normal vector for computing the refraction angle when launching rays into the volume, and again when rays exit the medium. The method is implemented in the Matlab(TM) environment and compared to closed- form gradient index models. A software architecture is also developedmore » for implementing the algorithms in the Zemax(TM) commercial ray-trace application. A controlled thermal environment was constructed in the laboratory, and measured data was collected to validate the structural, thermal, and optical modeling methods.« less

  16. GRay: A MASSIVELY PARALLEL GPU-BASED CODE FOR RAY TRACING IN RELATIVISTIC SPACETIMES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chan, Chi-kwan; Psaltis, Dimitrios; Özel, Feryal

    We introduce GRay, a massively parallel integrator designed to trace the trajectories of billions of photons in a curved spacetime. This graphics-processing-unit (GPU)-based integrator employs the stream processing paradigm, is implemented in CUDA C/C++, and runs on nVidia graphics cards. The peak performance of GRay using single-precision floating-point arithmetic on a single GPU exceeds 300 GFLOP (or 1 ns per photon per time step). For a realistic problem, where the peak performance cannot be reached, GRay is two orders of magnitude faster than existing central-processing-unit-based ray-tracing codes. This performance enhancement allows more effective searches of large parameter spaces when comparingmore » theoretical predictions of images, spectra, and light curves from the vicinities of compact objects to observations. GRay can also perform on-the-fly ray tracing within general relativistic magnetohydrodynamic algorithms that simulate accretion flows around compact objects. Making use of this algorithm, we calculate the properties of the shadows of Kerr black holes and the photon rings that surround them. We also provide accurate fitting formulae of their dependencies on black hole spin and observer inclination, which can be used to interpret upcoming observations of the black holes at the center of the Milky Way, as well as M87, with the Event Horizon Telescope.« less

  17. IonRayTrace: An HF Propagation Model for Communications and Radar Applications

    DTIC Science & Technology

    2014-12-01

    for modeling the impact of ionosphere variability on detection algorithms. Modification of IonRayTrace’s source code to include flexible gridding and...color denotes plasma frequency in MHz .................................................................. 6 4. Ionospheric absorption (dB) versus... Ionosphere for its environmental background [3]. IonRayTrace’s operation is summarized briefly in Section 3. However, the scope of this document is primarily

  18. CYBER 200 Applications Seminar

    NASA Technical Reports Server (NTRS)

    Gary, J. P. (Compiler)

    1984-01-01

    Applications suited for the CYBER 200 digital computer are discussed. Various areas of application including meteorology, algorithms, fluid dynamics, monte carlo methods, petroleum, electronic circuit simulation, biochemistry, lattice gauge theory, economics and ray tracing are discussed.

  19. Effects of refractive index mismatch in optical CT imaging of polymer gel dosimeters.

    PubMed

    Manjappa, Rakesh; Makki S, Sharath; Kumar, Rajesh; Kanhirodan, Rajan

    2015-02-01

    Proposing an image reconstruction technique, algebraic reconstruction technique-refraction correction (ART-rc). The proposed method takes care of refractive index mismatches present in gel dosimeter scanner at the boundary, and also corrects for the interior ray refraction. Polymer gel dosimeters with high dose regions have higher refractive index and optical density compared to the background medium, these changes in refractive index at high dose results in interior ray bending. The inclusion of the effects of refraction is an important step in reconstruction of optical density in gel dosimeters. The proposed ray tracing algorithm models the interior multiple refraction at the inhomogeneities. Jacob's ray tracing algorithm has been modified to calculate the pathlengths of the ray that traverses through the higher dose regions. The algorithm computes the length of the ray in each pixel along its path and is used as the weight matrix. Algebraic reconstruction technique and pixel based reconstruction algorithms are used for solving the reconstruction problem. The proposed method is tested with numerical phantoms for various noise levels. The experimental dosimetric results are also presented. The results show that the proposed scheme ART-rc is able to reconstruct optical density inside the dosimeter better than the results obtained using filtered backprojection and conventional algebraic reconstruction approaches. The quantitative improvement using ART-rc is evaluated using gamma-index. The refraction errors due to regions of different refractive indices are discussed. The effects of modeling of interior refraction in the dose region are presented. The errors propagated due to multiple refraction effects have been modeled and the improvements in reconstruction using proposed model is presented. The refractive index of the dosimeter has a mismatch with the surrounding medium (for dry air or water scanning). The algorithm reconstructs the dose profiles by estimating refractive indices of multiple inhomogeneities having different refractive indices and optical densities embedded in the dosimeter. This is achieved by tracking the path of the ray that traverses through the dosimeter. Extensive simulation studies have been carried out and results are found to be matching that of experimental results.

  20. Effects of refractive index mismatch in optical CT imaging of polymer gel dosimeters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manjappa, Rakesh; Makki S, Sharath; Kanhirodan, Rajan, E-mail: rajan@physics.iisc.ernet.in

    2015-02-15

    Purpose: Proposing an image reconstruction technique, algebraic reconstruction technique-refraction correction (ART-rc). The proposed method takes care of refractive index mismatches present in gel dosimeter scanner at the boundary, and also corrects for the interior ray refraction. Polymer gel dosimeters with high dose regions have higher refractive index and optical density compared to the background medium, these changes in refractive index at high dose results in interior ray bending. Methods: The inclusion of the effects of refraction is an important step in reconstruction of optical density in gel dosimeters. The proposed ray tracing algorithm models the interior multiple refraction at themore » inhomogeneities. Jacob’s ray tracing algorithm has been modified to calculate the pathlengths of the ray that traverses through the higher dose regions. The algorithm computes the length of the ray in each pixel along its path and is used as the weight matrix. Algebraic reconstruction technique and pixel based reconstruction algorithms are used for solving the reconstruction problem. The proposed method is tested with numerical phantoms for various noise levels. The experimental dosimetric results are also presented. Results: The results show that the proposed scheme ART-rc is able to reconstruct optical density inside the dosimeter better than the results obtained using filtered backprojection and conventional algebraic reconstruction approaches. The quantitative improvement using ART-rc is evaluated using gamma-index. The refraction errors due to regions of different refractive indices are discussed. The effects of modeling of interior refraction in the dose region are presented. Conclusions: The errors propagated due to multiple refraction effects have been modeled and the improvements in reconstruction using proposed model is presented. The refractive index of the dosimeter has a mismatch with the surrounding medium (for dry air or water scanning). The algorithm reconstructs the dose profiles by estimating refractive indices of multiple inhomogeneities having different refractive indices and optical densities embedded in the dosimeter. This is achieved by tracking the path of the ray that traverses through the dosimeter. Extensive simulation studies have been carried out and results are found to be matching that of experimental results.« less

  1. Automatic creation of object hierarchies for ray tracing

    NASA Technical Reports Server (NTRS)

    Goldsmith, Jeffrey; Salmon, John

    1987-01-01

    Various methods for evaluating generated trees are proposed. The use of the hierarchical extent method of Rubin and Whitted (1980) to find the objects that will be hit by a ray is examined. This method employs tree searching; the construction of a tree of bounding volumes in order to determine the number of objects that will be hit by a ray is discussed. A tree generation algorithm, which uses a heuristic tree search strategy, is described. The effects of shuffling and sorting on the input data are investigated. The cost of inserting an object into the hierarchy during the construction of a tree algorithm is estimated. The steps involved in estimating the number of intersection calculations are presented.

  2. Semiannual Report, April 1, 1989 through September 30, 1989 (Institute for Computer Applications in Science and Engineering)

    DTIC Science & Technology

    1990-02-01

    noise. Tobias B. Orloff Work began on developing a high quality rendering algorithm based on the radiosity method. The algorithm is similar to...previous progressive radiosity algorithms except for the following improvements: 1. At each iteration vertex radiosities are computed using a modified scan...line approach, thus eliminating the quadratic cost associated with a ray tracing computation of vortex radiosities . 2. At each iteration the scene is

  3. Assimilative model for ionospheric dynamics employing delay, Doppler, and direction of arrival measurements from multiple HF channels

    NASA Astrophysics Data System (ADS)

    Fridman, Sergey V.; Nickisch, L. J.; Hausman, Mark; Zunich, George

    2016-03-01

    We describe the development of new HF data assimilation capabilities for our ionospheric inversion algorithm called GPSII (GPS Ionospheric Inversion). Previously existing capabilities of this algorithm included assimilation of GPS total electron content data as well as assimilation of backscatter ionograms. In the present effort we concentrated on developing assimilation tools for data related to HF propagation channels. Measurements of propagation delay, angle of arrival, and the ionosphere-induced Doppler from any number of known propagation links can now be utilized by GPSII. The resulting ionospheric model is consistent with all assimilated measurements. This means that ray tracing simulations of the assimilated propagation links are guaranteed to be in agreement with measured data within the errors of measurement. The key theoretical element for assimilating HF data is the raypath response operator (RPRO) which describes response of raypath parameters to infinitesimal variations of electron density in the ionosphere. We construct the RPRO out of the fundamental solution of linearized ray tracing equations for a dynamic magnetoactive plasma. We demonstrate performance and internal consistency of the algorithm using propagation delay data from multiple oblique ionograms (courtesy of Defence Science and Technology Organisation, Australia) as well as with time series of near-vertical incidence sky wave data (courtesy of the Intelligence Advanced Research Projects Activity HFGeo Program Government team). In all cases GPSII produces electron density distributions which are smooth in space and in time. We simulate the assimilated propagation links by performing ray tracing through GPSII-produced ionosphere and observe that simulated data are indeed in agreement with assimilated measurements.

  4. Unified algorithm of cone optics to compute solar flux on central receiver

    NASA Astrophysics Data System (ADS)

    Grigoriev, Victor; Corsi, Clotilde

    2017-06-01

    Analytical algorithms to compute flux distribution on central receiver are considered as a faster alternative to ray tracing. They have quite too many modifications, with HFLCAL and UNIZAR being the most recognized and verified. In this work, a generalized algorithm is presented which is valid for arbitrary sun shape of radial symmetry. Heliostat mirrors can have a nonrectangular profile, and the effects of shading and blocking, strong defocusing and astigmatism can be taken into account. The algorithm is suitable for parallel computing and can benefit from hardware acceleration of polygon texturing.

  5. F--Ray: A new algorithm for efficient transport of ionizing radiation

    NASA Astrophysics Data System (ADS)

    Mao, Yi; Zhang, J.; Wandelt, B. D.; Shapiro, P. R.; Iliev, I. T.

    2014-04-01

    We present a new algorithm for the 3D transport of ionizing radiation, called F2-Ray (Fast Fourier Ray-tracing method). The transfer of ionizing radiation with long mean free path in diffuse intergalactic gas poses a special challenge to standard numerical methods which transport the radiation in position space. Standard methods usually trace each individual ray until it is fully absorbed by the intervening gas. If the mean free path is long, the computational cost and memory load are likely to be prohibitive. We have developed an algorithm that overcomes these limitations and is, therefore, significantly more efficient. The method calculates the transfer of radiation collectively, using the Fast Fourier Transform to convert radiation between position and Fourier spaces, so the computational cost will not increase with the number of ionizing sources. The method also automatically combines parallel rays with the same frequency at the same grid cell, thereby minimizing the memory requirement. The method is explicitly photon-conserving, i.e. the depletion of ionizing photons is guaranteed to equal the photoionizations they caused, and explicitly obeys the periodic boundary condition, i.e. the escape of ionizing photons from one side of a simulation volume is guaranteed to be compensated by emitting the same amount of photons into the volume through the opposite side. Together, these features make it possible to numerically simulate the transfer of ionizing photons more efficiently than previous methods. Since ionizing radiation such as the X-ray is responsible for heating the intergalactic gas when first stars and quasars form at high redshifts, our method can be applied to simulate thermal distribution, in addition to cosmic reionization, in three-dimensional inhomogeneous cosmological density field.

  6. High frequency estimation of 2-dimensional cavity scattering

    NASA Astrophysics Data System (ADS)

    Dering, R. S.

    1984-12-01

    This thesis develops a simple ray tracing approximation for the high frequency scattering from a two-dimensional cavity. Whereas many other cavity scattering algorithms are very time consuming, this method is very swift. The analytical development of the ray tracing approach is performed in great detail, and it is shown how the radar cross section (RCS) depends on the cavity's length and width along with the radar wave's angle of incidence. This explains why the cavity's RCS oscillates as a function of incident angle. The RCS of a two dimensional cavity was measured experimentally, and these results were compared to computer calculations based on the high frequency ray tracing theory. The comparison was favorable in the sense that angular RCS minima and maxima were exactly predicted even though accuracy of the RCS magnitude decreased for incident angles far off-axis. Overall, once this method is extended to three dimensions, the technique shows promise as a fast first approximation of high frequency cavity scattering.

  7. Parallelizing serial code for a distributed processing environment with an application to high frequency electromagnetic scattering

    NASA Astrophysics Data System (ADS)

    Work, Paul R.

    1991-12-01

    This thesis investigates the parallelization of existing serial programs in computational electromagnetics for use in a parallel environment. Existing algorithms for calculating the radar cross section of an object are covered, and a ray-tracing code is chosen for implementation on a parallel machine. Current parallel architectures are introduced and a suitable parallel machine is selected for the implementation of the chosen ray-tracing algorithm. The standard techniques for the parallelization of serial codes are discussed, including load balancing and decomposition considerations, and appropriate methods for the parallelization effort are selected. A load balancing algorithm is modified to increase the efficiency of the application, and a high level design of the structure of the serial program is presented. A detailed design of the modifications for the parallel implementation is also included, with both the high level and the detailed design specified in a high level design language called UNITY. The correctness of the design is proven using UNITY and standard logic operations. The theoretical and empirical results show that it is possible to achieve an efficient parallel application for a serial computational electromagnetic program where the characteristics of the algorithm and the target architecture critically influence the development of such an implementation.

  8. TransFit: Finite element analysis data fitting software

    NASA Technical Reports Server (NTRS)

    Freeman, Mark

    1993-01-01

    The Advanced X-Ray Astrophysics Facility (AXAF) mission support team has made extensive use of geometric ray tracing to analyze the performance of AXAF developmental and flight optics. One important aspect of this performance modeling is the incorporation of finite element analysis (FEA) data into the surface deformations of the optical elements. TransFit is software designed for the fitting of FEA data of Wolter I optical surface distortions with a continuous surface description which can then be used by SAO's analytic ray tracing software, currently OSAC (Optical Surface Analysis Code). The improved capabilities of Transfit over previous methods include bicubic spline fitting of FEA data to accommodate higher spatial frequency distortions, fitted data visualization for assessing the quality of fit, the ability to accommodate input data from three FEA codes plus other standard formats, and options for alignment of the model coordinate system with the ray trace coordinate system. TransFit uses the AnswerGarden graphical user interface (GUI) to edit input parameters and then access routines written in PV-WAVE, C, and FORTRAN to allow the user to interactively create, evaluate, and modify the fit. The topics covered include an introduction to TransFit: requirements, designs philosophy, and implementation; design specifics: modules, parameters, fitting algorithms, and data displays; a procedural example; verification of performance; future work; and appendices on online help and ray trace results of the verification section.

  9. Quantitative X-ray fluorescence computed tomography for low-Z samples using an iterative absorption correction algorithm

    NASA Astrophysics Data System (ADS)

    Huang, Rong; Limburg, Karin; Rohtla, Mehis

    2017-05-01

    X-ray fluorescence computed tomography is often used to measure trace element distributions within low-Z samples, using algorithms capable of X-ray absorption correction when sample self-absorption is not negligible. Its reconstruction is more complicated compared to transmission tomography, and therefore not widely used. We describe in this paper a very practical iterative method that uses widely available transmission tomography reconstruction software for fluorescence tomography. With this method, sample self-absorption can be corrected not only for the absorption within the measured layer but also for the absorption by material beyond that layer. By combining tomography with analysis for scanning X-ray fluorescence microscopy, absolute concentrations of trace elements can be obtained. By using widely shared software, we not only minimized the coding, took advantage of computing efficiency of fast Fourier transform in transmission tomography software, but also thereby accessed well-developed data processing tools coming with well-known and reliable software packages. The convergence of the iterations was also carefully studied for fluorescence of different attenuation lengths. As an example, fish eye lenses could provide valuable information about fish life-history and endured environmental conditions. Given the lens's spherical shape and sometimes the short distance from sample to detector for detecting low concentration trace elements, its tomography data are affected by absorption related to material beyond the measured layer but can be reconstructed well with our method. Fish eye lens tomography results are compared with sliced lens 2D fluorescence mapping with good agreement, and with tomography providing better spatial resolution.

  10. On Gamma Ray Instrument On-Board Data Processing Real-Time Computational Algorithm for Cosmic Ray Rejection

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Hunter, Stanley D.; Hanu, Andrei R.; Sheets, Teresa B.

    2016-01-01

    Richard O. Duda and Peter E. Hart of Stanford Research Institute in [1] described the recurring problem in computer image processing as the detection of straight lines in digitized images. The problem is to detect the presence of groups of collinear or almost collinear figure points. It is clear that the problem can be solved to any desired degree of accuracy by testing the lines formed by all pairs of points. However, the computation required for n=NxM points image is approximately proportional to n2 or O(n2), becoming prohibitive for large images or when data processing cadence time is in milliseconds. Rosenfeld in [2] described an ingenious method due to Hough [3] for replacing the original problem of finding collinear points by a mathematically equivalent problem of finding concurrent lines. This method involves transforming each of the figure points into a straight line in a parameter space. Hough chose to use the familiar slope-intercept parameters, and thus his parameter space was the two-dimensional slope-intercept plane. A parallel Hough transform running on multi-core processors was elaborated in [4]. There are many other proposed methods of solving a similar problem, such as sampling-up-the-ramp algorithm (SUTR) [5] and algorithms involving artificial swarm intelligence techniques [6]. However, all state-of-the-art algorithms lack in real time performance. Namely, they are slow for large images that require performance cadence of a few dozens of milliseconds (50ms). This problem arises in spaceflight applications such as near real-time analysis of gamma ray measurements contaminated by overwhelming amount of traces of cosmic rays (CR). Future spaceflight instruments such as the Advanced Energetic Pair Telescope instrument (AdEPT) [7-9] for cosmos gamma ray survey employ large detector readout planes registering multitudes of cosmic ray interference events and sparse science gamma ray event traces' projections. The AdEPT science of interest is in the gamma ray events and the problem is to detect and reject the much more voluminous cosmic ray projections, so that the remaining science data can be telemetered to the ground over the constrained communication link. The state-of-the-art in cosmic rays detection and rejection does not provide an adequate computational solution. This paper presents a novel approach to the AdEPT on-board data processing burdened with the CR detection top pole bottleneck problem. This paper is introducing the data processing object, demonstrates object segmentation and distribution for processing among many processing elements (PEs) and presents solution algorithm for the processing bottleneck - the CR-Algorithm. The algorithm is based on the a priori knowledge that a CR pierces the entire instrument pressure vessel. This phenomenon is also the basis for a straightforward CR simulator, allowing the CR-Algorithm performance testing. Parallel processing of the readout image's (2(N+M) - 4) peripheral voxels is detecting all CRs, resulting in O(n) computational complexity. This algorithm near real-time performance is making AdEPT class spaceflight instruments feasible.

  11. Adaptive optics based non-null interferometry for optical free form surfaces test

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Zhou, Sheng; Li, Jingsong; Yu, Benli

    2018-03-01

    An adaptive optics based non-null interferometry (ANI) is proposed for optical free form surfaces testing, in which an open-loop deformable mirror (DM) is employed as a reflective compensator, to compensate various low-order aberrations flexibly. The residual wavefront aberration is treated by the multi-configuration ray tracing (MCRT) algorithm. The MCRT algorithm based on the simultaneous ray tracing for multiple system models, in which each model has different DM surface deformation. With the MCRT algorithm, the final figure error can be extracted together with the surface misalignment aberration correction after the initial system calibration. The flexible test for free form surface is achieved with high accuracy, without auxiliary device for DM deformation monitoring. Experiments proving the feasibility, repeatability and high accuracy of the ANI were carried out to test a bi-conic surface and a paraboloidal surface, with a high stable ALPAOTM DM88. The accuracy of the final test result of the paraboloidal surface was better than 1/20 Μ PV value. It is a successful attempt in research of flexible optical free form surface metrology and would have enormous potential in future application with the development of the DM technology.

  12. Simulating polarized light scattering in terrestrial snow based on bicontinuous random medium and Monte Carlo ray tracing

    NASA Astrophysics Data System (ADS)

    Xiong, Chuan; Shi, Jiancheng

    2014-01-01

    To date, the light scattering models of snow consider very little about the real snow microstructures. The ideal spherical or other single shaped particle assumptions in previous snow light scattering models can cause error in light scattering modeling of snow and further cause errors in remote sensing inversion algorithms. This paper tries to build up a snow polarized reflectance model based on bicontinuous medium, with which the real snow microstructure is considered. The accurate specific surface area of bicontinuous medium can be analytically derived. The polarized Monte Carlo ray tracing technique is applied to the computer generated bicontinuous medium. With proper algorithms, the snow surface albedo, bidirectional reflectance distribution function (BRDF) and polarized BRDF can be simulated. The validation of model predicted spectral albedo and bidirectional reflectance factor (BRF) using experiment data shows good results. The relationship between snow surface albedo and snow specific surface area (SSA) were predicted, and this relationship can be used for future improvement of snow specific surface area (SSA) inversion algorithms. The model predicted polarized reflectance is validated and proved accurate, which can be further applied in polarized remote sensing.

  13. Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patchett, John M; Ahrens, James P; Lo, Li - Ta

    2010-10-15

    Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We presentmore » a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.« less

  14. SU-E-T-397: Evaluation of Planned Dose Distributions by Monte Carlo (0.5%) and Ray Tracing Algorithm for the Spinal Tumors with CyberKnife

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, H; Brindle, J; Hepel, J

    2015-06-15

    Purpose: To analyze and evaluate dose distribution between Ray Tracing (RT) and Monte Carlo (MC) algorithms of 0.5% uncertainty on a critical structure of spinal cord and gross target volume and planning target volume. Methods: Twenty four spinal tumor patients were treated with stereotactic body radiotherapy (SBRT) by CyberKnife in 2013 and 2014. The MC algorithm with 0.5% of uncertainty is used to recalculate the dose distribution for the treatment plan of the patients using the same beams, beam directions, and monitor units (MUs). Results: The prescription doses are uniformly larger for MC plans than RT except one case. Upmore » to a factor of 1.19 for 0.25cc threshold volume and 1.14 for 1.2cc threshold volume of dose differences are observed for the spinal cord. Conclusion: The MC recalculated dose distributions are larger than the original MC calculations for the spinal tumor cases. Based on the accuracy of the MC calculations, more radiation dose might be delivered to the tumor targets and spinal cords with the increase prescription dose.« less

  15. Transition zone structure beneath Ethiopia from 3-D fast marching pseudo-migration stacking

    NASA Astrophysics Data System (ADS)

    Benoit, M. H.; Lopez, A.; Levin, V.

    2008-12-01

    Several models for the origin of the Afar hotspot have been put forth over the last decade, but much ambiguity remains as to whether the hotspot tectonism found there is due to a shallow or deeply seated feature. Additionally, there has been much debate as to whether the hotspot owes its existence to a 'classic' mantle plume feature or if it is part of the African Superplume complex. To further understand the origin of the hotspot, we employ a new receiver function stacking method that incorporates a fast-marching three- dimensional ray tracing algorithm to improve upon existing studies of the mantle transition zone structure. Using teleseismic data from the Ethiopia Broadband Seismic Experiment and the EAGLE (Ethiopia Afar Grand Lithospheric Experiment) experiment, we stack receiver functions using a three-dimensional pseudo- migration technique to examine topography on the 410 and 660 km discontinuities. Previous methods of receiver function pseudo-migration incorporated ray tracing methods that were not able to ray trace through highly complicated 3-D structure, or the ray tracing techniques only produced 3-D time perturbations associated 1-D rays in a 3-D velocity medium. These previous techniques yielded confusing and incomplete results for when applied to the exceedingly complicated mantle structure beneath Ethiopia. Indeed, comparisons of the 1-D versus 3-D ray tracing techniques show that the 1-D technique mislocated structure laterally in the mantle by over 100 km. Preliminary results using our new technique show a shallower then average 410 km discontinuity and a deeper than average 660 km discontinuity over much of the region, suggested that the hotspot has a deep seated origin.

  16. Evolutionary algorithm for optimization of nonimaging Fresnel lens geometry.

    PubMed

    Yamada, N; Nishikawa, T

    2010-06-21

    In this study, an evolutionary algorithm (EA), which consists of genetic and immune algorithms, is introduced to design the optical geometry of a nonimaging Fresnel lens; this lens generates the uniform flux concentration required for a photovoltaic cell. Herein, a design procedure that incorporates a ray-tracing technique in the EA is described, and the validity of the design is demonstrated. The results show that the EA automatically generated a unique geometry of the Fresnel lens; the use of this geometry resulted in better uniform flux concentration with high optical efficiency.

  17. Baseline mathematics and geodetics for tracking operations

    NASA Technical Reports Server (NTRS)

    James, R.

    1981-01-01

    Various geodetic and mapping algorithms are analyzed as they apply to radar tracking systems and tested in extended BASIC computer language for real time computer applications. Closed-form approaches to the solution of converting Earth centered coordinates to latitude, longitude, and altitude are compared with classical approximations. A simplified approach to atmospheric refractivity called gradient refraction is compared with conventional ray tracing processes. An extremely detailed set of documentation which provides the theory, derivations, and application of algorithms used in the programs is included. Validation methods are also presented for testing the accuracy of the algorithms.

  18. Automatic segmentation of mandible in panoramic x-ray.

    PubMed

    Abdi, Amir Hossein; Kasaei, Shohreh; Mehdizadeh, Mojdeh

    2015-10-01

    As the panoramic x-ray is the most common extraoral radiography in dentistry, segmentation of its anatomical structures facilitates diagnosis and registration of dental records. This study presents a fast and accurate method for automatic segmentation of mandible in panoramic x-rays. In the proposed four-step algorithm, a superior border is extracted through horizontal integral projections. A modified Canny edge detector accompanied by morphological operators extracts the inferior border of the mandible body. The exterior borders of ramuses are extracted through a contour tracing method based on the average model of mandible. The best-matched template is fetched from the atlas of mandibles to complete the contour of left and right processes. The algorithm was tested on a set of 95 panoramic x-rays. Evaluating the results against manual segmentations of three expert dentists showed that the method is robust. It achieved an average performance of [Formula: see text] in Dice similarity, specificity, and sensitivity.

  19. Modeling UV Radiation Feedback from Massive Stars. I. Implementation of Adaptive Ray-tracing Method and Tests

    NASA Astrophysics Data System (ADS)

    Kim, Jeong-Gyu; Kim, Woong-Tae; Ostriker, Eve C.; Skinner, M. Aaron

    2017-12-01

    We present an implementation of an adaptive ray-tracing (ART) module in the Athena hydrodynamics code that accurately and efficiently handles the radiative transfer involving multiple point sources on a three-dimensional Cartesian grid. We adopt a recently proposed parallel algorithm that uses nonblocking, asynchronous MPI communications to accelerate transport of rays across the computational domain. We validate our implementation through several standard test problems, including the propagation of radiation in vacuum and the expansions of various types of H II regions. Additionally, scaling tests show that the cost of a full ray trace per source remains comparable to that of the hydrodynamics update on up to ∼ {10}3 processors. To demonstrate application of our ART implementation, we perform a simulation of star cluster formation in a marginally bound, turbulent cloud, finding that its star formation efficiency is 12% when both radiation pressure forces and photoionization by UV radiation are treated. We directly compare the radiation forces computed from the ART scheme with those from the M1 closure relation. Although the ART and M1 schemes yield similar results on large scales, the latter is unable to resolve the radiation field accurately near individual point sources.

  20. Light reflection models for computer graphics.

    PubMed

    Greenberg, D P

    1989-04-14

    During the past 20 years, computer graphic techniques for simulating the reflection of light have progressed so that today images of photorealistic quality can be produced. Early algorithms considered direct lighting only, but global illumination phenomena with indirect lighting, surface interreflections, and shadows can now be modeled with ray tracing, radiosity, and Monte Carlo simulations. This article describes the historical development of computer graphic algorithms for light reflection and pictorially illustrates what will be commonly available in the near future.

  1. The effect of anatomical modeling on space radiation dose estimates: a comparison of doses for NASA phantoms and the 5th, 50th, and 95th percentile male and female astronauts.

    PubMed

    Bahadori, Amir A; Van Baalen, Mary; Shavers, Mark R; Dodge, Charles; Semones, Edward J; Bolch, Wesley E

    2011-03-21

    The National Aeronautics and Space Administration (NASA) performs organ dosimetry and risk assessment for astronauts using model-normalized measurements of the radiation fields encountered in space. To determine the radiation fields in an organ or tissue of interest, particle transport calculations are performed using self-shielding distributions generated with the computer program CAMERA to represent the human body. CAMERA mathematically traces linear rays (or path lengths) through the computerized anatomical man (CAM) phantom, a computational stylized model developed in the early 1970s with organ and body profiles modeled using solid shapes and scaled to represent the body morphometry of the 1950 50th percentile (PCTL) Air Force male. With the increasing use of voxel phantoms in medical and health physics, a conversion from a mathematical-based to a voxel-based ray-tracing algorithm is warranted. In this study, the voxel-based ray tracer (VoBRaT) is introduced to ray trace voxel phantoms using a modified version of the algorithm first proposed by Siddon (1985 Med. Phys. 12 252-5). After validation, VoBRAT is used to evaluate variations in body self-shielding distributions for NASA phantoms and six University of Florida (UF) hybrid phantoms, scaled to represent the 5th, 50th, and 95th PCTL male and female astronaut body morphometries, which have changed considerably since the inception of CAM. These body self-shielding distributions are used to generate organ dose equivalents and effective doses for five commonly evaluated space radiation environments. It is found that dosimetric differences among the phantoms are greatest for soft radiation spectra and light vehicular shielding.

  2. Two-dimensional fast marching for geometrical optics.

    PubMed

    Capozzoli, Amedeo; Curcio, Claudio; Liseno, Angelo; Savarese, Salvatore

    2014-11-03

    We develop an approach for the fast and accurate determination of geometrical optics solutions to Maxwell's equations in inhomogeneous 2D media and for TM polarized electric fields. The eikonal equation is solved by the fast marching method. Particular attention is paid to consistently discretizing the scatterers' boundaries and matching the discretization to that of the computational domain. The ray tracing is performed, in a direct and inverse way, by using a technique introduced in computer graphics for the fast and accurate generation of textured images from vector fields. The transport equation is solved by resorting only to its integral form, the transport of polarization being trivial for the considered geometry and polarization. Numerical results for the plane wave scattering of two perfectly conducting circular cylinders and for a Luneburg lens prove the accuracy of the algorithm. In particular, it is shown how the approach is capable of properly accounting for the multiple scattering occurring between the two metallic cylinders and how inverse ray tracing should be preferred to direct ray tracing in the case of the Luneburg lens.

  3. Spectral radiation analyses of the GOES solar illuminated hexagonal cell scan mirror back

    NASA Technical Reports Server (NTRS)

    Fantano, Louis G.

    1993-01-01

    A ray tracing analytical tool has been developed for the simulation of spectral radiation exchange in complex systems. Algorithms are used to account for heat source spectral energy, surface directional radiation properties, and surface spectral absorptivity properties. This tool has been used to calculate the effective solar absorptivity of the geostationary operational environmental satellites (GOES) scan mirror in the calibration position. The development and design of Sounder and Imager instruments on board GOES is reviewed and the problem of calculating the effective solar absorptivity associated with the GOES hexagonal cell configuration is presented. The analytical methodology based on the Monte Carlo ray tracing technique is described and results are presented and verified by experimental measurements for selected solar incidence angles.

  4. Anisotropic ray trace

    NASA Astrophysics Data System (ADS)

    Lam, Wai Sze Tiffany

    Optical components made of anisotropic materials, such as crystal polarizers and crystal waveplates, are widely used in many complex optical system, such as display systems, microlithography, biomedical imaging and many other optical systems, and induce more complex aberrations than optical components made of isotropic materials. The goal of this dissertation is to accurately simulate the performance of optical systems with anisotropic materials using polarization ray trace. This work extends the polarization ray tracing calculus to incorporate ray tracing through anisotropic materials, including uniaxial, biaxial and optically active materials. The 3D polarization ray tracing calculus is an invaluable tool for analyzing polarization properties of an optical system. The 3x3 polarization ray tracing P matrix developed for anisotropic ray trace assists tracking the 3D polarization transformations along a ray path with series of surfaces in an optical system. To better represent the anisotropic light-matter interactions, the definition of the P matrix is generalized to incorporate not only the polarization change at a refraction/reflection interface, but also the induced optical phase accumulation as light propagates through the anisotropic medium. This enables realistic modeling of crystalline polarization elements, such as crystal waveplates and crystal polarizers. The wavefront and polarization aberrations of these anisotropic components are more complex than those of isotropic optical components and can be evaluated from the resultant P matrix for each eigen-wavefront as well as for the overall image. One incident ray refracting or reflecting into an anisotropic medium produces two eigenpolarizations or eigenmodes propagating in different directions. The associated ray parameters of these modes necessary for the anisotropic ray trace are described in Chapter 2. The algorithms to calculate the P matrix from these ray parameters are described in Chapter 3 for anisotropic ray tracing. x. Chapter 4 presents the data reduction of the P matrix of a crystal waveplate. The diattenuation is embedded in the singular values of P. The retardance is divided into two parts: (A) The physical retardance induced by OPLs and surface interactions, and (B) the geometrical transformation induced by geometry of a ray path, which is calculated by the geometrical transform Q matrix. The Q matrix of an anisotropic intercept is derived from the generalization of s- and p-bases at the anisotropic intercept; the p basis is not confined to the plane of incidence due to the anisotropic refraction or reflection. Chapter 5 shows how the multiple P matrices associated with the eigenmodes resulting from propagation through multiple anisotropic surfaces can be combined into one P matrix when the multiple modes interfere in their overlapping regions. The resultant P matrix contains diattenuation induced at each surface interaction as well as the retardance due to ray propagation and total internal reflections. The polarization aberrations of crystal waveplates and crystal polarizers are studied in Chapter 6 and Chapter 7. A wavefront simulated by a grid of rays is traced through the anisotropic system and the resultant grid of rays is analyzed. The analysis is complicated by the ray doubling effects and the partially overlapping eigen-wavefronts propagating in various directions. The wavefront and polarization aberrations of each eigenmode can be evaluated from the electric field distributions. The overall polarization at the plane of interest or the image quality at the image plane are affected by each of these eigen-wavefronts. Isotropic materials become anisotropic due to stress, strain, or applied electric or magnetic fields. In Chapter 8, the P matrix for anisotropic materials is extended to ray tracing in stress birefringent materials which are treated as spatially varying anisotropic materials. Such simulations can predict the spatial retardance variation throughout the stressed optical component and its effects on the point spread function and modulation transfer function for different incident polarizations. The anisotropic extension of the P matrix also applies to other anisotropic optical components, such as anisotropic diffractive optical elements and anisotropic thin films. It systematically keeps track of polarization transformation in 3D global Cartesian coordinates of a ray propagating through series of anisotropic and isotropic optical components with arbitrary orientations. The polarization ray tracing calculus with this generalized P matrix provides a powerful tool for optical ray trace and allows comprehensive analysis of complex optical system. (Abstract shortened by UMI.).

  5. Simultaneous elastic parameter inversion in 2-D/3-D TTI medium combined later arrival times

    NASA Astrophysics Data System (ADS)

    Bai, Chao-ying; Wang, Tao; Yang, Shang-bei; Li, Xing-wang; Huang, Guo-jiao

    2016-04-01

    Traditional traveltime inversion for anisotropic medium is, in general, based on a "weak" assumption in the anisotropic property, which simplifies both the forward part (ray tracing is performed once only) and the inversion part (a linear inversion solver is possible). But for some real applications, a general (both "weak" and "strong") anisotropic medium should be considered. In such cases, one has to develop a ray tracing algorithm to handle with the general (including "strong") anisotropic medium and also to design a non-linear inversion solver for later tomography. Meanwhile, it is constructive to investigate how much the tomographic resolution can be improved by introducing the later arrivals. For this motivation, we incorporated our newly developed ray tracing algorithm (multistage irregular shortest-path method) for general anisotropic media with a non-linear inversion solver (a damped minimum norm, constrained least squares problem with a conjugate gradient approach) to formulate a non-linear inversion solver for anisotropic medium. This anisotropic traveltime inversion procedure is able to combine the later (reflected) arrival times. Both 2-D/3-D synthetic inversion experiments and comparison tests show that (1) the proposed anisotropic traveltime inversion scheme is able to recover the high contrast anomalies and (2) it is possible to improve the tomographic resolution by introducing the later (reflected) arrivals, but not as expected in the isotropic medium, because the different velocity (qP, qSV and qSH) sensitivities (or derivatives) respective to the different elastic parameters are not the same but are also dependent on the inclination angle.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Y; Liu, B; Liang, B

    Purpose: Current CyberKnife treatment planning system (TPS) provided two dose calculation algorithms: Ray-tracing and Monte Carlo. Ray-tracing algorithm is fast, but less accurate, and also can’t handle irregular fields since a multi-leaf collimator system was recently introduced to CyberKnife M6 system. Monte Carlo method has well-known accuracy, but the current version still takes a long time to finish dose calculations. The purpose of this paper is to develop a GPU-based fast C/S dose engine for CyberKnife system to achieve both accuracy and efficiency. Methods: The TERMA distribution from a poly-energetic source was calculated based on beam’s eye view coordinate system,more » which is GPU friendly and has linear complexity. The dose distribution was then computed by inversely collecting the energy depositions from all TERMA points along 192 collapsed-cone directions. EGSnrc user code was used to pre-calculate energy deposition kernels (EDKs) for a series of mono-energy photons The energy spectrum was reconstructed based on measured tissue maximum ratio (TMR) curve, the TERMA averaged cumulative kernels was then calculated. Beam hardening parameters and intensity profiles were optimized based on measurement data from CyberKnife system. Results: The difference between measured and calculated TMR are less than 1% for all collimators except in the build-up regions. The calculated profiles also showed good agreements with the measured doses within 1% except in the penumbra regions. The developed C/S dose engine was also used to evaluate four clinical CyberKnife treatment plans, the results showed a better dose calculation accuracy than Ray-tracing algorithm compared with Monte Carlo method for heterogeneous cases. For the dose calculation time, it takes about several seconds for one beam depends on collimator size and dose calculation grids. Conclusion: A GPU-based C/S dose engine has been developed for CyberKnife system, which was proven to be efficient and accurate for clinical purpose, and can be easily implemented in TPS.« less

  7. The extended Beer-Lambert theory for ray tracing modeling of LED chip-scaled packaging application with multiple luminescence materials

    NASA Astrophysics Data System (ADS)

    Yuan, Cadmus C. A.

    2015-12-01

    Optical ray tracing modeling applied Beer-Lambert method in the single luminescence material system to model the white light pattern from blue LED light source. This paper extends such algorithm to a mixed multiple luminescence material system by introducing the equivalent excitation and emission spectrum of individual luminescence materials. The quantum efficiency numbers of individual material and self-absorption of the multiple luminescence material system are considered as well. By this combination, researchers are able to model the luminescence characteristics of LED chip-scaled packaging (CSP), which provides simple process steps and the freedom of the luminescence material geometrical dimension. The method will be first validated by the experimental results. Afterward, a further parametric investigation has been then conducted.

  8. MC ray-tracing optimization of lobster-eye focusing devices with RESTRAX

    NASA Astrophysics Data System (ADS)

    Šaroun, Jan; Kulda, Jiří

    2006-11-01

    The enhanced functionalities of the latest version of the RESTRAX software, providing a high-speed Monte Carlo (MC) ray-tracing code to represent a virtual three-axis neutron spectrometer, include representation of parabolic and elliptic guide profiles and facilities for numerical optimization of parameter values, characterizing the instrument components. As examples, we present simulations of a doubly focusing monochromator in combination with cold neutron guides and lobster-eye supermirror devices, concentrating a monochromatic beam to small sample volumes. A Levenberg-Marquardt minimization algorithm is used to optimize simultaneously several parameters of the monochromator and lobster-eye guides. We compare the performance of optimized configurations in terms of monochromatic neutron flux and energy spread and demonstrate the effect of lobster-eye optics on beam transformations in real and momentum subspaces.

  9. Ray-tracing in three dimensions for calculation of radiation-dose calculations. Master's thesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, D.R.

    1986-05-27

    This thesis addresses several methods of calculating the radiation-dose distribution for use by technicians or clinicians in radiation-therapy treatment planning. It specifically covers the calculation of the effective pathlength of the radiation beam for use in beam models representing the dose distribution. A two-dimensional method by Bentley and Milan is compared to the method of Strip Trees developed by Duda and Hart and then a three-dimensional algorithm built to perform the calculations in three dimensions. The use of PRISMS conforms easily to the obtained CT Scans and provides a means of only doing two-dimensional ray-tracing while performing three-dimensional dose calculations.more » This method is already being applied and used in actual calculations.« less

  10. Ray-tracing 3D dust radiative transfer with DART-Ray: code upgrade and public release

    NASA Astrophysics Data System (ADS)

    Natale, Giovanni; Popescu, Cristina C.; Tuffs, Richard J.; Clarke, Adam J.; Debattista, Victor P.; Fischera, Jörg; Pasetto, Stefano; Rushton, Mark; Thirlwall, Jordan J.

    2017-11-01

    We present an extensively updated version of the purely ray-tracing 3D dust radiation transfer code DART-Ray. The new version includes five major upgrades: 1) a series of optimizations for the ray-angular density and the scattered radiation source function; 2) the implementation of several data and task parallelizations using hybrid MPI+OpenMP schemes; 3) the inclusion of dust self-heating; 4) the ability to produce surface brightness maps for observers within the models in HEALPix format; 5) the possibility to set the expected numerical accuracy already at the start of the calculation. We tested the updated code with benchmark models where the dust self-heating is not negligible. Furthermore, we performed a study of the extent of the source influence volumes, using galaxy models, which are critical in determining the efficiency of the DART-Ray algorithm. The new code is publicly available, documented for both users and developers, and accompanied by several programmes to create input grids for different model geometries and to import the results of N-body and SPH simulations. These programmes can be easily adapted to different input geometries, and for different dust models or stellar emission libraries.

  11. Single-step reinitialization and extending algorithms for level-set based multi-phase flow simulations

    NASA Astrophysics Data System (ADS)

    Fu, Lin; Hu, Xiangyu Y.; Adams, Nikolaus A.

    2017-12-01

    We propose efficient single-step formulations for reinitialization and extending algorithms, which are critical components of level-set based interface-tracking methods. The level-set field is reinitialized with a single-step (non iterative) "forward tracing" algorithm. A minimum set of cells is defined that describes the interface, and reinitialization employs only data from these cells. Fluid states are extrapolated or extended across the interface by a single-step "backward tracing" algorithm. Both algorithms, which are motivated by analogy to ray-tracing, avoid multiple block-boundary data exchanges that are inevitable for iterative reinitialization and extending approaches within a parallel-computing environment. The single-step algorithms are combined with a multi-resolution conservative sharp-interface method and validated by a wide range of benchmark test cases. We demonstrate that the proposed reinitialization method achieves second-order accuracy in conserving the volume of each phase. The interface location is invariant to reapplication of the single-step reinitialization. Generally, we observe smaller absolute errors than for standard iterative reinitialization on the same grid. The computational efficiency is higher than for the standard and typical high-order iterative reinitialization methods. We observe a 2- to 6-times efficiency improvement over the standard method for serial execution. The proposed single-step extending algorithm, which is commonly employed for assigning data to ghost cells with ghost-fluid or conservative interface interaction methods, shows about 10-times efficiency improvement over the standard method while maintaining same accuracy. Despite their simplicity, the proposed algorithms offer an efficient and robust alternative to iterative reinitialization and extending methods for level-set based multi-phase simulations.

  12. Fully automated laser ray tracing system to measure changes in the crystalline lens GRIN profile.

    PubMed

    Qiu, Chen; Maceo Heilman, Bianca; Kaipio, Jari; Donaldson, Paul; Vaghefi, Ehsan

    2017-11-01

    Measuring the lens gradient refractive index (GRIN) accurately and reliably has proven an extremely challenging technical problem. A fully automated laser ray tracing (LRT) system was built to address this issue. The LRT system captures images of multiple laser projections before and after traversing through an ex vivo lens. These LRT images, combined with accurate measurements of the lens geometry, are used to calculate the lens GRIN profile. Mathematically, this is an ill-conditioned problem; hence, it is essential to apply biologically relevant constraints to produce a feasible solution. The lens GRIN measurements were compared with previously published data. Our GRIN retrieval algorithm produces fast and accurate measurements of the lens GRIN profile. Experiments to study the optics of physiologically perturbed lenses are the future direction of this research.

  13. Fully automated laser ray tracing system to measure changes in the crystalline lens GRIN profile

    PubMed Central

    Qiu, Chen; Maceo Heilman, Bianca; Kaipio, Jari; Donaldson, Paul; Vaghefi, Ehsan

    2017-01-01

    Measuring the lens gradient refractive index (GRIN) accurately and reliably has proven an extremely challenging technical problem. A fully automated laser ray tracing (LRT) system was built to address this issue. The LRT system captures images of multiple laser projections before and after traversing through an ex vivo lens. These LRT images, combined with accurate measurements of the lens geometry, are used to calculate the lens GRIN profile. Mathematically, this is an ill-conditioned problem; hence, it is essential to apply biologically relevant constraints to produce a feasible solution. The lens GRIN measurements were compared with previously published data. Our GRIN retrieval algorithm produces fast and accurate measurements of the lens GRIN profile. Experiments to study the optics of physiologically perturbed lenses are the future direction of this research. PMID:29188093

  14. Ray tracing method for simulation of laser beam interaction with random packings of powders

    NASA Astrophysics Data System (ADS)

    Kovalev, O. B.; Kovaleva, I. O.; Belyaev, V. V.

    2018-03-01

    Selective laser sintering is a technology of rapid manufacturing of a free form that is created as a solid object by selectively fusing successive layers of powder using a laser. The motivation of this study is due to the currently insufficient understanding of the processes and phenomena of selective laser melting of powders whose time scales differ by orders of magnitude. To construct random packings from mono- and polydispersed solid spheres, the algorithm of their generation based on the discrete element method is used. A numerical method of ray tracing is proposed that is used to simulate the interaction of laser radiation with a random bulk packing of spherical particles and to predict the optical properties of the granular layer, the extinction and absorption coefficients, depending on the optical properties of a powder material.

  15. Method for rapid high-frequency seismogram calculation

    NASA Astrophysics Data System (ADS)

    Stabile, Tony Alfredo; De Matteis, Raffaella; Zollo, Aldo

    2009-02-01

    We present a method for rapid, high-frequency seismogram calculation that makes use of an algorithm to automatically generate an exhaustive set of seismic phases with an appreciable amplitude on the seismogram. The method uses a hierarchical order of ray and seismic-phase generation, taking into account some existing constraints for ray paths and some physical constraints. To compute synthetic seismograms, the COMRAD code (from the Italian: "COdice Multifase per il RAy-tracing Dinamico") uses as core a dynamic ray-tracing code. To validate the code, we have computed in a layered medium synthetic seismograms using both COMRAD and a code that computes the complete wave field by the discrete wave number method. The seismograms are compared according to a time-frequency misfit criteria based on the continuous wavelet transform of the signals. Although the number of phases is considerably reduced by the selection criteria, the results show that the loss in amplitude on the whole seismogram is negligible. Moreover, the time for the computing of the synthetics using the COMRAD code (truncating the ray series at the 10th generation) is 3-4-fold less than that needed for the AXITRA code (up to a frequency of 25 Hz).

  16. CALCLENS: Weak lensing simulations for large-area sky surveys and second-order effects in cosmic shear power spectra

    NASA Astrophysics Data System (ADS)

    Becker, Matthew Rand

    I present a new algorithm, CALCLENS, for efficiently computing weak gravitational lensing shear signals from large N-body light cone simulations over a curved sky. This new algorithm properly accounts for the sky curvature and boundary conditions, is able to produce redshift- dependent shear signals including corrections to the Born approximation by using multiple- plane ray tracing, and properly computes the lensed images of source galaxies in the light cone. The key feature of this algorithm is a new, computationally efficient Poisson solver for the sphere that combines spherical harmonic transform and multigrid methods. As a result, large areas of sky (~10,000 square degrees) can be ray traced efficiently at high-resolution using only a few hundred cores. Using this new algorithm and curved-sky calculations that only use a slower but more accurate spherical harmonic transform Poisson solver, I study the convergence, shear E-mode, shear B-mode and rotation mode power spectra. Employing full-sky E/B-mode decompositions, I confirm that the numerically computed shear B-mode and rotation mode power spectra are equal at high accuracy ( ≲ 1%) as expected from perturbation theory up to second order. Coupled with realistic galaxy populations placed in large N-body light cone simulations, this new algorithm is ideally suited for the construction of synthetic weak lensing shear catalogs to be used to test for systematic effects in data analysis procedures for upcoming large-area sky surveys. The implementation presented in this work, written in C and employing widely available software libraries to maintain portability, is publicly available at http://code.google.com/p/calclens.

  17. CALCLENS: weak lensing simulations for large-area sky surveys and second-order effects in cosmic shear power spectra

    NASA Astrophysics Data System (ADS)

    Becker, Matthew R.

    2013-10-01

    I present a new algorithm, Curved-sky grAvitational Lensing for Cosmological Light conE simulatioNS (CALCLENS), for efficiently computing weak gravitational lensing shear signals from large N-body light cone simulations over a curved sky. This new algorithm properly accounts for the sky curvature and boundary conditions, is able to produce redshift-dependent shear signals including corrections to the Born approximation by using multiple-plane ray tracing and properly computes the lensed images of source galaxies in the light cone. The key feature of this algorithm is a new, computationally efficient Poisson solver for the sphere that combines spherical harmonic transform and multigrid methods. As a result, large areas of sky (˜10 000 square degrees) can be ray traced efficiently at high resolution using only a few hundred cores. Using this new algorithm and curved-sky calculations that only use a slower but more accurate spherical harmonic transform Poisson solver, I study the convergence, shear E-mode, shear B-mode and rotation mode power spectra. Employing full-sky E/B-mode decompositions, I confirm that the numerically computed shear B-mode and rotation mode power spectra are equal at high accuracy (≲1 per cent) as expected from perturbation theory up to second order. Coupled with realistic galaxy populations placed in large N-body light cone simulations, this new algorithm is ideally suited for the construction of synthetic weak lensing shear catalogues to be used to test for systematic effects in data analysis procedures for upcoming large-area sky surveys. The implementation presented in this work, written in C and employing widely available software libraries to maintain portability, is publicly available at http://code.google.com/p/calclens.

  18. Skeleton-based tracing of curved fibers from 3D X-ray microtomographic imaging

    NASA Astrophysics Data System (ADS)

    Huang, Xiang; Wen, Donghui; Zhao, Yanwei; Wang, Qinghui; Zhou, Wei; Deng, Daxiang

    A skeleton-based fiber tracing algorithm is described and applied on a specific fibrous material, porous metal fiber sintered sheet (PMFSS), featuring high porosity and curved fibers. The skeleton segments are firstly categorized according to the connectivity of the skeleton paths. Spurious segments like fiber bonds are detected making extensive use of the distance transform (DT) values. Single fibers are then traced and reconstructed by consecutively choosing the connecting skeleton segment pairs that show the most similar orientations and radius. Moreover, to reduce the misconnection due to the tracing orders, a multilevel tracing strategy is proposed. The fibrous network is finally reconstructed by dilating single fibers according to the DT values. Based on the traced single fibers, various morphology information regarding fiber length, radius, orientation, and tortuosity are quantitatively analyzed and compared with our previous results (Wang et al., 2013). Moreover, the number of bonds per fibers are firstly accessed. The methodology described in this paper can be expanded to other fibrous materials with adapted parameters.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schreiner, S.; Paschal, C.B.; Galloway, R.L.

    Four methods of producing maximum intensity projection (MIP) images were studied and compared. Three of the projection methods differ in the interpolation kernel used for ray tracing. The interpolation kernels include nearest neighbor interpolation, linear interpolation, and cubic convolution interpolation. The fourth projection method is a voxel projection method that is not explicitly a ray-tracing technique. The four algorithms` performance was evaluated using a computer-generated model of a vessel and using real MR angiography data. The evaluation centered around how well an algorithm transferred an object`s width to the projection plane. The voxel projection algorithm does not suffer from artifactsmore » associated with the nearest neighbor algorithm. Also, a speed-up in the calculation of the projection is seen with the voxel projection method. Linear interpolation dramatically improves the transfer of width information from the 3D MRA data set over both nearest neighbor and voxel projection methods. Even though the cubic convolution interpolation kernel is theoretically superior to the linear kernel, it did not project widths more accurately than linear interpolation. A possible advantage to the nearest neighbor interpolation is that the size of small vessels tends to be exaggerated in the projection plane, thereby increasing their visibility. The results confirm that the way in which an MIP image is constructed has a dramatic effect on information contained in the projection. The construction method must be chosen with the knowledge that the clinical information in the 2D projections in general will be different from that contained in the original 3D data volume. 27 refs., 16 figs., 2 tabs.« less

  20. Reconstruction of 2D PET data with Monte Carlo generated system matrix for generalized natural pixels

    NASA Astrophysics Data System (ADS)

    Vandenberghe, Stefaan; Staelens, Steven; Byrne, Charles L.; Soares, Edward J.; Lemahieu, Ignace; Glick, Stephen J.

    2006-06-01

    In discrete detector PET, natural pixels are image basis functions calculated from the response of detector pairs. By using reconstruction with natural pixel basis functions, the discretization of the object into a predefined grid can be avoided. Here, we propose to use generalized natural pixel reconstruction. Using this approach, the basis functions are not the detector sensitivity functions as in the natural pixel case but uniform parallel strips. The backprojection of the strip coefficients results in the reconstructed image. This paper proposes an easy and efficient way to generate the matrix M directly by Monte Carlo simulation. Elements of the generalized natural pixel system matrix are formed by calculating the intersection of a parallel strip with the detector sensitivity function. These generalized natural pixels are easier to use than conventional natural pixels because the final step from solution to a square pixel representation is done by simple backprojection. Due to rotational symmetry in the PET scanner, the matrix M is block circulant and only the first blockrow needs to be stored. Data were generated using a fast Monte Carlo simulator using ray tracing. The proposed method was compared to a listmode MLEM algorithm, which used ray tracing for doing forward and backprojection. Comparison of the algorithms with different phantoms showed that an improved resolution can be obtained using generalized natural pixel reconstruction with accurate system modelling. In addition, it was noted that for the same resolution a lower noise level is present in this reconstruction. A numerical observer study showed the proposed method exhibited increased performance as compared to a standard listmode EM algorithm. In another study, more realistic data were generated using the GATE Monte Carlo simulator. For these data, a more uniform contrast recovery and a better contrast-to-noise performance were observed. It was observed that major improvements in contrast recovery were obtained with MLEM when the correct system matrix was used instead of simple ray tracing. The correct modelling was the major cause of improved contrast for the same background noise. Less important factors were the choice of the algorithm (MLEM performed better than ART) and the basis functions (generalized natural pixels gave better results than pixels).

  1. Gamma ray spectroscopy employing divalent europium-doped alkaline earth halides and digital readout for accurate histogramming

    DOEpatents

    Cherepy, Nerine Jane; Payne, Stephen Anthony; Drury, Owen B; Sturm, Benjamin W

    2014-11-11

    A scintillator radiation detector system according to one embodiment includes a scintillator; and a processing device for processing pulse traces corresponding to light pulses from the scintillator, wherein pulse digitization is used to improve energy resolution of the system. A scintillator radiation detector system according to another embodiment includes a processing device for fitting digitized scintillation waveforms to an algorithm based on identifying rise and decay times and performing a direct integration of fit parameters. A method according to yet another embodiment includes processing pulse traces corresponding to light pulses from a scintillator, wherein pulse digitization is used to improve energy resolution of the system. A method in a further embodiment includes fitting digitized scintillation waveforms to an algorithm based on identifying rise and decay times; and performing a direct integration of fit parameters. Additional systems and methods are also presented.

  2. Ionosphere Profile Estimation Using Ionosonde & GPS Data in an Inverse Refraction Calculation

    NASA Astrophysics Data System (ADS)

    Psiaki, M. L.

    2014-12-01

    A method has been developed to assimilate ionosonde virtual heights and GPS slant TEC data to estimate the parameters of a local ionosphere model, including estimates of the topside and of latitude and longitude variations. This effort seeks to better assimilate a variety of remote sensing data in order to characterize local (and eventually regional and global) ionosphere electron density profiles. The core calculations involve a forward refractive ray-tracing solution and a nonlinear optimal estimation algorithm that inverts the forward model. The ray-tracing calculations solve a nonlinear two-point boundary value problem for the curved ionosonde or GPS ray path through a parameterized electron density profile. It implements a full 3D solution that can handle the case of a tilted ionosphere. These calculations use Hamiltonian equivalents of the Appleton-Hartree magneto-plasma refraction index model. The current ionosphere parameterization is a modified Booker profile. It has been augmented to include latitude and longitude dependencies. The forward ray-tracing solution yields a given signal's group delay and beat carrier phase observables. An auxiliary set of boundary value problem solutions determine the sensitivities of the ray paths and observables with respect to the parameters of the augmented Booker profile. The nonlinear estimation algorithm compares the measured ionosonde virtual-altitude observables and GPS slant-TEC observables to the corresponding values from the forward refraction model. It uses the parameter sensitivities of the model to iteratively improve its parameter estimates in a way the reduces the residual errors between the measurements and their modeled values. This method has been applied to data from HAARP in Gakona, AK and has produced good TEC and virtual height fits. It has been extended to characterize electron density perturbations caused by HAARP heating experiments through the use of GPS slant TEC data for an LOS through the heated zone. The next planned extension of the method is to estimate the parameters of a regional ionosphere profile. The input observables will be slant TEC from an array of GPS receivers and group delay and carrier phase observables from an array of high-frequency beacons. The beacon array will function as a sort of multi-static ionosonde.

  3. ALGORITHMS AND PROGRAMS FOR STRONG GRAVITATIONAL LENSING IN KERR SPACE-TIME INCLUDING POLARIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Bin; Maddumage, Prasad; Kantowski, Ronald

    2015-05-15

    Active galactic nuclei (AGNs) and quasars are important astrophysical objects to understand. Recently, microlensing observations have constrained the size of the quasar X-ray emission region to be of the order of 10 gravitational radii of the central supermassive black hole. For distances within a few gravitational radii, light paths are strongly bent by the strong gravity field of the central black hole. If the central black hole has nonzero angular momentum (spin), then a photon’s polarization plane will be rotated by the gravitational Faraday effect. The observed X-ray flux and polarization will then be influenced significantly by the strong gravitymore » field near the source. Consequently, linear gravitational lensing theory is inadequate for such extreme circumstances. We present simple algorithms computing the strong lensing effects of Kerr black holes, including the effects on polarization. Our algorithms are realized in a program “KERTAP” in two versions: MATLAB and Python. The key ingredients of KERTAP are a graphic user interface, a backward ray-tracing algorithm, a polarization propagator dealing with gravitational Faraday rotation, and algorithms computing observables such as flux magnification and polarization angles. Our algorithms can be easily realized in other programming languages such as FORTRAN, C, and C++. The MATLAB version of KERTAP is parallelized using the MATLAB Parallel Computing Toolbox and the Distributed Computing Server. The Python code was sped up using Cython and supports full implementation of MPI using the “mpi4py” package. As an example, we investigate the inclination angle dependence of the observed polarization and the strong lensing magnification of AGN X-ray emission. We conclude that it is possible to perform complex numerical-relativity related computations using interpreted languages such as MATLAB and Python.« less

  4. Algorithms and Programs for Strong Gravitational Lensing In Kerr Space-time Including Polarization

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Kantowski, Ronald; Dai, Xinyu; Baron, Eddie; Maddumage, Prasad

    2015-05-01

    Active galactic nuclei (AGNs) and quasars are important astrophysical objects to understand. Recently, microlensing observations have constrained the size of the quasar X-ray emission region to be of the order of 10 gravitational radii of the central supermassive black hole. For distances within a few gravitational radii, light paths are strongly bent by the strong gravity field of the central black hole. If the central black hole has nonzero angular momentum (spin), then a photon’s polarization plane will be rotated by the gravitational Faraday effect. The observed X-ray flux and polarization will then be influenced significantly by the strong gravity field near the source. Consequently, linear gravitational lensing theory is inadequate for such extreme circumstances. We present simple algorithms computing the strong lensing effects of Kerr black holes, including the effects on polarization. Our algorithms are realized in a program “KERTAP” in two versions: MATLAB and Python. The key ingredients of KERTAP are a graphic user interface, a backward ray-tracing algorithm, a polarization propagator dealing with gravitational Faraday rotation, and algorithms computing observables such as flux magnification and polarization angles. Our algorithms can be easily realized in other programming languages such as FORTRAN, C, and C++. The MATLAB version of KERTAP is parallelized using the MATLAB Parallel Computing Toolbox and the Distributed Computing Server. The Python code was sped up using Cython and supports full implementation of MPI using the “mpi4py” package. As an example, we investigate the inclination angle dependence of the observed polarization and the strong lensing magnification of AGN X-ray emission. We conclude that it is possible to perform complex numerical-relativity related computations using interpreted languages such as MATLAB and Python.

  5. Artificial Neural Network as the FPGA Trigger in the Cyclone V based Front-End for a Detection of Neutrino-Origin Showers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szadkowski, Zbigniew; Glas, Dariusz; Pytel, Krzysztof

    Neutrinos play a fundamental role in the understanding of the origin of ultra-high-energy cosmic rays. They interact through charged and neutral currents in the atmosphere generating extensive air showers. However, their a very low rate of events potentially generated by neutrinos is a significant challenge for a detection technique and requires both sophisticated algorithms and high-resolution hardware. A trigger based on a artificial neural network was implemented into the Cyclone{sup R} V E FPGA 5CEFA9F31I7 - the heart of the prototype Front-End boards developed for tests of new algorithms in the Pierre Auger surface detectors. Showers for muon and taumore » neutrino initiating particles on various altitudes, angles and energies were simulated in CORSICA and Offline platforms giving pattern of ADC traces in Auger water Cherenkov detectors. The 3-layer 12-8-1 neural network was taught in MATLAB by simulated ADC traces according the Levenberg-Marquardt algorithm. Results show that a probability of a ADC traces generation is very low due to a small neutrino cross-section. Nevertheless, ADC traces, if occur, for 1-10 EeV showers are relatively short and can be analyzed by 16-point input algorithm. We optimized the coefficients from MATLAB to get a maximal range of potentially registered events and for fixed-point FPGA processing to minimize calculation errors. New sophisticated triggers implemented in Cyclone{sup R} V E FPGAs with large amount of DSP blocks, embedded memory running with 120 - 160 MHz sampling may support a discovery of neutrino events in the Pierre Auger Observatory. (authors)« less

  6. CUDA-Accelerated Geodesic Ray-Tracing for Fiber Tracking

    PubMed Central

    van Aart, Evert; Sepasian, Neda; Jalba, Andrei; Vilanova, Anna

    2011-01-01

    Diffusion Tensor Imaging (DTI) allows to noninvasively measure the diffusion of water in fibrous tissue. By reconstructing the fibers from DTI data using a fiber-tracking algorithm, we can deduce the structure of the tissue. In this paper, we outline an approach to accelerating such a fiber-tracking algorithm using a Graphics Processing Unit (GPU). This algorithm, which is based on the calculation of geodesics, has shown promising results for both synthetic and real data, but is limited in its applicability by its high computational requirements. We present a solution which uses the parallelism offered by modern GPUs, in combination with the CUDA platform by NVIDIA, to significantly reduce the execution time of the fiber-tracking algorithm. Compared to a multithreaded CPU implementation of the same algorithm, our GPU mapping achieves a speedup factor of up to 40 times. PMID:21941525

  7. A Rotor Tip Vortex Tracing Algorithm for Image Post-Processing

    NASA Technical Reports Server (NTRS)

    Overmeyer, Austin D.

    2015-01-01

    A neurite tracing algorithm, originally developed for medical image processing, was used to trace the location of the rotor tip vortex in density gradient flow visualization images. The tracing algorithm was applied to several representative test images to form case studies. The accuracy of the tracing algorithm was compared to two current methods including a manual point and click method and a cross-correlation template method. It is shown that the neurite tracing algorithm can reduce the post-processing time to trace the vortex by a factor of 10 to 15 without compromising the accuracy of the tip vortex location compared to other methods presented in literature.

  8. A novel Monte Carlo algorithm for simulating crystals with McStas

    NASA Astrophysics Data System (ADS)

    Alianelli, L.; Sánchez del Río, M.; Felici, R.; Andersen, K. H.; Farhi, E.

    2004-07-01

    We developed an original Monte Carlo algorithm for the simulation of Bragg diffraction by mosaic, bent and gradient crystals. It has practical applications, as it can be used for simulating imperfect crystals (monochromators, analyzers and perhaps samples) in neutron ray-tracing packages, like McStas. The code we describe here provides a detailed description of the particle interaction with the microscopic homogeneous regions composing the crystal, therefore it can be used also for the calculation of quantities having a conceptual interest, as multiple scattering, or for the interpretation of experiments aiming at characterizing crystals, like diffraction topographs.

  9. SU-F-T-53: Treatment Planning with Inhomogeneity Correction for Intraoperative Radiotherapy Using KV X-Ray Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Y; Ghaly, M; Souri, S

    Purpose: The current standard in dose calculation for intraoperative radiotherapy (IORT) using the ZEISS Intrabeam 50 kV x-ray system is based on depth dose measurements in water and no heterogeneous tissue effect has been taken into account. We propose an algorithm for pre-treatment planning including inhomogeneity correction based on data of depth dose measurements in various tissue phantoms for kV x-rays. Methods: Direct depth dose measurements were made in air, water, inner bone and cortical bone phantoms for the Intrabeam 50 kV x-rays with a needle applicator. The data were modelled by a function of power law combining exponential withmore » different parameters. Those phantom slabs used in the measurements were scanned to obtain CT numbers. The x-ray beam initiated from the source isocenter is ray-traced through tissues. The corresponding doses will be deposited/assigned at different depths. On the boundary of tissue/organ changes, the x-ray beam will be re-traced in new tissue/organ starting at an equivalent depth with the same dose. In principle, a volumetric dose distribution can be generated if enough directional beams are traced. In practice, a several typical rays traced may be adequate in providing estimates of maximum dose to the organ at risk and minimum dose in the target volume. Results: Depth dose measurements and modeling are shown in Figure 1. The dose versus CT number is shown in Figure 2. A computer program has been written for Kypho-IORT planning using those data. A direct measurement through 2 mm solid water, 2 mm inner bone, and 1 mm solid water yields a dose rate of 7.7 Gy/min. Our calculation shows 8.1±0.4 Gy/min, consistent with the measurement within 5%. Conclusion: The proposed method can be used to more accurately calculate the dose by taking into account the heterogeneous effect. The further validation includes comparison with Monte Carlo simulation.« less

  10. Numerical modeling of Gaussian beam propagation and diffraction in inhomogeneous media based on the complex eikonal equation

    NASA Astrophysics Data System (ADS)

    Huang, Xingguo; Sun, Hui

    2018-05-01

    Gaussian beam is an important complex geometrical optical technology for modeling seismic wave propagation and diffraction in the subsurface with complex geological structure. Current methods for Gaussian beam modeling rely on the dynamic ray tracing and the evanescent wave tracking. However, the dynamic ray tracing method is based on the paraxial ray approximation and the evanescent wave tracking method cannot describe strongly evanescent fields. This leads to inaccuracy of the computed wave fields in the region with a strong inhomogeneous medium. To address this problem, we compute Gaussian beam wave fields using the complex phase by directly solving the complex eikonal equation. In this method, the fast marching method, which is widely used for phase calculation, is combined with Gauss-Newton optimization algorithm to obtain the complex phase at the regular grid points. The main theoretical challenge in combination of this method with Gaussian beam modeling is to address the irregular boundary near the curved central ray. To cope with this challenge, we present the non-uniform finite difference operator and a modified fast marching method. The numerical results confirm the proposed approach.

  11. Development and evaluation of a LOR-based image reconstruction with 3D system response modeling for a PET insert with dual-layer offset crystal design.

    PubMed

    Zhang, Xuezhu; Stortz, Greg; Sossi, Vesna; Thompson, Christopher J; Retière, Fabrice; Kozlowski, Piotr; Thiessen, Jonathan D; Goertzen, Andrew L

    2013-12-07

    In this study we present a method of 3D system response calculation for analytical computer simulation and statistical image reconstruction for a magnetic resonance imaging (MRI) compatible positron emission tomography (PET) insert system that uses a dual-layer offset (DLO) crystal design. The general analytical system response functions (SRFs) for detector geometric and inter-crystal penetration of coincident crystal pairs are derived first. We implemented a 3D ray-tracing algorithm with 4π sampling for calculating the SRFs of coincident pairs of individual DLO crystals. The determination of which detector blocks are intersected by a gamma ray is made by calculating the intersection of the ray with virtual cylinders with radii just inside the inner surface and just outside the outer-edge of each crystal layer of the detector ring. For efficient ray-tracing computation, the detector block and ray to be traced are then rotated so that the crystals are aligned along the X-axis, facilitating calculation of ray/crystal boundary intersection points. This algorithm can be applied to any system geometry using either single-layer (SL) or multi-layer array design with or without offset crystals. For effective data organization, a direct lines of response (LOR)-based indexed histogram-mode method is also presented in this work. SRF calculation is performed on-the-fly in both forward and back projection procedures during each iteration of image reconstruction, with acceleration through use of eight-fold geometric symmetry and multi-threaded parallel computation. To validate the proposed methods, we performed a series of analytical and Monte Carlo computer simulations for different system geometry and detector designs. The full-width-at-half-maximum of the numerical SRFs in both radial and tangential directions are calculated and compared for various system designs. By inspecting the sinograms obtained for different detector geometries, it can be seen that the DLO crystal design can provide better sampling density than SL or dual-layer no-offset system designs with the same total crystal length. The results of the image reconstruction with SRFs modeling for phantom studies exhibit promising image recovery capability for crystal widths of 1.27-1.43 mm and top/bottom layer lengths of 4/6 mm. In conclusion, we have developed efficient algorithms for system response modeling of our proposed PET insert with DLO crystal arrays. This provides an effective method for both 3D computer simulation and quantitative image reconstruction, and will aid in the optimization of our PET insert system with various crystal designs.

  12. Three-dimensional ray tracing in spherical and elliptical generalized Luneburg lenses for application in the human eye lens.

    PubMed

    Gómez-Correa, J E; Coello, V; Garza-Rivera, A; Puente, N P; Chávez-Cerda, S

    2016-03-10

    Ray tracing in spherical Luneburg lenses has always been represented in 2D. All propagation planes in a 3D spherical Luneburg lens generate the same ray tracing, due to its radial symmetry. A geometry without radial symmetry generates a different ray tracing. For this reason, a new ray tracing method in 3D through spherical and elliptical Luneburg lenses using 2D methods is proposed. The physics of the propagation is shown here, which allows us to make a ray tracing associated with a vortex beam. A 3D ray tracing in a composite modified Luneburg lens that represents the human eye lens is also presented.

  13. Non-null annular subaperture stitching interferometry for aspheric test

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Liu, Dong; Shi, Tu; Yang, Yongying; Chong, Shiyao; Miao, Liang; Huang, Wei; Shen, Yibing; Bai, Jian

    2015-10-01

    A non-null annular subaperture stitching interferometry (NASSI), combining the subaperture stitching idea and non-null test method, is proposed for steep aspheric testing. Compared with standard annular subaperture stitching interferometry (ASSI), a partial null lens (PNL) is employed as an alternative to the transmission sphere, to generate different aspherical wavefronts as the references. The coverage subaperture number would thus be reduced greatly for the better performance of aspherical wavefronts in matching the local slope of aspheric surfaces. Instead of various mathematical stitching algorithms, a simultaneous reverse optimizing reconstruction (SROR) method based on system modeling and ray tracing is proposed for full aperture figure error reconstruction. All the subaperture measurements are simulated simultaneously with a multi-configuration model in a ray-tracing program, including the interferometric system modeling and subaperture misalignments modeling. With the multi-configuration model, full aperture figure error would be extracted in form of Zernike polynomials from subapertures wavefront data by the SROR method. This method concurrently accomplishes subaperture retrace error and misalignment correction, requiring neither complex mathematical algorithms nor subaperture overlaps. A numerical simulation exhibits the comparison of the performance of the NASSI and standard ASSI, which demonstrates the high accuracy of the NASSI in testing steep aspheric. Experimental results of NASSI are shown to be in good agreement with that of Zygo® VerifireTM Asphere interferometer.

  14. Sampling solution traces for the problem of sorting permutations by signed reversals

    PubMed Central

    2012-01-01

    Background Traditional algorithms to solve the problem of sorting by signed reversals output just one optimal solution while the space of all optimal solutions can be huge. A so-called trace represents a group of solutions which share the same set of reversals that must be applied to sort the original permutation following a partial ordering. By using traces, we therefore can represent the set of optimal solutions in a more compact way. Algorithms for enumerating the complete set of traces of solutions were developed. However, due to their exponential complexity, their practical use is limited to small permutations. A partial enumeration of traces is a sampling of the complete set of traces and can be an alternative for the study of distinct evolutionary scenarios of big permutations. Ideally, the sampling should be done uniformly from the space of all optimal solutions. This is however conjectured to be ♯P-complete. Results We propose and evaluate three algorithms for producing a sampling of the complete set of traces that instead can be shown in practice to preserve some of the characteristics of the space of all solutions. The first algorithm (RA) performs the construction of traces through a random selection of reversals on the list of optimal 1-sequences. The second algorithm (DFALT) consists in a slight modification of an algorithm that performs the complete enumeration of traces. Finally, the third algorithm (SWA) is based on a sliding window strategy to improve the enumeration of traces. All proposed algorithms were able to enumerate traces for permutations with up to 200 elements. Conclusions We analysed the distribution of the enumerated traces with respect to their height and average reversal length. Various works indicate that the reversal length can be an important aspect in genome rearrangements. The algorithms RA and SWA show a tendency to lose traces with high average reversal length. Such traces are however rare, and qualitatively our results show that, for testable-sized permutations, the algorithms DFALT and SWA produce distributions which approximate the reversal length distributions observed with a complete enumeration of the set of traces. PMID:22704580

  15. Ray-trace analysis of glancing-incidence X-ray optical systems

    NASA Technical Reports Server (NTRS)

    Foreman, J. W., Jr.; Cardone, J. M.

    1976-01-01

    The results of a ray-trace analysis of several glancing-incidence X-ray optical systems are presented. The object of the study was threefold. First, the vignetting characteristics of the S-056 X-ray telescope were calculated using experimental data to determine mirror reflectivities. Second, a small Wolter Type I X-ray telescope intended for possible use in the Geostationary Operational Environmental Satellite program was designed and ray traced. Finally, a ray-trace program was developed for a Wolter-Schwarzschild X-ray telescope.

  16. Ray tracing method for the evaluation of grazing incidence x-ray telescopes described by spatially sampled surfaces.

    PubMed

    Yu, Jun; Shen, Zhengxiang; Sheng, Pengfeng; Wang, Xiaoqiang; Hailey, Charles J; Wang, Zhanshan

    2018-03-01

    The nested grazing incidence telescope can achieve a large collecting area in x-ray astronomy, with a large number of closely packed, thin conical mirrors. Exploiting the surface metrological data, the ray tracing method used to reconstruct the shell surface topography and evaluate the imaging performance is a powerful tool to assist iterative improvement in the fabrication process. However, current two-dimensional (2D) ray tracing codes, especially when utilized with densely sampled surface shape data, may not provide sufficient accuracy of reconstruction and are computationally cumbersome. In particular, 2D ray tracing currently employed considers coplanar rays and thus simulates only these rays along the meridional plane. This captures axial figure errors but leaves other important errors, such as roundness errors, unaccounted for. We introduce a semianalytic, three-dimensional (3D) ray tracing approach for x-ray optics that overcomes these shortcomings. And the present method is both computationally fast and accurate. We first introduce the principles and the computational details of this 3D ray tracing method. Then the computer simulations of this approach compared to 2D ray tracing are demonstrated, using an ideal conic Wolter-I telescope for benchmarking. Finally, the present 3D ray tracing is used to evaluate the performance of a prototype x-ray telescope fabricated for the enhanced x-ray timing and polarization mission.

  17. Fast algorithm for the rendering of three-dimensional surfaces

    NASA Astrophysics Data System (ADS)

    Pritt, Mark D.

    1994-02-01

    It is often desirable to draw a detailed and realistic representation of surface data on a computer graphics display. One such representation is a 3D shaded surface. Conventional techniques for rendering shaded surfaces are slow, however, and require substantial computational power. Furthermore, many techniques suffer from aliasing effects, which appear as jagged lines and edges. This paper describes an algorithm for the fast rendering of shaded surfaces without aliasing effects. It is much faster than conventional ray tracing and polygon-based rendering techniques and is suitable for interactive use. On an IBM RISC System/6000TM workstation it renders a 1000 X 1000 surface in about 7 seconds.

  18. Immersive Molecular Visualization with Omnidirectional Stereoscopic Ray Tracing and Remote Rendering

    PubMed Central

    Stone, John E.; Sherman, William R.; Schulten, Klaus

    2016-01-01

    Immersive molecular visualization provides the viewer with intuitive perception of complex structures and spatial relationships that are of critical interest to structural biologists. The recent availability of commodity head mounted displays (HMDs) provides a compelling opportunity for widespread adoption of immersive visualization by molecular scientists, but HMDs pose additional challenges due to the need for low-latency, high-frame-rate rendering. State-of-the-art molecular dynamics simulations produce terabytes of data that can be impractical to transfer from remote supercomputers, necessitating routine use of remote visualization. Hardware-accelerated video encoding has profoundly increased frame rates and image resolution for remote visualization, however round-trip network latencies would cause simulator sickness when using HMDs. We present a novel two-phase rendering approach that overcomes network latencies with the combination of omnidirectional stereoscopic progressive ray tracing and high performance rasterization, and its implementation within VMD, a widely used molecular visualization and analysis tool. The new rendering approach enables immersive molecular visualization with rendering techniques such as shadows, ambient occlusion lighting, depth-of-field, and high quality transparency, that are particularly helpful for the study of large biomolecular complexes. We describe ray tracing algorithms that are used to optimize interactivity and quality, and we report key performance metrics of the system. The new techniques can also benefit many other application domains. PMID:27747138

  19. Do Peripheral Refraction and Aberration Profiles Vary with the Type of Myopia? - An Illustration Using a Ray-Tracing Approach

    PubMed Central

    Bakaraju, Ravi C.; Ehrmann, Klaus; Papas, Eric B.; Ho, Arthur

    2010-01-01

    Purpose Myopia is considered to be the most common refractive error occurring in children and young adults, around the world. Motivated to elucidate how the process of emmetropization is disrupted, potentially causing myopia and its progression, researchers have shown great interest in peripheral refraction. This study assessed the effect of the myopia type, either refractive or axial, on peripheral refraction and aberration profiles. Methods Using customized schematic eye models for myopia in a ray tracing algorithm, peripheral aberrations, including the refractive error, were calculated as a function of myopia type. Results In all the selected models, hyperopic shifts in the mean spherical equivalent (MSE) component were found whose magnitude seemed to be largely dependent on the field angle. The MSE profiles showed larger hyperopic shifts for the axial type of myopic models than the refractive ones and were evident in -4 and -6 D prescriptions. Additionally, greater levels of astigmatic component (J180) were also seen in axial-length-dependent models, while refractive models showed higher levels of spherical aberration and coma. Conclusion This study has indicated that myopic eyes with primarily an axial component may have a greater risk of progression than their refractive counterparts albeit with the same degree of refractive error. This prediction emerges from the presented theoretical ray tracing model and, therefore, requires clinical confirmation.

  20. Gamma ray spectroscopy employing divalent europium-doped alkaline earth halides and digital readout for accurate histogramming

    DOEpatents

    Cherepy, Nerine Jane; Payne, Stephen Anthony; Drury, Owen B.; Sturm, Benjamin W.

    2016-02-09

    According to one embodiment, a scintillator radiation detector system includes a scintillator, and a processing device for processing pulse traces corresponding to light pulses from the scintillator, where the processing device is configured to: process each pulse trace over at least two temporal windows and to use pulse digitization to improve energy resolution of the system. According to another embodiment, a scintillator radiation detector system includes a processing device configured to: fit digitized scintillation waveforms to an algorithm, perform a direct integration of fit parameters, process multiple integration windows for each digitized scintillation waveform to determine a correction factor, and apply the correction factor to each digitized scintillation waveform.

  1. Scalar wave-optical reconstruction of plenoptic camera images.

    PubMed

    Junker, André; Stenau, Tim; Brenner, Karl-Heinz

    2014-09-01

    We investigate the reconstruction of plenoptic camera images in a scalar wave-optical framework. Previous publications relating to this topic numerically simulate light propagation on the basis of ray tracing. However, due to continuing miniaturization of hardware components it can be assumed that in combination with low-aperture optical systems this technique may not be generally valid. Therefore, we study the differences between ray- and wave-optical object reconstructions of true plenoptic camera images. For this purpose we present a wave-optical reconstruction algorithm, which can be run on a regular computer. Our findings show that a wave-optical treatment is capable of increasing the detail resolution of reconstructed objects.

  2. Methods for calculating the vergence of an astigmatic ray bundle in an optical system that contains a freeform surface

    NASA Astrophysics Data System (ADS)

    Shirayanagi, Moriyasu

    2016-10-01

    A method using the generalized Coddington equations enables calculating the vergence of an astigmatic ray bundle in the vicinity of a skew ray in an optical system containing a freeform surface. Because this method requires time-consuming calculations, however, there is still room for increasing the calculation speed. In addition, this method cannot be applied to optical systems containing a medium with a gradient index. Therefore, we propose two new calculation methods in this paper. The first method, using differential ray tracing, enables us to shorten computation time by using simpler algorithms than those used by conventional methods. The second method, using proximate rays, employs only the ray data obtained from the rays exiting an optical system. Therefore, this method can be applied to an optical system that contains a medium with a gradient index. We show some sample applications of these methods in the field of ophthalmic optics.

  3. Thin Lens Ray Tracing.

    ERIC Educational Resources Information Center

    Gatland, Ian R.

    2002-01-01

    Proposes a ray tracing approach to thin lens analysis based on a vector form of Snell's law for paraxial rays as an alternative to the usual approach in introductory physics courses. The ray tracing approach accommodates skew rays and thus provides a complete analysis. (Author/KHR)

  4. Image space subdivision for fast ray tracing

    NASA Astrophysics Data System (ADS)

    Yu, Billy T.; Yu, William W.

    1999-09-01

    Ray-tracing is notorious of its computational requirement. There were a number of techniques to speed up the process. However, a famous statistic indicated that ray-object intersections occupies over 95% of the total image generation time. Thus, it is most beneficial to work on this bottle-neck. There were a number of ray-object intersection reduction techniques and they could be classified into three major categories: bounding volume hierarchies, space subdivision, and directional subdivision. This paper introduces a technique falling into the third category. To further speed up the process, it takes advantages of hierarchy by adopting a MX-CIF quadtree in the image space. This special kind of quadtree provides simple objects allocation and ease of implementation. The text also included a theoretical proof of the expected performance. For ray-polygon comparison, the technique reduces the order of complexity from linear to square-root, O(n) -> O(2(root)n). Experiments with various shape, size and complexity were conducted to verify the expectation. Results shown that computational improvement grew with the complexity of the sceneries. The experimental improvement was more than 90% and it agreed with the theoretical value when the number of polygons exceeded 3000. The more complex was the scene, the more efficient was the acceleration. The algorithm described was implemented in the polygonal level, however, it could be easily enhanced and extended to the object or higher levels.

  5. Global optimization method based on ray tracing to achieve optimum figure error compensation

    NASA Astrophysics Data System (ADS)

    Liu, Xiaolin; Guo, Xuejia; Tang, Tianjin

    2017-02-01

    Figure error would degrade the performance of optical system. When predicting the performance and performing system assembly, compensation by clocking of optical components around the optical axis is a conventional but user-dependent method. Commercial optical software cannot optimize this clocking. Meanwhile existing automatic figure-error balancing methods can introduce approximate calculation error and the build process of optimization model is complex and time-consuming. To overcome these limitations, an accurate and automatic global optimization method of figure error balancing is proposed. This method is based on precise ray tracing to calculate the wavefront error, not approximate calculation, under a given elements' rotation angles combination. The composite wavefront error root-mean-square (RMS) acts as the cost function. Simulated annealing algorithm is used to seek the optimal combination of rotation angles of each optical element. This method can be applied to all rotational symmetric optics. Optimization results show that this method is 49% better than previous approximate analytical method.

  6. Temperature, Pressure, and Infrared Image Survey of an Axisymmetric Heated Exhaust Plume

    NASA Technical Reports Server (NTRS)

    Nelson, Edward L.; Mahan, J. Robert; Birckelbaw, Larry D.; Turk, Jeffrey A.; Wardwell, Douglas A.; Hange, Craig E.

    1996-01-01

    The focus of this research is to numerically predict an infrared image of a jet engine exhaust plume, given field variables such as temperature, pressure, and exhaust plume constituents as a function of spatial position within the plume, and to compare this predicted image directly with measured data. This work is motivated by the need to validate computational fluid dynamic (CFD) codes through infrared imaging. The technique of reducing the three-dimensional field variable domain to a two-dimensional infrared image invokes the use of an inverse Monte Carlo ray trace algorithm and an infrared band model for exhaust gases. This report describes an experiment in which the above-mentioned field variables were carefully measured. Results from this experiment, namely tables of measured temperature and pressure data, as well as measured infrared images, are given. The inverse Monte Carlo ray trace technique is described. Finally, experimentally obtained infrared images are directly compared to infrared images predicted from the measured field variables.

  7. Reverse ray tracing for transformation optics.

    PubMed

    Hu, Chia-Yu; Lin, Chun-Hung

    2015-06-29

    Ray tracing is an important technique for predicting optical system performance. In the field of transformation optics, the Hamiltonian equations of motion for ray tracing are well known. The numerical solutions to the Hamiltonian equations of motion are affected by the complexities of the inhomogeneous and anisotropic indices of the optical device. Based on our knowledge, no previous work has been conducted on ray tracing for transformation optics with extreme inhomogeneity and anisotropicity. In this study, we present the use of 3D reverse ray tracing in transformation optics. The reverse ray tracing is derived from Fermat's principle based on a sweeping method instead of finding the full solution to ordinary differential equations. The sweeping method is employed to obtain the eikonal function. The wave vectors are then obtained from the gradient of that eikonal function map in the transformed space to acquire the illuminance. Because only the rays in the points of interest have to be traced, the reverse ray tracing provides an efficient approach to investigate the illuminance of a system. This approach is useful in any form of transformation optics where the material property tensor is a symmetric positive definite matrix. The performance and analysis of three transformation optics with inhomogeneous and anisotropic indices are explored. The ray trajectories and illuminances in these demonstration cases are successfully solved by the proposed reverse ray tracing method.

  8. Three dimensional ray tracing of the Jovian magnetosphere in the low frequency range

    NASA Technical Reports Server (NTRS)

    Menietti, J. D.

    1984-01-01

    Ray tracing studies of Jovian low frequency emissions were studied. A comprehensive three-dimensional ray tracing computer code for examination of model Jovian decametric (DAM) emission was developed. The improvements to the computer code are outlined and described. The results of the ray tracings of Jovian emissions will be presented in summary form.

  9. Computer programs simplify optical system analysis

    NASA Technical Reports Server (NTRS)

    1965-01-01

    The optical ray-trace computer program performs geometrical ray tracing. The energy-trace program calculates the relative monochromatic flux density on a specific target area. This program uses the ray-trace program as a subroutine to generate a representation of the optical system.

  10. Rapid Process to Generate Beam Envelopes for Optical System Analysis

    NASA Technical Reports Server (NTRS)

    Howard, Joseph; Seals, Lenward

    2012-01-01

    The task of evaluating obstructions in the optical throughput of an optical system requires the use of two disciplines, and hence, two models: optical models for the details of optical propagation, and mechanical models for determining the actual structure that exists in the optical system. Previous analysis methods for creating beam envelopes (or cones of light) for use in this obstruction analysis were found to be cumbersome to calculate and take significant time and resources to complete. A new process was developed that takes less time to complete beam envelope analysis, is more accurate and less dependent upon manual node tracking to create the beam envelopes, and eases the burden on the mechanical CAD (computer-aided design) designers to form the beam solids. This algorithm allows rapid generation of beam envelopes for optical system obstruction analysis. Ray trace information is taken from optical design software and used to generate CAD objects that represent the boundary of the beam envelopes for detailed analysis in mechanical CAD software. Matlab is used to call ray trace data from the optical model for all fields and entrance pupil points of interest. These are chosen to be the edge of each space, so that these rays produce the bounding volume for the beam. The x and y global coordinate data is collected on the surface planes of interest, typically an image of the field and entrance pupil internal of the optical system. This x and y coordinate data is then evaluated using a convex hull algorithm, which removes any internal points, which are unnecessary to produce the bounding volume of interest. At this point, tolerances can be applied to expand the size of either the field or aperture, depending on the allocations. Once this minimum set of coordinates on the pupil and field is obtained, a new set of rays is generated between the field plane and aperture plane (or vice-versa). These rays are then evaluated at planes between the aperture and field, at a desired number of steps perceived necessary to build up the bounding volume or cone shape. At each plane, the ray coordinates are again evaluated using the convex hull algorithm to reduce the data to a minimal set. When all of the coordinates of interest are obtained for every plane of the propagation, the data is formatted into an xyz file suitable for FRED optical analysis software to import and create a STEP file of the data. This results in a spiral-like structure that is easily imported by mechanical CAD users who can then use an automated algorithm to wrap a skin around it and create a solid that represents the beam.

  11. Improved atmospheric 3D BSDF model in earthlike exoplanet using ray-tracing based method

    NASA Astrophysics Data System (ADS)

    Ryu, Dongok; Kim, Sug-Whan; Seong, Sehyun

    2012-10-01

    The studies on planetary radiative transfer computation have become important elements to disk-averaged spectral characterization of potential exoplanets. In this paper, we report an improved ray-tracing based atmospheric simulation model as a part of 3-D earth-like planet model with 3 principle sub-components i.e. land, sea and atmosphere. Any changes in ray paths and their characteristics such as radiative power and direction are computed as they experience reflection, refraction, transmission, absorption and scattering. Improved atmospheric BSDF algorithms uses Q.Liu's combined Rayleigh and aerosol Henrey-Greenstein scattering phase function. The input cloud-free atmosphere model consists of 48 layers with vertical absorption profiles and a scattering layer with their input characteristics using the GIOVANNI database. Total Solar Irradiance data are obtained from Solar Radiation and Climate Experiment (SORCE) mission. Using aerosol scattering computation, we first tested the atmospheric scattering effects with imaging simulation with HRIV, EPOXI. Then we examined the computational validity of atmospheric model with the measurements of global, direct and diffuse radiation taken from NREL(National Renewable Energy Laboratory)s pyranometers and pyrheliometers on a ground station for cases of single incident angle and for simultaneous multiple incident angles of the solar beam.

  12. Modeling and Simulation of Radiative Compressible Flows in Aerodynamic Heating Arc-Jet Facility

    NASA Technical Reports Server (NTRS)

    Bensassi, Khalil; Laguna, Alejandro A.; Lani, Andrea; Mansour, Nagi N.

    2016-01-01

    Numerical simulations of an arc heated flow inside NASA's 20 [MW] Aerodynamics heating facility (AHF) are performed in order to investigate the three-dimensional swirling flow and the current distribution inside the wind tunnel. The plasma is considered in Local Thermodynamics Equilibrium(LTE) and is composed of Air-Argon gas mixture. The governing equations are the Navier-Stokes equations that include source terms corresponding to Joule heating and radiative cooling. The former is obtained by solving an electric potential equation, while the latter is calculated using an innovative massively parallel ray-tracing algorithm. The fully coupled system is closed by the thermodynamics relations and transport properties which are obtained from Chapman-Enskog method. A novel strategy was developed in order to enable the flow solver and the radiation calculation to be preformed independently and simultaneously using a different number of processors. Drastic reduction in the computational cost was achieved using this strategy. Details on the numerical methods used for space discretization, time integration and ray-tracing algorithm will be presented. The effect of the radiative cooling on the dynamics of the flow will be investigated. The complete set of equations were implemented within the COOLFluiD Framework. Fig. 1 shows the geometry of the Anode and part of the constrictor of the Aerodynamics heating facility (AHF). Fig. 2 shows the velocity field distribution along (x-y) plane and the streamline in (z-y) plane.

  13. High performance ultrasonic field simulation on complex geometries

    NASA Astrophysics Data System (ADS)

    Chouh, H.; Rougeron, G.; Chatillon, S.; Iehl, J. C.; Farrugia, J. P.; Ostromoukhov, V.

    2016-02-01

    Ultrasonic field simulation is a key ingredient for the design of new testing methods as well as a crucial step for NDT inspection simulation. As presented in a previous paper [1], CEA-LIST has worked on the acceleration of these simulations focusing on simple geometries (planar interfaces, isotropic materials). In this context, significant accelerations were achieved on multicore processors and GPUs (Graphics Processing Units), bringing the execution time of realistic computations in the 0.1 s range. In this paper, we present recent works that aim at similar performances on a wider range of configurations. We adapted the physical model used by the CIVA platform to design and implement a new algorithm providing a fast ultrasonic field simulation that yields nearly interactive results for complex cases. The improvements over the CIVA pencil-tracing method include adaptive strategies for pencil subdivisions to achieve a good refinement of the sensor geometry while keeping a reasonable number of ray-tracing operations. Also, interpolation of the times of flight was used to avoid time consuming computations in the impulse response reconstruction stage. To achieve the best performance, our algorithm runs on multi-core superscalar CPUs and uses high performance specialized libraries such as Intel Embree for ray-tracing, Intel MKL for signal processing and Intel TBB for parallelization. We validated the simulation results by comparing them to the ones produced by CIVA on identical test configurations including mono-element and multiple-element transducers, homogeneous, meshed 3D CAD specimens, isotropic and anisotropic materials and wave paths that can involve several interactions with interfaces. We show performance results on complete simulations that achieve computation times in the 1s range.

  14. Bendable X-ray Optics at the ALS: Design, Tuning, Performance and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Advanced Light Source, Lawrence Berkeley National Laboratory; Yashchuk, Valeriy V.; Church, Matthew N.

    2008-09-08

    We review the development at the Advanced Light Source (ALS) of bendable x-ray optics widely used for focusing of beams of soft and hard x-rays. Typically, the focusing is divided in the tangential and sagittal directions into two elliptically cylindrical reflecting elements, the so-called Kirkpatrick-Baez (KB) pair [1]. Because fabrication of elliptical surfaces is complicated, the cost of directly fabricated tangential elliptical cylinders is often prohibitive. This is in contrast to flat optics, that are simpler to manufacture and easier to measure by conventional interferometry. The figure of a flat substrate can be changed by placing torques (couples) at eachmore » end. Equal couples form a tangential cylinder, and unequal couples can approximate a tangential ellipse or parabola. We review the nature of the bending, requirements and approaches to the mechanical design, and describe a technique developed at the ALS Optical Metrology Laboratory (OML) for optimal tuning of bendable mirrors before installation in the beamline [2]. The tuning technique adapts a method previously used to adjust bendable mirrors on synchrotron radiation beamlines [3]. However, in our case, optimal tuning of a bendable mirror is based on surface slope trace data obtained with a slope measuring instrument--in our case, the long trace profiler (LTP). We show that due to the near linearity of the bending problem, the minimal set of data, necessary for tuning of two benders, consists of only three slope traces measured before and after a single adjustment of each bending couple. We provide an algorithm that was used in dedicated software for finding optimal settings for the mirror benders. The algorithm is based on the method of regression analysis with experimentally found characteristic functions of the benders. The resulting approximation to the functional dependence of the desired slope shape provides nearly final settings for the benders. Moreover, the characteristic functions of the benders found in the course of tuning, can be used for retuning of the optics to a new desired shape without removing it from the beamline and re-measuring with the LTP. The result of practical use of the developed technique to precisely tune a KB mirror used at the ALS for micro-focusing is also presented. We also describe a simple ray trace using the profiler data which shows expected performance in the beamline and compare the simulation with experimental data. In summary, we also discuss the next steps in the systematic improvement of optical performance for the application of KB pairs in synchrotron beamlines at the ALS.« less

  15. Virtual Ray Tracing as a Conceptual Tool for Image Formation in Mirrors and Lenses

    ERIC Educational Resources Information Center

    Heikkinen, Lasse; Savinainen, Antti; Saarelainen, Markku

    2016-01-01

    The ray tracing method is widely used in teaching geometrical optics at the upper secondary and university levels. However, using simple and straightforward examples may lead to a situation in which students use the model of ray tracing too narrowly. Previous studies show that students seem to use the ray tracing method too concretely instead of…

  16. SU-F-T-619: Dose Evaluation of Specific Patient Plans Based On Monte Carlo Algorithm for a CyberKnife Stereotactic Radiosurgery System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piao, J; PLA 302 Hospital, Beijing; Xu, S

    2016-06-15

    Purpose: This study will use Monte Carlo to simulate the Cyberknife system, and intend to develop the third-party tool to evaluate the dose verification of specific patient plans in TPS. Methods: By simulating the treatment head using the BEAMnrc and DOSXYZnrc software, the comparison between the calculated and measured data will be done to determine the beam parameters. The dose distribution calculated in the Raytracing, Monte Carlo algorithms of TPS (Multiplan Ver4.0.2) and in-house Monte Carlo simulation method for 30 patient plans, which included 10 head, lung and liver cases in each, were analyzed. The γ analysis with the combinedmore » 3mm/3% criteria would be introduced to quantitatively evaluate the difference of the accuracy between three algorithms. Results: More than 90% of the global error points were less than 2% for the comparison of the PDD and OAR curves after determining the mean energy and FWHM.The relative ideal Monte Carlo beam model had been established. Based on the quantitative evaluation of dose accuracy for three algorithms, the results of γ analysis shows that the passing rates (84.88±9.67% for head,98.83±1.05% for liver,98.26±1.87% for lung) of PTV in 30 plans between Monte Carlo simulation and TPS Monte Carlo algorithms were good. And the passing rates (95.93±3.12%,99.84±0.33% in each) of PTV in head and liver plans between Monte Carlo simulation and TPS Ray-tracing algorithms were also good. But the difference of DVHs in lung plans between Monte Carlo simulation and Ray-tracing algorithms was obvious, and the passing rate (51.263±38.964%) of γ criteria was not good. It is feasible that Monte Carlo simulation was used for verifying the dose distribution of patient plans. Conclusion: Monte Carlo simulation algorithm developed in the CyberKnife system of this study can be used as a reference tool for the third-party tool, which plays an important role in dose verification of patient plans. This work was supported in part by the grant from Chinese Natural Science Foundation (Grant No. 11275105). Thanks for the support from Accuray Corp.« less

  17. A deep learning-based reconstruction of cosmic ray-induced air showers

    NASA Astrophysics Data System (ADS)

    Erdmann, M.; Glombitza, J.; Walz, D.

    2018-01-01

    We describe a method of reconstructing air showers induced by cosmic rays using deep learning techniques. We simulate an observatory consisting of ground-based particle detectors with fixed locations on a regular grid. The detector's responses to traversing shower particles are signal amplitudes as a function of time, which provide information on transverse and longitudinal shower properties. In order to take advantage of convolutional network techniques specialized in local pattern recognition, we convert all information to the image-like grid of the detectors. In this way, multiple features, such as arrival times of the first particles and optimized characterizations of time traces, are processed by the network. The reconstruction quality of the cosmic ray arrival direction turns out to be competitive with an analytic reconstruction algorithm. The reconstructed shower direction, energy and shower depth show the expected improvement in resolution for higher cosmic ray energy.

  18. Quantum rendering

    NASA Astrophysics Data System (ADS)

    Lanzagorta, Marco O.; Gomez, Richard B.; Uhlmann, Jeffrey K.

    2003-08-01

    In recent years, computer graphics has emerged as a critical component of the scientific and engineering process, and it is recognized as an important computer science research area. Computer graphics are extensively used for a variety of aerospace and defense training systems and by Hollywood's special effects companies. All these applications require the computer graphics systems to produce high quality renderings of extremely large data sets in short periods of time. Much research has been done in "classical computing" toward the development of efficient methods and techniques to reduce the rendering time required for large datasets. Quantum Computing's unique algorithmic features offer the possibility of speeding up some of the known rendering algorithms currently used in computer graphics. In this paper we discuss possible implementations of quantum rendering algorithms. In particular, we concentrate on the implementation of Grover's quantum search algorithm for Z-buffering, ray-tracing, radiosity, and scene management techniques. We also compare the theoretical performance between the classical and quantum versions of the algorithms.

  19. MCViNE- An object oriented Monte Carlo neutron ray tracing simulation package

    DOE PAGES

    Lin, J. Y. Y.; Smith, Hillary L.; Granroth, Garrett E.; ...

    2015-11-28

    MCViNE (Monte-Carlo VIrtual Neutron Experiment) is an open-source Monte Carlo (MC) neutron ray-tracing software for performing computer modeling and simulations that mirror real neutron scattering experiments. We exploited the close similarity between how instrument components are designed and operated and how such components can be modeled in software. For example we used object oriented programming concepts for representing neutron scatterers and detector systems, and recursive algorithms for implementing multiple scattering. Combining these features together in MCViNE allows one to handle sophisticated neutron scattering problems in modern instruments, including, for example, neutron detection by complex detector systems, and single and multiplemore » scattering events in a variety of samples and sample environments. In addition, MCViNE can use simulation components from linear-chain-based MC ray tracing packages which facilitates porting instrument models from those codes. Furthermore it allows for components written solely in Python, which expedites prototyping of new components. These developments have enabled detailed simulations of neutron scattering experiments, with non-trivial samples, for time-of-flight inelastic instruments at the Spallation Neutron Source. Examples of such simulations for powder and single-crystal samples with various scattering kernels, including kernels for phonon and magnon scattering, are presented. As a result, with simulations that closely reproduce experimental results, scattering mechanisms can be turned on and off to determine how they contribute to the measured scattering intensities, improving our understanding of the underlying physics.« less

  20. Ray Tracing with Virtual Objects.

    ERIC Educational Resources Information Center

    Leinoff, Stuart

    1991-01-01

    Introduces the method of ray tracing to analyze the refraction or reflection of real or virtual images from multiple optical devices. Discusses ray-tracing techniques for locating images using convex and concave lenses or mirrors. (MDH)

  1. Parallel Ray Tracing Using the Message Passing Interface

    DTIC Science & Technology

    2007-09-01

    software is available for lens design and for general optical systems modeling. It tends to be designed to run on a single processor and can be very...Cameron, Senior Member, IEEE Abstract—Ray-tracing software is available for lens design and for general optical systems modeling. It tends to be designed to...National Aeronautics and Space Administration (NASA), optical ray tracing, parallel computing, parallel pro- cessing, prime numbers, ray tracing

  2. Trajectory optimization for dynamic couch rotation during volumetric modulated arc radiotherapy

    NASA Astrophysics Data System (ADS)

    Smyth, Gregory; Bamber, Jeffrey C.; Evans, Philip M.; Bedford, James L.

    2013-11-01

    Non-coplanar radiation beams are often used in three-dimensional conformal and intensity modulated radiotherapy to reduce dose to organs at risk (OAR) by geometric avoidance. In volumetric modulated arc radiotherapy (VMAT) non-coplanar geometries are generally achieved by applying patient couch rotations to single or multiple full or partial arcs. This paper presents a trajectory optimization method for a non-coplanar technique, dynamic couch rotation during VMAT (DCR-VMAT), which combines ray tracing with a graph search algorithm. Four clinical test cases (partial breast, brain, prostate only, and prostate and pelvic nodes) were used to evaluate the potential OAR sparing for trajectory-optimized DCR-VMAT plans, compared with standard coplanar VMAT. In each case, ray tracing was performed and a cost map reflecting the number of OAR voxels intersected for each potential source position was generated. The least-cost path through the cost map, corresponding to an optimal DCR-VMAT trajectory, was determined using Dijkstra’s algorithm. Results show that trajectory optimization can reduce dose to specified OARs for plans otherwise comparable to conventional coplanar VMAT techniques. For the partial breast case, the mean heart dose was reduced by 53%. In the brain case, the maximum lens doses were reduced by 61% (left) and 77% (right) and the globes by 37% (left) and 40% (right). Bowel mean dose was reduced by 15% in the prostate only case. For the prostate and pelvic nodes case, the bowel V50 Gy and V60 Gy were reduced by 9% and 45% respectively. Future work will involve further development of the algorithm and assessment of its performance over a larger number of cases in site-specific cohorts.

  3. Development and Application of New Algorithms for the Simulation of Viscous Compressible Flows with Moving Bodies in Three Dimensions.

    DTIC Science & Technology

    1996-12-01

    ranging from academic to industrial demonstrated the utility of the developed procedure for ab initio surface meshing from discrete data, such as...academic to industrial demonstrate the utility of the pro- hypersonic reentry problems, where ray-tracing based on posed procedure for ab initio surface...data input within industrial simulations. The origi- nal CAD dataset had over 500 surface patches, many All of the surface grids shown were obtained

  4. Frontal view reconstruction for iris recognition

    DOEpatents

    Santos-Villalobos, Hector J; Bolme, David S; Boehnen, Chris Bensing

    2015-02-17

    Iris recognition can be accomplished for a wide variety of eye images by correcting input images with an off-angle gaze. A variety of techniques, from limbus modeling, corneal refraction modeling, optical flows, and genetic algorithms can be used. A variety of techniques, including aspherical eye modeling, corneal refraction modeling, ray tracing, and the like can be employed. Precomputed transforms can enhance performance for use in commercial applications. With application of the technologies, images with significantly unfavorable gaze angles can be successfully recognized.

  5. Multi-ray-based system matrix generation for 3D PET reconstruction

    NASA Astrophysics Data System (ADS)

    Moehrs, Sascha; Defrise, Michel; Belcari, Nicola; DelGuerra, Alberto; Bartoli, Antonietta; Fabbri, Serena; Zanetti, Gianluigi

    2008-12-01

    Iterative image reconstruction algorithms for positron emission tomography (PET) require a sophisticated system matrix (model) of the scanner. Our aim is to set up such a model offline for the YAP-(S)PET II small animal imaging tomograph in order to use it subsequently with standard ML-EM (maximum-likelihood expectation maximization) and OSEM (ordered subset expectation maximization) for fully three-dimensional image reconstruction. In general, the system model can be obtained analytically, via measurements or via Monte Carlo simulations. In this paper, we present the multi-ray method, which can be considered as a hybrid method to set up the system model offline. It incorporates accurate analytical (geometric) considerations as well as crystal depth and crystal scatter effects. At the same time, it has the potential to model seamlessly other physical aspects such as the positron range. The proposed method is based on multiple rays which are traced from/to the detector crystals through the image volume. Such a ray-tracing approach itself is not new; however, we derive a novel mathematical formulation of the approach and investigate the positioning of the integration (ray-end) points. First, we study single system matrix entries and show that the positioning and weighting of the ray-end points according to Gaussian integration give better results compared to equally spaced integration points (trapezoidal integration), especially if only a small number of integration points (rays) are used. Additionally, we show that, for a given variance of the single matrix entries, the number of rays (events) required to calculate the whole matrix is a factor of 20 larger when using a pure Monte-Carlo-based method. Finally, we analyse the quality of the model by reconstructing phantom data from the YAP-(S)PET II scanner.

  6. Comparison between ray-tracing and physical optics for the computation of light absorption in capillaries--the influence of diffraction and interference.

    PubMed

    Qin, Yuan; Michalowski, Andreas; Weber, Rudolf; Yang, Sen; Graf, Thomas; Ni, Xiaowu

    2012-11-19

    Ray-tracing is the commonly used technique to calculate the absorption of light in laser deep-penetration welding or drilling. Since new lasers with high brilliance enable small capillaries with high aspect ratios, diffraction might become important. To examine the applicability of the ray-tracing method, we studied the total absorptance and the absorbed intensity of polarized beams in several capillary geometries. The ray-tracing results are compared with more sophisticated simulations based on physical optics. The comparison shows that the simple ray-tracing is applicable to calculate the total absorptance in triangular grooves and in conical capillaries but not in rectangular grooves. To calculate the distribution of the absorbed intensity ray-tracing fails due to the neglected interference, diffraction, and the effects of beam propagation in the capillaries with sub-wavelength diameter. If diffraction is avoided e.g. with beams smaller than the entrance pupil of the capillary or with very shallow capillaries, the distribution of the absorbed intensity calculated by ray-tracing corresponds to the local average of the interference pattern found by physical optics.

  7. Modeling a 400 Hz Signal Transmission Through the South China Sea Basin

    DTIC Science & Technology

    2009-03-01

    TRACING ..........................8 1. General Ray Theory and the Eikonal Approximation .....................8 2. Hamiltonian Ray Tracing...HAMILTONIAN RAY TRACING 1. General Ray Theory and the Eikonal Approximation In general, modeling acoustic propagation through the ocean necessitates... eikonal and represents the phase component of the solution. Since solutions of constant phase represent wave fronts, and rays travel in a direction

  8. Application of ray-traced tropospheric slant delays to geodetic VLBI analysis

    NASA Astrophysics Data System (ADS)

    Hofmeister, Armin; Böhm, Johannes

    2017-08-01

    The correction of tropospheric influences via so-called path delays is critical for the analysis of observations from space geodetic techniques like the very long baseline interferometry (VLBI). In standard VLBI analysis, the a priori slant path delays are determined using the concept of zenith delays, mapping functions and gradients. The a priori use of ray-traced delays, i.e., tropospheric slant path delays determined with the technique of ray-tracing through the meteorological data of numerical weather models (NWM), serves as an alternative way of correcting the influences of the troposphere on the VLBI observations within the analysis. In the presented research, the application of ray-traced delays to the VLBI analysis of sessions in a time span of 16.5 years is investigated. Ray-traced delays have been determined with program RADIATE (see Hofmeister in Ph.D. thesis, Department of Geodesy and Geophysics, Faculty of Mathematics and Geoinformation, Technische Universität Wien. http://resolver.obvsg.at/urn:nbn:at:at-ubtuw:1-3444, 2016) utilizing meteorological data provided by NWM of the European Centre for Medium-Range Weather Forecasts (ECMWF). In comparison with a standard VLBI analysis, which includes the tropospheric gradient estimation, the application of the ray-traced delays to an analysis, which uses the same parameterization except for the a priori slant path delay handling and the used wet mapping factors for the zenith wet delay (ZWD) estimation, improves the baseline length repeatability (BLR) at 55.9% of the baselines at sub-mm level. If no tropospheric gradients are estimated within the compared analyses, 90.6% of all baselines benefit from the application of the ray-traced delays, which leads to an average improvement of the BLR of 1 mm. The effects of the ray-traced delays on the terrestrial reference frame are also investigated. A separate assessment of the RADIATE ray-traced delays is carried out by comparison to the ray-traced delays from the National Aeronautics and Space Administration Goddard Space Flight Center (NASA GSFC) (Eriksson and MacMillan in http://lacerta.gsfc.nasa.gov/tropodelays, 2016) with respect to the analysis performances in terms of BLR results. If tropospheric gradient estimation is included in the analysis, 51.3% of the baselines benefit from the RADIATE ray-traced delays at sub-mm difference level. If no tropospheric gradients are estimated within the analysis, the RADIATE ray-traced delays deliver a better BLR at 63% of the baselines compared to the NASA GSFC ray-traced delays.

  9. Computer program for optical systems ray tracing

    NASA Technical Reports Server (NTRS)

    Ferguson, T. J.; Konn, H.

    1967-01-01

    Program traces rays of light through optical systems consisting of up to 65 different optical surfaces and computes the aberrations. For design purposes, paraxial tracings with astigmation and third order tracings are provided.

  10. Evaluating progressive-rendering algorithms in appearance design tasks.

    PubMed

    Jiawei Ou; Karlik, Ondrej; Křivánek, Jaroslav; Pellacini, Fabio

    2013-01-01

    Progressive rendering is becoming a popular alternative to precomputational approaches to appearance design. However, progressive algorithms create images exhibiting visual artifacts at early stages. A user study investigated these artifacts' effects on user performance in appearance design tasks. Novice and expert subjects performed lighting and material editing tasks with four algorithms: random path tracing, quasirandom path tracing, progressive photon mapping, and virtual-point-light rendering. Both the novices and experts strongly preferred path tracing to progressive photon mapping and virtual-point-light rendering. None of the participants preferred random path tracing to quasirandom path tracing or vice versa; the same situation held between progressive photon mapping and virtual-point-light rendering. The user workflow didn’t differ significantly with the four algorithms. The Web Extras include a video showing how four progressive-rendering algorithms converged (at http://youtu.be/ck-Gevl1e9s), the source code used, and other supplementary materials.

  11. Image recognition of clipped stigma traces in rice seeds

    NASA Astrophysics Data System (ADS)

    Cheng, F.; Ying, YB

    2005-11-01

    The objective of this research is to develop algorithm to recognize clipped stigma traces in rice seeds using image processing. At first, the micro-configuration of clipped stigma traces was observed with electronic scanning microscope. Then images of rice seeds were acquired with a color machine vision system. A digital image-processing algorithm based on morphological operations and Hough transform was developed to inspect the occurrence of clipped stigma traces. Five varieties of Jinyou402, Shanyou10, Zhongyou207, Jiayou and you3207 were evaluated. The algorithm was implemented with all image sets using a Matlab 6.5 procedure. The results showed that the algorithm achieved an average accuracy of 96%. The algorithm was proved to be insensitive to the different rice seed varieties.

  12. CosApps: Simulate gravitational lensing through ray tracing and shear calculation

    NASA Astrophysics Data System (ADS)

    Coss, David

    2017-12-01

    Cosmology Applications (CosApps) provides tools to simulate gravitational lensing using two different techniques, ray tracing and shear calculation. The tool ray_trace_ellipse calculates deflection angles on a grid for light passing a deflecting mass distribution. Using MPI, ray_trace_ellipse may calculate deflection in parallel across network connected computers, such as cluster. The program physcalc calculates the gravitational lensing shear using the relationship of convergence and shear, described by a set of coupled partial differential equations.

  13. Computer ray tracing speeds.

    PubMed

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  14. RAPID COMMUNICATION: Optical distortion correction for liquid droplet visualization using the ray tracing method: further considerations

    NASA Astrophysics Data System (ADS)

    Minor, G.; Oshkai, P.; Djilali, N.

    2007-11-01

    The original work of Kang et al (2004 Meas. Sci. Technol. 15 1104 12) presents a scheme for correcting optical distortion caused by the curved surface of a droplet, and illustrates its application in PIV measurements of the velocity field inside evaporating liquid droplets. In this work we re-derive the correction algorithm and show that several terms in the original algorithm proposed by Kang et al are erroneous. This was not evident in the original work because the erroneous terms are negligible for droplets with approximately hemispherical shapes. However, for the more general situation of droplets that have shapes closer to that of a sphere, with heights much larger than their contact-line radii, these errors become quite significant. The corrected algorithm is presented and its application illustrated in comparison with that of Kang et al.

  15. Optimizing heliostat positions with local search metaheuristics using a ray tracing optical model

    NASA Astrophysics Data System (ADS)

    Reinholz, Andreas; Husenbeth, Christof; Schwarzbözl, Peter; Buck, Reiner

    2017-06-01

    The life cycle costs of solar tower power plants are mainly determined by the investment costs of its construction. Significant parts of these investment costs are used for the heliostat field. Therefore, an optimized placement of the heliostats gaining the maximal annual power production has a direct impact on the life cycle costs revenue ratio. We present a two level local search method implemented in MATLAB utilizing the Monte Carlo raytracing software STRAL [1] for the evaluation of the annual power output for a specific weighted annual time scheme. The algorithm was applied to a solar tower power plant (PS10) with 624 heliostats. Compared to former work of Buck [2], we were able to improve both runtime of the algorithm and quality of the output solutions significantly. Using the same environment for both algorithms, we were able to reach Buck's best solution with a speed up factor of about 20.

  16. Ray Tracing and Modal Methods for Modeling Radio Propagation in Tunnels With Rough Walls

    PubMed Central

    Zhou, Chenming

    2017-01-01

    At the ultrahigh frequencies common to portable radios, tunnels such as mine entries are often modeled by hollow dielectric waveguides. The roughness condition of the tunnel walls has an influence on radio propagation, and therefore should be taken into account when an accurate power prediction is needed. This paper investigates how wall roughness affects radio propagation in tunnels, and presents a unified ray tracing and modal method for modeling radio propagation in tunnels with rough walls. First, general analytical formulas for modeling the influence of the wall roughness are derived, based on the modal method and the ray tracing method, respectively. Second, the equivalence of the ray tracing and modal methods in the presence of wall roughnesses is mathematically proved, by showing that the ray tracing-based analytical formula can converge to the modal-based formula through the Poisson summation formula. The derivation and findings are verified by simulation results based on ray tracing and modal methods. PMID:28935995

  17. Validation of Ray Tracing Code Refraction Effects

    NASA Technical Reports Server (NTRS)

    Heath, Stephanie L.; McAninch, Gerry L.; Smith, Charles D.; Conner, David A.

    2008-01-01

    NASA's current predictive capabilities using the ray tracing program (RTP) are validated using helicopter noise data taken at Eglin Air Force Base in 2007. By including refractive propagation effects due to wind and temperature, the ray tracing code is able to explain large variations in the data observed during the flight test.

  18. Comparing FDTD and Ray-Tracing Models in Numerical Simulation of HgCdTe LWIR Photodetectors

    NASA Astrophysics Data System (ADS)

    Vallone, Marco; Goano, Michele; Bertazzi, Francesco; Ghione, Giovanni; Schirmacher, Wilhelm; Hanna, Stefan; Figgemeier, Heinrich

    2016-09-01

    We present a simulation study of HgCdTe-based long-wavelength infrared detectors, focusing on methodological comparisons between the finite-difference time-domain (FDTD) and ray-tracing optical models. We performed three-dimensional simulations to determine the absorbed photon density distributions and the corresponding photocurrent and quantum efficiency spectra of isolated n-on- p uniform-composition pixels, systematically comparing the results obtained with FDTD and ray tracing. Since ray tracing is a classical optics approach, unable to describe interference effects, its applicability has been found to be strongly wavelength dependent, especially when reflections from metallic layers are relevant. Interesting cavity effects around the material cutoff wavelength are described, and the cases where ray tracing can be considered a viable approximation are discussed.

  19. Real-time dose computation: GPU-accelerated source modeling and superposition/convolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques, Robert; Wong, John; Taylor, Russell

    Purpose: To accelerate dose calculation to interactive rates using highly parallel graphics processing units (GPUs). Methods: The authors have extended their prior work in GPU-accelerated superposition/convolution with a modern dual-source model and have enhanced performance. The primary source algorithm supports both focused leaf ends and asymmetric rounded leaf ends. The extra-focal algorithm uses a discretized, isotropic area source and models multileaf collimator leaf height effects. The spectral and attenuation effects of static beam modifiers were integrated into each source's spectral function. The authors introduce the concepts of arc superposition and delta superposition. Arc superposition utilizes separate angular sampling for themore » total energy released per unit mass (TERMA) and superposition computations to increase accuracy and performance. Delta superposition allows single beamlet changes to be computed efficiently. The authors extended their concept of multi-resolution superposition to include kernel tilting. Multi-resolution superposition approximates solid angle ray-tracing, improving performance and scalability with a minor loss in accuracy. Superposition/convolution was implemented using the inverse cumulative-cumulative kernel and exact radiological path ray-tracing. The accuracy analyses were performed using multiple kernel ray samplings, both with and without kernel tilting and multi-resolution superposition. Results: Source model performance was <9 ms (data dependent) for a high resolution (400{sup 2}) field using an NVIDIA (Santa Clara, CA) GeForce GTX 280. Computation of the physically correct multispectral TERMA attenuation was improved by a material centric approach, which increased performance by over 80%. Superposition performance was improved by {approx}24% to 0.058 and 0.94 s for 64{sup 3} and 128{sup 3} water phantoms; a speed-up of 101-144x over the highly optimized Pinnacle{sup 3} (Philips, Madison, WI) implementation. Pinnacle{sup 3} times were 8.3 and 94 s, respectively, on an AMD (Sunnyvale, CA) Opteron 254 (two cores, 2.8 GHz). Conclusions: The authors have completed a comprehensive, GPU-accelerated dose engine in order to provide a substantial performance gain over CPU based implementations. Real-time dose computation is feasible with the accuracy levels of the superposition/convolution algorithm.« less

  20. Studying the precision of ray tracing techniques with Szekeres models

    NASA Astrophysics Data System (ADS)

    Koksbang, S. M.; Hannestad, S.

    2015-07-01

    The simplest standard ray tracing scheme employing the Born and Limber approximations and neglecting lens-lens coupling is used for computing the convergence along individual rays in mock N-body data based on Szekeres swiss cheese and onion models. The results are compared with the exact convergence computed using the exact Szekeres metric combined with the Sachs formalism. A comparison is also made with an extension of the simple ray tracing scheme which includes the Doppler convergence. The exact convergence is reproduced very precisely as the sum of the gravitational and Doppler convergences along rays in Lemaitre-Tolman-Bondi swiss cheese and single void models. This is not the case when the swiss cheese models are based on nonsymmetric Szekeres models. For such models, there is a significant deviation between the exact and ray traced paths and hence also the corresponding convergences. There is also a clear deviation between the exact and ray tracing results obtained when studying both nonsymmetric and spherically symmetric Szekeres onion models.

  1. Computational methods for inverse problems in geophysics: inversion of travel time observations

    USGS Publications Warehouse

    Pereyra, V.; Keller, H.B.; Lee, W.H.K.

    1980-01-01

    General ways of solving various inverse problems are studied for given travel time observations between sources and receivers. These problems are separated into three components: (a) the representation of the unknown quantities appearing in the model; (b) the nonlinear least-squares problem; (c) the direct, two-point ray-tracing problem used to compute travel time once the model parameters are given. Novel software is described for (b) and (c), and some ideas given on (a). Numerical results obtained with artificial data and an implementation of the algorithm are also presented. ?? 1980.

  2. The vectorization of a ray tracing program for image generation

    NASA Technical Reports Server (NTRS)

    Plunkett, D. J.; Cychosz, J. M.; Bailey, M. J.

    1984-01-01

    Ray tracing is a widely used method for producing realistic computer generated images. Ray tracing involves firing an imaginary ray from a view point, through a point on an image plane, into a three dimensional scene. The intersections of the ray with the objects in the scene determines what is visible at the point on the image plane. This process must be repeated many times, once for each point (commonly called a pixel) in the image plane. A typical image contains more than a million pixels making this process computationally expensive. A traditional ray tracing program processes one ray at a time. In such a serial approach, as much as ninety percent of the execution time is spent computing the intersection of a ray with the surface in the scene. With the CYBER 205, many rays can be intersected with all the bodies im the scene with a single series of vector operations. Vectorization of this intersection process results in large decreases in computation time. The CADLAB's interest in ray tracing stems from the need to produce realistic images of mechanical parts. A high quality image of a part during the design process can increase the productivity of the designer by helping him visualize the results of his work. To be useful in the design process, these images must be produced in a reasonable amount of time. This discussion will explain how the ray tracing process was vectorized and gives examples of the images obtained.

  3. Reverse radiance: a fast accurate method for determining luminance

    NASA Astrophysics Data System (ADS)

    Moore, Kenneth E.; Rykowski, Ronald F.; Gangadhara, Sanjay

    2012-10-01

    Reverse ray tracing from a region of interest backward to the source has long been proposed as an efficient method of determining luminous flux. The idea is to trace rays only from where the final flux needs to be known back to the source, rather than tracing in the forward direction from the source outward to see where the light goes. Once the reverse ray reaches the source, the radiance the equivalent forward ray would have represented is determined and the resulting flux computed. Although reverse ray tracing is conceptually simple, the method critically depends upon an accurate source model in both the near and far field. An overly simplified source model, such as an ideal Lambertian surface substantially detracts from the accuracy and thus benefit of the method. This paper will introduce an improved method of reverse ray tracing that we call Reverse Radiance that avoids assumptions about the source properties. The new method uses measured data from a Source Imaging Goniometer (SIG) that simultaneously measures near and far field luminous data. Incorporating this data into a fast reverse ray tracing integration method yields fast, accurate data for a wide variety of illumination problems.

  4. Computation and analysis of backward ray-tracing in aero-optics flow fields.

    PubMed

    Xu, Liang; Xue, Deting; Lv, Xiaoyi

    2018-01-08

    A backward ray-tracing method is proposed for aero-optics simulation. Different from forward tracing, the backward tracing direction is from the internal sensor to the distant target. Along this direction, the tracing in turn goes through the internal gas region, the aero-optics flow field, and the freestream. The coordinate value, the density, and the refractive index are calculated at each tracing step. A stopping criterion is developed to ensure the tracing stops at the outer edge of the aero-optics flow field. As a demonstration, the analysis is carried out for a typical blunt nosed vehicle. The backward tracing method and stopping criterion greatly simplify the ray-tracing computations in the aero-optics flow field, and they can be extended to our active laser illumination aero-optics study because of the reciprocity principle.

  5. SolarPILOT | Concentrating Solar Power | NREL

    Science.gov Websites

    tools. Unlike exclusively ray-tracing tools, SolarPILOT runs the analytical simulation engine that uses engine alongside a ray-tracing core for more detailed simulations. The SolTrace simulation engine is

  6. Ray tracing of multiple transmitted/reflected/converted waves in 2-D/3-D layered anisotropic TTI media and application to crosswell traveltime tomography

    NASA Astrophysics Data System (ADS)

    Bai, Chao-Ying; Huang, Guo-Jiao; Li, Xiao-Ling; Zhou, Bing; Greenhalgh, Stewart

    2013-11-01

    To overcome the deficiency of some current grid-/cell-based ray tracing algorithms, which are only able to handle first arrivals or primary reflections (or conversions) in anisotropic media, we have extended the functionality of the multistage irregular shortest-path method to 2-D/3-D tilted transversely isotropic (TTI) media. The new approach is able to track multiple transmitted/reflected/converted arrivals composed of any kind of combinations of transmissions, reflections and mode conversions. The basic principle is that the seven parameters (five elastic parameters plus two polar angles defining the tilt of the symmetry axis) of the TTI media are sampled at primary nodes, and the group velocity values at secondary nodes are obtained by tri-linear interpolation of the primary nodes across each cell, from which the group velocities of the three wave modes (qP, qSV and qSH) are calculated. Finally, we conduct grid-/cell-based wave front expansion to trace multiple transmitted/reflected/converted arrivals from one region to the next. The results of calculations in uniform anisotropic media indicate that the numerical results agree with the analytical solutions except in directions of SV-wave triplications, at which only the lowest velocity value is selected at the singularity points by the multistage irregular shortest-path anisotropic ray tracing method. This verifies the accuracy of the methodology. Several simulation results show that the new method is able to efficiently and accurately approximate situations involving continuous velocity variations and undulating discontinuities, and that it is suitable for any combination of multiple transmitted/reflected/converted arrival tracking in TTI media of arbitrary strength and tilt. Crosshole synthetic traveltime tomographic tests have been performed, which highlight the importance of using such code when the medium is distinctly anisotropic.

  7. Improved electron probe microanalysis of trace elements in quartz

    USGS Publications Warehouse

    Donovan, John J.; Lowers, Heather; Rusk, Brian G.

    2011-01-01

    Quartz occurs in a wide range of geologic environments throughout the Earth's crust. The concentration and distribution of trace elements in quartz provide information such as temperature and other physical conditions of formation. Trace element analyses with modern electron-probe microanalysis (EPMA) instruments can achieve 99% confidence detection of ~100 ppm with fairly minimal effort for many elements in samples of low to moderate average atomic number such as many common oxides and silicates. However, trace element measurements below 100 ppm in many materials are limited, not only by the precision of the background measurement, but also by the accuracy with which background levels are determined. A new "blank" correction algorithm has been developed and tested on both Cameca and JEOL instruments, which applies a quantitative correction to the emitted X-ray intensities during the iteration of the sample matrix correction based on a zero level (or known trace) abundance calibration standard. This iterated blank correction, when combined with improved background fit models, and an "aggregate" intensity calculation utilizing multiple spectrometer intensities in software for greater geometric efficiency, yields a detection limit of 2 to 3 ppm for Ti and 6 to 7 ppm for Al in quartz at 99% t-test confidence with similar levels for absolute accuracy.

  8. Improved automated lumen contour detection by novel multifrequency processing algorithm with current intravascular ultrasound system.

    PubMed

    Kume, Teruyoshi; Kim, Byeong-Keuk; Waseda, Katsuhisa; Sathyanarayana, Shashidhar; Li, Wenguang; Teo, Tat-Jin; Yock, Paul G; Fitzgerald, Peter J; Honda, Yasuhiro

    2013-02-01

    The aim of this study was to evaluate a new fully automated lumen border tracing system based on a novel multifrequency processing algorithm. We developed the multifrequency processing method to enhance arterial lumen detection by exploiting the differential scattering characteristics of blood and arterial tissue. The implementation of the method can be integrated into current intravascular ultrasound (IVUS) hardware. This study was performed in vivo with conventional 40-MHz IVUS catheters (Atlantis SR Pro™, Boston Scientific Corp, Natick, MA) in 43 clinical patients with coronary artery disease. A total of 522 frames were randomly selected, and lumen areas were measured after automatically tracing lumen borders with the new tracing system and a commercially available tracing system (TraceAssist™) referred to as the "conventional tracing system." The data assessed by the two automated systems were compared with the results of manual tracings by experienced IVUS analysts. New automated lumen measurements showed better agreement with manual lumen area tracings compared with those of the conventional tracing system (correlation coefficient: 0.819 vs. 0.509). When compared against manual tracings, the new algorithm also demonstrated improved systematic error (mean difference: 0.13 vs. -1.02 mm(2) ) and random variability (standard deviation of difference: 2.21 vs. 4.02 mm(2) ) compared with the conventional tracing system. This preliminary study showed that the novel fully automated tracing system based on the multifrequency processing algorithm can provide more accurate lumen border detection than current automated tracing systems and thus, offer a more reliable quantitative evaluation of lumen geometry. Copyright © 2011 Wiley Periodicals, Inc.

  9. Majorization as a Tool for Optimizing a Class of Matrix Functions.

    ERIC Educational Resources Information Center

    Kiers, Henk A.

    1990-01-01

    General algorithms are presented that can be used for optimizing matrix trace functions subject to certain constraints on the parameters. The parameter set that minimizes the majorizing function also decreases the matrix trace function, providing a monotonically convergent algorithm for minimizing the matrix trace function iteratively. (SLD)

  10. The Alba ray tracing code: ART

    NASA Astrophysics Data System (ADS)

    Nicolas, Josep; Barla, Alessandro; Juanhuix, Jordi

    2013-09-01

    The Alba ray tracing code (ART) is a suite of Matlab functions and tools for the ray tracing simulation of x-ray beamlines. The code is structured in different layers, which allow its usage as part of optimization routines as well as an easy control from a graphical user interface. Additional tools for slope error handling and for grating efficiency calculations are also included. Generic characteristics of ART include the accumulation of rays to improve statistics without memory limitations, and still providing normalized values of flux and resolution in physically meaningful units.

  11. A novel calibration method of focused light field camera for 3-D reconstruction of flame temperature

    NASA Astrophysics Data System (ADS)

    Sun, Jun; Hossain, Md. Moinul; Xu, Chuan-Long; Zhang, Biao; Wang, Shi-Min

    2017-05-01

    This paper presents a novel geometric calibration method for focused light field camera to trace the rays of flame radiance and to reconstruct the three-dimensional (3-D) temperature distribution of a flame. A calibration model is developed to calculate the corner points and their projections of the focused light field camera. The characteristics of matching main lens and microlens f-numbers are used as an additional constrains for the calibration. Geometric parameters of the focused light field camera are then achieved using Levenberg-Marquardt algorithm. Total focused images in which all the points are in focus, are utilized to validate the proposed calibration method. Calibration results are presented and discussed in details. The maximum mean relative error of the calibration is found less than 0.13%, indicating that the proposed method is capable of calibrating the focused light field camera successfully. The parameters obtained by the calibration are then utilized to trace the rays of flame radiance. A least square QR-factorization algorithm with Plank's radiation law is used to reconstruct the 3-D temperature distribution of a flame. Experiments were carried out on an ethylene air fired combustion test rig to reconstruct the temperature distribution of flames. The flame temperature obtained by the proposed method is then compared with that obtained by using high-precision thermocouple. The difference between the two measurements was found no greater than 6.7%. Experimental results demonstrated that the proposed calibration method and the applied measurement technique perform well in the reconstruction of the flame temperature.

  12. Eta Carinae: X-ray Line Variations during the 2003 X-ray Minimum, and the Orbit Orientation

    NASA Technical Reports Server (NTRS)

    Corcoran, M. F.; Henley, D.; Hamaguchi, K.; Khibashi, K.; Pittard, J. M.; Stevens, I. R.; Gull, T. R.

    2007-01-01

    The future evolution of Eta Carinae will be as a supernova (or hypernova) and black hole. The evolution is highly contingent on mass and angular momentum changes and instabilities. The presence of a companion can serve to trigger instabilities and provide pathways for mass and angular momentum exchange loss. X-rays can be used a a key diagnostic tool: x-ray temperatures trace pre-shock wind velocities, periodic x-ray variability traces the orbit, and x-ray line variations traces the flow and orientation of shocked gas. This brief presentation highlights x-ray line variations from the HETG and presents a model of the colliding wind flow.

  13. Development and validation of real-time simulation of X-ray imaging with respiratory motion.

    PubMed

    Vidal, Franck P; Villard, Pierre-Frédéric

    2016-04-01

    We present a framework that combines evolutionary optimisation, soft tissue modelling and ray tracing on GPU to simultaneously compute the respiratory motion and X-ray imaging in real-time. Our aim is to provide validated building blocks with high fidelity to closely match both the human physiology and the physics of X-rays. A CPU-based set of algorithms is presented to model organ behaviours during respiration. Soft tissue deformation is computed with an extension of the Chain Mail method. Rigid elements move according to kinematic laws. A GPU-based surface rendering method is proposed to compute the X-ray image using the Beer-Lambert law. It is provided as an open-source library. A quantitative validation study is provided to objectively assess the accuracy of both components: (i) the respiration against anatomical data, and (ii) the X-ray against the Beer-Lambert law and the results of Monte Carlo simulations. Our implementation can be used in various applications, such as interactive medical virtual environment to train percutaneous transhepatic cholangiography in interventional radiology, 2D/3D registration, computation of digitally reconstructed radiograph, simulation of 4D sinograms to test tomography reconstruction tools. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Light ray tracing through a leaf cross section

    NASA Technical Reports Server (NTRS)

    Kumar, R.; Silva, L. F.

    1973-01-01

    A light ray, incident at about 5 deg to the normal, is geometrically plotted through the drawing of the cross section of a soybean leaf using Fresnel's equations and Snell's law. The optical mediums of the leaf considered for ray tracing are: air, cell sap, chloroplast, and cell wall. The ray is also drawn through the same leaf cross section with cell wall and air as the only optical mediums. The values of the reflection and transmission found from the ray tracing tests agree closely with the experimental results obtained using a Beckman Dk-2A Spectroreflector.

  15. An Effective Algorithm Research of Scenario Voxelization Organization and Occlusion Culling

    NASA Astrophysics Data System (ADS)

    Lai, Guangling; Ding, Lu; Qin, Zhiyuan; Tong, Xiaochong

    2016-11-01

    Compared with the traditional triangulation approaches, the voxelized point cloud data can reduce the sensitivity of scenario and complexity of calculation. While on the base of the point cloud data, implementation scenario organization could be accomplishment by subtle voxel, but it will add more memory consumption. Therefore, an effective voxel representation method is very necessary. At present, the specific study of voxel visualization algorithm is less. This paper improved the ray tracing algorithm by the characteristics of voxel configuration. Firstly, according to the scope of point cloud data, determined the scope of the pixels on the screen. Then, calculated the light vector came from each pixel. Lastly, used the rules of voxel configuration to calculate all the voxel penetrated through by light. The voxels closest to viewpoint were named visible ones, the rest were all obscured ones. This experimental showed that the method could realize voxelization organization and voxel occlusion culling of implementation scenario efficiently, and increased the render efficiency.

  16. Robust approximation of image illumination direction in a segmentation-based crater detection algorithm for spacecraft navigation

    NASA Astrophysics Data System (ADS)

    Maass, Bolko

    2016-12-01

    This paper describes an efficient and easily implemented algorithmic approach to extracting an approximation to an image's dominant projected illumination direction, based on intermediary results from a segmentation-based crater detection algorithm (CDA), at a computational cost that is negligible in comparison to that of the prior stages of the CDA. Most contemporary CDAs built for spacecraft navigation use this illumination direction as a means of improving performance or even require it to function at all. Deducing the illumination vector from the image alone reduces the reliance on external information such as the accurate knowledge of the spacecraft inertial state, accurate time base and solar system ephemerides. Therefore, a method such as the one described in this paper is a prerequisite for true "Lost in Space" operation of a purely segmentation-based crater detecting and matching method for spacecraft navigation. The proposed method is verified using ray-traced lunar elevation model data, asteroid image data, and in a laboratory setting with a camera in the loop.

  17. A Kirchhoff approach to seismic modeling and prestack depth migration

    NASA Astrophysics Data System (ADS)

    Liu, Zhen-Yue

    1993-05-01

    The Kirchhoff integral provides a robust method for implementing seismic modeling and prestack depth migration, which can handle lateral velocity variation and turning waves. With a little extra computation cost, the Kirchoff-type migration can obtain multiple outputs that have the same phase but different amplitudes, compared with that of other migration methods. The ratio of these amplitudes is helpful in computing some quantities such as reflection angle. I develop a seismic modeling and prestack depth migration method based on the Kirchhoff integral, that handles both laterally variant velocity and a dip beyond 90 degrees. The method uses a finite-difference algorithm to calculate travel times and WKBJ amplitudes for the Kirchhoff integral. Compared to ray-tracing algorithms, the finite-difference algorithm gives an efficient implementation and single-valued quantities (first arrivals) on output. In my finite difference algorithm, the upwind scheme is used to calculate travel times, and the Crank-Nicolson scheme is used to calculate amplitudes. Moreover, interpolation is applied to save computation cost. The modeling and migration algorithms require a smooth velocity function. I develop a velocity-smoothing technique based on damped least-squares to aid in obtaining a successful migration.

  18. The Alvarez and Lohmann refractive lenses revisited.

    PubMed

    Barbero, Sergio

    2009-05-25

    Alvarez and Lohmann lenses are variable focus optical devices based on lateral shifts of two lenses with cubic-type surfaces. I analyzed the optical performance of these types of lenses computing the first order optical properties (applying wavefront refraction and propagation) without the restriction of the thin lens approximation, and the spot diagram using a ray tracing algorithm. I proposed an analytic and numerical method to select the most optimum coefficients and the specific configuration of these lenses. The results show that Lohmann composite lens is slightly superior to Alvarez one because the overall thickness and optical aberrations are smaller.

  19. A Comparison of 3D3C Velocity Measurement Techniques

    NASA Astrophysics Data System (ADS)

    La Foy, Roderick; Vlachos, Pavlos

    2013-11-01

    The velocity measurement fidelity of several 3D3C PIV measurement techniques including tomographic PIV, synthetic aperture PIV, plenoptic PIV, defocusing PIV, and 3D PTV are compared in simulations. A physically realistic ray-tracing algorithm is used to generate synthetic images of a standard calibration grid and of illuminated particle fields advected by homogeneous isotropic turbulence. The simulated images for the tomographic, synthetic aperture, and plenoptic PIV cases are then used to create three-dimensional reconstructions upon which cross-correlations are performed to yield the measured velocity field. Particle tracking algorithms are applied to the images for the defocusing PIV and 3D PTV to directly yield the three-dimensional velocity field. In all cases the measured velocity fields are compared to one-another and to the true velocity field using several metrics.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, Bin; Li, Yongbao; Liu, Bo

    Purpose: CyberKnife system is initially equipped with fixed circular cones for stereotactic radiosurgery. Two dose calculation algorithms, Ray-Tracing and Monte Carlo, are available in the supplied treatment planning system. A multileaf collimator system was recently introduced in the latest generation of system, capable of arbitrarily shaped treatment field. The purpose of this study is to develop a model based dose calculation algorithm to better handle the lateral scatter in an irregularly shaped small field for the CyberKnife system. Methods: A pencil beam dose calculation algorithm widely used in linac based treatment planning system was modified. The kernel parameters and intensitymore » profile were systematically determined by fitting to the commissioning data. The model was tuned using only a subset of measured data (4 out of 12 cones) and applied to all fixed circular cones for evaluation. The root mean square (RMS) of the difference between the measured and calculated tissue-phantom-ratios (TPRs) and off-center-ratio (OCR) was compared. Three cone size correction techniques were developed to better fit the OCRs at the penumbra region, which are further evaluated by the output factors (OFs). The pencil beam model was further validated against measurement data on the variable dodecagon-shaped Iris collimators and a half-beam blocked field. Comparison with Ray-Tracing and Monte Carlo methods was also performed on a lung SBRT case. Results: The RMS between the measured and calculated TPRs is 0.7% averaged for all cones, with the descending region at 0.5%. The RMSs of OCR at infield and outfield regions are both at 0.5%. The distance to agreement (DTA) at the OCR penumbra region is 0.2 mm. All three cone size correction models achieve the same improvement in OCR agreement, with the effective source shift model (SSM) preferred, due to their ability to predict more accurately the OF variations with the source to axis distance (SAD). In noncircular field validation, the pencil beam calculated results agreed well with the film measurement of both Iris collimators and the half-beam blocked field, fared much better than the Ray-Tracing calculation. Conclusions: The authors have developed a pencil beam dose calculation model for the CyberKnife system. The dose calculation accuracy is better than the standard linac based system because the model parameters were specifically tuned to the CyberKnife system and geometry correction factors. The model handles better the lateral scatter and has the potential to be used for the irregularly shaped fields. Comprehensive validations on MLC equipped system are necessary for its clinical implementation. It is reasonably fast enough to be used during plan optimization.« less

  1. Comparing TID simulations using 3-D ray tracing and mirror reflection

    NASA Astrophysics Data System (ADS)

    Huang, X.; Reinisch, B. W.; Sales, G. S.; Paznukhov, V. V.; Galkin, I. A.

    2016-04-01

    Measuring the time variations of Doppler frequencies and angles of arrival (AoA) of ionospherically reflected HF waves has been proposed as a means of detecting the occurrence of traveling ionospheric disturbances (TIDs). Simulations are made using ray tracing through the International Reference Ionosphere (IRI) electron density model in an effort to reproduce measured signatures. The TID is represented by a wavelike perturbation of the 3-D electron density traveling horizontally in the ionosphere with an amplitude that varies sinusoidally with time. By judiciously selecting the TID parameters the ray tracing simulation reproduces the observed Doppler frequencies and AoAs. Ray tracing in a 3-D realistic ionosphere is, however, excessively time consuming considering the involved homing procedures. It is shown that a carefully selected reflecting corrugated mirror can reproduce the time variations of the AoA and Doppler frequency. The results from the ray tracing through the IRI model ionosphere and the mirror model reflections are compared to assess the applicability of the mirror-reflection model.

  2. Neurient: An Algorithm for Automatic Tracing of Confluent Neuronal Images to Determine Alignment

    PubMed Central

    Mitchel, J.A.; Martin, I.S.

    2013-01-01

    A goal of neural tissue engineering is the development and evaluation of materials that guide neuronal growth and alignment. However, the methods available to quantitatively evaluate the response of neurons to guidance materials are limited and/or expensive, and may require manual tracing to be performed by the researcher. We have developed an open source, automated Matlab-based algorithm, building on previously published methods, to trace and quantify alignment of fluorescent images of neurons in culture. The algorithm is divided into three phases, including computation of a lookup table which contains directional information for each image, location of a set of seed points which may lie along neurite centerlines, and tracing neurites starting with each seed point and indexing into the lookup table. This method was used to obtain quantitative alignment data for complex images of densely cultured neurons. Complete automation of tracing allows for unsupervised processing of large numbers of images. Following image processing with our algorithm, available metrics to quantify neurite alignment include angular histograms, percent of neurite segments in a given direction, and mean neurite angle. The alignment information obtained from traced images can be used to compare the response of neurons to a range of conditions. This tracing algorithm is freely available to the scientific community under the name Neurient, and its implementation in Matlab allows a wide range of researchers to use a standardized, open source method to quantitatively evaluate the alignment of dense neuronal cultures. PMID:23384629

  3. SU-F-T-555: Accurate Stereotactic Cone TMRs Converted from PDDs Scanned with Ray Trace

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, H; Zhong, H; Qin, Y

    Purpose: To investigate whether the accuracy of TMRs for stereotactic cones converted from PDDs scanned with Ray Trace can be improved, when compared against the TMRs converted from the traditional PDDs. Methods: Ray Trace measurement in Sun Nuclear 3D Scanner is for accurate scan of small field PDDs. The system detects the center of field at two depths, for example, at 3 and 20 cm in our study, and then performs scan along the line passing the two centers. With both Ray Trace and the traditional method, PDDs for conical cones of 4, 5, 7.5, 10, 12.5, 15, and 17.5more » mm diameter (jaws set to 5×5 cm) were obtained for 6X FFF and 10X FFF energies on a Varian Edge linac, using Edge detectors. The formalism of converting PDD to TMR given in Khan’s book (4th Edition, p.161) was applied. Sp values at dmax were obtained by measuring cone Scp and Sc. Continuous direct measurement of TMR by filling/draining water to/from the tank and spot measurement by moving the tank and detector were also performed with the same equipment, using 100 cm SDD. Results: For 6XFFF energy and all the cones, TMRs converted from Ray Trace were very close to the continuous and spot measurement, while TMRs converted from traditional PDDs had larger deviation. Along the central axis beyond dmax, 1.7% of TMR data points calculated from Ray Trace had more 3% deviation from measurement, with maximal deviation of 5.2%. Whereas, 34% of TMR points calculated from traditional PDDs had more than 3% deviation, with maximum of 5.7%. In this initial study, Ray Trace scans for 10XFFF beam were noisy, further measurement is warranted. Conclusion: The Ray Trace could improve the accuracy of PDDs measurement and the calculated TMRs for stereotactic cones, which was within 3% of the measured TMRs.« less

  4. Effects of a descending lithospheric slab on yield estimates of underground nuclear tests. Final technical report, 8 Mar 88-31 Aug 90

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cormier, V.F.; Kim, W.; Mandal, B.

    A method for computing seismic wavefields in a high-frequency approximation is proposed based on the integration of the kinematic ray tracing equations and a new set of differential equations for the dynamic properties of the wavefront, which the authors call the vicinity ray tracing (VRT) equations. These equations are directly obtained from the Hamiltonian in ray centered coordinates, using no paraxial approximations. This system is comparable to the standard dynamic ray tracing (DRT) system, but it is specified by fewer equations (four versus eight in 3-D) and only requires the specification of velocity and its first spacial derivative along amore » ray. The VRT equations describe the trajectory of a ray in ray centered coordinates of a reference ray. Quantities obtained from vicinity ray tracing can be used to determine wavefront curvature, geometric spreading, travel time to a receiver near the reference ray, and the KMAH index of the reference ray with greater numerical precision than is possible by differencing kinematically traced rays. Since second spatial derivatives of velocity are not required by the new technique, parameterization of the medium is simplified, and reflection and transmission of beams can be calculated by applying Snell's law to both vicinity and central rays. Conversation relations between VRT and DRT can be used to determine the paraxial vicinity of DRT, in which the errors of the paraxial approximations of DRT remain small. Because no paraxial approximations are made, the superposition of the Gaussian beams define from the vicinity rays should exhibit a much slower breakdown in accuracy as the scale length of the medium given by V/Delta v approaches the beamwidth.« less

  5. High-efficiency photorealistic computer-generated holograms based on the backward ray-tracing technique

    NASA Astrophysics Data System (ADS)

    Wang, Yuan; Chen, Zhidong; Sang, Xinzhu; Li, Hui; Zhao, Linmin

    2018-03-01

    Holographic displays can provide the complete optical wave field of a three-dimensional (3D) scene, including the depth perception. However, it often takes a long computation time to produce traditional computer-generated holograms (CGHs) without more complex and photorealistic rendering. The backward ray-tracing technique is able to render photorealistic high-quality images, which noticeably reduce the computation time achieved from the high-degree parallelism. Here, a high-efficiency photorealistic computer-generated hologram method is presented based on the ray-tracing technique. Rays are parallelly launched and traced under different illuminations and circumstances. Experimental results demonstrate the effectiveness of the proposed method. Compared with the traditional point cloud CGH, the computation time is decreased to 24 s to reconstruct a 3D object of 100 ×100 rays with continuous depth change.

  6. A method to generate soft shadows using a layered depth image and warping.

    PubMed

    Im, Yeon-Ho; Han, Chang-Young; Kim, Lee-Sup

    2005-01-01

    We present an image-based method for propagating area light illumination through a Layered Depth Image (LDI) to generate soft shadows from opaque and nonrefractive transparent objects. In our approach, using the depth peeling technique, we render an LDI from a reference light sample on a planar light source. Light illumination of all pixels in an LDI is then determined for all the other sample points via warping, an image-based rendering technique, which approximates ray tracing in our method. We use an image-warping equation and McMillan's warp ordering algorithm to find the intersections between rays and polygons and to find the order of intersections. Experiments for opaque and nonrefractive transparent objects are presented. Results indicate our approach generates soft shadows fast and effectively. Advantages and disadvantages of the proposed method are also discussed.

  7. Low-dose x-ray tomography through a deep convolutional neural network

    DOE PAGES

    Yang, Xiaogang; De Andrade, Vincent; Scullin, William; ...

    2018-02-07

    Synchrotron-based X-ray tomography offers the potential of rapid large-scale reconstructions of the interiors of materials and biological tissue at fine resolution. However, for radiation sensitive samples, there remain fundamental trade-offs between damaging samples during longer acquisition times and reducing signals with shorter acquisition times. We present a deep convolutional neural network (CNN) method that increases the acquired X-ray tomographic signal by at least a factor of 10 during low-dose fast acquisition by improving the quality of recorded projections. Short exposure time projections enhanced with CNN show similar signal to noise ratios as compared with long exposure time projections and muchmore » lower noise and more structural information than low-dose fats acquisition without CNN. We optimized this approach using simulated samples and further validated on experimental nano-computed tomography data of radiation sensitive mouse brains acquired with a transmission X-ray microscopy. We demonstrate that automated algorithms can reliably trace brain structures in datasets collected with low dose-CNN. As a result, this method can be applied to other tomographic or scanning based X-ray imaging techniques and has great potential for studying faster dynamics in specimens.« less

  8. Low-dose x-ray tomography through a deep convolutional neural network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xiaogang; De Andrade, Vincent; Scullin, William

    Synchrotron-based X-ray tomography offers the potential of rapid large-scale reconstructions of the interiors of materials and biological tissue at fine resolution. However, for radiation sensitive samples, there remain fundamental trade-offs between damaging samples during longer acquisition times and reducing signals with shorter acquisition times. We present a deep convolutional neural network (CNN) method that increases the acquired X-ray tomographic signal by at least a factor of 10 during low-dose fast acquisition by improving the quality of recorded projections. Short exposure time projections enhanced with CNN show similar signal to noise ratios as compared with long exposure time projections and muchmore » lower noise and more structural information than low-dose fats acquisition without CNN. We optimized this approach using simulated samples and further validated on experimental nano-computed tomography data of radiation sensitive mouse brains acquired with a transmission X-ray microscopy. We demonstrate that automated algorithms can reliably trace brain structures in datasets collected with low dose-CNN. As a result, this method can be applied to other tomographic or scanning based X-ray imaging techniques and has great potential for studying faster dynamics in specimens.« less

  9. Plenoptic camera image simulation for reconstruction algorithm verification

    NASA Astrophysics Data System (ADS)

    Schwiegerling, Jim

    2014-09-01

    Plenoptic cameras have emerged in recent years as a technology for capturing light field data in a single snapshot. A conventional digital camera can be modified with the addition of a lenslet array to create a plenoptic camera. Two distinct camera forms have been proposed in the literature. The first has the camera image focused onto the lenslet array. The lenslet array is placed over the camera sensor such that each lenslet forms an image of the exit pupil onto the sensor. The second plenoptic form has the lenslet array relaying the image formed by the camera lens to the sensor. We have developed a raytracing package that can simulate images formed by a generalized version of the plenoptic camera. Several rays from each sensor pixel are traced backwards through the system to define a cone of rays emanating from the entrance pupil of the camera lens. Objects that lie within this cone are integrated to lead to a color and exposure level for that pixel. To speed processing three-dimensional objects are approximated as a series of planes at different depths. Repeating this process for each pixel in the sensor leads to a simulated plenoptic image on which different reconstruction algorithms can be tested.

  10. A trace map comparison algorithm for the discrete fracture network models of rock masses

    NASA Astrophysics Data System (ADS)

    Han, Shuai; Wang, Gang; Li, Mingchao

    2018-06-01

    Discrete fracture networks (DFN) are widely used to build refined geological models. However, validating whether a refined model can match to reality is a crucial problem, concerning whether the model can be used for analysis. The current validation methods include numerical validation and graphical validation. However, the graphical validation, aiming at estimating the similarity between a simulated trace map and the real trace map by visual observation, is subjective. In this paper, an algorithm for the graphical validation of DFN is set up. Four main indicators, including total gray, gray grade curve, characteristic direction and gray density distribution curve, are presented to assess the similarity between two trace maps. A modified Radon transform and loop cosine similarity are presented based on Radon transform and cosine similarity respectively. Besides, how to use Bézier curve to reduce the edge effect is described. Finally, a case study shows that the new algorithm can effectively distinguish which simulated trace map is more similar to the real trace map.

  11. Integration of airborne LiDAR data and voxel-based ray tracing to determine high-resolution solar radiation dynamics at the forest floor: implications for improving stand-scale distributed snowmelt models

    NASA Astrophysics Data System (ADS)

    Musselman, K. N.; Molotch, N. P.; Margulis, S. A.

    2012-12-01

    Forest architecture dictates sub-canopy solar irradiance and the resulting patterns can vary seasonally and over short spatial distances. These radiation dynamics are thought to have significant implications on snowmelt processes, regional hydrology, and remote sensing signatures. The variability calls into question many assumptions inherent in traditional canopy models (e.g. Beer's Law) when applied at high resolution (i.e. 1 m). We present a method of estimating solar canopy transmissivity using airborne LiDAR data. The canopy structure is represented in 3-D voxel space (i.e. a cubic discretization of a 3-D domain analogous to a pixel representation of a 2-D space). The solar direct beam canopy transmissivity (DBT) is estimated with a ray-tracing algorithm and the diffuse component is estimated from LiDAR-derived effective LAI. Results from one year at five-minute temporal and 1 m spatial resolutions are presented from Sequoia National Park. Compared to estimates from 28 hemispherical photos, the ray-tracing model estimated daily mean DBT with a 10% average error, while the errors from a Beer's-type DBT estimate exceeded 20%. Compared to the ray-tracing estimates, the Beer's-type transmissivity method was unable to resolve complex spatial patterns resulting from canopy gaps, individual tree canopies and boles, and steep variable terrain. The snowmelt model SNOWPACK was applied at locations of ultrasonic snow depth sensors. Two scenarios were tested; 1) a nominal case where canopy model parameters were obtained from hemispherical photographs, and 2) an explicit scenario where the model was modified to accept LiDAR-derived time-variant DBT. The bulk canopy treatment was generally unable to simulate the sub-canopy snowmelt dynamics observed at the depth sensor locations. The explicit treatment reduced error in the snow disappearance date by one week and both positive and negative melt-season SWE biases were reduced. The results highlight the utility of LiDAR canopy measurements and physically based snowmelt models to simulate spatially distributed stand- and slope-scale snowmelt dynamics at resolutions necessary to capture the inherent underlying variability.iDAR-derived solar direct beam canopy transmissivity computed as the daily average for March 1st and May 1st.

  12. Dual-Energy Contrast-Enhanced Breast Tomosynthesis: Optimization of Beam Quality for Dose and Image Quality

    PubMed Central

    Samei, Ehsan; Saunders, Robert S.

    2014-01-01

    Dual-energy contrast-enhanced breast tomosynthesis is a promising technique to obtain three-dimensional functional information from the breast with high resolution and speed. To optimize this new method, this study searched for the beam quality that maximized image quality in terms of mass detection performance. A digital tomosynthesis system was modeled using a fast ray-tracing algorithm, which created simulated projection images by tracking photons through a voxelized anatomical breast phantom containing iodinated lesions. The single-energy images were combined into dual-energy images through a weighted log subtraction process. The weighting factor was optimized to minimize anatomical noise, while the dose distribution was chosen to minimize quantum noise. The dual-energy images were analyzed for the signal difference to noise ratio (SdNR) of iodinated masses. The fast ray-tracing explored 523,776 dual-energy combinations to identify which yields optimum mass SdNR. The ray-tracing results were verified using a Monte Carlo model for a breast tomosynthesis system with a selenium-based flat-panel detector. The projection images from our voxelized breast phantom were obtained at a constant total glandular dose. The projections were combined using weighted log subtraction and reconstructed using commercial reconstruction software. The lesion SdNR was measured in the central reconstructed slice. The SdNR performance varied markedly across the kVp and filtration space. Ray-tracing results indicated that the mass SdNR was maximized with a high-energy tungsten beam at 49 kVp with 92.5 μm of copper filtration and a low-energy tungsten beam at 49 kVp with 95 μm of tin filtration. This result was consistent with Monte Carlo findings. This mammographic technique led to a mass SdNR of 0.92 ± 0.03 in the projections and 3.68 ± 0.19 in the reconstructed slices. These values were markedly higher than those for non-optimized techniques. Our findings indicate that dual-energy breast tomosynthesis can be performed optimally at 49 kVp with alternative copper and tin filters, with reconstruction following weighted subtraction. The optimum technique provides best visibility of iodine against structured breast background in dual-energy contrast-enhanced breast tomosynthesis. PMID:21908902

  13. Optimizing detector geometry for trace element mapping by X-ray fluorescence.

    PubMed

    Sun, Yue; Gleber, Sophie-Charlotte; Jacobsen, Chris; Kirz, Janos; Vogt, Stefan

    2015-05-01

    Trace metals play critical roles in a variety of systems, ranging from cells to photovoltaics. X-Ray Fluorescence (XRF) microscopy using X-ray excitation provides one of the highest sensitivities available for imaging the distribution of trace metals at sub-100 nm resolution. With the growing availability and increasing performance of synchrotron light source based instruments and X-ray nanofocusing optics, and with improvements in energy-dispersive XRF detectors, what are the factors that limit trace element detectability? To address this question, we describe an analytical model for the total signal incident on XRF detectors with various geometries, including the spectral response of energy dispersive detectors. This model agrees well with experimentally recorded X-ray fluorescence spectra, and involves much shorter calculation times than with Monte Carlo simulations. With such a model, one can estimate the signal when a trace element is illuminated with an X-ray beam, and when just the surrounding non-fluorescent material is illuminated. From this signal difference, a contrast parameter can be calculated and this can in turn be used to calculate the signal-to-noise ratio (S/N) for detecting a certain elemental concentration. We apply this model to the detection of trace amounts of zinc in biological materials, and to the detection of small quantities of arsenic in semiconductors. We conclude that increased detector collection solid angle is (nearly) always advantageous even when considering the scattered signal. However, given the choice between a smaller detector at 90° to the beam versus a larger detector at 180° (in a backscatter-like geometry), the 90° detector is better for trace element detection in thick samples, while the larger detector in 180° geometry is better suited to trace element detection in thin samples. Copyright © 2015. Published by Elsevier B.V.

  14. Optimizing detector geometry for trace element mapping by X-ray fluorescence

    PubMed Central

    Sun, Yue; Gleber, Sophie-Charlotte; Jacobsen, Chris; Kirz, Janos; Vogt, Stefan

    2016-01-01

    Trace metals play critical roles in a variety of systems, ranging from cells to photovoltaics. X-Ray Fluorescence (XRF) microscopy using X-ray excitation provides one of the highest sensitivities available for imaging the distribution of trace metals at sub-100 nm resolution. With the growing availability and increasing performance of synchrotron light source based instruments and X-ray nanofocusing optics, and with improvements in energy-dispersive XRF detectors, what are the factors that limit trace element detectability? To address this question, we describe an analytical model for the total signal incident on XRF detectors with various geometries, including the spectral response of energy dispersive detectors. This model agrees well with experimentally recorded X-ray fluorescence spectra, and involves much shorter calculation times than with Monte Carlo simulations. With such a model, one can estimate the signal when a trace element is illuminated with an X-ray beam, and when just the surrounding non-fluorescent material is illuminated. From this signal difference, a contrast parameter can be calculated and this can in turn be used to calculate the signal-to-noise ratio (S/N) for detecting a certain elemental concentration. We apply this model to the detection of trace amounts of zinc in biological materials, and to the detection of small quantities of arsenic in semiconductors. We conclude that increased detector collection solid angle is (nearly) always advantageous even when considering the scattered signal. However, given the choice between a smaller detector at 90° to the beam versus a larger detector at 180° (in a backscatter-like geometry), the 90° detector is better for trace element detection in thick samples, while the larger detector in 180° geometry is better suited to trace element detection in thin samples. PMID:25600825

  15. Optimizing detector geometry for trace element mapping by X-ray fluorescence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yue; Gleber, Sophie-Charlotte; Jacobsen, Chris

    Trace metals play critical roles in a variety of systems, ranging from cells to photovoltaics. X-Ray Fluorescence (XRF) microscopy using X-ray excitation provides one of the highest sensitivities available for imaging the distribution of trace metals at sub-100 nm resolution. With the growing availability and increasing performance of synchrotron light source based instruments and X-ray nanofocusing optics, and with improvements in energy-dispersive XRF detectors, what are the factors that limit trace element detectability? To address this question, we describe an analytical model for the total signal incident on XRF detectors with various geometries, including the spectral response of energy dispersivemore » detectors. This model agrees well with experimentally recorded X-ray fluorescence spectra, and involves much shorter calculation times than with Monte Carlo simulations. With such a model, one can estimate the signal when a trace element is illuminated with an X-ray beam, and when just the surrounding non-fluorescent material is illuminated. From this signal difference, a contrast parameter can be calculated and this can in turn be used to calculate the signal-to-noise ratio (S/N) for detecting a certain elemental concentration. We apply this model to the detection of trace amounts of zinc in biological materials, and to the detection of small quantities of arsenic in semiconductors. We conclude that increased detector collection solid angle is (nearly) always advantageous even when considering the scattered signal. However, given the choice between a smaller detector at 90° to the beam versus a larger detector at 180° (in a backscatter-like geometry), the 90° detector is better for trace element detection in thick samples, while the larger detector in 180° geometry is better suited to trace element detection in thin samples.« less

  16. Ray Tracing Methods in Seismic Emission Tomography

    NASA Astrophysics Data System (ADS)

    Chebotareva, I. Ya.

    2018-03-01

    Highly efficient approximate ray tracing techniques which can be used in seismic emission tomography and in other methods requiring a large number of raypaths are described. The techniques are applicable for the gradient and plane-layered velocity sections of the medium and for the models with a complicated geometry of contrasting boundaries. The empirical results obtained with the use of the discussed ray tracing technologies and seismic emission tomography results, as well as the results of numerical modeling, are presented.

  17. Laser Ray Tracing in a Parallel Arbitrary Lagrangian-Eulerian Adaptive Mesh Refinement Hydrocode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masters, N D; Kaiser, T B; Anderson, R W

    2009-09-28

    ALE-AMR is a new hydrocode that we are developing as a predictive modeling tool for debris and shrapnel formation in high-energy laser experiments. In this paper we present our approach to implementing laser ray-tracing in ALE-AMR. We present the equations of laser ray tracing, our approach to efficient traversal of the adaptive mesh hierarchy in which we propagate computational rays through a virtual composite mesh consisting of the finest resolution representation of the modeled space, and anticipate simulations that will be compared to experiments for code validation.

  18. Computer-based analysis of holography using ray tracing.

    PubMed

    Latta, J N

    1971-12-01

    The application of a ray-tracing methodology to holography is presented. Emphasis is placed on establishing a very general foundation from which to build a general computer-based implementation. As few restrictions as possible are placed on the recording and reconstruction geometry. The necessary equations are established from the construction and reconstruction parameters of the hologram. The aberrations are defined following H. H. Hopkins, and these aberration specification techniques are compared with those used previously to analyze holography. Representative of the flexibility of the ray-tracing approach, two examples are considered. The first compares the answers between a wavefront matching and the ray-tracing analysis in the case of aberration balancing to compensate for chromatic aberrations. The results are very close and establish the basic utility of aberration balancing. Further indicative of the power of a ray tracing, a thick media analysis is included in the computer programs. This section is then used to perform a study of the effects of hologram emulsion shrinkage and methods for compensation. The results of compensating such holograms are to introduce aberrations, and these are considered in both reflection and transmission holograms.

  19. Synthesizing Dynamic Programming Algorithms from Linear Temporal Logic Formulae

    NASA Technical Reports Server (NTRS)

    Rosu, Grigore; Havelund, Klaus

    2001-01-01

    The problem of testing a linear temporal logic (LTL) formula on a finite execution trace of events, generated by an executing program, occurs naturally in runtime analysis of software. We present an algorithm which takes an LTL formula and generates an efficient dynamic programming algorithm. The generated algorithm tests whether the LTL formula is satisfied by a finite trace of events given as input. The generated algorithm runs in linear time, its constant depending on the size of the LTL formula. The memory needed is constant, also depending on the size of the formula.

  20. Ambient occlusion - A powerful algorithm to segment shell and skeletal intrapores in computed tomography data

    NASA Astrophysics Data System (ADS)

    Titschack, J.; Baum, D.; Matsuyama, K.; Boos, K.; Färber, C.; Kahl, W.-A.; Ehrig, K.; Meinel, D.; Soriano, C.; Stock, S. R.

    2018-06-01

    During the last decades, X-ray (micro-)computed tomography has gained increasing attention for the description of porous skeletal and shell structures of various organism groups. However, their quantitative analysis is often hampered by the difficulty to discriminate cavities and pores within the object from the surrounding region. Herein, we test the ambient occlusion (AO) algorithm and newly implemented optimisations for the segmentation of cavities (implemented in the software Amira). The segmentation accuracy is evaluated as a function of (i) changes in the ray length input variable, and (ii) the usage of AO (scalar) field and other AO-derived (scalar) fields. The results clearly indicate that the AO field itself outperforms all other AO-derived fields in terms of segmentation accuracy and robustness against variations in the ray length input variable. The newly implemented optimisations improved the AO field-based segmentation only slightly, while the segmentations based on the AO-derived fields improved considerably. Additionally, we evaluated the potential of the AO field and AO-derived fields for the separation and classification of cavities as well as skeletal structures by comparing them with commonly used distance-map-based segmentations. For this, we tested the zooid separation within a bryozoan colony, the stereom classification of an ophiuroid tooth, the separation of bioerosion traces within a marble block and the calice (central cavity)-pore separation within a dendrophyllid coral. The obtained results clearly indicate that the ideal input field depends on the three-dimensional morphology of the object of interest. The segmentations based on the AO-derived fields often provided cavity separations and skeleton classifications that were superior to or impossible to obtain with commonly used distance-map-based segmentations. The combined usage of various AO-derived fields by supervised or unsupervised segmentation algorithms might provide a promising target for future research to further improve the results for this kind of high-end data segmentation and classification. Furthermore, the application of the developed segmentation algorithm is not restricted to X-ray (micro-)computed tomographic data but may potentially be useful for the segmentation of 3D volume data from other sources.

  1. Automatic Whistler Detector and Analyzer system: Implementation of the analyzer algorithm

    NASA Astrophysics Data System (ADS)

    Lichtenberger, JáNos; Ferencz, Csaba; Hamar, Daniel; Steinbach, Peter; Rodger, Craig J.; Clilverd, Mark A.; Collier, Andrew B.

    2010-12-01

    The full potential of whistlers for monitoring plasmaspheric electron density variations has not yet been realized. The primary reason is the vast human effort required for the analysis of whistler traces. Recently, the first part of a complete whistler analysis procedure was successfully automated, i.e., the automatic detection of whistler traces from the raw broadband VLF signal was achieved. This study describes a new algorithm developed to determine plasmaspheric electron density measurements from whistler traces, based on a Virtual (Whistler) Trace Transformation, using a 2-D fast Fourier transform transformation. This algorithm can be automated and can thus form the final step to complete an Automatic Whistler Detector and Analyzer (AWDA) system. In this second AWDA paper, the practical implementation of the Automatic Whistler Analyzer (AWA) algorithm is discussed and a feasible solution is presented. The practical implementation of the algorithm is able to track the variations of plasmasphere in quasi real time on a PC cluster with 100 CPU cores. The electron densities obtained by the AWA method can be used in investigations such as plasmasphere dynamics, ionosphere-plasmasphere coupling, or in space weather models.

  2. Automatic Insall-Salvati ratio measurement on lateral knee x-ray images using model-guided landmark localization

    NASA Astrophysics Data System (ADS)

    Chen, Hsin-Chen; Lin, Chii-Jeng; Wu, Chia-Hsing; Wang, Chien-Kuo; Sun, Yung-Nien

    2010-11-01

    The Insall-Salvati ratio (ISR) is important for detecting two common clinical signs of knee disease: patella alta and patella baja. Furthermore, large inter-operator differences in ISR measurement make an objective measurement system necessary for better clinical evaluation. In this paper, we define three specific bony landmarks for determining the ISR and then propose an x-ray image analysis system to localize these landmarks and measure the ISR. Due to inherent artifacts in x-ray images, such as unevenly distributed intensities, which make landmark localization difficult, we hence propose a registration-assisted active-shape model (RAASM) to localize these landmarks. We first construct a statistical model from a set of training images based on x-ray image intensity and patella shape. Since a knee x-ray image contains specific anatomical structures, we then design an algorithm, based on edge tracing, for patella feature extraction in order to automatically align the model to the patella image. We can estimate the landmark locations as well as the ISR after registration-assisted model fitting. Our proposed method successfully overcomes drawbacks caused by x-ray image artifacts. Experimental results show great agreement between the ISRs measured by the proposed method and by orthopedic clinicians.

  3. Trisodium phosphate poisoning

    MedlinePlus

    ... the esophagus and the stomach. Chest x-ray ECG (electrocardiogram, or heart tracing) Fluids by IV (through ... in the airways and lungs. Chest x-ray ECG (electrocardiogram, or heart tracing) Fluids by IV (through ...

  4. The influence of leaf anatomy on the internal light environment and photosynthetic electron transport rate: exploration with a new leaf ray tracing model

    PubMed Central

    Xiao, Yi; Tholen, Danny; Zhu, Xin-Guang

    2016-01-01

    Leaf photosynthesis is determined by biochemical properties and anatomical features. Here we developed a three-dimensional leaf model that can be used to evaluate the internal light environment of a leaf and its implications for whole-leaf electron transport rates (J). This model includes (i) the basic components of a leaf, such as the epidermis, palisade and spongy tissues, as well as the physical dimensions and arrangements of cell walls, vacuoles and chloroplasts; and (ii) an efficient forward ray-tracing algorithm, predicting the internal light environment for light of wavelengths between 400 and 2500nm. We studied the influence of leaf anatomy and ambient light on internal light conditions and J. The results show that (i) different chloroplasts can experience drastically different light conditions, even when they are located at the same distance from the leaf surface; (ii) bundle sheath extensions, which are strips of parenchyma, collenchyma or sclerenchyma cells connecting the vascular bundles with the epidermis, can influence photosynthetic light-use efficiency of leaves; and (iii) chloroplast positioning can also influence the light-use efficiency of leaves. Mechanisms underlying leaf internal light heterogeneity and implications of the heterogeneity for photoprotection and for the convexity of the light response curves are discussed. PMID:27702991

  5. Comparison of laser ray-tracing and skiascopic ocular wavefront-sensing devices

    PubMed Central

    Bartsch, D-UG; Bessho, K; Gomez, L; Freeman, WR

    2009-01-01

    Purpose To compare two wavefront-sensing devices based on different principles. Methods Thirty-eight healthy eyes of 19 patients were measured five times in the reproducibility study. Twenty eyes of 10 patients were measured in the comparison study. The Tracey Visual Function Analyzer (VFA), based on the ray-tracing principle and the Nidek optical pathway difference (OPD)-Scan, based on the dynamic skiascopy principle were compared. Standard deviation (SD) of root mean square (RMS) errors was compared to verify the reproducibility. We evaluated RMS errors, Zernike terms and conventional refractive indexes (Sph, Cyl, Ax, and spherical equivalent). Results In RMS errors reading, both devices showed similar ratios of SD to the mean measurement value (VFA: 57.5±11.7%, OPD-Scan: 53.9±10.9%). Comparison on the same eye showed that almost all terms were significantly greater using the VFA than using the OPD-Scan. However, certain high spatial frequency aberrations (tetrafoil, pentafoil, and hexafoil) were consistently measured near zero with the OPD-Scan. Conclusion Both devices showed similar level of reproducibility; however, there was considerable difference in the wavefront reading between machines when measuring the same eye. Differences in the number of sample points, centration, and measurement algorithms between the two instruments may explain our results. PMID:17571088

  6. Application of the nudged elastic band method to the point-to-point radio wave ray tracing in IRI modeled ionosphere

    NASA Astrophysics Data System (ADS)

    Nosikov, I. A.; Klimenko, M. V.; Bessarab, P. F.; Zhbankov, G. A.

    2017-07-01

    Point-to-point ray tracing is an important problem in many fields of science. While direct variational methods where some trajectory is transformed to an optimal one are routinely used in calculations of pathways of seismic waves, chemical reactions, diffusion processes, etc., this approach is not widely known in ionospheric point-to-point ray tracing. We apply the Nudged Elastic Band (NEB) method to a radio wave propagation problem. In the NEB method, a chain of points which gives a discrete representation of the radio wave ray is adjusted iteratively to an optimal configuration satisfying the Fermat's principle, while the endpoints of the trajectory are kept fixed according to the boundary conditions. Transverse displacements define the radio ray trajectory, while springs between the points control their distribution along the ray. The method is applied to a study of point-to-point ionospheric ray tracing, where the propagation medium is obtained with the International Reference Ionosphere model taking into account traveling ionospheric disturbances. A 2-dimensional representation of the optical path functional is developed and used to gain insight into the fundamental difference between high and low rays. We conclude that high and low rays are minima and saddle points of the optical path functional, respectively.

  7. AXAF FITS standard for ray trace interchange

    NASA Astrophysics Data System (ADS)

    Hsieh, Paul F.

    1993-07-01

    A standard data format for the archival and transport of x-ray events generated by ray trace models is described. Upon review and acceptance by the Advanced X-ray Astrophysics Facility (AXAF) Software Systems Working Group (SSWG), this standard shall become the official AXAF data format for ray trace events. The Flexible Image Transport System (FITS) is well suited for the purposes of the standard and was selected to be the basis of the standard. FITS is both flexible and efficient and is also widely used within the astronomical community for storage and transfer of data. In addition, software to read and write FITS format files are widely available. In selecting quantities to be included within the ray trace standard, the AXAF Mission Support team, Science Instruments team, and the other contractor teams were surveyed. From the results of this survey, the following requirements were established: (1) for the scientific needs, each photon should have associated with it: position, direction, energy, and statistical weight; the standard must also accommodate path length (relative phase), and polarization. (2) a unique photon identifier is necessary for bookkeeping purposes; (3) a log of individuals, organizations, and software packages that have modified the data must be maintained in order to create an audit trail; (4) a mechanism for extensions to the basic kernel should be provided; and (5) the ray trace standard should integrate with future AXAF data product standards.

  8. AXAF FITS standard for ray trace interchange

    NASA Technical Reports Server (NTRS)

    Hsieh, Paul F.

    1993-01-01

    A standard data format for the archival and transport of x-ray events generated by ray trace models is described. Upon review and acceptance by the Advanced X-ray Astrophysics Facility (AXAF) Software Systems Working Group (SSWG), this standard shall become the official AXAF data format for ray trace events. The Flexible Image Transport System (FITS) is well suited for the purposes of the standard and was selected to be the basis of the standard. FITS is both flexible and efficient and is also widely used within the astronomical community for storage and transfer of data. In addition, software to read and write FITS format files are widely available. In selecting quantities to be included within the ray trace standard, the AXAF Mission Support team, Science Instruments team, and the other contractor teams were surveyed. From the results of this survey, the following requirements were established: (1) for the scientific needs, each photon should have associated with it: position, direction, energy, and statistical weight; the standard must also accommodate path length (relative phase), and polarization. (2) a unique photon identifier is necessary for bookkeeping purposes; (3) a log of individuals, organizations, and software packages that have modified the data must be maintained in order to create an audit trail; (4) a mechanism for extensions to the basic kernel should be provided; and (5) the ray trace standard should integrate with future AXAF data product standards.

  9. Acceleration of Radiance for Lighting Simulation by Using Parallel Computing with OpenCL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuo, Wangda; McNeil, Andrew; Wetter, Michael

    2011-09-06

    We report on the acceleration of annual daylighting simulations for fenestration systems in the Radiance ray-tracing program. The algorithm was optimized to reduce both the redundant data input/output operations and the floating-point operations. To further accelerate the simulation speed, the calculation for matrix multiplications was implemented using parallel computing on a graphics processing unit. We used OpenCL, which is a cross-platform parallel programming language. Numerical experiments show that the combination of the above measures can speed up the annual daylighting simulations 101.7 times or 28.6 times when the sky vector has 146 or 2306 elements, respectively.

  10. Effective emissivities of isothermal blackbody cavities calculated by the Monte Carlo method using the three-component bidirectional reflectance distribution function model.

    PubMed

    Prokhorov, Alexander

    2012-05-01

    This paper proposes a three-component bidirectional reflectance distribution function (3C BRDF) model consisting of diffuse, quasi-specular, and glossy components for calculation of effective emissivities of blackbody cavities and then investigates the properties of the new reflection model. The particle swarm optimization method is applied for fitting a 3C BRDF model to measured BRDFs. The model is incorporated into the Monte Carlo ray-tracing algorithm for isothermal cavities. Finally, the paper compares the results obtained using the 3C model and the conventional specular-diffuse model of reflection.

  11. The Use of Pro/Engineer CAD Software and Fishbowl Tool Kit in Ray-tracing Analysis

    NASA Technical Reports Server (NTRS)

    Nounu, Hatem N.; Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2009-01-01

    This document is designed as a manual for a user who wants to operate the Pro/ENGINEER (ProE) Wildfire 3.0 with the NASA Space Radiation Program's (SRP) custom-designed Toolkit, called 'Fishbowl', for the ray tracing of complex spacecraft geometries given by a ProE CAD model. The analysis of spacecraft geometry through ray tracing is a vital part in the calculation of health risks from space radiation. Space radiation poses severe risks of cancer, degenerative diseases and acute radiation sickness during long-term exploration missions, and shielding optimization is an important component in the application of radiation risk models. Ray tracing is a technique in which 3-dimensional (3D) vehicle geometry can be represented as the input for the space radiation transport code and subsequent risk calculations. In ray tracing a certain number of rays (on the order of 1000) are used to calculate the equivalent thickness, say of aluminum, of the spacecraft geometry seen at a point of interest called the dose point. The rays originate at the dose point and terminate at a homogenously distributed set of points lying on a sphere that circumscribes the spacecraft and that has its center at the dose point. The distance a ray traverses in each material is converted to aluminum or other user-selected equivalent thickness. Then all equivalent thicknesses are summed up for each ray. Since each ray points to a direction, the aluminum equivalent of each ray represents the shielding that the geometry provides to the dose point from that particular direction. This manual will first list for the user the contact information for help in installing ProE and Fishbowl in addition to notes on the platform support and system requirements information. Second, the document will show the user how to use the software to ray trace a Pro/E-designed 3-D assembly and will serve later as a reference for troubleshooting. The user is assumed to have previous knowledge of ProE and CAD modeling.

  12. Ray tracing a three-dimensional scene using a hierarchical data structure

    DOEpatents

    Wald, Ingo; Boulos, Solomon; Shirley, Peter

    2012-09-04

    Ray tracing a three-dimensional scene made up of geometric primitives that are spatially partitioned into a hierarchical data structure. One example embodiment is a method for ray tracing a three-dimensional scene made up of geometric primitives that are spatially partitioned into a hierarchical data structure. In this example embodiment, the hierarchical data structure includes at least a parent node and a corresponding plurality of child nodes. The method includes a first act of determining that a first active ray in the packet hits the parent node and a second act of descending to each of the plurality of child nodes.

  13. Refraction corrected calibration for aquatic locomotion research: application of Snell's law improves spatial accuracy.

    PubMed

    Henrion, Sebastian; Spoor, Cees W; Pieters, Remco P M; Müller, Ulrike K; van Leeuwen, Johan L

    2015-07-07

    Images of underwater objects are distorted by refraction at the water-glass-air interfaces and these distortions can lead to substantial errors when reconstructing the objects' position and shape. So far, aquatic locomotion studies have minimized refraction in their experimental setups and used the direct linear transform algorithm (DLT) to reconstruct position information, which does not model refraction explicitly. Here we present a refraction corrected ray-tracing algorithm (RCRT) that reconstructs position information using Snell's law. We validated this reconstruction by calculating 3D reconstruction error-the difference between actual and reconstructed position of a marker. We found that reconstruction error is small (typically less than 1%). Compared with the DLT algorithm, the RCRT has overall lower reconstruction errors, especially outside the calibration volume, and errors are essentially insensitive to camera position and orientation and the number and position of the calibration points. To demonstrate the effectiveness of the RCRT, we tracked an anatomical marker on a seahorse recorded with four cameras to reconstruct the swimming trajectory for six different camera configurations. The RCRT algorithm is accurate and robust and it allows cameras to be oriented at large angles of incidence and facilitates the development of accurate tracking algorithms to quantify aquatic manoeuvers.

  14. Ray tracing: Experience at SRC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Severson, M.

    1996-09-01

    SHADOW [B. Lai and F. Cerrina, Nucl. Instrum. Methods A {bold 246}, 337 (1986)] is the primary ray-tracing program used at SRC. Ray tracing provides a tremendous amount of information regarding beamline layout, mirror sizes, resolution, alignment tolerances, and beam size at various locations. It also provides a way to check the beamline design for errors. Two recent designs have been ray traced extensively: an undulator-based, 4-meter, normal-incidence monochromator (NIM) [R. Reininger, M.C. Severson, R.W.C. Hansen, W.R. Winter, M.A. Green, and W.S. Trzeciak, Rev. Sci. Instrum. {bold 66}, 2194 (1995)] and an undulator-based, plane-grating monochromator (PGM) [R. Reininger, S.L. Crossley,more » M.A. Lagergren, M.C. Severson, and R.W.C. Hansen, Nucl. Instrum. Methods A {bold 347}, 304 (1994)]. {copyright} {ital 1996 American Institute of Physics.}« less

  15. Analysis and design of optical systems by use of sensitivity analysis of skew ray tracing

    NASA Astrophysics Data System (ADS)

    Lin, Psang Dain; Lu, Chia-Hung

    2004-02-01

    Optical systems are conventionally evaluated by ray-tracing techniques that extract performance quantities such as aberration and spot size. Current optical analysis software does not provide satisfactory analytical evaluation functions for the sensitivity of an optical system. Furthermore, when functions oscillate strongly, the results are of low accuracy. Thus this work extends our earlier research on an advanced treatment of reflected or refracted rays, referred to as sensitivity analysis, in which differential changes of reflected or refracted rays are expressed in terms of differential changes of incident rays. The proposed sensitivity analysis methodology for skew ray tracing of reflected or refracted rays that cross spherical or flat boundaries is demonstrated and validated by the application of a cat's eye retroreflector to the design and by the image orientation of a system with noncoplanar optical axes. The proposed sensitivity analysis is projected as the nucleus of other geometrical optical computations.

  16. Analysis and Design of Optical Systems by Use of Sensitivity Analysis of Skew Ray Tracing

    NASA Astrophysics Data System (ADS)

    Dain Lin, Psang; Lu, Chia-Hung

    2004-02-01

    Optical systems are conventionally evaluated by ray-tracing techniques that extract performance quantities such as aberration and spot size. Current optical analysis software does not provide satisfactory analytical evaluation functions for the sensitivity of an optical system. Furthermore, when functions oscillate strongly, the results are of low accuracy. Thus this work extends our earlier research on an advanced treatment of reflected or refracted rays, referred to as sensitivity analysis, in which differential changes of reflected or refracted rays are expressed in terms of differential changes of incident rays. The proposed sensitivity analysis methodology for skew ray tracing of reflected or refracted rays that cross spherical or flat boundaries is demonstrated and validated by the application of a cat ?s eye retroreflector to the design and by the image orientation of a system with noncoplanar optical axes. The proposed sensitivity analysis is projected as the nucleus of other geometrical optical computations.

  17. A Rewriting-Based Approach to Trace Analysis

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We present a rewriting-based algorithm for efficiently evaluating future time Linear Temporal Logic (LTL) formulae on finite execution traces online. While the standard models of LTL are infinite traces, finite traces appear naturally when testing and/or monitoring red applications that only run for limited time periods. The presented algorithm is implemented in the Maude executable specification language and essentially consists of a set of equations establishing an executable semantics of LTL using a simple formula transforming approach. The algorithm is further improved to build automata on-the-fly from formulae, using memoization. The result is a very efficient and small Maude program that can be used to monitor program executions. We furthermore present an alternative algorithm for synthesizing probably minimal observer finite state machines (or automata) from LTL formulae, which can be used to analyze execution traces without the need for a rewriting system, and can hence be used by observers written in conventional programming languages. The presented work is part of an ambitious runtime verification and monitoring project at NASA Ames, called PATHEXPLORER, and demonstrates that rewriting can be a tractable and attractive means for experimenting and implementing program monitoring logics.

  18. A practical implementation of wave front construction for 3-D isotropic media

    NASA Astrophysics Data System (ADS)

    Chambers, K.; Kendall, J.-M.

    2008-06-01

    Wave front construction (WFC) methods are a useful tool for tracking wave fronts and are a natural extension to standard ray shooting methods. Here we describe and implement a simple WFC method that is used to interpolate wavefield properties throughout a 3-D heterogeneous medium. Our approach differs from previous 3-D WFC procedures primarily in the use of a ray interpolation scheme, based on approximating the wave front as a `locally spherical' surface and a `first arrival mode', which reduces computation times, where only first arrivals are required. Both of these features have previously been included in 2-D WFC algorithms; however, until now they have not been extended to 3-D systems. The wave front interpolation scheme allows for rays to be traced from a nearly arbitrary distribution of take-off angles, and the calculation of derivatives with respect to take-off angles is not required for wave front interpolation. However, in regions of steep velocity gradient, the locally spherical approximation is not valid, and it is necessary to backpropagate rays to a sufficiently homogenous region before interpolation of the new ray. Our WFC technique is illustrated using a realistic velocity model, based on a North Sea oil reservoir. We examine wavefield quantities such as traveltimes, ray angles, source take-off angles and geometrical spreading factors, all of which are interpolated on to a regular grid. We compare geometrical spreading factors calculated using two methods: using the ray Jacobian and by taking the ratio of a triangular area of wave front to the corresponding solid angle at the source. The results show that care must be taken when using ray Jacobians to calculate geometrical spreading factors, as the poles of the source coordinate system produce unreliable values, which can be spread over a large area, as only a few initial rays are traced in WFC. We also show that the use of the first arrival mode can reduce computation time by ~65 per cent, with the accuracy of the interpolated traveltimes, ray angles and source take-off angles largely unchanged. However, the first arrival mode does lead to inaccuracies in interpolated angles near caustic surfaces, as well as small variations in geometrical spreading factors for ray tubes that have passed through caustic surfaces.

  19. Ab initio simulation of diffractometer instrumental function for high-resolution X-ray diffraction1

    PubMed Central

    Mikhalychev, Alexander; Benediktovitch, Andrei; Ulyanenkova, Tatjana; Ulyanenkov, Alex

    2015-01-01

    Modeling of the X-ray diffractometer instrumental function for a given optics configuration is important both for planning experiments and for the analysis of measured data. A fast and universal method for instrumental function simulation, suitable for fully automated computer realization and describing both coplanar and noncoplanar measurement geometries for any combination of X-ray optical elements, is proposed. The method can be identified as semi-analytical backward ray tracing and is based on the calculation of a detected signal as an integral of X-ray intensities for all the rays reaching the detector. The high speed of calculation is provided by the expressions for analytical integration over the spatial coordinates that describe the detection point. Consideration of the three-dimensional propagation of rays without restriction to the diffraction plane provides the applicability of the method for noncoplanar geometry and the accuracy for characterization of the signal from a two-dimensional detector. The correctness of the simulation algorithm is checked in the following two ways: by verifying the consistency of the calculated data with the patterns expected for certain simple limiting cases and by comparing measured reciprocal-space maps with the corresponding maps simulated by the proposed method for the same diffractometer configurations. Both kinds of tests demonstrate the agreement of the simulated instrumental function shape with the measured data. PMID:26089760

  20. Can We Trace "Arbitrary" Rays to Locate an Image Formed by a Thin Lens?

    ERIC Educational Resources Information Center

    Suppapittayaporn, Decha; Panijpan, Bhinyo; Emarat, Narumon

    2010-01-01

    After learning how to trace the principal rays [Fig. 1(i)] through a thin lens in order to form the image in the conventional way, students sometimes ask whether it is possible to use other rays emanating from the object to form exactly the same image--for example, the two arbitrary rays shown in Fig. 1(ii). The answer is a definite yes, and this…

  1. Photorealistic 3D omni-directional stereo simulator

    NASA Astrophysics Data System (ADS)

    Reiners, Dirk; Cruz-Neira, Carolina; Neumann, Carsten

    2015-03-01

    While a lot of areas in VR have made significant advances, visual rendering in VR is often not quite keeping up with the state of the art. There are many reasons for this, but one way to alleviate some of the issues is by using ray tracing instead of rasterization for image generation. Contrary to popular belief, ray tracing is a realistic, competitive technology nowadays. This paper looks at the pros and cons of using ray tracing and demonstrates the feasibility of employing it using the example of a helicopter flight simulator image generator.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sudhyadhom, A; McGuinness, C; Descovich, M

    Purpose: To develop a methodology for validation of a Monte-Carlo dose calculation model for robotic small field SRS/SBRT deliveries. Methods: In a robotic treatment planning system, a Monte-Carlo model was iteratively optimized to match with beam data. A two-part analysis was developed to verify this model. 1) The Monte-Carlo model was validated in a simulated water phantom versus a Ray-Tracing calculation on a single beam collimator-by-collimator calculation. 2) The Monte-Carlo model was validated to be accurate in the most challenging situation, lung, by acquiring in-phantom measurements. A plan was created and delivered in a CIRS lung phantom with film insert.more » Separately, plans were delivered in an in-house created lung phantom with a PinPoint chamber insert within a lung simulating material. For medium to large collimator sizes, a single beam was delivered to the phantom. For small size collimators (10, 12.5, and 15mm), a robotically delivered plan was created to generate a uniform dose field of irradiation over a 2×2cm{sup 2} area. Results: Dose differences in simulated water between Ray-Tracing and Monte-Carlo were all within 1% at dmax and deeper. Maximum dose differences occurred prior to dmax but were all within 3%. Film measurements in a lung phantom show high correspondence of over 95% gamma at the 2%/2mm level for Monte-Carlo. Ion chamber measurements for collimator sizes of 12.5mm and above were within 3% of Monte-Carlo calculated values. Uniform irradiation involving the 10mm collimator resulted in a dose difference of ∼8% for both Monte-Carlo and Ray-Tracing indicating that there may be limitations with the dose calculation. Conclusion: We have developed a methodology to validate a Monte-Carlo model by verifying that it matches in water and, separately, that it corresponds well in lung simulating materials. The Monte-Carlo model and algorithm tested may have more limited accuracy for 10mm fields and smaller.« less

  3. Optimizing detector geometry for trace element mapping by X-ray fluorescence

    DOE PAGES

    Sun, Yue; Gleber, Sophie -Charlotte; Jacobsen, Chris; ...

    2015-01-01

    We report that trace metals play critical roles in a variety of systems, ranging from cells to photovoltaics. X-Ray Fluorescence (XRF) microscopy using X-ray excitation provides one of the highest sensitivities available for imaging the distribution of trace metals at sub-100 nm resolution. With the growing availability and increasing performance of synchrotron light source based instruments and X-ray nanofocusing optics, and with improvements in energy-dispersive XRF detectors, what are the factors that limit trace element detectability? To address this question, we describe an analytical model for the total signal incident on XRF detectors with various geometries, including the spectral responsemore » of energy dispersive detectors. This model agrees well with experimentally recorded X-ray fluorescence spectra, and involves much shorter calculation times than with Monte Carlo simulations. With such a model, one can estimate the signal when a trace element is illuminated with an X-ray beam, and when just the surrounding non-fluorescent material is illuminated. From this signal difference, a contrast parameter can be calculated and this can in turn be used to calculate the signal-to-noise ratio (S/N) for detecting a certain elemental concentration. We apply this model to the detection of trace amounts of zinc in biological materials, and to the detection of small quantities of arsenic in semiconductors. In conclusion, we conclude that increased detector collection solid angle is (nearly) always advantageous even when considering the scattered signal. However, given the choice between a smaller detector at 90° to the beam versus a larger detector at 180° (in a backscatter-like geometry), the 90° detector is better for trace element detection in thick samples, while the larger detector in 180° geometry is better suited to trace element detection in thin samples.« less

  4. Optimizing detector geometry for trace element mapping by X-ray fluorescence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yue; Gleber, Sophie -Charlotte; Jacobsen, Chris

    We report that trace metals play critical roles in a variety of systems, ranging from cells to photovoltaics. X-Ray Fluorescence (XRF) microscopy using X-ray excitation provides one of the highest sensitivities available for imaging the distribution of trace metals at sub-100 nm resolution. With the growing availability and increasing performance of synchrotron light source based instruments and X-ray nanofocusing optics, and with improvements in energy-dispersive XRF detectors, what are the factors that limit trace element detectability? To address this question, we describe an analytical model for the total signal incident on XRF detectors with various geometries, including the spectral responsemore » of energy dispersive detectors. This model agrees well with experimentally recorded X-ray fluorescence spectra, and involves much shorter calculation times than with Monte Carlo simulations. With such a model, one can estimate the signal when a trace element is illuminated with an X-ray beam, and when just the surrounding non-fluorescent material is illuminated. From this signal difference, a contrast parameter can be calculated and this can in turn be used to calculate the signal-to-noise ratio (S/N) for detecting a certain elemental concentration. We apply this model to the detection of trace amounts of zinc in biological materials, and to the detection of small quantities of arsenic in semiconductors. In conclusion, we conclude that increased detector collection solid angle is (nearly) always advantageous even when considering the scattered signal. However, given the choice between a smaller detector at 90° to the beam versus a larger detector at 180° (in a backscatter-like geometry), the 90° detector is better for trace element detection in thick samples, while the larger detector in 180° geometry is better suited to trace element detection in thin samples.« less

  5. Localization and cooperative communication methods for cognitive radio

    NASA Astrophysics Data System (ADS)

    Duval, Olivier

    We study localization of nearby nodes and cooperative communication for cognitive radios. Cognitive radios sensing their environment to estimate the channel gain between nodes can cooperate and adapt their transmission power to maximize the capacity of the communication between two nodes. We study the end-to-end capacity of a cooperative relaying scheme using orthogonal frequency-division modulation (OFDM) modulation, under power constraints for both the base station and the relay station. The relay uses amplify-and-forward and decode-and-forward cooperative relaying techniques to retransmit messages on a subset of the available subcarriers. The power used in the base station and the relay station transmitters is allocated to maximize the overall system capacity. The subcarrier selection and power allocation are obtained based on convex optimization formulations and an iterative algorithm. Additionally, decode-and-forward relaying schemes are allowed to pair source and relayed subcarriers to increase further the capacity of the system. The proposed techniques outperforms non-selective relaying schemes over a range of relay power budgets. Cognitive radios can be used for opportunistic access of the radio spectrum by detecting spectrum holes left unused by licensed primary users. We introduce a spectrum holes detection approach, which combines blind modulation classification, angle of arrival estimation and number of sources detection. We perform eigenspace analysis to determine the number of sources, and estimate their angles of arrival (AOA). In addition, we classify detected sources as primary or secondary users with their distinct second-orde one-conjugate cyclostationarity features. Extensive simulations carried out indicate that the proposed system identifies and locates individual sources correctly, even at -4 dB signal-to-noise ratios (SNR). In environments with a high density of scatterers, several wireless channels experience nonline-of-sight (NLOS) condition, increasing the localization error, even when the AOA estimate is accurate. We present a real-time localization solver (RTLS) for time-of-arrival (TOA) estimates using ray-tracing methods on the map of the geometry of walls and compare its performance with classical TOA trilateration localization methods. Extensive simulations and field trials for indoor environments show that our method increases the coverage area from 1.9% of the floor to 82.3 % and the accuracy by a 10-fold factor when compared with trilateration. We implemented our ray tracing model in C++ using the CGAL computational geometry algorithm library. We illustrate the real-time property of our RTLS that performs most ray tracing tasks in a preprocessing phase with time and space complexity analyses and profiling of our software.

  6. Determination of equivalent sound speed profiles for ray tracing in near-ground sound propagation.

    PubMed

    Prospathopoulos, John M; Voutsinas, Spyros G

    2007-09-01

    The determination of appropriate sound speed profiles in the modeling of near-ground propagation using a ray tracing method is investigated using a ray tracing model which is capable of performing axisymmetric calculations of the sound field around an isolated source. Eigenrays are traced using an iterative procedure which integrates the trajectory equations for each ray launched from the source at a specific direction. The calculation of sound energy losses is made by introducing appropriate coefficients to the equations representing the effect of ground and atmospheric absorption and the interaction with the atmospheric turbulence. The model is validated against analytical and numerical predictions of other methodologies for simple cases, as well as against measurements for nonrefractive atmospheric environments. A systematic investigation for near-ground propagation in downward and upward refractive atmosphere is made using experimental data. Guidelines for the suitable simulation of the wind velocity profile are derived by correlating predictions with measurements.

  7. Three-dimensional ray-tracing model for the study of advanced refractive errors in keratoconus.

    PubMed

    Schedin, Staffan; Hallberg, Per; Behndig, Anders

    2016-01-20

    We propose a numerical three-dimensional (3D) ray-tracing model for the analysis of advanced corneal refractive errors. The 3D modeling was based on measured corneal elevation data by means of Scheimpflug photography. A mathematical description of the measured corneal surfaces from a keratoconus (KC) patient was used for the 3D ray tracing, based on Snell's law of refraction. A model of a commercial intraocular lens (IOL) was included in the analysis. By modifying the posterior IOL surface, it was shown that the imaging quality could be significantly improved. The RMS values were reduced by approximately 50% close to the retina, both for on- and off-axis geometries. The 3D ray-tracing model can constitute a basis for simulation of customized IOLs that are able to correct the advanced, irregular refractive errors in KC.

  8. Fast solar radiation pressure modelling with ray tracing and multiple reflections

    NASA Astrophysics Data System (ADS)

    Li, Zhen; Ziebart, Marek; Bhattarai, Santosh; Harrison, David; Grey, Stuart

    2018-05-01

    Physics based SRP (Solar Radiation Pressure) models using ray tracing methods are powerful tools when modelling the forces on complex real world space vehicles. Currently high resolution (1 mm) ray tracing with secondary intersections is done on high performance computers at UCL (University College London). This study introduces the BVH (Bounding Volume Hierarchy) into the ray tracing approach for physics based SRP modelling and makes it possible to run high resolution analysis on personal computers. The ray tracer is both general and efficient enough to cope with the complex shape of satellites and multiple reflections (three or more, with no upper limit). In this study, the traditional ray tracing technique is introduced in the first place and then the BVH is integrated into the ray tracing. Four aspects of the ray tracer were tested for investigating the performance including runtime, accuracy, the effects of multiple reflections and the effects of pixel array resolution.Test results in runtime on GPS IIR and Galileo IOV (In Orbit Validation) satellites show that the BVH can make the force model computation 30-50 times faster. The ray tracer has an absolute accuracy of several nanonewtons by comparing the test results for spheres and planes with the analytical computations. The multiple reflection effects are investigated both in the intersection number and acceleration on GPS IIR, Galileo IOV and Sentinel-1 spacecraft. Considering the number of intersections, the 3rd reflection can capture 99.12 %, 99.14 % , and 91.34 % of the total reflections for GPS IIR, Galileo IOV satellite bus and the Sentinel-1 spacecraft respectively. In terms of the multiple reflection effects on the acceleration, the secondary reflection effect for Galileo IOV satellite and Sentinel-1 can reach 0.2 nm /s2 and 0.4 nm /s2 respectively. The error percentage in the accelerations magnitude results show that the 3rd reflection should be considered in order to make it less than 0.035 % . The pixel array resolution tests show that the dimensions of the components have to be considered when choosing the spacing of the pixel in order not to miss some components of the satellite in ray tracing. This paper presents the first systematic and quantitative study of the secondary and higher order intersection effects. It shows conclusively the effect is non-negligible for certain classes of misson.

  9. Eikonal-Based Inversion of GPR Data from the Vaucluse Karst Aquifer

    NASA Astrophysics Data System (ADS)

    Yedlin, M. J.; van Vorst, D.; Guglielmi, Y.; Cappa, F.; Gaffet, S.

    2009-12-01

    In this paper, we present an easy-to-implement eikonal-based travel time inversion algorithm and apply it to borehole GPR measurement data obtained from a karst aquifer located in the Vaucluse in Provence. The boreholes are situated with a fault zone deep inside the aquifer, in the Laboratoire Souterrain à Bas Bruit (LSBB). The measurements were made using 250 MHz MALA RAMAC borehole GPR antennas. The inversion formulation is unique in its application of a fast-sweeping eikonal solver (Zhao [1]) to the minimization of an objective functional that is composed of a travel time misfit and a model-based regularization [2]. The solver is robust in the presence of large velocity contrasts, efficient, easy to implement, and does not require the use of a sorting algorithm. The computation of sensitivities, which are required for the inversion process, is achieved by tracing rays backward from receiver to source following the gradient of the travel time field [2]. A user wishing to implement this algorithm can opt to avoid the ray tracing step and simply perturb the model to obtain the required sensitivities. Despite the obvious computational inefficiency of such an approach, it is acceptable for 2D problems. The relationship between travel time and the velocity profile is non-linear, requiring an iterative approach to be used. At each iteration, a set of matrix equations is solved to determine the model update. As the inversion continues, the weighting of the regularization parameter is adjusted until an appropriate data misfit is obtained. The inversion results, shown in the attached image, are consistent with previously obtained geological structure. Future work will look at improving inversion resolution and incorporating other measurement methodologies, with the goal of providing useful data for groundwater analysis. References: [1] H. Zhao, “A fast sweeping method for Eikonal equations,” Mathematics of Computation, vol. 74, no. 250, pp. 603-627, 2004. [2] D. Aldridge and D. Oldenburg, “Two-dimensional tomographic inversion with finite-difference traveltimes,” Journal of Seismic Exploration, vol. 2, pp. 257-274, 1993. Recovered Permittivity Profiles

  10. Analysis of eight argonne premium coal samples by X-ray fluorescence spectrometry

    USGS Publications Warehouse

    Evans, J.R.; Sellers, G.A.; Johnson, R.G.; Vivit, D.V.; Kent, J.

    1990-01-01

    X-ray fluorescence spectrometric methods were used in the analysis of eight Argonne Premium Coal Samples. Trace elements (Cr, Ni, Cu, Zn, Rb, Sr, Y, Zr, Nb, Ba, La, and Ce) in coal ash were determined by energy-dispersive X-ray fluorescence spectrometry; major elements (Na, Mg, Al, Si, P, S, K, Ca, Ti, Mn, and Fe) in coal ash and trace elements (Cl and P) in whole coal were determined by wavelength-dispersive X-ray fluorescence spectrometry. The results of this study will be used in a geochemical database compiled for these materials from various analytical techniques. The experimental XRF methods and procedures used to determine these major and trace elements are described.

  11. Incorporating geometric ray tracing to generate initial conditions for intensity modulated arc therapy optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliver, Mike; Gladwish, Adam; Craig, Jeff

    2008-07-15

    Purpose and background: Intensity modulated arc therapy (IMAT) is a rotational variant of Intensity modulated radiation therapy (IMRT) that is achieved by allowing the multileaf collimator (MLC) positions to vary as the gantry rotates around the patient. This work describes a method to generate an IMAT plan through the use of a fast ray tracing technique based on dosimetric and geometric information for setting initial MLC leaf positions prior to final IMAT optimization. Methods and materials: Three steps were used to generate an IMAT plan. The first step was to generate arcs based on anatomical contours. The second step wasmore » to generate ray importance factor (RIF) maps by ray tracing the dose distribution inside the planning target volume (PTV) to modify the MLC leaf positions of the anatomical arcs to reduce the maximum dose inside the PTV. The RIF maps were also segmented to create a new set of arcs to improve the dose to low dose voxels within the PTV. In the third step, the MLC leaf positions from all arcs were put through a leaf position optimization (LPO) algorithm and brought into a fast Monte Carlo dose calculation engine for a final dose calculation. The method was applied to two phantom cases, a clinical prostate case and the Radiological Physics Center (RPC)'s head and neck phantom. The authors assessed the plan improvements achieved by each step and compared plans with and without using RIF. They also compared the IMAT plan with an IMRT plan for the RPC phantom. Results: All plans that incorporated RIF and LPO had lower objective function values than those that incorporated LPO only. The objective function value was reduced by about 15% after the generation of RIF arcs and 52% after generation of RIF arcs and leaf position optimization. The IMAT plan for the RPC phantom had similar dose coverage for PTV1 and PTV2 (the same dose volume histogram curves), however, slightly lower dose to the normal tissues compared to a six-field IMRT plan. Conclusion: The use of a ray importance factor can generate initial IMAT arcs efficiently for further MLC leaf position optimization to obtain more favorable IMAT plan.« less

  12. Trace Elements in Ovaries: Measurement and Physiology.

    PubMed

    Ceko, Melanie J; O'Leary, Sean; Harris, Hugh H; Hummitzsch, Katja; Rodgers, Raymond J

    2016-04-01

    Traditionally, research in the field of trace element biology and human and animal health has largely depended on epidemiological methods to demonstrate involvement in biological processes. These studies were typically followed by trace element supplementation trials or attempts at identification of the biochemical pathways involved. With the discovery of biological molecules that contain the trace elements, such as matrix metalloproteinases containing zinc (Zn), cytochrome P450 enzymes containing iron (Fe), and selenoproteins containing selenium (Se), much of the current research focuses on these molecules, and, hence, only indirectly on trace elements themselves. This review focuses largely on two synchrotron-based x-ray techniques: X-ray absorption spectroscopy and x-ray fluorescence imaging that can be used to identify the in situ speciation and distribution of trace elements in tissues, using our recent studies of bovine ovaries, where the distribution of Fe, Se, Zn, and bromine were determined. It also discusses the value of other techniques, such as inductively coupled plasma mass spectrometry, used to garner information about the concentrations and elemental state of the trace elements. These applications to measure trace elemental distributions in bovine ovaries at high resolutions provide new insights into possible roles for trace elements in the ovary. © 2016 by the Society for the Study of Reproduction, Inc.

  13. In vivo ultrasound imaging of the bone cortex

    NASA Astrophysics Data System (ADS)

    Renaud, Guillaume; Kruizinga, Pieter; Cassereau, Didier; Laugier, Pascal

    2018-06-01

    Current clinical ultrasound scanners cannot be used to image the interior morphology of bones because these scanners fail to address the complicated physics involved for exact image reconstruction. Here, we show that if the physics is properly addressed, bone cortex can be imaged using a conventional transducer array and a programmable ultrasound scanner. We provide in vivo proof for this technique by scanning the radius and tibia of two healthy volunteers and comparing the thickness of the radius bone with high-resolution peripheral x-ray computed tomography. Our method assumes a medium that is composed of different homogeneous layers with unique elastic anisotropy and ultrasonic wave-speed values. The applicable values of these layers are found by optimizing image sharpness and intensity over a range of relevant values. In the algorithm of image reconstruction we take wave refraction between the layers into account using a ray-tracing technique. The estimated values of the ultrasonic wave-speed and anisotropy in cortical bone are in agreement with ex vivo studies reported in the literature. These parameters are of interest since they were proposed as biomarkers for cortical bone quality. In this paper we discuss the physics involved with ultrasound imaging of bone and provide an algorithm to successfully image the first segment of cortical bone.

  14. Experimental evaluation of radiosity for room sound-field prediction.

    PubMed

    Hodgson, Murray; Nosal, Eva-Marie

    2006-08-01

    An acoustical radiosity model was evaluated for how it performs in predicting real room sound fields. This was done by comparing radiosity predictions with experimental results for three existing rooms--a squash court, a classroom, and an office. Radiosity predictions were also compared with those by ray tracing--a "reference" prediction model--for both specular and diffuse surface reflection. Comparisons were made for detailed and discretized echograms, sound-decay curves, sound-propagation curves, and the variations with frequency of four room-acoustical parameters--EDT, RT, D50, and C80. In general, radiosity and diffuse ray tracing gave very similar predictions. Predictions by specular ray tracing were often very different. Radiosity agreed well with experiment in some cases, less well in others. Definitive conclusions regarding the accuracy with which the rooms were modeled, or the accuracy of the radiosity approach, were difficult to draw. The results suggest that radiosity predicts room sound fields with some accuracy, at least as well as diffuse ray tracing and, in general, better than specular ray tracing. The predictions of detailed echograms are less accurate, those of derived room-acoustical parameters more accurate. The results underline the need to develop experimental methods for accurately characterizing the absorptive and reflective characteristics of room surfaces, possible including phase.

  15. Application of adaptive filters in denoising magnetocardiogram signals

    NASA Astrophysics Data System (ADS)

    Khan, Pathan Fayaz; Patel, Rajesh; Sengottuvel, S.; Saipriya, S.; Swain, Pragyna Parimita; Gireesan, K.

    2017-05-01

    Magnetocardiography (MCG) is the measurement of weak magnetic fields from the heart using Superconducting QUantum Interference Devices (SQUID). Though the measurements are performed inside magnetically shielded rooms (MSR) to reduce external electromagnetic disturbances, interferences which are caused by sources inside the shielded room could not be attenuated. The work presented here reports the application of adaptive filters to denoise MCG signals. Two adaptive noise cancellation approaches namely least mean squared (LMS) algorithm and recursive least squared (RLS) algorithm are applied to denoise MCG signals and the results are compared. It is found that both the algorithms effectively remove noisy wiggles from MCG traces; significantly improving the quality of the cardiac features in MCG traces. The calculated signal-to-noise ratio (SNR) for the denoised MCG traces is found to be slightly higher in the LMS algorithm as compared to the RLS algorithm. The results encourage the use of adaptive techniques to suppress noise due to power line frequency and its harmonics which occur frequently in biomedical measurements.

  16. Ray tracing a three dimensional scene using a grid

    DOEpatents

    Wald, Ingo; Ize, Santiago; Parker, Steven G; Knoll, Aaron

    2013-02-26

    Ray tracing a three-dimensional scene using a grid. One example embodiment is a method for ray tracing a three-dimensional scene using a grid. In this example method, the three-dimensional scene is made up of objects that are spatially partitioned into a plurality of cells that make up the grid. The method includes a first act of computing a bounding frustum of a packet of rays, and a second act of traversing the grid slice by slice along a major traversal axis. Each slice traversal includes a first act of determining one or more cells in the slice that are overlapped by the frustum and a second act of testing the rays in the packet for intersection with any objects at least partially bounded by the one or more cells overlapped by the frustum.

  17. Modeling a Miniaturized Scanning Electron Microscope Focusing Column - Lessons Learned in Electron Optics Simulation

    NASA Technical Reports Server (NTRS)

    Loyd, Jody; Gregory, Don; Gaskin, Jessica

    2016-01-01

    This presentation discusses work done to assess the design of a focusing column in a miniaturized Scanning Electron Microscope (SEM) developed at the NASA Marshall Space Flight Center (MSFC) for use in-situ on the Moon-in particular for mineralogical analysis. The MSFC beam column design uses purely electrostatic fields for focusing, because of the severe constraints on mass and electrical power consumption imposed by the goals of lunar exploration and of spaceflight in general. The resolution of an SEM ultimately depends on the size of the focused spot of the scanning beam probe, for which the stated goal here is a diameter of 10 nanometers. Optical aberrations are the main challenge to this performance goal, because they blur the ideal geometrical optical image of the electron source, effectively widening the ideal spot size of the beam probe. In the present work the optical aberrations of the mini SEM focusing column were assessed using direct tracing of non-paraxial rays, as opposed to mathematical estimates of aberrations based on paraxial ray-traces. The geometrical ray-tracing employed here is completely analogous to ray-tracing as conventionally understood in the realm of photon optics, with the major difference being that in electron optics the lens is simply a smoothly varying electric field in vacuum, formed by precisely machined electrodes. Ray-tracing in this context, therefore, relies upon a model of the electrostatic field inside the focusing column to provide the mathematical description of the "lens" being traced. This work relied fundamentally on the boundary element method (BEM) for this electric field model. In carrying out this research the authors discovered that higher accuracy in the field model was essential if aberrations were to be reliably assessed using direct ray-tracing. This led to some work in testing alternative techniques for modeling the electrostatic field. Ultimately, the necessary accuracy was attained using a BEM/Fourier series hybrid approach. The presentation will give background remarks about the MSFC mini Lunar SEM concept and electron optics modeling, followed by a description of the alternate field modeling techniques that were tried, along with their incorporation into a ray-trace simulation. Next, the validation of this simulation against commercially available software will be discussed using an example lens as a test case. Then, the efficacy of aberration assessment using direct ray-tracing will be demonstrated, using this same validation case. The discussion will include practical error checks of the field solution. Finally, the ray-trace assessment of the MSFC mini Lunar SEM concept will be shown and discussed. The authors believe this presentation will be of general interest to practitioners of modeling and simulation, as well as those with a general optics background. Because electron optics and photon optics share many basic concepts (e.g., lenses, images, aberrations, etc.), the appeal of this presentation need not be restricted to just those interested in charged particle optics.

  18. Effect of the equivalent refractive index on intraocular lens power prediction with ray tracing after myopic laser in situ keratomileusis.

    PubMed

    Canovas, Carmen; van der Mooren, Marrie; Rosén, Robert; Piers, Patricia A; Wang, Li; Koch, Douglas D; Artal, Pablo

    2015-05-01

    To determine the impact of the equivalent refractive index (ERI) on intraocular lens (IOL) power prediction for eyes with previous myopic laser in situ keratomileusis (LASIK) using custom ray tracing. AMO B.V., Groningen, the Netherlands, and the Department of Ophthalmology, Baylor College of Medicine, Houston, Texas, USA. Retrospective data analysis. The ERI was calculated individually from the post-LASIK total corneal power. Two methods to account for the posterior corneal surface were tested; that is, calculation from pre-LASIK data or from post-LASIK data only. Four IOL power predictions were generated using a computer-based ray-tracing technique, including individual ERI results from both calculation methods, a mean ERI over the whole population, and the ERI for normal patients. For each patient, IOL power results calculated from the four predictions as well as those obtained with the Haigis-L were compared with the optimum IOL power calculated after cataract surgery. The study evaluated 25 patients. The mean and range of ERI values determined using post-LASIK data were similar to those determined from pre-LASIK data. Introducing individual or an average ERI in the ray-tracing IOL power calculation procedure resulted in mean IOL power errors that were not significantly different from zero. The ray-tracing procedure that includes an average ERI gave a greater percentage of eyes with an IOL power prediction error within ±0.5 diopter than the Haigis-L (84% versus 52%). For IOL power determination in post-LASIK patients, custom ray tracing including a modified ERI was an accurate procedure that exceeded the current standards for normal eyes. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  19. The Gaussian Laser Angular Distribution in HYDRA's 3D Laser Ray Trace Package

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sepke, Scott M.

    In this note, the angular distribution of rays launched by the 3D LZR ray trace package is derived for Gaussian beams (npower==2) with bm model=3±. Beams with bm model=+3 have a nearly at distribution, and beams with bm model=-3 have a nearly linear distribution when the spot size is large compared to the wavelength.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yue; Gleber, Sophie-Charlotte; Jacobsen, Chris

    Trace metals play critical roles in a variety of systems, ranging from cells to photovoltaics. X-Ray Fluorescence (XRF) microscopy using X-ray excitation provides one of the highest sensitivities available for imaging the distribution of trace metals at sub-100 nm resolution. With the growing availability and increasing performance of synchrotron light source based instruments and X-ray nanofocusing optics, and with improvements in energy-dispersive XRF detectors, what are the factors that limit trace element detectability? To address this question, we describe an analytical model for the total signal incident on XRF detectors with various geometries, including the spectral response of energy dispersivemore » detectors. This model agrees well with experimentally recorded X-ray fluorescence spectra, and involves much shorter calculation times than with Monte Carlo simulations. With such a model, one can estimate the signal when a trace element is illuminated with an X-ray beam, and when just the surrounding non-fluorescent material is illuminated. From this signal difference, a contrast parameter can be calculated and this can in turn be used to calculate the signal-to-noise ratio (S/N) for detecting a certain elemental concentration. We apply this model to the detection of trace amounts of zinc in biological materials, and to the detection of small quantities of arsenic in semiconductors. We conclude that increased detector collection solid angle is (nearly) always advantageous even when considering the scattered signal. However, given the choice between a smaller detector at 90° to the beam versus a larger detector at 180° (in a backscatter-like geometry), the 90° detector is better for trace element detection in thick samples, while the larger detector in 180° geometry is better suited to trace element detection in thin samples.« less

  1. Modeling ECM fiber formation: structure information extracted by analysis of 2D and 3D image sets

    NASA Astrophysics Data System (ADS)

    Wu, Jun; Voytik-Harbin, Sherry L.; Filmer, David L.; Hoffman, Christoph M.; Yuan, Bo; Chiang, Ching-Shoei; Sturgis, Jennis; Robinson, Joseph P.

    2002-05-01

    Recent evidence supports the notion that biological functions of extracellular matrix (ECM) are highly correlated to its structure. Understanding this fibrous structure is very crucial in tissue engineering to develop the next generation of biomaterials for restoration of tissues and organs. In this paper, we integrate confocal microscopy imaging and image-processing techniques to analyze the structural properties of ECM. We describe a 2D fiber middle-line tracing algorithm and apply it via Euclidean distance maps (EDM) to extract accurate fibrous structure information, such as fiber diameter, length, orientation, and density, from single slices. Based on a 2D tracing algorithm, we extend our analysis to 3D tracing via Euclidean distance maps to extract 3D fibrous structure information. We use computer simulation to construct the 3D fibrous structure which is subsequently used to test our tracing algorithms. After further image processing, these models are then applied to a variety of ECM constructions from which results of 2D and 3D traces are statistically analyzed.

  2. SolTrace | Concentrating Solar Power | NREL

    Science.gov Websites

    NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted

  3. Enroute NASA/FAA low-frequency propfan test in Alabama (October 1987): A versatile atmospheric aircraft long-range noise prediction system

    NASA Astrophysics Data System (ADS)

    Tsouka, Despina G.

    In order to obtain a flight-to-static noise prediction of an advanced Turboprop (propfan) Aircraft, FAA went on an elaboration of the data that were measured during a full scale measuring program that was conducted by NASA and FAA/DOT/TSC on October 1987 in Alabama. The elaboration process was based on aircraft simulation to a point source, on an atmospheric two dimensional noise model, on the American National Standard algorithm for the calculation of atmospheric absortion, and on the DOT/TSC convention for ground reflection effects. Using the data of the Alabama measurements, the present paper examines the development of a generalized, flexible and more accurate process for the evaluation of the static and flight low-frequency long-range noise data. This paper also examines the applicability of the assumptions made by the Integrated Noise Model about linear propagation, of the three dimensional Hamiltonian Rays Tracing model and of the Weyl-Van der Pol model. The model proposes some assumptions in order to increase the calculations flexibility without significant loss of accuracy. In addition, it proposes the usage of the three dimensional Hamiltonian Rays Tracing model and the Weyl-Van der Pol model in order to increase the accuracy and to ensure the generalization of noise propagation prediction over grounds with variable impedance.

  4. The influence of leaf anatomy on the internal light environment and photosynthetic electron transport rate: exploration with a new leaf ray tracing model.

    PubMed

    Xiao, Yi; Tholen, Danny; Zhu, Xin-Guang

    2016-11-01

    Leaf photosynthesis is determined by biochemical properties and anatomical features. Here we developed a three-dimensional leaf model that can be used to evaluate the internal light environment of a leaf and its implications for whole-leaf electron transport rates (J). This model includes (i) the basic components of a leaf, such as the epidermis, palisade and spongy tissues, as well as the physical dimensions and arrangements of cell walls, vacuoles and chloroplasts; and (ii) an efficient forward ray-tracing algorithm, predicting the internal light environment for light of wavelengths between 400 and 2500nm. We studied the influence of leaf anatomy and ambient light on internal light conditions and J The results show that (i) different chloroplasts can experience drastically different light conditions, even when they are located at the same distance from the leaf surface; (ii) bundle sheath extensions, which are strips of parenchyma, collenchyma or sclerenchyma cells connecting the vascular bundles with the epidermis, can influence photosynthetic light-use efficiency of leaves; and (iii) chloroplast positioning can also influence the light-use efficiency of leaves. Mechanisms underlying leaf internal light heterogeneity and implications of the heterogeneity for photoprotection and for the convexity of the light response curves are discussed. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  5. Light-field camera-based 3D volumetric particle image velocimetry with dense ray tracing reconstruction technique

    NASA Astrophysics Data System (ADS)

    Shi, Shengxian; Ding, Junfei; New, T. H.; Soria, Julio

    2017-07-01

    This paper presents a dense ray tracing reconstruction technique for a single light-field camera-based particle image velocimetry. The new approach pre-determines the location of a particle through inverse dense ray tracing and reconstructs the voxel value using multiplicative algebraic reconstruction technique (MART). Simulation studies were undertaken to identify the effects of iteration number, relaxation factor, particle density, voxel-pixel ratio and the effect of the velocity gradient on the performance of the proposed dense ray tracing-based MART method (DRT-MART). The results demonstrate that the DRT-MART method achieves higher reconstruction resolution at significantly better computational efficiency than the MART method (4-50 times faster). Both DRT-MART and MART approaches were applied to measure the velocity field of a low speed jet flow which revealed that for the same computational cost, the DRT-MART method accurately resolves the jet velocity field with improved precision, especially for the velocity component along the depth direction.

  6. Synchrotron-induced X-ray fluorescence from rat bone and lumber vertebra of different age groups

    NASA Astrophysics Data System (ADS)

    Rao, Donepudi V.; Swapna, Medasani; Cesareo, Roberto; Brunetti, Antonio; Akatsuka, Tako; Yuasa, Tetsuya; Takeda, Tohoru; Tromba, Giuliana; Gigante, Giovanni E.

    2009-02-01

    The fluorescence spectra from rat bones of different age groups (8, 56 and 78 weeks) and lumber vertebra were measured with 8, 10 and 12 keV synchrotron X-rays. We have utilized the new hard X-ray micro-spectroscopy beamline facility, X27A, available at NSLS with a primary beam spot size of the order of ˜10 μm. With this spatial resolution and high flux throughput, X-ray fluorescent intensities for Ca and other trace elements were measured using a liquid-nitrogen-cooled 13-element energy-dispersive high-purity germanium detector. Regarding the lumber vertebra, we acquired the fluorescence spectra from the left, right and middle portions and calcium accumulation was evaluated and compared with the other samples. We have identified the major trace elements of Ca, Ni, Fe and Zn and minor trace elements of Ti, Cr and Mn in the sample. The percentage of scattered radiation and trace element contributions from these samples were highlighted at different energies.

  7. Backward and forward Monte Carlo method for vector radiative transfer in a two-dimensional graded index medium

    NASA Astrophysics Data System (ADS)

    Qian, Lin-Feng; Shi, Guo-Dong; Huang, Yong; Xing, Yu-Ming

    2017-10-01

    In vector radiative transfer, backward ray tracing is seldom used. We present a backward and forward Monte Carlo method to simulate vector radiative transfer in a two-dimensional graded index medium, which is new and different from the conventional Monte Carlo method. The backward and forward Monte Carlo method involves dividing the ray tracing into two processes backward tracing and forward tracing. In multidimensional graded index media, the trajectory of a ray is usually a three-dimensional curve. During the transport of a polarization ellipse, the curved ray trajectory will induce geometrical effects and cause Stokes parameters to continuously change. The solution processes for a non-scattering medium and an anisotropic scattering medium are analysed. We also analyse some parameters that influence the Stokes vector in two-dimensional graded index media. The research shows that the Q component of the Stokes vector cannot be ignored. However, the U and V components of the Stokes vector are very small.

  8. Trace element abundance determinations by Synchrotron X Ray Fluorescence (SXRF) on returned comet nucleus mineral grains

    NASA Technical Reports Server (NTRS)

    Flynn, G. J.; Sutton, S. R.

    1989-01-01

    Trace element analyses were performed on bulk cosmic dust particles by Proton Induced X Ray Emission (PIXE) and Synchrotron X Ray Fluorescence (SXRF). When present at or near chondritic abundances the trace elements K, Ti, Cr, Mn, Cu, Zn, Ga, Ge, Se, and Br are presently detectable by SXRF in particles of 20 micron diameter. Improvements to the SXRF analysis facility at the National Synchrotron Light Source presently underway should increase the range of detectable elements and permit the analysis of smaller samples. In addition the Advanced Photon Source will be commissioned at Argonne National Laboratory in 1995. This 7 to 8 GeV positron storage ring, specifically designed for high-energy undulator and wiggler insertion devices, will be an ideal source for an x ray microprobe with one micron spatial resolution and better than 100 ppb elemental sensitivity for most elements. Thus trace element analysis of individual micron-sized grains should be possible by the time of the comet nucleus sample return mission.

  9. Registration of pencil beam proton radiography data with X-ray CT.

    PubMed

    Deffet, Sylvain; Macq, Benoît; Righetto, Roberto; Vander Stappen, François; Farace, Paolo

    2017-10-01

    Proton radiography seems to be a promising tool for assessing the quality of the stopping power computation in proton therapy. However, range error maps obtained on the basis of proton radiographs are very sensitive to small misalignment between the planning CT and the proton radiography acquisitions. In order to be able to mitigate misalignment in postprocessing, the authors implemented a fast method for registration between pencil proton radiography data obtained with a multilayer ionization chamber (MLIC) and an X-ray CT acquired on a head phantom. The registration was performed by optimizing a cost function which performs a comparison between the acquired data and simulated integral depth-dose curves. Two methodologies were considered, one based on dual orthogonal projections and the other one on a single projection. For each methodology, the robustness of the registration algorithm with respect to three confounding factors (measurement noise, CT calibration errors, and spot spacing) was investigated by testing the accuracy of the method through simulations based on a CT scan of a head phantom. The present registration method showed robust convergence towards the optimal solution. For the level of measurement noise and the uncertainty in the stopping power computation expected in proton radiography using a MLIC, the accuracy appeared to be better than 0.3° for angles and 0.3 mm for translations by use of the appropriate cost function. The spot spacing analysis showed that a spacing larger than the 5 mm used by other authors for the investigation of a MLIC for proton radiography led to results with absolute accuracy better than 0.3° for angles and 1 mm for translations when orthogonal proton radiographs were fed into the algorithm. In the case of a single projection, 6 mm was the largest spot spacing presenting an acceptable registration accuracy. For registration of proton radiography data with X-ray CT, the use of a direct ray-tracing algorithm to compute sums of squared differences and corrections of range errors showed very good accuracy and robustness with respect to three confounding factors: measurement noise, calibration error, and spot spacing. It is therefore a suitable algorithm to use in the in vivo range verification framework, allowing to separate in postprocessing the proton range uncertainty due to setup errors from the other sources of uncertainty. © 2017 American Association of Physicists in Medicine.

  10. Polarized reflectance and transmittance properties of windblown sea surfaces.

    PubMed

    Mobley, Curtis D

    2015-05-20

    Generation of random sea surfaces using wave variance spectra and Fourier transforms is formulated in a way that guarantees conservation of wave energy and fully resolves wave height and slope variances. Monte Carlo polarized ray tracing, which accounts for multiple scattering between light rays and wave facets, is used to compute effective Mueller matrices for reflection and transmission of air- or water-incident polarized radiance. Irradiance reflectances computed using a Rayleigh sky radiance distribution, sea surfaces generated with Cox-Munk statistics, and unpolarized ray tracing differ by 10%-18% compared with values computed using elevation- and slope-resolving surfaces and polarized ray tracing. Radiance reflectance factors, as used to estimate water-leaving radiance from measured upwelling and sky radiances, are shown to depend on sky polarization, and improved values are given.

  11. Ray tracing on the MPP

    NASA Technical Reports Server (NTRS)

    Dorband, John E.

    1987-01-01

    Generating graphics to faithfully represent information can be a computationally intensive task. A way of using the Massively Parallel Processor to generate images by ray tracing is presented. This technique uses sort computation, a method of performing generalized routing interspersed with computation on a single-instruction-multiple-data (SIMD) computer.

  12. Publications - GMC 265 | Alaska Division of Geological & Geophysical

    Science.gov Websites

    DGGS GMC 265 Publication Details Title: X-ray fluorescence trace element data of the U.S. Bureau of Clautice, K.H., 1996, X-ray fluorescence trace element data of the U.S. Bureau of Mines Idaho Gulch (Tofty

  13. Investigation of the validity of radiosity for sound-field prediction in cubic rooms

    NASA Astrophysics Data System (ADS)

    Nosal, Eva-Marie; Hodgson, Murray; Ashdown, Ian

    2004-12-01

    This paper explores acoustical (or time-dependent) radiosity using predictions made in four cubic enclosures. The methods and algorithms used are those presented in a previous paper by the same authors [Nosal, Hodgson, and Ashdown, J. Acoust. Soc. Am. 116(2), 970-980 (2004)]. First, the algorithm, methods, and conditions for convergence are investigated by comparison of numerous predictions for the four cubic enclosures. Here, variables and parameters used in the predictions are varied to explore the effect of absorption distribution, the necessary conditions for convergence of the numerical solution to the analytical solution, form-factor prediction methods, and the computational requirements. The predictions are also used to investigate the effect of absorption distribution on sound fields in cubic enclosures with diffusely reflecting boundaries. Acoustical radiosity is then compared to predictions made in the four enclosures by a ray-tracing model that can account for diffuse reflection. Comparisons are made of echograms, room-acoustical parameters, and discretized echograms. .

  14. Investigation of the validity of radiosity for sound-field prediction in cubic rooms.

    PubMed

    Nosal, Eva-Marie; Hodgson, Murray; Ashdown, Ian

    2004-12-01

    This paper explores acoustical (or time-dependent) radiosity using predictions made in four cubic enclosures. The methods and algorithms used are those presented in a previous paper by the same authors [Nosal, Hodgson, and Ashdown, J. Acoust. Soc. Am. 116(2), 970-980 (2004)]. First, the algorithm, methods, and conditions for convergence are investigated by comparison of numerous predictions for the four cubic enclosures. Here, variables and parameters used in the predictions are varied to explore the effect of absorption distribution, the necessary conditions for convergence of the numerical solution to the analytical solution, form-factor prediction methods, and the computational requirements. The predictions are also used to investigate the effect of absorption distribution on sound fields in cubic enclosures with diffusely reflecting boundaries. Acoustical radiosity is then compared to predictions made in the four enclosures by a ray-tracing model that can account for diffuse reflection. Comparisons are made of echograms, room-acoustical parameters, and discretized echograms.

  15. A Compressed Sensing-based Image Reconstruction Algorithm for Solar Flare X-Ray Observations

    NASA Astrophysics Data System (ADS)

    Felix, Simon; Bolzern, Roman; Battaglia, Marina

    2017-11-01

    One way of imaging X-ray emission from solar flares is to measure Fourier components of the spatial X-ray source distribution. We present a new compressed sensing-based algorithm named VIS_CS, which reconstructs the spatial distribution from such Fourier components. We demonstrate the application of the algorithm on synthetic and observed solar flare X-ray data from the Reuven Ramaty High Energy Solar Spectroscopic Imager satellite and compare its performance with existing algorithms. VIS_CS produces competitive results with accurate photometry and morphology, without requiring any algorithm- and X-ray-source-specific parameter tuning. Its robustness and performance make this algorithm ideally suited for the generation of quicklook images or large image cubes without user intervention, such as for imaging spectroscopy analysis.

  16. A Compressed Sensing-based Image Reconstruction Algorithm for Solar Flare X-Ray Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felix, Simon; Bolzern, Roman; Battaglia, Marina, E-mail: simon.felix@fhnw.ch, E-mail: roman.bolzern@fhnw.ch, E-mail: marina.battaglia@fhnw.ch

    One way of imaging X-ray emission from solar flares is to measure Fourier components of the spatial X-ray source distribution. We present a new compressed sensing-based algorithm named VIS-CS, which reconstructs the spatial distribution from such Fourier components. We demonstrate the application of the algorithm on synthetic and observed solar flare X-ray data from the Reuven Ramaty High Energy Solar Spectroscopic Imager satellite and compare its performance with existing algorithms. VIS-CS produces competitive results with accurate photometry and morphology, without requiring any algorithm- and X-ray-source-specific parameter tuning. Its robustness and performance make this algorithm ideally suited for the generation ofmore » quicklook images or large image cubes without user intervention, such as for imaging spectroscopy analysis.« less

  17. Optimization of an FPGA Trigger Based on an Artificial Neural Network for the Detection of Neutrino-Induced Air Showers

    NASA Astrophysics Data System (ADS)

    Szadkowski, Zbigniew; Głas, Dariusz; Pytel, Krzysztof; Wiedeński, Michał

    2017-06-01

    Neutrinos play a fundamental role in the understanding of the origin of ultrahigh-energy cosmic rays. They interact through charged and neutral currents in the atmosphere generating extensive air showers. However, the very low rate of events potentially generated by neutrinos is a significant challenge for detection techniques and requires both sophisticated algorithms and high-resolution hardware. Air showers initiated by protons and muon neutrinos at various altitudes, angles, and energies were simulated in CORSIKA and the Auger OffLine event reconstruction platforms, giving analog-to-digital convertor (ADC) patterns in Auger water Cherenkov detectors on the ground. The proton interaction cross section is high, so proton “old” showers start their development early in the atmosphere. In contrast to this, neutrinos can generate “young” showers deeply in the atmosphere relatively close to the detectors. Differences between “old” proton and “young” neutrino showers are visible in attenuation factors of ADC waveforms. For the separation of “old” proton and “young” neutrino ADC traces, many three-layer artificial neural networks (ANNs) were tested. They were trained in MATLAB (in a dedicated way -only “old” proton and “young” neutrino showers as patterns) by simulated ADC traces according to the Levenberg-Marquardt algorithm. Unexpectedly, the recognition efficiency is found to be almost independent of the size of the networks. The ANN trigger based on a selected 8-6-1 network was tested in the Cyclone V E FPGA 5CEFA9F31I7, the heart of prototype front-end boards developed for testing new algorithms in the Pierre Auger surface detectors.

  18. Active learning of neuron morphology for accurate automated tracing of neurites

    PubMed Central

    Gala, Rohan; Chapeton, Julio; Jitesh, Jayant; Bhavsar, Chintan; Stepanyants, Armen

    2014-01-01

    Automating the process of neurite tracing from light microscopy stacks of images is essential for large-scale or high-throughput quantitative studies of neural circuits. While the general layout of labeled neurites can be captured by many automated tracing algorithms, it is often not possible to differentiate reliably between the processes belonging to different cells. The reason is that some neurites in the stack may appear broken due to imperfect labeling, while others may appear fused due to the limited resolution of optical microscopy. Trained neuroanatomists routinely resolve such topological ambiguities during manual tracing tasks by combining information about distances between branches, branch orientations, intensities, calibers, tortuosities, colors, as well as the presence of spines or boutons. Likewise, to evaluate different topological scenarios automatically, we developed a machine learning approach that combines many of the above mentioned features. A specifically designed confidence measure was used to actively train the algorithm during user-assisted tracing procedure. Active learning significantly reduces the training time and makes it possible to obtain less than 1% generalization error rates by providing few training examples. To evaluate the overall performance of the algorithm a number of image stacks were reconstructed automatically, as well as manually by several trained users, making it possible to compare the automated traces to the baseline inter-user variability. Several geometrical and topological features of the traces were selected for the comparisons. These features include the total trace length, the total numbers of branch and terminal points, the affinity of corresponding traces, and the distances between corresponding branch and terminal points. Our results show that when the density of labeled neurites is sufficiently low, automated traces are not significantly different from manual reconstructions obtained by trained users. PMID:24904306

  19. Performance characteristics of a perforated shadow band under clear sky conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, Michael J.

    2010-12-15

    A perforated, non-rotating shadow band is described for separating global solar irradiance into its diffuse and direct normal components using a single pyranometer. Whereas shadow bands are normally solid so as to occult the sensor of a pyranometer throughout the day, the proposed band has apertures cut from its circumference to intermittently expose the instrument sensor at preset intervals. Under clear sky conditions the device produces a saw tooth waveform of irradiance data from which it is possible to reconstruct separate global and diffuse curves. The direct normal irradiance may then be calculated giving a complete breakdown of the irradiancemore » curves without need of a second instrument or rotating shadow band. This paper describes the principle of operation of the band and gives a mathematical model of its shading mask based on the results of an optical ray tracing study. An algorithm for processing the data from the perforated band system is described and evaluated. In an extended trial conducted at NREL's Solar Radiation Research Laboratory, the band coupled with a thermally corrected Eppley PSP produced independent curves for diffuse, global and direct normal irradiance with low mean bias errors of 5.6 W/m{sup 2}, 0.3 W/m{sup 2} and -2.6 W/m{sup 2} respectively, relative to collocated reference instruments. Random uncertainties were 9.7 W/m{sup 2} (diffuse), 17.3 W/m{sup 2} (global) and 19.0 W/m{sup 2} (direct). When the data processing algorithm was modified to include the ray trace model of sensor exposure, uncertainties increased only marginally, confirming the effectiveness of the model. Deployment of the perforated band system can potentially increase the accuracy of data from ground stations in predominantly sunny areas where instrumentation is limited to a single pyranometer. (author)« less

  20. Trace: a high-throughput tomographic reconstruction engine for large-scale datasets

    DOE PAGES

    Bicer, Tekin; Gursoy, Doga; Andrade, Vincent De; ...

    2017-01-28

    Here, synchrotron light source and detector technologies enable scientists to perform advanced experiments. These scientific instruments and experiments produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used data acquisition technique at light sources is Computed Tomography, which can generate tens of GB/s depending on x-ray range. A large-scale tomographic dataset, such as mouse brain, may require hours of computation time with a medium size workstation. In this paper, we present Trace, a data-intensive computing middleware we developed for implementation and parallelization of iterative tomographic reconstruction algorithms. Tracemore » provides fine-grained reconstruction of tomography datasets using both (thread level) shared memory and (process level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations we have done on the replicated reconstruction objects and evaluate them using a shale and a mouse brain sinogram. Our experimental evaluations show that the applied optimizations and parallelization techniques can provide 158x speedup (using 32 compute nodes) over single core configuration, which decreases the reconstruction time of a sinogram (with 4501 projections and 22400 detector resolution) from 12.5 hours to less than 5 minutes per iteration.« less

  1. Developmental long trace profiler using optimally aligned mirror based pentaprism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barber, Samuel K; Morrison, Gregory Y.; Yashchuk, Valeriy V.

    2010-07-21

    A low-budget surface slope measuring instrument, the Developmental Long Trace Profiler (DLTP), was recently brought into operation at the Advanced Light Source Optical Metrology Laboratory [Nucl. Instr. and Meth. A 616, 212-223 (2010)]. The instrument is based on a precisely calibrated autocollimator and a movable pentaprism. The capability of the DLTP to achieve sub-microradian surface slope metrology has been verified via cross-comparison measurements with other high-performance slope measuring instruments when measuring the same high-quality test optics. In the present work, a further improvement of the DLTP is achieved by replacing the existing bulk pentaprism with a specially designed mirror basedmore » pentaprism. A mirror based pentaprism offers the possibility to eliminate systematic errors introduced by inhomogeneity of the optical material and fabrication imperfections of a bulk pentaprism. We provide the details of the mirror based pentaprism design and describe an original experimental procedure for precision mutual alignment of the mirrors. The algorithm of the alignment procedure and its efficiency are verified with rigorous ray tracing simulations. Results of measurements of a spherically curved test mirror and a flat test mirror using the original bulk pentaprism are compared with measurements using the new mirror based pentaprism, demonstrating the improved performance.« less

  2. Trace: a high-throughput tomographic reconstruction engine for large-scale datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bicer, Tekin; Gursoy, Doga; Andrade, Vincent De

    Here, synchrotron light source and detector technologies enable scientists to perform advanced experiments. These scientific instruments and experiments produce data at such scale and complexity that large-scale computation is required to unleash their full power. One of the widely used data acquisition technique at light sources is Computed Tomography, which can generate tens of GB/s depending on x-ray range. A large-scale tomographic dataset, such as mouse brain, may require hours of computation time with a medium size workstation. In this paper, we present Trace, a data-intensive computing middleware we developed for implementation and parallelization of iterative tomographic reconstruction algorithms. Tracemore » provides fine-grained reconstruction of tomography datasets using both (thread level) shared memory and (process level) distributed memory parallelization. Trace utilizes a special data structure called replicated reconstruction object to maximize application performance. We also present the optimizations we have done on the replicated reconstruction objects and evaluate them using a shale and a mouse brain sinogram. Our experimental evaluations show that the applied optimizations and parallelization techniques can provide 158x speedup (using 32 compute nodes) over single core configuration, which decreases the reconstruction time of a sinogram (with 4501 projections and 22400 detector resolution) from 12.5 hours to less than 5 minutes per iteration.« less

  3. RAY-RAMSES: a code for ray tracing on the fly in N-body simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barreira, Alexandre; Llinares, Claudio; Bose, Sownak

    2016-05-01

    We present a ray tracing code to compute integrated cosmological observables on the fly in AMR N-body simulations. Unlike conventional ray tracing techniques, our code takes full advantage of the time and spatial resolution attained by the N-body simulation by computing the integrals along the line of sight on a cell-by-cell basis through the AMR simulation grid. Moroever, since it runs on the fly in the N-body run, our code can produce maps of the desired observables without storing large (or any) amounts of data for post-processing. We implemented our routines in the RAMSES N-body code and tested the implementationmore » using an example of weak lensing simulation. We analyse basic statistics of lensing convergence maps and find good agreement with semi-analytical methods. The ray tracing methodology presented here can be used in several cosmological analysis such as Sunyaev-Zel'dovich and integrated Sachs-Wolfe effect studies as well as modified gravity. Our code can also be used in cross-checks of the more conventional methods, which can be important in tests of theory systematics in preparation for upcoming large scale structure surveys.« less

  4. The Laser Level as an Optics Laboratory Tool

    ERIC Educational Resources Information Center

    Kutzner, Mickey

    2013-01-01

    For decades now, the laser has been used as a handy device for performing ray traces in geometrical optics demonstrations and laboratories. For many ray- trace applications, I have found the laser level 3 to be even more visually compelling and easy for student use than the laser pointer.

  5. Chamber Optics for Testing Passive Remote Sensing Vapor Detectors

    DTIC Science & Technology

    1993-11-01

    BIOLOGICAL A DEFENSE AGENCY Aberden Proving Ground , Maryland 21010-6423 S4 2 18 94-05616 Best Available Copy Disclaimer The findings in this report are...were tried; ray tracing proved to be the most useful. Rays were iteratively traced through every element using the following paraxial equations. 8 U

  6. Seismic velocity estimation from time migration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cameron, Maria Kourkina

    2007-01-01

    This is concerned with imaging and wave propagation in nonhomogeneous media, and includes a collection of computational techniques, such as level set methods with material transport, Dijkstra-like Hamilton-Jacobi solvers for first arrival Eikonal equations and techniques for data smoothing. The theoretical components include aspects of seismic ray theory, and the results rely on careful comparison with experiment and incorporation as input into large production-style geophysical processing codes. Producing an accurate image of the Earth's interior is a challenging aspect of oil recovery and earthquake analysis. The ultimate computational goal, which is to accurately produce a detailed interior map of themore » Earth's makeup on the basis of external soundings and measurements, is currently out of reach for several reasons. First, although vast amounts of data have been obtained in some regions, this has not been done uniformly, and the data contain noise and artifacts. Simply sifting through the data is a massive computational job. Second, the fundamental inverse problem, namely to deduce the local sound speeds of the earth that give rise to measured reacted signals, is exceedingly difficult: shadow zones and complex structures can make for ill-posed problems, and require vast computational resources. Nonetheless, seismic imaging is a crucial part of the oil and gas industry. Typically, one makes assumptions about the earth's substructure (such as laterally homogeneous layering), and then uses this model as input to an iterative procedure to build perturbations that more closely satisfy the measured data. Such models often break down when the material substructure is significantly complex: not surprisingly, this is often where the most interesting geological features lie. Data often come in a particular, somewhat non-physical coordinate system, known as time migration coordinates. The construction of substructure models from these data is less and less reliable as the earth becomes horizontally nonconstant. Even mild lateral velocity variations can significantly distort subsurface structures on the time migrated images. Conversely, depth migration provides the potential for more accurate reconstructions, since it can handle significant lateral variations. However, this approach requires good input data, known as a 'velocity model'. We address the problem of estimating seismic velocities inside the earth, i.e., the problem of constructing a velocity model, which is necessary for obtaining seismic images in regular Cartesian coordinates. The main goals are to develop algorithms to convert time-migration velocities to true seismic velocities, and to convert time-migrated images to depth images in regular Cartesian coordinates. Our main results are three-fold. First, we establish a theoretical relation between the true seismic velocities and the 'time migration velocities' using the paraxial ray tracing. Second, we formulate an appropriate inverse problem describing the relation between time migration velocities and depth velocities, and show that this problem is mathematically ill-posed, i.e., unstable to small perturbations. Third, we develop numerical algorithms to solve regularized versions of these equations which can be used to recover smoothed velocity variations. Our algorithms consist of efficient time-to-depth conversion algorithms, based on Dijkstra-like Fast Marching Methods, as well as level set and ray tracing algorithms for transforming Dix velocities into seismic velocities. Our algorithms are applied to both two-dimensional and three-dimensional problems, and we test them on a collection of both synthetic examples and field data.« less

  7. Dual-energy fluorescent x-ray computed tomography system with a pinhole design: Use of K-edge discontinuity for scatter correction

    PubMed Central

    Sasaya, Tenta; Sunaguchi, Naoki; Thet-Lwin, Thet-; Hyodo, Kazuyuki; Zeniya, Tsutomu; Takeda, Tohoru; Yuasa, Tetsuya

    2017-01-01

    We propose a pinhole-based fluorescent x-ray computed tomography (p-FXCT) system with a 2-D detector and volumetric beam that can suppress the quality deterioration caused by scatter components. In the corresponding p-FXCT technique, projections are acquired at individual incident energies just above and below the K-edge of the imaged trace element; then, reconstruction is performed based on the two sets of projections using a maximum likelihood expectation maximization algorithm that incorporates the scatter components. We constructed a p-FXCT imaging system and performed a preliminary experiment using a physical phantom and an I imaging agent. The proposed dual-energy p-FXCT improved the contrast-to-noise ratio by a factor of more than 2.5 compared to that attainable using mono-energetic p-FXCT for a 0.3 mg/ml I solution. We also imaged an excised rat’s liver infused with a Ba contrast agent to demonstrate the feasibility of imaging a biological sample. PMID:28272496

  8. Design and analysis of radiometric instruments using high-level numerical models and genetic algorithms

    NASA Astrophysics Data System (ADS)

    Sorensen, Ira Joseph

    A primary objective of the effort reported here is to develop a radiometric instrument modeling environment to provide complete end-to-end numerical models of radiometric instruments, integrating the optical, electro-thermal, and electronic systems. The modeling environment consists of a Monte Carlo ray-trace (MCRT) model of the optical system coupled to a transient, three-dimensional finite-difference electrothermal model of the detector assembly with an analytic model of the signal-conditioning circuitry. The environment provides a complete simulation of the dynamic optical and electrothermal behavior of the instrument. The modeling environment is used to create an end-to-end model of the CERES scanning radiometer, and its performance is compared to the performance of an operational CERES total channel as a benchmark. A further objective of this effort is to formulate an efficient design environment for radiometric instruments. To this end, the modeling environment is then combined with evolutionary search algorithms known as genetic algorithms (GA's) to develop a methodology for optimal instrument design using high-level radiometric instrument models. GA's are applied to the design of the optical system and detector system separately and to both as an aggregate function with positive results.

  9. Tensor completion for estimating missing values in visual data.

    PubMed

    Liu, Ji; Musialski, Przemyslaw; Wonka, Peter; Ye, Jieping

    2013-01-01

    In this paper, we propose an algorithm to estimate missing values in tensors of visual data. The values can be missing due to problems in the acquisition process or because the user manually identified unwanted outliers. Our algorithm works even with a small amount of samples and it can propagate structure to fill larger missing regions. Our methodology is built on recent studies about matrix completion using the matrix trace norm. The contribution of our paper is to extend the matrix case to the tensor case by proposing the first definition of the trace norm for tensors and then by building a working algorithm. First, we propose a definition for the tensor trace norm that generalizes the established definition of the matrix trace norm. Second, similarly to matrix completion, the tensor completion is formulated as a convex optimization problem. Unfortunately, the straightforward problem extension is significantly harder to solve than the matrix case because of the dependency among multiple constraints. To tackle this problem, we developed three algorithms: simple low rank tensor completion (SiLRTC), fast low rank tensor completion (FaLRTC), and high accuracy low rank tensor completion (HaLRTC). The SiLRTC algorithm is simple to implement and employs a relaxation technique to separate the dependent relationships and uses the block coordinate descent (BCD) method to achieve a globally optimal solution; the FaLRTC algorithm utilizes a smoothing scheme to transform the original nonsmooth problem into a smooth one and can be used to solve a general tensor trace norm minimization problem; the HaLRTC algorithm applies the alternating direction method of multipliers (ADMMs) to our problem. Our experiments show potential applications of our algorithms and the quantitative evaluation indicates that our methods are more accurate and robust than heuristic approaches. The efficiency comparison indicates that FaLTRC and HaLRTC are more efficient than SiLRTC and between FaLRTC an- HaLRTC the former is more efficient to obtain a low accuracy solution and the latter is preferred if a high-accuracy solution is desired.

  10. A new approach for reducing beam hardening artifacts in polychromatic X-ray computed tomography using more accurate prior image.

    PubMed

    Wang, Hui; Xu, Yanan; Shi, Hongli

    2018-03-15

    Metal artifacts severely degrade CT image quality in clinical diagnosis, which are difficult to removed, especially for the beam hardening artifacts. The metal artifact reduction (MAR) based on prior images are the most frequently-used methods. However, there exists a lot misclassification in most prior images caused by absence of prior information such as spectrum distribution of X-ray beam source, especially when multiple or big metal are included. This work aims is to identify a more accurate prior image to improve image quality. The proposed method includes four steps. First, the metal image is segmented by thresholding an initial image, where the metal traces are identified in the initial projection data using the forward projection of the metal image. Second, the accurate absorbent model of certain metal image is calculated according to the spectrum distribution of certain X-ray beam source and energy-dependent attenuation coefficients of metal. Third, a new metal image is reconstructed by the general analytical reconstruction algorithm such as filtered back projection (FPB). The prior image is obtained by segmenting the difference image between the initial image and the new metal image into air, tissue and bone. Fourth, the initial projection data are normalized by dividing the projection data of prior image pixel to pixel. The final corrected image is obtained by interpolation, denormalization and reconstruction. Several clinical images with dental fillings and knee prostheses were used to evaluate the proposed algorithm and normalized metal artifact reduction (NMAR) and linear interpolation (LI) method. The results demonstrate the artifacts were reduced efficiently by the proposed method. The proposed method could obtain an exact prior image using the prior information about X-ray beam source and energy-dependent attenuation coefficients of metal. As a result, better performance of reducing beam hardening artifacts can be achieved. Moreover, the process of the proposed method is rather simple and little extra calculation burden is necessary. It has superiorities over other algorithms when include multiple and/or big implants.

  11. Development of Extended Ray-tracing method including diffraction, polarization and wave decay effects

    NASA Astrophysics Data System (ADS)

    Yanagihara, Kota; Kubo, Shin; Dodin, Ilya; Nakamura, Hiroaki; Tsujimura, Toru

    2017-10-01

    Geometrical Optics Ray-tracing is a reasonable numerical analytic approach for describing the Electron Cyclotron resonance Wave (ECW) in slowly varying spatially inhomogeneous plasma. It is well known that the result with this conventional method is adequate in most cases. However, in the case of Helical fusion plasma which has complicated magnetic structure, strong magnetic shear with a large scale length of density can cause a mode coupling of waves outside the last closed flux surface, and complicated absorption structure requires a strong focused wave for ECH. Since conventional Ray Equations to describe ECW do not have any terms to describe the diffraction, polarization and wave decay effects, we can not describe accurately a mode coupling of waves, strong focus waves, behavior of waves in inhomogeneous absorption region and so on. For fundamental solution of these problems, we consider the extension of the Ray-tracing method. Specific process is planned as follows. First, calculate the reference ray by conventional method, and define the local ray-base coordinate system along the reference ray. Then, calculate the evolution of the distributions of amplitude and phase on ray-base coordinate step by step. The progress of our extended method will be presented.

  12. Gold nanoparticle flow sensors designed for dynamic X-ray imaging in biofluids.

    PubMed

    Ahn, Sungsook; Jung, Sung Yong; Lee, Jin Pyung; Kim, Hae Koo; Lee, Sang Joon

    2010-07-27

    X-ray-based imaging is one of the most powerful and convenient methods in terms of versatility in applicable energy and high performance in use. Different from conventional nuclear medicine imaging, contrast agents are required in X-ray imaging especially for effectively targeted and molecularly specific functions. Here, in contrast to much reported static accumulation of the contrast agents in targeted organs, dynamic visualization in a living organism is successfully accomplished by the particle-traced X-ray imaging for the first time. Flow phenomena across perforated end walls of xylem vessels in rice are monitored by a gold nanoparticle (AuNP) (approximately 20 nm in diameter) as a flow tracing sensor working in nontransparent biofluids. AuNPs are surface-modified to control the hydrodynamic properties such as hydrodynamic size (DH), zeta-potential, and surface plasmonic properties in aqueous conditions. Transmission electron microscopy (TEM), scanning electron microscopy (SEM), X-ray nanoscopy (XN), and X-ray microscopy (XM) are used to correlate the interparticle interactions with X-ray absorption ability. Cluster formation and X-ray contrast ability of the AuNPs are successfully modulated by controlling the interparticle interactions evaluated as flow-tracing sensors.

  13. Mid-Frequency Reverberation Measurements with Full Companion Environmental Support

    DTIC Science & Technology

    2014-12-30

    acoustic modeling is based on measured stratification and observed wave amplitudes on the New Jersey shelf during the SWARM experiment.3 Ray tracing is...wave model then gives quantitative results for the clutter. 2. Swarm NLIW model and ray tracing Nonlinear internal waves are very common on the...receiver in order to give quantitative clutter to reverberation. To picture the mechanism, a set of rays was launched from a source at range zero and

  14. Quantitative ultrasonic testing of acoustically anisotropic materials with verification on austenitic and dissimilar weld joints

    NASA Astrophysics Data System (ADS)

    Boller, C.; Pudovikov, S.; Bulavinov, A.

    2012-05-01

    Austenitic stainless steel materials are widely used in a variety of industry sectors. In particular, the material is qualified to meet the design criteria of high quality in safety related applications. For example, the primary loop of the most of the nuclear power plants in the world, due to high durability and corrosion resistance, is made of this material. Certain operating conditions may cause a range of changes in the integrity of the component, and therefore require nondestructive testing at reasonable intervals. These in-service inspections are often performed using ultrasonic techniques, in particular when cracking is of specific concern. However, the coarse, dendritic grain structure of the weld material, formed during the welding process, is extreme and unpredictably anisotropic. Such structure is no longer direction-independent to the ultrasonic wave propagation; therefore, the ultrasonic beam deflects and redirects and the wave front becomes distorted. Thus, the use of conventional ultrasonic testing techniques using fixed beam angles is very limited and the application of ultrasonic Phased Array techniques becomes desirable. The "Sampling Phased Array" technique, invented and developed by Fraunhofer IZFP, allows the acquisition of time signals (A-scans) for each individual transducer element of the array along with fast image reconstruction techniques based on synthetic focusing algorithms. The reconstruction considers the sound propagation from each image pixel to the individual sensor element. For anisotropic media, where the sound beam is deflected and the sound path is not known a-priori, a novel phase adjustment technique called "Reverse Phase Matching" is implemented. By taking into account the anisotropy and inhomogeneity of the weld structure, a ray tracing algorithm for modeling the acoustic wave propagation and calculating the sound propagation time is applied. This technique can be utilized for 2D and 3D real time image reconstruction. The "Gradient Constant Descent Method" (GECDM), an iterative algorithm, is implemented, which is essential for examination of inhomogeneous anisotropic media having unknown properties (elastic constants). The Sampling Phased Array technique with Reverse Phase Matching extended by GECDM-technique determines unknown elastic constants and provides reliable and efficient quantitative flaw detection in the austenitic welds. The validation of ray-tracing algorithm and GECDM-method is performed by number of experiments on test specimens with artificial as well as natural material flaws. A mechanized system for ultrasonic testing of stainless steel and dissimilar welds is developed. The system works on both conventional and Sampling Phased Array techniques. The new frontend ultrasonic unit with optical data link allows the 3D visualization of the inspection results in real time.

  15. Verification technology of remote sensing camera satellite imaging simulation based on ray tracing

    NASA Astrophysics Data System (ADS)

    Gu, Qiongqiong; Chen, Xiaomei; Yang, Deyun

    2017-08-01

    Remote sensing satellite camera imaging simulation technology is broadly used to evaluate the satellite imaging quality and to test the data application system. But the simulation precision is hard to examine. In this paper, we propose an experimental simulation verification method, which is based on the test parameter variation comparison. According to the simulation model based on ray-tracing, the experiment is to verify the model precision by changing the types of devices, which are corresponding the parameters of the model. The experimental results show that the similarity between the imaging model based on ray tracing and the experimental image is 91.4%, which can simulate the remote sensing satellite imaging system very well.

  16. The effects of atmospheric refraction on the accuracy of laser ranging systems

    NASA Technical Reports Server (NTRS)

    Zanter, D. L.; Gardner, C. S.; Rao, N. N.

    1976-01-01

    Correction formulas derived by Saastamoinen and Marini, and the ray traces through the refractivity profiles all assume a spherically symmetric refractivity profile. The errors introduced by this assumption were investigated by ray tracing through three-dimensional profiles. The results of this investigation indicate that the difference between ray traces through the spherically symmetric and three-dimensional profiles is approximately three centimeters at 10 deg and decreases to less than one half of a centimeter at 80 deg. If the accuracy desired in future laser ranging systems is less than a few centimeters, Saastamoinen and Marini's formulas must be altered to account for the fact that the refractivity profile is not spherically symmetric.

  17. Atmospheric oscillations comparison on long term tropospheric delay time series derived from ray-tracing and GPS

    NASA Astrophysics Data System (ADS)

    Nikolaidou, Thalia; Santos, Marcelo

    2017-04-01

    The caused time delay induced by the atmosphere on the GNSS signals (NAD), depends primarily on the amount of atmosphere the signal traverses till it reaches to the Earth's surface and can exceed t 20 m for low elevation angles (around 3 degrees). For a particular ray i.e. satellite/quasar-antenna link, the delay depends on the atmospheric parameters of total pressure, temperature, and the partial pressure of water vapor. Because of that, numerical weather models (NWM) have already proven beneficial for atmospheric modelling and geodesy. By direct raytracing, inside NWM, the VMF1 and the University of New Brunswick VMF1 (UNB-VMF1) (Urquhart et al. 2011), access the 3D variation of the meteorological parameters that determine the delay thus being the state-the-art mapping functions used today. The raytracing procedure is capable of providing NADs delays for any point on the Earth's surface. In this study we study the impact of regional numerical weather models, with high spatial and temporal resolution, namely 25km and 6h. These models outweigh the currently used NWM by having about 2.6 times better spatial resolution. Raytracing through such NWM, using the independent raytracing algorithm develop at UNB (Nievinski, 2009), we acquire superior quality NADs with regional application. We ray-trace for the International GNSS service (IGS) network stations for a time span of 11 years. Benchmarking against the IGS troposphere product is performed to access the accuracy of our results. A periodicity analysis is conducted to examine the signature of atmospheric oscillations on the NAD time series. In order to recognize the NAD periodicities, we compared our product against the GPS-derived IGS troposphere product. Systematic effects within each single technique are identified and long-term NAD stability is accessed.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anthony, Stephen

    The Sandia hyperspectral upper-bound spectrum algorithm (hyper-UBS) is a cosmic ray despiking algorithm for hyperspectral data sets. When naturally-occurring, high-energy (gigaelectronvolt) cosmic rays impact the earth’s atmosphere, they create an avalanche of secondary particles which will register as a large, positive spike on any spectroscopic detector they hit. Cosmic ray spikes are therefore an unavoidable spectroscopic contaminant which can interfere with subsequent analysis. A variety of cosmic ray despiking algorithms already exist and can potentially be applied to hyperspectral data matrices, most notably the upper-bound spectrum data matrices (UBS-DM) algorithm by Dongmao Zhang and Dor Ben-Amotz which served as themore » basis for the hyper-UBS algorithm. However, the existing algorithms either cannot be applied to hyperspectral data, require information that is not always available, introduce undesired spectral bias, or have otherwise limited effectiveness for some experimentally relevant conditions. Hyper-UBS is more effective at removing a wider variety of cosmic ray spikes from hyperspectral data without introducing undesired spectral bias. In addition to the core algorithm the Sandia hyper-UBS software package includes additional source code useful in evaluating the effectiveness of the hyper-UBS algorithm. The accompanying source code includes code to generate simulated hyperspectral data contaminated by cosmic ray spikes, several existing despiking algorithms, and code to evaluate the performance of the despiking algorithms on simulated data.« less

  19. Investigations into the Properties, Conditions, and Effects of the Ionosphere

    DTIC Science & Technology

    1990-01-15

    ionogram database to be used in testing trace-identification algorithms; d. Development of automatic trace-identification algorithms and autoscaling ...Scaler ( ARTIST ) and improvement of the ARTIST software; g. Maintenance and upgrade of the digital ionosondes at Argentia, Newfoundland, and Goose Bay...provided by the contractor; j. Upgrade of the ARTIST computer at the Danish Meteorological Institute/GL Qaanaaq site to provide digisonde tape-playback

  20. Molray--a web interface between O and the POV-Ray ray tracer.

    PubMed

    Harris, M; Jones, T A

    2001-08-01

    A publicly available web-based interface is presented for producing high-quality ray-traced images and movies from the molecular-modelling program O [Jones et al. (1991), Acta Cryst. A47, 110-119]. The interface allows the user to select O-plot files and set parameters to create standard input files for the popular ray-tracing renderer POV-Ray, which can then produce publication-quality still images or simple movies. To ensure ease of use, we have made this service available to the O user community via the World Wide Web. The public Molray server is available at http://xray.bmc.uu.se/molray.

  1. Flood inundation extent mapping based on block compressed tracing

    NASA Astrophysics Data System (ADS)

    Shen, Dingtao; Rui, Yikang; Wang, Jiechen; Zhang, Yu; Cheng, Liang

    2015-07-01

    Flood inundation extent, depth, and duration are important factors affecting flood hazard evaluation. At present, flood inundation analysis is based mainly on a seeded region-growing algorithm, which is an inefficient process because it requires excessive recursive computations and it is incapable of processing massive datasets. To address this problem, we propose a block compressed tracing algorithm for mapping the flood inundation extent, which reads the DEM data in blocks before transferring them to raster compression storage. This allows a smaller computer memory to process a larger amount of data, which solves the problem of the regular seeded region-growing algorithm. In addition, the use of a raster boundary tracing technique allows the algorithm to avoid the time-consuming computations required by the seeded region-growing. Finally, we conduct a comparative evaluation in the Chin-sha River basin, results show that the proposed method solves the problem of flood inundation extent mapping based on massive DEM datasets with higher computational efficiency than the original method, which makes it suitable for practical applications.

  2. Quantum Hamiltonian identification from measurement time traces.

    PubMed

    Zhang, Jun; Sarovar, Mohan

    2014-08-22

    Precise identification of parameters governing quantum processes is a critical task for quantum information and communication technologies. In this Letter, we consider a setting where system evolution is determined by a parametrized Hamiltonian, and the task is to estimate these parameters from temporal records of a restricted set of system observables (time traces). Based on the notion of system realization from linear systems theory, we develop a constructive algorithm that provides estimates of the unknown parameters directly from these time traces. We illustrate the algorithm and its robustness to measurement noise by applying it to a one-dimensional spin chain model with variable couplings.

  3. Integration of Monte-Carlo ray tracing with a stochastic optimisation method: application to the design of solar receiver geometry.

    PubMed

    Asselineau, Charles-Alexis; Zapata, Jose; Pye, John

    2015-06-01

    A stochastic optimisation method adapted to illumination and radiative heat transfer problems involving Monte-Carlo ray-tracing is presented. A solar receiver shape optimisation case study illustrates the advantages of the method and its potential: efficient receivers are identified using a moderate computational cost.

  4. Usage of CO2 microbubbles as flow-tracing contrast media in X-ray dynamic imaging of blood flows.

    PubMed

    Lee, Sang Joon; Park, Han Wook; Jung, Sung Yong

    2014-09-01

    X-ray imaging techniques have been employed to visualize various biofluid flow phenomena in a non-destructive manner. X-ray particle image velocimetry (PIV) was developed to measure velocity fields of blood flows to obtain hemodynamic information. A time-resolved X-ray PIV technique that is capable of measuring the velocity fields of blood flows under real physiological conditions was recently developed. However, technical limitations still remained in the measurement of blood flows with high image contrast and sufficient biocapability. In this study, CO2 microbubbles as flow-tracing contrast media for X-ray PIV measurements of biofluid flows was developed. Human serum albumin and CO2 gas were mechanically agitated to fabricate CO2 microbubbles. The optimal fabricating conditions of CO2 microbubbles were found by comparing the size and amount of microbubbles fabricated under various operating conditions. The average size and quantity of CO2 microbubbles were measured by using a synchrotron X-ray imaging technique with a high spatial resolution. The quantity and size of the fabricated microbubbles decrease with increasing speed and operation time of the mechanical agitation. The feasibility of CO2 microbubbles as a flow-tracing contrast media was checked for a 40% hematocrit blood flow. Particle images of the blood flow were consecutively captured by the time-resolved X-ray PIV system to obtain velocity field information of the flow. The experimental results were compared with a theoretically amassed velocity profile. Results show that the CO2 microbubbles can be used as effective flow-tracing contrast media in X-ray PIV experiments.

  5. Three-dimensional plant architecture and sunlit-shaded patterns: a stochastic model of light dynamics in canopies.

    PubMed

    Retkute, Renata; Townsend, Alexandra J; Murchie, Erik H; Jensen, Oliver E; Preston, Simon P

    2018-05-25

    Diurnal changes in solar position and intensity combined with the structural complexity of plant architecture result in highly variable and dynamic light patterns within the plant canopy. This affects productivity through the complex ways that photosynthesis responds to changes in light intensity. Current methods to characterize light dynamics, such as ray-tracing, are able to produce data with excellent spatio-temporal resolution but are computationally intensive and the resulting data are complex and high-dimensional. This necessitates development of more economical models for summarizing the data and for simulating realistic light patterns over the course of a day. High-resolution reconstructions of field-grown plants are assembled in various configurations to form canopies, and a forward ray-tracing algorithm is applied to the canopies to compute light dynamics at high (1 min) temporal resolution. From the ray-tracer output, the sunlit or shaded state for each patch on the plants is determined, and these data are used to develop a novel stochastic model for the sunlit-shaded patterns. The model is designed to be straightforward to fit to data using maximum likelihood estimation, and fast to simulate from. For a wide range of contrasting 3-D canopies, the stochastic model is able to summarize, and replicate in simulations, key features of the light dynamics. When light patterns simulated from the stochastic model are used as input to a model of photoinhibition, the predicted reduction in carbon gain is similar to that from calculations based on the (extremely costly) ray-tracer data. The model provides a way to summarize highly complex data in a small number of parameters, and a cost-effective way to simulate realistic light patterns. Simulations from the model will be particularly useful for feeding into larger-scale photosynthesis models for calculating how light dynamics affects the photosynthetic productivity of canopies.

  6. Spectral correction algorithm for multispectral CdTe x-ray detectors

    NASA Astrophysics Data System (ADS)

    Christensen, Erik D.; Kehres, Jan; Gu, Yun; Feidenhans'l, Robert; Olsen, Ulrik L.

    2017-09-01

    Compared to the dual energy scintillator detectors widely used today, pixelated multispectral X-ray detectors show the potential to improve material identification in various radiography and tomography applications used for industrial and security purposes. However, detector effects, such as charge sharing and photon pileup, distort the measured spectra in high flux pixelated multispectral detectors. These effects significantly reduce the detectors' capabilities to be used for material identification, which requires accurate spectral measurements. We have developed a semi analytical computational algorithm for multispectral CdTe X-ray detectors which corrects the measured spectra for severe spectral distortions caused by the detector. The algorithm is developed for the Multix ME100 CdTe X-ray detector, but could potentially be adapted for any pixelated multispectral CdTe detector. The calibration of the algorithm is based on simple attenuation measurements of commercially available materials using standard laboratory sources, making the algorithm applicable in any X-ray setup. The validation of the algorithm has been done using experimental data acquired with both standard lab equipment and synchrotron radiation. The experiments show that the algorithm is fast, reliable even at X-ray flux up to 5 Mph/s/mm2, and greatly improves the accuracy of the measured X-ray spectra, making the algorithm very useful for both security and industrial applications where multispectral detectors are used.

  7. Control of broadband optically generated ultrasound pulses using binary amplitude holograms.

    PubMed

    Brown, Michael D; Jaros, Jiri; Cox, Ben T; Treeby, Bradley E

    2016-04-01

    In this work, the use of binary amplitude holography is investigated as a mechanism to focus broadband acoustic pulses generated by high peak-power pulsed lasers. Two algorithms are described for the calculation of the binary holograms; one using ray-tracing, and one using an optimization based on direct binary search. It is shown using numerical simulations that when a binary amplitude hologram is excited by a train of laser pulses at its design frequency, the acoustic field can be focused at a pre-determined distribution of points, including single and multiple focal points, and line and square foci. The numerical results are validated by acoustic field measurements from binary amplitude holograms, excited by a high peak-power laser.

  8. Accuracy control in Monte Carlo radiative calculations

    NASA Technical Reports Server (NTRS)

    Almazan, P. Planas

    1993-01-01

    The general accuracy law that rules the Monte Carlo, ray-tracing algorithms used commonly for the calculation of the radiative entities in the thermal analysis of spacecraft are presented. These entities involve transfer of radiative energy either from a single source to a target (e.g., the configuration factors). or from several sources to a target (e.g., the absorbed heat fluxes). In fact, the former is just a particular case of the latter. The accuracy model is later applied to the calculation of some specific radiative entities. Furthermore, some issues related to the implementation of such a model in a software tool are discussed. Although only the relative error is considered through the discussion, similar results can be derived for the absolute error.

  9. Optimization of the design of Gas Cherenkov Detectors for ICF diagnosis

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Hu, Huasi; Han, Hetong; Lv, Huanwen; Li, Lan

    2018-07-01

    A design method, which combines a genetic algorithm (GA) with Monte-Carlo simulation, is established and applied to two different types of Cherenkov detectors, namely, Gas Cherenkov Detector (GCD) and Gamma Reaction History (GRH). For accelerating the optimization program, open Message Passing Interface (MPI) is used in the Geant4 simulation. Compared with the traditional optical ray-tracing method, the performances of these detectors have been improved with the optimization method. The efficiency for GCD system, with a threshold of 6.3 MeV, is enhanced by ∼20% and time response improved by ∼7.2%. For the GRH system, with threshold of 10 MeV, the efficiency is enhanced by ∼76% in comparison with previously published results.

  10. Error analysis of speed of sound reconstruction in ultrasound limited angle transmission tomography.

    PubMed

    Jintamethasawat, Rungroj; Lee, Won-Mean; Carson, Paul L; Hooi, Fong Ming; Fowlkes, J Brian; Goodsitt, Mitchell M; Sampson, Richard; Wenisch, Thomas F; Wei, Siyuan; Zhou, Jian; Chakrabarti, Chaitali; Kripfgans, Oliver D

    2018-04-07

    We have investigated limited angle transmission tomography to estimate speed of sound (SOS) distributions for breast cancer detection. That requires both accurate delineations of major tissues, in this case by segmentation of prior B-mode images, and calibration of the relative positions of the opposed transducers. Experimental sensitivity evaluation of the reconstructions with respect to segmentation and calibration errors is difficult with our current system. Therefore, parametric studies of SOS errors in our bent-ray reconstructions were simulated. They included mis-segmentation of an object of interest or a nearby object, and miscalibration of relative transducer positions in 3D. Close correspondence of reconstruction accuracy was verified in the simplest case, a cylindrical object in homogeneous background with induced segmentation and calibration inaccuracies. Simulated mis-segmentation in object size and lateral location produced maximum SOS errors of 6.3% within 10 mm diameter change and 9.1% within 5 mm shift, respectively. Modest errors in assumed transducer separation produced the maximum SOS error from miscalibrations (57.3% within 5 mm shift), still, correction of this type of error can easily be achieved in the clinic. This study should aid in designing adequate transducer mounts and calibration procedures, and in specification of B-mode image quality and segmentation algorithms for limited angle transmission tomography relying on ray tracing algorithms. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Testing Linear Temporal Logic Formulae on Finite Execution Traces

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Norvig, Peter (Technical Monitor)

    2001-01-01

    We present an algorithm for efficiently testing Linear Temporal Logic (LTL) formulae on finite execution traces. The standard models of LTL are infinite traces, reflecting the behavior of reactive and concurrent systems which conceptually may be continuously alive. In most past applications of LTL. theorem provers and model checkers have been used to formally prove that down-scaled models satisfy such LTL specifications. Our goal is instead to use LTL for up-scaled testing of real software applications. Such tests correspond to analyzing the conformance of finite traces against LTL formulae. We first describe what it means for a finite trace to satisfy an LTL property. We then suggest an optimized algorithm based on transforming LTL formulae. The work is done using the Maude rewriting system. which turns out to provide a perfect notation and an efficient rewriting engine for performing these experiments.

  12. Non-singular acoustic cloak derived by the ray tracing method with rotationally symmetric transformations

    PubMed Central

    Wu, Linzhi

    2016-01-01

    Recently, the ray tracing method has been used to derive the non-singular cylindrical invisibility cloaks for out-of-plane shear waves, which is impossible via the transformation method directly owing to the singular push-forward mapping. In this paper, the method is adopted to design a kind of non-singular acoustic cloak. Based on Hamilton's equations of motion, eikonal equation and pre-designed ray equations, we derive several constraint equations for bulk modulus and density tensor. On the premise that the perfect matching conditions are satisfied, a series of non-singular physical profiles can be obtained by arranging the singular terms reasonably. The physical profiles derived by the ray tracing method will degenerate to the transformation-based solutions when taking the transport equation into consideration. This illuminates the essence of the newly designed cloaks that they are actually the so-called eikonal cloaks that can accurately control the paths of energy flux but with small disturbance in energy distribution along the paths. The near-perfect invisible performance has been demonstrated by the numerical ray tracing results and the pressure distribution snapshots. Finally, a kind of reduced cloak is conceived, and the good invisible performance has been measured quantitatively by the normalized scattering width. PMID:27118884

  13. HARPA: A versatile three-dimensional Hamiltonian ray-tracing program for acoustic waves in the atmosphere above irregular terrain

    NASA Astrophysics Data System (ADS)

    Jones, R. M.; Riley, J. P.; Georges, T. M.

    1986-08-01

    The modular FORTRAN 77 computer program traces the three-dimensional paths of acoustic rays through continuous model atmospheres by numerically integrating Hamilton's equations (a differential expression of Fermat's principle). The user specifies an atmospheric model by writing closed-form formulas for its three-dimensional wind and temperature (or sound speed) distribution, and by defining the height of the reflecting terrain vs. geographic latitude and longitude. Some general-purpose models are provided, or users can readily design their own. In addition to computing the geometry of each raypath, HARPA can calculate pulse travel time, phase time, Doppler shift (if the medium varies in time), absorption, and geometrical path length. The program prints a step-by-step account of a ray's progress. The 410-page documentation describes the ray-tracing equations and the structure of the program, and provides complete instructions, illustrated by a sample case.

  14. Software to model AXAF-I image quality

    NASA Technical Reports Server (NTRS)

    Ahmad, Anees; Feng, Chen

    1995-01-01

    A modular user-friendly computer program for the modeling of grazing-incidence type x-ray optical systems has been developed. This comprehensive computer software GRAZTRACE covers the manipulation of input data, ray tracing with reflectivity and surface deformation effects, convolution with x-ray source shape, and x-ray scattering. The program also includes the capabilities for image analysis, detector scan modeling, and graphical presentation of the results. A number of utilities have been developed to interface the predicted Advanced X-ray Astrophysics Facility-Imaging (AXAF-I) mirror structural and thermal distortions with the ray-trace. This software is written in FORTRAN 77 and runs on a SUN/SPARC station. An interactive command mode version and a batch mode version of the software have been developed.

  15. Multilayer X-ray imaging systems

    NASA Astrophysics Data System (ADS)

    Shealy, D. L.; Hoover, R. B.; Gabardi, D. R.

    1986-01-01

    An assessment of the imaging properties of multilayer X-ray imaging systems with spherical surfaces has been made. A ray trace analysis was performed to investigate the effects of using spherical substrates (rather than the conventional paraboloidal/hyperboloidal contours) for doubly reflecting Cassegrain telescopes. These investigations were carried out for mirrors designed to operate at selected soft X-ray/XUV wavelengths that are of significance for studies of the solar corona/transition region from the Stanford/MSFC Rocket X-Ray Telescope. The effects of changes in separation of the primary and secondary elements were also investigated. These theoretical results are presented as well as the results of ray trace studies to establish the resolution and vignetting effects as a function of field angle and system parameters.

  16. Publications - GMC 264 | Alaska Division of Geological & Geophysical

    Science.gov Websites

    DGGS GMC 264 Publication Details Title: X-ray fluorescence trace element data of the following U.S for more information. Bibliographic Reference Werdon, M.B., 1996, X-ray fluorescence trace element . Bureau of Mines hard rock mineral pulp samples from the Colville mining district: West Kivliktort

  17. Inverse scattering and refraction corrected reflection for breast cancer imaging

    NASA Astrophysics Data System (ADS)

    Wiskin, J.; Borup, D.; Johnson, S.; Berggren, M.; Robinson, D.; Smith, J.; Chen, J.; Parisky, Y.; Klock, John

    2010-03-01

    Reflection ultrasound (US) has been utilized as an adjunct imaging modality for over 30 years. TechniScan, Inc. has developed unique, transmission and concomitant reflection algorithms which are used to reconstruct images from data gathered during a tomographic breast scanning process called Warm Bath Ultrasound (WBU™). The transmission algorithm yields high resolution, 3D, attenuation and speed of sound (SOS) images. The reflection algorithm is based on canonical ray tracing utilizing refraction correction via the SOS and attenuation reconstructions. The refraction correction reflection algorithm allows 360 degree compounding resulting in the reflection image. The requisite data are collected when scanning the entire breast in a 33° C water bath, on average in 8 minutes. This presentation explains how the data are collected and processed by the 3D transmission and reflection imaging mode algorithms. The processing is carried out using two NVIDIA® Tesla™ GPU processors, accessing data on a 4-TeraByte RAID. The WBU™ images are displayed in a DICOM viewer that allows registration of all three modalities. Several representative cases are presented to demonstrate potential diagnostic capability including: a cyst, fibroadenoma, and a carcinoma. WBU™ images (SOS, attenuation, and reflection modalities) are shown along with their respective mammograms and standard ultrasound images. In addition, anatomical studies are shown comparing WBU™ images and MRI images of a cadaver breast. This innovative technology is designed to provide additional tools in the armamentarium for diagnosis of breast disease.

  18. Biological X-ray absorption spectroscopy (BioXAS): a valuable tool for the study of trace elements in the life sciences.

    PubMed

    Strange, Richard W; Feiters, Martin C

    2008-10-01

    Using X-ray absorption spectroscopy (XAS) the binding modes (type and number of ligands, distances and geometry) and oxidation states of metals and other trace elements in crystalline as well as non-crystalline samples can be revealed. The method may be applied to biological systems as a 'stand-alone' technique, but it is particularly powerful when used alongside other X-ray and spectroscopic techniques and computational approaches. In this review, we highlight how biological XAS is being used in concert with crystallography, spectroscopy and computational chemistry to study metalloproteins in crystals, and report recent applications on relatively rare trace elements utilised by living organisms and metals involved in neurodegenerative diseases.

  19. Electromagnetic ray tracing model for line structures.

    PubMed

    Tan, C B; Khoh, A; Yeo, S H

    2008-03-17

    In this paper, a model for electromagnetic scattering of line structures is established based on high frequency approximation approach - ray tracing. This electromagnetic ray tracing (ERT) model gives the advantage of identifying each physical field that contributes to the total solution of the scattering phenomenon. Besides the geometrical optics field, different diffracted fields associated with the line structures are also discussed and formulated. A step by step addition of each electromagnetic field is given to elucidate the causes of a disturbance in the amplitude profile. The accuracy of the ERT model is also discussed by comparing with the reference finite difference time domain (FDTD) solution, which shows a promising result for a single polysilicon line structure with width of as narrow as 0.4 wavelength.

  20. Optical design and optimization of parabolic dish solar concentrator with a cavity hybrid receiver

    NASA Astrophysics Data System (ADS)

    Blázquez, R.; Carballo, J.; Silva, M.

    2016-05-01

    One of the main goals of the BIOSTIRLING-4SKA project, funded by the European Commission, is the development of a hybrid Dish-Stirling system based on a hybrid solar-gas receiver, which has been designed by the Swedish company Cleanergy. A ray tracing study, which is part of the design of this parabolic dish system, is presented in this paper. The study pursues the optimization of the concentrator and receiver cavity geometry according to the requirements of flux distribution on the receiver walls set by the designer of the hybrid receiver. The ray-tracing analysis has been performed with the open source software Tonatiuh, a ray-tracing tool specifically oriented to the modeling of solar concentrators.

  1. X-ray mask and method for providing same

    DOEpatents

    Morales, Alfredo M [Pleasanton, CA; Skala, Dawn M [Fremont, CA

    2004-09-28

    The present invention describes a method for fabricating an x-ray mask tool which can achieve pattern features having lateral dimension of less than 1 micron. The process uses a thin photoresist and a standard lithographic mask to transfer an trace image pattern in the surface of a silicon wafer by exposing and developing the resist. The exposed portion of the silicon substrate is then anisotropically etched to provide an etched image of the trace image pattern consisting of a series of channels in the silicon having a high depth-to-width aspect ratio. These channels are then filled by depositing a metal such as gold to provide an inverse image of the trace image and thereby providing a robust x-ray mask tool.

  2. X-ray mask and method for providing same

    DOEpatents

    Morales, Alfredo M.; Skala, Dawn M.

    2002-01-01

    The present invention describes a method for fabricating an x-ray mask tool which can achieve pattern features having lateral dimension of less than 1 micron. The process uses a thin photoresist and a standard lithographic mask to transfer an trace image pattern in the surface of a silicon wafer by exposing and developing the resist. The exposed portion of the silicon substrate is then anisotropically etched to provide an etched image of the trace image pattern consisting of a series of channels in the silicon having a high depth-to-width aspect ratio. These channels are then filled by depositing a metal such as gold to provide an inverse image of the trace image and thereby providing a robust x-ray mask tool.

  3. Comparison of matrix method and ray tracing in the study of complex optical systems

    NASA Astrophysics Data System (ADS)

    Anterrieu, Eric; Perez, Jose-Philippe

    2000-06-01

    In the context of the classical study of optical systems within the geometrical Gauss approximation, the cardinal elements are efficiently obtained with the aid of the transfer matrix between the input and output planes of the system. In order to take into account the geometrical aberrations, a ray tracing approach, using the Snell- Descartes laws, has been implemented in an interactive software. Both methods are applied for measuring the correction to be done to a human eye suffering from ametropia. This software may be used by optometrists and ophthalmologists for solving the problems encountered when considering this pathology. The ray tracing approach gives a significant improvement and could be very helpful for a better understanding of an eventual surgical act.

  4. OPTICAL TRANSCRIBING OSCILLOSCOPE

    DOEpatents

    Kerns, Q.A.

    1961-09-26

    A device is designed for producing accurate graphed waveforms of very fast electronic pulses. The fast pulse is slowly tracked on a cathode ray tube and a pair of photomultiplier tubes, exposed to the pulse trace, view separate vertical portions thereof at each side of a fixed horizontal reference. Each phototube produces an output signal indicative of vertical movement of the exposed trace, which simultaneous signals are compared in a difference amplifier. The amplifier produces a difference signal which, when applied to the cathode ray tube, maintains the trace on the reference. A graphic recorder receives the amplified difference signal at an x-axis input, while a y-axis input is synchronized with the tracking time of the cathode ray tube and therefore graphs the enlarged waveshape.

  5. Ray tracing analysis of overlapping objects in refraction contrast imaging.

    PubMed

    Hirano, Masatsugu; Yamasaki, Katsuhito; Okada, Hiroshi; Sakurai, Takashi; Kondoh, Takeshi; Katafuchi, Tetsuro; Sugimura, Kazuro; Kitazawa, Sohei; Kitazawa, Riko; Maeda, Sakan; Tamura, Shinichi

    2005-08-01

    We simulated refraction contrast imaging in overlapping objects using the ray tracing method. The easiest case, in which two columnar objects (blood vessels) with a density of 1.0 [g/cm3], run at right angles in air, was calculated. For absorption, we performed simulation using the Snell law adapted to the object's boundary. A pair of bright and dark spot results from the interference of refracted X-rays where the blood vessels crossed. This has the possibility of increasing the visibility of the image.

  6. Simulation and optimization of volume holographic imaging systems in Zemax.

    PubMed

    Wissmann, Patrick; Oh, Se Baek; Barbastathis, George

    2008-05-12

    We present a new methodology for ray-tracing analysis of volume holographic imaging (VHI) systems. Using the k-sphere formulation, we apply geometrical relationships to describe the volumetric diffraction effects imposed on rays passing through a volume hologram. We explain the k-sphere formulation in conjunction with ray tracing process and describe its implementation in a Zemax UDS (User Defined Surface). We conclude with examples of simulation and optimization results and show proof of consistency and usefulness of the proposed model.

  7. Effects of urban microcellular environments on ray-tracing-based coverage predictions.

    PubMed

    Liu, Zhongyu; Guo, Lixin; Guan, Xiaowei; Sun, Jiejing

    2016-09-01

    The ray-tracing (RT) algorithm, which is based on geometrical optics and the uniform theory of diffraction, has become a typical deterministic approach of studying wave-propagation characteristics. Under urban microcellular environments, the RT method highly depends on detailed environmental information. The aim of this paper is to provide help in selecting the appropriate level of accuracy required in building databases to achieve good tradeoffs between database costs and prediction accuracy. After familiarization with the operating procedures of the RT-based prediction model, this study focuses on the effect of errors in environmental information on prediction results. The environmental information consists of two parts, namely, geometric and electrical parameters. The geometric information can be obtained from a digital map of a city. To study the effects of inaccuracies in geometry information (building layout) on RT-based coverage prediction, two different artificial erroneous maps are generated based on the original digital map, and systematic analysis is performed by comparing the predictions with the erroneous maps and measurements or the predictions with the original digital map. To make the conclusion more persuasive, the influence of random errors on RMS delay spread results is investigated. Furthermore, given the electrical parameters' effect on the accuracy of the predicted results of the RT model, the dielectric constant and conductivity of building materials are set with different values. The path loss and RMS delay spread under the same circumstances are simulated by the RT prediction model.

  8. Study on the influence of X-ray tube spectral distribution on the analysis of bulk samples and thin films: Fundamental parameters method and theoretical coefficient algorithms

    NASA Astrophysics Data System (ADS)

    Sitko, Rafał

    2008-11-01

    Knowledge of X-ray tube spectral distribution is necessary in theoretical methods of matrix correction, i.e. in both fundamental parameter (FP) methods and theoretical influence coefficient algorithms. Thus, the influence of X-ray tube distribution on the accuracy of the analysis of thin films and bulk samples is presented. The calculations are performed using experimental X-ray tube spectra taken from the literature and theoretical X-ray tube spectra evaluated by three different algorithms proposed by Pella et al. (X-Ray Spectrom. 14 (1985) 125-135), Ebel (X-Ray Spectrom. 28 (1999) 255-266), and Finkelshtein and Pavlova (X-Ray Spectrom. 28 (1999) 27-32). In this study, Fe-Cr-Ni system is selected as an example and the calculations are performed for X-ray tubes commonly applied in X-ray fluorescence analysis (XRF), i.e., Cr, Mo, Rh and W. The influence of X-ray tube spectra on FP analysis is evaluated when quantification is performed using various types of calibration samples. FP analysis of bulk samples is performed using pure-element bulk standards and multielement bulk standards similar to the analyzed material, whereas for FP analysis of thin films, the bulk and thin pure-element standards are used. For the evaluation of the influence of X-ray tube spectra on XRF analysis performed by theoretical influence coefficient methods, two algorithms for bulk samples are selected, i.e. Claisse-Quintin (Can. Spectrosc. 12 (1967) 129-134) and COLA algorithms (G.R. Lachance, Paper Presented at the International Conference on Industrial Inorganic Elemental Analysis, Metz, France, June 3, 1981) and two algorithms (constant and linear coefficients) for thin films recently proposed by Sitko (X-Ray Spectrom. 37 (2008) 265-272).

  9. Spatial imaging in the soft x-ray region (20--304 A) utilizing the astigmatism of a grazing incidence concave grating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nudelfuden, A.; Solanki, R.; Moos, H.W.

    1985-03-15

    Soft x-ray (20--304--A) astigmatic line shapes were measured in order to evaluate the spatial imaging properties of a Rowland mounted concave grating in grazing incidence. The practicability of coarse 1-D spatial imaging in the soft x-ray region is demonstrated. Spatial resolution equivalent to approx.4 cm at a source distance of 2 m can be achieved with practical parameters (e.g., sensitivity and time resolution) for a fusion diagnostic spectrograph. The results are compared to computer-generated ray tracings and found to be in good agreement. The ray tracing program which models the grazing incidence optics is discussed.

  10. Best-next-view algorithm for three-dimensional scene reconstruction using range images

    NASA Astrophysics Data System (ADS)

    Banta, J. E.; Zhien, Yu; Wang, X. Z.; Zhang, G.; Smith, M. T.; Abidi, Mongi A.

    1995-10-01

    The primary focus of the research detailed in this paper is to develop an intelligent sensing module capable of automatically determining the optimal next sensor position and orientation during scene reconstruction. To facilitate a solution to this problem, we have assembled a system for reconstructing a 3D model of an object or scene from a sequence of range images. Candidates for the best-next-view position are determined by detecting and measuring occlusions to the range camera's view in an image. Ultimately, the candidate which will reveal the greatest amount of unknown scene information is selected as the best-next-view position. Our algorithm uses ray tracing to determine how much new information a given sensor perspective will reveal. We have tested our algorithm successfully on several synthetic range data streams, and found the system's results to be consistent with an intuitive human search. The models recovered by our system from range data compared well with the ideal models. Essentially, we have proven that range information of physical objects can be employed to automatically reconstruct a satisfactory dynamic 3D computer model at a minimal computational expense. This has obvious implications in the contexts of robot navigation, manufacturing, and hazardous materials handling. The algorithm we developed takes advantage of no a priori information in finding the best-next-view position.

  11. 4D ultrasound speckle tracking of intra-fraction prostate motion: a phantom-based comparison with x-ray fiducial tracking using CyberKnife

    NASA Astrophysics Data System (ADS)

    O'Shea, Tuathan P.; Garcia, Leo J.; Rosser, Karen E.; Harris, Emma J.; Evans, Philip M.; Bamber, Jeffrey C.

    2014-04-01

    This study investigates the use of a mechanically-swept 3D ultrasound (3D-US) probe for soft-tissue displacement monitoring during prostate irradiation, with emphasis on quantifying the accuracy relative to CyberKnife® x-ray fiducial tracking. An US phantom, implanted with x-ray fiducial markers was placed on a motion platform and translated in 3D using five real prostate motion traces acquired using the Calypso system. Motion traces were representative of all types of motion as classified by studying Calypso data for 22 patients. The phantom was imaged using a 3D swept linear-array probe (to mimic trans-perineal imaging) and, subsequently, the kV x-ray imaging system on CyberKnife. A 3D cross-correlation block-matching algorithm was used to track speckle in the ultrasound data. Fiducial and US data were each compared with known phantom displacement. Trans-perineal 3D-US imaging could track superior-inferior (SI) and anterior-posterior (AP) motion to ≤0.81 mm root-mean-square error (RMSE) at a 1.7 Hz volume rate. The maximum kV x-ray tracking RMSE was 0.74 mm, however the prostate motion was sampled at a significantly lower imaging rate (mean: 0.04 Hz). Initial elevational (right-left RL) US displacement estimates showed reduced accuracy but could be improved (RMSE <2.0 mm) using a correlation threshold in the ultrasound tracking code to remove erroneous inter-volume displacement estimates. Mechanically-swept 3D-US can track the major components of intra-fraction prostate motion accurately but exhibits some limitations. The largest US RMSE was for elevational (RL) motion. For the AP and SI axes, accuracy was sub-millimetre. It may be feasible to track prostate motion in 2D only. 3D-US also has the potential to improve high tracking accuracy for all motion types. It would be advisable to use US in conjunction with a small (˜2.0 mm) centre-of-mass displacement threshold in which case it would be possible to take full advantage of the accuracy and high imaging rate capability.

  12. OceanRoute: Vessel Mobility Data Processing and Analyzing Model Based on MapReduce

    NASA Astrophysics Data System (ADS)

    Liu, Chao; Liu, Yingjian; Guo, Zhongwen; Jing, Wei

    2018-06-01

    The network coverage is a big problem in ocean communication, and there is no low-cost solution in the short term. Based on the knowledge of Mobile Delay Tolerant Network (MDTN), the mobility of vessels can create the chances of end-to-end communication. The mobility pattern of vessel is one of the key metrics on ocean MDTN network. Because of the high cost, few experiments have focused on research of vessel mobility pattern for the moment. In this paper, we study the traces of more than 4000 fishing and freight vessels. Firstly, to solve the data noise and sparsity problem, we design two algorithms to filter the noise and complement the missing data based on the vessel's turning feature. Secondly, after studying the traces of vessels, we observe that the vessel's traces are confined by invisible boundary. Thirdly, through defining the distance between traces, we design MR-Similarity algorithm to find the mobility pattern of vessels. Finally, we realize our algorithm on cluster and evaluate the performance and accuracy. Our results can provide the guidelines on design of data routing protocols on ocean MDTN.

  13. Negative dysphotopsia: Causes and rationale for prevention and treatment.

    PubMed

    Holladay, Jack T; Simpson, Michael J

    2017-02-01

    To determine the cause of negative dysphotopsia using standard ray-tracing techniques and identify the primary and secondary causative factors. Department of Ophthalmology, Baylor College of Medicine, Houston, Texas, USA. Experimental study. Zemax ray-tracing software was used to evaluate pseudophakic and phakic eye models to show the location of retinal field images from various visual field objects. Phakic retinal field angles (RFAs) were used as a reference for the perceived field locations for retinal images in pseudophakic eyes. In a nominal acrylic pseudophakic eye model with a 2.5 mm diameter pupil, the maximum RFA from rays refracted by the intraocular lens (IOL) was 85.7 degrees and the minimum RFA for rays missing the optic of the IOL was 88.3 degrees, leaving a dark gap (shadow) of 2.6 degrees in the extreme temporal field. The width of the shadow was more prominent for a smaller pupil, a larger angle kappa, an equi-biconvex or plano-convex IOL shape, and a smaller axial distance from iris to IOL and with the anterior capsule overlying the nasal IOL. Secondary factors included IOL edge design, material, diameter, decentration, tilt, and aspheric surfaces. Standard ray-tracing techniques showed that a shadow is present when there is a gap between the retinal images formed by rays missing the optic of the IOL and rays refracted by the IOL. Primary and secondary factors independently affected the width and location of the gap (or overlap). The ray tracing also showed a constriction and double retinal imaging in the extreme temporal visual field. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  14. (U) Second-Order Sensitivity Analysis of Uncollided Particle Contributions to Radiation Detector Responses Using Ray-Tracing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Favorite, Jeffrey A.

    The Second-Level Adjoint Sensitivity System (2nd-LASS) that yields the second-order sensitivities of a response of uncollided particles with respect to isotope densities, cross sections, and source emission rates is derived in Refs. 1 and 2. In Ref. 2, we solved problems for the uncollided leakage from a homogeneous sphere and a multiregion cylinder using the PARTISN multigroup discrete-ordinates code. In this memo, we derive solutions of the 2nd-LASS for the particular case when the response is a flux or partial current density computed at a single point on the boundary, and the inner products are computed using ray-tracing. Both themore » PARTISN approach and the ray-tracing approach are implemented in a computer code, SENSPG. The next section of this report presents the equations of the 1st- and 2nd-LASS for uncollided particles and the first- and second-order sensitivities that use the solutions of the 1st- and 2nd-LASS. Section III presents solutions of the 1st- and 2nd-LASS equations for the case of ray-tracing from a detector point. Section IV presents specific solutions of the 2nd-LASS and derives the ray-trace form of the inner products needed for second-order sensitivities. Numerical results for the total leakage from a homogeneous sphere are presented in Sec. V and for the leakage from one side of a two-region slab in Sec. VI. Section VII is a summary and conclusions.« less

  15. Analysis of the Effect of Electron Density Perturbations Generated by Gravity Waves on HF Communication Links

    NASA Astrophysics Data System (ADS)

    Fagre, M.; Elias, A. G.; Chum, J.; Cabrera, M. A.

    2017-12-01

    In the present work, ray tracing of high frequency (HF) signals in ionospheric disturbed conditions is analyzed, particularly in the presence of electron density perturbations generated by gravity waves (GWs). The three-dimensional numerical ray tracing code by Jones and Stephenson, based on Hamilton's equations, which is commonly used to study radio propagation through the ionosphere, is used. An electron density perturbation model is implemented to this code based upon the consideration of atmospheric GWs generated at a height of 150 km in the thermosphere and propagating up into the ionosphere. The motion of the neutral gas at these altitudes induces disturbances in the background plasma which affects HF signals propagation. To obtain a realistic model of GWs in order to analyze the propagation and dispersion characteristics, a GW ray tracing method with kinematic viscosity and thermal diffusivity was applied. The IRI-2012, HWM14 and NRLMSISE-00 models were incorporated to assess electron density, wind velocities, neutral temperature and total mass density needed for the ray tracing codes. Preliminary results of gravity wave effects on ground range and reflection height are presented for low-mid latitude ionosphere.

  16. Total-reflection X-ray fluorescence studies of trace elements in biomedical samples

    NASA Astrophysics Data System (ADS)

    Kubala-Kukuś, A.; Braziewicz, J.; Pajek, M.

    2004-08-01

    Application of the total-reflection X-ray fluorescence (TXRF) analysis in the studies of trace element contents in biomedical samples is discussed in the following aspects: (i) a nature of trace element concentration distributions, (ii) censoring approach to the detection limits, and (iii) a comparison of two sets of censored data. The paper summarizes the recent results achieved in this topics, in particular, the lognormal, or more general logstable, nature of concentration distribution of trace elements, the random left-censoring and the Kaplan-Meier approach accounting for detection limits and, finally, the application of the logrank test to compare the censored concentrations measured for two groups. These new aspects, which are of importance for applications of the TXRF in different fields, are discussed here in the context of TXRF studies of trace element in various samples of medical interest.

  17. New Developments in Hard X-ray Fluorescence Microscopy for In-situ Investigations of Trace Element Distributions in Aqueous Systems of Soil Colloids

    NASA Astrophysics Data System (ADS)

    Gleber, Sophie-Charlotte; Weinhausen, Britta; Köster, Sarah; Ward, Jesse; Vine, David; Finney, Lydia; Vogt, Stefan

    2013-10-01

    The distribution, binding and release of trace elements on soil colloids determine matter transport through the soil matrix, and necessitates an aqueous environment and short length and time scales for their study. However, not many microscopy techniques allow for that. We previously showed hard x-ray fluorescence microscopy capabilities to image aqueous colloidal soil samples [1]. As this technique provides attogram sensitivity for transition elements like Cu, Zn, and other geochemically relevant trace elements at sub micrometer spatial resolution (currently down to 150 nm at 2-ID-E [2]; below 50nm at Bionanoprobe, cf. G.Woloschak et al, this volume) combined with the capability to penetrate tens of micrometer of water, it is ideally suited for imaging the elemental content of soil colloids. To address the question of binding and release processes of trace elements on the surface of soil colloids, we developed a microfluidics based XRF flow cytometer, and expanded the applied methods of hard x-ray fluorescence microscopy towards three dimensional imaging. Here, we show (a) the 2-D imaged distributions of Si, K and Fe on soil colloids of Pseudogley samples; (b) how the trace element distribution is a dynamic, pH-dependent process; and (c) x-ray tomographic applications to render the trace elemental distributions in 3-D. We conclude that the approach presented here shows the remarkable potential to image and quantitate elemental distributions from samles within their natural aqueous microenvironment, particularly important in the environmental, medical, and biological sciences.

  18. Ray-tracing of shape metrology data of grazing incidence x-ray astronomy mirrors

    NASA Astrophysics Data System (ADS)

    Zocchi, Fabio E.; Vernani, Dervis

    2008-07-01

    A number of future X-ray astronomy missions (e.g. Simbol-X, eROSITA) plan to utilize high throughput grazing incidence optics with very lightweight mirrors. The severe mass specifications require a further optimization of the existing technology with the consequent need of proper optical numerical modeling capabilities for both the masters and the mirrors. A ray tracing code has been developed for the simulation of the optical performance of type I Wolter masters and mirrors starting from 2D and 3D metrology data. In particular, in the case of 2D measurements, a 3D data set is reconstructed on the basis of dimensional references and used for the optical analysis by ray tracing. In this approach, the actual 3D shape is used for the optical analysis, thus avoiding the need of combining the separate contributions of different 2D measurements that require the knowledge of their interactions which is not normally available. The paper describes the proposed approach and presents examples of application on a prototype engineering master in the frame of ongoing activities carried out for present and future X-ray missions.

  19. Efficient calculation of luminance variation of a luminaire that uses LED light sources

    NASA Astrophysics Data System (ADS)

    Goldstein, Peter

    2007-09-01

    Many luminaires have an array of LEDs that illuminate a lenslet-array diffuser in order to create the appearance of a single, extended source with a smooth luminance distribution. Designing such a system is challenging because luminance calculations for a lenslet array generally involve tracing millions of rays per LED, which is computationally intensive and time-consuming. This paper presents a technique for calculating an on-axis luminance distribution by tracing only one ray per LED per lenslet. A multiple-LED system is simulated with this method, and with Monte Carlo ray-tracing software for comparison. Accuracy improves, and computation time decreases by at least five orders of magnitude with this technique, which has applications in LED-based signage, displays, and general illumination.

  20. Determination by ray-tracing of the regions where mid-latitude whistlers exit from the lower ionosphere

    NASA Astrophysics Data System (ADS)

    Strangeways, H. J.

    1981-03-01

    The size and position of the regions in the bottomside ionosphere through which downcoming whistlers emerge are estimated using ray-tracing calculations in both summer day and winter night models of the magnetospheric plasma. Consideration is given to the trapping of upgoing whistler-mode waves through both the base and the side of ducts. It is found that for downcoming rays which were trapped in the duct in the summer day model, the limited range of wave-normal angles which can be transmitted from the lower ionosphere to free space below causes the size of the exit point to be considerably smaller than the region of incidence. The exit point is found to be approximately 100 km in size, which agrees with ground-based observations of fairly narrow trace whistlers. For rays trapped in the duct in the winter night model, it is found that the size of the exit point is more nearly the same as the range of final latitudes of the downcoming rays in the lower ionosphere.

  1. Rapid simulation of X-ray transmission imaging for baggage inspection via GPU-based ray-tracing

    NASA Astrophysics Data System (ADS)

    Gong, Qian; Stoian, Razvan-Ionut; Coccarelli, David S.; Greenberg, Joel A.; Vera, Esteban; Gehm, Michael E.

    2018-01-01

    We present a pipeline that rapidly simulates X-ray transmission imaging for arbitrary system architectures using GPU-based ray-tracing techniques. The purpose of the pipeline is to enable statistical analysis of threat detection in the context of airline baggage inspection. As a faster alternative to Monte Carlo methods, we adopt a deterministic approach for simulating photoelectric absorption-based imaging. The highly-optimized NVIDIA OptiX API is used to implement ray-tracing, greatly speeding code execution. In addition, we implement the first hierarchical representation structure to determine the interaction path length of rays traversing heterogeneous media described by layered polygons. The accuracy of the pipeline has been validated by comparing simulated data with experimental data collected using a heterogenous phantom and a laboratory X-ray imaging system. On a single computer, our approach allows us to generate over 400 2D transmission projections (125 × 125 pixels per frame) per hour for a bag packed with hundreds of everyday objects. By implementing our approach on cloud-based GPU computing platforms, we find that the same 2D projections of approximately 3.9 million bags can be obtained in a single day using 400 GPU instances, at a cost of only 0.001 per bag.

  2. Using recurrence plot analysis for software execution interpretation and fault detection

    NASA Astrophysics Data System (ADS)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  3. Compton scattering artifacts in electron excited X-ray spectra measured with a silicon drift detector.

    PubMed

    Ritchie, Nicholas W M; Newbury, Dale E; Lindstrom, Abigail P

    2011-12-01

    Artifacts are the nemesis of trace element analysis in electron-excited energy dispersive X-ray spectrometry. Peaks that result from nonideal behavior in the detector or sample can fool even an experienced microanalyst into believing that they have trace amounts of an element that is not present. Many artifacts, such as the Si escape peak, absorption edges, and coincidence peaks, can be traced to the detector. Others, such as secondary fluorescence peaks and scatter peaks, can be traced to the sample. We have identified a new sample-dependent artifact that we attribute to Compton scattering of energetic X-rays generated in a small feature and subsequently scattered from a low atomic number matrix. It seems likely that this artifact has not previously been reported because it only occurs under specific conditions and represents a relatively small signal. However, with the advent of silicon drift detectors and their utility for trace element analysis, we anticipate that more people will observe it and possibly misidentify it. Though small, the artifact is not inconsequential. Under some conditions, it is possible to mistakenly identify the Compton scatter artifact as approximately 1% of an element that is not present.

  4. Understanding GRETINA using angular correlation method

    NASA Astrophysics Data System (ADS)

    Austin, Madeline

    2015-10-01

    The ability to trace the path of gamma rays through germanium is not only necessary for taking full advantage of GRETINA but also a promising possibility for homeland security defense against nuclear threats. This research tested the current tracking algorithm using the angular correlation method by comparing results from raw and tracked data to the theoretical model for Co-60. It was found that the current tracking method is unsuccessful in reproducing angular correlation. Variations to the tracking algorithm were made in the FM value, tracking angle, number of angles of separation observed, and window of coincidence in attempt to improve correlation results. From these variations it was observed that having a larger FM improved results, reducing the number of observational angles worsened correlation, and that overall larger tracking angles improved with larger windows of coincidence and vice-verse. Future research would be to refine the angle of measurement for raw data and to explore the possibility of an energy dependence by testing other elements. This work is supported by the United States Department of Energy, Office of Science, under Contract Number DE-AC02-06CH11357

  5. Effects of Combined Stellar Feedback on Star Formation in Stellar Clusters

    NASA Astrophysics Data System (ADS)

    Wall, Joshua Edward; McMillan, Stephen; Pellegrino, Andrew; Mac Low, Mordecai; Klessen, Ralf; Portegies Zwart, Simon

    2018-01-01

    We present results of hybrid MHD+N-body simulations of star cluster formation and evolution including self consistent feedback from the stars in the form of radiation, winds, and supernovae from all stars more massive than 7 solar masses. The MHD is modeled with the adaptive mesh refinement code FLASH, while the N-body computations are done with a direct algorithm. Radiation is modeled using ray tracing along long characteristics in directions distributed using the HEALPIX algorithm, and causes ionization and momentum deposition, while winds and supernova conserve momentum and energy during injection. Stellar evolution is followed using power-law fits to evolution models in SeBa. We use a gravity bridge within the AMUSE framework to couple the N-body dynamics of the stars to the gas dynamics in FLASH. Feedback from the massive stars alters the structure of young clusters as gas ejection occurs. We diagnose this behavior by distinguishing between fractal distribution and central clustering using a Q parameter computed from the minimum spanning tree of each model cluster. Global effects of feedback in our simulations will also be discussed.

  6. Simultaneous optimization of micro-heliostat geometry and field layout using a genetic algorithm

    NASA Astrophysics Data System (ADS)

    Lazardjani, Mani Yousefpour; Kronhardt, Valentina; Dikta, Gerhard; Göttsche, Joachim

    2016-05-01

    A new optimization tool for micro-heliostat (MH) geometry and field layout is presented. The method intends simultaneous performance improvement and cost reduction through iteration of heliostat geometry and field layout parameters. This tool was developed primarily for the optimization of a novel micro-heliostat concept, which was developed at Solar-Institut Jülich (SIJ). However, the underlying approach for the optimization can be used for any heliostat type. During the optimization the performance is calculated using the ray-tracing tool SolCal. The costs of the heliostats are calculated by use of a detailed cost function. A genetic algorithm is used to change heliostat geometry and field layout in an iterative process. Starting from an initial setup, the optimization tool generates several configurations of heliostat geometries and field layouts. For each configuration a cost-performance ratio is calculated. Based on that, the best geometry and field layout can be selected in each optimization step. In order to find the best configuration, this step is repeated until no significant improvement in the results is observed.

  7. High-resolution computed tomography of single breast cancer microcalcifications in vivo.

    PubMed

    Inoue, Kazumasa; Liu, Fangbing; Hoppin, Jack; Lunsford, Elaine P; Lackas, Christian; Hesterman, Jacob; Lenkinski, Robert E; Fujii, Hirofumi; Frangioni, John V

    2011-08-01

    Microcalcification is a hallmark of breast cancer and a key diagnostic feature for mammography. We recently described the first robust animal model of breast cancer microcalcification. In this study, we hypothesized that high-resolution computed tomography (CT) could potentially detect the genesis of a single microcalcification in vivo and quantify its growth over time. Using a commercial CT scanner, we systematically optimized acquisition and reconstruction parameters. Two ray-tracing image reconstruction algorithms were tested: a voxel-driven "fast" cone beam algorithm (FCBA) and a detector-driven "exact" cone beam algorithm (ECBA). By optimizing acquisition and reconstruction parameters, we were able to achieve a resolution of 104 μm full width at half-maximum (FWHM). At an optimal detector sampling frequency, the ECBA provided a 28 μm (21%) FWHM improvement in resolution over the FCBA. In vitro, we were able to image a single 300 μm × 100 μm hydroxyapatite crystal. In a syngeneic rat model of breast cancer, we were able to detect the genesis of a single microcalcification in vivo and follow its growth longitudinally over weeks. Taken together, this study provides an in vivo "gold standard" for the development of calcification-specific contrast agents and a model system for studying the mechanism of breast cancer microcalcification.

  8. Simulating optoelectronic systems for remote sensing with SENSOR

    NASA Astrophysics Data System (ADS)

    Boerner, Anko

    2003-04-01

    The consistent end-to-end simulation of airborne and spaceborne remote sensing systems is an important task and sometimes the only way for the adaptation and optimization of a sensor and its observation conditions, the choice and test of algorithms for data processing, error estimation and the evaluation of the capabilities of the whole sensor system. The presented software simulator SENSOR (Software ENvironment for the Simulation of Optical Remote sensing systems) includes a full model of the sensor hardware, the observed scene, and the atmosphere in between. It allows the simulation of a wide range of optoelectronic systems for remote sensing. The simulator consists of three parts. The first part describes the geometrical relations between scene, sun, and the remote sensing system using a ray tracing algorithm. The second part of the simulation environment considers the radiometry. It calculates the at-sensor radiance using a pre-calculated multidimensional lookup-table taking the atmospheric influence on the radiation into account. Part three consists of an optical and an electronic sensor model for the generation of digital images. Using SENSOR for an optimization requires the additional application of task-specific data processing algorithms. The principle of the end-to-end-simulation approach is explained, all relevant concepts of SENSOR are discussed, and examples of its use are given. The verification of SENSOR is demonstrated.

  9. Chapter 13. Exploring Use of the Reserved Core

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holmen, John; Humphrey, Alan; Berzins, Martin

    2015-07-29

    In this chapter, we illustrate benefits of thinking in terms of thread management techniques when using a centralized scheduler model along with interoperability of MPI and PThread. This is facilitated through an exploration of thread placement strategies for an algorithm modeling radiative heat transfer with special attention to the 61st core. This algorithm plays a key role within the Uintah Computational Framework (UCF) and current efforts taking place at the University of Utah to model next-generation, large-scale clean coal boilers. In such simulations, this algorithm models the dominant form of heat transfer and consumes a large portion of compute time.more » Exemplified by a real-world example, this chapter presents our early efforts in porting a key portion of a scalability-centric codebase to the Intel Xeon Phi coprocessor. Specifically, this chapter presents results from our experiments profiling the native execution of a reverse Monte-Carlo ray tracing-based radiation model on a single coprocessor. These results demonstrate that our fastest run configurations utilized the 61st core and that performance was not profoundly impacted when explicitly oversubscribing the coprocessor operating system thread. Additionally, this chapter presents a portion of radiation model source code, a MIC-centric UCF cross-compilation example, and less conventional thread management technique for developers utilizing the PThreads threading model.« less

  10. The XMM Cluster Survey: X-ray analysis methodology

    NASA Astrophysics Data System (ADS)

    Lloyd-Davies, E. J.; Romer, A. Kathy; Mehrtens, Nicola; Hosmer, Mark; Davidson, Michael; Sabirli, Kivanc; Mann, Robert G.; Hilton, Matt; Liddle, Andrew R.; Viana, Pedro T. P.; Campbell, Heather C.; Collins, Chris A.; Dubois, E. Naomi; Freeman, Peter; Harrison, Craig D.; Hoyle, Ben; Kay, Scott T.; Kuwertz, Emma; Miller, Christopher J.; Nichol, Robert C.; Sahlén, Martin; Stanford, S. A.; Stott, John P.

    2011-11-01

    The XMM Cluster Survey (XCS) is a serendipitous search for galaxy clusters using all publicly available data in the XMM-Newton Science Archive. Its main aims are to measure cosmological parameters and trace the evolution of X-ray scaling relations. In this paper we describe the data processing methodology applied to the 5776 XMM observations used to construct the current XCS source catalogue. A total of 3675 > 4σ cluster candidates with >50 background-subtracted X-ray counts are extracted from a total non-overlapping area suitable for cluster searching of 410 deg2. Of these, 993 candidates are detected with >300 background-subtracted X-ray photon counts, and we demonstrate that robust temperature measurements can be obtained down to this count limit. We describe in detail the automated pipelines used to perform the spectral and surface brightness fitting for these candidates, as well as to estimate redshifts from the X-ray data alone. A total of 587 (122) X-ray temperatures to a typical accuracy of <40 (<10) per cent have been measured to date. We also present the methodology adopted for determining the selection function of the survey, and show that the extended source detection algorithm is robust to a range of cluster morphologies by inserting mock clusters derived from hydrodynamical simulations into real XMMimages. These tests show that the simple isothermal β-profiles is sufficient to capture the essential details of the cluster population detected in the archival XMM observations. The redshift follow-up of the XCS cluster sample is presented in a companion paper, together with a first data release of 503 optically confirmed clusters.

  11. Trace gas detection in hyperspectral imagery using the wavelet packet subspace

    NASA Astrophysics Data System (ADS)

    Salvador, Mark A. Z.

    This dissertation describes research into a new remote sensing method to detect trace gases in hyperspectral and ultra-spectral data. This new method is based on the wavelet packet transform. It attempts to improve both the computational tractability and the detection of trace gases in airborne and spaceborne spectral imagery. Atmospheric trace gas research supports various Earth science disciplines to include climatology, vulcanology, pollution monitoring, natural disasters, and intelligence and military applications. Hyperspectral and ultra-spectral data significantly increases the data glut of existing Earth science data sets. Spaceborne spectral data in particular significantly increases spectral resolution while performing daily global collections of the earth. Application of the wavelet packet transform to the spectral space of hyperspectral and ultra-spectral imagery data potentially improves remote sensing detection algorithms. It also facilities the parallelization of these methods for high performance computing. This research seeks two science goals, (1) developing a new spectral imagery detection algorithm, and (2) facilitating the parallelization of trace gas detection in spectral imagery data.

  12. Exploiting Data Similarity to Reduce Memory Footprints

    DTIC Science & Technology

    2011-01-01

    leslie3d Fortran Computational Fluid Dynamics (CFD) application 122. tachyon C Parallel Ray Tracing application 128.GAPgeofem C and Fortran Simulates...benefits most from SBLLmalloc; LAMMPS, which shows moderate similarity from primarily zero pages; and 122. tachyon , a parallel ray- tracing application...similarity across MPI tasks. They primarily are zero- pages although a small fraction (≈10%) are non-zero pages. 122. tachyon is an image rendering

  13. Flowfield computer graphics

    NASA Technical Reports Server (NTRS)

    Desautel, Richard

    1993-01-01

    The objectives of this research include supporting the Aerothermodynamics Branch's research by developing graphical visualization tools for both the branch's adaptive grid code and flow field ray tracing code. The completed research for the reporting period includes development of a graphical user interface (GUI) and its implementation into the NAS Flowfield Analysis Software Tool kit (FAST), for both the adaptive grid code (SAGE) and the flow field ray tracing code (CISS).

  14. MALBEC: a new CUDA-C ray-tracer in general relativity

    NASA Astrophysics Data System (ADS)

    Quiroga, G. D.

    2018-06-01

    A new CUDA-C code for tracing orbits around non-charged black holes is presented. This code, named MALBEC, take advantage of the graphic processing units and the CUDA platform for tracking null and timelike test particles in Schwarzschild and Kerr. Also, a new general set of equations that describe the closed circular orbits of any timelike test particle in the equatorial plane is derived. These equations are extremely important in order to compare the analytical behavior of the orbits with the numerical results and verify the correct implementation of the Runge-Kutta algorithm in MALBEC. Finally, other numerical tests are performed, demonstrating that MALBEC is able to reproduce some well-known results in these metrics in a faster and more efficient way than a conventional CPU implementation.

  15. FPGA based charge acquisition algorithm for soft x-ray diagnostics system

    NASA Astrophysics Data System (ADS)

    Wojenski, A.; Kasprowicz, G.; Pozniak, K. T.; Zabolotny, W.; Byszuk, A.; Juszczyk, B.; Kolasinski, P.; Krawczyk, R. D.; Zienkiewicz, P.; Chernyshova, M.; Czarski, T.

    2015-09-01

    Soft X-ray (SXR) measurement systems working in tokamaks or with laser generated plasma can expect high photon fluxes. Therefore it is necessary to focus on data processing algorithms to have the best possible efficiency in term of processed photon events per second. This paper refers to recently designed algorithm and data-flow for implementation of charge data acquisition in FPGA. The algorithms are currently on implementation stage for the soft X-ray diagnostics system. In this paper despite of the charge processing algorithm is also described general firmware overview, data storage methods and other key components of the measurement system. The simulation section presents algorithm performance and expected maximum photon rate.

  16. Novel trace chemical detection algorithms: a comparative study

    NASA Astrophysics Data System (ADS)

    Raz, Gil; Murphy, Cara; Georgan, Chelsea; Greenwood, Ross; Prasanth, R. K.; Myers, Travis; Goyal, Anish; Kelley, David; Wood, Derek; Kotidis, Petros

    2017-05-01

    Algorithms for standoff detection and estimation of trace chemicals in hyperspectral images in the IR band are a key component for a variety of applications relevant to law-enforcement and the intelligence communities. Performance of these methods is impacted by the spectral signature variability due to presence of contaminants, surface roughness, nonlinear dependence on abundances as well as operational limitations on the compute platforms. In this work we provide a comparative performance and complexity analysis of several classes of algorithms as a function of noise levels, error distribution, scene complexity, and spatial degrees of freedom. The algorithm classes we analyze and test include adaptive cosine estimator (ACE and modifications to it), compressive/sparse methods, Bayesian estimation, and machine learning. We explicitly call out the conditions under which each algorithm class is optimal or near optimal as well as their built-in limitations and failure modes.

  17. Fuzzy Sarsa with Focussed Replacing Eligibility Traces for Robust and Accurate Control

    NASA Astrophysics Data System (ADS)

    Kamdem, Sylvain; Ohki, Hidehiro; Sueda, Naomichi

    Several methods of reinforcement learning in continuous state and action spaces that utilize fuzzy logic have been proposed in recent years. This paper introduces Fuzzy Sarsa(λ), an on-policy algorithm for fuzzy learning that relies on a novel way of computing replacing eligibility traces to accelerate the policy evaluation. It is tested against several temporal difference learning algorithms: Sarsa(λ), Fuzzy Q(λ), an earlier fuzzy version of Sarsa and an actor-critic algorithm. We perform detailed evaluations on two benchmark problems : a maze domain and the cart pole. Results of various tests highlight the strengths and weaknesses of these algorithms and show that Fuzzy Sarsa(λ) outperforms all other algorithms tested for a larger granularity of design and under noisy conditions. It is a highly competitive method of learning in realistic noisy domains where a denser fuzzy design over the state space is needed for a more precise control.

  18. User's Guide for Computer Program that Routes Signal Traces

    NASA Technical Reports Server (NTRS)

    Hedgley, David R., Jr.

    2000-01-01

    This disk contains both a FORTRAN computer program and the corresponding user's guide that facilitates both its incorporation into your system and its utility. The computer program represents an efficient algorithm that routes signal traces on layers of a printed circuit with both through-pins and surface mounts. The computer program included is an implementation of the ideas presented in the theoretical paper titled "A Formal Algorithm for Routing Signal Traces on a Printed Circuit Board", NASA TP-3639 published in 1996. The computer program in the "connects" file can be read with a FORTRAN compiler and readily integrated into software unique to each particular environment where it might be used.

  19. Ray tracing simulation of aero-optical effect using multiple gradient index layer

    NASA Astrophysics Data System (ADS)

    Yang, Seul Ki; Seong, Sehyun; Ryu, Dongok; Kim, Sug-Whan; Kwon, Hyeuknam; Jin, Sang-Hun; Jeong, Ho; Kong, Hyun Bae; Lim, Jae Wan; Choi, Jong Hwa

    2016-10-01

    We present a new ray tracing simulation of aero-optical effect through anisotropic inhomogeneous media as supersonic flow field surrounds a projectile. The new method uses multiple gradient-index (GRIN) layers for construction of the anisotropic inhomogeneous media and ray tracing simulation. The cone-shaped projectile studied has 19° semi-vertical angle; a sapphire window is parallel to the cone angle; and an optical system of the projectile was assumed via paraxial optics and infrared image detector. The condition for the steady-state solver conducted through computational fluid dynamics (CFD) included Mach numbers 4 and 6 in speed, 25 km altitude, and 0° angle of attack (AoA). The grid refractive index of the flow field via CFD analysis and Gladstone-Dale relation was discretized into equally spaced layers which are parallel with the projectile's window. Each layer was modeled as a form of 2D polynomial by fitting the refractive index distribution. The light source of ray set generated 3,228 rays for varying line of sight (LOS) from 10° to 40°. Ray tracing simulation adopted the Snell's law in 3D to compute the paths of skew rays in the GRIN layers. The results show that optical path difference (OPD) and boresight error (BSE) decreases exponentially as LOS increases. The variation of refractive index decreases, as the speed of flow field increases the OPD and its rate of decay at Mach number 6 in speed has somewhat larger value than at Mach number 4 in speed. Compared with the ray equation method, at Mach number 4 and 10° LOS, the new method shows good agreement, generated 0.33% of relative root-mean-square (RMS) OPD difference and 0.22% of relative BSE difference. Moreover, the simulation time of the new method was more than 20,000 times faster than the conventional ray equation method. The technical detail of the new method and simulation is presented with results and implication.

  20. Solar Proton Transport Within an ICRU Sphere Surrounded by a Complex Shield: Ray-trace Geometry

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Wilson, John W.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2015-01-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z is less than or equal to 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency.

  1. Solar proton exposure of an ICRU sphere within a complex structure part II: Ray-trace geometry.

    PubMed

    Slaba, Tony C; Wilson, John W; Badavi, Francis F; Reddell, Brandon D; Bahadori, Amir A

    2016-06-01

    A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z ≤ 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency. Published by Elsevier Ltd.

  2. Agent-Based Chemical Plume Tracing Using Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Zarzhitsky, Dimitri; Spears, Diana; Thayer, David; Spears, William

    2004-01-01

    This paper presents a rigorous evaluation of a novel, distributed chemical plume tracing algorithm. The algorithm is a combination of the best aspects of the two most popular predecessors for this task. Furthermore, it is based on solid, formal principles from the field of fluid mechanics. The algorithm is applied by a network of mobile sensing agents (e.g., robots or micro-air vehicles) that sense the ambient fluid velocity and chemical concentration, and calculate derivatives. The algorithm drives the robotic network to the source of the toxic plume, where measures can be taken to disable the source emitter. This work is part of a much larger effort in research and development of a physics-based approach to developing networks of mobile sensing agents for monitoring, tracking, reporting and responding to hazardous conditions.

  3. Volumetric visualization algorithm development for an FPGA-based custom computing machine

    NASA Astrophysics Data System (ADS)

    Sallinen, Sami J.; Alakuijala, Jyrki; Helminen, Hannu; Laitinen, Joakim

    1998-05-01

    Rendering volumetric medical images is a burdensome computational task for contemporary computers due to the large size of the data sets. Custom designed reconfigurable hardware could considerably speed up volume visualization if an algorithm suitable for the platform is used. We present an algorithm and speedup techniques for visualizing volumetric medical CT and MR images with a custom-computing machine based on a Field Programmable Gate Array (FPGA). We also present simulated performance results of the proposed algorithm calculated with a software implementation running on a desktop PC. Our algorithm is capable of generating perspective projection renderings of single and multiple isosurfaces with transparency, simulated X-ray images, and Maximum Intensity Projections (MIP). Although more speedup techniques exist for parallel projection than for perspective projection, we have constrained ourselves to perspective viewing, because of its importance in the field of radiotherapy. The algorithm we have developed is based on ray casting, and the rendering is sped up by three different methods: shading speedup by gradient precalculation, a new generalized version of Ray-Acceleration by Distance Coding (RADC), and background ray elimination by speculative ray selection.

  4. An optimization of the FPGA trigger based on the artificial neural network for a detection of neutrino-origin showers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szadkowski, Zbigniew; Glas, Dariusz; Pytel, Krzysztof

    Observations of ultra-high energy neutrinos became a priority in experimental astro-particle physics. Up to now, the Pierre Auger Observatory did not find any candidate on a neutrino event. This imposes competitive limits to the diffuse flux of ultra-high energy neutrinos in the EeV range and above. A very low rate of events potentially generated by neutrinos is a significant challenge for a detection technique and requires both sophisticated algorithms and high-resolution hardware. A trigger based on a artificial neural network was implemented into the Cyclone{sup R} V E FPGA 5CEFA9F31I7. The prototype Front-End boards for Auger-Beyond-2015 with Cyclone{sup R} Vmore » E can test the neural network algorithm in real pampas conditions in 2015. Showers for muon and tau neutrino initiating particles on various altitudes, angles and energies were simulated in CORSICA and Offline platforms giving pattern of ADC traces in Auger water Cherenkov detectors. The 3-layer 12-10-1 neural network was taught in MATLAB by simulated ADC traces according the Levenberg-Marquardt algorithm. Results show that a probability of a ADC traces generation is very low due to a small neutrino cross-section. Nevertheless, ADC traces, if occur, for 1-10 EeV showers are relatively short and can be analyzed by 16-point input algorithm. For 100 EeV range traces are much longer, but with significantly higher amplitudes, which can be detected by standard threshold algorithms. We optimized the coefficients from MATLAB to get a maximal range of potentially registered events and for fixed-point FPGA processing to minimize calculation errors. Currently used Front-End boards based on no-more produced ACEXR PLDs and obsolete Cyclone{sup R} FPGAs allow an implementation of relatively simple threshold algorithms for triggers. New sophisticated trigger implemented in Cyclone{sup R} V E FPGAs with large amount of DSP blocks, embedded memory running with 120 - 160 MHz sampling may support to discover neutrino events in the Pierre Auger Observatory. (authors)« less

  5. The Search for Efficiency in Arboreal Ray Tracing Applications

    NASA Astrophysics Data System (ADS)

    van Leeuwen, M.; Disney, M.; Chen, J. M.; Gomez-Dans, J.; Kelbe, D.; van Aardt, J. A.; Lewis, P.

    2016-12-01

    Forest structure significantly impacts a range of abiotic conditions, including humidity and the radiation regime, all of which affect the rate of net and gross primary productivity. Current forest productivity models typically consider abstract media to represent the transfer of radiation within the canopy. Examples include the representation forest structure via a layered canopy model, where leaf area and inclination angles are stratified with canopy depth, or as turbid media where leaves are randomly distributed within space or within confined geometric solids such as blocks, spheres or cones. While these abstract models are known to produce accurate estimates of primary productivity at the stand level, their limited geometric resolution restricts applicability at fine spatial scales, such as the cell, leaf or shoot levels, thereby not addressing the full potential of assimilation of data from laboratory and field measurements with that of remote sensing technology. Recent research efforts have explored the use of laser scanning to capture detailed tree morphology at millimeter accuracy. These data can subsequently be used to combine ray tracing with primary productivity models, providing an ability to explore trade-offs among different morphological traits or assimilate data from spatial scales, spanning the leaf- to the stand level. Ray tracing has a major advantage of allowing the most accurate structural description of the canopy, and can directly exploit new 3D structural measurements, e.g., from laser scanning. However, the biggest limitation of ray tracing models is their high computational cost, which currently limits their use for large-scale applications. In this talk, we explore ways to more efficiently exploit ray tracing simulations and capture this information in a readily computable form for future evaluation, thus potentially enabling large-scale first-principles forest growth modelling applications.

  6. A wavelet based method for automatic detection of slow eye movements: a pilot study.

    PubMed

    Magosso, Elisa; Provini, Federica; Montagna, Pasquale; Ursino, Mauro

    2006-11-01

    Electro-oculographic (EOG) activity during the wake-sleep transition is characterized by the appearance of slow eye movements (SEM). The present work describes an algorithm for the automatic localisation of SEM events from EOG recordings. The algorithm is based on a wavelet multiresolution analysis of the difference between right and left EOG tracings, and includes three main steps: (i) wavelet decomposition down to 10 detail levels (i.e., 10 scales), using Daubechies order 4 wavelet; (ii) computation of energy in 0.5s time steps at any level of decomposition; (iii) construction of a non-linear discriminant function expressing the relative energy of high-scale details to both high- and low-scale details. The main assumption is that the value of the discriminant function increases above a given threshold during SEM episodes due to energy redistribution toward higher scales. Ten EOG recordings from ten male patients with obstructive sleep apnea syndrome were used. All tracings included a period from pre-sleep wakefulness to stage 2 sleep. Two experts inspected the tracings separately to score SEMs. A reference set of SEM (gold standard) were obtained by joint examination by both experts. Parameters of the discriminant function were assigned on three tracings (design set) to minimize the disagreement between the system classification and classification by the two experts; the algorithm was then tested on the remaining seven tracings (test set). Results show that the agreement between the algorithm and the gold standard was 80.44+/-4.09%, the sensitivity of the algorithm was 67.2+/-7.37% and the selectivity 83.93+/-8.65%. However, most errors were not caused by an inability of the system to detect intervals with SEM activity against NON-SEM intervals, but were due to a different localisation of the beginning and end of some SEM episodes. The proposed method may be a valuable tool for computerized EOG analysis.

  7. Semi-automatic delineation of the spino-laminar junction curve on lateral x-ray radiographs of the cervical spine

    NASA Astrophysics Data System (ADS)

    Narang, Benjamin; Phillips, Michael; Knapp, Karen; Appelboam, Andy; Reuben, Adam; Slabaugh, Greg

    2015-03-01

    Assessment of the cervical spine using x-ray radiography is an important task when providing emergency room care to trauma patients suspected of a cervical spine injury. In routine clinical practice, a physician will inspect the alignment of the cervical spine vertebrae by mentally tracing three alignment curves along the anterior and posterior sides of the cervical vertebral bodies, as well as one along the spinolaminar junction. In this paper, we propose an algorithm to semi-automatically delineate the spinolaminar junction curve, given a single reference point and the corners of each vertebral body. From the reference point, our method extracts a region of interest, and performs template matching using normalized cross-correlation to find matching regions along the spinolaminar junction. Matching points are then fit to a third order spline, producing an interpolating curve. Experimental results demonstrate promising results, on average producing a modified Hausdorff distance of 1.8 mm, validated on a dataset consisting of 29 patients including those with degenerative change, retrolisthesis, and fracture.

  8. Time-Domain Receiver Function Deconvolution using Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Moreira, L. P.

    2017-12-01

    Receiver Functions (RF) are well know method for crust modelling using passive seismological signals. Many different techniques were developed to calculate the RF traces, applying the deconvolution calculation to radial and vertical seismogram components. A popular method used a spectral division of both components, which requires human intervention to apply the Water Level procedure to avoid instabilities from division by small numbers. One of most used method is an iterative procedure to estimate the RF peaks and applying the convolution with vertical component seismogram, comparing the result with the radial component. This method is suitable for automatic processing, however several RF traces are invalid due to peak estimation failure.In this work it is proposed a deconvolution algorithm using Genetic Algorithm (GA) to estimate the RF peaks. This method is entirely processed in the time domain, avoiding the time-to-frequency calculations (and vice-versa), and totally suitable for automatic processing. Estimated peaks can be used to generate RF traces in a seismogram format for visualization. The RF trace quality is similar for high magnitude events, although there are less failures for RF calculation of smaller events, increasing the overall performance for high number of events per station.

  9. Patellar segmentation from 3D magnetic resonance images using guided recursive ray-tracing for edge pattern detection

    NASA Astrophysics Data System (ADS)

    Cheng, Ruida; Jackson, Jennifer N.; McCreedy, Evan S.; Gandler, William; Eijkenboom, J. J. F. A.; van Middelkoop, M.; McAuliffe, Matthew J.; Sheehan, Frances T.

    2016-03-01

    The paper presents an automatic segmentation methodology for the patellar bone, based on 3D gradient recalled echo and gradient recalled echo with fat suppression magnetic resonance images. Constricted search space outlines are incorporated into recursive ray-tracing to segment the outer cortical bone. A statistical analysis based on the dependence of information in adjacent slices is used to limit the search in each image to between an outer and inner search region. A section based recursive ray-tracing mechanism is used to skip inner noise regions and detect the edge boundary. The proposed method achieves higher segmentation accuracy (0.23mm) than the current state-of-the-art methods with the average dice similarity coefficient of 96.0% (SD 1.3%) agreement between the auto-segmentation and ground truth surfaces.

  10. Ray tracing study of rising tone EMIC-triggered emissions

    NASA Astrophysics Data System (ADS)

    Hanzelka, Miroslav; Santolík, Ondřej; Grison, Benjamin; Cornilleau-Wehrlin, Nicole

    2017-04-01

    ElectroMagnetic Ion Cyclotron (EMIC) triggered emissions have been subject of extensive theoretical and experimental research in last years. These emissions are characterized by high coherence values and a frequency range of 0.5 - 2.0 Hz, close to local helium gyrofrequency. We perform ray tracing case studies of rising tone EMIC-triggered emissions observed by the Cluster spacecraft in both nightside and dayside regions off the equatorial plane. By comparison of simulated and measured wave properties, namely wave vector orientation, group velocity, dispersion and ellipticity of polarization, we determine possible source locations. Diffusive equilibrium density model and other, semi-empirical models are used with ion composition inferred from cross-over frequencies. Ray tracing simulations are done in cold plasma approximation with inclusion of Landau and cyclotron damping. Various widths, locations and profiles of plasmapause are tested.

  11. An analysis of options available for developing a common laser ray tracing package for Ares and Kull code frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weeratunga, S K

    Ares and Kull are mature code frameworks that support ALE hydrodynamics for a variety of HEDP applications at LLNL, using two widely different meshing approaches. While Ares is based on a 2-D/3-D block-structured mesh data base, Kull is designed to support unstructured, arbitrary polygonal/polyhedral meshes. In addition, both frameworks are capable of running applications on large, distributed-memory parallel machines. Currently, both these frameworks separately support assorted collections of physics packages related to HEDP, including one for the energy deposition by laser/ion-beam ray tracing. This study analyzes the options available for developing a common laser/ion-beam ray tracing package that can bemore » easily shared between these two code frameworks and concludes with a set of recommendations for its development.« less

  12. Calculated X-ray Intensities Using Monte Carlo Algorithms: A Comparison to Experimental EPMA Data

    NASA Technical Reports Server (NTRS)

    Carpenter, P. K.

    2005-01-01

    Monte Carlo (MC) modeling has been used extensively to simulate electron scattering and x-ray emission from complex geometries. Here are presented comparisons between MC results and experimental electron-probe microanalysis (EPMA) measurements as well as phi(rhoz) correction algorithms. Experimental EPMA measurements made on NIST SRM 481 (AgAu) and 482 (CuAu) alloys, at a range of accelerating potential and instrument take-off angles, represent a formal microanalysis data set that has been widely used to develop phi(rhoz) correction algorithms. X-ray intensity data produced by MC simulations represents an independent test of both experimental and phi(rhoz) correction algorithms. The alpha-factor method has previously been used to evaluate systematic errors in the analysis of semiconductor and silicate minerals, and is used here to compare the accuracy of experimental and MC-calculated x-ray data. X-ray intensities calculated by MC are used to generate a-factors using the certificated compositions in the CuAu binary relative to pure Cu and Au standards. MC simulations are obtained using the NIST, WinCasino, and WinXray algorithms; derived x-ray intensities have a built-in atomic number correction, and are further corrected for absorption and characteristic fluorescence using the PAP phi(rhoz) correction algorithm. The Penelope code additionally simulates both characteristic and continuum x-ray fluorescence and thus requires no further correction for use in calculating alpha-factors.

  13. A novel tracing method for the segmentation of cell wall networks.

    PubMed

    De Vylder, Jonas; Rooms, Filip; Dhondt, Stijn; Inze, Dirk; Philips, Wilfried

    2013-01-01

    Cell wall networks are a common subject of research in biology, which are important for plant growth analysis, organ studies, etc. In order to automate the detection of individual cells in such cell wall networks, we propose a new segmentation algorithm. The proposed method is a network tracing algorithm, exploiting the prior knowledge of the network structure. The method is applicable on multiple microscopy modalities such as fluorescence, but also for images captured using non invasive microscopes such as differential interference contrast (DIC) microscopes.

  14. Phase and amplitude wave front sensing and reconstruction with a modified plenoptic camera

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Nelson, William; Davis, Christopher C.

    2014-10-01

    A plenoptic camera is a camera that can retrieve the direction and intensity distribution of light rays collected by the camera and allows for multiple reconstruction functions such as: refocusing at a different depth, and for 3D microscopy. Its principle is to add a micro-lens array to a traditional high-resolution camera to form a semi-camera array that preserves redundant intensity distributions of the light field and facilitates back-tracing of rays through geometric knowledge of its optical components. Though designed to process incoherent images, we found that the plenoptic camera shows high potential in solving coherent illumination cases such as sensing both the amplitude and phase information of a distorted laser beam. Based on our earlier introduction of a prototype modified plenoptic camera, we have developed the complete algorithm to reconstruct the wavefront of the incident light field. In this paper the algorithm and experimental results will be demonstrated, and an improved version of this modified plenoptic camera will be discussed. As a result, our modified plenoptic camera can serve as an advanced wavefront sensor compared with traditional Shack- Hartmann sensors in handling complicated cases such as coherent illumination in strong turbulence where interference and discontinuity of wavefronts is common. Especially in wave propagation through atmospheric turbulence, this camera should provide a much more precise description of the light field, which would guide systems in adaptive optics to make intelligent analysis and corrections.

  15. Sinogram-based adaptive iterative reconstruction for sparse view x-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Trinca, D.; Zhong, Y.; Wang, Y.-Z.; Mamyrbayev, T.; Libin, E.

    2016-10-01

    With the availability of more powerful computing processors, iterative reconstruction algorithms have recently been successfully implemented as an approach to achieving significant dose reduction in X-ray CT. In this paper, we propose an adaptive iterative reconstruction algorithm for X-ray CT, that is shown to provide results comparable to those obtained by proprietary algorithms, both in terms of reconstruction accuracy and execution time. The proposed algorithm is thus provided for free to the scientific community, for regular use, and for possible further optimization.

  16. A fully-automatic fast segmentation of the sub-basal layer nerves in corneal images.

    PubMed

    Guimarães, Pedro; Wigdahl, Jeff; Poletti, Enea; Ruggeri, Alfredo

    2014-01-01

    Corneal nerves changes have been linked to damage caused by surgical interventions or prolonged contact lens wear. Furthermore nerve tortuosity has been shown to correlate with the severity of diabetic neuropathy. For these reasons there has been an increasing interest on the analysis of these structures. In this work we propose a novel, robust, and fast fully automatic algorithm capable of tracing the sub-basal plexus nerves from human corneal confocal images. We resort to logGabor filters and support vector machines to trace the corneal nerves. The proposed algorithm traced most of the corneal nerves correctly (sensitivity of 0.88 ± 0.06 and false discovery rate of 0.08 ± 0.06). The displayed performance is comparable to a human grader. We believe that the achieved processing time (0.661 ± 0.07 s) and tracing quality are major advantages for the daily clinical practice.

  17. Source-independent full waveform inversion of seismic data

    DOEpatents

    Lee, Ki Ha

    2006-02-14

    A set of seismic trace data is collected in an input data set that is first Fourier transformed in its entirety into the frequency domain. A normalized wavefield is obtained for each trace of the input data set in the frequency domain. Normalization is done with respect to the frequency response of a reference trace selected from the set of seismic trace data. The normalized wavefield is source independent, complex, and dimensionless. The normalized wavefield is shown to be uniquely defined as the normalized impulse response, provided that a certain condition is met for the source. This property allows construction of the inversion algorithm disclosed herein, without any source or source coupling information. The algorithm minimizes the error between data normalized wavefield and the model normalized wavefield. The methodology is applicable to any 3-D seismic problem, and damping may be easily included in the process.

  18. Modeling, Materials, and Metrics: The Three-m Approach to FCS Signature Solutions

    DTIC Science & Technology

    2002-05-07

    calculations. These multiple levels will be incorporated into the MuSES software. The four levels are described as follows: "* Radiosity - Deterministic...view-factor-based, all-diffuse solution. Very fast. Independent of user position. "* Directional Reflectivity - Radiosity with directional incident...target and environment facets (view factor with BRDF). Last ray cast bounce = radiosity solution. "* Multi-bounce path trace - Rays traced from observer

  19. Ray tracing the Wigner distribution function for optical simulations

    NASA Astrophysics Data System (ADS)

    Mout, Marco; Wick, Michael; Bociort, Florian; Petschulat, Joerg; Urbach, Paul

    2018-01-01

    We study a simulation method that uses the Wigner distribution function to incorporate wave optical effects in an established framework based on geometrical optics, i.e., a ray tracing engine. We use the method to calculate point spread functions and show that it is accurate for paraxial systems but produces unphysical results in the presence of aberrations. The cause of these anomalies is explained using an analytical model.

  20. Multiscale optical simulation settings: challenging applications handled with an iterative ray-tracing FDTD interface method.

    PubMed

    Leiner, Claude; Nemitz, Wolfgang; Schweitzer, Susanne; Kuna, Ladislav; Wenzl, Franz P; Hartmann, Paul; Satzinger, Valentin; Sommer, Christian

    2016-03-20

    We show that with an appropriate combination of two optical simulation techniques-classical ray-tracing and the finite difference time domain method-an optical device containing multiple diffractive and refractive optical elements can be accurately simulated in an iterative simulation approach. We compare the simulation results with experimental measurements of the device to discuss the applicability and accuracy of our iterative simulation procedure.

  1. Reflectance model of a plant leaf

    NASA Technical Reports Server (NTRS)

    Kumar, R.; Silva, L.

    1973-01-01

    A light ray, incident at 5 deg to the normal, is geometrically plotted through the drawing of the cross section of a soybean leaf using Fresnel's Equations and Snell's Law. The optical mediums of the leaf considered for ray tracing are: air, cell sap, chloroplast, and cell wall. The above ray is also drawn through the same leaf cross section considering cell wall and air as the only optical mediums. The values of the reflection and transmission found from ray tracing agree closely with the experimental results obtained using a Beckman DK-2A Spectroreflectometer. Similarly a light ray, incident at about 60 deg to the normal, is drawn through the palisade cells of a soybean leaf to illustrate the pathway of light, incident at an oblique angle, through the palisade cells.

  2. Robust adaptive 3-D segmentation of vessel laminae from fluorescence confocal microscope images and parallel GPU implementation.

    PubMed

    Narayanaswamy, Arunachalam; Dwarakapuram, Saritha; Bjornsson, Christopher S; Cutler, Barbara M; Shain, William; Roysam, Badrinath

    2010-03-01

    This paper presents robust 3-D algorithms to segment vasculature that is imaged by labeling laminae, rather than the lumenal volume. The signal is weak, sparse, noisy, nonuniform, low-contrast, and exhibits gaps and spectral artifacts, so adaptive thresholding and Hessian filtering based methods are not effective. The structure deviates from a tubular geometry, so tracing algorithms are not effective. We propose a four step approach. The first step detects candidate voxels using a robust hypothesis test based on a model that assumes Poisson noise and locally planar geometry. The second step performs an adaptive region growth to extract weakly labeled and fine vessels while rejecting spectral artifacts. To enable interactive visualization and estimation of features such as statistical confidence, local curvature, local thickness, and local normal, we perform the third step. In the third step, we construct an accurate mesh representation using marching tetrahedra, volume-preserving smoothing, and adaptive decimation algorithms. To enable topological analysis and efficient validation, we describe a method to estimate vessel centerlines using a ray casting and vote accumulation algorithm which forms the final step of our algorithm. Our algorithm lends itself to parallel processing, and yielded an 8 x speedup on a graphics processor (GPU). On synthetic data, our meshes had average error per face (EPF) values of (0.1-1.6) voxels per mesh face for peak signal-to-noise ratios from (110-28 dB). Separately, the error from decimating the mesh to less than 1% of its original size, the EPF was less than 1 voxel/face. When validated on real datasets, the average recall and precision values were found to be 94.66% and 94.84%, respectively.

  3. Measurement of Trace Constituents by Electron-Excited X-Ray Microanalysis with Energy-Dispersive Spectrometry.

    PubMed

    Newbury, Dale E; Ritchie, Nicholas W M

    2016-06-01

    Electron-excited X-ray microanalysis performed with scanning electron microscopy and energy-dispersive spectrometry (EDS) has been used to measure trace elemental constituents of complex multielement materials, where "trace" refers to constituents present at concentrations below 0.01 (mass fraction). High count spectra measured with silicon drift detector EDS were quantified using the standards/matrix correction protocol embedded in the NIST DTSA-II software engine. Robust quantitative analytical results for trace constituents were obtained from concentrations as low as 0.000500 (mass fraction), even in the presence of significant peak interferences from minor (concentration 0.01≤C≤0.1) and major (C>0.1) constituents. Limits of detection as low as 0.000200 were achieved in the absence of peak interference.

  4. A novel material detection algorithm based on 2D GMM-based power density function and image detail addition scheme in dual energy X-ray images.

    PubMed

    Pourghassem, Hossein

    2012-01-01

    Material detection is a vital need in dual energy X-ray luggage inspection systems at security of airport and strategic places. In this paper, a novel material detection algorithm based on statistical trainable models using 2-Dimensional power density function (PDF) of three material categories in dual energy X-ray images is proposed. In this algorithm, the PDF of each material category as a statistical model is estimated from transmission measurement values of low and high energy X-ray images by Gaussian Mixture Models (GMM). Material label of each pixel of object is determined based on dependency probability of its transmission measurement values in the low and high energy to PDF of three material categories (metallic, organic and mixed materials). The performance of material detection algorithm is improved by a maximum voting scheme in a neighborhood of image as a post-processing stage. Using two background removing and denoising stages, high and low energy X-ray images are enhanced as a pre-processing procedure. For improving the discrimination capability of the proposed material detection algorithm, the details of the low and high energy X-ray images are added to constructed color image which includes three colors (orange, blue and green) for representing the organic, metallic and mixed materials. The proposed algorithm is evaluated on real images that had been captured from a commercial dual energy X-ray luggage inspection system. The obtained results show that the proposed algorithm is effective and operative in detection of the metallic, organic and mixed materials with acceptable accuracy.

  5. Ionospheric Tomography Using Faraday Rotation of Automatic Dependant Surveillance Broadcast UHF Signals

    NASA Astrophysics Data System (ADS)

    Cushley, A. C.

    2013-12-01

    The proposed launch of a satellite carrying the first space-borne ADS-B receiver by the Royal Military College of Canada (RMCC) will create a unique opportunity to study the modification of the 1090 MHz radio waves following propagation through the ionosphere from the transmitting aircraft to the passive satellite receiver(s). Experimental work successfully demonstrated that ADS-B data can be used to reconstruct two dimensional (2D) electron density maps of the ionosphere using computerized tomography (CT). The goal of this work is to evaluate the feasibility of CT reconstruction. The data is modelled using Ray-tracing techniques. This allows us to determine the characteristics of individual waves, including the wave path and the state of polarization at the satellite receiver. The modelled Faraday rotation (FR) is determined and converted to total electron content (TEC) along the ray-paths. The resulting TEC is used as input for computerized ionospheric tomography (CIT) using algebraic reconstruction technique (ART). This study concentrated on meso-scale structures 100-1000 km in horizontal extent. The primary scientific interest of this thesis was to show the feasibility of a new method to image the ionosphere and obtain a better understanding of magneto-ionic wave propagation. Multiple feature input electron density profile to ray-tracing program. Top: reconstructed relative electron density map of ray-trace input (Fig. 1) using TEC measurements and line-of-sight path. Bottom: reconstructed electron density map of ray-trace input using quiet background a priori estimate.

  6. Automated method for tracing leading and trailing processes of migrating neurons in confocal image sequences

    NASA Astrophysics Data System (ADS)

    Kerekes, Ryan A.; Gleason, Shaun S.; Trivedi, Niraj; Solecki, David J.

    2010-03-01

    Segmentation, tracking, and tracing of neurons in video imagery are important steps in many neuronal migration studies and can be inaccurate and time-consuming when performed manually. In this paper, we present an automated method for tracing the leading and trailing processes of migrating neurons in time-lapse image stacks acquired with a confocal fluorescence microscope. In our approach, we first locate and track the soma of the cell of interest by smoothing each frame and tracking the local maxima through the sequence. We then trace the leading process in each frame by starting at the center of the soma and stepping repeatedly in the most likely direction of the leading process. This direction is found at each step by examining second derivatives of fluorescent intensity along curves of constant radius around the current point. Tracing terminates after a fixed number of steps or when fluorescent intensity drops below a fixed threshold. We evolve the resulting trace to form an improved trace that more closely follows the approximate centerline of the leading process. We apply a similar algorithm to the trailing process of the cell by starting the trace in the opposite direction. We demonstrate our algorithm on two time-lapse confocal video sequences of migrating cerebellar granule neurons (CGNs). We show that the automated traces closely approximate ground truth traces to within 1 or 2 pixels on average. Additionally, we compute line intensity profiles of fluorescence along the automated traces and quantitatively demonstrate their similarity to manually generated profiles in terms of fluorescence peak locations.

  7. A Fully GPU-Based Ray-Driven Backprojector via a Ray-Culling Scheme with Voxel-Level Parallelization for Cone-Beam CT Reconstruction.

    PubMed

    Park, Hyeong-Gyu; Shin, Yeong-Gil; Lee, Ho

    2015-12-01

    A ray-driven backprojector is based on ray-tracing, which computes the length of the intersection between the ray paths and each voxel to be reconstructed. To reduce the computational burden caused by these exhaustive intersection tests, we propose a fully graphics processing unit (GPU)-based ray-driven backprojector in conjunction with a ray-culling scheme that enables straightforward parallelization without compromising the high computing performance of a GPU. The purpose of the ray-culling scheme is to reduce the number of ray-voxel intersection tests by excluding rays irrelevant to a specific voxel computation. This rejection step is based on an axis-aligned bounding box (AABB) enclosing a region of voxel projection, where eight vertices of each voxel are projected onto the detector plane. The range of the rectangular-shaped AABB is determined by min/max operations on the coordinates in the region. Using the indices of pixels inside the AABB, the rays passing through the voxel can be identified and the voxel is weighted as the length of intersection between the voxel and the ray. This procedure makes it possible to reflect voxel-level parallelization, allowing an independent calculation at each voxel, which is feasible for a GPU implementation. To eliminate redundant calculations during ray-culling, a shared-memory optimization is applied to exploit the GPU memory hierarchy. In experimental results using real measurement data with phantoms, the proposed GPU-based ray-culling scheme reconstructed a volume of resolution 28032803176 in 77 seconds from 680 projections of resolution 10243768 , which is 26 times and 7.5 times faster than standard CPU-based and GPU-based ray-driven backprojectors, respectively. Qualitative and quantitative analyses showed that the ray-driven backprojector provides high-quality reconstruction images when compared with those generated by the Feldkamp-Davis-Kress algorithm using a pixel-driven backprojector, with an average of 2.5 times higher contrast-to-noise ratio, 1.04 times higher universal quality index, and 1.39 times higher normalized mutual information. © The Author(s) 2014.

  8. An Algorithm of an X-ray Hit Allocation to a Single Pixel in a Cluster and Its Test-Circuit Implementation

    DOE PAGES

    Deptuch, Grzegorz W.; Fahim, Farah; Grybos, Pawel; ...

    2017-06-28

    An on-chip implementable algorithm for allocation of an X-ray photon imprint, called a hit, to a single pixel in the presence of charge sharing in a highly segmented pixel detector is described. Its proof-of-principle implementation is also given supported by the results of tests using a highly collimated X-ray photon beam from a synchrotron source. The algorithm handles asynchronous arrivals of X-ray photons. Activation of groups of pixels, comparisons of peak amplitudes of pulses within an active neighborhood and finally latching of the results of these comparisons constitute the three procedural steps of the algorithm. A grouping of pixels tomore » one virtual pixel, that recovers composite signals and event driven strobes, to control comparisons of fractional signals between neighboring pixels are the actuators of the algorithm. The circuitry necessary to implement the algorithm requires an extensive inter-pixel connection grid of analog and digital signals, that are exchanged between pixels. A test-circuit implementation of the algorithm was achieved with a small array of 32 × 32 pixels and the device was exposed to an 8 keV highly collimated to a diameter of 3-μm X-ray beam. Furthermore, the results of these tests are given in this paper assessing physical implementation of the algorithm.« less

  9. TIM, a ray-tracing program for METATOY research and its dissemination

    NASA Astrophysics Data System (ADS)

    Lambert, Dean; Hamilton, Alasdair C.; Constable, George; Snehanshu, Harsh; Talati, Sharvil; Courtial, Johannes

    2012-03-01

    TIM (The Interactive METATOY) is a ray-tracing program specifically tailored towards our research in METATOYs, which are optical components that appear to be able to create wave-optically forbidden light-ray fields. For this reason, TIM possesses features not found in other ray-tracing programs. TIM can either be used interactively or by modifying the openly available source code; in both cases, it can easily be run as an applet embedded in a web page. Here we describe the basic structure of TIM's source code and how to extend it, and we give examples of how we have used TIM in our own research. Program summaryProgram title: TIM Catalogue identifier: AEKY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 124 478 No. of bytes in distributed program, including test data, etc.: 4 120 052 Distribution format: tar.gz Programming language: Java Computer: Any computer capable of running the Java Virtual Machine (JVM) 1.6 Operating system: Any; developed under Mac OS X Version 10.6 RAM: Typically 145 MB (interactive version running under Mac OS X Version 10.6) Classification: 14, 18 External routines: JAMA [1] (source code included) Nature of problem: Visualisation of scenes that include scene objects that create wave-optically forbidden light-ray fields. Solution method: Ray tracing. Unusual features: Specifically designed to visualise wave-optically forbidden light-ray fields; can visualise ray trajectories; can visualise geometric optic transformations; can create anaglyphs (for viewing with coloured "3D glasses") and random-dot autostereograms of the scene; integrable into web pages. Running time: Problem-dependent; typically seconds for a simple scene.

  10. Application of relativistic electrons for the quantitative analysis of trace elements

    NASA Astrophysics Data System (ADS)

    Hoffmann, D. H. H.; Brendel, C.; Genz, H.; Löw, W.; Richter, A.

    1984-04-01

    Particle induced X-ray emission methods (PIXE) have been extended to relativistic electrons to induce X-ray emission (REIXE) for quantitative trace-element analysis. The electron beam (20 ≤ E0≤ 70 MeV) was supplied by the Darmstadt electron linear accelerator DALINAC. Systematic measurements of absolute K-, L- and M-shell ionization cross sections revealed a scaling behaviour of inner-shell ionization cross sections from which X-ray production cross sections can be deduced for any element of interest for a quantitative sample investigation. Using a multielemental mineral monazite sample from Malaysia the sensitivity of REIXE is compared to well established methods of trace-element analysis like proton- and X-ray-induced X-ray fluorescence analysis. The achievable detection limit for very heavy elements amounts to about 100 ppm for the REIXE method. As an example of an application the investigation of a sample prepared from manganese nodules — picked up from the Pacific deep sea — is discussed, which showed the expected high mineral content of Fe, Ni, Cu and Ti, although the search for aliquots of Pt did not show any measurable content within an upper limit of 250 ppm.

  11. Efficient Geometric Sound Propagation Using Visibility Culling

    NASA Astrophysics Data System (ADS)

    Chandak, Anish

    2011-07-01

    Simulating propagation of sound can improve the sense of realism in interactive applications such as video games and can lead to better designs in engineering applications such as architectural acoustics. In this thesis, we present geometric sound propagation techniques which are faster than prior methods and map well to upcoming parallel multi-core CPUs. We model specular reflections by using the image-source method and model finite-edge diffraction by using the well-known Biot-Tolstoy-Medwin (BTM) model. We accelerate the computation of specular reflections by applying novel visibility algorithms, FastV and AD-Frustum, which compute visibility from a point. We accelerate finite-edge diffraction modeling by applying a novel visibility algorithm which computes visibility from a region. Our visibility algorithms are based on frustum tracing and exploit recent advances in fast ray-hierarchy intersections, data-parallel computations, and scalable, multi-core algorithms. The AD-Frustum algorithm adapts its computation to the scene complexity and allows small errors in computing specular reflection paths for higher computational efficiency. FastV and our visibility algorithm from a region are general, object-space, conservative visibility algorithms that together significantly reduce the number of image sources compared to other techniques while preserving the same accuracy. Our geometric propagation algorithms are an order of magnitude faster than prior approaches for modeling specular reflections and two to ten times faster for modeling finite-edge diffraction. Our algorithms are interactive, scale almost linearly on multi-core CPUs, and can handle large, complex, and dynamic scenes. We also compare the accuracy of our sound propagation algorithms with other methods. Once sound propagation is performed, it is desirable to listen to the propagated sound in interactive and engineering applications. We can generate smooth, artifact-free output audio signals by applying efficient audio-processing algorithms. We also present the first efficient audio-processing algorithm for scenarios with simultaneously moving source and moving receiver (MS-MR) which incurs less than 25% overhead compared to static source and moving receiver (SS-MR) or moving source and static receiver (MS-SR) scenario.

  12. National Ignition Facility main laser stray light analysis and control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    English, R E; Miller, J L; Peterson, G

    1998-06-26

    Stray light analysis has been carried out for the main laser section of the National Ignition Facility main laser section using a comprehensive non-sequential ray trace model supplemented with additional ray trace and diffraction propagation modeling. This paper describes the analysis and control methodology, gives examples of ghost paths and required tilted lenses, baffles, absorbers, and beam dumps, and discusses analysis of stray light "pencil beams" in the system.

  13. Mathematic models for a ray tracing method and its applications in wireless optical communications.

    PubMed

    Zhang, Minglun; Zhang, Yangan; Yuan, Xueguang; Zhang, Jinnan

    2010-08-16

    This paper presents a new ray tracing method, which contains a whole set of mathematic models, and its validity is verified by simulations. In addition, both theoretical analysis and simulation results show that the computational complexity of the method is much lower than that of previous ones. Therefore, the method can be used to rapidly calculate the impulse response of wireless optical channels for complicated systems.

  14. Thermal radiation characteristics of nonisothermal cylindrical enclosures using a numerical ray tracing technique

    NASA Technical Reports Server (NTRS)

    Baumeister, Joseph F.

    1990-01-01

    Analysis of energy emitted from simple or complex cavity designs can lead to intricate solutions due to nonuniform radiosity and irradiation within a cavity. A numerical ray tracing technique was applied to simulate radiation propagating within and from various cavity designs. To obtain the energy balance relationships between isothermal and nonisothermal cavity surfaces and space, the computer code NEVADA was utilized for its statistical technique applied to numerical ray tracing. The analysis method was validated by comparing results with known theoretical and limiting solutions, and the electrical resistance network method. In general, for nonisothermal cavities the performance (apparent emissivity) is a function of cylinder length-to-diameter ratio, surface emissivity, and cylinder surface temperatures. The extent of nonisothermal conditions in a cylindrical cavity significantly affects the overall cavity performance. Results are presented over a wide range of parametric variables for use as a possible design reference.

  15. Fast estimation of first-order scattering in a medical x-ray computed tomography scanner using a ray-tracing technique.

    PubMed

    Liu, Xin

    2014-01-01

    This study describes a deterministic method for simulating the first-order scattering in a medical computed tomography scanner. The method was developed based on a physics model of x-ray photon interactions with matter and a ray tracing technique. The results from simulated scattering were compared to the ones from an actual scattering measurement. Two phantoms with homogeneous and heterogeneous material distributions were used in the scattering simulation and measurement. It was found that the simulated scatter profile was in agreement with the measurement result, with an average difference of 25% or less. Finally, tomographic images with artifacts caused by scatter were corrected based on the simulated scatter profiles. The image quality improved significantly.

  16. X-ray Pulsar Navigation Algorithms and Testbed for SEXTANT

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke M. B.; Hasouneh, Monther A.; Mitchell, Jason W.; Valdez, Jennifer E.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wood, Kent S.; Arzoumanian, Zaven; hide

    2015-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a NASA funded technologydemonstration. SEXTANT will, for the first time, demonstrate real-time, on-board X-ray Pulsar-based Navigation (XNAV), a significant milestone in the quest to establish a GPS-like navigation capability available throughout our Solar System and beyond. This paper describes the basic design of the SEXTANT system with a focus on core models and algorithms, and the design and continued development of the GSFC X-ray Navigation Laboratory Testbed (GXLT) with its dynamic pulsar emulation capability. We also present early results from GXLT modeling of the combined NICER X-ray timing instrument hardware and SEXTANT flight software algorithms.

  17. Trace elements in natural azurite pigments found in illuminated manuscript leaves investigated by synchrotron x-ray fluorescence and diffraction mapping

    NASA Astrophysics Data System (ADS)

    Smieska, Louisa M.; Mullett, Ruth; Ferri, Laurent; Woll, Arthur R.

    2017-07-01

    We present trace-element and composition analysis of azurite pigments in six illuminated manuscript leaves, dating from the thirteenth to sixteenth century, using synchrotron-based, large-area x-ray fluorescence (SR-XRF) and diffraction (SR-XRD) mapping. SR-XRF mapping reveals several trace elements correlated with azurite, including arsenic, zirconium, antimony, barium, and bismuth, that appear in multiple manuscripts but were not always detected by point XRF. Within some manuscript leaves, variations in the concentration of trace elements associated with azurite coincide with distinct regions of the illuminations, suggesting systematic differences in azurite preparation or purification. Variations of the trace element concentrations in azurite are greater among different manuscript leaves than the variations within each individual leaf, suggesting the possibility that such impurities reflect distinct mineralogical/geologic sources. SR-XRD maps collected simultaneously with the SR-XRF maps confirm the identification of azurite regions and are consistent with impurities found in natural mineral sources of azurite. In general, our results suggest the feasibility of using azurite trace element analysis for provenance studies of illuminated manuscript fragments, and demonstrate the value of XRF mapping in non-destructive determination of trace element concentrations within a single pigment.

  18. MARXS: A Modular Software to Ray-trace X-Ray Instrumentation

    NASA Astrophysics Data System (ADS)

    Günther, Hans Moritz; Frost, Jason; Theriault-Shay, Adam

    2017-12-01

    To obtain the best possible scientific result, astronomers must understand the properties of the available instrumentation well. This is important both when designing new instruments and when using existing instruments close to the limits of their specified capabilities or beyond. Ray-tracing is a technique for numerical simulations where the path of many light rays is followed through the system to understand how individual system components influence the observed properties, such as the shape of the point-spread-function. In instrument design, such simulations can be used to optimize the performance. For observations with existing instruments, this helps to discern instrumental artefacts from a true signal. Here, we describe MARXS, a new python package designed to simulate X-ray instruments on satellites and sounding rockets. MARXS uses probability tracking of photons and has polarimetric capabilities.

  19. A multiple reader scoring system for Nasal Potential Difference parameters.

    PubMed

    Solomon, George M; Liu, Bo; Sermet-Gaudelus, Isabelle; Fajac, Isabelle; Wilschanski, Michael; Vermeulen, Francois; Rowe, Steven M

    2017-09-01

    Nasal Potential Difference (NPD) is a biomarker of CFTR activity used to diagnose CF and monitor experimental therapies. Limited studies have been performed to assess agreement between expert readers of NPD interpretation using a scoring algorithm. We developed a standardized scoring algorithm for "interpretability" and "confidence" for PD (potential difference) measures, and sought to determine the degree of agreement on NPD parameters between trained readers. There was excellent agreement for interpretability between NPD readers for CF and fair agreement for normal tracings but slight agreement of interpretability in indeterminate tracings. Amongst interpretable tracings, excellent correlation of mean scores for Ringer's Baseline PD, Δ amiloride , and Δ Cl-free+Isoproterenol was observed. There was slight agreement regarding confidence of the interpretable PD tracings, resulting in divergence of the Ringers and Δ amiloride , and ΔCl -free+Isoproterenol PDs between "high" and "low" confidence CF tracings. A multi-reader process with adjudication is important for scoring NPDs for diagnosis and in monitoring of CF clinical trials. Copyright © 2017 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.

  20. Atomic Detail Visualization of Photosynthetic Membranes with GPU-Accelerated Ray Tracing

    PubMed Central

    Vandivort, Kirby L.; Barragan, Angela; Singharoy, Abhishek; Teo, Ivan; Ribeiro, João V.; Isralewitz, Barry; Liu, Bo; Goh, Boon Chong; Phillips, James C.; MacGregor-Chatwin, Craig; Johnson, Matthew P.; Kourkoutis, Lena F.; Hunter, C. Neil

    2016-01-01

    The cellular process responsible for providing energy for most life on Earth, namely photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. We present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. We describe the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers. PMID:27274603

  1. Analytical approach of laser beam propagation in the hollow polygonal light pipe.

    PubMed

    Zhu, Guangzhi; Zhu, Xiao; Zhu, Changhong

    2013-08-10

    An analytical method of researching the light distribution properties on the output end of a hollow n-sided polygonal light pipe and a light source with a Gaussian distribution is developed. The mirror transformation matrices and a special algorithm of removing void virtual images are created to acquire the location and direction vector of each effective virtual image on the entrance plane. The analytical method is demonstrated by Monte Carlo ray tracing. At the same time, four typical cases are discussed. The analytical results indicate that the uniformity of light distribution varies with the structural and optical parameters of the hollow n-sided polygonal light pipe and light source with a Gaussian distribution. The analytical approach will be useful to design and choose the hollow n-sided polygonal light pipe, especially for high-power laser beam homogenization techniques.

  2. A Formal Algorithm for Routing Traces on a Printed Circuit Board

    NASA Technical Reports Server (NTRS)

    Hedgley, David R., Jr.

    1996-01-01

    This paper addresses the classical problem of printed circuit board routing: that is, the problem of automatic routing by a computer other than by brute force that causes the execution time to grow exponentially as a function of the complexity. Most of the present solutions are either inexpensive but not efficient and fast, or efficient and fast but very costly. Many solutions are proprietary, so not much is written or known about the actual algorithms upon which these solutions are based. This paper presents a formal algorithm for routing traces on a print- ed circuit board. The solution presented is very fast and efficient and for the first time speaks to the question eloquently by way of symbolic statements.

  3. Thermal Characterization of the Air Force Institute of Technology Solar Simulation Thermal Vacuum Chamber

    DTIC Science & Technology

    2014-03-27

    mass and surface area, Equation 12 demonstrates an energy balance for the material, assuming the rest of the surfaces of the material are isothermal...radiation in order to dissipate heat from 18 the spacecraft [8]. As discussed in the system thermal energy balance defined previously, emission of IR... energy balance calculations will be utilized. The Monte Carlo/Ray Trace Radiation Method The Monte Carlo/Ray Trace method is utilized in order to

  4. Modeling of laser interactions with composite materials

    DOE PAGES

    Rubenchik, Alexander M.; Boley, Charles D.

    2013-05-07

    In this study, we develop models of laser interactions with composite materials consisting of fibers embedded within a matrix. A ray-trace model is shown to determine the absorptivity, absorption depth, and optical power enhancement within the material, as well as the angular distribution of the reflected light. We also develop a macroscopic model, which provides physical insight and overall results. We show that the parameters in this model can be determined from the ray trace model.

  5. ROSAT EUV and soft X-ray studies of atmospheric composition and structure in G191-B2B

    NASA Technical Reports Server (NTRS)

    Barstow, M. A.; Fleming, T. A.; Finley, D. S.; Koester, D.; Diamond, C. J.

    1993-01-01

    Previous studies of the hot DA white dwarf GI91-B2B have been unable to determine whether the observed soft X-ray and EUV opacity arises from a stratified hydrogen and helium atmosphere or from the presence of trace metals in the photosphere. New EUV and soft X-ray photometry of this star, made with the ROSAT observatory, when analyzed in conjunction with the earlier data, shows that the stratified models cannot account for the observed fluxes. Consequently, we conclude that trace metals must be a substantial source of opacity in the photosphere of G191-B2B.

  6. Optimal cost design of water distribution networks using a decomposition approach

    NASA Astrophysics Data System (ADS)

    Lee, Ho Min; Yoo, Do Guen; Sadollah, Ali; Kim, Joong Hoon

    2016-12-01

    Water distribution network decomposition, which is an engineering approach, is adopted to increase the efficiency of obtaining the optimal cost design of a water distribution network using an optimization algorithm. This study applied the source tracing tool in EPANET, which is a hydraulic and water quality analysis model, to the decomposition of a network to improve the efficiency of the optimal design process. The proposed approach was tested by carrying out the optimal cost design of two water distribution networks, and the results were compared with other optimal cost designs derived from previously proposed optimization algorithms. The proposed decomposition approach using the source tracing technique enables the efficient decomposition of an actual large-scale network, and the results can be combined with the optimal cost design process using an optimization algorithm. This proves that the final design in this study is better than those obtained with other previously proposed optimization algorithms.

  7. Agreement between total corneal astigmatism calculated by vector summation and total corneal astigmatism measured by ray tracing using Galilei double Scheimpflug analyzer.

    PubMed

    Feizi, Sepehr; Delfazayebaher, Siamak; Ownagh, Vahid; Sadeghpour, Fatemeh

    To evaluate the agreement between total corneal astigmatism calculated by vector summation of anterior and posterior corneal astigmatism (TCA Vec ) and total corneal astigmatism measured by ray tracing (TCA Ray ). This study enrolled a total of 204 right eyes of 204 normal subjects. The eyes were measured using a Galilei double Scheimpflug analyzer. The measured parameters included simulated keratometric astigmatism using the keratometric index, anterior corneal astigmatism using the corneal refractive index, posterior corneal astigmatism, and TCA Ray . TCA Vec was derived by vector summation of the astigmatism on the anterior and posterior corneal surfaces. The magnitudes and axes of TCA Vec and TCA Ray were compared. The Pearson correlation coefficient and Bland-Altman plots were used to assess the relationship and agreement between TCA Vec and TCA Ray , respectively. The mean TCA Vec and TCA Ray magnitudes were 0.76±0.57D and 1.00±0.78D, respectively (P<0.001). The mean axis orientations were 85.12±30.26° and 89.67±36.76°, respectively (P=0.02). Strong correlations were found between the TCA Vec and TCA Ray magnitudes (r=0.96, P<0.001). Moderate associations were observed between the TCA Vec and TCA Ray axes (r=0.75, P<0.001). Bland-Altman plots produced the 95% limits of agreement for the TCA Vec and TCA Ray magnitudes from -0.33 to 0.82D. The 95% limits of agreement between the TCA Vec and TCA Ray axes was -43.0 to 52.1°. The magnitudes and axes of astigmatisms measured by the vector summation and ray tracing methods cannot be used interchangeably. There was a systematic error between the TCA Vec and TCA Ray magnitudes. Copyright © 2017 Spanish General Council of Optometry. Published by Elsevier España, S.L.U. All rights reserved.

  8. SENSOR: a tool for the simulation of hyperspectral remote sensing systems

    NASA Astrophysics Data System (ADS)

    Börner, Anko; Wiest, Lorenz; Keller, Peter; Reulke, Ralf; Richter, Rolf; Schaepman, Michael; Schläpfer, Daniel

    The consistent end-to-end simulation of airborne and spaceborne earth remote sensing systems is an important task, and sometimes the only way for the adaptation and optimisation of a sensor and its observation conditions, the choice and test of algorithms for data processing, error estimation and the evaluation of the capabilities of the whole sensor system. The presented software simulator SENSOR (Software Environment for the Simulation of Optical Remote sensing systems) includes a full model of the sensor hardware, the observed scene, and the atmosphere in between. The simulator consists of three parts. The first part describes the geometrical relations between scene, sun, and the remote sensing system using a ray-tracing algorithm. The second part of the simulation environment considers the radiometry. It calculates the at-sensor radiance using a pre-calculated multidimensional lookup-table taking the atmospheric influence on the radiation into account. The third part consists of an optical and an electronic sensor model for the generation of digital images. Using SENSOR for an optimisation requires the additional application of task-specific data processing algorithms. The principle of the end-to-end-simulation approach is explained, all relevant concepts of SENSOR are discussed, and first examples of its use are given. The verification of SENSOR is demonstrated. This work is closely related to the Airborne PRISM Experiment (APEX), an airborne imaging spectrometer funded by the European Space Agency.

  9. Adaptive mapping functions to the azimuthal anisotropy of the neutral atmosphere

    NASA Astrophysics Data System (ADS)

    Gegout, P.; Biancale, R.; Soudarin, L.

    2011-10-01

    The anisotropy of propagation of radio waves used by global navigation satellite systems is investigated using high-resolution observational data assimilations produced by the European Centre for Medium-range Weather Forecast. The geometry and the refractivity of the neutral atmosphere are built introducing accurate geodetic heights and continuous formulations of the refractivity and its gradient. Hence the realistic ellipsoidal shape of the refractivity field above the topography is properly represented. Atmospheric delays are obtained by ray-tracing through the refractivity field, integrating the eikonal differential system. Ray-traced delays reveal the anisotropy of the atmosphere. With the aim to preserve the classical mapping function strategy, mapping functions can evolve to adapt to high-frequency atmospheric fluctuations and to account for the anisotropy of propagation by fitting at each site and time the zenith delays and the mapping functions coefficients. Adaptive mapping functions (AMF) are designed with coefficients of the continued fraction form which depend on azimuth. The basic idea is to expand the azimuthal dependency of the coefficients in Fourier series introducing a multi-scale azimuthal decomposition which slightly changes the elevation functions with the azimuth. AMF are used to approximate thousands of atmospheric ray-traced delays using a few tens of coefficients. Generic recursive definitions of the AMF and their partial derivatives lead to observe that the truncation of the continued fraction form at the third term and the truncation of the azimuthal Fourier series at the fourth term are sufficient in usual meteorological conditions. Delays' and elevations' mapping functions allow to store and to retrieve the ray-tracing results to solve the parallax problem at the observation level. AMF are suitable to fit the time-variable isotropic and anisotropic parts of the ray-traced delays at each site at each time step and to provide GPS range corrections at the measurement level with millimeter accuracy at low elevation. AMF to the azimuthal anisotropy of the neutral atmosphere are designed to adapt to complex weather conditions by adaptively changing their truncations.

  10. Monitoring Programs Using Rewriting

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Lan, Sonie (Technical Monitor)

    2001-01-01

    We present a rewriting algorithm for efficiently testing future time Linear Temporal Logic (LTL) formulae on finite execution traces, The standard models of LTL are infinite traces, reflecting the behavior of reactive and concurrent systems which conceptually may be continuously alive in most past applications of LTL, theorem provers and model checkers have been used to formally prove that down-scaled models satisfy such LTL specifications. Our goal is instead to use LTL for up-scaled testing of real software applications, corresponding to analyzing the conformance of finite traces against LTL formulae. We first describe what it means for a finite trace to satisfy an LTL property end then suggest an optimized algorithm based on transforming LTL formulae. We use the Maude rewriting logic, which turns out to be a good notation and being supported by an efficient rewriting engine for performing these experiments. The work constitutes part of the Java PathExplorer (JPAX) project, the purpose of which is to develop a flexible tool for monitoring Java program executions.

  11. AXAF-1 high-resolution mirror assembly image model and comparison with x-ray ground-test image

    NASA Astrophysics Data System (ADS)

    Zissa, David E.

    1999-09-01

    The completed High Resolution Mirror Assembly (HRMA) of the Advanced X-ray Astrophysics Facility - Imaging (AXAF-I) was tested at the X-ray Calibration Facility (XRCF) at the NASA- Marshall Space Flight Center (MSFC) in 1997. The MSFC image model was developed during the development of AXAF-I. The MSFC model is a detailed ray-trace model of the as-built HRMA optics and the XRCF teste conditions. The image encircled-energy distributions from the model are found to general agree well with XRCF test data nd the preliminary Smithsonian Astrophysical Observatory (SAO) model. MSFC model effective-area result generally agree with those of the preliminary SAO model. Preliminary model effective-area results were reported by SAO to be approximately 5-13 percent above initial XRCF test results. The XRCF test conditions are removed from the MSFC ray-trace model to derive an on-orbit prediction of the HRMA image.

  12. FormTracer. A mathematica tracing package using FORM

    NASA Astrophysics Data System (ADS)

    Cyrol, Anton K.; Mitter, Mario; Strodthoff, Nils

    2017-10-01

    We present FormTracer, a high-performance, general purpose, easy-to-use Mathematica tracing package which uses FORM. It supports arbitrary space and spinor dimensions as well as an arbitrary number of simple compact Lie groups. While keeping the usability of the Mathematica interface, it relies on the efficiency of FORM. An additional performance gain is achieved by a decomposition algorithm that avoids redundant traces in the product tensors spaces. FormTracer supports a wide range of syntaxes which endows it with a high flexibility. Mathematica notebooks that automatically install the package and guide the user through performing standard traces in space-time, spinor and gauge-group spaces are provided. Program Files doi:http://dx.doi.org/10.17632/7rd29h4p3m.1 Licensing provisions: GPLv3 Programming language: Mathematica and FORM Nature of problem: Efficiently compute traces of large expressions Solution method: The expression to be traced is decomposed into its subspaces by a recursive Mathematica expansion algorithm. The result is subsequently translated to a FORM script that takes the traces. After FORM is executed, the final result is either imported into Mathematica or exported as optimized C/C++/Fortran code. Unusual features: The outstanding features of FormTracer are the simple interface, the capability to efficiently handle an arbitrary number of Lie groups in addition to Dirac and Lorentz tensors, and a customizable input-syntax.

  13. Distance measurement based on light field geometry and ray tracing.

    PubMed

    Chen, Yanqin; Jin, Xin; Dai, Qionghai

    2017-01-09

    In this paper, we propose a geometric optical model to measure the distances of object planes in a light field image. The proposed geometric optical model is composed of two sub-models based on ray tracing: object space model and image space model. The two theoretic sub-models are derived on account of on-axis point light sources. In object space model, light rays propagate into the main lens and refract inside it following the refraction theorem. In image space model, light rays exit from emission positions on the main lens and subsequently impinge on the image sensor with different imaging diameters. The relationships between imaging diameters of objects and their corresponding emission positions on the main lens are investigated through utilizing refocusing and similar triangle principle. By combining the two sub-models together and tracing light rays back to the object space, the relationships between objects' imaging diameters and corresponding distances of object planes are figured out. The performance of the proposed geometric optical model is compared with existing approaches using different configurations of hand-held plenoptic 1.0 cameras and real experiments are conducted using a preliminary imaging system. Results demonstrate that the proposed model can outperform existing approaches in terms of accuracy and exhibits good performance at general imaging range.

  14. 3D Laser Imprint Using a Smoother Ray-Traced Power Deposition Method

    NASA Astrophysics Data System (ADS)

    Schmitt, Andrew J.

    2017-10-01

    Imprinting of laser nonuniformities in directly-driven icf targets is a challenging problem to accurately simulate with large radiation-hydro codes. One of the most challenging aspects is the proper construction of the complex and rapidly changing laser interference structure driving the imprint using the reduced laser propagation models (usually ray-tracing) found in these codes. We have upgraded the modelling capability in our massively-parallel fastrad3d code by adding a more realistic EM-wave interference structure. This interference model adds an axial laser speckle to the previous transverse-only laser structure, and can be impressed on our improved smoothed 3D raytrace package. This latter package, which connects rays to form bundles and performs power deposition calculations on the bundles, is intended to decrease ray-trace noise (which can mask or add to imprint) while using fewer rays. We apply this improved model to 3D simulations of recent imprint experiments performed on the Omega-EP laser and the Nike laser that examined the reduction of imprinting due to very thin high-Z target coatings. We report on the conditions in which this new model makes a significant impact on the development of laser imprint. Supported by US DoE/NNSA.

  15. Laser absorption of carbon fiber reinforced polymer with randomly distributed carbon fibers

    NASA Astrophysics Data System (ADS)

    Hu, Jun; Xu, Hebing; Li, Chao

    2018-03-01

    Laser processing of carbon fiber reinforced polymer (CFRP) is a non-traditional machining method which has many prospective applications. The laser absorption characteristics of CFRP are analyzed in this paper. A ray tracing model describing the interaction of the laser spot with CFRP is established. The material model contains randomly distributed carbon fibers which are generated using an improved carbon fiber placement method. It was found that CFRP has good laser absorption due to multiple reflections of the light rays in the material’s microstructure. The randomly distributed carbon fibers make the absorptivity of the light rays change randomly in the laser spot. Meanwhile, the average absorptivity fluctuation is obvious during movement of the laser. The experimental measurements agree well with the values predicted by the ray tracing model.

  16. Tomography and the Herglotz-Wiechert inverse formulation

    NASA Astrophysics Data System (ADS)

    Nowack, Robert L.

    1990-04-01

    In this paper, linearized tomography and the Herglotz-Wiechert inverse formulation are compared. Tomographic inversions for 2-D or 3-D velocity structure use line integrals along rays and can be written in terms of Radon transforms. For radially concentric structures, Radon transforms are shown to reduce to Abel transforms. Therefore, for straight ray paths, the Abel transform of travel-time is a tomographic algorithm specialized to a one-dimensional radially concentric medium. The Herglotz-Wiechert formulation uses seismic travel-time data to invert for one-dimensional earth structure and is derived using exact ray trajectories by applying an Abel transform. This is of historical interest since it would imply that a specialized tomographic-like algorithm has been used in seismology since the early part of the century (see Herglotz, 1907; Wiechert, 1910). Numerical examples are performed comparing the Herglotz-Wiechert algorithm and linearized tomography along straight rays. Since the Herglotz-Wiechert algorithm is applicable under specific conditions, (the absence of low velocity zones) to non-straight ray paths, the association with tomography may prove to be useful in assessing the uniqueness of tomographic results generalized to curved ray geometries.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kieselmann, J; Bartzsch, S; Oelfke, U

    Purpose: Microbeam Radiation Therapy is a preclinical method in radiation oncology that modulates radiation fields on a micrometre scale. Dose calculation is challenging due to arising dose gradients and therapeutically important dose ranges. Monte Carlo (MC) simulations, often used as gold standard, are computationally expensive and hence too slow for the optimisation of treatment parameters in future clinical applications. On the other hand, conventional kernel based dose calculation leads to inaccurate results close to material interfaces. The purpose of this work is to overcome these inaccuracies while keeping computation times low. Methods: A point kernel superposition algorithm is modified tomore » account for tissue inhomogeneities. Instead of conventional ray tracing approaches, methods from differential geometry are applied and the space around the primary photon interaction is locally warped. The performance of this approach is compared to MC simulations and a simple convolution algorithm (CA) for two different phantoms and photon spectra. Results: While peak doses of all dose calculation methods agreed within less than 4% deviations, the proposed approach surpassed a simple convolution algorithm in accuracy by a factor of up to 3 in the scatter dose. In a treatment geometry similar to possible future clinical situations differences between Monte Carlo and the differential geometry algorithm were less than 3%. At the same time the calculation time did not exceed 15 minutes. Conclusion: With the developed method it was possible to improve the dose calculation based on the CA method with respect to accuracy especially at sharp tissue boundaries. While the calculation is more extensive than for the CA method and depends on field size, the typical calculation time for a 20×20 mm{sup 2} field on a 3.4 GHz and 8 GByte RAM processor remained below 15 minutes. Parallelisation and optimisation of the algorithm could lead to further significant calculation time reductions.« less

  18. An adaptive replacement algorithm for paged-memory computer systems.

    NASA Technical Reports Server (NTRS)

    Thorington, J. M., Jr.; Irwin, J. D.

    1972-01-01

    A general class of adaptive replacement schemes for use in paged memories is developed. One such algorithm, called SIM, is simulated using a probability model that generates memory traces, and the results of the simulation of this adaptive scheme are compared with those obtained using the best nonlookahead algorithms. A technique for implementing this type of adaptive replacement algorithm with state of the art digital hardware is also presented.

  19. Polarization Considerations for the Laser Interferometer Space Antenna

    NASA Technical Reports Server (NTRS)

    Waluschka, Eugene; Pedersen, Tracy R.; McNamara, Paul

    2005-01-01

    A polarization ray trace model of the Laser Interferometer Space Antenna s (LISA) optical path is being created. The model will be able to assess the effects of various polarizing elements and the optical coatings on the required, very long path length, picometer level dynamic interferometry. The computational steps are described. This should eliminate any ambiguities associated with polarization ray tracing of interferometers and provide a basis for determining the computer model s limitations and serve as a clearly defined starting point for future work.

  20. Sequentially reweighted TV minimization for CT metal artifact reduction.

    PubMed

    Zhang, Xiaomeng; Xing, Lei

    2013-07-01

    Metal artifact reduction has long been an important topic in x-ray CT image reconstruction. In this work, the authors propose an iterative method that sequentially minimizes a reweighted total variation (TV) of the image and produces substantially artifact-reduced reconstructions. A sequentially reweighted TV minimization algorithm is proposed to fully exploit the sparseness of image gradients (IG). The authors first formulate a constrained optimization model that minimizes a weighted TV of the image, subject to the constraint that the estimated projection data are within a specified tolerance of the available projection measurements, with image non-negativity enforced. The authors then solve a sequence of weighted TV minimization problems where weights used for the next iteration are computed from the current solution. Using the complete projection data, the algorithm first reconstructs an image from which a binary metal image can be extracted. Forward projection of the binary image identifies metal traces in the projection space. The metal-free background image is then reconstructed from the metal-trace-excluded projection data by employing a different set of weights. Each minimization problem is solved using a gradient method that alternates projection-onto-convex-sets and steepest descent. A series of simulation and experimental studies are performed to evaluate the proposed approach. Our study shows that the sequentially reweighted scheme, by altering a single parameter in the weighting function, flexibly controls the sparsity of the IG and reconstructs artifacts-free images in a two-stage process. It successfully produces images with significantly reduced streak artifacts, suppressed noise and well-preserved contrast and edge properties. The sequentially reweighed TV minimization provides a systematic approach for suppressing CT metal artifacts. The technique can also be generalized to other "missing data" problems in CT image reconstruction.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naseri, M; Rajabi, H; Wang, J

    Purpose: Respiration causes lesion smearing, image blurring and quality degradation, affecting lesion contrast and the ability to define correct lesion size. The spatial resolution of current multi pinhole SPECT (MPHS) scanners is sub-millimeter. Therefore, the effect of motion is more noticeable in comparison to conventional SPECT scanner. Gated imaging aims to reduce motion artifacts. A major issue in gating is the lack of statistics and individual reconstructed frames are noisy. The increased noise in each frame, deteriorates the quantitative accuracy of the MPHS Images. The objective of this work, is to enhance the image quality in 4D-MPHS imaging, by 4Dmore » image reconstruction. Methods: The new algorithm requires deformation vector fields (DVFs) that are calculated by non-rigid Demons registration. The algorithm is based on the motion-incorporated version of ordered subset expectation maximization (OSEM) algorithm. This iterative algorithm is capable to make full use of all projections to reconstruct each individual frame. To evaluate the performance of the proposed algorithm a simulation study was conducted. A fast ray tracing method was used to generate MPHS projections of a 4D digital mouse phantom with a small tumor in liver in eight different respiratory phases. To evaluate the 4D-OSEM algorithm potential, tumor to liver activity ratio was compared with other image reconstruction methods including 3D-MPHS and post reconstruction registered with Demons-derived DVFs. Results: Image quality of 4D-MPHS is greatly improved by the 4D-OSEM algorithm. When all projections are used to reconstruct a 3D-MPHS, motion blurring artifacts are present, leading to overestimation of the tumor size and 24% tumor contrast underestimation. This error reduced to 16% and 10% for post reconstruction registration methods and 4D-OSEM respectively. Conclusion: 4D-OSEM method can be used for motion correction in 4D-MPHS. The statistics and quantification are improved since all projection data are combined together to update the image.« less

  2. Tolerancing the alignment of large-core optical fibers, fiber bundles and light guides using a Fourier approach.

    PubMed

    Sawyer, Travis W; Petersburg, Ryan; Bohndiek, Sarah E

    2017-04-20

    Optical fiber technology is found in a wide variety of applications to flexibly relay light between two points, enabling information transfer across long distances and allowing access to hard-to-reach areas. Large-core optical fibers and light guides find frequent use in illumination and spectroscopic applications, for example, endoscopy and high-resolution astronomical spectroscopy. Proper alignment is critical for maximizing throughput in optical fiber coupling systems; however, there currently are no formal approaches to tolerancing the alignment of a light-guide coupling system. Here, we propose a Fourier alignment sensitivity (FAS) algorithm to determine the optimal tolerances on the alignment of a light guide by computing the alignment sensitivity. The algorithm shows excellent agreement with both simulated and experimentally measured values and improves on the computation time of equivalent ray-tracing simulations by two orders of magnitude. We then apply FAS to tolerance and fabricate a coupling system, which is shown to meet specifications, thus validating FAS as a tolerancing technique. These results indicate that FAS is a flexible and rapid means to quantify the alignment sensitivity of a light guide, widely informing the design and tolerancing of coupling systems.

  3. Multiscale Simulation of Gas Film Lubrication During Liquid Droplet Collision

    NASA Astrophysics Data System (ADS)

    Chen, Xiaodong; Khare, Prashant; Ma, Dongjun; Yang, Vigor

    2012-02-01

    Droplet collision plays an elementary role in dense spray combustion process. When two droplets approach each other, a gas film forms in between. The pressure generated within the film prevents motion of approaching droplets. This fluid mechanics is fluid film lubrication that occurs when opposing bearing surfaces are completely separated by fluid film. The lubrication flow in gas film decides the collision outcome, coalescence or bouncing. Present study focuses on gas film drainage process over a wide range of Weber numbers during equal- and unequal-sized droplet collision. The formulation is based on complete set of conservation equations for both liquid and surrounding gas phases. An improved volume-of-fluid technique, augmented by an adaptive mesh refinement algorithm, is used to track liquid/gas interfaces. A unique thickness-based refinement algorithm based on topology of interfacial flow is developed and implemented to efficiently resolve the multiscale problem. The grid size on interface is up O(10-4) of droplet size with a max resolution of 0.015 μm. An advanced visualization technique using the Ray-tracing methodology is used to gain direct insights to detailed physics. Theories are established by analyzing the characteristics of shape changing and flow evolution.

  4. A Simple Geometrical Model for Calculation of the Effective Emissivity in Blackbody Cylindrical Cavities

    NASA Astrophysics Data System (ADS)

    De Lucas, Javier

    2015-03-01

    A simple geometrical model for calculating the effective emissivity in blackbody cylindrical cavities has been developed. The back ray tracing technique and the Monte Carlo method have been employed, making use of a suitable set of coordinates and auxiliary planes. In these planes, the trajectories of individual photons in the successive reflections between the cavity points are followed in detail. The theoretical model is implemented by using simple numerical tools, programmed in Microsoft Visual Basic for Application and Excel. The algorithm is applied to isothermal and non-isothermal diffuse cylindrical cavities with a lid; however, the basic geometrical structure can be generalized to a cylindro-conical shape and specular reflection. Additionally, the numerical algorithm and the program source code can be used, with minor changes, for determining the distribution of the cavity points, where photon absorption takes place. This distribution could be applied to the study of the influence of thermal gradients on the effective emissivity profiles, for example. Validation is performed by analyzing the convergence of the Monte Carlo method as a function of the number of trials and by comparison with published results of different authors.

  5. Assessing the Suitability of the ClOud Reflection Algorithm (CORA) in Modelling the Evolution of an Artificial Plasma Cloud in the Ionosphere

    NASA Astrophysics Data System (ADS)

    Jackson-Booth, N.

    2016-12-01

    Artificial Ionospheric Modification (AIM) attempts to modify the ionosphere in order to alter the propagation environment. It can be achieved through injecting the ionosphere with aerosols, chemicals or radio signals. The effects of any such release can be detected through the deployment of sensors, including ground based high frequency (HF) sounders. During the Metal Oxide Space Clouds (MOSC) experiment (undertaken in April/May 2013 in the Kwajalein Atoll, part of the Marshall Islands) several oblique ionograms were recorded from a ground based HF system. These ionograms were collected over multiple geometries and allowed the effects on the HF propagation environment to be understood. These ionograms have subsequently been used in the ClOud Reflection Algorithm (CORA) to attempt to model the evolution of the cloud following release. This paper describes the latest validation results from CORA, both from testing against ionograms, but also other independent models of cloud evolution from MOSC. For all testing the various cloud models (including that generated by CORA) were incorporated into a background ionosphere through which a 3D numerical ray trace was run to produce synthetic ionograms that could be compared with the ionograms recorded during MOSC.

  6. Tolerancing the alignment of large-core optical fibers, fiber bundles and light guides using a Fourier approach

    PubMed Central

    Sawyer, Travis W.; Petersburg, Ryan; Bohndiek, Sarah E.

    2017-01-01

    Optical fiber technology is found in a wide variety of applications to flexibly relay light between two points, enabling information transfer across long distances and allowing access to hard-to-reach areas. Large-core optical fibers and light guides find frequent use in illumination and spectroscopic applications; for example, endoscopy and high-resolution astronomical spectroscopy. Proper alignment is critical for maximizing throughput in optical fiber coupling systems, however, there currently are no formal approaches to tolerancing the alignment of a light guide coupling system. Here, we propose a Fourier Alignment Sensitivity (FAS) algorithm to determine the optimal tolerances on the alignment of a light guide by computing the alignment sensitivity. The algorithm shows excellent agreement with both simulated and experimentally measured values and improves on the computation time of equivalent ray tracing simulations by two orders of magnitude. We then apply FAS to tolerance and fabricate a coupling system, which is shown to meet specifications, thus validating FAS as a tolerancing technique. These results indicate that FAS is a flexible and rapid means to quantify the alignment sensitivity of a light guide, widely informing the design and tolerancing of coupling systems. PMID:28430250

  7. Lateral Penumbra Modelling Based Leaf End Shape Optimization for Multileaf Collimator in Radiotherapy.

    PubMed

    Zhou, Dong; Zhang, Hui; Ye, Peiqing

    2016-01-01

    Lateral penumbra of multileaf collimator plays an important role in radiotherapy treatment planning. Growing evidence has revealed that, for a single-focused multileaf collimator, lateral penumbra width is leaf position dependent and largely attributed to the leaf end shape. In our study, an analytical method for leaf end induced lateral penumbra modelling is formulated using Tangent Secant Theory. Compared with Monte Carlo simulation and ray tracing algorithm, our model serves well the purpose of cost-efficient penumbra evaluation. Leaf ends represented in parametric forms of circular arc, elliptical arc, Bézier curve, and B-spline are implemented. With biobjective function of penumbra mean and variance introduced, genetic algorithm is carried out for approximating the Pareto frontier. Results show that for circular arc leaf end objective function is convex and convergence to optimal solution is guaranteed using gradient based iterative method. It is found that optimal leaf end in the shape of Bézier curve achieves minimal standard deviation, while using B-spline minimum of penumbra mean is obtained. For treatment modalities in clinical application, optimized leaf ends are in close agreement with actual shapes. Taken together, the method that we propose can provide insight into leaf end shape design of multileaf collimator.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santos-Villalobos, Hector J; Barstow, Del R; Karakaya, Mahmut

    Iris recognition has been proven to be an accurate and reliable biometric. However, the recognition of non-ideal iris images such as off angle images is still an unsolved problem. We propose a new biometric targeted eye model and a method to reconstruct the off-axis eye to its frontal view allowing for recognition using existing methods and algorithms. This allows for existing enterprise level algorithms and approaches to be largely unmodified by using our work as a pre-processor to improve performance. In addition, we describe the `Limbus effect' and its importance for an accurate segmentation of off-axis irides. Our method usesmore » an anatomically accurate human eye model and ray-tracing techniques to compute a transformation function, which reconstructs the iris to its frontal, non-refracted state. Then, the same eye model is used to render a frontal view of the reconstructed iris. The proposed method is fully described and results from synthetic data are shown to establish an upper limit on performance improvement and establish the importance of the proposed approach over traditional linear elliptical unwrapping methods. Our results with synthetic data demonstrate the ability to perform an accurate iris recognition with an image taken as much as 70 degrees off-axis.« less

  9. Portable TXRF Spectrometer with 10{sup -11}g Detection Limit and Portable XRF Spectromicroscope with Sub-mm Spatial Resolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kunimura, Shinsuke; Hatakeyama, So; Sasaki, Nobuharu

    A portable total reflection X-ray fluorescence (TXRF) spectrometer that we have developed is applied to trace elemental analysis of water solutions. Although a 5 W X-ray tube is used in the portable TXRF spectrometer, detection limits of several ppb are achieved for 3d transition metal elements and trace elements in a leaching solution of soils, a leaching solution of solder, and alcoholic beverages are detected. Portable X-ray fluorescence (XRF) spectromicroscopes with a 1 W X-ray tube and an 8 W X-ray tube are also presented. Using the portable XRF spectromicroscope with the 1 W X-ray tube, 93 ppm of Crmore » is detected with an about 700 {mu}m spatial resolution. Spatially resolved elemental analysis of a mug painted with blue, red, green, and white is performed using the two portable spectromicroscopes, and the difference in elemental composition at each paint is detected.« less

  10. Advancing X-ray scattering metrology using inverse genetic algorithms.

    PubMed

    Hannon, Adam F; Sunday, Daniel F; Windover, Donald; Kline, R Joseph

    2016-01-01

    We compare the speed and effectiveness of two genetic optimization algorithms to the results of statistical sampling via a Markov chain Monte Carlo algorithm to find which is the most robust method for determining real space structure in periodic gratings measured using critical dimension small angle X-ray scattering. Both a covariance matrix adaptation evolutionary strategy and differential evolution algorithm are implemented and compared using various objective functions. The algorithms and objective functions are used to minimize differences between diffraction simulations and measured diffraction data. These simulations are parameterized with an electron density model known to roughly correspond to the real space structure of our nanogratings. The study shows that for X-ray scattering data, the covariance matrix adaptation coupled with a mean-absolute error log objective function is the most efficient combination of algorithm and goodness of fit criterion for finding structures with little foreknowledge about the underlying fine scale structure features of the nanograting.

  11. A framework for modeling connections between hydraulics, water surface roughness, and surface reflectance in open channel flows

    USGS Publications Warehouse

    Legleiter, Carl; Mobley, Curtis D.; Overstreet, Brandon

    2017-01-01

    This paper introduces a framework for examining connections between the flow field, the texture of the air-water interface, and the reflectance of the water surface and thus evaluating the potential to infer hydraulic information from remotely sensed observations of surface reflectance. We used a spatial correlation model describing water surface topography to illustrate the application of our framework. Nondimensional relations between model parameters and flow intensity were established based on a prior flume study. Expressing the model in the spatial frequency domain allowed us to use an efficient Fourier transform-based algorithm for simulating water surfaces. Realizations for both flume and field settings had water surface slope distributions positively correlated with velocity and water surface roughness. However, most surface facets were gently sloped and thus unlikely to yield strong specular reflections; the model exaggerated the extent of water surface features, leading to underestimation of facet slopes. A ray tracing algorithm indicated that reflectance was greatest when solar and view zenith angles were equal and the sensor scanned toward the Sun to capture specular reflections of the solar beam. Reflected energy was concentrated in a small portion of the sky, but rougher water surfaces reflected rays into a broader range of directions. Our framework facilitates flight planning to avoid surface-reflected radiance while mapping other river attributes, or to maximize this component to exploit relationships between hydraulics and surface reflectance. This initial analysis also highlighted the need for improved models of water surface topography in natural rivers.

  12. Using portable X-ray fluorescence spectrometry and GIS to assess environmental risk and identify sources of trace metals in soils of peri-urban areas in the Yangtze Delta region, China.

    PubMed

    Ran, Jing; Wang, Dejian; Wang, Can; Zhang, Gang; Yao, Lipeng

    2014-08-01

    Portable X-ray fluorescence (PXRF) spectrometry may be very suitable for a fast and effective environmental assessment and source identification of trace metals in soils. In this study, topsoils (0-10 cm) at 139 sites were in situ scanned for total trace metals (Cr, Cu, Ni, Pb and Zn) and arsenic concentrations by PXRF in a typical town in Yangtze Delta region of Jiangsu province, China. To validate the utility of PXRF, 53 samples were collected from the scanning sites for the determination of selected trace metals using conventional methods. Based on trace metal concentrations detected by in situ PXRF, the contamination extent and sources of trace metals were studied via geo-accumulation index, multivariate analysis and geostatistics. The trace metal concentrations determined by PXRF were similar to those obtained via conventional chemical analysis. The median concentration of As, Cr, Cu, Ni, Pb and Zn in soils were 10.8, 56.4, 41.5, 43.5, 33.5, and 77.7 mg kg(-1), respectively. The distribution patterns of Cr, Cu, Ni, Pb, and Zn were mostly affected by anthropogenic sources, while As was mainly derived from lithogenic sources. Overall, PXRF has been successfully applied to contamination assessment and source identification of trace metals in soils.

  13. Evaluating the effect of online data compression on the disk cache of a mass storage system

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Yesha, Yelena

    1994-01-01

    A trace driven simulation of the disk cache of a mass storage system was used to evaluate the effect of an online compression algorithm on various performance measures. Traces from the system at NASA's Center for Computational Sciences were used to run the simulation and disk cache hit ratios, number of files and bytes migrating to tertiary storage were measured. The measurements were performed for both an LRU and a size based migration algorithm. In addition to seeing the effect of online data compression on the disk cache performance measure, the simulation provided insight into the characteristics of the interactive references, suggesting that hint based prefetching algorithms are the only alternative for any future improvements to the disk cache hit ratio.

  14. Analysis of stray radiation for infrared optical system

    NASA Astrophysics Data System (ADS)

    Li, Yang; Zhang, Tingcheng; Liao, Zhibo; Mu, Shengbo; Du, Jianxiang; Wang, Xiangdong

    2016-10-01

    Based on the theory of radiation energy transfer in the infrared optical system, two methods for stray radiation analysis caused by interior thermal radiation in infrared optical system are proposed, one of which is important sampling method technique using forward ray trace, another of which is integral computation method using reverse ray trace. The two methods are discussed in detail. A concrete infrared optical system is provided. Light-tools is used to simulate the passage of radiation from the mirrors and mounts. Absolute values of internal irradiance on the detector are received. The results shows that the main part of the energy on the detector is due to the critical objects which were consistent with critical objects obtained by reverse ray trace, where mirror self-emission contribution is about 87.5% of the total energy. Corresponding to the results, the irradiance on the detector calculated by the two methods are in good agreement. So the validity and rationality of the two methods are proved.

  15. Method for position emission mammography image reconstruction

    DOEpatents

    Smith, Mark Frederick

    2004-10-12

    An image reconstruction method comprising accepting coincidence datat from either a data file or in real time from a pair of detector heads, culling event data that is outside a desired energy range, optionally saving the desired data for each detector position or for each pair of detector pixels on the two detector heads, and then reconstructing the image either by backprojection image reconstruction or by iterative image reconstruction. In the backprojection image reconstruction mode, rays are traced between centers of lines of response (LOR's), counts are then either allocated by nearest pixel interpolation or allocated by an overlap method and then corrected for geometric effects and attenuation and the data file updated. If the iterative image reconstruction option is selected, one implementation is to compute a grid Siddon retracing, and to perform maximum likelihood expectation maiximization (MLEM) computed by either: a) tracing parallel rays between subpixels on opposite detector heads; or b) tracing rays between randomized endpoint locations on opposite detector heads.

  16. Three dimensional ray tracing Jovian magnetosphere in the low frequency range

    NASA Technical Reports Server (NTRS)

    Menietti, J. D.

    1982-01-01

    Ray tracing of the Jovian magnetosphere in the low frequency range (1+40 MHz) has resulted in a new understanding of the source mechanism for Io dependent decametric radiation (DAM). Our three dimensional ray tracing computer code has provided model DAM arcs at 10 deg. intervals of Io longitude source positions for the full 360 deg of Jovian system III longitude. In addition, particularly interesting arcs were singled out for detailed study and modelling. Dependent decametric radiation arcs are categorized according to curvature--the higher curvature arcs are apparently due to wave stimulation at a nonconstant wave normal angle, psi. The psi(f) relationship has a signature that is common to most of the higher curvature arcs. The low curvature arcs, on the other hand, are adequately modelled with a constant wave normal angle of close to 90 deg. These results imply that for higher curvature arcs observed for from Jupiter (to diminish spacecraft motion effects) the electrons providing the gyroemission are relativistically beamed.

  17. Detection of fruit-fly infestation in olives using X-ray imaging: Algorithm development and prospects

    USDA-ARS?s Scientific Manuscript database

    An algorithm using a Bayesian classifier was developed to automatically detect olive fruit fly infestations in x-ray images of olives. The data set consisted of 249 olives with various degrees of infestation and 161 non-infested olives. Each olive was x-rayed on film and digital images were acquired...

  18. The Abundance of Large Arcs From CLASH

    NASA Astrophysics Data System (ADS)

    Xu, Bingxiao; Postman, Marc; Meneghetti, Massimo; Coe, Dan A.; Clash Team

    2015-01-01

    We have developed an automated arc-finding algorithm to perform a rigorous comparison of the observed and simulated abundance of large lensed background galaxies (a.k.a arcs). We use images from the CLASH program to derive our observed arc abundance. Simulated CLASH images are created by performing ray tracing through mock clusters generated by the N-body simulation calibrated tool -- MOKA, and N-body/hydrodynamic simulations -- MUSIC, over the same mass and redshift range as the CLASH X-ray selected sample. We derive a lensing efficiency of 15 ± 3 arcs per cluster for the X-ray selected CLASH sample and 4 ± 2 arcs per cluster for the simulated sample. The marginally significant difference (3.0 σ) between the results for the observations and the simulations can be explained by the systematically smaller area with magnification larger than 3 (by a factor of ˜4) in both MOKA and MUSIC mass models relative to those derived from the CLASH data. Accounting for this difference brings the observed and simulated arc statistics into full agreement. We find that the source redshift distribution does not have big impact on the arc abundance but the arc abundance is very sensitive to the concentration of the dark matter halos. Our results suggest that the solution to the "arc statistics problem" lies primarily in matching the cluster dark matter distribution.

  19. Ultrasonic transmission at solid-liquid interfaces

    NASA Astrophysics Data System (ADS)

    Wadley, Haydn N. G.; Queheillalt, Douglas T.; Lu, Yichi

    1996-11-01

    New non-invasive solid-liquid interface sensing technologies are a key element in the development of improved Bridman growth techniques for synthesizing single crystal semiconductor materials. Laser generated and optically detect ultrasonic techniques have the potential to satisfy this need. Using an anisotropic 3D ray tracing methodology combined with elastic constant data measured near the melting point, ultrasonic propagation in cylindrical single crystal bodies containing either a convex, flat, or concave solid-liquid interface has been simulated. Ray paths, wavefronts and the time-of-flight (TOF) of rays that travel from a source to an arbitrarily positioned receiver have all been calculated. Experimentally measured TOF data have been collected using laser generated, optically detected ultrasound on model systems with independently known interface shapes. Both numerically simulated and experimental data have shown that the solidification region can be easily identified from transmission TOF measurements because the velocity of the liquid is much smaller than that of the solid. Since convex and concave solid-liquid interfaces result in distinctively different TOF data profiles, the interface shape can also be readily determined from the TOF data. When TOF data collected in the diametral plane is used in conjunction with a nonlinear least squares algorithm, the interface geometry has been successfully reconstructed and ultrasonic velocities of both the solid and liquid obtained with reconstruction errors less than 5 percent.

  20. 49 CFR Appendix A to Part 1511 - Aviation Security Infrastructure Fee

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... final acceptance testing. This includes such equipment as Metal Detection Devices, Hand Wands, X-ray... such equipment as Metal Detection Devices, Hand Wands, X-ray screening machines, Explosives Trace... as test objects and X-ray radiation surveys, electricity costs and maintenance contract costs...

  1. CORFIG- CORRECTOR SURFACE DESIGN SOFTWARE

    NASA Technical Reports Server (NTRS)

    Dantzler, A.

    1994-01-01

    Corrector Surface Design Software, CORFIG, calculates the optimum figure of a corrector surface for an optical system based on real ray traces. CORFIG generates the corrector figure in the form of a spline data point table and/or a list of polynomial coefficients. The number of spline data points as well as the number of coefficients is user specified. First, the optical system's parameters (thickness, radii of curvature, etc.) are entered. CORFIG will trace the outermost axial real ray through the uncorrected system to determine approximate radial limits for all rays. Then, several real rays are traced backwards through the system from the image to the surface that originally followed the object, within these radial limits. At this first surface, the local curvature is adjusted on a small scale to direct the rays toward the object, thus removing any accumulated aberrations. For each ray traced, this adjustment will be different, so that at the end of this process the resultant surface is made up of many local curvatures. The equations that describe these local surfaces, expressed as high order polynomials, are then solved simultaneously to yield the final surface figure, from which data points are extracted. Finally, a spline table or list of polynomial coefficients is extracted from these data points. CORFIG is intended to be used in the late stages of optical design. The system's design must have at least a good paraxial foundation. Preferably, the design should be at a stage where traditional methods of Seidel aberration correction will not bring about the required image spot size specification. CORFIG will read the system parameters of such a design and calculate the optimum figure for the first surface such that all of the original parameters remain unchanged. Depending upon the system, CORFIG can reduce the RMS image spot radius by a factor of 5 to 25. The original parameters (magnification, back focal length, etc.) are maintained because all rays upon which the corrector figure is based are traced within the bounds of the original system's outermost ray. For this reason the original system must have a certain degree of integrity. CORFIG optimizes the corrector surface figure for on-axis images at a single wavelength only. However, it has been demonstrated many times that CORFIG's method also significantly improves the quality of field images and images formed from wavelengths other than the center wavelength. CORFIG is written completely in VAX FORTRAN. It has been implemented on a DEC VAX series computer under VMS with a central memory requirement of 55 K bytes. This program was developed in 1986.

  2. Plane-Based Sampling for Ray Casting Algorithm in Sequential Medical Images

    PubMed Central

    Lin, Lili; Chen, Shengyong; Shao, Yan; Gu, Zichun

    2013-01-01

    This paper proposes a plane-based sampling method to improve the traditional Ray Casting Algorithm (RCA) for the fast reconstruction of a three-dimensional biomedical model from sequential images. In the novel method, the optical properties of all sampling points depend on the intersection points when a ray travels through an equidistant parallel plan cluster of the volume dataset. The results show that the method improves the rendering speed at over three times compared with the conventional algorithm and the image quality is well guaranteed. PMID:23424608

  3. TOMO3D: 3-D joint refraction and reflection traveltime tomography parallel code for active-source seismic data—synthetic test

    NASA Astrophysics Data System (ADS)

    Meléndez, A.; Korenaga, J.; Sallarès, V.; Miniussi, A.; Ranero, C. R.

    2015-10-01

    We present a new 3-D traveltime tomography code (TOMO3D) for the modelling of active-source seismic data that uses the arrival times of both refracted and reflected seismic phases to derive the velocity distribution and the geometry of reflecting boundaries in the subsurface. This code is based on its popular 2-D version TOMO2D from which it inherited the methods to solve the forward and inverse problems. The traveltime calculations are done using a hybrid ray-tracing technique combining the graph and bending methods. The LSQR algorithm is used to perform the iterative regularized inversion to improve the initial velocity and depth models. In order to cope with an increased computational demand due to the incorporation of the third dimension, the forward problem solver, which takes most of the run time (˜90 per cent in the test presented here), has been parallelized with a combination of multi-processing and message passing interface standards. This parallelization distributes the ray-tracing and traveltime calculations among available computational resources. The code's performance is illustrated with a realistic synthetic example, including a checkerboard anomaly and two reflectors, which simulates the geometry of a subduction zone. The code is designed to invert for a single reflector at a time. A data-driven layer-stripping strategy is proposed for cases involving multiple reflectors, and it is tested for the successive inversion of the two reflectors. Layers are bound by consecutive reflectors, and an initial velocity model for each inversion step incorporates the results from previous steps. This strategy poses simpler inversion problems at each step, allowing the recovery of strong velocity discontinuities that would otherwise be smoothened.

  4. Emerging spatial curvature can resolve the tension between high-redshift CMB and low-redshift distance ladder measurements of the Hubble constant

    NASA Astrophysics Data System (ADS)

    Bolejko, Krzysztof

    2018-05-01

    The measurements of the Hubble constant reveal a tension between high-redshift (CMB) and low-redshift (distance ladder) constraints. So far neither observational systematics nor new physics has been successfully implemented to explain away this tension. This paper presents a new solution to the Hubble constant problem. The solution is based on the Simsilun simulation (relativistic simulation of the large scale structure of the Universe) with the ray-tracing algorithm implemented. The initial conditions for the Simsilun simulation were set up as perturbations around the Λ CDM model. However, unlike in the standard cosmological model (i.e., Λ CDM model +perturbations ), within the Simsilun simulation relativistic and nonlinear evolution of cosmic structures lead to the phenomenon of emerging spatial curvature, where the mean spatial curvature evolves from the spatial flatness of the early Universe towards the slightly curved present-day Universe. Consequently, the present-day expansion rate is slightly faster compared to the spatially flat Λ CDM model. The results of the ray-tracing analysis show that the Universe which starts with initial conditions consistent with the Planck constraints should have the Hubble constant H0=72.5 ±2.1 km s-1 Mpc-1 . When the Simsilun simulation was rerun with no inhomogeneities imposed, the Hubble constant inferred within such a homogeneous simulation was H0=68.1 ±2.0 km s-1 Mpc-1 . Thus, the inclusion of nonlinear relativistic evolution that leads to the emergence of the spatial curvature can explain why the low-redshift measurements favor higher values compared to the high-redshift constraints and alleviate the tension between the CMB and distance ladder measurements of the Hubble constant.

  5. Solar Thermal Concept Evaluation

    NASA Technical Reports Server (NTRS)

    Hawk, Clark W.; Bonometti, Joseph A.

    1995-01-01

    Concentrated solar thermal energy can be utilized in a variety of high temperature applications for both terrestrial and space environments. In each application, knowledge of the collector and absorber's heat exchange interaction is required. To understand this coupled mechanism, various concentrator types and geometries, as well as, their relationship to the physical absorber mechanics were investigated. To conduct experimental tests various parts of a 5,000 watt, thermal concentrator, facility were made and evaluated. This was in anticipation at a larger NASA facility proposed for construction. Although much of the work centered on solar thermal propulsion for an upper stage (less than one pound thrust range), the information generated and the facility's capabilities are applicable to material processing, power generation and similar uses. The numerical calculations used to design the laboratory mirror and the procedure for evaluating other solar collectors are presented here. The mirror design is based on a hexagonal faceted system, which uses a spherical approximation to the parabolic surface. The work began with a few two dimensional estimates and continued with a full, three dimensional, numerical algorithm written in FORTRAN code. This was compared to a full geometry, ray trace program, BEAM 4, which optimizes the curvatures, based on purely optical considerations. Founded on numerical results, the characteristics of a faceted concentrator were construed. The numerical methodologies themselves were evaluated and categorized. As a result, the three-dimensional FORTRAN code was the method chosen to construct the mirrors, due to its overall accuracy and superior results to the ray trace program. This information is being used to fabricate and subsequently, laser map the actual mirror surfaces. Evaluation of concentrator mirrors, thermal applications and scaling the results of the 10 foot diameter mirror to a much larger concentrator, were studied. Evaluations, recommendations and pit falls regarding the structure, materials and facility design are presented.

  6. EAGLE Monitors by Collecting Facts and Generating Obligations

    NASA Technical Reports Server (NTRS)

    Barrnger, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2003-01-01

    We present a rule-based framework, called EAGLE, that has been shown to be capable of defining and implementing a range of finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time and metric temporal logics, interval logics, forms of quantified temporal logics, and so on. A monitor for an EAGLE formula checks if a finite trace of states satisfies the given formula. We present, in details, an algorithm for the synthesis of monitors for EAGLE. The algorithm is implemented as a Java application and involves novel techniques for rule definition, manipulation and execution. Monitoring is achieved on a state-by-state basis avoiding any need to store the input trace of states. Our initial experiments have been successful as EAGLE detected a previously unknown bug while testing a planetary rover controller.

  7. Statistical Models for Averaging of the Pump–Probe Traces: Example of Denoising in Terahertz Time-Domain Spectroscopy

    NASA Astrophysics Data System (ADS)

    Skorobogatiy, Maksim; Sadasivan, Jayesh; Guerboukha, Hichem

    2018-05-01

    In this paper, we first discuss the main types of noise in a typical pump-probe system, and then focus specifically on terahertz time domain spectroscopy (THz-TDS) setups. We then introduce four statistical models for the noisy pulses obtained in such systems, and detail rigorous mathematical algorithms to de-noise such traces, find the proper averages and characterise various types of experimental noise. Finally, we perform a comparative analysis of the performance, advantages and limitations of the algorithms by testing them on the experimental data collected using a particular THz-TDS system available in our laboratories. We conclude that using advanced statistical models for trace averaging results in the fitting errors that are significantly smaller than those obtained when only a simple statistical average is used.

  8. Directional impulse response of a large cavity inside a sonic crystal.

    PubMed

    Spiousas, Ignacio; Eguia, Manuel C

    2012-10-01

    Both temporal and directional responses of a cavity inside a two-dimensional sonic crystal are investigated. The size of the cavity is large compared to the lattice parameter and the wavelength for the frequency range of interest. Hence, a hybrid method to compute the response is proposed, combining multiscattering theory for the calculation of the reflective properties of the sonic crystal with a modified ray-tracing algorithm for the sound propagation within the cavity. The response of this enclosure displays resonances for certain frequency bands that depend on the geometry of the lattice and the cavity. When a full band gap exists in the sonic crystal, rays cannot propagate through the medium and total reflection occurs for all incidence angles, leading to strong resonances with an isotropic intensity field inside the cavity. When only some propagation directions are forbidden, total reflection occurs for certain ranges of incidence angles, and resonances can also be elicited but with a highly anisotropic intensity field. The spectrum of resonances of the cavity is strongly affected by changes in the lattice geometry, suggesting that they can be tailored to some extent, a feature that can lead to potential applications in architectural acoustics.

  9. Ray Tracing through the Edge Focusing of Rectangular Benders and an Improved Model for the Los Alamos Proton Storage Ring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolski, Jeffrey S.; Barlow, David B.; Macek, Robert J.

    2011-01-01

    Particle ray tracing through simulated 3D magnetic fields was executed to investigate the effective quadrupole strength of the edge focusing of the rectangular bending magnets in the Los Alamos Proton Storage Ring (PSR). The particle rays receive a kick in the edge field of the rectangular dipole. A focal length may be calculated from the particle tracking and related to the fringe field integral (FINT) model parameter. This tech note introduces the baseline lattice model of the PSR and motivates the need for an improvement in the baseline model's vertical tune prediction, which differs from measurement by .05. An improvedmore » model of the PSR is created by modifying the fringe field integral parameter to those suggested by the ray tracing investigation. This improved model is then verified against measurement at the nominal PSR operating set point and at set points far away from the nominal operating conditions. Lastly, Linear Optics from Closed Orbits (LOCO) is employed in an orbit response matrix method for model improvement to verify the quadrupole strengths of the improved model.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deptuch, G. W.; Fahim, F.; Grybos, P.

    An on-chip implementable algorithm for allocation of an X-ray photon imprint, called a hit, to a single pixel in the presence of charge sharing in a highly segmented pixel detector is described. Its proof-of-principle implementation is also given supported by the results of tests using a highly collimated X-ray photon beam from a synchrotron source. The algorithm handles asynchronous arrivals of X-ray photons. Activation of groups of pixels, comparisons of peak amplitudes of pulses within an active neighborhood and finally latching of the results of these comparisons constitute the three procedural steps of the algorithm. A grouping of pixels tomore » one virtual pixel that recovers composite signals and event driven strobes to control comparisons of fractional signals between neighboring pixels are the actuators of the algorithm. The circuitry necessary to implement the algorithm requires an extensive inter-pixel connection grid of analog and digital signals that are exchanged between pixels. A test-circuit implementation of the algorithm was achieved with a small array of 32×32 pixels and the device was exposed to an 8 keV highly collimated to a diameter of 3 μm X-ray beam. The results of these tests are given in the paper assessing physical implementation of the algorithm.« less

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deptuch, Grzegorz W.; Fahim, Farah; Grybos, Pawel

    An on-chip implementable algorithm for allocation of an X-ray photon imprint, called a hit, to a single pixel in the presence of charge sharing in a highly segmented pixel detector is described. Its proof-of-principle implementation is also given supported by the results of tests using a highly collimated X-ray photon beam from a synchrotron source. The algorithm handles asynchronous arrivals of X-ray photons. Activation of groups of pixels, comparisons of peak amplitudes of pulses within an active neighborhood and finally latching of the results of these comparisons constitute the three procedural steps of the algorithm. A grouping of pixels tomore » one virtual pixel, that recovers composite signals and event driven strobes, to control comparisons of fractional signals between neighboring pixels are the actuators of the algorithm. The circuitry necessary to implement the algorithm requires an extensive inter-pixel connection grid of analog and digital signals, that are exchanged between pixels. A test-circuit implementation of the algorithm was achieved with a small array of 32 × 32 pixels and the device was exposed to an 8 keV highly collimated to a diameter of 3-μm X-ray beam. Furthermore, the results of these tests are given in this paper assessing physical implementation of the algorithm.« less

  12. Ray-tracing critical-angle transmission gratings for the X-ray Surveyor and Explorer-size missions

    NASA Astrophysics Data System (ADS)

    Günther, Hans M.; Bautz, Marshall W.; Heilmann, Ralf K.; Huenemoerder, David P.; Marshall, Herman L.; Nowak, Michael A.; Schulz, Norbert S.

    2016-07-01

    We study a critical angle transmission (CAT) grating spectrograph that delivers a spectral resolution significantly above any X-ray spectrograph ever own. This new technology will allow us to resolve kinematic components in absorption and emission lines of galactic and extragalactic matter down to unprecedented dispersion levels. We perform ray-trace simulations to characterize the performance of the spectrograph in the context of an X-ray Surveyor or Arcus like layout (two mission concepts currently under study). Our newly developed ray-trace code is a tool suite to simulate the performance of X-ray observatories. The simulator code is written in Python, because the use of a high-level scripting language allows modifications of the simulated instrument design in very few lines of code. This is especially important in the early phase of mission development, when the performances of different configurations are contrasted. To reduce the run-time and allow for simulations of a few million photons in a few minutes on a desktop computer, the simulator code uses tabulated input (from theoretical models or laboratory measurements of samples) for grating efficiencies and mirror reflectivities. We find that the grating facet alignment tolerances to maintain at least 90% of resolving power that the spectrometer has with perfect alignment are (i) translation parallel to the optical axis below 0.5 mm, (ii) rotation around the optical axis or the groove direction below a few arcminutes, and (iii) constancy of the grating period to 1:105. Translations along and rotations around the remaining axes can be significantly larger than this without impacting the performance.

  13. Gravity Waves and Mesospheric Clouds in the Summer Middle Atmosphere: A Comparison of Lidar Measurements and Ray Modeling of Gravity Waves Over Sondrestrom, Greenland

    NASA Technical Reports Server (NTRS)

    Gerrard, Andrew J.; Kane, Timothy J.; Eckermann, Stephen D.; Thayer, Jeffrey P.

    2004-01-01

    We conducted gravity wave ray-tracing experiments within an atmospheric region centered near the ARCLITE lidar system at Sondrestrom, Greenland (67N, 310 deg E), in efforts to understand lidar observations of both upper stratospheric gravity wave activity and mesospheric clouds during August 1996 and the summer of 2001. The ray model was used to trace gravity waves through realistic three-dimensional daily-varying background atmospheres in the region, based on forecasts and analyses in the troposphere and stratosphere and climatologies higher up. Reverse ray tracing based on upper stratospheric lidar observations at Sondrestrom was also used to try to objectively identify wave source regions in the troposphere. A source spectrum specified by reverse ray tracing experiments in early August 1996 (when atmospheric flow patterns produced enhanced transmission of waves into the upper stratosphere) yielded model results throughout the remainder of August 1996 that agreed best with the lidar observations. The model also simulated increased vertical group propagation of waves between 40 km and 80 km due to intensifying mean easterlies, which allowed many of the gravity waves observed at 40 km over Sondrestrom to propagate quasi-vertically from 40-80 km and then interact with any mesospheric clouds at 80 km near Sondrestrom, supporting earlier experimentally-inferred correlations between upper stratospheric gravity wave activity and mesospheric cloud backscatter from Sondrestrom lidar observations. A pilot experiment of real-time runs with the model in 2001 using weather forecast data as a low-level background produced less agreement with lidar observations. We believe this is due to limitations in our specified tropospheric source spectrum, the use of climatological winds and temperatures in the upper stratosphere and mesosphere, and missing lidar data from important time periods.

  14. Topics in polarization ray tracing for image projectors

    NASA Astrophysics Data System (ADS)

    Rosenbluth, Alan E.; Gallatin, Gregg; Lai, Kafai; Seong, Nakgeuon; Singh, Rama N.

    2005-08-01

    Many subtle effects arise when tracing polarization along rays that converge or diverge to form an image. This paper concentrates on a few examples that are notable for the challenges they pose in properly analyzing vector imaging problems. A striking example is the Federov-Imbert shift, in which coating phase-shifts cause a reflected beam to actually be deviated "sideways" out of the plane of incidence. A second example involving groups of coated surfaces is the correction of contrast loss from skew-angle depolarization in the optics of data projectors that use reflective polarization-modulating light valves. We show that phase-controlled coatings can collectively correct the contrast loss by exploiting a symmetry that arises when the coatings are operated in double-pass (due to use of reflective light valves). In lowest order, this symmetry causes any ellipticity that the coatings may introduce in the polarization of illuminating skew-rays to cancel in the return pass from the light valve back through the optics. Even beyond this first order reversibility result, we have shown elsewhere that, for NA less than about 0.2, the computation involved in calculating beam contrast can be reduced to the equivalent of tracing a single ray. We show here that the Federov-Imbert shift can be derived in a straightforward way using this formalism. Even a non-polarizing system will show vector effects when the numerical aperture is sufficiently high, as in photolithographic lenses. Wavefront quality in these deep-UV lenses is of order λ/100, and simulations to account for the complexities of the image transfer steps during IC manufacture must be accurate to better than a part in 1E2 or 1E3; hence small polarization distortions in the superposed image rays become very significant. An interesting source of such distortions is spatial dispersion in CaF2 lens elements, which gives rise to intrinsic birefringence at the ppm level. Polarization ray tracing must then contend with the phenomenon of double refraction, wherein a given ray splits into two rays each time it passes through an element, giving rise in principle to an exponentially extended family of rays in the exit pupil. However, we show that it is possible to merge each coherent family of rays into a single plane-wave component of the image. (This is joint work with colleagues at Carl Zeiss SMT.1) Generalizing beyond the analysis of birefringence, such a plane-wave component can be identified with the particular subset of rays that are converged through a common pupil point and transferred to the image after diffracting from the object points within an isoplanatic patch. Thin-film amplitude transfer coefficients implicitly take into account the prismatic change in beam-width that occurs when such a ray bundle refracts through a lens surface, but these coefficients do not include the focusing effect arising from power in the surfaces; hence polarization ray-tracing by sequential application of thin-film transfer coefficients does not by itself provide the correct amplitude distribution over the pupil.

  15. Cosmic-ray tracing

    NASA Astrophysics Data System (ADS)

    Becker Tjus, Julia

    2018-04-01

    Active galactic nuclei are firm favourites to be revealed as the source of cosmic rays, but solid evidence has proven elusive. A model taking both local and global nuclei propagation into account may help to close the deal.

  16. The Birth of Elementary-Particle Physics.

    ERIC Educational Resources Information Center

    Brown, Laurie M.; Hoddeson, Lillian

    1982-01-01

    Traces the origin and development of particle physics, concentrating on the roles of cosmic rays and theory. Includes charts highlighting significant events in the development of cosmic-ray physics and quantum field theory. (SK)

  17. Adaptive protection algorithm and system

    DOEpatents

    Hedrick, Paul [Pittsburgh, PA; Toms, Helen L [Irwin, PA; Miller, Roger M [Mars, PA

    2009-04-28

    An adaptive protection algorithm and system for protecting electrical distribution systems traces the flow of power through a distribution system, assigns a value (or rank) to each circuit breaker in the system and then determines the appropriate trip set points based on the assigned rank.

  18. A Computational Framework for High-Throughput Isotopic Natural Abundance Correction of Omics-Level Ultra-High Resolution FT-MS Datasets

    PubMed Central

    Carreer, William J.; Flight, Robert M.; Moseley, Hunter N. B.

    2013-01-01

    New metabolomics applications of ultra-high resolution and accuracy mass spectrometry can provide thousands of detectable isotopologues, with the number of potentially detectable isotopologues increasing exponentially with the number of stable isotopes used in newer isotope tracing methods like stable isotope-resolved metabolomics (SIRM) experiments. This huge increase in usable data requires software capable of correcting the large number of isotopologue peaks resulting from SIRM experiments in a timely manner. We describe the design of a new algorithm and software system capable of handling these high volumes of data, while including quality control methods for maintaining data quality. We validate this new algorithm against a previous single isotope correction algorithm in a two-step cross-validation. Next, we demonstrate the algorithm and correct for the effects of natural abundance for both 13C and 15N isotopes on a set of raw isotopologue intensities of UDP-N-acetyl-D-glucosamine derived from a 13C/15N-tracing experiment. Finally, we demonstrate the algorithm on a full omics-level dataset. PMID:24404440

  19. Simulation of an active underwater imaging through a wavy sea surface

    NASA Astrophysics Data System (ADS)

    Gholami, Ali; Saghafifar, Hossein

    2018-06-01

    A numerical simulation for underwater imaging through a wavy sea surface has been done. We have used a common approach to model the sea surface elevation and its slopes as an important source of image disturbance. The simulation algorithm is based on a combination of ray tracing and optical propagation, which has taken to different approaches for downwelling and upwelling beams. The nature of randomly focusing and defocusing property of surface waves causes a fluctuated irradiance distribution as an illuminating source of immersed object, while it gives rise to a great disturbance on the image through a coordinate change of image pixels. We have also used a modulation transfer function based on Well's small angle approximations to consider the underwater optical properties effect on the transferring of the image. As expected, the absorption effect reduces the light intensity and scattering decreases image contrast by blurring the image.

  20. Atomic detail visualization of photosynthetic membranes with GPU-accelerated ray tracing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, John E.; Sener, Melih; Vandivort, Kirby L.

    The cellular process responsible for providing energy for most life on Earth, namely, photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. In this paper, we present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. Finally, we describemore » the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers.« less

  1. Atomic detail visualization of photosynthetic membranes with GPU-accelerated ray tracing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, John E.; Sener, Melih; Vandivort, Kirby L.

    The cellular process responsible for providing energy for most life on Earth, namely, photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. We present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. We describe the techniques that weremore » used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers.« less

  2. Atomic detail visualization of photosynthetic membranes with GPU-accelerated ray tracing

    DOE PAGES

    Stone, John E.; Sener, Melih; Vandivort, Kirby L.; ...

    2015-12-12

    The cellular process responsible for providing energy for most life on Earth, namely, photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. In this paper, we present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. Finally, we describemore » the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers.« less

  3. Design of a novel freeform lens for LED uniform illumination and conformal phosphor coating.

    PubMed

    Hu, Run; Luo, Xiaobing; Zheng, Huai; Qin, Zong; Gan, Zhiqiang; Wu, Bulong; Liu, Sheng

    2012-06-18

    A conformal phosphor coating can realize a phosphor layer with uniform thickness, which could enhance the angular color uniformity (ACU) of light-emitting diode (LED) packaging. In this study, a novel freeform lens was designed for simultaneous realization of LED uniform illumination and conformal phosphor coating. The detailed algorithm of the design method, which involves an extended light source and double refractions, was presented. The packaging configuration of the LED modules and the modeling of the light-conversion process were also presented. Monte Carlo ray-tracing simulations were conducted to validate the design method by comparisons with a conventional freeform lens. It is demonstrated that for the LED module with the present freeform lens, the illumination uniformity and ACU was 0.89 and 0.9283, respectively. The present freeform lens can realize equivalent illumination uniformity, but the angular color uniformity can be enhanced by 282.3% when compared with the conventional freeform lens.

  4. Modeling of Pixelated Detector in SPECT Pinhole Reconstruction.

    PubMed

    Feng, Bing; Zeng, Gengsheng L

    2014-04-10

    A challenge for the pixelated detector is that the detector response of a gamma-ray photon varies with the incident angle and the incident location within a crystal. The normalization map obtained by measuring the flood of a point-source at a large distance can lead to artifacts in reconstructed images. In this work, we investigated a method of generating normalization maps by ray-tracing through the pixelated detector based on the imaging geometry and the photo-peak energy for the specific isotope. The normalization is defined for each pinhole as the normalized detector response for a point-source placed at the focal point of the pinhole. Ray-tracing is used to generate the ideal flood image for a point-source. Each crystal pitch area on the back of the detector is divided into 60 × 60 sub-pixels. Lines are obtained by connecting between a point-source and the centers of sub-pixels inside each crystal pitch area. For each line ray-tracing starts from the entrance point at the detector face and ends at the center of a sub-pixel on the back of the detector. Only the attenuation by NaI(Tl) crystals along each ray is assumed to contribute directly to the flood image. The attenuation by the silica (SiO 2 ) reflector is also included in the ray-tracing. To calculate the normalization for a pinhole, we need to calculate the ideal flood for a point-source at 360 mm distance (where the point-source was placed for the regular flood measurement) and the ideal flood image for the point-source at the pinhole focal point, together with the flood measurement at 360 mm distance. The normalizations are incorporated in the iterative OSEM reconstruction as a component of the projection matrix. Applications to single-pinhole and multi-pinhole imaging showed that this method greatly reduced the reconstruction artifacts.

  5. XCAT/DRASIM: a realistic CT/human-model simulation package

    NASA Astrophysics Data System (ADS)

    Fung, George S. K.; Stierstorfer, Karl; Segars, W. Paul; Taguchi, Katsuyuki; Flohr, Thomas G.; Tsui, Benjamin M. W.

    2011-03-01

    The aim of this research is to develop a complete CT/human-model simulation package by integrating the 4D eXtended CArdiac-Torso (XCAT) phantom, a computer generated NURBS surface based phantom that provides a realistic model of human anatomy and respiratory and cardiac motions, and the DRASIM (Siemens Healthcare) CT-data simulation program. Unlike other CT simulation tools which are based on simple mathematical primitives or voxelized phantoms, this new simulation package has the advantages of utilizing a realistic model of human anatomy and physiological motions without voxelization and with accurate modeling of the characteristics of clinical Siemens CT systems. First, we incorporated the 4D XCAT anatomy and motion models into DRASIM by implementing a new library which consists of functions to read-in the NURBS surfaces of anatomical objects and their overlapping order and material properties in the XCAT phantom. Second, we incorporated an efficient ray-tracing algorithm for line integral calculation in DRASIM by computing the intersection points of the rays cast from the x-ray source to the detector elements through the NURBS surfaces of the multiple XCAT anatomical objects along the ray paths. Third, we evaluated the integrated simulation package by performing a number of sample simulations of multiple x-ray projections from different views followed by image reconstruction. The initial simulation results were found to be promising by qualitative evaluation. In conclusion, we have developed a unique CT/human-model simulation package which has great potential as a tool in the design and optimization of CT scanners, and the development of scanning protocols and image reconstruction methods for improving CT image quality and reducing radiation dose.

  6. Lost in Virtual Reality: Pathfinding Algorithms Detect Rock Fractures and Contacts in Point Clouds

    NASA Astrophysics Data System (ADS)

    Thiele, S.; Grose, L.; Micklethwaite, S.

    2016-12-01

    UAV-based photogrammetric and LiDAR techniques provide high resolution 3D point clouds and ortho-rectified photomontages that can capture surface geology in outstanding detail over wide areas. Automated and semi-automated methods are vital to extract full value from these data in practical time periods, though the nuances of geological structures and materials (natural variability in colour and geometry, soft and hard linkage, shadows and multiscale properties) make this a challenging task. We present a novel method for computer assisted trace detection in dense point clouds, using a lowest cost path solver to "follow" fracture traces and lithological contacts between user defined end points. This is achieved by defining a local neighbourhood network where each point in the cloud is linked to its neighbours, and then using a least-cost path algorithm to search this network and estimate the trace of the fracture or contact. A variety of different algorithms can then be applied to calculate the best fit plane, produce a fracture network, or map properties such as roughness, curvature and fracture intensity. Our prototype of this method (Fig. 1) suggests the technique is feasible and remarkably good at following traces under non-optimal conditions such as variable-shadow, partial occlusion and complex fracturing. Furthermore, if a fracture is initially mapped incorrectly, the user can easily provide further guidance by defining intermediate waypoints. Future development will include optimization of the algorithm to perform well on large point clouds and modifications that permit the detection of features such as step-overs. We also plan on implementing this approach in an interactive graphical user environment.

  7. Massively parallel algorithms for trace-driven cache simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Greenberg, Albert G.; Lubachevsky, Boris D.

    1991-01-01

    Trace driven cache simulation is central to computer design. A trace is a very long sequence of reference lines from main memory. At the t(exp th) instant, reference x sub t is hashed into a set of cache locations, the contents of which are then compared with x sub t. If at the t sup th instant x sub t is not present in the cache, then it is said to be a miss, and is loaded into the cache set, possibly forcing the replacement of some other memory line, and making x sub t present for the (t+1) sup st instant. The problem of parallel simulation of a subtrace of N references directed to a C line cache set is considered, with the aim of determining which references are misses and related statistics. A simulation method is presented for the Least Recently Used (LRU) policy, which regradless of the set size C runs in time O(log N) using N processors on the exclusive read, exclusive write (EREW) parallel model. A simpler LRU simulation algorithm is given that runs in O(C log N) time using N/log N processors. Timings are presented of the second algorithm's implementation on the MasPar MP-1, a machine with 16384 processors. A broad class of reference based line replacement policies are considered, which includes LRU as well as the Least Frequently Used and Random replacement policies. A simulation method is presented for any such policy that on any trace of length N directed to a C line set runs in the O(C log N) time with high probability using N processors on the EREW model. The algorithms are simple, have very little space overhead, and are well suited for SIMD implementation.

  8. Combining ray tracing and CFD in the thermal analysis of a parabolic dish tubular cavity receiver

    NASA Astrophysics Data System (ADS)

    Craig, Ken J.; Marsberg, Justin; Meyer, Josua P.

    2016-05-01

    This paper describes the numerical evaluation of a tubular receiver used in a dish Brayton cycle. In previous work considering the use of Computational Fluid Dynamics (CFD) to perform the calculation of the absorbed radiation from the parabolic dish into the cavity as well as the resulting conjugate heat transfer, it was shown that an axi-symmetric model of the dish and receiver absorbing surfaces was useful in reducing the computational cost required for a full 3-D discrete ordinates solution, but concerns remained about its accuracy. To increase the accuracy, the Monte Carlo ray tracer SolTrace is used to perform the calculation of the absorbed radiation profile to be used in the conjugate heat transfer CFD simulation. The paper describes an approach for incorporating a complex geometry like a tubular receiver generated using CFD software into SolTrace. The results illustrate the variation of CFD mesh density that translates into the number of elements in SolTrace as well as the number of rays used in the Monte Carlo approach and their effect on obtaining a resolution-independent solution. The conjugate heat transfer CFD simulation illustrates the effect of applying the SolTrace surface heat flux profile solution as a volumetric heat source to heat up the air inside the tube. Heat losses due to convection and thermal re-radiation are also determined as a function of different tube absorptivities.

  9. Probing the local environment of the supernova remnant HESS J1731-347 with CO and CS observations

    NASA Astrophysics Data System (ADS)

    Maxted, N.; Burton, M.; Braiding, C.; Rowell, G.; Sano, H.; Voisin, F.; Capasso, M.; Pühlhofer, G.; Fukui, Y.

    2018-02-01

    The shell-type supernova remnant HESS J1731 - 347 emits TeV gamma-rays, and is a key object for the study of the cosmic ray acceleration potential of supernova remnants. We use 0.5-1 arcmin Mopra CO/CS(1-0) data in conjunction with H I data to calculate column densities towards the HESS J1731 - 347 region. We trace gas within at least four Galactic arms, typically tracing total (atomic+molecular) line-of-sight H column densities of 2-3× 1022 cm-2. Assuming standard X-factor values and that most of the H I/CO emission seen towards HESS J1731 - 347 is on the near-side of the Galaxy, X-ray absorption column densities are consistent with H I+CO-derived column densities foreground to, but not beyond, the Scutum-Crux Galactic arm, suggesting a kinematic distance of ˜3.2 kpc for HESS J1731 - 347. At this kinematic distance, we also find dense, infrared-dark gas traced by CS(1-0) emission coincident with the north of HESS J1731 - 347, the nearby H II region G353.43-0.37 and the nearby unidentified gamma-ray source HESS J1729 - 345. This dense gas lends weight to the idea that HESS J1729 - 345 and HESS J1731 - 347 are connected, perhaps via escaping cosmic-rays.

  10. Impact of large-scale atmospheric refractive structures on optical wave propagation

    NASA Astrophysics Data System (ADS)

    Nunalee, Christopher G.; He, Ping; Basu, Sukanta; Vorontsov, Mikhail A.; Fiorino, Steven T.

    2014-10-01

    Conventional techniques used to model optical wave propagation through the Earth's atmosphere typically as- sume flow fields based on various empirical relationships. Unfortunately, these synthetic refractive index fields do not take into account the influence of transient macroscale and mesoscale (i.e. larger than turbulent microscale) atmospheric phenomena. Nevertheless, a number of atmospheric structures that are characterized by various spatial and temporal scales exist which have the potential to significantly impact refractive index fields, thereby resulting dramatic impacts on optical wave propagation characteristics. In this paper, we analyze a subset of spatio-temporal dynamics found to strongly affect optical waves propagating through these atmospheric struc- tures. Analysis of wave propagation was performed in the geometrical optics approximation using a standard ray tracing technique. Using a numerical weather prediction (NWP) approach, we simulate multiple realistic atmospheric events (e.g., island wakes, low-level jets, etc.), and estimate the associated refractivity fields prior to performing ray tracing simulations. By coupling NWP model output with ray tracing simulations, we demon- strate the ability to quantitatively assess the potential impacts of coherent atmospheric phenomena on optical ray propagation. Our results show a strong impact of spatio-temporal characteristics of the refractive index field on optical ray trajectories. Such correlations validate the effectiveness of NWP models as they offer a more comprehensive representation of atmospheric refractivity fields compared to conventional methods based on the assumption of horizontal homogeneity.

  11. Measurement techniques for trace metals in coal-plant effluents: A brief review

    NASA Technical Reports Server (NTRS)

    Singh, J. J.

    1979-01-01

    The strong features and limitations of techniques for determining trace elements in aerosols emitted from coal plants are discussed. Techniques reviewed include atomic absorption spectroscopy, charged particle scattering and activation, instrumental neutron activation analysis, gas/liquid chromatography, gas chromatographic/mass spectrometric methods, X-ray fluorescence, and charged-particle-induced X-ray emission. The latter two methods are emphasized. They provide simultaneous, sensitive multielement analyses and lend themselves readily to depth profiling. It is recommended that whenever feasible, two or more complementary techniques should be used for analyzing environmental samples.

  12. Infrasonic ray tracing applied to mesoscale atmospheric structures: refraction by hurricanes.

    PubMed

    Bedard, Alfred J; Jones, R Michael

    2013-11-01

    A ray-tracing program is used to estimate the refraction of infrasound by the temperature structure of the atmosphere and by hurricanes represented by a Rankine-combined vortex wind plus a temperature perturbation. Refraction by the hurricane winds is significant, giving rise to regions of focusing, defocusing, and virtual sources. The refraction of infrasound by the temperature anomaly associated with a hurricane is small, probably no larger than that from uncertainties in the wind field. The results are pertinent to interpreting ocean wave generated infrasound in the vicinities of tropical cyclones.

  13. Quantitative Electron Probe Microanalysis: State of the Art

    NASA Technical Reports Server (NTRS)

    Carpernter, P. K.

    2005-01-01

    Quantitative electron-probe microanalysis (EPMA) has improved due to better instrument design and X-ray correction methods. Design improvement of the electron column and X-ray spectrometer has resulted in measurement precision that exceeds analytical accuracy. Wavelength-dispersive spectrometer (WDS) have layered-dispersive diffraction crystals with improved light-element sensitivity. Newer energy-dispersive spectrometers (EDS) have Si-drift detector elements, thin window designs, and digital processing electronics with X-ray throughput approaching that of WDS Systems. Using these systems, digital X-ray mapping coupled with spectrum imaging is a powerful compositional mapping tool. Improvements in analytical accuracy are due to better X-ray correction algorithms, mass absorption coefficient data sets,and analysis method for complex geometries. ZAF algorithms have ban superceded by Phi(pz) algorithms that better model the depth distribution of primary X-ray production. Complex thin film and particle geometries are treated using Phi(pz) algorithms, end results agree well with Monte Carlo simulations. For geological materials, X-ray absorption dominates the corretions end depends on the accuracy of mass absorption coefficient (MAC) data sets. However, few MACs have been experimentally measured, and the use of fitted coefficients continues due to general success of the analytical technique. A polynomial formulation of the Bence-Albec alpha-factor technique, calibrated using Phi(pz) algorithms, is used to critically evaluate accuracy issues and can be also be used for high 2% relative and is limited by measurement precision for ideal cases, but for many elements the analytical accuracy is unproven. The EPMA technique has improved to the point where it is frequently used instead of the petrogaphic microscope for reconnaissance work. Examples of stagnant research areas are: WDS detector design characterization of calibration standards, and the need for more complete treatment of the continuum X-ray fluorescence correction.

  14. Automated segmentation and feature extraction of product inspection items

    NASA Astrophysics Data System (ADS)

    Talukder, Ashit; Casasent, David P.

    1997-03-01

    X-ray film and linescan images of pistachio nuts on conveyor trays for product inspection are considered. The final objective is the categorization of pistachios into good, blemished and infested nuts. A crucial step before classification is the separation of touching products and the extraction of features essential for classification. This paper addresses new detection and segmentation algorithms to isolate touching or overlapping items. These algorithms employ a new filter, a new watershed algorithm, and morphological processing to produce nutmeat-only images. Tests on a large database of x-ray film and real-time x-ray linescan images of around 2900 small, medium and large nuts showed excellent segmentation results. A new technique to detect and segment dark regions in nutmeat images is also presented and tested on approximately 300 x-ray film and approximately 300 real-time linescan x-ray images with 95-97 percent detection and correct segmentation. New algorithms are described that determine nutmeat fill ratio and locate splits in nutmeat. The techniques formulated in this paper are of general use in many different product inspection and computer vision problems.

  15. Validation of Ionosonde Electron Density Reconstruction Algorithms with IONOLAB-RAY in Central Europe

    NASA Astrophysics Data System (ADS)

    Gok, Gokhan; Mosna, Zbysek; Arikan, Feza; Arikan, Orhan; Erdem, Esra

    2016-07-01

    Ionospheric observation is essentially accomplished by specialized radar systems called ionosondes. The time delay between the transmitted and received signals versus frequency is measured by the ionosondes and the received signals are processed to generate ionogram plots, which show the time delay or reflection height of signals with respect to transmitted frequency. The critical frequencies of ionospheric layers and virtual heights, that provide useful information about ionospheric structurecan be extracted from ionograms . Ionograms also indicate the amount of variability or disturbances in the ionosphere. With special inversion algorithms and tomographical methods, electron density profiles can also be estimated from the ionograms. Although structural pictures of ionosphere in the vertical direction can be observed from ionosonde measurements, some errors may arise due to inaccuracies that arise from signal propagation, modeling, data processing and tomographic reconstruction algorithms. Recently IONOLAB group (www.ionolab.org) developed a new algorithm for effective and accurate extraction of ionospheric parameters and reconstruction of electron density profile from ionograms. The electron density reconstruction algorithm applies advanced optimization techniques to calculate parameters of any existing analytical function which defines electron density with respect to height using ionogram measurement data. The process of reconstructing electron density with respect to height is known as the ionogram scaling or true height analysis. IONOLAB-RAY algorithm is a tool to investigate the propagation path and parameters of HF wave in the ionosphere. The algorithm models the wave propagation using ray representation under geometrical optics approximation. In the algorithm , the structural ionospheric characteristics arerepresented as realistically as possible including anisotropicity, inhomogenity and time dependence in 3-D voxel structure. The algorithm is also used for various purposes including calculation of actual height and generation of ionograms. In this study, the performance of electron density reconstruction algorithm of IONOLAB group and standard electron density profile algorithms of ionosondes are compared with IONOLAB-RAY wave propagation simulation in near vertical incidence. The electron density reconstruction and parameter extraction algorithms of ionosondes are validated with the IONOLAB-RAY results both for quiet anddisturbed ionospheric states in Central Europe using ionosonde stations such as Pruhonice and Juliusruh . It is observed that IONOLAB ionosonde parameter extraction and electron density reconstruction algorithm performs significantly better compared to standard algorithms especially for disturbed ionospheric conditions. IONOLAB-RAY provides an efficient and reliable tool to investigate and validate ionosonde electron density reconstruction algorithms, especially in determination of reflection height (true height) of signals and critical parameters of ionosphere. This study is supported by TUBITAK 114E541, 115E915 and Joint TUBITAK 114E092 and AS CR 14/001 projects.

  16. Applications of Quantum Cascade Laser Scanners for Remote Detection of Chemical and Biological Threats and Weapons of Mass Destruction

    DTIC Science & Technology

    2014-07-09

    Rivera. Highly Sensitive Filter Paper Substrate for SERS Trace Explosives Detection , International Journal of Spectroscopy, (09 2012): 0. doi: 10.1155...Highly Sensitive Filter Paper Substrate for SERS Field Detection of Trace Threat Chemicals”, PITTCON-2013: Forensic Analysis in the Lab and Crime Scene...the surface. In addition, built-in algorithms were used for nearly real-time sample detection . Trace and bulk concentrations of the other substances

  17. Modified Hyperspheres Algorithm to Trace Homotopy Curves of Nonlinear Circuits Composed by Piecewise Linear Modelled Devices

    PubMed Central

    Vazquez-Leal, H.; Jimenez-Fernandez, V. M.; Benhammouda, B.; Filobello-Nino, U.; Sarmiento-Reyes, A.; Ramirez-Pinero, A.; Marin-Hernandez, A.; Huerta-Chua, J.

    2014-01-01

    We present a homotopy continuation method (HCM) for finding multiple operating points of nonlinear circuits composed of devices modelled by using piecewise linear (PWL) representations. We propose an adaptation of the modified spheres path tracking algorithm to trace the homotopy trajectories of PWL circuits. In order to assess the benefits of this proposal, four nonlinear circuits composed of piecewise linear modelled devices are analysed to determine their multiple operating points. The results show that HCM can find multiple solutions within a single homotopy trajectory. Furthermore, we take advantage of the fact that homotopy trajectories are PWL curves meant to replace the multidimensional interpolation and fine tuning stages of the path tracking algorithm with a simple and highly accurate procedure based on the parametric straight line equation. PMID:25184157

  18. There Are (super)Giants in the Sky: Searching for Misidentified Massive Stars in Algorithmically-Selected Quasar Catalogs

    NASA Astrophysics Data System (ADS)

    Dorn-Wallenstein, Trevor Z.; Levesque, Emily

    2017-11-01

    Thanks to incredible advances in instrumentation, surveys like the Sloan Digital Sky Survey have been able to find and catalog billions of objects, ranging from local M dwarfs to distant quasars. Machine learning algorithms have greatly aided in the effort to classify these objects; however, there are regimes where these algorithms fail, where interesting oddities may be found. We present here an X-ray bright quasar misidentified as a red supergiant/X-ray binary, and a subsequent search of the SDSS quasar catalog for X-ray bright stars misidentified as quasars.

  19. Determination of trace metals in spirits by total reflection X-ray fluorescence spectrometry

    NASA Astrophysics Data System (ADS)

    Siviero, G.; Cinosi, A.; Monticelli, D.; Seralessandri, L.

    2018-06-01

    Eight spirituous samples were analyzed for trace metal content with Horizon Total Reflection X-Ray Fluorescence (TXRF) Spectrometer. The expected single metal amount is at the ng/g level in a mixed aqueous/organic matrix, thus requiring a sample preparation method capable of achieving suitable limits of detection. On-site enrichment and Atmospheric Pressure-Vapor Phase Decomposition allowed to detect Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Sr and Pb with detection limits ranging from 0.1 ng/g to 4.6 ng/g. These results highlight how the synergy between instrument and sample preparation strategy may foster the use of TXRF as a fast and reliable technique for the determination of trace elements in spirituous samples, either for quality control or risk assessment purposes.

  20. Analytical approximations to the Hotelling trace for digital x-ray detectors

    NASA Astrophysics Data System (ADS)

    Clarkson, Eric; Pineda, Angel R.; Barrett, Harrison H.

    2001-06-01

    The Hotelling trace is the signal-to-noise ratio for the ideal linear observer in a detection task. We provide an analytical approximation for this figure of merit when the signal is known exactly and the background is generated by a stationary random process, and the imaging system is an ideal digital x-ray detector. This approximation is based on assuming that the detector is infinite in extent. We test this approximation for finite-size detectors by comparing it to exact calculations using matrix inversion of the data covariance matrix. After verifying the validity of the approximation under a variety of circumstances, we use it to generate plots of the Hotelling trace as a function of pairs of parameters of the system, the signal and the background.

  1. Trace Norm Regularized CANDECOMP/PARAFAC Decomposition With Missing Data.

    PubMed

    Liu, Yuanyuan; Shang, Fanhua; Jiao, Licheng; Cheng, James; Cheng, Hong

    2015-11-01

    In recent years, low-rank tensor completion (LRTC) problems have received a significant amount of attention in computer vision, data mining, and signal processing. The existing trace norm minimization algorithms for iteratively solving LRTC problems involve multiple singular value decompositions of very large matrices at each iteration. Therefore, they suffer from high computational cost. In this paper, we propose a novel trace norm regularized CANDECOMP/PARAFAC decomposition (TNCP) method for simultaneous tensor decomposition and completion. We first formulate a factor matrix rank minimization model by deducing the relation between the rank of each factor matrix and the mode- n rank of a tensor. Then, we introduce a tractable relaxation of our rank function, and then achieve a convex combination problem of much smaller-scale matrix trace norm minimization. Finally, we develop an efficient algorithm based on alternating direction method of multipliers to solve our problem. The promising experimental results on synthetic and real-world data validate the effectiveness of our TNCP method. Moreover, TNCP is significantly faster than the state-of-the-art methods and scales to larger problems.

  2. X-radiography of trace fossils in limestones and dolostones from the Jurassic Smackover Formation, south Alabama

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Esposito, R.A.; Castleman, S.P.; King, D.T. Jr.

    X-radiography has been useful in studying biogenic sedimentary structures in unconsolidated sediments but the technique has not been applied often to the study of hard carbonate rock. The authors have applied x-radiography to the study of the lower part of the Smackover to enhance the complete petrologic description of the rock. The lower Smackover has many dense micrite intervals and intervals of monotonous, thin graded beds. Parts of the lower Smackover is also dolomitized. None of the above rocks contains significant amount of skeletal debris and trace fossils are not generally obvious in an etched slab of core. In limestone,more » they have detected well-preserved trace fossils by x-radiography, however. The dolostones show no traces using our method. In limestones, the traces are marked by minute amounts of finely divided iron sulfides. This causes a slight density difference resulting in greater x-ray absorption. They recognize two main trace-fossil types: a Thalassinoides best seen in slabs cut parallel to bedding and a Zoophycos best seen in slabs cut perpendicular to bedding. The technique requires a slab cut 8 mm thick with parallel flat surfaces and a medical x-ray unit using accelerating voltages of 66 kV and 10 mas. Traces are most successfully imaged on industrial-quality films.« less

  3. Use of portable X-ray fluorescence spectroscopy and geostatistics for health risk assessment.

    PubMed

    Yang, Meng; Wang, Cheng; Yang, Zhao-Ping; Yan, Nan; Li, Feng-Ying; Diao, Yi-Wei; Chen, Min-Dong; Li, Hui-Ming; Wang, Jin-Hua; Qian, Xin

    2018-05-30

    Laboratory analysis of trace metals using inductively coupled plasma (ICP) spectroscopy is not cost effective, and the complex spatial distribution of soil trace metals makes their spatial analysis and prediction problematic. Thus, for the health risk assessment of exposure to trace metals in soils, portable X-ray fluorescence (PXRF) spectroscopy was used to replace ICP spectroscopy for metal analysis, and robust geostatistical methods were used to identify spatial outliers in trace metal concentrations and to map trace metal distributions. A case study was carried out around an industrial area in Nanjing, China. The results showed that PXRF spectroscopy provided results for trace metal (Cu, Ni, Pb and Zn) levels comparable to ICP spectroscopy. The results of the health risk assessment showed that Ni posed a higher non-carcinogenic risk than Cu, Pb and Zn, indicating a higher priority of concern than the other elements. Sampling locations associated with adverse health effects were identified as 'hotspots', and high-risk areas were delineated from risk maps. These 'hotspots' and high-risk areas were in close proximity to and downwind from petrochemical plants, indicating the dominant role of industrial activities as the major sources of trace metals in soils. The approach used in this study could be adopted as a cost-effective methodology for screening 'hotspots' and priority areas of concern for cost-efficient health risk management. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. X-Ray Radiography of Gas Turbine Ceramics.

    DTIC Science & Technology

    1979-10-20

    Microfocus X-ray equipment. 1a4ihe definition of equipment concepts for a computer assisted tomography ( CAT ) system; and 4ffthe development of a CAT ...were obtained from these test coupons using Microfocus X-ray and image en- hancement techniques. A Computer Assisted Tomography ( CAT ) design concept...monitor. Computer reconstruction algorithms were investigated with respect to CAT and a preferred approach was determined. An appropriate CAT algorithm

  5. Trace of totally positive algebraic integers and integer transfinite diameter

    NASA Astrophysics Data System (ADS)

    Flammang, V.

    2009-06-01

    Explicit auxiliary functions can be used in the ``Schur-Siegel- Smyth trace problem''. In the previous works, these functions were constructed only with polynomials having all their roots positive. Here, we use several polynomials with complex roots, which are found with Wu's algorithm, and we improve the known lower bounds for the absolute trace of totally positive algebraic integers. This improvement has a consequence for the search of Salem numbers that have a negative trace. The same method also gives a small improvement of the upper bound for the integer transfinite diameter of [0,1].

  6. Viewer Makes Radioactivity "Visible"

    NASA Technical Reports Server (NTRS)

    Yin, L. I.

    1983-01-01

    Battery operated viewer demonstrates feasibility of generating threedimensional visible light simulations of objects that emit X-ray or gamma rays. Ray paths are traced for two pinhold positions to show location of reconstructed image. Images formed by pinholes are converted to intensified visible-light images. Applications range from radioactivity contamination surveys to monitoring radioisotope absorption in tumors.

  7. Reliability model generator specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Mccann, Catherine

    1990-01-01

    The Reliability Model Generator (RMG), a program which produces reliability models from block diagrams for ASSIST, the interface for the reliability evaluation tool SURE is described. An account is given of motivation for RMG and the implemented algorithms are discussed. The appendices contain the algorithms and two detailed traces of examples.

  8. RAY-UI: A powerful and extensible user interface for RAY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baumgärtel, P., E-mail: peter.baumgaertel@helmholtz-berlin.de; Erko, A.; Schäfers, F.

    2016-07-27

    The RAY-UI project started as a proof-of-concept for an interactive and graphical user interface (UI) for the well-known ray tracing software RAY [1]. In the meantime, it has evolved into a powerful enhanced version of RAY that will serve as the platform for future development and improvement of associated tools. The software as of today supports nearly all sophisticated simulation features of RAY. Furthermore, it delivers very significant usability and work efficiency improvements. Beamline elements can be quickly added or removed in the interactive sequence view. Parameters of any selected element can be accessed directly and in arbitrary order. Withmore » a single click, parameter changes can be tested and new simulation results can be obtained. All analysis results can be explored interactively right after ray tracing by means of powerful integrated image viewing and graphing tools. Unlimited image planes can be positioned anywhere in the beamline, and bundles of image planes can be created for moving the plane along the beam to identify the focus position with live updates of the simulated results. In addition to showing the features and workflow of RAY-UI, we will give an overview of the underlying software architecture as well as examples for use and an outlook for future developments.« less

  9. A computer program to trace seismic ray distribution in complex two-dimensional geological models

    USGS Publications Warehouse

    Yacoub, Nazieh K.; Scott, James H.

    1970-01-01

    A computer program has been developed to trace seismic rays and their amplitudes and energies through complex two-dimensional geological models, for which boundaries between elastic units are defined by a series of digitized X-, Y-coordinate values. Input data for the program includes problem identification, control parameters, model coordinates and elastic parameter for the elastic units. The program evaluates the partitioning of ray amplitude and energy at elastic boundaries, computes the total travel time, total travel distance and other parameters for rays arising at the earth's surface. Instructions are given for punching program control cards and data cards, and for arranging input card decks. An example of printer output for a simple problem is presented. The program is written in FORTRAN IV language. The listing of the program is shown in the Appendix, with an example output from a CDC-6600 computer.

  10. Surface scanning through a cylindrical tank of coupling fluid for clinical microwave breast imaging exams

    PubMed Central

    Pallone, Matthew J.; Meaney, Paul M.; Paulsen, Keith D.

    2012-01-01

    Purpose: Microwave tomographic image quality can be improved significantly with prior knowledge of the breast surface geometry. The authors have developed a novel laser scanning system capable of accurately recovering surface renderings of breast-shaped phantoms immersed within a cylindrical tank of coupling fluid which resides completely external to the tank (and the aqueous environment) and overcomes the challenges associated with the optical distortions caused by refraction from the air, tank wall, and liquid bath interfaces. Methods: The scanner utilizes two laser line generators and a small CCD camera mounted concentrically on a rotating gantry about the microwave imaging tank. Various calibration methods were considered for optimizing the accuracy of the scanner in the presence of the optical distortions including traditional ray tracing and image registration approaches. In this paper, the authors describe the construction and operation of the laser scanner, compare the efficacy of several calibration methods—including analytical ray tracing and piecewise linear, polynomial, locally weighted mean, and thin-plate-spline (TPS) image registrations—and report outcomes from preliminary phantom experiments. Results: The results show that errors in calibrating camera angles and position prevented analytical ray tracing from achieving submillimeter accuracy in the surface renderings obtained from our scanner configuration. Conversely, calibration by image registration reliably attained mean surface errors of less than 0.5 mm depending on the geometric complexity of the object scanned. While each of the image registration approaches outperformed the ray tracing strategy, the authors found global polynomial methods produced the best compromise between average surface error and scanner robustness. Conclusions: The laser scanning system provides a fast and accurate method of three dimensional surface capture in the aqueous environment commonly found in microwave breast imaging. Optical distortions imposed by the imaging tank and coupling bath diminished the effectiveness of the ray tracing approach; however, calibration through image registration techniques reliably produced scans of submillimeter accuracy. Tests of the system with breast-shaped phantoms demonstrated the successful implementation of the scanner for the intended application. PMID:22755695

  11. Advanced ultrasonic techniques for nondestructive testing of austenitic and dissimilar welds in nuclear facilities

    NASA Astrophysics Data System (ADS)

    Juengert, Anne; Dugan, Sandra; Homann, Tobias; Mitzscherling, Steffen; Prager, Jens; Pudovikov, Sergey; Schwender, Thomas

    2018-04-01

    Austenitic stainless steel welds as well as dissimilar metal welds with nickel alloy filler material, used in safety relevant parts of nuclear power plants, still challenge the ultrasonic inspection. The weld material forms large oriented grains that lead, on the one hand, to high sound scattering and, on the other hand, to inhomogeneity and to the acoustic anisotropy of the weld structure. The ultrasonic wave fronts do not propagate linearly, as in ferritic weld joints, but along the curves, which depend on the specific grain structure of the weld. Due to the influence of these phenomena, it is difficult to analyze the inspection results and to classify the ultrasonic indications, which could be both from the weld geometry and from the material defects. A correct flaw sizing is not possible. In an ongoing research project, different techniques to improve the reliability of ultrasonic testing at these kinds of welds are investigated. In a first step (in the previous research project) two ultrasonic inspection techniques were developed and validated on plane test specimens with artificial and realistic flaws. In the ongoing project, these techniques are applied to circumferential pipe welds with longitudinal and transverse flaws. The technique developed at the Federal Institute for Materials Research and Testing (BAM) in Germany uses a combination of ray tracing and synthetic aperture focusing technique (SAFT). To investigate the unknown grain structure, the velocity distribution of weld-transmitting ultrasound waves is measured and used to model the weld by ray tracing. The second technique, developed at the Fraunhofer Institute for Nondestructive Testing (IZFP) in Germany, uses Sampling Phased Array (Full Matrix Capture) combined with the reverse phase matching (RPM) and the gradient elastic constant descent algorithm (GECDM). This inspection method is able to estimate the elastic constants of the columnar grains in the weld and offers an improvement of the reliability of ultrasonic testing through the correction of the sound field distortion. The unknown inhomogeneity and anisotropy are investigated using a reference indication and the special optimization algorithm. Both reconstruction techniques give quantitative inspection results and allow the defect sizing. They have been compared to conventional ultrasonic testing with techniques that are state of the art for components in nuclear power plants. The improvement will be quantified by the comparison of the probability of detection (POD) of each technique.

  12. Ray-tracing analysis of intraocular lens power in situ.

    PubMed

    Olsen, Thomas; Funding, Mikkel

    2012-04-01

    To describe a method for back-solving the power of an intraocular lens (IOL) in situ based on laser biometry and ray-tracing analysis of the pseudophakic eye. University Eye Clinic, Aarhus Hospital, Aarhus, Denmark. Evaluation of diagnostic test or technology. This study comprised pseudophakic eyes with an IOL power ranging from -2.00 to +36.00 diopters (D). Preoperatively, the corneal radius was measured with conventional autokeratometry and the axial length (AL) with optical biometry. After surgery, the position of the IOL was recorded using laser interferometry. Based on the postoperative refraction and the biometric measurements, a ray-tracing analysis was performed back-solving for the power of the IOL in situ. The analysis was performed assuming pupil diameters from 0.0 to 8.0 mm with and without correction for the Stiles-Crawford effect. The study evaluated 767 pseudophakic eyes (583 patients). Assuming a 3.0 mm pupil, the mean prediction error between the labeled and the calculated IOL power (± 1 standard deviation [SD]) was -0.26 D ± 0.65 (SD) (range -2.4 to +1.8 D). The prediction error showed no bias with IOL power or with AL. The calculated IOL power depended on the assumed pupil size and the Stiles-Crawford effect. However, the latter had a modulatory effect on the prediction error for large pupil diameters (>5.0 mm) only. The optics of the pseudophakic eye can be accurately described using exact ray tracing and modern biometric techniques. Copyright © 2012 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  13. Analysis of elemental concentration censored distributions in breast malignant and breast benign neoplasm tissues

    NASA Astrophysics Data System (ADS)

    Kubala-Kukuś, A.; Banaś, D.; Braziewicz, J.; Góźdź, S.; Majewska, U.; Pajek, M.

    2007-07-01

    The total reflection X-ray fluorescence method was applied to study the trace element concentrations in human breast malignant and breast benign neoplasm tissues taken from the women who were patients of Holycross Cancer Centre in Kielce (Poland). These investigations were mainly focused on the development of new possibilities of cancer diagnosis and therapy monitoring. This systematic comparative study was based on relatively large (˜ 100) population studied, namely 26 samples of breast malignant and 68 samples of breast benign neoplasm tissues. The concentrations, being in the range from a few ppb to 0.1%, were determined for thirteen elements (from P to Pb). The results were carefully analysed to investigate the concentration distribution of trace elements in the studied samples. The measurements of concentration of trace elements by total reflection X-ray fluorescence were limited, however, by the detection limit of the method. It was observed that for more than 50% of elements determined, the concentrations were not measured in all samples. These incomplete measurements were treated within the statistical concept called left-random censoring and for the estimation of the mean value and median of censored concentration distributions, the Kaplan-Meier estimator was used. For comparison of concentrations in two populations, the log-rank test was applied, which allows to compare the censored total reflection X-ray fluorescence data. Found statistically significant differences are discussed in more details. It is noted that described data analysis procedures should be the standard tool to analyze the censored concentrations of trace elements analysed by X-ray fluorescence methods.

  14. SENSOR++: Simulation of Remote Sensing Systems from Visible to Thermal Infrared

    NASA Astrophysics Data System (ADS)

    Paproth, C.; Schlüßler, E.; Scherbaum, P.; Börner, A.

    2012-07-01

    During the development process of a remote sensing system, the optimization and the verification of the sensor system are important tasks. To support these tasks, the simulation of the sensor and its output is valuable. This enables the developers to test algorithms, estimate errors, and evaluate the capabilities of the whole sensor system before the final remote sensing system is available and produces real data. The presented simulation concept, SENSOR++, consists of three parts. The first part is the geometric simulation which calculates where the sensor looks at by using a ray tracing algorithm. This also determines whether the observed part of the scene is shadowed or not. The second part describes the radiometry and results in the spectral at-sensor radiance from the visible spectrum to the thermal infrared according to the simulated sensor type. In the case of earth remote sensing, it also includes a model of the radiative transfer through the atmosphere. The final part uses the at-sensor radiance to generate digital images by using an optical and an electronic sensor model. Using SENSOR++ for an optimization requires the additional application of task-specific data processing algorithms. The principle of the simulation approach is explained, all relevant concepts of SENSOR++ are discussed, and first examples of its use are given, for example a camera simulation for a moon lander. Finally, the verification of SENSOR++ is demonstrated.

  15. An automatic scaling method for obtaining the trace and parameters from oblique ionogram based on hybrid genetic algorithm

    NASA Astrophysics Data System (ADS)

    Song, Huan; Hu, Yaogai; Jiang, Chunhua; Zhou, Chen; Zhao, Zhengyu; Zou, Xianjian

    2016-12-01

    Scaling oblique ionogram plays an important role in obtaining ionospheric structure at the midpoint of oblique sounding path. The paper proposed an automatic scaling method to extract the trace and parameters of oblique ionogram based on hybrid genetic algorithm (HGA). The extracted 10 parameters come from F2 layer and Es layer, such as maximum observation frequency, critical frequency, and virtual height. The method adopts quasi-parabolic (QP) model to describe F2 layer's electron density profile that is used to synthesize trace. And it utilizes secant theorem, Martyn's equivalent path theorem, image processing technology, and echoes' characteristics to determine seven parameters' best fit values, and three parameter's initial values in QP model to set up their searching spaces which are the needed input data of HGA. Then HGA searches the three parameters' best fit values from their searching spaces based on the fitness between the synthesized trace and the real trace. In order to verify the performance of the method, 240 oblique ionograms are scaled and their results are compared with manual scaling results and the inversion results of the corresponding vertical ionograms. The comparison results show that the scaling results are accurate or at least adequate 60-90% of the time.

  16. Root Gravitropism: Quantification, Challenges, and Solutions.

    PubMed

    Muller, Lukas; Bennett, Malcolm J; French, Andy; Wells, Darren M; Swarup, Ranjan

    2018-01-01

    Better understanding of root traits such as root angle and root gravitropism will be crucial for development of crops with improved resource use efficiency. This chapter describes a high-throughput, automated image analysis method to trace Arabidopsis (Arabidopsis thaliana) seedling roots grown on agar plates. The method combines a "particle-filtering algorithm with a graph-based method" to trace the center line of a root and can be adopted for the analysis of several root parameters such as length, curvature, and stimulus from original root traces.

  17. A Methodology and Software Environment for Testing Process Model’s Sequential Predictions with Protocols

    DTIC Science & Technology

    1992-12-21

    in preparation). Foundations of artificial intelligence. Cambridge, MA: MIT Press. O’Reilly, R. C. (1991). X3DNet: An X- Based Neural Network ...2.2.3 Trace based protocol analysis 19 2.2A Summary of important data features 21 2.3 Tools related to process model testing 23 2.3.1 Tools for building...algorithm 57 3. Requirements for testing process models using trace based protocol 59 analysis 3.1 Definition of trace based protocol analysis (TBPA) 59

  18. Moment expansion for ionospheric range error

    NASA Technical Reports Server (NTRS)

    Mallinckrodt, A.; Reich, R.; Parker, H.; Berbert, J.

    1972-01-01

    On a plane earth, the ionospheric or tropospheric range error depends only on the total refractivity content or zeroth moment of the refracting layer and the elevation angle. On a spherical earth, however, the dependence is more complex; so for more accurate results it has been necessary to resort to complex ray-tracing calculations. A simple, high-accuracy alternative to the ray-tracing calculation is presented. By appropriate expansion of the angular dependence in the ray-tracing integral in a power series in height, an expression is obtained for the range error in terms of a simple function of elevation angle, E, at the expansion height and of the mth moment of the refractivity, N, distribution about the expansion height. The rapidity of convergence is heavily dependent on the choice of expansion height. For expansion heights in the neighborhood of the centroid of the layer (300-490 km), the expansion to N = 2 (three terms) gives results accurate to about 0.4% at E = 10 deg. As an analytic tool, the expansion affords some insight on the influence of layer shape on range errors in special problems.

  19. Elimination of 'ghost'-effect-related systematic error in metrology of X-ray optics with a long trace profiler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yashchuk, Valeriy V.; Irick, Steve C.; MacDowell, Alastair A.

    2005-04-28

    A data acquisition technique and relevant program for suppression of one of the systematic effects, namely the ''ghost'' effect, of a second generation long trace profiler (LTP) is described. The ''ghost'' effect arises when there is an unavoidable cross-contamination of the LTP sample and reference signals into one another, leading to a systematic perturbation in the recorded interference patterns and, therefore, a systematic variation of the measured slope trace. Perturbations of about 1-2 {micro}rad have been observed with a cylindrically shaped X-ray mirror. Even stronger ''ghost'' effects show up in an LTP measurement with a mirror having a toroidal surfacemore » figure. The developed technique employs separate measurement of the ''ghost''-effect-related interference patterns in the sample and the reference arms and then subtraction of the ''ghost'' patterns from the sample and the reference interference patterns. The procedure preserves the advantage of simultaneously measuring the sample and reference signals. The effectiveness of the technique is illustrated with LTP metrology of a variety of X-ray mirrors.« less

  20. Reviewed approach to defining the Active Interlock Envelope for Front End ray tracing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seletskiy, S.; Shaftan, T.

    To protect the NSLS-II Storage Ring (SR) components from damage from synchrotron radiation produced by insertion devices (IDs) the Active Interlock (AI) keeps electron beam within some safe envelope (a.k.a Active Interlock Envelope or AIE) in the transverse phase space. The beamline Front Ends (FEs) are designed under assumption that above certain beam current (typically 2 mA) the ID synchrotron radiation (IDSR) fan is produced by the interlocked e-beam. These assumptions also define how the ray tracing for FE is done. To simplify the FE ray tracing for typical uncanted ID it was decided to provide the Mechanical Engineering groupmore » with a single set of numbers (x,x’,y,y’) for the AIE at the center of the long (or short) ID straight section. Such unified approach to the design of the beamline Front Ends will accelerate the design process and save valuable human resources. In this paper we describe our new approach to defining the AI envelope and provide the resulting numbers required for design of the typical Front End.« less

  1. System matrix computation vs storage on GPU: A comparative study in cone beam CT.

    PubMed

    Matenine, Dmitri; Côté, Geoffroi; Mascolo-Fortin, Julia; Goussard, Yves; Després, Philippe

    2018-02-01

    Iterative reconstruction algorithms in computed tomography (CT) require a fast method for computing the intersection distances between the trajectories of photons and the object, also called ray tracing or system matrix computation. This work focused on the thin-ray model is aimed at comparing different system matrix handling strategies using graphical processing units (GPUs). In this work, the system matrix is modeled by thin rays intersecting a regular grid of box-shaped voxels, known to be an accurate representation of the forward projection operator in CT. However, an uncompressed system matrix exceeds the random access memory (RAM) capacities of typical computers by one order of magnitude or more. Considering the RAM limitations of GPU hardware, several system matrix handling methods were compared: full storage of a compressed system matrix, on-the-fly computation of its coefficients, and partial storage of the system matrix with partial on-the-fly computation. These methods were tested on geometries mimicking a cone beam CT (CBCT) acquisition of a human head. Execution times of three routines of interest were compared: forward projection, backprojection, and ordered-subsets convex (OSC) iteration. A fully stored system matrix yielded the shortest backprojection and OSC iteration times, with a 1.52× acceleration for OSC when compared to the on-the-fly approach. Nevertheless, the maximum problem size was bound by the available GPU RAM and geometrical symmetries. On-the-fly coefficient computation did not require symmetries and was shown to be the fastest for forward projection. It also offered reasonable execution times of about 176.4 ms per view per OSC iteration for a detector of 512 × 448 pixels and a volume of 384 3 voxels, using commodity GPU hardware. Partial system matrix storage has shown a performance similar to the on-the-fly approach, while still relying on symmetries. Partial system matrix storage was shown to yield the lowest relative performance. On-the-fly ray tracing was shown to be the most flexible method, yielding reasonable execution times. A fully stored system matrix allowed for the lowest backprojection and OSC iteration times and may be of interest for certain performance-oriented applications. © 2017 American Association of Physicists in Medicine.

  2. X-ray simulation algorithms used in ISP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, John P.

    ISP is a simulation code which is sometimes used in the USNDS program. ISP is maintained by Sandia National Lab. However, the X-ray simulation algorithm used by ISP was written by scientists at LANL – mainly by Ed Fenimore with some contributions from John Sullivan and George Neuschaefer and probably others. In email to John Sullivan on July 25, 2016, Jill Rivera, ISP project lead, said “ISP uses the function xdosemeters_sim from the xgen library.” The is a fortran subroutine which is also used to simulate the X-ray response in consim (a descendant of xgen). Therefore, no separate documentation ofmore » the X-ray simulation algorithms in ISP have been written – the documentation for the consim simulation can be used.« less

  3. Gamma ray energy tracking in GRETINA

    NASA Astrophysics Data System (ADS)

    Lee, I. Y.

    2011-10-01

    The next generation of stable and exotic beam accelerators will provide physics opportunities to study nuclei farther away from the line of stability. However, these experiments will be more demanding on instrumentation performance. These come from the lower production rate for more exotic beams, worse beam impurities, and large beam velocity from the fragmentation and inverse reactions. Gamma-ray spectroscopy will be one of the most effective tools to study exotic nuclei. However, to fully exploit the physics reach provided by these new facilities, better gamma-ray detector will be needed. In the last 10 years, a new concept, gamma-ray energy tracking array, was developed. Tracking arrays will increase the detection sensitivity by factors of several hundred compared to current arrays used in nuclear physics research. Particularly, the capability of reconstructing the position of the interaction with millimeters resolution is needed to correct the Doppler broadening of gamma rays emitted from high velocity nuclei. GRETINA is a gamma-ray tracking array which uses 28 Ge crystals, each with 36 segments, to cover ¼ of the 4 π of the 4 π solid angle. The gamma ray tracking technique requires detailed pulse shape information from each of the segments. These pulses are digitized using 14-bit 100 MHz flash ADCs, and digital signal analysis algorithms implemented in the on-board FPGAs provides energy, time and selection of pulse traces. A digital trigger system, provided flexible trigger functions including a fast trigger output, and also allows complicated trigger decisions to be made up to 20 microseconds. Further analyzed, carried out in a computer cluster, determine the energy, time, and three-dimensional positions of all gamma-ray interactions in the array. This information is then utilized, together with the characteristics of Compton scattering and pair-production processes, to track the scattering sequences of the gamma rays. GRETINA construction is completed in March 2011, and extensive engineering runs were carried out using radioactive sources, and beams from the 88-Inch Cyclotron at LBNL. The data obtained will be used to optimize its performance. Then the first scientific campaign will start in March 2012 at NSCL MSU.

  4. Nitrogen dioxide observations from the Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) airborne instrument: Retrieval algorithm and measurements during DISCOVER-AQ Texas 2013

    EPA Science Inventory

    The Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) airborne instrument is a test bed for upcoming air quality satellite instruments that will measure backscattered ultraviolet, visible and near-infrared light from geostationary orbit. GeoTASO flew on the NASA F...

  5. Recursion Removal as an Instructional Method to Enhance the Understanding of Recursion Tracing

    ERIC Educational Resources Information Center

    Velázquez-Iturbide, J. Ángel; Castellanos, M. Eugenia; Hijón-Neira, Raquel

    2016-01-01

    Recursion is one of the most difficult programming topics for students. In this paper, an instructional method is proposed to enhance students' understanding of recursion tracing. The proposal is based on the use of rules to translate linear recursion algorithms into equivalent, iterative ones. The paper has two main contributions: the…

  6. Generation algorithm of craniofacial structure contour in cephalometric images

    NASA Astrophysics Data System (ADS)

    Mondal, Tanmoy; Jain, Ashish; Sardana, H. K.

    2010-02-01

    Anatomical structure tracing on cephalograms is a significant way to obtain cephalometric analysis. Computerized cephalometric analysis involves both manual and automatic approaches. The manual approach is limited in accuracy and repeatability. In this paper we have attempted to develop and test a novel method for automatic localization of craniofacial structure based on the detected edges on the region of interest. According to the grey scale feature at the different region of the cephalometric images, an algorithm for obtaining tissue contour is put forward. Using edge detection with specific threshold an improved bidirectional contour tracing approach is proposed by an interactive selection of the starting edge pixels, the tracking process searches repetitively for an edge pixel at the neighborhood of previously searched edge pixel to segment images, and then craniofacial structures are obtained. The effectiveness of the algorithm is demonstrated by the preliminary experimental results obtained with the proposed method.

  7. [X-ray endoscopic semiotics and diagnostic algorithm of radiation studies of preneoplastic gastric mucosa changes].

    PubMed

    Akberov, R F; Gorshkov, A N

    1997-01-01

    The X-ray endoscopic semiotics of precancerous gastric mucosal changes (epithelial dysplasia, intestinal epithelial rearrangement) was examined by the results of 1574 gastric examination. A diagnostic algorithm was developed for radiation studies in the diagnosis of the above pathology.

  8. Accurate Ray-tracing of Realistic Neutron Star Atmospheres for Constraining Their Parameters

    NASA Astrophysics Data System (ADS)

    Vincent, Frederic H.; Bejger, Michał; Różańska, Agata; Straub, Odele; Paumard, Thibaut; Fortin, Morgane; Madej, Jerzy; Majczyna, Agnieszka; Gourgoulhon, Eric; Haensel, Paweł; Zdunik, Leszek; Beldycki, Bartosz

    2018-03-01

    Thermal-dominated X-ray spectra of neutron stars in quiescent, transient X-ray binaries and neutron stars that undergo thermonuclear bursts are sensitive to mass and radius. The mass–radius relation of neutron stars depends on the equation of state (EoS) that governs their interior. Constraining this relation accurately is therefore of fundamental importance to understand the nature of dense matter. In this context, we introduce a pipeline to calculate realistic model spectra of rotating neutron stars with hydrogen and helium atmospheres. An arbitrarily fast-rotating neutron star with a given EoS generates the spacetime in which the atmosphere emits radiation. We use the LORENE/NROTSTAR code to compute the spacetime numerically and the ATM24 code to solve the radiative transfer equations self-consistently. Emerging specific intensity spectra are then ray-traced through the neutron star’s spacetime from the atmosphere to a distant observer with the GYOTO code. Here, we present and test our fully relativistic numerical pipeline. To discuss and illustrate the importance of realistic atmosphere models, we compare our model spectra to simpler models like the commonly used isotropic color-corrected blackbody emission. We highlight the importance of considering realistic model-atmosphere spectra together with relativistic ray-tracing to obtain accurate predictions. We also insist upon the crucial impact of the star’s rotation on the observables. Finally, we close a controversy that has been ongoing in the literature in the recent years, regarding the validity of the ATM24 code.

  9. Spacecraft observations of man-made whistler-mode signals near the electron gyrofrequency

    NASA Technical Reports Server (NTRS)

    Dunckel, N.; Helliwell, R. A.

    1977-01-01

    The reported investigation extends the range of whistler-mode wave observations to a wave frequency/electron gyrofrequency ratio of about 0.9, where an abrupt cutoff is observed. This cutoff can be explained entirely in terms of accessibility and hence, if there is damping, it must be limited to normalized frequencies above 0.9. In connection with a study of the behavior of the signal intensity, ray tracings were carried out at 80 kHz. The ray-tracing calculations were carried out with the aid of a computer program written by Walter (1969) and modified by Angerami (1970).

  10. Ray tracing evaluation of a technique for correcting the refraction errors in satellite tracking data

    NASA Technical Reports Server (NTRS)

    Gardner, C. S.; Rowlett, J. R.; Hendrickson, B. E.

    1978-01-01

    Errors may be introduced in satellite laser ranging data by atmospheric refractivity. Ray tracing data have indicated that horizontal refractivity gradients may introduce nearly 3-cm rms error when satellites are near 10-degree elevation. A correction formula to compensate for the horizontal gradients has been developed. Its accuracy is evaluated by comparing it to refractivity profiles. It is found that if both spherical and gradient correction formulas are employed in conjunction with meteorological measurements, a range resolution of one cm or less is feasible for satellite elevation angles above 10 degrees.

  11. Ray tracing for inhomogeneous media applied to the human eye

    NASA Astrophysics Data System (ADS)

    Diaz-Gonzalez, G.; Iturbe-Castillo, M. D.; Juarez-Salazar, R.

    2017-08-01

    Inhomogeneous or gradient index media exhibit a refractive index varying with the position. This kind of media are very interesting because they can be found in both synthetic as well as real life optical devices such as the human lens. In this work we present the development of a computational tool for ray tracing in refractive optical systems. Particularly, the human eye is used as the optical system under study. An inhomogeneous medium with similar characteristics to the human lens is introduced and modeled by the so-called slices method. The useful of our proposal is illustrated by several graphical results.

  12. A complete ray-trace analysis of the Mirage toy

    NASA Astrophysics Data System (ADS)

    Adhya, Sriya; Noé, John W.

    2007-06-01

    The `Mirage' (Opti-Gone International) is a well-known optics demonstration (PIRA index number 6A20.35) that uses two opposed concave mirrors to project a real image of a small object into space. We studied image formation in the Mirage by standard 2x2 matrix methods and by exact ray tracing, with particular attention to additional real images that can be observed when the mirror separation is increased beyond one focal length. We find that the three readily observed secondary images correspond to 4, 6, or 8 reflections, respectively, contrary to previous reports.

  13. Pentacam Scheimpflug quantitative imaging of the crystalline lens and intraocular lens.

    PubMed

    Rosales, Patricia; Marcos, Susana

    2009-05-01

    To implement geometrical and optical distortion correction methods for anterior segment Scheimpflug images obtained with a commercially available system (Pentacam, Oculus Optikgeräte GmbH). Ray tracing algorithms were implemented to obtain corrected ocular surface geometry from the original images captured by the Pentacam's CCD camera. As details of the optical layout were not fully provided by the manufacturer, an iterative procedure (based on imaging of calibrated spheres) was developed to estimate the camera lens specifications. The correction procedure was tested on Scheimpflug images of a physical water cell model eye (with polymethylmethacrylate cornea and a commercial IOL of known dimensions) and of a normal human eye previously measured with a corrected optical and geometrical distortion Scheimpflug camera (Topcon SL-45 [Topcon Medical Systems Inc] from the Vrije University, Amsterdam, Holland). Uncorrected Scheimpflug images show flatter surfaces and thinner lenses than in reality. The application of geometrical and optical distortion correction algorithms improves the accuracy of the estimated anterior lens radii of curvature by 30% to 40% and of the estimated posterior lens by 50% to 100%. The average error in the retrieved radii was 0.37 and 0.46 mm for the anterior and posterior lens radii of curvature, respectively, and 0.048 mm for lens thickness. The Pentacam Scheimpflug system can be used to obtain quantitative information on the geometry of the crystalline lens, provided that geometrical and optical distortion correction algorithms are applied, within the accuracy of state-of-the art phakometry and biometry. The techniques could improve with exact knowledge of the technical specifications of the instrument, improved edge detection algorithms, consideration of aspheric and non-rotationally symmetrical surfaces, and introduction of a crystalline gradient index.

  14. LOR-interleaving image reconstruction for PET imaging with fractional-crystal collimation

    NASA Astrophysics Data System (ADS)

    Li, Yusheng; Matej, Samuel; Karp, Joel S.; Metzler, Scott D.

    2015-01-01

    Positron emission tomography (PET) has become an important modality in medical and molecular imaging. However, in most PET applications, the resolution is still mainly limited by the physical crystal sizes or the detector’s intrinsic spatial resolution. To achieve images with better spatial resolution in a central region of interest (ROI), we have previously proposed using collimation in PET scanners. The collimator is designed to partially mask detector crystals to detect lines of response (LORs) within fractional crystals. A sequence of collimator-encoded LORs is measured with different collimation configurations. This novel collimated scanner geometry makes the reconstruction problem challenging, as both detector and collimator effects need to be modeled to reconstruct high-resolution images from collimated LORs. In this paper, we present a LOR-interleaving (LORI) algorithm, which incorporates these effects and has the advantage of reusing existing reconstruction software, to reconstruct high-resolution images for PET with fractional-crystal collimation. We also develop a 3D ray-tracing model incorporating both the collimator and crystal penetration for simulations and reconstructions of the collimated PET. By registering the collimator-encoded LORs with the collimator configurations, high-resolution LORs are restored based on the modeled transfer matrices using the non-negative least-squares method and EM algorithm. The resolution-enhanced images are then reconstructed from the high-resolution LORs using the MLEM or OSEM algorithm. For validation, we applied the LORI method to a small-animal PET scanner, A-PET, with a specially designed collimator. We demonstrate through simulated reconstructions with a hot-rod phantom and MOBY phantom that the LORI reconstructions can substantially improve spatial resolution and quantification compared to the uncollimated reconstructions. The LORI algorithm is crucial to improve overall image quality of collimated PET, which can have significant implications in preclinical and clinical ROI imaging applications.

  15. Design of the soft x-ray tomography beamline at Taiwan photon source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, Yi-Jr, E-mail: su.yj@nsrrc.org.tw; Fu, Huang-Wen; Chung, Shih-Chun

    2016-07-27

    The optical design of the varied-line-spacing plane-grating monochromator for transmission full-field imaging of frozen-hydrated biological samples at NSRRC is presented. This monochromator consists of a plane mirror and three interchangeable gratings with groove densities 600, 1200 and 2400 l/mm to cover the energy range 260 – 2600 eV. The groove parameters of the varied-line-spacing plane gratings are designed to minimize the effect of coma and spherical aberration to maintain the exit slit in focus for any value of incident angle. All parameters of optical components at the beamline are verified with a ray-tracing method. In the beamline design, the calculatedmore » results from the ray-tracing codes and the expected performances are discussed.« less

  16. Observations of acoustic ray detection by aircraft wake vortices

    DOT National Transportation Integrated Search

    1972-03-15

    Acoustic ray deflection by aircraft wake vortex flow has been observed during landing operations of large aircraft. The phenomenon has been used to detect and locate vortex traces in a plane perpendicular to the runway centerline. The maximum deflect...

  17. Characterization With Scanning Electron Microscopy/Energy-Dispersive X-ray Spectrometry of Microtraces From the Ligature Mean in Hanging Mechanical Asphyxia: A Series of Forensic Cases.

    PubMed

    Maghin, Francesca; Andreola, Salvatore Ambrogio; Boracchi, Michele; Gentile, Guendalina; Maciocco, Francesca; Zoja, Riccardo

    2018-03-01

    The authors applied scanning electron microscopy with energy-dispersive x-ray spectrometry to the furrow derived from hanging means. The study was conducted with the purpose to detect possible extraneous microtraces, deriving from the ligature, that could have had an interaction with the cutaneous biological matrix, thanks to a transfert mechanism, in the proximities of the lesion.Fifteen cutaneous samples of the furrow and an equal number of fragments of graphite tape, directly positioned on the lesion produced by the ligature mean and used as a "conductor" of possible traces, were analyzed using scanning electron microscopy with energy-dispersive x-ray spectrometry.The research of microscopic traces on the furrow using this technique highlights extraneous traces leading to 3 main categories: natural fabrics, and synthetic and metallic materials, excluding possible environmental pollutants. The analysis, run on 7 hanging deaths, made available by the judicial authority, found a morphological and compositional compatibility with the traces found on the cutaneous furrow produced during hanging.The technique used in this study is innovative in the pathological-forensic field, and can be considered useful in clarifying and studying this typology of asphyxia leading to a specific ligature material, when missing, or attributing the cause of death to hanging when the furrow is not macroscopically obvious.

  18. Characterizing X-ray Attenuation of Containerized Cargo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birrer, N.; Divin, C.; Glenn, S.

    X-ray inspection systems can be used to detect radiological and nuclear threats in imported cargo. In order to better understand performance of these systems, the attenuation characteristics of imported cargo need to be determined. This project focused on developing image processing algorithms for segmenting cargo and using x-ray attenuation to quantify equivalent steel thickness to determine cargo density. These algorithms were applied to over 450 cargo radiographs. The results are summarized in this report.

  19. Miniature modified Faraday cup for micro electron beams

    DOEpatents

    Teruya, Alan T.; Elmer, John W.; Palmer, Todd A.; Walton, Chris C.

    2008-05-27

    A micro beam Faraday cup assembly includes a refractory metal layer with an odd number of thin, radially positioned traces in this refractory metal layer. Some of the radially positioned traces are located at the edge of the micro modified Faraday cup body and some of the radially positioned traces are located in the central portion of the micro modified Faraday cup body. Each set of traces is connected to a separate data acquisition channel to form multiple independent diagnostic networks. The data obtained from the two diagnostic networks are combined and inputted into a computed tomography algorithm to reconstruct the beam shape, size, and power density distribution.

  20. Account of an optical beam spreading caused by turbulence for the problem of partially coherent wavefield propagation through inhomogeneous absorbing media

    NASA Astrophysics Data System (ADS)

    Dudorov, Vadim V.; Kolosov, Valerii V.

    2003-04-01

    The propagation problem for partially coherent wave fields in inhomogeneous media is considered in this work. The influence of refraction, inhomogeneity of gain medium properties and refraction parameter fluctuations on target characteristics of radiation are taken into consideration. Such problems arise in the study of laser propagation on atmosphere paths, under investigation of directional radiation pattern forming for lasers which gain media is characterized by strong fluctuation of dielectric constant and for lasers which resonator have an atmosphere area. The ray-tracing technique allows us to make effective algorithms for modeling of a partially coherent wave field propagation through inhomogeneous random media is presented for case when the influecne of an optical wave refraction, the influence of the inhomogeiety of radiaitn amplification or absorption, and also the influence of fluctuations of a refraction parameter on target radiation parameters are basic. Novelty of the technique consists in the account of the additional refraction caused by inhomogeneity of gain, and also in the method of an account of turbulent distortions of a beam with any initial coherence allowing to execute construction of effective numerical algorithms. The technique based on the solution of the equation for coherence function of the second order.

  1. Lateral Penumbra Modelling Based Leaf End Shape Optimization for Multileaf Collimator in Radiotherapy

    PubMed Central

    Zhou, Dong; Zhang, Hui; Ye, Peiqing

    2016-01-01

    Lateral penumbra of multileaf collimator plays an important role in radiotherapy treatment planning. Growing evidence has revealed that, for a single-focused multileaf collimator, lateral penumbra width is leaf position dependent and largely attributed to the leaf end shape. In our study, an analytical method for leaf end induced lateral penumbra modelling is formulated using Tangent Secant Theory. Compared with Monte Carlo simulation and ray tracing algorithm, our model serves well the purpose of cost-efficient penumbra evaluation. Leaf ends represented in parametric forms of circular arc, elliptical arc, Bézier curve, and B-spline are implemented. With biobjective function of penumbra mean and variance introduced, genetic algorithm is carried out for approximating the Pareto frontier. Results show that for circular arc leaf end objective function is convex and convergence to optimal solution is guaranteed using gradient based iterative method. It is found that optimal leaf end in the shape of Bézier curve achieves minimal standard deviation, while using B-spline minimum of penumbra mean is obtained. For treatment modalities in clinical application, optimized leaf ends are in close agreement with actual shapes. Taken together, the method that we propose can provide insight into leaf end shape design of multileaf collimator. PMID:27110274

  2. Hyperspectral imaging simulation of object under sea-sky background

    NASA Astrophysics Data System (ADS)

    Wang, Biao; Lin, Jia-xuan; Gao, Wei; Yue, Hui

    2016-10-01

    Remote sensing image simulation plays an important role in spaceborne/airborne load demonstration and algorithm development. Hyperspectral imaging is valuable in marine monitoring, search and rescue. On the demand of spectral imaging of objects under the complex sea scene, physics based simulation method of spectral image of object under sea scene is proposed. On the development of an imaging simulation model considering object, background, atmosphere conditions, sensor, it is able to examine the influence of wind speed, atmosphere conditions and other environment factors change on spectral image quality under complex sea scene. Firstly, the sea scattering model is established based on the Philips sea spectral model, the rough surface scattering theory and the water volume scattering characteristics. The measured bi directional reflectance distribution function (BRDF) data of objects is fit to the statistical model. MODTRAN software is used to obtain solar illumination on the sea, sky brightness, the atmosphere transmittance from sea to sensor and atmosphere backscattered radiance, and Monte Carlo ray tracing method is used to calculate the sea surface object composite scattering and spectral image. Finally, the object spectrum is acquired by the space transformation, radiation degradation and adding the noise. The model connects the spectrum image with the environmental parameters, the object parameters, and the sensor parameters, which provide a tool for the load demonstration and algorithm development.

  3. Automated isotope identification algorithm using artificial neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamuda, Mark; Stinnett, Jacob; Sullivan, Clair

    There is a need to develop an algorithm that can determine the relative activities of radio-isotopes in a large dataset of low-resolution gamma-ray spectra that contains a mixture of many radio-isotopes. Low-resolution gamma-ray spectra that contain mixtures of radio-isotopes often exhibit feature over-lap, requiring algorithms that can analyze these features when overlap occurs. While machine learning and pattern recognition algorithms have shown promise for the problem of radio-isotope identification, their ability to identify and quantify mixtures of radio-isotopes has not been studied. Because machine learning algorithms use abstract features of the spectrum, such as the shape of overlapping peaks andmore » Compton continuum, they are a natural choice for analyzing radio-isotope mixtures. An artificial neural network (ANN) has be trained to calculate the relative activities of 32 radio-isotopes in a spectrum. Furthermore, the ANN is trained with simulated gamma-ray spectra, allowing easy expansion of the library of target radio-isotopes. In this paper we present our initial algorithms based on an ANN and evaluate them against a series measured and simulated spectra.« less

  4. Automated isotope identification algorithm using artificial neural networks

    DOE PAGES

    Kamuda, Mark; Stinnett, Jacob; Sullivan, Clair

    2017-04-12

    There is a need to develop an algorithm that can determine the relative activities of radio-isotopes in a large dataset of low-resolution gamma-ray spectra that contains a mixture of many radio-isotopes. Low-resolution gamma-ray spectra that contain mixtures of radio-isotopes often exhibit feature over-lap, requiring algorithms that can analyze these features when overlap occurs. While machine learning and pattern recognition algorithms have shown promise for the problem of radio-isotope identification, their ability to identify and quantify mixtures of radio-isotopes has not been studied. Because machine learning algorithms use abstract features of the spectrum, such as the shape of overlapping peaks andmore » Compton continuum, they are a natural choice for analyzing radio-isotope mixtures. An artificial neural network (ANN) has be trained to calculate the relative activities of 32 radio-isotopes in a spectrum. Furthermore, the ANN is trained with simulated gamma-ray spectra, allowing easy expansion of the library of target radio-isotopes. In this paper we present our initial algorithms based on an ANN and evaluate them against a series measured and simulated spectra.« less

  5. Efficient implementation of the 3D-DDA ray traversal algorithm on GPU and its application in radiation dose calculation.

    PubMed

    Xiao, Kai; Chen, Danny Z; Hu, X Sharon; Zhou, Bo

    2012-12-01

    The three-dimensional digital differential analyzer (3D-DDA) algorithm is a widely used ray traversal method, which is also at the core of many convolution∕superposition (C∕S) dose calculation approaches. However, porting existing C∕S dose calculation methods onto graphics processing unit (GPU) has brought challenges to retaining the efficiency of this algorithm. In particular, straightforward implementation of the original 3D-DDA algorithm inflicts a lot of branch divergence which conflicts with the GPU programming model and leads to suboptimal performance. In this paper, an efficient GPU implementation of the 3D-DDA algorithm is proposed, which effectively reduces such branch divergence and improves performance of the C∕S dose calculation programs running on GPU. The main idea of the proposed method is to convert a number of conditional statements in the original 3D-DDA algorithm into a set of simple operations (e.g., arithmetic, comparison, and logic) which are better supported by the GPU architecture. To verify and demonstrate the performance improvement, this ray traversal method was integrated into a GPU-based collapsed cone convolution∕superposition (CCCS) dose calculation program. The proposed method has been tested using a water phantom and various clinical cases on an NVIDIA GTX570 GPU. The CCCS dose calculation program based on the efficient 3D-DDA ray traversal implementation runs 1.42 ∼ 2.67× faster than the one based on the original 3D-DDA implementation, without losing any accuracy. The results show that the proposed method can effectively reduce branch divergence in the original 3D-DDA ray traversal algorithm and improve the performance of the CCCS program running on GPU. Considering the wide utilization of the 3D-DDA algorithm, various applications can benefit from this implementation method.

  6. Microseismic response characteristics modeling and locating of underground water supply pipe leak

    NASA Astrophysics Data System (ADS)

    Wang, J.; Liu, J.

    2015-12-01

    In traditional methods of pipeline leak location, geophones must be located on the pipe wall. If the exact location of the pipeline is unknown, the leaks cannot be identified accurately. To solve this problem, taking into account the characteristics of the pipeline leak, we propose a continuous random seismic source model and construct geological models to investigate the proposed method for locating underground pipeline leaks. Based on two dimensional (2D) viscoacoustic equations and the staggered grid finite-difference (FD) algorithm, the microseismic wave field generated by a leaking pipe is modeled. Cross-correlation analysis and the simulated annealing (SA) algorithm were utilized to obtain the time difference and the leak location. We also analyze and discuss the effect of the number of recorded traces, the survey layout, and the offset and interval of the traces on the accuracy of the estimated location. The preliminary results of the simulation and data field experiment indicate that (1) a continuous random source can realistically represent the leak microseismic wave field in a simulation using 2D visco-acoustic equations and a staggered grid FD algorithm. (2) The cross-correlation method is effective for calculating the time difference of the direct wave relative to the reference trace. However, outside the refraction blind zone, the accuracy of the time difference is reduced by the effects of the refracted wave. (3) The acquisition method of time difference based on the microseismic theory and SA algorithm has a great potential for locating leaks from underground pipelines from an array located on the ground surface. Keywords: Viscoacoustic finite-difference simulation; continuous random source; simulated annealing algorithm; pipeline leak location

  7. Two Automated Techniques for Carotid Lumen Diameter Measurement: Regional versus Boundary Approaches.

    PubMed

    Araki, Tadashi; Kumar, P Krishna; Suri, Harman S; Ikeda, Nobutaka; Gupta, Ajay; Saba, Luca; Rajan, Jeny; Lavra, Francesco; Sharma, Aditya M; Shafique, Shoaib; Nicolaides, Andrew; Laird, John R; Suri, Jasjit S

    2016-07-01

    The degree of stenosis in the carotid artery can be predicted using automated carotid lumen diameter (LD) measured from B-mode ultrasound images. Systolic velocity-based methods for measurement of LD are subjective. With the advancement of high resolution imaging, image-based methods have started to emerge. However, they require robust image analysis for accurate LD measurement. This paper presents two different algorithms for automated segmentation of the lumen borders in carotid ultrasound images. Both algorithms are modeled as a two stage process. Stage one consists of a global-based model using scale-space framework for the extraction of the region of interest. This stage is common to both algorithms. Stage two is modeled using a local-based strategy that extracts the lumen interfaces. At this stage, the algorithm-1 is modeled as a region-based strategy using a classification framework, whereas the algorithm-2 is modeled as a boundary-based approach that uses the level set framework. Two sets of databases (DB), Japan DB (JDB) (202 patients, 404 images) and Hong Kong DB (HKDB) (50 patients, 300 images) were used in this study. Two trained neuroradiologists performed manual LD tracings. The mean automated LD measured was 6.35 ± 0.95 mm for JDB and 6.20 ± 1.35 mm for HKDB. The precision-of-merit was: 97.4 % and 98.0 % w.r.t to two manual tracings for JDB and 99.7 % and 97.9 % w.r.t to two manual tracings for HKDB. Statistical tests such as ANOVA, Chi-Squared, T-test, and Mann-Whitney test were conducted to show the stability and reliability of the automated techniques.

  8. Chandra imaging of the kpc extended outflow in 1H 0419-577

    NASA Astrophysics Data System (ADS)

    Di Gesu, L.; Costantini, E.; Piconcelli, E.; Kaastra, J. S.; Mehdipour, M.; Paltani, S.

    2017-12-01

    The Seyfert 1 galaxy 1H 0419-577 hosts a kpc extended outflow that is evident in the [O III] image and that is also detected as a warm absorber in the UV/X-ray spectrum. Here, we analyze a 30 ks Chandra-ACIS X-ray image, with the aim of resolving the diffuse extranuclear X-ray emission and of investigating its relationship with the galactic outflow. Thanks to its sub-arcsecond spatial resolution, Chandra resolves the circumnuclear X-ray emission, which extends up to a projected distance of at least 16 kpc from the center. The morphology of the diffuse X-ray emission is spherically symmetrical. We could not recover a morphological resemblance between the soft X-ray emission and the ionization bicone that is traced by the [O III] outflow. Our spectral analysis indicates that one of the possible explanations for the extended emission is thermal emission from a low-density (nH 10-3 cm-3) hot plasma (Te 0.22 keV). If this is the case, we may be witnessing the cooling of a shock-heated wind bubble. In this scenario, the [O III] emission line and the X-ray/UV absorption lines may trace cooler clumps that are entrained in the hot outflow. Alternatively, the extended emission could be to due to a blend of emission lines from a photoionized gas component having a hydrogen column density of NH 2.1 × 1022 cm-2 and an ionization parameter of log ξ 1.3. Because the source is viewed almost edge-on we argue that the photoionized gas nebula must be distributed mostly along the polar directions, outside our line of sight. In this geometry, the X-ray/UV warm absorber must trace a different gas component, physically disconnected from the emitting gas, and located closer to the equatorial plane.

  9. A sparse differential clustering algorithm for tracing cell type changes via single-cell RNA-sequencing data

    PubMed Central

    Barron, Martin; Zhang, Siyuan

    2018-01-01

    Abstract Cell types in cell populations change as the condition changes: some cell types die out, new cell types may emerge and surviving cell types evolve to adapt to the new condition. Using single-cell RNA-sequencing data that measure the gene expression of cells before and after the condition change, we propose an algorithm, SparseDC, which identifies cell types, traces their changes across conditions and identifies genes which are marker genes for these changes. By solving a unified optimization problem, SparseDC completes all three tasks simultaneously. SparseDC is highly computationally efficient and demonstrates its accuracy on both simulated and real data. PMID:29140455

  10. Experimental investigation of a moving averaging algorithm for motion perpendicular to the leaf travel direction in dynamic MLC target tracking.

    PubMed

    Yoon, Jai-Woong; Sawant, Amit; Suh, Yelin; Cho, Byung-Chul; Suh, Tae-Suk; Keall, Paul

    2011-07-01

    In dynamic multileaf collimator (MLC) motion tracking with complex intensity-modulated radiation therapy (IMRT) fields, target motion perpendicular to the MLC leaf travel direction can cause beam holds, which increase beam delivery time by up to a factor of 4. As a means to balance delivery efficiency and accuracy, a moving average algorithm was incorporated into a dynamic MLC motion tracking system (i.e., moving average tracking) to account for target motion perpendicular to the MLC leaf travel direction. The experimental investigation of the moving average algorithm compared with real-time tracking and no compensation beam delivery is described. The properties of the moving average algorithm were measured and compared with those of real-time tracking (dynamic MLC motion tracking accounting for both target motion parallel and perpendicular to the leaf travel direction) and no compensation beam delivery. The algorithm was investigated using a synthetic motion trace with a baseline drift and four patient-measured 3D tumor motion traces representing regular and irregular motions with varying baseline drifts. Each motion trace was reproduced by a moving platform. The delivery efficiency, geometric accuracy, and dosimetric accuracy were evaluated for conformal, step-and-shoot IMRT, and dynamic sliding window IMRT treatment plans using the synthetic and patient motion traces. The dosimetric accuracy was quantified via a tgamma-test with a 3%/3 mm criterion. The delivery efficiency ranged from 89 to 100% for moving average tracking, 26%-100% for real-time tracking, and 100% (by definition) for no compensation. The root-mean-square geometric error ranged from 3.2 to 4.0 mm for moving average tracking, 0.7-1.1 mm for real-time tracking, and 3.7-7.2 mm for no compensation. The percentage of dosimetric points failing the gamma-test ranged from 4 to 30% for moving average tracking, 0%-23% for real-time tracking, and 10%-47% for no compensation. The delivery efficiency of moving average tracking was up to four times higher than that of real-time tracking and approached the efficiency of no compensation for all cases. The geometric accuracy and dosimetric accuracy of the moving average algorithm was between real-time tracking and no compensation, approximately half the percentage of dosimetric points failing the gamma-test compared with no compensation.

  11. Tracing contacts of TB patients in Malaysia: costs and practicality.

    PubMed

    Atif, Muhammad; Sulaiman, Syed Azhar Syed; Shafie, Asrul Akmal; Ali, Irfhan; Asif, Muhammad

    2012-01-01

    Tuberculin skin testing (TST) and chest X-ray are the conventional methods used for tracing suspected tuberculosis (TB) patients. The purpose of the study was to calculate the cost incurred by Penang General Hospital on performing one contact tracing procedure using an activity based costing approach. Contact tracing records (including the demographic profile of contacts and outcome of the contact tracing procedure) from March 2010 until February 2011 were retrospectively obtained from the TB contact tracing record book. The human resource cost was calculated by multiplying the mean time spent (in minutes) by employees doing a specific activity by their per-minute salaries. The costs of consumables, Purified Protein Derivative vials and clinical equipment were obtained from the procurement section of the Pharmacy and Radiology Departments. The cost of the building was calculated by multiplying the area of space used by the facility with the unit cost of the public building department. Straight-line deprecation with a discount rate of 3% was assumed for the calculation of equivalent annual costs for the building and machines. Out of 1024 contact tracing procedures, TST was positive (≥10 mm) in 38 suspects. However, chemoprophylaxis was started in none. Yield of contact tracing (active tuberculosis) was as low as 0.5%. The total unit cost of chest X-ray and TST was MYR 9.23 (2.90 USD) & MYR 11.80 (USD 3.70), respectively. The total cost incurred on a single contact tracing procedure was MYR 21.03 (USD 6.60). Our findings suggest that the yield of contact tracing was very low which may be attributed to an inappropriate prioritization process. TST may be replaced with more accurate and specific methods (interferon gamma release assay) in highly prioritized contacts; or TST-positive contacts should be administered 6H therapy (provided that the chest radiography excludes TB) in accordance with standard protocols. The unit cost of contact tracing can be significantly reduced if radiological examination is done only in TST or IRGA positive contacts.

  12. Ray Casting of Large Multi-Resolution Volume Datasets

    NASA Astrophysics Data System (ADS)

    Lux, C.; Fröhlich, B.

    2009-04-01

    High quality volume visualization through ray casting on graphics processing units (GPU) has become an important approach for many application domains. We present a GPU-based, multi-resolution ray casting technique for the interactive visualization of massive volume data sets commonly found in the oil and gas industry. Large volume data sets are represented as a multi-resolution hierarchy based on an octree data structure. The original volume data is decomposed into small bricks of a fixed size acting as the leaf nodes of the octree. These nodes are the highest resolution of the volume. Coarser resolutions are represented through inner nodes of the hierarchy which are generated by down sampling eight neighboring nodes on a finer level. Due to limited memory resources of current desktop workstations and graphics hardware only a limited working set of bricks can be locally maintained for a frame to be displayed. This working set is chosen to represent the whole volume at different local resolution levels depending on the current viewer position, transfer function and distinct areas of interest. During runtime the working set of bricks is maintained in CPU- and GPU memory and is adaptively updated by asynchronously fetching data from external sources like hard drives or a network. The CPU memory hereby acts as a secondary level cache for these sources from which the GPU representation is updated. Our volume ray casting algorithm is based on a 3D texture-atlas in GPU memory. This texture-atlas contains the complete working set of bricks of the current multi-resolution representation of the volume. This enables the volume ray casting algorithm to access the whole working set of bricks through only a single 3D texture. For traversing rays through the volume, information about the locations and resolution levels of visited bricks are required for correct compositing computations. We encode this information into a small 3D index texture which represents the current octree subdivision on its finest level and spatially organizes the bricked data. This approach allows us to render a bricked multi-resolution volume data set utilizing only a single rendering pass with no loss of compositing precision. In contrast most state-of-the art volume rendering systems handle the bricked data as individual 3D textures, which are rendered one at a time while the results are composited into a lower precision frame buffer. Furthermore, our method enables us to integrate advanced volume rendering techniques like empty-space skipping, adaptive sampling and preintegrated transfer functions in a very straightforward manner with virtually no extra costs. Our interactive volume ray tracing implementation allows high quality visualizations of massive volume data sets of tens of Gigabytes in size on standard desktop workstations.

  13. Thermal particle image velocity estimation of fire plume flow

    Treesearch

    Xiangyang Zhou; Lulu Sun; Shankar Mahalingam; David R. Weise

    2003-01-01

    For the purpose of studying wildfire spread in living vegetation such as chaparral in California, a thermal particle image velocity (TPIV) algorithm for nonintrusively measuring flame gas velocities through thermal infrared (IR) imagery was developed. By tracing thermal particles in successive digital IR images, the TPIV algorithm can estimate the velocity field in a...

  14. Automatic Program Synthesis Reports.

    ERIC Educational Resources Information Center

    Biermann, A. W.; And Others

    Some of the major results of future goals of an automatic program synthesis project are described in the two papers that comprise this document. The first paper gives a detailed algorithm for synthesizing a computer program from a trace of its behavior. Since the algorithm involves a search, the length of time required to do the synthesis of…

  15. Dynamic Group Formation Based on a Natural Phenomenon

    ERIC Educational Resources Information Center

    Zedadra, Amina; Lafifi, Yacine; Zedadra, Ouarda

    2016-01-01

    This paper presents a new approach of learners grouping in collaborative learning systems. This grouping process is based on traces left by learners. The goal is the circular dynamic grouping to achieve collaborative projects. The proposed approach consists of two main algorithms: (1) the circular grouping algorithm and (2) the dynamic grouping…

  16. Self-Cohering Airborne Distributed Array

    DTIC Science & Technology

    1988-06-01

    F19628-84- C -0080 ft. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT JTASK JWORK UNIT Hanscom APE MA 01731-5000...algorithms under consideration (including the newly developed algorithms). The algorithms are classified both according to the type c -f processing and...4.1 RADIO CAMERA DATA FORMAT AND PROCEDURES (FROM C -23) The range trace delivered by each antenna element is stonred as a rc’w of coimplex number-s

  17. Transient electromagnetic scattering by a radially uniaxial dielectric sphere: Debye series, Mie series and ray tracing methods

    NASA Astrophysics Data System (ADS)

    Yazdani, Mohsen

    Transient electromagnetic scattering by a radially uniaxial dielectric sphere is explored using three well-known methods: Debye series, Mie series, and ray tracing theory. In the first approach, the general solutions for the impulse and step responses of a uniaxial sphere are evaluated using the inverse Laplace transformation of the generalized Mie series solution. Following high frequency scattering solution of a large uniaxial sphere, the Mie series summation is split into the high frequency (HF) and low frequency terms where the HF term is replaced by its asymptotic expression allowing a significant reduction in computation time of the numerical Bromwich integral. In the second approach, the generalized Debye series for a radially uniaxial dielectric sphere is introduced and the Mie series coefficients are replaced by their equivalent Debye series formulations. The results are then applied to examine the transient response of each individual Debye term allowing the identification of impulse returns in the transient response of the uniaxial sphere. In the third approach, the ray tracing theory in a uniaxial sphere is investigated to evaluate the propagation path as well as the arrival time of the ordinary and extraordinary returns in the transient response of the uniaxial sphere. This is achieved by extracting the reflection and transmission angles of a plane wave obliquely incident on the radially oriented air-uniaxial and uniaxial-air boundaries, and expressing the phase velocities as well as the refractive indices of the ordinary and extraordinary waves in terms of the incident angle, optic axis and propagation direction. The results indicate a satisfactory agreement between Debye series, Mie series and ray tracing methods.

  18. Viscosity in the thermosphere: Evidence from gravity wave, neutral wind and direct lab measurements that the standard viscosity coefficients are too large in the thermosphere; and implication for gravity wave propagation in the thermosphere

    NASA Astrophysics Data System (ADS)

    Vadas, Sharon; Crowley, Geoff

    2017-04-01

    In this paper, we review measurements of 1) gravity waves (GWs) observed as traveling ionospheric disturbances (TIDs) at z 283 km by the TIDDBIT sounder on 30 October 2007, and 2) simultaneous rockets measurements of in-situ neutral winds at z 320-385 km. The neutral wind contains a 100 m/s peak at z 325 km in the same direction as the GWs, but oppositely-directed to the diurnal tides. We hypothesize that several of the TIDDBIT GWs propagated upwards and created this neutral wind peak. Using an anelastic GW ray trace model which includes thermospheric dissipation from molecular viscosity and thermal conductivity with mu proportional to the temperature to the power of 0.7, we forward ray trace the GWs from z_i=220 km. Surprisingly, the GWs dissipate below z 260 km, well below the altitude they were observed. Furthermore, none of the GWs could have propagated high-enough to create the neutral wind peak. In our opinion, this constitutes a significant discrepancy between observations and GW dissipative theory. We perform sensitivity experiments to rule out background temperature and wind effects as being the cause. We propose a modification to the formula for mu, and show that this yields ray trace results that agree reasonably well with the observations. We examine papers and reports for laboratory experiments which measured mu at low pressures, and find similar results. We conclude that the standard formulas for mu routinely used in thermospheric models must be modified in the thermosphere to account for this important effect. We also show preliminary GW ray trace results using this modified formula for mu, and compare with previous theoretical results.

  19. Determination of major elements by wavelength-dispersive X-ray fluorescence spectrometry and trace elements by inductively coupled plasma mass spectrometry in igneous rocks from the same fused sample (110 mg)

    NASA Astrophysics Data System (ADS)

    Amosova, Alena A.; Panteeva, Svetlana V.; Chubarov, Victor M.; Finkelshtein, Alexandr L.

    2016-08-01

    The fusion technique is proposed for simultaneous determination of 35 elements from the same sample. Only 110 mg of rock sample was used to obtain fused glasses for quantitative determination of 10 major elements by wavelength dispersive X-ray fluorescence analysis, 16 rare earth elements and some other trace elements by inductively coupled plasma mass spectrometry analysis. Fusion was performed with 1.1 g of lithium metaborate and LiBr solution as the releasing agent in platinum crucible in electric furnace at 1100 °C. The certified reference materials of ultramafic, mafic, intermediate and felsic igneous rocks have been applied to obtain the calibration curves for rock-forming oxides (Na2O, MgO, Al2O3, SiO2, P2O5, K2O, CaO, TiO2, MnO, Fe2O3) and some trace elements (Ba, Sr, Zr) determination by X-ray fluorescence analysis. The repeatability does not exceed the allowable standard deviation for a wide range of concentrations. In the most cases the relative standard deviation was less than 5%. Obtained glasses were utilized for the further determination of rare earth (La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb, Lu) and some other (Ba, Sr, Zr, Rb, Cs, Y, Nb, Hf, Ta, Th and U) trace elements by inductively coupled plasma mass spectrometry analysis with the same certified reference materials employed. The results could mostly be accepted as satisfactory. The proposed procedure essentially reduces the expenses in comparison with separate sample preparation for inductively coupled plasma mass spectrometry and X-ray fluorescence analysis.

  20. Sub-basalt Imaging of Hydrocarbon-Bearing Mesozoic Sediments Using Ray-Trace Inversion of First-Arrival Seismic Data and Elastic Finite-Difference Full-Wave Modeling Along Sinor-Valod Profile of Deccan Syneclise, India

    NASA Astrophysics Data System (ADS)

    Talukdar, Karabi; Behera, Laxmidhar

    2018-03-01

    Imaging below the basalt for hydrocarbon exploration is a global problem because of poor penetration and significant loss of seismic energy due to scattering, attenuation, absorption and mode-conversion when the seismic waves encounter a highly heterogeneous and rugose basalt layer. The conventional (short offset) seismic data acquisition, processing and modeling techniques adopted by the oil industry generally fails to image hydrocarbon-bearing sub-trappean Mesozoic sediments hidden below the basalt and is considered as a serious problem for hydrocarbon exploration in the world. To overcome this difficulty of sub-basalt imaging, we have generated dense synthetic seismic data with the help of elastic finite-difference full-wave modeling using staggered-grid scheme for the model derived from ray-trace inversion using sparse wide-angle seismic data acquired along Sinor-Valod profile in the Deccan Volcanic Province of India. The full-wave synthetic seismic data generated have been processed and imaged using conventional seismic data processing technique with Kirchhoff pre-stack time and depth migrations. The seismic image obtained correlates with all the structural features of the model obtained through ray-trace inversion of wide-angle seismic data, validating the effectiveness of robust elastic finite-difference full-wave modeling approach for imaging below thick basalts. Using the full-wave modeling also allows us to decipher small-scale heterogeneities imposed in the model as a measure of the rugose basalt interfaces, which could not be dealt with ray-trace inversion. Furthermore, we were able to accurately image thin low-velocity hydrocarbon-bearing Mesozoic sediments sandwiched between and hidden below two thick sequences of high-velocity basalt layers lying above the basement.

Top