Data analysis in emission tomography using emission-count posteriors
NASA Astrophysics Data System (ADS)
Sitek, Arkadiusz
2012-11-01
A novel approach to the analysis of emission tomography data using the posterior probability of the number of emissions per voxel (emission count) conditioned on acquired tomographic data is explored. The posterior is derived from the prior and the Poisson likelihood of the emission-count data by marginalizing voxel activities. Based on emission-count posteriors, examples of Bayesian analysis including estimation and classification tasks in emission tomography are provided. The application of the method to computer simulations of 2D tomography is demonstrated. In particular, the minimum-mean-square-error point estimator of the emission count is demonstrated. The process of finding this estimator can be considered as a tomographic image reconstruction technique since the estimates of the number of emissions per voxel divided by voxel sensitivities and acquisition time are the estimates of the voxel activities. As an example of a classification task, a hypothesis stating that some region of interest (ROI) emitted at least or at most r-times the number of events in some other ROI is tested. The ROIs are specified by the user. The analysis described in this work provides new quantitative statistical measures that can be used in decision making in diagnostic imaging using emission tomography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hou, Z; Terry, N; Hubbard, S S
2013-02-12
In this study, we evaluate the possibility of monitoring soil moisture variation using tomographic ground penetrating radar travel time data through Bayesian inversion, which is integrated with entropy memory function and pilot point concepts, as well as efficient sampling approaches. It is critical to accurately estimate soil moisture content and variations in vadose zone studies. Many studies have illustrated the promise and value of GPR tomographic data for estimating soil moisture and associated changes, however, challenges still exist in the inversion of GPR tomographic data in a manner that quantifies input and predictive uncertainty, incorporates multiple data types, handles non-uniquenessmore » and nonlinearity, and honors time-lapse tomograms collected in a series. To address these challenges, we develop a minimum relative entropy (MRE)-Bayesian based inverse modeling framework that non-subjectively defines prior probabilities, incorporates information from multiple sources, and quantifies uncertainty. The framework enables us to estimate dielectric permittivity at pilot point locations distributed within the tomogram, as well as the spatial correlation range. In the inversion framework, MRE is first used to derive prior probability distribution functions (pdfs) of dielectric permittivity based on prior information obtained from a straight-ray GPR inversion. The probability distributions are then sampled using a Quasi-Monte Carlo (QMC) approach, and the sample sets provide inputs to a sequential Gaussian simulation (SGSim) algorithm that constructs a highly resolved permittivity/velocity field for evaluation with a curved-ray GPR forward model. The likelihood functions are computed as a function of misfits, and posterior pdfs are constructed using a Gaussian kernel. Inversion of subsequent time-lapse datasets combines the Bayesian estimates from the previous inversion (as a memory function) with new data. The memory function and pilot point design takes advantage of the spatial-temporal correlation of the state variables. We first apply the inversion framework to a static synthetic example and then to a time-lapse GPR tomographic dataset collected during a dynamic experiment conducted at the Hanford Site in Richland, WA. We demonstrate that the MRE-Bayesian inversion enables us to merge various data types, quantify uncertainty, evaluate nonlinear models, and produce more detailed and better resolved estimates than straight-ray based inversion; therefore, it has the potential to improve estimates of inter-wellbore dielectric permittivity and soil moisture content and to monitor their temporal dynamics more accurately.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hou, Zhangshuan; Terry, Neil C.; Hubbard, Susan S.
2013-02-22
In this study, we evaluate the possibility of monitoring soil moisture variation using tomographic ground penetrating radar travel time data through Bayesian inversion, which is integrated with entropy memory function and pilot point concepts, as well as efficient sampling approaches. It is critical to accurately estimate soil moisture content and variations in vadose zone studies. Many studies have illustrated the promise and value of GPR tomographic data for estimating soil moisture and associated changes, however, challenges still exist in the inversion of GPR tomographic data in a manner that quantifies input and predictive uncertainty, incorporates multiple data types, handles non-uniquenessmore » and nonlinearity, and honors time-lapse tomograms collected in a series. To address these challenges, we develop a minimum relative entropy (MRE)-Bayesian based inverse modeling framework that non-subjectively defines prior probabilities, incorporates information from multiple sources, and quantifies uncertainty. The framework enables us to estimate dielectric permittivity at pilot point locations distributed within the tomogram, as well as the spatial correlation range. In the inversion framework, MRE is first used to derive prior probability density functions (pdfs) of dielectric permittivity based on prior information obtained from a straight-ray GPR inversion. The probability distributions are then sampled using a Quasi-Monte Carlo (QMC) approach, and the sample sets provide inputs to a sequential Gaussian simulation (SGSIM) algorithm that constructs a highly resolved permittivity/velocity field for evaluation with a curved-ray GPR forward model. The likelihood functions are computed as a function of misfits, and posterior pdfs are constructed using a Gaussian kernel. Inversion of subsequent time-lapse datasets combines the Bayesian estimates from the previous inversion (as a memory function) with new data. The memory function and pilot point design takes advantage of the spatial-temporal correlation of the state variables. We first apply the inversion framework to a static synthetic example and then to a time-lapse GPR tomographic dataset collected during a dynamic experiment conducted at the Hanford Site in Richland, WA. We demonstrate that the MRE-Bayesian inversion enables us to merge various data types, quantify uncertainty, evaluate nonlinear models, and produce more detailed and better resolved estimates than straight-ray based inversion; therefore, it has the potential to improve estimates of inter-wellbore dielectric permittivity and soil moisture content and to monitor their temporal dynamics more accurately.« less
NASA Astrophysics Data System (ADS)
Bai, Bing
2012-03-01
There has been a lot of work on total variation (TV) regularized tomographic image reconstruction recently. Many of them use gradient-based optimization algorithms with a differentiable approximation of the TV functional. In this paper we apply TV regularization in Positron Emission Tomography (PET) image reconstruction. We reconstruct the PET image in a Bayesian framework, using Poisson noise model and TV prior functional. The original optimization problem is transformed to an equivalent problem with inequality constraints by adding auxiliary variables. Then we use an interior point method with logarithmic barrier functions to solve the constrained optimization problem. In this method, a series of points approaching the solution from inside the feasible region are found by solving a sequence of subproblems characterized by an increasing positive parameter. We use preconditioned conjugate gradient (PCG) algorithm to solve the subproblems directly. The nonnegativity constraint is enforced by bend line search. The exact expression of the TV functional is used in our calculations. Simulation results show that the algorithm converges fast and the convergence is insensitive to the values of the regularization and reconstruction parameters.
SSULI/SSUSI UV Tomographic Images of Large-Scale Plasma Structuring
NASA Astrophysics Data System (ADS)
Hei, M. A.; Budzien, S. A.; Dymond, K.; Paxton, L. J.; Schaefer, R. K.; Groves, K. M.
2015-12-01
We present a new technique that creates tomographic reconstructions of atmospheric ultraviolet emission based on data from the Special Sensor Ultraviolet Limb Imager (SSULI) and the Special Sensor Ultraviolet Spectrographic Imager (SSUSI), both flown on the Defense Meteorological Satellite Program (DMSP) Block 5D3 series satellites. Until now, the data from these two instruments have been used independently of each other. The new algorithm combines SSULI/SSUSI measurements of 135.6 nm emission using the tomographic technique; the resultant data product - whole-orbit reconstructions of atmospheric volume emission within the satellite orbital plane - is substantially improved over the original data sets. Tests using simulated atmospheric emission verify that the algorithm performs well in a variety of situations, including daytime, nighttime, and even in the challenging terminator regions. A comparison with ALTAIR radar data validates that the volume emission reconstructions can be inverted to yield maps of electron density. The algorithm incorporates several innovative new features, including the use of both SSULI and SSUSI data to create tomographic reconstructions, the use of an inversion algorithm (Richardson-Lucy; RL) that explicitly accounts for the Poisson statistics inherent in optical measurements, and a pseudo-diffusion based regularization scheme implemented between iterations of the RL code. The algorithm also explicitly accounts for extinction due to absorption by molecular oxygen.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ceglio, N.M.; George, E.V.; Brooks, K.M.
The first successful demonstration of high resolution, tomographic imaging of a laboratory plasma using coded imaging techniques is reported. ZPCI has been used to image the x-ray emission from laser compressed DT filled microballoons. The zone plate camera viewed an x-ray spectral window extending from below 2 keV to above 6 keV. It exhibited a resolution approximately 8 ..mu..m, a magnification factor approximately 13, and subtended a radiation collection solid angle at the target approximately 10/sup -2/ sr. X-ray images using ZPCI were compared with those taken using a grazing incidence reflection x-ray microscope. The agreement was excellent. In addition,more » the zone plate camera produced tomographic images. The nominal tomographic resolution was approximately 75 ..mu..m. This allowed three dimensional viewing of target emission from a single shot in planar ''slices''. In addition to its tomographic capability, the great advantage of the coded imaging technique lies in its applicability to hard (greater than 10 keV) x-ray and charged particle imaging. Experiments involving coded imaging of the suprathermal x-ray and high energy alpha particle emission from laser compressed microballoon targets are discussed.« less
Tomographic inversion of satellite photometry. II
NASA Technical Reports Server (NTRS)
Solomon, S. C.; Hays, P. B.; Abreu, V. J.
1985-01-01
A method for combining nadir observations of emission features in the upper atmosphere with the result of a tomographic inversion of limb brightness measurements is presented. Simulated and actual results are provided, and error sensitivity is investigated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pablant, N. A.; Bell, R. E.; Bitter, M.
2014-11-15
Accurate tomographic inversion is important for diagnostic systems on stellarators and tokamaks which rely on measurements of line integrated emission spectra. A tomographic inversion technique based on spline optimization with enforcement of constraints is described that can produce unique and physically relevant inversions even in situations with noisy or incomplete input data. This inversion technique is routinely used in the analysis of data from the x-ray imaging crystal spectrometer (XICS) installed at the Large Helical Device. The XICS diagnostic records a 1D image of line integrated emission spectra from impurities in the plasma. Through the use of Doppler spectroscopy andmore » tomographic inversion, XICS can provide profile measurements of the local emissivity, temperature, and plasma flow. Tomographic inversion requires the assumption that these measured quantities are flux surface functions, and that a known plasma equilibrium reconstruction is available. In the case of low signal levels or partial spatial coverage of the plasma cross-section, standard inversion techniques utilizing matrix inversion and linear-regularization often cannot produce unique and physically relevant solutions. The addition of physical constraints, such as parameter ranges, derivative directions, and boundary conditions, allow for unique solutions to be reliably found. The constrained inversion technique described here utilizes a modified Levenberg-Marquardt optimization scheme, which introduces a condition avoidance mechanism by selective reduction of search directions. The constrained inversion technique also allows for the addition of more complicated parameter dependencies, for example, geometrical dependence of the emissivity due to asymmetries in the plasma density arising from fast rotation. The accuracy of this constrained inversion technique is discussed, with an emphasis on its applicability to systems with limited plasma coverage.« less
Pablant, N. A.; Bell, R. E.; Bitter, M.; ...
2014-08-08
Accurate tomographic inversion is important for diagnostic systems on stellarators and tokamaks which rely on measurements of line integrated emission spectra. A tomographic inversion technique based on spline optimization with enforcement of constraints is described that can produce unique and physically relevant inversions even in situations with noisy or incomplete input data. This inversion technique is routinely used in the analysis of data from the x-ray imaging crystal spectrometer (XICS) installed at LHD. The XICS diagnostic records a 1D image of line integrated emission spectra from impurities in the plasma. Through the use of Doppler spectroscopy and tomographic inversion, XICSmore » can provide pro file measurements of the local emissivity, temperature and plasma flow. Tomographic inversion requires the assumption that these measured quantities are flux surface functions, and that a known plasma equilibrium reconstruction is available. In the case of low signal levels or partial spatial coverage of the plasma cross-section, standard inversion techniques utilizing matrix inversion and linear-regularization often cannot produce unique and physically relevant solutions. The addition of physical constraints, such as parameter ranges, derivative directions, and boundary conditions, allow for unique solutions to be reliably found. The constrained inversion technique described here utilizes a modifi ed Levenberg-Marquardt optimization scheme, which introduces a condition avoidance mechanism by selective reduction of search directions. The constrained inversion technique also allows for the addition of more complicated parameter dependencies, for example geometrical dependence of the emissivity due to asymmetries in the plasma density arising from fast rotation. The accuracy of this constrained inversion technique is discussed, with an emphasis on its applicability to systems with limited plasma coverage.« less
Tomographic Image Compression Using Multidimensional Transforms.
ERIC Educational Resources Information Center
Villasenor, John D.
1994-01-01
Describes a method for compressing tomographic images obtained using Positron Emission Tomography (PET) and Magnetic Resonance (MR) by applying transform compression using all available dimensions. This takes maximum advantage of redundancy of the data, allowing significant increases in compression efficiency and performance. (13 references) (KRN)
Inference of emission rates from multiple sources using Bayesian probability theory.
Yee, Eugene; Flesch, Thomas K
2010-03-01
The determination of atmospheric emission rates from multiple sources using inversion (regularized least-squares or best-fit technique) is known to be very susceptible to measurement and model errors in the problem, rendering the solution unusable. In this paper, a new perspective is offered for this problem: namely, it is argued that the problem should be addressed as one of inference rather than inversion. Towards this objective, Bayesian probability theory is used to estimate the emission rates from multiple sources. The posterior probability distribution for the emission rates is derived, accounting fully for the measurement errors in the concentration data and the model errors in the dispersion model used to interpret the data. The Bayesian inferential methodology for emission rate recovery is validated against real dispersion data, obtained from a field experiment involving various source-sensor geometries (scenarios) consisting of four synthetic area sources and eight concentration sensors. The recovery of discrete emission rates from three different scenarios obtained using Bayesian inference and singular value decomposition inversion are compared and contrasted.
NASA Technical Reports Server (NTRS)
Mcdade, Ian C.
1991-01-01
Techniques were developed for recovering two-dimensional distributions of auroral volume emission rates from rocket photometer measurements made in a tomographic spin scan mode. These tomographic inversion procedures are based upon an algebraic reconstruction technique (ART) and utilize two different iterative relaxation techniques for solving the problems associated with noise in the observational data. One of the inversion algorithms is based upon a least squares method and the other on a maximum probability approach. The performance of the inversion algorithms, and the limitations of the rocket tomography technique, were critically assessed using various factors such as (1) statistical and non-statistical noise in the observational data, (2) rocket penetration of the auroral form, (3) background sources of emission, (4) smearing due to the photometer field of view, and (5) temporal variations in the auroral form. These tests show that the inversion procedures may be successfully applied to rocket observations made in medium intensity aurora with standard rocket photometer instruments. The inversion procedures have been used to recover two-dimensional distributions of auroral emission rates and ionization rates from an existing set of N2+3914A rocket photometer measurements which were made in a tomographic spin scan mode during the ARIES auroral campaign. The two-dimensional distributions of the 3914A volume emission rates recoverd from the inversion of the rocket data compare very well with the distributions that were inferred from ground-based measurements using triangulation-tomography techniques and the N2 ionization rates derived from the rocket tomography results are in very good agreement with the in situ particle measurements that were made during the flight. Three pre-prints describing the tomographic inversion techniques and the tomographic analysis of the ARIES rocket data are included as appendices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craciunescu, Teddy, E-mail: teddy.craciunescu@jet.uk; Tiseanu, Ion; Zoita, Vasile
The Joint European Torus (JET) neutron profile monitor ensures 2D coverage of the gamma and neutron emissive region that enables tomographic reconstruction. Due to the availability of only two projection angles and to the coarse sampling, tomographic inversion is a limited data set problem. Several techniques have been developed for tomographic reconstruction of the 2-D gamma and neutron emissivity on JET, but the problem of evaluating the errors associated with the reconstructed emissivity profile is still open. The reconstruction technique based on the maximum likelihood principle, that proved already to be a powerful tool for JET tomography, has been usedmore » to develop a method for the numerical evaluation of the statistical properties of the uncertainties in gamma and neutron emissivity reconstructions. The image covariance calculation takes into account the additional techniques introduced in the reconstruction process for tackling with the limited data set (projection resampling, smoothness regularization depending on magnetic field). The method has been validated by numerically simulations and applied to JET data. Different sources of artefacts that may significantly influence the quality of reconstructions and the accuracy of variance calculation have been identified.« less
NASA Astrophysics Data System (ADS)
Tornai, Martin P.; Bowsher, James E.; Archer, Caryl N.; Peter, Jörg; Jaszczak, Ronald J.; MacDonald, Lawrence R.; Patt, Bradley E.; Iwanczyk, Jan S.
2003-01-01
A novel tomographic gantry was designed, built and initially evaluated for single photon emission imaging of metabolically active lesions in the pendant breast and near chest wall. Initial emission imaging measurements with breast lesions of various uptake ratios are presented. Methods: A prototype tomograph was constructed utilizing a compact gamma camera having a field-of-view of <13×13 cm 2 with arrays of 2×2×6 mm 3 quantized NaI(Tl) scintillators coupled to position sensitive PMTs. The camera was mounted on a radially oriented support with 6 cm variable radius-of-rotation. This unit is further mounted on a goniometric cradle providing polar motion, and in turn mounted on an azimuthal rotation stage capable of indefinite vertical axis-of-rotation about the central rotation axis (RA). Initial measurements with isotopic Tc-99 m (140 keV) to evaluate the system include acquisitions with various polar tilt angles about the RA. Tomographic measurements were made of a frequency and resolution cold-rod phantom filled with aqueous Tc-99 m. Tomographic and planar measurements of 0.6 and 1.0 cm diameter fillable spheres in an available ˜950 ml hemi-ellipsoidal (uncompressed) breast phantom attached to a life-size anthropomorphic torso phantom with lesion:breast-and-body:cardiac-and-liver activity concentration ratios of 11:1:19 were compared. Various photopeak energy windows from 10-30% widths were obtained, along with a 35% scatter window below a 15% photopeak window from the list mode data. Projections with all photopeak window and camera tilt conditions were reconstructed with an ordered subsets expectation maximization (OSEM) algorithm capable of reconstructing arbitrary tomographic orbits. Results: As iteration number increased for the tomographically measured data at all polar angles, contrasts increased while signal-to-noise ratios (SNRs) decreased in the expected way with OSEM reconstruction. The rollover between contrast improvement and SNR degradation of the lesion occurred at two to three iterations. The reconstructed tomographic data yielded SNRs with or without scatter correction that were >9 times better than the planar scans. There was up to a factor of ˜2.5 increase in total primary and scatter contamination in the photopeak window with increasing tilt angle from 15° to 45°, consistent with more direct line-of-sight of myocardial and liver activity with increased camera polar angle. Conclusion: This new, ultra-compact, dedicated tomographic imaging system has the potential of providing valuable, fully 3D functional information about small, otherwise indeterminate breast lesions as an adjunct to diagnostic mammography.
On the uncertainty in single molecule fluorescent lifetime and energy emission measurements
NASA Technical Reports Server (NTRS)
Brown, Emery N.; Zhang, Zhenhua; Mccollom, Alex D.
1995-01-01
Time-correlated single photon counting has recently been combined with mode-locked picosecond pulsed excitation to measure the fluorescent lifetimes and energy emissions of single molecules in a flow stream. Maximum likelihood (ML) and least square methods agree and are optimal when the number of detected photons is large however, in single molecule fluorescence experiments the number of detected photons can be less than 20, 67% of those can be noise and the detection time is restricted to 10 nanoseconds. Under the assumption that the photon signal and background noise are two independent inhomogeneous poisson processes, we derive the exact joint arrival time probably density of the photons collected in a single counting experiment performed in the presence of background noise. The model obviates the need to bin experimental data for analysis, and makes it possible to analyze formally the effect of background noise on the photon detection experiment using both ML or Bayesian methods. For both methods we derive the joint and marginal probability densities of the fluorescent lifetime and fluorescent emission. the ML and Bayesian methods are compared in an analysis of simulated single molecule fluorescence experiments of Rhodamine 110 using different combinations of expected background nose and expected fluorescence emission. While both the ML or Bayesian procedures perform well for analyzing fluorescence emissions, the Bayesian methods provide more realistic measures of uncertainty in the fluorescent lifetimes. The Bayesian methods would be especially useful for measuring uncertainty in fluorescent lifetime estimates in current single molecule flow stream experiments where the expected fluorescence emission is low. Both the ML and Bayesian algorithms can be automated for applications in molecular biology.
On the Uncertainty in Single Molecule Fluorescent Lifetime and Energy Emission Measurements
NASA Technical Reports Server (NTRS)
Brown, Emery N.; Zhang, Zhenhua; McCollom, Alex D.
1996-01-01
Time-correlated single photon counting has recently been combined with mode-locked picosecond pulsed excitation to measure the fluorescent lifetimes and energy emissions of single molecules in a flow stream. Maximum likelihood (ML) and least squares methods agree and are optimal when the number of detected photons is large, however, in single molecule fluorescence experiments the number of detected photons can be less than 20, 67 percent of those can be noise, and the detection time is restricted to 10 nanoseconds. Under the assumption that the photon signal and background noise are two independent inhomogeneous Poisson processes, we derive the exact joint arrival time probability density of the photons collected in a single counting experiment performed in the presence of background noise. The model obviates the need to bin experimental data for analysis, and makes it possible to analyze formally the effect of background noise on the photon detection experiment using both ML or Bayesian methods. For both methods we derive the joint and marginal probability densities of the fluorescent lifetime and fluorescent emission. The ML and Bayesian methods are compared in an analysis of simulated single molecule fluorescence experiments of Rhodamine 110 using different combinations of expected background noise and expected fluorescence emission. While both the ML or Bayesian procedures perform well for analyzing fluorescence emissions, the Bayesian methods provide more realistic measures of uncertainty in the fluorescent lifetimes. The Bayesian methods would be especially useful for measuring uncertainty in fluorescent lifetime estimates in current single molecule flow stream experiments where the expected fluorescence emission is low. Both the ML and Bayesian algorithms can be automated for applications in molecular biology.
New Possibilities of Positron-Emission Tomography
NASA Astrophysics Data System (ADS)
Volobuev, A. N.
2018-01-01
The reasons for the emergence of the angular distribution of photons generated as a result of annihilation of an electron and a positron in a positron-emission tomograph are investigated. It is shown that the angular distribution of the radiation intensity (i.e., the probability of photon emission at different angles) is a consequence of the Doppler effect in the center-of-mass reference system of the electron and the positron. In the reference frame attached to the electron, the angular distribution of the number of emitted photons does not exists but is replaced by the Doppler shift of the frequency of photons. The results obtained in this study make it possible to extend the potentialities of the positron-emission tomograph in the diagnostics of diseases and to obtain additional mechanical characteristics of human tissues, such as density and viscosity.
Markov chain Monte Carlo estimation of quantum states
NASA Astrophysics Data System (ADS)
Diguglielmo, James; Messenger, Chris; Fiurášek, Jaromír; Hage, Boris; Samblowski, Aiko; Schmidt, Tabea; Schnabel, Roman
2009-03-01
We apply a Bayesian data analysis scheme known as the Markov chain Monte Carlo to the tomographic reconstruction of quantum states. This method yields a vector, known as the Markov chain, which contains the full statistical information concerning all reconstruction parameters including their statistical correlations with no a priori assumptions as to the form of the distribution from which it has been obtained. From this vector we can derive, e.g., the marginal distributions and uncertainties of all model parameters, and also of other quantities such as the purity of the reconstructed state. We demonstrate the utility of this scheme by reconstructing the Wigner function of phase-diffused squeezed states. These states possess non-Gaussian statistics and therefore represent a nontrivial case of tomographic reconstruction. We compare our results to those obtained through pure maximum-likelihood and Fisher information approaches.
Positron Emission Tomography: Principles, Technology, and Recent Developments
NASA Astrophysics Data System (ADS)
Ziegler, Sibylle I.
2005-04-01
Positron emission tomography (PET) is a nuclear medical imaging technique for quantitative measurement of physiologic parameters in vivo (an overview of principles and applications can be found in [P.E. Valk, et al., eds. Positron Emission Tomography. Basic Science and Clinical Practice. 2003, Springer: Heidelberg]), based on the detection of small amounts of posi-tron-emitter-labelled biologic molecules. Various radiotracers are available for neuro-logical, cardiological, and oncological applications in the clinic and in research proto-cols. This overview describes the basic principles, technology, and recent develop-ments in PET, followed by a section on the development of a tomograph with ava-lanche photodiodes dedicated for small animal imaging as an example of efforts in the domain of high resolution tomographs.
Classification of JET Neutron and Gamma Emissivity Profiles
NASA Astrophysics Data System (ADS)
Craciunescu, T.; Murari, A.; Kiptily, V.; Vega, J.; Contributors, JET
2016-05-01
In thermonuclear plasmas, emission tomography uses integrated measurements along lines of sight (LOS) to determine the two-dimensional (2-D) spatial distribution of the volume emission intensity. Due to the availability of only a limited number views and to the coarse sampling of the LOS, the tomographic inversion is a limited data set problem. Several techniques have been developed for tomographic reconstruction of the 2-D gamma and neutron emissivity on JET. In specific experimental conditions the availability of LOSs is restricted to a single view. In this case an explicit reconstruction of the emissivity profile is no longer possible. However, machine learning classification methods can be used in order to derive the type of the distribution. In the present approach the classification is developed using the theory of belief functions which provide the support to fuse the results of independent clustering and supervised classification. The method allows to represent the uncertainty of the results provided by different independent techniques, to combine them and to manage possible conflicts.
Bayesian Estimation of Fugitive Methane Point Source Emission Rates from a Single Downwind High-Frequency Gas Sensor With the tremendous advances in onshore oil and gas exploration and production (E&P) capability comes the realization that new tools are needed to support env...
Kwak, Sehyun; Svensson, J; Brix, M; Ghim, Y-C
2016-02-01
A Bayesian model of the emission spectrum of the JET lithium beam has been developed to infer the intensity of the Li I (2p-2s) line radiation and associated uncertainties. The detected spectrum for each channel of the lithium beam emission spectroscopy system is here modelled by a single Li line modified by an instrumental function, Bremsstrahlung background, instrumental offset, and interference filter curve. Both the instrumental function and the interference filter curve are modelled with non-parametric Gaussian processes. All free parameters of the model, the intensities of the Li line, Bremsstrahlung background, and instrumental offset, are inferred using Bayesian probability theory with a Gaussian likelihood for photon statistics and electronic background noise. The prior distributions of the free parameters are chosen as Gaussians. Given these assumptions, the intensity of the Li line and corresponding uncertainties are analytically available using a Bayesian linear inversion technique. The proposed approach makes it possible to extract the intensity of Li line without doing a separate background subtraction through modulation of the Li beam.
Representation of photon limited data in emission tomography using origin ensembles
NASA Astrophysics Data System (ADS)
Sitek, A.
2008-06-01
Representation and reconstruction of data obtained by emission tomography scanners are challenging due to high noise levels in the data. Typically, images obtained using tomographic measurements are represented using grids. In this work, we define images as sets of origins of events detected during tomographic measurements; we call these origin ensembles (OEs). A state in the ensemble is characterized by a vector of 3N parameters Y, where the parameters are the coordinates of origins of detected events in a three-dimensional space and N is the number of detected events. The 3N-dimensional probability density function (PDF) for that ensemble is derived, and we present an algorithm for OE image estimation from tomographic measurements. A displayable image (e.g. grid based image) is derived from the OE formulation by calculating ensemble expectations based on the PDF using the Markov chain Monte Carlo method. The approach was applied to computer-simulated 3D list-mode positron emission tomography data. The reconstruction errors for a 10 000 000 event acquisition for simulated ranged from 0.1 to 34.8%, depending on object size and sampling density. The method was also applied to experimental data and the results of the OE method were consistent with those obtained by a standard maximum-likelihood approach. The method is a new approach to representation and reconstruction of data obtained by photon-limited emission tomography measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kramar, M.; Lin, H.; Tomczyk, S., E-mail: kramar@cua.edu, E-mail: lin@ifa.hawaii.edu, E-mail: tomczyk@ucar.edu
We present the first direct “observation” of the global-scale, 3D coronal magnetic fields of Carrington Rotation (CR) Cycle 2112 using vector tomographic inversion techniques. The vector tomographic inversion uses measurements of the Fe xiii 10747 Å Hanle effect polarization signals by the Coronal Multichannel Polarimeter (CoMP) and 3D coronal density and temperature derived from scalar tomographic inversion of Solar Terrestrial Relations Observatory (STEREO)/Extreme Ultraviolet Imager (EUVI) coronal emission lines (CELs) intensity images as inputs to derive a coronal magnetic field model that best reproduces the observed polarization signals. While independent verifications of the vector tomography results cannot be performed, wemore » compared the tomography inverted coronal magnetic fields with those constructed by magnetohydrodynamic (MHD) simulations based on observed photospheric magnetic fields of CR 2112 and 2113. We found that the MHD model for CR 2112 is qualitatively consistent with the tomography inverted result for most of the reconstruction domain except for several regions. Particularly, for one of the most noticeable regions, we found that the MHD simulation for CR 2113 predicted a model that more closely resembles the vector tomography inverted magnetic fields. In another case, our tomographic reconstruction predicted an open magnetic field at a region where a coronal hole can be seen directly from a STEREO-B/EUVI image. We discuss the utilities and limitations of the tomographic inversion technique, and present ideas for future developments.« less
Recent Developments in Positron Emission Tomography (PET) Instrumentation
DOE R&D Accomplishments Database
Derenzo, S. E.; Budinger, T. F.
1986-04-01
This paper presents recent detector developments and perspectives for positron emission tomography (PET) instrumentation used for medical research, as well as the physical processes in positron annihilation, photon scattering and detection, tomograph design considerations, and the potentials for new advances in detectors.
The Tomographic Ionized-Carbon Mapping Experiment (TIME) CII Imaging Spectrometer
NASA Astrophysics Data System (ADS)
Staniszewski, Z.; Bock, J. J.; Bradford, C. M.; Brevik, J.; Cooray, A.; Gong, Y.; Hailey-Dunsheath, S.; O'Brient, R.; Santos, M.; Shirokoff, E.; Silva, M.; Zemcov, M.
2014-09-01
The Tomographic Ionized-Carbon Mapping Experiment (TIME) and TIME-Pilot are proposed imaging spectrometers to measure reionization and large scale structure at redshifts 5-9. We seek to exploit the 158 restframe emission of [CII], which becomes measurable at 200-300 GHz at reionization redshifts. Here we describe the scientific motivation, give an overview of the proposed instrument, and highlight key technological developments underway to enable these measurements.
The auroral 6300 A emission - Observations and modeling
NASA Technical Reports Server (NTRS)
Solomon, Stanley C.; Hays, Paul B.; Abreu, Vincent J.
1988-01-01
A tomographic inversion is used to analyze measurements of the auroral atomic oxygen emission line at 6300 A made by the atmosphere explorer visible airglow experiment. A comparison is made between emission altitude profiles and the results from an electron transport and chemical reaction model. Measurements of the energetic electron flux, neutral composition, ion composition, and electron density are incorporated in the model.
5D-intravital tomography as a novel tool for non-invasive in-vivo analysis of human skin
NASA Astrophysics Data System (ADS)
König, Karsten; Weinigel, Martin; Breunig, Hans G.; Gregory, Axel; Fischer, Peter; Kellner-Höfer, Marcel; Bückle, Rainer; Schwarz, Martin; Riemann, Iris; Stracke, Frank; Huck, Volker; Gorzelanny, Christian; Schneider, Stefan W.
2010-02-01
Some years ago, CE-marked clinical multiphoton systems for 3D imaging of human skin with subcellular resolution have been launched. These tomographs provide optical biopsies with submicron resolution based on two-photon excited autofluorescence (NAD(P)H, flavoproteins, keratin, elastin, melanin, porphyrins) and second harmonic generation by collagen. The 3D tomograph was now transferred into a 5D imaging system by the additional detection of the emission spectrum and the fluorescence lifetime based on spatially and spectrally resolved time-resolved single photon counting. The novel 5D intravital tomograph (5D-IVT) was employed for the early detection of atopic dermatitis and the analysis of treatment effects.
Yamatsuji, Tomoki; Ishida, Naomasa; Takaoka, Munenori; Hayashi, Jiro; Yoshida, Kazuhiro; Shigemitsu, Kaori; Urakami, Atsushi; Haisa, Minoru; Naomoto, Yoshio
2017-01-01
Of 129 esophagectomies at our institute from June 2010 to March 2015, we experienced three preoperative positron emission tomography-computed tomographic (PET/CT) false positives. Bone metastasis was originally suspected in 2 cases, but they were later found to be bone metastasis negative after a preoperative bone biopsy and clinical course observation. The other cases suspected of mediastinal lymph node metastasis were diagnosed as inflammatory lymphadenopathy by a pathological examination of the removed lymph nodes. Conducting a PET/CT is useful when diagnosing esophageal cancer metastasis, but we need to be aware of the possibility of false positives. Therapeutic decisions should be made based on appropriate and accurate diagnoses, with pathological diagnosis actively introduced if necessary. PMID:28469502
Imaging of turbulent structures and tomographic reconstruction of TORPEX plasma emissivity
NASA Astrophysics Data System (ADS)
Iraji, D.; Furno, I.; Fasoli, A.; Theiler, C.
2010-12-01
In the TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], a simple magnetized plasma device, low frequency electrostatic fluctuations associated with interchange waves, are routinely measured by means of extensive sets of Langmuir probes. To complement the electrostatic probe measurements of plasma turbulence and study of plasma structures smaller than the spatial resolution of probes array, a nonperturbative direct imaging system has been developed on TORPEX, including a fast framing Photron-APX-RS camera and an image intensifier unit. From the line-integrated camera images, we compute the poloidal emissivity profile of the plasma by applying a tomographic reconstruction technique using a pixel method and solving an overdetermined set of equations by singular value decomposition. This allows comparing statistical, spectral, and spatial properties of visible light radiation with electrostatic fluctuations. The shape and position of the time-averaged reconstructed plasma emissivity are observed to be similar to those of the ion saturation current profile. In the core plasma, excluding the electron cyclotron and upper hybrid resonant layers, the mean value of the plasma emissivity is observed to vary with (Te)α(ne)β, in which α =0.25-0.7 and β =0.8-1.4, in agreement with collisional radiative model. The tomographic reconstruction is applied to the fast camera movie acquired with 50 kframes/s rate and 2 μs of exposure time to obtain the temporal evolutions of the emissivity fluctuations. Conditional average sampling is also applied to visualize and measure sizes of structures associated with the interchange mode. The ω-time and the two-dimensional k-space Fourier analysis of the reconstructed emissivity fluctuations show the same interchange mode that is detected in the ω and k spectra of the ion saturation current fluctuations measured by probes. Small scale turbulent plasma structures can be detected and tracked in the reconstructed emissivity movies with the spatial resolution down to 2 cm, well beyond the spatial resolution of the probe array.
1987-01-01
BLOOD FLOW CHANGE Steven E. Petersen, Peter T. Fox, Michael I. Posner, Marcus Raichle McDonnell Center for Studies of Higher Brain Function...Single Words Using Positron Emission Tomographic Measurements of Cerebral Blood Flow Change *= ’I PERSONAL AUTHOR(S) * Petersen, Steven E. 13a. TYPE OF...CHANGE Steven E. Petersen, Peter T. Fox, Michael I. Posner, Marcus E. Raichle INTRODUCTION Language is an essential characteristic of the human
Bayesian Redshift Classification of Emission-line Galaxies with Photometric Equivalent Widths
NASA Astrophysics Data System (ADS)
Leung, Andrew S.; Acquaviva, Viviana; Gawiser, Eric; Ciardullo, Robin; Komatsu, Eiichiro; Malz, A. I.; Zeimann, Gregory R.; Bridge, Joanna S.; Drory, Niv; Feldmeier, John J.; Finkelstein, Steven L.; Gebhardt, Karl; Gronwall, Caryl; Hagen, Alex; Hill, Gary J.; Schneider, Donald P.
2017-07-01
We present a Bayesian approach to the redshift classification of emission-line galaxies when only a single emission line is detected spectroscopically. We consider the case of surveys for high-redshift Lyα-emitting galaxies (LAEs), which have traditionally been classified via an inferred rest-frame equivalent width (EW {W}{Lyα }) greater than 20 Å. Our Bayesian method relies on known prior probabilities in measured emission-line luminosity functions and EW distributions for the galaxy populations, and returns the probability that an object in question is an LAE given the characteristics observed. This approach will be directly relevant for the Hobby-Eberly Telescope Dark Energy Experiment (HETDEX), which seeks to classify ˜106 emission-line galaxies into LAEs and low-redshift [{{O}} {{II}}] emitters. For a simulated HETDEX catalog with realistic measurement noise, our Bayesian method recovers 86% of LAEs missed by the traditional {W}{Lyα } > 20 Å cutoff over 2 < z < 3, outperforming the EW cut in both contamination and incompleteness. This is due to the method’s ability to trade off between the two types of binary classification error by adjusting the stringency of the probability requirement for classifying an observed object as an LAE. In our simulations of HETDEX, this method reduces the uncertainty in cosmological distance measurements by 14% with respect to the EW cut, equivalent to recovering 29% more cosmological information. Rather than using binary object labels, this method enables the use of classification probabilities in large-scale structure analyses. It can be applied to narrowband emission-line surveys as well as upcoming large spectroscopic surveys including Euclid and WFIRST.
NASA Astrophysics Data System (ADS)
Yatsishina, E. B.; Kovalchuk, M. V.; Loshak, M. D.; Vasilyev, S. V.; Vasilieva, O. A.; Dyuzheva, O. P.; Pojidaev, V. M.; Ushakov, V. L.
2018-05-01
Nine ancient Egyptian mummies (dated preliminarily to the period from the 1st mill. BCE to the first centuries CE) from the collection of the State Pushkin Museum of Fine Arts have been studied at the National Research Centre "Kurchatov Institute" (NRC KI) on the base of the complex of NBICS technologies. Tomographic scanning is performed using a magneto-resonance tomograph (3 T) and a hybrid positron emission tomography/computed tomography (PET-CT) scanner. Three-dimensional reconstructions of mummies and their anthropological measurements are carried out. Some medical conclusions are drawn based on the tomographic data. In addition, the embalming composition and tissue of one of the mummies are preliminarily analyzed.
Concurrent Ultrasonic Tomography and Acoustic Emission in Solid Materials
NASA Astrophysics Data System (ADS)
Chow, Thomas M.
A series of experiments were performed to detect stress induced changes in the elastic properties of various solid materials. A technique was developed where these changes were monitored concurrently by two methods, ultrasonic tomography and acoustic emission monitoring. This thesis discusses some experiments in which acoustic emission (AE) and ultrasonic tomography were performed on various samples of solid materials including rocks, concrete, metals, and fibre reinforced composites. Three separate techniques were used to induce stress in these samples. Disk shaped samples were subject to stress via diametral loading using an indirect tensile test geometry. Cylindrical samples of rocks and concrete were subject to hydraulic fracture tests, and rectangular samples of fibre reinforced composite were subject to direct tensile loading. The majority of the samples were elastically anisotropic. Full waveform acoustic emission and tomographic data were collected while these samples were under load to give information concerning changes in the structure of the material as it was undergoing stress change and/or failure. Analysis of this data indicates that AE and tomographic techniques mutually compliment each other to give a view of the stress induced elastic changes in the tested samples.
STARBLADE: STar and Artefact Removal with a Bayesian Lightweight Algorithm from Diffuse Emission
NASA Astrophysics Data System (ADS)
Knollmüller, Jakob; Frank, Philipp; Ensslin, Torsten A.
2018-05-01
STARBLADE (STar and Artefact Removal with a Bayesian Lightweight Algorithm from Diffuse Emission) separates superimposed point-like sources from a diffuse background by imposing physically motivated models as prior knowledge. The algorithm can also be used on noisy and convolved data, though performing a proper reconstruction including a deconvolution prior to the application of the algorithm is advised; the algorithm could also be used within a denoising imaging method. STARBLADE learns the correlation structure of the diffuse emission and takes it into account to determine the occurrence and strength of a superimposed point source.
Downscaling Smooth Tomographic Models: Separating Intrinsic and Apparent Anisotropy
NASA Astrophysics Data System (ADS)
Bodin, Thomas; Capdeville, Yann; Romanowicz, Barbara
2016-04-01
In recent years, a number of tomographic models based on full waveform inversion have been published. Due to computational constraints, the fitted waveforms are low pass filtered, which results in an inability to map features smaller than half the shortest wavelength. However, these tomographic images are not a simple spatial average of the true model, but rather an effective, apparent, or equivalent model that provides a similar 'long-wave' data fit. For example, it can be shown that a series of horizontal isotropic layers will be seen by a 'long wave' as a smooth anisotropic medium. In this way, the observed anisotropy in tomographic models is a combination of intrinsic anisotropy produced by lattice-preferred orientation (LPO) of minerals, and apparent anisotropy resulting from the incapacity of mapping discontinuities. Interpretations of observed anisotropy (e.g. in terms of mantle flow) requires therefore the separation of its intrinsic and apparent components. The "up-scaling" relations that link elastic properties of a rapidly varying medium to elastic properties of the effective medium as seen by long waves are strongly non-linear and their inverse highly non-unique. That is, a smooth homogenized effective model is equivalent to a large number of models with discontinuities. In the 1D case, Capdeville et al (GJI, 2013) recently showed that a tomographic model which results from the inversion of low pass filtered waveforms is an homogenized model, i.e. the same as the model computed by upscaling the true model. Here we propose a stochastic method to sample the ensemble of layered models equivalent to a given tomographic profile. We use a transdimensional formulation where the number of layers is variable. Furthermore, each layer may be either isotropic (1 parameter) or intrinsically anisotropic (2 parameters). The parsimonious character of the Bayesian inversion gives preference to models with the least number of parameters (i.e. least number of layers, and maximum number of isotropic layers). The non-uniqueness of the problem can be addressed by adding high frequency data such as receiver functions, able to map first order discontinuities. We show with synthetic tests that this method enables us to distinguish between intrinsic and apparent anisotropy in tomographic models, as layers with intrinsic anisotropy are only present when required by the data. A real data example is presented based on the latest global model produced at Berkeley.
NASA Astrophysics Data System (ADS)
Dymond, K.; Nicholas, A. C.; Budzien, S. A.; Stephan, A. W.; Coker, C.; Hei, M. A.; Groves, K. M.
2015-12-01
The Special Sensor Ultraviolet Limb Imager (SSULI) instruments are ultraviolet limb scanning sensors flying on the Defense Meteorological Satellite Program (DMSP) satellites. The SSULIs observe the 80-170 nanometer wavelength range covering emissions at 91 and 136 nm, which are produced by radiative recombination of the ionosphere. We invert these emissions tomographically using newly developed algorithms that include optical depth effects due to pure absorption and resonant scattering. We present the details of our approach including how the optimal altitude and along-track sampling were determined and the newly developed approach we are using for regularizing the SSULI tomographic inversions. Finally, we conclude with validations of the SSULI inversions against ALTAIR incoherent scatter radar measurements and demonstrate excellent agreement between the measurements.
NASA Astrophysics Data System (ADS)
Rathore, Kavita; Bhattacharjee, Sudeep; Munshi, Prabhat
2017-06-01
A tomographic method based on the Fourier transform is used for characterizing a microwave plasma in a multicusp (MC), in order to obtain 2D distribution of plasma emissions, plasma (electron) density (Ne) and temperature (Te). The microwave plasma in the MC is characterized as a function of microwave power, gas pressure, and axial distance. The experimentally obtained 2D emission profiles show that the plasma emissions are generated in a circular ring shape. There are usually two bright rings, one at the plasma core and another near the boundary. The experimental results are validated using a numerical code that solves Maxwell's equations inside a waveguide filled with a plasma in a magnetic field, with collisions included. It is inferred that the dark and bright circular ring patterns are a result of superposition of Bessel modes (TE11 and TE21) of the wave electric field inside the plasma filled MC, which are in reasonable agreement with the plasma emission profiles. The tomographically obtained Ne and Te profiles indicate higher densities in the plasma core (˜1010 cm-3) and enhanced electron temperature in the ECR region (˜13 eV), which are in agreement with earlier results using a Langmuir probe and optical emission spectroscopy (OES) diagnostics.
Bayesian reconstruction and use of anatomical a priori information for emission tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowsher, J.E.; Johnson, V.E.; Turkington, T.G.
1996-10-01
A Bayesian method is presented for simultaneously segmenting and reconstructing emission computed tomography (ECT) images and for incorporating high-resolution, anatomical information into those reconstructions. The anatomical information is often available from other imaging modalities such as computed tomography (CT) or magnetic resonance imaging (MRI). The Bayesian procedure models the ECT radiopharmaceutical distribution as consisting of regions, such that radiopharmaceutical activity is similar throughout each region. It estimates the number of regions, the mean activity of each region, and the region classification and mean activity of each voxel. Anatomical information is incorporated by assigning higher prior probabilities to ECT segmentations inmore » which each ECT region stays within a single anatomical region. This approach is effective because anatomical tissue type often strongly influences radiopharmaceutical uptake. The Bayesian procedure is evaluated using physically acquired single-photon emission computed tomography (SPECT) projection data and MRI for the three-dimensional (3-D) Hoffman brain phantom. A clinically realistic count level is used. A cold lesion within the brain phantom is created during the SPECT scan but not during the MRI to demonstrate that the estimation procedure can detect ECT structure that is not present anatomically.« less
Functional-Lesion Investigation of Developmental Stuttering with Positron Emission Tomography.
ERIC Educational Resources Information Center
Ingham, Roger J.; And Others
1996-01-01
Analysis of use of positron emission tomographic measurements of resting-state regional cerebral blood flow in 29 men, 10 of whom stuttered, did not support the idea that developmental stuttering is associated with abnormalities of blood flow at rest. Findings did suggest an essentially normal functional brain terrain with a small number of minor…
Imaging of turbulent structures and tomographic reconstruction of TORPEX plasma emissivity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iraji, D.; Furno, I.; Fasoli, A.
In the TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], a simple magnetized plasma device, low frequency electrostatic fluctuations associated with interchange waves, are routinely measured by means of extensive sets of Langmuir probes. To complement the electrostatic probe measurements of plasma turbulence and study of plasma structures smaller than the spatial resolution of probes array, a nonperturbative direct imaging system has been developed on TORPEX, including a fast framing Photron-APX-RS camera and an image intensifier unit. From the line-integrated camera images, we compute the poloidal emissivity profile of the plasma by applying a tomographic reconstruction technique usingmore » a pixel method and solving an overdetermined set of equations by singular value decomposition. This allows comparing statistical, spectral, and spatial properties of visible light radiation with electrostatic fluctuations. The shape and position of the time-averaged reconstructed plasma emissivity are observed to be similar to those of the ion saturation current profile. In the core plasma, excluding the electron cyclotron and upper hybrid resonant layers, the mean value of the plasma emissivity is observed to vary with (T{sub e}){sup {alpha}}(n{sub e}){sup {beta}}, in which {alpha}=0.25-0.7 and {beta}=0.8-1.4, in agreement with collisional radiative model. The tomographic reconstruction is applied to the fast camera movie acquired with 50 kframes/s rate and 2 {mu}s of exposure time to obtain the temporal evolutions of the emissivity fluctuations. Conditional average sampling is also applied to visualize and measure sizes of structures associated with the interchange mode. The {omega}-time and the two-dimensional k-space Fourier analysis of the reconstructed emissivity fluctuations show the same interchange mode that is detected in the {omega} and k spectra of the ion saturation current fluctuations measured by probes. Small scale turbulent plasma structures can be detected and tracked in the reconstructed emissivity movies with the spatial resolution down to 2 cm, well beyond the spatial resolution of the probe array.« less
Bayesian estimation of optical properties of the human head via 3D structural MRI
NASA Astrophysics Data System (ADS)
Barnett, Alexander H.; Culver, Joseph P.; Sorensen, A. Gregory; Dale, Anders M.; Boas, David A.
2003-10-01
Knowledge of the baseline optical properties of the tissues of the human head is essential for absolute cerebral oximetry, and for quantitative studies of brain activation. In this work we numerically model the utility of signals from a small 6-optode time-resolved diffuse optical tomographic apparatus for inferring baseline scattering and absorption coefficients of the scalp, skull and brain, when complete geometric information is available from magnetic resonance imaging (MRI). We use an optical model where MRI-segmented tissues are assumed homogeneous. We introduce a noise model capturing both photon shot noise and forward model numerical accuracy, and use Bayesian inference to predict errorbars and correlations on the measurments. We also sample from the full posterior distribution using Markov chain Monte Carlo. We conclude that ~ 106 detected photons are sufficient to measure the brain"s scattering and absorption to a few percent. We present preliminary results using a fast multi-layer slab model, comparing the case when layer thicknesses are known versus unknown.
Yoon, Seo Yeon; Kim, Je-Kyung; An, Young-Sil; Kim, Yong Wook
2015-01-01
Aphasia is one of the most common neurologic deficits occurring after stroke. Although the speech-language therapy is a mainstream option for poststroke aphasia, pharmacotherapy is recently being tried to modulate different neurotransmitter systems. However, the efficacy of those treatments is still controversial. We present a case of a 53-year-old female patient with Wernicke aphasia, after the old infarction in the territory of left middle cerebral artery for 8 years and the recent infarction in the right middle cerebral artery for 4 months. On the initial evaluation, the Aphasia Quotient in Korean version of the Western Aphasia Battery was 25.6 of 100. Baseline brain F-18 fluorodeoxyglucose positron emission tomographic images demonstrated a decreased cerebral metabolism in the left temporoparietal area and right temporal lobe. Donepezil hydrochloride, a reversible acetylcholinesterase inhibitor, was orally administered 5 mg/d for 6 weeks after the initial evaluation and was increased to 10 mg/d for the following 6 weeks. After the donepezil treatment, the patient showed improvement in language function, scoring 51.0 of 100 on Aphasia Quotient. A subtraction analysis of the brain F-18 fluorodeoxyglucose positron emission tomographic images after donepezil medication demonstrated increased uptake in both middle temporal gyri, extended to the occipital area and the left cerebellum. Thus, we suggest that donepezil can be an effective therapeutic choice for the treatment of Wernicke aphasia.
Reducing uncertainties in decadal variability of the global carbon budget with multiple datasets
Li, Wei; Ciais, Philippe; Wang, Yilong; Peng, Shushi; Broquet, Grégoire; Ballantyne, Ashley P.; Canadell, Josep G.; Cooper, Leila; Friedlingstein, Pierre; Le Quéré, Corinne; Myneni, Ranga B.; Peters, Glen P.; Piao, Shilong; Pongratz, Julia
2016-01-01
Conventional calculations of the global carbon budget infer the land sink as a residual between emissions, atmospheric accumulation, and the ocean sink. Thus, the land sink accumulates the errors from the other flux terms and bears the largest uncertainty. Here, we present a Bayesian fusion approach that combines multiple observations in different carbon reservoirs to optimize the land (B) and ocean (O) carbon sinks, land use change emissions (L), and indirectly fossil fuel emissions (F) from 1980 to 2014. Compared with the conventional approach, Bayesian optimization decreases the uncertainties in B by 41% and in O by 46%. The L uncertainty decreases by 47%, whereas F uncertainty is marginally improved through the knowledge of natural fluxes. Both ocean and net land uptake (B + L) rates have positive trends of 29 ± 8 and 37 ± 17 Tg C⋅y−2 since 1980, respectively. Our Bayesian fusion of multiple observations reduces uncertainties, thereby allowing us to isolate important variability in global carbon cycle processes. PMID:27799533
NASA Astrophysics Data System (ADS)
Zhou, X.; Albertson, J. D.
2016-12-01
Natural gas is considered as a bridge fuel towards clean energy due to its potential lower greenhouse gas emission comparing with other fossil fuels. Despite numerous efforts, an efficient and cost-effective approach to monitor fugitive methane emissions along the natural gas production-supply chain has not been developed yet. Recently, mobile methane measurement has been introduced which applies a Bayesian approach to probabilistically infer methane emission rates and update estimates recursively when new measurements become available. However, the likelihood function, especially the error term which determines the shape of the estimate uncertainty, is not rigorously defined and evaluated with field data. To address this issue, we performed a series of near-source (< 30 m) controlled methane release experiments using a specialized vehicle mounted with fast response methane analyzers and a GPS unit. Methane concentrations were measured at two different heights along mobile traversals downwind of the sources, and concurrent wind and temperature data are recorded by nearby 3-D sonic anemometers. With known methane release rates, the measurements were used to determine the functional form and the parameterization of the likelihood function in the Bayesian inference scheme under different meteorological conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rathore, Kavita, E-mail: kavira@iitk.ac.in, E-mail: pmunshi@iitk.ac.in, E-mail: sudeepb@iitk.ac.in; Munshi, Prabhat, E-mail: kavira@iitk.ac.in, E-mail: pmunshi@iitk.ac.in, E-mail: sudeepb@iitk.ac.in; Bhattacharjee, Sudeep, E-mail: kavira@iitk.ac.in, E-mail: pmunshi@iitk.ac.in, E-mail: sudeepb@iitk.ac.in
A new non-invasive diagnostic system is developed for Microwave Induced Plasma (MIP) to reconstruct tomographic images of a 2D emission profile. A compact MIP system has wide application in industry as well as research application such as thrusters for space propulsion, high current ion beams, and creation of negative ions for heating of fusion plasma. Emission profile depends on two crucial parameters, namely, the electron temperature and density (over the entire spatial extent) of the plasma system. Emission tomography provides basic understanding of plasmas and it is very useful to monitor internal structure of plasma phenomena without disturbing its actualmore » processes. This paper presents development of a compact, modular, and versatile Optical Emission Tomography (OET) tool for a cylindrical, magnetically confined MIP system. It has eight slit-hole cameras and each consisting of a complementary metal–oxide–semiconductor linear image sensor for light detection. The optical noise is reduced by using aspheric lens and interference band-pass filters in each camera. The entire cylindrical plasma can be scanned with automated sliding ring mechanism arranged in fan-beam data collection geometry. The design of the camera includes a unique possibility to incorporate different filters to get the particular wavelength light from the plasma. This OET system includes selected band-pass filters for particular argon emission 750 nm, 772 nm, and 811 nm lines and hydrogen emission H{sub α} (656 nm) and H{sub β} (486 nm) lines. Convolution back projection algorithm is used to obtain the tomographic images of plasma emission line. The paper mainly focuses on (a) design of OET system in detail and (b) study of emission profile for 750 nm argon emission lines to validate the system design.« less
Singh, Parmanand; Emami, Hamed; Subramanian, Sharath; Maurovich-Horvat, Pal; Marincheva-Savcheva, Gergana; Medina, Hector M; Abdelbaky, Amr; Alon, Achilles; Shankar, Sudha S; Rudd, James H F; Fayad, Zahi A; Hoffmann, Udo; Tawakol, Ahmed
2016-12-01
Nonobstructive coronary plaques manifesting high-risk morphology (HRM) associate with an increased risk of adverse clinical cardiovascular events. We sought to test the hypothesis that statins have a greater anti-inflammatory effect within coronary plaques containing HRM. In this prospective multicenter study, 55 subjects with or at high risk for atherosclerosis underwent 18 F-fluorodeoxyglucose positron emission tomographic/computed tomographic imaging at baseline and after 12 weeks of treatment with atorvastatin. Coronary arterial inflammation ( 18 F-fluorodeoxyglucose uptake, expressed as target-to-background ratio) was assessed in the left main coronary artery (LMCA). While blinded to the PET findings, contrast-enhanced computed tomographic angiography was performed to characterize the presence of HRM (defined as noncalcified or partially calcified plaques) in the LMCA. Arterial inflammation (target-to-background ratio) was higher in LMCA segments with HRM than those without HRM (mean±SEM: 1.95±0.43 versus 1.67±0.32 for LMCA with versus without HRM, respectively; P=0.04). Moreover, atorvastatin treatment for 12 weeks reduced target-to-background ratio more in LMCA segments with HRM than those without HRM (12 week-baseline Δtarget-to-background ratio [95% confidence interval]: -0.18 [-0.35 to -0.004] versus 0.09 [-0.06 to 0.26]; P=0.02). Furthermore, this relationship between coronary plaque morphology and change in LMCA inflammatory activity remained significant after adjusting for baseline low-density lipoprotein and statin dose (β=-0.27; P=0.038). In this first study to evaluate the impact of statins on coronary inflammation, we observed that the anti-inflammatory impact of statins is substantially greater within coronary plaques that contain HRM features. These findings suggest an additional mechanism by which statins disproportionately benefit individuals with more advanced atherosclerotic disease. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00703261. © 2016 The Authors.
Overview of the Benzene and Other Toxics Exposure (BEE-TEX) Field Study.
Olaguer, Eduardo P
2015-01-01
The Benzene and other Toxics Exposure (BEE-TEX) field study was an experimental campaign designed to demonstrate novel methods for measuring ambient concentrations of hazardous air pollutants (HAPs) in real time and to attribute these concentrations to quantified releases from specific emission points in industrial facilities while operating outside facility fence lines. BEE-TEX was conducted in February 2015 at three neighboring communities in the Houston Ship Channel of Texas, where a large number of petrochemical facilities are concentrated. The novel technologies deployed during BEE-TEX included: (1) tomographic remote sensing based on differential optical absorption spectroscopy; (2) real-time broadcasting of ambient air monitoring data over the World Wide Web; (3) real-time source attribution and quantification of HAP emissions based on either tomographic or mobile measurement platforms; and (4) the use of cultured human lung cells in vitro as portable indicators of HAP exposure.
Overview of the Benzene and Other Toxics Exposure (BEE-TEX) Field Study
Olaguer, Eduardo P.
2015-01-01
The Benzene and other Toxics Exposure (BEE-TEX) field study was an experimental campaign designed to demonstrate novel methods for measuring ambient concentrations of hazardous air pollutants (HAPs) in real time and to attribute these concentrations to quantified releases from specific emission points in industrial facilities while operating outside facility fence lines. BEE-TEX was conducted in February 2015 at three neighboring communities in the Houston Ship Channel of Texas, where a large number of petrochemical facilities are concentrated. The novel technologies deployed during BEE-TEX included: (1) tomographic remote sensing based on differential optical absorption spectroscopy; (2) real-time broadcasting of ambient air monitoring data over the World Wide Web; (3) real-time source attribution and quantification of HAP emissions based on either tomographic or mobile measurement platforms; and (4) the use of cultured human lung cells in vitro as portable indicators of HAP exposure. PMID:26549972
Feasibility of hydrogen density estimation from tomographic sensing of Lyman alpha emission
NASA Astrophysics Data System (ADS)
Waldrop, L.; Kamalabadi, F.; Ren, D.
2015-12-01
In this work, we describe the scientific motivation, basic principles, and feasibility of a new approach to the estimation of neutral hydrogen (H) density in the terrestrial exosphere based on the 3-D tomographic sensing of optically thin H emission at 121.6 nm (Lyman alpha). In contrast to existing techniques, Lyman alpha tomography allows for model-independent reconstruction of the underlying H distribution in support of investigations regarding the origin and time-dependent evolution of exospheric structure. We quantitatively describe the trade-off space between the measurement sampling rate, viewing geometry, and the spatial and temporal resolution of the reconstruction that is supported by the data. We demonstrate that this approach is feasible from either earth-orbiting satellites such as the stereoscopic NASA TWINS mission or from a CubeSat platform along a trans-exosphere trajectory such as that enabled by the upcoming Exploration Mission 1 launch.
Ultrasoft x-ray imaging system for the National Spherical Torus Experiment
NASA Astrophysics Data System (ADS)
Stutman, D.; Finkenthal, M.; Soukhanovskii, V.; May, M. J.; Moos, H. W.; Kaita, R.
1999-01-01
A spectrally resolved ultrasoft x-ray imaging system, consisting of arrays of high resolution (<2 Å) and throughput (⩾tens of kHz) miniature monochromators, and based on multilayer mirrors and absolute photodiodes, is being designed for the National Spherical Torus Experiment. Initially, three poloidal arrays of diodes filtered for C 1s-np emission will be implemented for fast tomographic imaging of the colder start-up plasmas. Later on, mirrors tuned to the C Lyα emission will be added in order to enable the arrays to "see" the periphery through the hot core and to study magnetohydrodynamic activity and impurity transport in this region. We also discuss possible core diagnostics, based on tomographic imaging of the Lyα emission from the plume of recombined, low Z impurity ions left by neutral beams or fueling pellets. The arrays can also be used for radiated power measurements and to map the distribution of high Z impurities injected for transport studies. The performance of the proposed system is illustrated with results from test channels on the CDX-U spherical torus at Princeton Plasma Physics Laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bielecki, J.; Scholz, M.; Drozdowicz, K.
A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape ofmore » normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET.« less
Photoacoustic imaging of fluorophores using pump-probe excitation
Märk, Julia; Schmitt, Franz-Josef; Theiss, Christoph; Dortay, Hakan; Friedrich, Thomas; Laufer, Jan
2015-01-01
A pump-probe technique for the detection of fluorophores in tomographic PA images is introduced. It is based on inducing stimulated emission in fluorescent molecules, which in turn modulates the amount of thermalized energy, and hence the PA signal amplitude. A theoretical model of the PA signal generation in fluorophores is presented and experimentally validated on cuvette measurements made in solutions of Rhodamine 6G, a fluorophore of known optical and molecular properties. The application of this technique to deep tissue tomographic PA imaging is demonstrated by determining the spatial distribution of a near-infrared fluorophore in a tissue phantom. PMID:26203378
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laughlin, J.S.; Benua, R.S.; Tilbury, R.S.
1978-09-30
Progress is reported on biomedical studies using cyclotron-produced /sup 18/F, /sup 15/O, /sup 11/C, /sup 13/N, /sup 52/Fe, /sup 38/K, /sup 206/Bi, /sup 73/Se, /sup 53/Co, and /sup 43/K. The following research projects are described: tumor detection and diagnosis; neurological studies; radiopharmaceutical development; /sup 38/K as an indicator of blood flow to the myocardium; dosimetry for internally deposited isotopes in animals and man; cyclotron development; positron tomographic imaging with the TOKIM System; and review of positron emission transaxial tomograph instruments. (HLW)
Tomographic capabilities of the new GEM based SXR diagnostic of WEST
NASA Astrophysics Data System (ADS)
Jardin, A.; Mazon, D.; O'Mullane, M.; Mlynar, J.; Loffelmann, V.; Imrisek, M.; Chernyshova, M.; Czarski, T.; Kasprowicz, G.; Wojenski, A.; Bourdelle, C.; Malard, P.
2016-07-01
The tokamak WEST (Tungsten Environment in Steady-State Tokamak) will start operating by the end of 2016 as a test bed for the ITER divertor components in long pulse operation. In this context, radiative cooling of heavy impurities like tungsten (W) in the Soft X-ray (SXR) range [0.1 keV; 20 keV] is a critical issue for the plasma core performances. Thus reliable tools are required to monitor the local impurity density and avoid W accumulation. The WEST SXR diagnostic will be equipped with two new GEM (Gas Electron Multiplier) based poloidal cameras allowing to perform 2D tomographic reconstructions in tunable energy bands. In this paper tomographic capabilities of the Minimum Fisher Information (MFI) algorithm developed for Tore Supra and upgraded for WEST are investigated, in particular through a set of emissivity phantoms and the standard WEST scenario including reconstruction errors, influence of noise as well as computational time.
A novel Bayesian approach to acoustic emission data analysis.
Agletdinov, E; Pomponi, E; Merson, D; Vinogradov, A
2016-12-01
Acoustic emission (AE) technique is a popular tool for materials characterization and non-destructive testing. Originating from the stochastic motion of defects in solids, AE is a random process by nature. The challenging problem arises whenever an attempt is made to identify specific points corresponding to the changes in the trends in the fluctuating AE time series. A general Bayesian framework is proposed for the analysis of AE time series, aiming at automated finding the breakpoints signaling a crossover in the dynamics of underlying AE sources. Copyright © 2016 Elsevier B.V. All rights reserved.
Lithographed spectrometers for tomographic line mapping of the Epoch of Reionization
NASA Astrophysics Data System (ADS)
O'Brient, R.; Bock, J. J.; Bradford, C. M.; Crites, A.; Duan, R.; Hailey-Dunsheath, S.; Hunacek, J.; LeDuc, R.; Shirokoff, E.; Staniszewski, Z.; Turner, A.; Zemcov, M.
2014-08-01
The Tomographic Ionized carbon Mapping Experiment (TIME) is a multi-phased experiment that will topographically map [CII] emission from the Epoch of Reionization. We are developing lithographed spectrometers that couple to TES bolometers in anticipation of the second generation instrument. Our design intentionally mirrors many features of the parallel SuperSpec project, inductively coupling power from a trunk-line microstrip onto half-wave resonators. The resonators couple to a rat-race hybrids that feeds TES bolometers. Our 25 channel prototype shows spectrally positioned lines roughly matching design with a receiver optical efficiency of 15-20%, a level that is dominated by loss in components outside the spectrometer.
Dynamic Positron Emission Tomography [PET] in Man Using Small Bismuth Germanate Crystals
DOE R&D Accomplishments Database
Derenzo, S. E.; Budinger, T. F.; Huesman, R. H.; Cahoon, J. L.
1982-04-01
Primary considerations for the design of positron emission tomographs for medical studies in humans are the need for high imaging sensitivity, whole organ coverage, good spatial resolution, high maximum data rates, adequate spatial sampling with minimum mechanical motion, shielding against out of plane activity, pulse height discrimination against scattered photons, and timing discrimination against accidental coincidences. We discuss the choice of detectors, sampling motion, shielding, and electronics to meet these objectives.
A 3D tomographic reconstruction method to analyze Jupiter's electron-belt emission observations
NASA Astrophysics Data System (ADS)
Santos-Costa, Daniel; Girard, Julien; Tasse, Cyril; Zarka, Philippe; Kita, Hajime; Tsuchiya, Fuminori; Misawa, Hiroaki; Clark, George; Bagenal, Fran; Imai, Masafumi; Becker, Heidi N.; Janssen, Michael A.; Bolton, Scott J.; Levin, Steve M.; Connerney, John E. P.
2017-04-01
Multi-dimensional reconstruction techniques of Jupiter's synchrotron radiation from radio-interferometric observations were first developed by Sault et al. [Astron. Astrophys., 324, 1190-1196, 1997]. The tomographic-like technique introduced 20 years ago had permitted the first 3-dimensional mapping of the brightness distribution around the planet. This technique has demonstrated the advantage to be weakly dependent on planetary field models. It also does not require any knowledge on the energy and spatial distributions of the radiating electrons. On the downside, it is assumed that the volume emissivity of any punctual point source around the planet is isotropic. This assumption becomes incorrect when mapping the brightness distribution for non-equatorial point sources or any point sources from Juno's perspective. In this paper, we present our modeling effort to bypass the isotropy issue. Our approach is to use radio-interferometric observations and determine the 3-D brightness distribution in a cylindrical coordinate system. For each set (z, r), we constrain the longitudinal distribution with a Fourier series and the anisotropy is addressed with a simple periodic function when possible. We develop this new method over a wide range of frequencies using past VLA and LOFAR observations of Jupiter. We plan to test this reconstruction method with observations of Jupiter that are currently being carried out with LOFAR and GMRT in support to the Juno mission. We describe how this new 3D tomographic reconstruction method provides new model constraints on the energy and spatial distributions of Jupiter's ultra-relativistic electrons close to the planet and be used to interpret Juno MWR observations of Jupiter's electron-belt emission and assist in evaluating the background noise from the radiation environment in the atmospheric measurements.
Bayesian ionospheric multi-instrument 3D tomography
NASA Astrophysics Data System (ADS)
Norberg, Johannes; Vierinen, Juha; Roininen, Lassi
2017-04-01
The tomographic reconstruction of ionospheric electron densities is an inverse problem that cannot be solved without relatively strong regularising additional information. % Especially the vertical electron density profile is determined predominantly by the regularisation. % %Often utilised regularisations in ionospheric tomography include smoothness constraints and iterative methods with initial ionospheric models. % Despite its crucial role, the regularisation is often hidden in the algorithm as a numerical procedure without physical understanding. % % The Bayesian methodology provides an interpretative approach for the problem, as the regularisation can be given in a physically meaningful and quantifiable prior probability distribution. % The prior distribution can be based on ionospheric physics, other available ionospheric measurements and their statistics. % Updating the prior with measurements results as the posterior distribution that carries all the available information combined. % From the posterior distribution, the most probable state of the ionosphere can then be solved with the corresponding probability intervals. % Altogether, the Bayesian methodology provides understanding on how strong the given regularisation is, what is the information gained with the measurements and how reliable the final result is. % In addition, the combination of different measurements and temporal development can be taken into account in a very intuitive way. However, a direct implementation of the Bayesian approach requires inversion of large covariance matrices resulting in computational infeasibility. % In the presented method, Gaussian Markov random fields are used to form a sparse matrix approximations for the covariances. % The approach makes the problem computationally feasible while retaining the probabilistic and physical interpretation. Here, the Bayesian method with Gaussian Markov random fields is applied for ionospheric 3D tomography over Northern Europe. % Multi-instrument measurements are utilised from TomoScand receiver network for Low Earth orbit beacon satellite signals, GNSS receiver networks, as well as from EISCAT ionosondes and incoherent scatter radars. % %The performance is demonstrated in three-dimensional spatial domain with temporal development also taken into account.
NASA Astrophysics Data System (ADS)
Tugendhat, Tim M.; Schäfer, Björn Malte
2018-05-01
We investigate a physical, composite alignment model for both spiral and elliptical galaxies and its impact on cosmological parameter estimation from weak lensing for a tomographic survey. Ellipticity correlation functions and angular ellipticity spectra for spiral and elliptical galaxies are derived on the basis of tidal interactions with the cosmic large-scale structure and compared to the tomographic weak-lensing signal. We find that elliptical galaxies cause a contribution to the weak-lensing dominated ellipticity correlation on intermediate angular scales between ℓ ≃ 40 and ℓ ≃ 400 before that of spiral galaxies dominates on higher multipoles. The predominant term on intermediate scales is the negative cross-correlation between intrinsic alignments and weak gravitational lensing (GI-alignment). We simulate parameter inference from weak gravitational lensing with intrinsic alignments unaccounted; the bias induced by ignoring intrinsic alignments in a survey like Euclid is shown to be several times larger than the statistical error and can lead to faulty conclusions when comparing to other observations. The biases generally point into different directions in parameter space, such that in some cases one can observe a partial cancellation effect. Furthermore, it is shown that the biases increase with the number of tomographic bins used for the parameter estimation process. We quantify this parameter estimation bias in units of the statistical error and compute the loss of Bayesian evidence for a model due to the presence of systematic errors as well as the Kullback-Leibler divergence to quantify the distance between the true model and the wrongly inferred one.
On the regularization for nonlinear tomographic absorption spectroscopy
NASA Astrophysics Data System (ADS)
Dai, Jinghang; Yu, Tao; Xu, Lijun; Cai, Weiwei
2018-02-01
Tomographic absorption spectroscopy (TAS) has attracted increased research efforts recently due to the development in both hardware and new imaging concepts such as nonlinear tomography and compressed sensing. Nonlinear TAS is one of the emerging modality that bases on the concept of nonlinear tomography and has been successfully demonstrated both numerically and experimentally. However, all the previous demonstrations were realized using only two orthogonal projections simply for ease of implementation. In this work, we examine the performance of nonlinear TAS using other beam arrangements and test the effectiveness of the beam optimization technique that has been developed for linear TAS. In addition, so far only smoothness prior has been adopted and applied in nonlinear TAS. Nevertheless, there are also other useful priors such as sparseness and model-based prior which have not been investigated yet. This work aims to show how these priors can be implemented and included in the reconstruction process. Regularization through Bayesian formulation will be introduced specifically for this purpose, and a method for the determination of a proper regularization factor will be proposed. The comparative studies performed with different beam arrangements and regularization schemes on a few representative phantoms suggest that the beam optimization method developed for linear TAS also works for the nonlinear counterpart and the regularization scheme should be selected properly according to the available a priori information under specific application scenarios so as to achieve the best reconstruction fidelity. Though this work is conducted under the context of nonlinear TAS, it can also provide useful insights for other tomographic modalities.
Bayesian Analysis of a Reduced-Form Air Quality Model
Numerical air quality models are being used for assessing emission control strategies for improving ambient pollution levels across the globe. This paper applies probabilistic modeling to evaluate the effectiveness of emission reduction scenarios aimed at lowering ground-level oz...
Tomographic inversion of satellite photometry
NASA Technical Reports Server (NTRS)
Solomon, S. C.; Hays, P. B.; Abreu, V. J.
1984-01-01
An inversion algorithm capable of reconstructing the volume emission rate of thermospheric airglow features from satellite photometry has been developed. The accuracy and resolution of this technique are investigated using simulated data, and the inversions of several sets of observations taken by the Visible Airglow Experiment are presented.
ERIC Educational Resources Information Center
Damasio, Antonio R., Damasio, Hanna
1992-01-01
Discusses the advances made in understanding the brain structures responsible for language. Presents findings made using magnetic resonance imaging (MRI) and positron emission tomographic (PET) scans to study brain activity. These findings map the structures in the brain that manipulate concepts and those that turn concepts into words. (MCO)
21 CFR 212.1 - What are the meanings of the technical terms used in these regulations?
Code of Federal Regulations, 2014 CFR
2014-04-01
..., numbers, or symbols from which the complete history of the production, processing, packing, holding, and... is used for providing dual photon positron emission tomographic diagnostic images. The definition.... Production means the manufacturing, compounding, processing, packaging, labeling, reprocessing, repacking...
21 CFR 212.1 - What are the meanings of the technical terms used in these regulations?
Code of Federal Regulations, 2012 CFR
2012-04-01
..., numbers, or symbols from which the complete history of the production, processing, packing, holding, and... is used for providing dual photon positron emission tomographic diagnostic images. The definition.... Production means the manufacturing, compounding, processing, packaging, labeling, reprocessing, repacking...
21 CFR 212.1 - What are the meanings of the technical terms used in these regulations?
Code of Federal Regulations, 2013 CFR
2013-04-01
..., numbers, or symbols from which the complete history of the production, processing, packing, holding, and... is used for providing dual photon positron emission tomographic diagnostic images. The definition.... Production means the manufacturing, compounding, processing, packaging, labeling, reprocessing, repacking...
NASA Astrophysics Data System (ADS)
Nakada, Tomohiro; Takadama, Keiki; Watanabe, Shigeyoshi
This paper proposes the classification method using Bayesian analytical method to classify the time series data in the international emissions trading market depend on the agent-based simulation and compares the case with Discrete Fourier transform analytical method. The purpose demonstrates the analytical methods mapping time series data such as market price. These analytical methods have revealed the following results: (1) the classification methods indicate the distance of mapping from the time series data, it is easier the understanding and inference than time series data; (2) these methods can analyze the uncertain time series data using the distance via agent-based simulation including stationary process and non-stationary process; and (3) Bayesian analytical method can show the 1% difference description of the emission reduction targets of agent.
Acoustic emission based damage localization in composites structures using Bayesian identification
NASA Astrophysics Data System (ADS)
Kundu, A.; Eaton, M. J.; Al-Jumali, S.; Sikdar, S.; Pullin, R.
2017-05-01
Acoustic emission based damage detection in composite structures is based on detection of ultra high frequency packets of acoustic waves emitted from damage sources (such as fibre breakage, fatigue fracture, amongst others) with a network of distributed sensors. This non-destructive monitoring scheme requires solving an inverse problem where the measured signals are linked back to the location of the source. This in turn enables rapid deployment of mitigative measures. The presence of significant amount of uncertainty associated with the operating conditions and measurements makes the problem of damage identification quite challenging. The uncertainties stem from the fact that the measured signals are affected by the irregular geometries, manufacturing imprecision, imperfect boundary conditions, existing damages/structural degradation, amongst others. This work aims to tackle these uncertainties within a framework of automated probabilistic damage detection. The method trains a probabilistic model of the parametrized input and output model of the acoustic emission system with experimental data to give probabilistic descriptors of damage locations. A response surface modelling the acoustic emission as a function of parametrized damage signals collected from sensors would be calibrated with a training dataset using Bayesian inference. This is used to deduce damage locations in the online monitoring phase. During online monitoring, the spatially correlated time data is utilized in conjunction with the calibrated acoustic emissions model to infer the probabilistic description of the acoustic emission source within a hierarchical Bayesian inference framework. The methodology is tested on a composite structure consisting of carbon fibre panel with stiffeners and damage source behaviour has been experimentally simulated using standard H-N sources. The methodology presented in this study would be applicable in the current form to structural damage detection under varying operational loads and would be investigated in future studies.
Normal-pressure hydrocephalus and the saga of the treatable dementias
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedland, R.P.
1989-11-10
A case study of a 74-year-old woman is presented which illustrates the difficulty of understanding dementing illnesses. A diagnosis of normal-pressure hydrocephalus (NPH) was made because of the development of abnormal gait, with urinary incontinence and severe, diffuse, white matter lesions on the MRI scan. Computed tomographic, MRI scans and positron emission tomographic images of glucose use are presented. The treatable dementias are a large, multifaceted group of illnesses, of which NPH is one. The author proposes a new term for this disorder commonly known as NPH because the problem with the term normal-pressure hydrocephalus is that the cerebrospinal fluidmore » pressure is not always normal in the disease.« less
Single-photon tomographic determination of regional cerebral blood flow in epilepsy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonte, F.J.; Devous, M.D. Sr.; Stokely, E.M.
Using a single-photon emission computed tomographic scanner (SPECT) the authors determined regional cerebral blood flow (rCBF) with inhaled xenon-133, a noninvasive procedure. Studies were performed in 40 normal individuals, and these were compared with rCBF determinations in 51 patients with seizure disorders. Although positive results were obtained in 15 of 16 patients with mass lesions, the group of principal interest comprised 25 patients suffering from ''temporal lobe'' epilepsy. Only one of these had a positive x-ray computed tomogram, but 16 had positive findings on rCBF study. These findings included increased local blood flow in the ictal state and reduced flowmore » interictally.« less
Gaussian process tomography for soft x-ray spectroscopy at WEST without equilibrium information
NASA Astrophysics Data System (ADS)
Wang, T.; Mazon, D.; Svensson, J.; Li, D.; Jardin, A.; Verdoolaege, G.
2018-06-01
Gaussian process tomography (GPT) is a recently developed tomography method based on the Bayesian probability theory [J. Svensson, JET Internal Report EFDA-JET-PR(11)24, 2011 and Li et al., Rev. Sci. Instrum. 84, 083506 (2013)]. By modeling the soft X-ray (SXR) emissivity field in a poloidal cross section as a Gaussian process, the Bayesian SXR tomography can be carried out in a robust and extremely fast way. Owing to the short execution time of the algorithm, GPT is an important candidate for providing real-time reconstructions with a view to impurity transport and fast magnetohydrodynamic control. In addition, the Bayesian formalism allows quantifying uncertainty on the inferred parameters. In this paper, the GPT technique is validated using a synthetic data set expected from the WEST tokamak, and the results are shown of its application to the reconstruction of SXR emissivity profiles measured on Tore Supra. The method is compared with the standard algorithm based on minimization of the Fisher information.
NASA Astrophysics Data System (ADS)
Olaguer, Eduardo P.; Stutz, Jochen; Erickson, Matthew H.; Hurlock, Stephen C.; Cheung, Ross; Tsai, Catalina; Colosimo, Santo F.; Festa, James; Wijesinghe, Asanga; Neish, Bradley S.
2017-02-01
During the Benzene and other Toxics Exposure (BEE-TEX) study, a remote sensing network based on long path Differential Optical Absorption Spectroscopy (DOAS) was set up in the Manchester neighborhood beside the Ship Channel of Houston, Texas in order to perform Computer Aided Tomography (CAT) scans of hazardous air pollutants. On 18-19 February 2015, the CAT scan network detected large nocturnal plumes of toluene and xylenes most likely associated with railcar loading and unloading operations at Ship Channel petrochemical facilities. The presence of such plumes during railcar operations was confirmed by a mobile laboratory equipped with a Proton Transfer Reaction-Mass Spectrometer (PTR-MS), which measured transient peaks of toluene and C2-benzenes of 50 ppb and 57 ppb respectively around 4 a.m. LST on 19 February 2015. Plume reconstruction and source attribution were performed using the 4D variational data assimilation technique and a 3D micro-scale forward and adjoint air quality model based on both tomographic and PTR-MS data. Inverse model estimates of fugitive emissions associated with railcar transfer emissions ranged from 2.0 to 8.2 kg/hr for toluene and from 2.2 to 3.5 kg/hr for xylenes in the early morning of 19 February 2015.
NASA Astrophysics Data System (ADS)
Craciunescu, Teddy; Peluso, Emmanuele; Murari, Andrea; Gelfusa, Michela; JET Contributors
2018-05-01
The total emission of radiation is a crucial quantity to calculate the power balances and to understand the physics of any Tokamak. Bolometric systems are the main tool to measure this important physical quantity through quite sophisticated tomographic inversion methods. On the Joint European Torus, the coverage of the bolometric diagnostic, due to the availability of basically only two projection angles, is quite limited, rendering the inversion a very ill-posed mathematical problem. A new approach, based on the maximum likelihood, has therefore been developed and implemented to alleviate one of the major weaknesses of traditional tomographic techniques: the difficulty to determine routinely the confidence intervals in the results. The method has been validated by numerical simulations with phantoms to assess the quality of the results and to optimise the configuration of the parameters for the main types of emissivity encountered experimentally. The typical levels of statistical errors, which may significantly influence the quality of the reconstructions, have been identified. The systematic tests with phantoms indicate that the errors in the reconstructions are quite limited and their effect on the total radiated power remains well below 10%. A comparison with other approaches to the inversion and to the regularization has also been performed.
Ambient Noise Tomography of central Java, with Transdimensional Bayesian Inversion
NASA Astrophysics Data System (ADS)
Zulhan, Zulfakriza; Saygin, Erdinc; Cummins, Phil; Widiyantoro, Sri; Nugraha, Andri Dian; Luehr, Birger-G.; Bodin, Thomas
2014-05-01
Delineating the crustal structure of central Java is crucial for understanding its complex tectonic setting. However, seismic imaging of the strong heterogeneity typical of such a tectonically active region can be challenging, particularly in the upper crust where velocity contrasts are strongest and steep body wave ray-paths provide poor resolution. We have applied ambient noise cross correlation of pair stations in central Java, Indonesia by using the MERapi Amphibious EXperiment (MERAMEX) dataset. The data were collected between May to October 2004. We used 120 of 134 temporary seismic stations for about 150 days of observation, which covered central Java. More than 5000 Rayleigh wave Green's function were extracted by cross-correlating the noise simultaneously recorded at available station pairs. We applied a fully nonlinear 2D Bayesian inversion technique to the retrieved travel times. Features in the derived tomographic images correlate well with previous studies, and some shallow structures that were not evident in previous studies are clearly imaged with Ambient Noise Tomography. The Kendeng Basin and several active volcanoes appear with very low group velocities, and anomalies with relatively high velocities can be interpreted in terms of crustal sutures and/or surface geological features.
Bayesian statistical ionospheric tomography improved by incorporating ionosonde measurements
NASA Astrophysics Data System (ADS)
Norberg, Johannes; Virtanen, Ilkka I.; Roininen, Lassi; Vierinen, Juha; Orispää, Mikko; Kauristie, Kirsti; Lehtinen, Markku S.
2016-04-01
We validate two-dimensional ionospheric tomography reconstructions against EISCAT incoherent scatter radar measurements. Our tomography method is based on Bayesian statistical inversion with prior distribution given by its mean and covariance. We employ ionosonde measurements for the choice of the prior mean and covariance parameters and use the Gaussian Markov random fields as a sparse matrix approximation for the numerical computations. This results in a computationally efficient tomographic inversion algorithm with clear probabilistic interpretation. We demonstrate how this method works with simultaneous beacon satellite and ionosonde measurements obtained in northern Scandinavia. The performance is compared with results obtained with a zero-mean prior and with the prior mean taken from the International Reference Ionosphere 2007 model. In validating the results, we use EISCAT ultra-high-frequency incoherent scatter radar measurements as the ground truth for the ionization profile shape. We find that in comparison to the alternative prior information sources, ionosonde measurements improve the reconstruction by adding accurate information about the absolute value and the altitude distribution of electron density. With an ionosonde at continuous disposal, the presented method enhances stand-alone near-real-time ionospheric tomography for the given conditions significantly.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakajima, K.; Bunko, H.; Tada, A.
1984-01-01
Phase analysis has been applied to Wolff-Parkinson-White syndrome (WPW) to detect the site of accessory conduction pathway (ACP); however, there was a limitation to estimate the precise location of ACP by planar phase analysis. In this study, the authors applied phase analysis to gated blood pool tomography. Twelve patients with WPW who underwent epicardial mapping and surgical division of ACP were studied by both of gated emission computed tomography (GECT) and routine gated blood pool study (GBPS). The GBPS was performed with Tc-99m red blood cells in multiple projections; modified left anterior oblique, right anterior oblique and/or left lateral views.more » In GECT, short axial, horizontal and vertical long axial blood pool images were reconstructed. Phase analysis was performed using fundamental frequency of the Fourier transform in both GECT and GBPS images, and abnormal initial contractions on both the planar and tomographic phase analysis were compared with the location of surgically confirmed ACPs. In planar phase analysis, abnormal initial phase was identified in 7 out of 12 (58%) patients, while in tomographic phase analysis, the localization of ACP was predicted in 11 out of 12 (92%) patients. Tomographic phase analysis is superior to planar phase images in 8 out of 12 patients to estimate the location of ACP. Phase analysis by GECT can avoid overlap of blood pool in cardiac chambers and has advantage to identify the propagation of phase three-dimensionally. Tomographic phase analysis is a good adjunctive method for patients with WPW to estimate the site of ACP.« less
J-PET: A New Technology for the Whole-body PET Imaging
NASA Astrophysics Data System (ADS)
Niedźwiecki, S.; Białas, P.; Curceanu, C.; Czerwiński, E.; Dulski, K.; Gajos, A.; Głowacz, B.; Gorgol, M.; Hiesmayr, B. C.; Jasińska, B.; Kapłon, Ł.; Kisielewska-Kamińska, D.; Korcyl, G.; Kowalski, P.; Kozik, T.; Krawczyk, N.; Krzemień, W.; Kubicz, E.; Mohammed, M.; Pawlik-Niedźwiecka, M.; Pałka, M.; Raczyński, L.; Rudy, Z.; Sharma, N. G.; Sharma, S.; Shopa, R. Y.; Silarski, M.; Skurzok, M.; Wieczorek, A.; Wiślicki, W.; Zgardzińska, B.; Zieliński, M.; Moskal, P.
The Jagiellonian Positron Emission Tomograph (J-PET) is the first PET built from plastic scintillators. J-PET prototype consists of 192 detection modules arranged axially in three layers forming a cylindrical diagnostic chamber with the inner diameter of 85 cm and the axial field-of-view of 50 cm. An axial arrangement of long strips of plastic scintillators, their small light attenuation, superior timing properties, and relative ease of the increase of the axial field-of-view opens promising perspectives for the cost effective construction of the whole-body PET scanner, as well as construction of MR and CT compatible PET inserts. Present status of the development of the J-PET tomograph will be presented and discussed.
Kotasidis, F A; Mehranian, A; Zaidi, H
2016-05-07
Kinetic parameter estimation in dynamic PET suffers from reduced accuracy and precision when parametric maps are estimated using kinetic modelling following image reconstruction of the dynamic data. Direct approaches to parameter estimation attempt to directly estimate the kinetic parameters from the measured dynamic data within a unified framework. Such image reconstruction methods have been shown to generate parametric maps of improved precision and accuracy in dynamic PET. However, due to the interleaving between the tomographic and kinetic modelling steps, any tomographic or kinetic modelling errors in certain regions or frames, tend to spatially or temporally propagate. This results in biased kinetic parameters and thus limits the benefits of such direct methods. Kinetic modelling errors originate from the inability to construct a common single kinetic model for the entire field-of-view, and such errors in erroneously modelled regions could spatially propagate. Adaptive models have been used within 4D image reconstruction to mitigate the problem, though they are complex and difficult to optimize. Tomographic errors in dynamic imaging on the other hand, can originate from involuntary patient motion between dynamic frames, as well as from emission/transmission mismatch. Motion correction schemes can be used, however, if residual errors exist or motion correction is not included in the study protocol, errors in the affected dynamic frames could potentially propagate either temporally, to other frames during the kinetic modelling step or spatially, during the tomographic step. In this work, we demonstrate a new strategy to minimize such error propagation in direct 4D image reconstruction, focusing on the tomographic step rather than the kinetic modelling step, by incorporating time-of-flight (TOF) within a direct 4D reconstruction framework. Using ever improving TOF resolutions (580 ps, 440 ps, 300 ps and 160 ps), we demonstrate that direct 4D TOF image reconstruction can substantially prevent kinetic parameter error propagation either from erroneous kinetic modelling, inter-frame motion or emission/transmission mismatch. Furthermore, we demonstrate the benefits of TOF in parameter estimation when conventional post-reconstruction (3D) methods are used and compare the potential improvements to direct 4D methods. Further improvements could possibly be achieved in the future by combining TOF direct 4D image reconstruction with adaptive kinetic models and inter-frame motion correction schemes.
NASA Astrophysics Data System (ADS)
Kotasidis, F. A.; Mehranian, A.; Zaidi, H.
2016-05-01
Kinetic parameter estimation in dynamic PET suffers from reduced accuracy and precision when parametric maps are estimated using kinetic modelling following image reconstruction of the dynamic data. Direct approaches to parameter estimation attempt to directly estimate the kinetic parameters from the measured dynamic data within a unified framework. Such image reconstruction methods have been shown to generate parametric maps of improved precision and accuracy in dynamic PET. However, due to the interleaving between the tomographic and kinetic modelling steps, any tomographic or kinetic modelling errors in certain regions or frames, tend to spatially or temporally propagate. This results in biased kinetic parameters and thus limits the benefits of such direct methods. Kinetic modelling errors originate from the inability to construct a common single kinetic model for the entire field-of-view, and such errors in erroneously modelled regions could spatially propagate. Adaptive models have been used within 4D image reconstruction to mitigate the problem, though they are complex and difficult to optimize. Tomographic errors in dynamic imaging on the other hand, can originate from involuntary patient motion between dynamic frames, as well as from emission/transmission mismatch. Motion correction schemes can be used, however, if residual errors exist or motion correction is not included in the study protocol, errors in the affected dynamic frames could potentially propagate either temporally, to other frames during the kinetic modelling step or spatially, during the tomographic step. In this work, we demonstrate a new strategy to minimize such error propagation in direct 4D image reconstruction, focusing on the tomographic step rather than the kinetic modelling step, by incorporating time-of-flight (TOF) within a direct 4D reconstruction framework. Using ever improving TOF resolutions (580 ps, 440 ps, 300 ps and 160 ps), we demonstrate that direct 4D TOF image reconstruction can substantially prevent kinetic parameter error propagation either from erroneous kinetic modelling, inter-frame motion or emission/transmission mismatch. Furthermore, we demonstrate the benefits of TOF in parameter estimation when conventional post-reconstruction (3D) methods are used and compare the potential improvements to direct 4D methods. Further improvements could possibly be achieved in the future by combining TOF direct 4D image reconstruction with adaptive kinetic models and inter-frame motion correction schemes.
Methane emissions in East Asia for 2000-2011 estimated using an atmospheric Bayesian inversion
NASA Astrophysics Data System (ADS)
Thompson, R. L.; Stohl, A.; Zhou, L. X.; Dlugokencky, E.; Fukuyama, Y.; Tohjima, Y.; Kim, S.-Y.; Lee, H.; Nisbet, E. G.; Fisher, R. E.; Lowry, D.; Weiss, R. F.; Prinn, R. G.; O'Doherty, S.; Young, D.; White, J. W. C.
2015-05-01
We present methane (CH4) emissions for East Asia from a Bayesian inversion of CH4 mole fraction and stable isotope (δ13C-CH4) measurements. Emissions were estimated at monthly resolution from 2000 to 2011. A posteriori, the total emission for East Asia increased from 43 ± 4 to 59 ± 4 Tg yr-1 between 2000 and 2011, owing largely to the increase in emissions from China, from 39 ± 4 to 54 ± 4 Tg yr-1, while emissions in other East Asian countries remained relatively stable. For China, South Korea, and Japan, the total emissions were smaller than the prior estimates (i.e., Emission Database for Global Atmospheric Research 4.2 FT2010 for anthropogenic emissions) by an average of 29%, 20%, and 23%, respectively. For Mongolia, Taiwan, and North Korea, the total emission was less than 2 Tg yr-1 and was not significantly different from the prior. The largest reductions in emissions, compared to the prior, occurred in summer in regions important for rice agriculture suggesting that this source is overestimated in the prior. Furthermore, an analysis of the isotope data suggests that the prior underestimates emissions from landfills and ruminant animals for winter 2010 to spring 2011 (no data available for other times). The inversion also found a lower average emission trend for China, 1.2 Tg yr-1 compared to 2.8 Tg yr-1 in the prior. This trend was not constant, however, and increased significantly after 2005, up to 2.0 Tg yr-1. Overall, the changes in emissions from China explain up to 40% of the increase in global emissions in the 2000s.
Covariance specification and estimation to improve top-down Green House Gas emission estimates
NASA Astrophysics Data System (ADS)
Ghosh, S.; Lopez-Coto, I.; Prasad, K.; Whetstone, J. R.
2015-12-01
The National Institute of Standards and Technology (NIST) operates the North-East Corridor (NEC) project and the Indianapolis Flux Experiment (INFLUX) in order to develop measurement methods to quantify sources of Greenhouse Gas (GHG) emissions as well as their uncertainties in urban domains using a top down inversion method. Top down inversion updates prior knowledge using observations in a Bayesian way. One primary consideration in a Bayesian inversion framework is the covariance structure of (1) the emission prior residuals and (2) the observation residuals (i.e. the difference between observations and model predicted observations). These covariance matrices are respectively referred to as the prior covariance matrix and the model-data mismatch covariance matrix. It is known that the choice of these covariances can have large effect on estimates. The main objective of this work is to determine the impact of different covariance models on inversion estimates and their associated uncertainties in urban domains. We use a pseudo-data Bayesian inversion framework using footprints (i.e. sensitivities of tower measurements of GHGs to surface emissions) and emission priors (based on Hestia project to quantify fossil-fuel emissions) to estimate posterior emissions using different covariance schemes. The posterior emission estimates and uncertainties are compared to the hypothetical truth. We find that, if we correctly specify spatial variability and spatio-temporal variability in prior and model-data mismatch covariances respectively, then we can compute more accurate posterior estimates. We discuss few covariance models to introduce space-time interacting mismatches along with estimation of the involved parameters. We then compare several candidate prior spatial covariance models from the Matern covariance class and estimate their parameters with specified mismatches. We find that best-fitted prior covariances are not always best in recovering the truth. To achieve accuracy, we perform a sensitivity study to further tune covariance parameters. Finally, we introduce a shrinkage based sample covariance estimation technique for both prior and mismatch covariances. This technique allows us to achieve similar accuracy nonparametrically in a more efficient and automated way.
Bayesian Inference for Source Reconstruction: A Real-World Application
Yee, Eugene; Hoffman, Ian; Ungar, Kurt
2014-01-01
This paper applies a Bayesian probabilistic inferential methodology for the reconstruction of the location and emission rate from an actual contaminant source (emission from the Chalk River Laboratories medical isotope production facility) using a small number of activity concentration measurements of a noble gas (Xenon-133) obtained from three stations that form part of the International Monitoring System radionuclide network. The sampling of the resulting posterior distribution of the source parameters is undertaken using a very efficient Markov chain Monte Carlo technique that utilizes a multiple-try differential evolution adaptive Metropolis algorithm with an archive of past states. It is shown that the principal difficulty in the reconstruction lay in the correct specification of the model errors (both scale and structure) for use in the Bayesian inferential methodology. In this context, two different measurement models for incorporation of the model error of the predicted concentrations are considered. The performance of both of these measurement models with respect to their accuracy and precision in the recovery of the source parameters is compared and contrasted. PMID:27379292
NASA Astrophysics Data System (ADS)
Hu, L.; Montzka, S. A.; Miller, B.; Andrews, A. E.; Miller, J. B.; Lehman, S.; Sweeney, C.; Miller, S. M.; Thoning, K. W.; Siso, C.; Atlas, E. L.; Blake, D. R.; De Gouw, J. A.; Gilman, J.; Dutton, G. S.; Elkins, J. W.; Hall, B. D.; Chen, H.; Fischer, M. L.; Mountain, M. E.; Nehrkorn, T.; Biraud, S.; Tans, P. P.
2015-12-01
Global atmospheric observations suggest substantial ongoing emissions of carbon tetrachloride (CCl4) despite a 100% phase-out of production for dispersive uses since 1996 in developed countries and 2010 in other countries. Little progress has been made in understanding the causes of these ongoing emissions or identifying their contributing sources. In this study, we employed multiple inverse modeling techniques (i.e. Bayesian and geostatistical inversions) to assimilate CCl4 mole fractions observed from the National Oceanic and Atmospheric Administration (NOAA) flask-air sampling network over the US, and quantify its national and regional emissions during 2008 - 2012. Average national total emissions of CCl4 between 2008 and 2012 determined from these observations and an ensemble of inversions range between 2.1 and 6.1 Gg yr-1. This emission is substantially larger than the mean of 0.06 Gg/yr reported to the US EPA Toxics Release Inventory over these years, suggesting that under-reported emissions or non-reporting sources make up the bulk of CCl4 emissions from the US. But while the inventory does not account for the magnitude of observationally-derived CCl4 emissions, the regional distribution of derived and inventory emissions is similar. Furthermore, when considered relative to the distribution of uncapped landfills or population, the variability in measured mole fractions was most consistent with the distribution of industrial sources (i.e., those from the Toxics Release Inventory). Our results suggest that emissions from the US only account for a small fraction of the global on-going emissions of CCl4 (30 - 80 Gg yr-1 over this period). Finally, to ascertain the importance of the US emissions relative to the unaccounted global emission rate we considered multiple approaches to extrapolate our results to other countries and the globe.
Single photon emission tomography using 99mTc-HM-PAO in the investigation of dementia.
Neary, D; Snowden, J S; Shields, R A; Burjan, A W; Northen, B; MacDermott, N; Prescott, M C; Testa, H J
1987-01-01
Single photon emission tomographic imaging of the brain using 99mTc HM-PAO was carried out in patients with a clinical diagnosis of Alzheimer's disease, non-Alzheimer frontal-lobe dementia, and progressive supranuclear palsy. Independent assessment of reductions in uptake revealed posterior hemisphere abnormalities in the majority of the Alzheimer group, and selective anterior hemisphere abnormalities in both other groups. The findings were consistent with observed patterns of mental impairment. The imaging technique has potential value in the differential diagnosis of primary cerebral atrophy. Images PMID:3499484
Brain Correlates of Stuttering and Syllable Production: Gender Comparison and Replication.
ERIC Educational Resources Information Center
Ingham, Roger J.; Fox, Peter T.; Ingham, Janis C.; Xiong, Jinhu; Zamarripa, Frank; Hardies, L. Jean; Lancaster, Jack L.
2004-01-01
This article reports a gender replication study of the P. T. Fox et a. (2000) performance correlation analysis of neural systems that distinguish between normal and stuttered speech in adult males. Positron-emission tomographic (PET) images of cerebral blood flow (CBF) were correlated with speech behavior scores obtained during PET imaging for 10…
Sparse Bayesian Inference and the Temperature Structure of the Solar Corona
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warren, Harry P.; Byers, Jeff M.; Crump, Nicholas A.
Measuring the temperature structure of the solar atmosphere is critical to understanding how it is heated to high temperatures. Unfortunately, the temperature of the upper atmosphere cannot be observed directly, but must be inferred from spectrally resolved observations of individual emission lines that span a wide range of temperatures. Such observations are “inverted” to determine the distribution of plasma temperatures along the line of sight. This inversion is ill posed and, in the absence of regularization, tends to produce wildly oscillatory solutions. We introduce the application of sparse Bayesian inference to the problem of inferring the temperature structure of themore » solar corona. Within a Bayesian framework a preference for solutions that utilize a minimum number of basis functions can be encoded into the prior and many ad hoc assumptions can be avoided. We demonstrate the efficacy of the Bayesian approach by considering a test library of 40 assumed temperature distributions.« less
Multiphoton tomography of intratissue tattoo nanoparticles
NASA Astrophysics Data System (ADS)
König, Karsten
2012-02-01
Most of today's intratissue tattoo pigments are unknown nanoparticles. So far, there was no real control of their use due to the absence of regulations. Some of the tattoo pigments contain carcinogenic amines e.g. azo pigment Red 22. Nowadays, the European Union starts to control the administration of tattoo pigments. There is an interest to obtain information on the intratissue distribution, their interaction with living cells and the extracellular matrix, and the mechanisms behind laser tattoo removal. Multiphoton tomographs are novel biosafety and imaging tools that can provide such information non-invasively and without further labeling. When using the spectral FLIM module, spatially-resolved emission spectra, excitation spectra, and fluorescence lifetimes can pr provided. Multiphoton tomographs are used by all major cosmetic comapanies to test the biosafety of sunscreen nanoparticles.
Murder, insanity, and medical expert witnesses.
Ciccone, J R
1992-06-01
Recent advances in the ability to study brain anatomy and function and attempts to link these findings with human behavior have captured the attention of the legal system. This had led to the increasing use of the "neurological defense" to support a plea of not guilty by reason of insanity. This article explores the history of the insanity defense and explores the role of the medical expert witnesses in integrating clinical and laboratory findings, eg, computed tomographic scans, magnetic resonance scans, and single-photon emission computed tomographic scans. Three cases involving murder and brain dysfunction are discussed: the first case involves a subarachnoid hemorrhage resulting in visual perceptual and memory impairment; the second case, a diagnosis of Alzheimer's disease; and the third case, the controverted diagnosis of complex partial seizures in a serial killer.
Three-dimensional Image Reconstruction in J-PET Using Filtered Back-projection Method
NASA Astrophysics Data System (ADS)
Shopa, R. Y.; Klimaszewski, K.; Kowalski, P.; Krzemień, W.; Raczyński, L.; Wiślicki, W.; Białas, P.; Curceanu, C.; Czerwiński, E.; Dulski, K.; Gajos, A.; Głowacz, B.; Gorgol, M.; Hiesmayr, B.; Jasińska, B.; Kisielewska-Kamińska, D.; Korcyl, G.; Kozik, T.; Krawczyk, N.; Kubicz, E.; Mohammed, M.; Pawlik-Niedźwiecka, M.; Niedźwiecki, S.; Pałka, M.; Rudy, Z.; Sharma, N. G.; Sharma, S.; Silarski, M.; Skurzok, M.; Wieczorek, A.; Zgardzińska, B.; Zieliński, M.; Moskal, P.
We present a method and preliminary results of the image reconstruction in the Jagiellonian PET tomograph. Using GATE (Geant4 Application for Tomographic Emission), interactions of the 511 keV photons with a cylindrical detector were generated. Pairs of such photons, flying back-to-back, originate from e+e- annihilations inside a 1-mm spherical source. Spatial and temporal coordinates of hits were smeared using experimental resolutions of the detector. We incorporated the algorithm of the 3D Filtered Back Projection, implemented in the STIR and TomoPy software packages, which differ in approximation methods. Consistent results for the Point Spread Functions of ~5/7,mm and ~9/20, mm were obtained, using STIR, for transverse and longitudinal directions, respectively, with no time of flight information included.
Regional brain hematocrit in stroke by single photon emission computed tomography imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loutfi, I.; Frackowiak, R.S.; Myers, M.J.
1987-01-01
Nineteen studies on 18 subjects were performed by single photon emission computed tomography (SPECT) of the head after the successive intravenous administration of a plasma label (/sup 99m/Tc-human serum albumin (HSA)) and /sup 99m/Tc-labeled autologous red blood cells (RBC). Two sets of cerebral tomographic sections were generated: for cerebral /sup 99m/Tc-HSA alone and for combined /sup 99m/Tc-HSA and /sup 99m/Tc-RBC. By relating counts in regions of interest from the cerebral tomograms to counts from blood samples obtained during each tomographic acquisition, regional cerebral haematocrit (Hct) was calculated by the application of a simple formula. Results show 1) lower cerebral Hctmore » than venous Hct (ratio of HCT brain/Hct venous 0.65-0.90) in all subjects, and 2) comparison between right and left hemisphere Hct in 3/3 normal subjects, 6/6 patients with transient ischaemic attacks and 3/8 patients with stroke showed no significant difference. However, in 3/8 patients with stroke (most recent strokes) significant differences were found, the higher Hct value corresponding to the affected side.« less
Hackstadt, Amber J; Peng, Roger D
2014-11-01
Time series studies have suggested that air pollution can negatively impact health. These studies have typically focused on the total mass of fine particulate matter air pollution or the individual chemical constituents that contribute to it, and not source-specific contributions to air pollution. Source-specific contribution estimates are useful from a regulatory standpoint by allowing regulators to focus limited resources on reducing emissions from sources that are major contributors to air pollution and are also desired when estimating source-specific health effects. However, researchers often lack direct observations of the emissions at the source level. We propose a Bayesian multivariate receptor model to infer information about source contributions from ambient air pollution measurements. The proposed model incorporates information from national databases containing data on both the composition of source emissions and the amount of emissions from known sources of air pollution. The proposed model is used to perform source apportionment analyses for two distinct locations in the United States (Boston, Massachusetts and Phoenix, Arizona). Our results mirror previous source apportionment analyses that did not utilize the information from national databases and provide additional information about uncertainty that is relevant to the estimation of health effects.
NASA Astrophysics Data System (ADS)
Tang, Yuping; Wang, Daniel; Wilson, Grant; Gutermuth, Robert; Heyer, Mark
2018-01-01
We present the AzTEC/LMT survey of dust continuum at 1.1mm on the central ˜ 200pc (CMZ) of our Galaxy. A joint SED analysis of all existing dust continuum surveys on the CMZ is performed, from 160µm to 1.1mm. Our analysis follows a MCMC sampling strategy incorporating the knowledge of PSFs in different maps, which provides unprecedented spacial resolution on distributions of dust temperature, column density and emissivity index. The dense clumps in the CMZ typically show low dust temperature ( 20K), with no significant sign of buried star formation, and a weak evolution of higher emissivity index toward dense peak. A new model is proposed, allowing for varying dust temperature inside a cloud and self-shielding of dust emission, which leads to similar conclusions on dust temperature and grain properties. We further apply a hierarchical Bayesian analysis to infer the column density probability distribution function (N-PDF), while simultaneously removing the Galactic foreground and background emission. The N-PDF shows a steep power-law profile with α > 3, indicating that formation of dense structures are suppressed.
Estimating National-scale Emissions using Dense Monitoring Networks
NASA Astrophysics Data System (ADS)
Ganesan, A.; Manning, A.; Grant, A.; Young, D.; Oram, D.; Sturges, W. T.; Moncrieff, J. B.; O'Doherty, S.
2014-12-01
The UK's DECC (Deriving Emissions linked to Climate Change) network consists of four greenhouse gas measurement stations that are situated to constrain emissions from the UK and Northwest Europe. These four stations are located in Mace Head (West Coast of Ireland), and on telecommunication towers at Ridge Hill (Western England), Tacolneston (Eastern England) and Angus (Eastern Scotland). With the exception of Angus, which currently only measures carbon dioxide (CO2) and methane (CH4), the remaining sites are additionally equipped to monitor nitrous oxide (N2O). We present an analysis of the network's CH4 and N2O observations from 2011-2013 and compare derived top-down regional emissions with bottom-up inventories, including a recently produced high-resolution inventory (UK National Atmospheric Emissions Inventory). As countries are moving toward national-level emissions estimation, we also address some of the considerations that need to be made when designing these national networks. One of the novel aspects of this work is that we use a hierarchical Bayesian inversion framework. This methodology, which has newly been applied to greenhouse gas emissions estimation, is designed to estimate temporally and spatially varying model-measurement uncertainties and correlation scales, in addition to fluxes. Through this analysis, we demonstrate the importance of characterizing these covariance parameters in order to properly use data from high-density monitoring networks. This UK case study highlights the ways in which this new inverse framework can be used to address some of the limitations of traditional Bayesian inverse methods.
NASA Astrophysics Data System (ADS)
Moskal, P.; Alfs, D.; Bednarski, T.; Białas, P.; Curceanu, C.; Czerwiński, E.; Dulski, K.; Gajos, A.; Głowacz, B.; Gupta-Sharma, N.; Gorgol, M.; Hiesmayr, B. C.; Jasińska, B.; Kamińska, D.; Khreptak, O.; Korcyl, G.; Kowalski, P.; Krzemień, W.; Krawczyk, N.; Kubicz, E.; Mohammed, M.; Niedźwiecki, Sz.; Pawlik-Niedńwiecka, M.; Raczyński, L.; Rudy, Z.; Silarski, M.; Smyrski, J.; Wieczorek, A.; Wiślicki, W.; Zgardzińska, B.; Zieliński, M.
2016-11-01
Discrete symmetries such as parity (P), charge-conjugation (C) and time reversal (T) are of fundamental importance in physics and cosmology. Breaking of charge conjugation symmetry (C) and its combination with parity (CP) constitute necessary conditions for the existence of the asymmetry between matter and antimatter in the observed Universe. The presently known sources of discrete symmetries violations can account for only a tiny fraction of the excess of matter over antimatter. So far CP and T symmetries violations were observed only for systems involving quarks and they were never reported for the purely leptonic objects. In this article we describe briefly an experimental proposal for the test of discrete symmetries in the decays of positronium atom which is made exclusively of leptons. The experiments are conducted by means of the Jagiellonian Positron Emission Tomograph (J-PET) which is constructed from strips of plastic scintillators enabling registration of photons from the positronium annihilation. J-PET tomograph together with the positronium target system enable to measure expectation values for the discrete symmetries odd operators constructed from (i) spin vector of the ortho-positronium atom, (ii) momentum vectors of photons originating from the decay of positronium, and (iii) linear polarization direction of annihilation photons. Linearly polarized positronium will be produced in the highly porous aerogel or polymer targets, exploiting longitudinally polarized positrons emitted by the sodium 22Na isotope. Information about the polarization vector of orthopositronium will be available on the event by event basis and will be reconstructed from the known position of the positron source and the reconstructed position of the orthopositronium annihilation. In 2016 the first tests and calibration runs are planned, and the data collection with high statistics will commence in the year 2017.
HFSB-seeding for large-scale tomographic PIV in wind tunnels
NASA Astrophysics Data System (ADS)
Caridi, Giuseppe Carlo Alp; Ragni, Daniele; Sciacchitano, Andrea; Scarano, Fulvio
2016-12-01
A new system for large-scale tomographic particle image velocimetry in low-speed wind tunnels is presented. The system relies upon the use of sub-millimetre helium-filled soap bubbles as flow tracers, which scatter light with intensity several orders of magnitude higher than micron-sized droplets. With respect to a single bubble generator, the system increases the rate of bubbles emission by means of transient accumulation and rapid release. The governing parameters of the system are identified and discussed, namely the bubbles production rate, the accumulation and release times, the size of the bubble injector and its location with respect to the wind tunnel contraction. The relations between the above parameters, the resulting spatial concentration of tracers and measurement of dynamic spatial range are obtained and discussed. Large-scale experiments are carried out in a large low-speed wind tunnel with 2.85 × 2.85 m2 test section, where a vertical axis wind turbine of 1 m diameter is operated. Time-resolved tomographic PIV measurements are taken over a measurement volume of 40 × 20 × 15 cm3, allowing the quantitative analysis of the tip-vortex structure and dynamical evolution.
Bayesian soft X-ray tomography using non-stationary Gaussian Processes
NASA Astrophysics Data System (ADS)
Li, Dong; Svensson, J.; Thomsen, H.; Medina, F.; Werner, A.; Wolf, R.
2013-08-01
In this study, a Bayesian based non-stationary Gaussian Process (GP) method for the inference of soft X-ray emissivity distribution along with its associated uncertainties has been developed. For the investigation of equilibrium condition and fast magnetohydrodynamic behaviors in nuclear fusion plasmas, it is of importance to infer, especially in the plasma center, spatially resolved soft X-ray profiles from a limited number of noisy line integral measurements. For this ill-posed inversion problem, Bayesian probability theory can provide a posterior probability distribution over all possible solutions under given model assumptions. Specifically, the use of a non-stationary GP to model the emission allows the model to adapt to the varying length scales of the underlying diffusion process. In contrast to other conventional methods, the prior regularization is realized in a probability form which enhances the capability of uncertainty analysis, in consequence, scientists who concern the reliability of their results will benefit from it. Under the assumption of normally distributed noise, the posterior distribution evaluated at a discrete number of points becomes a multivariate normal distribution whose mean and covariance are analytically available, making inversions and calculation of uncertainty fast. Additionally, the hyper-parameters embedded in the model assumption can be optimized through a Bayesian Occam's Razor formalism and thereby automatically adjust the model complexity. This method is shown to produce convincing reconstructions and good agreements with independently calculated results from the Maximum Entropy and Equilibrium-Based Iterative Tomography Algorithm methods.
Bayesian soft X-ray tomography using non-stationary Gaussian Processes.
Li, Dong; Svensson, J; Thomsen, H; Medina, F; Werner, A; Wolf, R
2013-08-01
In this study, a Bayesian based non-stationary Gaussian Process (GP) method for the inference of soft X-ray emissivity distribution along with its associated uncertainties has been developed. For the investigation of equilibrium condition and fast magnetohydrodynamic behaviors in nuclear fusion plasmas, it is of importance to infer, especially in the plasma center, spatially resolved soft X-ray profiles from a limited number of noisy line integral measurements. For this ill-posed inversion problem, Bayesian probability theory can provide a posterior probability distribution over all possible solutions under given model assumptions. Specifically, the use of a non-stationary GP to model the emission allows the model to adapt to the varying length scales of the underlying diffusion process. In contrast to other conventional methods, the prior regularization is realized in a probability form which enhances the capability of uncertainty analysis, in consequence, scientists who concern the reliability of their results will benefit from it. Under the assumption of normally distributed noise, the posterior distribution evaluated at a discrete number of points becomes a multivariate normal distribution whose mean and covariance are analytically available, making inversions and calculation of uncertainty fast. Additionally, the hyper-parameters embedded in the model assumption can be optimized through a Bayesian Occam's Razor formalism and thereby automatically adjust the model complexity. This method is shown to produce convincing reconstructions and good agreements with independently calculated results from the Maximum Entropy and Equilibrium-Based Iterative Tomography Algorithm methods.
NASA Astrophysics Data System (ADS)
Ghosh, S.; Lopez-Coto, I.; Prasad, K.; Karion, A.; Mueller, K.; Gourdji, S.; Martin, C.; Whetstone, J. R.
2017-12-01
The National Institute of Standards and Technology (NIST) supports the North-East Corridor Baltimore Washington (NEC-B/W) project and Indianapolis Flux Experiment (INFLUX) aiming to quantify sources of Greenhouse Gas (GHG) emissions as well as their uncertainties. These projects employ different flux estimation methods including top-down inversion approaches. The traditional Bayesian inversion method estimates emission distributions by updating prior information using atmospheric observations of Green House Gases (GHG) coupled to an atmospheric and dispersion model. The magnitude of the update is dependent upon the observed enhancement along with the assumed errors such as those associated with prior information and the atmospheric transport and dispersion model. These errors are specified within the inversion covariance matrices. The assumed structure and magnitude of the specified errors can have large impact on the emission estimates from the inversion. The main objective of this work is to build a data-adaptive model for these covariances matrices. We construct a synthetic data experiment using a Kalman Filter inversion framework (Lopez et al., 2017) employing different configurations of transport and dispersion model and an assumed prior. Unlike previous traditional Bayesian approaches, we estimate posterior emissions using regularized sample covariance matrices associated with prior errors to investigate whether the structure of the matrices help to better recover our hypothetical true emissions. To incorporate transport model error, we use ensemble of transport models combined with space-time analytical covariance to construct a covariance that accounts for errors in space and time. A Kalman Filter is then run using these covariances along with Maximum Likelihood Estimates (MLE) of the involved parameters. Preliminary results indicate that specifying sptio-temporally varying errors in the error covariances can improve the flux estimates and uncertainties. We also demonstrate that differences between the modeled and observed meteorology can be used to predict uncertainties associated with atmospheric transport and dispersion modeling which can help improve the skill of an inversion at urban scales.
Radial anisotropy of Northeast Asia inferred from Bayesian inversions of ambient noise data
NASA Astrophysics Data System (ADS)
Lee, S. J.; Kim, S.; Rhie, J.
2017-12-01
The eastern margin of the Eurasia plate exhibits complex tectonic settings due to interactions with the subducting Pacific and Philippine Sea plates and the colliding India plate. Distributed extensional basins and intraplate volcanoes, and their heterogeneous features in the region are not easily explained with a simple mechanism. Observations of radial anisotropy in the entire lithosphere and the part of the asthenosphere provide the most effective evidence for the deformation of the lithosphere and the associated variation of the lithosphere-asthenosphere boundary (LAB). To infer anisotropic structures of crustal and upper-mantle in this region, radial anisotropy is measured using ambient noise data. In a continuation of previous Rayleigh wave tomography study in Northeast Asia, we conduct Love wave tomography to determine radial anisotropy using the Bayesian inversion techniques. Continuous seismic noise recordings of 237 broad-band seismic stations are used and more than 55,000 group and phase velocities of fundamental mode are measured for periods of 5-60 s. Total 8 different types of dispersion maps of Love wave from this study (period 10-60 s), Rayleigh wave from previous tomographic study (Kim et al., 2016; period 8-70 s) and longer period data (period 70-200 s) from a global model (Ekstrom, 2011) are jointly inverted using a hierarchical and transdimensional Bayesian technique. For each grid-node, boundary depths, velocities and anisotropy parameters of layers are sampled simultaneously on the assumption of the layered half-space model. The constructed 3-D radial anisotropy model provides much more details about the crust and upper mantle anisotropic structures, and about the complex undulation of the LAB.
3D tomographic reconstruction using geometrical models
NASA Astrophysics Data System (ADS)
Battle, Xavier L.; Cunningham, Gregory S.; Hanson, Kenneth M.
1997-04-01
We address the issue of reconstructing an object of constant interior density in the context of 3D tomography where there is prior knowledge about the unknown shape. We explore the direct estimation of the parameters of a chosen geometrical model from a set of radiographic measurements, rather than performing operations (segmentation for example) on a reconstructed volume. The inverse problem is posed in the Bayesian framework. A triangulated surface describes the unknown shape and the reconstruction is computed with a maximum a posteriori (MAP) estimate. The adjoint differentiation technique computes the derivatives needed for the optimization of the model parameters. We demonstrate the usefulness of the approach and emphasize the techniques of designing forward and adjoint codes. We use the system response of the University of Arizona Fast SPECT imager to illustrate this method by reconstructing the shape of a heart phantom.
Scarmeas, Nikolaos; Zarahn, Eric; Anderson, Karen E.; Honig, Lawrence S.; Park, Aileen; Hilton, John; Flynn, Joseph; Sackeim, Harold A.; Stern, Yaakov
2011-01-01
Background Cognitive reserve (CR) is the ability of an individual to cope with advancing brain pathological abnormalities so that he or she remains free of symptoms. Epidemiological data and evidence from positron emission tomography suggest that it may be mediated through education or IQ. Objective To investigate CR-mediated differential brain activation in Alzheimer disease (AD) subjects compared with healthy elderly persons. Participants Using radioactive water positron emission tomography, we scanned 12 AD patients and 17 healthy elderly persons while performing a serial recognition memory task for nonverbalizable shapes under 2 conditions: low demand, in which one shape was presented in each study trial, and titrated demand, in which the study list length was adjusted so that each subject recognized shapes at approximately 75% accuracy. Positron emission tomographic scan acquisition included the encoding and recognition phases. A CR factor score that summarized years of education, National Adult Reading Test estimated IQ, and Wechsler Adult Intelligence Scale–Revised vocabulary subtest score (explaining 71% of the total variance) was used as an index of CR. Voxel-wise, multiple regression analyses were performed with the “activation” difference (titrated demand–low demand) as the dependent variables and the CR factor score as the independent one. Brain regions where regression slopes differed between the 2 groups were identified. Results The slopes were significantly more positive for the AD patients in the left precentral gyrus and in the left hippocampus and significantly more negative in the right fusiform, right middle occipital, left superior occipital, and left middle temporal gyri. Conclusion Brain regions where systematic relationships (slopes) between subjects’ education-IQ and brain activation differ as a function of disease status may mediate the differential ability to cope with (ie, delay or modify) clinical manifestations of AD. PMID:14732623
Schmidt, Sergio L; Schmidt, Juliana J; Tolentino, Julio C; Ferreira, Carlos G; de Almeida, Sergio A; Alvarenga, Regina P; Simoes, Eunice N; Schmidt, Guilherme J; Canedo, Nathalie H S; Chimelli, Leila
2016-07-20
Limbic encephalitis was originally described as a rare clinical neuropathological entity involving seizures and neuropsychological disturbances. In this report, we describe cerebral patterns visualized by positron emission tomography in a patient with limbic encephalitis and cholangiocarcinoma. To our knowledge, there is no other description in the literature of cerebral positron emission tomography findings in the setting of limbic encephalitis and subsequent diagnosis of cholangiocarcinoma. We describe a case of a 77-year-old Caucasian man who exhibited persistent cognitive changes 2 years before his death. A cerebral scan obtained at that time by 2-deoxy-2-[fluorine-18]fluoro- D -glucose integrated with computed tomography-positron emission tomography showed low radiotracer uptake in the frontal and temporal lobes. Cerebrospinal fluid analysis indicated the presence of voltage-gated potassium channel antibodies. Three months before the patient's death, a lymph node biopsy indicated a cholangiocarcinoma, and a new cerebral scan obtained by 2-deoxy-2-[fluorine-18]fluoro-D-glucose integrated with computed tomography-positron emission tomography showed an increment in the severity of metabolic deficit in the frontal and parietal lobes, as well as hypometabolism involving the temporal lobes. Two months before the patient's death, cerebral metastases were detected on a contrast-enhanced computed tomographic scan. Postmortem examination revealed a cholangiocarcinoma with multiple metastases including the lungs and lymph nodes. The patient's brain weighed 1300 g, and mild cortical atrophy, ex vacuo dilation of the ventricles, and mild focal thickening of the cerebellar leptomeninges, which were infiltrated by neoplastic epithelial cells, were observed. These findings support the need for continued vigilance in malignancy surveillance in patients with limbic encephalitis and early cerebral positron emission tomographic scan abnormalities. The difficulty in early diagnosis of small tumors, such as a cholangiocarcinoma, is discussed in the context of the clinical utility of early cerebral hypometabolism detected by 2-deoxy-2-[fluorine-18]fluoro-D-glucose integrated with computed tomography-positron emission tomography in patients with rapidly progressive dementia.
Source Partitioning of Methane Emissions and its Seasonality in the U.S. Midwest
NASA Astrophysics Data System (ADS)
Chen, Zichong; Griffis, Timothy J.; Baker, John M.; Millet, Dylan B.; Wood, Jeffrey D.; Dlugokencky, Edward J.; Andrews, Arlyn E.; Sweeney, Colm; Hu, Cheng; Kolka, Randall K.
2018-02-01
The methane (CH4) budget and its source partitioning are poorly constrained in the Midwestern United States. We used tall tower (185 m) aerodynamic flux measurements and atmospheric scale factor Bayesian inversions to constrain the monthly budget and to partition the total budget into natural (e.g., wetlands) and anthropogenic (e.g., livestock, waste, and natural gas) sources for the period June 2016 to September 2017. Aerodynamic flux observations indicated that the landscape was a CH4 source with a mean annual CH4 flux of +13.7 ± 0.34 nmol m-2 s-1 and was rarely a net sink. The scale factor Bayesian inversion analyses revealed a mean annual source of +12.3 ± 2.1 nmol m-2 s-1. Flux partitioning revealed that the anthropogenic source (7.8 ± 1.6 Tg CH4 yr-1) was 1.5 times greater than the bottom-up gridded United States Environmental Protection Agency inventory, in which livestock and oil/gas sources were underestimated by 1.8-fold and 1.3-fold, respectively. Wetland emissions (4.0 ± 1.2 Tg CH4 yr-1) were the second largest source, accounting for 34% of the total budget. The temporal variability of total CH4 emissions was dominated by wetlands with peak emissions occurring in August. In contrast, emissions from oil/gas and other anthropogenic sources showed relatively weak seasonality.
Estimating methane emissions in California's urban and rural regions using multitower observations
Jeong, Seongeun; Newman, Sally; Zhang, Jingsong; ...
2016-11-05
Here, we present an analysis of methane (CH 4) emissions using atmospheric observations from 36 thirteen sites in California during June 2013 – May 2014. A hierarchical Bayesian inversion 37 method is used to estimate CH 4 emissions for spatial regions (0.3° pixels for major regions) by 38 comparing measured CH 4 mixing ratios with transport model (WRF-STILT) predictions based 39 on seasonally varying California-specific CH 4 prior emission models. The transport model is 40 assessed using a combination of meteorological and carbon monoxide (CO) measurements 41 coupled with the gridded California Air Resources Board (CARB) carbon monoxide (CO) 42more » emission inventory. Hierarchical Bayesian inversion suggests that state annual anthropogenic 43 CH 4 emissions are 2.42 ± 0.49 Tg CH 4/yr (at 95% confidence, including transport bias 44 uncertainty), higher (1.2 - 1.8 times) than the CARB current inventory (1.64 Tg CH 4/yr in 2013). 45 We note that the estimated CH 4 emissions drop to 1.0 - 1.6 times the CARB inventory if we 46 correct for the 10% median CH 4 emissions assuming the bias in CO analysis is applicable to 47 CH 4. The CH 4 emissions from the Central Valley and urban regions (San Francisco Bay and 48 South Coast Air Basins) account for ~58% and 26% of the total posterior emissions, 49 respectively. This study suggests that the livestock sector is likely the major contributor to the 50 state total CH 4 emissions, in agreement with CARB’s inventory. Attribution to source sectors for 51 sub-regions of California using additional trace gas species would further improve the 52 quantification of California’s CH 4 emissions and mitigation efforts towards the California Global 53 Warming Solutions Act of 2006 (AB-32).« less
Estimating methane emissions in California's urban and rural regions using multitower observations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeong, Seongeun; Newman, Sally; Zhang, Jingsong
Here, we present an analysis of methane (CH 4) emissions using atmospheric observations from 36 thirteen sites in California during June 2013 – May 2014. A hierarchical Bayesian inversion 37 method is used to estimate CH 4 emissions for spatial regions (0.3° pixels for major regions) by 38 comparing measured CH 4 mixing ratios with transport model (WRF-STILT) predictions based 39 on seasonally varying California-specific CH 4 prior emission models. The transport model is 40 assessed using a combination of meteorological and carbon monoxide (CO) measurements 41 coupled with the gridded California Air Resources Board (CARB) carbon monoxide (CO) 42more » emission inventory. Hierarchical Bayesian inversion suggests that state annual anthropogenic 43 CH 4 emissions are 2.42 ± 0.49 Tg CH 4/yr (at 95% confidence, including transport bias 44 uncertainty), higher (1.2 - 1.8 times) than the CARB current inventory (1.64 Tg CH 4/yr in 2013). 45 We note that the estimated CH 4 emissions drop to 1.0 - 1.6 times the CARB inventory if we 46 correct for the 10% median CH 4 emissions assuming the bias in CO analysis is applicable to 47 CH 4. The CH 4 emissions from the Central Valley and urban regions (San Francisco Bay and 48 South Coast Air Basins) account for ~58% and 26% of the total posterior emissions, 49 respectively. This study suggests that the livestock sector is likely the major contributor to the 50 state total CH 4 emissions, in agreement with CARB’s inventory. Attribution to source sectors for 51 sub-regions of California using additional trace gas species would further improve the 52 quantification of California’s CH 4 emissions and mitigation efforts towards the California Global 53 Warming Solutions Act of 2006 (AB-32).« less
System Performance Simulations of the RatCAP Awake Rat Brain Scanner
NASA Astrophysics Data System (ADS)
Shokouhi, S.; Vaska, P.; Schlyer, D. J.; Stoll, S. P.; Villanueva, A.; Kriplani, A.; Woody, C. L.
2005-10-01
The capability to create high quality images from data acquired by the Rat Conscious Animal PET tomograph (RatCAP) has been evaluated using modified versions of the PET Monte Carlo code Simulation System for Emission Tomography (SimSET). The proposed tomograph consists of lutetium oxyorthosilicate (LSO) crystals arranged in 12 4 /spl times/ 8 blocks. The effects of the RatCAPs small ring diameter (/spl sim/40 mm) and its block detector geometry on image quality for small animal studies have been investigated. Since the field of view will be almost as large as the ring diameter, radial elongation artifacts due to parallax error are expected to degrade the spatial resolution and thus the image quality at the edge of the field of view. In addition to Monte Carlo simulations, some preliminary results of experimentally acquired images in both two-dimensional (2-D) and 3-D modes are presented.
NASA Astrophysics Data System (ADS)
Munoz Burgos, J. M.; Brooks, N. H.; Fenstermacher, M. E.; Meyer, W. H.; Unterberg, E. A.; Schmitz, O.; Loch, S. D.; Balance, C. P.
2011-10-01
We apply new atomic modeling techniques to helium and deuterium for diagnostics in the divertor and scrape-off layer regions. Analysis of tomographically inverted images is useful for validating detachment prediction models and power balances in the divertor. We apply tomographic image inversion from fast tangential cameras of helium and Dα emission at the divertor in order to obtain 2D profiles of Te, Ne, and ND (neutral ion density profiles). The accuracy of the atomic models for He I will be cross-checked against Thomson scattering measurements of Te and Ne. This work summarizes several current developments and applications of atomic modeling into diagnostic at the DIII-D tokamak. Supported in part by the US DOE under DE-AC05-06OR23100, DE-FC02-04ER54698, DE-AC52-07NA27344, and DE-AC05-00OR22725.
NASA Technical Reports Server (NTRS)
Ter-Pogossian, M. M.; Hoffman, E. J.; Weiss, E. S.; Coleman, R. E.; Phelps, M. E.; Welch, M. J.; Sobel, B. E.
1975-01-01
A positron emission transverse tomograph device was developed which provides transaxial sectional images of the distribution of positron-emitting radionuclides in the heart. The images provide a quantitative three-dimensional map of the distribution of activity unencumbered by the superimposition of activity originating from regions overlying and underlying the plane of interest. PETT is used primarily with the cyclotron-produced radionuclides oxygen-15, nitrogen-13 and carbon-11. Because of the participation of these atoms in metabolism, they can be used to label metabolic substrates and intermediary molecules incorporated in myocardial metabolism.
Cardiac metastases of Ewing sarcoma detected by 18F-FDG PET/CT.
Coccia, Paola; Ruggiero, Antonio; Rufini, Vittoria; Maurizi, Palma; Attinà, Giorgio; Marano, Riccardo; Natale, Luigi; Leccisotti, Lucia; Calcagni, Maria L; Riccardi, Riccardo
2012-04-01
Positron emission tomography (PET) is widely used in the diagnostic evaluation and staging of different malignant tumors. The role of PET/computed tomographic scan in detecting distant metastases in the workup of Ewing sarcoma in children or young adults is less well defined. We report a case of a boy affected by a metastatic Ewing sarcoma with cardiac asymptomatic metastasis detected by F-FDG PET/computed tomography.
Ionospheric-thermospheric UV tomography: 2. Comparison with incoherent scatter radar measurements
NASA Astrophysics Data System (ADS)
Dymond, K. F.; Nicholas, A. C.; Budzien, S. A.; Stephan, A. W.; Coker, C.; Hei, M. A.; Groves, K. M.
2017-03-01
The Special Sensor Ultraviolet Limb Imager (SSULI) instruments are ultraviolet limb scanning sensors that fly on the Defense Meteorological Satellite Program F16-F19 satellites. The SSULIs cover the 80-170 nm wavelength range which contains emissions at 91 and 136 nm, which are produced by radiative recombination of the ionosphere. We invert the 91.1 nm emission tomographically using a newly developed algorithm that includes optical depth effects due to pure absorption and resonant scattering. We present the details of our approach including how the optimal altitude and along-track sampling were determined and the newly developed approach we are using for regularizing the SSULI tomographic inversions. Finally, we conclude with validations of the SSULI inversions against Advanced Research Project Agency Long-range Tracking and Identification Radar (ALTAIR) incoherent scatter radar measurements and demonstrate excellent agreement between the measurements. As part of this study, we include the effects of pure absorption by O2, N2, and O in the inversions and find that best agreement between the ALTAIR and SSULI measurements is obtained when only O2 and O are included, but the agreement degrades when N2 absorption is included. This suggests that the absorption cross section of N2 needs to be reinvestigated near 91.1 nm wavelengths.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choong, W. -S.; Abu-Nimeh, F.; Moses, W. W.
Here, we present a 16-channel front-end readout board for the OpenPET electronics system. A major task in developing a nuclear medical imaging system, such as a positron emission computed tomograph (PET) or a single-photon emission computed tomograph (SPECT), is the electronics system. While there are a wide variety of detector and camera design concepts, the relatively simple nature of the acquired data allows for a common set of electronics requirements that can be met by a flexible, scalable, and high-performance OpenPET electronics system. The analog signals from the different types of detectors used in medical imaging share similar characteristics, whichmore » allows for a common analog signal processing. The OpenPET electronics processes the analog signals with Detector Boards. Here we report on the development of a 16-channel Detector Board. Each signal is digitized by a continuously sampled analog-to-digital converter (ADC), which is processed by a field programmable gate array (FPGA) to extract pulse height information. A leading edge discriminator creates a timing edge that is "time stamped" by a time-to-digital converter (TDC) implemented inside the FPGA. In conclusion, this digital information from each channel is sent to an FPGA that services 16 analog channels, and then information from multiple channels is processed by this FPGA to perform logic for crystal lookup, DOI calculation, calibration, etc.« less
Koukourakis, G; Maravelis, G; Koukouraki, S; Padelakos, P; Kouloulias, V
2009-01-01
The concept of emission and transmission tomography was introduced by David Kuhl and Roy Edwards in the late 1950s. Their work later led to the design and construction of several tomographic instruments at the University of Pennsylvania. Tomographic imaging techniques were further developed by Michel Ter-Pogossian, Michael E. Phelps and others at the Washington University School of Medicine. Positron emission tomography (PET) is a nuclear medicine imaging technique which produces a 3-dimensional image or map of functional processes in the body. The system detects pairs of gamma rays emitted indirectly by a positron-emitting radionuclide (tracer), which is introduced into the body on a biologically active molecule. Images of tracer concentration in 3-dimensional space within the body are then reconstructed by computer analysis. In modern scanners, this reconstruction is often accomplished with the aid of a CT X-ray scan performed on the patient during the same session, in the same machine. If the biologically active molecule chosen for PET is 18F-fluorodeoxyglucose (FDG), an analogue of glucose, the concentrations of tracer imaged give tissue metabolic activity in terms of regional glucose uptake. Although use of this tracer results in the most common type of PET scan, other tracer molecules are used in PET to image the tissue concentration of many other types of molecules of interest. The main role of this article was to analyse the available types of radiopharmaceuticals used in PET-CT along with the principles of its clinical and technical considerations.
GATE - Geant4 Application for Tomographic Emission: a simulation toolkit for PET and SPECT
Jan, S.; Santin, G.; Strul, D.; Staelens, S.; Assié, K.; Autret, D.; Avner, S.; Barbier, R.; Bardiès, M.; Bloomfield, P. M.; Brasse, D.; Breton, V.; Bruyndonckx, P.; Buvat, I.; Chatziioannou, A. F.; Choi, Y.; Chung, Y. H.; Comtat, C.; Donnarieix, D.; Ferrer, L.; Glick, S. J.; Groiselle, C. J.; Guez, D.; Honore, P.-F.; Kerhoas-Cavata, S.; Kirov, A. S.; Kohli, V.; Koole, M.; Krieguer, M.; van der Laan, D. J.; Lamare, F.; Largeron, G.; Lartizien, C.; Lazaro, D.; Maas, M. C.; Maigne, L.; Mayet, F.; Melot, F.; Merheb, C.; Pennacchio, E.; Perez, J.; Pietrzyk, U.; Rannou, F. R.; Rey, M.; Schaart, D. R.; Schmidtlein, C. R.; Simon, L.; Song, T. Y.; Vieira, J.-M.; Visvikis, D.; Van de Walle, R.; Wieërs, E.; Morel, C.
2012-01-01
Monte Carlo simulation is an essential tool in emission tomography that can assist in the design of new medical imaging devices, the optimization of acquisition protocols, and the development or assessment of image reconstruction algorithms and correction techniques. GATE, the Geant4 Application for Tomographic Emission, encapsulates the Geant4 libraries to achieve a modular, versatile, scripted simulation toolkit adapted to the field of nuclear medicine. In particular, GATE allows the description of time-dependent phenomena such as source or detector movement, and source decay kinetics. This feature makes it possible to simulate time curves under realistic acquisition conditions and to test dynamic reconstruction algorithms. This paper gives a detailed description of the design and development of GATE by the OpenGATE collaboration, whose continuing objective is to improve, document, and validate GATE by simulating commercially available imaging systems for PET and SPECT. Large effort is also invested in the ability and the flexibility to model novel detection systems or systems still under design. A public release of GATE licensed under the GNU Lesser General Public License can be downloaded at the address http://www-lphe.ep.ch/GATE/. Two benchmarks developed for PET and SPECT to test the installation of GATE and to serve as a tutorial for the users are presented. Extensive validation of the GATE simulation platform has been started, comparing simulations and measurements on commercially available acquisition systems. References to those results are listed. The future prospects toward the gridification of GATE and its extension to other domains such as dosimetry are also discussed. PMID:15552416
NASA Astrophysics Data System (ADS)
Cui, Y.; Brioude, J. F.; Angevine, W. M.; McKeen, S. A.; Henze, D. K.; Bousserez, N.; Liu, Z.; McDonald, B.; Peischl, J.; Ryerson, T. B.; Frost, G. J.; Trainer, M.
2016-12-01
Production of unconventional natural gas grew rapidly during the past ten years in the US which led to an increase in emissions of methane (CH4) and, depending on the shale region, nitrogen oxides (NOx). In terms of radiative forcing, CH4 is the second most important greenhouse gas after CO2. NOx is a precursor of ozone (O3) in the troposphere and nitrate particles, both of which are regulated by the US Clean Air Act. Emission estimates of CH4 and NOx from the shale regions are still highly uncertain. We present top-down estimates of CH4 and NOx surface fluxes from the Haynesville and Fayetteville shale production regions using aircraft data collected during the Southeast Nexus of Climate Change and Air Quality (SENEX) field campaign (June-July, 2013) and the Shale Oil and Natural Gas Nexus (SONGNEX) field campaign (March-May, 2015) within a mesoscale inversion framework. The inversion method is based on a mesoscale Bayesian inversion system using multiple transport models. EPA's 2011 National CH4 and NOx Emission Inventories are used as prior information to optimize CH4 and NOx emissions. Furthermore, the posterior CH4 emission estimates are used to constrain NOx emission estimates using a flux ratio inversion technique. Sensitivity of the posterior estimates to the use of off-diagonal terms in the error covariance matrices, the transport models, and prior estimates is discussed. Compared to the ground-based in-situ observations, the optimized CH4 and NOx inventories improve ground level CH4 and O3 concentrations calculated by the Weather Research and Forecasting mesoscale model coupled with chemistry (WRF-Chem).
NASA Astrophysics Data System (ADS)
Meillier, Céline; Chatelain, Florent; Michel, Olivier; Bacon, Roland; Piqueras, Laure; Bacher, Raphael; Ayasso, Hacheme
2016-04-01
We present SELFI, the Source Emission Line FInder, a new Bayesian method optimized for detection of faint galaxies in Multi Unit Spectroscopic Explorer (MUSE) deep fields. MUSE is the new panoramic integral field spectrograph at the Very Large Telescope (VLT) that has unique capabilities for spectroscopic investigation of the deep sky. It has provided data cubes with 324 million voxels over a single 1 arcmin2 field of view. To address the challenge of faint-galaxy detection in these large data cubes, we developed a new method that processes 3D data either for modeling or for estimation and extraction of source configurations. This object-based approach yields a natural sparse representation of the sources in massive data fields, such as MUSE data cubes. In the Bayesian framework, the parameters that describe the observed sources are considered random variables. The Bayesian model leads to a general and robust algorithm where the parameters are estimated in a fully data-driven way. This detection algorithm was applied to the MUSE observation of Hubble Deep Field-South. With 27 h total integration time, these observations provide a catalog of 189 sources of various categories and with secured redshift. The algorithm retrieved 91% of the galaxies with only 9% false detection. This method also allowed the discovery of three new Lyα emitters and one [OII] emitter, all without any Hubble Space Telescope counterpart. We analyzed the reasons for failure for some targets, and found that the most important limitation of the method is when faint sources are located in the vicinity of bright spatially resolved galaxies that cannot be approximated by the Sérsic elliptical profile. The software and its documentation are available on the MUSE science web service (muse-vlt.eu/science).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobsson-Svard, Staffan; Smith, Leon E.; White, Timothy
The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly has been assessed within the IAEA Support Program project JNT 1955, phase I, which was completed and reported to the IAEA in October 2016. Two safeguards verification objectives were identified in the project; (1) independent determination of the number of active pins that are present in a measured assembly, in the absence of a priori information about the assembly; and (2) quantitative assessment of pin-by-pin properties, for example the activity of key isotopes or pin attributes such as cooling time and relative burnup,more » under the assumption that basic fuel parameters (e.g., assembly type and nominal fuel composition) are known. The efficacy of GET to meet these two verification objectives was evaluated across a range of fuel types, burnups and cooling times, while targeting a total interrogation time of less than 60 minutes. The evaluations were founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types were used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. The simulated instrument response data were then processed using a variety of tomographic-reconstruction and image-processing methods, and scoring metrics were defined and used to evaluate the performance of the methods.This paper describes the analysis framework and metrics used to predict tomographer performance. It also presents the design of a “universal” GET (UGET) instrument intended to support the full range of verification scenarios envisioned by the IAEA. Finally, it gives examples of the expected partial-defect detection capabilities for some fuels and diversion scenarios, and it provides a comparison of predicted performance for the notional UGET design and an optimized variant of an existing IAEA instrument.« less
NASA Astrophysics Data System (ADS)
Gu, Z.; Bao, Q.; Taschereau, R.; Wang, H.; Bai, B.; Chatziioannou, A. F.
2014-06-01
Small animal positron emission tomography (PET) systems are often designed by employing close geometry configurations. Due to the different characteristics caused by geometrical factors, these tomographs require data acquisition protocols that differ from those optimized for conventional large diameter ring systems. In this work we optimized the energy window for data acquisitions with PETbox4, a 50 mm detector separation (box-like geometry) pre-clinical PET scanner, using the Geant4 Application for Tomographic Emission (GATE). The fractions of different types of events were estimated using a voxelized phantom including a mouse as well as its supporting chamber, mimicking a realistic mouse imaging environment. Separate code was developed to extract additional information about the gamma interactions for more accurate event type classification. Three types of detector backscatter events were identified in addition to the trues, phantom scatters and randoms. The energy window was optimized based on the noise equivalent count rate (NECR) and scatter fraction (SF) with lower-level discriminators (LLD) corresponding to energies from 150 keV to 450 keV. The results were validated based on the calculated image uniformity, spillover ratio (SOR) and recovery coefficient (RC) from physical measurements using the National Electrical Manufacturers Association (NEMA) NU-4 image quality phantom. These results indicate that when PETbox4 is operated with a more narrow energy window (350-650 keV), detector backscatter rejection is unnecessary. For the NEMA NU-4 image quality phantom, the SOR for the water chamber decreases by about 45% from 15.1% to 8.3%, and the SOR for the air chamber decreases by 31% from 12.0% to 8.3% at the LLDs of 150 and 350 keV, without obvious change in uniformity, further supporting the simulation based optimization. The optimization described in this work is not limited to PETbox4, but also applicable or helpful to other small inner diameter geometry scanners.
Bayesian tomography and integrated data analysis in fusion diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Dong, E-mail: lid@swip.ac.cn; Dong, Y. B.; Deng, Wei
2016-11-15
In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varyingmore » smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.« less
Detection of isolated cerebrovascular beta-amyloid with Pittsburgh compound B.
Greenberg, Steven M; Grabowski, Thomas; Gurol, M Edip; Skehan, Maureen E; Nandigam, R N Kaveer; Becker, John A; Garcia-Alloza, Monica; Prada, Claudia; Frosch, Matthew P; Rosand, Jonathan; Viswanathan, Anand; Smith, Eric E; Johnson, Keith A
2008-11-01
Imaging of cerebrovascular beta-amyloid (cerebral amyloid angiopathy) is complicated by the nearly universal overlap of this pathology with Alzheimer's pathology. We performed positron emission tomographic imaging with Pittsburgh Compound B on 42-year-old man with early manifestations of Iowa-type hereditary cerebral amyloid angiopathy, a form of the disorder with little or no plaque deposits of fibrillar beta-amyloid. The results demonstrated increased Pittsburgh Compound B retention selectively in occipital cortex, sparing regions typically labeled in Alzheimer's disease. These results offer compelling evidence that Pittsburgh Compound B positron emission tomography can noninvasively detect isolated cerebral amyloid angiopathy before overt signs of tissue damage such as hemorrhage or white matter lesions.
Application of Monte Carlo algorithms to the Bayesian analysis of the Cosmic Microwave Background
NASA Technical Reports Server (NTRS)
Jewell, J.; Levin, S.; Anderson, C. H.
2004-01-01
Power spectrum estimation and evaluation of associated errors in the presence of incomplete sky coverage; nonhomogeneous, correlated instrumental noise; and foreground emission are problems of central importance for the extraction of cosmological information from the cosmic microwave background (CMB).
A front-end readout Detector Board for the OpenPET electronics system
NASA Astrophysics Data System (ADS)
Choong, W.-S.; Abu-Nimeh, F.; Moses, W. W.; Peng, Q.; Vu, C. Q.; Wu, J.-Y.
2015-08-01
We present a 16-channel front-end readout board for the OpenPET electronics system. A major task in developing a nuclear medical imaging system, such as a positron emission computed tomograph (PET) or a single-photon emission computed tomograph (SPECT), is the electronics system. While there are a wide variety of detector and camera design concepts, the relatively simple nature of the acquired data allows for a common set of electronics requirements that can be met by a flexible, scalable, and high-performance OpenPET electronics system. The analog signals from the different types of detectors used in medical imaging share similar characteristics, which allows for a common analog signal processing. The OpenPET electronics processes the analog signals with Detector Boards. Here we report on the development of a 16-channel Detector Board. Each signal is digitized by a continuously sampled analog-to-digital converter (ADC), which is processed by a field programmable gate array (FPGA) to extract pulse height information. A leading edge discriminator creates a timing edge that is ``time stamped'' by a time-to-digital converter (TDC) implemented inside the FPGA . This digital information from each channel is sent to an FPGA that services 16 analog channels, and then information from multiple channels is processed by this FPGA to perform logic for crystal lookup, DOI calculation, calibration, etc.
Soh, Shui-Boon; Pham, Alan; O'Hehir, Robyn E; Cherk, Martin; Topliss, Duncan J
2013-09-01
A 42-year-old woman presented with a rapidly enlarging right-sided thyroid mass and underwent hemithyroidectomy. Riedel's thyroiditis was only diagnosed upon surgical decompression of the right carotid artery 2 years later. She became more symptomatic as Riedel's thyroiditis progressed, and mediastinal fibrosclerosis developed over the next 12 months. Oral prednisolone failed to improve her condition, and she was commenced on tamoxifen. Despite initial improvement, her symptoms recurred 2 years later, mainly arising from compression of the trachea and esophagus at the thoracic inlet. Fluorodeoxyglucose positron emission tomographic scan showed locally advanced active invasive fibrosclerosis in the neck and mediastinum. An elevated activin-A level of 218 pg/mL was consistent with active inflammation. IgG subtypes (including IgG4) were normal. Two courses of iv methylprednisolone were given but only produced transient improvement. Subsequently, the patient received 3 doses of i.v. rituximab at monthly intervals and had prompt sustained symptomatic improvement. Activin-A level decreased to 122 pg/mL 10 months after rituximab therapy. Fluorodeoxyglucose positron emission tomographic scan 6 weeks after therapy showed reduction in inflammation. A further scan at 10 months demonstrated ongoing response to rituximab. This is a case of refractory Riedel's thyroiditis with symptomatic, biochemical, and radiological improvement that has persisted 14 months after rituximab. The likelihood and duration of response to rituximab in Riedel's thyroiditis requires further study.
A front-end readout Detector Board for the OpenPET electronics system
Choong, W. -S.; Abu-Nimeh, F.; Moses, W. W.; ...
2015-08-12
Here, we present a 16-channel front-end readout board for the OpenPET electronics system. A major task in developing a nuclear medical imaging system, such as a positron emission computed tomograph (PET) or a single-photon emission computed tomograph (SPECT), is the electronics system. While there are a wide variety of detector and camera design concepts, the relatively simple nature of the acquired data allows for a common set of electronics requirements that can be met by a flexible, scalable, and high-performance OpenPET electronics system. The analog signals from the different types of detectors used in medical imaging share similar characteristics, whichmore » allows for a common analog signal processing. The OpenPET electronics processes the analog signals with Detector Boards. Here we report on the development of a 16-channel Detector Board. Each signal is digitized by a continuously sampled analog-to-digital converter (ADC), which is processed by a field programmable gate array (FPGA) to extract pulse height information. A leading edge discriminator creates a timing edge that is "time stamped" by a time-to-digital converter (TDC) implemented inside the FPGA. In conclusion, this digital information from each channel is sent to an FPGA that services 16 analog channels, and then information from multiple channels is processed by this FPGA to perform logic for crystal lookup, DOI calculation, calibration, etc.« less
Giacomelli, L; Conroy, S; Gorini, G; Horton, L; Murari, A; Popovichev, S; Syme, D B
2014-02-01
The Joint European Torus (JET, Culham, UK) is the largest tokamak in the world devoted to nuclear fusion experiments of magnetic confined Deuterium (D)/Deuterium-Tritium (DT) plasmas. Neutrons produced in these plasmas are measured using various types of neutron detectors and spectrometers. Two of these instruments on JET make use of organic liquid scintillator detectors. The neutron emission profile monitor implements 19 liquid scintillation counters to detect the 2.45 MeV neutron emission from D plasmas. A new compact neutron spectrometer is operational at JET since 2010 to measure the neutron energy spectra from both D and DT plasmas. Liquid scintillation detectors are sensitive to both neutron and gamma radiation but give light responses of different decay time such that pulse shape discrimination techniques can be applied to identify the neutron contribution of interest from the data. The most common technique consists of integrating the radiation pulse shapes within different ranges of their rising and/or trailing edges. In this article, a step forward in this type of analysis is presented. The method applies a tomographic analysis of the 3-dimensional neutron and gamma pulse shape and pulse height distribution data obtained from liquid scintillation detectors such that n/γ discrimination can be improved to lower energies and additional information can be gained on neutron contributions to the gamma events and vice versa.
Greenhouse Gas Source Attribution: Measurements Modeling and Uncertainty Quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Zhen; Safta, Cosmin; Sargsyan, Khachik
2014-09-01
In this project we have developed atmospheric measurement capabilities and a suite of atmospheric modeling and analysis tools that are well suited for verifying emissions of green- house gases (GHGs) on an urban-through-regional scale. We have for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate atmospheric CO 2 . This will allow for the examination of regional-scale transport and distribution of CO 2 along with air pollutants traditionally studied using CMAQ at relatively high spatial and temporal resolution with the goal of leveraging emissions verification efforts for both air quality and climate. We have developedmore » a bias-enhanced Bayesian inference approach that can remedy the well-known problem of transport model errors in atmospheric CO 2 inversions. We have tested the approach using data and model outputs from the TransCom3 global CO 2 inversion comparison project. We have also performed two prototyping studies on inversion approaches in the generalized convection-diffusion context. One of these studies employed Polynomial Chaos Expansion to accelerate the evaluation of a regional transport model and enable efficient Markov Chain Monte Carlo sampling of the posterior for Bayesian inference. The other approach uses de- terministic inversion of a convection-diffusion-reaction system in the presence of uncertainty. These approaches should, in principle, be applicable to realistic atmospheric problems with moderate adaptation. We outline a regional greenhouse gas source inference system that integrates (1) two ap- proaches of atmospheric dispersion simulation and (2) a class of Bayesian inference and un- certainty quantification algorithms. We use two different and complementary approaches to simulate atmospheric dispersion. Specifically, we use a Eulerian chemical transport model CMAQ and a Lagrangian Particle Dispersion Model - FLEXPART-WRF. These two models share the same WRF assimilated meteorology fields, making it possible to perform a hybrid simulation, in which the Eulerian model (CMAQ) can be used to compute the initial condi- tion needed by the Lagrangian model, while the source-receptor relationships for a large state vector can be efficiently computed using the Lagrangian model in its backward mode. In ad- dition, CMAQ has a complete treatment of atmospheric chemistry of a suite of traditional air pollutants, many of which could help attribute GHGs from different sources. The inference of emissions sources using atmospheric observations is cast as a Bayesian model calibration problem, which is solved using a variety of Bayesian techniques, such as the bias-enhanced Bayesian inference algorithm, which accounts for the intrinsic model deficiency, Polynomial Chaos Expansion to accelerate model evaluation and Markov Chain Monte Carlo sampling, and Karhunen-Lo %60 eve (KL) Expansion to reduce the dimensionality of the state space. We have established an atmospheric measurement site in Livermore, CA and are collect- ing continuous measurements of CO 2 , CH 4 and other species that are typically co-emitted with these GHGs. Measurements of co-emitted species can assist in attributing the GHGs to different emissions sectors. Automatic calibrations using traceable standards are performed routinely for the gas-phase measurements. We are also collecting standard meteorological data at the Livermore site as well as planetary boundary height measurements using a ceilometer. The location of the measurement site is well suited to sample air transported between the San Francisco Bay area and the California Central Valley.« less
Objectified quantification of uncertainties in Bayesian atmospheric inversions
NASA Astrophysics Data System (ADS)
Berchet, A.; Pison, I.; Chevallier, F.; Bousquet, P.; Bonne, J.-L.; Paris, J.-D.
2015-05-01
Classical Bayesian atmospheric inversions process atmospheric observations and prior emissions, the two being connected by an observation operator picturing mainly the atmospheric transport. These inversions rely on prescribed errors in the observations, the prior emissions and the observation operator. When data pieces are sparse, inversion results are very sensitive to the prescribed error distributions, which are not accurately known. The classical Bayesian framework experiences difficulties in quantifying the impact of mis-specified error distributions on the optimized fluxes. In order to cope with this issue, we rely on recent research results to enhance the classical Bayesian inversion framework through a marginalization on a large set of plausible errors that can be prescribed in the system. The marginalization consists in computing inversions for all possible error distributions weighted by the probability of occurrence of the error distributions. The posterior distribution of the fluxes calculated by the marginalization is not explicitly describable. As a consequence, we carry out a Monte Carlo sampling based on an approximation of the probability of occurrence of the error distributions. This approximation is deduced from the well-tested method of the maximum likelihood estimation. Thus, the marginalized inversion relies on an automatic objectified diagnosis of the error statistics, without any prior knowledge about the matrices. It robustly accounts for the uncertainties on the error distributions, contrary to what is classically done with frozen expert-knowledge error statistics. Some expert knowledge is still used in the method for the choice of an emission aggregation pattern and of a sampling protocol in order to reduce the computation cost. The relevance and the robustness of the method is tested on a case study: the inversion of methane surface fluxes at the mesoscale with virtual observations on a realistic network in Eurasia. Observing system simulation experiments are carried out with different transport patterns, flux distributions and total prior amounts of emitted methane. The method proves to consistently reproduce the known "truth" in most cases, with satisfactory tolerance intervals. Additionally, the method explicitly provides influence scores and posterior correlation matrices. An in-depth interpretation of the inversion results is then possible. The more objective quantification of the influence of the observations on the fluxes proposed here allows us to evaluate the impact of the observation network on the characterization of the surface fluxes. The explicit correlations between emission aggregates reveal the mis-separated regions, hence the typical temporal and spatial scales the inversion can analyse. These scales are consistent with the chosen aggregation patterns.
Water vapour tomography using GPS phase observations: Results from the ESCOMPTE experiment
NASA Astrophysics Data System (ADS)
Nilsson, T.; Gradinarsky, L.; Elgered, G.
2007-10-01
Global Positioning System (GPS) tomography is a technique for estimating the 3-D structure of the atmospheric water vapour using data from a dense local network of GPS receivers. Several current methods utilize estimates of slant wet delays between the GPS satellites and the receivers on the ground, which are difficult to obtain with millimetre accuracy from the GPS observations. We present results of applying a new tomographic method to GPS data from the Expériance sur site pour contraindre les modèles de pollution atmosphérique et de transport d'emissions (ESCOMPTE) experiment in southern France. This method does not rely on any slant wet delay estimates, instead it uses the GPS phase observations directly. We show that the estimated wet refractivity profiles estimated by this method is on the same accuracy level or better compared to other tomographic methods. The results are in agreement with earlier simulations, for example the profile information is limited above 4 km.
AN EXAMINATION OF THE CMAQ SIMULATIONS OF THE WET DEPOSITION OF AMMONIUM FROM A BAYESIAN PERSPECTIVE
The objective of this study is to ascertain the effects of precipitation simulations and emissions on CMAQ simulations of deposition. In both seasons, CMAQ tends to underpredict the deposition amounts. Based on the co-located measurements of ammonium wet deposition and precipita...
Source partitioning of methane emissions and its seasonality in the U.S. Midwest
USDA-ARS?s Scientific Manuscript database
The methane (CH4) budget and its source partitioning are poorly constrained in the Midwestern, United States. We used tall tower (185 m) aerodynamic flux measurements and atmospheric scale factor Bayesian inversions (SFBI) to constrain the monthly budget and to partition the total budget into natura...
Recent global methane trends: an investigation using hierarchical Bayesian methods
NASA Astrophysics Data System (ADS)
Rigby, M. L.; Stavert, A.; Ganesan, A.; Lunt, M. F.
2014-12-01
Following a decade with little growth, methane concentrations began to increase across the globe in 2007, and have continued to rise ever since. The reasons for this renewed growth are currently the subject of much debate. Here, we discuss the recent observed trends, and highlight some of the strengths and weaknesses in current "inverse" methods for quantifying fluxes using observations. In particular, we focus on the outstanding problems of accurately quantifying uncertainties in inverse frameworks. We examine to what extent the recent methane changes can be explained by the current generation of flux models and inventories. We examine the major modes of variability in wetland models along with the Global Fire Emissions Database (GFED) and the Emissions Database for Global Atmospheric Research (EDGAR). Using the Model for Ozone and Related Tracers (MOZART), we determine whether the spatial and temporal atmospheric trends predicted using these emissions can be brought into consistency with in situ atmospheric observations. We use a novel hierarchical Bayesian methodology in which scaling factors applied to the principal components of the flux fields are estimated simultaneously with the uncertainties associated with the a priori fluxes and with model representations of the observations. Using this method, we examine the predictive power of methane flux models for explaining recent fluctuations.
Quantifying methane and nitrous oxide emissions from the UK using a dense monitoring network
NASA Astrophysics Data System (ADS)
Ganesan, A. L.; Manning, A. J.; Grant, A.; Young, D.; Oram, D. E.; Sturges, W. T.; Moncrieff, J. B.; O'Doherty, S.
2015-01-01
The UK is one of several countries around the world that has enacted legislation to reduce its greenhouse gas emissions. Monitoring of emissions has been done through a detailed sectoral level bottom-up inventory (UK National Atmospheric Emissions Inventory, NAEI) from which national totals are submitted yearly to the United Framework Convention on Climate Change. In parallel, the UK government has funded four atmospheric monitoring stations to infer emissions through top-down methods that assimilate atmospheric observations. In this study, we present top-down emissions of methane (CH4) and nitrous oxide (N2O) for the UK and Ireland over the period August 2012 to August 2014. We used a hierarchical Bayesian inverse framework to infer fluxes as well as a set of covariance parameters that describe uncertainties in the system. We inferred average UK emissions of 2.08 (1.72-2.47) Tg yr-1 CH4 and 0.105 (0.087-0.127) Tg yr-1 N2O and found our derived estimates to be generally lower than the inventory. We used sectoral distributions from the NAEI to determine whether these discrepancies can be attributed to specific source sectors. Because of the distinct distributions of the two dominant CH4 emissions sectors in the UK, agriculture and waste, we found that the inventory may be overestimated in agricultural CH4 emissions. We also found that N2O fertilizer emissions from the NAEI may be overestimated and we derived a significant seasonal cycle in emissions. This seasonality is likely due to seasonality in fertilizer application and in environmental drivers such as temperature and rainfall, which are not reflected in the annual resolution inventory. Through the hierarchical Bayesian inverse framework, we quantified uncertainty covariance parameters and emphasized their importance for high-resolution emissions estimation. We inferred average model errors of approximately 20 and 0.4 ppb and correlation timescales of 1.0 (0.72-1.43) and 2.6 (1.9-3.9) days for CH4 and N2O, respectively. These errors are a combination of transport model errors as well as errors due to unresolved emissions processes in the inventory. We found the largest CH4 errors at the Tacolneston station in eastern England, which is possibly to do with sporadic emissions from landfills and offshore gas in the North Sea.
Dental optical tomography with upconversion nanoparticles—a feasibility study
Long, Feixiao; Intes, Xavier
2017-01-01
Abstract. Upconversion nanoparticles (UCNPs) have the unique ability to emit multiple colors upon excitation by near-infrared (NIR) light. Herein, we investigate the potential use of UCNPs as contrast agents for dental optical tomography, with a focus on monitoring the status of fillings after dental restoration. The potential of performing tomographic imaging using UCNP emission of visible or NIR light is established. This in silico and ex vivo study paves the way toward employing UCNPs as theranostic agents for dental applications. PMID:28586852
1983-02-23
Annihilation Techniques SONIA MIIAN S., R. ZANA , J.CH. ABBE, and G. DUPLATRE - xxviii P-80 Study of Microemulsion Systems by Positron Annihilation...California, Berkeley CA 94720, U.S.A. Primary considerations for the design of positron emission tomographs for medical studies in humans are high...imaging system for medical applications is to produce an image in as short as time as possible which represents as accurately Ias possible the
Commissioning of the J-PET Detector for Studies of Decays of Positronium Atoms
NASA Astrophysics Data System (ADS)
Czerwiński, E.; Dulski, K.; Białas, P.; Curceanu, C.; Gajos, A.; Głowacz, B.; Gorgol, M.; Hiesmayr, B. C.; Jasińska, B.; Kisielewska, D.; Korcyl, G.; Kowalski, P.; Kozik, T.; Krawczyk, N.; Krzemień, W.; Kubicz, E.; Mohammed, M.; Niedźwiecki, Sz.; Pałka, M.; Pawlik-Niedźwiecka, M.; Raczyński, L.; Rudy, Z.; Sharma, N. G.; Sharma, S.; Shopa, R. Y.; Silarski, M.; Skurzok, M.; Wieczorek, A.; Wiślicki, W.; Zgardzińska, B.; Zieliński, M.; Moskal, P.
The Jagiellonian Positron Emission Tomograph (J-PET) is a detector for medical imaging of the whole human body as well as for physics studies involving detection of electron-positron annihilation into photons. J-PET has high angular and time resolution and allows for measurement of spin of the positronium and the momenta and polarization vectors of annihilation quanta. In this article, we present the potential of the J-PET system for background rejection in the decays of positronium atoms.
Dental optical tomography with upconversion nanoparticles—a feasibility study
NASA Astrophysics Data System (ADS)
Long, Feixiao; Intes, Xavier
2017-06-01
Upconversion nanoparticles (UCNPs) have the unique ability to emit multiple colors upon excitation by near-infrared (NIR) light. Herein, we investigate the potential use of UCNPs as contrast agents for dental optical tomography, with a focus on monitoring the status of fillings after dental restoration. The potential of performing tomographic imaging using UCNP emission of visible or NIR light is established. This in silico and ex vivo study paves the way toward employing UCNPs as theranostic agents for dental applications.
Dental optical tomography with upconversion nanoparticles-a feasibility study.
Long, Feixiao; Intes, Xavier
2017-06-01
Upconversion nanoparticles (UCNPs) have the unique ability to emit multiple colors upon excitation by near-infrared (NIR) light. Herein, we investigate the potential use of UCNPs as contrast agents for dental optical tomography, with a focus on monitoring the status of fillings after dental restoration. The potential of performing tomographic imaging using UCNP emission of visible or NIR light is established. This in silico and ex vivo study paves the way toward employing UCNPs as theranostic agents for dental applications.
Hybrid Gama Emission Tomography (HGET): FY16 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Erin A.; Smith, Leon E.; Wittman, Richard S.
2017-02-01
Current International Atomic Energy Agency (IAEA) methodologies for the verification of fresh low-enriched uranium (LEU) and mixed oxide (MOX) fuel assemblies are volume-averaging methods that lack sensitivity to individual pins. Further, as fresh fuel assemblies become more and more complex (e.g., heavy gadolinium loading, high degrees of axial and radial variation in fissile concentration), the accuracy of current IAEA instruments degrades and measurement time increases. Particularly in light of the fact that no special tooling is required to remove individual pins from modern fuel assemblies, the IAEA needs new capabilities for the verification of unirradiated (i.e., fresh LEU and MOX)more » assemblies to ensure that fissile material has not been diverted. Passive gamma emission tomography has demonstrated potential to provide pin-level verification of spent fuel, but gamma-ray emission rates from unirradiated fuel emissions are significantly lower, precluding purely passive tomography methods. The work presented here introduces the concept of Hybrid Gamma Emission Tomography (HGET) for verification of unirradiated fuels, in which a neutron source is used to actively interrogate the fuel assembly and the resulting gamma-ray emissions are imaged using tomographic methods to provide pin-level verification of fissile material concentration.« less
Hu, X H; Li, Y P; Huang, G H; Zhuang, X W; Ding, X W
2016-05-01
In this study, a Bayesian-based two-stage inexact optimization (BTIO) method is developed for supporting water quality management through coupling Bayesian analysis with interval two-stage stochastic programming (ITSP). The BTIO method is capable of addressing uncertainties caused by insufficient inputs in water quality model as well as uncertainties expressed as probabilistic distributions and interval numbers. The BTIO method is applied to a real case of water quality management for the Xiangxi River basin in the Three Gorges Reservoir region to seek optimal water quality management schemes under various uncertainties. Interval solutions for production patterns under a range of probabilistic water quality constraints have been generated. Results obtained demonstrate compromises between the system benefit and the system failure risk due to inherent uncertainties that exist in various system components. Moreover, information about pollutant emission is accomplished, which would help managers to adjust production patterns of regional industry and local policies considering interactions of water quality requirement, economic benefit, and industry structure.
Experimental adaptive quantum tomography of two-qubit states
NASA Astrophysics Data System (ADS)
Struchalin, G. I.; Pogorelov, I. A.; Straupe, S. S.; Kravtsov, K. S.; Radchenko, I. V.; Kulik, S. P.
2016-01-01
We report an experimental realization of adaptive Bayesian quantum state tomography for two-qubit states. Our implementation is based on the adaptive experimental design strategy proposed in the work by Huszár and Houlsby [F. Huszár and N. M. T. Houlsby, Phys. Rev. A 85, 052120 (2012)., 10.1103/PhysRevA.85.052120] and provides an optimal measurement approach in terms of the information gain. We address the practical questions which one faces in any experimental application: the influence of technical noise and the behavior of the tomographic algorithm for an easy-to-implement class of factorized measurements. In an experiment with polarization states of entangled photon pairs, we observe a lower instrumental noise floor and superior reconstruction accuracy for nearly pure states of the adaptive protocol compared to a nonadaptive protocol. At the same time, we show that for the mixed states, the restriction to factorized measurements results in no advantage for adaptive measurements, so general measurements have to be used.
New method to analyze internal disruptions with tomographic reconstructions
NASA Astrophysics Data System (ADS)
Tanzi, C. P.; de Blank, H. J.
1997-03-01
Sawtooth crashes have been investigated on the Rijnhuizen Tokamak Project (RTP) [N. J. Lopes Cardozo et al., Proceedings of the 14th International Conference on Plasma Physics and Controlled Nuclear Fusion Research, Würzburg, 1992 (International Atomic Energy Agency, Vienna, 1993), Vol. 1, p. 271]. Internal disruptions in tokamak plasmas often exhibit an m=1 poloidal mode structure prior to the collapse which can be clearly identified by means of multicamera soft x-ray diagnostics. In this paper tomographic reconstructions of such m=1 modes are analyzed with a new method, based on magnetohydrodynamic (MHD) invariants computed from the two-dimensional emissivity profiles, which quantifies the amount of profile flattening not only after the crash but also during the precursor oscillations. The results are interpreted by comparing them with two models which simulate the measurements of the m=1 redistribution of soft x-ray emissivity prior to the sawtooth crash. One model is based on the magnetic reconnection model of Kadomtsev. The other involves ideal MHD motion only. In cases where differences in magnetic topology between the two models cannot be seen in the tomograms, the analysis of profile flattening has an advantage. The analysis shows that in RTP the clearly observed m=1 displacement of some sawteeth requires the presence of convective ideal MHD motion, whereas other precursors are consistent with magnetic reconnection of up to 75% of the magnetic flux within the q=1 surface. The possibility of ideal interchange combined with enhanced cross-field transport is not excluded.
Simulation of Medical Imaging Systems: Emission and Transmission Tomography
NASA Astrophysics Data System (ADS)
Harrison, Robert L.
Simulation is an important tool in medical imaging research. In patient scans the true underlying anatomy and physiology is unknown. We have no way of knowing in a given scan how various factors are confounding the data: statistical noise; biological variability; patient motion; scattered radiation, dead time, and other data contaminants. Simulation allows us to isolate a single factor of interest, for instance when researchers perform multiple simulations of the same imaging situation to determine the effect of statistical noise or biological variability. Simulations are also increasingly used as a design optimization tool for tomographic scanners. This article gives an overview of the mechanics of emission and transmission tomography simulation, reviews some of the publicly available simulation tools, and discusses trade-offs between the accuracy and efficiency of simulations.
NASA Astrophysics Data System (ADS)
Liu, Shuangquan; Zhang, Bin; Wang, Xin; Li, Lin; Chen, Yan; Liu, Xin; Liu, Fei; Shan, Baoci; Bai, Jing
2011-02-01
A dual-modality imaging system for simultaneous fluorescence molecular tomography (FMT) and positron emission tomography (PET) of small animals has been developed. The system consists of a noncontact 360°-projection FMT module and a flat panel detector pair based PET module, which are mounted orthogonally for the sake of eliminating cross interference. The FMT images and PET data are simultaneously acquired by employing dynamic sampling mode. Phantom experiments, in which the localization and range of radioactive and fluorescence probes are exactly indicated, have been carried out to verify the feasibility of the system. An experimental tumor-bearing mouse is also scanned using the dual-modality simultaneous imaging system, the preliminary fluorescence tomographic images and PET images demonstrate the in vivo performance of the presented dual-modality system.
Dulski, Kamil; Niedźwiecki, Szymon; Alfs, Dominika; Białas, Piotr; Curceanu, Catalina; Czerwiński, Eryk; Danel, Andrzej; Gajos, Aleksander; Głowacz, Bartosz; Gorgol, Marek; Hiesmayr, Beatrix; Jasińska, Bożena; Kacprzak, Krzysztof; Kamińska, Daria; Kapłon, Łukasz; Kochanowski, Andrzej; Korcyl, Grzegorz; Kowalski, Paweł; Kozik, Tomasz; Krzemień, Wojciech; Kubicz, Ewelina; Kucharek, Mateusz; Mohammed, Muhsin; Pawlik-Niedźwiecka, Monika; Pałka, Marek; Raczyński, Lech; Rudy, Zbigniew; Rundel, Oleksandr; Sharma, Neha G.; Silarski, Michał; Uchacz, Tomasz; Wiślicki, Wojciech; Zgardzińska, Bożena; Zieliński, Marcin; Moskal, Paweł
2017-01-01
A novel plastic scintillator is developed for the application in the digital positron emission tomography (PET). The novelty of the concept lies in application of the 2-(4-styrylphenyl)benzoxazole as a wavelength shifter. The substance has not been used as scintillator dopant before. A dopant shifts the scintillation spectrum towards longer wavelengths making it more suitable for applications in scintillators of long strips geometry and light detection with digital silicon photomultipliers. These features open perspectives for the construction of the cost-effective and MRI-compatible PET scanner with the large field of view. In this article we present the synthesis method and characterize performance of the elaborated scintillator by determining its light emission spectrum, light emission efficiency, rising and decay time of the scintillation pulses and resulting timing resolution when applied in the positron emission tomography. The optimal concentration of the novel wavelength shifter was established by maximizing the light output and it was found to be 0.05 ‰ for cuboidal scintillator with dimensions of 14 mm x 14 mm x 20 mm. PMID:29176834
NASA Astrophysics Data System (ADS)
Salas-García, Irene; Fanjul-Vélez, Félix; Arce-Diego, José Luis
2012-03-01
The development of Photodynamic Therapy (PDT) predictive models has become a valuable tool for an optimal treatment planning, monitoring and dosimetry adjustment. A few attempts have achieved a quite complete characterization of the complex photochemical and photophysical processes involved, even taking into account superficial fluorescence in the target tissue. The present work is devoted to the application of a predictive PDT model to obtain fluorescence tomography information during PDT when applied to a skin disease. The model takes into account the optical radiation distribution, a non-homogeneous topical photosensitizer distribution, the time dependent photochemical interaction and the photosensitizer fluorescence emission. The results show the spatial evolution of the photosensitizer fluorescence emission and the amount of singlet oxygen produced during PDT. The depth dependent photosensitizer fluorescence emission obtained is essential to estimate the spatial photosensitizer concentration and its degradation due to photobleaching. As a consequence the proposed approach could be used to predict the photosensitizer fluorescence tomographic measurements during PDT. The singlet oxygen prediction could also be employed as a valuable tool to predict the short term treatment outcome.
Wieczorek, Anna; Dulski, Kamil; Niedźwiecki, Szymon; Alfs, Dominika; Białas, Piotr; Curceanu, Catalina; Czerwiński, Eryk; Danel, Andrzej; Gajos, Aleksander; Głowacz, Bartosz; Gorgol, Marek; Hiesmayr, Beatrix; Jasińska, Bożena; Kacprzak, Krzysztof; Kamińska, Daria; Kapłon, Łukasz; Kochanowski, Andrzej; Korcyl, Grzegorz; Kowalski, Paweł; Kozik, Tomasz; Krzemień, Wojciech; Kubicz, Ewelina; Kucharek, Mateusz; Mohammed, Muhsin; Pawlik-Niedźwiecka, Monika; Pałka, Marek; Raczyński, Lech; Rudy, Zbigniew; Rundel, Oleksandr; Sharma, Neha G; Silarski, Michał; Uchacz, Tomasz; Wiślicki, Wojciech; Zgardzińska, Bożena; Zieliński, Marcin; Moskal, Paweł
2017-01-01
A novel plastic scintillator is developed for the application in the digital positron emission tomography (PET). The novelty of the concept lies in application of the 2-(4-styrylphenyl)benzoxazole as a wavelength shifter. The substance has not been used as scintillator dopant before. A dopant shifts the scintillation spectrum towards longer wavelengths making it more suitable for applications in scintillators of long strips geometry and light detection with digital silicon photomultipliers. These features open perspectives for the construction of the cost-effective and MRI-compatible PET scanner with the large field of view. In this article we present the synthesis method and characterize performance of the elaborated scintillator by determining its light emission spectrum, light emission efficiency, rising and decay time of the scintillation pulses and resulting timing resolution when applied in the positron emission tomography. The optimal concentration of the novel wavelength shifter was established by maximizing the light output and it was found to be 0.05 ‰ for cuboidal scintillator with dimensions of 14 mm x 14 mm x 20 mm.
Clinical applications with the HIDAC positron camera
NASA Astrophysics Data System (ADS)
Frey, P.; Schaller, G.; Christin, A.; Townsend, D.; Tochon-Danguy, H.; Wensveen, M.; Donath, A.
1988-06-01
A high density avalanche chamber (HIDAC) positron camera has been used for positron emission tomographic (PET) imaging in three different human studies, including patients presenting with: (I) thyroid diseases (124 cases); (II) clinically suspected malignant tumours of the pharynx or larynx (ENT) region (23 cases); and (III) clinically suspected primary malignant and metastatic tumours of the liver (9 cases, 19 PET scans). The positron emitting radiopharmaceuticals used for the three studies were Na 124I (4.2 d half-life) for the thyroid, 55Co-bleomycin (17.5 h half-life) for the ENT-region and 68Ga-colloid (68 min half-life) for the liver. Tomographic imaging was performed: (I) 24 h after oral Na 124I administration to the thyroid patients, (II) 18 h after intraveneous administration of 55Co-bleomycin to the ENT patients and (III) 20 min following the intraveneous injection of 68Ga-colloid to the liver tumour patients. Three different imaging protocols were used with the HIDAC positron camera to perform appropriate tomographic imaging in each patient study. Promising results were obtained in all three studies, particularly in tomographic thyroid imaging, where a significant clinical contribution is made possible for diagnosis and therapy planning by the PET technique. In the other two PET studies encouraging results were obtained for the detection and precise localisation of malignant tumour disease including an estimate of the functional liver volume based on the reticulo-endothelial-system (RES) of the liver, obtained in vivo, and the three-dimensional display of liver PET data using shaded graphics techniques. The clinical significance of the overall results obtained in both the ENT and the liver PET study, however, is still uncertain and the respective role of PET as a new imaging modality in these applications is not yet clearly established. To appreciate the clinical impact made by PET in liver and ENT malignant tumour staging needs further investigation, and more detailed data on a larger number of clinical and experimental PET scans will be necessary for definitive evaluation. Nevertheless, the HIDAC positron camera may be used for clinical PET imaging in well-defined patient cases, particularly in situations where both high spatial resolution is desired in the reconstructed image of the examined pathological condition and at the same time "static" PET imaging may be adequate, as is the case in thyroid-, ENT- and liver tomographic imaging using the HIDAC positron camera.
NASA Astrophysics Data System (ADS)
Kopacz, Monika; Jacob, Daniel J.; Henze, Daven K.; Heald, Colette L.; Streets, David G.; Zhang, Qiang
2009-02-01
We apply the adjoint of an atmospheric chemical transport model (GEOS-Chem CTM) to constrain Asian sources of carbon monoxide (CO) with 2° × 2.5° spatial resolution using Measurement of Pollution in the Troposphere (MOPITT) satellite observations of CO columns in February-April 2001. Results are compared to the more common analytical method for solving the same Bayesian inverse problem and applied to the same data set. The analytical method is more exact but because of computational limitations it can only constrain emissions over coarse regions. We find that the correction factors to the a priori CO emission inventory from the adjoint inversion are generally consistent with those of the analytical inversion when averaged over the large regions of the latter. The adjoint solution reveals fine-scale variability (cities, political boundaries) that the analytical inversion cannot resolve, for example, in the Indian subcontinent or between Korea and Japan, and some of that variability is of opposite sign which points to large aggregation errors in the analytical solution. Upward correction factors to Chinese emissions from the prior inventory are largest in central and eastern China, consistent with a recent bottom-up revision of that inventory, although the revised inventory also sees the need for upward corrections in southern China where the adjoint and analytical inversions call for downward correction. Correction factors for biomass burning emissions derived from the adjoint and analytical inversions are consistent with a recent bottom-up inventory on the basis of MODIS satellite fire data.
Optimization-Based Approach for Joint X-Ray Fluorescence and Transmission Tomographic Inversion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di, Zichao; Leyffer, Sven; Wild, Stefan M.
2016-01-01
Fluorescence tomographic reconstruction, based on the detection of photons coming from fluorescent emission, can be used for revealing the internal elemental composition of a sample. On the other hand, conventional X-ray transmission tomography can be used for reconstructing the spatial distribution of the absorption coefficient inside a sample. In this work, we integrate both X-ray fluorescence and X-ray transmission data modalities and formulate a nonlinear optimization-based approach for reconstruction of the elemental composition of a given object. This model provides a simultaneous reconstruction of both the quantitative spatial distribution of all elements and the absorption effect in the sample. Mathematicallymore » speaking, we show that compared with the single-modality inversion (i.e., the X-ray transmission or fluorescence alone), the joint inversion provides a better-posed problem, which implies a better recovery. Therefore, the challenges in X-ray fluorescence tomography arising mainly from the effects of self-absorption in the sample are partially mitigated. The use of this technique is demonstrated on the reconstruction of several synthetic samples.« less
NASA Astrophysics Data System (ADS)
Gammaldi, S.; Amoroso, O.; D'Auria, L.; Zollo, A.
2018-05-01
A multi-2D imaging of the Solfatara Crater inside the Campi Flegrei Caldera, was obtained by the joint interpretation of geophysical evidences and the new active seismic dataset acquired during the RICEN experiment (EU project MEDSUV) in 2014. We used a total of 17,894 first P-wave arrival times manually picked on pre-processed waveforms, recorded along two 1D profiles criss-crossing the inner Solfatara crater, and performed a tomographic inversion based on a multi-scale strategy and a Bayesian estimation of velocity parameters. The resulting tomographic images provide evidence for a low velocity (500-1500 m/s) water saturated deeper layer at West near the outcropping evidence of the Fangaia, contrasted by a high velocity (2000-3200 m/s) layer correlated with a consolidated tephra deposit. The transition velocity range (1500-2000 m/s) layer suggests a possible presence of a gas-rich, accumulation volume. Thanks to the mutual P-wave velocity model, we infer a detailed image for the gas migration path to the Earth surface. The gasses coming from the deep hydrothermal plume accumulate in the central and most depressed area of the Solfatara being trapped by the meteoric water saturated layer. Therefore, the gasses are transmitted through the buried fault toward the east part of the crater, where the ring faults facilitate the release as confirmed by the fumaroles. Starting from the eastern surface evidence of the gas releasing in the Bocca Grande and Bocca Nuova fumaroles, and the presence of the central deeper plume we suggest a fault situated in the central part of the crater which seems to represent the main buried conduit among them plays a key role.
NASA Astrophysics Data System (ADS)
Evangeliou, Nikolaos; Thompson, Rona; Stohl, Andreas; Shevchenko, Vladimir P.
2016-04-01
Black carbon (BC) is the main light absorbing aerosol species and it has important impacts on air quality, weather and climate. The major source of BC is incomplete combustion of fossil fuels and the burning of biomass or bio-fuels (soot). Therefore, to understand to what extent BC affects climate change and pollutant dynamics, accurate knowledge of the emissions, distribution and variation of BC is required. Most commonly, BC emission inventory datasets are built by "bottom up" approaches based on activity data and emissions factors, but these methods are considered to have large uncertainty (Cao et al, 2006). In this study, we have used a Bayesian Inversion to estimate spatially resolved BC emissions. Emissions are estimated monthly for 2014 and over the domain from 180°W to 180°E and 50°N to 90°N. Atmospheric transport is modeled using the Lagrangian Particle Dispersion Model, FLEXPART (Stohl et al., 1998; 2005), and the inversion framework, FLEXINVERT, developed by Thompson and Stohl, (2014). The study domain is of particular interest concerning the identification and estimation of BC sources. In contrast to Europe and North America, where BC sources are comparatively well documented as a result of intense monitoring, only one station recording BC concentrations exists in the whole of Siberia. In addition, emissions from gas flaring by the oil industry have been geographically misplaced in most emission inventories and may be an important source of BC at high latitudes since a significant proportion of the total gas flared occurs at these high latitudes (Stohl et al., 2013). Our results show large differences with the existing BC inventories, whereas the estimated fluxes improve modeled BC concentrations with respect to observations. References Cao, G. et al. Atmos. Environ., 40, 6516-6527, 2006. Stohl, A. et al. Atmos. Environ., 32(24), 4245-4264, 1998. Stohl, A. et al. Atmos. Chem. Phys., 5(9), 2461-2474, 2005. Stohl, A. et al. Atmos. Chem. Phys., 13, 8833-8855, 2013. Thompson, R. L., and Stohl A. Geosci. Model Dev., 7, 2223-2242, 2014.
Source partitioning of methane emissions and its seasonality in the U.S. Midwest
Zichong Chen; Timothy J. Griffis; John M. Baker; Dylan B. Millet; Jeffrey D. Wood; Edward J. Dlugokencky; Arlyn E. Andrews; Colm Sweeney; Cheng Hu; Randall K. Kolka
2018-01-01
The methane (CH4) budget and its source partitioning are poorly constrained in the Midwestern United States. We used tall tower (185 m) aerodynamic flux measurements and atmospheric scale factor Bayesian inversions to constrain the monthly budget and to partition the total budget into natural (e.g., wetlands) and anthropogenic (e.g., livestock,...
J-PET detector system for studies of the electron-positron annihilations
NASA Astrophysics Data System (ADS)
Pawlik-Niedźwiecka, M.; Khreptak, O.; Gajos, A.; Wieczorek, A.; Alfs, D.; Bednarski, T.; Białas, P.; Curceanu, C.; Czerwiński, E.; Dulski, K.; Głowacz, B.; Gupta-Sharma, N.; Gorgol, M.; Hiesmayr, B. C.; Jasińska, B.; Kamińska, D.; Korcyl, G.; Kowalski, P.; Krzmień, W.; Krawczyk, N.; Kubicz, E.; Mohammed, M.; Niedźwiecki, Sz.; Raczyński, L.; Rudy, Z.; Silarski, M.; Wiślicki, W.; Zgardzińska, B.; Zieliński, M.; Moskal, P.
2016-11-01
Jagiellonian Positron Emission Tomograph (J-PET) has been recently constructed at the Jagiellonian University as a prototype of a cost-effective scanner for the metabolic imaging of the whole human body. J-PET detector is optimized for the measurement of momentum and polarization of photons from the electron-positron annihilations. It is built out of strips of plastic scintillators, forming three cylindrical layers. As detector of gamma quanta it will be used for studies of discrete symmetries and multiparticle entanglement of photons originating from the decays of ortho-positronium atoms.
Counts, Sarah J; Kim, Anthony W
2017-08-01
Modalities to detect and characterize lung cancer are generally divided into those that are invasive [endobronchial ultrasound (EBUS), esophageal ultrasound (EUS), and electromagnetic navigational bronchoscopy (ENMB)] versus noninvasive [chest radiography (CXR), computed tomography (CT), positron emission tomography (PET), and magnetic resonance imaging (MRI)]. This chapter describes these modalities, the literature supporting their use, and delineates what tests to use to best evaluate the patient with lung cancer. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Ruggeri, Paolo; Irving, James; Holliger, Klaus
2015-08-01
We critically examine the performance of sequential geostatistical resampling (SGR) as a model proposal mechanism for Bayesian Markov-chain-Monte-Carlo (MCMC) solutions to near-surface geophysical inverse problems. Focusing on a series of simple yet realistic synthetic crosshole georadar tomographic examples characterized by different numbers of data, levels of data error and degrees of model parameter spatial correlation, we investigate the efficiency of three different resampling strategies with regard to their ability to generate statistically independent realizations from the Bayesian posterior distribution. Quite importantly, our results show that, no matter what resampling strategy is employed, many of the examined test cases require an unreasonably high number of forward model runs to produce independent posterior samples, meaning that the SGR approach as currently implemented will not be computationally feasible for a wide range of problems. Although use of a novel gradual-deformation-based proposal method can help to alleviate these issues, it does not offer a full solution. Further, we find that the nature of the SGR is found to strongly influence MCMC performance; however no clear rule exists as to what set of inversion parameters and/or overall proposal acceptance rate will allow for the most efficient implementation. We conclude that although the SGR methodology is highly attractive as it allows for the consideration of complex geostatistical priors as well as conditioning to hard and soft data, further developments are necessary in the context of novel or hybrid MCMC approaches for it to be considered generally suitable for near-surface geophysical inversions.
NASA Astrophysics Data System (ADS)
Alsing, Justin; Heavens, Alan; Jaffe, Andrew H.
2017-04-01
We apply two Bayesian hierarchical inference schemes to infer shear power spectra, shear maps and cosmological parameters from the Canada-France-Hawaii Telescope (CFHTLenS) weak lensing survey - the first application of this method to data. In the first approach, we sample the joint posterior distribution of the shear maps and power spectra by Gibbs sampling, with minimal model assumptions. In the second approach, we sample the joint posterior of the shear maps and cosmological parameters, providing a new, accurate and principled approach to cosmological parameter inference from cosmic shear data. As a first demonstration on data, we perform a two-bin tomographic analysis to constrain cosmological parameters and investigate the possibility of photometric redshift bias in the CFHTLenS data. Under the baseline ΛCDM (Λ cold dark matter) model, we constrain S_8 = σ _8(Ω _m/0.3)^{0.5} = 0.67+0.03-0.03 (68 per cent), consistent with previous CFHTLenS analyses but in tension with Planck. Adding neutrino mass as a free parameter, we are able to constrain ∑mν < 4.6 eV (95 per cent) using CFHTLenS data alone. Including a linear redshift-dependent photo-z bias Δz = p2(z - p1), we find p_1=-0.25+0.53-0.60 and p_2 = -0.15+0.17-0.15, and tension with Planck is only alleviated under very conservative prior assumptions. Neither the non-minimal neutrino mass nor photo-z bias models are significantly preferred by the CFHTLenS (two-bin tomography) data.
NASA Astrophysics Data System (ADS)
Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan; Medeiros, Lia; Özel, Feryal; Psaltis, Dimitrios
2016-12-01
The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore the robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan
2016-12-01
The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore themore » robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.« less
Quantification of Methane Source Locations and Emissions in AN Urban Setting
NASA Astrophysics Data System (ADS)
Crosson, E.; Richardson, S.; Tan, S. M.; Whetstone, J.; Bova, T.; Prasad, K. R.; Davis, K. J.; Phillips, N. G.; Turnbull, J. C.; Shepson, P. B.; Cambaliza, M. L.
2011-12-01
The regulation of methane emissions from urban sources such as landfills and waste-water treatment facilities is currently a highly debated topic in the US and in Europe. This interest is fueled, in part, by recent measurements indicating that urban emissions are a significant source of Methane (CH4) and in fact may be substantially higher than current inventory estimates(1). As a result, developing methods for locating and quantifying emissions from urban methane sources is of great interest to industries such as landfill and wastewater treatment facility owners, watchdog groups, and the governmental agencies seeking to evaluate or enforce regulations. In an attempt to identify major methane source locations and emissions in Boston, Indianapolis, and the Bay Area, systematic measurements of CH4 concentrations and meteorology data were made at street level using a vehicle mounted cavity ringdown analyzer. A number of discrete sources were detected at concentration levels in excess of 15 times background levels. Using Gaussian plume models as well as tomographic techniques, methane source locations and emission rates will be presented. In addition, flux chamber measurements of discrete sources such as those found in natural gas leaks will also be presented. (1) Wunch, D., P.O. Wennberg, G.C. Toon, G. Keppel-Aleks, and Y.G. Yavin, Emissions of Greenhouse Gases from a North American Megacity, Geophysical Research Letters, Vol. 36, L15810, doi:10.1029/2009GL)39825, 2009.
Transdimensional Bayesian tomography of the lowermost mantle from shear waves
NASA Astrophysics Data System (ADS)
Richardson, C.; Mousavi, S. S.; Tkalcic, H.; Masters, G.
2017-12-01
The lowermost layer of the mantle, known as D'', is a complex region that contains significant heterogeneities on different spatial scales and a wide range of physical and chemical features such as partial melting, seismic anisotropy, and variations in thermal and chemical composition. The most powerful tools we have to probe this region are seismic waves and corresponding imaging techniques such as tomography. Recently, we developed compressional velocity tomograms of D'' using a transdimensional Bayesian inversion, where the model parameterization is not explicit and regularization is not required. This has produced a far more nuanced P-wave velocity model of D'' than that from traditional S-wave tomography. We also note that P-wave models of D'' vary much more significantly among various research groups than the corresponding S-wave models. This study therefore seeks to develop a new S-wave velocity model of D'' underneath Australia by using predominantly ScS-S differential travel times measured through waveform correlation and Bayesian transdimensional inversion to further understand and characterize heterogeneities in D''. We used events at epicentral distances between 45 and 75 degrees from stations in Australia at depths of over 200 km and with magnitudes between 6.0 and 6.7. Because of globally incomplete coverage of station and earthquake locations, a major limitation of deep earth tomography has been the explicit parameterization of the region of interest. Explicit parameterization has been foundational in most studies, but faces inherent problems of either over-smoothing the data, or allowing for too much noise. To avoid this, we use spherical Voronoi polygons, which allow for a high level of flexibility as the polygons can grow, shrink, or be altogether deleted throughout a sequence of iterations. Our technique also yields highly desired model parameter uncertainties. While there is little doubt that D'' is heterogeneous, there is still much that is unclear about the extent and spatial distribution of different heterogeneous domains, as there are open questions about their dynamics and chemical interactions in the context of the surrounding mantle and outer core. In this context, our goal is also to quantify and understand the differences between S-wave and P-wave velocity tomographic models.
NASA Astrophysics Data System (ADS)
Graziosi, F.; Arduini, J.; Furlani, F.; Giostra, U.; Kuijpers, L. J. M.; Montzka, S. A.; Miller, B. R.; O'Doherty, S. J.; Stohl, A.; Bonasoni, P.; Maione, M.
2015-07-01
HCFC-22 (CHClF2), a stratospheric ozone depleting substance and a powerful greenhouse gas, is the third most abundant anthropogenic halocarbon in the atmosphere. Primarily used in refrigeration and air conditioning systems, its global production and consumption have increased during the last 60 years, with the global increases in the last decade mainly attributable to developing countries. In 2007, an adjustment to the Montreal Protocol for Substances that Deplete the Ozone Layer called for an accelerated phase out of HCFCs, implying a 75% reduction (base year 1989) of HCFC production and consumption by 2010 in developed countries against the previous 65% reduction. In Europe HCFC-22 is continuously monitored at the two sites Mace Head (Ireland) and Monte Cimone (Italy). Combining atmospheric observations with a Bayesian inversion technique, we estimated fluxes of HCFC-22 from Europe and from eight macro-areas within it, over an 11-year period from January 2002 to December 2012, during which the accelerated restrictions on HCFCs production and consumption have entered into force. According to our study, the maximum emissions over the entire domain was in 2003 (38.2 ± 4.7 Gg yr-1), and the minimum in 2012 (12.1 ± 2.0 Gg yr-1); emissions continuously decreased between these years, except for secondary maxima in the 2008 and 2010. Despite such a decrease in regional emissions, background values of HCFC-22 measured at the two European stations over 2002-2012 are still increasing as a consequence of global emissions, in part from developing countries, with an average trend of ca 7.0 ppt yr-1. However, the observations at the two European stations show also that since 2008 a decrease in the global growth rate has occurred. In general, our European emission estimates are in good agreement with those reported by previous studies that used different techniques. Since the currently dominant emission source of HCFC-22 is from banks, we assess the banks' size and their contribution to the total European emissions up to 2030, and we project a fast decrease approaching negligible emissions in the last five years of the considered period. Finally, inversions conducted over three month periods showed evidence for a seasonal cycle in emissions in regions in the Mediterranean basin but not outside it. Emissions derived from regions in the Mediterranean basin were ca. 25% higher in warmer months than in colder months.
NASA Astrophysics Data System (ADS)
Seidel, Anne; Wagner, Steven; Dreizler, Andreas; Ebert, Volker
2014-05-01
One of the most intricate effects in climate modelling is the role of permafrost thawing during the global warming process. Soil that has formerly never totally lost its ice cover now emits climate gases due to melting processes[1]. For a better prediction of climate development and possible feedback mechanisms, insights into physical procedures (like e.g. gas emission from underground reservoirs) are required[2]. Therefore, a long-term quantification of greenhouse gas concentrations (and further on fluxes) is necessary and the related structures that are responsible for emission need to be identified. In particular the spatial heterogeneity of soils caused by soil internal structures (e.g. soil composition changes or surface cracks) or by surface modifications (e.g. by plant growth) generate considerable complexities and difficulties for local measurements, for example with soil chambers. For such situations, which often cannot be avoided, a spatially resolved 2D-measurement to identify and quantify the gas emission from the structured soil would be needed, to better understand the influence of the soil sub-structures on the emission behavior. Thus we designed a spatially scanning laser absorption spectrometer setup to determine a 2D-gas concentration map in the soil-air boundary layer. The setup is designed to cover the surfaces in the range of square meters in a horizontal plane above the soil to be investigated. Existing field instruments for gas concentration or flux measurements are based on point-wise measurements, so structure identification is very tedious or even impossible. For this reason, we have developed a tomographic in-situ instrument based on TDLAS ('tunable diode laser absorption spectroscopy') that delivers absolute gas concentration distributions of areas with 0.8m × 0.8m size, without any need for reference measurements with a calibration gas. It is a simple and robust device based on a combination of scanning mirrors and reflecting foils, so that only very little optical alignment is necessary in the field. The measurement rate for a complete 2D field is presently up to 2.5 Hz. The measurement field size is currently limited only by laboratory conditions and could be extended easily to the range of several meters, as previous tests have confirmed[3]. A fast laser tuning rate of more than 5 kHz leads to high measurement path density, and overall more than 70% of a square shaped field area is covered. With this instrument, measurements of H2O- and CH4 - concentration distributions have taken place so far. We are going to discuss the instrument setup and the spectroscopic performance and present numerical studies concerning the tomographic reconstruction quality as well as first 2D reconstructions in the laboratory. The applicability to 2D CO2 detection and the improvement of frame rate and reconstruction quality using faster laser tuning will be discussed. [1] K. M. Walter Anthony, P. Anthony, G. Grosse, and J. Chanton, 'Geologic methane seeps along boundaries of Arctic permafrost thaw and melting glaciers,' Nat. Geosci., vol. 5, no. 6, pp. 419-426, May 2012. [2] B. Elberling, A. Michelsen, C. Schädel, E. A. G. Schuur, H. H. Christiansen, L. Tamstorf, M. P. Berg, and C. Sigsgaard, 'Long-term CO2 production following permafrost thaw,' Nat. Clim. Chang., vol. 3, no. 10, pp. 890-894, 2013. [3] A. Seidel, S. Wagner, and V. Ebert, 'TDLAS-based open-path laser hygrometer using simple reflective foils as scattering targets,' Appl. Phys. B, vol. 109, no. 3, pp. 497-504, Oct. 2012.
NASA Astrophysics Data System (ADS)
Santos-Costa, D.; Bolton, S. J.; Adumitroaie, V.; Janssen, M.; Levin, S.; Sault, R. J.; De Pater, I.; Tao, C.
2015-12-01
The Juno spacecraft will go into polar orbit after it arrives at Jupiter in mid-2016. Between November 2016 and March 2017, six MicroWave Radiometers will collect information on Jupiter's atmosphere and electron belt. Here we present simulations of MWR observations of the electron belt synchrotron emission, and discuss the features and dynamical behavior of this emission when observations are carried out from inside the radiation zone. We first present our computation method. We combine a three-dimensional tomographic reconstruction method of Earth-based observations and a theoretical model of Jupiter's electron belt to constrain the calculations of the volume emissivity of the synchrotron radiation for any frequency, location in the Jovian inner magnetosphere (radial distance < 4 Rj), and observational direction. Values of the computed emissivity are incorporated into a synchrotron simulator to predict Juno MWR measurements (full sky maps and temperatures) at any time of the mission. Samples of simulated MWR observations are presented and examined for different segments of Juno trajectory. We also present results of our ongoing investigation of the radiation zone distribution around the planet and the sources of variation on different time-scales. We show that a better understanding of the spatial distribution and variability of the electron belt is key to realistically forecast Juno MWR measurements.
Intravascular lymphomatosis presenting as acute hemispheric dysfunction.
Hwang, Woo Sub; Jung, Chul Won; Ko, Young Hye; Seo, Sang Won; Na, Duk L
2012-11-01
Intravascular lymphomatosis (IVL) is known to affect both hemispheres of the brain and manifests clinically as seizures or dementia. To our knowledge, there have been no cases in which acute hemispheric dysfunction is manifested in IVL. We present a 54-year-old man who showed steroid responsive acute hemispheric dysfunction. A technetium 99m-ethyl cysteinate dimer single-photon emission computed tomographic scan of the brain revealed hypoperfusion in the right hemisphere. The bone marrow biopsy specimen confirmed malignant lymphoid cells in vessels, which suggested IVL. Our case signifies the diversity of clinical manifestations in IVL. Copyright © 2012. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Zulfakriza, Z.; Saygin, E.; Cummins, P. R.; Widiyantoro, S.; Nugraha, A. D.; Lühr, B.-G.; Bodin, T.
2014-04-01
Delineating the crustal structure of central Java is crucial for understanding its complex tectonic setting. However, seismic imaging of the strong heterogeneity typical of such a tectonically active region can be challenging, particularly in the upper crust where velocity contrasts are strongest and steep body wave ray paths provide poor resolution. To overcome these difficulties, we apply the technique of ambient noise tomography (ANT) to data collected during the Merapi Amphibious Experiment (MERAMEX), which covered central Java with a temporary deployment of over 120 seismometers during 2004 May-October. More than 5000 Rayleigh wave Green's functions were extracted by cross-correlating the noise simultaneously recorded at available station pairs. We applied a fully non-linear 2-D Bayesian probabilistic inversion technique to the retrieved traveltimes. Features in the derived tomographic images correlate well with previous studies, and some shallow structures that were not evident in previous studies are clearly imaged with ANT. The Kendeng Basin and several active volcanoes appear with very low group velocities, and anomalies with relatively high velocities can be interpreted in terms of crustal sutures and/or surface geological features.
Transdimensional Seismic Tomography
NASA Astrophysics Data System (ADS)
Bodin, T.; Sambridge, M.
2009-12-01
In seismic imaging the degree of model complexity is usually determined by manually tuning damping parameters within a fixed parameterization chosen in advance. Here we present an alternative methodology for seismic travel time tomography where the model complexity is controlled automatically by the data. In particular we use a variable parametrization consisting of Voronoi cells with mobile geometry, shape and number, all treated as unknowns in the inversion. The reversible jump algorithm is used to sample the transdimensional model space within a Bayesian framework which avoids global damping procedures and the need to tune regularisation parameters. The method is an ensemble inference approach, as many potential solutions are generated with variable numbers of cells. Information is extracted from the ensemble as a whole by performing Monte Carlo integration to produce the expected Earth model. The ensemble of models can also be used to produce velocity uncertainty estimates and experiments with synthetic data suggest they represent actual uncertainty surprisingly well. In a transdimensional approach, the level of data uncertainty directly determines the model complexity needed to satisfy the data. Intriguingly, the Bayesian formulation can be extended to the case where data uncertainty is also uncertain. Experiments show that it is possible to recover data noise estimate while at the same time controlling model complexity in an automated fashion. The method is tested on synthetic data in a 2-D application and compared with a more standard matrix based inversion scheme. The method has also been applied to real data obtained from cross correlation of ambient noise where little is known about the size of the errors associated with the travel times. As an example, a tomographic image of Rayleigh wave group velocity for the Australian continent is constructed for 5s data together with uncertainty estimates.
Design and initial performance of PlanTIS: a high-resolution positron emission tomograph for plants
NASA Astrophysics Data System (ADS)
Beer, S.; Streun, M.; Hombach, T.; Buehler, J.; Jahnke, S.; Khodaverdi, M.; Larue, H.; Minwuyelet, S.; Parl, C.; Roeb, G.; Schurr, U.; Ziemons, K.
2010-02-01
Positron emitters such as 11C, 13N and 18F and their labelled compounds are widely used in clinical diagnosis and animal studies, but can also be used to study metabolic and physiological functions in plants dynamically and in vivo. A very particular tracer molecule is 11CO2 since it can be applied to a leaf as a gas. We have developed a Plant Tomographic Imaging System (PlanTIS), a high-resolution PET scanner for plant studies. Detectors, front-end electronics and data acquisition architecture of the scanner are based on the ClearPET™ system. The detectors consist of LSO and LuYAP crystals in phoswich configuration which are coupled to position-sensitive photomultiplier tubes. Signals are continuously sampled by free running ADCs, and data are stored in a list mode format. The detectors are arranged in a horizontal plane to allow the plants to be measured in the natural upright position. Two groups of four detector modules stand face-to-face and rotate around the field-of-view. This special system geometry requires dedicated image reconstruction and normalization procedures. We present the initial performance of the detector system and first phantom and plant measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lassen, N.A.; Henriksen, L.; Holm, S.
1983-01-01
Tomographic maps of local cerebral blood flow (CBF) were obtained with xenon-133 and with isopropyl-amphetamine-iodine-123 (IMP) in 11 subjects: one normal, two tumor cases, and eight cerebrovascular cases. A highly sensitive four-face, rapidly rotating, single-photon emission tomograph was used. The Xe-133 flow maps are essentially based on the average Xe-133 concentration over the initial 2 min during and after an inhalation of the inert gas lasting 1 min. These maps agreed very well with the early IMP maps obtained over the initial 10 min following an i.v. bolus injection. The subsequent IMP tomograms showed a slight decrease in contrast amountingmore » to appr. five percentage points in the CBF ratio between diseased and contralateral areas. It is concluded that Xe-133 is more practical: low cost, available on a 7-day basis, easily repeatable, quantifiable without the need for arterial sampling, and with low radiation exposure to patient and personnel. On the other hand, IMP gives an image of slightly higher resolution. It also introduces a new class of iodinated brain-seeking compounds allowing, perhaps, imaging of other functions more important than mere blood flow.« less
Design and initial performance of PlanTIS: a high-resolution positron emission tomograph for plants.
Beer, S; Streun, M; Hombach, T; Buehler, J; Jahnke, S; Khodaverdi, M; Larue, H; Minwuyelet, S; Parl, C; Roeb, G; Schurr, U; Ziemons, K
2010-02-07
Positron emitters such as (11)C, (13)N and (18)F and their labelled compounds are widely used in clinical diagnosis and animal studies, but can also be used to study metabolic and physiological functions in plants dynamically and in vivo. A very particular tracer molecule is (11)CO(2) since it can be applied to a leaf as a gas. We have developed a Plant Tomographic Imaging System (PlanTIS), a high-resolution PET scanner for plant studies. Detectors, front-end electronics and data acquisition architecture of the scanner are based on the ClearPET system. The detectors consist of LSO and LuYAP crystals in phoswich configuration which are coupled to position-sensitive photomultiplier tubes. Signals are continuously sampled by free running ADCs, and data are stored in a list mode format. The detectors are arranged in a horizontal plane to allow the plants to be measured in the natural upright position. Two groups of four detector modules stand face-to-face and rotate around the field-of-view. This special system geometry requires dedicated image reconstruction and normalization procedures. We present the initial performance of the detector system and first phantom and plant measurements.
Cerebral blood flow tomography with xenon-133
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lassen, N.A.
1985-10-01
Cerebral blood flow (CBF) can be measured tomographically by inhalation of Xenon-/sup 133/. The calculation is based on taking a sequence of tomograms during the wash-in and wash-out phase of the tracer. Due to the dynamic nature of the process, a highly sensitive and fast moving single photon emission computed tomograph (SPECT) is required. Two brain-dedicated SPECT systems designed for this purpose are mentioned, and the method is described with special reference to the limitations inherent in the soft energy of the 133Xe primary photons. CBF tomography can be used for a multitude of clinical and investigative purposes. This articlemore » discusses in particular its use for the selection of patients with carotid occlusion for extracranial/intracranial bypass surgery, for detection of severe arterial spasm after aneurysm bleeding, and for detection of low flow areas during severe migraine attacks. The use of other tracers for CBF tomography using SPECT is summarized with emphasis on the /sup 99m/Tc chelates that freely pass the intact blood-brain barrier. The highly sensitive brain-dedicated SPECT systems described are a prerequisite for achieving high resolution tomograms with such tracers.« less
NASA Astrophysics Data System (ADS)
Lunt, Mark; Rigby, Matt; Manning, Alistair; O'Doherty, Simon; Stavert, Ann; Stanley, Kieran; Young, Dickon; Pitt, Joseph; Bauguitte, Stephane; Allen, Grant; Helfter, Carole; Palmer, Paul
2017-04-01
The Greenhouse gAs Uk and Global Emissions (GAUGE) project aims to quantify the magnitude and uncertainty of key UK greenhouse gas emissions more robustly than previously achieved. Measurements of methane have been taken from a number of tall-tower and surface sites as well as mobile measurement platforms such as a research aircraft and a ferry providing regular transects off the east coast of the UK. Using the UK Met Office's atmospheric transport model, NAME, and a novel Bayesian inversion technique we present estimates of methane emissions from the UK from a number of different combinations of sites to show the robustness of the UK total emissions to network configuration. The impact on uncertainties will be discussed, focusing on the usefulness of the various measurement platforms for constraining UK emissions. We will examine the effects of observation selection and how a priori assumptions about model uncertainty can affect the emission estimates, even within a data-driven hierarchical inversion framework. Finally, we will show the impact of the resolution of the meteorology used to drive the NAME model on emissions estimates, and how to rationalise our understanding of the ability of transport models to represent reality.
Tomographic reconstruction of tokamak plasma light emission using wavelet-vaguelette decomposition
NASA Astrophysics Data System (ADS)
Schneider, Kai; Nguyen van Yen, Romain; Fedorczak, Nicolas; Brochard, Frederic; Bonhomme, Gerard; Farge, Marie; Monier-Garbet, Pascale
2012-10-01
Images acquired by cameras installed in tokamaks are difficult to interpret because the three-dimensional structure of the plasma is flattened in a non-trivial way. Nevertheless, taking advantage of the slow variation of the fluctuations along magnetic field lines, the optical transformation may be approximated by a generalized Abel transform, for which we proposed in Nguyen van yen et al., Nucl. Fus., 52 (2012) 013005, an inversion technique based on the wavelet-vaguelette decomposition. After validation of the new method using an academic test case and numerical data obtained with the Tokam 2D code, we present an application to an experimental movie obtained in the tokamak Tore Supra. A comparison with a classical regularization technique for ill-posed inverse problems, the singular value decomposition, allows us to assess the efficiency. The superiority of the wavelet-vaguelette technique is reflected in preserving local features, such as blobs and fronts, in the denoised emissivity map.
NASA Astrophysics Data System (ADS)
Nguyen van yen, R.; Fedorczak, N.; Brochard, F.; Bonhomme, G.; Schneider, K.; Farge, M.; Monier-Garbet, P.
2012-01-01
Images acquired by cameras installed in tokamaks are difficult to interpret because the three-dimensional structure of the plasma is flattened in a non-trivial way. Nevertheless, taking advantage of the slow variation of the fluctuations along magnetic field lines, the optical transformation may be approximated by a generalized Abel transform, for which we propose an inversion technique based on the wavelet-vaguelette decomposition. After validation of the new method using an academic test case and numerical data obtained with the Tokam 2D code, we present an application to an experimental movie obtained in the tokamak Tore Supra. A comparison with a classical regularization technique for ill-posed inverse problems, the singular value decomposition, allows us to assess the efficiency. The superiority of the wavelet-vaguelette technique is reflected in preserving local features, such as blobs and fronts, in the denoised emissivity map.
Compact conscious animal positron emission tomography scanner
Schyler, David J.; O'Connor, Paul; Woody, Craig; Junnarkar, Sachin Shrirang; Radeka, Veljko; Vaska, Paul; Pratte, Jean-Francois; Volkow, Nora
2006-10-24
A method of serially transferring annihilation information in a compact positron emission tomography (PET) scanner includes generating a time signal for an event, generating an address signal representing a detecting channel, generating a detector channel signal including the time and address signals, and generating a composite signal including the channel signal and similarly generated signals. The composite signal includes events from detectors in a block and is serially output. An apparatus that serially transfers annihilation information from a block includes time signal generators for detectors in a block and an address and channel signal generator. The PET scanner includes a ring tomograph that mounts onto a portion of an animal, which includes opposing block pairs. Each of the blocks in a block pair includes a scintillator layer, detection array, front-end array, and a serial encoder. The serial encoder includes time signal generators and an address signal and channel signal generator.
Two-dimensional AXUV-based radiated power density diagnostics on NSTX-Ua)
NASA Astrophysics Data System (ADS)
Faust, I.; Delgado-Aparicio, L.; Bell, R. E.; Tritz, K.; Diallo, A.; Gerhardt, S. P.; LeBlanc, B.; Kozub, T. A.; Parker, R. R.; Stratton, B. C.
2014-11-01
A new set of radiated-power-density diagnostics for the National Spherical Torus Experiment Upgrade (NSTX-U) tokamak have been designed to measure the two-dimensional poloidal structure of the total photon emissivity profile in order to perform power balance, impurity transport, and magnetohydrodynamic studies. Multiple AXUV-diode based pinhole cameras will be installed in the same toroidal angle at various poloidal locations. The local emissivity will be obtained from several types of tomographic reconstructions. The layout and response expected for the new radially viewing poloidal arrays will be shown for different impurity concentrations to characterize the diagnostic sensitivity. The radiated power profile inverted from the array data will also be used for estimates of power losses during transitions from various divertor configurations in NSTX-U. The effect of in-out and top/bottom asymmetries in the core radiation from high-Z impurities will be addressed.
Two-dimensional AXUV-based radiated power density diagnostics on NSTX-U.
Faust, I; Delgado-Aparicio, L; Bell, R E; Tritz, K; Diallo, A; Gerhardt, S P; LeBlanc, B; Kozub, T A; Parker, R R; Stratton, B C
2014-11-01
A new set of radiated-power-density diagnostics for the National Spherical Torus Experiment Upgrade (NSTX-U) tokamak have been designed to measure the two-dimensional poloidal structure of the total photon emissivity profile in order to perform power balance, impurity transport, and magnetohydrodynamic studies. Multiple AXUV-diode based pinhole cameras will be installed in the same toroidal angle at various poloidal locations. The local emissivity will be obtained from several types of tomographic reconstructions. The layout and response expected for the new radially viewing poloidal arrays will be shown for different impurity concentrations to characterize the diagnostic sensitivity. The radiated power profile inverted from the array data will also be used for estimates of power losses during transitions from various divertor configurations in NSTX-U. The effect of in-out and top/bottom asymmetries in the core radiation from high-Z impurities will be addressed.
SXR measurement and W transport survey using GEM tomographic system on WEST
NASA Astrophysics Data System (ADS)
Mazon, D.; Jardin, A.; Malard, P.; Chernyshova, M.; Coston, C.; Malard, P.; O'Mullane, M.; Czarski, T.; Malinowski, K.; Faisse, F.; Ferlay, F.; Verger, J. M.; Bec, A.; Larroque, S.; Kasprowicz, G.; Wojenski, A.; Pozniak, K.
2017-11-01
Measuring Soft X-Ray (SXR) radiation (0.1-20 keV) of fusion plasmas is a standard way of accessing valuable information on particle transport. Since heavy impurities like tungsten (W) could degrade plasma core performances and cause radiative collapses, it is necessary to develop new diagnostics to be able to monitor the impurity distribution in harsh fusion environments like ITER. A gaseous detector with energy discrimination would be a very good candidate for this purpose. The design and implementation of a new SXR diagnostic developed for the WEST project, based on a triple Gas Electron Multiplier (GEM) detector is presented. This detector works in photon counting mode and presents energy discrimination capabilities. The SXR system is composed of two 1D cameras (vertical and horizontal views respectively), located in the same poloidal cross-section to allow for tomographic reconstruction. An array (20 cm × 2 cm) consists of up to 128 detectors in front of a beryllium pinhole (equipped with a 1 mm diameter diaphragm) inserted at about 50 cm depth inside a cooled thimble in order to retrieve a wide plasma view. Acquisition of low energy spectrum is insured by a helium buffer installed between the pinhole and the detector. Complementary cooling systems (water) are used to maintain a constant temperature (25oC) inside the thimble. Finally a real-time automatic extraction system has been developed to protect the diagnostic during baking phases or any overheating unwanted events. Preliminary simulations of plasma emissivity and W distribution have been performed for WEST using a recently developed synthetic diagnostic coupled to a tomographic algorithm based on the minimum Fisher information (MFI) inversion method. First GEM acquisitions are presented as well as estimation of transport effect in presence of ICRH on W density reconstruction capabilities of the GEM.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stratton, J.R.; Ritchie, J.L.
Aspirin plus dipyridamole reduces platelet accumulation on short-term Dacron vascular grafts in man. To determine whether drug inhibition of platelet deposition is sustained on older grafts, we studied 18 men aged 41 to 87 years who had Dacron aortic bifurcation grafts in place a mean of 43.4 months (range 9.8 to 121.0) before and during short-term therapy with aspirin (325 mg tid) plus dipyridamole (75 mg tid). During both the baseline and drug studies, indium-111 (/sup 111/In) platelet deposition was quantitated by two techniques, standard planar imaging performed at 24, 48, and 72 hr after injection of platelets and singlemore » photon emission computed tomographic imaging performed at 24 and 72 hr after injection. All analyses were performed in a blinded fashion. On both the planar and tomographic images, platelet accumulation on the graft was quantitated by a graft/blood ratio that compared activity in the graft to simultaneously collected whole blood /sup 111/In platelet activity. Aspirin plus dipyridamole reduced the tomographic graft/blood ratio at 24 hr (20.6 +/- 3.5 vs 17.3 +/- 2.5) (+/-SEM) and at 72 hr (29.0 +/- 4.8 vs 25.0 +/- 4.1) after injection of platelets (p = .02). Dacron vascular grafts. Similarly, the planar graft/blood ratio was reduced at 24 hr (2.7 +/- 0.5 vs 2.4 +/- 0.5), 48 hr (3.7 +/- 0.9 vs 3.1 +/- 0.7), and 72 hr (4.0 +/- 0.9 vs 3.6 +/- 0.8) (p = .04). We conclude that aspirin (325 mg tid) plus dipyridamole (75 mg tid) reduces platelet accumulation on long-term Dacron vascular grafts.« less
Tomographic diagnostics of nonthermal plasmas
NASA Astrophysics Data System (ADS)
Denisova, Natalia
2009-10-01
In the previous work [1], we discussed a ``technology'' of tomographic method and relations between the tomographic diagnostics in thermal (equilibrium) and nonthermal (nonequilibrium) plasma sources. The conclusion has been made that tomographic reconstruction in thermal plasma sources is the standard procedure at present, which can provide much useful information on the plasma structure and its evolution in time, while the tomographic reconstruction of nonthermal plasma has a great potential at making a contribution to understanding the fundamental problem of substance behavior in strongly nonequilibrium conditions. Using medical terminology, one could say, that tomographic diagnostics of the equilibrium plasma sources studies their ``anatomic'' structure, while reconstruction of the nonequilibrium plasma is similar to the ``physiological'' examination: it is directed to study the physical mechanisms and processes. The present work is focused on nonthermal plasma research. The tomographic diagnostics is directed to study spatial structures formed in the gas discharge plasmas under the influence of electrical and gravitational fields. The ways of plasma ``self-organization'' in changing and extreme conditions are analyzed. The analysis has been made using some examples from our practical tomographic diagnostics of nonthermal plasma sources, such as low-pressure capacitive and inductive discharges. [0pt] [1] Denisova N. Plasma diagnostics using computed tomography method // IEEE Trans. Plasma Sci. 2009 37 4 502.
Hybrid-coded 3D structured illumination imaging with Bayesian estimation (Conference Presentation)
NASA Astrophysics Data System (ADS)
Chen, Hsi-Hsun; Luo, Yuan; Singh, Vijay R.
2016-03-01
Light induced fluorescent microscopy has long been developed to observe and understand the object at microscale, such as cellular sample. However, the transfer function of lense-based imaging system limits the resolution so that the fine and detailed structure of sample cannot be identified clearly. The techniques of resolution enhancement are fascinated to break the limit of resolution for objective given. In the past decades, the resolution enhancement imaging has been investigated through variety of strategies, including photoactivated localization microscopy (PALM), stochastic optical reconstruction microscopy (STORM), stimulated emission depletion (STED), and structure illuminated microscopy (SIM). In those methods, only SIM can intrinsically improve the resolution limit for a system without taking the structure properties of object into account. In this paper, we develop a SIM associated with Bayesian estimation, furthermore, with optical sectioning capability rendered from HiLo processing, resulting the high resolution through 3D volume. This 3D SIM can provide the optical sectioning and resolution enhancement performance, and be robust to noise owing to the Data driven Bayesian estimation reconstruction proposed. For validating the 3D SIM, we show our simulation result of algorithm, and the experimental result demonstrating the 3D resolution enhancement.
A Bayesian analysis of HAT-P-7b using the EXONEST algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Placek, Ben; Knuth, Kevin H.
2015-01-13
The study of exoplanets (planets orbiting other stars) is revolutionizing the way we view our universe. High-precision photometric data provided by the Kepler Space Telescope (Kepler) enables not only the detection of such planets, but also their characterization. This presents a unique opportunity to apply Bayesian methods to better characterize the multitude of previously confirmed exoplanets. This paper focuses on applying the EXONEST algorithm to characterize the transiting short-period-hot-Jupiter, HAT-P-7b (also referred to as Kepler-2b). EXONEST evaluates a suite of exoplanet photometric models by applying Bayesian Model Selection, which is implemented with the MultiNest algorithm. These models take into accountmore » planetary effects, such as reflected light and thermal emissions, as well as the effect of the planetary motion on the host star, such as Doppler beaming, or boosting, of light from the reflex motion of the host star, and photometric variations due to the planet-induced ellipsoidal shape of the host star. By calculating model evidences, one can determine which model best describes the observed data, thus identifying which effects dominate the planetary system. Presented are parameter estimates and model evidences for HAT-P-7b.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Konomi, Bledar A.; Karagiannis, Georgios; Sarkar, Avik
2014-05-16
Computer experiments (numerical simulations) are widely used in scientific research to study and predict the behavior of complex systems, which usually have responses consisting of a set of distinct outputs. The computational cost of the simulations at high resolution are often expensive and become impractical for parametric studies at different input values. To overcome these difficulties we develop a Bayesian treed multivariate Gaussian process (BTMGP) as an extension of the Bayesian treed Gaussian process (BTGP) in order to model and evaluate a multivariate process. A suitable choice of covariance function and the prior distributions facilitates the different Markov chain Montemore » Carlo (MCMC) movements. We utilize this model to sequentially sample the input space for the most informative values, taking into account model uncertainty and expertise gained. A simulation study demonstrates the use of the proposed method and compares it with alternative approaches. We apply the sequential sampling technique and BTMGP to model the multiphase flow in a full scale regenerator of a carbon capture unit. The application presented in this paper is an important tool for research into carbon dioxide emissions from thermal power plants.« less
NASA Astrophysics Data System (ADS)
Taylor, Stephen R.; Simon, Joseph; Sampson, Laura
2017-01-01
The final parsec of supermassive black-hole binary evolution is subject to the complex interplay of stellar loss-cone scattering, circumbinary disk accretion, and gravitational-wave emission, with binary eccentricity affected by all of these. The strain spectrum of gravitational-waves in the pulsar-timing band thus encodes rich information about the binary population's response to these various environmental mechanisms. Current spectral models have heretofore followed basic analytic prescriptions, and attempt to investigate these final-parsec mechanisms in an indirect fashion. Here we describe a new technique to directly probe the environmental properties of supermassive black-hole binaries through "Bayesian model-emulation". We perform black-hole binary population synthesis simulations at a restricted set of environmental parameter combinations, compute the strain spectra from these, then train a Gaussian process to learn the shape of the spectrum at any point in parameter space. We describe this technique, demonstrate its efficacy with a program of simulated datasets, then illustrate its power by directly constraining final-parsec physics in a Bayesian analysis of the NANOGrav 5-year dataset. The technique is fast, flexible, and robust.
NASA Astrophysics Data System (ADS)
Taylor, Stephen; Simon, Joseph; Sampson, Laura
2017-01-01
The final parsec of supermassive black-hole binary evolution is subject to the complex interplay of stellar loss-cone scattering, circumbinary disk accretion, and gravitational-wave emission, with binary eccentricity affected by all of these. The strain spectrum of gravitational-waves in the pulsar-timing band thus encodes rich information about the binary population's response to these various environmental mechanisms. Current spectral models have heretofore followed basic analytic prescriptions, and attempt to investigate these final-parsec mechanisms in an indirect fashion. Here we describe a new technique to directly probe the environmental properties of supermassive black-hole binaries through ``Bayesian model-emulation''. We perform black-hole binary population synthesis simulations at a restricted set of environmental parameter combinations, compute the strain spectra from these, then train a Gaussian process to learn the shape of spectrum at any point in parameter space. We describe this technique, demonstrate its efficacy with a program of simulated datasets, then illustrate its power by directly constraining final-parsec physics in a Bayesian analysis of the NANOGrav 5-year dataset. The technique is fast, flexible, and robust.
NASA Astrophysics Data System (ADS)
Taylor, Stephen; Ellis, Justin; Gair, Jonathan
2014-11-01
We describe several new techniques which accelerate Bayesian searches for continuous gravitational-wave emission from supermassive black-hole binaries using pulsar-timing arrays. These techniques mitigate the problematic increase of search dimensionality with the size of the pulsar array which arises from having to include an extra parameter per pulsar as the array is expanded. This extra parameter corresponds to searching over the phase of the gravitational wave as it propagates past each pulsar so that we can coherently include the pulsar term in our search strategies. Our techniques make the analysis tractable with powerful evidence-evaluation packages like MultiNest. We find good agreement of our techniques with the parameter-estimation and Bayes factor evaluation performed with full signal templates and conclude that these techniques make excellent first-cut tools for detection and characterization of continuous gravitational-wave signals with pulsar-timing arrays. Crucially, at low to moderate signal-to-noise ratios the factor by which the analysis is sped up can be ≳100 , permitting rigorous programs of systematic injection and recovery of signals to establish robust detection criteria within a Bayesian formalism.
Bayesian Multiscale Analysis of X-Ray Jet Features in High Redshift Quasars
NASA Astrophysics Data System (ADS)
McKeough, Kathryn; Siemiginowska, A.; Kashyap, V.; Stein, N.
2014-01-01
X-ray emission of powerful quasar jets may be a result of the inverse Compton (IC) process in which the Cosmic Microwave Background (CMB) photons gain energy by interactions with the jet’s relativistic electrons. However, there is no definite evidence that IC/CMB process is responsible for the observed X-ray emission of large scale jets. A step toward understanding the X-ray emission process is to study the Radio and X-ray morphologies of the jet. We implement a sophisticated Bayesian image analysis program, Low-count Image Reconstruction and Analysis (LIRA) (Esch et al. 2004; Conners & van Dyk 2007), to analyze jet features in 11 Chandra images of high redshift quasars (z ~ 2 - 4.8). Out of the 36 regions where knots are visible in the radio jets, nine showed detectable X-ray emission. We measured the ratios of the X-ray and radio luminosities of the detected features and found that they are consistent with the CMB radiation relationship. We derived a range of the bulk lorentz factor (Γ) for detected jet features under the CMB jet emission model. There is no discernible trend of Γ with redshift within the sample. The efficiency of the X-ray emission between the detected jet feature and the corresponding quasar also shows no correlation with redshift. This work is supported in part by the National Science Foundation REU and the Department of Defense ASSURE programs under NSF Grant no.1262851 and by the Smithsonian Institution, and by NASA Contract NAS8-39073 to the Chandra X-ray Center (CXC). This research has made use of data obtained from the Chandra Data Archive and Chandra Source Catalog, and software provided by the CXC in the application packages CIAO, ChIPS, and Sherpa. We thank Teddy Cheung for providing the VLA radio images. Connors, A., & van Dyk, D. A. 2007, Statistical Challenges in Modern Astronomy IV, 371, 101 Esch, D. N., Connors, A., Karovska, M., & van Dyk, D. A. 2004, ApJ, 610, 1213
NASA Astrophysics Data System (ADS)
Susiluoto, Jouni; Raivonen, Maarit; Backman, Leif; Laine, Marko; Makela, Jarmo; Peltola, Olli; Vesala, Timo; Aalto, Tuula
2018-03-01
Estimating methane (CH4) emissions from natural wetlands is complex, and the estimates contain large uncertainties. The models used for the task are typically heavily parameterized and the parameter values are not well known. In this study, we perform a Bayesian model calibration for a new wetland CH4 emission model to improve the quality of the predictions and to understand the limitations of such models.The detailed process model that we analyze contains descriptions for CH4 production from anaerobic respiration, CH4 oxidation, and gas transportation by diffusion, ebullition, and the aerenchyma cells of vascular plants. The processes are controlled by several tunable parameters. We use a hierarchical statistical model to describe the parameters and obtain the posterior distributions of the parameters and uncertainties in the processes with adaptive Markov chain Monte Carlo (MCMC), importance resampling, and time series analysis techniques. For the estimation, the analysis utilizes measurement data from the Siikaneva flux measurement site in southern Finland. The uncertainties related to the parameters and the modeled processes are described quantitatively. At the process level, the flux measurement data are able to constrain the CH4 production processes, methane oxidation, and the different gas transport processes. The posterior covariance structures explain how the parameters and the processes are related. Additionally, the flux and flux component uncertainties are analyzed both at the annual and daily levels. The parameter posterior densities obtained provide information regarding importance of the different processes, which is also useful for development of wetland methane emission models other than the square root HelsinkI Model of MEthane buiLd-up and emIssion for peatlands (sqHIMMELI). The hierarchical modeling allows us to assess the effects of some of the parameters on an annual basis. The results of the calibration and the cross validation suggest that the early spring net primary production could be used to predict parameters affecting the annual methane production. Even though the calibration is specific to the Siikaneva site, the hierarchical modeling approach is well suited for larger-scale studies and the results of the estimation pave way for a regional or global-scale Bayesian calibration of wetland emission models.
Tomographic image of a seismically active volcano: Mammoth Mountain, California
Dawson, Phillip B.; Chouet, Bernard A.; Pitt, Andrew M.
2016-01-01
High-resolution tomographic P wave, S wave, and VP/VS velocity structure models are derived for Mammoth Mountain, California, using phase data from the Northern California Seismic Network and a temporary deployment of broadband seismometers. An anomalous volume (5.1 × 109 to 5.9 × 1010m3) of low P and low S wave velocities is imaged beneath Mammoth Mountain, extending from near the surface to a depth of ∼2 km below sea level. We infer that the reduction in seismic wave velocities is due to the presence of CO2 distributed in oblate spheroid pores with mean aspect ratio α = 1.6 × 10−3 to 7.9 × 10−3 (crack-like pores) and mean gas volume fraction ϕ = 8.1 × 10−4 to 3.4 × 10−3. The pore density parameter κ = 3ϕ/(4πα) = na3=0.11, where n is the number of pores per cubic meter and a is the mean pore equatorial radius. The total mass of CO2 is estimated to be 4.6 × 109 to 1.9 × 1011 kg. The local geological structure indicates that the CO2 contained in the pores is delivered to the surface through fractures controlled by faults and remnant foliation of the bedrock beneath Mammoth Mountain. The total volume of CO2 contained in the reservoir suggests that given an emission rate of 500 tons day−1, the reservoir could supply the emission of CO2 for ∼25–1040 years before depletion. Continued supply of CO2 from an underlying magmatic system would significantly prolong the existence of the reservoir.
Tomographic Image of a Seismically Active Volcano: Mammoth Mountain, California
NASA Astrophysics Data System (ADS)
Dawson, P. B.; Chouet, B. A.; Pitt, A. M.
2015-12-01
High-resolution tomographic P wave, S wave, and VP /VS velocity structure models are derived for Mammoth Mountain, California using phase data from the Northern California Seismic Network and a temporary deployment of broadband seismometers. An anomalous volume (˜50 km3) of low P and low S wave velocities is imaged beneath Mammoth Mountain, extending from near the surface to a depth of ˜2 km below sea level. We infer that the reduction in seismic wave velocities is primarily due to the presence of CO2 distributed in oblate-spheroid pores with mean aspect ratio α ˜8 x 10-4 (crack-like pores) and gas volume fraction φ ˜4 x 10-4. The pore density parameter κ = 3φ / (4πα) = na3 = 0.12, where n is the number of pores per cubic meter and a is the mean pore equatorial radius. The total mass of CO2 is estimated to range up to ˜1.6 x 1010 kg if the pores exclusively contain CO2, although he presence of an aqueous phase may lower this estimate by up to one order of magnitude. The local geological structure indicates that the CO2 contained in the pores is delivered to the surface through fractures controlled by faults and remnant foliation of the bedrock beneath Mammoth Mountain. The total volume of CO2 contained in the reservoir suggests that given an emission rate of 5 x 105 kg day-1, the reservoir could supply the emission of CO2 for ˜8 to ˜90 years before depletion. Continued supply of CO2 from an underlying magmatic system would significantly prolong the existence of the reservoir.
Tomographic image of a seismically active volcano: Mammoth Mountain, California
NASA Astrophysics Data System (ADS)
Dawson, Phillip; Chouet, Bernard; Pitt, Andrew
2016-01-01
High-resolution tomographic P wave, S wave, and VP/VS velocity structure models are derived for Mammoth Mountain, California, using phase data from the Northern California Seismic Network and a temporary deployment of broadband seismometers. An anomalous volume (5.1 × 109 to 5.9 × 1010m3) of low P and low S wave velocities is imaged beneath Mammoth Mountain, extending from near the surface to a depth of ˜2 km below sea level. We infer that the reduction in seismic wave velocities is due to the presence of CO2 distributed in oblate spheroid pores with mean aspect ratio α = 1.6 × 10-3 to 7.9 × 10-3 (crack-like pores) and mean gas volume fraction ϕ = 8.1 × 10-4 to 3.4 × 10-3. The pore density parameter κ = 3ϕ/(4πα) = na3=0.11, where n is the number of pores per cubic meter and a is the mean pore equatorial radius. The total mass of CO2 is estimated to be 4.6 × 109 to 1.9 × 1011 kg. The local geological structure indicates that the CO2 contained in the pores is delivered to the surface through fractures controlled by faults and remnant foliation of the bedrock beneath Mammoth Mountain. The total volume of CO2 contained in the reservoir suggests that given an emission rate of 500 tons day-1, the reservoir could supply the emission of CO2 for ˜25-1040 years before depletion. Continued supply of CO2 from an underlying magmatic system would significantly prolong the existence of the reservoir.
NASA Astrophysics Data System (ADS)
Camera, S.; Fornasa, M.; Fornengo, N.; Regis, M.
2015-06-01
We recently proposed to cross-correlate the diffuse extragalactic γ-ray background with the gravitational lensing signal of cosmic shear. This represents a novel and promising strategy to search for annihilating or decaying particle dark matter (DM) candidates. In the present work, we demonstrate the potential of a tomographic-spectral approach: measuring the cross-correlation in separate bins of redshift and energy significantly improves the sensitivity to a DM signal. Indeed, the technique proposed here takes advantage of the different scaling of the astrophysical and DM components with redshift and, simultaneously of their different energy spectra and different angular extensions. The sensitivity to a particle DM signal is extremely promising even when the DM-induced emission is quite faint. We first quantify the prospects of detecting DM by cross-correlating the Fermi Large Area Telescope (LAT) diffuse γ-ray background with the cosmic shear expected from the Dark Energy Survey. Under the hypothesis of a significant subhalo boost, such a measurement can deliver a 5σ detection of DM, if the DM particle is lighter than 300 GeV and has a thermal annihilation rate. We then forecast the capability of the European Space Agency Euclid satellite (whose launch is planned for 2020), in combination with an hypothetical future γ-ray detector with slightly improved specifications compared to current telescopes. We predict that the cross-correlation of their data will allow a measurement of the DM mass with an uncertainty of a factor of 1.5-2, even for moderate subhalo boosts, for DM masses up to few hundreds of GeV and thermal annihilation rates.
CTPPL: A Continuous Time Probabilistic Programming Language
2009-07-01
recent years there has been a flurry of interest in continuous time models, mostly focused on continuous time Bayesian networks ( CTBNs ) [Nodelman, 2007... CTBNs are built on homogenous Markov processes. A homogenous Markov pro- cess is a finite state, continuous time process, consisting of an initial...q1 : xn()] ... Some state transitions can produce emissions. In a CTBN , each variable has a conditional inten- sity matrix Qu for every combination of
Tomographic Imaging of a Forested Area By Airborne Multi-Baseline P-Band SAR.
Frey, Othmar; Morsdorf, Felix; Meier, Erich
2008-09-24
In recent years, various attempts have been undertaken to obtain information about the structure of forested areas from multi-baseline synthetic aperture radar data. Tomographic processing of such data has been demonstrated for airborne L-band data but the quality of the focused tomographic images is limited by several factors. In particular, the common Fourierbased focusing methods are susceptible to irregular and sparse sampling, two problems, that are unavoidable in case of multi-pass, multi-baseline SAR data acquired by an airborne system. In this paper, a tomographic focusing method based on the time-domain back-projection algorithm is proposed, which maintains the geometric relationship between the original sensor positions and the imaged target and is therefore able to cope with irregular sampling without introducing any approximations with respect to the geometry. The tomographic focusing quality is assessed by analysing the impulse response of simulated point targets and an in-scene corner reflector. And, in particular, several tomographic slices of a volume representing a forested area are given. The respective P-band tomographic data set consisting of eleven flight tracks has been acquired by the airborne E-SAR sensor of the German Aerospace Center (DLR).
Offodile, Anaeze C; Chatterjee, Abhishek; Vallejo, Sergio; Fisher, Carla S; Tchou, Julia C; Guo, Lifei
2015-04-01
Computed tomographic angiography is a diagnostic tool increasingly used for preoperative vascular mapping in abdomen-based perforator flap breast reconstruction. This study compared the use of computed tomographic angiography and the conventional practice of Doppler ultrasonography only in postmastectomy reconstruction using a cost-utility model. Following a comprehensive literature review, a decision analytic model was created using the three most clinically relevant health outcomes in free autologous breast reconstruction with computed tomographic angiography versus Doppler ultrasonography only. Cost and utility estimates for each health outcome were used to derive the quality-adjusted life-years and incremental cost-utility ratio. One-way sensitivity analysis was performed to scrutinize the robustness of the authors' results. Six studies and 782 patients were identified. Cost-utility analysis revealed a baseline cost savings of $3179, a gain in quality-adjusted life-years of 0.25. This yielded an incremental cost-utility ratio of -$12,716, implying a dominant choice favoring preoperative computed tomographic angiography. Sensitivity analysis revealed that computed tomographic angiography was costlier when the operative time difference between the two techniques was less than 21.3 minutes. However, the clinical advantage of computed tomographic angiography over Doppler ultrasonography only showed that computed tomographic angiography would still remain the cost-effective option even if it offered no additional operating time advantage. The authors' results show that computed tomographic angiography is a cost-effective technology for identifying lower abdominal perforators for autologous breast reconstruction. Although the perfect study would be a randomized controlled trial of the two approaches with true cost accrual, the authors' results represent the best available evidence.
Subpercent-Scale Control of 3D Low Modes of Targets Imploded in Direct-Drive Configuration on OMEGA
NASA Astrophysics Data System (ADS)
Michel, D. T.; Igumenshchev, I. V.; Davis, A. K.; Edgell, D. H.; Froula, D. H.; Jacobs-Perkins, D. W.; Goncharov, V. N.; Regan, S. P.; Shvydky, A.; Campbell, E. M.
2018-03-01
Multiple self-emission x-ray images are used to measure tomographically target modes 1, 2, and 3 up to the end of the target acceleration in direct-drive implosions on OMEGA. Results show that the modes consist of two components: the first varies linearly with the laser beam-energy balance and the second is static and results from physical effects including beam mistiming, mispointing, and uncertainty in beam energies. This is used to reduce the target low modes of low-adiabat implosions from 2.2% to 0.8% by adjusting the beam-energy balance to compensate these static modes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, K. W.; Bitter, M. L.; Scott, S. D.
2009-03-24
A new spatially resolving x-ray crystal spectrometer capable of measuring continuous spatial profiles of high resolution spectra (λ/dλ > 6000) of He-like and H-like Ar Kα lines with good spatial (~1 cm) and temporal (~10 ms) resolutions has been installed on the Alcator C-Mod tokamak. Two spherically bent crystals image the spectra onto four two-dimensional Pilatus II pixel detectors. Tomographic inversion enables inference of local line emissivity, ion temperature (Ti), and toroidal plasma rotation velocity (vφ) from the line Doppler widths and shifts. The data analysis techniqu
Multi-pinhole SPECT Imaging with Silicon Strip Detectors
Peterson, Todd E.; Shokouhi, Sepideh; Furenlid, Lars R.; Wilson, Donald W.
2010-01-01
Silicon double-sided strip detectors offer outstanding instrinsic spatial resolution with reasonable detection efficiency for iodine-125 emissions. This spatial resolution allows for multiple-pinhole imaging at low magnification, minimizing the problem of multiplexing. We have conducted imaging studies using a prototype system that utilizes a detector of 300-micrometer thickness and 50-micrometer strip pitch together with a 23-pinhole collimator. These studies include an investigation of the synthetic-collimator imaging approach, which combines multiple-pinhole projections acquired at multiple magnifications to obtain tomographic reconstructions from limited-angle data using the ML-EM algorithm. Sub-millimeter spatial resolution was obtained, demonstrating the basic validity of this approach. PMID:20953300
Brain activation during human male ejaculation revisited.
Georgiadis, Janniko R; Reinders, A A T Simone; Van der Graaf, Ferdinand H C E; Paans, Anne M J; Kortekaas, Rudie
2007-04-16
In a prior [O]-H2O positron emission tomographic study we reported brain regions involved in human male ejaculation. Here, we used another, more recently acquired data set to evaluate the methodological approach of this previous study, and discovered that part of the reported activation pattern was not related to ejaculation. With a new analysis of these ejaculation data, we now demonstrate ejaculation-related activations in the deep cerebellar nuclei (dentate nucleus), anterior vermis, pons, and ventrolateral thalamus, and, most importantly, ejaculation-related deactivations throughout the prefrontal cortex. This revision offers a new and more accurate insight into the brain regions involved in human male ejaculation.
Stochastic volatility of the futures prices of emission allowances: A Bayesian approach
NASA Astrophysics Data System (ADS)
Kim, Jungmu; Park, Yuen Jung; Ryu, Doojin
2017-01-01
Understanding the stochastic nature of the spot volatility of emission allowances is crucial for risk management in emissions markets. In this study, by adopting a stochastic volatility model with or without jumps to represent the dynamics of European Union Allowances (EUA) futures prices, we estimate the daily volatilities and model parameters by using the Markov Chain Monte Carlo method for stochastic volatility (SV), stochastic volatility with return jumps (SVJ) and stochastic volatility with correlated jumps (SVCJ) models. Our empirical results reveal three important features of emissions markets. First, the data presented herein suggest that EUA futures prices exhibit significant stochastic volatility. Second, the leverage effect is noticeable regardless of whether or not jumps are included. Third, the inclusion of jumps has a significant impact on the estimation of the volatility dynamics. Finally, the market becomes very volatile and large jumps occur at the beginning of a new phase. These findings are important for policy makers and regulators.
NASA Astrophysics Data System (ADS)
Yee, Eugene
2007-04-01
Although a great deal of research effort has been focused on the forward prediction of the dispersion of contaminants (e.g., chemical and biological warfare agents) released into the turbulent atmosphere, much less work has been directed toward the inverse prediction of agent source location and strength from the measured concentration, even though the importance of this problem for a number of practical applications is obvious. In general, the inverse problem of source reconstruction is ill-posed and unsolvable without additional information. It is demonstrated that a Bayesian probabilistic inferential framework provides a natural and logically consistent method for source reconstruction from a limited number of noisy concentration data. In particular, the Bayesian approach permits one to incorporate prior knowledge about the source as well as additional information regarding both model and data errors. The latter enables a rigorous determination of the uncertainty in the inference of the source parameters (e.g., spatial location, emission rate, release time, etc.), hence extending the potential of the methodology as a tool for quantitative source reconstruction. A model (or, source-receptor relationship) that relates the source distribution to the concentration data measured by a number of sensors is formulated, and Bayesian probability theory is used to derive the posterior probability density function of the source parameters. A computationally efficient methodology for determination of the likelihood function for the problem, based on an adjoint representation of the source-receptor relationship, is described. Furthermore, we describe the application of efficient stochastic algorithms based on Markov chain Monte Carlo (MCMC) for sampling from the posterior distribution of the source parameters, the latter of which is required to undertake the Bayesian computation. The Bayesian inferential methodology for source reconstruction is validated against real dispersion data for two cases involving contaminant dispersion in highly disturbed flows over urban and complex environments where the idealizations of horizontal homogeneity and/or temporal stationarity in the flow cannot be applied to simplify the problem. Furthermore, the methodology is applied to the case of reconstruction of multiple sources.
Top-down Estimates of Biomass Burning Emissions of Black Carbon in the Western United States
NASA Astrophysics Data System (ADS)
Mao, Y.; Li, Q.; Randerson, J. T.; Liou, K.
2011-12-01
We apply a Bayesian linear inversion to derive top-down estimates of biomass burning emissions of black carbon (BC) in the western United States (WUS) for May-November 2006 by inverting surface BC concentrations from the IMPROVE network using the GEOS-Chem chemical transport model. Model simulations are conducted at both 2°×2.5° (globally) and 0.55°×0.66° (nested over North America) horizontal resolutions. We first improve the spatial distributions and seasonal and interannual variations of the BC emissions from the Global Fire Emissions Database (GFEDv2) using MODIS 8-day active fire counts from 2005-2007. The GFEDv2 emissions in N. America are adjusted for three zones: boreal N. America, temperate N. America, and Mexico plus Central America. The resulting emissions are then used as a priori for the inversion. The a posteriori emissions are 2-5 times higher than the a priori in California and the Rockies. Model surface BC concentrations using the a posteriori estimate provide better agreement with IMPROVE observations (~20% increase in the Taylor skill score), including improved ability to capture the observed variability especially during June-July. However, model surface BC concentrations are still biased low by ~30%. Comparisons with the Fire Locating and Modeling of Burning Emissions (FLAMBE) are included.
Top-down Estimates of Biomass Burning Emissions of Black Carbon in the Western United States
NASA Astrophysics Data System (ADS)
Mao, Y.; Li, Q.; Randerson, J. T.; CHEN, D.; Zhang, L.; Liou, K.
2012-12-01
We apply a Bayesian linear inversion to derive top-down estimates of biomass burning emissions of black carbon (BC) in the western United States (WUS) for May-November 2006 by inverting surface BC concentrations from the IMPROVE network using the GEOS-Chem chemical transport model. Model simulations are conducted at both 2°×2.5° (globally) and 0.5°×0.667° (nested over North America) horizontal resolutions. We first improve the spatial distributions and seasonal and interannual variations of the BC emissions from the Global Fire Emissions Database (GFEDv2) using MODIS 8-day active fire counts from 2005-2007. The GFEDv2 emissions in N. America are adjusted for three zones: boreal N. America, temperate N. America, and Mexico plus Central America. The resulting emissions are then used as a priori for the inversion. The a posteriori emissions are 2-5 times higher than the a priori in California and the Rockies. Model surface BC concentrations using the a posteriori estimate provide better agreement with IMPROVE observations (~50% increase in the Taylor skill score), including improved ability to capture the observed variability especially during June-September. However, model surface BC concentrations are still biased low by ~30%. Comparisons with the Fire Locating and Modeling of Burning Emissions (FLAMBE) are included.
On the capability of IASI measurements to inform about CO surface emissions
NASA Astrophysics Data System (ADS)
Fortems-Cheiney, A.; Chevallier, F.; Pison, I.; Bousquet, P.; Carouge, C.; Clerbaux, C.; Coheur, P.-F.; George, M.; Hurtmans, D.; Szopa, S.
2009-03-01
Between July and November 2008, simultaneous observations were conducted by several orbiting instruments that monitor carbon monoxide in the atmosphere, among them the Infrared Atmospheric Sounding Instrument (IASI) and Measurements Of Pollution In The Troposphere (MOPITT). In this paper, the concentration retrievals at about 700 hPa from these two instruments are successively used in a variational Bayesian system to infer the global distribution of CO emissions. Our posterior estimate of CO emissions using IASI retrievals gives a total of 793 Tg for the considered period, which is 40% higher than the global budget calculated with the MOPITT data (566 Tg). Over six continental regions (Eurasian Boreal, South Asia, South East Asia, North American Boreal, Northern Africa and South American Temperate) and thanks to a better observation density, the theoretical uncertainty reduction obtained with the IASI retrievals is better or similar than with MOPITT. For the other continental regions, IASI constrains the emissions less than MOPITT because of lesser sensitivity in the lower troposphere. These first results indicate that IASI may play a major role in the quantification of the emissions of CO.
Thermal Aging of Oceanic Asthenosphere
NASA Astrophysics Data System (ADS)
Paulson, E.; Jordan, T. H.
2013-12-01
To investigate the depth extent of mantle thermal aging beneath ocean basins, we project 3D Voigt-averaged S-velocity variations from an ensemble of global tomographic models onto a 1x1 degree age-based regionalization and average over bins delineated by equal increments in the square-root of crustal age. From comparisons among the bin-averaged S-wave profiles, we estimate age-dependent convergence depths (minimum depths where the age variations become statistically insignificant) as well as S travel times from these depths to a shallow reference surface. Using recently published techniques (Jordan & Paulson, JGR, doi:10.1002/jgrb.50263, 2013), we account for the aleatory variability in the bin-averaged S-wave profiles using the angular correlation functions of the individual tomographic models, we correct the convergence depths for vertical-smearing bias using their radial correlation functions, and we account for epistemic uncertainties through Bayesian averaging over the tomographic model ensemble. From this probabilistic analysis, we can assert with 90% confidence that the age-correlated variations in Voigt-averaged S velocities persist to depths greater than 170 km; i.e., more than 100 km below the mean depth of the G discontinuity (~70 km). Moreover, the S travel time above the convergence depth decays almost linearly with the square-root of crustal age out to 200 Ma, consistent with a half-space cooling model. Given the strong evidence that the G discontinuity approximates the lithosphere-asthenosphere boundary (LAB) beneath ocean basins, we conclude that the upper (and probably weakest) part of the oceanic asthenosphere, like the oceanic lithosphere, participates in the cooling that forms the kinematic plates, or tectosphere. In other words, the thermal boundary layer of a mature oceanic plate appears to be more than twice the thickness of its mechanical boundary layer. We do not discount the possibility that small-scale convection creates heterogeneities in the oceanic upper mantle; however, the large-scale flow evidently advects these small-scale heterogeneities along with the plates, allowing the upper part of the asthenosphere to continue cooling with lithospheric age. The dominance of this large-scale horizontal flow may be related to the high stresses associated with its channelization in a thin (~100 km) asthenosphere, as well as the possible focusing of the subtectospheric strain in a low-viscosity channel immediately above the 410-km discontinuity. These speculations aside, the observed thermal aging of oceanic asthenosphere is inconsistent with a tenet of plate tectonics, the LAB hypothesis, which states that lithospheric plates are decoupled from deeper mantle flow by a shear zone in the upper part of the asthenosphere.
Tomographic Reconstruction from a Few Views: A Multi-Marginal Optimal Transport Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abraham, I., E-mail: isabelle.abraham@cea.fr; Abraham, R., E-mail: romain.abraham@univ-orleans.fr; Bergounioux, M., E-mail: maitine.bergounioux@univ-orleans.fr
2017-02-15
In this article, we focus on tomographic reconstruction. The problem is to determine the shape of the interior interface using a tomographic approach while very few X-ray radiographs are performed. We use a multi-marginal optimal transport approach. Preliminary numerical results are presented.
Bayesian analysis of X-ray jet features of the high redshift quasar jets observed with Chandra
NASA Astrophysics Data System (ADS)
McKeough, Kathryn; Siemiginowska, Aneta; Kashyap, Vinay; Stein, Nathan; Cheung, Chi C.
2015-01-01
X-ray emission of powerful quasar jets may be a result of the inverse Compton (IC) process in which the Cosmic Microwave Background (CMB) photons gain energy by interactions with the jet's relativistic electrons. However, there is no definite evidence that IC/CMB process is responsible for the observed X-ray emission of large scale jets. A step toward understanding the X-ray emission process is to study the Radio and X-ray morphologies of the jet. Results from Chandra X-ray and multi-frequency VLA imaging observations of a sample of 11 high- redshift (z > 2) quasars with kilo-parsec scale radio jets are reported. The sample consists of a set of four z ≥ 3.6 flat-spectrum radio quasars, and seven intermediate redshift (z = 2.1 - 2.9) quasars comprised of four sources with integrated steep radio spectra and three with flat radio spectra.We implement a Bayesian image analysis program, Low-count Image Reconstruction and Analysis (LIRA) , to analyze jet features in the X-ray images of the high redshift quasars. Out of the 36 regions where knots are visible in the radio jets, nine showed detectable X-ray emission. Significant detections are based on the upper bound p-value test based on LIRA simulations. The X-ray and radio properties of this sample combined are examined and compared to lower-redshift samples.This work is supported in part by the National Science Foundation REU and the Department of Defense ASSURE programs under NSF Grant no.1262851 and by the Smithsonian Institution, and by NASA Contract NAS8-39073 to the Chandra X-ray Center (CXC). This research has made use of data obtained from the Chandra Data Archive and Chandra Source Catalog, and software provided by the CXC in the application packages CIAO, ChIPS, and Sherpa. Work is also supported by the Chandra grant GO4-15099X.
Preliminary studies of a simultaneous PET/MRI scanner based on the RatCAP small animal tomograph
NASA Astrophysics Data System (ADS)
Woody, C.; Schlyer, D.; Vaska, P.; Tomasi, D.; Solis-Najera, S.; Rooney, W.; Pratte, J.-F.; Junnarkar, S.; Stoll, S.; Master, Z.; Purschke, M.; Park, S.-J.; Southekal, S.; Kriplani, A.; Krishnamoorthy, S.; Maramraju, S.; O'Connor, P.; Radeka, V.
2007-02-01
We are developing a scanner that will allow simultaneous acquisition of high resolution anatomical data using magnetic resonance imaging (MRI) and quantitative physiological data using positron emission tomography (PET). The approach is based on the technology used for the RatCAP conscious small animal PET tomograph which utilizes block detectors consisting of pixelated arrays of LSO crystals read out with matching arrays of avalanche photodiodes and a custom-designed ASIC. The version of this detector used for simultaneous PET/MRI imaging will be constructed out of all nonmagnetic materials and will be situated inside the MRI field. We have demonstrated that the PET detector and its electronics can be operated inside the MRI, and have obtained MRI images with various detector components located inside the MRI field. The MRI images show minimal distortion in this configuration even where some components still contain traces of certain magnetic materials. We plan to improve on the image quality in the future using completely non-magnetic components and by tuning the MRI pulse sequences. The combined result will be a highly compact, low mass PET scanner that can operate inside an MRI magnet without distorting the MRI image, and can be retrofitted into existing MRI instruments.
Progress In The Development Of A Tomographic SPECT System For Online Dosimetry In BNCT
NASA Astrophysics Data System (ADS)
Minsky, D. M.; Valda, A.; Kreiner, A. J.; Burlon, A. A.; Green, S.; Wojnecki, C.; Ghani, Z.
2010-08-01
In boron neutron capture therapy (BNCT) the delivered dose to the patient depends both on the neutron beam characteristics and on the 10B body distribution which, in turn, is governed by the tumor specificity of the 10B drug-carrier. BNCT dosimetry is a complex matter due to the several interactions that neutrons can undergo with the different nuclei present in tissue. However the boron capture reaction 10B(n,α)7Li accounts for about 80 % of the total dose in a tumor with 40 ppm in 10B concentration. Present dosimetric methods are indirect, based on drug biodistribution statistical data and subjected to inter and intra-patient variability. In order to overcome the consequences of the concomitant high dosimetric uncertainties, we propose a SPECT (Single Photon Emission Tomography) approach based on the detection of the prompt gamma-ray (478 keV) emitted in 94 % of the cases from 7Li. For this purpose we designed, built and tested a prototype based on LaBr3(Ce) scintillators. Measurements on a head and tumor phantom were performed in the accelerator-based BNCT facility of the University of Birmingham (UK). They result in the first tomographic image of the 10B capture distribution obtained in a BNCT facility.
NASA Astrophysics Data System (ADS)
Rajaona, Harizo; Septier, François; Armand, Patrick; Delignon, Yves; Olry, Christophe; Albergel, Armand; Moussafir, Jacques
2015-12-01
In the eventuality of an accidental or intentional atmospheric release, the reconstruction of the source term using measurements from a set of sensors is an important and challenging inverse problem. A rapid and accurate estimation of the source allows faster and more efficient action for first-response teams, in addition to providing better damage assessment. This paper presents a Bayesian probabilistic approach to estimate the location and the temporal emission profile of a pointwise source. The release rate is evaluated analytically by using a Gaussian assumption on its prior distribution, and is enhanced with a positivity constraint to improve the estimation. The source location is obtained by the means of an advanced iterative Monte-Carlo technique called Adaptive Multiple Importance Sampling (AMIS), which uses a recycling process at each iteration to accelerate its convergence. The proposed methodology is tested using synthetic and real concentration data in the framework of the Fusion Field Trials 2007 (FFT-07) experiment. The quality of the obtained results is comparable to those coming from the Markov Chain Monte Carlo (MCMC) algorithm, a popular Bayesian method used for source estimation. Moreover, the adaptive processing of the AMIS provides a better sampling efficiency by reusing all the generated samples.
NASA Astrophysics Data System (ADS)
Nguyen, Emmanuel; Antoni, Jerome; Grondin, Olivier
2009-12-01
In the automotive industry, the necessary reduction of pollutant emission for new Diesel engines requires the control of combustion events. This control is efficient provided combustion parameters such as combustion occurrence and combustion energy are relevant. Combustion parameters are traditionally measured from cylinder pressure sensors. However this kind of sensor is expensive and has a limited lifetime. Thus this paper proposes to use only one cylinder pressure on a multi-cylinder engine and to extract combustion parameters from the other cylinders with low cost knock sensors. Knock sensors measure the vibration circulating on the engine block, hence they do not all contain the information on the combustion processes, but they are also contaminated by other mechanical noises that disorder the signal. The question is how to combine the information coming from one cylinder pressure and knock sensors to obtain the most relevant combustion parameters in all engine cylinders. In this paper, the issue is addressed trough the Bayesian inference formalism. In that cylinder where a cylinder pressure sensor is mounted, combustion parameters will be measured directly. In the other cylinders, they will be measured indirectly from Bayesian inference. Experimental results obtained on a four cylinder Diesel engine demonstrate the effectiveness of the proposed algorithm toward that purpose.
Liu, Yixin; Zhou, Kai; Lei, Yu
2015-01-01
High temperature gas sensors have been highly demanded for combustion process optimization and toxic emissions control, which usually suffer from poor selectivity. In order to solve this selectivity issue and identify unknown reducing gas species (CO, CH 4 , and CH 8 ) and concentrations, a high temperature resistive sensor array data set was built in this study based on 5 reported sensors. As each sensor showed specific responses towards different types of reducing gas with certain concentrations, based on which calibration curves were fitted, providing benchmark sensor array response database, then Bayesian inference framework was utilized to process themore » sensor array data and build a sample selection program to simultaneously identify gas species and concentration, by formulating proper likelihood between input measured sensor array response pattern of an unknown gas and each sampled sensor array response pattern in benchmark database. This algorithm shows good robustness which can accurately identify gas species and predict gas concentration with a small error of less than 10% based on limited amount of experiment data. These features indicate that Bayesian probabilistic approach is a simple and efficient way to process sensor array data, which can significantly reduce the required computational overhead and training data.« less
NASA Astrophysics Data System (ADS)
Haas, Edwin; Santabarbara, Ignacio; Kiese, Ralf; Butterbach-Bahl, Klaus
2017-04-01
Numerical simulation models are increasingly used to estimate greenhouse gas emissions at site to regional / national scale and are outlined as the most advanced methodology (Tier 3) in the framework of UNFCCC reporting. Process-based models incorporate the major processes of the carbon and nitrogen cycle of terrestrial ecosystems and are thus thought to be widely applicable at various conditions and spatial scales. Process based modelling requires high spatial resolution input data on soil properties, climate drivers and management information. The acceptance of model based inventory calculations depends on the assessment of the inventory's uncertainty (model, input data and parameter induced uncertainties). In this study we fully quantify the uncertainty in modelling soil N2O and NO emissions from arable, grassland and forest soils using the biogeochemical model LandscapeDNDC. We address model induced uncertainty (MU) by contrasting two different soil biogeochemistry modules within LandscapeDNDC. The parameter induced uncertainty (PU) was assessed by using joint parameter distributions for key parameters describing microbial C and N turnover processes as obtained by different Bayesian calibration studies for each model configuration. Input data induced uncertainty (DU) was addressed by Bayesian calibration of soil properties, climate drivers and agricultural management practices data. For the MU, DU and PU we performed several hundred simulations each to contribute to the individual uncertainty assessment. For the overall uncertainty quantification we assessed the model prediction probability, followed by sampled sets of input datasets and parameter distributions. Statistical analysis of the simulation results have been used to quantify the overall full uncertainty of the modelling approach. With this study we can contrast the variation in model results to the different sources of uncertainties for each ecosystem. Further we have been able to perform a fully uncertainty analysis for modelling N2O and NO emissions from arable, grassland and forest soils necessary for the comprehensibility of modelling results. We have applied the methodology to a regional inventory to assess the overall modelling uncertainty for a regional N2O and NO emissions inventory for the state of Saxony, Germany.
System and method for generating motion corrected tomographic images
Gleason, Shaun S [Knoxville, TN; Goddard, Jr., James S.
2012-05-01
A method and related system for generating motion corrected tomographic images includes the steps of illuminating a region of interest (ROI) to be imaged being part of an unrestrained live subject and having at least three spaced apart optical markers thereon. Simultaneous images are acquired from a first and a second camera of the markers from different angles. Motion data comprising 3D position and orientation of the markers relative to an initial reference position is then calculated. Motion corrected tomographic data obtained from the ROI using the motion data is then obtained, where motion corrected tomographic images obtained therefrom.
Herrera, Ronald; Radon, Katja; von Ehrenstein, Ondine S; Cifuentes, Stella; Muñoz, Daniel Moraga; Berger, Ursula
2016-06-07
In a community in northern Chile, explosive procedures are used by two local industrial mines (gold, copper). We hypothesized that the prevalence of asthma and rhinoconjunctivitis in the community may be associated with air pollution emissions generated by the mines. A cross-sectional study of 288 children (aged 6-15 years) was conducted in a community in northern Chile using a validated questionnaire in 2009. The proximity between each child's place of residence and the mines was assessed as indicator of exposure to mining related air pollutants. Logistic regression, semiparametric models and spatial Bayesian models with a parametric form for distance were used to calculate odds ratios and 95 % confidence intervals. The prevalence of asthma and rhinoconjunctivitis was 24 and 34 %, respectively. For rhinoconjunctivitis, the odds ratio for average distance between both mines and child's residence was 1.72 (95 % confidence interval 1.00, 3.04). The spatial Bayesian models suggested a considerable increase in the risk for respiratory diseases closer to the mines, and only beyond a minimum distance of more than 1800 m the health impact was considered to be negligible. The findings indicate that air pollution emissions related to industrial gold or copper mines mainly occurring in rural Chilean communities might increase the risk of respiratory diseases in children.
A Bayesian analysis of redshifted 21-cm H I signal and foregrounds: simulations for LOFAR
NASA Astrophysics Data System (ADS)
Ghosh, Abhik; Koopmans, Léon V. E.; Chapman, E.; Jelić, V.
2015-09-01
Observations of the epoch of reionization (EoR) using the 21-cm hyperfine emission of neutral hydrogen (H I) promise to open an entirely new window on the formation of the first stars, galaxies and accreting black holes. In order to characterize the weak 21-cm signal, we need to develop imaging techniques that can reconstruct the extended emission very precisely. Here, we present an inversion technique for LOw Frequency ARray (LOFAR) baselines at the North Celestial Pole (NCP), based on a Bayesian formalism with optimal spatial regularization, which is used to reconstruct the diffuse foreground map directly from the simulated visibility data. We notice that the spatial regularization de-noises the images to a large extent, allowing one to recover the 21-cm power spectrum over a considerable k⊥-k∥ space in the range 0.03 Mpc-1 < k⊥ < 0.19 Mpc-1 and 0.14 Mpc-1 < k∥ < 0.35 Mpc-1 without subtracting the noise power spectrum. We find that, in combination with using generalized morphological component analysis (GMCA), a non-parametric foreground removal technique, we can mostly recover the spherical average power spectrum within 2σ statistical fluctuations for an input Gaussian random root-mean-square noise level of 60 mK in the maps after 600 h of integration over a 10-MHz bandwidth.
Toward a probabilistic acoustic emission source location algorithm: A Bayesian approach
NASA Astrophysics Data System (ADS)
Schumacher, Thomas; Straub, Daniel; Higgins, Christopher
2012-09-01
Acoustic emissions (AE) are stress waves initiated by sudden strain releases within a solid body. These can be caused by internal mechanisms such as crack opening or propagation, crushing, or rubbing of crack surfaces. One application for the AE technique in the field of Structural Engineering is Structural Health Monitoring (SHM). With piezo-electric sensors mounted to the surface of the structure, stress waves can be detected, recorded, and stored for later analysis. An important step in quantitative AE analysis is the estimation of the stress wave source locations. Commonly, source location results are presented in a rather deterministic manner as spatial and temporal points, excluding information about uncertainties and errors. Due to variability in the material properties and uncertainty in the mathematical model, measures of uncertainty are needed beyond best-fit point solutions for source locations. This paper introduces a novel holistic framework for the development of a probabilistic source location algorithm. Bayesian analysis methods with Markov Chain Monte Carlo (MCMC) simulation are employed where all source location parameters are described with posterior probability density functions (PDFs). The proposed methodology is applied to an example employing data collected from a realistic section of a reinforced concrete bridge column. The selected approach is general and has the advantage that it can be extended and refined efficiently. Results are discussed and future steps to improve the algorithm are suggested.
Optical tomographic memories: algorithms for the efficient information readout
NASA Astrophysics Data System (ADS)
Pantelic, Dejan V.
1990-07-01
Tomographic alogithms are modified in order to reconstruct the inf ormation previously stored by focusing laser radiation in a volume of photosensitive media. Apriori information about the position of bits of inf ormation is used. 1. THE PRINCIPLES OF TOMOGRAPHIC MEMORIES Tomographic principles can be used to store and reconstruct the inf ormation artificially stored in a bulk of a photosensitive media 1 The information is stored by changing some characteristics of a memory material (e. g. refractive index). Radiation from the two independent light sources (e. g. lasers) is f ocused inside the memory material. In this way the intensity of the light is above the threshold only in the localized point where the light rays intersect. By scanning the material the information can be stored in binary or nary format. When the information is stored it can be read by tomographic methods. However the situation is quite different from the classical tomographic problem. Here a lot of apriori information is present regarding the p0- sitions of the bits of information profile representing single bit and a mode of operation (binary or n-ary). 2. ALGORITHMS FOR THE READOUT OF THE TOMOGRAPHIC MEMORIES Apriori information enables efficient reconstruction of the memory contents. In this paper a few methods for the information readout together with the simulation results will be presented. Special attention will be given to the noise considerations. Two different
Extreme ultraviolet diagnostic upgrades for kink mode control on the HBT-EP tokamak
NASA Astrophysics Data System (ADS)
Levesque, J. P.; Brooks, J. W.; Desanto, S.; Mauel, M. E.; Navratil, G. A.; Page, J. W.; Hansen, C. J.; Delgado-Aparicio, L.
2016-10-01
Optical diagnostics can provide non-invasive measurements of tokamak equilibria and the internal characteristics of MHD mode activity. We present research plans and ongoing progress on upgrading extreme ultraviolet (EUV) diagnostics in the HBT-EP tokamak. Four sets of 16 poloidal views will allow tomographic reconstruction of plasma emissivity and internal kink mode structure. Emission characteristics of naturally-occurring m/n = 2/1, 3/2, and 3/1 tearing and kink modes will be compared with expectations from a synthetic diagnostic. Coupling between internal and external modes leading up to disruptions is studied. The internal plasma response to external magnetic perturbations is investigated, and compared with magnetic response measurements. Correlation between internal emissivity and external magnetic measurements provides a global picture of long-wavelength MHD instabilities. Measurements are input to HBT-EP's GPU-based feedback system, allowing active feedback for kink modes using only optical sensors and both magnetic and edge current actuators. A separate two-color, 16-chord tangential system will be installed next year to allow reconstruction of temperature profiles and their fluctuations versus time. Supported by U.S. DOE Grant DE-FG02-86ER53222.
DOE R&D Accomplishments Database
Phelps, M. E.; Hoffman, E. J.; Huang, S. C.; Schelbert, H. R.; Kuhl, D. E.
1978-01-01
Emission computed tomography can provide a quantitative in vivo measurement of regional tissue radionuclide tracer concentrations. This facility when combined with physiologic models and radioactively labeled physiologic tracers that behave in a predictable manner allow measurement of a wide variety of physiologic variables. This integrated technique has been referred to as Physiologic Tomography (PT). PT requires labeled compounds which trace physiologic processes in a known and predictable manner, and physiologic models which are appropriately formulated and validated to derive physiologic variables from ECT data. In order to effectively achieve this goal, PT requires an ECT system that is capable of performing truly quantitative or analytical measurements of tissue tracer concentrations and which has been well characterized in terms of spatial resolution, sensitivity and signal to noise ratios in the tomographic image. This paper illustrates the capabilities of emission computed tomography and provides examples of physiologic tomography for the regional measurement of cerebral and myocardial metabolic rate for glucose, regional measurement of cerebral blood volume, gated cardiac blood pools and capillary perfusion in brain and heart. Studies on patients with stroke and myocardial ischemia are also presented.
Development of an oximeter for neurology
NASA Astrophysics Data System (ADS)
Aleinik, A.; Serikbekova, Z.; Zhukova, N.; Zhukova, I.; Nikitina, M.
2016-06-01
Cerebral desaturation can occur during surgery manipulation, whereas other parameters vary insignificantly. Prolonged intervals of cerebral anoxia can cause serious damage to the nervous system. Commonly used method for measurement of cerebral blood flow uses invasive catheters. Other techniques include single photon emission computed tomography (SPECT), positron emission tomography (PET), magnetic resonance imaging (MRI). Tomographic methods frequently use isotope administration, that may result in anaphylactic reactions to contrast media and associated nerve diseases. Moreover, the high cost and the need for continuous monitoring make it difficult to apply these techniques in clinical practice. Cerebral oximetry is a method for measuring oxygen saturation using infrared spectrometry. Moreover reflection pulse oximetry can detect sudden changes in sympathetic tone. For this purpose the reflectance pulse oximeter for use in neurology is developed. Reflectance oximeter has a definite advantage as it can be used to measure oxygen saturation in any part of the body. Preliminary results indicate that the device has a good resolution and high reliability. Modern applied schematics have improved device characteristics compared with existing ones.
NASA Astrophysics Data System (ADS)
Thorek, Daniel L. J.; Ulmert, David; Diop, Ndeye-Fatou M.; Lupu, Mihaela E.; Doran, Michael G.; Huang, Ruimin; Abou, Diane S.; Larson, Steven M.; Grimm, Jan
2014-01-01
The invasion status of tumour-draining lymph nodes (LNs) is a critical indicator of cancer stage and is important for treatment planning. Clinicians currently use planar scintigraphy and single-photon emission computed tomography (SPECT) with 99mTc-radiocolloid to guide biopsy and resection of LNs. However, emerging multimodality approaches such as positron emission tomography combined with magnetic resonance imaging (PET/MRI) detect sites of disease with higher sensitivity and accuracy. Here we present a multimodal nanoparticle, 89Zr-ferumoxytol, for the enhanced detection of LNs with PET/MRI. For genuine translational potential, we leverage a clinical iron oxide formulation, altered with minimal modification for radiolabelling. Axillary drainage in naive mice and from healthy and tumour-bearing prostates was investigated. We demonstrate that 89Zr-ferumoxytol can be used for high-resolution tomographic studies of lymphatic drainage in preclinical disease models. This nanoparticle platform has significant translational potential to improve preoperative planning for nodal resection and tumour staging.
Mazon, D; Vezinet, D; Pacella, D; Moreau, D; Gabelieri, L; Romano, A; Malard, P; Mlynar, J; Masset, R; Lotte, P
2012-06-01
This paper is focused on the soft x-ray (SXR) tomography system setup at Tore Supra (DTOMOX) and the recent developments made to automatically get precise information about plasma features from inverted data. The first part describes the main aspects of the tomographic inversion optimization process. Several observations are made using this new tool and a set of shape factors is defined to help characterizing the emissivity field in a real-time perspective. The second part presents a detailed off-line analysis comparing the positions of the magnetic axis obtained from a magnetic equilibrium solver, and the maximum of the reconstructed emissivity field for ohmic and heated pulses. A systematic discrepancy of about 5 cm is found in both cases and it is shown that this discrepancy increases during sawtooth crashes. Finally, evidence of radially localized tungsten accumulation with an in-out asymmetry during a lower hybrid current drive pulse is provided to illustrate the DTOMOX capabilities for a precise observation of local phenomena.
Martin, Niall P D; Bishop, Justin D K; Boies, Adam M
2017-03-07
While the UK has committed to reduce CO 2 emissions to 80% of 1990 levels by 2050, transport accounts for nearly a fourth of all emissions and the degree to which decarbonization can occur is highly uncertain. We present a new methodology using vehicle and powertrain parameters within a Bayesian framework to determine the impact of engineering vehicle improvements on fuel consumption and CO 2 emissions. Our results show how design changes in vehicle parameters (e.g., mass, engine size, and compression ratio) result in fuel consumption improvements from a fleet-wide mean of 5.6 L/100 km in 2014 to 3.0 L/100 km by 2030. The change in vehicle efficiency coupled with increases in vehicle numbers and fleet-wide activity result in a total fleet-wide reduction of 41 ± 10% in 2030, relative to 2012. Concerted internal combustion engine improvements result in a 48 ± 10% reduction of CO 2 emissions, while efforts to increase the number of diesel vehicles within the fleet had little additional effect. Increasing plug-in and all-electric vehicles reduced CO 2 emissions by less (42 ± 10% reduction) than concerted internal combustion engines improvements. However, if the grid decarbonizes, electric vehicles reduce emissions by 45 ± 9% with further reduction potential to 2050.
NASA Astrophysics Data System (ADS)
Hei, Matthew A.; Budzien, Scott A.; Dymond, Kenneth F.; Nicholas, Andrew C.; Paxton, Larry J.; Schaefer, Robert K.; Groves, Keith M.
2017-07-01
We present the Volume Emission Rate Tomography (VERT) technique for inverting satellite-based, multisensor limb and nadir measurements of atmospheric ultraviolet emission to create whole-orbit reconstructions of atmospheric volume emission rate. The VERT approach is more general than previous ionospheric tomography methods because it can reconstruct the volume emission rate field irrespective of the particular excitation mechanisms (e.g., radiative recombination, photoelectron impact excitation, and energetic particle precipitation in auroras); physical models are then applied to interpret the airglow. The technique was developed and tested using data from the Special Sensor Ultraviolet Limb Imager and Special Sensor Ultraviolet Spectrographic Imager instruments aboard the Defense Meteorological Satellite Program F-18 spacecraft and planned for use with upcoming remote sensing missions. The technique incorporates several features to optimize the tomographic solutions, such as the use of a nonnegative algorithm (Richardson-Lucy, RL) that explicitly accounts for the Poisson statistics inherent in optical measurements, capability to include extinction effects due to resonant scattering and absorption of the photons from the lines of sight, a pseudodiffusion-based regularization scheme implemented between iterations of the RL code to produce smoother solutions, and the capability to estimate error bars on the solutions. Tests using simulated atmospheric emissions verify that the technique performs well in a variety of situations, including daytime, nighttime, and even in the challenging terminator regions. Lastly, we consider ionospheric nightglow and validate reconstructions of the nighttime electron density against Advanced Research Project Agency (ARPA) Long-range Tracking and Identification Radar (ALTAIR) incoherent scatter radar data.
Meijer, Tineke W H; de Geus-Oei, Lioe-Fee; Visser, Eric P; Oyen, Wim J G; Looijen-Salamon, Monika G; Visvikis, Dimitris; Verhagen, Ad F T M; Bussink, Johan; Vriens, Dennis
2017-05-01
Purpose To assess whether dynamic fluorine 18 ( 18 F) fluorodeoxyglucose (FDG) positron emission tomography (PET) has added value over static 18 F-FDG PET for tumor delineation in non-small cell lung cancer (NSCLC) radiation therapy planning by using pathology volumes as the reference standard and to compare pharmacokinetic rate constants of 18 F-FDG metabolism, including regional variation, between NSCLC histologic subtypes. Materials and Methods The study was approved by the institutional review board. Patients gave written informed consent. In this prospective observational study, 1-hour dynamic 18 F-FDG PET/computed tomographic examinations were performed in 35 patients (36 resectable NSCLCs) between 2009 and 2014. Static and parametric images of glucose metabolic rate were obtained to determine lesion volumes by using three delineation strategies. Pathology volume was calculated from three orthogonal dimensions (n = 32). Whole tumor and regional rate constants and blood volume fraction (V B ) were computed by using compartment modeling. Results Pathology volumes were larger than PET volumes (median difference, 8.7-25.2 cm 3 ; Wilcoxon signed rank test, P < .001). Static fuzzy locally adaptive Bayesian (FLAB) volumes corresponded best with pathology volumes (intraclass correlation coefficient, 0.72; P < .001). Bland-Altman analyses showed the highest precision and accuracy for static FLAB volumes. Glucose metabolic rate and 18 F-FDG phosphorylation rate were higher in squamous cell carcinoma (SCC) than in adenocarcinoma (AC), whereas V B was lower (Mann-Whitney U test or t test, P = .003, P = .036, and P = .019, respectively). Glucose metabolic rate, 18 F-FDG phosphorylation rate, and V B were less heterogeneous in AC than in SCC (Friedman analysis of variance). Conclusion Parametric images are not superior to static images for NSCLC delineation. FLAB-based segmentation on static 18 F-FDG PET images is in best agreement with pathology volume and could be useful for NSCLC autocontouring. Differences in glycolytic rate and V B between SCC and AC are relevant for research in targeting agents and radiation therapy dose escalation. © RSNA, 2016 Online supplemental material is available for this article.
NASA Astrophysics Data System (ADS)
Menendez, H. M.; Thurber, C. H.
2011-12-01
Eastern California's Long Valley Caldera (LVC) and the Mono-Inyo Crater volcanic systems have been active for the past ~3.6 million years. Long Valley is known to produce very large silicic eruptions, the last of which resulted in the formation of a 17 km by 32 km wide, east-west trending caldera. Relatively recent unrest began between 1978-1980 with five ML ≥ 5.7 non-double-couple (NDC) earthquakes and associated aftershock swarms. Similar shallow seismic swarms have continued south of the resurgent dome and beneath Mammoth Mountain, surrounding sites of increased CO2 gas emissions. Nearly two decades of increased volcanic activity led to the 1997 installation of a temporary three-component array of 69 seismometers. This network, deployed by the Durham University, the USGS, and Duke University, recorded over 4,000 high-frequency events from May to September. A local tomographic inversion of 283 events surrounding Mammoth Mountain yielded a velocity structure with low Vp and Vp/Vs anomalies at 2-3 km bsl beneath the resurgent dome and Casa Diablo hot springs. These anomalies were interpreted to be CO2 reservoirs (Foulger et al., 2003). Several teleseismic and regional tomography studies have also imaged low Vp anomalies beneath the caldera at ~5-15 km depth, interpreted to be the underlying magma reservoir (Dawson et al., 1990; Weiland et al., 1995; Thurber et al., 2009). This study aims to improve the resolution of the LVC regional velocity model by performing tomographic inversions using the local events from 1997 in conjunction with regional events recorded by the Northern California Seismic Network (NCSN) between 1980 and 2010 and available refraction data. Initial tomographic inversions reveal a low velocity zone at ~2 to 6 km depth beneath the caldera. This structure may simply represent the caldera fill. Further iterations and the incorporation of teleseismic data may better resolve the overall shape and size of the underlying magma reservoir.
Advanced Ionospheric Sensing using GROUP-C and LITES aboard the ISS
NASA Astrophysics Data System (ADS)
Budzien, S. A.; Stephan, A. W.; Chakrabarti, S.; Finn, S. C.; Cook, T.; Powell, S. P.; O'Hanlon, B.; Bishop, R. L.
2015-12-01
The GPS Radio Occultation and Ultraviolet Photometer Co-located (GROUP-C) and Limb-imaging Ionospheric and Thermospheric Extreme-ultraviolet Spectrograph (LITES) experiments are manifested for flight aboard the International Space Station (ISS) in 2016 as part of the Space Test Program Houston #5 payload. The two experiments provide technical development and risk-reduction for future DoD space weather sensors suitable for ionospheric specification, space situational awareness, and data products for global ionosphere assimilative models. In addition, the combined instrument complement of these two experiments offers a unique opportunity to study structures of the nighttime ionosphere. GROUP-C includes an advanced GPS receiver providing ionospheric electron density profiles and scintillation measurements and a high-sensitivity far-ultraviolet photometer measuring horizontal ionospheric gradients. LITES is an imaging spectrograph that spans 60-140 nm and will obtain high-cadence limb profiles of the ionosphere and thermosphere from 150-350 km altitude. In the nighttime ionosphere, recombination of O+ and electrons produces optically thin emissions at 91.1 and 135.6 nm that can be used to tomographically reconstruct the two-dimensional plasma distribution in the orbital plane below ISS altitudes. Ionospheric irregularities, such as plasma bubbles and blobs, are transient features of the low and middle latitude ionosphere with important implications for operational systems. Irregularity structures have been studied primarily using ground-based systems, though some spaced-based remote and in-situ sensing has been performed. An ionospheric observatory aboard the ISS would provide new capability to study low- and mid-latitude ionospheric structures on a global scale. By combining for the first time high-sensitivity in-track photometry, vertical ionospheric airglow spectrographic imagery, and recent advancements in UV tomography, high-fidelity tomographic reconstruction of nighttime structures can be performed from the ISS. We discuss the tomographic approach, simulated reconstructions, and value added by including complementary ground-based observations. Acknowledgements: This work is supported by NRL Work Unit 76-1C09-05.
New developments in multimodal clinical multiphoton tomography
NASA Astrophysics Data System (ADS)
König, Karsten
2011-03-01
80 years ago, the PhD student Maria Goeppert predicted in her thesis in Goettingen, Germany, two-photon effects. It took 30 years to prove her theory, and another three decades to realize the first two-photon microscope. With the beginning of this millennium, first clinical multiphoton tomographs started operation in research institutions, hospitals, and in the cosmetic industry. The multiphoton tomograph MPTflexTM with its miniaturized flexible scan head became the Prism-Award 2010 winner in the category Life Sciences. Multiphoton tomographs with its superior submicron spatial resolution can be upgraded to 5D imaging tools by adding spectral time-correlated single photon counting units. Furthermore, multimodal hybrid tomographs provide chemical fingerprinting and fast wide-field imaging. The world's first clinical CARS studies have been performed with a hybrid multimodal multiphoton tomograph in spring 2010. In particular, nonfluorescent lipids and water as well as mitochondrial fluorescent NAD(P)H, fluorescent elastin, keratin, and melanin as well as SHG-active collagen have been imaged in patients with dermatological disorders. Further multimodal approaches include the combination of multiphoton tomographs with low-resolution imaging tools such as ultrasound, optoacoustic, OCT, and dermoscopy systems. Multiphoton tomographs are currently employed in Australia, Japan, the US, and in several European countries for early diagnosis of skin cancer (malignant melanoma), optimization of treatment strategies (wound healing, dermatitis), and cosmetic research including long-term biosafety tests of ZnO sunscreen nanoparticles and the measurement of the stimulated biosynthesis of collagen by anti-ageing products.
Rose, Michael; Rubal, Bernard; Hulten, Edward; Slim, Jennifer N; Steel, Kevin; Furgerson, James L; Villines, Todd C
2014-01-01
Background: The correlation between normal cardiac chamber linear dimensions measured during retrospective coronary computed tomographic angiography as compared to transthoracic echocardiography using the American Society of Echocardiography guidelines is not well established. Methods: We performed a review from January 2005 to July 2011 to identify subjects with retrospective electrocardiogram-gated coronary computed tomographic angiography scans for chest pain and transthoracic echocardiography with normal cardiac structures performed within 90 days. Dimensions were manually calculated in both imaging modalities in accordance with the American Society of Echocardiography published guidelines. Left ventricular ejection fraction was calculated on echocardiography manually using the Simpson’s formula and by coronary computed tomographic angiography using the end-systolic and end-diastolic volumes. Results: We reviewed 532 studies, rejected 412 and had 120 cases for review with a median time between studies of 7 days (interquartile range (IQR25,75) = 0–22 days) with no correlation between the measurements made by coronary computed tomographic angiography and transthoracic echocardiography using Bland–Altman analysis. We generated coronary computed tomographic angiography cardiac dimension reference ranges for both genders for our population. Conclusion: Our findings represent a step towards generating cardiac chamber dimensions’ reference ranges for coronary computed tomographic angiography as compared to transthoracic echocardiography in patients with normal cardiac morphology and function using the American Society of Echocardiography guideline measurements that are commonly used by cardiologists. PMID:26770706
Studies of asteroids, comets, and Jupiter's outer satellites
NASA Technical Reports Server (NTRS)
Bowell, Edward
1991-01-01
Observational, theoretical, and computational research was performed, mainly on asteroids. Two principal areas of research, centering on astrometry and photometry, are interrelated in their aim to study the overall structure of the asteroid belt and the physical and orbital properties of individual asteroids. Two highlights are: detection of CN emission from Chiron; and realization that 1990 MB is the first known Trojan type asteroid of a planet other than Jupiter. A new method of asteroid orbital error analysis, based on Bayesian theory, was developed.
NASA Astrophysics Data System (ADS)
Lester, R.; Zhai, Y.; Corr, C.; Howard, J.
2016-02-01
This paper describes a coherence imaging system designed for spectroscopic Doppler measurements of ion light in a low-temperature (T e < 10 eV) helicon-produced argon plasma. Observation of the very small Doppler broadening of the Ar II 488 nm emission line requires very high spectral resolution, or equivalently, very large interferometric optical path delay (comparable with the coherence length of the emission line). For these polarization interferometers, this can only be achieved using large thicknesses (100 mm) of birefringent crystal. This poses special design challenges including the application of field-widening techniques and the development of passive thermal stabilization of the optical phase offset. We discuss the measurement principles and the optical design of these systems and present measurements of the line-integrated emissivity, and ion flow and ion temperatures along with tomographic reconstructions of the local values, for a cylindrical low temperature helicon discharge in a linear magnetized device with downstream magnetic mirror. Key results reveal a hollow edge-peaked temperature profile (central temperature ∼0.1 eV) and sheared rigid-body rotational flows and axial flows which are comparable with the ion thermal speed. The emission line brightness, ion temperature and azimuthal ion flows are all found to increase with increased mirror magnetic field strength.
Dust in a compact, cold, high-velocity cloud: A new approach to removing foreground emission
NASA Astrophysics Data System (ADS)
Lenz, D.; Flöer, L.; Kerp, J.
2016-02-01
Context. Because isolated high-velocity clouds (HVCs) are found at great distances from the Galactic radiation field and because they have subsolar metallicities, there have been no detections of dust in these structures. A key problem in this search is the removal of foreground dust emission. Aims: Using the Effelsberg-Bonn H I Survey and the Planck far-infrared data, we investigate a bright, cold, and clumpy HVC. This cloud apparently undergoes an interaction with the ambient medium and thus has great potential to form dust. Methods: To remove the local foreground dust emission we used a regularised, generalised linear model and we show the advantages of this approach with respect to other methods. To estimate the dust emissivity of the HVC, we set up a simple Bayesian model with mildly informative priors to perform the line fit instead of an ordinary linear least-squares approach. Results: We find that the foreground can be modelled accurately and robustly with our approach and is limited mostly by the cosmic infrared background. Despite this improvement, we did not detect any significant dust emission from this promising HVC. The 3σ-equivalent upper limit to the dust emissivity is an order of magnitude below the typical values for the Galactic interstellar medium.
Templin, Christian; Zweigerdt, Robert; Schwanke, Kristin; Olmer, Ruth; Ghadri, Jelena-Rima; Emmert, Maximilian Y; Müller, Ennio; Küest, Silke M; Cohrs, Susan; Schibli, Roger; Kronen, Peter; Hilbe, Monika; Reinisch, Andreas; Strunk, Dirk; Haverich, Axel; Hoerstrup, Simon; Lüscher, Thomas F; Kaufmann, Philipp A; Landmesser, Ulf; Martin, Ulrich
2012-07-24
Evaluation of novel cellular therapies in large-animal models and patients is currently hampered by the lack of imaging approaches that allow for long-term monitoring of viable transplanted cells. In this study, sodium iodide symporter (NIS) transgene imaging was evaluated as an approach to follow in vivo survival, engraftment, and distribution of human-induced pluripotent stem cell (hiPSC) derivatives in a pig model of myocardial infarction. Transgenic hiPSC lines stably expressing a fluorescent reporter and NIS (NIS(pos)-hiPSCs) were established. Iodide uptake, efflux, and viability of NIS(pos)-hiPSCs were assessed in vitro. Ten (±2) days after induction of myocardial infarction by transient occlusion of the left anterior descending artery, catheter-based intramyocardial injection of NIS(pos)-hiPSCs guided by 3-dimensional NOGA mapping was performed. Dual-isotope single photon emission computed tomographic/computed tomographic imaging was applied with the use of (123)I to follow donor cell survival and distribution and with the use of (99m)TC-tetrofosmin for perfusion imaging. In vitro, iodide uptake in NIS(pos)-hiPSCs was increased 100-fold above that of nontransgenic controls. In vivo, viable NIS(pos)-hiPSCs could be visualized for up to 15 weeks. Immunohistochemistry demonstrated that hiPSC-derived endothelial cells contributed to vascularization. Up to 12 to 15 weeks after transplantation, no teratomas were detected. This study describes for the first time the feasibility of repeated long-term in vivo imaging of viability and tissue distribution of cellular grafts in large animals. Moreover, this is the first report demonstrating vascular differentiation and long-term engraftment of hiPSCs in a large-animal model of myocardial infarction. NIS(pos)-hiPSCs represent a valuable tool to monitor and improve current cellular treatment strategies in clinically relevant animal models.
Wiley, Clayton A.; Lopresti, Brian J.; Venneti, Sriram; Price, Julie; Klunk, William E.; DeKosky, Steven T.; Mathis, Chester A.
2009-01-01
Background Alzheimer disease (AD) is defined neuropathologically by the presence of neurofibrillary tangles and plaques associated with tau and β-amyloid protein deposition. The colocalization of microglia and β-amyloid plaques has been widely reported in pathological examination of AD and suggests that neuroinflammation may play a role in pathogenesis and/or progression. Because postmortem histopathological analyses are limited to single end-stage assessment, the time course and nature of this relationship are not well understood. Objective To image microglial activation and β-amyloid deposition in the brains of subjects with and without AD. Design, Setting, and Participants Using two carbon 11 ([11C])–labeled positron emission tomographic imaging agents, Pittsburgh Compound B (PiB) and (R)-PK11195, we examined the relationship between amyloid deposition and microglial activation in different stages of AD using 5 control subjects, 6 subjects diagnosed with mild cognitive impairment, and 6 patients with mild to moderate AD. Results Consistent with prior reports, subjects with a clinical diagnosis of probable AD showed significantly greater levels of [11C]PiB retention than control subjects, whereas patients with mild cognitive impairment spanned a range from control-like to AD-like levels of [11C]PiB retention. Additionally, 2 asymptomatic control subjects also exhibited evidence of elevated PiB retention in regions associated with the early emergence of plaques in AD and may represent prodromal cases of AD. We observed no differences in brain [11C](R)-PK11195 retention when subjects were grouped by clinical diagnosis or the presence or absence of β-amyloid pathological findings as indicated by analyses of [11C]PiB retention. Conclusions These findings suggest that either microglial activation is limited to later stages of severe AD or [11C](R)-PK11195 is too insensitive to detect the level of microglial activation associated with mild to moderate AD. PMID:19139300
NASA Astrophysics Data System (ADS)
Akolkar, A.; Petrasch, J.; Finck, S.; Rahmatian, N.
2018-02-01
An inverse analysis of the phosphor layer of a commercially available, conformally coated, white LED is done based on tomographic and spectrometric measurements. The aim is to determine the radiative transfer coefficients of the phosphor layer from the measurements of the finished device, with minimal assumptions regarding the composition of the phosphor layer. These results can be used for subsequent opto-thermal modelling and optimization of the device. For this purpose, multiple integrating sphere and gonioradiometric measurements are done to obtain statistical bounds on spectral radiometric values and angular color distributions for ten LEDs belonging to the same color bin of the product series. Tomographic measurements of the LED package are used to generate a tetrahedral grid of the 3D LED geometry. A radiative transfer model using Monte Carlo Ray Tracing in the tetrahedral grid is developed. Using a two-wavelength model consisting of a blue emission wavelength and a yellow, Stokes-shifted re-emission wavelength, the angular color distribution of the LED is simulated over wide ranges of the absorption and scattering coefficients of the phosphor layer, for the blue and yellow wavelengths. Using a two-step, iterative space search, combinations of the radiative transfer coefficients are obtained for which the simulations are consistent with the integrating sphere and gonioradiometric measurements. The results show an inverse relationship between the scattering and absorption coefficients of the phosphor layer for blue light. Scattering of yellow light acts as a distribution and loss mechanism for yellow light and affects the shape of the angular color distribution significantly, especially at larger viewing angles. The spread of feasible coefficients indicates that measured optical behavior of the LEDs may be reproduced using a range of combinations of radiative coefficients. Given that coefficients predicted by the Mie theory usually must be corrected in order to reproduce experimental results, these results indicate that a more complete model of radiative transfer in phosphor layers is required.
NASA Astrophysics Data System (ADS)
Fourmaux, Sylvain; Kieffer, Jean-Claude; Krol, Andrzej
2017-03-01
We are developing ultrahigh spatial resolution (FWHM < 2 μm) high-brilliance x-ray source for rapid in vivo tomographic microvasculature imaging micro-CT angiography (μCTA) in small animal models using optimized contrast agent. It exploits Laser Wakefield Accelerator (LWFA) betatron x-ray emission phenomenon. Ultrashort high-intensity laser pulse interacting with a supersonic gas jet produces an ion cavity ("bubble") in the plasma in the wake of the laser pulse. Electrons that are injected into this bubble gain energy, perform wiggler-like oscillations and generate burst of incoherent x-rays with characteristic duration time comparable to the laser pulse duration, continuous synchrotron-like spectral distribution that might extend to hundreds keV, very high brilliance, very small focal spot and highly directional emission in the cone-beam geometry. Such LWFA betatron x-ray source created in our lab produced 1021 -1023 photonsṡ shot-1ṡmrad-2ṡmm-2/0.1%bw with mean critical energy in the12-30 keV range. X-ray source size for a single laser shot was FWHM=1.7 μm x-ray beam divergence 20-30 mrad, and effective focal spot size for multiple shots FWHM= 2 μm. Projection images of simple phantoms and complex biological objects including insects and mice were obtained in single laser shots. We conclude that ultrahigh spatial resolution μCTA (FWHM 2 μm) requiring thousands of projection images could be accomplished using LWFA betatron x-ray radiation in approximately 40 s with our existing 220 TW laser and sub seconds with next generation of ultrafast lasers and x-ray detectors, as opposed to several hours required using conventional microfocal x-ray tubes. Thus, sub second ultrahigh resolution in vivo microtomographic microvasculature imaging (in both absorption and phase contrast mode) in small animal models of cancer and vascular diseases will be feasible with LWFA betatron x-ray source.
NASA Astrophysics Data System (ADS)
Thampi, Smitha V.; Yamamoto, Mamoru
2010-03-01
A chain of newly designed GNU (GNU is not UNIX) Radio Beacon Receivers (GRBR) has recently been established over Japan, primarily for tomographic imaging of the ionosphere over this region. Receivers installed at Shionomisaki (33.45°N, 135.8°E), Shigaraki (34.8°N, 136.1°E), and Fukui (36°N, 136°E) continuously track low earth orbiting satellites (LEOS), mainly OSCAR, Cosmos, and FORMOSAT-3/COSMIC, to obtain simultaneous total electron content (TEC) data from these three locations, which are then used for the tomographic reconstruction of ionospheric electron densities. This is the first GRBR network established for TEC observations, and the first beacon-based tomographic imaging in Japanese longitudes. The first tomographic images revealed the temporal evolution with all of the major features in the ionospheric electron density distribution over Japan. A comparison of the tomographically reconstructed electron densities with the ƒ o F 2 data from Kokubunji (35°N, 139°E) revealed that there was good agreement between the datasets. These first results show the potential of GRBR and its network for making continuous, unattended ionospheric TEC measurements and for tomographic imaging of the ionosphere.
NASA Astrophysics Data System (ADS)
Boxx, I.; Carter, C. D.; Meier, W.
2014-08-01
Tomographic particle image velocimetry (tomographic-PIV) is a recently developed measurement technique used to acquire volumetric velocity field data in liquid and gaseous flows. The technique relies on line-of-sight reconstruction of the rays between a 3D particle distribution and a multi-camera imaging system. In a turbulent flame, however, index-of-refraction variations resulting from local heat-release may inhibit reconstruction and thereby render the technique infeasible. The objective of this study was to test the efficacy of tomographic-PIV in a turbulent flame. An additional goal was to determine the feasibility of acquiring usable tomographic-PIV measurements in a turbulent flame at multi-kHz acquisition rates with current-generation laser and camera technology. To this end, a setup consisting of four complementary metal oxide semiconductor cameras and a dual-cavity Nd:YAG laser was implemented to test the technique in a lifted turbulent jet flame. While the cameras were capable of kHz-rate image acquisition, the laser operated at a pulse repetition rate of only 10 Hz. However, use of this laser allowed exploration of the required pulse energy and thus power for a kHz-rate system. The imaged region was 29 × 28 × 2.7 mm in size. The tomographic reconstruction of the 3D particle distributions was accomplished using the multiplicative algebraic reconstruction technique. The results indicate that volumetric velocimetry via tomographic-PIV is feasible with pulse energies of 25 mJ, which is within the capability of current-generation kHz-rate diode-pumped solid-state lasers.
Large-scale tomographic particle image velocimetry using helium-filled soap bubbles
NASA Astrophysics Data System (ADS)
Kühn, Matthias; Ehrenfried, Klaus; Bosbach, Johannes; Wagner, Claus
2011-04-01
To measure large-scale flow structures in air, a tomographic particle image velocimetry (tomographic PIV) system for measurement volumes of the order of one cubic metre is developed, which employs helium-filled soap bubbles (HFSBs) as tracer particles. The technique has several specific characteristics compared to most conventional tomographic PIV systems, which are usually applied to small measurement volumes. One of them is spot lights on the HFSB tracers, which slightly change their position, when the direction of observation is altered. Further issues are the large particle to voxel ratio and the short focal length of the used camera lenses, which result in a noticeable variation of the magnification factor in volume depth direction. Taking the specific characteristics of the HFSBs into account, the feasibility of our large-scale tomographic PIV system is demonstrated by showing that the calibration errors can be reduced down to 0.1 pixels as required. Further, an accurate and fast implementation of the multiplicative algebraic reconstruction technique, which calculates the weighting coefficients when needed instead of storing them, is discussed. The tomographic PIV system is applied to measure forced convection in a convection cell at a Reynolds number of 530 based on the inlet channel height and the mean inlet velocity. The size of the measurement volume and the interrogation volumes amount to 750 mm × 450 mm × 165 mm and 48 mm × 48 mm × 24 mm, respectively. Validation of the tomographic PIV technique employing HFSBs is further provided by comparing profiles of the mean velocity and of the root mean square velocity fluctuations to respective planar PIV data.
Vanderperren, K; Bergman, H J; Spoormakers, T J P; Pille, F; Duchateau, L; Puchalski, S M; Saunders, J H
2014-07-01
Lysis of the axial aspect of equine proximal sesamoid bones (PSBs) is a rare condition reported to have septic or traumatic origins. Limited information exists regarding imaging of nonseptic axial osteitis of a PSB. To report the clinical, radiographic, ultrasonographic, computed tomographic and intra-arterial contrast-enhanced computed tomographic abnormalities in horses with axial nonseptic osteitis of a PSB. Retrospective clinical study. Eighteen horses diagnosed with nonseptic osteitis of the axial border of a PSB between 2007 and 2012 were reviewed retrospectively. Case details, clinical examination, radiographic, ultrasonographic, computed tomographic and intra-arterial/intra-articular contrast-enhanced computed tomographic features were recorded, when available. Radiographic, ultrasonographic and computed tomographic evaluations of the fetlock region had been performed on 18, 15 and 9 horses, respectively. The effect of the degree of lysis on the grade and duration of lameness was determined. All horses had chronic unilateral lameness, 4 with forelimb and 14 with hindlimb signs. On radiographs, lysis was identified in both PSBs in 14 horses, one PSB in 3 horses and in one horse no lysis was identified. The degree of osteolysis was variable. Ultrasonography identified variably sized irregularities of the bone surface and alteration in echogenicity of the palmar/plantar ligament (PL). All horses undergoing computed tomographic examination (n = 9) had biaxial lysis. The lesions were significantly longer and deeper on computed tomographic images compared with radiographic images. Intra-arterial contrast-enhanced computed tomography may reveal moderate to marked contrast enhancement of the PL. There was no significant effect of the degree of lysis on the grade and duration of lameness. Lesions of nonseptic axial osteitis of a PSB can be identified using a combination of radiography and ultrasonography. Computed tomography provides additional information regarding the extent of the pathology. © 2013 EVJ Ltd.
Progress In The Development Of A Tomographic SPECT System For Online Dosimetry In BNCT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Minsky, D. M.; Kreiner, A. J.; ECyT, UNSAM, M. de Irigoyen 3100
2010-08-04
In boron neutron capture therapy (BNCT) the delivered dose to the patient depends both on the neutron beam characteristics and on the {sup 10}B body distribution which, in turn, is governed by the tumor specificity of the {sup 10}B drug-carrier. BNCT dosimetry is a complex matter due to the several interactions that neutrons can undergo with the different nuclei present in tissue. However the boron capture reaction {sup 10}B(n,{alpha}){sup 7}Li accounts for about 80 % of the total dose in a tumor with 40 ppm in {sup 10}B concentration. Present dosimetric methods are indirect, based on drug biodistribution statistical datamore » and subjected to inter and intra-patient variability. In order to overcome the consequences of the concomitant high dosimetric uncertainties, we propose a SPECT (Single Photon Emission Tomography) approach based on the detection of the prompt gamma-ray (478 keV) emitted in 94 % of the cases from {sup 7}Li. For this purpose we designed, built and tested a prototype based on LaBr{sub 3}(Ce) scintillators. Measurements on a head and tumor phantom were performed in the accelerator-based BNCT facility of the University of Birmingham (UK). They result in the first tomographic image of the 10B capture distribution obtained in a BNCT facility.« less
Tomographic digital subtraction angiography for lung perfusion estimation in rodents.
Badea, Cristian T; Hedlund, Laurence W; De Lin, Ming; Mackel, Julie S Boslego; Samei, Ehsan; Johnson, G Allan
2007-05-01
In vivo measurements of perfusion present a challenge to existing small animal imaging techniques such as magnetic resonance microscopy, micro computed tomography, micro positron emission tomography, and microSPECT, due to combined requirements for high spatial and temporal resolution. We demonstrate the use of tomographic digital subtraction angiography (TDSA) for estimation of perfusion in small animals. TDSA augments conventional digital subtraction angiography (DSA) by providing three-dimensional spatial information using tomosynthesis algorithms. TDSA is based on the novel paradigm that the same time density curves can be reproduced in a number of consecutive injections of microL volumes of contrast at a series of different angles of rotation. The capabilities of TDSA are established in studies on lung perfusion in rats. Using an imaging system developed in-house, we acquired data for four-dimensional (4D) imaging with temporal resolution of 140 ms, in-plane spatial resolution of 100 microm, and slice thickness on the order of millimeters. Based on a structured experimental approach, we optimized TDSA imaging providing a good trade-off between slice thickness, the number of injections, contrast to noise, and immunity to artifacts. Both DSA and TDSA images were used to create parametric maps of perfusion. TDSA imaging has potential application in a number of areas where functional perfusion measurements in 4D can provide valuable insight into animal models of disease and response to therapeutics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
PELT, DANIEL
2017-04-21
Small Python package to compute tomographic reconstructions using a reconstruction method published in: Pelt, D.M., & De Andrade, V. (2017). Improved tomographic reconstruction of large-scale real-world data by filter optimization. Advanced Structural and Chemical Imaging 2: 17; and Pelt, D. M., & Batenburg, K. J. (2015). Accurately approximating algebraic tomographic reconstruction by filtered backprojection. In Proceedings of The 13th International Meeting on Fully Three-Dimensional Image Reconstruction in Radiology and Nuclear Medicine (pp. 158-161).
Simulation results of a veto counter for the ClearPEM
NASA Astrophysics Data System (ADS)
Trummer, J.; Auffray, E.; Lecoq, P.
2009-04-01
The Crystal Clear Collaboration (CCC) has built a prototype of a novel positron emission tomograph dedicated to functional breast imaging, the ClearPEM. The ClearPEM uses the common radio pharmaceutical FDG for imaging cancer. As FDG is a rather non-specific radio tracer, it accumulates not only in cancer cells but in all cells with a high energy consumption, such as the heart and liver. This fact poses a problem especially in breast imaging, where the vicinity of the heart and other organs to the breast leads to a high background noise level in the scanner. In this work, a veto counter to reduce the background is described. Different configurations and their effectiveness were studied using the GATE simulation package.
NASA Astrophysics Data System (ADS)
Huh, C.; Bolch, W. E.
2003-10-01
Two classes of anatomic models currently exist for use in both radiation protection and radiation dose reconstruction: stylized mathematical models and tomographic voxel models. The former utilize 3D surface equations to represent internal organ structure and external body shape, while the latter are based on segmented CT or MR images of a single individual. While tomographic models are clearly more anthropomorphic than stylized models, a given model's characterization as being anthropometric is dependent upon the reference human to which the model is compared. In the present study, data on total body mass, standing/sitting heights and body mass index are collected and reviewed for the US population covering the time interval from 1971 to 2000. These same anthropometric parameters are then assembled for the ORNL series of stylized models, the GSF series of tomographic models (Golem, Helga, Donna, etc), the adult male Zubal tomographic model and the UF newborn tomographic model. The stylized ORNL models of the adult male and female are found to be fairly representative of present-day average US males and females, respectively, in terms of both standing and sitting heights for ages between 20 and 60-80 years. While the ORNL adult male model provides a reasonably close match to the total body mass of the average US 21-year-old male (within ~5%), present-day 40-year-old males have an average total body mass that is ~16% higher. For radiation protection purposes, the use of the larger 73.7 kg adult ORNL stylized hermaphrodite model provides a much closer representation of average present-day US females at ages ranging from 20 to 70 years. In terms of the adult tomographic models from the GSF series, only Donna (40-year-old F) closely matches her age-matched US counterpart in terms of average body mass. Regarding standing heights, the better matches to US age-correlated averages belong to Irene (32-year-old F) for the females and Golem (38-year-old M) for the males. Both Helga (27-year-old F) and Donna, however, provide good matches to average US sitting heights for adult females, while Golem and Otoko (male of unknown age) yield sitting heights that are slightly below US adult male averages. Finally, Helga is seen as the only GSF tomographic female model that yields a body mass index in line with her average US female counterpart at age 26. In terms of dose reconstruction activities, however, all current tomographic voxel models are valuable assets in attempting to cover the broad distribution of individual anthropometric parameters representative of the current US population. It is highly recommended that similar attempts to create a broad library of tomographic models be initiated in the United States and elsewhere to complement and extend the limited number of tomographic models presently available for these efforts.
Extreme ultraviolet and Soft X-ray diagnostic upgrade on the HBT-EP tokamak: Progress and Results
NASA Astrophysics Data System (ADS)
Desanto, S.; Levesque, J. P.; Battey, A.; Brooks, J. W.; Mauel, M. E.; Navratil, G. A.; Hansen, C. J.
2017-10-01
In order to understand internal MHD mode structure in a tokamak plasma, it is helpful to understand temperature and density fluctuations within that plasma. In the HBT-EP tokamak, the plasma emits bremsstrahlung radiation in the extreme ultraviolet (EUV) and soft x-ray (SXR) regimes, and the emitted power is primarily related to electron density and temperature. This radiation is detected by photodiode arrays located at several different angular positions near the plasma's edge, each array making several views through a poloidal slice of plasma. From these measurements a 2-d emissivity profile of that slice can be reconstructed with tomographic algorithms. This profile cannot directly tell us whether the emissivity is due to electron density, temperature, line emission, or charge recombination; however, when combined with information from other diagnostics, it can provide strong evidence of the type of internal mode or modes depending on the temporal-spatial context. We present ongoing progress and results on the installation of a new system that will eventually consist of four arrays of 16 views each and a separate two-color, 16-chord tangential system, which will provide an improved understanding of the internal structure of HBT-EP plasmas. Supported by U.S. DOE Grant DE-FG02-86ER5322.
Anthropomorphic thorax phantom for cardio-respiratory motion simulation in tomographic imaging
NASA Astrophysics Data System (ADS)
Bolwin, Konstantin; Czekalla, Björn; Frohwein, Lynn J.; Büther, Florian; Schäfers, Klaus P.
2018-02-01
Patient motion during medical imaging using techniques such as computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), or single emission computed tomography (SPECT) is well known to degrade images, leading to blurring effects or severe artifacts. Motion correction methods try to overcome these degrading effects. However, they need to be validated under realistic conditions. In this work, a sophisticated anthropomorphic thorax phantom is presented that combines several aspects of a simulator for cardio-respiratory motion. The phantom allows us to simulate various types of cardio-respiratory motions inside a human-like thorax, including features such as inflatable lungs, beating left ventricular myocardium, respiration-induced motion of the left ventricle, moving lung lesions, and moving coronary artery plaques. The phantom is constructed to be MR-compatible. This means that we can not only perform studies in PET, SPECT and CT, but also inside an MRI system. The technical features of the anthropomorphic thorax phantom Wilhelm are presented with regard to simulating motion effects in hybrid emission tomography and radiotherapy. This is supplemented by a study on the detectability of small coronary plaque lesions in PET/CT under the influence of cardio-respiratory motion, and a study on the accuracy of left ventricular blood volumes.
A comparison of newborn stylized and tomographic models for dose assessment in paediatric radiology
NASA Astrophysics Data System (ADS)
Staton, R. J.; Pazik, F. D.; Nipper, J. C.; Williams, J. L.; Bolch, W. E.
2003-04-01
Establishment of organ doses from diagnostic and interventional examinations is a key component to quantifying the radiation risks from medical exposures and for formulating corresponding dose-reduction strategies. Radiation transport models of human anatomy provide a convenient method for simulating radiological examinations. At present, two classes of models exist: stylized mathematical models and tomographic voxel models. In the present study, organ dose comparisons are made for projection radiographs of both a stylized and a tomographic model of the newborn patient. Sixteen separate radiographs were simulated for each model at x-ray technique factors typical of newborn examinations: chest, abdomen, thorax and head views in the AP, PA, left LAT and right LAT projection orientation. For AP and PA radiographs of the torso (chest, abdomen and thorax views), the effective dose assessed for the tomographic model exceeds that for the stylized model with per cent differences ranging from 19% (AP abdominal view) to 43% AP chest view. In contrast, the effective dose for the stylized model exceeds that for the tomographic model for all eight lateral views including those of the head, with per cent differences ranging from 9% (LLAT chest view) to 51% (RLAT thorax view). While organ positioning differences do exist between the models, a major factor contributing to differences in effective dose is the models' exterior trunk shape. In the tomographic model, a more elliptical shape is seen thus providing for less tissue shielding for internal organs in the AP and PA directions, with corresponding increased tissue shielding in the lateral directions. This observation is opposite of that seen in comparisons of stylized and tomographic models of the adult.
2017-07-31
Report: High-Energy, High-Pulse-Rate Light Sources for Enhanced Time -Resolved Tomographic PIV of Unsteady & Turbulent Flows The views, opinions and/or...reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...High-Energy, High-Pulse-Rate Light Sources for Enhanced Time -Resolved Tomographic PIV of Unsteady & Turbulent Flows Report Term: 0-Other Email
Image processing pipeline for synchrotron-radiation-based tomographic microscopy.
Hintermüller, C; Marone, F; Isenegger, A; Stampanoni, M
2010-07-01
With synchrotron-radiation-based tomographic microscopy, three-dimensional structures down to the micrometer level can be visualized. Tomographic data sets typically consist of 1000 to 1500 projections of 1024 x 1024 to 2048 x 2048 pixels and are acquired in 5-15 min. A processing pipeline has been developed to handle this large amount of data efficiently and to reconstruct the tomographic volume within a few minutes after the end of a scan. Just a few seconds after the raw data have been acquired, a selection of reconstructed slices is accessible through a web interface for preview and to fine tune the reconstruction parameters. The same interface allows initiation and control of the reconstruction process on the computer cluster. By integrating all programs and tools, required for tomographic reconstruction into the pipeline, the necessary user interaction is reduced to a minimum. The modularity of the pipeline allows functionality for new scan protocols to be added, such as an extended field of view, or new physical signals such as phase-contrast or dark-field imaging etc.
Tomographic imaging of OH laser-induced fluorescence in laminar and turbulent jet flames
NASA Astrophysics Data System (ADS)
Li, Tao; Pareja, Jhon; Fuest, Frederik; Schütte, Manuel; Zhou, Yihui; Dreizler, Andreas; Böhm, Benjamin
2018-01-01
In this paper a new approach for 3D flame structure diagnostics using tomographic laser-induced fluorescence (Tomo-LIF) of the OH radical was evaluated. The approach combined volumetric illumination with a multi-camera detection system of eight views. Single-shot measurements were performed in a methane/air premixed laminar flame and in a non-premixed turbulent methane jet flame. 3D OH fluorescence distributions in the flames were reconstructed using the simultaneous multiplicative algebraic reconstruction technique. The tomographic measurements were compared and validated against results of OH-PLIF in the laminar flame. The effects of the experimental setup of the detection system and the size of the volumetric illumination on the quality of the tomographic reconstructions were evaluated. Results revealed that the Tomo-LIF is suitable for volumetric reconstruction of flame structures with acceptable spatial resolution and uncertainty. It was found that the number of views and their angular orientation have a strong influence on the quality and accuracy of the tomographic reconstruction while the illumination volume thickness influences mainly the spatial resolution.
Optimal joule heating of the subsurface
Berryman, James G.; Daily, William D.
1994-01-01
A method for simultaneously heating the subsurface and imaging the effects of the heating. This method combines the use of tomographic imaging (electrical resistance tomography or ERT) to image electrical resistivity distribution underground, with joule heating by electrical currents injected in the ground. A potential distribution is established on a series of buried electrodes resulting in energy deposition underground which is a function of the resistivity and injection current density. Measurement of the voltages and currents also permits a tomographic reconstruction of the resistivity distribution. Using this tomographic information, the current injection pattern on the driving electrodes can be adjusted to change the current density distribution and thus optimize the heating. As the heating changes conditions, the applied current pattern can be repeatedly adjusted (based on updated resistivity tomographs) to affect real time control of the heating.
Siontis, George CM; Mavridis, Dimitris; Greenwood, John P; Coles, Bernadette; Nikolakopoulou, Adriani; Jüni, Peter; Salanti, Georgia
2018-01-01
Abstract Objective To evaluate differences in downstream testing, coronary revascularisation, and clinical outcomes following non-invasive diagnostic modalities used to detect coronary artery disease. Design Systematic review and network meta-analysis. Data sources Medline, Medline in process, Embase, Cochrane Library for clinical trials, PubMed, Web of Science, SCOPUS, WHO International Clinical Trials Registry Platform, and Clinicaltrials.gov. Eligibility criteria for selecting studies Diagnostic randomised controlled trials comparing non-invasive diagnostic modalities in patients presenting with symptoms suggestive of low risk acute coronary syndrome or stable coronary artery disease. Data synthesis A random effects network meta-analysis synthesised available evidence from trials evaluating the effect of non-invasive diagnostic modalities on downstream testing and patient oriented outcomes in patients with suspected coronary artery disease. Modalities included exercise electrocardiograms, stress echocardiography, single photon emission computed tomography-myocardial perfusion imaging, real time myocardial contrast echocardiography, coronary computed tomographic angiography, and cardiovascular magnetic resonance. Unpublished outcome data were obtained from 11 trials. Results 18 trials of patients with low risk acute coronary syndrome (n=11 329) and 12 trials of those with suspected stable coronary artery disease (n=22 062) were included. Among patients with low risk acute coronary syndrome, stress echocardiography, cardiovascular magnetic resonance, and exercise electrocardiograms resulted in fewer invasive referrals for coronary angiography than coronary computed tomographic angiography (odds ratio 0.28 (95% confidence interval 0.14 to 0.57), 0.32 (0.15 to 0.71), and 0.53 (0.28 to 1.00), respectively). There was no effect on the subsequent risk of myocardial infarction, but estimates were imprecise. Heterogeneity and inconsistency were low. In patients with suspected stable coronary artery disease, an initial diagnostic strategy of stress echocardiography or single photon emission computed tomography-myocardial perfusion imaging resulted in fewer downstream tests than coronary computed tomographic angiography (0.24 (0.08 to 0.74) and 0.57 (0.37 to 0.87), respectively). However, exercise electrocardiograms yielded the highest downstream testing rate. Estimates for death and myocardial infarction were imprecise without clear discrimination between strategies. Conclusions For patients with low risk acute coronary syndrome, an initial diagnostic strategy of stress echocardiography or cardiovascular magnetic resonance is associated with fewer referrals for invasive coronary angiography and revascularisation procedures than non-invasive anatomical testing, without apparent impact on the future risk of myocardial infarction. For suspected stable coronary artery disease, there was no clear discrimination between diagnostic strategies regarding the subsequent need for invasive coronary angiography, and differences in the risk of myocardial infarction cannot be ruled out. Systematic review registration PROSPERO registry no CRD42016049442. PMID:29467161
NASA Astrophysics Data System (ADS)
Lauvaux, Thomas; Miles, Natasha L.; Deng, Aijun; Richardson, Scott J.; Cambaliza, Maria O.; Davis, Kenneth J.; Gaudet, Brian; Gurney, Kevin R.; Huang, Jianhua; O'Keefe, Darragh; Song, Yang; Karion, Anna; Oda, Tomohiro; Patarasuk, Risa; Razlivanov, Igor; Sarmiento, Daniel; Shepson, Paul; Sweeney, Colm; Turnbull, Jocelyn; Wu, Kai
2016-05-01
Based on a uniquely dense network of surface towers measuring continuously the atmospheric concentrations of greenhouse gases (GHGs), we developed the first comprehensive monitoring systems of CO2 emissions at high resolution over the city of Indianapolis. The urban inversion evaluated over the 2012-2013 dormant season showed a statistically significant increase of about 20% (from 4.5 to 5.7 MtC ± 0.23 MtC) compared to the Hestia CO2 emission estimate, a state-of-the-art building-level emission product. Spatial structures in prior emission errors, mostly undetermined, appeared to affect the spatial pattern in the inverse solution and the total carbon budget over the entire area by up to 15%, while the inverse solution remains fairly insensitive to the CO2 boundary inflow and to the different prior emissions (i.e., ODIAC). Preceding the surface emission optimization, we improved the atmospheric simulations using a meteorological data assimilation system also informing our Bayesian inversion system through updated observations error variances. Finally, we estimated the uncertainties associated with undetermined parameters using an ensemble of inversions. The total CO2 emissions based on the ensemble mean and quartiles (5.26-5.91 MtC) were statistically different compared to the prior total emissions (4.1 to 4.5 MtC). Considering the relatively small sensitivity to the different parameters, we conclude that atmospheric inversions are potentially able to constrain the carbon budget of the city, assuming sufficient data to measure the inflow of GHG over the city, but additional information on prior emission error structures are required to determine the spatial structures of urban emissions at high resolution.
Search for C II Emission on Cosmological Scales at Redshift Z ˜ 2.6
NASA Astrophysics Data System (ADS)
Pullen, Anthony R.; Serra, Paolo; Chang, Tzu-Ching; Doré, Olivier; Ho, Shirley
2018-05-01
We present a search for Cii emission over cosmological scales at high-redshifts. The Cii line is a prime candidate to be a tracer of star formation over large-scale structure since it is one of the brightest emission lines from galaxies. Redshifted Cii emission appears in the submillimeter regime, meaning it could potentially be present in the higher frequency intensity data from the Planck satellite used to measure the cosmic infrared background (CIB). We search for Cii emission over redshifts z = 2 - 3.2 in the Planck 545 GHz intensity map by cross-correlating the 3 highest frequency Planck maps with spectroscopic quasars and CMASS galaxies from the Sloan Digital Sky Survey III (SDSS-III), which we then use to jointly fit for Cii intensity, CIB parameters, and thermal Sunyaev-Zeldovich (SZ) emission. We report a measurement of an anomalous emission I_ν =6.6^{+5.0}_{-4.8}× 10^4Jy/sr at 95% confidence, which could be explained by Cii emission, favoring collisional excitation models of Cii emission that tend to be more optimistic than models based on Cii luminosity scaling relations from local measurements; however, a comparison of Bayesian information criteria reveal that this model and the CIB & SZ only model are equally plausible. Thus, more sensitive measurements will be needed to confirm the existence of large-scale Cii emission at high redshifts. Finally, we forecast that intensity maps from Planck cross-correlated with quasars from the Dark Energy Spectroscopic Instrument (DESI) would increase our sensitivity to Cii emission by a factor of 5, while the proposed Primordial Inflation Explorer (PIXIE) could increase the sensitivity further.
Conjugate-gradient preconditioning methods for shift-variant PET image reconstruction.
Fessler, J A; Booth, S D
1999-01-01
Gradient-based iterative methods often converge slowly for tomographic image reconstruction and image restoration problems, but can be accelerated by suitable preconditioners. Diagonal preconditioners offer some improvement in convergence rate, but do not incorporate the structure of the Hessian matrices in imaging problems. Circulant preconditioners can provide remarkable acceleration for inverse problems that are approximately shift-invariant, i.e., for those with approximately block-Toeplitz or block-circulant Hessians. However, in applications with nonuniform noise variance, such as arises from Poisson statistics in emission tomography and in quantum-limited optical imaging, the Hessian of the weighted least-squares objective function is quite shift-variant, and circulant preconditioners perform poorly. Additional shift-variance is caused by edge-preserving regularization methods based on nonquadratic penalty functions. This paper describes new preconditioners that approximate more accurately the Hessian matrices of shift-variant imaging problems. Compared to diagonal or circulant preconditioning, the new preconditioners lead to significantly faster convergence rates for the unconstrained conjugate-gradient (CG) iteration. We also propose a new efficient method for the line-search step required by CG methods. Applications to positron emission tomography (PET) illustrate the method.
Kubo, N
1995-04-01
To improve the quality of single-photon emission computed tomographic (SPECT) images, a restoration filter has been developed. This filter was designed according to practical "least squares filter" theory. It is necessary to know the object power spectrum and the noise power spectrum. The power spectrum is estimated from the power spectrum of a projection, when the high-frequency power spectrum of a projection is adequately approximated as a polynomial exponential expression. A study of the restoration with the filter based on a projection power spectrum was conducted, and compared with that of the "Butterworth" filtering method (cut-off frequency of 0.15 cycles/pixel), and "Wiener" filtering (signal-to-noise power spectrum ratio was a constant). Normalized mean-squared errors (NMSE) of the phantom, two line sources located in a 99mTc filled cylinder, were used. NMSE of the "Butterworth" filter, "Wiener" filter, and filtering based on a power spectrum were 0.77, 0.83, and 0.76 respectively. Clinically, brain SPECT images utilizing this new restoration filter improved the contrast. Thus, this filter may be useful in diagnosis of SPECT images.
Tomographic imaging of subducted lithosphere below northwest Pacific island arcs
Van Der Hilst, R.; Engdahl, R.; Spakman, W.; Nolet, G.
1991-01-01
The seismic tomography problem does not have a unique solution, and published tomographic images have been equivocal with regard to the deep structure of subducting slabs. An improved tomographic method, using a more realistic background Earth model and surf ace-reflected as well as direct seismic phases, shows that slabs beneath the Japan and Izu Bonin island arcs are deflected at the boundary between upper and lower mantle, whereas those beneath the northern Kuril and Mariana arcs sink into the lower mantle.
Business aspects of cardiovascular computed tomography: tackling the challenges.
Bateman, Timothy M
2008-01-01
The purpose of this article is to provide a comprehensive understanding of the business issues surrounding provision of dedicated cardiovascular computed tomographic imaging. Some of the challenges include high up-front costs, current low utilization relative to scanner capability, and inadequate payments. Cardiovascular computed tomographic imaging is a valuable clinical modality that should be offered by cardiovascular centers-of-excellence. With careful consideration of the business aspects, moderate-to-large size cardiology programs should be able to implement an economically viable cardiovascular computed tomographic service.
NASA Astrophysics Data System (ADS)
Karagiannis, Georgios
2017-03-01
This work led to a new method named 3D spectracoustic tomographic mapping imaging. The current and the future work is related to the fabrication of a combined acoustic microscopy transducer and infrared illumination probe permitting the simultaneous acquisition of the spectroscopic and the tomographic information. This probe provides with the capability of high fidelity and precision registered information from the combined modalities named spectracoustic information.
2016-04-28
Single- shot , volumetrically illuminated, three- dimensional, tomographic laser-induced- fluorescence imaging in a gaseous free jet Benjamin R. Halls...us.af.mil Abstract: Single- shot , tomographic imaging of the three-dimensional concentration field is demonstrated in a turbulent gaseous free jet in co-flow...2001). 6. K. M. Tacina and W. J. A. Dahm, “Effects of heat release on turbulent shear flows, Part 1. A general equivalence principle for non-buoyant
Zhang, T; Godavarthi, C; Chaumet, P C; Maire, G; Giovannini, H; Talneau, A; Prada, C; Sentenac, A; Belkebir, K
2015-02-15
Tomographic diffractive microscopy is a marker-free optical digital imaging technique in which three-dimensional samples are reconstructed from a set of holograms recorded under different angles of incidence. We show experimentally that, by processing the holograms with singular value decomposition, it is possible to image objects in a noisy background that are invisible with classical wide-field microscopy and conventional tomographic reconstruction procedure. The targets can be further characterized with a selective quantitative inversion.
A spatially resolved radio spectral index study of the dwarf irregular galaxy NGC 1569
NASA Astrophysics Data System (ADS)
Westcott, Jonathan; Brinks, Elias; Hindson, Luke; Beswick, Robert; Heesen, Volker
2018-04-01
We study the resolved radio continuum spectral energy distribution of the dwarf irregular galaxy, NGC 1569, on a beam-by-beam basis to isolate and study its spatially resolved radio emission characteristics. Utilizing high-quality NRAO Karl G. Jansky Very Large Array observations that densely sample the 1-34 GHz frequency range, we adopt a Bayesian fitting procedure, where we use H α emission that has not been corrected for extinction as a prior, to produce maps of how the separated thermal emission, non-thermal emission, and non-thermal spectral index vary across NGC 1569's main disc. We find a higher thermal fraction at 1 GHz than is found in spiral galaxies (26^{+2}_{-3} {per cent}) and find an average non-thermal spectral index α = -0.53 ± 0.02, suggesting that a young population of cosmic ray electrons is responsible for the observed non-thermal emission. By comparing our recovered map of the thermal radio emission with literature H α maps, we estimate the total reddening along the line of sight to NGC 1569 to be E(B - V) = 0.49 ± 0.05, which is in good agreement with other literature measurements. Spatial variations in the reddening indicate that a significant portion of the total reddening is due to internal extinction within NGC 1569.
Allowable carbon emissions lowered by multiple climate targets.
Steinacher, Marco; Joos, Fortunat; Stocker, Thomas F
2013-07-11
Climate targets are designed to inform policies that would limit the magnitude and impacts of climate change caused by anthropogenic emissions of greenhouse gases and other substances. The target that is currently recognized by most world governments places a limit of two degrees Celsius on the global mean warming since preindustrial times. This would require large sustained reductions in carbon dioxide emissions during the twenty-first century and beyond. Such a global temperature target, however, is not sufficient to control many other quantities, such as transient sea level rise, ocean acidification and net primary production on land. Here, using an Earth system model of intermediate complexity (EMIC) in an observation-informed Bayesian approach, we show that allowable carbon emissions are substantially reduced when multiple climate targets are set. We take into account uncertainties in physical and carbon cycle model parameters, radiative efficiencies, climate sensitivity and carbon cycle feedbacks along with a large set of observational constraints. Within this framework, we explore a broad range of economically feasible greenhouse gas scenarios from the integrated assessment community to determine the likelihood of meeting a combination of specific global and regional targets under various assumptions. For any given likelihood of meeting a set of such targets, the allowable cumulative emissions are greatly reduced from those inferred from the temperature target alone. Therefore, temperature targets alone are unable to comprehensively limit the risks from anthropogenic emissions.
Lung Morphometry with Hyperpolarized 129Xe: Theoretical Background
Sukstanskii, A.L.; Yablonskiy, D.A.
2011-01-01
The 3He lung morphometry technique, based on MRI measurements of hyperpolarized 3He gas diffusion in lung airspaces, provides unique information on the lung microstructure at the alveolar level. In vivo 3D tomographic images of standard morphological parameters (airspace chord length, lung parenchyma surface-to-volume ratio, number of alveoli per unit volume) can be generated from a rather short (several seconds) MRI scan. The technique is based on a theory of gas diffusion in lung acinar airways and experimental measurements of diffusion attenuated MRI signal. The present work aims at developing the theoretical background of a similar technique based on hyperpolarized 129Xe gas. As the diffusion coefficient and gyromagnetic ratio of 129Xe gas are substantially different from those of 3He gas, the specific details of the theory and experimental measurements with 129Xe should be amended. We establish phenomenological relationships between acinar airway geometrical parameters and the diffusion attenuated MR signal for human and small animal lungs, both normal lungs and lungs with mild emphysema. Optimal diffusion times are shown to be about 5 ms for human and 1.3 ms for small animals. The expected uncertainties in measuring main morphometrical parameters of the lungs are estimated in the framework of Bayesian probability theory. PMID:21713985
NASA Astrophysics Data System (ADS)
Thompson, R. L.; Gerbig, C.; Roedenbeck, C.; Heimann, M.
2009-04-01
The nitrous oxide (N2O) mixing ratio has been increasing in the atmosphere since the industrial revolution, from 270 ppb in 1750 to 320 ppb in 2007 with a steady growth rate of around 0.26% since the early 1980's. The increase in N2O is worrisome for two main reasons. First, it is a greenhouse gas; this means that its atmospheric increase translates to an enhancement in radiative forcing of 0.16 ± 0.02 Wm-2 making it currently the fourth most important long-lived greenhouse gas and is predicted to soon overtake CFC's to become the third most important. Second, it plays an important role in stratospheric ozone chemistry. Human activities are the primary cause of the atmospheric N2O increase. The largest anthropogenic source of N2O is from the use of N-fertilizers in agriculture but fossil fuel combustion and industrial processes, such as adipic and nitric acid production, are also important. We present a Bayesian inversion approach for estimating N2O fluxes over central and western Europe using high frequency in-situ concentration data from the Ochsenkopf tall tower (50 °01â²N, 11 °48â², 1022 masl). For the inversion, we employ a Lagrangian-type transport model, STILT, which provides source-receptor relationships at 10 km using ECMWF meteorological data. The a priori flux estimates used were from IER, for anthropogenic, and GEIA, for natural fluxes. N2O fluxes were retrieved monthly at 2 x 2 degree spatial resolution for 2007. The retrieved N2O fluxes showed significantly more spatial heterogeneity than in the a priori field and considerable seasonal variability. The timing of peak emissions was different for different regions but in general the months with the strongest emissions were May and August. Overall, the retrieved flux (anthropogenic and natural) was lower than in the a priori field.
Lung Cancer Risk and Past Exposure to Emissions from a Large Steel Plant
Ameling, Caroline; van de Kassteele, Jan; Lijzen, Johannes; Oosterlee, Arie; Keuken, Rinske; Visser, Otto; van Wiechen, Carla
2013-01-01
We studied the spatial distribution of cancer incidence rates around a large steel plant and its association with historical exposure. The study population was close to 600,000. The incidence data was collected for 1995–2006. From historical emission data the air pollution concentrations for polycyclic aromatic hydrocarbons (PAH) and metals were modelled. Data were analyzed using Bayesian hierarchical Poisson regression models. The standardized incidence ratio (SIR) for lung cancer was up to 40% higher than average in postcodes located in two municipalities adjacent to the industrial area. Increased incidence rates could partly be explained by differences in socioeconomic status (SES). In the highest exposure category (approximately 45,000 inhabitants) a statistically significant increased relative risk (RR) of 1.21 (1.01–1.43) was found after adjustment for SES. The elevated RRs were similar for men and women. Additional analyses in a subsample of the population with personal smoking data from a recent survey suggested that the observed association between lung cancer and plant emission, after adjustment for SES, could still be caused by residual confounding. Therefore, we cannot indisputably conclude that past emissions from the steel plant have contributed to the increased risk of lung cancer. PMID:24324501
An Anticipatory Model of Cavitation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allgood, G.O.; Dress, W.B., Jr.; Hylton, J.O.
1999-04-05
The Anticipatory System (AS) formalism developed by Robert Rosen provides some insight into the problem of embedding intelligent behavior in machines. AS emulates the anticipatory behavior of biological systems. AS bases its behavior on its expectations about the near future and those expectations are modified as the system gains experience. The expectation is based on an internal model that is drawn from an appeal to physical reality. To be adaptive, the model must be able to update itself. To be practical, the model must run faster than real-time. The need for a physical model and the requirement that the modelmore » execute at extreme speeds, has held back the application of AS to practical problems. Two recent advances make it possible to consider the use of AS for practical intelligent sensors. First, advances in transducer technology make it possible to obtain previously unavailable data from which a model can be derived. For example, acoustic emissions (AE) can be fed into a Bayesian system identifier that enables the separation of a weak characterizing signal, such as the signature of pump cavitation precursors, from a strong masking signal, such as a pump vibration feature. The second advance is the development of extremely fast, but inexpensive, digital signal processing hardware on which it is possible to run an adaptive Bayesian-derived model faster than real-time. This paper reports the investigation of an AS using a model of cavitation based on hydrodynamic principles and Bayesian analysis of data from high-performance AE sensors.« less
Towards national-scale greenhouse gas emissions evaluation with robust uncertainty estimates
NASA Astrophysics Data System (ADS)
Rigby, Matthew; Swallow, Ben; Lunt, Mark; Manning, Alistair; Ganesan, Anita; Stavert, Ann; Stanley, Kieran; O'Doherty, Simon
2016-04-01
Through the Deriving Emissions related to Climate Change (DECC) network and the Greenhouse gAs Uk and Global Emissions (GAUGE) programme, the UK's greenhouse gases are now monitored by instruments mounted on telecommunications towers and churches, on a ferry that performs regular transects of the North Sea, on-board a research aircraft and from space. When combined with information from high-resolution chemical transport models such as the Met Office Numerical Atmospheric dispersion Modelling Environment (NAME), these measurements are allowing us to evaluate emissions more accurately than has previously been possible. However, it has long been appreciated that current methods for quantifying fluxes using atmospheric data suffer from uncertainties, primarily relating to the chemical transport model, that have been largely ignored to date. Here, we use novel model reduction techniques for quantifying the influence of a set of potential systematic model errors on the outcome of a national-scale inversion. This new technique has been incorporated into a hierarchical Bayesian framework, which can be shown to reduce the influence of subjective choices on the outcome of inverse modelling studies. Using estimates of the UK's methane emissions derived from DECC and GAUGE tall-tower measurements as a case study, we will show that such model systematic errors have the potential to significantly increase the uncertainty on national-scale emissions estimates. Therefore, we conclude that these factors must be incorporated in national emissions evaluation efforts, if they are to be credible.
Optimal joule heating of the subsurface
Berryman, J.G.; Daily, W.D.
1994-07-05
A method for simultaneously heating the subsurface and imaging the effects of the heating is disclosed. This method combines the use of tomographic imaging (electrical resistance tomography or ERT) to image electrical resistivity distribution underground, with joule heating by electrical currents injected in the ground. A potential distribution is established on a series of buried electrodes resulting in energy deposition underground which is a function of the resistivity and injection current density. Measurement of the voltages and currents also permits a tomographic reconstruction of the resistivity distribution. Using this tomographic information, the current injection pattern on the driving electrodes can be adjusted to change the current density distribution and thus optimize the heating. As the heating changes conditions, the applied current pattern can be repeatedly adjusted (based on updated resistivity tomographs) to affect real time control of the heating.
Wang, Dengjiang; Zhang, Weifang; Wang, Xiangyu; Sun, Bo
2016-01-01
This study presents a novel monitoring method for hole-edge corrosion damage in plate structures based on Lamb wave tomographic imaging techniques. An experimental procedure with a cross-hole layout using 16 piezoelectric transducers (PZTs) was designed. The A0 mode of the Lamb wave was selected, which is sensitive to thickness-loss damage. The iterative algebraic reconstruction technique (ART) method was used to locate and quantify the corrosion damage at the edge of the hole. Hydrofluoric acid with a concentration of 20% was used to corrode the specimen artificially. To estimate the effectiveness of the proposed method, the real corrosion damage was compared with the predicted corrosion damage based on the tomographic method. The results show that the Lamb-wave-based tomographic method can be used to monitor the hole-edge corrosion damage accurately. PMID:28774041
Atwood, Robert C.; Bodey, Andrew J.; Price, Stephen W. T.; Basham, Mark; Drakopoulos, Michael
2015-01-01
Tomographic datasets collected at synchrotrons are becoming very large and complex, and, therefore, need to be managed efficiently. Raw images may have high pixel counts, and each pixel can be multidimensional and associated with additional data such as those derived from spectroscopy. In time-resolved studies, hundreds of tomographic datasets can be collected in sequence, yielding terabytes of data. Users of tomographic beamlines are drawn from various scientific disciplines, and many are keen to use tomographic reconstruction software that does not require a deep understanding of reconstruction principles. We have developed Savu, a reconstruction pipeline that enables users to rapidly reconstruct data to consistently create high-quality results. Savu is designed to work in an ‘orthogonal’ fashion, meaning that data can be converted between projection and sinogram space throughout the processing workflow as required. The Savu pipeline is modular and allows processing strategies to be optimized for users' purposes. In addition to the reconstruction algorithms themselves, it can include modules for identification of experimental problems, artefact correction, general image processing and data quality assessment. Savu is open source, open licensed and ‘facility-independent’: it can run on standard cluster infrastructure at any institution. PMID:25939626
Risk assessment by dynamic representation of vulnerability, exploitation, and impact
NASA Astrophysics Data System (ADS)
Cam, Hasan
2015-05-01
Assessing and quantifying cyber risk accurately in real-time is essential to providing security and mission assurance in any system and network. This paper presents a modeling and dynamic analysis approach to assessing cyber risk of a network in real-time by representing dynamically its vulnerabilities, exploitations, and impact using integrated Bayesian network and Markov models. Given the set of vulnerabilities detected by a vulnerability scanner in a network, this paper addresses how its risk can be assessed by estimating in real-time the exploit likelihood and impact of vulnerability exploitation on the network, based on real-time observations and measurements over the network. The dynamic representation of the network in terms of its vulnerabilities, sensor measurements, and observations is constructed dynamically using the integrated Bayesian network and Markov models. The transition rates of outgoing and incoming links of states in hidden Markov models are used in determining exploit likelihood and impact of attacks, whereas emission rates help quantify the attack states of vulnerabilities. Simulation results show the quantification and evolving risk scores over time for individual and aggregated vulnerabilities of a network.
Korving, H; Clemens, F
2002-01-01
In recent years, decision analysis has become an important technique in many disciplines. It provides a methodology for rational decision-making allowing for uncertainties in the outcome of several possible actions to be undertaken. An example in urban drainage is the situation in which an engineer has to decide upon a major reconstruction of a system in order to prevent pollution of receiving waters due to CSOs. This paper describes the possibilities of Bayesian decision-making in urban drainage. In particular, the utility of monitoring prior to deciding on the reconstruction of a sewer system to reduce CSO emissions is studied. Our concern is with deciding whether a price should be paid for new information and which source of information is the best choice given the expected uncertainties in the outcome. The influence of specific uncertainties (sewer system data and model parameters) on the probability of CSO volumes is shown to be significant. Using Bayes' rule, to combine prior impressions with new observations, reduces the risks linked with the planning of sewer system reconstructions.
NASA Astrophysics Data System (ADS)
Hopcroft, Peter O.; Valdes, Paul J.; Kaplan, Jed O.
2018-04-01
The observed rise in atmospheric methane (CH4) from 375 ppbv during the Last Glacial Maximum (LGM: 21,000 years ago) to 680 ppbv during the late preindustrial era is not well understood. Atmospheric chemistry considerations implicate an increase in CH4 sources, but process-based estimates fail to reproduce the required amplitude. CH4 stable isotopes provide complementary information that can help constrain the underlying causes of the increase. We combine Earth System model simulations of the late preindustrial and LGM CH4 cycles, including process-based estimates of the isotopic discrimination of vegetation, in a box model of atmospheric CH4 and its isotopes. Using a Bayesian approach, we show how model-based constraints and ice core observations may be combined in a consistent probabilistic framework. The resultant posterior distributions point to a strong reduction in wetland and other biogenic CH4 emissions during the LGM, with a modest increase in the geological source, or potentially natural or anthropogenic fires, accounting for the observed enrichment of δ13CH4.
Bayesian Analysis of the Cosmic Microwave Background
NASA Technical Reports Server (NTRS)
Jewell, Jeffrey
2007-01-01
There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.
NASA Astrophysics Data System (ADS)
Elshall, A. S.; Ye, M.; Niu, G. Y.; Barron-Gafford, G.
2015-12-01
Models in biogeoscience involve uncertainties in observation data, model inputs, model structure, model processes and modeling scenarios. To accommodate for different sources of uncertainty, multimodal analysis such as model combination, model selection, model elimination or model discrimination are becoming more popular. To illustrate theoretical and practical challenges of multimodal analysis, we use an example about microbial soil respiration modeling. Global soil respiration releases more than ten times more carbon dioxide to the atmosphere than all anthropogenic emissions. Thus, improving our understanding of microbial soil respiration is essential for improving climate change models. This study focuses on a poorly understood phenomena, which is the soil microbial respiration pulses in response to episodic rainfall pulses (the "Birch effect"). We hypothesize that the "Birch effect" is generated by the following three mechanisms. To test our hypothesis, we developed and assessed five evolving microbial-enzyme models against field measurements from a semiarid Savannah that is characterized by pulsed precipitation. These five model evolve step-wise such that the first model includes none of these three mechanism, while the fifth model includes the three mechanisms. The basic component of Bayesian multimodal analysis is the estimation of marginal likelihood to rank the candidate models based on their overall likelihood with respect to observation data. The first part of the study focuses on using this Bayesian scheme to discriminate between these five candidate models. The second part discusses some theoretical and practical challenges, which are mainly the effect of likelihood function selection and the marginal likelihood estimation methods on both model ranking and Bayesian model averaging. The study shows that making valid inference from scientific data is not a trivial task, since we are not only uncertain about the candidate scientific models, but also about the statistical methods that are used to discriminate between these models.
Developing a new Bayesian Risk Index for risk evaluation of soil contamination.
Albuquerque, M T D; Gerassis, S; Sierra, C; Taboada, J; Martín, J E; Antunes, I M H R; Gallego, J R
2017-12-15
Industrial and agricultural activities heavily constrain soil quality. Potentially Toxic Elements (PTEs) are a threat to public health and the environment alike. In this regard, the identification of areas that require remediation is crucial. In the herein research a geochemical dataset (230 samples) comprising 14 elements (Cu, Pb, Zn, Ag, Ni, Mn, Fe, As, Cd, V, Cr, Ti, Al and S) was gathered throughout eight different zones distinguished by their main activity, namely, recreational, agriculture/livestock and heavy industry in the Avilés Estuary (North of Spain). Then a stratified systematic sampling method was used at short, medium, and long distances from each zone to obtain a representative picture of the total variability of the selected attributes. The information was then combined in four risk classes (Low, Moderate, High, Remediation) following reference values from several sediment quality guidelines (SQGs). A Bayesian analysis, inferred for each zone, allowed the characterization of PTEs correlations, the unsupervised learning network technique proving to be the best fit. Based on the Bayesian network structure obtained, Pb, As and Mn were selected as key contamination parameters. For these 3 elements, the conditional probability obtained was allocated to each observed point, and a simple, direct index (Bayesian Risk Index-BRI) was constructed as a linear rating of the pre-defined risk classes weighted by the previously obtained probability. Finally, the BRI underwent geostatistical modeling. One hundred Sequential Gaussian Simulations (SGS) were computed. The Mean Image and the Standard Deviation maps were obtained, allowing the definition of High/Low risk clusters (Local G clustering) and the computation of spatial uncertainty. High-risk clusters are mainly distributed within the area with the highest altitude (agriculture/livestock) showing an associated low spatial uncertainty, clearly indicating the need for remediation. Atmospheric emissions, mainly derived from the metallurgical industry, contribute to soil contamination by PTEs. Copyright © 2017 Elsevier B.V. All rights reserved.
What Can Galaxies Tell Us About The Epoch of Reionization?
NASA Astrophysics Data System (ADS)
Mason, Charlotte; GLASS, BoRG
2018-01-01
The reionization of neutral hydrogen in the intergalactic medium (IGM) in the universe's first billion years (z>6) was likely driven by the first stars and galaxies, and its history encodes information about their properties. But the timeline of reionization is not well-measured and it is still unclear whether galaxies alone can produce the required ionizing photons. I will describe two key ways in which galaxies at our current observational frontiers can constrain reionization.One tool is the UV luminosity function (LF), which traces the evolution of star-forming galaxies and their ionizing photons. I will describe a Bayesian technique to account for gravitational lensing magnification bias in galaxy surveys to produce accurate LFs. I will then describe a simple, but powerful, model for LF evolution and its implications for reionization and z>10 galaxy surveys with JWST. Secondly, Lyman alpha (Lya) emission from galaxies is a potential probe of the IGM ionization state as Lya photons are strongly attenuated by neutral hydrogen, but requires disentangling physics on pc to Gpc scales. I will introduce a new forward-modeling Bayesian framework which combines cosmological IGM simulations with models of interstellar medium conditions to infer the IGM neutral fraction from observations of Lya emission. I will present our new measurement of the neutral fraction at z~7 and place it in the context of other constraints of the reionization history. I will describe ongoing efforts to build larger samples of Lya emitting galaxies for more accurate measurements with the HST survey GLASS, and will describe future prospects with JWST.
Development of microwave rainfall retrieval algorithm for climate applications
NASA Astrophysics Data System (ADS)
KIM, J. H.; Shin, D. B.
2014-12-01
With the accumulated satellite datasets for decades, it is possible that satellite-based data could contribute to sustained climate applications. Level-3 products from microwave sensors for climate applications can be obtained from several algorithms. For examples, the Microwave Emission brightness Temperature Histogram (METH) algorithm produces level-3 rainfalls directly, whereas the Goddard profiling (GPROF) algorithm first generates instantaneous rainfalls and then temporal and spatial averaging process leads to level-3 products. The rainfall algorithm developed in this study follows a similar approach to averaging instantaneous rainfalls. However, the algorithm is designed to produce instantaneous rainfalls at an optimal resolution showing reduced non-linearity in brightness temperature (TB)-rain rate(R) relations. It is found that the resolution tends to effectively utilize emission channels whose footprints are relatively larger than those of scattering channels. This algorithm is mainly composed of a-priori databases (DBs) and a Bayesian inversion module. The DB contains massive pairs of simulated microwave TBs and rain rates, obtained by WRF (version 3.4) and RTTOV (version 11.1) simulations. To improve the accuracy and efficiency of retrieval process, data mining technique is additionally considered. The entire DB is classified into eight types based on Köppen climate classification criteria using reanalysis data. Among these sub-DBs, only one sub-DB which presents the most similar physical characteristics is selected by considering the thermodynamics of input data. When the Bayesian inversion is applied to the selected DB, instantaneous rain rate with 6 hours interval is retrieved. The retrieved monthly mean rainfalls are statistically compared with CMAP and GPCP, respectively.
NASA Astrophysics Data System (ADS)
Mason, Charlotte A.; Treu, Tommaso; Dijkstra, Mark; Mesinger, Andrei; Trenti, Michele; Pentericci, Laura; de Barros, Stephane; Vanzella, Eros
2018-03-01
We present a new flexible Bayesian framework for directly inferring the fraction of neutral hydrogen in the intergalactic medium (IGM) during the Epoch of Reionization (EoR, z ∼ 6–10) from detections and non-detections of Lyman Alpha (Lyα) emission from Lyman Break galaxies (LBGs). Our framework combines sophisticated reionization simulations with empirical models of the interstellar medium (ISM) radiative transfer effects on Lyα. We assert that the Lyα line profile emerging from the ISM has an important impact on the resulting transmission of photons through the IGM, and that these line profiles depend on galaxy properties. We model this effect by considering the peak velocity offset of Lyα lines from host galaxies’ systemic redshifts, which are empirically correlated with UV luminosity and redshift (or halo mass at fixed redshift). We use our framework on the sample of LBGs presented in Pentericci et al. and infer a global neutral fraction at z ∼ 7 of {\\overline{x}}{{H}{{I}}}={0.59}-0.15+0.11, consistent with other robust probes of the EoR and confirming that reionization is ongoing ∼700 Myr after the Big Bang. We show that using the full distribution of Lyα equivalent width detections and upper limits from LBGs places tighter constraints on the evolving IGM than the standard Lyα emitter fraction, and that larger samples are within reach of deep spectroscopic surveys of gravitationally lensed fields and James Webb Space Telescope NIRSpec.
NASA Astrophysics Data System (ADS)
McGarry, Meghan B.
An innovative new soft x-ray (SXR) diagnostic has been developed for the Madison Symmetric Torus that provides measurements of tomographic emissivity and electron temperature (Te) via the double-foil technique. Two measurements of electron temperature from SXR emission are available, one from the ratio of the emissivities through thin and thick filters as mapped onto magnetic flux surfaces, and the other directly from the ratio of two foils sharing a single line-of-sight. The SXR measurements have been benchmarked against Thomson Scattering electron temperature during high current, improved confinement discharges, and show excellent agreement. The SXR diagnostic has been used to investigate the source of emissive structures seen during high-current improved confinement discharges. Although the emissivity structures are correlated to the magnetic configuration of the discharges, direct-brightness Te measurements do not typically show a clear Te structure, indicating a general upper limit of ˜ 15--20% on any possible localized increase in Te. In most shots, the flux-surface reconstructed Te shows no indication of Te structure. However, in one discharge with a very large tearing mode amplitude (15 Gauss), measurements and modeling indicate that the structure has a localized increase of 20-180 eV in Te. The structure cannot be explained by a localized enhancement of electron density. A second case study with a multiple-helicity magnetic spectrum indicates that a ring of enhanced SXR emission at 0.4 normalized radius is caused by an impurity accumulation of up to 58% that of the core region. For the first time, the SXR diagnostic has also been combined with Al11+ impurity measurements to normalize the aluminum contribution to the SXR emission spectrum and demonstrate that the filter thicknesses used for the diagnostic do not pass aluminum line radiation. The new SXR Te and tomography diagnostic will continue to provide insight into the relationship between magnetic structures and electron temperature in improved confinement plasmas.
NASA Astrophysics Data System (ADS)
Petit, C.; Le Louarn, M.; Fusco, T.; Madec, P.-Y.
2011-09-01
Various tomographic control solutions have been proposed during the last decades to ensure efficient or even optimal closed-loop correction to tomographic Adaptive Optics (AO) concepts such as Laser Tomographic AO (LTAO), Multi-Conjugate AO (MCAO). The optimal solution, based on Linear Quadratic Gaussian (LQG) approach, as well as suboptimal but efficient solutions such as Pseudo-Open Loop Control (POLC) require multiple Matrix Vector Multiplications (MVM). Disregarding their respective performance, these efficient control solutions thus exhibit strong increase of on-line complexity and their implementation may become difficult in demanding cases. Among them, two cases are of particular interest. First, the system Real-Time Computer architecture and implementation is derived from past or present solutions and does not support multiple MVM. This is the case of the AO Facility which RTC architecture is derived from the SPARTA platform and inherits its simple MVM architecture, which does not fit with LTAO control solutions for instance. Second, considering future systems such as Extremely Large Telescopes, the number of degrees of freedom is twenty to one hundred times bigger than present systems. In these conditions, tomographic control solutions can hardly be used in their standard form and optimized implementation shall be considered. Single MVM tomographic control solutions represent a potential solution, and straightforward solutions such as Virtual Deformable Mirrors have been already proposed for LTAO but with tuning issues. We investigate in this paper the possibility to derive from tomographic control solutions, such as POLC or LQG, simplified control solutions ensuring simple MVM architecture and that could be thus implemented on nowadays systems or future complex systems. We theoretically derive various solutions and analyze their respective performance on various systems thanks to numerical simulation. We discuss the optimization of their performance and stability issues with respect to classic control solutions. We finally discuss off-line computation and implementation constraints.
On the V-Line Radon Transform and Its Imaging Applications
Morvidone, M.; Nguyen, M. K.; Truong, T. T.; Zaidi, H.
2010-01-01
Radon transforms defined on smooth curves are well known and extensively studied in the literature. In this paper, we consider a Radon transform defined on a discontinuous curve formed by a pair of half-lines forming the vertical letter V. If the classical two-dimensional Radon transform has served as a work horse for tomographic transmission and/or emission imaging, we show that this V-line Radon transform is the backbone of scattered radiation imaging in two dimensions. We establish its analytic inverse formula as well as a corresponding filtered back projection reconstruction procedure. These theoretical results allow the reconstruction of two-dimensional images from Compton scattered radiation collected on a one-dimensional collimated camera. We illustrate the working principles of this imaging modality by presenting numerical simulation results. PMID:20706545
F-18 fluorodeoxyglucose: Its potential in differentiating between stress fracture and neoplasia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul, R.; Ahonen, A.; Virtama, P.
1989-12-01
F-18 fluorodeoxyglucose (FDG) accumulates into regions of enhanced glucose uptake and metabolism such as the brain, heart, and malignant tumors. The clinical usefulness of this positron-emitting radiopharmaceutical is illustrated in a case where the clinical picture and CT indicated a malignant bone lesion in the clavicle. Histologically a stress fracture was found secondary to chronic strain on the clavicle. On follow-up the lesion's course was benign. Planar imaging with F-18 FDG was performed twice during follow-up, and on both occasions there was no accumulation of radioactivity over the suspicious area, indicating normal glucose consumption. This case demonstrates the differential diagnosticmore » potential of F-18 FDG and shows that clinically useful information may be obtained without a position emission tomograph.« less
Hill, K W; Bitter, M L; Scott, S D; Ince-Cushman, A; Reinke, M; Rice, J E; Beiersdorfer, P; Gu, M-F; Lee, S G; Broennimann, Ch; Eikenberry, E F
2008-10-01
A new spatially resolving x-ray crystal spectrometer capable of measuring continuous spatial profiles of high resolution spectra (lambda/d lambda>6000) of He-like and H-like Ar K alpha lines with good spatial (approximately 1 cm) and temporal (approximately 10 ms) resolutions has been installed on the Alcator C-Mod tokamak. Two spherically bent crystals image the spectra onto four two-dimensional Pilatus II pixel detectors. Tomographic inversion enables inference of local line emissivity, ion temperature (T(i)), and toroidal plasma rotation velocity (upsilon(phi)) from the line Doppler widths and shifts. The data analysis techniques, T(i) and upsilon(phi) profiles, analysis of fusion-neutron background, and predictions of performance on other tokamaks, including ITER, will be presented.
Assessing the resolution-dependent utility of tomograms for geostatistics
Day-Lewis, F. D.; Lane, J.W.
2004-01-01
Geophysical tomograms are used increasingly as auxiliary data for geostatistical modeling of aquifer and reservoir properties. The correlation between tomographic estimates and hydrogeologic properties is commonly based on laboratory measurements, co-located measurements at boreholes, or petrophysical models. The inferred correlation is assumed uniform throughout the interwell region; however, tomographic resolution varies spatially due to acquisition geometry, regularization, data error, and the physics underlying the geophysical measurements. Blurring and inversion artifacts are expected in regions traversed by few or only low-angle raypaths. In the context of radar traveltime tomography, we derive analytical models for (1) the variance of tomographic estimates, (2) the spatially variable correlation with a hydrologic parameter of interest, and (3) the spatial covariance of tomographic estimates. Synthetic examples demonstrate that tomograms of qualitative value may have limited utility for geostatistics; moreover, the imprint of regularization may preclude inference of meaningful spatial statistics from tomograms.
NASA Astrophysics Data System (ADS)
Flynn, Brendan P.; D'Souza, Alisha V.; Kanick, Stephen C.; Maytin, Edward; Hasan, Tayyaba; Pogue, Brian W.
2013-03-01
Aminolevulinic acid (ALA)-induced Protoporphyrin IX (PpIX)-based photodynamic therapy (PDT) is an effective treatment for skin cancers including basal cell carcinoma (BCC). Topically applied ALA promotes PpIX production preferentially in tumors, and many strategies have been developed to increase PpIX distribution and PDT treatment efficacy at depths > 1mm is not fully understood. While surface imaging techniques provide useful diagnosis, dosimetry, and efficacy information for superficial tumors, these methods cannot interrogate deeper tumors to provide in situ insight into spatial PpIX distributions. We have developed an ultrasound-guided, white-light-informed, tomographics spectroscopy system for the spatial measurement of subsurface PpIX. Detailed imaging system specifications, methodology, and optical-phantom-based characterization will be presented separately. Here we evaluate preliminary in vivo results using both full tomographic reconstruction and by plotting individual tomographic source-detector pair data against US images.
NASA Astrophysics Data System (ADS)
Silva, B. Marta; Zaroubi, Saleem; Kooistra, Robin; Cooray, Asantha
2018-04-01
The H α line emission is an important probe for a number of fundamental quantities in galaxies, including their number density, star formation rate (SFR), and overall gas content. A new generation of low-resolution intensity mapping (IM) probes, e.g. SPHEREx and CDIM, will observe galaxies in H α emission over a large fraction of the sky from the local Universe till a redshift of z ˜ 6 - 10, respectively. This will also be the target line for observations by the high-resolution Euclid and WFIRST instruments in the z ˜ 0.7-2 redshift range. In this paper, we estimate the intensity and power spectra of the H α line in the z ˜ 0-5 redshift range using observed line luminosity functions (LFs), when possible, and simulations, otherwise. We estimate the significance of our predictions by accounting for the modelling uncertainties (e.g. SFR, extinction, etc.) and observational contamination. We find that IM surveys can make a statistical detection of the full H α emission between z ˜ 0.8 and 5. Moreover, we find that the high-frequency resolution and the sensitivity of the planned CDIM surveys allow for the separation of H α emission from several interloping lines. We explore ways to use the combination of these line intensities to probe galaxy properties. As expected, our study indicates that galaxy surveys will only detect bright galaxies that contribute up to a few per cent of the overall H α intensity. However, these surveys will provide important constraints on the high end of the H α LF and put strong constraints on the active galactic nucleus LF.
Emissions of carbon tetrachloride from Europe
NASA Astrophysics Data System (ADS)
Graziosi, Francesco; Arduini, Jgor; Bonasoni, Paolo; Furlani, Francesco; Giostra, Umberto; Manning, Alistair J.; McCulloch, Archie; O'Doherty, Simon; Simmonds, Peter G.; Reimann, Stefan; Vollmer, Martin K.; Maione, Michela
2016-10-01
Carbon tetrachloride (CCl4) is a long-lived radiatively active compound with the ability to destroy stratospheric ozone. Due to its inclusion in the Montreal Protocol on Substances that Deplete the Ozone Layer (MP), the last two decades have seen a sharp decrease in its large-scale emissive use with a consequent decline in its atmospheric mole fractions. However, the MP restrictions do not apply to the use of carbon tetrachloride as feedstock for the production of other chemicals, implying the risk of fugitive emissions from the industry sector. The occurrence of such unintended emissions is suggested by a significant discrepancy between global emissions as derived from reported production and feedstock usage (bottom-up emissions), and those based on atmospheric observations (top-down emissions). In order to better constrain the atmospheric budget of carbon tetrachloride, several studies based on a combination of atmospheric observations and inverse modelling have been conducted in recent years in various regions of the world. This study is focused on the European scale and based on long-term high-frequency observations at three European sites, combined with a Bayesian inversion methodology. We estimated that average European emissions for 2006-2014 were 2.2 (± 0.8) Gg yr-1, with an average decreasing trend of 6.9 % per year. Our analysis identified France as the main source of emissions over the whole study period, with an average contribution to total European emissions of approximately 26 %. The inversion was also able to allow the localisation of emission "hot spots" in the domain, with major source areas in southern France, central England (UK) and Benelux (Belgium, the Netherlands, Luxembourg), where most industrial-scale production of basic organic chemicals is located. According to our results, European emissions correspond, on average, to 4.0 % of global emissions for 2006-2012. Together with other regional studies, our results allow a better constraint of the global budget of carbon tetrachloride and a better quantification of the gap between top-down and bottom-up estimates.
NASA Astrophysics Data System (ADS)
Kostencka, Julianna; Kozacki, Tomasz; Hennelly, Bryan; Sheridan, John T.
2017-06-01
Holographic tomography (HT) allows noninvasive, quantitative, 3D imaging of transparent microobjects, such as living biological cells and fiber optics elements. The technique is based on acquisition of multiple scattered fields for various sample perspectives using digital holographic microscopy. Then, the captured data is processed with one of the tomographic reconstruction algorithms, which enables 3D reconstruction of refractive index distribution. In our recent works we addressed the issue of spatially variant accuracy of the HT reconstructions, which results from the insufficient model of diffraction that is applied in the widely-used tomographic reconstruction algorithms basing on the Rytov approximation. In the present study, we continue investigating the spatially variant properties of the HT imaging, however, we are now focusing on the limited spatial size of holograms as a source of this problem. Using the Wigner distribution representation and the Ewald sphere approach, we show that the limited size of the holograms results in a decreased quality of tomographic imaging in off-center regions of the HT reconstructions. This is because the finite detector extent becomes a limiting aperture that prohibits acquisition of full information about diffracted fields coming from the out-of-focus structures of a sample. The incompleteness of the data results in an effective truncation of the tomographic transfer function for the out-of-center regions of the tomographic image. In this paper, the described effect is quantitatively characterized for three types of the tomographic systems: the configuration with 1) object rotation, 2) scanning of the illumination direction, 3) the hybrid HT solution combing both previous approaches.
Search for gravitational waves associated with the August 2006 timing glitch of the Vela pulsar
NASA Astrophysics Data System (ADS)
Abadie, J.; Abbott, B. P.; Abbott, R.; Adhikari, R.; Ajith, P.; Allen, B.; Allen, G.; Amador Ceron, E.; Amin, R. S.; Anderson, S. B.; Anderson, W. G.; Arain, M. A.; Araya, M.; Aso, Y.; Aston, S.; Aufmuth, P.; Aulbert, C.; Babak, S.; Baker, P.; Ballmer, S.; Barker, D.; Barr, B.; Barriga, P.; Barsotti, L.; Barton, M. A.; Bartos, I.; Bassiri, R.; Bastarrika, M.; Behnke, B.; Benacquista, M.; Bennett, M. F.; Betzwieser, J.; Beyersdorf, P. T.; Bilenko, I. A.; Billingsley, G.; Biswas, R.; Black, E.; Blackburn, J. K.; Blackburn, L.; Blair, D.; Bland, B.; Bock, O.; Bodiya, T. P.; Bondarescu, R.; Bork, R.; Born, M.; Bose, S.; Brady, P. R.; Braginsky, V. B.; Brau, J. E.; Breyer, J.; Bridges, D. O.; Brinkmann, M.; Britzger, M.; Brooks, A. F.; Brown, D. A.; Bullington, A.; Buonanno, A.; Burmeister, O.; Byer, R. L.; Cadonati, L.; Cain, J.; Camp, J. B.; Cannizzo, J.; Cannon, K. C.; Cao, J.; Capano, C.; Cardenas, L.; Caudill, S.; Cavaglià, M.; Cepeda, C.; Chalermsongsak, T.; Chalkley, E.; Charlton, P.; Chatterji, S.; Chelkowski, S.; Chen, Y.; Christensen, N.; Chua, S. S. Y.; Chung, C. T. Y.; Clark, D.; Clark, J.; Clayton, J. H.; Conte, R.; Cook, D.; Corbitt, T. R. C.; Cornish, N.; Coward, D.; Coyne, D. C.; Creighton, J. D. E.; Creighton, T. D.; Cruise, A. M.; Culter, R. M.; Cumming, A.; Cunningham, L.; Dahl, K.; Danilishin, S. L.; Danzmann, K.; Daudert, B.; Davies, G.; Daw, E. J.; Dayanga, T.; Debra, D.; Degallaix, J.; Dergachev, V.; Desalvo, R.; Dhurandhar, S.; Díaz, M.; Donovan, F.; Dooley, K. L.; Doomes, E. E.; Drever, R. W. P.; Driggers, J.; Dueck, J.; Duke, I.; Dumas, J.-C.; Edgar, M.; Edwards, M.; Effler, A.; Ehrens, P.; Etzel, T.; Evans, M.; Evans, T.; Fairhurst, S.; Faltas, Y.; Fan, Y.; Fazi, D.; Fehrmann, H.; Finn, L. S.; Flasch, K.; Foley, S.; Forrest, C.; Fotopoulos, N.; Frede, M.; Frei, M.; Frei, Z.; Freise, A.; Frey, R.; Fricke, T. T.; Friedrich, D.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Garofoli, J. A.; Ghosh, S.; Giaime, J. A.; Giampanis, S.; Giardina, K. D.; Goetz, E.; Goggin, L. M.; González, G.; Goßler, S.; Grant, A.; Gras, S.; Gray, C.; Greenhalgh, R. J. S.; Gretarsson, A. M.; Grosso, R.; Grote, H.; Grunewald, S.; Gustafson, E. K.; Gustafson, R.; Hage, B.; Hallam, J. M.; Hammer, D.; Hammond, G. D.; Hanna, C.; Hanson, J.; Harms, J.; Harry, G. M.; Harry, I. W.; Harstad, E. D.; Haughian, K.; Hayama, K.; Hayler, T.; Heefner, J.; Heng, I. S.; Heptonstall, A.; Hewitson, M.; Hild, S.; Hirose, E.; Hoak, D.; Hodge, K. A.; Holt, K.; Hosken, D. J.; Hough, J.; Howell, E.; Hoyland, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Ingram, D. R.; Isogai, T.; Ivanov, A.; Johnson, W. W.; Jones, D. I.; Jones, G.; Jones, R.; Ju, L.; Kalmus, P.; Kalogera, V.; Kandhasamy, S.; Kanner, J.; Katsavounidis, E.; Kawabe, K.; Kawamura, S.; Kawazoe, F.; Kells, W.; Keppel, D. G.; Khalaidovski, A.; Khalili, F. Y.; Khan, R.; Khazanov, E.; Kim, H.; King, P. J.; Kissel, J. S.; Klimenko, S.; Kokeyama, K.; Kondrashov, V.; Kopparapu, R.; Koranda, S.; Kozak, D.; Kringel, V.; Krishnan, B.; Kuehn, G.; Kullman, J.; Kumar, R.; Kwee, P.; Lam, P. K.; Landry, M.; Lang, M.; Lantz, B.; Lastzka, N.; Lazzarini, A.; Leaci, P.; Lei, M.; Leindecker, N.; Leonor, I.; Lin, H.; Lindquist, P. E.; Littenberg, T. B.; Lockerbie, N. A.; Lodhia, D.; Lormand, M.; Lu, P.; Lubiński, M.; Lucianetti, A.; Lück, H.; Lundgren, A.; Machenschalk, B.; Macinnis, M.; Mageswaran, M.; Mailand, K.; Mak, C.; Mandel, I.; Mandic, V.; Márka, S.; Márka, Z.; Markosyan, A.; Markowitz, J.; Maros, E.; Martin, I. W.; Martin, R. M.; Marx, J. N.; Mason, K.; Matichard, F.; Matone, L.; Matzner, R. A.; Mavalvala, N.; McCarthy, R.; McClelland, D. E.; McGuire, S. C.; McIntyre, G.; McKechan, D. J. A.; Mehmet, M.; Melatos, A.; Melissinos, A. C.; Mendell, G.; Menéndez, D. F.; Mercer, R. A.; Merrill, L.; Meshkov, S.; Messenger, C.; Meyer, M. S.; Miao, H.; Miller, J.; Mino, Y.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Miyakawa, O.; Moe, B.; Mohanty, S. D.; Mohapatra, S. R. P.; Moreno, G.; Mors, K.; Mossavi, K.; Mowlowry, C.; Mueller, G.; Müller-Ebhardt, H.; Mukherjee, S.; Mullavey, A.; Munch, J.; Murray, P. G.; Nash, T.; Nawrodt, R.; Nelson, J.; Newton, G.; Nishida, E.; Nishizawa, A.; O'Dell, J.; O'Reilly, B.; O'Shaughnessy, R.; Ochsner, E.; Ogin, G. H.; Oldenburg, R.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Page, A.; Pan, Y.; Pankow, C.; Papa, M. A.; Patel, P.; Pathak, D.; Pedraza, M.; Pekowsky, L.; Penn, S.; Peralta, C.; Perreca, A.; Pickenpack, M.; Pinto, I. M.; Pitkin, M.; Pletsch, H. J.; Plissi, M. V.; Postiglione, F.; Principe, M.; Prix, R.; Prokhorov, L.; Puncken, O.; Quetschke, V.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raics, Z.; Rakhmanov, M.; Raymond, V.; Reed, C. M.; Reed, T.; Rehbein, H.; Reid, S.; Reitze, D. H.; Riesen, R.; Riles, K.; Roberts, P.; Robertson, N. A.; Robinson, C.; Robinson, E. L.; Roddy, S.; Röver, C.; Rollins, J.; Romano, J. D.; Romie, J. H.; Rowan, S.; Rüdiger, A.; Ryan, K.; Sakata, S.; Sammut, L.; Sancho de La Jordana, L.; Sandberg, V.; Sannibale, V.; Santamaría, L.; Santostasi, G.; Saraf, S.; Sarin, P.; Sathyaprakash, B. S.; Sato, S.; Satterthwaite, M.; Saulson, P. R.; Savage, R.; Schilling, R.; Schnabel, R.; Schofield, R.; Schulz, B.; Schutz, B. F.; Schwinberg, P.; Scott, J.; Scott, S. M.; Searle, A. C.; Seifert, F.; Sellers, D.; Sengupta, A. S.; Sergeev, A.; Shapiro, B.; Shawhan, P.; Shoemaker, D. H.; Sibley, A.; Siemens, X.; Sigg, D.; Sintes, A. M.; Skelton, G.; Slagmolen, B. J. J.; Slutsky, J.; Smith, J. R.; Smith, M. R.; Smith, N. D.; Somiya, K.; Sorazu, B.; Speirits, F.; Stein, A. J.; Stein, L. C.; Steplewski, S.; Stochino, A.; Stone, R.; Strain, K. A.; Strigin, S.; Stroeer, A.; Stuver, A. L.; Summerscales, T. Z.; Sung, M.; Susmithan, S.; Sutton, P. J.; Szokoly, G. P.; Talukder, D.; Tanner, D. B.; Tarabrin, S. P.; Taylor, J. R.; Taylor, R.; Thorne, K. A.; Thorne, K. S.; Thüring, A.; Titsler, C.; Tokmakov, K. V.; Torres, C.; Torrie, C. I.; Traylor, G.; Trias, M.; Turner, L.; Ugolini, D.; Urbanek, K.; Vahlbruch, H.; Vallisneri, M.; van den Broeck, C.; van der Sluys, M. V.; van Veggel, A. A.; Vass, S.; Vaulin, R.; Vecchio, A.; Veitch, J.; Veitch, P. J.; Veltkamp, C.; Villar, A.; Vorvick, C.; Vyachanin, S. P.; Waldman, S. J.; Wallace, L.; Wanner, A.; Ward, R. L.; Wei, P.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Wen, L.; Wen, S.; Wessels, P.; West, M.; Westphal, T.; Wette, K.; Whelan, J. T.; Whitcomb, S. E.; Whiting, B. F.; Wilkinson, C.; Willems, P. A.; Williams, H. R.; Williams, L.; Willke, B.; Wilmut, I.; Winkelmann, L.; Winkler, W.; Wipf, C. C.; Wiseman, A. G.; Woan, G.; Wooley, R.; Worden, J.; Yakushin, I.; Yamamoto, H.; Yamamoto, K.; Yeaton-Massey, D.; Yoshida, S.; Zanolin, M.; Zhang, L.; Zhang, Z.; Zhao, C.; Zotov, N.; Zucker, M. E.; Zweizig, J.; Buchner, S.
2011-02-01
The physical mechanisms responsible for pulsar timing glitches are thought to excite quasinormal mode oscillations in their parent neutron star that couple to gravitational-wave emission. In August 2006, a timing glitch was observed in the radio emission of PSR B0833-45, the Vela pulsar. At the time of the glitch, the two colocated Hanford gravitational-wave detectors of the Laser Interferometer Gravitational-wave observatory (LIGO) were operational and taking data as part of the fifth LIGO science run (S5). We present the first direct search for the gravitational-wave emission associated with oscillations of the fundamental quadrupole mode excited by a pulsar timing glitch. No gravitational-wave detection candidate was found. We place Bayesian 90% confidence upper limits of 6.3×10-21 to 1.4×10-20 on the peak intrinsic strain amplitude of gravitational-wave ring-down signals, depending on which spherical harmonic mode is excited. The corresponding range of energy upper limits is 5.0×1044 to 1.3×1045erg.
Search for Gravitational Waves Associated with the August 2006 Timing Glitch of the Vela Pulsar
NASA Technical Reports Server (NTRS)
Camp, J. B.; Cannizzo, J.; Stroeer, A.
2011-01-01
The physical mechanisms responsible for pulsar timing glitches are thought to excite quasinormal mode oscillations in their parent neutron star that couple to gravitational-wave emission, In August 2006, a timing glitch was observed in the radio emission of PSR B0833-45, the Vela pulsar. At the time of the glitch, the two colocated Hanford gravitational-wave detectors of the Laser Interferometer Gravitational-wave observatory (LIGO) were operational and taking data as part of the fifth LIGO science run (S5). We present the first direct search for the gravitational-wave emission associated with oscillations of the fundamental quadrupole mode excited by a pulsar timing glitch. No gravitational-wave detection candidate was found. We place Bayesian 90% confidence upper limits of 6,3 x 10(exp -21) to 1.4 x 10(exp -20) on the peak: intrinsic strain amplitude of gravitational-wave ring-down signals, depending on which spherical harmonic mode is excited. The corresponding range of energy upper limits is 5.0 x 10(exp 44) to 1.3 x 10(exp 45) erg.
Beating the Spin-down Limit on Gravitational Wave Emission from the Vela Pulsar
NASA Astrophysics Data System (ADS)
Abadie, J.; Abbott, B. P.; Abbott, R.; Abernathy, M.; Accadia, T.; Acernese, F.; Adams, C.; Adhikari, R.; Affeldt, C.; Allen, B.; Allen, G. S.; Amador Ceron, E.; Amariutei, D.; Amin, R. S.; Anderson, S. B.; Anderson, W. G.; Antonucci, F.; Arai, K.; Arain, M. A.; Araya, M. C.; Aston, S. M.; Astone, P.; Atkinson, D.; Aufmuth, P.; Aulbert, C.; Aylott, B. E.; Babak, S.; Baker, P.; Ballardin, G.; Ballmer, S.; Barker, D.; Barnum, S.; Barone, F.; Barr, B.; Barriga, P.; Barsotti, L.; Barsuglia, M.; Barton, M. A.; Bartos, I.; Bassiri, R.; Bastarrika, M.; Basti, A.; Bauchrowitz, J.; Bauer, Th. S.; Behnke, B.; Bejger, M.; Beker, M. G.; Bell, A. S.; Belletoile, A.; Belopolski, I.; Benacquista, M.; Bertolini, A.; Betzwieser, J.; Beveridge, N.; Beyersdorf, P. T.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Birindelli, S.; Biswas, R.; Bitossi, M.; Bizouard, M. A.; Black, E.; Blackburn, J. K.; Blackburn, L.; Blair, D.; Bland, B.; Blom, M.; Bock, O.; Bodiya, T. P.; Bogan, C.; Bondarescu, R.; Bondu, F.; Bonelli, L.; Bonnand, R.; Bork, R.; Born, M.; Boschi, V.; Bose, S.; Bosi, L.; Bouhou, B.; Boyle, M.; Braccini, S.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Brau, J. E.; Breyer, J.; Bridges, D. O.; Brillet, A.; Brinkmann, M.; Brisson, V.; Britzger, M.; Brooks, A. F.; Brown, D. A.; Brummit, A.; Budzyński, R.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Burguet-Castell, J.; Burmeister, O.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cain, J.; Calloni, E.; Camp, J. B.; Campagna, E.; Campsie, P.; Cannizzo, J.; Cannon, K.; Canuel, B.; Cao, J.; Capano, C.; Carbognani, F.; Caride, S.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C.; Cesarini, E.; Chaibi, O.; Chalermsongsak, T.; Chalkley, E.; Charlton, P.; Chassande-Mottin, E.; Chelkowski, S.; Chen, Y.; Chincarini, A.; Christensen, N.; Chua, S. S. Y.; Chung, C. T. Y.; Chung, S.; Clara, F.; Clark, D.; Clark, J.; Clayton, J. H.; Cleva, F.; Coccia, E.; Colacino, C. N.; Colas, J.; Colla, A.; Colombini, M.; Conte, R.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Costa, C. A.; Coughlin, M.; Coulon, J.-P.; Coward, D. M.; Coyne, D. C.; Creighton, J. D. E.; Creighton, T. D.; Cruise, A. M.; Culter, R. M.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dahl, K.; Danilishin, S. L.; Dannenberg, R.; D'Antonio, S.; Danzmann, K.; Das, K.; Dattilo, V.; Daudert, B.; Daveloza, H.; Davier, M.; Davies, G.; Daw, E. J.; Day, R.; Dayanga, T.; De Rosa, R.; DeBra, D.; Debreczeni, G.; Degallaix, J.; del Prete, M.; Dent, T.; Dergachev, V.; DeRosa, R.; DeSalvo, R.; Dhurandhar, S.; Di Fiore, L.; Di Lieto, A.; Di Palma, I.; Emilio, M. Di Paolo; Di Virgilio, A.; Díaz, M.; Dietz, A.; Donovan, F.; Dooley, K. L.; Dorsher, S.; Douglas, E. S. D.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Dumas, J.-C.; Dwyer, S.; Eberle, T.; Edgar, M.; Edwards, M.; Effler, A.; Ehrens, P.; Engel, R.; Etzel, T.; Evans, M.; Evans, T.; Factourovich, M.; Fafone, V.; Fairhurst, S.; Fan, Y.; Farr, B. F.; Fazi, D.; Fehrmann, H.; Feldbaum, D.; Ferrante, I.; Fidecaro, F.; Finn, L. S.; Fiori, I.; Flaminio, R.; Flanigan, M.; Foley, S.; Forsi, E.; Forte, L. A.; Fotopoulos, N.; Fournier, J.-D.; Franc, J.; Frasca, S.; Frasconi, F.; Frede, M.; Frei, M.; Frei, Z.; Freise, A.; Frey, R.; Fricke, T. T.; Friedrich, D.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Galimberti, M.; Gammaitoni, L.; Garcia, J.; Garofoli, J. A.; Garufi, F.; Gáspár, M. E.; Gemme, G.; Genin, E.; Gennai, A.; Ghosh, S.; Giaime, J. A.; Giampanis, S.; Giardina, K. D.; Giazotto, A.; Gill, C.; Goetz, E.; Goggin, L. M.; González, G.; Gorodetsky, M. L.; Goßler, S.; Gouaty, R.; Graef, C.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greenhalgh, R. J. S.; Gretarsson, A. M.; Greverie, C.; Grosso, R.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guido, C.; Gupta, R.; Gustafson, E. K.; Gustafson, R.; Hage, B.; Hallam, J. M.; Hammer, D.; Hammond, G.; Hanks, J.; Hanna, C.; Hanson, J.; Harms, J.; Harry, G. M.; Harry, I. W.; Harstad, E. D.; Hartman, M. T.; Haughian, K.; Hayama, K.; Hayau, J.-F.; Hayler, T.; Heefner, J.; Heitmann, H.; Hello, P.; Hendry, M. A.; Heng, I. S.; Heptonstall, A. W.; Herrera, V.; Hewitson, M.; Hild, S.; Hoak, D.; Hodge, K. A.; Holt, K.; Hong, T.; Hooper, S.; Hosken, D. J.; Hough, J.; Howell, E. J.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Ingram, D. R.; Inta, R.; Isogai, T.; Ivanov, A.; Jaranowski, P.; Johnson, W. W.; Jones, D. I.; Jones, G.; Jones, R.; Ju, L.; Kalmus, P.; Kalogera, V.; Kandhasamy, S.; Kanner, J. B.; Katsavounidis, E.; Katzman, W.; Kawabe, K.; Kawamura, S.; Kawazoe, F.; Kells, W.; Kelner, M.; Keppel, D. G.; Khalaidovski, A.; Khalili, F. Y.; Khazanov, E. A.; Kim, H.; Kim, N.; King, P. J.; Kinzel, D. L.; Kissel, J. S.; Klimenko, S.; Kondrashov, V.; Kopparapu, R.; Koranda, S.; Korth, W. Z.; Kowalska, I.; Kozak, D.; Kringel, V.; Krishnamurthy, S.; Krishnan, B.; Królak, A.; Kuehn, G.; Kumar, R.; Kwee, P.; Landry, M.; Lantz, B.; Lastzka, N.; Lazzarini, A.; Leaci, P.; Leong, J.; Leonor, I.; Leroy, N.; Letendre, N.; Li, J.; Li, T. G. F.; Liguori, N.; Lindquist, P. E.; Lockerbie, N. A.; Lodhia, D.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lu, P.; Luan, J.; Lubinski, M.; Lück, H.; Lundgren, A. P.; Macdonald, E.; Machenschalk, B.; MacInnis, M.; Mageswaran, M.; Mailand, K.; Majorana, E.; Maksimovic, I.; Man, N.; Mandel, I.; Mandic, V.; Mantovani, M.; Marandi, A.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Maros, E.; Marque, J.; Martelli, F.; Martin, I. W.; Martin, R. M.; Marx, J. N.; Mason, K.; Masserot, A.; Matichard, F.; Matone, L.; Matzner, R. A.; Mavalvala, N.; McCarthy, R.; McClelland, D. E.; McGuire, S. C.; McIntyre, G.; McKechan, D. J. A.; Meadors, G.; Mehmet, M.; Meier, T.; Melatos, A.; Melissinos, A. C.; Mendell, G.; Mercer, R. A.; Merill, L.; Meshkov, S.; Messenger, C.; Meyer, M. S.; Miao, H.; Michel, C.; Milano, L.; Miller, J.; Minenkov, Y.; Mino, Y.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Miyakawa, O.; Moe, B.; Moesta, P.; Mohan, M.; Mohanty, S. D.; Mohapatra, S. R. P.; Moraru, D.; Moreno, G.; Morgado, N.; Morgia, A.; Mosca, S.; Moscatelli, V.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, G.; Mukherjee, S.; Mullavey, A.; Müller-Ebhardt, H.; Munch, J.; Murray, P. G.; Nash, T.; Nawrodt, R.; Nelson, J.; Neri, I.; Newton, G.; Nishida, E.; Nishizawa, A.; Nocera, F.; Nolting, D.; Ochsner, E.; O'Dell, J.; Ogin, G. H.; Oldenburg, R. G.; O'Reilly, B.; O'Shaughnessy, R.; Osthelder, C.; Ott, C. D.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Page, A.; Pagliaroli, G.; Palladino, L.; Palomba, C.; Pan, Y.; Pankow, C.; Paoletti, F.; Papa, M. A.; Parameswaran, A.; Pardi, S.; Parisi, M.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patel, P.; Pathak, D.; Pedraza, M.; Pekowsky, L.; Penn, S.; Peralta, C.; Perreca, A.; Persichetti, G.; Phelps, M.; Pichot, M.; Pickenpack, M.; Piergiovanni, F.; Pietka, M.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Pletsch, H. J.; Plissi, M. V.; Podkaminer, J.; Poggiani, R.; Pöld, J.; Postiglione, F.; Prato, M.; Predoi, V.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prix, R.; Prodi, G. A.; Prokhorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Quetschke, V.; Raab, F. J.; Rabeling, D. S.; Rácz, I.; Radkins, H.; Raffai, P.; Rakhmanov, M.; Ramet, C. R.; Rankins, B.; Rapagnani, P.; Raymond, V.; Re, V.; Redwine, K.; Reed, C. M.; Reed, T.; Regimbau, T.; Reid, S.; Reitze, D. H.; Ricci, F.; Riesen, R.; Riles, K.; Roberts, P.; Robertson, N. A.; Robinet, F.; Robinson, C.; Robinson, E. L.; Rocchi, A.; Roddy, S.; Rolland, L.; Rollins, J.; Romano, J. D.; Romano, R.; Romie, J. H.; Rosińska, D.; Röver, C.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Ryan, K.; Sakata, S.; Sakosky, M.; Salemi, F.; Salit, M.; Sammut, L.; Sancho de la Jordana, L.; Sandberg, V.; Sannibale, V.; Santamaría, L.; Santiago-Prieto, I.; Santostasi, G.; Saraf, S.; Sassolas, B.; Sathyaprakash, B. S.; Sato, S.; Satterthwaite, M.; Saulson, P. R.; Savage, R.; Schilling, R.; Schlamminger, S.; Schnabel, R.; Schofield, R. M. S.; Schulz, B.; Schutz, B. F.; Schwinberg, P.; Scott, J.; Scott, S. M.; Searle, A. C.; Seifert, F.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sergeev, A.; Shaddock, D. A.; Shaltev, M.; Shapiro, B.; Shawhan, P.; Shihan Weerathunga, T.; Shoemaker, D. H.; Sibley, A.; Siemens, X.; Sigg, D.; Singer, A.; Singer, L.; Sintes, A. M.; Skelton, G.; Slagmolen, B. J. J.; Slutsky, J.; Smith, J. R.; Smith, M. R.; Smith, N. D.; Smith, R.; Somiya, K.; Sorazu, B.; Soto, J.; Speirits, F. C.; Sperandio, L.; Stefszky, M.; Stein, A. J.; Steinlechner, J.; Steinlechner, S.; Steplewski, S.; Stochino, A.; Stone, R.; Strain, K. A.; Strigin, S.; Stroeer, A. S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sung, M.; Susmithan, S.; Sutton, P. J.; Swinkels, B.; Szokoly, G. P.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tarabrin, S. P.; Taylor, J. R.; Taylor, R.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Thüring, A.; Titsler, C.; Tokmakov, K. V.; Toncelli, A.; Tonelli, M.; Torre, O.; Torres, C.; Torrie, C. I.; Tournefier, E.; Travasso, F.; Traylor, G.; Trias, M.; Tseng, K.; Turner, L.; Ugolini, D.; Urbanek, K.; Vahlbruch, H.; Vaishnav, B.; Vajente, G.; Vallisneri, M.; van den Brand, J. F. J.; Van Den Broeck, C.; van der Putten, S.; van der Sluys, M. V.; van Veggel, A. A.; Vass, S.; Vasuth, M.; Vaulin, R.; Vavoulidis, M.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Veltkamp, C.; Verkindt, D.; Vetrano, F.; Viceré, A.; Villar, A. E.; Vinet, J.-Y.; Vocca, H.; Vorvick, C.; Vyachanin, S. P.; Waldman, S. J.; Wallace, L.; Wanner, A.; Ward, R. L.; Was, M.; Wei, P.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Wen, L.; Wen, S.; Wessels, P.; West, M.; Westphal, T.; Wette, K.; Whelan, J. T.; Whitcomb, S. E.; White, D.; Whiting, B. F.; Wilkinson, C.; Willems, P. A.; Williams, H. R.; Williams, L.; Willke, B.; Winkelmann, L.; Winkler, W.; Wipf, C. C.; Wiseman, A. G.; Woan, G.; Wooley, R.; Worden, J.; Yablon, J.; Yakushin, I.; Yamamoto, H.; Yamamoto, K.; Yang, H.; Yeaton-Massey, D.; Yoshida, S.; Yu, P.; Yvert, M.; Zanolin, M.; Zhang, L.; Zhang, Z.; Zhao, C.; Zotov, N.; Zucker, M. E.; Zweizig, J.; LIGO Scientific Collaboration; Virgo Collaboration; Buchner, S.; Hotan, A.; Palfreyman, J.
2011-08-01
We present direct upper limits on continuous gravitational wave emission from the Vela pulsar using data from the Virgo detector's second science run. These upper limits have been obtained using three independent methods that assume the gravitational wave emission follows the radio timing. Two of the methods produce frequentist upper limits for an assumed known orientation of the star's spin axis and value of the wave polarization angle of, respectively, 1.9 × 10-24 and 2.2 × 10-24, with 95% confidence. The third method, under the same hypothesis, produces a Bayesian upper limit of 2.1 × 10-24, with 95% degree of belief. These limits are below the indirect spin-down limit of 3.3 × 10-24 for the Vela pulsar, defined by the energy loss rate inferred from observed decrease in Vela's spin frequency, and correspond to a limit on the star ellipticity of ~10-3. Slightly less stringent results, but still well below the spin-down limit, are obtained assuming the star's spin axis inclination and the wave polarization angles are unknown.
Bayesian data analysis for newcomers.
Kruschke, John K; Liddell, Torrin M
2018-02-01
This article explains the foundational concepts of Bayesian data analysis using virtually no mathematical notation. Bayesian ideas already match your intuitions from everyday reasoning and from traditional data analysis. Simple examples of Bayesian data analysis are presented that illustrate how the information delivered by a Bayesian analysis can be directly interpreted. Bayesian approaches to null-value assessment are discussed. The article clarifies misconceptions about Bayesian methods that newcomers might have acquired elsewhere. We discuss prior distributions and explain how they are not a liability but an important asset. We discuss the relation of Bayesian data analysis to Bayesian models of mind, and we briefly discuss what methodological problems Bayesian data analysis is not meant to solve. After you have read this article, you should have a clear sense of how Bayesian data analysis works and the sort of information it delivers, and why that information is so intuitive and useful for drawing conclusions from data.
eGSM: A extended Sky Model of Diffuse Radio Emission
NASA Astrophysics Data System (ADS)
Kim, Doyeon; Liu, Adrian; Switzer, Eric
2018-01-01
Both cosmic microwave background and 21cm cosmology observations must contend with astrophysical foreground contaminants in the form of diffuse radio emission. For precise cosmological measurements, these foregrounds must be accurately modeled over the entire sky Ideally, such full-sky models ought to be primarily motivated by observations. Yet in practice, these observations are limited, with data sets that are observed not only in a heterogenous fashion, but also over limited frequency ranges. Previously, the Global Sky Model (GSM) took some steps towards solving the problem of incomplete observational data by interpolating over multi-frequency maps using principal component analysis (PCA).In this poster, we present an extended version of GSM (called eGSM) that includes the following improvements: 1) better zero-level calibration 2) incorporation of non-uniform survey resolutions and sky coverage 3) the ability to quantify uncertainties in sky models 4) the ability to optimally select spectral models using Bayesian Evidence techniques.
RESOLVE: A new algorithm for aperture synthesis imaging of extended emission in radio astronomy
NASA Astrophysics Data System (ADS)
Junklewitz, H.; Bell, M. R.; Selig, M.; Enßlin, T. A.
2016-02-01
We present resolve, a new algorithm for radio aperture synthesis imaging of extended and diffuse emission in total intensity. The algorithm is derived using Bayesian statistical inference techniques, estimating the surface brightness in the sky assuming a priori log-normal statistics. resolve estimates the measured sky brightness in total intensity, and the spatial correlation structure in the sky, which is used to guide the algorithm to an optimal reconstruction of extended and diffuse sources. During this process, the algorithm succeeds in deconvolving the effects of the radio interferometric point spread function. Additionally, resolve provides a map with an uncertainty estimate of the reconstructed surface brightness. Furthermore, with resolve we introduce a new, optimal visibility weighting scheme that can be viewed as an extension to robust weighting. In tests using simulated observations, the algorithm shows improved performance against two standard imaging approaches for extended sources, Multiscale-CLEAN and the Maximum Entropy Method.
NASA Astrophysics Data System (ADS)
Kosch, Michael; Bristow, Bill; Gustavsson, Bjorn; Heinselman, Craig; Hughes, John; Isham, Brett; Mutiso, Charles; Nielsen, Kim; Pedersen, Todd; Wang, Weiyuan; Wong, Alfred
We report results from a unique experiment performed at the HIPAS ionospheric modification facility in Alaska. High power radio waves at 2.85 MHz, which corresponds to the second electron gyroharmonic at 240 km altitude, were transmitted into the nighttime ionosphere. Diagnostics included optical equipment at HIPAS and HAARP, 288 km to the south-east, the PFISR radar at Poker Flat, 32 km to the north-west, and the Kodiak SuperDARN radar, 856 km to the south-west. Camera observations of the stimulated optical emissions at 557.7 nm (O1S, threshold 4.2 eV) and 630 nm (O1D, threshold 2 eV) were made, allowing tomographic reconstruction of the volume emission. The first observations of pump-induced 732 nm (O+, threshold 18.6 eV) emissions are reported. Kodiak radar backscatter, which is a proxy for upper-hybrid resonance, shows strong production of striations without a minimum on the second gyroharmonic, confirming previous results. PFISR analysis shows clear evidence of electron temperature enhancements, consistent with previous EISCAT results, maximizing when the pump frequency matches the second gyroharmonic and when double resonance occurs, i.e. the upper-hybrid resonance frequency matches the second gyroharmonic. This is consistent with the optical observations. From the above data, we are able to infer the efficiency of different groups of electron-accelerating mechanisms.
TOmographic Remote Observer of Ionospheric Disturbances
2007-11-15
ionosphere . The proposed spacecraft was an evolutionary design from the USUSat, Combat Sentinel, and USUSat II programs whose histories are shown in...Figure 1. The primary science instrument, TOROID for TOmographic Remote Observer of Ionospheric Disturbances, is a photometer for measuring the
Acoustic representation of tomographic data
NASA Astrophysics Data System (ADS)
Wampler, Cheryl; Zahrt, John D.; Hotchkiss, Robert S.; Zahrt, Rebecca; Kust, Mark
1993-04-01
Tomographic data and tomographic reconstructions are naturally periodic in the angle of rotation of the turntable and the polar angel of the coordinates in the object, respectively. Similarly, acoustic waves are periodic and have amplitude and wavelength as free parameters that can be fit to another representation. Work has been in progress for some time in bringing the acoustic senses to bear on large data sets rather than just the visual sense. We will provide several different acoustic representations of both raw data and density maps. Rather than graphical portrayal of the data and reconstructions, you will be presented various 'tone poems.'
Optical tomograph optimized for tumor detection inside highly absorbent organs
NASA Astrophysics Data System (ADS)
Boutet, Jérôme; Koenig, Anne; Hervé, Lionel; Berger, Michel; Dinten, Jean-Marc; Josserand, Véronique; Coll, Jean-Luc
2011-05-01
This paper presents a tomograph for small animal fluorescence imaging. The compact and cost-effective system described in this article was designed to address the problem of tumor detection inside highly absorbent heterogeneous organs, such as lungs. To validate the tomograph's ability to detect cancerous nodules inside lungs, in vivo tumor growth was studied on seven cancerous mice bearing murine mammary tumors marked with Alexa Fluor 700. They were successively imaged 10, 12, and 14 days after the primary tumor implantation. The fluorescence maps were compared over this time period. As expected, the reconstructed fluorescence increases with the tumor growth stage.
Lamb wave tomographic imaging system for aircraft structural health assessment
NASA Astrophysics Data System (ADS)
Schwarz, Willi G.; Read, Michael E.; Kremer, Matthew J.; Hinders, Mark K.; Smith, Barry T.
1999-01-01
A tomographic imaging system using ultrasonic Lamb waves for the nondestructive inspection of aircraft components such as wings and fuselage is being developed. The computer-based system provides large-area inspection capability by electronically scanning an array of transducers that can be easily attached to flat and curved surface without moving parts. Images of the inspected area are produced in near real time employing a tomographic reconstruction method adapted from seismological applications. Changes in material properties caused by structural flaws such as disbonds, corrosion, and fatigue cracks can be effectively detected and characterized utilizing this fast NDE technique.
Sodankylä ionospheric tomography dataset 2003-2014
NASA Astrophysics Data System (ADS)
Norberg, J.; Roininen, L.; Kero, A.; Raita, T.; Ulich, T.; Markkanen, M.; Juusola, L.; Kauristie, K.
2015-12-01
Sodankylä Geophysical Observatory has been operating a tomographic receiver network and collecting the produced data since 2003. The collected dataset consists of phase difference curves measured from Russian COSMOS dual-frequency (150/400 MHz) low-Earth-orbit satellite signals, and tomographic electron density reconstructions obtained from these measurements. In this study vertical total electron content (VTEC) values are integrated from the reconstructed electron densities to make a qualitative and quantitative analysis to validate the long-term performance of the tomographic system. During the observation period, 2003-2014, there were three-to-five operational stations at the Fenno-Scandinavian sector. Altogether the analysis consists of around 66 000 overflights, but to ensure the quality of the reconstructions, the examination is limited to cases with descending (north to south) overflights and maximum elevation over 60°. These constraints limit the number of overflights to around 10 000. Based on this dataset, one solar cycle of ionospheric vertical total electron content estimates is constructed. The measurements are compared against International Reference Ionosphere IRI-2012 model, F10.7 solar flux index and sunspot number data. Qualitatively the tomographic VTEC estimate corresponds to reference data very well, but the IRI-2012 model are on average 40 % higher of that of the tomographic results.
Late Holocene volcanic activity and environmental change in Highland Guatemala
NASA Astrophysics Data System (ADS)
Lohse, Jon C.; Hamilton, W. Derek; Brenner, Mark; Curtis, Jason; Inomata, Takeshi; Morgan, Molly; Cardona, Karla; Aoyama, Kazuo; Yonenobu, Hitoshi
2018-07-01
We present a record of late Holocene volcanic eruptions with elemental data for a sequence of sampled tephras from Lake Amatitlan in Highland Guatemala. Our tephrochronology is anchored by a Bayesian P_Sequence age-depth model based on multiple AMS radiocarbon dates. We compare our record against a previously published study from the same area to understand the record of volcanism and environmental changes. This work has implications for understanding the effects of climate and other environmental changes that may be related to the emission of volcanic aerosols at local, regional and global scales.
Transmission imaging for integrated PET-MR systems.
Bowen, Spencer L; Fuin, Niccolò; Levine, Michael A; Catana, Ciprian
2016-08-07
Attenuation correction for PET-MR systems continues to be a challenging problem, particularly for body regions outside the head. The simultaneous acquisition of transmission scan based μ-maps and MR images on integrated PET-MR systems may significantly increase the performance of and offer validation for new MR-based μ-map algorithms. For the Biograph mMR (Siemens Healthcare), however, use of conventional transmission schemes is not practical as the patient table and relatively small diameter scanner bore significantly restrict radioactive source motion and limit source placement. We propose a method for emission-free coincidence transmission imaging on the Biograph mMR. The intended application is not for routine subject imaging, but rather to improve and validate MR-based μ-map algorithms; particularly for patient implant and scanner hardware attenuation correction. In this study we optimized source geometry and assessed the method's performance with Monte Carlo simulations and phantom scans. We utilized a Bayesian reconstruction algorithm, which directly generates μ-map estimates from multiple bed positions, combined with a robust scatter correction method. For simulations with a pelvis phantom a single torus produced peak noise equivalent count rates (34.8 kcps) dramatically larger than a full axial length ring (11.32 kcps) and conventional rotating source configurations. Bias in reconstructed μ-maps for head and pelvis simulations was ⩽4% for soft tissue and ⩽11% for bone ROIs. An implementation of the single torus source was filled with (18)F-fluorodeoxyglucose and the proposed method quantified for several test cases alone or in comparison with CT-derived μ-maps. A volume average of 0.095 cm(-1) was recorded for an experimental uniform cylinder phantom scan, while a bias of <2% was measured for the cortical bone equivalent insert of the multi-compartment phantom. Single torus μ-maps of a hip implant phantom showed significantly less artifacts and improved dynamic range, and differed greatly for highly attenuating materials in the case of the patient table, compared to CT results. Use of a fixed torus geometry, in combination with translation of the patient table to perform complete tomographic sampling, generated highly quantitative measured μ-maps and is expected to produce images with significantly higher SNR than competing fixed geometries at matched total acquisition time.
Transmission imaging for integrated PET-MR systems
NASA Astrophysics Data System (ADS)
Bowen, Spencer L.; Fuin, Niccolò; Levine, Michael A.; Catana, Ciprian
2016-08-01
Attenuation correction for PET-MR systems continues to be a challenging problem, particularly for body regions outside the head. The simultaneous acquisition of transmission scan based μ-maps and MR images on integrated PET-MR systems may significantly increase the performance of and offer validation for new MR-based μ-map algorithms. For the Biograph mMR (Siemens Healthcare), however, use of conventional transmission schemes is not practical as the patient table and relatively small diameter scanner bore significantly restrict radioactive source motion and limit source placement. We propose a method for emission-free coincidence transmission imaging on the Biograph mMR. The intended application is not for routine subject imaging, but rather to improve and validate MR-based μ-map algorithms; particularly for patient implant and scanner hardware attenuation correction. In this study we optimized source geometry and assessed the method’s performance with Monte Carlo simulations and phantom scans. We utilized a Bayesian reconstruction algorithm, which directly generates μ-map estimates from multiple bed positions, combined with a robust scatter correction method. For simulations with a pelvis phantom a single torus produced peak noise equivalent count rates (34.8 kcps) dramatically larger than a full axial length ring (11.32 kcps) and conventional rotating source configurations. Bias in reconstructed μ-maps for head and pelvis simulations was ⩽4% for soft tissue and ⩽11% for bone ROIs. An implementation of the single torus source was filled with 18F-fluorodeoxyglucose and the proposed method quantified for several test cases alone or in comparison with CT-derived μ-maps. A volume average of 0.095 cm-1 was recorded for an experimental uniform cylinder phantom scan, while a bias of <2% was measured for the cortical bone equivalent insert of the multi-compartment phantom. Single torus μ-maps of a hip implant phantom showed significantly less artifacts and improved dynamic range, and differed greatly for highly attenuating materials in the case of the patient table, compared to CT results. Use of a fixed torus geometry, in combination with translation of the patient table to perform complete tomographic sampling, generated highly quantitative measured μ-maps and is expected to produce images with significantly higher SNR than competing fixed geometries at matched total acquisition time.
The Development of Bayesian Theory and Its Applications in Business and Bioinformatics
NASA Astrophysics Data System (ADS)
Zhang, Yifei
2018-03-01
Bayesian Theory originated from an Essay of a British mathematician named Thomas Bayes in 1763, and after its development in 20th century, Bayesian Statistics has been taking a significant part in statistical study of all fields. Due to the recent breakthrough of high-dimensional integral, Bayesian Statistics has been improved and perfected, and now it can be used to solve problems that Classical Statistics failed to solve. This paper summarizes Bayesian Statistics’ history, concepts and applications, which are illustrated in five parts: the history of Bayesian Statistics, the weakness of Classical Statistics, Bayesian Theory and its development and applications. The first two parts make a comparison between Bayesian Statistics and Classical Statistics in a macroscopic aspect. And the last three parts focus on Bayesian Theory in specific -- from introducing some particular Bayesian Statistics’ concepts to listing their development and finally their applications.
Bayesian demography 250 years after Bayes
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889
An LUR/BME framework to estimate PM2.5 explained by on road mobile and stationary sources.
Reyes, Jeanette M; Serre, Marc L
2014-01-01
Knowledge of particulate matter concentrations <2.5 μm in diameter (PM2.5) across the United States is limited due to sparse monitoring across space and time. Epidemiological studies need accurate exposure estimates in order to properly investigate potential morbidity and mortality. Previous works have used geostatistics and land use regression (LUR) separately to quantify exposure. This work combines both methods by incorporating a large area variability LUR model that accounts for on road mobile emissions and stationary source emissions along with data that take into account incompleteness of PM2.5 monitors into the modern geostatistical Bayesian Maximum Entropy (BME) framework to estimate PM2.5 across the United States from 1999 to 2009. A cross-validation was done to determine the improvement of the estimate due to the LUR incorporation into BME. These results were applied to known diseases to determine predicted mortality coming from total PM2.5 as well as PM2.5 explained by major contributing sources. This method showed a mean squared error reduction of over 21.89% oversimple kriging. PM2.5 explained by on road mobile emissions and stationary emissions contributed to nearly 568,090 and 306,316 deaths, respectively, across the United States from 1999 to 2007.
The Mathematics of Four or More N-Localizers for Stereotactic Neurosurgery.
Brown, Russell A
2015-10-13
The mathematics that were originally developed for the N-localizer apply to three N-localizers that produce three sets of fiducials in a tomographic image. Some applications of the N-localizer use four N-localizers that produce four sets of fiducials; however, the mathematics that apply to three sets of fiducials do not apply to four sets of fiducials. This article presents mathematics that apply to four or more sets of fiducials that all lie within one planar tomographic image. In addition, these mathematics are extended to apply to four or more fiducials that do not all lie within one planar tomographic image, as may be the case with magnetic resonance (MR) imaging where a volume is imaged instead of a series of planar tomographic images. Whether applied to a planar image or a volume image, the mathematics of four or more N-localizers provide a statistical measure of the quality of the image data that may be influenced by factors, such as the nonlinear distortion of MR images.
TomoBank: a tomographic data repository for computational x-ray science
De Carlo, Francesco; Gürsoy, Doğa; Ching, Daniel J.; ...
2018-02-08
There is a widening gap between the fast advancement of computational methods for tomographic reconstruction and their successful implementation in production software at various synchrotron facilities. This is due in part to the lack of readily available instrument datasets and phantoms representative of real materials for validation and comparison of new numerical methods. Recent advancements in detector technology made sub-second and multi-energy tomographic data collection possible [1], but also increased the demand to develop new reconstruction methods able to handle in-situ [2] and dynamic systems [3] that can be quickly incorporated in beamline production software [4]. The X-ray Tomography Datamore » Bank, tomoBank, provides a repository of experimental and simulated datasets with the aim to foster collaboration among computational scientists, beamline scientists, and experimentalists and to accelerate the development and implementation of tomographic reconstruction methods for synchrotron facility production software by providing easy access to challenging dataset and their descriptors.« less
High-resolution multimodal clinical multiphoton tomography of skin
NASA Astrophysics Data System (ADS)
König, Karsten
2011-03-01
This review focuses on multimodal multiphoton tomography based on near infrared femtosecond lasers. Clinical multiphoton tomographs for 3D high-resolution in vivo imaging have been placed into the market several years ago. The second generation of this Prism-Award winning High-Tech skin imaging tool (MPTflex) was introduced in 2010. The same year, the world's first clinical CARS studies have been performed with a hybrid multimodal multiphoton tomograph. In particular, non-fluorescent lipids and water as well as mitochondrial fluorescent NAD(P)H, fluorescent elastin, keratin, and melanin as well as SHG-active collagen has been imaged with submicron resolution in patients suffering from psoriasis. Further multimodal approaches include the combination of multiphoton tomographs with low-resolution wide-field systems such as ultrasound, optoacoustical, OCT, and dermoscopy systems. Multiphoton tomographs are currently employed in Australia, Japan, the US, and in several European countries for early diagnosis of skin cancer, optimization of treatment strategies, and cosmetic research including long-term testing of sunscreen nanoparticles as well as anti-aging products.
Nuclear medicine in clinical neurology: an update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldendorf, W.H.
1981-01-01
Isotope scanning using technetium 99m pertechnetate has fallen into disuse since the advent of x-ray computerized tomography. Regional brain blood flow studies have been pursued on a research basis. Increased regional blood flow during focal seizure activity has been demonstrated and is of use in localizing such foci. Cisternography as a predictive tool in normal pressure hydrocephalus is falling into disuse. Positron tomographic scanning is a potent research tool that can demonstrate both regional glycolysis and blood flow. Unfortunately, it is extremely expensive and complex to apply in a clinical setting. With support from the National Institutes of Health, sevenmore » extramural centers have been funded to develop positron tomographic capabilities, and they will greatly advance our knowledge of stroke pathophysiology, seizure disorders, brain tumors, and various degenerative diseases. Nuclear magnetic resonance imaging is a potentially valuable tool since it creates tomographic images representing the distribution of brain water. No tissue ionization is produced, and images comparable to second-generation computerized tomographic scans are already being produced in humans.« less
NASA Astrophysics Data System (ADS)
Cui, Y.; Falk, M.; Chen, Y.; Herner, J.; Croes, B. E.; Vijayan, A.
2017-12-01
Methane (CH4) is an important short-lived climate pollutant (SLCP), and the second most important greenhouse gas (GHG) in California which accounts for 9% of the statewide GHG emissions inventory. Over the years, California has enacted several ambitious climate change mitigation goals, including the California Global Warming Solutions Act of 2006 which requires ARB to reduce statewide GHG emissions to 1990 emission level by 2020, as well as Assembly Bill 1383 which requires implementation of a climate mitigation program to reduce statewide methane emissions by 40% below the 2013 levels. In order to meet these requirements, ARB has proposed a comprehensive SLCP Strategy with goals to reduce oil and gas related emissions and capture methane emissions from dairy operations and organic waste. Achieving these goals will require accurate understanding of the sources of CH4 emissions. Since direct monitoring of CH4 emission sources in large spatial and temporal scales is challenging and resource intensive, we developed a complex inverse technique combined with atmospheric three-dimensional (3D) transport model and atmospheric observations of CH4 concentrations from a regional tower network and aircraft measurements, to gain insights into emission sources in California. In this study, develop a comprehensive inversion estimate using available aircraft measurements from CalNex airborne campaigns (May-June 2010) and three years of hourly continuous measurements from the ARB Statewide GHG Monitoring Network (2014-2016). The inversion analysis is conducted using two independent 3D Lagrangian models (WRF-STILT and WRF-FLEXPART), with a variety of bottom-up prior inputs from national and regional inventories, as well as two different probability density functions (Gaussian and Lognormal). Altogether, our analysis provides a detailed picture of the spatially resolved CH4 emission sources and their temporal variation over a multi-year period.
Moosavi Tayebi, Rohollah; Wirza, Rahmita; Sulaiman, Puteri S B; Dimon, Mohd Zamrin; Khalid, Fatimah; Al-Surmi, Aqeel; Mazaheri, Samaneh
2015-04-22
Computerized tomographic angiography (3D data representing the coronary arteries) and X-ray angiography (2D X-ray image sequences providing information about coronary arteries and their stenosis) are standard and popular assessment tools utilized for medical diagnosis of coronary artery diseases. At present, the results of both modalities are individually analyzed by specialists and it is difficult for them to mentally connect the details of these two techniques. The aim of this work is to assist medical diagnosis by providing specialists with the relationship between computerized tomographic angiography and X-ray angiography. In this study, coronary arteries from two modalities are registered in order to create a 3D reconstruction of the stenosis position. The proposed method starts with coronary artery segmentation and labeling for both modalities. Then, stenosis and relevant labeled artery in X-ray angiography image are marked by a specialist. Proper control points for the marked artery in both modalities are automatically detected and normalized. Then, a geometrical transformation function is computed using these control points. Finally, this function is utilized to register the marked artery from the X-ray angiography image on the computerized tomographic angiography and get the 3D position of the stenosis lesion. The result is a 3D informative model consisting of stenosis and coronary arteries' information from the X-ray angiography and computerized tomographic angiography modalities. The results of the proposed method for coronary artery segmentation, labeling and 3D reconstruction are evaluated and validated on the dataset containing both modalities. The advantage of this method is to aid specialists to determine a visual relationship between the correspondent coronary arteries from two modalities and also set up a connection between stenosis points from an X-ray angiography along with their 3D positions on the coronary arteries from computerized tomographic angiography. Moreover, another benefit of this work is that the medical acquisition standards remain unchanged, which means that no calibration in the acquisition devices is required. It can be applied on most computerized tomographic angiography and angiography devices.
Tomographic findings of acute pulmonary toxoplasmosis in immunocompetent patients.
de Souza Giassi, Karina; Costa, Andre Nathan; Apanavicius, Andre; Teixeira, Fernando Bin; Fernandes, Caio Julio Cesar; Helito, Alfredo Salim; Kairalla, Ronaldo Adib
2014-11-25
Toxoplasmosis is one of the most common human zoonosis, and is generally benign in most of the individuals. Pulmonary involvement is common in immunocompromised subjects, but very rare in immunocompetents and there are scarce reports of tomographic findings in the literature. The aim of the study is to describe three immunocompetent patients diagnosed with acute pulmonary toxoplasmosis and their respective thoracic tomographic findings. Acute toxoplasmosis was diagnosed according to the results of serological tests suggestive of recent primary infection and the absence of an alternative etiology. From 2009 to 2013, three patients were diagnosed with acute respiratory failure secondary to acute toxoplasmosis. The patients were two female and one male, and were 38, 56 and 36 years old. Similarly they presented a two-week febrile illness and progressive dyspnea before admission. Laboratory tests demonstrated lymphocytosis, slight changes in liver enzymes and high inflammatory markers. Tomographic findings were bilateral smooth septal and peribronchovascular thickening (100%), ground-glass opacities (100%), atelectasis (33%), random nodules (33%), lymph node enlargement (33%) and pleural effusion (66%). All the patients improved their symptoms after treatment, and complete resolution of tomographic findings were found in the followup. These cases provide a unique description of the presentation and evolution of pulmonary tomographic manifestations of toxoplasmosis in immunocompetent patients. Toxoplasma pneumonia manifests with fever, dyspnea and a non-productive cough that may result in respiratory failure. In animal models, changes were described as interstitial pneumonitis with focal infiltrates of neutrophils that can finally evolve into a pattern of diffuse alveolar damage with focal necrosis. The tomographic findings are characterized as ground glass opacities, smooth septal and marked peribronchovascular thickening; and may mimic pulmonary congestion, lymphangitis, atypical pneumonia and pneumocystosis. This is the largest series of CT findings of acute toxoplasmosis in immunocompetent hosts, and the diagnosis should be considered as patients that present with acute respiratory failure in the context of a subacute febrile illness with bilateral and diffuse interstitial infiltrates with marked peribronchovascular thickening. If promptly treated, pulmonary toxoplasmosis can result in complete clinical and radiological recovery in immunocompetent hosts.
Verbal fluency and positron emission tomographic mapping of regional cerebral glucose metabolism.
Boivin, M J; Giordani, B; Berent, S; Amato, D A; Lehtinen, S; Koeppe, R A; Buchtel, H A; Foster, N L; Kuhl, D E
1992-06-01
Impairment in verbal fluency (VF) has been a consistently reported clinical feature of focal cerebral deficits in frontal and temporal regions. More recent behavioral activation studies with healthy control subjects using positron emission tomography (PET), however, have noted a negative correlation between performance on verbal fluency tasks and regional cortical activity. To see if this negative relationship extends to steady-state non-activation PET measures, thirty-three healthy adults were given a VF task within a day of their 18F-2-fluoro-2-deoxy-D-glucose PET scan. VF was found to correlate positively with left temporal cortical region metabolic activity but to correlate negatively with right and left frontal activity. VF was not correlated significantly with right temporal cortical metabolic activity. Some previous studies with normals using behavioral activation paradigms and PET have reported negative correlations between metabolic activity and cognitive performance similar to that reported here. An explanation for the disparate relationships that were observed between frontal and temporal brain areas and VF might be found in the mediation of different task demands by these separate locations, i.e., task planning and/or initiation by frontal regions and verbal memory by the left temporal area.
Evaluation of positron emission tomography as a method to visualize subsurface microbial processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kinsella K.; Schlyer D.; Kinsella, K.
2012-01-18
Positron emission tomography (PET) provides spatiotemporal monitoring in a nondestructive manner and has higher sensitivity and resolution relative to other tomographic methods. Therefore, this technology was evaluated for its application to monitor in situ subsurface bacterial activity. To date, however, it has not been used to monitor or image soil microbial processes. In this study, PET imaging was applied as a 'proof-of-principle' method to assess the feasibility of visualizing a radiotracer labeled subsurface bacterial strain (Rahnella sp. Y9602), previously isolated from uranium contaminated soils and shown to promote uranium phosphate precipitation. Soil columns packed with acid-purified simulated mineral soils weremore » seeded with 2-deoxy-2-[{sup 18}F]fluoro-d-glucose ({sup 18}FDG) labeled Rahnella sp. Y9602. The applicability of [{sup 18}F]fluoride ion as a tracer for measuring hydraulic conductivity and {sup 18}FDG as a tracer to identify subsurface metabolically active bacteria was successful in our soil column studies. Our findings indicate that positron-emitting isotopes can be utilized for studies aimed at elucidating subsurface microbiology and geochemical processes important in contaminant remediation.« less
Derenzo, Stephen E.; Budinger, Thomas F.
1984-01-01
In brief, the invention is a tomograph modified to be in a clamshell configuration so that the ring or rings may be moved to multiple sampling positions. The tomograph includes an array of detectors arranged in successive adjacent relative locations along a closed curve in a first position in a selected plane, and means for securing the detectors in the relative locations in a first sampling position. The securing means is movable in the plane in two sections and pivotable at one p The U.S. Government has rights in this invention pursuant to Contract No. W-7405-ENG-48 between the U.S. Department of Energy and the University of California.
Center-of-Mass Tomography and Wigner Function for Multimode Photon States
NASA Astrophysics Data System (ADS)
Dudinets, Ivan V.; Man'ko, Vladimir I.
2018-06-01
Tomographic probability representation of multimode electromagnetic field states in the scheme of center-of-mass tomography is reviewed. Both connection of the field state Wigner function and observable Weyl symbols with the center-of-mass tomograms as well as connection of the Grönewold kernel with the center-of-mass tomographic kernel determining the noncommutative product of the tomograms are obtained. The dual center-of-mass tomogram of the photon states are constructed and the dual tomographic kernel is obtained. The models of other generalized center-of-mass tomographies are discussed. Example of two-mode even and odd Schrödinger cat states is presented in details.
In vivo evaluation of (64)Cu-labeled magnetic nanoparticles as a dual-modality PET/MR imaging agent.
Glaus, Charles; Rossin, Raffaella; Welch, Michael J; Bao, Gang
2010-04-21
A novel nanoparticle-based dual-modality positron emission tomograph/magnetic resonance imaging (PET/MRI) contrast agent was developed. The probe consisted of a superparamagnetic iron oxide (SPIO) core coated with PEGylated phospholipids. The chelator 1,4,7,10-tetraazacyclo-dodecane-1,4,7,10-tetraacetic acid (DOTA) was conjugated to PEG termini to allow labeling with positron-emitting (64)Cu. Radiolabeling with (64)Cu at high yield and high purity was readily achieved. The (64)Cu-SPIO probes produced strong MR and PET signals and were stable in mouse serum for 24 h at 37 degrees C. Biodistribution and in vivo PET/CT imaging studies of the probes showed a circulation half-life of 143 min and high initial blood retention with moderate liver uptake, making them an attractive contrast agent for disease studies.
Statistical reconstruction for cosmic ray muon tomography.
Schultz, Larry J; Blanpied, Gary S; Borozdin, Konstantin N; Fraser, Andrew M; Hengartner, Nicolas W; Klimenko, Alexei V; Morris, Christopher L; Orum, Chris; Sossong, Michael J
2007-08-01
Highly penetrating cosmic ray muons constantly shower the earth at a rate of about 1 muon per cm2 per minute. We have developed a technique which exploits the multiple Coulomb scattering of these particles to perform nondestructive inspection without the use of artificial radiation. In prior work [1]-[3], we have described heuristic methods for processing muon data to create reconstructed images. In this paper, we present a maximum likelihood/expectation maximization tomographic reconstruction algorithm designed for the technique. This algorithm borrows much from techniques used in medical imaging, particularly emission tomography, but the statistics of muon scattering dictates differences. We describe the statistical model for multiple scattering, derive the reconstruction algorithm, and present simulated examples. We also propose methods to improve the robustness of the algorithm to experimental errors and events departing from the statistical model.
Initial evaluation of discrete orthogonal basis reconstruction of ECT images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moody, E.B.; Donohue, K.D.
1996-12-31
Discrete orthogonal basis restoration (DOBR) is a linear, non-iterative, and robust method for solving inverse problems for systems characterized by shift-variant transfer functions. This simulation study evaluates the feasibility of using DOBR for reconstructing emission computed tomographic (ECT) images. The imaging system model uses typical SPECT parameters and incorporates the effects of attenuation, spatially-variant PSF, and Poisson noise in the projection process. Sample reconstructions and statistical error analyses for a class of digital phantoms compare the DOBR performance for Hartley and Walsh basis functions. Test results confirm that DOBR with either basis set produces images with good statistical properties. Nomore » problems were encountered with reconstruction instability. The flexibility of the DOBR method and its consistent performance warrants further investigation of DOBR as a means of ECT image reconstruction.« less
Nyquist, Jonathan E.; Toran, Laura; Fang, Allison C.; Ryan, Robert J.; Rosenberry, Donald O.
2010-01-01
Characterization of the hyporheic zone is of critical importance for understanding stream ecology, contaminant transport, and groundwater‐surface water interaction. A salt water tracer test was used to probe the hyporheic zone of a recently re‐engineered portion of Crabby Creek, a stream located near Philadelphia, PA. The tracer solution was tracked through a 13.5 meter segment of the stream using both a network of 25 wells sampled every 5–15 minutes and time‐lapse electrical resistivity tomographs collected every 11 minutes for six hours, with additional tomographs collected every 100 minutes for an additional 16 hours. The comparison of tracer monitoring methods is of keen interest because tracer tests are one of the few techniques available for characterizing this dynamic zone, and logistically it is far easier to collect resistivity tomographs than to install and monitor a dense network of wells. Our results show that resistivity monitoring captured the essential shape of the breakthrough curve and may indicate portions of the stream where the tracer lingered in the hyporheic zone. Time‐lapse resistivity measurements, however, represent time averages over the period required to collect a tomographic data set, and spatial averages over a volume larger than captured by a well sample. Smoothing by the resistivity data inversion algorithm further blurs the resulting tomograph; consequently resistivity monitoring underestimates the degree of fine‐scale heterogeneity in the hyporheic zone.
Rapid tomographic reconstruction based on machine learning for time-resolved combustion diagnostics
NASA Astrophysics Data System (ADS)
Yu, Tao; Cai, Weiwei; Liu, Yingzheng
2018-04-01
Optical tomography has attracted surged research efforts recently due to the progress in both the imaging concepts and the sensor and laser technologies. The high spatial and temporal resolutions achievable by these methods provide unprecedented opportunity for diagnosis of complicated turbulent combustion. However, due to the high data throughput and the inefficiency of the prevailing iterative methods, the tomographic reconstructions which are typically conducted off-line are computationally formidable. In this work, we propose an efficient inversion method based on a machine learning algorithm, which can extract useful information from the previous reconstructions and build efficient neural networks to serve as a surrogate model to rapidly predict the reconstructions. Extreme learning machine is cited here as an example for demonstrative purpose simply due to its ease of implementation, fast learning speed, and good generalization performance. Extensive numerical studies were performed, and the results show that the new method can dramatically reduce the computational time compared with the classical iterative methods. This technique is expected to be an alternative to existing methods when sufficient training data are available. Although this work is discussed under the context of tomographic absorption spectroscopy, we expect it to be useful also to other high speed tomographic modalities such as volumetric laser-induced fluorescence and tomographic laser-induced incandescence which have been demonstrated for combustion diagnostics.
Rapid tomographic reconstruction based on machine learning for time-resolved combustion diagnostics.
Yu, Tao; Cai, Weiwei; Liu, Yingzheng
2018-04-01
Optical tomography has attracted surged research efforts recently due to the progress in both the imaging concepts and the sensor and laser technologies. The high spatial and temporal resolutions achievable by these methods provide unprecedented opportunity for diagnosis of complicated turbulent combustion. However, due to the high data throughput and the inefficiency of the prevailing iterative methods, the tomographic reconstructions which are typically conducted off-line are computationally formidable. In this work, we propose an efficient inversion method based on a machine learning algorithm, which can extract useful information from the previous reconstructions and build efficient neural networks to serve as a surrogate model to rapidly predict the reconstructions. Extreme learning machine is cited here as an example for demonstrative purpose simply due to its ease of implementation, fast learning speed, and good generalization performance. Extensive numerical studies were performed, and the results show that the new method can dramatically reduce the computational time compared with the classical iterative methods. This technique is expected to be an alternative to existing methods when sufficient training data are available. Although this work is discussed under the context of tomographic absorption spectroscopy, we expect it to be useful also to other high speed tomographic modalities such as volumetric laser-induced fluorescence and tomographic laser-induced incandescence which have been demonstrated for combustion diagnostics.
Estimating crustal heterogeneity from double-difference tomography
Got, J.-L.; Monteiller, V.; Virieux, J.; Okubo, P.
2006-01-01
Seismic velocity parameters in limited, but heterogeneous volumes can be inferred using a double-difference tomographic algorithm, but to obtain meaningful results accuracy must be maintained at every step of the computation. MONTEILLER et al. (2005) have devised a double-difference tomographic algorithm that takes full advantage of the accuracy of cross-spectral time-delays of large correlated event sets. This algorithm performs an accurate computation of theoretical travel-time delays in heterogeneous media and applies a suitable inversion scheme based on optimization theory. When applied to Kilauea Volcano, in Hawaii, the double-difference tomography approach shows significant and coherent changes to the velocity model in the well-resolved volumes beneath the Kilauea caldera and the upper east rift. In this paper, we first compare the results obtained using MONTEILLER et al.'s algorithm with those obtained using the classic travel-time tomographic approach. Then, we evaluated the effect of using data series of different accuracies, such as handpicked arrival-time differences ("picking differences"), on the results produced by double-difference tomographic algorithms. We show that picking differences have a non-Gaussian probability density function (pdf). Using a hyperbolic secant pdf instead of a Gaussian pdf allows improvement of the double-difference tomographic result when using picking difference data. We completed our study by investigating the use of spatially discontinuous time-delay data. ?? Birkha??user Verlag, Basel, 2006.
Field-portable lensfree tomographic microscope†
Isikman, Serhan O.; Bishara, Waheb; Sikora, Uzair; Yaglidere, Oguzhan; Yeah, John; Ozcan, Aydogan
2011-01-01
We present a field-portable lensfree tomographic microscope, which can achieve sectional imaging of a large volume (~20 mm3) on a chip with an axial resolution of <7 μm. In this compact tomographic imaging platform (weighing only ~110 grams), 24 light-emitting diodes (LEDs) that are each butt-coupled to a fibre-optic waveguide are controlled through a cost-effective micro-processor to sequentially illuminate the sample from different angles to record lensfree holograms of the sample that is placed on the top of a digital sensor array. In order to generate pixel super-resolved (SR) lensfree holograms and hence digitally improve the achievable lateral resolution, multiple sub-pixel shifted holograms are recorded at each illumination angle by electromagnetically actuating the fibre-optic waveguides using compact coils and magnets. These SR projection holograms obtained over an angular range of ~50° are rapidly reconstructed to yield projection images of the sample, which can then be back-projected to compute tomograms of the objects on the sensor-chip. The performance of this compact and light-weight lensfree tomographic microscope is validated by imaging micro-beads of different dimensions as well as a Hymenolepis nana egg, which is an infectious parasitic flatworm. Achieving a decent three-dimensional spatial resolution, this field-portable on-chip optical tomographic microscope might provide a useful toolset for telemedicine and high-throughput imaging applications in resource-poor settings. PMID:21573311
Compressive sensing reconstruction of 3D wet refractivity based on GNSS and InSAR observations
NASA Astrophysics Data System (ADS)
Heublein, Marion; Alshawaf, Fadwa; Erdnüß, Bastian; Zhu, Xiao Xiang; Hinz, Stefan
2018-06-01
In this work, the reconstruction quality of an approach for neutrospheric water vapor tomography based on Slant Wet Delays (SWDs) obtained from Global Navigation Satellite Systems (GNSS) and Interferometric Synthetic Aperture Radar (InSAR) is investigated. The novelties of this approach are (1) the use of both absolute GNSS and absolute InSAR SWDs for tomography and (2) the solution of the tomographic system by means of compressive sensing (CS). The tomographic reconstruction is performed based on (i) a synthetic SWD dataset generated using wet refractivity information from the Weather Research and Forecasting (WRF) model and (ii) a real dataset using GNSS and InSAR SWDs. Thus, the validation of the achieved results focuses (i) on a comparison of the refractivity estimates with the input WRF refractivities and (ii) on radiosonde profiles. In case of the synthetic dataset, the results show that the CS approach yields a more accurate and more precise solution than least squares (LSQ). In addition, the benefit of adding synthetic InSAR SWDs into the tomographic system is analyzed. When applying CS, adding synthetic InSAR SWDs into the tomographic system improves the solution both in magnitude and in scattering. When solving the tomographic system by means of LSQ, no clear behavior is observed. In case of the real dataset, the estimated refractivities of both methodologies show a consistent behavior although the LSQ and CS solution strategies differ.
NASA Astrophysics Data System (ADS)
Hart, V. P.; Taylor, M. J.; Doyle, T. E.; Zhao, Y.; Pautet, P.-D.; Carruth, B. L.; Rusch, D. W.; Russell, J. M.
2018-01-01
This research presents the first application of tomographic techniques for investigating gravity wave structures in polar mesospheric clouds (PMCs) imaged by the Cloud Imaging and Particle Size instrument on the NASA AIM satellite. Albedo data comprising consecutive PMC scenes were used to tomographically reconstruct a 3-D layer using the Partially Constrained Algebraic Reconstruction Technique algorithm and a previously developed "fanning" technique. For this pilot study, a large region (760 × 148 km) of the PMC layer (altitude 83 km) was sampled with a 2 km horizontal resolution, and an intensity weighted centroid technique was developed to create novel 2-D surface maps, characterizing the individual gravity waves as well as their altitude variability. Spectral analysis of seven selected wave events observed during the Northern Hemisphere 2007 PMC season exhibited dominant horizontal wavelengths of 60-90 km, consistent with previous studies. These tomographic analyses have enabled a broad range of new investigations. For example, a clear spatial anticorrelation was observed between the PMC albedo and wave-induced altitude changes, with higher-albedo structures aligning well with wave troughs, while low-intensity regions aligned with wave crests. This result appears to be consistent with current theories of PMC development in the mesopause region. This new tomographic imaging technique also provides valuable wave amplitude information enabling further mesospheric gravity wave investigations, including quantitative analysis of their hemispheric and interannual characteristics and variations.
KiDS-450: the tomographic weak lensing power spectrum and constraints on cosmological parameters
NASA Astrophysics Data System (ADS)
Köhlinger, F.; Viola, M.; Joachimi, B.; Hoekstra, H.; van Uitert, E.; Hildebrandt, H.; Choi, A.; Erben, T.; Heymans, C.; Joudaki, S.; Klaes, D.; Kuijken, K.; Merten, J.; Miller, L.; Schneider, P.; Valentijn, E. A.
2017-11-01
We present measurements of the weak gravitational lensing shear power spectrum based on 450 ° ^2 of imaging data from the Kilo Degree Survey. We employ a quadratic estimator in two and three redshift bins and extract band powers of redshift autocorrelation and cross-correlation spectra in the multipole range 76 ≤ ℓ ≤ 1310. The cosmological interpretation of the measured shear power spectra is performed in a Bayesian framework assuming a ΛCDM model with spatially flat geometry, while accounting for small residual uncertainties in the shear calibration and redshift distributions as well as marginalizing over intrinsic alignments, baryon feedback and an excess-noise power model. Moreover, massive neutrinos are included in the modelling. The cosmological main result is expressed in terms of the parameter combination S_8 ≡ σ _8 √{Ω_m/0.3} yielding S8 = 0.651 ± 0.058 (three z-bins), confirming the recently reported tension in this parameter with constraints from Planck at 3.2σ (three z-bins). We cross-check the results of the three z-bin analysis with the weaker constraints from the two z-bin analysis and find them to be consistent. The high-level data products of this analysis, such as the band power measurements, covariance matrices, redshift distributions and likelihood evaluation chains are available at http://kids.strw.leidenuniv.nl.
Tomographic reflection modelling of quasi-periodic oscillations in the black hole binary H 1743-322
NASA Astrophysics Data System (ADS)
Ingram, Adam; van der Klis, Michiel; Middleton, Matthew; Altamirano, Diego; Uttley, Phil
2017-01-01
Accreting stellar mass black holes (BHs) routinely exhibit Type-C quasi-periodic oscillations (QPOs). These are often interpreted as Lense-Thirring precession of the inner accretion flow, a relativistic effect whereby the spin of the BH distorts the surrounding space-time, inducing nodal precession. The best evidence for the precession model is the recent discovery, using a long joint XMM-Newton and NuSTAR observation of H 1743-322, that the centroid energy of the iron florescence line changes systematically with QPO phase. This was interpreted as the inner flow illuminating different azimuths of the accretion disc as it precesses, giving rise to a blueshifted/redshifted iron line when the approaching/receding disc material is illuminated. Here, we develop a physical model for this interpretation, including a self-consistent reflection continuum, and fit this to the same H 1743-322 data. We use an analytic function to parametrize the asymmetric illumination pattern on the disc surface that would result from inner flow precession, and find that the data are well described if two bright patches rotate about the disc surface. This model is preferred to alternatives considering an oscillating disc ionization parameter, disc inner radius and radial emissivity profile. We find that the reflection fraction varies with QPO phase (3.5σ), adding to the now formidable body of evidence that Type-C QPOs are a geometric effect. This is the first example of tomographic QPO modelling, initiating a powerful new technique that utilizes QPOs in order to map the dynamics of accreting material close to the BH.
Hyperspectral and multispectral bioluminescence optical tomography for small animal imaging.
Chaudhari, Abhijit J; Darvas, Felix; Bading, James R; Moats, Rex A; Conti, Peter S; Smith, Desmond J; Cherry, Simon R; Leahy, Richard M
2005-12-07
For bioluminescence imaging studies in small animals, it is important to be able to accurately localize the three-dimensional (3D) distribution of the underlying bioluminescent source. The spectrum of light produced by the source that escapes the subject varies with the depth of the emission source because of the wavelength-dependence of the optical properties of tissue. Consequently, multispectral or hyperspectral data acquisition should help in the 3D localization of deep sources. In this paper, we describe a framework for fully 3D bioluminescence tomographic image acquisition and reconstruction that exploits spectral information. We describe regularized tomographic reconstruction techniques that use semi-infinite slab or FEM-based diffusion approximations of photon transport through turbid media. Singular value decomposition analysis was used for data dimensionality reduction and to illustrate the advantage of using hyperspectral rather than achromatic data. Simulation studies in an atlas-mouse geometry indicated that sub-millimeter resolution may be attainable given accurate knowledge of the optical properties of the animal. A fixed arrangement of mirrors and a single CCD camera were used for simultaneous acquisition of multispectral imaging data over most of the surface of the animal. Phantom studies conducted using this system demonstrated our ability to accurately localize deep point-like sources and show that a resolution of 1.5 to 2.2 mm for depths up to 6 mm can be achieved. We also include an in vivo study of a mouse with a brain tumour expressing firefly luciferase. Co-registration of the reconstructed 3D bioluminescent image with magnetic resonance images indicated good anatomical localization of the tumour.
A flexible, small positron emission tomography prototype for resource-limited laboratories
NASA Astrophysics Data System (ADS)
Miranda-Menchaca, A.; Martínez-Dávalos, A.; Murrieta-Rodríguez, T.; Alva-Sánchez, H.; Rodríguez-Villafuerte, M.
2015-05-01
Modern small-animal PET scanners typically consist of a large number of detectors along with complex electronics to provide tomographic images for research in the preclinical sciences that use animal models. These systems can be expensive, especially for resource-limited educational and academic institutions in developing countries. In this work we show that a small-animal PET scanner can be built with a relatively reduced budget while, at the same time, achieving relatively high performance. The prototype consists of four detector modules each composed of LYSO pixelated crystal arrays (individual crystal elements of dimensions 1 × 1 × 10 mm3) coupled to position-sensitive photomultiplier tubes. Tomographic images are obtained by rotating the subject to complete enough projections for image reconstruction. Image quality was evaluated for different reconstruction algorithms including filtered back-projection and iterative reconstruction with maximum likelihood-expectation maximization and maximum a posteriori methods. The system matrix was computed both with geometric considerations and by Monte Carlo simulations. Prior to image reconstruction, Fourier data rebinning was used to increase the number of lines of response used. The system was evaluated for energy resolution at 511 keV (best 18.2%), system sensitivity (0.24%), spatial resolution (best 0.87 mm), scatter fraction (4.8%) and noise equivalent count-rate. The system can be scaled-up to include up to 8 detector modules, increasing detection efficiency, and its price may be reduced as newer solid state detectors become available replacing the traditional photomultiplier tubes. Prototypes like this may prove to be very valuable for educational, training, preclinical and other biological research purposes.
Iskandrian, A S; Powers, J; Cave, V; Wasserleben, V; Cassell, D; Heo, J
1995-01-01
This study examined the ability of dynamic 123I-labeled iodophenylpentadecanoic acid (IPPA) imaging to detect myocardial viability in patients with left ventricular (LV) dysfunction caused by coronary artery disease. Serial 180-degree single-photon emission computed tomographic (SPECT) images (five sets, 8 minutes each) were obtained starting 4 minutes after injection of 2 to 6 mCi 123I at rest in 21 patients with LV dysfunction (ejection fraction [EF] 34% +/- 11%). The segmental uptake was compared with that of rest-redistribution 201Tl images (20 segments/study). The number of perfusion defects (reversible and fixed) was similar by IPPA and thallium (11 +/- 5 vs 10 +/- 5 segments/patient; difference not significant). There was agreement between IPPA and thallium for presence or absence (kappa = 0.78 +/- 0.03) and nature (reversible, mild fixed, or severe fixed) of perfusion defects (kappa = 0.54 +/- 0.04). However, there were more reversible IPPA defects than reversible thallium defects (7 +/- 4 vs 3 +/- 4 segments/patient; p = 0.001). In 14 patients the EF (by gated pool imaging) improved after coronary revascularization from 33% +/- 11% to 39% +/- 12% (p = 0.002). The number of reversible IPPA defects was greater in the seven patients who had improvement in EF than in the patients without such improvement (10 +/- 4 vs 5 +/- 4 segments/patient; p = 0.075). 123I-labeled IPPA SPECT imaging is a promising new technique for assessment of viability. Reversible defects predict recovery of LV dysfunction after coronary revascularization.
Application Of Iterative Reconstruction Techniques To Conventional Circular Tomography
NASA Astrophysics Data System (ADS)
Ghosh Roy, D. N.; Kruger, R. A.; Yih, B. C.; Del Rio, S. P.; Power, R. L.
1985-06-01
Two "point-by-point" iteration procedures, namely, Iterative Least Square Technique (ILST) and Simultaneous Iterative Reconstructive Technique (SIRT) were applied to classical circular tomographic reconstruction. The technique of tomosynthetic DSA was used in forming the tomographic images. Reconstructions of a dog's renal and neck anatomy are presented.
21 CFR 892.1740 - Tomographic x-ray system.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Tomographic x-ray system. 892.1740 Section 892.1740 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED.... This generic type of device may include signal analysis and display equipment, patient and equipment...
21 CFR 892.1740 - Tomographic x-ray system.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Tomographic x-ray system. 892.1740 Section 892.1740 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED.... This generic type of device may include signal analysis and display equipment, patient and equipment...
21 CFR 892.1740 - Tomographic x-ray system.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Tomographic x-ray system. 892.1740 Section 892.1740 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED.... This generic type of device may include signal analysis and display equipment, patient and equipment...
21 CFR 892.1740 - Tomographic x-ray system.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Tomographic x-ray system. 892.1740 Section 892.1740 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED.... This generic type of device may include signal analysis and display equipment, patient and equipment...
21 CFR 892.1740 - Tomographic x-ray system.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Tomographic x-ray system. 892.1740 Section 892.1740 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED.... This generic type of device may include signal analysis and display equipment, patient and equipment...
ECAT: A New Computerized Tomographic Imaging System for Position-Emitting Radiopharmaceuticals
DOE R&D Accomplishments Database
Phelps, M. E.; Hoffman, E. J.; Huang, S. C.; Kuhl, D. E.
1977-01-01
The ECAT was designed and developed as a complete computerized positron radionuclide imaging system capable of providing high contrast, high resolution, quantitative images in 2 dimensional and tomographic formats. Flexibility, in its various image mode options, allows it to be used for a wide variety of imaging problems.
Model Diagnostics for Bayesian Networks
ERIC Educational Resources Information Center
Sinharay, Sandip
2006-01-01
Bayesian networks are frequently used in educational assessments primarily for learning about students' knowledge and skills. There is a lack of works on assessing fit of Bayesian networks. This article employs the posterior predictive model checking method, a popular Bayesian model checking tool, to assess fit of simple Bayesian networks. A…
A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research
van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B; Neyer, Franz J; van Aken, Marcel AG
2014-01-01
Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are introduced using a simplified example. Thereafter, the advantages and pitfalls of the specification of prior knowledge are discussed. To illustrate Bayesian methods explained in this study, in a second example a series of studies that examine the theoretical framework of dynamic interactionism are considered. In the Discussion the advantages and disadvantages of using Bayesian statistics are reviewed, and guidelines on how to report on Bayesian statistics are provided. PMID:24116396
Estimating Biases for Regional Methane Fluxes using Co-emitted Tracers
NASA Astrophysics Data System (ADS)
Bambha, R.; Safta, C.; Michelsen, H. A.; Cui, X.; Jeong, S.; Fischer, M. L.
2017-12-01
Methane is a powerful greenhouse gas, and the development and improvement of emissions models rely on understanding the flux of methane released from anthropogenic sources relative to releases from other sources. Increasing production of shale oil and gas in the mid-latitudes and associated fugitive emissions are suspected to be a dominant contributor to the global methane increase. Landfills, sewage treatment, and other sources may be dominant sources in some parts of the U.S. Large discrepancies between emissions models present a great challenge to reconciling atmospheric measurements with inventory-based estimates for various emissions sectors. Current approaches for measuring regional emissions yield highly uncertain estimates because of the sparsity of measurement sites and the presence of multiple simultaneous sources. Satellites can provide wide spatial coverage at the expense of much lower measurement precision compared to ground-based instruments. Methods for effective assimilation of data from a variety of sources are critically needed to perform regional GHG attribution with existing measurements and to determine how to structure future measurement systems including satellites. We present a hierarchical Bayesian framework to estimate surface methane fluxes based on atmospheric concentration measurements and a Lagrangian transport model (Weather Research and Forecasting and Stochastic Time-Inverted Lagrangian Transport). Structural errors in the transport model are estimated with the help of co-emitted traces species with well defined decay rates. We conduct the analyses at regional scales that are based on similar geographical and meteorological conditions. For regions where data are informative, we further refine flux estimates by emissions sector and infer spatially and temporally varying biases parameterized as spectral random field representations.
Monitoring fossil fuel sources of methane in Australia
NASA Astrophysics Data System (ADS)
Loh, Zoe; Etheridge, David; Luhar, Ashok; Hibberd, Mark; Thatcher, Marcus; Noonan, Julie; Thornton, David; Spencer, Darren; Gregory, Rebecca; Jenkins, Charles; Zegelin, Steve; Leuning, Ray; Day, Stuart; Barrett, Damian
2017-04-01
CSIRO has been active in identifying and quantifying methane emissions from a range of fossil fuel sources in Australia over the past decade. We present here a history of the development of our work in this domain. While we have principally focused on optimising the use of long term, fixed location, high precision monitoring, paired with both forward and inverse modelling techniques suitable either local or regional scales, we have also incorporated mobile ground surveys and flux calculations from plumes in some contexts. We initially developed leak detection methodologies for geological carbon storage at a local scale using a Bayesian probabilistic approach coupled to a backward Lagrangian particle dispersion model (Luhar et al. JGR, 2014), and single point monitoring with sector analysis (Etheridge et al. In prep.) We have since expanded our modelling techniques to regional scales using both forward and inverse approaches to constrain methane emissions from coal mining and coal seam gas (CSG) production. The Surat Basin (Queensland, Australia) is a region of rapidly expanding CSG production, in which we have established a pair of carefully located, well-intercalibrated monitoring stations. These data sets provide an almost continuous record of (i) background air arriving at the Surat Basin, and (ii) the signal resulting from methane emissions within the Basin, i.e. total downwind methane concentration (comprising emissions including natural geological seeps, agricultural and biogenic sources and fugitive emissions from CSG production) minus background or upwind concentration. We will present our latest results on monitoring from the Surat Basin and their application to estimating methane emissions.
A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research
ERIC Educational Resources Information Center
van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A. G.
2014-01-01
Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are…
Bartlett, Jonathan W; Keogh, Ruth H
2018-06-01
Bayesian approaches for handling covariate measurement error are well established and yet arguably are still relatively little used by researchers. For some this is likely due to unfamiliarity or disagreement with the Bayesian inferential paradigm. For others a contributory factor is the inability of standard statistical packages to perform such Bayesian analyses. In this paper, we first give an overview of the Bayesian approach to handling covariate measurement error, and contrast it with regression calibration, arguably the most commonly adopted approach. We then argue why the Bayesian approach has a number of statistical advantages compared to regression calibration and demonstrate that implementing the Bayesian approach is usually quite feasible for the analyst. Next, we describe the closely related maximum likelihood and multiple imputation approaches and explain why we believe the Bayesian approach to generally be preferable. We then empirically compare the frequentist properties of regression calibration and the Bayesian approach through simulation studies. The flexibility of the Bayesian approach to handle both measurement error and missing data is then illustrated through an analysis of data from the Third National Health and Nutrition Examination Survey.
Estimating methane emissions from dairies in the Los Angeles Basin
NASA Astrophysics Data System (ADS)
Viatte, C.; Lauvaux, T.; Hedelius, J.; Parker, H. A.; Chen, J.; Jones, T.; Franklin, J.; Deng, A.; Gaudet, B.; Duren, R. M.; Verhulst, K. R.; Wunch, D.; Roehl, C. M.; Dubey, M. K.; Wofsy, S.; Wennberg, P. O.
2015-12-01
Inventory estimates of methane (CH4) emissions among the individual sources (mainly agriculture, energy production, and waste management) remain highly uncertain at regional and urban scales. Accurate atmospheric measurements can provide independent estimates to evaluate bottom-up inventories, especially in urban region, where many different CH4 sources are often confined in relatively small areas. Among these sources, livestock emissions, which are mainly originating from dairy cows, account for ~55% of the total CH4 emissions in California in 2013. This study aims to rigorously estimate the amount of CH4 emitted by the largest dairies in the Southern California region by combining measurements from four mobile ground-based spectrometers (EM27/SUN), in situ isotopic methane measurements from a CRDS analyzer (Picarro), and a high-resolution atmospheric transport model (the Weather Research and Forecasting model) in Large-Eddy Simulation mode. The remote sensing spectrometers measure the total column-averaged dry-air mole fractions of CH4 and CO2 (XCH4 and XCO2) in the near infrared region, providing information about total emissions of the dairies. Gradients measured by the four EM27 ranged from 0.2 to 22 ppb and from 0.7 to 3 ppm for XCH4 and XCO2, respectively. To assess the fluxes of the dairies, measurements of these gradients are used in conjunction with the local atmospheric dynamics simulated at 111 m resolution. Inverse modelling from WRF-LES is employed to resolve the spatial distribution of CH4 emissions in the domain. A Bayesian inversion and a Monte-Carlo approach were used to provide the CH4 emissions over the dairy with their associated uncertainties. The isotopic δ13C sampled at different locations in the area ranges from -40 ‰ to -55 ‰, indicating a mixture of anthropogenic and biogenic sources.
Uncertainty loops in travel-time tomography from nonlinear wave physics.
Galetti, Erica; Curtis, Andrew; Meles, Giovanni Angelo; Baptie, Brian
2015-04-10
Estimating image uncertainty is fundamental to guiding the interpretation of geoscientific tomographic maps. We reveal novel uncertainty topologies (loops) which indicate that while the speeds of both low- and high-velocity anomalies may be well constrained, their locations tend to remain uncertain. The effect is widespread: loops dominate around a third of United Kingdom Love wave tomographic uncertainties, changing the nature of interpretation of the observed anomalies. Loops exist due to 2nd and higher order aspects of wave physics; hence, although such structures must exist in many tomographic studies in the physical sciences and medicine, they are unobservable using standard linearized methods. Higher order methods might fruitfully be adopted.
Tomographic diffractive microscopy with a wavefront sensor.
Ruan, Y; Bon, P; Mudry, E; Maire, G; Chaumet, P C; Giovannini, H; Belkebir, K; Talneau, A; Wattellier, B; Monneret, S; Sentenac, A
2012-05-15
Tomographic diffractive microscopy is a recent imaging technique that reconstructs quantitatively the three-dimensional permittivity map of a sample with a resolution better than that of conventional wide-field microscopy. Its main drawbacks lie in the complexity of the setup and in the slowness of the image recording as both the amplitude and the phase of the field scattered by the sample need to be measured for hundreds of successive illumination angles. In this Letter, we show that, using a wavefront sensor, tomographic diffractive microscopy can be implemented easily on a conventional microscope. Moreover, the number of illuminations can be dramatically decreased if a constrained reconstruction algorithm is used to recover the sample map of permittivity.
NASA Technical Reports Server (NTRS)
Yin, L. I.; Trombka, J. I.; Bielefeld, M. J.; Seltzer, S. M.
1984-01-01
The results of two computer simulations demonstrate the feasibility of using the nonoverlapping redundant array (NORA) to form three-dimensional images of objects with X-rays. Pinholes admit the X-rays to nonoverlapping points on a detector. The object is reconstructed in the analog mode by optical correlation and in the digital mode by tomographic computations. Trials were run with a stick-figure pyramid and extended objects with out-of-focus backgrounds. Substitution of spherical optical lenses for the pinholes increased the light transmission sufficiently that objects could be easily viewed in a dark room. Out-of-focus aberrations in tomographic reconstruction could be eliminated using Chang's (1976) algorithm.
Tomographic phase microscopy: principles and applications in bioimaging [Invited
Jin, Di; Zhou, Renjie; Yaqoob, Zahid; So, Peter T. C.
2017-01-01
Tomographic phase microscopy (TPM) is an emerging optical microscopic technique for bioimaging. TPM uses digital holographic measurements of complex scattered fields to reconstruct three-dimensional refractive index (RI) maps of cells with diffraction-limited resolution by solving inverse scattering problems. In this paper, we review the developments of TPM from the fundamental physics to its applications in bioimaging. We first provide a comprehensive description of the tomographic reconstruction physical models used in TPM. The RI map reconstruction algorithms and various regularization methods are discussed. Selected TPM applications for cellular imaging, particularly in hematology, are reviewed. Finally, we examine the limitations of current TPM systems, propose future solutions, and envision promising directions in biomedical research. PMID:29386746
Light radiation pressure upon an optically orthotropic surface
NASA Astrophysics Data System (ADS)
Nerovny, Nikolay A.; Lapina, Irina E.; Grigorjev, Anton S.
2017-11-01
In this paper, we discuss the problem of determination of light radiation pressure force upon an anisotropic surface. The optical parameters of such a surface are considered to have major and minor axes, so the model is called an orthotropic model. We derive the equations for force components from emission, absorption, and reflection, utilizing a modified Maxwell's specular-diffuse model. The proposed model can be used to model a flat solar sail with wrinkles. By performing Bayesian analysis for example of a wrinkled surface, we show that there are cases in which an orthotropic model of the optical parameters of a surface may be more accurate than an isotropic model.
Mittleman, D M; Hunsche, S; Boivin, L; Nuss, M C
1997-06-15
We demonstrate tomographic T-ray imaging, using the timing information present in terahertz (THz) pulses in a reflection geometry. THz pulses are reflected from refractive-index discontinuities inside an object, and the time delays of these pulses are used to determine the positions of the discontinuities along the propagation direction. In this fashion a tomographic image can be constructed.
Computed tomographic findings of cerebral fat embolism following multiple bone fractures.
Law, Huong Ling; Wong, Siong Lung; Tan, Suzet
2013-02-01
Fat embolism to the lungs and brain is an uncommon complication following fractures. Few reports with descriptions of computed tomographic (CT) findings of emboli to the brain or cerebral fat embolism are available. We report a case of cerebral fat embolism following multiple skeletal fractures and present its CT findings here.
Quantum-tomographic cryptography with a semiconductor single-photon source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaszlikowski, D.; Yang, L.J.; Yong, L.S.
2005-09-15
We analyze the security of so-called quantum-tomographic cryptography with the source producing entangled photons via an experimental scheme proposed by Fattal et al. [Phys. Rev. Lett. 92, 37903 (2004)]. We determine the range of the experimental parameters for which the protocol is secure against the most general incoherent attacks.
Current developments in clinical multiphoton tomography
NASA Astrophysics Data System (ADS)
König, Karsten; Weinigel, Martin; Breunig, Hans Georg; Gregory, Axel; Fischer, Peter; Kellner-Höfer, Marcel; Bückle, Rainer
2010-02-01
Two-photon microscopy has been introduced in 1990 [1]. 13 years later, CE-marked clinical multiphoton systems for 3D imaging of human skin with subcellular resolution have been launched by the JenLab company with the tomograph DermaInspectTM. In 2010, the second generation of clinical multiphoton tomographs was introduced. The novel mobile multiphoton tomograph MPTflexTM, equipped with a flexible articulated optical arm, provides an increased flexibility and accessibility especially for clinical and cosmetical examinations. The multiphoton excitation of fluorescent biomolecules like NAD(P)H, flavins, porphyrins, elastin, and melanin as well as the second harmonic generation of collagen is induced by picojoule femtosecond laser pulses from an tunable turn-key near infrared laser system. The ability for rapid highquality image acquisition, the user-friendly operation of the system, and the compact and flexible design qualifies this system to be used for melanoma detection, diagnostics of dermatological disorders, cosmetic research, and skin aging measurements as well as in situ drug monitoring and animal research. So far, more than 1,000 patients and volunteers have been investigated with the multiphoton tomographs in Europe, Asia, and Australia.
Tomography and the Herglotz-Wiechert inverse formulation
NASA Astrophysics Data System (ADS)
Nowack, Robert L.
1990-04-01
In this paper, linearized tomography and the Herglotz-Wiechert inverse formulation are compared. Tomographic inversions for 2-D or 3-D velocity structure use line integrals along rays and can be written in terms of Radon transforms. For radially concentric structures, Radon transforms are shown to reduce to Abel transforms. Therefore, for straight ray paths, the Abel transform of travel-time is a tomographic algorithm specialized to a one-dimensional radially concentric medium. The Herglotz-Wiechert formulation uses seismic travel-time data to invert for one-dimensional earth structure and is derived using exact ray trajectories by applying an Abel transform. This is of historical interest since it would imply that a specialized tomographic-like algorithm has been used in seismology since the early part of the century (see Herglotz, 1907; Wiechert, 1910). Numerical examples are performed comparing the Herglotz-Wiechert algorithm and linearized tomography along straight rays. Since the Herglotz-Wiechert algorithm is applicable under specific conditions, (the absence of low velocity zones) to non-straight ray paths, the association with tomography may prove to be useful in assessing the uniqueness of tomographic results generalized to curved ray geometries.
A detailed comparison of single-camera light-field PIV and tomographic PIV
NASA Astrophysics Data System (ADS)
Shi, Shengxian; Ding, Junfei; Atkinson, Callum; Soria, Julio; New, T. H.
2018-03-01
This paper conducts a comprehensive study between the single-camera light-field particle image velocimetry (LF-PIV) and the multi-camera tomographic particle image velocimetry (Tomo-PIV). Simulation studies were first performed using synthetic light-field and tomographic particle images, which extensively examine the difference between these two techniques by varying key parameters such as pixel to microlens ratio (PMR), light-field camera Tomo-camera pixel ratio (LTPR), particle seeding density and tomographic camera number. Simulation results indicate that the single LF-PIV can achieve accuracy consistent with that of multi-camera Tomo-PIV, but requires the use of overall greater number of pixels. Experimental studies were then conducted by simultaneously measuring low-speed jet flow with single-camera LF-PIV and four-camera Tomo-PIV systems. Experiments confirm that given a sufficiently high pixel resolution, a single-camera LF-PIV system can indeed deliver volumetric velocity field measurements for an equivalent field of view with a spatial resolution commensurate with those of multi-camera Tomo-PIV system, enabling accurate 3D measurements in applications where optical access is limited.
NASA Astrophysics Data System (ADS)
Lew, E. J.; Butenhoff, C. L.; Karmakar, S.; Rice, A. L.; Khalil, A. K.
2017-12-01
Methane is the second most important greenhouse gas after carbon dioxide. In efforts to control emissions, a careful examination of the methane budget and source strengths is required. To determine methane surface fluxes, Bayesian methods are often used to provide top-down constraints. Inverse modeling derives unknown fluxes using observed methane concentrations, a chemical transport model (CTM) and prior information. The Bayesian inversion reduces prior flux uncertainties by exploiting information content in the data. While the Bayesian formalism produces internal error estimates of source fluxes, systematic or external errors that arise from user choices in the inversion scheme are often much larger. Here we examine model sensitivity and uncertainty of our inversion under different observation data sets and CTM grid resolution. We compare posterior surface fluxes using the data product GLOBALVIEW-CH4 against the event-level molar mixing ratio data available from NOAA. GLOBALVIEW-CH4 is a collection of CH4 concentration estimates from 221 sites, collected by 12 laboratories, that have been interpolated and extracted to provide weekly records from 1984-2008. Differently, the event-level NOAA data records methane mixing ratios field measurements from 102 sites, containing sampling frequency irregularities and gaps in time. Furthermore, the sampling platform types used by the data sets may influence the posterior flux estimates, namely fixed surface, tower, ship and aircraft sites. To explore the sensitivity of the posterior surface fluxes to the observation network geometry, inversions composed of all sites, only aircraft, only ship, only tower and only fixed surface sites, are performed and compared. Also, we investigate the sensitivity of the error reduction associated with the resolution of the GEOS-Chem simulation (4°×5° vs 2°×2.5°) used to calculate the response matrix. Using a higher resolution grid decreased the model-data error at most sites, thereby increasing the information at that site. These different inversions—event-level and interpolated data, higher and lower resolutions—are compared using an ensemble of descriptive and comparative statistics. Analyzing the sensitivity of the inverse model leads to more accurate estimates of the methane source category uncertainty.
Nishimura, T; Uehara, T; Shimonagata, T; Nagata, S; Haze, K
1994-01-01
This study was undertaken to evaluate the relationships, between myocardial perfusion and metabolism. Simultaneous beta-methyl-p(123I)iodophenylpentadecanoic acid (123I-BMIPP) and thallium 201 myocardial single-photon emission computed tomography (SPECT) were performed in 25 patients with myocardial infarction (group A) and 16 patients with hypertrophic cardiomyopathy (group B). The severity scores of 123I-BMIPP and 201Tl myocardial SPECT images were evaluated semiquantitatively by segmental analysis. In Group A, dissociations between thallium- and 123I-BMIPP-imaged defects were frequently observed in patients with successful reperfusion compared with those with no reperfusion and those with reinfarction. In four patients with successful reperfusion, repeated 123I-BMIPP and 201Tl myocardial SPECT showed gradual improvement of the 123I-BMIPP severity score compared with the thallium severity score. In group B, dissociations between thallium- and 123I-BMIPP-imaged defects were also demonstrated in hypertrophic myocardium. In addition, nonhypertrophic myocardium also had decreased 123I-BMIPP uptake. In groups A and B, 123I-BMIPP severity scores correlated well with left ventricular function compared with thallium severity scores. These findings indicate that 123I-BMIPP is a suitable agent for the assessment of functional integrity, because left ventricular wall motion is energy dependent and 123I-BMIPP may reflect an aspect of myocardial energy production. This agent may be useful for the early detection and patient management of various heart diseases as an alternative to positron emission tomographic study.
Diaz, Alejandro A; Estépar, Raul San José; Washko, George R
2016-01-01
Computed tomographic measures of central airway morphology have been used in clinical, epidemiologic, and genetic investigation as an inference of the presence and severity of small-airway disease in smokers. Although several association studies have brought us to believe that these computed tomographic measures reflect airway remodeling, a careful review of such data and more recent evidence may reveal underappreciated complexity to these measures and limitations that prompt us to question that belief. This Perspective offers a review of seminal papers and alternative explanations of their data in the light of more recent evidence. The relationships between airway morphology and lung function are observed in subjects who never smoked, implying that native airway structure indeed contributes to lung function; computed tomographic measures of central airways such as wall area, lumen area, and total bronchial area are smaller in smokers with chronic obstructive pulmonary disease versus those without chronic obstructive pulmonary disease; and the airways are smaller as disease severity increases. The observations suggest that (1) native airway morphology likely contributes to the relationships between computed tomographic measures of airways and lung function; and (2) the presence of smaller airways in those with chronic obstructive pulmonary disease versus those without chronic obstructive pulmonary disease as well as their decrease with disease severity suggests that smokers with chronic obstructive pulmonary disease may simply have smaller airways to begin with, which put them at greater risk for the development of smoking-related disease.
Hierarchical multimodal tomographic x-ray imaging at a superbend
NASA Astrophysics Data System (ADS)
Stampanoni, M.; Marone, F.; Mikuljan, G.; Jefimovs, K.; Trtik, P.; Vila-Comamala, J.; David, C.; Abela, R.
2008-08-01
Over the last decade, synchrotron-based X-ray tomographic microscopy has established itself as a fundamental tool for non-invasive, quantitative investigations of a broad variety of samples, with application ranging from space research and materials science to biology and medicine. Thanks to the brilliance of modern third generation sources, voxel sizes in the micrometer range are routinely achieved by the major X-ray microtomography devices around the world, while the isotropic 100 nm barrier is reached and trespassed only by few instruments. The beamline for TOmographic Microscopy and Coherent rAdiology experiments (TOMCAT) of the Swiss Light Source at the Paul Scherrer Institut, operates a multimodal endstation which offers tomographic capabilities in the micrometer range in absorption contrast - of course - as well as phase contrast imaging. Recently, the beamline has been equipped with a full field, hard X-rays microscope with a theoretical pixel size down to 30 nm and a field of view of 50 microns. The nanoscope performs well at X-ray energies between 8 and 12 keV, selected from the white beam of a 2.9 T superbend by a [Ru/C]100 fixed exit multilayer monochromator. In this work we illustrate the experimental setup dedicated to the nanoscope, in particular the ad-hoc designed X-ray optics needed to produce a homogeneous, square illumination of the sample imaging plane as well as the magnifying zone plate. Tomographic reconstructions at 60 nm voxel size will be shown and discussed.
Field-portable lensfree tomographic microscope.
Isikman, Serhan O; Bishara, Waheb; Sikora, Uzair; Yaglidere, Oguzhan; Yeah, John; Ozcan, Aydogan
2011-07-07
We present a field-portable lensfree tomographic microscope, which can achieve sectional imaging of a large volume (∼20 mm(3)) on a chip with an axial resolution of <7 μm. In this compact tomographic imaging platform (weighing only ∼110 grams), 24 light-emitting diodes (LEDs) that are each butt-coupled to a fibre-optic waveguide are controlled through a cost-effective micro-processor to sequentially illuminate the sample from different angles to record lensfree holograms of the sample that is placed on the top of a digital sensor array. In order to generate pixel super-resolved (SR) lensfree holograms and hence digitally improve the achievable lateral resolution, multiple sub-pixel shifted holograms are recorded at each illumination angle by electromagnetically actuating the fibre-optic waveguides using compact coils and magnets. These SR projection holograms obtained over an angular range of ±50° are rapidly reconstructed to yield projection images of the sample, which can then be back-projected to compute tomograms of the objects on the sensor-chip. The performance of this compact and light-weight lensfree tomographic microscope is validated by imaging micro-beads of different dimensions as well as a Hymenolepis nana egg, which is an infectious parasitic flatworm. Achieving a decent three-dimensional spatial resolution, this field-portable on-chip optical tomographic microscope might provide a useful toolset for telemedicine and high-throughput imaging applications in resource-poor settings. This journal is © The Royal Society of Chemistry 2011
Relative arrival-time upper-mantle tomography and the elusive background mean
NASA Astrophysics Data System (ADS)
Bastow, Ian D.
2012-08-01
The interpretation of seismic tomographic images of upper-mantle seismic wave speed structure is often a matter of considerable debate because the observations can usually be explained by a range of hypotheses, including variable temperature, composition, anisotropy, and the presence of partial melt. An additional problem, often overlooked in tomographic studies using relative as opposed to absolute arrival-times, is the issue of the resulting velocity model's zero mean. In shield areas, for example, relative arrival-time analysis strips off a background mean velocity structure that is markedly fast compared to the global average. Conversely, in active areas, the background mean is often markedly slow compared to the global average. Appreciation of this issue is vital when interpreting seismic tomographic images: 'high' and 'low' velocity anomalies should not necessarily be interpreted, respectively, as 'fast' and 'slow' compared to 'normal mantle'. This issue has been discussed in the seismological literature in detail over the years, yet subsequent tomography studies have still fallen into the trap of mis-interpreting their velocity models. I highlight here some recent examples of this and provide a simple strategy to address the problem using constraints from a recent global tomographic model, and insights from catalogues of absolute traveltime anomalies. Consultation of such absolute measures of seismic wave speed should be routine during regional tomographic studies, if only for the benefit of the broader Earth Science community, who readily follow the red = hot and slow, blue = cold and fast rule of thumb when interpreting the images for themselves.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blanc, Guillermo A.; Kewley, Lisa; Vogt, Frédéric P. A.
2015-01-10
We present a new method for inferring the metallicity (Z) and ionization parameter (q) of H II regions and star-forming galaxies using strong nebular emission lines (SELs). We use Bayesian inference to derive the joint and marginalized posterior probability density functions for Z and q given a set of observed line fluxes and an input photoionization model. Our approach allows the use of arbitrary sets of SELs and the inclusion of flux upper limits. The method provides a self-consistent way of determining the physical conditions of ionized nebulae that is not tied to the arbitrary choice of a particular SELmore » diagnostic and uses all the available information. Unlike theoretically calibrated SEL diagnostics, the method is flexible and not tied to a particular photoionization model. We describe our algorithm, validate it against other methods, and present a tool that implements it called IZI. Using a sample of nearby extragalactic H II regions, we assess the performance of commonly used SEL abundance diagnostics. We also use a sample of 22 local H II regions having both direct and recombination line (RL) oxygen abundance measurements in the literature to study discrepancies in the abundance scale between different methods. We find that oxygen abundances derived through Bayesian inference using currently available photoionization models in the literature can be in good (∼30%) agreement with RL abundances, although some models perform significantly better than others. We also confirm that abundances measured using the direct method are typically ∼0.2 dex lower than both RL and photoionization-model-based abundances.« less
Natanegara, Fanni; Neuenschwander, Beat; Seaman, John W; Kinnersley, Nelson; Heilmann, Cory R; Ohlssen, David; Rochester, George
2014-01-01
Bayesian applications in medical product development have recently gained popularity. Despite many advances in Bayesian methodology and computations, increase in application across the various areas of medical product development has been modest. The DIA Bayesian Scientific Working Group (BSWG), which includes representatives from industry, regulatory agencies, and academia, has adopted the vision to ensure Bayesian methods are well understood, accepted more broadly, and appropriately utilized to improve decision making and enhance patient outcomes. As Bayesian applications in medical product development are wide ranging, several sub-teams were formed to focus on various topics such as patient safety, non-inferiority, prior specification, comparative effectiveness, joint modeling, program-wide decision making, analytical tools, and education. The focus of this paper is on the recent effort of the BSWG Education sub-team to administer a Bayesian survey to statisticians across 17 organizations involved in medical product development. We summarize results of this survey, from which we provide recommendations on how to accelerate progress in Bayesian applications throughout medical product development. The survey results support findings from the literature and provide additional insight on regulatory acceptance of Bayesian methods and information on the need for a Bayesian infrastructure within an organization. The survey findings support the claim that only modest progress in areas of education and implementation has been made recently, despite substantial progress in Bayesian statistical research and software availability. Copyright © 2013 John Wiley & Sons, Ltd.
On the Adequacy of Bayesian Evaluations of Categorization Models: Reply to Vanpaemel and Lee (2012)
ERIC Educational Resources Information Center
Wills, Andy J.; Pothos, Emmanuel M.
2012-01-01
Vanpaemel and Lee (2012) argued, and we agree, that the comparison of formal models can be facilitated by Bayesian methods. However, Bayesian methods neither precede nor supplant our proposals (Wills & Pothos, 2012), as Bayesian methods can be applied both to our proposals and to their polar opposites. Furthermore, the use of Bayesian methods to…
Moving beyond qualitative evaluations of Bayesian models of cognition.
Hemmer, Pernille; Tauber, Sean; Steyvers, Mark
2015-06-01
Bayesian models of cognition provide a powerful way to understand the behavior and goals of individuals from a computational point of view. Much of the focus in the Bayesian cognitive modeling approach has been on qualitative model evaluations, where predictions from the models are compared to data that is often averaged over individuals. In many cognitive tasks, however, there are pervasive individual differences. We introduce an approach to directly infer individual differences related to subjective mental representations within the framework of Bayesian models of cognition. In this approach, Bayesian data analysis methods are used to estimate cognitive parameters and motivate the inference process within a Bayesian cognitive model. We illustrate this integrative Bayesian approach on a model of memory. We apply the model to behavioral data from a memory experiment involving the recall of heights of people. A cross-validation analysis shows that the Bayesian memory model with inferred subjective priors predicts withheld data better than a Bayesian model where the priors are based on environmental statistics. In addition, the model with inferred priors at the individual subject level led to the best overall generalization performance, suggesting that individual differences are important to consider in Bayesian models of cognition.
NASA Astrophysics Data System (ADS)
Ars, S.; Broquet, G.; Yver-Kwok, C.; Wu, L.; Bousquet, P.; Roustan, Y.
2015-12-01
Greenhouse gas (GHG) concentrations keep on increasing in the atmosphere since industrial revolution. Methane (CH4) is the second most important anthropogenic GHG after carbon dioxide (CO2). Its sources and sinks are nowadays well identified however their relative contributions remain uncertain. The industries and the waste treatment emit an important part of the anthropogenic methane that is difficult to quantify because the sources are fugitive and discontinuous. A better estimation of methane emissions could help industries to adapt their mitigation's politic and encourage them to install methane recovery systems in order to reduce their emissions while saving money. Different methods exist to quantify methane emissions. Among them is the tracer release method consisting in releasing a tracer gas near the methane source at a well-known rate and measuring both their concentrations in the emission plume. The methane rate is calculated using the ratio of methane and tracer concentrations and the emission rate of the tracer. A good estimation of the methane emissions requires a good differentiation between the methane actually emitted by the site and the methane from the background concentration level, but also a good knowledge of the sources distribution over the site. For this purpose, a Gaussian plume model is used in addition to the tracer release method to assess the emission rates calculated. In a first step, the data obtained for the tracer during a field campaign are used to tune the model. Different model's parameterizations have been tested to find the best representation of the atmospheric dispersion conditions. Once these parameters are set, methane emissions are estimated thanks to the methane concentrations measured and a Bayesian inversion. This enables to adjust the position and the emission rate of the different methane sources of the site and remove the methane background concentration.
Bayesian structural equation modeling in sport and exercise psychology.
Stenling, Andreas; Ivarsson, Andreas; Johnson, Urban; Lindwall, Magnus
2015-08-01
Bayesian statistics is on the rise in mainstream psychology, but applications in sport and exercise psychology research are scarce. In this article, the foundations of Bayesian analysis are introduced, and we will illustrate how to apply Bayesian structural equation modeling in a sport and exercise psychology setting. More specifically, we contrasted a confirmatory factor analysis on the Sport Motivation Scale II estimated with the most commonly used estimator, maximum likelihood, and a Bayesian approach with weakly informative priors for cross-loadings and correlated residuals. The results indicated that the model with Bayesian estimation and weakly informative priors provided a good fit to the data, whereas the model estimated with a maximum likelihood estimator did not produce a well-fitting model. The reasons for this discrepancy between maximum likelihood and Bayesian estimation are discussed as well as potential advantages and caveats with the Bayesian approach.
A New Approach to X-ray Analysis of SNRs
NASA Astrophysics Data System (ADS)
Frank, Kari A.; Burrows, David; Dwarkadas, Vikram
2016-06-01
We present preliminary results of applying a novel analysis method, Smoothed Particle Inference (SPI), to XMM-Newton observations of SNR RCW 103 and Tycho. SPI is a Bayesian modeling process that fits a population of gas blobs (”smoothed particles”) such that their superposed emission reproduces the observed spatial and spectral distribution of photons. Emission-weighted distributions of plasma properties, such as abundances and temperatures, are then extracted from the properties of the individual blobs. This technique has important advantages over analysis techniques which implicitly assume that remnants are two-dimensional objects in which each line of sight encompasses a single plasma. By contrast, SPI allows superposition of as many blobs of plasma as are needed to match the spectrum observed in each direction, without the need to bin the data spatially. The analyses of RCW 103 and Tycho are part of a pilot study for the larger SPIES (Smoothed Particle Inference Exploration of SNRs) project, in which SPI will be applied to a sample of 12 bright SNRs.
Smoothed Particle Inference Analysis of SNR RCW 103
NASA Astrophysics Data System (ADS)
Frank, Kari A.; Burrows, David N.; Dwarkadas, Vikram
2016-04-01
We present preliminary results of applying a novel analysis method, Smoothed Particle Inference (SPI), to an XMM-Newton observation of SNR RCW 103. SPI is a Bayesian modeling process that fits a population of gas blobs ("smoothed particles") such that their superposed emission reproduces the observed spatial and spectral distribution of photons. Emission-weighted distributions of plasma properties, such as abundances and temperatures, are then extracted from the properties of the individual blobs. This technique has important advantages over analysis techniques which implicitly assume that remnants are two-dimensional objects in which each line of sight encompasses a single plasma. By contrast, SPI allows superposition of as many blobs of plasma as are needed to match the spectrum observed in each direction, without the need to bin the data spatially. This RCW 103 analysis is part of a pilot study for the larger SPIES (Smoothed Particle Inference Exploration of SNRs) project, in which SPI will be applied to a sample of 12 bright SNRs.
The PoGO+ view on Crab off-pulse hard X-ray polarization
NASA Astrophysics Data System (ADS)
Chauvin, M.; Florén, H.-G.; Friis, M.; Jackson, M.; Kamae, T.; Kataoka, J.; Kawano, T.; Kiss, M.; Mikhalev, V.; Mizuno, T.; Tajima, H.; Takahashi, H.; Uchida, N.; Pearce, M.
2018-06-01
The linear polarization fraction (PF) and angle of the hard X-ray emission from the Crab provide unique insight into high-energy radiation mechanisms, complementing the usual imaging, timing, and spectroscopic approaches. Results have recently been presented by two missions operating in partially overlapping energy bands, PoGO+ (18-160 keV) and AstroSat CZTI (100-380 keV). We previously reported PoGO+ results on the polarization parameters integrated across the light curve and for the entire nebula-dominated off-pulse region. We now introduce finer phase binning, in light of the AstroSat CZTI claim that the PF varies across the off-pulse region. Since both missions are operating in a regime where errors on the reconstructed polarization parameters are non-Gaussian, we adopt a Bayesian approach to compare results from each mission. We find no statistically significant variation in off-pulse polarization parameters, neither when considering the mission data separately nor when they are combined. This supports expectations from standard high-energy emission models.
Inverse Estimation of California Methane Emissions and Their Uncertainties using FLEXPART-WRF
NASA Astrophysics Data System (ADS)
Cui, Y.; Brioude, J. F.; Angevine, W. M.; McKeen, S. A.; Peischl, J.; Nowak, J. B.; Henze, D. K.; Bousserez, N.; Fischer, M. L.; Jeong, S.; Liu, Z.; Michelsen, H. A.; Santoni, G.; Daube, B. C.; Kort, E. A.; Frost, G. J.; Ryerson, T. B.; Wofsy, S. C.; Trainer, M.
2015-12-01
Methane (CH4) has a large global warming potential and mediates global tropospheric chemistry. In California, CH4 emissions estimates derived from "top-down" methods based on atmospheric observations have been found to be greater than expected from "bottom-up" population-apportioned national and state inventories. Differences between bottom-up and top-down estimates suggest that the understanding of California's CH4 sources is incomplete, leading to uncertainty in the application of regulations to mitigate regional CH4 emissions. In this study, we use airborne measurements from the California research at the Nexus of Air Quality and Climate Change (CalNex) campaign in 2010 to estimate CH4 emissions in the South Coast Air Basin (SoCAB), which includes California's largest metropolitan area (Los Angeles), and in the Central Valley, California's main agricultural and livestock management area. Measurements from 12 daytime flights, prior information from national and regional official inventories (e.g. US EPA's National Emission Inventory, the California Air Resources Board inventories, the Liu et al. Hybrid Inventory, and the California Greenhouse Gas Emissions Measurement dataset), and the FLEXPART-WRF transport model are used in our mesoscale Bayesian inverse system. We compare our optimized posterior CH4 inventory to the prior bottom-up inventories in terms of total emissions (Mg CH4/hr) and the spatial distribution of the emissions (0.1 degree), and quantify uncertainties in our posterior estimates. Our inversions show that the oil and natural gas industry (extraction, processing and distribution) is the main source accounting for the gap between top-down and bottom-up inventories over the SoCAB, while dairy farms are the largest CH4 source in the Central Valley. CH4 emissions of dairy farms in the San Joaquin Valley and variations of CH4 emissions in the rice-growing regions of Sacramento Valley are quantified and discussed. We also estimate CO and NH3 surface fluxes and use their observed correlation with CH4 mixing ratio to further evaluate our CH4 total emission estimates, and understand the spatial distribution of CH4 emissions.
NASA Astrophysics Data System (ADS)
Arnold, Tim; Manning, Alistair; Li, Shanlan; Kim, Jooil; Park, Sunyoung; Muhle, Jens; Weiss, Ray
2017-04-01
The fluorinated species carbon tetrafluoride (CF4; PFC-14), nitrogen trifluoride (NF3) and trifluoromethane (CHF3; HFC-23) are potent greenhouse gases with 100-year global warming potentials of 6,630, 16,100 and 12,400, respectively. Unlike the majority of CFC-replacements that are emitted from fugitive and mobile emission sources, these gases are mostly emitted from large single point sources - semiconductor manufacturing facilities (all three), aluminium smelting plants (CF4) and chlorodifluoromethane (HCFC-22) factories (HFC-23). In this work we show that atmospheric measurements can serve as a basis to calculate emissions of these gases and to highlight emission 'hotspots'. We use measurements from one Advanced Global Atmospheric Gases Experiment (AGAGE) long term monitoring sites at Gosan on Jeju Island in the Republic of Korea. This site measures CF4, NF3 and HFC-23 alongside a suite of greenhouse and stratospheric ozone depleting gases every two hours using automated in situ gas-chromatography mass-spectrometry instrumentation. We couple each measurement to an analysis of air history using the regional atmospheric transport model NAME (Numerical Atmospheric dispersion Modelling Environment) driven by 3D meteorology from the Met Office's Unified Model, and use a Bayesian inverse method (InTEM - Inversion Technique for Emission Modelling) to calculate yearly emission changes over seven years between 2008 and 2015. We show that our 'top-down' emission estimates for NF3 and CF4 are significantly larger than 'bottom-up' estimates in the EDGAR emissions inventory (edgar.jrc.ec.europa.eu). For example we calculate South Korean emissions of CF4 in 2010 to be 0.29±0.04 Gg/yr, which is significantly larger than the Edgar prior emissions of 0.07 Gg/yr. Further, inversions for several separate years indicate that emission hotspots can be found without prior spatial information. At present these gases make a small contribution to global radiative forcing, however, given that the impact of these long-lived gases could rise significantly and that point sources of such gases can be mitigated, atmospheric monitoring could be an important tool for aiding emissions reduction policy.
Arima, E. Y.
2016-01-01
Tropical forests are now at the center stage of climate mitigation policies worldwide given their roles as sources of carbon emissions resulting from deforestation and forest degradation. Although the international community has created mechanisms such as REDD+ to reduce those emissions, developing tropical countries continue to invest in infrastructure development in an effort to spur economic growth. Construction of roads in particular is known to be an important driver of deforestation. This article simulates the impact of road construction on deforestation in Western Amazonia, Peru, and quantifies the amount of carbon emissions associated with projected deforestation. To accomplish this objective, the article adopts a Bayesian probit land change model in which spatial dependencies are defined between regions or groups of pixels instead of between individual pixels, thereby reducing computational requirements. It also compares and contrasts the patterns of deforestation predicted by both spatial and non-spatial probit models. The spatial model replicates complex patterns of deforestation whereas the non-spatial model fails to do so. In terms of policy, both models suggest that road construction will increase deforestation by a modest amount, between 200–300 km2. This translates into aboveground carbon emissions of 1.36 and 1.85 x 106 tons. However, recent introduction of palm oil in the region serves as a cautionary example that the models may be underestimating the impact of roads. PMID:27010739
Arima, E Y
2016-01-01
Tropical forests are now at the center stage of climate mitigation policies worldwide given their roles as sources of carbon emissions resulting from deforestation and forest degradation. Although the international community has created mechanisms such as REDD+ to reduce those emissions, developing tropical countries continue to invest in infrastructure development in an effort to spur economic growth. Construction of roads in particular is known to be an important driver of deforestation. This article simulates the impact of road construction on deforestation in Western Amazonia, Peru, and quantifies the amount of carbon emissions associated with projected deforestation. To accomplish this objective, the article adopts a Bayesian probit land change model in which spatial dependencies are defined between regions or groups of pixels instead of between individual pixels, thereby reducing computational requirements. It also compares and contrasts the patterns of deforestation predicted by both spatial and non-spatial probit models. The spatial model replicates complex patterns of deforestation whereas the non-spatial model fails to do so. In terms of policy, both models suggest that road construction will increase deforestation by a modest amount, between 200-300 km2. This translates into aboveground carbon emissions of 1.36 and 1.85 x 106 tons. However, recent introduction of palm oil in the region serves as a cautionary example that the models may be underestimating the impact of roads.
The spectral energy distribution of powerful starburst galaxies - I. Modelling the radio continuum
NASA Astrophysics Data System (ADS)
Galvin, T. J.; Seymour, N.; Marvil, J.; Filipović, M. D.; Tothill, N. F. H.; McDermid, R. M.; Hurley-Walker, N.; Hancock, P. J.; Callingham, J. R.; Cook, R. H.; Norris, R. P.; Bell, M. E.; Dwarakanath, K. S.; For, B.; Gaensler, B. M.; Hindson, L.; Johnston-Hollitt, M.; Kapińska, A. D.; Lenc, E.; McKinley, B.; Morgan, J.; Offringa, A. R.; Procopio, P.; Staveley-Smith, L.; Wayth, R. B.; Wu, C.; Zheng, Q.
2018-02-01
We have acquired radio-continuum data between 70 MHz and 48 GHz for a sample of 19 southern starburst galaxies at moderate redshifts (0.067 < z < 0.227) with the aim of separating synchrotron and free-free emission components. Using a Bayesian framework, we find the radio continuum is rarely characterized well by a single power law, instead often exhibiting low-frequency turnovers below 500 MHz, steepening at mid to high frequencies, and a flattening at high frequencies where free-free emission begins to dominate over the synchrotron emission. These higher order curvature components may be attributed to free-free absorption across multiple regions of star formation with varying optical depths. The decomposed synchrotron and free-free emission components in our sample of galaxies form strong correlations with the total-infrared bolometric luminosities. Finally, we find that without accounting for free-free absorption with turnovers between 90 and 500 MHz the radio continuum at low frequency (ν < 200 MHz) could be overestimated by upwards of a factor of 12 if a simple power-law extrapolation is used from higher frequencies. The mean synchrotron spectral index of our sample is constrained to be α = -1.06, which is steeper than the canonical value of -0.8 for normal galaxies. We suggest this may be caused by an intrinsically steeper cosmic ray distribution.
Siberian Arctic black carbon sources constrained by model and observation
Andersson, August; Eckhardt, Sabine; Stohl, Andreas; Semiletov, Igor P.; Dudarev, Oleg V.; Charkin, Alexander; Shakhova, Natalia; Klimont, Zbigniew; Heyes, Chris; Gustafsson, Örjan
2017-01-01
Black carbon (BC) in haze and deposited on snow and ice can have strong effects on the radiative balance of the Arctic. There is a geographic bias in Arctic BC studies toward the Atlantic sector, with lack of observational constraints for the extensive Russian Siberian Arctic, spanning nearly half of the circum-Arctic. Here, 2 y of observations at Tiksi (East Siberian Arctic) establish a strong seasonality in both BC concentrations (8 ng⋅m−3 to 302 ng⋅m−3) and dual-isotope–constrained sources (19 to 73% contribution from biomass burning). Comparisons between observations and a dispersion model, coupled to an anthropogenic emissions inventory and a fire emissions inventory, give mixed results. In the European Arctic, this model has proven to simulate BC concentrations and source contributions well. However, the model is less successful in reproducing BC concentrations and sources for the Russian Arctic. Using a Bayesian approach, we show that, in contrast to earlier studies, contributions from gas flaring (6%), power plants (9%), and open fires (12%) are relatively small, with the major sources instead being domestic (35%) and transport (38%). The observation-based evaluation of reported emissions identifies errors in spatial allocation of BC sources in the inventory and highlights the importance of improving emission distribution and source attribution, to develop reliable mitigation strategies for efficient reduction of BC impact on the Russian Arctic, one of the fastest-warming regions on Earth. PMID:28137854
System for plotting subsoil structure and method therefor
NASA Technical Reports Server (NTRS)
Narasimhan, K. Y.; Nathan, R.; Parthasarathy, S. P. (Inventor)
1980-01-01
Data for use in producing a tomograph of subsoil structure between boreholes is derived by pacing spaced geophones in one borehole, on the Earth surface if desired, and by producing a sequence of shots at spaced apart locations in the other borehole. The signals, detected by each of the geophones from the various shots, are processed either on a time of arrival basis, or on the basis of signal amplitude, to provide information of the characteristics of a large number of incremental areas between the boreholes. Such information is useable to produce a tomograph of the subsoil structure between the boreholes. By processing signals of relatively high frequencies, e.g., up to 100 Hz, and by closely spacing the geophones, a high resolution tomograph can be produced.
Robust statistical reconstruction for charged particle tomography
Schultz, Larry Joe; Klimenko, Alexei Vasilievich; Fraser, Andrew Mcleod; Morris, Christopher; Orum, John Christopher; Borozdin, Konstantin N; Sossong, Michael James; Hengartner, Nicolas W
2013-10-08
Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.
Data-processing strategies for nano-tomography with elemental specification
NASA Astrophysics Data System (ADS)
Liu, Yijin; Cats, Korneel H.; Nelson Weker, Johanna; Andrews, Joy C.; Weckhuysen, Bert M.; Pianetta, Piero
2013-10-01
Combining the energy tunability provided by synchrotron X-ray sources with transmission X-ray microscopy, the morphology of materials can be resolved in 3D at spatial resolution down to 30 nm with elemental/chemical specification. In order to study the energy dependence of the absorption coefficient over the investigated volume, the tomographic reconstruction and image registration (before and/or after the tomographic reconstruction) are critical. We show in this paper the comparison of two different data processing strategies and conclude that the signal to noise ratio (S/N) in the final result can be improved via performing tomographic reconstruction prior to the evaluation of energy dependence. Our result echoes the dose fractionation theorem, and is particularly helpful when the element of interest has low concentration.
NASA Astrophysics Data System (ADS)
Kim, Y.; Nishina, K.; Chae, N.; Park, S. J.; Yoon, Y. J.; Lee, B. Y.
2014-10-01
The tundra ecosystem is quite vulnerable to drastic climate change in the Arctic, and the quantification of carbon dynamics is of significant importance regarding thawing permafrost, changes to the snow-covered period and snow and shrub community extent, and the decline of sea ice in the Arctic. Here, CO2 efflux measurements using a manual chamber system within a 40 m × 40 m (5 m interval; 81 total points) plot were conducted within dominant tundra vegetation on the Seward Peninsula of Alaska, during the growing seasons of 2011 and 2012, for the assessment of driving parameters of CO2 efflux. We applied a hierarchical Bayesian (HB) model - a function of soil temperature, soil moisture, vegetation type, and thaw depth - to quantify the effects of environmental factors on CO2 efflux and to estimate growing season CO2 emissions. Our results showed that average CO2 efflux in 2011 was 1.4 times higher than in 2012, resulting from the distinct difference in soil moisture between the 2 years. Tussock-dominated CO2 efflux is 1.4 to 2.3 times higher than those measured in lichen and moss communities, revealing tussock as a significant CO2 source in the Arctic, with a wide area distribution on the circumpolar scale. CO2 efflux followed soil temperature nearly exponentially from both the observed data and the posterior medians of the HB model. This reveals that soil temperature regulates the seasonal variation of CO2 efflux and that soil moisture contributes to the interannual variation of CO2 efflux for the two growing seasons in question. Obvious changes in soil moisture during the growing seasons of 2011 and 2012 resulted in an explicit difference between CO2 effluxes - 742 and 539 g CO2 m-2 period-1 for 2011 and 2012, respectively, suggesting the 2012 CO2 emission rate was reduced to 27% (95% credible interval: 17-36%) of the 2011 emission, due to higher soil moisture from severe rain. The estimated growing season CO2 emission rate ranged from 0.86 Mg CO2 in 2012 to 1.20 Mg CO2 in 2011 within a 40 m × 40 m plot, corresponding to 86 and 80% of annual CO2 emission rates within the western Alaska tundra ecosystem, estimated from the temperature dependence of CO2 efflux. Therefore, this HB model can be readily applied to observed CO2 efflux, as it demands only four environmental factors and can also be effective for quantitatively assessing the driving parameters of CO2 efflux.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groth, M; Ellis, R; Brooks, N
A video camera system is described that measures the spatial distribution of visible line emission emitted from the main scrape-off layer (SOL) of plasmas in the DIII-D tokamak. A wide-angle lens installed on an equatorial port and an in-vessel mirror which intercepts part of the lens view provide simultaneous tangential views of the SOL on the low-field and high-field sides of the plasma's equatorial plane. Tomographic reconstruction techniques are used to calculate the 2-D poloidal profiles from the raw data, and 1-D poloidal profiles simulating chordal views of other optical diagnostics from the 2-D profiles. The 2-D profiles can bemore » compared with SOL plasma simulations; the 1-D profiles with measurements from spectroscopic diagnostics. Sample results are presented which elucidate carbon transport in plasmas with toroidally uniform injection of methane and argon transport in disruption mitigation experiments with massive gas jet injection.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seo, Dongcheol; Peterson, B. J.; Lee, Seung Hun
The resistive bolometers have been successfully installed in the midplane of L-port in Korea Superconducting Tokamak Advanced Research (KSTAR) device. The spatial and temporal resolutions, 4.5 cm and {approx}1 kHz, respectively, enable us to measure the radial profile of the total radiated power from magnetically confined plasma at a high temperature through radiation and neutral particles. The radiated power was measured at all shots. Even at low plasma current, the bolometer signal was detectable. The electron cyclotron resonance heating (ECH) has been used in tokamak for ECH assisted start-up and plasma control by local heating and current drive. The detectorsmore » of resistive bolometer, near the antenna of ECH, are affected by electron cyclotron wave. The tomographic reconstruction, using the Phillips-Tikhonov regularization method, will be carried out for a major radial profile of the radiation emissivity of the circular cross-section plasma.« less
In vivo biodistribution of ginkgolide B, a constituent of Ginkgo biloba, visualized by MicroPET.
Suehiro, Makiko; Simpson, Norman R; Underwood, Mark D; Castrillon, John; Nakanishi, Koji; van Heertum, Ronald
2005-07-01
The in vivo dynamic behavior of ginkgolide B (GB), a terpene lactone constituent of the Ginkgo biloba extracts, in the living animal was visualized by positron emission tomographic (PET) imaging using a GB analogue labeled with the positron emitter (18)F. The in vivo imaging studies, combined with ex vivo dissection experiments, reveal that GB exists in 2 forms in the body: the original GB with its lactone rings closed and a second form with one of the rings open. The original GB in plasma is taken up rapidly by various organs including the liver, the intestine and possibly the stomach. Consequently, in plasma, the proportion of the ionized form of GB increases dramatically with time. Thereafter the ratio between the 2 forms appears to shift slowly towards equilibrium. The results suggest that more attention needs to be focused on in vivo dynamics between the 2 forms of GB.
Caliyurt, Okan; Vardar, Erdal; Tuglu, Cengiz
2004-01-01
We report a case of Cotard's syndrome associated with psychotic symptoms. A 27-year-old man was admitted to hospital with the diagnosis of schizophreniform disorder. His presenting symptoms, which had started 1 month before hospital admission, were somatic delusions of gastrointestinal and cardiovascular malfunction and the absence of a stomach, which resulted in a decrease in weight from 75 kg to 63 kg in 1 month. Cranial computed tomographic images showed dilatation of the lateral and third ventricles, whereas magnetic resonance imaging revealed central atrophy and lateral ventricle dilatation. Single- photon emission computed tomography demonstrated left temporal, left frontal and left parietal hypoperfusion. The patient did not respond to antipsychotic therapies, but he was successfully treated with electroconvulsive therapy. This report emphasizes that Cotard's syndrome may be accompanied by lesions of the left hemisphere and that electroconvulsive therapy could be the first-line therapy in such patients with psychotic disorder. PMID:15069468
Arenberg, Douglas
2011-02-01
This review focuses on aspects of bronchioloalveolar carcinoma (BAC) in which it differs importantly from other forms of non-small-cell lung cancer. BAC is a form of adenocarcinoma with unique clinical, radiological, and epidemiological features. With the notable exception of a lower likelihood of a positive positron-emission tomographic (PET) scan in BAC, staging, diagnosis, and treatment are largely the same as for other histological subtypes of lung cancer. However, additional treatment options exist that are equivalent, if not more effective, for many patients with BAC. The diagnosis of BAC should be reserved for those tumors meeting the 1999/2004 criteria set forth by the World Health Organization. Revised nomenclature proposed by an expert consensus panel may change how this disease is viewed. Additional clinical trials are needed on patients with BAC, employing strict definitions and enrollment criteria to allow the results to be applied to appropriate patient populations. © Thieme Medical Publishers.
In vivo fluorescence lifetime tomography of a FRET probe expressed in mouse
McGinty, James; Stuckey, Daniel W.; Soloviev, Vadim Y.; Laine, Romain; Wylezinska-Arridge, Marzena; Wells, Dominic J.; Arridge, Simon R.; French, Paul M. W.; Hajnal, Joseph V.; Sardini, Alessandro
2011-01-01
Förster resonance energy transfer (FRET) is a powerful biological tool for reading out cell signaling processes. In vivo use of FRET is challenging because of the scattering properties of bulk tissue. By combining diffuse fluorescence tomography with fluorescence lifetime imaging (FLIM), implemented using wide-field time-gated detection of fluorescence excited by ultrashort laser pulses in a tomographic imaging system and applying inverse scattering algorithms, we can reconstruct the three dimensional spatial localization of fluorescence quantum efficiency and lifetime. We demonstrate in vivo spatial mapping of FRET between genetically expressed fluorescent proteins in live mice read out using FLIM. Following transfection by electroporation, mouse hind leg muscles were imaged in vivo and the emission of free donor (eGFP) in the presence of free acceptor (mCherry) could be clearly distinguished from the fluorescence of the donor when directly linked to the acceptor in a tandem (eGFP-mCherry) FRET construct. PMID:21750768
Lockwood, Alan H; Weissenborn, Karin; Bokemeyer, Martin; Tietge, U; Burchert, Wolfgang
2002-03-01
Many cirrhotics have abnormal neuropsychological test scores. To define the anatomical-physiological basis for encephalopathy in nonalcoholic cirrhotics, we performed resting-state fluorodeoxyglucose positron emission tomographic scans and administered a neuropsychological test battery to 18 patients and 10 controls. Statistical parametric mapping correlated changes in regional glucose metabolism with performance on the individual tests and a composite battery score. In patients without overt encephalopathy, poor performance correlated with reductions in metabolism in the anterior cingulate. In all patients, poor performance on the battery was positively correlated (p < 0.001) with glucose metabolism in bifrontal and biparietal regions of the cerebral cortex and negatively correlated with metabolism in hippocampal, lingual, and fusiform gyri and the posterior putamen. Similar patterns of abnormal metabolism were found when comparing the patients to 10 controls. Metabolic abnormalities in the anterior attention system and association cortices mediating executive and integrative function form the pathophysiological basis for mild hepatic encephalopathy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhl, D.E.; Phelps, M.E.; Engel, J. Jr.
1980-01-01
The ECAT Positron Tomograph was used to scan normal control subjects, stroke patients at various times during recovery, and patients with partial epilepsy during EEG monitoring. /sup 18/F-fluorodeoxyglucose (/sup 18/FDG) and /sup 13/N-Ammonia (/sup 13/NH/sub 3/) were used as indicators of abnormalities in local cerebral glucose utilization (LCMR/sub glc/) and relative perfusion, respectively. Hypometabolism, due to deactivation or minimal damage, was demonstrated with the /sup 18/FDG scan in deep structures and broad zones of cerebral cortex which appeared normal on x-ray CT (XCT) and /sup 99m/Tc pertechnetate scans. In patients with partial epilepsy, who had unilateral or focal electrical abnormalities,more » interictal /sup 18/FDG scan patterns clearly showed localized regions of decreased (20 to 50%) LCMR/sub glc/, which correlated anatomically with the eventual EEG localization.« less
Coronal Polarization of Pseudostreamers and the Solar Polar Field Reversal
NASA Technical Reports Server (NTRS)
Rachmeler, L. A.; Guennou, C.; Seaton, D. B.; Gibson, S. E.; Auchere, F.
2016-01-01
The reversal of the solar polar magnetic field is notoriously hard to pin down due to the extreme viewing angle of the pole. In Cycle 24, the southern polar field reversal can be pinpointed with high accuracy due to a large-scale pseudostreamer that formed over the pole and persisted for approximately a year. We tracked the size and shape of this structure with multiple observations and analysis techniques including PROBA2/SWAP EUV images, AIA EUV images, CoMP polarization data, and 3D tomographic reconstructions. We find that the heliospheric field reversed polarity in February 2014, whereas in the photosphere, the last vestiges of the previous polar field polarity remained until March 2015. We present here the evolution of the structure and describe its identification in the Fe XII 1074nm coronal emission line, sensitive to the Hanle effect in the corona.
Bayesian Inference for Functional Dynamics Exploring in fMRI Data.
Guo, Xuan; Liu, Bing; Chen, Le; Chen, Guantao; Pan, Yi; Zhang, Jing
2016-01-01
This paper aims to review state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI) data. Particularly, we focus on one specific long-standing challenge in the computational modeling of fMRI datasets: how to effectively explore typical functional interactions from fMRI time series and the corresponding boundaries of temporal segments. Bayesian inference is a method of statistical inference which has been shown to be a powerful tool to encode dependence relationships among the variables with uncertainty. Here we provide an introduction to a group of Bayesian-inference-based methods for fMRI data analysis, which were designed to detect magnitude or functional connectivity change points and to infer their functional interaction patterns based on corresponding temporal boundaries. We also provide a comparison of three popular Bayesian models, that is, Bayesian Magnitude Change Point Model (BMCPM), Bayesian Connectivity Change Point Model (BCCPM), and Dynamic Bayesian Variable Partition Model (DBVPM), and give a summary of their applications. We envision that more delicate Bayesian inference models will be emerging and play increasingly important roles in modeling brain functions in the years to come.
NASA Astrophysics Data System (ADS)
Riffel, Katharina; Sebastian, Donner; Shaiganfar, Reza; Wagner, Thomas; Dörner, Steffen
2016-04-01
The MAX DOAS-Method (Multi-AXis Differential Optical Absorption Spectroscopy) is used to analyze different trace gases (e.g. NO2, SO2, HCHO) at the same time and to determine the trace gas vertical column density (vertically integrated concentration). In summer 2015 we performed car-MAX-DOAS measurements in Romania during the AROMAT2 campaign. We encircled Bucharest at different weather situations and different times of the day. Afterwards the total NOx emissions were derived from the mobile MAX-DOAS observations in combination with wind data. In Germany we performed the same measurement procedure in fall/ winter/ spring 2015 /2016 by encircling the cities Mainz and Frankfurt. For the setting we mounted two MAX-DOAS instruments with different viewing directions (forward and backward) on the roof of a car. One instrument is a commercial mini MAX-DOAS that is built by the German company Hoffmann Messtechnik. The second one was built at the MPI in Mainz. This so-called Tube MAX-DOAS uses an AVANTES spectrometer with better optical characteristics than Hoffmann's mini MAX-DOAS. The advantage of two instruments working at the same time is (besides redundancy) that localized emission plumes can be measured from different directions at different locations. Thus, especially for emission plumes from power plants, tomographic methods can be applied to derive information about the plume altitude. Car-MAX-DOAS observations can cover large areas at a short time with reasonable resolution (depending on the speed of the car and the instruments integration time). Thus these measurements are well suited to validate satellites observations. This work will show the first AROMAT2 results of NOx emissions derived in Romania and in the Rhein-Main region, which is one of the most polluted area in Germany.
NASA Astrophysics Data System (ADS)
De Landro, Grazia; Gammaldi, Sergio; Serlenga, Vincenzo; Amoroso, Ortensia; Russo, Guido; Festa, Gaetano; D'Auria, Luca; Bruno, Pier Paolo; Gresse, Marceau; Vandemeulebrouck, Jean; Zollo, Aldo
2017-04-01
Seismic tomography can be used to image the spatial variation of rock properties within complex geological media such as volcanoes. Solfatara is a volcano located within the Campi Flegrei still active caldera, characterized by periodic episodes of extended, low-rate ground subsidence and uplift called bradyseism accompanied by intense seismic and geochemical activities. In particular, Solfatara is characterized by an impressive magnitude diffuse degassing, which underlines the relevance of fluid and heat transport at the crater and prompted further research to improve the understanding of the hydrothermal system feeding the surface phenomenon. In this line, an active seismic experiment, Repeated Induced Earthquake and Noise (RICEN) (EU Project MEDSUV), was carried out between September 2013 and November 2014 to provide time-varying high-resolution images of the structure of Solfatara. In this study we used the datasets provided by two different acquisition geometries: a) A 2D array cover an area of 90 x 115 m ^ 2 sampled by a regular grid of 240 vertical sensors deployed at the crater surface; b) two 1D orthogonal seismic arrays deployed along NE-SW and NW-SE directions crossing the 400 m crater surface. The arrays are sampled with a regular line of 240 receiver and 116 shots. We present 2D and 3D tomographic high-resolution P-wave velocity images obtained using two different tomographic methods adopting a multiscale strategy. The 3D image of the shallow (30-35 m) central part of Solfatara crater is performed through the iterative, linearized, tomographic inversion of the P-wave first arrival times. 2D P-wave velocity sections (60-70 m) are obtained using a non-linear travel-time tomography method based on the evaluation of a posteriori probability density with a Bayesian approach. The 3D retrieved images integrated with resistivity section and temperature and CO2 flux measurements , define the following characteristics: 1. A depth dependent P-wave velocity layer down to 14 m, with Vp<700m/s typical of poorly-consolidated tephra and affected by CO2 degassing; 2. An intermediate layer, deepening towards the mineralized liquid-saturated area (Fangaia), interpreted as permeable deposits saturated with condensed water; 3. A deep, confined high velocity anomaly associated with a CO2 reservoir. With the 2D profiles we can image up to around 70 m depth: the first 30 m are characterized by features and velocities comparable to those of the 3D profiles, deeper, between 40-60 m depth, were found two low velocity anomalies, that probably indicate a preferential via for fluid degassing. These features are expression of an area located between the Fangaia, which is water saturated and replenished from deep aquifers, and the main fumaroles that are the superficial relief of deep rising CO2 flux. So, the changes in the outgassing rate greatly affects the shallow hydrothermal system, which can be used as a near-surface "mirror" of fluid migration processes occurring at greater depths.
NASA Astrophysics Data System (ADS)
Li, L.; Xu, C.-Y.; Engeland, K.
2012-04-01
With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD
A multiresolution inversion for imaging the ionosphere
NASA Astrophysics Data System (ADS)
Yin, Ping; Zheng, Ya-Nan; Mitchell, Cathryn N.; Li, Bo
2017-06-01
Ionospheric tomography has been widely employed in imaging the large-scale ionospheric structures at both quiet and storm times. However, the tomographic algorithms to date have not been very effective in imaging of medium- and small-scale ionospheric structures due to limitations of uneven ground-based data distributions and the algorithm itself. Further, the effect of the density and quantity of Global Navigation Satellite Systems data that could help improve the tomographic results for the certain algorithm remains unclear in much of the literature. In this paper, a new multipass tomographic algorithm is proposed to conduct the inversion using intensive ground GPS observation data and is demonstrated over the U.S. West Coast during the period of 16-18 March 2015 which includes an ionospheric storm period. The characteristics of the multipass inversion algorithm are analyzed by comparing tomographic results with independent ionosonde data and Center for Orbit Determination in Europe total electron content estimates. Then, several ground data sets with different data distributions are grouped from the same data source in order to investigate the impact of the density of ground stations on ionospheric tomography results. Finally, it is concluded that the multipass inversion approach offers an improvement. The ground data density can affect tomographic results but only offers improvements up to a density of around one receiver every 150 to 200 km. When only GPS satellites are tracked there is no clear advantage in increasing the density of receivers beyond this level, although this may change if multiple constellations are monitored from each receiving station in the future.
Computed tomographic contrast tenography of the digital flexor tendon sheath of the equine hindlimb.
Agass, Rachel; Dixon, Jonathon; Fraser, Barny
2018-05-01
Pre-surgical investigation of digital flexor tendon sheath pathology remains challenging with current standard imaging techniques. The aim of this prospective, anatomical, pilot study was to describe the anatomy of the equine hind limb digital flexor tendon sheath using a combination of computed tomography (CT) and computed tomographic contrast tenography in clinically normal cadaver limbs. Ten pairs of hind limbs with no external abnormalities were examined from the level of the tarsometatarsal joint distally. Limbs initially underwent non-contrast CT examination using 120 kVp, 300 mAs, and 1.5 mm slice thickness. Sixty millilitres of ioversol iodinated contrast media and saline (final concentration 100 mg/ml) were injected using a basilar sesamoidean approach. The computed tomographic contrast tenography examination was then repeated, before dissection of the specimens to compare gross and imaging findings. The combined CT and computed tomographic contrast tenography examinations provided excellent anatomical detail of intra-thecal structures. The borders of the superficial and deep digital flexor tendons, and the manica flexoria were consistently identifiable in all limbs. Detailed anatomy including that of the mesotenons, two of which are previously undescribed, and the plantar annular ligament were also consistently identifiable. Dissection of all 10 pairs of limbs revealed there to be no pathology, in accordance with the imaging findings. In conclusion, the combination of CT and computed tomographic contrast tenography may be useful adjunctive diagnostic techniques to define digital flexor tendon sheath pathology prior to surgical exploration in horses. © 2017 American College of Veterinary Radiology.
NASA Astrophysics Data System (ADS)
Henne, Stephan; Leuenberger, Markus; Steinbacher, Martin; Eugster, Werner; Meinhardt, Frank; Bergamaschi, Peter; Emmenegger, Lukas; Brunner, Dominik
2017-04-01
Similar to other Western European countries, agricultural sources dominate the methane (CH4) emission budget in Switzerland. 'Bottom-up' estimates of these emissions are still connected with relatively large uncertainties due to considerable variability and uncertainties in observed emission factors for the underlying processes (e.g., enteric fermentation, manure management). Here, we present a regional-scale (˜300 x 200 km2) atmospheric inversion study of CH4 emissions in Switzerland making use of the recently established CarboCount-CH network of four stations on the Swiss Plateau as well as the neighbouring mountain-top sites Jungfraujoch and Schauinsland (Germany). Continuous observations from all CarboCount-CH sites are available since 2013. We use a high-resolution (7 x 7 km2) Lagrangian particle dispersion model (FLEXPART-COSMO) in connection with two different inversion systems (Bayesian and extended Kalman filter) to estimate spatially and temporally resolved CH4 emissions for the Swiss domain in the period 2013 to 2016. An extensive set of sensitivity inversions is used to assess the overall uncertainty of our inverse approach. In general we find good agreement of the total Swiss CH4 emissions between our 'top-down' estimate and the national 'bottom-up' reporting. In addition, a robust emission seasonality, with reduced winter time values, can be seen in all years. No significant trend or year-to-year variability was observed for the analysed four-year period, again in agreement with a very small downward trend in the national 'bottom-up' reporting. Special attention is given to the influence of boundary conditions as taken from different global scale model simulations (TM5, FLEXPART) and remote observations. We find that uncertainties in the boundary conditions can induce large offsets in the national total emissions. However, spatial emission patterns are less sensitive to the choice of boundary condition. Furthermore and in order to demonstrate the validity of our approach, a series of inversion runs using synthetic observations, generated from 'true' emissions, in combination with various sources of uncertainty are presented.
NASA Astrophysics Data System (ADS)
Kim, Y.; Nishina, K.; Chae, N.; Park, S.; Yoon, Y.; Lee, B.
2014-04-01
The tundra ecosystem is quite vulnerable to drastic climate change in the Arctic, and the quantification of carbon dynamics is of significant importance in response to thawing permafrost, changes in the snow-covered period and snow and shrub community extent, and the decline of sea ice in the Arctic. Here, CO2 efflux measurements using a manual chamber system within a 40 m × 40 m (5 m interval; 81 total points) plot were conducted in dominant tundra vegetation on the Seward Peninsula of Alaska, during the growing seasons of 2011 and 2012, for the assessment of the driving parameters of CO2 efflux. We applied a hierarchical Bayesian (HB) model - which is a function of soil temperature, soil moisture, vegetation type and thaw depth - to quantify the effect of environmental parameters on CO2 efflux, and to estimate growing season CO2 emission. Our results showed that average CO2 efflux in 2011 is 1.4-fold higher than in 2012, resulting from the distinct difference in soil moisture between the two years. Tussock-dominated CO2 efflux is 1.4 to 2.3 times higher than those measured in lichen and moss communities, reflecting tussock as a significant CO2 source in the Arctic, with wide area distribution on a circumpolar scale. CO2 efflux followed soil temperature nearly exponentially from both the observed data and the posterior medians of the HB model. This reveals soil temperature as the most important parameter in regulating CO2 efflux, rather than soil moisture and thaw depth. Obvious changes in soil moisture during the growing seasons of 2011 and 2012 resulted in an explicit difference in CO2 efflux - 742 and 539 g CO2 m-2 period-1 in 2011 and 2012, respectively, suggesting that the 2012 CO2 emission rate was constrained by 27% (95% credible interval: 17-36%) compared to 2011, due to higher soil moisture from severe rain. Estimated growing season CO2 emission rate ranged from 0.86 Mg CO2 period-1 in 2012 to 1.2 Mg CO2 period-1 in 2011 within a 40 m × 40 m plot, corresponding to 86% and 80% of the annual CO2 emission rates within the Alaska western tundra ecosystem. Therefore, the HB model can be readily applied to observed CO2 efflux, as it demands only four environmental parameters and can also be effective for quantitatively assessing the driving parameters of CO2 efflux.
Sparsity Aware Adaptive Radar Sensor Imaging in Complex Scattering Environments
2015-06-15
while meeting the requirement on the peak to average power ratio. Third, we study impact of waveform encoding on nonlinear electromagnetic tomographic...Enyue Lu. Time Domain Electromagnetic Tomography Using Propagation and Backpropagation Method, IEEE International Conference on Image Processing...Received Paper 3.00 4.00 Yuanwei Jin, Chengdon Dong, Enyue Lu. Waveform Encoding for Nonlinear Electromagnetic Tomographic Imaging, IEEE Global
An Analysis for Capital Expenditure Decisions at a Naval Regional Medical Center.
1981-12-01
Service Equipment Review Committee 1. Portable defibrilator Computed tomographic scanner and cardioscope 2. ECG cart Automated blood cell counter 3. Gas...system sterilizer Gas system sterilizer 4. Automated blood cell Portable defibrilator and counter cardioscope 5. Computed tomographic ECG cart scanner...dictating and automated typing) systems. e. Filing equipment f. Automatic data processing equipment including data communications equipment. g
Analysis of 21-cm tomographic data
NASA Astrophysics Data System (ADS)
Mellema, Garrelt; Giri, Sambit; Ghara, Raghuna
2018-05-01
The future SKA1-Low radio telescope will be powerful enough to produce tomographic images of the 21-cm signal from the Epoch of Reionization. Here we address how to identify ionized regions in such data sets, taking into account the resolution and noise levels associated with SKA1-Low. We describe three methods of which one, superpixel oversegmentation, consistently performs best.
1986-03-10
and P. Frangos , "Inverse Scattering for Dielectric Media", Annual OSA Meeting, Wash. D.C., Oct. 1985. Invited Presentations 1. N. Farhat, "Tomographic...Optical Computing", DARPA Briefing, ~~April 1985. ... -7--.. , 1% If .% P . .% .% *-. 7777~14e 7-7. K-7 77 Theses 0 P.V. Frangos , "The Electromagnetic
Volume Segmentation and Ghost Particles
NASA Astrophysics Data System (ADS)
Ziskin, Isaac; Adrian, Ronald
2011-11-01
Volume Segmentation Tomographic PIV (VS-TPIV) is a type of tomographic PIV in which images of particles in a relatively thick volume are segmented into images on a set of much thinner volumes that may be approximated as planes, as in 2D planar PIV. The planes of images can be analysed by standard mono-PIV, and the volume of flow vectors can be recreated by assembling the planes of vectors. The interrogation process is similar to a Holographic PIV analysis, except that the planes of image data are extracted from two-dimensional camera images of the volume of particles instead of three-dimensional holographic images. Like the tomographic PIV method using the MART algorithm, Volume Segmentation requires at least two cameras and works best with three or four. Unlike the MART method, Volume Segmentation does not require reconstruction of individual particle images one pixel at a time and it does not require an iterative process, so it operates much faster. As in all tomographic reconstruction strategies, ambiguities known as ghost particles are produced in the segmentation process. The effect of these ghost particles on the PIV measurement is discussed. This research was supported by Contract 79419-001-09, Los Alamos National Laboratory.
Creating three-dimensional tooth models from tomographic images.
Lima da Silva, Isaac Newton; Barbosa, Gustavo Frainer; Soares, Rodrigo Borowski Grecco; Beltrao, Maria Cecilia Gomes; Spohr, Ana Maria; Mota, Eduardo Golcalves; Oshima, Hugo Mitsuo Silva; Burnett, Luiz Henrique
2008-01-01
The use of Finite Element Analysis (FEA) is becoming very frequent in Dentistry. However, most of the three-dimensional models presented by the literature for teeth are limited in terms of geometry. Discrepancy in shape and dimensions can cause wrong results to occur. Sharp cusps and faceted contour can produce stress concentrations, which are incoherent with the reality. The aim of this study was the processing of tomographic images in order to develop an advanced three-dimensional reconstruction of the anatomy of a molar tooth and the integration of the resulting solid with commercially available CAD/CAE software. Computed tomographic images were obtained from 0.5 mm thick slices of mandibular molar and transferred to commercial cad software. Once the point cloud data have been generated, the work on these points started to get to the solid model of the tooth with Pro/Engineer software. The obtained tooth model showed very accurate shape and dimensions, as it was obtained from real tooth data with error of 0.0 to -0.8 mm. The methodology presented was efficient for creating a biomodel of a tooth from tomographic images that realistically represented its anatomy.
Experimental demonstration of laser tomographic adaptive optics on a 30-meter telescope at 800 nm
NASA Astrophysics Data System (ADS)
Ammons, S., Mark; Johnson, Luke; Kupke, Renate; Gavel, Donald T.; Max, Claire E.
2010-07-01
A critical goal in the next decade is to develop techniques that will extend Adaptive Optics correction to visible wavelengths on Extremely Large Telescopes (ELTs). We demonstrate in the laboratory the highly accurate atmospheric tomography necessary to defeat the cone effect on ELTs, an essential milestone on the path to this capability. We simulate a high-order Laser Tomographic AO System for a 30-meter telescope with the LTAO/MOAO testbed at UCSC. Eight Sodium Laser Guide Stars (LGSs) are sensed by 99x99 Shack-Hartmann wavefront sensors over 75". The AO system is diffraction-limited at a science wavelength of 800 nm (S ~ 6-9%) over a field of regard of 20" diameter. Openloop WFS systematic error is observed to be proportional to the total input atmospheric disturbance and is nearly the dominant error budget term (81 nm RMS), exceeded only by tomographic wavefront estimation error (92 nm RMS). The total residual wavefront error for this experiment is comparable to that expected for wide-field tomographic adaptive optics systems of similar wavefront sensor order and LGS constellation geometry planned for Extremely Large Telescopes.
Kubo, S; Nakata, H; Sugauchi, Y; Yokota, N; Yoshimine, T
2000-05-01
The preoperative localization of superficial intracranial lesions is often necessary for accurate burr hole placement or craniotomy siting. It is not always easy, however, to localize the lesions over the scalp working only from computed tomographic images. We developed a simple method for such localization using a laser pointer during the preoperative computed tomographic examination. The angle of incidence, extending from a point on the scalp to the center of the computed tomographic image, is measured by the software included with the scanner. In the gantry, at the same angle as on the image, a laser is beamed from a handmade projector onto the patient's scalp toward the center of the gantry. The point illuminated on the patient's head corresponds to that on the image. The device and the method are described in detail herein. We applied this technique to mark the area for the craniotomy before surgery in five patients with superficial brain tumors. At the time of surgery, it was confirmed that the tumors were circumscribed precisely. The technique is easy to perform and useful in the preoperative planning for a craniotomy. In addition, the device is easily constructed and inexpensive.
Optical coherence tomography using images of hair structure and dyes penetrating into the hair.
Tsugita, Tetsuya; Iwai, Toshiaki
2014-11-01
Hair dyes are commonly evaluated by the appearance of the hair after dyeing. However, this approach cannot simultaneously assess how deep the dye has penetrated into hair. For simultaneous assessment of the appearance and the interior of hair, we developed a visible-range red, green, and blue (RGB) (three primary colors)-optical coherence tomography (OCT) using an RGB LED light source. We then evaluated a phantom model based on the assumption that the sample's absorbability in the vertical direction affects the tomographic imaging. Consistent with theory, our device showed higher resolution than conventional OCT with far-red light. In the experiment on the phantom model, we confirmed that the tomographic imaging is affected by absorbability unique to the sample. Furthermore, we verified that permeability can be estimated from this tomographic image. We also identified for the first time the relationship between penetration of the dye into hair and characteristics of wavelength by tomographic imaging of dyed hair. We successfully simultaneously assessed the appearance of dyed hair and inward penetration of the dye without preparing hair sections. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
TomoBank: a tomographic data repository for computational x-ray science
NASA Astrophysics Data System (ADS)
De Carlo, Francesco; Gürsoy, Doğa; Ching, Daniel J.; Joost Batenburg, K.; Ludwig, Wolfgang; Mancini, Lucia; Marone, Federica; Mokso, Rajmund; Pelt, Daniël M.; Sijbers, Jan; Rivers, Mark
2018-03-01
There is a widening gap between the fast advancement of computational methods for tomographic reconstruction and their successful implementation in production software at various synchrotron facilities. This is due in part to the lack of readily available instrument datasets and phantoms representative of real materials for validation and comparison of new numerical methods. Recent advancements in detector technology have made sub-second and multi-energy tomographic data collection possible (Gibbs et al 2015 Sci. Rep. 5 11824), but have also increased the demand to develop new reconstruction methods able to handle in situ (Pelt and Batenburg 2013 IEEE Trans. Image Process. 22 5238-51) and dynamic systems (Mohan et al 2015 IEEE Trans. Comput. Imaging 1 96-111) that can be quickly incorporated in beamline production software (Gürsoy et al 2014 J. Synchrotron Radiat. 21 1188-93). The x-ray tomography data bank, tomoBank, provides a repository of experimental and simulated datasets with the aim to foster collaboration among computational scientists, beamline scientists, and experimentalists and to accelerate the development and implementation of tomographic reconstruction methods for synchrotron facility production software by providing easy access to challenging datasets and their descriptors.
Tomographic imaging using poissonian detector data
Aspelmeier, Timo; Ebel, Gernot; Hoeschen, Christoph
2013-10-15
An image reconstruction method for reconstructing a tomographic image (f.sub.j) of a region of investigation within an object (1), comprises the steps of providing detector data (y.sub.i) comprising Poisson random values measured at an i-th of a plurality of different positions, e.g. i=(k,l) with pixel index k on a detector device and angular index l referring to both the angular position (.alpha..sub.l) and the rotation radius (r.sub.l) of the detector device (10) relative to the object (1), providing a predetermined system matrix A.sub.ij assigning a j-th voxel of the object (1) to the i-th detector data (y.sub.i), and reconstructing the tomographic image (f.sub.j) based on the detector data (y.sub.i), said reconstructing step including a procedure of minimizing a functional F(f) depending on the detector data (y.sub.i) and the system matrix A.sub.ij and additionally including a sparse or compressive representation of the object (1) in an orthobasis T, wherein the tomographic image (f.sub.j) represents the global minimum of the functional F(f). Furthermore, an imaging method and an imaging device using the image reconstruction method are described.
Bayesian data analysis in population ecology: motivations, methods, and benefits
Dorazio, Robert
2016-01-01
During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.
A Bayesian-frequentist two-stage single-arm phase II clinical trial design.
Dong, Gaohong; Shih, Weichung Joe; Moore, Dirk; Quan, Hui; Marcella, Stephen
2012-08-30
It is well-known that both frequentist and Bayesian clinical trial designs have their own advantages and disadvantages. To have better properties inherited from these two types of designs, we developed a Bayesian-frequentist two-stage single-arm phase II clinical trial design. This design allows both early acceptance and rejection of the null hypothesis ( H(0) ). The measures (for example probability of trial early termination, expected sample size, etc.) of the design properties under both frequentist and Bayesian settings are derived. Moreover, under the Bayesian setting, the upper and lower boundaries are determined with predictive probability of trial success outcome. Given a beta prior and a sample size for stage I, based on the marginal distribution of the responses at stage I, we derived Bayesian Type I and Type II error rates. By controlling both frequentist and Bayesian error rates, the Bayesian-frequentist two-stage design has special features compared with other two-stage designs. Copyright © 2012 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Maseda, Michael V.; van der Wel, Arjen; Rix, Hans-Walter; Momcheva, Ivelina; Brammer, Gabriel B.; Franx, Marijn; Lundgren, Britt F.; Skelton, Rosalind E.; Whitaker, Katherine E.
2018-02-01
The multiplexing capability of slitless spectroscopy is a powerful asset in creating large spectroscopic data sets, but issues such as spectral confusion make the interpretation of the data challenging. Here we present a new method to search for emission lines in the slitless spectroscopic data from the 3D-HST survey utilizing the Wide-Field Camera 3 on board the Hubble Space Telescope. Using a novel statistical technique, we can detect compact (extended) emission lines at 90% completeness down to fluxes of 1.5(3.0)× {10}-17 {erg} {{{s}}}-1 {{cm}}-2, close to the noise level of the grism exposures, for objects detected in the deep ancillary photometric data. Unlike previous methods, the Bayesian nature allows for probabilistic line identifications, namely redshift estimates, based on secondary emission line detections and/or photometric redshift priors. As a first application, we measure the comoving number density of Extreme Emission Line Galaxies (restframe [O III] λ5007 equivalent widths in excess of 500 Å). We find that these galaxies are nearly 10× more common above z ∼ 1.5 than at z ≲ 0.5. With upcoming large grism surveys such as Euclid and WFIRST, as well as grisms featured prominently on the NIRISS and NIRCam instruments on the James Webb Space Telescope, methods like the one presented here will be crucial for constructing emission line redshift catalogs in an automated and well-understood manner. This work is based on observations taken by the 3D-HST Treasury Program and the CANDELS Multi-Cycle Treasury Program with the NASA/ESA HST, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555.
LITES and GROUP-C Mission Update: Ionosphere and Thermosphere Sensing from the ISS
NASA Astrophysics Data System (ADS)
Stephan, A. W.; Budzien, S. A.; Chakrabarti, S.; Hysell, D. L.; Powell, S. P.; Finn, S. C.; Cook, T.; Bishop, R. L.
2016-12-01
The Limb-imaging Ionospheric and Thermospheric Extreme-ultraviolet Spectrograph (LITES) and GPS Radio Occultation and Ultraviolet Photometer Co-located (GROUP-C) experiments are scheduled for launch to the International Space Station (ISS) in November 2016 as part of the Space Test Program Houston #5 payload (STP-H5). The two experiments provide technical development and risk-reduction for future space weather sensors suitable for ionospheric specification, space situational awareness, and data products for global ionosphere assimilative models. The combined instrument suite of these experiments offers a unique capability to study spatial and temporal variability of the thermosphere and ionosphere using multi-sensor and tomographic approaches. LITES is an imaging spectrograph that spans 60-140 nm and continuously acquires limb profiles of the ionosphere and thermosphere from 150-350 km altitude. GROUP-C includes a high-sensitivity far-ultraviolet photometer measuring horizontal ionospheric gradients and an advanced GPS receiver providing ionospheric electron density profiles and scintillation measurements. High-cadence limb images and nadir photometry from GROUP-C/LITES are combined to tomographically reconstruct high-fidelity two-dimensional volume emission rates within the ISS orbital plane. The GPS occultation receiver provides independent measurements to calibrate and validate advanced daytime ionospheric algorithms and nighttime tomography. The vantage from the ISS on the lower portion of the thermosphere and ionosphere will yield measurements complementary to the NASA GOLD and ICON missions which are expected to fly during the STP-H5 mission. We present a mission status update and available early orbit observations, and the opportunities for using these new data to help address questions regarding the complex and dynamic features of the low and middle latitude ionosphere-thermosphere system that have important implications for operational systems.
Min, James K; Shaw, Leslee J; Berman, Daniel S; Gilmore, Amanda; Kang, Ning
2008-09-15
Multidetector coronary computed tomographic angiography (CCTA) demonstrates high accuracy for the detection and exclusion of coronary artery disease (CAD) and predicts adverse prognosis. To date, opportunity costs relating the clinical and economic outcomes of CCTA compared with other methods of diagnosing CAD, such as myocardial perfusion single-photon emission computed tomography (SPECT), remain unknown. An observational, multicenter, patient-level analysis of patients without known CAD who underwent CCTA or SPECT was performed. Patients who underwent CCTA (n = 1,938) were matched to those who underwent SPECT (n = 7,752) on 8 demographic and clinical characteristics and 2 summary measures of cardiac medications and co-morbidities and were evaluated for 9-month expenditures and clinical outcomes. Adjusted total health care and CAD expenditures were 27% (p <0.001) and 33% (p <0.001) lower, respectively, for patients who underwent CCTA compared with those who underwent SPECT, by an average of $467 (95% confidence interval $99 to $984) for CAD expenditures per patient. Despite lower total health care expenditures for CCTA, no differences were observed for rates of adverse cardiovascular events, including CAD hospitalizations (4.2% vs 4.1%, p = NS), CAD outpatient visits (17.4% vs 13.3%, p = NS), myocardial infarction (0.4% vs 0.6%, p = NS), and new-onset angina (3.0% vs 3.5%, p = NS). Patients without known CAD who underwent CCTA, compared with matched patients who underwent SPECT, incurred lower overall health care and CAD expenditures while experiencing similarly low rates of CAD hospitalization, outpatient visits, myocardial infarction, and angina. In conclusion, these data suggest that CCTA may be a cost-efficient alternative to SPECT for the initial coronary evaluation of patients without known CAD.
Post-processing methods of rendering and visualizing 3-D reconstructed tomographic images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, S.T.C.
The purpose of this presentation is to discuss the computer processing techniques of tomographic images, after they have been generated by imaging scanners, for volume visualization. Volume visualization is concerned with the representation, manipulation, and rendering of volumetric data. Since the first digital images were produced from computed tomography (CT) scanners in the mid 1970s, applications of visualization in medicine have expanded dramatically. Today, three-dimensional (3D) medical visualization has expanded from using CT data, the first inherently digital source of 3D medical data, to using data from various medical imaging modalities, including magnetic resonance scanners, positron emission scanners, digital ultrasound,more » electronic and confocal microscopy, and other medical imaging modalities. We have advanced from rendering anatomy to aid diagnosis and visualize complex anatomic structures to planning and assisting surgery and radiation treatment. New, more accurate and cost-effective procedures for clinical services and biomedical research have become possible by integrating computer graphics technology with medical images. This trend is particularly noticeable in current market-driven health care environment. For example, interventional imaging, image-guided surgery, and stereotactic and visualization techniques are now stemming into surgical practice. In this presentation, we discuss only computer-display-based approaches of volumetric medical visualization. That is, we assume that the display device available is two-dimensional (2D) in nature and all analysis of multidimensional image data is to be carried out via the 2D screen of the device. There are technologies such as holography and virtual reality that do provide a {open_quotes}true 3D screen{close_quotes}. To confine the scope, this presentation will not discuss such approaches.« less
Wu, Abraham J; Bosch, Walter R; Chang, Daniel T; Hong, Theodore S; Jabbour, Salma K; Kleinberg, Lawrence R; Mamon, Harvey J; Thomas, Charles R; Goodman, Karyn A
2015-07-15
Current guidelines for esophageal cancer contouring are derived from traditional 2-dimensional fields based on bony landmarks, and they do not provide sufficient anatomic detail to ensure consistent contouring for more conformal radiation therapy techniques such as intensity modulated radiation therapy (IMRT). Therefore, we convened an expert panel with the specific aim to derive contouring guidelines and generate an atlas for the clinical target volume (CTV) in esophageal or gastroesophageal junction (GEJ) cancer. Eight expert academically based gastrointestinal radiation oncologists participated. Three sample cases were chosen: a GEJ cancer, a distal esophageal cancer, and a mid-upper esophageal cancer. Uniform computed tomographic (CT) simulation datasets and accompanying diagnostic positron emission tomographic/CT images were distributed to each expert, and the expert was instructed to generate gross tumor volume (GTV) and CTV contours for each case. All contours were aggregated and subjected to quantitative analysis to assess the degree of concordance between experts and to generate draft consensus contours. The panel then refined these contours to generate the contouring atlas. The κ statistics indicated substantial agreement between panelists for each of the 3 test cases. A consensus CTV atlas was generated for the 3 test cases, each representing common anatomic presentations of esophageal cancer. The panel agreed on guidelines and principles to facilitate the generalizability of the atlas to individual cases. This expert panel successfully reached agreement on contouring guidelines for esophageal and GEJ IMRT and generated a reference CTV atlas. This atlas will serve as a reference for IMRT contours for clinical practice and prospective trial design. Subsequent patterns of failure analyses of clinical datasets using these guidelines may require modification in the future. Copyright © 2015 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Abraham J., E-mail: wua@mskcc.org; Bosch, Walter R.; Chang, Daniel T.
Purpose/Objective(s): Current guidelines for esophageal cancer contouring are derived from traditional 2-dimensional fields based on bony landmarks, and they do not provide sufficient anatomic detail to ensure consistent contouring for more conformal radiation therapy techniques such as intensity modulated radiation therapy (IMRT). Therefore, we convened an expert panel with the specific aim to derive contouring guidelines and generate an atlas for the clinical target volume (CTV) in esophageal or gastroesophageal junction (GEJ) cancer. Methods and Materials: Eight expert academically based gastrointestinal radiation oncologists participated. Three sample cases were chosen: a GEJ cancer, a distal esophageal cancer, and a mid-upper esophagealmore » cancer. Uniform computed tomographic (CT) simulation datasets and accompanying diagnostic positron emission tomographic/CT images were distributed to each expert, and the expert was instructed to generate gross tumor volume (GTV) and CTV contours for each case. All contours were aggregated and subjected to quantitative analysis to assess the degree of concordance between experts and to generate draft consensus contours. The panel then refined these contours to generate the contouring atlas. Results: The κ statistics indicated substantial agreement between panelists for each of the 3 test cases. A consensus CTV atlas was generated for the 3 test cases, each representing common anatomic presentations of esophageal cancer. The panel agreed on guidelines and principles to facilitate the generalizability of the atlas to individual cases. Conclusions: This expert panel successfully reached agreement on contouring guidelines for esophageal and GEJ IMRT and generated a reference CTV atlas. This atlas will serve as a reference for IMRT contours for clinical practice and prospective trial design. Subsequent patterns of failure analyses of clinical datasets using these guidelines may require modification in the future.« less
Critical examination of the uniformity requirements for single-photon emission computed tomography.
O'Connor, M K; Vermeersch, C
1991-01-01
It is generally recognized that single-photon emission computed tomography (SPECT) imposes very stringent requirements on gamma camera uniformity to prevent the occurrence of ring artifacts. The purpose of this study was to examine the relationship between nonuniformities in the planar data and the magnitude of the consequential ring artifacts in the transaxial data, and how the perception of these artifacts is influenced by factors such as reconstruction matrix size, reconstruction filter, and image noise. The study indicates that the relationship between ring artifact magnitude and image noise is essentially independent of the acquisition or reconstruction matrix sizes, but is strongly dependent upon the type of smoothing filter applied during the reconstruction process. Furthermore, the degree to which a ring artifact can be perceived above image noise is dependent on the size and location of the nonuniformity in the planar data, with small nonuniformities (1-2 pixels wide) close to the center of rotation being less perceptible than those further out (8-20 pixels). Small defects or nonuniformities close to the center of rotation are thought to cause the greatest potential corruption to tomographic data. The study indicates that such may not be the case. Hence the uniformity requirements for SPECT may be less demanding than was previously thought.
Schneider, Florian R; Mann, Alexander B; Konorov, Igor; Delso, Gaspar; Paul, Stephan; Ziegler, Sibylle I
2012-06-01
A one-day laboratory course on positron emission tomography (PET) for the education of physics students and PhD students in medical physics has been set up. In the course, the physical background and the principles of a PET scanner are introduced. Course attendees set the system in operation, calibrate it using a (22)Na point source and reconstruct different source geometries filled with (18)F. The PET scanner features an individual channel read-out of 96 lutetium oxyorthosilicate (LSO) scintillator crystals coupled to avalanche photodiodes (APD). The analog data of each APD are digitized by fast sampling analog to digital converters (SADC) and processed within field programmable gate arrays (FPGA) to extract amplitudes and time stamps. All SADCs are continuously sampling with a precise rate of 80MHz, which is synchronous for the whole system. The data is transmitted via USB to a Linux PC, where further processing and the image reconstruction are performed. The course attendees get an insight into detector techniques, modern read-out electronics, data acquisition and PET image reconstruction. In addition, a short introduction to some common software applications used in particle and high energy physics is part of the course. Copyright © 2011. Published by Elsevier GmbH.
Measuring the Epoch of Reionization using [CII] Intensity Mapping with TIME-Pilot
NASA Astrophysics Data System (ADS)
Crites, Abigail; Bock, James; Bradford, Matt; Bumble, Bruce; Chang, Tzu-Ching; Cheng, Yun-Ting; Cooray, Asantha R.; Hailey-Dunsheath, Steve; Hunacek, Jonathon; Li, Chao-Te; O'Brient, Roger; Shirokoff, Erik; Staniszewski, Zachary; Shiu, Corwin; Uzgil, Bade; Zemcov, Michael B.; Sun, Guochao
2017-01-01
TIME-Pilot (the Tomographic Ionized carbon Intensity Mapping Experiment) is a new instrument designed to probe the epoch of reionization (EoR) by measuring the 158 um ionized carbon emission line [CII] from redshift 5 - 9. TIME-Pilot will also probe the molecular gas content of the universe during the epoch spanning the peak of star formation (z ~ 1 -3) by making an intensity mapping measurement of the CO transitions in the TIME-Pilot band (CO(3-2), CO(4-3), CO(5-4), and CO(6-5)). I will describe the instrument we are building which is an R of ~100 spectrometer sensitive to the 200-300 GHz radiation. The camera is designed to measure the line emission from galaxies using an intensity mapping technique. This instrument will allow us to detect the [CII] clustering fluctuations from faint galaxies during EoR and compare these measurements to predicted [CII] amplitudes from current models. The CO measurements will allow us to constrain models for galaxies at lower redshift. The [CII] intensity mapping measurements that will be made with TIME-Pilot and detailed measurements made with future more sensitive mm-wavelength spectrometers are complimentary to 21-cm measurements of the EoR and complimentary to direct detections of high redshift galaxies with HST, ALMA, and, in the future, JWST.
Harjula, A; Järvinen, A; Mattila, S; Porkka, L
1985-01-01
Single photon emission computerized tomography (SPECT) was performed thrice in ten patients undergoing open-heart surgery--preoperatively and 2 and 12 weeks postoperatively. The operations were done for ischemic heart disease (5), aortic valvular stenosis (2), aortic valvular insufficiency (1), leaking mitral prosthetic valve (1) and combined aortic and mitral valvular stenosis and insufficiency (1). The healing process in the longitudinally divided sternum was evaluated from the SPECT study. Four conventional static images in two dimensions were registered in anteroposterior, posteroanterior and left and right lateral projections. A tomographic study was done. Quantitative analyses were performed. The ratio of the sternal counts to the counts from a thoracic vertebra was calculated for use as a reference. The activity ratios showed a similar pattern in six cases, with initial increases and at 12 weeks slight decrease compared with the preoperative values. In two cases the activity was still increasing after 12 postoperative weeks. One patient, with sternotomy also one year previously, showed only slightly increased activity. The activity at the areas of the sternal wires was increased in six cases. The study thus revealed differing patterns of isotope uptake, although recovery was uneventful in all patients. The differences may reflect the possibility that the operative course and the preoperative clinical status can influence the healing mechanisms.
Preliminary results of a prototype C-shaped PET designed for an in-beam PET system
NASA Astrophysics Data System (ADS)
Kim, Hyun-Il; Chung, Yong Hyun; Lee, Kisung; Kim, Kyeong Min; Kim, Yongkwon; Joung, Jinhun
2016-06-01
Positron emission tomography (PET) can be utilized in particle beam therapy to verify the dose distribution of the target volume as well as the accuracy of the treatment. We present an in-beam PET scanner that can be integrated into a particle beam therapy system. The proposed PET scanner consisted of 14 detector modules arranged in a C-shape to avoid blockage of the particle beam line by the detector modules. Each detector module was composed of a 9×9 array of 4.0 mm×4.0 mm×20.0 mm LYSO crystals optically coupled to four 29-mm-diameter PMTs using the photomultiplier-quadrant-sharing (PQS) technique. In this study, a Geant4 Application for Tomographic Emission (GATE) simulation study was conducted to design a C-shaped PET scanner and then experimental evaluation of the proposed design was performed. The spatial resolution and sensitivity were measured according to NEMA NU2-2007 standards and were 6.1 mm and 5.61 cps/kBq, respectively, which is in good agreement with our simulation, with an error rate of 12.0%. Taken together, our results demonstrate the feasibility of the proposed C-shaped in-beam PET system, which we expect will be useful for measuring dose distribution in particle therapy.
Image quality phantom and parameters for high spatial resolution small-animal SPECT
NASA Astrophysics Data System (ADS)
Visser, Eric P.; Harteveld, Anita A.; Meeuwis, Antoi P. W.; Disselhorst, Jonathan A.; Beekman, Freek J.; Oyen, Wim J. G.; Boerman, Otto C.
2011-10-01
At present, generally accepted standards to characterize small-animal single photon emission tomographs (SPECT) do not exist. Whereas for small-animal positron emission tomography (PET), the NEMA NU 4-2008 guidelines are available, such standards are still lacking for small-animal SPECT. More specifically, a dedicated image quality (IQ) phantom and corresponding IQ parameters are absent. The structures of the existing PET IQ phantom are too large to fully characterize the sub-millimeter spatial resolution of modern multi-pinhole SPECT scanners, and its diameter will not fit into all scanners when operating in high spatial resolution mode. We therefore designed and constructed an adapted IQ phantom with smaller internal structures and external diameter, and a facility to guarantee complete filling of the smallest rods. The associated IQ parameters were adapted from NEMA NU 4. An additional parameter, effective whole-body sensitivity, was defined since this was considered relevant in view of the variable size of the field of view and the use of multiple bed positions as encountered in modern small-animal SPECT scanners. The usefulness of the phantom was demonstrated for 99mTc in a USPECT-II scanner operated in whole-body scanning mode using a multi-pinhole mouse collimator with 0.6 mm pinhole diameter.
NASA Astrophysics Data System (ADS)
Pablico-Lansigan, Michele H.; Situ, Shu F.; Samia, Anna Cristina S.
2013-05-01
Magnetic particle imaging (MPI) is an emerging biomedical imaging technology that allows the direct quantitative mapping of the spatial distribution of superparamagnetic iron oxide nanoparticles. MPI's increased sensitivity and short image acquisition times foster the creation of tomographic images with high temporal and spatial resolution. The contrast and sensitivity of MPI is envisioned to transcend those of other medical imaging modalities presently used, such as magnetic resonance imaging (MRI), X-ray scans, ultrasound, computed tomography (CT), positron emission tomography (PET) and single photon emission computed tomography (SPECT). In this review, we present an overview of the recent advances in the rapidly developing field of MPI. We begin with a basic introduction of the fundamentals of MPI, followed by some highlights over the past decade of the evolution of strategies and approaches used to improve this new imaging technique. We also examine the optimization of iron oxide nanoparticle tracers used for imaging, underscoring the importance of size homogeneity and surface engineering. Finally, we present some future research directions for MPI, emphasizing the novel and exciting opportunities that it offers as an important tool for real-time in vivo monitoring. All these opportunities and capabilities that MPI presents are now seen as potential breakthrough innovations in timely disease diagnosis, implant monitoring, and image-guided therapeutics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giacomelli, L.; Department of Physics, Università degli Studi di Milano-Bicocca, Milano; Conroy, S.
The Joint European Torus (JET, Culham, UK) is the largest tokamak in the world devoted to nuclear fusion experiments of magnetic confined Deuterium (D)/Deuterium-Tritium (DT) plasmas. Neutrons produced in these plasmas are measured using various types of neutron detectors and spectrometers. Two of these instruments on JET make use of organic liquid scintillator detectors. The neutron emission profile monitor implements 19 liquid scintillation counters to detect the 2.45 MeV neutron emission from D plasmas. A new compact neutron spectrometer is operational at JET since 2010 to measure the neutron energy spectra from both D and DT plasmas. Liquid scintillation detectorsmore » are sensitive to both neutron and gamma radiation but give light responses of different decay time such that pulse shape discrimination techniques can be applied to identify the neutron contribution of interest from the data. The most common technique consists of integrating the radiation pulse shapes within different ranges of their rising and/or trailing edges. In this article, a step forward in this type of analysis is presented. The method applies a tomographic analysis of the 3-dimensional neutron and gamma pulse shape and pulse height distribution data obtained from liquid scintillation detectors such that n/γ discrimination can be improved to lower energies and additional information can be gained on neutron contributions to the gamma events and vice versa.« less
NASA Astrophysics Data System (ADS)
Raczyński, L.; Moskal, P.; Kowalski, P.; Wiślicki, W.; Bednarski, T.; Białas, P.; Czerwiński, E.; Kapłon, Ł.; Kochanowski, A.; Korcyl, G.; Kowal, J.; Kozik, T.; Krzemień, W.; Kubicz, E.; Molenda, M.; Moskal, I.; Niedźwiecki, Sz.; Pałka, M.; Pawlik-Niedźwiecka, M.; Rudy, Z.; Salabura, P.; Sharma, N. G.; Silarski, M.; Słomski, A.; Smyrski, J.; Strzelecki, A.; Wieczorek, A.; Zieliński, M.; Zoń, N.
2014-11-01
Currently inorganic scintillator detectors are used in all commercial Time of Flight Positron Emission Tomograph (TOF-PET) devices. The J-PET collaboration investigates a possibility of construction of a PET scanner from plastic scintillators which would allow for single bed imaging of the whole human body. This paper describes a novel method of hit-position reconstruction based on sampled signals and an example of an application of the method for a single module with a 30 cm long plastic strip, read out on both ends by Hamamatsu R4998 photomultipliers. The sampling scheme to generate a vector with samples of a PET event waveform with respect to four user-defined amplitudes is introduced. The experimental setup provides irradiation of a chosen position in the plastic scintillator strip with an annihilation gamma quanta of energy 511 keV. The statistical test for a multivariate normal (MVN) distribution of measured vectors at a given position is developed, and it is shown that signals sampled at four thresholds in a voltage domain are approximately normally distributed variables. With the presented method of a vector analysis made out of waveform samples acquired with four thresholds, we obtain a spatial resolution of about 1 cm and a timing resolution of about 80 ps (σ).
Dual-Modality PET/Ultrasound imaging of the Prostate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huber, Jennifer S.; Moses, William W.; Pouliot, Jean
2005-11-11
Functional imaging with positron emission tomography (PET)will detect malignant tumors in the prostate and/or prostate bed, as well as possibly help determine tumor ''aggressiveness''. However, the relative uptake in a prostate tumor can be so great that few other anatomical landmarks are visible in a PET image. Ultrasound imaging with a transrectal probe provides anatomical detail in the prostate region that can be co-registered with the sensitive functional information from the PET imaging. Imaging the prostate with both PET and transrectal ultrasound (TRUS) will help determine the location of any cancer within the prostate region. This dual-modality imaging should helpmore » provide better detection and treatment of prostate cancer. LBNL has built a high performance positron emission tomograph optimized to image the prostate.Compared to a standard whole-body PET camera, our prostate-optimized PET camera has the same sensitivity and resolution, less backgrounds and lower cost. We plan to develop the hardware and software tools needed for a validated dual PET/TRUS prostate imaging system. We also plan to develop dual prostate imaging with PET and external transabdominal ultrasound, in case the TRUS system is too uncomfortable for some patients. We present the design and intended clinical uses for these dual imaging systems.« less
Numerical study on simultaneous emission and transmission tomography in the MRI framework
NASA Astrophysics Data System (ADS)
Gjesteby, Lars; Cong, Wenxiang; Wang, Ge
2017-09-01
Multi-modality imaging methods are instrumental for advanced diagnosis and therapy. Specifically, a hybrid system that combines computed tomography (CT), nuclear imaging, and magnetic resonance imaging (MRI) will be a Holy Grail of medical imaging, delivering complementary structural/morphological, functional, and molecular information for precision medicine. A novel imaging method was recently demonstrated that takes advantage of radiotracer polarization to combine MRI principles with nuclear imaging. This approach allows the concentration of a polarized Υ-ray emitting radioisotope to be imaged with MRI resolution potentially outperforming the standard nuclear imaging mode at a sensitivity significantly higher than that of MRI. In our work, we propose to acquire MRI-modulated nuclear data for simultaneous image reconstruction of both emission and transmission parameters, suggesting the potential for simultaneous CT-SPECT-MRI. The synchronized diverse datasets allow excellent spatiotemporal registration and unique insight into physiological and pathological features. Here we describe the methodology involving the system design with emphasis on the formulation for tomographic images, even when significant radiotracer signals are limited to a region of interest (ROI). Initial numerical results demonstrate the feasibility of our approach for reconstructing concentration and attenuation images through a head phantom with various radio-labeled ROIs. Additional considerations regarding the radioisotope characteristics are also discussed.
Single photon emission computed tomography-guided Cerenkov luminescence tomography
NASA Astrophysics Data System (ADS)
Hu, Zhenhua; Chen, Xueli; Liang, Jimin; Qu, Xiaochao; Chen, Duofang; Yang, Weidong; Wang, Jing; Cao, Feng; Tian, Jie
2012-07-01
Cerenkov luminescence tomography (CLT) has become a valuable tool for preclinical imaging because of its ability of reconstructing the three-dimensional distribution and activity of the radiopharmaceuticals. However, it is still far from a mature technology and suffers from relatively low spatial resolution due to the ill-posed inverse problem for the tomographic reconstruction. In this paper, we presented a single photon emission computed tomography (SPECT)-guided reconstruction method for CLT, in which a priori information of the permissible source region (PSR) from SPECT imaging results was incorporated to effectively reduce the ill-posedness of the inverse reconstruction problem. The performance of the method was first validated with the experimental reconstruction of an adult athymic nude mouse implanted with a Na131I radioactive source and an adult athymic nude mouse received an intravenous tail injection of Na131I. A tissue-mimic phantom based experiment was then conducted to illustrate the ability of the proposed method in resolving double sources. Compared with the traditional PSR strategy in which the PSR was determined by the surface flux distribution, the proposed method obtained much more accurate and encouraging localization and resolution results. Preliminary results showed that the proposed SPECT-guided reconstruction method was insensitive to the regularization methods and ignored the heterogeneity of tissues which can avoid the segmentation procedure of the organs.
Han, Hyemin; Park, Joonsuk
2018-01-01
Recent debates about the conventional traditional threshold used in the fields of neuroscience and psychology, namely P < 0.05, have spurred researchers to consider alternative ways to analyze fMRI data. A group of methodologists and statisticians have considered Bayesian inference as a candidate methodology. However, few previous studies have attempted to provide end users of fMRI analysis tools, such as SPM 12, with practical guidelines about how to conduct Bayesian inference. In the present study, we aim to demonstrate how to utilize Bayesian inference, Bayesian second-level inference in particular, implemented in SPM 12 by analyzing fMRI data available to public via NeuroVault. In addition, to help end users understand how Bayesian inference actually works in SPM 12, we examine outcomes from Bayesian second-level inference implemented in SPM 12 by comparing them with those from classical second-level inference. Finally, we provide practical guidelines about how to set the parameters for Bayesian inference and how to interpret the results, such as Bayes factors, from the inference. We also discuss the practical and philosophical benefits of Bayesian inference and directions for future research. PMID:29456498
An introduction to Bayesian statistics in health psychology.
Depaoli, Sarah; Rus, Holly M; Clifton, James P; van de Schoot, Rens; Tiemensma, Jitske
2017-09-01
The aim of the current article is to provide a brief introduction to Bayesian statistics within the field of health psychology. Bayesian methods are increasing in prevalence in applied fields, and they have been shown in simulation research to improve the estimation accuracy of structural equation models, latent growth curve (and mixture) models, and hierarchical linear models. Likewise, Bayesian methods can be used with small sample sizes since they do not rely on large sample theory. In this article, we discuss several important components of Bayesian statistics as they relate to health-based inquiries. We discuss the incorporation and impact of prior knowledge into the estimation process and the different components of the analysis that should be reported in an article. We present an example implementing Bayesian estimation in the context of blood pressure changes after participants experienced an acute stressor. We conclude with final thoughts on the implementation of Bayesian statistics in health psychology, including suggestions for reviewing Bayesian manuscripts and grant proposals. We have also included an extensive amount of online supplementary material to complement the content presented here, including Bayesian examples using many different software programmes and an extensive sensitivity analysis examining the impact of priors.
Farhate, Camila Viana Vieira; Souza, Zigomar Menezes de; Oliveira, Stanley Robson de Medeiros; Tavares, Rose Luiza Moraes; Carvalho, João Luís Nunes
2018-01-01
Soil CO2 emissions are regarded as one of the largest flows of the global carbon cycle and small changes in their magnitude can have a large effect on the CO2 concentration in the atmosphere. Thus, a better understanding of this attribute would enable the identification of promoters and the development of strategies to mitigate the risks of climate change. Therefore, our study aimed at using data mining techniques to predict the soil CO2 emission induced by crop management in sugarcane areas in Brazil. To do so, we used different variable selection methods (correlation, chi-square, wrapper) and classification (Decision tree, Bayesian models, neural networks, support vector machine, bagging with logistic regression), and finally we tested the efficiency of different approaches through the Receiver Operating Characteristic (ROC) curve. The original dataset consisted of 19 variables (18 independent variables and one dependent (or response) variable). The association between cover crop and minimum tillage are effective strategies to promote the mitigation of soil CO2 emissions, in which the average CO2 emissions are 63 kg ha-1 day-1. The variables soil moisture, soil temperature (Ts), rainfall, pH, and organic carbon were most frequently selected for soil CO2 emission classification using different methods for attribute selection. According to the results of the ROC curve, the best approaches for soil CO2 emission classification were the following: (I)-the Multilayer Perceptron classifier with attribute selection through the wrapper method, that presented rate of false positive of 13,50%, true positive of 94,20% area under the curve (AUC) of 89,90% (II)-the Bagging classifier with logistic regression with attribute selection through the Chi-square method, that presented rate of false positive of 13,50%, true positive of 94,20% AUC of 89,90%. However, the (I) approach stands out in relation to (II) for its higher positive class accuracy (high CO2 emission) and lower computational cost.
de Souza, Zigomar Menezes; Oliveira, Stanley Robson de Medeiros; Tavares, Rose Luiza Moraes; Carvalho, João Luís Nunes
2018-01-01
Soil CO2 emissions are regarded as one of the largest flows of the global carbon cycle and small changes in their magnitude can have a large effect on the CO2 concentration in the atmosphere. Thus, a better understanding of this attribute would enable the identification of promoters and the development of strategies to mitigate the risks of climate change. Therefore, our study aimed at using data mining techniques to predict the soil CO2 emission induced by crop management in sugarcane areas in Brazil. To do so, we used different variable selection methods (correlation, chi-square, wrapper) and classification (Decision tree, Bayesian models, neural networks, support vector machine, bagging with logistic regression), and finally we tested the efficiency of different approaches through the Receiver Operating Characteristic (ROC) curve. The original dataset consisted of 19 variables (18 independent variables and one dependent (or response) variable). The association between cover crop and minimum tillage are effective strategies to promote the mitigation of soil CO2 emissions, in which the average CO2 emissions are 63 kg ha-1 day-1. The variables soil moisture, soil temperature (Ts), rainfall, pH, and organic carbon were most frequently selected for soil CO2 emission classification using different methods for attribute selection. According to the results of the ROC curve, the best approaches for soil CO2 emission classification were the following: (I)–the Multilayer Perceptron classifier with attribute selection through the wrapper method, that presented rate of false positive of 13,50%, true positive of 94,20% area under the curve (AUC) of 89,90% (II)–the Bagging classifier with logistic regression with attribute selection through the Chi-square method, that presented rate of false positive of 13,50%, true positive of 94,20% AUC of 89,90%. However, the (I) approach stands out in relation to (II) for its higher positive class accuracy (high CO2 emission) and lower computational cost. PMID:29513765
Cui, Yu Yan; Brioude, Jerome; McKeen, Stuart A.; ...
2015-07-28
Methane (CH 4) is the primary component of natural gas and has a larger global warming potential than CO 2. Some recent top-down studies based on observations showed CH 4 emissions in California's South Coast Air Basin (SoCAB) were greater than those expected from population-apportioned bottom-up state inventories. In this study, we quantify CH 4 emissions with an advanced mesoscale inverse modeling system at a resolution of 8 km × 8 km, using aircraft measurements in the SoCAB during the 2010 Nexus of Air Quality and Climate Change campaign to constrain the inversion. To simulate atmospheric transport, we use themore » FLEXible PARTicle-Weather Research and Forecasting (FLEXPART-WRF) Lagrangian particle dispersion model driven by three configurations of the Weather Research and Forecasting (WRF) mesoscale model. We determine surface fluxes of CH 4 using a Bayesian least squares method in a four-dimensional inversion. Simulated CH4 concentrations with the posterior emission inventory achieve much better correlations with the measurements (R2 = 0.7) than using the prior inventory (U.S. Environmental Protection Agency's National Emission Inventory 2005, R 2 = 0.5). The emission estimates for CH 4 in the posterior, 46.3 ± 9.2 Mg CH 4/h, are consistent with published observation-based estimates. Changes in the spatial distribution of CH 4 emissions in the SoCAB between the prior and posterior inventories are discussed. Missing or underestimated emissions from dairies, the oil/gas system, and landfills in the SoCAB seem to explain the differences between the prior and posterior inventories. Furthermore, we estimate that dairies contributed 5.9 ± 1.7 Mg CH 4/h and the two sectors of oil and gas industries (production and downstream) and landfills together contributed 39.6 ± 8.1 Mg CH 4/h in the SoCAB.« less
Historical emissions critical for mapping decarbonization pathways
NASA Astrophysics Data System (ADS)
Majkut, J.; Kopp, R. E.; Sarmiento, J. L.; Oppenheimer, M.
2016-12-01
Policymakers have set a goal of limiting temperature increase from human influence on the climate. This motivates the identification of decarbonization pathways to stabilize atmospheric concentrations of CO2. In this context, the future behavior of CO2 sources and sinks define the CO2 emissions necessary to meet warming thresholds with specified probabilities. We adopt a simple model of the atmosphere-land-ocean carbon balance to reflect uncertainty in how natural CO2 sinks will respond to increasing atmospheric CO2 and temperature. Bayesian inversion is used to estimate the probability distributions of selected parameters of the carbon model. Prior probability distributions are chosen to reflect the behavior of CMIP5 models. We then update these prior distributions by running historical simulations of the global carbon cycle and inverting with observationally-based inventories and fluxes of anthropogenic carbon in the ocean and atmosphere. The result is a best-estimate of historical CO2 sources and sinks and a model of how CO2 sources and sinks will vary in the future under various emissions scenarios, with uncertainty. By linking the carbon model to a simple climate model, we calculate emissions pathways and carbon budgets consistent with meeting specific temperature thresholds and identify key factors that contribute to remaining uncertainty. In particular, we show how the assumed history of CO2 emissions from land use change (LUC) critically impacts estimates of the strength of the land CO2 sink via CO2 fertilization. Different estimates of historical LUC emissions taken from the literature lead to significantly different parameterizations of the carbon system. High historical CO2 emissions from LUC lead to a more robust CO2 fertilization effect, significantly lower future atmospheric CO2 concentrations, and an increased amount of CO2 that can be emitted to satisfy temperature stabilization targets. Thus, in our model, historical LUC emissions have a significant impact on allowable carbon budgets under temperture targets.
NASA Astrophysics Data System (ADS)
White, Emily; Rigby, Matt; O'Doherty, Simon; Stavert, Ann; Lunt, Mark; Nemitz, Eiko; Helfter, Carole; Allen, Grant; Pitt, Joe; Bauguitte, Stéphane; Levy, Pete; van Oijen, Marcel; Williams, Mat; Smallman, Luke; Palmer, Paul
2016-04-01
Having a comprehensive understanding, on a countrywide scale, of both biogenic and anthropogenic CO2 emissions is essential for knowing how best to reduce anthropogenic emissions and for understanding how the terrestrial biosphere is responding to global fossil fuel emissions. Whilst anthropogenic CO2 flux estimates are fairly well constrained, fluxes from biogenic sources are not. This work will help to verify existing anthropogenic emissions inventories and give a better understanding of biosphere - atmosphere CO2 exchange. Using an innovative top-down inversion scheme; a hierarchical Bayesian Markov Chain Monte Carlo approach with reversible jump "trans-dimensional" basis function selection, we aim to find emissions estimates for biogenic and anthropogenic sources simultaneously. Our approach allows flux uncertainties to be derived more comprehensively than previous methods, and allows the resolved spatial scales in the solution to be determined using the data. We use atmospheric CO2 mole fraction data from the UK Deriving Emissions related to Climate Change (DECC) and Greenhouse gAs UK and Global Emissions (GAUGE) projects. The network comprises of 6 tall tower sites, flight campaigns and a ferry transect along the east coast, and enables us to derive high-resolution monthly flux estimates across the UK and Ireland for the period 2013-2015. We have derived UK total fluxes of 675 PIC 78 Tg/yr during January 2014 (seasonal maximum) and 23 PIC 96 Tg/yr during May 2014 (seasonal minimum). Our disaggregated anthropogenic and biogenic flux estimates are compared to a new high-resolution time resolved anthropogenic inventory that will underpin future UNFCCC reports by the UK, and to DALEC carbon cycle model. This allows us to identify where significant differences exist between these "bottom-up" and "top-down" flux estimates and suggest reasons for discrepancies. We will highlight the strengths and limitations of the UK's CO2 emissions verification infrastructure at present and outline improvements that could be made in the future.
Prior approval: the growth of Bayesian methods in psychology.
Andrews, Mark; Baguley, Thom
2013-02-01
Within the last few years, Bayesian methods of data analysis in psychology have proliferated. In this paper, we briefly review the history or the Bayesian approach to statistics, and consider the implications that Bayesian methods have for the theory and practice of data analysis in psychology.
Using artificial neural networks (ANN) for open-loop tomography
NASA Astrophysics Data System (ADS)
Osborn, James; De Cos Juez, Francisco Javier; Guzman, Dani; Butterley, Timothy; Myers, Richard; Guesalaga, Andres; Laine, Jesus
2011-09-01
The next generation of adaptive optics (AO) systems require tomographic techniques in order to correct for atmospheric turbulence along lines of sight separated from the guide stars. Multi-object adaptive optics (MOAO) is one such technique. Here, we present a method which uses an artificial neural network (ANN) to reconstruct the target phase given off-axis references sources. This method does not require any input of the turbulence profile and is therefore less susceptible to changing conditions than some existing methods. We compare our ANN method with a standard least squares type matrix multiplication method (MVM) in simulation and find that the tomographic error is similar to the MVM method. In changing conditions the tomographic error increases for MVM but remains constant with the ANN model and no large matrix inversions are required.
NASA Astrophysics Data System (ADS)
Massambone de Oliveira, Rafael; Salomão Helou, Elias; Fontoura Costa, Eduardo
2016-11-01
We present a method for non-smooth convex minimization which is based on subgradient directions and string-averaging techniques. In this approach, the set of available data is split into sequences (strings) and a given iterate is processed independently along each string, possibly in parallel, by an incremental subgradient method (ISM). The end-points of all strings are averaged to form the next iterate. The method is useful to solve sparse and large-scale non-smooth convex optimization problems, such as those arising in tomographic imaging. A convergence analysis is provided under realistic, standard conditions. Numerical tests are performed in a tomographic image reconstruction application, showing good performance for the convergence speed when measured as the decrease ratio of the objective function, in comparison to classical ISM.
A local approach for focussed Bayesian fusion
NASA Astrophysics Data System (ADS)
Sander, Jennifer; Heizmann, Michael; Goussev, Igor; Beyerer, Jürgen
2009-04-01
Local Bayesian fusion approaches aim to reduce high storage and computational costs of Bayesian fusion which is separated from fixed modeling assumptions. Using the small world formalism, we argue why this proceeding is conform with Bayesian theory. Then, we concentrate on the realization of local Bayesian fusion by focussing the fusion process solely on local regions that are task relevant with a high probability. The resulting local models correspond then to restricted versions of the original one. In a previous publication, we used bounds for the probability of misleading evidence to show the validity of the pre-evaluation of task specific knowledge and prior information which we perform to build local models. In this paper, we prove the validity of this proceeding using information theoretic arguments. For additional efficiency, local Bayesian fusion can be realized in a distributed manner. Here, several local Bayesian fusion tasks are evaluated and unified after the actual fusion process. For the practical realization of distributed local Bayesian fusion, software agents are predestinated. There is a natural analogy between the resulting agent based architecture and criminal investigations in real life. We show how this analogy can be used to improve the efficiency of distributed local Bayesian fusion additionally. Using a landscape model, we present an experimental study of distributed local Bayesian fusion in the field of reconnaissance, which highlights its high potential.
Vemmer, T; Steinbüchel, C; Bertram, J; Eschner, W; Kögler, A; Luig, H
1997-03-01
The purpose of this study was to determine whether data acquisition in the list mode and iterative tomographic reconstruction would render feasible cardiac phase-synchronized thallium-201 single-photon emission tomography (SPET) of the myocardium under routine conditions without modifications in tracer dose, acquisition time, or number of steps of the a gamma camera. Seventy non-selected patients underwent 201T1 SPET imaging according to a routine protocol (74 MBq/2 mCi 201T1, 180 degrees rotation of the gamma camera, 32 steps, 30 min). Gamma camera data, ECG, and a time signal were recorded in list mode. The cardiac cycle was divided into eight phases, the end-diastolic phase encompassing the QRS complex, and the end-systolic phase the T wave. Both phase- and non-phase-synchronized tomograms based on the same list mode data were reconstructed iteratively. Phase-synchronized and non-synchronized images were compared. Patients were divided into two groups depending on whether or not coronary artery disease had been definitely diagnosed prior to SPET imaging. The numbers of patients in both groups demonstrating defects visible on the phase-synchronized but not on the non-synchronized images were compared. It was found that both postexercise and redistribution phase tomograms were suited for interpretation. The changes from end-diastolic to end-systolic images allowed a comparative assessment of regional wall motility and tracer uptake. End-diastolic tomograms provided the best definition of defects. Additional defects not apparent on non-synchronized images were visible in 40 patients, six of whom did not show any defect on the non-synchronized images. Of 42 patients in whom coronary artery disease had been definitely diagnosed, 19 had additional defects not visible on the non-synchronized images, in comparison to 21 of 28 in whom coronary artery disease was suspected (P < 0.02; chi 2). It is concluded that cardiac phase-synchronized 201T1 SPET of the myocardium was made feasible by list mode data acquisition and iterative reconstruction. The additional findings on the phase-synchronized tomograms, not visible on the non-synchronized ones, represented genuine defects. Cardiac phase-synchronized 201T1 SPET is advantageous in allowing simultaneous assessment of regional wall motion and tracer uptake, and in visualizing smaller defects.
NASA Astrophysics Data System (ADS)
Lanusse, F.; Rassat, A.; Starck, J.-L.
2015-06-01
Context. Upcoming spectroscopic galaxy surveys are extremely promising to help in addressing the major challenges of cosmology, in particular in understanding the nature of the dark universe. The strength of these surveys, naturally described in spherical geometry, comes from their unprecedented depth and width, but an optimal extraction of their three-dimensional information is of utmost importance to best constrain the properties of the dark universe. Aims: Although there is theoretical motivation and novel tools to explore these surveys using the 3D spherical Fourier-Bessel (SFB) power spectrum of galaxy number counts Cℓ(k,k'), most survey optimisations and forecasts are based on the tomographic spherical harmonics power spectrum C(ij)_ℓ. The goal of this paper is to perform a new investigation of the information that can be extracted from these two analyses in the context of planned stage IV wide-field galaxy surveys. Methods: We compared tomographic and 3D SFB techniques by comparing the forecast cosmological parameter constraints obtained from a Fisher analysis. The comparison was made possible by careful and coherent treatment of non-linear scales in the two analyses, which makes this study the first to compare 3D SFB and tomographic constraints on an equal footing. Nuisance parameters related to a scale- and redshift-dependent galaxy bias were also included in the computation of the 3D SFB and tomographic power spectra for the first time. Results: Tomographic and 3D SFB methods can recover similar constraints in the absence of systematics. This requires choosing an optimal number of redshift bins for the tomographic analysis, which we computed to be N = 26 for zmed ≃ 0.4, N = 30 for zmed ≃ 1.0, and N = 42 for zmed ≃ 1.7. When marginalising over nuisance parameters related to the galaxy bias, the forecast 3D SFB constraints are less affected by this source of systematics than the tomographic constraints. In addition, the rate of increase of the figure of merit as a function of median redshift is higher for the 3D SFB method than for the 2D tomographic method. Conclusions: Constraints from the 3D SFB analysis are less sensitive to unavoidable systematics stemming from a redshift- and scale-dependent galaxy bias. Even for surveys that are optimised with tomography in mind, a 3D SFB analysis is more powerful. In addition, for survey optimisation, the figure of merit for the 3D SFB method increases more rapidly with redshift, especially at higher redshifts, suggesting that the 3D SFB method should be preferred for designing and analysing future wide-field spectroscopic surveys. CosmicPy, the Python package developed for this paper, is freely available at https://cosmicpy.github.io. Appendices are available in electronic form at http://www.aanda.org
A Bayesian Nonparametric Approach to Test Equating
ERIC Educational Resources Information Center
Karabatsos, George; Walker, Stephen G.
2009-01-01
A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…
Bayesian Model Averaging for Propensity Score Analysis
ERIC Educational Resources Information Center
Kaplan, David; Chen, Jianshen
2013-01-01
The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…
An attempt at estimating Paris area CO2 emissions from atmospheric concentration measurements
NASA Astrophysics Data System (ADS)
Bréon, F. M.; Broquet, G.; Puygrenier, V.; Chevallier, F.; Xueref-Rémy, I.; Ramonet, M.; Dieudonné, E.; Lopez, M.; Schmidt, M.; Perrussel, O.; Ciais, P.
2014-04-01
Atmospheric concentration measurements are used to adjust the daily to monthly budget of CO2 emissions from the AirParif inventory of the Paris agglomeration. We use 5 atmospheric monitoring sites including one at the top of the Eiffel tower. The atmospheric inversion is based on a Bayesian approach, and relies on an atmospheric transport model with a spatial resolution of 2 km with boundary conditions from a global coarse grid transport model. The inversion tool adjusts the CO2 fluxes (anthropogenic and biogenic) with a temporal resolution of 6 h, assuming temporal correlation of emissions uncertainties within the daily cycle and from day to day, while keeping the a priori spatial distribution from the emission inventory. The inversion significantly improves the agreement between measured and modelled concentrations. However, the amplitude of the atmospheric transport errors is often large compared to the CO2 gradients between the sites that are used to estimate the fluxes, in particular for the Eiffel tower station. In addition, we sometime observe large model-measurement differences upwind from the Paris agglomeration, which confirms the large and poorly constrained contribution from distant sources and sinks included in the prescribed CO2 boundary conditions These results suggest that (i) the Eiffel measurements at 300 m above ground cannot be used with the current system and (ii) the inversion shall rely on the measured upwind-downwind gradients rather than the raw mole fraction measurements. With such setup, realistic emissions are retrieved for two 30 day periods. Similar inversions over longer periods are necessary for a proper evaluation of the results.
Astrocytic tracer dynamics estimated from [1-¹¹C]-acetate PET measurements.
Arnold, Andrea; Calvetti, Daniela; Gjedde, Albert; Iversen, Peter; Somersalo, Erkki
2015-12-01
We address the problem of estimating the unknown parameters of a model of tracer kinetics from sequences of positron emission tomography (PET) scan data using a statistical sequential algorithm for the inference of magnitudes of dynamic parameters. The method, based on Bayesian statistical inference, is a modification of a recently proposed particle filtering and sequential Monte Carlo algorithm, where instead of preassigning the accuracy in the propagation of each particle, we fix the time step and account for the numerical errors in the innovation term. We apply the algorithm to PET images of [1-¹¹C]-acetate-derived tracer accumulation, estimating the transport rates in a three-compartment model of astrocytic uptake and metabolism of the tracer for a cohort of 18 volunteers from 3 groups, corresponding to healthy control individuals, cirrhotic liver and hepatic encephalopathy patients. The distribution of the parameters for the individuals and for the groups presented within the Bayesian framework support the hypothesis that the parameters for the hepatic encephalopathy group follow a significantly different distribution than the other two groups. The biological implications of the findings are also discussed. © The Authors 2014. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.
NASA Astrophysics Data System (ADS)
Fan, Lulu; Han, Yunkun; Nikutta, Robert; Drouart, Guillaume; Knudsen, Kirsten K.
2016-06-01
We utilize a Bayesian approach to fit the observed mid-IR-to-submillimeter/millimeter spectral energy distributions (SEDs) of 22 WISE-selected and submillimeter-detected, hyperluminous hot dust-obscured galaxies (Hot DOGs), with spectroscopic redshift ranging from 1.7 to 4.6. We compare the Bayesian evidence of a torus plusgraybody (Torus+GB) model with that of a torus-only (Torus) model and find that the Torus+GB model has higher Bayesian evidence for all 22 Hot DOGs than the torus-only model, which presents strong evidence in favor of the Torus+GB model. By adopting the Torus+GB model, we decompose the observed IR SEDs of Hot DOGs into torus and cold dust components. The main results are as follows. (1) Hot DOGs in our submillimeter-detected sample are hyperluminous ({L}{IR}≥slant {10}13{L}⊙ ), with torus emission dominating the IR energy output. However, cold dust emission is non-negligible, contributing on average ˜ 24% of total IR luminosity. (2) Compared to QSO and starburst SED templates, the median SED of Hot DOGs shows the highest luminosity ratio between mid-IR and submillimeter at rest frame, while it is very similar to that of QSOs at ˜ 10{--}50 μ {{m}}, suggesting that the heating sources of Hot DOGs should be buried AGNs. (3) Hot DOGs have high dust temperatures ({T}{dust}˜ 72 K) and high IR luminosity of cold dust. The {T}{dust}{--}{L}{IR} relation of Hot DOGs suggests that the increase in IR luminosity for Hot DOGs is mostly due to the increase of the dust temperature, rather than dust mass. Hot DOGs have lower dust masses than submillimeter galaxies (SMGs) and QSOs within a similar redshift range. Both high IR luminosity of cold dust and relatively low dust mass in Hot DOGs can be expected by their relatively high dust temperatures. (4) Hot DOGs have high dust-covering factors (CFs), which deviate from the previously proposed trend of the dust CF decreasing with increasing bolometric luminosity. Finally, we can reproduce the observed properties in Hot DOGs by employing a physical model of galaxy evolution. This result suggests that Hot DOGs may lie at or close to peaks of both star formation and black hole growth histories, and represent a transit phase during the evolutions of massive galaxies, transforming them from the dusty starburst-dominated phase to the optically bright QSO phase.
Semisupervised learning using Bayesian interpretation: application to LS-SVM.
Adankon, Mathias M; Cheriet, Mohamed; Biem, Alain
2011-04-01
Bayesian reasoning provides an ideal basis for representing and manipulating uncertain knowledge, with the result that many interesting algorithms in machine learning are based on Bayesian inference. In this paper, we use the Bayesian approach with one and two levels of inference to model the semisupervised learning problem and give its application to the successful kernel classifier support vector machine (SVM) and its variant least-squares SVM (LS-SVM). Taking advantage of Bayesian interpretation of LS-SVM, we develop a semisupervised learning algorithm for Bayesian LS-SVM using our approach based on two levels of inference. Experimental results on both artificial and real pattern recognition problems show the utility of our method.
Tomographic Image Reconstruction Using an Interpolation Method for Tree Decay Detection
Hailin Feng; Guanghui Li; Sheng Fu; Xiping Wang
2014-01-01
Stress wave velocity has been traditionally regarded as an indicator of the extent of damage inside wood. This paper aimed to detect internal decay of urban trees through reconstructing tomographic image of the cross section of a tree trunk. A grid model covering the cross section area of a tree trunk was defined with some assumptions. Stress wave data were processed...
A theoretical framework to predict the most likely ion path in particle imaging.
Collins-Fekete, Charles-Antoine; Volz, Lennart; Portillo, Stephen K N; Beaulieu, Luc; Seco, Joao
2017-03-07
In this work, a generic rigorous Bayesian formalism is introduced to predict the most likely path of any ion crossing a medium between two detection points. The path is predicted based on a combination of the particle scattering in the material and measurements of its initial and final position, direction and energy. The path estimate's precision is compared to the Monte Carlo simulated path. Every ion from hydrogen to carbon is simulated in two scenarios, (1) where the range is fixed and (2) where the initial velocity is fixed. In the scenario where the range is kept constant, the maximal root-mean-square error between the estimated path and the Monte Carlo path drops significantly between the proton path estimate (0.50 mm) and the helium path estimate (0.18 mm), but less so up to the carbon path estimate (0.09 mm). However, this scenario is identified as the configuration that maximizes the dose while minimizing the path resolution. In the scenario where the initial velocity is fixed, the maximal root-mean-square error between the estimated path and the Monte Carlo path drops significantly between the proton path estimate (0.29 mm) and the helium path estimate (0.09 mm) but increases for heavier ions up to carbon (0.12 mm). As a result, helium is found to be the particle with the most accurate path estimate for the lowest dose, potentially leading to tomographic images of higher spatial resolution.
Bayesian Regression with Network Prior: Optimal Bayesian Filtering Perspective
Qian, Xiaoning; Dougherty, Edward R.
2017-01-01
The recently introduced intrinsically Bayesian robust filter (IBRF) provides fully optimal filtering relative to a prior distribution over an uncertainty class ofjoint random process models, whereas formerly the theory was limited to model-constrained Bayesian robust filters, for which optimization was limited to the filters that are optimal for models in the uncertainty class. This paper extends the IBRF theory to the situation where there are both a prior on the uncertainty class and sample data. The result is optimal Bayesian filtering (OBF), where optimality is relative to the posterior distribution derived from the prior and the data. The IBRF theories for effective characteristics and canonical expansions extend to the OBF setting. A salient focus of the present work is to demonstrate the advantages of Bayesian regression within the OBF setting over the classical Bayesian approach in the context otlinear Gaussian models. PMID:28824268
An introduction to using Bayesian linear regression with clinical data.
Baldwin, Scott A; Larson, Michael J
2017-11-01
Statistical training psychology focuses on frequentist methods. Bayesian methods are an alternative to standard frequentist methods. This article provides researchers with an introduction to fundamental ideas in Bayesian modeling. We use data from an electroencephalogram (EEG) and anxiety study to illustrate Bayesian models. Specifically, the models examine the relationship between error-related negativity (ERN), a particular event-related potential, and trait anxiety. Methodological topics covered include: how to set up a regression model in a Bayesian framework, specifying priors, examining convergence of the model, visualizing and interpreting posterior distributions, interval estimates, expected and predicted values, and model comparison tools. We also discuss situations where Bayesian methods can outperform frequentist methods as well has how to specify more complicated regression models. Finally, we conclude with recommendations about reporting guidelines for those using Bayesian methods in their own research. We provide data and R code for replicating our analyses. Copyright © 2017 Elsevier Ltd. All rights reserved.
A SAS Interface for Bayesian Analysis with WinBUGS
ERIC Educational Resources Information Center
Zhang, Zhiyong; McArdle, John J.; Wang, Lijuan; Hamagami, Fumiaki
2008-01-01
Bayesian methods are becoming very popular despite some practical difficulties in implementation. To assist in the practical application of Bayesian methods, we show how to implement Bayesian analysis with WinBUGS as part of a standard set of SAS routines. This implementation procedure is first illustrated by fitting a multiple regression model…
BMDS: A Collection of R Functions for Bayesian Multidimensional Scaling
ERIC Educational Resources Information Center
Okada, Kensuke; Shigemasu, Kazuo
2009-01-01
Bayesian multidimensional scaling (MDS) has attracted a great deal of attention because: (1) it provides a better fit than do classical MDS and ALSCAL; (2) it provides estimation errors of the distances; and (3) the Bayesian dimension selection criterion, MDSIC, provides a direct indication of optimal dimensionality. However, Bayesian MDS is not…
A Two-Step Bayesian Approach for Propensity Score Analysis: Simulations and Case Study
ERIC Educational Resources Information Center
Kaplan, David; Chen, Jianshen
2012-01-01
A two-step Bayesian propensity score approach is introduced that incorporates prior information in the propensity score equation and outcome equation without the problems associated with simultaneous Bayesian propensity score approaches. The corresponding variance estimators are also provided. The two-step Bayesian propensity score is provided for…
Bayesian inference for psychology. Part II: Example applications with JASP.
Wagenmakers, Eric-Jan; Love, Jonathon; Marsman, Maarten; Jamil, Tahira; Ly, Alexander; Verhagen, Josine; Selker, Ravi; Gronau, Quentin F; Dropmann, Damian; Boutin, Bruno; Meerhoff, Frans; Knight, Patrick; Raj, Akash; van Kesteren, Erik-Jan; van Doorn, Johnny; Šmíra, Martin; Epskamp, Sacha; Etz, Alexander; Matzke, Dora; de Jong, Tim; van den Bergh, Don; Sarafoglou, Alexandra; Steingroever, Helen; Derks, Koen; Rouder, Jeffrey N; Morey, Richard D
2018-02-01
Bayesian hypothesis testing presents an attractive alternative to p value hypothesis testing. Part I of this series outlined several advantages of Bayesian hypothesis testing, including the ability to quantify evidence and the ability to monitor and update this evidence as data come in, without the need to know the intention with which the data were collected. Despite these and other practical advantages, Bayesian hypothesis tests are still reported relatively rarely. An important impediment to the widespread adoption of Bayesian tests is arguably the lack of user-friendly software for the run-of-the-mill statistical problems that confront psychologists for the analysis of almost every experiment: the t-test, ANOVA, correlation, regression, and contingency tables. In Part II of this series we introduce JASP ( http://www.jasp-stats.org ), an open-source, cross-platform, user-friendly graphical software package that allows users to carry out Bayesian hypothesis tests for standard statistical problems. JASP is based in part on the Bayesian analyses implemented in Morey and Rouder's BayesFactor package for R. Armed with JASP, the practical advantages of Bayesian hypothesis testing are only a mouse click away.
Applying Bayesian statistics to the study of psychological trauma: A suggestion for future research.
Yalch, Matthew M
2016-03-01
Several contemporary researchers have noted the virtues of Bayesian methods of data analysis. Although debates continue about whether conventional or Bayesian statistics is the "better" approach for researchers in general, there are reasons why Bayesian methods may be well suited to the study of psychological trauma in particular. This article describes how Bayesian statistics offers practical solutions to the problems of data non-normality, small sample size, and missing data common in research on psychological trauma. After a discussion of these problems and the effects they have on trauma research, this article explains the basic philosophical and statistical foundations of Bayesian statistics and how it provides solutions to these problems using an applied example. Results of the literature review and the accompanying example indicates the utility of Bayesian statistics in addressing problems common in trauma research. Bayesian statistics provides a set of methodological tools and a broader philosophical framework that is useful for trauma researchers. Methodological resources are also provided so that interested readers can learn more. (c) 2016 APA, all rights reserved).
Bayesian analyses of time-interval data for environmental radiation monitoring.
Luo, Peng; Sharp, Julia L; DeVol, Timothy A
2013-01-01
Time-interval (time difference between two consecutive pulses) analysis based on the principles of Bayesian inference was investigated for online radiation monitoring. Using experimental and simulated data, Bayesian analysis of time-interval data [Bayesian (ti)] was compared with Bayesian and a conventional frequentist analysis of counts in a fixed count time [Bayesian (cnt) and single interval test (SIT), respectively]. The performances of the three methods were compared in terms of average run length (ARL) and detection probability for several simulated detection scenarios. Experimental data were acquired with a DGF-4C system in list mode. Simulated data were obtained using Monte Carlo techniques to obtain a random sampling of the Poisson distribution. All statistical algorithms were developed using the R Project for statistical computing. Bayesian analysis of time-interval information provided a similar detection probability as Bayesian analysis of count information, but the authors were able to make a decision with fewer pulses at relatively higher radiation levels. In addition, for the cases with very short presence of the source (< count time), time-interval information is more sensitive to detect a change than count information since the source data is averaged by the background data over the entire count time. The relationships of the source time, change points, and modifications to the Bayesian approach for increasing detection probability are presented.
Embedding the results of focussed Bayesian fusion into a global context
NASA Astrophysics Data System (ADS)
Sander, Jennifer; Heizmann, Michael
2014-05-01
Bayesian statistics offers a well-founded and powerful fusion methodology also for the fusion of heterogeneous information sources. However, except in special cases, the needed posterior distribution is not analytically derivable. As consequence, Bayesian fusion may cause unacceptably high computational and storage costs in practice. Local Bayesian fusion approaches aim at reducing the complexity of the Bayesian fusion methodology significantly. This is done by concentrating the actual Bayesian fusion on the potentially most task relevant parts of the domain of the Properties of Interest. Our research on these approaches is motivated by an analogy to criminal investigations where criminalists pursue clues also only locally. This publication follows previous publications on a special local Bayesian fusion technique called focussed Bayesian fusion. Here, the actual calculation of the posterior distribution gets completely restricted to a suitably chosen local context. By this, the global posterior distribution is not completely determined. Strategies for using the results of a focussed Bayesian analysis appropriately are needed. In this publication, we primarily contrast different ways of embedding the results of focussed Bayesian fusion explicitly into a global context. To obtain a unique global posterior distribution, we analyze the application of the Maximum Entropy Principle that has been shown to be successfully applicable in metrology and in different other areas. To address the special need for making further decisions subsequently to the actual fusion task, we further analyze criteria for decision making under partial information.
Bayesian flood forecasting methods: A review
NASA Astrophysics Data System (ADS)
Han, Shasha; Coulibaly, Paulin
2017-08-01
Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.
Enhanced Combined Tomography and Biomechanics Data for Distinguishing Forme Fruste Keratoconus.
Luz, Allan; Lopes, Bernardo; Hallahan, Katie M; Valbon, Bruno; Ramos, Isaac; Faria-Correia, Fernando; Schor, Paulo; Dupps, William J; Ambrósio, Renato
2016-07-01
To evaluate the performance of the Ocular Response Analyzer (ORA) (Reichert Ophthalmic Instruments, Depew, NY) variables and Pentacam HR (Oculus Optikgeräte GmbH, Wetzlar, Germany) tomographic parameters in differentiating forme fruste keratoconus (FFKC) from normal corneas, and to assess a combined biomechanical and tomographic parameter to improve outcomes. Seventy-six eyes of 76 normal patients and 21 eyes of 21 patients with FFKC were included in the study. Fifteen variables were derived from exported ORA signals to characterize putative indicators of biomechanical behavior and 37 ORA waveform parameters were tested. Sixteen tomographic parameters from Pentacam HR were tested. Logistic regression was used to produce a combined biomechanical and tomography linear model. Differences between groups were assessed by the Mann-Whitney U test. The area under the receiver operating characteristics curve (AUROC) was used to compare diagnostic performance. No statistically significant differences were found in age, thinnest point, central corneal thickness, and maximum keratometry between groups. Twenty-one parameters showed significant differences between the FFKC and control groups. Among the ORA waveform measurements, the best parameters were those related to the area under the first peak, p1area1 (AUROC, 0.717 ± 0.065). Among the investigator ORA variables, a measure incorporating the pressure-deformation relationship of the entire response cycle was the best predictor (hysteresis loop area, AUROC, 0.688 ± 0.068). Among tomographic parameters, Belin/Ambrósio display showed the highest predictive value (AUROC, 0.91 ± 0.057). A combination of parameters showed the best result (AUROC, 0.953 ± 0.024) outperforming individual parameters. Tomographic and biomechanical parameters demonstrated the ability to differentiate FFKC from normal eyes. A combination of both types of information further improved predictive value. [J Refract Surg. 2016;32(7):479-485.]. Copyright 2016, SLACK Incorporated.
Sporns, Peter B; Schwake, Michael; Schmidt, Rene; Kemmling, André; Minnerup, Jens; Schwindt, Wolfram; Cnyrim, Christian; Zoubi, Tarek; Heindel, Walter; Niederstadt, Thomas; Hanning, Uta
2017-01-01
Significant early hematoma growth in patients with intracerebral hemorrhage is an independent predictor of poor functional outcome. Recently, the novel blend sign (BS) has been introduced as a new imaging sign for predicting hematoma growth in noncontrast computed tomography. Another parameter predicting increasing hematoma size is the well-established spot sign (SS) visible in computed tomographic angiography. We, therefore, aimed to clarify the association between established SS and novel BS and their values predicting a secondary neurological deterioration. Retrospective study inclusion criteria were (1) spontaneous intracerebral hemorrhage confirmed on noncontrast computed tomography and (2) noncontrast computed tomography and computed tomographic angiography performed on admission within 6 hours after onset of symptoms. We defined a binary outcome (secondary neurological deterioration versus no secondary deterioration). As secondary neurological deterioration, we defined (1) early hemicraniectomy under standardized criteria or (2) secondary decrease of Glasgow Coma Scale of >3 points, both within the first 48 hours after symptom onset. Of 182 patients with spontaneous intracerebral hemorrhage, 37 (20.3%) presented with BS and 39 (21.4%) with SS. Of the 81 patients with secondary deterioration, 31 (38.3%) had BS and SS on admission. Multivariable logistic regression analysis identified hematoma volume (odds ratio, 1.07 per mL; P≤0.001), intraventricular hemorrhage (odds ratio, 3.08; P=0.008), and the presence of BS (odds ratio, 11.47; P≤0.001) as independent predictors of neurological deterioration. The BS, which is obtainable in noncontrast computed tomography, shows a high correlation with the computed tomographic angiography SS and is a reliable predictor of secondary neurological deterioration after spontaneous intracerebral hemorrhage. © 2016 American Heart Association, Inc.
NASA Astrophysics Data System (ADS)
Moeck, Jonas P.; Bourgouin, Jean-François; Durox, Daniel; Schuller, Thierry; Candel, Sébastien
2013-04-01
Swirl flows with vortex breakdown are widely used in industrial combustion systems for flame stabilization. This type of flow is known to sustain a hydrodynamic instability with a rotating helical structure, one common manifestation of it being the precessing vortex core. The role of this unsteady flow mode in combustion is not well understood, and its interaction with combustion instabilities and flame stabilization remains unclear. It is therefore important to assess the structure of the perturbation in the flame that is induced by this helical mode. Based on principles of tomographic reconstruction, a method is presented to determine the 3-D distribution of the heat release rate perturbation associated with the helical mode. Since this flow instability is rotating, a phase-resolved sequence of projection images of light emitted from the flame is identical to the Radon transform of the light intensity distribution in the combustor volume and thus can be used for tomographic reconstruction. This is achieved with one stationary camera only, a vast reduction in experimental and hardware requirements compared to a multi-camera setup or camera repositioning, which is typically required for tomographic reconstruction. Different approaches to extract the coherent part of the oscillation from the images are discussed. Two novel tomographic reconstruction algorithms specifically tailored to the structure of the heat release rate perturbations related to the helical mode are derived. The reconstruction techniques are first applied to an artificial field to illustrate the accuracy. High-speed imaging data acquired in a turbulent swirl-stabilized combustor setup with strong helical mode oscillations are then used to reconstruct the 3-D structure of the associated perturbation in the flame.
Li, Qiao; Gao, Xinyi; Yao, Zhenwei; Feng, Xiaoyuan; He, Huijin; Xue, Jing; Gao, Peiyi; Yang, Lumeng; Cheng, Xin; Chen, Weijian; Yang, Yunjun
2017-09-01
Permeability surface (PS) on computed tomographic perfusion reflects blood-brain barrier permeability and is related to hemorrhagic transformation (HT). HT of deep middle cerebral artery (MCA) territory can occur after recanalization of proximal large-vessel occlusion. We aimed to determine the relationship between HT and PS of deep MCA territory. We retrospectively reviewed 70 consecutive acute ischemic stroke patients presenting with occlusion of the distal internal carotid artery or M1 segment of the MCA. All patients underwent computed tomographic perfusion within 6 hours after symptom onset. Computed tomographic perfusion data were postprocessed to generate maps of different perfusion parameters. Risk factors were identified for increased deep MCA territory PS. Receiver operating characteristic curve analysis was performed to calculate the optimal PS threshold to predict HT of deep MCA territory. Increased PS was associated with HT of deep MCA territory. After adjustments for age, sex, onset time to computed tomographic perfusion, and baseline National Institutes of Health Stroke Scale, poor collateral status (odds ratio, 7.8; 95% confidence interval, 1.67-37.14; P =0.009) and proximal MCA-M1 occlusion (odds ratio, 4.12; 95% confidence interval, 1.03-16.52; P =0.045) were independently associated with increased deep MCA territory PS. Relative PS most accurately predicted HT of deep MCA territory (area under curve, 0.94; optimal threshold, 2.89). Increased PS can predict HT of deep MCA territory after recanalization therapy for cerebral proximal large-vessel occlusion. Proximal MCA-M1 complete occlusion and distal internal carotid artery occlusion in conjunction with poor collaterals elevate deep MCA territory PS. © 2017 American Heart Association, Inc.
NASA Astrophysics Data System (ADS)
Sun, G.; Moncelsi, L.; Viero, M. P.; Silva, M. B.; Bock, J.; Bradford, C. M.; Chang, T.-C.; Cheng, Y.-T.; Cooray, A. R.; Crites, A.; Hailey-Dunsheath, S.; Uzgil, B.; Hunacek, J. R.; Zemcov, M.
2018-04-01
Intensity mapping provides a unique means to probe the epoch of reionization (EoR), when the neutral intergalactic medium was ionized by energetic photons emitted from the first galaxies. The [C II] 158 μm fine-structure line is typically one of the brightest emission lines of star-forming galaxies and thus a promising tracer of the global EoR star formation activity. However, [C II] intensity maps at 6 ≲ z ≲ 8 are contaminated by interloping CO rotational line emission (3 ≤ J upp ≤ 6) from lower-redshift galaxies. Here we present a strategy to remove the foreground contamination in upcoming [C II] intensity mapping experiments, guided by a model of CO emission from foreground galaxies. The model is based on empirical measurements of the mean and scatter of the total infrared luminosities of galaxies at z < 3 and with stellar masses {M}* > {10}8 {M}ȯ selected in the K-band from the COSMOS/UltraVISTA survey, which can be converted to CO line strengths. For a mock field of the Tomographic Ionized-carbon Mapping Experiment, we find that masking out the “voxels” (spectral–spatial elements) containing foreground galaxies identified using an optimized CO flux threshold results in a z-dependent criterion {m}{{K}}AB}≲ 22 (or {M}* ≳ {10}9 {M}ȯ ) at z < 1 and makes a [C II]/COtot power ratio of ≳10 at k = 0.1 h/Mpc achievable, at the cost of a moderate ≲8% loss of total survey volume.
Less than 2 °C warming by 2100 unlikely
NASA Astrophysics Data System (ADS)
Raftery, Adrian E.; Zimmer, Alec; Frierson, Dargan M. W.; Startz, Richard; Liu, Peiran
2017-09-01
The recently published Intergovernmental Panel on Climate Change (IPCC) projections to 2100 give likely ranges of global temperature increase in four scenarios for population, economic growth and carbon use. However, these projections are not based on a fully statistical approach. Here we use a country-specific version of Kaya's identity to develop a statistically based probabilistic forecast of CO2 emissions and temperature change to 2100. Using data for 1960-2010, including the UN's probabilistic population projections for all countries, we develop a joint Bayesian hierarchical model for Gross Domestic Product (GDP) per capita and carbon intensity. We find that the 90% interval for cumulative CO2 emissions includes the IPCC's two middle scenarios but not the extreme ones. The likely range of global temperature increase is 2.0-4.9 °C, with median 3.2 °C and a 5% (1%) chance that it will be less than 2 °C (1.5 °C). Population growth is not a major contributing factor. Our model is not a `business as usual' scenario, but rather is based on data which already show the effect of emission mitigation policies. Achieving the goal of less than 1.5 °C warming will require carbon intensity to decline much faster than in the recent past.
Particle filtering based structural assessment with acoustic emission sensing
NASA Astrophysics Data System (ADS)
Yan, Wuzhao; Abdelrahman, Marwa; Zhang, Bin; Ziehl, Paul
2017-02-01
Nuclear structures are designed to withstand severe loading events under various stresses. Over time, aging of structural systems constructed with concrete and steel will occur. This deterioration may reduce service life of nuclear facilities and/or lead to unnecessary or untimely repairs. Therefore, online monitoring of structures in nuclear power plants and waste storage has drawn significant attention in recent years. Of many existing non-destructive evaluation and structural monitoring approaches, acoustic emission is promising for assessment of structural damage because it is non-intrusive and is sensitive to corrosion and crack growth in reinforced concrete elements. To provide a rapid, actionable, and graphical means for interpretation Intensity Analysis plots have been developed. This approach provides a means for classification of damage. Since the acoustic emission measurement is only an indirect indicator of structural damage, potentially corrupted by non-genuine data, it is more suitable to estimate the states of corrosion and cracking in a Bayesian estimation framework. In this paper, we will utilize the accelerated corrosion data from a specimen at the University of South Carolina to develop a particle filtering-based diagnosis and prognosis algorithm. Promising features of the proposed algorithm are described in terms of corrosion state estimation and prediction of degradation over time to a predefined threshold.
Metric on the space of quantum states from relative entropy. Tomographic reconstruction
NASA Astrophysics Data System (ADS)
Man'ko, Vladimir I.; Marmo, Giuseppe; Ventriglia, Franco; Vitale, Patrizia
2017-08-01
In the framework of quantum information geometry, we derive, from quantum relative Tsallis entropy, a family of quantum metrics on the space of full rank, N level quantum states, by means of a suitably defined coordinate free differential calculus. The cases N=2, N=3 are discussed in detail and notable limits are analyzed. The radial limit procedure has been used to recover quantum metrics for lower rank states, such as pure states. By using the tomographic picture of quantum mechanics we have obtained the Fisher-Rao metric for the space of quantum tomograms and derived a reconstruction formula of the quantum metric of density states out of the tomographic one. A new inequality obtained for probabilities of three spin-1/2 projections in three perpendicular directions is proposed to be checked in experiments with superconducting circuits.
The Psychology of Bayesian Reasoning
2014-10-21
The psychology of Bayesian reasoning David R. Mandel* Socio-Cognitive Systems Section, Defence Research and Development Canada and Department...belief revision, subjective probability, human judgment, psychological methods. Most psychological research on Bayesian reasoning since the 1970s has...attention to some important problems with the conventional approach to studying Bayesian reasoning in psychology that has been dominant since the
Bayesian Just-So Stories in Psychology and Neuroscience
ERIC Educational Resources Information Center
Bowers, Jeffrey S.; Davis, Colin J.
2012-01-01
According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make 3 main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak.…
Teaching Bayesian Statistics in a Health Research Methodology Program
ERIC Educational Resources Information Center
Pullenayegum, Eleanor M.; Thabane, Lehana
2009-01-01
Despite the appeal of Bayesian methods in health research, they are not widely used. This is partly due to a lack of courses in Bayesian methods at an appropriate level for non-statisticians in health research. Teaching such a course can be challenging because most statisticians have been taught Bayesian methods using a mathematical approach, and…
Bayesian inference based on dual generalized order statistics from the exponentiated Weibull model
NASA Astrophysics Data System (ADS)
Al Sobhi, Mashail M.
2015-02-01
Bayesian estimation for the two parameters and the reliability function of the exponentiated Weibull model are obtained based on dual generalized order statistics (DGOS). Also, Bayesian prediction bounds for future DGOS from exponentiated Weibull model are obtained. The symmetric and asymmetric loss functions are considered for Bayesian computations. The Markov chain Monte Carlo (MCMC) methods are used for computing the Bayes estimates and prediction bounds. The results have been specialized to the lower record values. Comparisons are made between Bayesian and maximum likelihood estimators via Monte Carlo simulation.
Kruschke, John K; Liddell, Torrin M
2018-02-01
In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.
BATSE gamma-ray burst line search. 2: Bayesian consistency methodology
NASA Technical Reports Server (NTRS)
Band, D. L.; Ford, L. A.; Matteson, J. L.; Briggs, M.; Paciesas, W.; Pendleton, G.; Preece, R.; Palmer, D.; Teegarden, B.; Schaefer, B.
1994-01-01
We describe a Bayesian methodology to evaluate the consistency between the reported Ginga and Burst and Transient Source Experiment (BATSE) detections of absorption features in gamma-ray burst spectra. Currently no features have been detected by BATSE, but this methodology will still be applicable if and when such features are discovered. The Bayesian methodology permits the comparison of hypotheses regarding the two detectors' observations and makes explicit the subjective aspects of our analysis (e.g., the quantification of our confidence in detector performance). We also present non-Bayesian consistency statistics. Based on preliminary calculations of line detectability, we find that both the Bayesian and non-Bayesian techniques show that the BATSE and Ginga observations are consistent given our understanding of these detectors.
Application of Bayesian Approach in Cancer Clinical Trial
Bhattacharjee, Atanu
2014-01-01
The application of Bayesian approach in clinical trials becomes more useful over classical method. It is beneficial from design to analysis phase. The straight forward statement is possible to obtain through Bayesian about the drug treatment effect. Complex computational problems are simple to handle with Bayesian techniques. The technique is only feasible to performing presence of prior information of the data. The inference is possible to establish through posterior estimates. However, some limitations are present in this method. The objective of this work was to explore the several merits and demerits of Bayesian approach in cancer research. The review of the technique will be helpful for the clinical researcher involved in the oncology to explore the limitation and power of Bayesian techniques. PMID:29147387
Tomographic Validation of the AWSoM Model of the Inner Corona During Solar Minima
NASA Astrophysics Data System (ADS)
Manchester, W.; Vásquez, A. M.; Lloveras, D. G.; Mac Cormack, C.; Nuevo, F.; Lopez-Fuentes, M.; Frazin, R. A.; van der Holst, B.; Landi, E.; Gombosi, T. I.
2017-12-01
Continuous improvement of MHD three-dimensional (3D) models of the global solar corona, such as the Alfven Wave Solar Model (AWSoM) of the Space Weather Modeling Framework (SWMF), requires testing their ability to reproduce observational constraints at a global scale. To that end, solar rotational tomography based on EUV image time-series can be used to reconstruct the 3D distribution of the electron density and temperature in the inner solar corona (r < 1.25 Rsun). The tomographic results, combined with a global coronal magnetic model, can further provide constraints on the energy input flux required at the coronal base to maintain stable structures. In this work, tomographic reconstructions are used to validate steady-state 3D MHD simulations of the inner corona using the latest version of the AWSoM model. We perform the study for selected rotations representative of solar minimum conditions, when the global structure of the corona is more axisymmetric. We analyse in particular the ability of the MHD simulation to match the tomographic results across the boundary region between the equatorial streamer belt and the surrounding coronal holes. The region is of particular interest as the plasma flow from that zone is thought to be related to the origin of the slow component of the solar wind.
Isaacson, Brandon; Kutz, Joe Walter; Mendelsohn, Dianne; Roland, Peter S
2009-04-01
To demonstrate the use of computed tomographic (CT) venography in selecting a surgical approach for cholesterol granulomas. Retrospective case review. Tertiary referral center. Three patients presented with symptomatic petrous apex cholesterol granulomas with extensive bone erosion involving the jugular fossa. Computed tomographic venography was performed on each patient before selecting a surgical approach for drainage. Localization of the jugular bulb in relation to the petrous carotid artery and basal turn of the cochlea was ascertained in each subject. Three patients with large symptomatic cholesterol granulomas were identified. Conventional CT demonstrated extensive bone erosion involving the jugular fossa in each patient. The location of the jugular bulb and its proximity to the petrous carotid artery and basal turn of the cochlea could not be determined with conventional temporal bone CT and magnetic resonance imaging. Computed tomographic venography provided the exact location of the jugular bulb in all 3 patients. The favorable position of the jugular bulb in all 3 cases permitted drainage of these lesions using an infracochlear approach. Computed tomographic venography provided invaluable information in 3 patients with large symptomatic cholesterol granulomas. All 3 patients were previously thought to be unsuitable candidates for an infracochlear or infralabyrinthine approach because of the unknown location of the jugular bulb.
Sodankylä ionospheric tomography data set 2003-2014
NASA Astrophysics Data System (ADS)
Norberg, Johannes; Roininen, Lassi; Kero, Antti; Raita, Tero; Ulich, Thomas; Markkanen, Markku; Juusola, Liisa; Kauristie, Kirsti
2016-07-01
Sodankylä Geophysical Observatory has been operating a receiver network for ionospheric tomography and collecting the produced data since 2003. The collected data set consists of phase difference curves measured from COSMOS navigation satellites from the Russian Parus network (Wood and Perry, 1980) and tomographic electron density reconstructions obtained from these measurements. In this study vertical total electron content (VTEC) values are integrated from the reconstructed electron densities to make a qualitative and quantitative analysis to validate the long-term performance of the tomographic system. During the observation period, 2003-2014, there were three to five operational stations at the Fennoscandia sector. Altogether the analysis consists of around 66 000 overflights, but to ensure the quality of the reconstructions, the examination is limited to cases with descending (north to south) overflights and maximum elevation over 60°. These constraints limit the number of overflights to around 10 000. Based on this data set, one solar cycle of ionospheric VTEC estimates is constructed. The measurements are compared against the International Reference Ionosphere (IRI)-2012 model, F10.7 solar flux index and sunspot number data. Qualitatively the tomographic VTEC estimate corresponds to reference data very well, but the IRI-2012 model results are on average 40 % higher than that of the tomographic results.
Bayesian just-so stories in psychology and neuroscience.
Bowers, Jeffrey S; Davis, Colin J
2012-05-01
According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make 3 main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak. This weakness relates to the many arbitrary ways that priors, likelihoods, and utility functions can be altered in order to account for the data that are obtained, making the models unfalsifiable. It further relates to the fact that Bayesian theories are rarely better at predicting data compared with alternative (and simpler) non-Bayesian theories. Second, we show that the empirical evidence for Bayesian theories in neuroscience is weaker still. There are impressive mathematical analyses showing how populations of neurons could compute in a Bayesian manner but little or no evidence that they do. Third, we challenge the general scientific approach that characterizes Bayesian theorizing in cognitive science. A common premise is that theories in psychology should largely be constrained by a rational analysis of what the mind ought to do. We question this claim and argue that many of the important constraints come from biological, evolutionary, and processing (algorithmic) considerations that have no adaptive relevance to the problem per se. In our view, these factors have contributed to the development of many Bayesian "just so" stories in psychology and neuroscience; that is, mathematical analyses of cognition that can be used to explain almost any behavior as optimal. 2012 APA, all rights reserved.
Greenhouse gas mitigation can reduce sea-ice loss and increase polar bear persistence.
Amstrup, Steven C; Deweaver, Eric T; Douglas, David C; Marcot, Bruce G; Durner, George M; Bitz, Cecilia M; Bailey, David A
2010-12-16
On the basis of projected losses of their essential sea-ice habitats, a United States Geological Survey research team concluded in 2007 that two-thirds of the world's polar bears (Ursus maritimus) could disappear by mid-century if business-as-usual greenhouse gas emissions continue. That projection, however, did not consider the possible benefits of greenhouse gas mitigation. A key question is whether temperature increases lead to proportional losses of sea-ice habitat, or whether sea-ice cover crosses a tipping point and irreversibly collapses when temperature reaches a critical threshold. Such a tipping point would mean future greenhouse gas mitigation would confer no conservation benefits to polar bears. Here we show, using a general circulation model, that substantially more sea-ice habitat would be retained if greenhouse gas rise is mitigated. We also show, with Bayesian network model outcomes, that increased habitat retention under greenhouse gas mitigation means that polar bears could persist throughout the century in greater numbers and more areas than in the business-as-usual case. Our general circulation model outcomes did not reveal thresholds leading to irreversible loss of ice; instead, a linear relationship between global mean surface air temperature and sea-ice habitat substantiated the hypothesis that sea-ice thermodynamics can overcome albedo feedbacks proposed to cause sea-ice tipping points. Our outcomes indicate that rapid summer ice losses in models and observations represent increased volatility of a thinning sea-ice cover, rather than tipping-point behaviour. Mitigation-driven Bayesian network outcomes show that previously predicted declines in polar bear distribution and numbers are not unavoidable. Because polar bears are sentinels of the Arctic marine ecosystem and trends in their sea-ice habitats foreshadow future global changes, mitigating greenhouse gas emissions to improve polar bear status would have conservation benefits throughout and beyond the Arctic.
Greenhouse gas mitigation can reduce sea-ice loss and increase polar bear persistence
Amstrup, Steven C.; Deweaver, E.T.; Douglas, David C.; Marcot, B.G.; Durner, George M.; Bitz, C.M.; Bailey, D.A.
2010-01-01
On the basis of projected losses of their essential sea-ice habitats, a United States Geological Survey research team concluded in 2007 that two-thirds of the worlds polar bears (Ursus maritimus) could disappear by mid-century if business-as-usual greenhouse gas emissions continue. That projection, however, did not consider the possible benefits of greenhouse gas mitigation. A key question is whether temperature increases lead to proportional losses of sea-ice habitat, or whether sea-ice cover crosses a tipping point and irreversibly collapses when temperature reaches a critical threshold. Such a tipping point would mean future greenhouse gas mitigation would confer no conservation benefits to polar bears. Here we show, using a general circulation model, that substantially more sea-ice habitat would be retained if greenhouse gas rise is mitigated. We also show, with Bayesian network model outcomes, that increased habitat retention under greenhouse gas mitigation means that polar bears could persist throughout the century in greater numbers and more areas than in the business-as-usual case. Our general circulation model outcomes did not reveal thresholds leading to irreversible loss of ice; instead, a linear relationship between global mean surface air temperature and sea-ice habitat substantiated the hypothesis that sea-ice thermodynamics can overcome albedo feedbacks proposed to cause sea-ice tipping points. Our outcomes indicate that rapid summer ice losses in models and observations represent increased volatility of a thinning sea-ice cover, rather than tipping-point behaviour. Mitigation-driven Bayesian network outcomes show that previously predicted declines in polar bear distribution and numbers are not unavoidable. Because polar bears are sentinels of the Arctic marine ecosystem and trends in their sea-ice habitats foreshadow future global changes, mitigating greenhouse gas emissions to improve polar bear status would have conservation benefits throughout and beyond the Arctic.
Bayesian models: A statistical primer for ecologists
Hobbs, N. Thompson; Hooten, Mevin B.
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models
ERIC Educational Resources Information Center
Griffiths, Thomas L.; Chater, Nick; Norris, Dennis; Pouget, Alexandre
2012-01-01
Bowers and Davis (2012) criticize Bayesian modelers for telling "just so" stories about cognition and neuroscience. Their criticisms are weakened by not giving an accurate characterization of the motivation behind Bayesian modeling or the ways in which Bayesian models are used and by not evaluating this theoretical framework against specific…
Global and regional emissions estimates for N2O
NASA Astrophysics Data System (ADS)
Saikawa, E.; Prinn, R. G.; Dlugokencky, E.; Ishijima, K.; Dutton, G. S.; Hall, B. D.; Langenfelds, R.; Tohjima, Y.; Machida, T.; Manizza, M.; Rigby, M.; O'Doherty, S.; Patra, P. K.; Harth, C. M.; Weiss, R. F.; Krummel, P. B.; van der Schoot, M.; Fraser, P. J.; Steele, L. P.; Aoki, S.; Nakazawa, T.; Elkins, J. W.
2014-05-01
We present a comprehensive estimate of nitrous oxide (N2O) emissions using observations and models from 1995 to 2008. High-frequency records of tropospheric N2O are available from measurements at Cape Grim, Tasmania; Cape Matatula, American Samoa; Ragged Point, Barbados; Mace Head, Ireland; and at Trinidad Head, California using the Advanced Global Atmospheric Gases Experiment (AGAGE) instrumentation and calibrations. The Global Monitoring Division of the National Oceanic and Atmospheric Administration/Earth System Research Laboratory (NOAA/ESRL) has also collected discrete air samples in flasks and in situ measurements from remote sites across the globe and analyzed them for a suite of species including N2O. In addition to these major networks, we include in situ and aircraft measurements from the National Institute of Environmental Studies (NIES) and flask measurements from the Tohoku University and Commonwealth Scientific and Industrial Research Organization (CSIRO) networks. All measurements show increasing atmospheric mole fractions of N2O, with a varying growth rate of 0.1-0.7% per year, resulting in a 7.4% increase in the background atmospheric mole fraction between 1979 and 2011. Using existing emission inventories as well as bottom-up process modeling results, we first create globally gridded a priori N2O emissions over the 37 years since 1975. We then use the three-dimensional chemical transport model, Model for Ozone and Related Chemical Tracers version 4 (MOZART v4), and a Bayesian inverse method to estimate global as well as regional annual emissions for five source sectors from 13 regions in the world. This is the first time that all of these measurements from multiple networks have been combined to determine emissions. Our inversion indicates that global and regional N2O emissions have an increasing trend between 1995 and 2008. Despite large uncertainties, a significant increase is seen from the Asian agricultural sector in recent years, most likely due to an increase in the use of nitrogenous fertilizers, as has been suggested by previous studies.
Oates, Lawrence G.; Duncan, David S.; Gelfand, Ilya; ...
2015-05-14
Greenhouse gas (GHG) emissions from soils are a key sustainability metric of cropping systems. During crop establishment, disruptive land-use change is known to be a critical, but under reported period, for determining GHG emissions. We measured soil N 2O emissions and potential environmental drivers of these fluxes from a three-year establishment-phase bioenergy cropping systems experiment replicated in southcentral Wisconsin (ARL) and southwestern Michigan (KBS). Cropping systems treatments were annual monocultures (continuous corn, corn–soybean–canola rotation), perennial monocultures (switchgrass, miscanthus, and poplar), and perennial polycultures (native grass mixture, early successional community, and restored prairie) all grown using best management practices specific tomore » the system. Cumulative three-year N 2O emissions from annuals were 142% higher than from perennials, with fertilized perennials 190% higher than unfertilized perennials. Emissions ranged from 3.1 to 19.1 kg N 2O-N ha -1 yr -1 for the annuals with continuous corn > corn–soybean–canola rotation and 1.1 to 6.3 kg N 2O-N ha -1 yr -1 for perennials. Nitrous oxide peak fluxes typically were associated with precipitation events that closely followed fertilization. Bayesian modeling of N 2O fluxes based on measured environmental factors explained 33% of variability across all systems. Models trained on single systems performed well in most monocultures (e.g., R 2 = 0.52 for poplar) but notably worse in polycultures (e.g., R 2 = 0.17 for early successional, R 2 = 0.06 for restored prairie), indicating that simulation models that include N 2O emissions should be parameterized specific to particular plant communities. These results indicate that perennial bioenergy crops in their establishment phase emit less N 2O than annual crops, especially when not fertilized. These findings should be considered further alongside yield and other metrics contributing to important ecosystem services.« less
NASA Astrophysics Data System (ADS)
Dal Ferro, Nicola; Quinn, Claire Helen; Morari, Francesco
2017-04-01
A key challenge for soil scientists is predicting agricultural management scenarios that combine crop productions with high standards of environmental quality. In this context, reversing the soil organic carbon (SOC) decline in croplands is required for maintaining soil fertility and contributing to mitigate GHGs emissions. Bayesian belief networks (BBN) are probabilistic models able to accommodate uncertainty and variability in the predictions of the impacts of management and environmental changes. By linking multiple qualitative and quantitative variables in a cause-and-effect relationships, BBNs can be used as a decision support system at different spatial scales to find best management strategies in the agroecosystems. In this work we built a BBN to model SOC dynamics (0-30 cm layer) in the low-lying plain of Veneto region, north-eastern Italy, and define best practices leading to SOC accumulation and GHGs (CO2-equivalent) emissions reduction. Regional pedo-climatic, land use and management information were combined with experimental and modelled data on soil C dynamics as natural and anthropic key drivers affecting SOC stock change. Moreover, utility nodes were introduced to determine optimal decisions for mitigating GHGs emissions from croplands considering also three different IPCC climate scenarios. The network was finally validated with real field data in terms of SOC stock change. Results showed that the BBN was able to model real SOC stock changes, since validation slightly overestimated SOC reduction (+5%) at the expenses of its accumulation. At regional level, probability distributions showed 50% of SOC loss, while only 17% of accumulation. However, the greatest losses (34%) were associated with low reduction rates (100-500 kg C ha-1 y-1), followed by 33% of stabilized conditions (-100 < SOC < 100 kg ha-1 y-1). Land use management (especially tillage operations and soil cover) played a primary role to affect SOC stock change, while climate conditions were only slightly involved in C regulation within the 0-30 cm layer. The proposed BBN framework was flexible to perform both field-scale validation and regional-scale predictions. Moreover, BBN provided guidelines for improved land management strategies in a perspective of climate change scenarios, although further validation, including a broader set of experimental data, is needed to strengthen the outcomes across Veneto region.
Estimate of main local sources to ambient ultrafine particle number concentrations in an urban area
NASA Astrophysics Data System (ADS)
Rahman, Md Mahmudur; Mazaheri, Mandana; Clifford, Sam; Morawska, Lidia
2017-09-01
Quantifying and apportioning the contribution of a range of sources to ultrafine particles (UFPs, D < 100 nm) is a challenge due to the complex nature of the urban environments. Although vehicular emissions have long been considered one of the major sources of ultrafine particles in urban areas, the contribution of other major urban sources is not yet fully understood. This paper aims to determine and quantify the contribution of local ground traffic, nucleated particle (NP) formation and distant non-traffic (e.g. airport, oil refineries, and seaport) sources to the total ambient particle number concentration (PNC) in a busy, inner-city area in Brisbane, Australia using Bayesian statistical modelling and other exploratory tools. The Bayesian model was trained on the PNC data on days where NP formations were known to have not occurred, hourly traffic counts, solar radiation data, and smooth daily trend. The model was applied to apportion and quantify the contribution of NP formations and local traffic and non-traffic sources to UFPs. The data analysis incorporated long-term measured time-series of total PNC (D ≥ 6 nm), particle number size distributions (PSD, D = 8 to 400 nm), PM2.5, PM10, NOx, CO, meteorological parameters and traffic counts at a stationary monitoring site. The developed Bayesian model showed reliable predictive performances in quantifying the contribution of NP formation events to UFPs (up to 4 × 104 particles cm- 3), with a significant day to day variability. The model identified potential NP formation and no-formations days based on PNC data and quantified the sources contribution to UFPs. Exploratory statistical analyses show that total mean PNC during the middle of the day was up to 32% higher than during peak morning and evening traffic periods, which were associated with NP formation events. The majority of UFPs measured during the peak traffic and NP formation periods were between 30-100 nm and smaller than 30 nm, respectively. To date, this is the first application of Bayesian model to apportion different sources contribution to UFPs, and therefore the importance of this study is not only in its modelling outcomes but in demonstrating the applicability and advantages of this statistical approach to air pollution studies.
Wang, Jiali; Zhang, Qingnian; Ji, Wenfeng
2014-01-01
A large number of data is needed by the computation of the objective Bayesian network, but the data is hard to get in actual computation. The calculation method of Bayesian network was improved in this paper, and the fuzzy-precise Bayesian network was obtained. Then, the fuzzy-precise Bayesian network was used to reason Bayesian network model when the data is limited. The security of passengers during shipping is affected by various factors, and it is hard to predict and control. The index system that has the impact on the passenger safety during shipping was established on basis of the multifield coupling theory in this paper. Meanwhile, the fuzzy-precise Bayesian network was applied to monitor the security of passengers in the shipping process. The model was applied to monitor the passenger safety during shipping of a shipping company in Hainan, and the effectiveness of this model was examined. This research work provides guidance for guaranteeing security of passengers during shipping.
Bayesian model reduction and empirical Bayes for group (DCM) studies
Friston, Karl J.; Litvak, Vladimir; Oswal, Ashwini; Razi, Adeel; Stephan, Klaas E.; van Wijk, Bernadette C.M.; Ziegler, Gabriel; Zeidman, Peter
2016-01-01
This technical note describes some Bayesian procedures for the analysis of group studies that use nonlinear models at the first (within-subject) level – e.g., dynamic causal models – and linear models at subsequent (between-subject) levels. Its focus is on using Bayesian model reduction to finesse the inversion of multiple models of a single dataset or a single (hierarchical or empirical Bayes) model of multiple datasets. These applications of Bayesian model reduction allow one to consider parametric random effects and make inferences about group effects very efficiently (in a few seconds). We provide the relatively straightforward theoretical background to these procedures and illustrate their application using a worked example. This example uses a simulated mismatch negativity study of schizophrenia. We illustrate the robustness of Bayesian model reduction to violations of the (commonly used) Laplace assumption in dynamic causal modelling and show how its recursive application can facilitate both classical and Bayesian inference about group differences. Finally, we consider the application of these empirical Bayesian procedures to classification and prediction. PMID:26569570
A study of finite mixture model: Bayesian approach on financial time series data
NASA Astrophysics Data System (ADS)
Phoong, Seuk-Yen; Ismail, Mohd Tahir
2014-07-01
Recently, statistician have emphasized on the fitting finite mixture model by using Bayesian method. Finite mixture model is a mixture of distributions in modeling a statistical distribution meanwhile Bayesian method is a statistical method that use to fit the mixture model. Bayesian method is being used widely because it has asymptotic properties which provide remarkable result. In addition, Bayesian method also shows consistency characteristic which means the parameter estimates are close to the predictive distributions. In the present paper, the number of components for mixture model is studied by using Bayesian Information Criterion. Identify the number of component is important because it may lead to an invalid result. Later, the Bayesian method is utilized to fit the k-component mixture model in order to explore the relationship between rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia. Lastly, the results showed that there is a negative effect among rubber price and stock market price for all selected countries.
Wang, Jiali; Zhang, Qingnian; Ji, Wenfeng
2014-01-01
A large number of data is needed by the computation of the objective Bayesian network, but the data is hard to get in actual computation. The calculation method of Bayesian network was improved in this paper, and the fuzzy-precise Bayesian network was obtained. Then, the fuzzy-precise Bayesian network was used to reason Bayesian network model when the data is limited. The security of passengers during shipping is affected by various factors, and it is hard to predict and control. The index system that has the impact on the passenger safety during shipping was established on basis of the multifield coupling theory in this paper. Meanwhile, the fuzzy-precise Bayesian network was applied to monitor the security of passengers in the shipping process. The model was applied to monitor the passenger safety during shipping of a shipping company in Hainan, and the effectiveness of this model was examined. This research work provides guidance for guaranteeing security of passengers during shipping. PMID:25254227
Philosophy and the practice of Bayesian statistics
Gelman, Andrew; Shalizi, Cosma Rohilla
2015-01-01
A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. PMID:22364575
Philosophy and the practice of Bayesian statistics.
Gelman, Andrew; Shalizi, Cosma Rohilla
2013-02-01
A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. © 2012 The British Psychological Society.
Bayesian statistics in medicine: a 25 year review.
Ashby, Deborah
2006-11-15
This review examines the state of Bayesian thinking as Statistics in Medicine was launched in 1982, reflecting particularly on its applicability and uses in medical research. It then looks at each subsequent five-year epoch, with a focus on papers appearing in Statistics in Medicine, putting these in the context of major developments in Bayesian thinking and computation with reference to important books, landmark meetings and seminal papers. It charts the growth of Bayesian statistics as it is applied to medicine and makes predictions for the future. From sparse beginnings, where Bayesian statistics was barely mentioned, Bayesian statistics has now permeated all the major areas of medical statistics, including clinical trials, epidemiology, meta-analyses and evidence synthesis, spatial modelling, longitudinal modelling, survival modelling, molecular genetics and decision-making in respect of new technologies.
With or without you: predictive coding and Bayesian inference in the brain
Aitchison, Laurence; Lengyel, Máté
2018-01-01
Two theoretical ideas have emerged recently with the ambition to provide a unifying functional explanation of neural population coding and dynamics: predictive coding and Bayesian inference. Here, we describe the two theories and their combination into a single framework: Bayesian predictive coding. We clarify how the two theories can be distinguished, despite sharing core computational concepts and addressing an overlapping set of empirical phenomena. We argue that predictive coding is an algorithmic / representational motif that can serve several different computational goals of which Bayesian inference is but one. Conversely, while Bayesian inference can utilize predictive coding, it can also be realized by a variety of other representations. We critically evaluate the experimental evidence supporting Bayesian predictive coding and discuss how to test it more directly. PMID:28942084