Sample records for pet probabilistic map

  1. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners.

    PubMed

    Chen, Kevin T; Izquierdo-Garcia, David; Poynton, Clare B; Chonde, Daniel B; Catana, Ciprian

    2017-03-01

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps ("μ-maps") were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map ("PAC-map") generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach.

  2. Probabilistic atlas-based segmentation of combined T1-weighted and DUTE MRI for calculation of head attenuation maps in integrated PET/MRI scanners.

    PubMed

    Poynton, Clare B; Chen, Kevin T; Chonde, Daniel B; Izquierdo-Garcia, David; Gollub, Randy L; Gerstner, Elizabeth R; Batchelor, Tracy T; Catana, Ciprian

    2014-01-01

    We present a new MRI-based attenuation correction (AC) approach for integrated PET/MRI systems that combines both segmentation- and atlas-based methods by incorporating dual-echo ultra-short echo-time (DUTE) and T1-weighted (T1w) MRI data and a probabilistic atlas. Segmented atlases were constructed from CT training data using a leave-one-out framework and combined with T1w, DUTE, and CT data to train a classifier that computes the probability of air/soft tissue/bone at each voxel. This classifier was applied to segment the MRI of the subject of interest and attenuation maps (μ-maps) were generated by assigning specific linear attenuation coefficients (LACs) to each tissue class. The μ-maps generated with this "Atlas-T1w-DUTE" approach were compared to those obtained from DUTE data using a previously proposed method. For validation of the segmentation results, segmented CT μ-maps were considered to the "silver standard"; the segmentation accuracy was assessed qualitatively and quantitatively through calculation of the Dice similarity coefficient (DSC). Relative change (RC) maps between the CT and MRI-based attenuation corrected PET volumes were also calculated for a global voxel-wise assessment of the reconstruction results. The μ-maps obtained using the Atlas-T1w-DUTE classifier agreed well with those derived from CT; the mean DSCs for the Atlas-T1w-DUTE-based μ-maps across all subjects were higher than those for DUTE-based μ-maps; the atlas-based μ-maps also showed a lower percentage of misclassified voxels across all subjects. RC maps from the atlas-based technique also demonstrated improvement in the PET data compared to the DUTE method, both globally as well as regionally.

  3. Probabilistic atlas-based segmentation of combined T1-weighted and DUTE MRI for calculation of head attenuation maps in integrated PET/MRI scanners

    PubMed Central

    Poynton, Clare B; Chen, Kevin T; Chonde, Daniel B; Izquierdo-Garcia, David; Gollub, Randy L; Gerstner, Elizabeth R; Batchelor, Tracy T; Catana, Ciprian

    2014-01-01

    We present a new MRI-based attenuation correction (AC) approach for integrated PET/MRI systems that combines both segmentation- and atlas-based methods by incorporating dual-echo ultra-short echo-time (DUTE) and T1-weighted (T1w) MRI data and a probabilistic atlas. Segmented atlases were constructed from CT training data using a leave-one-out framework and combined with T1w, DUTE, and CT data to train a classifier that computes the probability of air/soft tissue/bone at each voxel. This classifier was applied to segment the MRI of the subject of interest and attenuation maps (μ-maps) were generated by assigning specific linear attenuation coefficients (LACs) to each tissue class. The μ-maps generated with this “Atlas-T1w-DUTE” approach were compared to those obtained from DUTE data using a previously proposed method. For validation of the segmentation results, segmented CT μ-maps were considered to the “silver standard”; the segmentation accuracy was assessed qualitatively and quantitatively through calculation of the Dice similarity coefficient (DSC). Relative change (RC) maps between the CT and MRI-based attenuation corrected PET volumes were also calculated for a global voxel-wise assessment of the reconstruction results. The μ-maps obtained using the Atlas-T1w-DUTE classifier agreed well with those derived from CT; the mean DSCs for the Atlas-T1w-DUTE-based μ-maps across all subjects were higher than those for DUTE-based μ-maps; the atlas-based μ-maps also showed a lower percentage of misclassified voxels across all subjects. RC maps from the atlas-based technique also demonstrated improvement in the PET data compared to the DUTE method, both globally as well as regionally. PMID:24753982

  4. Probabilistic Air Segmentation and Sparse Regression Estimated Pseudo CT for PET/MR Attenuation Correction

    PubMed Central

    Chen, Yasheng; Juttukonda, Meher; Su, Yi; Benzinger, Tammie; Rubin, Brian G.; Lee, Yueh Z.; Lin, Weili; Shen, Dinggang; Lalush, David

    2015-01-01

    Purpose To develop a positron emission tomography (PET) attenuation correction method for brain PET/magnetic resonance (MR) imaging by estimating pseudo computed tomographic (CT) images from T1-weighted MR and atlas CT images. Materials and Methods In this institutional review board–approved and HIPAA-compliant study, PET/MR/CT images were acquired in 20 subjects after obtaining written consent. A probabilistic air segmentation and sparse regression (PASSR) method was developed for pseudo CT estimation. Air segmentation was performed with assistance from a probabilistic air map. For nonair regions, the pseudo CT numbers were estimated via sparse regression by using atlas MR patches. The mean absolute percentage error (MAPE) on PET images was computed as the normalized mean absolute difference in PET signal intensity between a method and the reference standard continuous CT attenuation correction method. Friedman analysis of variance and Wilcoxon matched-pairs tests were performed for statistical comparison of MAPE between the PASSR method and Dixon segmentation, CT segmentation, and population averaged CT atlas (mean atlas) methods. Results The PASSR method yielded a mean MAPE ± standard deviation of 2.42% ± 1.0, 3.28% ± 0.93, and 2.16% ± 1.75, respectively, in the whole brain, gray matter, and white matter, which were significantly lower than the Dixon, CT segmentation, and mean atlas values (P < .01). Moreover, 68.0% ± 16.5, 85.8% ± 12.9, and 96.0% ± 2.5 of whole-brain volume had within ±2%, ±5%, and ±10% percentage error by using PASSR, respectively, which was significantly higher than other methods (P < .01). Conclusion PASSR outperformed the Dixon, CT segmentation, and mean atlas methods by reducing PET error owing to attenuation correction. © RSNA, 2014 PMID:25521778

  5. Validation of Simple Quantification Methods for (18)F-FP-CIT PET Using Automatic Delineation of Volumes of Interest Based on Statistical Probabilistic Anatomical Mapping and Isocontour Margin Setting.

    PubMed

    Kim, Yong-Il; Im, Hyung-Jun; Paeng, Jin Chul; Lee, Jae Sung; Eo, Jae Seon; Kim, Dong Hyun; Kim, Euishin E; Kang, Keon Wook; Chung, June-Key; Lee, Dong Soo

    2012-12-01

    (18)F-FP-CIT positron emission tomography (PET) is an effective imaging for dopamine transporters. In usual clinical practice, (18)F-FP-CIT PET is analyzed visually or quantified using manual delineation of a volume of interest (VOI) for the striatum. In this study, we suggested and validated two simple quantitative methods based on automatic VOI delineation using statistical probabilistic anatomical mapping (SPAM) and isocontour margin setting. Seventy-five (18)F-FP-CIT PET images acquired in routine clinical practice were used for this study. A study-specific image template was made and the subject images were normalized to the template. Afterwards, uptakes in the striatal regions and cerebellum were quantified using probabilistic VOI based on SPAM. A quantitative parameter, QSPAM, was calculated to simulate binding potential. Additionally, the functional volume of each striatal region and its uptake were measured in automatically delineated VOI using isocontour margin setting. Uptake-volume product (QUVP) was calculated for each striatal region. QSPAM and QUVP were compared with visual grading and the influence of cerebral atrophy on the measurements was tested. Image analyses were successful in all the cases. Both the QSPAM and QUVP were significantly different according to visual grading (P < 0.001). The agreements of QUVP or QSPAM with visual grading were slight to fair for the caudate nucleus (κ = 0.421 and 0.291, respectively) and good to perfect to the putamen (κ = 0.663 and 0.607, respectively). Also, QSPAM and QUVP had a significant correlation with each other (P < 0.001). Cerebral atrophy made a significant difference in QSPAM and QUVP of the caudate nuclei regions with decreased (18)F-FP-CIT uptake. Simple quantitative measurements of QSPAM and QUVP showed acceptable agreement with visual grading. Although QSPAM in some group may be influenced by cerebral atrophy, these simple methods are expected to be effective in the quantitative analysis of (18)F-FP-CIT PET in usual clinical practice.

  6. Uncertainty Quantification of Evapotranspiration and Infiltration from Modeling and Historic Time Series at the Savannah River F-Area

    NASA Astrophysics Data System (ADS)

    Faybishenko, B.; Flach, G. P.

    2012-12-01

    The objectives of this presentation are: (a) to illustrate the application of Monte Carlo and fuzzy-probabilistic approaches for uncertainty quantification (UQ) in predictions of potential evapotranspiration (PET), actual evapotranspiration (ET), and infiltration (I), using uncertain hydrological or meteorological time series data, and (b) to compare the results of these calculations with those from field measurements at the U.S. Department of Energy Savannah River Site (SRS), near Aiken, South Carolina, USA. The UQ calculations include the evaluation of aleatory (parameter uncertainty) and epistemic (model) uncertainties. The effect of aleatory uncertainty is expressed by assigning the probability distributions of input parameters, using historical monthly averaged data from the meteorological station at the SRS. The combined effect of aleatory and epistemic uncertainties on the UQ of PET, ET, and Iis then expressed by aggregating the results of calculations from multiple models using a p-box and fuzzy numbers. The uncertainty in PETis calculated using the Bair-Robertson, Blaney-Criddle, Caprio, Hargreaves-Samani, Hamon, Jensen-Haise, Linacre, Makkink, Priestly-Taylor, Penman, Penman-Monteith, Thornthwaite, and Turc models. Then, ET is calculated from the modified Budyko model, followed by calculations of I from the water balance equation. We show that probabilistic and fuzzy-probabilistic calculations using multiple models generate the PET, ET, and Idistributions, which are well within the range of field measurements. We also show that a selection of a subset of models can be used to constrain the uncertainty quantification of PET, ET, and I.

  7. Arterial spin labeling-based Z-maps have high specificity and positive predictive value for neurodegenerative dementia compared to FDG-PET.

    PubMed

    Fällmar, David; Haller, Sven; Lilja, Johan; Danfors, Torsten; Kilander, Lena; Tolboom, Nelleke; Egger, Karl; Kellner, Elias; Croon, Philip M; Verfaillie, Sander C J; van Berckel, Bart N M; Ossenkoppele, Rik; Barkhof, Frederik; Larsson, Elna-Marie

    2017-10-01

    Cerebral perfusion analysis based on arterial spin labeling (ASL) MRI has been proposed as an alternative to FDG-PET in patients with neurodegenerative disease. Z-maps show normal distribution values relating an image to a database of controls. They are routinely used for FDG-PET to demonstrate disease-specific patterns of hypometabolism at the individual level. This study aimed to compare the performance of Z-maps based on ASL to FDG-PET. Data were combined from two separate sites, each cohort consisting of patients with Alzheimer's disease (n = 18 + 7), frontotemporal dementia (n = 12 + 8) and controls (n = 9 + 29). Subjects underwent pseudocontinuous ASL and FDG-PET. Z-maps were created for each subject and modality. Four experienced physicians visually assessed the 166 Z-maps in random order, blinded to modality and diagnosis. Discrimination of patients versus controls using ASL-based Z-maps yielded high specificity (84%) and positive predictive value (80%), but significantly lower sensitivity compared to FDG-PET-based Z-maps (53% vs. 96%, p < 0.001). Among true-positive cases, correct diagnoses were made in 76% (ASL) and 84% (FDG-PET) (p = 0.168). ASL-based Z-maps can be used for visual assessment of neurodegenerative dementia with high specificity and positive predictive value, but with inferior sensitivity compared to FDG-PET. • ASL-based Z-maps yielded high specificity and positive predictive value in neurodegenerative dementia. • ASL-based Z-maps had significantly lower sensitivity compared to FDG-PET-based Z-maps. • FDG-PET might be reserved for ASL-negative cases where clinical suspicion persists. • Findings were similar at two study sites.

  8. Reproducibility of MR-Based Attenuation Maps in PET/MRI and the Impact on PET Quantification in Lung Cancer.

    PubMed

    Olin, Anders; Ladefoged, Claes N; Langer, Natasha H; Keller, Sune H; Löfgren, Johan; Hansen, Adam E; Kjær, Andreas; Langer, Seppo W; Fischer, Barbara M; Andersen, Flemming L

    2018-06-01

    Quantitative PET/MRI is dependent on reliable and reproducible MR-based attenuation correction (MR-AC). In this study, we evaluated the quality of current vendor-provided thoracic MR-AC maps and further investigated the reproducibility of their impact on 18 F-FDG PET quantification in patients with non-small cell lung cancer. Methods: Eleven patients with inoperable non-small cell lung cancer underwent 2-5 thoracic PET/MRI scan-rescan examinations within 22 d. 18 F-FDG PET data were acquired along with 2 Dixon MR-AC maps for each examination. Two PET images (PET A and PET B ) were reconstructed using identical PET emission data but with MR-AC from these intrasubject repeated attenuation maps. In total, 90 MR-AC maps were evaluated visually for quality and the occurrence of categorized artifacts by 2 PET/MRI-experienced physicians. Each tumor was outlined by a volume of interest (40% isocontour of maximum) on PET A , which was then projected onto the corresponding PET B SUV mean and SUV max were assessed from the PET images. Within-examination coefficients of variation and Bland-Altman analyses were conducted for the assessment of SUV variations between PET A and PET B Results: Image artifacts were observed in 86% of the MR-AC maps, and 30% of the MR-AC maps were subjectively expected to affect the tumor SUV. SUV mean and SUV max resulted in coefficients of variation of 5.6% and 6.6%, respectively, and scan-rescan SUV variations were within ±20% in 95% of the cases. Substantial SUV variations were seen mainly for scan-rescan examinations affected by respiratory motion. Conclusion: Artifacts occur frequently in standard thoracic MR-AC maps, affecting the reproducibility of PET/MRI. These, in combination with other well-known sources of error associated with PET/MRI examinations, lead to inconsistent SUV measurements in serial studies, which may affect the reliability of therapy response assessment. A thorough visual inspection of the thoracic MR-AC map and Dixon images from which it is derived remains crucial for the detection of MR-AC artifacts that may influence the reliability of SUV. © 2018 by the Society of Nuclear Medicine and Molecular Imaging.

  9. First USGS urban seismic hazard maps predict the effects of soils

    USGS Publications Warehouse

    Cramer, C.H.; Gomberg, J.S.; Schweig, E.S.; Waldron, B.A.; Tucker, K.

    2006-01-01

    Probabilistic and scenario urban seismic hazard maps have been produced for Memphis, Shelby County, Tennessee covering a six-quadrangle area of the city. The nine probabilistic maps are for peak ground acceleration and 0.2 s and 1.0 s spectral acceleration and for 10%, 5%, and 2% probability of being exceeded in 50 years. Six scenario maps for these three ground motions have also been generated for both an M7.7 and M6.2 on the southwest arm of the New Madrid seismic zone ending at Marked Tree, Arkansas. All maps include the effect of local geology. Relative to the national seismic hazard maps, the effect of the thick sediments beneath Memphis is to decrease 0.2 s probabilistic ground motions by 0-30% and increase 1.0 s probabilistic ground motions by ???100%. Probabilistic peak ground accelerations remain at levels similar to the national maps, although the ground motion gradient across Shelby County is reduced and ground motions are more uniform within the county. The M7.7 scenario maps show ground motions similar to the 5%-in-50-year probabilistic maps. As an effect of local geology, both M7.7 and M6.2 scenario maps show a more uniform seismic ground-motion hazard across Shelby County than scenario maps with constant site conditions (i.e., NEHRP B/C boundary).

  10. PET-Tool: a software suite for comprehensive processing and managing of Paired-End diTag (PET) sequence data.

    PubMed

    Chiu, Kuo Ping; Wong, Chee-Hong; Chen, Qiongyu; Ariyaratne, Pramila; Ooi, Hong Sain; Wei, Chia-Lin; Sung, Wing-Kin Ken; Ruan, Yijun

    2006-08-25

    We recently developed the Paired End diTag (PET) strategy for efficient characterization of mammalian transcriptomes and genomes. The paired end nature of short PET sequences derived from long DNA fragments raised a new set of bioinformatics challenges, including how to extract PETs from raw sequence reads, and correctly yet efficiently map PETs to reference genome sequences. To accommodate and streamline data analysis of the large volume PET sequences generated from each PET experiment, an automated PET data process pipeline is desirable. We designed an integrated computation program package, PET-Tool, to automatically process PET sequences and map them to the genome sequences. The Tool was implemented as a web-based application composed of four modules: the Extractor module for PET extraction; the Examiner module for analytic evaluation of PET sequence quality; the Mapper module for locating PET sequences in the genome sequences; and the Project Manager module for data organization. The performance of PET-Tool was evaluated through the analyses of 2.7 million PET sequences. It was demonstrated that PET-Tool is accurate and efficient in extracting PET sequences and removing artifacts from large volume dataset. Using optimized mapping criteria, over 70% of quality PET sequences were mapped specifically to the genome sequences. With a 2.4 GHz LINUX machine, it takes approximately six hours to process one million PETs from extraction to mapping. The speed, accuracy, and comprehensiveness have proved that PET-Tool is an important and useful component in PET experiments, and can be extended to accommodate other related analyses of paired-end sequences. The Tool also provides user-friendly functions for data quality check and system for multi-layer data management.

  11. Probabilistic drug connectivity mapping

    PubMed Central

    2014-01-01

    Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351

  12. Sparsity-constrained PET image reconstruction with learned dictionaries

    NASA Astrophysics Data System (ADS)

    Tang, Jing; Yang, Bao; Wang, Yanhua; Ying, Leslie

    2016-09-01

    PET imaging plays an important role in scientific and clinical measurement of biochemical and physiological processes. Model-based PET image reconstruction such as the iterative expectation maximization algorithm seeking the maximum likelihood solution leads to increased noise. The maximum a posteriori (MAP) estimate removes divergence at higher iterations. However, a conventional smoothing prior or a total-variation (TV) prior in a MAP reconstruction algorithm causes over smoothing or blocky artifacts in the reconstructed images. We propose to use dictionary learning (DL) based sparse signal representation in the formation of the prior for MAP PET image reconstruction. The dictionary to sparsify the PET images in the reconstruction process is learned from various training images including the corresponding MR structural image and a self-created hollow sphere. Using simulated and patient brain PET data with corresponding MR images, we study the performance of the DL-MAP algorithm and compare it quantitatively with a conventional MAP algorithm, a TV-MAP algorithm, and a patch-based algorithm. The DL-MAP algorithm achieves improved bias and contrast (or regional mean values) at comparable noise to what the other MAP algorithms acquire. The dictionary learned from the hollow sphere leads to similar results as the dictionary learned from the corresponding MR image. Achieving robust performance in various noise-level simulation and patient studies, the DL-MAP algorithm with a general dictionary demonstrates its potential in quantitative PET imaging.

  13. Subject-specific bone attenuation correction for brain PET/MR: can ZTE-MRI substitute CT scan accurately?

    PubMed

    Khalifé, Maya; Fernandez, Brice; Jaubert, Olivier; Soussan, Michael; Brulon, Vincent; Buvat, Irène; Comtat, Claude

    2017-09-21

    In brain PET/MR applications, accurate attenuation maps are required for accurate PET image quantification. An implemented attenuation correction (AC) method for brain imaging is the single-atlas approach that estimates an AC map from an averaged CT template. As an alternative, we propose to use a zero echo time (ZTE) pulse sequence to segment bone, air and soft tissue. A linear relationship between histogram normalized ZTE intensity and measured CT density in Hounsfield units ([Formula: see text]) in bone has been established thanks to a CT-MR database of 16 patients. Continuous AC maps were computed based on the segmented ZTE by setting a fixed linear attenuation coefficient (LAC) to air and soft tissue and by using the linear relationship to generate continuous μ values for the bone. Additionally, for the purpose of comparison, four other AC maps were generated: a ZTE derived AC map with a fixed LAC for the bone, an AC map based on the single-atlas approach as provided by the PET/MR manufacturer, a soft-tissue only AC map and, finally, the CT derived attenuation map used as the gold standard (CTAC). All these AC maps were used with different levels of smoothing for PET image reconstruction with and without time-of-flight (TOF). The subject-specific AC map generated by combining ZTE-based segmentation and linear scaling of the normalized ZTE signal into [Formula: see text] was found to be a good substitute for the measured CTAC map in brain PET/MR when used with a Gaussian smoothing kernel of [Formula: see text] corresponding to the PET scanner intrinsic resolution. As expected TOF reduces AC error regardless of the AC method. The continuous ZTE-AC performed better than the other alternative MR derived AC methods, reducing the quantification error between the MRAC corrected PET image and the reference CTAC corrected PET image.

  14. Subject-specific bone attenuation correction for brain PET/MR: can ZTE-MRI substitute CT scan accurately?

    NASA Astrophysics Data System (ADS)

    Khalifé, Maya; Fernandez, Brice; Jaubert, Olivier; Soussan, Michael; Brulon, Vincent; Buvat, Irène; Comtat, Claude

    2017-10-01

    In brain PET/MR applications, accurate attenuation maps are required for accurate PET image quantification. An implemented attenuation correction (AC) method for brain imaging is the single-atlas approach that estimates an AC map from an averaged CT template. As an alternative, we propose to use a zero echo time (ZTE) pulse sequence to segment bone, air and soft tissue. A linear relationship between histogram normalized ZTE intensity and measured CT density in Hounsfield units (HU ) in bone has been established thanks to a CT-MR database of 16 patients. Continuous AC maps were computed based on the segmented ZTE by setting a fixed linear attenuation coefficient (LAC) to air and soft tissue and by using the linear relationship to generate continuous μ values for the bone. Additionally, for the purpose of comparison, four other AC maps were generated: a ZTE derived AC map with a fixed LAC for the bone, an AC map based on the single-atlas approach as provided by the PET/MR manufacturer, a soft-tissue only AC map and, finally, the CT derived attenuation map used as the gold standard (CTAC). All these AC maps were used with different levels of smoothing for PET image reconstruction with and without time-of-flight (TOF). The subject-specific AC map generated by combining ZTE-based segmentation and linear scaling of the normalized ZTE signal into HU was found to be a good substitute for the measured CTAC map in brain PET/MR when used with a Gaussian smoothing kernel of 4~mm corresponding to the PET scanner intrinsic resolution. As expected TOF reduces AC error regardless of the AC method. The continuous ZTE-AC performed better than the other alternative MR derived AC methods, reducing the quantification error between the MRAC corrected PET image and the reference CTAC corrected PET image.

  15. A Probabilistic Feature Map-Based Localization System Using a Monocular Camera.

    PubMed

    Kim, Hyungjin; Lee, Donghwa; Oh, Taekjun; Choi, Hyun-Taek; Myung, Hyun

    2015-08-31

    Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments.

  16. A Probabilistic Feature Map-Based Localization System Using a Monocular Camera

    PubMed Central

    Kim, Hyungjin; Lee, Donghwa; Oh, Taekjun; Choi, Hyun-Taek; Myung, Hyun

    2015-01-01

    Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments. PMID:26404284

  17. Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure

    NASA Astrophysics Data System (ADS)

    Tsai, C.; Yeh, J. J. J.

    2017-12-01

    A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.

  18. The combination of preoperative PET/CT and sentinel lymph node biopsy in the surgical management of early-stage cervical cancer.

    PubMed

    Papadia, Andrea; Gasparri, Maria Luisa; Genoud, Sophie; Bernd, Klaeser; Mueller, Michael D

    2017-11-01

    The aim of the study was to evaluate the use of PET/CT and/or SLN mapping alone or in combination in cervical cancer patients. Data on stage IA1-IIA cervical cancer patients undergoing PET/CT and SLN mapping were retrospectively collected. Sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) of PET/CT and SLN mapping, alone or in combination, in identifying cervical cancer patients with lymph node metastases were calculated. Sixty patients met the inclusion criteria. PET/CT showed a sensitivity of 68%, a specificity of 84%, a PPV of 61% and a NPV of 88% in detecting lymph nodal metastases. SLN mapping showed a sensitivity of 93%, a specificity of 100%, a PPV of 100% and a NPV of 97%. The combination of PET/CT and SLN mapping showed a sensitivity of 100%, a specificity of 86%, a PPV of 72% and a NPV of 100%. For patients with tumors of >2 cm in diameter, the PET/CT showed a sensitivity of 68%, a specificity of 72%, a PPV of 61% and a NPV of 86%. SLN mapping showed a sensitivity of 93%, a specificity of 100%, a PPV of 100% and a NPV of 95%. The combination of PET/CT and SLN mapping showed a sensitivity of 100%, a specificity of 76%, a PPV of 72% and a NPV of 100%. PET/CT represents a "safety net" that helps the surgeon in identifying metastatic lymph nodes, especially in patients with larger tumors.

  19. 18F-FDG PET/MRI fusion in characterizing pancreatic tumors: comparison to PET/CT.

    PubMed

    Tatsumi, Mitsuaki; Isohashi, Kayako; Onishi, Hiromitsu; Hori, Masatoshi; Kim, Tonsok; Higuchi, Ichiro; Inoue, Atsuo; Shimosegawa, Eku; Takeda, Yutaka; Hatazawa, Jun

    2011-08-01

    To demonstrate that positron emission tomography (PET)/magnetic resonance imaging (MRI) fusion was feasible in characterizing pancreatic tumors (PTs), comparing MRI and computed tomography (CT) as mapping images for fusion with PET as well as fused PET/MRI and PET/CT. We retrospectively reviewed 47 sets of (18)F-fluorodeoxyglucose ((18)F -FDG) PET/CT and MRI examinations to evaluate suspected or known pancreatic cancer. To assess the ability of mapping images for fusion with PET, CT (of PET/CT), T1- and T2-weighted (w) MR images (all non-contrast) were graded regarding the visibility of PT (5-point confidence scale). Fused PET/CT, PET/T1-w or T2-w MR images of the upper abdomen were evaluated to determine whether mapping images provided additional diagnostic information to PET alone (3-point scale). The overall quality of PET/CT or PET/MRI sets in diagnosis was also assessed (3-point scale). These PET/MRI-related scores were compared to PET/CT-related scores and the accuracy in characterizing PTs was compared. Forty-three PTs were visualized on CT or MRI, including 30 with abnormal FDG uptake and 13 without. The confidence score for the visibility of PT was significantly higher on T1-w MRI than CT. The scores for additional diagnostic information to PET and overall quality of each image set in diagnosis were significantly higher on the PET/T1-w MRI set than the PET/CT set. The diagnostic accuracy was higher on PET/T1-w or PET/T2-w MRI (93.0 and 90.7%, respectively) than PET/CT (88.4%), but statistical significance was not obtained. PET/MRI fusion, especially PET with T1-w MRI, was demonstrated to be superior to PET/CT in characterizing PTs, offering better mapping and fusion image quality.

  20. Construction and evaluation of quantitative small-animal PET probabilistic atlases for [¹⁸F]FDG and [¹⁸F]FECT functional mapping of the mouse brain.

    PubMed

    Casteels, Cindy; Vunckx, Kathleen; Aelvoet, Sarah-Ann; Baekelandt, Veerle; Bormans, Guy; Van Laere, Koen; Koole, Michel

    2013-01-01

    Automated voxel-based or pre-defined volume-of-interest (VOI) analysis of small-animal PET data in mice is necessary for optimal information usage as the number of available resolution elements is limited. We have mapped metabolic ([(18)F]FDG) and dopamine transporter ([(18)F]FECT) small-animal PET data onto a 3D Magnetic Resonance Microscopy (MRM) mouse brain template and aligned them in space to the Paxinos co-ordinate system. In this way, ligand-specific templates for sensitive analysis and accurate anatomical localization were created. Next, using a pre-defined VOI approach, test-retest and intersubject variability of various quantification methods were evaluated. Also, the feasibility of mouse brain statistical parametric mapping (SPM) was explored for [(18)F]FDG and [(18)F]FECT imaging of 6-hydroxydopamine-lesioned (6-OHDA) mice. Twenty-three adult C57BL6 mice were scanned with [(18)F]FDG and [(18)F]FECT. Registrations and affine spatial normalizations were performed using SPM8. [(18)F]FDG data were quantified using (1) an image-derived-input function obtained from the liver (cMRglc), using (2) standardized uptake values (SUVglc) corrected for blood glucose levels and by (3) normalizing counts to the whole-brain uptake. Parametric [(18)F]FECT binding images were constructed by reference to the cerebellum. Registration accuracy was determined using random simulated misalignments and vectorial mismatch determination. Registration accuracy was between 0.21-1.11 mm. Regional intersubject variabilities of cMRglc ranged from 15.4% to 19.2%, while test-retest values were between 5.0% and 13.0%. For [(18)F]FECT uptake in the caudate-putamen, these values were 13.0% and 10.3%, respectively. Regional values of cMRglc positively correlated to SUVglc measured within the 45-60 min time frame (spearman r = 0.71). Next, SPM analysis of 6-OHDA-lesioned mice showed hypometabolism in the bilateral caudate-putamen and cerebellum, and an unilateral striatal decrease in DAT availability. MRM-based small-animal PET templates facilitate accurate assessment and spatial localization of mouse brain function using VOI or voxel-based analysis. Regional intersubject- and test-retest variations indicate that for these targets accuracy comparable to humans can be achieved.

  1. Phase transitions in coupled map lattices and in associated probabilistic cellular automata.

    PubMed

    Just, Wolfram

    2006-10-01

    Analytical tools are applied to investigate piecewise linear coupled map lattices in terms of probabilistic cellular automata. The so-called disorder condition of probabilistic cellular automata is closely related with attracting sets in coupled map lattices. The importance of this condition for the suppression of phase transitions is illustrated by spatially one-dimensional systems. Invariant densities and temporal correlations are calculated explicitly. Ising type phase transitions are found for one-dimensional coupled map lattices acting on repelling sets and for a spatially two-dimensional Miller-Huse-like system with stable long time dynamics. Critical exponents are calculated within a finite size scaling approach. The relevance of detailed balance of the resulting probabilistic cellular automaton for the critical behavior is pointed out.

  2. A teaching-learning sequence about weather map reading

    NASA Astrophysics Data System (ADS)

    Mandrikas, Achilleas; Stavrou, Dimitrios; Skordoulis, Constantine

    2017-07-01

    In this paper a teaching-learning sequence (TLS) introducing pre-service elementary teachers (PET) to weather map reading, with emphasis on wind assignment, is presented. The TLS includes activities about recognition of wind symbols, assignment of wind direction and wind speed on a weather map and identification of wind characteristics in a weather forecast. Sixty PET capabilities and difficulties in understanding weather maps were investigated, using inquiry-based learning activities. The results show that most PET became more capable of reading weather maps and assigning wind direction and speed on them. Our results also show that PET could be guided to understand meteorology concepts useful in everyday life and in teaching their future students.

  3. Sci—Thur AM: YIS - 08: Constructing an Attenuation map for a PET/MR Breast coil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patrick, John C.; Imaging, Lawson Health Research Institute, Knoxville, TN; London Regional Cancer Program, Knoxville, TN

    2014-08-15

    In 2013, around 23000 Canadian women and 200 Canadian men were diagnosed with breast cancer. An estimated 5100 women and 55 men died from the disease. Using the sensitivity of MRI with the selectivity of PET, PET/MRI combines anatomical and functional information within the same scan and could help with early detection in high-risk patients. MRI requires radiofrequency coils for transmitting energy and receiving signal but the breast coil attenuates PET signal. To correct for this PET attenuation, a 3-dimensional map of linear attenuation coefficients (μ-map) of the breast coil must be created and incorporated into the PET reconstruction process.more » Several approaches have been proposed for building hardware μ-maps, some of which include the use of conventional kVCT and Dual energy CT. These methods can produce high resolution images based on the electron densities of materials that can be converted into μ-maps. However, imaging hardware containing metal components with photons in the kV range is susceptible to metal artifacts. These artifacts can compromise the accuracy of the resulting μ-map and PET reconstruction; therefore high-Z components should be removed. We propose a method for calculating μ-maps without removing coil components, based on megavoltage (MV) imaging with a linear accelerator that has been detuned for imaging at 1.0MeV. Containers of known geometry with F18 were placed in the breast coil for imaging. A comparison between reconstructions based on the different μ-map construction methods was made. PET reconstructions with our method show a maximum of 6% difference over the existing kVCT-based reconstructions.« less

  4. Robust identification of polyethylene terephthalate (PET) plastics through Bayesian decision.

    PubMed

    Zulkifley, Mohd Asyraf; Mustafa, Mohd Marzuki; Hussain, Aini; Mustapha, Aouache; Ramli, Suzaimah

    2014-01-01

    Recycling is one of the most efficient methods for environmental friendly waste management. Among municipal wastes, plastics are the most common material that can be easily recycled and polyethylene terephthalate (PET) is one of its major types. PET material is used in consumer goods packaging such as drinking bottles, toiletry containers, food packaging and many more. Usually, a recycling process is tailored to a specific material for optimal purification and decontamination to obtain high grade recyclable material. The quantity and quality of the sorting process are limited by the capacity of human workers that suffer from fatigue and boredom. Several automated sorting systems have been proposed in the literature that include using chemical, proximity and vision sensors. The main advantages of vision based sensors are its environmentally friendly approach, non-intrusive detection and capability of high throughput. However, the existing methods rely heavily on deterministic approaches that make them less accurate as the variations in PET plastic waste appearance are too high. We proposed a probabilistic approach of modeling the PET material by analyzing the reflection region and its surrounding. Three parameters are modeled by Gaussian and exponential distributions: color, size and distance of the reflection region. The final classification is made through a supervised training method of likelihood ratio test. The main novelty of the proposed method is the probabilistic approach in integrating various PET material signatures that are contaminated by stains under constant lighting changes. The system is evaluated by using four performance metrics: precision, recall, accuracy and error. Our system performed the best in all evaluation metrics compared to the benchmark methods. The system can be further improved by fusing all neighborhood information in decision making and by implementing the system in a graphics processing unit for faster processing speed.

  5. Robust Identification of Polyethylene Terephthalate (PET) Plastics through Bayesian Decision

    PubMed Central

    Zulkifley, Mohd Asyraf; Mustafa, Mohd Marzuki; Hussain, Aini; Mustapha, Aouache; Ramli, Suzaimah

    2014-01-01

    Recycling is one of the most efficient methods for environmental friendly waste management. Among municipal wastes, plastics are the most common material that can be easily recycled and polyethylene terephthalate (PET) is one of its major types. PET material is used in consumer goods packaging such as drinking bottles, toiletry containers, food packaging and many more. Usually, a recycling process is tailored to a specific material for optimal purification and decontamination to obtain high grade recyclable material. The quantity and quality of the sorting process are limited by the capacity of human workers that suffer from fatigue and boredom. Several automated sorting systems have been proposed in the literature that include using chemical, proximity and vision sensors. The main advantages of vision based sensors are its environmentally friendly approach, non-intrusive detection and capability of high throughput. However, the existing methods rely heavily on deterministic approaches that make them less accurate as the variations in PET plastic waste appearance are too high. We proposed a probabilistic approach of modeling the PET material by analyzing the reflection region and its surrounding. Three parameters are modeled by Gaussian and exponential distributions: color, size and distance of the reflection region. The final classification is made through a supervised training method of likelihood ratio test. The main novelty of the proposed method is the probabilistic approach in integrating various PET material signatures that are contaminated by stains under constant lighting changes. The system is evaluated by using four performance metrics: precision, recall, accuracy and error. Our system performed the best in all evaluation metrics compared to the benchmark methods. The system can be further improved by fusing all neighborhood information in decision making and by implementing the system in a graphics processing unit for faster processing speed. PMID:25485630

  6. PET attenuation correction for flexible MRI surface coils in hybrid PET/MRI using a 3D depth camera

    NASA Astrophysics Data System (ADS)

    Frohwein, Lynn J.; Heß, Mirco; Schlicher, Dominik; Bolwin, Konstantin; Büther, Florian; Jiang, Xiaoyi; Schäfers, Klaus P.

    2018-01-01

    PET attenuation correction for flexible MRI radio frequency surface coils in hybrid PET/MRI is still a challenging task, as position and shape of these coils conform to large inter-patient variabilities. The purpose of this feasibility study is to develop a novel method for the incorporation of attenuation information about flexible surface coils in PET reconstruction using the Microsoft Kinect V2 depth camera. The depth information is used to determine a dense point cloud of the coil’s surface representing the shape of the coil. From a CT template—acquired once in advance—surface information of the coil is extracted likewise and converted into a point cloud. The two point clouds are then registered using a combination of an iterative-closest-point (ICP) method and a partially rigid registration step. Using the transformation derived through the point clouds, the CT template is warped and thereby adapted to the PET/MRI scan setup. The transformed CT template is converted into an attenuation map from Hounsfield units into linear attenuation coefficients. The resulting fitted attenuation map is then integrated into the MRI-based patient-specific DIXON-based attenuation map of the actual PET/MRI scan. A reconstruction of phantom PET data acquired with the coil present in the field-of-view (FoV), but without the corresponding coil attenuation map, shows large artifacts in regions close to the coil. The overall count loss is determined to be around 13% compared to a PET scan without the coil present in the FoV. A reconstruction using the new μ-map resulted in strongly reduced artifacts as well as increased overall PET intensities with a remaining relative difference of about 1% to a PET scan without the coil in the FoV.

  7. Anatomy assisted PET image reconstruction incorporating multi-resolution joint entropy

    NASA Astrophysics Data System (ADS)

    Tang, Jing; Rahmim, Arman

    2015-01-01

    A promising approach in PET image reconstruction is to incorporate high resolution anatomical information (measured from MR or CT) taking the anato-functional similarity measures such as mutual information or joint entropy (JE) as the prior. These similarity measures only classify voxels based on intensity values, while neglecting structural spatial information. In this work, we developed an anatomy-assisted maximum a posteriori (MAP) reconstruction algorithm wherein the JE measure is supplied by spatial information generated using wavelet multi-resolution analysis. The proposed wavelet-based JE (WJE) MAP algorithm involves calculation of derivatives of the subband JE measures with respect to individual PET image voxel intensities, which we have shown can be computed very similarly to how the inverse wavelet transform is implemented. We performed a simulation study with the BrainWeb phantom creating PET data corresponding to different noise levels. Realistically simulated T1-weighted MR images provided by BrainWeb modeling were applied in the anatomy-assisted reconstruction with the WJE-MAP algorithm and the intensity-only JE-MAP algorithm. Quantitative analysis showed that the WJE-MAP algorithm performed similarly to the JE-MAP algorithm at low noise level in the gray matter (GM) and white matter (WM) regions in terms of noise versus bias tradeoff. When noise increased to medium level in the simulated data, the WJE-MAP algorithm started to surpass the JE-MAP algorithm in the GM region, which is less uniform with smaller isolated structures compared to the WM region. In the high noise level simulation, the WJE-MAP algorithm presented clear improvement over the JE-MAP algorithm in both the GM and WM regions. In addition to the simulation study, we applied the reconstruction algorithms to real patient studies involving DPA-173 PET data and Florbetapir PET data with corresponding T1-MPRAGE MRI images. Compared to the intensity-only JE-MAP algorithm, the WJE-MAP algorithm resulted in comparable regional mean values to those from the maximum likelihood algorithm while reducing noise. Achieving robust performance in various noise-level simulation and patient studies, the WJE-MAP algorithm demonstrates its potential in clinical quantitative PET imaging.

  8. Correlation between resting state fMRI total neuronal activity and PET metabolism in healthy controls and patients with disorders of consciousness.

    PubMed

    Soddu, Andrea; Gómez, Francisco; Heine, Lizette; Di Perri, Carol; Bahri, Mohamed Ali; Voss, Henning U; Bruno, Marie-Aurélie; Vanhaudenhuyse, Audrey; Phillips, Christophe; Demertzi, Athena; Chatelle, Camille; Schrouff, Jessica; Thibaut, Aurore; Charland-Verville, Vanessa; Noirhomme, Quentin; Salmon, Eric; Tshibanda, Jean-Flory Luaba; Schiff, Nicholas D; Laureys, Steven

    2016-01-01

    The mildly invasive 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) is a well-established imaging technique to measure 'resting state' cerebral metabolism. This technique made it possible to assess changes in metabolic activity in clinical applications, such as the study of severe brain injury and disorders of consciousness. We assessed the possibility of creating functional MRI activity maps, which could estimate the relative levels of activity in FDG-PET cerebral metabolic maps. If no metabolic absolute measures can be extracted, our approach may still be of clinical use in centers without access to FDG-PET. It also overcomes the problem of recognizing individual networks of independent component selection in functional magnetic resonance imaging (fMRI) resting state analysis. We extracted resting state fMRI functional connectivity maps using independent component analysis and combined only components of neuronal origin. To assess neuronality of components a classification based on support vector machine (SVM) was used. We compared the generated maps with the FDG-PET maps in 16 healthy controls, 11 vegetative state/unresponsive wakefulness syndrome patients and four locked-in patients. The results show a significant similarity with ρ = 0.75 ± 0.05 for healthy controls and ρ = 0.58 ± 0.09 for vegetative state/unresponsive wakefulness syndrome patients between the FDG-PET and the fMRI based maps. FDG-PET, fMRI neuronal maps, and the conjunction analysis show decreases in frontoparietal and medial regions in vegetative patients with respect to controls. Subsequent analysis in locked-in syndrome patients produced also consistent maps with healthy controls. The constructed resting state fMRI functional connectivity map points toward the possibility for fMRI resting state to estimate relative levels of activity in a metabolic map.

  9. Noninvasive Assessment of Oxygen Extraction Fraction in Chronic Ischemia Using Quantitative Susceptibility Mapping at 7 Tesla.

    PubMed

    Uwano, Ikuko; Kudo, Kohsuke; Sato, Ryota; Ogasawara, Kuniaki; Kameda, Hiroyuki; Nomura, Jun-Ichi; Mori, Futoshi; Yamashita, Fumio; Ito, Kenji; Yoshioka, Kunihiro; Sasaki, Makoto

    2017-08-01

    The oxygen extraction fraction (OEF) is an effective metric to evaluate metabolic reserve in chronic ischemia. However, OEF is considered to be accurately measured only when using positron emission tomography (PET). Thus, we investigated whether OEF maps generated by magnetic resonance quantitative susceptibility mapping (QSM) at 7 Tesla enabled detection of OEF changes when compared with those obtained with PET. Forty-one patients with chronic stenosis/occlusion of the unilateral internal carotid artery or middle cerebral artery were examined using 7 Tesla-MRI and PET scanners. QSM images were obtained from 3-dimensional T2*-weighted images, using a multiple dipole-inversion algorithm. OEF maps were generated based on susceptibility differences between venous structures and brain tissues on QSM images. OEF ratios of the ipsilateral middle cerebral artery territory against the contralateral side were calculated on the QSM-OEF and PET-OEF images, using an anatomic template. The OEF ratio in the middle cerebral artery territory showed significant correlations between QSM-OEF and PET-OEF maps ( r =0.69; P <0.001), especially in patients with a substantial increase in the PET-OEF ratio of 1.09 ( r =0.79; P =0.004), although showing significant systematic biases for the agreements. An increased QSM-OEF ratio of >1.09, as determined by receiver operating characteristic analysis, showed a sensitivity and specificity of 0.82 and 0.86, respectively, for the substantial increase in the PET-OEF ratio. Absolute QSM-OEF values were significantly correlated with PET-OEF values in the patients with increased PET-OEF. OEF ratios on QSM-OEF images at 7 Tesla showed a good correlation with those on PET-OEF images in patients with unilateral steno-occlusive internal carotid artery/middle cerebral artery lesions, suggesting that noninvasive OEF measurement by MRI can be a substitute for PET. © 2017 American Heart Association, Inc.

  10. Mixture model based joint-MAP reconstruction of attenuation and activity maps in TOF-PET

    NASA Astrophysics Data System (ADS)

    Hemmati, H.; Kamali-Asl, A.; Ghafarian, P.; Ay, M. R.

    2018-06-01

    A challenge to have quantitative positron emission tomography (PET) images is to provide an accurate and patient-specific photon attenuation correction. In PET/MR scanners, the nature of MR signals and hardware limitations have led to a real challenge on the attenuation map extraction. Except for a constant factor, the activity and attenuation maps from emission data on TOF-PET system can be determined by the maximum likelihood reconstruction of attenuation and activity approach (MLAA) from emission data. The aim of the present study is to constrain the joint estimations of activity and attenuation approach for PET system using a mixture model prior based on the attenuation map histogram. This novel prior enforces non-negativity and its hyperparameters can be estimated using a mixture decomposition step from the current estimation of the attenuation map. The proposed method can also be helpful on the solving of scaling problem and is capable to assign the predefined regional attenuation coefficients with some degree of confidence to the attenuation map similar to segmentation-based attenuation correction approaches. The performance of the algorithm is studied with numerical and Monte Carlo simulations and a phantom experiment and was compared with MLAA algorithm with and without the smoothing prior. The results demonstrate that the proposed algorithm is capable of producing the cross-talk free activity and attenuation images from emission data. The proposed approach has potential to be a practical and competitive method for joint reconstruction of activity and attenuation maps from emission data on PET/MR and can be integrated on the other methods.

  11. Towards improved hardware component attenuation correction in PET/MR hybrid imaging

    NASA Astrophysics Data System (ADS)

    Paulus, D. H.; Tellmann, L.; Quick, H. H.

    2013-11-01

    In positron emission tomography/computed tomography (PET/CT) hybrid imaging attenuation correction (AC) of the patient tissue and patient table is performed by converting the CT-based Hounsfield units (HU) to linear attenuation coefficients (LAC) of PET. When applied to the new field of hardware component AC in PET/magnetic resonance (MR) hybrid imaging, this conversion method may result in local overcorrection of PET activity values. The aim of this study thus was to optimize the conversion parameters for CT-based AC of hardware components in PET/MR. Systematic evaluation and optimization of the HU to LAC conversion parameters has been performed for the hardware component attenuation map (µ-map) of a flexible radiofrequency (RF) coil used in PET/MR imaging. Furthermore, spatial misregistration of this RF coil to its µ-map was simulated by shifting the µ-map in different directions and the effect on PET quantification was evaluated. Measurements of a PET NEMA standard emission phantom were performed on an integrated hybrid PET/MR system. Various CT parameters were used to calculate different µ-maps for the flexible RF coil and to evaluate the impact on the PET activity concentration. A 511 keV transmission scan of the local RF coil was used as standard of reference to adapt the slope of the conversion from HUs to LACs at 511 keV. The average underestimation of the PET activity concentration due to the non-attenuation corrected RF coil in place was calculated to be 5.0% in the overall phantom. When considering attenuation only in the upper volume of the phantom, the average difference to the reference scan without RF coil is 11.0%. When the PET/CT conversion is applied, an average overestimation of 3.1% (without extended CT scale) and 4.2% (with extended CT scale) is observed in the top volume of the NEMA phantom. Using the adapted conversion resulting from this study, the deviation in the top volume of the phantom is reduced to -0.5% and shows the lowest standard deviation inside the phantom in comparison to all other conversions. Simulation of a µ-map misregistration shows acceptable results for shifts below 5 mm for the flexible surface RF coil. The adapted conversion from HUs to LAC at 511 keV within this study can improve hardware component AC in PET/MR hybrid imaging as shown for a flexible RF surface coil. Furthermore, these results have a direct impact on the improvement of the hardware component AC of the examined flexible RF coil in conjunction with position determination.

  12. High-Resolution Underwater Mapping Using Side-Scan Sonar

    PubMed Central

    2016-01-01

    The goal of this study is to generate high-resolution sea floor maps using a Side-Scan Sonar(SSS). This is achieved by explicitly taking into account the SSS operation as follows. First, the raw sensor data is corrected by means of a physics-based SSS model. Second, the data is projected to the sea-floor. The errors involved in this projection are thoroughfully analysed. Third, a probabilistic SSS model is defined and used to estimate the probability of each sea-floor region to be observed. This probabilistic information is then used to weight the contribution of each SSS measurement to the map. Because of these models, arbitrary map resolutions can be achieved, even beyond the sensor resolution. Finally, a geometric map building method is presented and combined with the probabilistic approach. The resulting map is composed of two layers. The echo intensity layer holds the most likely echo intensities at each point in the sea-floor. The probabilistic layer contains information about how confident can the user or the higher control layers be about the echo intensity layer data. Experimental results have been conducted in a large subsea region. PMID:26821379

  13. Quality control for quantitative multicenter whole-body PET/MR studies: A NEMA image quality phantom study with three current PET/MR systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boellaard, Ronald, E-mail: r.boellaard@vumc.nl; European Association of Nuclear Medicine Research Ltd., Vienna 1060; European Association of Nuclear Medicine Physics Committee, Vienna 1060

    2015-10-15

    Purpose: Integrated positron emission tomography/magnetic resonance (PET/MR) systems derive the PET attenuation correction (AC) from dedicated MR sequences. While MR-AC performs reasonably well in clinical patient imaging, it may fail for phantom-based quality control (QC). The authors assess the applicability of different protocols for PET QC in multicenter PET/MR imaging. Methods: The National Electrical Manufacturers Association NU 2 2007 image quality phantom was imaged on three combined PET/MR systems: a Philips Ingenuity TF PET/MR, a Siemens Biograph mMR, and a GE SIGNA PET/MR (prototype) system. The phantom was filled according to the EANM FDG-PET/CT guideline 1.0 and scanned for 5more » min over 1 bed. Two MR-AC imaging protocols were tested: standard clinical procedures and a dedicated protocol for phantom tests. Depending on the system, the dedicated phantom protocol employs a two-class (water and air) segmentation of the MR data or a CT-based template. Differences in attenuation- and SUV recovery coefficients (RC) are reported. PET/CT-based simulations were performed to simulate the various artifacts seen in the AC maps (μ-map) and their impact on the accuracy of phantom-based QC. Results: Clinical MR-AC protocols caused substantial errors and artifacts in the AC maps, resulting in underestimations of the reconstructed PET activity of up to 27%, depending on the PET/MR system. Using dedicated phantom MR-AC protocols, PET bias was reduced to −8%. Mean and max SUV RC met EARL multicenter PET performance specifications for most contrast objects, but only when using the dedicated phantom protocol. Simulations confirmed the bias in experimental data to be caused by incorrect AC maps resulting from the use of clinical MR-AC protocols. Conclusions: Phantom-based quality control of PET/MR systems in a multicenter, multivendor setting may be performed with sufficient accuracy, but only when dedicated phantom acquisition and processing protocols are used for attenuation correction.« less

  14. Quantitatively Mapping Cellular Viscosity with Detailed Organelle Information via a Designed PET Fluorescent Probe

    PubMed Central

    Liu, Tianyu; Liu, Xiaogang; Spring, David R.; Qian, Xuhong; Cui, Jingnan; Xu, Zhaochao

    2014-01-01

    Viscosity is a fundamental physical parameter that influences diffusion in biological processes. The distribution of intracellular viscosity is highly heterogeneous, and it is challenging to obtain a full map of cellular viscosity with detailed organelle information. In this work, we report 1 as the first fluorescent viscosity probe which is able to quantitatively map cellular viscosity with detailed organelle information based on the PET mechanism. This probe exhibited a significant ratiometric fluorescence intensity enhancement as solvent viscosity increases. The emission intensity increase was attributed to combined effects of the inhibition of PET due to restricted conformational access (favorable for FRET, but not for PET), and the decreased PET efficiency caused by viscosity-dependent twisted intramolecular charge transfer (TICT). A full map of subcellular viscosity was successfully constructed via fluorescent ratiometric detection and fluorescence lifetime imaging; it was found that lysosomal regions in a cell possess the highest viscosity, followed by mitochondrial regions. PMID:24957323

  15. Probabilistic BPRRC: Robust Change Detection against Illumination Changes and Background Movements

    NASA Astrophysics Data System (ADS)

    Yokoi, Kentaro

    This paper presents Probabilistic Bi-polar Radial Reach Correlation (PrBPRRC), a change detection method that is robust against illumination changes and background movements. Most of the traditional change detection methods are robust against either illumination changes or background movements; BPRRC is one of the illumination-robust change detection methods. We introduce a probabilistic background texture model into BPRRC and add the robustness against background movements including foreground invasions such as moving cars, walking people, swaying trees, and falling snow. We show the superiority of PrBPRRC in the environment with illumination changes and background movements by using three public datasets and one private dataset: ATON Highway data, Karlsruhe traffic sequence data, PETS 2007 data, and Walking-in-a-room data.

  16. Automated segmentation of the prostate in 3D MR images using a probabilistic atlas and a spatially constrained deformable model.

    PubMed

    Martin, Sébastien; Troccaz, Jocelyne; Daanenc, Vincent

    2010-04-01

    The authors present a fully automatic algorithm for the segmentation of the prostate in three-dimensional magnetic resonance (MR) images. The approach requires the use of an anatomical atlas which is built by computing transformation fields mapping a set of manually segmented images to a common reference. These transformation fields are then applied to the manually segmented structures of the training set in order to get a probabilistic map on the atlas. The segmentation is then realized through a two stage procedure. In the first stage, the processed image is registered to the probabilistic atlas. Subsequently, a probabilistic segmentation is obtained by mapping the probabilistic map of the atlas to the patient's anatomy. In the second stage, a deformable surface evolves toward the prostate boundaries by merging information coming from the probabilistic segmentation, an image feature model and a statistical shape model. During the evolution of the surface, the probabilistic segmentation allows the introduction of a spatial constraint that prevents the deformable surface from leaking in an unlikely configuration. The proposed method is evaluated on 36 exams that were manually segmented by a single expert. A median Dice similarity coefficient of 0.86 and an average surface error of 2.41 mm are achieved. By merging prior knowledge, the presented method achieves a robust and completely automatic segmentation of the prostate in MR images. Results show that the use of a spatial constraint is useful to increase the robustness of the deformable model comparatively to a deformable surface that is only driven by an image appearance model.

  17. Improving accuracy of simultaneously reconstructed activity and attenuation maps using deep learning.

    PubMed

    Hwang, Donghwi; Kim, Kyeong Yun; Kang, Seung Kwan; Seo, Seongho; Paeng, Jin Chul; Lee, Dong Soo; Lee, Jae Sung

    2018-02-15

    Simultaneous reconstruction of activity and attenuation using the maximum likelihood reconstruction of activity and attenuation (MLAA) augmented by time-of-flight (TOF) information is a promising method for positron emission tomography (PET) attenuation correction. However, it still suffers from several problems, including crosstalk artifacts, slow convergence speed, and noisy attenuation maps (μ-maps). In this work, we developed deep convolutional neural networks (CNNs) to overcome these MLAA limitations, and we verified their feasibility using a clinical brain PET data set. Methods: We applied the proposed method to one of the most challenging PET cases for simultaneous image reconstruction ( 18 F-FP-CIT PET scans with highly specific binding to striatum of the brain). Three different CNN architectures (convolutional autoencoder (CAE), U-net, hybrid of CAE and U-net) were designed and trained to learn x-ray computed tomography (CT) derived μ-map (μ-CT) from the MLAA-generated activity distribution and μ-map (μ-MLAA). PET/CT data of 40 patients with suspected Parkinson's disease were employed for five-fold cross-validation. For the training of CNNs, 800,000 transverse PET slices and CTs augmented from 32 patient data sets were used. The similarity to μ-CT of the CNN-generated μ-maps (μ-CAE, μ-Unet, and μ-Hybrid) and μ-MLAA was compared using Dice similarity coefficients. In addition, we compared the activity concentration of specific (striatum) and non-specific binding regions (cerebellum and occipital cortex) and the binding ratios in the striatum in the PET activity images reconstructed using those μ-maps. Results: The CNNs generated less noisy and more uniform μ-maps than original μ-MLAA. Moreover, the air cavities and bones were better resolved in the proposed CNN outputs. In addition, the proposed deep learning approach was useful for mitigating the crosstalk problem in the MLAA reconstruction. The hybrid network of CAE and U-net yielded the most similar μ-maps to μ-CT (Dice similarity coefficient in the whole head = 0.79 in the bone and 0.72 in air cavities), resulting in only approximately 5% errors in activity and biding ratio quantification. Conclusion: The proposed deep learning approach is promising for accurate attenuation correction of activity distribution in TOF PET systems. Copyright © 2018 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  18. Attenuation correction in 4D-PET using a single-phase attenuation map and rigidity-adaptive deformable registration

    PubMed Central

    Kalantari, Faraz; Wang, Jing

    2017-01-01

    Purpose Four-dimensional positron emission tomography (4D-PET) imaging is a potential solution to the respiratory motion effect in the thoracic region. Computed tomography (CT)-based attenuation correction (AC) is an essential step toward quantitative imaging for PET. However, due to the temporal difference between 4D-PET and a single attenuation map from CT, typically available in routine clinical scanning, motion artifacts are observed in the attenuation-corrected PET images, leading to errors in tumor shape and uptake. We introduced a practical method to align single-phase CT with all other 4D-PET phases for AC. Methods A penalized non-rigid Demons registration between individual 4D-PET frames without AC provides the motion vectors to be used for warping single-phase attenuation map. The non-rigid Demons registration was used to derive deformation vector fields (DVFs) between PET matched with the CT phase and other 4D-PET images. While attenuated PET images provide useful data for organ borders such as those of the lung and the liver, tumors cannot be distinguished from the background due to loss of contrast. To preserve the tumor shape in different phases, an ROI-covering tumor was excluded from non-rigid transformation. Instead the mean DVF of the central region of the tumor was assigned to all voxels in the ROI. This process mimics a rigid transformation of the tumor along with a non-rigid transformation of other organs. A 4D-XCAT phantom with spherical lung tumors, with diameters ranging from 10 to 40 mm, was used to evaluate the algorithm. The performance of the proposed hybrid method for attenuation map estimation was compared to 1) the Demons non-rigid registration only and 2) a single attenuation map based on quantitative parameters in individual PET frames. Results Motion-related artifacts were significantly reduced in the attenuation-corrected 4D-PET images. When a single attenuation map was used for all individual PET frames, the normalized root mean square error (NRMSE) values in tumor region were 49.3% (STD: 8.3%), 50.5% (STD: 9.3%), 51.8% (STD: 10.8%) and 51.5% (STD: 12.1%) for 10-mm, 20-mm, 30-mm and 40-mm tumors respectively. These errors were reduced to 11.9% (STD: 2.9%), 13.6% (STD: 3.9%), 13.8% (STD: 4.8%), and 16.7% (STD: 9.3%) by our proposed method for deforming the attenuation map. The relative errors in total lesion glycolysis (TLG) values were −0.25% (STD: 2.87%) and 3.19% (STD: 2.35%) for 30-mm and 40-mm tumors respectively in proposed method. The corresponding values for Demons method were 25.22% (STD: 14.79%) and 18.42% (STD: 7.06%). Our proposed hybrid method outperforms the Demons method especially for larger tumors. For tumors smaller than 20 mm, non-rigid transformation could also provide quantitative results. Conclusion Although non-AC 4D-PET frames include insignificant anatomical information, they are still useful to estimate the DVFs to align the attenuation map for accurate AC. The proposed hybrid method can recover the AC-related artifacts and provide quantitative AC-PET images. PMID:27987223

  19. Investigating the state-of-the-art in whole-body MR-based attenuation correction: an intra-individual, inter-system, inventory study on three clinical PET/MR systems.

    PubMed

    Beyer, Thomas; Lassen, Martin L; Boellaard, Ronald; Delso, Gaspar; Yaqub, Maqsood; Sattler, Bernhard; Quick, Harald H

    2016-02-01

    We assess inter- and intra-subject variability of magnetic resonance (MR)-based attenuation maps (MRμMaps) of human subjects for state-of-the-art positron emission tomography (PET)/MR imaging systems. Four healthy male subjects underwent repeated MR imaging with a Siemens Biograph mMR, Philips Ingenuity TF and GE SIGNA PET/MR system using product-specific MR sequences and image processing algorithms for generating MRμMaps. Total lung volumes and mean attenuation values in nine thoracic reference regions were calculated. Linear regression was used for comparing lung volumes on MRμMaps. Intra- and inter-system variability was investigated using a mixed effects model. Intra-system variability was seen for the lung volume of some subjects, (p = 0.29). Mean attenuation values across subjects were significantly different (p < 0.001) due to different segmentations of the trachea. Differences in the attenuation values caused noticeable intra-individual and inter-system differences that translated into a subsequent bias of the corrected PET activity values, as verified by independent simulations. Significant differences of MRμMaps generated for the same subjects but different PET/MR systems resulted in differences in attenuation correction factors, particularly in the thorax. These differences currently limit the quantitative use of PET/MR in multi-center imaging studies.

  20. Evaluation of Atlas-Based Attenuation Correction for Integrated PET/MR in Human Brain: Application of a Head Atlas and Comparison to True CT-Based Attenuation Correction.

    PubMed

    Sekine, Tetsuro; Buck, Alfred; Delso, Gaspar; Ter Voert, Edwin E G W; Huellner, Martin; Veit-Haibach, Patrick; Warnock, Geoffrey

    2016-02-01

    Attenuation correction (AC) for integrated PET/MR imaging in the human brain is still an open problem. In this study, we evaluated a simplified atlas-based AC (Atlas-AC) by comparing (18)F-FDG PET data corrected using either Atlas-AC or true CT data (CT-AC). We enrolled 8 patients (median age, 63 y). All patients underwent clinically indicated whole-body (18)F-FDG PET/CT for staging, restaging, or follow-up of malignant disease. All patients volunteered for an additional PET/MR of the head (additional tracer was not injected). For each patient, 2 AC maps were generated: an Atlas-AC map registered to a patient-specific liver accelerated volume acquisition-Flex MR sequence and using a vendor-provided head atlas generated from multiple CT head images and a CT-based AC map. For comparative AC, the CT-AC map generated from PET/CT was superimposed on the Atlas-AC map. PET images were reconstructed from the list-mode raw data from the PET/MR imaging scanner using each AC map. All PET images were normalized to the SPM5 PET template, and (18)F-FDG accumulation was quantified in 67 volumes of interest (VOIs; automated anatomic labeling atlas). Relative difference (%diff) between images based on Atlas-AC and CT-AC was calculated, and averaged difference images were generated. (18)F-FDG uptake in all VOIs was compared using Bland-Altman analysis. The range of error in all 536 VOIs was -3.0%-7.3%. Whole-brain (18)F-FDG uptake based on Atlas-AC was slightly underestimated (%diff = 2.19% ± 1.40%). The underestimation was most pronounced in the regions below the anterior/posterior commissure line, such as the cerebellum, temporal lobe, and central structures (%diff = 3.69% ± 1.43%, 3.25% ± 1.42%, and 3.05% ± 1.18%), suggesting that Atlas-AC tends to underestimate the attenuation values of the skull base bone. When compared with the gold-standard CT-AC, errors introduced using Atlas-AC did not exceed 8% in any brain region investigated. Underestimation of (18)F-FDG uptake was minor (<4%) but significant in regions near the skull base. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  1. Effects of ferumoxytol on quantitative PET measurements in simultaneous PET/MR whole-body imaging: a pilot study in a baboon model.

    PubMed

    Borra, Ronald Jh; Cho, Hoon-Sung; Bowen, Spencer L; Attenberger, Ulrike; Arabasz, Grae; Catana, Ciprian; Josephson, Lee; Rosen, Bruce R; Guimaraes, Alexander R; Hooker, Jacob M

    2015-12-01

    Simultaneous PET/MR imaging depends on MR-derived attenuation maps (mu-maps) for accurate attenuation correction of PET data. Currently, these maps are derived from gradient-echo-based MR sequences, which are sensitive to susceptibility changes. Iron oxide magnetic nanoparticles have been used in the measurement of blood volume, tumor microvasculature, tumor-associated macrophages, and characterizing lymph nodes. Our aim in this study was to assess whether the susceptibility effects associated with iron oxide nanoparticles can potentially affect measured (18)F-FDG PET standardized uptake values (SUV) through effects on MR-derived attenuation maps. The study protocol was approved by the Institutional Animal Care and Use Committee. Using a Siemens Biograph mMR PET/MR scanner, we evaluated the effects of increasing concentrations of ferumoxytol and ferumoxytol aggregates on MR-derived mu-maps using an agarose phantom. In addition, we performed a baboon experiment evaluating the effects of a single i.v. ferumoxytol dose (10 mg/kg) on the liver, spleen, and pancreas (18)F-FDG SUV at baseline (ferumoxytol-naïve), within the first hour and at 1, 3, 5, and 11 weeks. Phantom experiments showed mu-map artifacts starting at ferumoxytol aggregate concentrations of 10 to 20 mg/kg. The in vivo baboon data demonstrated a 53% decrease of observed (18)F-FDG SUV compared to baseline within the first hour in the liver, persisting at least 11 weeks. A single ferumoxytol dose can affect measured SUV for at least 3 months, which should be taken into account when administrating ferumoxytol in patients needing sequential PET/MR scans. Advances in knowledge 1. Ferumoxytol aggregates, but not ferumoxytol alone, produce significant artifacts in MR-derived attenuation correction maps at approximate clinical dose levels of 10 mg/kg. 2. When performing simultaneous whole-body (18)F-FDG PET/MR, a single dose of ferumoxytol can result in observed SUV decreases up to 53%, depending on the amount of ferumoxytol aggregates in the studied tissue. Implications for patient care Administration of a single, clinically relevant, dose of ferumoxytol can potentially result in changes in observed SUV for a prolonged period of time in the setting of simultaneous PET/MR. These potential changes should be considered in particular when administering ferumoxytol to patients with expected future PET/MR studies, as ferumoxytol-induced SUV changes might interfere with therapy assessment.

  2. Multi-Atlas-Based Attenuation Correction for Brain 18F-FDG PET Imaging Using a Time-of-Flight PET/MR Scanner: Comparison with Clinical Single-Atlas- and CT-Based Attenuation Correction.

    PubMed

    Sekine, Tetsuro; Burgos, Ninon; Warnock, Geoffrey; Huellner, Martin; Buck, Alfred; Ter Voert, Edwin E G W; Cardoso, M Jorge; Hutton, Brian F; Ourselin, Sebastien; Veit-Haibach, Patrick; Delso, Gaspar

    2016-08-01

    In this work, we assessed the feasibility of attenuation correction (AC) based on a multi-atlas-based method (m-Atlas) by comparing it with a clinical AC method (single-atlas-based method [s-Atlas]), on a time-of-flight (TOF) PET/MRI scanner. We enrolled 15 patients. The median patient age was 59 y (age range, 31-80). All patients underwent clinically indicated whole-body (18)F-FDG PET/CT for staging, restaging, or follow-up of malignant disease. All patients volunteered for an additional PET/MRI scan of the head (no additional tracer being injected). For each patient, 3 AC maps were generated. Both s-Atlas and m-Atlas AC maps were generated from the same patient-specific LAVA-Flex T1-weighted images being acquired by default on the PET/MRI scanner during the first 18 s of the PET scan. An s-Atlas AC map was extracted by the PET/MRI scanner, and an m-Atlas AC map was created using a Web service tool that automatically generates m-Atlas pseudo-CT images. For comparison, the AC map generated by PET/CT was registered and used as a gold standard. PET images were reconstructed from raw data on the TOF PET/MRI scanner using each AC map. All PET images were normalized to the SPM5 PET template, and (18)F-FDG accumulation was quantified in 67 volumes of interest (VOIs; automated anatomic labeling atlas). Relative (%diff) and absolute differences (|%diff|) between images based on each atlas AC and CT-AC were calculated. (18)F-FDG uptake in all VOIs and generalized merged VOIs were compared using the paired t test and Bland-Altman test. The range of error on m-Atlas in all 1,005 VOIs was -4.99% to 4.09%. The |%diff| on the m-Atlas was improved by about 20% compared with s-Atlas (s-Atlas vs. m-Atlas: 1.49% ± 1.06% vs. 1.21% ± 0.89%, P < 0.01). In generalized VOIs, %diff on m-Atlas in the temporal lobe and cerebellum was significantly smaller (s-Atlas vs. m-Atlas: temporal lobe, 1.49% ± 1.37% vs. -0.37% ± 1.41%, P < 0.01; cerebellum, 1.55% ± 1.97% vs. -1.15% ± 1.72%, P < 0.01). The errors introduced using either s-Atlas or m-Atlas did not exceed 5% in any brain region investigated. When compared with the clinical s-Atlas, m-Atlas is more accurate, especially in regions close to the skull base. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  3. Vision 20/20: Magnetic resonance imaging-guided attenuation correction in PET/MRI: Challenges, solutions, and opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehranian, Abolfazl; Arabi, Hossein; Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch

    Attenuation correction is an essential component of the long chain of data correction techniques required to achieve the full potential of quantitative positron emission tomography (PET) imaging. The development of combined PET/magnetic resonance imaging (MRI) systems mandated the widespread interest in developing novel strategies for deriving accurate attenuation maps with the aim to improve the quantitative accuracy of these emerging hybrid imaging systems. The attenuation map in PET/MRI should ideally be derived from anatomical MR images; however, MRI intensities reflect proton density and relaxation time properties of biological tissues rather than their electron density and photon attenuation properties. Therefore, inmore » contrast to PET/computed tomography, there is a lack of standardized global mapping between the intensities of MRI signal and linear attenuation coefficients at 511 keV. Moreover, in standard MRI sequences, bones and lung tissues do not produce measurable signals owing to their low proton density and short transverse relaxation times. MR images are also inevitably subject to artifacts that degrade their quality, thus compromising their applicability for the task of attenuation correction in PET/MRI. MRI-guided attenuation correction strategies can be classified in three broad categories: (i) segmentation-based approaches, (ii) atlas-registration and machine learning methods, and (iii) emission/transmission-based approaches. This paper summarizes past and current state-of-the-art developments and latest advances in PET/MRI attenuation correction. The advantages and drawbacks of each approach for addressing the challenges of MR-based attenuation correction are comprehensively described. The opportunities brought by both MRI and PET imaging modalities for deriving accurate attenuation maps and improving PET quantification will be elaborated. Future prospects and potential clinical applications of these techniques and their integration in commercial systems will also be discussed.« less

  4. Vision 20/20: Magnetic resonance imaging-guided attenuation correction in PET/MRI: Challenges, solutions, and opportunities.

    PubMed

    Mehranian, Abolfazl; Arabi, Hossein; Zaidi, Habib

    2016-03-01

    Attenuation correction is an essential component of the long chain of data correction techniques required to achieve the full potential of quantitative positron emission tomography (PET) imaging. The development of combined PET/magnetic resonance imaging (MRI) systems mandated the widespread interest in developing novel strategies for deriving accurate attenuation maps with the aim to improve the quantitative accuracy of these emerging hybrid imaging systems. The attenuation map in PET/MRI should ideally be derived from anatomical MR images; however, MRI intensities reflect proton density and relaxation time properties of biological tissues rather than their electron density and photon attenuation properties. Therefore, in contrast to PET/computed tomography, there is a lack of standardized global mapping between the intensities of MRI signal and linear attenuation coefficients at 511 keV. Moreover, in standard MRI sequences, bones and lung tissues do not produce measurable signals owing to their low proton density and short transverse relaxation times. MR images are also inevitably subject to artifacts that degrade their quality, thus compromising their applicability for the task of attenuation correction in PET/MRI. MRI-guided attenuation correction strategies can be classified in three broad categories: (i) segmentation-based approaches, (ii) atlas-registration and machine learning methods, and (iii) emission/transmission-based approaches. This paper summarizes past and current state-of-the-art developments and latest advances in PET/MRI attenuation correction. The advantages and drawbacks of each approach for addressing the challenges of MR-based attenuation correction are comprehensively described. The opportunities brought by both MRI and PET imaging modalities for deriving accurate attenuation maps and improving PET quantification will be elaborated. Future prospects and potential clinical applications of these techniques and their integration in commercial systems will also be discussed.

  5. Attenuation of dopamine-modulated prefrontal value signals underlies probabilistic reward learning deficits in old age

    PubMed Central

    Axelsson, Jan; Riklund, Katrine; Nyberg, Lars; Dayan, Peter; Bäckman, Lars

    2017-01-01

    Probabilistic reward learning is characterised by individual differences that become acute in aging. This may be due to age-related dopamine (DA) decline affecting neural processing in striatum, prefrontal cortex, or both. We examined this by administering a probabilistic reward learning task to younger and older adults, and combining computational modelling of behaviour, fMRI and PET measurements of DA D1 availability. We found that anticipatory value signals in ventromedial prefrontal cortex (vmPFC) were attenuated in older adults. The strength of this signal predicted performance beyond age and was modulated by D1 availability in nucleus accumbens. These results uncover that a value-anticipation mechanism in vmPFC declines in aging, and that this mechanism is associated with DA D1 receptor availability. PMID:28870286

  6. Speech processing using maximum likelihood continuity mapping

    DOEpatents

    Hogden, John E.

    2000-01-01

    Speech processing is obtained that, given a probabilistic mapping between static speech sounds and pseudo-articulator positions, allows sequences of speech sounds to be mapped to smooth sequences of pseudo-articulator positions. In addition, a method for learning a probabilistic mapping between static speech sounds and pseudo-articulator position is described. The method for learning the mapping between static speech sounds and pseudo-articulator position uses a set of training data composed only of speech sounds. The said speech processing can be applied to various speech analysis tasks, including speech recognition, speaker recognition, speech coding, speech synthesis, and voice mimicry.

  7. Speech processing using maximum likelihood continuity mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogden, J.E.

    Speech processing is obtained that, given a probabilistic mapping between static speech sounds and pseudo-articulator positions, allows sequences of speech sounds to be mapped to smooth sequences of pseudo-articulator positions. In addition, a method for learning a probabilistic mapping between static speech sounds and pseudo-articulator position is described. The method for learning the mapping between static speech sounds and pseudo-articulator position uses a set of training data composed only of speech sounds. The said speech processing can be applied to various speech analysis tasks, including speech recognition, speaker recognition, speech coding, speech synthesis, and voice mimicry.

  8. The effect of respiratory induced density variations on non-TOF PET quantitation in the lung.

    PubMed

    Holman, Beverley F; Cuplov, Vesna; Hutton, Brian F; Groves, Ashley M; Thielemans, Kris

    2016-04-21

    Accurate PET quantitation requires a matched attenuation map. Obtaining matched CT attenuation maps in the thorax is difficult due to the respiratory cycle which causes both motion and density changes. Unlike with motion, little attention has been given to the effects of density changes in the lung on PET quantitation. This work aims to explore the extent of the errors caused by pulmonary density attenuation map mismatch on dynamic and static parameter estimates. Dynamic XCAT phantoms were utilised using clinically relevant (18)F-FDG and (18)F-FMISO time activity curves for all organs within the thorax to estimate the expected parameter errors. The simulations were then validated with PET data from 5 patients suffering from idiopathic pulmonary fibrosis who underwent PET/Cine-CT. The PET data were reconstructed with three gates obtained from the Cine-CT and the average Cine-CT. The lung TACs clearly displayed differences between true and measured curves with error depending on global activity distribution at the time of measurement. The density errors from using a mismatched attenuation map were found to have a considerable impact on PET quantitative accuracy. Maximum errors due to density mismatch were found to be as high as 25% in the XCAT simulation. Differences in patient derived kinetic parameter estimates and static concentration between the extreme gates were found to be as high as 31% and 14%, respectively. Overall our results show that respiratory associated density errors in the attenuation map affect quantitation throughout the lung, not just regions near boundaries. The extent of this error is dependent on the activity distribution in the thorax and hence on the tracer and time of acquisition. Consequently there may be a significant impact on estimated kinetic parameters throughout the lung.

  9. The effect of respiratory induced density variations on non-TOF PET quantitation in the lung

    NASA Astrophysics Data System (ADS)

    Holman, Beverley F.; Cuplov, Vesna; Hutton, Brian F.; Groves, Ashley M.; Thielemans, Kris

    2016-04-01

    Accurate PET quantitation requires a matched attenuation map. Obtaining matched CT attenuation maps in the thorax is difficult due to the respiratory cycle which causes both motion and density changes. Unlike with motion, little attention has been given to the effects of density changes in the lung on PET quantitation. This work aims to explore the extent of the errors caused by pulmonary density attenuation map mismatch on dynamic and static parameter estimates. Dynamic XCAT phantoms were utilised using clinically relevant 18F-FDG and 18F-FMISO time activity curves for all organs within the thorax to estimate the expected parameter errors. The simulations were then validated with PET data from 5 patients suffering from idiopathic pulmonary fibrosis who underwent PET/Cine-CT. The PET data were reconstructed with three gates obtained from the Cine-CT and the average Cine-CT. The lung TACs clearly displayed differences between true and measured curves with error depending on global activity distribution at the time of measurement. The density errors from using a mismatched attenuation map were found to have a considerable impact on PET quantitative accuracy. Maximum errors due to density mismatch were found to be as high as 25% in the XCAT simulation. Differences in patient derived kinetic parameter estimates and static concentration between the extreme gates were found to be as high as 31% and 14%, respectively. Overall our results show that respiratory associated density errors in the attenuation map affect quantitation throughout the lung, not just regions near boundaries. The extent of this error is dependent on the activity distribution in the thorax and hence on the tracer and time of acquisition. Consequently there may be a significant impact on estimated kinetic parameters throughout the lung.

  10. A geomorphic approach to 100-year floodplain mapping for the Conterminous United States

    NASA Astrophysics Data System (ADS)

    Jafarzadegan, Keighobad; Merwade, Venkatesh; Saksena, Siddharth

    2018-06-01

    Floodplain mapping using hydrodynamic models is difficult in data scarce regions. Additionally, using hydrodynamic models to map floodplain over large stream network can be computationally challenging. Some of these limitations of floodplain mapping using hydrodynamic modeling can be overcome by developing computationally efficient statistical methods to identify floodplains in large and ungauged watersheds using publicly available data. This paper proposes a geomorphic model to generate probabilistic 100-year floodplain maps for the Conterminous United States (CONUS). The proposed model first categorizes the watersheds in the CONUS into three classes based on the height of the water surface corresponding to the 100-year flood from the streambed. Next, the probability that any watershed in the CONUS belongs to one of these three classes is computed through supervised classification using watershed characteristics related to topography, hydrography, land use and climate. The result of this classification is then fed into a probabilistic threshold binary classifier (PTBC) to generate the probabilistic 100-year floodplain maps. The supervised classification algorithm is trained by using the 100-year Flood Insurance Rated Maps (FIRM) from the U.S. Federal Emergency Management Agency (FEMA). FEMA FIRMs are also used to validate the performance of the proposed model in areas not included in the training. Additionally, HEC-RAS model generated flood inundation extents are used to validate the model performance at fifteen sites that lack FEMA maps. Validation results show that the probabilistic 100-year floodplain maps, generated by proposed model, match well with both FEMA and HEC-RAS generated maps. On average, the error of predicted flood extents is around 14% across the CONUS. The high accuracy of the validation results shows the reliability of the geomorphic model as an alternative approach for fast and cost effective delineation of 100-year floodplains for the CONUS.

  11. Positron Emission Tomography (PET)

    DOE R&D Accomplishments Database

    Welch, M. J.

    1990-01-01

    Positron emission tomography (PET) assesses biochemical processes in the living subject, producing images of function rather than form. Using PET, physicians are able to obtain not the anatomical information provided by other medical imaging techniques, but pictures of physiological activity. In metaphoric terms, traditional imaging methods supply a map of the body's roadways, its, anatomy; PET shows the traffic along those paths, its biochemistry. This document discusses the principles of PET, the radiopharmaceuticals in PET, PET research, clinical applications of PET, the cost of PET, training of individuals for PET, the role of the United States Department of Energy in PET, and the futures of PET.

  12. MRI-guided attenuation correction in whole-body PET/MR: assessment of the effect of bone attenuation.

    PubMed

    Akbarzadeh, A; Ay, M R; Ahmadian, A; Alam, N Riahi; Zaidi, H

    2013-02-01

    Hybrid PET/MRI presents many advantages in comparison with its counterpart PET/CT in terms of improved soft-tissue contrast, decrease in radiation exposure, and truly simultaneous and multi-parametric imaging capabilities. However, the lack of well-established methodology for MR-based attenuation correction is hampering further development and wider acceptance of this technology. We assess the impact of ignoring bone attenuation and using different tissue classes for generation of the attenuation map on the accuracy of attenuation correction of PET data. This work was performed using simulation studies based on the XCAT phantom and clinical input data. For the latter, PET and CT images of patients were used as input for the analytic simulation model using realistic activity distributions where CT-based attenuation correction was utilized as reference for comparison. For both phantom and clinical studies, the reference attenuation map was classified into various numbers of tissue classes to produce three (air, soft tissue and lung), four (air, lungs, soft tissue and cortical bones) and five (air, lungs, soft tissue, cortical bones and spongeous bones) class attenuation maps. The phantom studies demonstrated that ignoring bone increases the relative error by up to 6.8% in the body and up to 31.0% for bony regions. Likewise, the simulated clinical studies showed that the mean relative error reached 15% for lesions located in the body and 30.7% for lesions located in bones, when neglecting bones. These results demonstrate an underestimation of about 30% of tracer uptake when neglecting bone, which in turn imposes substantial loss of quantitative accuracy for PET images produced by hybrid PET/MRI systems. Considering bones in the attenuation map will considerably improve the accuracy of MR-guided attenuation correction in hybrid PET/MR to enable quantitative PET imaging on hybrid PET/MR technologies.

  13. Assessment of Coastal Communities' Vulnerability to Hurricane Surge under Climate Change via Probabilistic Map - A Case Study of the Southwest Coast of Florida

    NASA Astrophysics Data System (ADS)

    Feng, X.; Shen, S.

    2014-12-01

    The US coastline, over the past few years, has been overwhelmed by major storms including Hurricane Katrina (2005), Ike (2008), Irene (2011), and Sandy (2012). Supported by a growing and extensive body of evidence, a majority of research agrees hurricane activities have been enhanced due to climate change. However, the precise prediction of hurricane induced inundation remains a challenge. This study proposed a probabilistic inundation map based on a Statistically Modeled Storm Database (SMSD) to assess the probabilistic coastal inundation risk of Southwest Florida for near-future (20 years) scenario considering climate change. This map was processed through a Joint Probability Method with Optimal-Sampling (JPM-OS), developed by Condon and Sheng in 2012, and accompanied by a high resolution storm surge modeling system CH3D-SSMS. The probabilistic inundation map shows a 25.5-31.2% increase in spatially averaged inundation height compared to an inundation map of present-day scenario. To estimate climate change impacts on coastal communities, socioeconomic analyses were conducted using both the SMSD based probabilistic inundation map and the present-day inundation map. Combined with 2010 census data and 2012 parcel data from Florida Geographic Data Library, the differences of economic loss between the near-future and present day scenarios were used to generate an economic exposure map at census block group level to reflect coastal communities' exposure to climate change. The results show that climate change induced inundation increase has significant economic impacts. Moreover, the impacts are not equally distributed among different social groups considering their social vulnerability to hazards. Social vulnerability index at census block group level were obtained from Hazards and Vulnerability Research Institute. The demographic and economic variables in the index represent a community's adaptability to hazards. Local Moran's I was calculated to identify the clusters of highly exposed and vulnerable communities. The economic-exposure cluster map was overlapped with social-vulnerability cluster map to identify communities with low adaptive capability but high exposure. The result provides decision makers an intuitive tool to identify most susceptible communities for adaptation.

  14. Probabilistic self-organizing maps for continuous data.

    PubMed

    Lopez-Rubio, Ezequiel

    2010-10-01

    The original self-organizing feature map did not define any probability distribution on the input space. However, the advantages of introducing probabilistic methodologies into self-organizing map models were soon evident. This has led to a wide range of proposals which reflect the current emergence of probabilistic approaches to computational intelligence. The underlying estimation theories behind them derive from two main lines of thought: the expectation maximization methodology and stochastic approximation methods. Here, we present a comprehensive view of the state of the art, with a unifying perspective of the involved theoretical frameworks. In particular, we examine the most commonly used continuous probability distributions, self-organization mechanisms, and learning schemes. Special emphasis is given to the connections among them and their relative advantages depending on the characteristics of the problem at hand. Furthermore, we evaluate their performance in two typical applications of self-organizing maps: classification and visualization.

  15. H-SLAM: Rao-Blackwellized Particle Filter SLAM Using Hilbert Maps.

    PubMed

    Vallicrosa, Guillem; Ridao, Pere

    2018-05-01

    Occupancy Grid maps provide a probabilistic representation of space which is important for a variety of robotic applications like path planning and autonomous manipulation. In this paper, a SLAM (Simultaneous Localization and Mapping) framework capable of obtaining this representation online is presented. The H-SLAM (Hilbert Maps SLAM) is based on Hilbert Map representation and uses a Particle Filter to represent the robot state. Hilbert Maps offer a continuous probabilistic representation with a small memory footprint. We present a series of experimental results carried both in simulation and with real AUVs (Autonomous Underwater Vehicles). These results demonstrate that our approach is able to represent the environment more consistently while capable of running online.

  16. Extracting a respiratory signal from raw dynamic PET data that contain tracer kinetics.

    PubMed

    Schleyer, P J; Thielemans, K; Marsden, P K

    2014-08-07

    Data driven gating (DDG) methods provide an alternative to hardware based respiratory gating for PET imaging. Several existing DDG approaches obtain a respiratory signal by observing the change in PET-counts within specific regions of acquired PET data. Currently, these methods do not allow for tracer kinetics which can interfere with the respiratory signal and introduce error. In this work, we produced a DDG method for dynamic PET studies that exhibit tracer kinetics. Our method is based on an existing approach that uses frequency-domain analysis to locate regions within raw PET data that are subject to respiratory motion. In the new approach, an optimised non-stationary short-time Fourier transform was used to create a time-varying 4D map of motion affected regions. Additional processing was required to ensure that the relationship between the sign of the respiratory signal and the physical direction of movement remained consistent for each temporal segment of the 4D map. The change in PET-counts within the 4D map during the PET acquisition was then used to generate a respiratory curve. Using 26 min dynamic cardiac NH3 PET acquisitions which included a hardware derived respiratory measurement, we show that tracer kinetics can severely degrade the respiratory signal generated by the original DDG method. In some cases, the transition of tracer from the liver to the lungs caused the respiratory signal to invert. The new approach successfully compensated for tracer kinetics and improved the correlation between the data-driven and hardware based signals. On average, good correlation was maintained throughout the PET acquisitions.

  17. New ShakeMaps for Georgia Resulting from Collaboration with EMME

    NASA Astrophysics Data System (ADS)

    Kvavadze, N.; Tsereteli, N. S.; Varazanashvili, O.; Alania, V.

    2015-12-01

    Correct assessment of probabilistic seismic hazard and risks maps are first step for advance planning and action to reduce seismic risk. Seismic hazard maps for Georgia were calculated based on modern approach that was developed in the frame of EMME (Earthquake Modl for Middle east region) project. EMME was one of GEM's successful endeavors at regional level. With EMME and GEM assistance, regional models were analyzed to identify the information and additional work needed for the preparation national hazard models. Probabilistic seismic hazard map (PSH) provides the critical bases for improved building code and construction. The most serious deficiency in PSH assessment for the territory of Georgia is the lack of high-quality ground motion data. Due to this an initial hybrid empirical ground motion model is developed for PGA and SA at selected periods. An application of these coefficients for ground motion models have been used in probabilistic seismic hazard assessment. Obtained results of seismic hazard maps show evidence that there were gaps in seismic hazard assessment and the present normative seismic hazard map needed a careful recalculation.

  18. A pilot study on the correlation between fat fraction values and glucose uptake values in supraclavicular fat by simultaneous PET/MRI.

    PubMed

    McCallister, Andrew; Zhang, Le; Burant, Alex; Katz, Laurence; Branca, Rosa Tamara

    2017-11-01

    To assess the spatial correlation between MRI and 18F-fludeoxyglucose positron emission tomography (FDG-PET) maps of human brown adipose tissue (BAT) and to measure differences in fat fraction (FF) between glucose avid and non-avid regions of the supraclavicular fat depot using a hybrid FDG-PET/MR scanner. In 16 healthy volunteers, mean age of 30 and body mass index of 26, FF, R2*, and FDG uptake maps were acquired simultaneously using a hybrid PET/MR system while employing an individualized cooling protocol to maximally stimulate BAT. Fourteen of the 16 volunteers reported BAT-positive FDG-PET scans. MR FF maps of BAT correlate well with combined FDG-PET/MR maps of BAT only in subjects with intense glucose uptake. The results indicate that the extent of the spatial correlation positively correlates with maximum FDG uptake in the supraclavicular fat depot. No consistent, significant differences were found in FF or R2* between FDG avid and non-avid supraclavicular fat regions. In a few FDG-positive subjects, a small but significant linear decrease in BAT FF was observed during BAT stimulation. MR FF, when used in conjunction with FDG uptake maps, can be seen as a valuable, radiation-free alternative to CT and can be used to measure tissue hydration and lipid consumption in some subjects. Magn Reson Med 78:1922-1932, 2017. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  19. MR-compatibility of a high-resolution small animal PET insert operating inside a 7 T MRI.

    PubMed

    Thiessen, J D; Shams, E; Stortz, G; Schellenberg, G; Bishop, D; Khan, M S; Kozlowski, P; Retière, F; Sossi, V; Thompson, C J; Goertzen, A L

    2016-11-21

    A full-ring PET insert consisting of 16 PET detector modules was designed and constructed to fit within the 114 mm diameter gradient bore of a Bruker 7 T MRI. The individual detector modules contain two silicon photomultiplier (SiPM) arrays, dual-layer offset LYSO crystal arrays, and high-definition multimedia interface (HDMI) cables for both signal and power transmission. Several different RF shielding configurations were assessed prior to construction of a fully assembled PET insert using a combination of carbon fibre and copper foil for RF shielding. MR-compatibility measurements included field mapping of the static magnetic field (B 0 ) and the time-varying excitation field (B 1 ) as well as acquisitions with multiple pulse sequences: spin echo (SE), rapid imaging with refocused echoes (RARE), fast low angle shot (FLASH) gradient echo, and echo planar imaging (EPI). B 0 field maps revealed a small degradation in the mean homogeneity (+0.1 ppm) when the PET insert was installed and operating. No significant change was observed in the B 1 field maps or the image homogeneity of various MR images, with a 9% decrease in the signal-to-noise ratio (SNR) observed only in EPI images acquired with the PET insert installed and operating. PET detector flood histograms, photopeak amplitudes, and energy resolutions were unchanged in individual PET detector modules when acquired during MRI operation. There was a small baseline shift on the PET detector signals due to the switching amplifiers used to power MRI gradient pulses. This baseline shift was observable when measured with an oscilloscope and varied as a function of the gradient duty cycle, but had no noticeable effect on the performance of the PET detector modules. Compact front-end electronics and effective RF shielding led to minimal cross-interference between the PET and MRI systems. Both PET detector and MRI performance was excellent, whether operating as a standalone system or a hybrid PET/MRI.

  20. MR-compatibility of a high-resolution small animal PET insert operating inside a 7 T MRI

    NASA Astrophysics Data System (ADS)

    Thiessen, J. D.; Shams, E.; Stortz, G.; Schellenberg, G.; Bishop, D.; Khan, M. S.; Kozlowski, P.; Retière, F.; Sossi, V.; Thompson, C. J.; Goertzen, A. L.

    2016-11-01

    A full-ring PET insert consisting of 16 PET detector modules was designed and constructed to fit within the 114 mm diameter gradient bore of a Bruker 7 T MRI. The individual detector modules contain two silicon photomultiplier (SiPM) arrays, dual-layer offset LYSO crystal arrays, and high-definition multimedia interface (HDMI) cables for both signal and power transmission. Several different RF shielding configurations were assessed prior to construction of a fully assembled PET insert using a combination of carbon fibre and copper foil for RF shielding. MR-compatibility measurements included field mapping of the static magnetic field (B 0) and the time-varying excitation field (B 1) as well as acquisitions with multiple pulse sequences: spin echo (SE), rapid imaging with refocused echoes (RARE), fast low angle shot (FLASH) gradient echo, and echo planar imaging (EPI). B 0 field maps revealed a small degradation in the mean homogeneity (+0.1 ppm) when the PET insert was installed and operating. No significant change was observed in the B 1 field maps or the image homogeneity of various MR images, with a 9% decrease in the signal-to-noise ratio (SNR) observed only in EPI images acquired with the PET insert installed and operating. PET detector flood histograms, photopeak amplitudes, and energy resolutions were unchanged in individual PET detector modules when acquired during MRI operation. There was a small baseline shift on the PET detector signals due to the switching amplifiers used to power MRI gradient pulses. This baseline shift was observable when measured with an oscilloscope and varied as a function of the gradient duty cycle, but had no noticeable effect on the performance of the PET detector modules. Compact front-end electronics and effective RF shielding led to minimal cross-interference between the PET and MRI systems. Both PET detector and MRI performance was excellent, whether operating as a standalone system or a hybrid PET/MRI.

  1. Efficient robust reconstruction of dynamic PET activity maps with radioisotope decay constraints.

    PubMed

    Gao, Fei; Liu, Huafeng; Shi, Pengcheng

    2010-01-01

    Dynamic PET imaging performs sequence of data acquisition in order to provide visualization and quantification of physiological changes in specific tissues and organs. The reconstruction of activity maps is generally the first step in dynamic PET. State space Hinfinity approaches have been proved to be a robust method for PET image reconstruction where, however, temporal constraints are not considered during the reconstruction process. In addition, the state space strategies for PET image reconstruction have been computationally prohibitive for practical usage because of the need for matrix inversion. In this paper, we present a minimax formulation of the dynamic PET imaging problem where a radioisotope decay model is employed as physics-based temporal constraints on the photon counts. Furthermore, a robust steady state Hinfinity filter is developed to significantly improve the computational efficiency with minimal loss of accuracy. Experiments are conducted on Monte Carlo simulated image sequences for quantitative analysis and validation.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welch, M.J.

    Positron emission tomography (PET) assesses biochemical processes in the living subject, producing images of function rather than form. Using PET, physicians are able to obtain not the anatomical information provided by other medical imaging techniques, but pictures of physiological activity. In metaphoric terms, traditional imaging methods supply a map of the body's roadways, its, anatomy; PET shows the traffic along those paths, its biochemistry. This document discusses the principles of PET, the radiopharmaceuticals in PET, PET research, clinical applications of PET, the cost of PET, training of individuals for PET, the role of the United States Department of Energy inmore » PET, and the futures of PET. 22 figs.« less

  3. Direct parametric reconstruction in dynamic PET myocardial perfusion imaging: in vivo studies.

    PubMed

    Petibon, Yoann; Rakvongthai, Yothin; El Fakhri, Georges; Ouyang, Jinsong

    2017-05-07

    Dynamic PET myocardial perfusion imaging (MPI) used in conjunction with tracer kinetic modeling enables the quantification of absolute myocardial blood flow (MBF). However, MBF maps computed using the traditional indirect method (i.e. post-reconstruction voxel-wise fitting of kinetic model to PET time-activity-curves-TACs) suffer from poor signal-to-noise ratio (SNR). Direct reconstruction of kinetic parameters from raw PET projection data has been shown to offer parametric images with higher SNR compared to the indirect method. The aim of this study was to extend and evaluate the performance of a direct parametric reconstruction method using in vivo dynamic PET MPI data for the purpose of quantifying MBF. Dynamic PET MPI studies were performed on two healthy pigs using a Siemens Biograph mMR scanner. List-mode PET data for each animal were acquired following a bolus injection of ~7-8 mCi of 18 F-flurpiridaz, a myocardial perfusion agent. Fully-3D dynamic PET sinograms were obtained by sorting the coincidence events into 16 temporal frames covering ~5 min after radiotracer administration. Additionally, eight independent noise realizations of both scans-each containing 1/8th of the total number of events-were generated from the original list-mode data. Dynamic sinograms were then used to compute parametric maps using the conventional indirect method and the proposed direct method. For both methods, a one-tissue compartment model accounting for spillover from the left and right ventricle blood-pools was used to describe the kinetics of 18 F-flurpiridaz. An image-derived arterial input function obtained from a TAC taken in the left ventricle cavity was used for tracer kinetic analysis. For the indirect method, frame-by-frame images were estimated using two fully-3D reconstruction techniques: the standard ordered subset expectation maximization (OSEM) reconstruction algorithm on one side, and the one-step late maximum a posteriori (OSL-MAP) algorithm on the other side, which incorporates a quadratic penalty function. The parametric images were then calculated using voxel-wise weighted least-square fitting of the reconstructed myocardial PET TACs. For the direct method, parametric images were estimated directly from the dynamic PET sinograms using a maximum a posteriori (MAP) parametric reconstruction algorithm which optimizes an objective function comprised of the Poisson log-likelihood term, the kinetic model and a quadratic penalty function. Maximization of the objective function with respect to each set of parameters was achieved using a preconditioned conjugate gradient algorithm with a specifically developed pre-conditioner. The performance of the direct method was evaluated by comparing voxel- and segment-wise estimates of [Formula: see text], the tracer transport rate (ml · min -1 · ml -1 ), to those obtained using the indirect method applied to both OSEM and OSL-MAP dynamic reconstructions. The proposed direct reconstruction method produced [Formula: see text] maps with visibly lower noise than the indirect method based on OSEM and OSL-MAP reconstructions. At normal count levels, the direct method was shown to outperform the indirect method based on OSL-MAP in the sense that at matched level of bias, reduced regional noise levels were obtained. At lower count levels, the direct method produced [Formula: see text] estimates with significantly lower standard deviation across noise realizations than the indirect method based on OSL-MAP at matched bias level. In all cases, the direct method yielded lower noise and standard deviation than the indirect method based on OSEM. Overall, the proposed direct reconstruction offered a better bias-variance tradeoff than the indirect method applied to either OSEM and OSL-MAP. Direct parametric reconstruction as applied to in vivo dynamic PET MPI data is therefore a promising method for producing MBF maps with lower variance.

  4. Direct parametric reconstruction in dynamic PET myocardial perfusion imaging: in-vivo studies

    PubMed Central

    Petibon, Yoann; Rakvongthai, Yothin; Fakhri, Georges El; Ouyang, Jinsong

    2017-01-01

    Dynamic PET myocardial perfusion imaging (MPI) used in conjunction with tracer kinetic modeling enables the quantification of absolute myocardial blood flow (MBF). However, MBF maps computed using the traditional indirect method (i.e. post-reconstruction voxel-wise fitting of kinetic model to PET time-activity-curves -TACs) suffer from poor signal-to-noise ratio (SNR). Direct reconstruction of kinetic parameters from raw PET projection data has been shown to offer parametric images with higher SNR compared to the indirect method. The aim of this study was to extend and evaluate the performance of a direct parametric reconstruction method using in-vivo dynamic PET MPI data for the purpose of quantifying MBF. Dynamic PET MPI studies were performed on two healthy pigs using a Siemens Biograph mMR scanner. List-mode PET data for each animal were acquired following a bolus injection of ~7-8 mCi of 18F-flurpiridaz, a myocardial perfusion agent. Fully-3D dynamic PET sinograms were obtained by sorting the coincidence events into 16 temporal frames covering ~5 min after radiotracer administration. Additionally, eight independent noise realizations of both scans - each containing 1/8th of the total number of events - were generated from the original list-mode data. Dynamic sinograms were then used to compute parametric maps using the conventional indirect method and the proposed direct method. For both methods, a one-tissue compartment model accounting for spillover from the left and right ventricle blood-pools was used to describe the kinetics of 18F-flurpiridaz. An image-derived arterial input function obtained from a TAC taken in the left ventricle cavity was used for tracer kinetic analysis. For the indirect method, frame-by-frame images were estimated using two fully-3D reconstruction techniques: the standard Ordered Subset Expectation Maximization (OSEM) reconstruction algorithm on one side, and the One-Step Late Maximum a Posteriori (OSL-MAP) algorithm on the other side, which incorporates a quadratic penalty function. The parametric images were then calculated using voxel-wise weighted least-square fitting of the reconstructed myocardial PET TACs. For the direct method, parametric images were estimated directly from the dynamic PET sinograms using a maximum a posteriori (MAP) parametric reconstruction algorithm which optimizes an objective function comprised of the Poisson log-likelihood term, the kinetic model and a quadratic penalty function. Maximization of the objective function with respect to each set of parameters was achieved using a preconditioned conjugate gradient algorithm with a specifically developed pre-conditioner. The performance of the direct method was evaluated by comparing voxel- and segment-wise estimates of K1, the tracer transport rate (mL.min−1.mL−1), to those obtained using the indirect method applied to both OSEM and OSL-MAP dynamic reconstructions. The proposed direct reconstruction method produced K1 maps with visibly lower noise than the indirect method based on OSEM and OSL-MAP reconstructions. At normal count levels, the direct method was shown to outperform the indirect method based on OSL-MAP in the sense that at matched level of bias, reduced regional noise levels were obtained. At lower count levels, the direct method produced K1 estimates with significantly lower standard deviation across noise realizations than the indirect method based on OSL-MAP at matched bias level. In all cases, the direct method yielded lower noise and standard deviation than the indirect method based on OSEM. Overall, the proposed direct reconstruction offered a better bias-variance tradeoff than the indirect method applied to either OSEM and OSL-MAP. Direct parametric reconstruction as applied to in-vivo dynamic PET MPI data is therefore a promising method for producing MBF maps with lower variance. PMID:28379843

  5. Direct parametric reconstruction in dynamic PET myocardial perfusion imaging: in vivo studies

    NASA Astrophysics Data System (ADS)

    Petibon, Yoann; Rakvongthai, Yothin; El Fakhri, Georges; Ouyang, Jinsong

    2017-05-01

    Dynamic PET myocardial perfusion imaging (MPI) used in conjunction with tracer kinetic modeling enables the quantification of absolute myocardial blood flow (MBF). However, MBF maps computed using the traditional indirect method (i.e. post-reconstruction voxel-wise fitting of kinetic model to PET time-activity-curves-TACs) suffer from poor signal-to-noise ratio (SNR). Direct reconstruction of kinetic parameters from raw PET projection data has been shown to offer parametric images with higher SNR compared to the indirect method. The aim of this study was to extend and evaluate the performance of a direct parametric reconstruction method using in vivo dynamic PET MPI data for the purpose of quantifying MBF. Dynamic PET MPI studies were performed on two healthy pigs using a Siemens Biograph mMR scanner. List-mode PET data for each animal were acquired following a bolus injection of ~7-8 mCi of 18F-flurpiridaz, a myocardial perfusion agent. Fully-3D dynamic PET sinograms were obtained by sorting the coincidence events into 16 temporal frames covering ~5 min after radiotracer administration. Additionally, eight independent noise realizations of both scans—each containing 1/8th of the total number of events—were generated from the original list-mode data. Dynamic sinograms were then used to compute parametric maps using the conventional indirect method and the proposed direct method. For both methods, a one-tissue compartment model accounting for spillover from the left and right ventricle blood-pools was used to describe the kinetics of 18F-flurpiridaz. An image-derived arterial input function obtained from a TAC taken in the left ventricle cavity was used for tracer kinetic analysis. For the indirect method, frame-by-frame images were estimated using two fully-3D reconstruction techniques: the standard ordered subset expectation maximization (OSEM) reconstruction algorithm on one side, and the one-step late maximum a posteriori (OSL-MAP) algorithm on the other side, which incorporates a quadratic penalty function. The parametric images were then calculated using voxel-wise weighted least-square fitting of the reconstructed myocardial PET TACs. For the direct method, parametric images were estimated directly from the dynamic PET sinograms using a maximum a posteriori (MAP) parametric reconstruction algorithm which optimizes an objective function comprised of the Poisson log-likelihood term, the kinetic model and a quadratic penalty function. Maximization of the objective function with respect to each set of parameters was achieved using a preconditioned conjugate gradient algorithm with a specifically developed pre-conditioner. The performance of the direct method was evaluated by comparing voxel- and segment-wise estimates of {{K}1} , the tracer transport rate (ml · min-1 · ml-1), to those obtained using the indirect method applied to both OSEM and OSL-MAP dynamic reconstructions. The proposed direct reconstruction method produced {{K}1} maps with visibly lower noise than the indirect method based on OSEM and OSL-MAP reconstructions. At normal count levels, the direct method was shown to outperform the indirect method based on OSL-MAP in the sense that at matched level of bias, reduced regional noise levels were obtained. At lower count levels, the direct method produced {{K}1} estimates with significantly lower standard deviation across noise realizations than the indirect method based on OSL-MAP at matched bias level. In all cases, the direct method yielded lower noise and standard deviation than the indirect method based on OSEM. Overall, the proposed direct reconstruction offered a better bias-variance tradeoff than the indirect method applied to either OSEM and OSL-MAP. Direct parametric reconstruction as applied to in vivo dynamic PET MPI data is therefore a promising method for producing MBF maps with lower variance.

  6. Multi-atlas attenuation correction supports full quantification of static and dynamic brain PET data in PET-MR

    NASA Astrophysics Data System (ADS)

    Mérida, Inés; Reilhac, Anthonin; Redouté, Jérôme; Heckemann, Rolf A.; Costes, Nicolas; Hammers, Alexander

    2017-04-01

    In simultaneous PET-MR, attenuation maps are not directly available. Essential for absolute radioactivity quantification, they need to be derived from MR or PET data to correct for gamma photon attenuation by the imaged object. We evaluate a multi-atlas attenuation correction method for brain imaging (MaxProb) on static [18F]FDG PET and, for the first time, on dynamic PET, using the serotoninergic tracer [18F]MPPF. A database of 40 MR/CT image pairs (atlases) was used. The MaxProb method synthesises subject-specific pseudo-CTs by registering each atlas to the target subject space. Atlas CT intensities are then fused via label propagation and majority voting. Here, we compared these pseudo-CTs with the real CTs in a leave-one-out design, contrasting the MaxProb approach with a simplified single-atlas method (SingleAtlas). We evaluated the impact of pseudo-CT accuracy on reconstructed PET images, compared to PET data reconstructed with real CT, at the regional and voxel levels for the following: radioactivity images; time-activity curves; and kinetic parameters (non-displaceable binding potential, BPND). On static [18F]FDG, the mean bias for MaxProb ranged between 0 and 1% for 73 out of 84 regions assessed, and exceptionally peaked at 2.5% for only one region. Statistical parametric map analysis of MaxProb-corrected PET data showed significant differences in less than 0.02% of the brain volume, whereas SingleAtlas-corrected data showed significant differences in 20% of the brain volume. On dynamic [18F]MPPF, most regional errors on BPND ranged from -1 to  +3% (maximum bias 5%) for the MaxProb method. With SingleAtlas, errors were larger and had higher variability in most regions. PET quantification bias increased over the duration of the dynamic scan for SingleAtlas, but not for MaxProb. We show that this effect is due to the interaction of the spatial tracer-distribution heterogeneity variation over time with the degree of accuracy of the attenuation maps. This work demonstrates that inaccuracies in attenuation maps can induce bias in dynamic brain PET studies. Multi-atlas attenuation correction with MaxProb enables quantification on hybrid PET-MR scanners, eschewing the need for CT.

  7. Multi-atlas attenuation correction supports full quantification of static and dynamic brain PET data in PET-MR.

    PubMed

    Mérida, Inés; Reilhac, Anthonin; Redouté, Jérôme; Heckemann, Rolf A; Costes, Nicolas; Hammers, Alexander

    2017-04-07

    In simultaneous PET-MR, attenuation maps are not directly available. Essential for absolute radioactivity quantification, they need to be derived from MR or PET data to correct for gamma photon attenuation by the imaged object. We evaluate a multi-atlas attenuation correction method for brain imaging (MaxProb) on static [ 18 F]FDG PET and, for the first time, on dynamic PET, using the serotoninergic tracer [ 18 F]MPPF. A database of 40 MR/CT image pairs (atlases) was used. The MaxProb method synthesises subject-specific pseudo-CTs by registering each atlas to the target subject space. Atlas CT intensities are then fused via label propagation and majority voting. Here, we compared these pseudo-CTs with the real CTs in a leave-one-out design, contrasting the MaxProb approach with a simplified single-atlas method (SingleAtlas). We evaluated the impact of pseudo-CT accuracy on reconstructed PET images, compared to PET data reconstructed with real CT, at the regional and voxel levels for the following: radioactivity images; time-activity curves; and kinetic parameters (non-displaceable binding potential, BP ND ). On static [ 18 F]FDG, the mean bias for MaxProb ranged between 0 and 1% for 73 out of 84 regions assessed, and exceptionally peaked at 2.5% for only one region. Statistical parametric map analysis of MaxProb-corrected PET data showed significant differences in less than 0.02% of the brain volume, whereas SingleAtlas-corrected data showed significant differences in 20% of the brain volume. On dynamic [ 18 F]MPPF, most regional errors on BP ND ranged from -1 to  +3% (maximum bias 5%) for the MaxProb method. With SingleAtlas, errors were larger and had higher variability in most regions. PET quantification bias increased over the duration of the dynamic scan for SingleAtlas, but not for MaxProb. We show that this effect is due to the interaction of the spatial tracer-distribution heterogeneity variation over time with the degree of accuracy of the attenuation maps. This work demonstrates that inaccuracies in attenuation maps can induce bias in dynamic brain PET studies. Multi-atlas attenuation correction with MaxProb enables quantification on hybrid PET-MR scanners, eschewing the need for CT.

  8. Whole-body hybrid imaging concept for the integration of PET/MR into radiation therapy treatment planning.

    PubMed

    Paulus, Daniel H; Oehmigen, Mark; Grüneisen, Johannes; Umutlu, Lale; Quick, Harald H

    2016-05-07

    Modern radiation therapy (RT) treatment planning is based on multimodality imaging. With the recent availability of whole-body PET/MR hybrid imaging new opportunities arise to improve target volume delineation in RT treatment planning. This, however, requires dedicated RT equipment for reproducible patient positioning on the PET/MR system, which has to be compatible with MR and PET imaging. A prototype flat RT table overlay, radiofrequency (RF) coil holders for head imaging, and RF body bridges for body imaging were developed and tested towards PET/MR system integration. Attenuation correction (AC) of all individual RT components was performed by generating 3D CT-based template models. A custom-built program for μ-map generation assembles all AC templates depending on the presence and position of each RT component. All RT devices were evaluated in phantom experiments with regards to MR and PET imaging compatibility, attenuation correction, PET quantification, and position accuracy. The entire RT setup was then evaluated in a first PET/MR patient study on five patients at different body regions. All tested devices are PET/MR compatible and do not produce visible artifacts or disturb image quality. The RT components showed a repositioning accuracy of better than 2 mm. Photon attenuation of  -11.8% in the top part of the phantom was observable, which was reduced to  -1.7% with AC using the μ-map generator. Active lesions of 3 subjects were evaluated in terms of SUVmean and an underestimation of  -10.0% and  -2.4% was calculated without and with AC of the RF body bridges, respectively. The new dedicated RT equipment for hybrid PET/MR imaging enables acquisitions in all body regions. It is compatible with PET/MR imaging and all hardware components can be corrected in hardware AC by using the suggested μ-map generator. These developments provide the technical and methodological basis for integration of PET/MR hybrid imaging into RT planning.

  9. Satellite-map position estimation for the Mars rover

    NASA Technical Reports Server (NTRS)

    Hayashi, Akira; Dean, Thomas

    1989-01-01

    A method for locating the Mars rover using an elevation map generated from satellite data is described. In exploring its environment, the rover is assumed to generate a local rover-centered elevation map that can be used to extract information about the relative position and orientation of landmarks corresponding to local maxima. These landmarks are integrated into a stochastic map which is then matched with the satellite map to obtain an estimate of the robot's current location. The landmarks are not explicitly represented in the satellite map. The results of the matching algorithm correspond to a probabilistic assessment of whether or not the robot is located within a given region of the satellite map. By assigning a probabilistic interpretation to the information stored in the satellite map, researchers are able to provide a precise characterization of the results computed by the matching algorithm.

  10. Whole-tumor histogram analysis of the cerebral blood volume map: tumor volume defined by 11C-methionine positron emission tomography image improves the diagnostic accuracy of cerebral glioma grading.

    PubMed

    Wu, Rongli; Watanabe, Yoshiyuki; Arisawa, Atsuko; Takahashi, Hiroto; Tanaka, Hisashi; Fujimoto, Yasunori; Watabe, Tadashi; Isohashi, Kayako; Hatazawa, Jun; Tomiyama, Noriyuki

    2017-10-01

    This study aimed to compare the tumor volume definition using conventional magnetic resonance (MR) and 11C-methionine positron emission tomography (MET/PET) images in the differentiation of the pre-operative glioma grade by using whole-tumor histogram analysis of normalized cerebral blood volume (nCBV) maps. Thirty-four patients with histopathologically proven primary brain low-grade gliomas (n = 15) and high-grade gliomas (n = 19) underwent pre-operative or pre-biopsy MET/PET, fluid-attenuated inversion recovery, dynamic susceptibility contrast perfusion-weighted magnetic resonance imaging, and contrast-enhanced T1-weighted at 3.0 T. The histogram distribution derived from the nCBV maps was obtained by co-registering the whole tumor volume delineated on conventional MR or MET/PET images, and eight histogram parameters were assessed. The mean nCBV value had the highest AUC value (0.906) based on MET/PET images. Diagnostic accuracy significantly improved when the tumor volume was measured from MET/PET images compared with conventional MR images for the parameters of mean, 50th, and 75th percentile nCBV value (p = 0.0246, 0.0223, and 0.0150, respectively). Whole-tumor histogram analysis of CBV map provides more valuable histogram parameters and increases diagnostic accuracy in the differentiation of pre-operative cerebral gliomas when the tumor volume is derived from MET/PET images.

  11. PET Mapping for Brain-Computer Interface Stimulation of the Ventroposterior Medial Nucleus of the Thalamus in Rats with Implanted Electrodes.

    PubMed

    Zhu, Yunqi; Xu, Kedi; Xu, Caiyun; Zhang, Jiacheng; Ji, Jianfeng; Zheng, Xiaoxiang; Zhang, Hong; Tian, Mei

    2016-07-01

    Brain-computer interface (BCI) technology has great potential for improving the quality of life for neurologic patients. This study aimed to use PET mapping for BCI-based stimulation in a rat model with electrodes implanted in the ventroposterior medial (VPM) nucleus of the thalamus. PET imaging studies were conducted before and after stimulation of the right VPM. Stimulation induced significant orienting performance. (18)F-FDG uptake increased significantly in the paraventricular thalamic nucleus, septohippocampal nucleus, olfactory bulb, left crus II of the ansiform lobule of the cerebellum, and bilaterally in the lateral septum, amygdala, piriform cortex, endopiriform nucleus, and insular cortex, but it decreased in the right secondary visual cortex, right simple lobule of the cerebellum, and bilaterally in the somatosensory cortex. This study demonstrated that PET mapping after VPM stimulation can identify specific brain regions associated with orienting performance. PET molecular imaging may be an important approach for BCI-based research and its clinical applications. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  12. Probabilistic flood extent estimates from social media flood observations

    NASA Astrophysics Data System (ADS)

    Brouwer, Tom; Eilander, Dirk; van Loenen, Arnejan; Booij, Martijn J.; Wijnberg, Kathelijne M.; Verkade, Jan S.; Wagemaker, Jurjen

    2017-05-01

    The increasing number and severity of floods, driven by phenomena such as urbanization, deforestation, subsidence and climate change, create a growing need for accurate and timely flood maps. In this paper we present and evaluate a method to create deterministic and probabilistic flood maps from Twitter messages that mention locations of flooding. A deterministic flood map created for the December 2015 flood in the city of York (UK) showed good performance (F(2) = 0.69; a statistic ranging from 0 to 1, with 1 expressing a perfect fit with validation data). The probabilistic flood maps we created showed that, in the York case study, the uncertainty in flood extent was mainly induced by errors in the precise locations of flood observations as derived from Twitter data. Errors in the terrain elevation data or in the parameters of the applied algorithm contributed less to flood extent uncertainty. Although these maps tended to overestimate the actual probability of flooding, they gave a reasonable representation of flood extent uncertainty in the area. This study illustrates that inherently uncertain data from social media can be used to derive information about flooding.

  13. Develop Probabilistic Tsunami Design Maps for ASCE 7

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Thio, H. K.; Chock, G.; Titov, V. V.

    2014-12-01

    A national standard for engineering design for tsunami effects has not existed before and this significant risk is mostly ignored in engineering design. The American Society of Civil Engineers (ASCE) 7 Tsunami Loads and Effects Subcommittee is completing a chapter for the 2016 edition of ASCE/SEI 7 Standard. Chapter 6, Tsunami Loads and Effects, would become the first national tsunami design provisions. These provisions will apply to essential facilities and critical infrastructure. This standard for tsunami loads and effects will apply to designs as part of the tsunami preparedness. The provisions will have significance as the post-tsunami recovery tool, to plan and evaluate for reconstruction. Maps of 2,500-year probabilistic tsunami inundation for Alaska, Washington, Oregon, California, and Hawaii need to be developed for use with the ASCE design provisions. These new tsunami design zone maps will define the coastal zones where structures of greater importance would be designed for tsunami resistance and community resilience. The NOAA Center for Tsunami Research (NCTR) has developed 75 tsunami inundation models as part of the operational tsunami model forecast capability for the U.S. coastline. NCTR, UW, and URS are collaborating with ASCE to develop the 2,500-year tsunami design maps for the Pacific states using these tsunami models. This ensures the probabilistic criteria are established in ASCE's tsunami design maps. URS established a Probabilistic Tsunami Hazard Assessment approach consisting of a large amount of tsunami scenarios that include both epistemic uncertainty and aleatory variability (Thio et al., 2010). Their study provides 2,500-year offshore tsunami heights at the 100-m water depth, along with the disaggregated earthquake sources. NOAA's tsunami models are used to identify a group of sources that produce these 2,500-year tsunami heights. The tsunami inundation limits and runup heights derived from these sources establish the tsunami design map for the study site. ASCE's Energy Grad Line Analysis then uses these modeling constraints to derive hydrodynamic forces for structures within the tsunami design zone. The probabilistic tsunami design maps will be validated by comparison to state inundation maps under the coordination of the National Tsunami Hazard Mitigation Program.

  14. Effect of Time-of-Flight Information on PET/MR Reconstruction Artifacts: Comparison of Free-breathing versus Breath-hold MR-based Attenuation Correction.

    PubMed

    Delso, Gaspar; Khalighi, Mohammed; Ter Voert, Edwin; Barbosa, Felipe; Sekine, Tetsuro; Hüllner, Martin; Veit-Haibach, Patrick

    2017-01-01

    Purpose To evaluate the magnitude and anatomic extent of the artifacts introduced on positron emission tomographic (PET)/magnetic resonance (MR) images by respiratory state mismatch in the attenuation map. Materials and Methods The method was tested on 14 patients referred for an oncologic examination who underwent PET/MR imaging. The acquisition included standard PET and MR series for each patient, and an additional attenuation correction series was acquired by using breath hold. PET data were reconstructed with and without time-of-flight (TOF) information, first by using the standard free-breathing attenuation map and then again by using the additional breath-hold map. Two-tailed paired t testing and linear regression with 0 intercept was performed on TOF versus non-TOF and free-breathing versus breath-hold data for all detected lesions. Results Fluorodeoxyglucose-avid lesions were found in eight of the 14 patients included in the study. The uptake differences (maximum standardized uptake values) between PET reconstructions with free-breathing versus breath-hold attenuation ranged, for non-TOF reconstructions, from -18% to 26%. The corresponding TOF reconstructions yielded differences from -15% to 18%. Conclusion TOF information was shown to reduce the artifacts caused at PET/MR by respiratory mismatch between emission and attenuation data. © RSNA, 2016 Online supplemental material is available for this article.

  15. Spatial planning using probabilistic flood maps

    NASA Astrophysics Data System (ADS)

    Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano

    2015-04-01

    Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.

  16. Prediction of CT Substitutes from MR Images Based on Local Diffeomorphic Mapping for Brain PET Attenuation Correction.

    PubMed

    Wu, Yao; Yang, Wei; Lu, Lijun; Lu, Zhentai; Zhong, Liming; Huang, Meiyan; Feng, Yanqiu; Feng, Qianjin; Chen, Wufan

    2016-10-01

    Attenuation correction is important for PET reconstruction. In PET/MR, MR intensities are not directly related to attenuation coefficients that are needed in PET imaging. The attenuation coefficient map can be derived from CT images. Therefore, prediction of CT substitutes from MR images is desired for attenuation correction in PET/MR. This study presents a patch-based method for CT prediction from MR images, generating attenuation maps for PET reconstruction. Because no global relation exists between MR and CT intensities, we propose local diffeomorphic mapping (LDM) for CT prediction. In LDM, we assume that MR and CT patches are located on 2 nonlinear manifolds, and the mapping from the MR manifold to the CT manifold approximates a diffeomorphism under a local constraint. Locality is important in LDM and is constrained by the following techniques. The first is local dictionary construction, wherein, for each patch in the testing MR image, a local search window is used to extract patches from training MR/CT pairs to construct MR and CT dictionaries. The k-nearest neighbors and an outlier detection strategy are then used to constrain the locality in MR and CT dictionaries. Second is local linear representation, wherein, local anchor embedding is used to solve MR dictionary coefficients when representing the MR testing sample. Under these local constraints, dictionary coefficients are linearly transferred from the MR manifold to the CT manifold and used to combine CT training samples to generate CT predictions. Our dataset contains 13 healthy subjects, each with T1- and T2-weighted MR and CT brain images. This method provides CT predictions with a mean absolute error of 110.1 Hounsfield units, Pearson linear correlation of 0.82, peak signal-to-noise ratio of 24.81 dB, and Dice in bone regions of 0.84 as compared with real CTs. CT substitute-based PET reconstruction has a regression slope of 1.0084 and R 2 of 0.9903 compared with real CT-based PET. In this method, no image segmentation or accurate registration is required. Our method demonstrates superior performance in CT prediction and PET reconstruction compared with competing methods. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  17. Can target-to-pons ratio be used as a reliable method for the analysis of [11C]PIB brain scans?

    PubMed

    Edison, P; Hinz, R; Ramlackhansingh, A; Thomas, J; Gelosa, G; Archer, H A; Turkheimer, F E; Brooks, D J

    2012-04-15

    (11)C]PIB is the most widely used PET imaging marker for amyloid in dementia studies. In the majority of studies the cerebellum has been used as a reference region. However, cerebellar amyloid may be present in genetic Alzheimer's (AD), cerebral amyloid angiopathy and prion diseases. Therefore, we investigated whether the pons could be used as an alternative reference region for the analysis of [(11)C]PIB binding in AD. The aims of the study were to: 1) Evaluate the pons as a reference region using arterial plasma input function and Logan graphical analysis of binding. 2) Assess the power of target-to-pons ratios to discriminate controls from AD subjects. 3) Determine the test-retest reliability in AD subjects. 4) Demonstrate the application of target-to-pons ratio in subjects with elevated cerebellar [(11)C]PIB binding. 12 sporadic AD subjects aged 65 ± 4.5 yrs with a mean MMSE 21.4 ± 4 and 10 age-matched control subjects had [(11)C]PIB PET with arterial blood sampling. Three additional subjects (two subjects with pre-symptomatic presenilin-1 mutation carriers and one probable familial AD) were also studied. Object maps were created by segmenting individual MRIs and spatially transforming the gray matter images into standard stereotaxic MNI space and then superimposing a probabilistic atlas. Cortical [(11)C]PIB binding was assessed with an ROI (region of interest) analysis. Parametric maps of the volume of distribution (V(T)) were generated with Logan analysis. Additionally, parametric maps of the 60-90 min target-to-cerebellar ratio (RATIO(CER)) and the 60-90 min target-to-pons ratio (RATIO(PONS)) were computed. All three approaches were able to differentiate AD from controls (p<0.0001, nonparametric Wilcoxon rank sum test) in the target regions with RATIO(CER) and RATIO(PONS) differences higher than V(T) with use of an arterial input function. All methods had a good reproducibility (intraclass correlation coefficient>0.83); RATIO(CER) performed best closely followed by RATIO(PONS). The two subjects with presenilin-1 mutations and the probable familial AD case showed no significant differences in cortical binding using RATIO(CER), but the RATIO(PONS) approach revealed higher [(11)C]PIB binding in cortex and cerebellum. This study established 60-90 min target-to-pons RATIOs as a reliable method of analysis in [(11)C]PIB PET studies where cerebellum is not an appropriate reference region. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Optimized MLAA for quantitative non-TOF PET/MR of the brain

    NASA Astrophysics Data System (ADS)

    Benoit, Didier; Ladefoged, Claes N.; Rezaei, Ahmadreza; Keller, Sune H.; Andersen, Flemming L.; Højgaard, Liselotte; Hansen, Adam E.; Holm, Søren; Nuyts, Johan

    2016-12-01

    For quantitative tracer distribution in positron emission tomography, attenuation correction is essential. In a hybrid PET/CT system the CT images serve as a basis for generation of the attenuation map, but in PET/MR, the MR images do not have a similarly simple relationship with the attenuation map. Hence attenuation correction in PET/MR systems is more challenging. Typically either of two MR sequences are used: the Dixon or the ultra-short time echo (UTE) techniques. However these sequences have some well-known limitations. In this study, a reconstruction technique based on a modified and optimized non-TOF MLAA is proposed for PET/MR brain imaging. The idea is to tune the parameters of the MLTR applying some information from an attenuation image computed from the UTE sequences and a T1w MR image. In this MLTR algorithm, an {αj} parameter is introduced and optimized in order to drive the algorithm to a final attenuation map most consistent with the emission data. Because the non-TOF MLAA is used, a technique to reduce the cross-talk effect is proposed. In this study, the proposed algorithm is compared to the common reconstruction methods such as OSEM using a CT attenuation map, considered as the reference, and OSEM using the Dixon and UTE attenuation maps. To show the robustness and the reproducibility of the proposed algorithm, a set of 204 [18F]FDG patients, 35 [11C]PiB patients and 1 [18F]FET patient are used. The results show that by choosing an optimized value of {αj} in MLTR, the proposed algorithm improves the results compared to the standard MR-based attenuation correction methods (i.e. OSEM using the Dixon or the UTE attenuation maps), and the cross-talk and the scale problem are limited.

  19. Recent advances in parametric neuroreceptor mapping with dynamic PET: basic concepts and graphical analyses.

    PubMed

    Seo, Seongho; Kim, Su Jin; Lee, Dong Soo; Lee, Jae Sung

    2014-10-01

    Tracer kinetic modeling in dynamic positron emission tomography (PET) has been widely used to investigate the characteristic distribution patterns or dysfunctions of neuroreceptors in brain diseases. Its practical goal has progressed from regional data quantification to parametric mapping that produces images of kinetic-model parameters by fully exploiting the spatiotemporal information in dynamic PET data. Graphical analysis (GA) is a major parametric mapping technique that is independent on any compartmental model configuration, robust to noise, and computationally efficient. In this paper, we provide an overview of recent advances in the parametric mapping of neuroreceptor binding based on GA methods. The associated basic concepts in tracer kinetic modeling are presented, including commonly-used compartment models and major parameters of interest. Technical details of GA approaches for reversible and irreversible radioligands are described, considering both plasma input and reference tissue input models. Their statistical properties are discussed in view of parametric imaging.

  20. Magnetic Tunnel Junction Mimics Stochastic Cortical Spiking Neurons

    NASA Astrophysics Data System (ADS)

    Sengupta, Abhronil; Panda, Priyadarshini; Wijesinghe, Parami; Kim, Yusung; Roy, Kaushik

    2016-07-01

    Brain-inspired computing architectures attempt to mimic the computations performed in the neurons and the synapses in the human brain in order to achieve its efficiency in learning and cognitive tasks. In this work, we demonstrate the mapping of the probabilistic spiking nature of pyramidal neurons in the cortex to the stochastic switching behavior of a Magnetic Tunnel Junction in presence of thermal noise. We present results to illustrate the efficiency of neuromorphic systems based on such probabilistic neurons for pattern recognition tasks in presence of lateral inhibition and homeostasis. Such stochastic MTJ neurons can also potentially provide a direct mapping to the probabilistic computing elements in Belief Networks for performing regenerative tasks.

  1. Quantitative Evaluation of 2 Scatter-Correction Techniques for 18F-FDG Brain PET/MRI in Regard to MR-Based Attenuation Correction.

    PubMed

    Teuho, Jarmo; Saunavaara, Virva; Tolvanen, Tuula; Tuokkola, Terhi; Karlsson, Antti; Tuisku, Jouni; Teräs, Mika

    2017-10-01

    In PET, corrections for photon scatter and attenuation are essential for visual and quantitative consistency. MR attenuation correction (MRAC) is generally conducted by image segmentation and assignment of discrete attenuation coefficients, which offer limited accuracy compared with CT attenuation correction. Potential inaccuracies in MRAC may affect scatter correction, because the attenuation image (μ-map) is used in single scatter simulation (SSS) to calculate the scatter estimate. We assessed the impact of MRAC to scatter correction using 2 scatter-correction techniques and 3 μ-maps for MRAC. Methods: The tail-fitted SSS (TF-SSS) and a Monte Carlo-based single scatter simulation (MC-SSS) algorithm implementations on the Philips Ingenuity TF PET/MR were used with 1 CT-based and 2 MR-based μ-maps. Data from 7 subjects were used in the clinical evaluation, and a phantom study using an anatomic brain phantom was conducted. Scatter-correction sinograms were evaluated for each scatter correction method and μ-map. Absolute image quantification was investigated with the phantom data. Quantitative assessment of PET images was performed by volume-of-interest and ratio image analysis. Results: MRAC did not result in large differences in scatter algorithm performance, especially with TF-SSS. Scatter sinograms and scatter fractions did not reveal large differences regardless of the μ-map used. TF-SSS showed slightly higher absolute quantification. The differences in volume-of-interest analysis between TF-SSS and MC-SSS were 3% at maximum in the phantom and 4% in the patient study. Both algorithms showed excellent correlation with each other with no visual differences between PET images. MC-SSS showed a slight dependency on the μ-map used, with a difference of 2% on average and 4% at maximum when a μ-map without bone was used. Conclusion: The effect of different MR-based μ-maps on the performance of scatter correction was minimal in non-time-of-flight 18 F-FDG PET/MR brain imaging. The SSS algorithm was not affected significantly by MRAC. The performance of the MC-SSS algorithm is comparable but not superior to TF-SSS, warranting further investigations of algorithm optimization and performance with different radiotracers and time-of-flight imaging. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  2. Mapping visual cortex in monkeys and humans using surface-based atlases

    NASA Technical Reports Server (NTRS)

    Van Essen, D. C.; Lewis, J. W.; Drury, H. A.; Hadjikhani, N.; Tootell, R. B.; Bakircioglu, M.; Miller, M. I.

    2001-01-01

    We have used surface-based atlases of the cerebral cortex to analyze the functional organization of visual cortex in humans and macaque monkeys. The macaque atlas contains multiple partitioning schemes for visual cortex, including a probabilistic atlas of visual areas derived from a recent architectonic study, plus summary schemes that reflect a combination of physiological and anatomical evidence. The human atlas includes a probabilistic map of eight topographically organized visual areas recently mapped using functional MRI. To facilitate comparisons between species, we used surface-based warping to bring functional and geographic landmarks on the macaque map into register with corresponding landmarks on the human map. The results suggest that extrastriate visual cortex outside the known topographically organized areas is dramatically expanded in human compared to macaque cortex, particularly in the parietal lobe.

  3. Development of Maximum Considered Earthquake Ground Motion Maps

    USGS Publications Warehouse

    Leyendecker, E.V.; Hunt, R.J.; Frankel, A.D.; Rukstales, K.S.

    2000-01-01

    The 1997 NEHRP Recommended Provisions for Seismic Regulations for New Buildings use a design procedure that is based on spectral response acceleration rather than the traditional peak ground acceleration, peak ground velocity, or zone factors. The spectral response accelerations are obtained from maps prepared following the recommendations of the Building Seismic Safety Council's (BSSC) Seismic Design Procedures Group (SDPG). The SDPG-recommended maps, the Maximum Considered Earthquake (MCE) Ground Motion Maps, are based on the U.S. Geological Survey (USGS) probabilistic hazard maps with additional modifications incorporating deterministic ground motions in selected areas and the application of engineering judgement. The MCE ground motion maps included with the 1997 NEHRP Provisions also serve as the basis for the ground motion maps used in the seismic design portions of the 2000 International Building Code and the 2000 International Residential Code. Additionally the design maps prepared for the 1997 NEHRP Provisions, combined with selected USGS probabilistic maps, are used with the 1997 NEHRP Guidelines for the Seismic Rehabilitation of Buildings.

  4. Daytime variation in ambient temperature affects skin temperatures and blood pressure: Ambulatory winter/summer comparison in healthy young women.

    PubMed

    Martinez-Nicolas, Antonio; Meyer, Martin; Hunkler, Stefan; Madrid, Juan Antonio; Rol, Maria Angeles; Meyer, Andrea H; Schötzau, Andy; Orgül, Selim; Kräuchi, Kurt

    2015-10-01

    It is widely accepted that cold exposure increases peripheral vascular resistance and arterial blood pressure (BP) and, hence, increases cardiovascular risk primarily in the elderly. However, there is a lack of concomitantly longitudinal recordings at personal level of environmental temperature (PET) and cardiophysiological variables together with skin temperatures (STs, the “interface-variable” between the body core and ambient temperature). To investigate the intra-individual temporal relationships between PET, STs and BP 60 healthy young women (52 completed the entire study) were prospectively studied in a winter/summer design for 26 h under real life conditions. The main hypothesis was tested whether distal ST (Tdist)mediates the effect of PET-changes on mean arterial BP (MAP). Diurnal profiles of cardiophysiological variables (including BP), STs and PET were ambulatory recorded. Daytime variations between 0930 and 2030 h were analyzed in detail by intra-individual longitudinal path analysis. Additionally, time segments before, during and after outdoor exposure were separately analyzed. In both seasons short-term variations in PET were positively associated with short-term changes in Tdist (not proximal ST, Tprox) and negatively with those in MAP. However, long-term seasonal differences in daytime mean levels were observed in STs but not in BP leading to non-significant inter-individual correlation between STs and BP. Additionally, higher individual body mass index (BMI) was significantly associated with lower daytime mean levels of Tprox and higher MAP suggesting Tprox as potential mediator variable for the association of BMI with MAP. In healthy young women the thermoregulatory and BP-regulatory systems are closely linked with respect to short-term, but not long-term changes in PET. One hypothetical explanation could serve recent findings that thermogenesis in brown adipose tissue is activated in a cool environment, which could be responsible for the counter-regulation of cold induced increase of BP in winter leading to no seasonal differences in MAP. Our findings suggest that the assessment of diurnal patterns of STs and PET, in addition to the conventional ambulatory BP monitoring, might improve individual cardiovascular risk prediction.

  5. Crystal identification for a dual-layer-offset LYSO based PET system via Lu-176 background radiation and mean shift algorithm

    NASA Astrophysics Data System (ADS)

    Wei, Qingyang; Ma, Tianyu; Xu, Tianpeng; Zeng, Ming; Gu, Yu; Dai, Tiantian; Liu, Yaqiang

    2018-01-01

    Modern positron emission tomography (PET) detectors are made from pixelated scintillation crystal arrays and readout by Anger logic. The interaction position of the gamma-ray should be assigned to a crystal using a crystal position map or look-up table. Crystal identification is a critical procedure for pixelated PET systems. In this paper, we propose a novel crystal identification method for a dual-layer-offset LYSO based animal PET system via Lu-176 background radiation and mean shift algorithm. Single photon event data of the Lu-176 background radiation are acquired in list-mode for 3 h to generate a single photon flood map (SPFM). Coincidence events are obtained from the same data using time information to generate a coincidence flood map (CFM). The CFM is used to identify the peaks of the inner layer using the mean shift algorithm. The response of the inner layer is deducted from the SPFM by subtracting CFM. Then, the peaks of the outer layer are also identified using the mean shift algorithm. The automatically identified peaks are manually inspected by a graphical user interface program. Finally, a crystal position map is generated using a distance criterion based on these peaks. The proposed method is verified on the animal PET system with 48 detector blocks on a laptop with an Intel i7-5500U processor. The total runtime for whole system peak identification is 67.9 s. Results show that the automatic crystal identification has 99.98% and 99.09% accuracy for the peaks of the inner and outer layers of the whole system respectively. In conclusion, the proposed method is suitable for the dual-layer-offset lutetium based PET system to perform crystal identification instead of external radiation sources.

  6. Automatic delineation of tumor volumes by co-segmentation of combined PET/MR data

    NASA Astrophysics Data System (ADS)

    Leibfarth, S.; Eckert, F.; Welz, S.; Siegel, C.; Schmidt, H.; Schwenzer, N.; Zips, D.; Thorwarth, D.

    2015-07-01

    Combined PET/MRI may be highly beneficial for radiotherapy treatment planning in terms of tumor delineation and characterization. To standardize tumor volume delineation, an automatic algorithm for the co-segmentation of head and neck (HN) tumors based on PET/MR data was developed. Ten HN patient datasets acquired in a combined PET/MR system were available for this study. The proposed algorithm uses both the anatomical T2-weighted MR and FDG-PET data. For both imaging modalities tumor probability maps were derived, assigning each voxel a probability of being cancerous based on its signal intensity. A combination of these maps was subsequently segmented using a threshold level set algorithm. To validate the method, tumor delineations from three radiation oncologists were available. Inter-observer variabilities and variabilities between the algorithm and each observer were quantified by means of the Dice similarity index and a distance measure. Inter-observer variabilities and variabilities between observers and algorithm were found to be comparable, suggesting that the proposed algorithm is adequate for PET/MR co-segmentation. Moreover, taking into account combined PET/MR data resulted in more consistent tumor delineations compared to MR information only.

  7. Optimal monochromatic color combinations for fusion imaging of FDG-PET and diffusion-weighted MR images.

    PubMed

    Kamei, Ryotaro; Watanabe, Yuji; Sagiyama, Koji; Isoda, Takuro; Togao, Osamu; Honda, Hiroshi

    2018-05-23

    To investigate the optimal monochromatic color combination for fusion imaging of FDG-PET and diffusion-weighted MR images (DW) regarding lesion conspicuity of each image. Six linear monochromatic color-maps of red, blue, green, cyan, magenta, and yellow were assigned to each of the FDG-PET and DW images. Total perceptual color differences of the lesions were calculated based on the lightness and chromaticity measured with the photometer. Visual lesion conspicuity was also compared among the PET-only, DW-only and PET-DW-double positive portions with mean conspicuity scores. Statistical analysis was performed with a one-way analysis of variance and Spearman's rank correlation coefficient. Among all the 12 possible monochromatic color-map combinations, the 3 combinations of red/cyan, magenta/green, and red/green produced the highest conspicuity scores. Total color differences between PET-positive and double-positive portions correlated with conspicuity scores (ρ = 0.2933, p < 0.005). Lightness differences showed a significant negative correlation with conspicuity scores between the PET-only and DWI-only positive portions. Chromaticity differences showed a marginally significant correlation with conspicuity scores between DWI-positive and double-positive portions. Monochromatic color combinations can facilitate the visual evaluation of FDG-uptake and diffusivity as well as registration accuracy on the FDG-PET/DW fusion images, when red- and green-colored elements are assigned to FDG-PET and DW images, respectively.

  8. Simultaneous maximum a posteriori longitudinal PET image reconstruction

    NASA Astrophysics Data System (ADS)

    Ellis, Sam; Reader, Andrew J.

    2017-09-01

    Positron emission tomography (PET) is frequently used to monitor functional changes that occur over extended time scales, for example in longitudinal oncology PET protocols that include routine clinical follow-up scans to assess the efficacy of a course of treatment. In these contexts PET datasets are currently reconstructed into images using single-dataset reconstruction methods. Inspired by recently proposed joint PET-MR reconstruction methods, we propose to reconstruct longitudinal datasets simultaneously by using a joint penalty term in order to exploit the high degree of similarity between longitudinal images. We achieved this by penalising voxel-wise differences between pairs of longitudinal PET images in a one-step-late maximum a posteriori (MAP) fashion, resulting in the MAP simultaneous longitudinal reconstruction (SLR) method. The proposed method reduced reconstruction errors and visually improved images relative to standard maximum likelihood expectation-maximisation (ML-EM) in simulated 2D longitudinal brain tumour scans. In reconstructions of split real 3D data with inserted simulated tumours, noise across images reconstructed with MAP-SLR was reduced to levels equivalent to doubling the number of detected counts when using ML-EM. Furthermore, quantification of tumour activities was largely preserved over a variety of longitudinal tumour changes, including changes in size and activity, with larger changes inducing larger biases relative to standard ML-EM reconstructions. Similar improvements were observed for a range of counts levels, demonstrating the robustness of the method when used with a single penalty strength. The results suggest that longitudinal regularisation is a simple but effective method of improving reconstructed PET images without using resolution degrading priors.

  9. Improved UTE-based attenuation correction for cranial PET-MR using dynamic magnetic field monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aitken, A. P.; Giese, D.; Tsoumpas, C.

    2014-01-15

    Purpose: Ultrashort echo time (UTE) MRI has been proposed as a way to produce segmented attenuation maps for PET, as it provides contrast between bone, air, and soft tissue. However, UTE sequences require samples to be acquired during rapidly changing gradient fields, which makes the resulting images prone to eddy current artifacts. In this work it is demonstrated that this can lead to misclassification of tissues in segmented attenuation maps (AC maps) and that these effects can be corrected for by measuring the true k-space trajectories using a magnetic field camera. Methods: The k-space trajectories during a dual echo UTEmore » sequence were measured using a dynamic magnetic field camera. UTE images were reconstructed using nominal trajectories and again using the measured trajectories. A numerical phantom was used to demonstrate the effect of reconstructing with incorrect trajectories. Images of an ovine leg phantom were reconstructed and segmented and the resulting attenuation maps were compared to a segmented map derived from a CT scan of the same phantom, using the Dice similarity measure. The feasibility of the proposed method was demonstrated inin vivo cranial imaging in five healthy volunteers. Simulated PET data were generated for one volunteer to show the impact of misclassifications on the PET reconstruction. Results: Images of the numerical phantom exhibited blurring and edge artifacts on the bone–tissue and air–tissue interfaces when nominal k-space trajectories were used, leading to misclassification of soft tissue as bone and misclassification of bone as air. Images of the tissue phantom and thein vivo cranial images exhibited the same artifacts. The artifacts were greatly reduced when the measured trajectories were used. For the tissue phantom, the Dice coefficient for bone in MR relative to CT was 0.616 using the nominal trajectories and 0.814 using the measured trajectories. The Dice coefficients for soft tissue were 0.933 and 0.934 for the nominal and measured cases, respectively. For air the corresponding figures were 0.991 and 0.993. Compared to an unattenuated reference image, the mean error in simulated PET uptake in the brain was 9.16% when AC maps derived from nominal trajectories was used, with errors in the SUV{sub max} for simulated lesions in the range of 7.17%–12.19%. Corresponding figures when AC maps derived from measured trajectories were used were 0.34% (mean error) and −0.21% to +1.81% (lesions). Conclusions: Eddy current artifacts in UTE imaging can be corrected for by measuring the true k-space trajectories during a calibration scan and using them in subsequent image reconstructions. This improves the accuracy of segmented PET attenuation maps derived from UTE sequences and subsequent PET reconstruction.« less

  10. Systems and methods that generate height map models for efficient three dimensional reconstruction from depth information

    DOEpatents

    Frahm, Jan-Michael; Pollefeys, Marc Andre Leon; Gallup, David Robert

    2015-12-08

    Methods of generating a three dimensional representation of an object in a reference plane from a depth map including distances from a reference point to pixels in an image of the object taken from a reference point. Weights are assigned to respective voxels in a three dimensional grid along rays extending from the reference point through the pixels in the image based on the distances in the depth map from the reference point to the respective pixels, and a height map including an array of height values in the reference plane is formed based on the assigned weights. An n-layer height map may be constructed by generating a probabilistic occupancy grid for the voxels and forming an n-dimensional height map comprising an array of layer height values in the reference plane based on the probabilistic occupancy grid.

  11. Deep-learning-based classification of FDG-PET data for Alzheimer's disease categories

    NASA Astrophysics Data System (ADS)

    Singh, Shibani; Srivastava, Anant; Mi, Liang; Caselli, Richard J.; Chen, Kewei; Goradia, Dhruman; Reiman, Eric M.; Wang, Yalin

    2017-11-01

    Fluorodeoxyglucose (FDG) positron emission tomography (PET) measures the decline in the regional cerebral metabolic rate for glucose, offering a reliable metabolic biomarker even on presymptomatic Alzheimer's disease (AD) patients. PET scans provide functional information that is unique and unavailable using other types of imaging. However, the computational efficacy of FDG-PET data alone, for the classification of various Alzheimers Diagnostic categories, has not been well studied. This motivates us to correctly discriminate various AD Diagnostic categories using FDG-PET data. Deep learning has improved state-of-the-art classification accuracies in the areas of speech, signal, image, video, text mining and recognition. We propose novel methods that involve probabilistic principal component analysis on max-pooled data and mean-pooled data for dimensionality reduction, and multilayer feed forward neural network which performs binary classification. Our experimental dataset consists of baseline data of subjects including 186 cognitively unimpaired (CU) subjects, 336 mild cognitive impairment (MCI) subjects with 158 Late MCI and 178 Early MCI, and 146 AD patients from Alzheimer's Disease Neuroimaging Initiative (ADNI) dataset. We measured F1-measure, precision, recall, negative and positive predictive values with a 10-fold cross validation scheme. Our results indicate that our designed classifiers achieve competitive results while max pooling achieves better classification performance compared to mean-pooled features. Our deep model based research may advance FDG-PET analysis by demonstrating their potential as an effective imaging biomarker of AD.

  12. PET image reconstruction using multi-parametric anato-functional priors

    NASA Astrophysics Data System (ADS)

    Mehranian, Abolfazl; Belzunce, Martin A.; Niccolini, Flavia; Politis, Marios; Prieto, Claudia; Turkheimer, Federico; Hammers, Alexander; Reader, Andrew J.

    2017-08-01

    In this study, we investigate the application of multi-parametric anato-functional (MR-PET) priors for the maximum a posteriori (MAP) reconstruction of brain PET data in order to address the limitations of the conventional anatomical priors in the presence of PET-MR mismatches. In addition to partial volume correction benefits, the suitability of these priors for reconstruction of low-count PET data is also introduced and demonstrated, comparing to standard maximum-likelihood (ML) reconstruction of high-count data. The conventional local Tikhonov and total variation (TV) priors and current state-of-the-art anatomical priors including the Kaipio, non-local Tikhonov prior with Bowsher and Gaussian similarity kernels are investigated and presented in a unified framework. The Gaussian kernels are calculated using both voxel- and patch-based feature vectors. To cope with PET and MR mismatches, the Bowsher and Gaussian priors are extended to multi-parametric priors. In addition, we propose a modified joint Burg entropy prior that by definition exploits all parametric information in the MAP reconstruction of PET data. The performance of the priors was extensively evaluated using 3D simulations and two clinical brain datasets of [18F]florbetaben and [18F]FDG radiotracers. For simulations, several anato-functional mismatches were intentionally introduced between the PET and MR images, and furthermore, for the FDG clinical dataset, two PET-unique active tumours were embedded in the PET data. Our simulation results showed that the joint Burg entropy prior far outperformed the conventional anatomical priors in terms of preserving PET unique lesions, while still reconstructing functional boundaries with corresponding MR boundaries. In addition, the multi-parametric extension of the Gaussian and Bowsher priors led to enhanced preservation of edge and PET unique features and also an improved bias-variance performance. In agreement with the simulation results, the clinical results also showed that the Gaussian prior with voxel-based feature vectors, the Bowsher and the joint Burg entropy priors were the best performing priors. However, for the FDG dataset with simulated tumours, the TV and proposed priors were capable of preserving the PET-unique tumours. Finally, an important outcome was the demonstration that the MAP reconstruction of a low-count FDG PET dataset using the proposed joint entropy prior can lead to comparable image quality to a conventional ML reconstruction with up to 5 times more counts. In conclusion, multi-parametric anato-functional priors provide a solution to address the pitfalls of the conventional priors and are therefore likely to increase the diagnostic confidence in MR-guided PET image reconstructions.

  13. Effects of shipping on marine acoustic habitats in Canadian Arctic estimated via probabilistic modeling and mapping.

    PubMed

    Aulanier, Florian; Simard, Yvan; Roy, Nathalie; Gervaise, Cédric; Bandet, Marion

    2017-12-15

    Canadian Arctic and Subarctic regions experience a rapid decrease of sea ice accompanied with increasing shipping traffic. The resulting time-space changes in shipping noise are studied for four key regions of this pristine environment, for 2013 traffic conditions and a hypothetical tenfold traffic increase. A probabilistic modeling and mapping framework, called Ramdam, which integrates the intrinsic variability and uncertainties of shipping noise and its effects on marine habitats, is developed and applied. A substantial transformation of soundscapes is observed in areas where shipping noise changes from present occasional-transient contributor to a dominant noise source. Examination of impacts on low-frequency mammals within ecologically and biologically significant areas reveals that shipping noise has the potential to trigger behavioral responses and masking in the future, although no risk of temporary or permanent hearing threshold shifts is noted. Such probabilistic modeling and mapping is strategic in marine spatial planning of this emerging noise issues. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  14. PET/MRI in the Presence of Metal Implants: Completion of the Attenuation Map from PET Emission Data.

    PubMed

    Fuin, Niccolo; Pedemonte, Stefano; Catalano, Onofrio A; Izquierdo-Garcia, David; Soricelli, Andrea; Salvatore, Marco; Heberlein, Keith; Hooker, Jacob M; Van Leemput, Koen; Catana, Ciprian

    2017-05-01

    We present a novel technique for accurate whole-body attenuation correction in the presence of metallic endoprosthesis, on integrated non-time-of-flight (non-TOF) PET/MRI scanners. The proposed implant PET-based attenuation map completion (IPAC) method performs a joint reconstruction of radioactivity and attenuation from the emission data to determine the position, shape, and linear attenuation coefficient (LAC) of metallic implants. Methods: The initial estimate of the attenuation map was obtained using the MR Dixon method currently available on the Siemens Biograph mMR scanner. The attenuation coefficients in the area of the MR image subjected to metal susceptibility artifacts are then reconstructed from the PET emission data using the IPAC algorithm. The method was tested on 11 subjects presenting 13 different metallic implants, who underwent CT and PET/MR scans. Relative mean LACs and Dice similarity coefficients were calculated to determine the accuracy of the reconstructed attenuation values and the shape of the metal implant, respectively. The reconstructed PET images were compared with those obtained using the reference CT-based approach and the Dixon-based method. Absolute relative change (aRC) images were generated in each case, and voxel-based analyses were performed. Results: The error in implant LAC estimation, using the proposed IPAC algorithm, was 15.7% ± 7.8%, which was significantly smaller than the Dixon- (100%) and CT- (39%) derived values. A mean Dice similarity coefficient of 73% ± 9% was obtained when comparing the IPAC- with the CT-derived implant shape. The voxel-based analysis of the reconstructed PET images revealed quantification errors (aRC) of 13.2% ± 22.1% for the IPAC- with respect to CT-corrected images. The Dixon-based method performed substantially worse, with a mean aRC of 23.1% ± 38.4%. Conclusion: We have presented a non-TOF emission-based approach for estimating the attenuation map in the presence of metallic implants, to be used for whole-body attenuation correction in integrated PET/MR scanners. The Graphics Processing Unit implementation of the algorithm will be included in the open-source reconstruction toolbox Occiput.io. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  15. MAP Reconstruction for Fourier Rebinned TOF-PET Data

    PubMed Central

    Bai, Bing; Lin, Yanguang; Zhu, Wentao; Ren, Ran; Li, Quanzheng; Dahlbom, Magnus; DiFilippo, Frank; Leahy, Richard M.

    2014-01-01

    Time-of-flight (TOF) information improves signal to noise ratio in Positron Emission Tomography (PET). Computation cost in processing TOF-PET sinograms is substantially higher than for nonTOF data because the data in each line of response is divided among multiple time of flight bins. This additional cost has motivated research into methods for rebinning TOF data into lower dimensional representations that exploit redundancies inherent in TOF data. We have previously developed approximate Fourier methods that rebin TOF data into either 3D nonTOF or 2D nonTOF formats. We refer to these methods respectively as FORET-3D and FORET-2D. Here we describe maximum a posteriori (MAP) estimators for use with FORET rebinned data. We first derive approximate expressions for the variance of the rebinned data. We then use these results to rescale the data so that the variance and mean are approximately equal allowing us to use the Poisson likelihood model for MAP reconstruction. MAP reconstruction from these rebinned data uses a system matrix in which the detector response model accounts for the effects of rebinning. Using these methods we compare performance of FORET-2D and 3D with TOF and nonTOF reconstructions using phantom and clinical data. Our phantom results show a small loss in contrast recovery at matched noise levels using FORET compared to reconstruction from the original TOF data. Clinical examples show FORET images that are qualitatively similar to those obtained from the original TOF-PET data but a small increase in variance at matched resolution. Reconstruction time is reduced by a factor of 5 and 30 using FORET3D+MAP and FORET2D+MAP respectively compared to 3D TOF MAP, which makes these methods attractive for clinical applications. PMID:24504374

  16. Reproducibility of Quantitative Brain Imaging Using a PET-Only and a Combined PET/MR System

    PubMed Central

    Lassen, Martin L.; Muzik, Otto; Beyer, Thomas; Hacker, Marcus; Ladefoged, Claes Nøhr; Cal-González, Jacobo; Wadsak, Wolfgang; Rausch, Ivo; Langer, Oliver; Bauer, Martin

    2017-01-01

    The purpose of this study was to test the feasibility of migrating a quantitative brain imaging protocol from a positron emission tomography (PET)-only system to an integrated PET/MR system. Potential differences in both absolute radiotracer concentration as well as in the derived kinetic parameters as a function of PET system choice have been investigated. Five healthy volunteers underwent dynamic (R)-[11C]verapamil imaging on the same day using a GE-Advance (PET-only) and a Siemens Biograph mMR system (PET/MR). PET-emission data were reconstructed using a transmission-based attenuation correction (AC) map (PET-only), whereas a standard MR-DIXON as well as a low-dose CT AC map was applied to PET/MR emission data. Kinetic modeling based on arterial blood sampling was performed using a 1-tissue-2-rate constant compartment model, yielding kinetic parameters (K1 and k2) and distribution volume (VT). Differences for parametric values obtained in the PET-only and the PET/MR systems were analyzed using a 2-way Analysis of Variance (ANOVA). Comparison of DIXON-based AC (PET/MR) with emission data derived from the PET-only system revealed average inter-system differences of −33 ± 14% (p < 0.05) for the K1 parameter and −19 ± 9% (p < 0.05) for k2. Using a CT-based AC for PET/MR resulted in slightly lower systematic differences of −16 ± 18% for K1 and −9 ± 10% for k2. The average differences in VT were −18 ± 10% (p < 0.05) for DIXON- and −8 ± 13% for CT-based AC. Significant systematic differences were observed for kinetic parameters derived from emission data obtained from PET/MR and PET-only imaging due to different standard AC methods employed. Therefore, a transfer of imaging protocols from PET-only to PET/MR systems is not straightforward without application of proper correction methods. Clinical Trial Registration: www.clinicaltrialsregister.eu, identifier 2013-001724-19 PMID:28769742

  17. Probabilistic Flood Maps to support decision-making: Mapping the Value of Information

    NASA Astrophysics Data System (ADS)

    Alfonso, L.; Mukolwe, M. M.; Di Baldassarre, G.

    2016-02-01

    Floods are one of the most frequent and disruptive natural hazards that affect man. Annually, significant flood damage is documented worldwide. Flood mapping is a common preimpact flood hazard mitigation measure, for which advanced methods and tools (such as flood inundation models) are used to estimate potential flood extent maps that are used in spatial planning. However, these tools are affected, largely to an unknown degree, by both epistemic and aleatory uncertainty. Over the past few years, advances in uncertainty analysis with respect to flood inundation modeling show that it is appropriate to adopt Probabilistic Flood Maps (PFM) to account for uncertainty. However, the following question arises; how can probabilistic flood hazard information be incorporated into spatial planning? Thus, a consistent framework to incorporate PFMs into the decision-making is required. In this paper, a novel methodology based on Decision-Making under Uncertainty theories, in particular Value of Information (VOI) is proposed. Specifically, the methodology entails the use of a PFM to generate a VOI map, which highlights floodplain locations where additional information is valuable with respect to available floodplain management actions and their potential consequences. The methodology is illustrated with a simplified example and also applied to a real case study in the South of France, where a VOI map is analyzed on the basis of historical land use change decisions over a period of 26 years. Results show that uncertain flood hazard information encapsulated in PFMs can aid decision-making in floodplain planning.

  18. Quantitative analysis of MRI-guided attenuation correction techniques in time-of-flight brain PET/MRI.

    PubMed

    Mehranian, Abolfazl; Arabi, Hossein; Zaidi, Habib

    2016-04-15

    In quantitative PET/MR imaging, attenuation correction (AC) of PET data is markedly challenged by the need of deriving accurate attenuation maps from MR images. A number of strategies have been developed for MRI-guided attenuation correction with different degrees of success. In this work, we compare the quantitative performance of three generic AC methods, including standard 3-class MR segmentation-based, advanced atlas-registration-based and emission-based approaches in the context of brain time-of-flight (TOF) PET/MRI. Fourteen patients referred for diagnostic MRI and (18)F-FDG PET/CT brain scans were included in this comparative study. For each study, PET images were reconstructed using four different attenuation maps derived from CT-based AC (CTAC) serving as reference, standard 3-class MR-segmentation, atlas-registration and emission-based AC methods. To generate 3-class attenuation maps, T1-weighted MRI images were segmented into background air, fat and soft-tissue classes followed by assignment of constant linear attenuation coefficients of 0, 0.0864 and 0.0975 cm(-1) to each class, respectively. A robust atlas-registration based AC method was developed for pseudo-CT generation using local weighted fusion of atlases based on their morphological similarity to target MR images. Our recently proposed MRI-guided maximum likelihood reconstruction of activity and attenuation (MLAA) algorithm was employed to estimate the attenuation map from TOF emission data. The performance of the different AC algorithms in terms of prediction of bones and quantification of PET tracer uptake was objectively evaluated with respect to reference CTAC maps and CTAC-PET images. Qualitative evaluation showed that the MLAA-AC method could sparsely estimate bones and accurately differentiate them from air cavities. It was found that the atlas-AC method can accurately predict bones with variable errors in defining air cavities. Quantitative assessment of bone extraction accuracy based on Dice similarity coefficient (DSC) showed that MLAA-AC and atlas-AC resulted in DSC mean values of 0.79 and 0.92, respectively, in all patients. The MLAA-AC and atlas-AC methods predicted mean linear attenuation coefficients of 0.107 and 0.134 cm(-1), respectively, for the skull compared to reference CTAC mean value of 0.138cm(-1). The evaluation of the relative change in tracer uptake within 32 distinct regions of the brain with respect to CTAC PET images showed that the 3-class MRAC, MLAA-AC and atlas-AC methods resulted in quantification errors of -16.2 ± 3.6%, -13.3 ± 3.3% and 1.0 ± 3.4%, respectively. Linear regression and Bland-Altman concordance plots showed that both 3-class MRAC and MLAA-AC methods result in a significant systematic bias in PET tracer uptake, while the atlas-AC method results in a negligible bias. The standard 3-class MRAC method significantly underestimated cerebral PET tracer uptake. While current state-of-the-art MLAA-AC methods look promising, they were unable to noticeably reduce quantification errors in the context of brain imaging. Conversely, the proposed atlas-AC method provided the most accurate attenuation maps, and thus the lowest quantification bias. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. One registration multi-atlas-based pseudo-CT generation for attenuation correction in PET/MRI.

    PubMed

    Arabi, Hossein; Zaidi, Habib

    2016-10-01

    The outcome of a detailed assessment of various strategies for atlas-based whole-body bone segmentation from magnetic resonance imaging (MRI) was exploited to select the optimal parameters and setting, with the aim of proposing a novel one-registration multi-atlas (ORMA) pseudo-CT generation approach. The proposed approach consists of only one online registration between the target and reference images, regardless of the number of atlas images (N), while for the remaining atlas images, the pre-computed transformation matrices to the reference image are used to align them to the target image. The performance characteristics of the proposed method were evaluated and compared with conventional atlas-based attenuation map generation strategies (direct registration of the entire atlas images followed by voxel-wise weighting (VWW) and arithmetic averaging atlas fusion). To this end, four different positron emission tomography (PET) attenuation maps were generated via arithmetic averaging and VWW scheme using both direct registration and ORMA approaches as well as the 3-class attenuation map obtained from the Philips Ingenuity TF PET/MRI scanner commonly used in the clinical setting. The evaluation was performed based on the accuracy of extracted whole-body bones by the different attenuation maps and by quantitative analysis of resulting PET images compared to CT-based attenuation-corrected PET images serving as reference. The comparison of validation metrics regarding the accuracy of extracted bone using the different techniques demonstrated the superiority of the VWW atlas fusion algorithm achieving a Dice similarity measure of 0.82 ± 0.04 compared to arithmetic averaging atlas fusion (0.60 ± 0.02), which uses conventional direct registration. Application of the ORMA approach modestly compromised the accuracy, yielding a Dice similarity measure of 0.76 ± 0.05 for ORMA-VWW and 0.55 ± 0.03 for ORMA-averaging. The results of quantitative PET analysis followed the same trend with less significant differences in terms of SUV bias, whereas massive improvements were observed compared to PET images corrected for attenuation using the 3-class attenuation map. The maximum absolute bias achieved by VWW and VWW-ORMA methods was 06.4 ± 5.5 in the lung and 07.9 ± 4.8 in the bone, respectively. The proposed algorithm is capable of generating decent attenuation maps. The quantitative analysis revealed a good correlation between PET images corrected for attenuation using the proposed pseudo-CT generation approach and the corresponding CT images. The computational time is reduced by a factor of 1/N at the expense of a modest decrease in quantitative accuracy, thus allowing us to achieve a reasonable compromise between computing time and quantitative performance.

  20. SU-E-CAMPUS-I-06: Y90 PET/CT for the Instantaneous Determination of Both Target and Non-Target Absorbed Doses Following Hepatic Radioembolization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pasciak, A; Kao, J

    2014-06-15

    Purpose The process of converting Yttrium-90 (Y90) PET/CT images into 3D absorbed dose maps will be explained. The simple methods presented will allow the medical physicst to analyze Y90 PET images following radioembolization and determine the absorbed dose to tumor, normal liver parenchyma and other areas of interest, without application of Monte-Carlo radiation transport or dose-point-kernel (DPK) convolution. Methods Absorbed dose can be computed from Y90 PET/CT images based on the premise that radioembolization is a permanent implant with a constant relative activity distribution after infusion. Many Y90 PET/CT publications have used DPK convolution to obtain 3D absorbed dose maps.more » However, this method requires specialized software limiting clinical utility. The Local Deposition method, an alternative to DPK convolution, can be used to obtain absorbed dose and requires no additional computer processing. Pixel values from regions of interest drawn on Y90 PET/CT images can be converted to absorbed dose (Gy) by multiplication with a scalar constant. Results There is evidence that suggests the Local Deposition method may actually be more accurate than DPK convolution and it has been successfully used in a recent Y90 PET/CT publication. We have analytically compared dose-volume-histograms (DVH) for phantom hot-spheres to determine the difference between the DPK and Local Deposition methods, as a function of PET scanner point-spread-function for Y90. We have found that for PET/CT systems with a FWHM greater than 3.0 mm when imaging Y90, the Local Deposition Method provides a more accurate representation of DVH, regardless of target size than DPK convolution. Conclusion Using the Local Deposition Method, post-radioembolization Y90 PET/CT images can be transformed into 3D absorbed dose maps of the liver. An interventional radiologist or a Medical Physicist can perform this transformation in a clinical setting, allowing for rapid prediction of treatment efficacy by comparison to published tumoricidal thresholds.« less

  1. Probabilistic seismic hazard estimates incorporating site effects - An example from Indiana, U.S.A

    USGS Publications Warehouse

    Hasse, J.S.; Park, C.H.; Nowack, R.L.; Hill, J.R.

    2010-01-01

    The U.S. Geological Survey (USGS) has published probabilistic earthquake hazard maps for the United States based on current knowledge of past earthquake activity and geological constraints on earthquake potential. These maps for the central and eastern United States assume standard site conditions with Swave velocities of 760 m/s in the top 30 m. For urban and infrastructure planning and long-term budgeting, the public is interested in similar probabilistic seismic hazard maps that take into account near-surface geological materials. We have implemented a probabilistic method for incorporating site effects into the USGS seismic hazard analysis that takes into account the first-order effects of the surface geologic conditions. The thicknesses of sediments, which play a large role in amplification, were derived from a P-wave refraction database with over 13, 000 profiles, and a preliminary geology-based velocity model was constructed from available information on S-wave velocities. An interesting feature of the preliminary hazard maps incorporating site effects is the approximate factor of two increases in the 1-Hz spectral acceleration with 2 percent probability of exceedance in 50 years for parts of the greater Indianapolis metropolitan region and surrounding parts of central Indiana. This effect is primarily due to the relatively thick sequence of sediments infilling ancient bedrock topography that has been deposited since the Pleistocene Epoch. As expected, the Late Pleistocene and Holocene depositional systems of the Wabash and Ohio Rivers produce additional amplification in the southwestern part of Indiana. Ground motions decrease, as would be expected, toward the bedrock units in south-central Indiana, where motions are significantly lower than the values on the USGS maps.

  2. An SPM8-based approach for attenuation correction combining segmentation and nonrigid template formation: application to simultaneous PET/MR brain imaging.

    PubMed

    Izquierdo-Garcia, David; Hansen, Adam E; Förster, Stefan; Benoit, Didier; Schachoff, Sylvia; Fürst, Sebastian; Chen, Kevin T; Chonde, Daniel B; Catana, Ciprian

    2014-11-01

    We present an approach for head MR-based attenuation correction (AC) based on the Statistical Parametric Mapping 8 (SPM8) software, which combines segmentation- and atlas-based features to provide a robust technique to generate attenuation maps (μ maps) from MR data in integrated PET/MR scanners. Coregistered anatomic MR and CT images of 15 glioblastoma subjects were used to generate the templates. The MR images from these subjects were first segmented into 6 tissue classes (gray matter, white matter, cerebrospinal fluid, bone, soft tissue, and air), which were then nonrigidly coregistered using a diffeomorphic approach. A similar procedure was used to coregister the anatomic MR data for a new subject to the template. Finally, the CT-like images obtained by applying the inverse transformations were converted to linear attenuation coefficients to be used for AC of PET data. The method was validated on 16 new subjects with brain tumors (n = 12) or mild cognitive impairment (n = 4) who underwent CT and PET/MR scans. The μ maps and corresponding reconstructed PET images were compared with those obtained using the gold standard CT-based approach and the Dixon-based method available on the Biograph mMR scanner. Relative change (RC) images were generated in each case, and voxel- and region-of-interest-based analyses were performed. The leave-one-out cross-validation analysis of the data from the 15 atlas-generation subjects showed small errors in brain linear attenuation coefficients (RC, 1.38% ± 4.52%) compared with the gold standard. Similar results (RC, 1.86% ± 4.06%) were obtained from the analysis of the atlas-validation datasets. The voxel- and region-of-interest-based analysis of the corresponding reconstructed PET images revealed quantification errors of 3.87% ± 5.0% and 2.74% ± 2.28%, respectively. The Dixon-based method performed substantially worse (the mean RC values were 13.0% ± 10.25% and 9.38% ± 4.97%, respectively). Areas closer to the skull showed the largest improvement. We have presented an SPM8-based approach for deriving the head μ map from MR data to be used for PET AC in integrated PET/MR scanners. Its implementation is straightforward and requires only the morphologic data acquired with a single MR sequence. The method is accurate and robust, combining the strengths of both segmentation- and atlas-based approaches while minimizing their drawbacks. © 2014 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  3. An SPM8-based Approach for Attenuation Correction Combining Segmentation and Non-rigid Template Formation: Application to Simultaneous PET/MR Brain Imaging

    PubMed Central

    Izquierdo-Garcia, David; Hansen, Adam E.; Förster, Stefan; Benoit, Didier; Schachoff, Sylvia; Fürst, Sebastian; Chen, Kevin T.; Chonde, Daniel B.; Catana, Ciprian

    2014-01-01

    We present an approach for head MR-based attenuation correction (MR-AC) based on the Statistical Parametric Mapping (SPM8) software that combines segmentation- and atlas-based features to provide a robust technique to generate attenuation maps (µ-maps) from MR data in integrated PET/MR scanners. Methods Coregistered anatomical MR and CT images acquired in 15 glioblastoma subjects were used to generate the templates. The MR images from these subjects were first segmented into 6 tissue classes (gray and white matter, cerebro-spinal fluid, bone and soft tissue, and air), which were then non-rigidly coregistered using a diffeomorphic approach. A similar procedure was used to coregister the anatomical MR data for a new subject to the template. Finally, the CT-like images obtained by applying the inverse transformations were converted to linear attenuation coefficients (LACs) to be used for AC of PET data. The method was validated on sixteen new subjects with brain tumors (N=12) or mild cognitive impairment (N=4) who underwent CT and PET/MR scans. The µ-maps and corresponding reconstructed PET images were compared to those obtained using the gold standard CT-based approach and the Dixon-based method available on the Siemens Biograph mMR scanner. Relative change (RC) images were generated in each case and voxel- and region of interest (ROI)-based analyses were performed. Results The leave-one-out cross-validation analysis of the data from the 15 atlas-generation subjects showed small errors in brain LACs (RC=1.38%±4.52%) compared to the gold standard. Similar results (RC=1.86±4.06%) were obtained from the analysis of the atlas-validation datasets. The voxel- and ROI-based analysis of the corresponding reconstructed PET images revealed quantification errors of 3.87±5.0% and 2.74±2.28%, respectively. The Dixon-based method performed substantially worse (the mean RC values were 13.0±10.25% and 9.38±4.97%, respectively). Areas closer to skull showed the largest improvement. Conclusion We have presented an SPM8-based approach for deriving the head µ-map from MR data to be used for PET AC in integrated PET/MR scanners. Its implementation is straightforward and only requires the morphological data acquired with a single MR sequence. The method is very accurate and robust, combining the strengths of both segmentation- and atlas-based approaches while minimizing their drawbacks. PMID:25278515

  4. A flood map based DOI decoding method for block detector: a GATE simulation study.

    PubMed

    Shi, Han; Du, Dong; Su, Zhihong; Peng, Qiyu

    2014-01-01

    Positron Emission Tomography (PET) systems using detectors with Depth of Interaction (DOI) capabilities could achieve higher spatial resolution and better image quality than those without DOI. Up till now, most DOI methods developed are not cost-efficient for a whole body PET system. In this paper, we present a DOI decoding method based on flood map for low-cost conventional block detector with four-PMT readout. Using this method, the DOI information can be directly extracted from the DOI-related crystal spot deformation in the flood map. GATE simulations are then carried out to validate the method, confirming a DOI sorting accuracy of 85.27%. Therefore, we conclude that this method has the potential to be applied in conventional detectors to achieve a reasonable DOI measurement without dramatically increasing their complexity and cost of an entire PET system.

  5. Diagnostic performance of fluorodeoxyglucose positron emission tomography/magnetic resonance imaging fusion images of gynecological malignant tumors: comparison with positron emission tomography/computed tomography.

    PubMed

    Nakajo, Kazuya; Tatsumi, Mitsuaki; Inoue, Atsuo; Isohashi, Kayako; Higuchi, Ichiro; Kato, Hiroki; Imaizumi, Masao; Enomoto, Takayuki; Shimosegawa, Eku; Kimura, Tadashi; Hatazawa, Jun

    2010-02-01

    We compared the diagnostic accuracy of fluorodeoxyglucose positron emission tomography/computed tomography (FDG PET/CT) and PET/magnetic resonance imaging (MRI) fusion images for gynecological malignancies. A total of 31 patients with gynecological malignancies were enrolled. FDG-PET images were fused to CT, T1- and T2-weighted images (T1WI, T2WI). PET-MRI fusion was performed semiautomatically. We performed three types of evaluation to demonstrate the usefulness of PET/MRI fusion images in comparison with that of inline PET/CT as follows: depiction of the uterus and the ovarian lesions on CT or MRI mapping images (first evaluation); additional information for lesion localization with PET and mapping images (second evaluation); and the image quality of fusion on interpretation (third evaluation). For the first evaluation, the score for T2WI (4.68 +/- 0.65) was significantly higher than that for CT (3.54 +/- 1.02) or T1WI (3.71 +/- 0.97) (P < 0.01). For the second evaluation, the scores for the localization of FDG accumulation showing that T2WI (2.74 +/- 0.57) provided significantly more additional information for the identification of anatomical sites of FDG accumulation than did CT (2.06 +/- 0.68) or T1WI (2.23 +/- 0.61) (P < 0.01). For the third evaluation, the three-point rating scale for the patient group as a whole demonstrated that PET/T2WI (2.72 +/- 0.54) localized the lesion significantly more convincingly than PET/CT (2.23 +/- 0.50) or PET/T1WI (2.29 +/- 0.53) (P < 0.01). PET/T2WI fusion images are superior for the detection and localization of gynecological malignancies.

  6. Effect of filters and reconstruction algorithms on I-124 PET in Siemens Inveon PET scanner

    NASA Astrophysics Data System (ADS)

    Ram Yu, A.; Kim, Jin Su

    2015-10-01

    Purpose: To assess the effects of filtering and reconstruction on Siemens I-124 PET data. Methods: A Siemens Inveon PET was used. Spatial resolution of I-124 was measured to a transverse offset of 50 mm from the center FBP, 2D ordered subset expectation maximization (OSEM2D), 3D re-projection algorithm (3DRP), and maximum a posteriori (MAP) methods were tested. Non-uniformity (NU), recovery coefficient (RC), and spillover ratio (SOR) parameterized image quality. Mini deluxe phantom data of I-124 was also assessed. Results: Volumetric resolution was 7.3 mm3 from the transverse FOV center when FBP reconstruction algorithms with ramp filter was used. MAP yielded minimal NU with β =1.5. OSEM2D yielded maximal RC. SOR was below 4% for FBP with ramp, Hamming, Hanning, or Shepp-Logan filters. Based on the mini deluxe phantom results, an FBP with Hanning or Parzen filters, or a 3DRP with Hanning filter yielded feasible I-124 PET data.Conclusions: Reconstruction algorithms and filters were compared. FBP with Hanning or Parzen filters, or 3DRP with Hanning filter yielded feasible data for quantifying I-124 PET.

  7. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction Engineering, Oregon State University, Corvallis, OR 97331, USA

  8. Automatic correction of dental artifacts in PET/MRI

    PubMed Central

    Ladefoged, Claes N.; Andersen, Flemming L.; Keller, Sune. H.; Beyer, Thomas; Law, Ian; Højgaard, Liselotte; Darkner, Sune; Lauze, Francois

    2015-01-01

    Abstract. A challenge when using current magnetic resonance (MR)-based attenuation correction in positron emission tomography/MR imaging (PET/MRI) is that the MRIs can have a signal void around the dental fillings that is segmented as artificial air-regions in the attenuation map. For artifacts connected to the background, we propose an extension to an existing active contour algorithm to delineate the outer contour using the nonattenuation corrected PET image and the original attenuation map. We propose a combination of two different methods for differentiating the artifacts within the body from the anatomical air-regions by first using a template of artifact regions, and second, representing the artifact regions with a combination of active shape models and k-nearest-neighbors. The accuracy of the combined method has been evaluated using 25 F18-fluorodeoxyglucose PET/MR patients. Results showed that the approach was able to correct an average of 97±3% of the artifact areas. PMID:26158104

  9. Quantitative performance evaluation of 124I PET/MRI lesion dosimetry in differentiated thyroid cancer

    NASA Astrophysics Data System (ADS)

    Wierts, R.; Jentzen, W.; Quick, H. H.; Wisselink, H. J.; Pooters, I. N. A.; Wildberger, J. E.; Herrmann, K.; Kemerink, G. J.; Backes, W. H.; Mottaghy, F. M.

    2018-01-01

    The aim was to investigate the quantitative performance of 124I PET/MRI for pre-therapy lesion dosimetry in differentiated thyroid cancer (DTC). Phantom measurements were performed on a PET/MRI system (Biograph mMR, Siemens Healthcare) using 124I and 18F. The PET calibration factor and the influence of radiofrequency coil attenuation were determined using a cylindrical phantom homogeneously filled with radioactivity. The calibration factor was 1.00  ±  0.02 for 18F and 0.88  ±  0.02 for 124I. Near the radiofrequency surface coil an underestimation of less than 5% in radioactivity concentration was observed. Soft-tissue sphere recovery coefficients were determined using the NEMA IEC body phantom. Recovery coefficients were systematically higher for 18F than for 124I. In addition, the six spheres of the phantom were segmented using a PET-based iterative segmentation algorithm. For all 124I measurements, the deviations in segmented lesion volume and mean radioactivity concentration relative to the actual values were smaller than 15% and 25%, respectively. The effect of MR-based attenuation correction (three- and four-segment µ-maps) on bone lesion quantification was assessed using radioactive spheres filled with a K2HPO4 solution mimicking bone lesions. The four-segment µ-map resulted in an underestimation of the imaged radioactivity concentration of up to 15%, whereas the three-segment µ-map resulted in an overestimation of up to 10%. For twenty lesions identified in six patients, a comparison of 124I PET/MRI to PET/CT was performed with respect to segmented lesion volume and radioactivity concentration. The interclass correlation coefficients showed excellent agreement in segmented lesion volume and radioactivity concentration (0.999 and 0.95, respectively). In conclusion, it is feasible that accurate quantitative 124I PET/MRI could be used to perform radioiodine pre-therapy lesion dosimetry in DTC.

  10. A probabilistic estimate of maximum acceleration in rock in the contiguous United States

    USGS Publications Warehouse

    Algermissen, Sylvester Theodore; Perkins, David M.

    1976-01-01

    This paper presents a probabilistic estimate of the maximum ground acceleration to be expected from earthquakes occurring in the contiguous United States. It is based primarily upon the historic seismic record which ranges from very incomplete before 1930 to moderately complete after 1960. Geologic data, primarily distribution of faults, have been employed only to a minor extent, because most such data have not been interpreted yet with earthquake hazard evaluation in mind.The map provides a preliminary estimate of the relative hazard in various parts of the country. The report provides a method for evaluating the relative importance of the many parameters and assumptions in hazard analysis. The map and methods of evaluation described reflect the current state of understanding and are intended to be useful for engineering purposes in reducing the effects of earthquakes on buildings and other structures.Studies are underway on improved methods for evaluating the relativ( earthquake hazard of different regions. Comments on this paper are invited to help guide future research and revisions of the accompanying map.The earthquake hazard in the United States has been estimated in a variety of ways since the initial effort by Ulrich (see Roberts and Ulrich, 1950). In general, the earlier maps provided an estimate of the severity of ground shaking or damage but the frequency of occurrence of the shaking or damage was not given. Ulrich's map showed the distribution of expected damage in terms of no damage (zone 0), minor damage (zone 1), moderate damage (zone 2), and major damage (zone 3). The zones were not defined further and the frequency of occurrence of damage was not suggested. Richter (1959) and Algermissen (1969) estimated the ground motion in terms of maximum Modified Mercalli intensity. Richter used the terms "occasional" and "frequent" to characterize intensity IX shaking and Algermissen included recurrence curves for various parts of the country in the paper accompanying his map.The first probabilistic hazard maps covering portions of the United States were by Milne and Davenport (1969a). Recently, Wiggins, Hirshberg and Bronowicki (1974) prepared a probabilistic map of maximum particle velocity and Modified Mercalli intensity for the entire United States. The maps are based on an analysis of the historical seismicity. In general, geological data were not incorporated into the development of the maps.

  11. Parametric mapping using spectral analysis for 11C-PBR28 PET reveals neuroinflammation in mild cognitive impairment subjects.

    PubMed

    Fan, Zhen; Dani, Melanie; Femminella, Grazia D; Wood, Melanie; Calsolaro, Valeria; Veronese, Mattia; Turkheimer, Federico; Gentleman, Steve; Brooks, David J; Hinz, Rainer; Edison, Paul

    2018-07-01

    Neuroinflammation and microglial activation play an important role in amnestic mild cognitive impairment (MCI) and Alzheimer's disease. In this study, we investigated the spatial distribution of neuroinflammation in MCI subjects, using spectral analysis (SA) to generate parametric maps and quantify 11 C-PBR28 PET, and compared these with compartmental and other kinetic models of quantification. Thirteen MCI and nine healthy controls were enrolled in this study. Subjects underwent 11 C-PBR28 PET scans with arterial cannulation. Spectral analysis with an arterial plasma input function was used to generate 11 C-PBR28 parametric maps. These maps were then compared with regional 11 C-PBR28 V T (volume of distribution) using a two-tissue compartment model and Logan graphic analysis. Amyloid load was also assessed with 18 F-Flutemetamol PET. With SA, three component peaks were identified in addition to blood volume. The 11 C-PBR28 impulse response function (IRF) at 90 min produced the lowest coefficient of variation. Single-subject analysis using this IRF demonstrated microglial activation in five out of seven amyloid-positive MCI subjects. IRF parametric maps of 11 C-PBR28 uptake revealed a group-wise significant increase in neuroinflammation in amyloid-positive MCI subjects versus HC in multiple cortical association areas, and particularly in the temporal lobe. Interestingly, compartmental analysis detected group-wise increase in 11 C-PBR28 binding in the thalamus of amyloid-positive MCI subjects, while Logan parametric maps did not perform well. This study demonstrates for the first time that spectral analysis can be used to generate parametric maps of 11 C-PBR28 uptake, and is able to detect microglial activation in amyloid-positive MCI subjects. IRF parametric maps of 11 C-PBR28 uptake allow voxel-wise single-subject analysis and could be used to evaluate microglial activation in individual subjects.

  12. 4D ML reconstruction as a tool for volumetric PET-based treatment verification in ion beam radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Bernardi, E., E-mail: elisabetta.debernardi@unimib.it; Ricotti, R.; Riboldi, M.

    2016-02-15

    Purpose: An innovative strategy to improve the sensitivity of positron emission tomography (PET)-based treatment verification in ion beam radiotherapy is proposed. Methods: Low counting statistics PET images acquired during or shortly after the treatment (Measured PET) and a Monte Carlo estimate of the same PET images derived from the treatment plan (Expected PET) are considered as two frames of a 4D dataset. A 4D maximum likelihood reconstruction strategy was adapted to iteratively estimate the annihilation events distribution in a reference frame and the deformation motion fields that map it in the Expected PET and Measured PET frames. The outputs generatedmore » by the proposed strategy are as follows: (1) an estimate of the Measured PET with an image quality comparable to the Expected PET and (2) an estimate of the motion field mapping Expected PET to Measured PET. The details of the algorithm are presented and the strategy is preliminarily tested on analytically simulated datasets. Results: The algorithm demonstrates (1) robustness against noise, even in the worst conditions where 1.5 × 10{sup 4} true coincidences and a random fraction of 73% are simulated; (2) a proper sensitivity to different kind and grade of mismatches ranging between 1 and 10 mm; (3) robustness against bias due to incorrect washout modeling in the Monte Carlo simulation up to 1/3 of the original signal amplitude; and (4) an ability to describe the mismatch even in presence of complex annihilation distributions such as those induced by two perpendicular superimposed ion fields. Conclusions: The promising results obtained in this work suggest the applicability of the method as a quantification tool for PET-based treatment verification in ion beam radiotherapy. An extensive assessment of the proposed strategy on real treatment verification data is planned.« less

  13. Disease quantification on PET/CT images without object delineation

    NASA Astrophysics Data System (ADS)

    Tong, Yubing; Udupa, Jayaram K.; Odhner, Dewey; Wu, Caiyun; Fitzpatrick, Danielle; Winchell, Nicole; Schuster, Stephen J.; Torigian, Drew A.

    2017-03-01

    The derivation of quantitative information from images to make quantitative radiology (QR) clinically practical continues to face a major image analysis hurdle because of image segmentation challenges. This paper presents a novel approach to disease quantification (DQ) via positron emission tomography/computed tomography (PET/CT) images that explores how to decouple DQ methods from explicit dependence on object segmentation through the use of only object recognition results to quantify disease burden. The concept of an object-dependent disease map is introduced to express disease severity without performing explicit delineation and partial volume correction of either objects or lesions. The parameters of the disease map are estimated from a set of training image data sets. The idea is illustrated on 20 lung lesions and 20 liver lesions derived from 18F-2-fluoro-2-deoxy-D-glucose (FDG)-PET/CT scans of patients with various types of cancers and also on 20 NEMA PET/CT phantom data sets. Our preliminary results show that, on phantom data sets, "disease burden" can be estimated to within 2% of known absolute true activity. Notwithstanding the difficulty in establishing true quantification on patient PET images, our results achieve 8% deviation from "true" estimates, with slightly larger deviations for small and diffuse lesions where establishing ground truth becomes really questionable, and smaller deviations for larger lesions where ground truth set up becomes more reliable. We are currently exploring extensions of the approach to include fully automated body-wide DQ, extensions to just CT or magnetic resonance imaging (MRI) alone, to PET/CT performed with radiotracers other than FDG, and other functional forms of disease maps.

  14. Errors in MR-based attenuation correction for brain imaging with PET/MR scanners

    NASA Astrophysics Data System (ADS)

    Rota Kops, Elena; Herzog, Hans

    2013-02-01

    AimAttenuation correction of PET data acquired by hybrid MR/PET scanners remains a challenge, even if several methods for brain and whole-body measurements have been developed recently. A template-based attenuation correction for brain imaging proposed by our group is easy to handle and delivers reliable attenuation maps in a short time. However, some potential error sources are analyzed in this study. We investigated the choice of template reference head among all the available data (error A), and possible skull anomalies of the specific patient, such as discontinuities due to surgery (error B). Materials and methodsAn anatomical MR measurement and a 2-bed-position transmission scan covering the whole head and neck region were performed in eight normal subjects (4 females, 4 males). Error A: Taking alternatively one of the eight heads as reference, eight different templates were created by nonlinearly registering the images to the reference and calculating the average. Eight patients (4 females, 4 males; 4 with brain lesions, 4 w/o brain lesions) were measured in the Siemens BrainPET/MR scanner. The eight templates were used to generate the patients' attenuation maps required for reconstruction. ROI and VOI atlas-based comparisons were performed employing all the reconstructed images. Error B: CT-based attenuation maps of two volunteers were manipulated by manually inserting several skull lesions and filling a nasal cavity. The corresponding attenuation coefficients were substituted with the water's coefficient (0.096/cm). ResultsError A: The mean SUVs over the eight templates pairs for all eight patients and all VOIs did not differ significantly one from each other. Standard deviations up to 1.24% were found. Error B: After reconstruction of the volunteers' BrainPET data with the CT-based attenuation maps without and with skull anomalies, a VOI-atlas analysis was performed revealing very little influence of the skull lesions (less than 3%), while the filled nasal cavity yielded an overestimation in cerebellum up to 5%. ConclusionsThe present error analysis confirms that our template-based attenuation method provides reliable attenuation corrections of PET brain imaging measured in PET/MR scanners.

  15. Direct Reconstruction of CT-Based Attenuation Correction Images for PET With Cluster-Based Penalties

    NASA Astrophysics Data System (ADS)

    Kim, Soo Mee; Alessio, Adam M.; De Man, Bruno; Kinahan, Paul E.

    2017-03-01

    Extremely low-dose (LD) CT acquisitions used for PET attenuation correction have high levels of noise and potential bias artifacts due to photon starvation. This paper explores the use of a priori knowledge for iterative image reconstruction of the CT-based attenuation map. We investigate a maximum a posteriori framework with cluster-based multinomial penalty for direct iterative coordinate decent (dICD) reconstruction of the PET attenuation map. The objective function for direct iterative attenuation map reconstruction used a Poisson log-likelihood data fit term and evaluated two image penalty terms of spatial and mixture distributions. The spatial regularization is based on a quadratic penalty. For the mixture penalty, we assumed that the attenuation map may consist of four material clusters: air + background, lung, soft tissue, and bone. Using simulated noisy sinogram data, dICD reconstruction was performed with different strengths of the spatial and mixture penalties. The combined spatial and mixture penalties reduced the root mean squared error (RMSE) by roughly two times compared with a weighted least square and filtered backprojection reconstruction of CT images. The combined spatial and mixture penalties resulted in only slightly lower RMSE compared with a spatial quadratic penalty alone. For direct PET attenuation map reconstruction from ultra-LD CT acquisitions, the combination of spatial and mixture penalties offers regularization of both variance and bias and is a potential method to reconstruct attenuation maps with negligible patient dose. The presented results, using a best-case histogram suggest that the mixture penalty does not offer a substantive benefit over conventional quadratic regularization and diminishes enthusiasm for exploring future application of the mixture penalty.

  16. Mixture Modeling for Background and Sources Separation in x-ray Astronomical Images

    NASA Astrophysics Data System (ADS)

    Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker

    2004-11-01

    A probabilistic technique for the joint estimation of background and sources in high-energy astrophysics is described. Bayesian probability theory is applied to gain insight into the coexistence of background and sources through a probabilistic two-component mixture model, which provides consistent uncertainties of background and sources. The present analysis is applied to ROSAT PSPC data (0.1-2.4 keV) in Survey Mode. A background map is modelled using a Thin-Plate spline. Source probability maps are obtained for each pixel (45 arcsec) independently and for larger correlation lengths, revealing faint and extended sources. We will demonstrate that the described probabilistic method allows for detection improvement of faint extended celestial sources compared to the Standard Analysis Software System (SASS) used for the production of the ROSAT All-Sky Survey (RASS) catalogues.

  17. Water availability predicts forest canopy height at the global scale.

    PubMed

    Klein, Tamir; Randin, Christophe; Körner, Christian

    2015-12-01

    The tendency of trees to grow taller with increasing water availability is common knowledge. Yet a robust, universal relationship between the spatial distribution of water availability and forest canopy height (H) is lacking. Here, we created a global water availability map by calculating an annual budget as the difference between precipitation (P) and potential evapotranspiration (PET) at a 1-km spatial resolution, and in turn correlated it with a global H map of the same resolution. Across forested areas over the globe, Hmean increased with P-PET, roughly: Hmean (m) = 19.3 + 0.077*(P-PET). Maximum forest canopy height also increased gradually from ~ 5 to ~ 50 m, saturating at ~ 45 m for P-PET > 500 mm. Forests were far from their maximum height potential in cold, boreal regions and in disturbed areas. The strong association between forest height and P-PET provides a useful tool when studying future forest dynamics under climate change, and in quantifying anthropogenic forest disturbance. © 2015 John Wiley & Sons Ltd/CNRS.

  18. Toward Probabilistic Risk Analyses - Development of a Probabilistic Tsunami Hazard Assessment of Crescent City, CA

    NASA Astrophysics Data System (ADS)

    González, F. I.; Leveque, R. J.; Hatheway, D.; Metzger, N.

    2011-12-01

    Risk is defined in many ways, but most are consistent with Crichton's [1999] definition based on the ''risk triangle'' concept and the explicit identification of three risk elements: ''Risk is the probability of a loss, and this depends on three elements: hazard, vulnerability, and exposure. If any of these three elements in risk increases or decreases, then the risk increases or decreases respectively." The World Meteorological Organization, for example, cites Crichton [1999] and then defines risk as [WMO, 2008] Risk = function (Hazard x Vulnerability x Exposure) while the Asian Disaster Reduction Center adopts the more general expression [ADRC, 2005] Risk = function (Hazard, Vulnerability, Exposure) In practice, probabilistic concepts are invariably invoked, and at least one of the three factors are specified as probabilistic in nature. The Vulnerability and Exposure factors are defined in multiple ways in the relevant literature; but the Hazard factor, which is the focus of our presentation, is generally understood to deal only with the physical aspects of the phenomena and, in particular, the ability of the phenomena to inflict harm [Thywissen, 2006]. A Hazard factor can be estimated by a methodology known as Probabilistic Tsunami Hazard Assessment (PTHA) [González, et al., 2009]. We will describe the PTHA methodology and provide an example -- the results of a previous application to Seaside, OR. We will also present preliminary results for a PTHA of Crescent City, CA -- a pilot project and coastal modeling/mapping effort funded by the Federal Emergency Management Agency (FEMA) Region IX office as part of the new California Coastal Analysis and Mapping Project (CCAMP). CCAMP and the PTHA in Crescent City are being conducted under the nationwide FEMA Risk Mapping, Assessment, and Planning (Risk MAP) Program which focuses on providing communities with flood information and tools they can use to enhance their mitigation plans and better protect their citizens.

  19. Probabilistic flood inundation mapping at ungauged streams due to roughness coefficient uncertainty in hydraulic modelling

    NASA Astrophysics Data System (ADS)

    Papaioannou, George; Vasiliades, Lampros; Loukas, Athanasios; Aronica, Giuseppe T.

    2017-04-01

    Probabilistic flood inundation mapping is performed and analysed at the ungauged Xerias stream reach, Volos, Greece. The study evaluates the uncertainty introduced by the roughness coefficient values on hydraulic models in flood inundation modelling and mapping. The well-established one-dimensional (1-D) hydraulic model, HEC-RAS is selected and linked to Monte-Carlo simulations of hydraulic roughness. Terrestrial Laser Scanner data have been used to produce a high quality DEM for input data uncertainty minimisation and to improve determination accuracy on stream channel topography required by the hydraulic model. Initial Manning's n roughness coefficient values are based on pebble count field surveys and empirical formulas. Various theoretical probability distributions are fitted and evaluated on their accuracy to represent the estimated roughness values. Finally, Latin Hypercube Sampling has been used for generation of different sets of Manning roughness values and flood inundation probability maps have been created with the use of Monte Carlo simulations. Historical flood extent data, from an extreme historical flash flood event, are used for validation of the method. The calibration process is based on a binary wet-dry reasoning with the use of Median Absolute Percentage Error evaluation metric. The results show that the proposed procedure supports probabilistic flood hazard mapping at ungauged rivers and provides water resources managers with valuable information for planning and implementing flood risk mitigation strategies.

  20. The effect of metal artefact reduction on CT-based attenuation correction for PET imaging in the vicinity of metallic hip implants: a phantom study.

    PubMed

    Harnish, Roy; Prevrhal, Sven; Alavi, Abass; Zaidi, Habib; Lang, Thomas F

    2014-07-01

    To determine if metal artefact reduction (MAR) combined with a priori knowledge of prosthesis material composition can be applied to obtain CT-based attenuation maps with sufficient accuracy for quantitative assessment of (18)F-fluorodeoxyglucose uptake in lesions near metallic prostheses. A custom hip prosthesis phantom with a lesion-sized cavity filled with 0.2 ml (18)F-FDG solution having an activity of 3.367 MBq adjacent to a prosthesis bore was imaged twice with a chrome-cobalt steel hip prosthesis and a plastic replica, respectively. Scanning was performed on a clinical hybrid PET/CT system equipped with an additional external (137)Cs transmission source. PET emission images were reconstructed from both phantom configurations with CT-based attenuation correction (CTAC) and with CT-based attenuation correction using MAR (MARCTAC). To compare results with the attenuation-correction method extant prior to the advent of PET/CT, we also carried out attenuation correction with (137)Cs transmission-based attenuation correction (TXAC). CTAC and MARCTAC images were scaled to attenuation coefficients at 511 keV using a trilinear function that mapped the highest CT values to the prosthesis alloy attenuation coefficient. Accuracy and spatial distribution of the lesion activity was compared between the three reconstruction schemes. Compared to the reference activity of 3.37 MBq, the estimated activity quantified from the PET image corrected by TXAC was 3.41 MBq. The activity estimated from PET images corrected by MARCTAC was similar in accuracy at 3.32 MBq. CTAC corrected PET images resulted in nearly 40 % overestimation of lesion activity at 4.70 MBq. Comparison of PET images obtained with the plastic and metal prostheses in place showed that CTAC resulted in a marked distortion of the (18)F-FDG distribution within the lesion, whereas application of MARCTAC and TXAC resulted in lesion distributions similar to those observed with the plastic replica. MAR combined with a trilinear CT number mapping for PET attenuation correction resulted in estimates of lesion activity comparable in accuracy to that obtained with (137)Cs transmission-based attenuation correction, and far superior to estimates made without attenuation correction or with a standard CT attenuation map. The ability to use CT images for attenuation correction is a potentially important development because it obviates the need for a (137)Cs transmission source, which entails extra scan time, logistical complexity and expense.

  1. Covariance approximation for fast and accurate computation of channelized Hotelling observer statistics

    NASA Astrophysics Data System (ADS)

    Bonetto, P.; Qi, Jinyi; Leahy, R. M.

    2000-08-01

    Describes a method for computing linear observer statistics for maximum a posteriori (MAP) reconstructions of PET images. The method is based on a theoretical approximation for the mean and covariance of MAP reconstructions. In particular, the authors derive here a closed form for the channelized Hotelling observer (CHO) statistic applied to 2D MAP images. The theoretical analysis models both the Poission statistics of PET data and the inhomogeneity of tracer uptake. The authors show reasonably good correspondence between these theoretical results and Monte Carlo studies. The accuracy and low computational cost of the approximation allow the authors to analyze the observer performance over a wide range of operating conditions and parameter settings for the MAP reconstruction algorithm.

  2. First Volcanological-Probabilistic Pyroclastic Density Current and Fallout Hazard Map for Campi Flegrei and Somma Vesuvius Volcanoes.

    NASA Astrophysics Data System (ADS)

    Mastrolorenzo, G.; Pappalardo, L.; Troise, C.; Panizza, A.; de Natale, G.

    2005-05-01

    Integrated volcanological-probabilistic approaches has been used in order to simulate pyroclastic density currents and fallout and produce hazard maps for Campi Flegrei and Somma Vesuvius areas. On the basis of the analyses of all types of pyroclastic flows, surges, secondary pyroclastic density currents and fallout events occurred in the volcanological history of the two volcanic areas and the evaluation of probability for each type of events, matrixs of input parameters for a numerical simulation have been performed. The multi-dimensional input matrixs include the main controlling parameters of the pyroclasts transport and deposition dispersion, as well as the set of possible eruptive vents used in the simulation program. Probabilistic hazard maps provide of each points of campanian area, the yearly probability to be interested by a given event with a given intensity and resulting demage. Probability of a few events in one thousand years are typical of most areas around the volcanoes whitin a range of ca 10 km, including Neaples. Results provide constrains for the emergency plans in Neapolitan area.

  3. User perception and interpretation of tornado probabilistic hazard information: Comparison of four graphical designs.

    PubMed

    Miran, Seyed M; Ling, Chen; James, Joseph J; Gerard, Alan; Rothfusz, Lans

    2017-11-01

    Effective design for presenting severe weather information is important to reduce devastating consequences of severe weather. The Probabilistic Hazard Information (PHI) system for severe weather is being developed by NOAA National Severe Storms Laboratory (NSSL) to communicate probabilistic hazardous weather information. This study investigates the effects of four PHI graphical designs for tornado threat, namely, "four-color"," red-scale", "grayscale" and "contour", on users' perception, interpretation, and reaction to threat information. PHI is presented on either a map background or a radar background. Analysis showed that the accuracy was significantly higher and response time faster when PHI was displayed on map background as compared to radar background due to better contrast. When displayed on a radar background, "grayscale" design resulted in a higher accuracy of responses. Possibly due to familiarity, participants reported four-color design as their favorite design, which also resulted in the fastest recognition of probability levels on both backgrounds. Our study shows the importance of using intuitive color-coding and sufficient contrast in conveying probabilistic threat information via graphical design. We also found that users follows a rational perceiving-judging-feeling-and acting approach in processing probabilistic hazard information for tornado. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. A novel visualisation tool for climate services: a case study of temperature extremes and human mortality in Europe

    NASA Astrophysics Data System (ADS)

    Lowe, R.; Ballester, J.; Robine, J.; Herrmann, F. R.; Jupp, T. E.; Stephenson, D.; Rodó, X.

    2013-12-01

    Users of climate information often require probabilistic information on which to base their decisions. However, communicating information contained within a probabilistic forecast presents a challenge. In this paper we demonstrate a novel visualisation technique to display ternary probabilistic forecasts on a map in order to inform decision making. In this method, ternary probabilistic forecasts, which assign probabilities to a set of three outcomes (e.g. low, medium, and high risk), are considered as a point in a triangle of barycentric coordinates. This allows a unique colour to be assigned to each forecast from a continuum of colours defined on the triangle. Colour saturation increases with information gain relative to the reference forecast (i.e. the long term average). This provides additional information to decision makers compared with conventional methods used in seasonal climate forecasting, where one colour is used to represent one forecast category on a forecast map (e.g. red = ';dry'). We use the tool to present climate-related mortality projections across Europe. Temperature and humidity are related to human mortality via location-specific transfer functions, calculated using historical data. Daily mortality data at the NUTS2 level for 16 countries in Europe were obtain from 1998-2005. Transfer functions were calculated for 54 aggregations in Europe, defined using criteria related to population and climatological similarities. Aggregations are restricted to fall within political boundaries to avoid problems related to varying adaptation policies between countries. A statistical model is fit to cold and warm tails to estimate future mortality using forecast temperatures, in a Bayesian probabilistic framework. Using predefined categories of temperature-related mortality risk, we present maps of probabilistic projections for human mortality at seasonal to decadal time scales. We demonstrate the information gained from using this technique compared to more traditional methods to display ternary probabilistic forecasts. This technique allows decision makers to identify areas where the model predicts with certainty area-specific heat waves or cold snaps, in order to effectively target resources to those areas most at risk, for a given season or year. It is hoped that this visualisation tool will facilitate the interpretation of the probabilistic forecasts not only for public health decision makers but also within a multi-sectoral climate service framework.

  5. Dental artifacts in the head and neck region: implications for Dixon-based attenuation correction in PET/MR.

    PubMed

    Ladefoged, Claes N; Hansen, Adam E; Keller, Sune H; Fischer, Barbara M; Rasmussen, Jacob H; Law, Ian; Kjær, Andreas; Højgaard, Liselotte; Lauze, Francois; Beyer, Thomas; Andersen, Flemming L

    2015-12-01

    In the absence of CT or traditional transmission sources in combined clinical positron emission tomography/magnetic resonance (PET/MR) systems, MR images are used for MR-based attenuation correction (MR-AC). The susceptibility effects due to metal implants challenge MR-AC in the neck region of patients with dental implants. The purpose of this study was to assess the frequency and magnitude of subsequent PET image distortions following MR-AC. A total of 148 PET/MR patients with clear visual signal voids on the attenuation map in the dental region were included in this study. Patients were injected with [(18)F]-FDG, [(11)C]-PiB, [(18)F]-FET, or [(64)Cu]-DOTATATE. The PET/MR data were acquired over a single-bed position of 25.8 cm covering the head and neck. MR-AC was based on either standard MR-ACDIXON or MR-ACINPAINTED where the susceptibility-induced signal voids were substituted with soft tissue information. Our inpainting algorithm delineates the outer contour of signal voids breaching the anatomical volume using the non-attenuation-corrected PET image and classifies the inner air regions based on an aligned template of likely dental artifact areas. The reconstructed PET images were evaluated visually and quantitatively using regions of interests in reference regions. The volume of the artifacts and the computed relative differences in mean and max standardized uptake value (SUV) between the two PET images are reported. The MR-based volume of the susceptibility-induced signal voids on the MR-AC attenuation maps was between 1.6 and 520.8 mL. The corresponding/resulting bias of the reconstructed tracer distribution was localized mainly in the area of the signal void. The mean and maximum SUVs averaged across all patients increased after inpainting by 52% (± 11%) and 28% (± 11%), respectively, in the corrected region. SUV underestimation decreased with the distance to the signal void and correlated with the volume of the susceptibility artifact on the MR-AC attenuation map. Metallic dental work may cause severe MR signal voids. The resulting PET/MR artifacts may exceed the actual volume of the dental fillings. The subsequent bias in PET is severe in regions in and near the signal voids and may affect the conspicuity of lesions in the mandibular region.

  6. Practical guide for implementing hybrid PET/MR clinical service: lessons learned from our experience

    PubMed Central

    Parikh, Nainesh; Friedman, Kent P.; Shah, Shetal N.; Chandarana, Hersh

    2015-01-01

    Positron emission tomography (PET) and magnetic resonance imaging, until recently, have been performed on separate PET and MR systems with varying temporal delay between the two acquisitions. The interpretation of these two separately acquired studies requires cognitive fusion by radiologists/nuclear medicine physicians or dedicated and challenging post-processing. Recent advances in hardware and software with introduction of hybrid PET/MR systems have made it possible to acquire the PET and MR images simultaneously or near simultaneously. This review article serves as a road-map for clinical implementation of hybrid PET/MR systems and briefly discusses hardware systems, the personnel needs, safety and quality issues, and reimbursement topics based on experience at NYU Langone Medical Center and Cleveland Clinic. PMID:25985966

  7. Probabilistic choice between symmetric disparities in motion stereo matching for a lateral navigation system

    NASA Astrophysics Data System (ADS)

    Ershov, Egor; Karnaukhov, Victor; Mozerov, Mikhail

    2016-02-01

    Two consecutive frames of a lateral navigation camera video sequence can be considered as an appropriate approximation to epipolar stereo. To overcome edge-aware inaccuracy caused by occlusion, we propose a model that matches the current frame to the next and to the previous ones. The positive disparity of matching to the previous frame has its symmetric negative disparity to the next frame. The proposed algorithm performs probabilistic choice for each matched pixel between the positive disparity and its symmetric disparity cost. A disparity map obtained by optimization over the cost volume composed of the proposed probabilistic choice is more accurate than the traditional left-to-right and right-to-left disparity maps cross-check. Also, our algorithm needs two times less computational operations per pixel than the cross-check technique. The effectiveness of our approach is demonstrated on synthetic data and real video sequences, with ground-truth value.

  8. Clinical Evaluation of Zero-Echo-Time Attenuation Correction for Brain 18F-FDG PET/MRI: Comparison with Atlas Attenuation Correction.

    PubMed

    Sekine, Tetsuro; Ter Voert, Edwin E G W; Warnock, Geoffrey; Buck, Alfred; Huellner, Martin; Veit-Haibach, Patrick; Delso, Gaspar

    2016-12-01

    Accurate attenuation correction (AC) on PET/MR is still challenging. The purpose of this study was to evaluate the clinical feasibility of AC based on fast zero-echo-time (ZTE) MRI by comparing it with the default atlas-based AC on a clinical PET/MR scanner. We recruited 10 patients with malignant diseases not located on the brain. In all patients, a clinically indicated whole-body 18 F-FDG PET/CT scan was acquired. In addition, a head PET/MR scan was obtained voluntarily. For each patient, 2 AC maps were generated from the MR images. One was atlas-AC, derived from T1-weighted liver acquisition with volume acceleration flex images (clinical standard). The other was ZTE-AC, derived from proton-density-weighted ZTE images by applying tissue segmentation and assigning continuous attenuation values to the bone. The AC map generated by PET/CT was used as a silver standard. On the basis of each AC map, PET images were reconstructed from identical raw data on the PET/MR scanner. All PET images were normalized to the SPM5 PET template. After that, these images were qualified visually and quantified in 67 volumes of interest (VOIs; automated anatomic labeling, atlas). Relative differences and absolute relative differences between PET images based on each AC were calculated. 18 F-FDG uptake in all 670 VOIs and generalized merged VOIs were compared using a paired t test. Qualitative analysis shows that ZTE-AC was robust to patient variability. Nevertheless, misclassification of air and bone in mastoid and nasal areas led to the overestimation of PET in the temporal lobe and cerebellum (%diff of ZTE-AC, 2.46% ± 1.19% and 3.31% ± 1.70%, respectively). The |%diff| of all 670 VOIs on ZTE was improved by approximately 25% compared with atlas-AC (ZTE-AC vs. atlas-AC, 1.77% ± 1.41% vs. 2.44% ± 1.63%, P < 0.01). In 2 of 7 generalized VOIs, |%diff| on ZTE-AC was significantly smaller than atlas-AC (ZTE-AC vs. atlas-AC: insula and cingulate, 1.06% ± 0.67% vs. 2.22% ± 1.10%, P < 0.01; central structure, 1.03% ± 0.99% vs. 2.54% ± 1.20%, P < 0.05). The ZTE-AC could provide more accurate AC than clinical atlas-AC by improving the estimation of head-skull attenuation. The misclassification in mastoid and nasal areas must be addressed to prevent the overestimation of PET in regions near the skull base. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  9. Visual Assessment of Brain Perfusion MRI Scans in Dementia: A Pilot Study.

    PubMed

    Fällmar, David; Lilja, Johan; Velickaite, Vilma; Danfors, Torsten; Lubberink, Mark; Ahlgren, André; van Osch, Matthias J P; Kilander, Lena; Larsson, Elna-Marie

    2016-05-01

    Functional imaging is becoming increasingly important for the detection of neurodegenerative disorders. Perfusion MRI with arterial spin labeling (ASL) has been reported to provide promising diagnostic possibilities but is not yet widely used in routine clinical work. The aim of this study was to compare, in a clinical setting, the visual assessment of subtracted ASL CBF maps with and without additional smoothing, to FDG-PET data. Ten patients with a clinical diagnosis of dementia and 11 age-matched cognitively healthy controls were examined with pseudo-continuous ASL (pCASL) and 18F-Fluorodeoxyglucose positron emission tomography (FDG-PET). Three diagnostic physicians visually assessed the pCASL maps after subtraction only, and after postprocessing using Gaussian smoothing and GLM-based beta estimate functions. The assessment scores were compared to FDG PET values. Furthermore, the ability to discriminate patients from healthy elderly controls was assessed. Smoothing improved the correlation between visually assessed regional ASL perfusion scores and the FDG PET SUV-r values from the corresponding regions. However, subtracted pCASL maps discriminated patients from healthy controls better than smoothed maps. Smoothing increased the number of false-positive patient identifications. Application of beta estimate functions had only a marginal effect. Spatial smoothing of ASL images increased false positive results in the discrimination of hypoperfusion conditions from healthy elderly. It also decreased interreader agreement. However, regional characterization and subjective perception of image quality was improved. Copyright © 2015 by the American Society of Neuroimaging.

  10. SAMPEX/PET model of the low altitude trapped proton environment

    NASA Astrophysics Data System (ADS)

    Heynderickx, D.; Looper, M. D.; Blake, J. B.

    The low-altitude trapped proton population exhibits strong time variations related to geomagnetic secular variation and neutral atmosphere conditions. The flux measurements of the Proton Electron Telescope (PET) onboard the polar satellite SAMPEX constitute an adequate data set to distinguish different time scales and to characterise the respective variations. As a first step towards building a dynamic model of the low altitude proton environment we binned the 1995-1996 PET data into a model map with functional dependencies of the proton fluxes on the F10.7 solar radio flux and on the time of year to represent variations on the time scale of the solar cycle and seasonal variations. Now, a full solar cycle of SAMPEX/PET data is available, so that the preliminary model could be extended. The secular variation of the geomagnetic field is included in the model, as it is constructed using Kaufmann's K=I √{B} instead of McIlwain's L as a map coordinate.

  11. Quantitative Evaluation of Segmentation- and Atlas-Based Attenuation Correction for PET/MR on Pediatric Patients.

    PubMed

    Bezrukov, Ilja; Schmidt, Holger; Gatidis, Sergios; Mantlik, Frédéric; Schäfer, Jürgen F; Schwenzer, Nina; Pichler, Bernd J

    2015-07-01

    Pediatric imaging is regarded as a key application for combined PET/MR imaging systems. Because existing MR-based attenuation-correction methods were not designed specifically for pediatric patients, we assessed the impact of 2 potentially influential factors: inter- and intrapatient variability of attenuation coefficients and anatomic variability. Furthermore, we evaluated the quantification accuracy of 3 methods for MR-based attenuation correction without (SEGbase) and with bone prediction using an adult and a pediatric atlas (SEGwBONEad and SEGwBONEpe, respectively) on PET data of pediatric patients. The variability of attenuation coefficients between and within pediatric (5-17 y, n = 17) and adult (27-66 y, n = 16) patient collectives was assessed on volumes of interest (VOIs) in CT datasets for different tissue types. Anatomic variability was assessed on SEGwBONEad/pe attenuation maps by computing mean differences to CT-based attenuation maps for regions of bone tissue, lungs, and soft tissue. PET quantification was evaluated on VOIs with physiologic uptake and on 80% isocontour VOIs with elevated uptake in the thorax and abdomen/pelvis. Inter- and intrapatient variability of the bias was assessed for each VOI group and method. Statistically significant differences in mean VOI Hounsfield unit values and linear attenuation coefficients between adult and pediatric collectives were found in the lungs and femur. The prediction of attenuation maps using the pediatric atlas showed a reduced error in bone tissue and better delineation of bone structure. Evaluation of PET quantification accuracy showed statistically significant mean errors in mean standardized uptake values of -14% ± 5% and -23% ± 6% in bone marrow and femur-adjacent VOIs with physiologic uptake for SEGbase, which could be reduced to 0% ± 4% and -1% ± 5% using SEGwBONEpe attenuation maps. Bias in soft-tissue VOIs was less than 5% for all methods. Lung VOIs showed high SDs in the range of 15% for all methods. For VOIs with elevated uptake, mean and SD were less than 5% except in the thorax. The use of a dedicated atlas for the pediatric patient collective resulted in improved attenuation map prediction in osseous regions and reduced interpatient bias variation in femur-adjacent VOIs. For the lungs, in which intrapatient variation was higher for the pediatric collective, a patient- or group-specific attenuation coefficient might improve attenuation map accuracy. Mean errors of -14% and -23% in bone marrow and femur-adjacent VOIs can affect PET quantification in these regions when bone tissue is ignored. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  12. St. Louis area earthquake hazards mapping project; seismic and liquefaction hazard maps

    USGS Publications Warehouse

    Cramer, Chris H.; Bauer, Robert A.; Chung, Jae-won; Rogers, David; Pierce, Larry; Voigt, Vicki; Mitchell, Brad; Gaunt, David; Williams, Robert; Hoffman, David; Hempen, Gregory L.; Steckel, Phyllis; Boyd, Oliver; Watkins, Connor M.; Tucker, Kathleen; McCallister, Natasha

    2016-01-01

    We present probabilistic and deterministic seismic and liquefaction hazard maps for the densely populated St. Louis metropolitan area that account for the expected effects of surficial geology on earthquake ground shaking. Hazard calculations were based on a map grid of 0.005°, or about every 500 m, and are thus higher in resolution than any earlier studies. To estimate ground motions at the surface of the model (e.g., site amplification), we used a new detailed near‐surface shear‐wave velocity model in a 1D equivalent‐linear response analysis. When compared with the 2014 U.S. Geological Survey (USGS) National Seismic Hazard Model, which uses a uniform firm‐rock‐site condition, the new probabilistic seismic‐hazard estimates document much more variability. Hazard levels for upland sites (consisting of bedrock and weathered bedrock overlain by loess‐covered till and drift deposits), show up to twice the ground‐motion values for peak ground acceleration (PGA), and similar ground‐motion values for 1.0 s spectral acceleration (SA). Probabilistic ground‐motion levels for lowland alluvial floodplain sites (generally the 20–40‐m‐thick modern Mississippi and Missouri River floodplain deposits overlying bedrock) exhibit up to twice the ground‐motion levels for PGA, and up to three times the ground‐motion levels for 1.0 s SA. Liquefaction probability curves were developed from available standard penetration test data assuming typical lowland and upland water table levels. A simplified liquefaction hazard map was created from the 5%‐in‐50‐year probabilistic ground‐shaking model. The liquefaction hazard ranges from low (60% of area expected to liquefy) in the lowlands. Because many transportation routes, power and gas transmission lines, and population centers exist in or on the highly susceptible lowland alluvium, these areas in the St. Louis region are at significant potential risk from seismically induced liquefaction and associated ground deformation

  13. Direct Evaluation of MR-Derived Attenuation Correction Maps for PET/MR of the Mouse Myocardium

    NASA Astrophysics Data System (ADS)

    Evans, Eleanor; Buonincontri, Guido; Hawkes, Rob C.; Ansorge, Richard E.; Carpenter, T. Adrian; Sawiak, Stephen J.

    2016-02-01

    Attenuation correction (AC) must be applied to provide accurate measurements of PET tracer activity concentrations. Due to the limited space available in PET/MR scanners, MR-derived AC (MRAC) is used as a substitute for transmission source scanning. In preclinical PET/MR, there has been limited exploration of MRAC, as the magnitude of AC in murine imaging is much smaller than that required in clinical scans. We investigated if a simple 2 class (air and tissue) segmentation-based MRAC approach could provide adequate AC for mouse PET imaging. To construct the default MRAC μ maps, MR images were thresholded and segmented using ASIPRO software (Siemens Molecular Imaging), which defined the mouse body region as tissue with a uniform linear attenuation coefficient ( μ) of 0.095 cm - 1, and the background and lungs as air, with a μ value of 0 cm - 1. To correct for the misassignment of the lungs as air, two further MRAC μ maps were tested: 1) MRAC (tissue) approach, which changed the lung region designation from air to tissue ( μ = 0.095 cm - 1) and 2) MRAC (lung) approach, which treated the lungs as an additional tissue class, with a μ value of 0.032 cm - 1. All μ maps were then forward projected to create attenuation sinograms for image reconstruction. Standard uptake value (SUV) maps of the myocardium were derived for 10 mice with and without AC applied using gold standard transmission scans (TXAC), the 3 MRAC methods and PET emission scans (EmAC). All AC methods produced significantly different myocardial SUVs to those produced without AC when compared across the mouse group ( ). Similar ( ) SUV were derived with all AC methods, with the best agreement to TXAC achieved using the MRAC (tissue) method, giving a mean difference of 0.9±2.4% in myocardial SUV when compared across all mice. SUV differences of up to 40%, however, were seen in areas adjacent to the RF coil in images produced using all AC methods, except for TXAC. A 2 class MRAC approach can therefore provide acceptable AC for myocardial imaging in mice, although additional CT templates of coils and animals beds would be recommended to further improve image quantification.

  14. WE-H-207A-02: Attenuation Correction in 4D-PET Using a Single-Phase Attenuation Map

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalantari, F; Wang, J

    2016-06-15

    Purpose: 4D-PET imaging has been proposed as a potential solution to the respiratory motion effect in thoracic region. CT-based attenuation correction (AC) is an essential step toward quantitative imaging for PET. However, due to the temporal difference of 4D-PET and a single breath-hold CT, motion artifacts are observed in the attenuation-corrected PET images that can lead to error in tumor shape and uptake. We introduce a practical method for aligning single-phase CT to all other 4D-PET phases using a penalized non-rigid demons registration. Methods: Individual 4D-PET frames were reconstructed without AC. Non-rigid Demons registration was used to derive deformation vectormore » fields (DVFs) between the PET matched with CT phase and other 4D-PET images. While attenuated PET images provide enough useful data for organ borders such as lung and liver, tumors are not distinguishable from background due to loss of contrast. To preserve tumor shape in different phases, from CT image an ROI covering tumor was excluded from non-rigid transformation. Mean DVF of the central region of the tumor was assigned to all voxels in the ROI. This process mimics a rigid transformation of tumor along with a non-rigid transformation of other organs. 4D XCAT phantom with spherical tumors in lung with diameters ranging from 10 to 40 mm was used to evaluate the algorithm. Results: Motion related induced artifacts in attenuation-corrected 4D-PET images were significantly reduced. For tumors smaller than 20 mm, non-rigid transformation was capable to provide quantitative results. However, for larger tumors, where tumor self-attenuation is considerable, our combined method yields superior results. Conclusion: We introduced a practical method for deforming a single CT to match all 4D-PET images for accurate AC. Although 4D-PET data include insignificant anatomical information, we showed that they are still useful to estimate DVFs for aligning attenuation map and accurate AC.« less

  15. Revision of Time-Independent Probabilistic Seismic Hazard Maps for Alaska

    USGS Publications Warehouse

    Wesson, Robert L.; Boyd, Oliver S.; Mueller, Charles S.; Bufe, Charles G.; Frankel, Arthur D.; Petersen, Mark D.

    2007-01-01

    We present here time-independent probabilistic seismic hazard maps of Alaska and the Aleutians for peak ground acceleration (PGA) and 0.1, 0.2, 0.3, 0.5, 1.0 and 2.0 second spectral acceleration at probability levels of 2 percent in 50 years (annual probability of 0.000404), 5 percent in 50 years (annual probability of 0.001026) and 10 percent in 50 years (annual probability of 0.0021). These maps represent a revision of existing maps based on newly obtained data and assumptions reflecting best current judgments about methodology and approach. These maps have been prepared following the procedures and assumptions made in the preparation of the 2002 National Seismic Hazard Maps for the lower 48 States. A significant improvement relative to the 2002 methodology is the ability to include variable slip rate along a fault where appropriate. These maps incorporate new data, the responses to comments received at workshops held in Fairbanks and Anchorage, Alaska, in May, 2005, and comments received after draft maps were posted on the National Seismic Hazard Mapping Web Site. These maps will be proposed for adoption in future revisions to the International Building Code. In this documentation we describe the maps and in particular explain and justify changes that have been made relative to the 1999 maps. We are also preparing a series of experimental maps of time-dependent hazard that will be described in future documents.

  16. Modelling and simulation of [18F]fluoromisonidazole dynamics based on histology-derived microvessel maps

    NASA Astrophysics Data System (ADS)

    Mönnich, David; Troost, Esther G. C.; Kaanders, Johannes H. A. M.; Oyen, Wim J. G.; Alber, Markus; Thorwarth, Daniela

    2011-04-01

    Hypoxia can be assessed non-invasively by positron emission tomography (PET) using radiotracers such as [18F]fluoromisonidazole (Fmiso) accumulating in poorly oxygenated cells. Typical features of dynamic Fmiso PET data are high signal variability in the first hour after tracer administration and slow formation of a consistent contrast. The purpose of this study is to investigate whether these characteristics can be explained by the current conception of the underlying microscopic processes and to identify fundamental effects. This is achieved by modelling and simulating tissue oxygenation and tracer dynamics on the microscopic scale. In simulations, vessel structures on histology-derived maps act as sources and sinks for oxygen as well as tracer molecules. Molecular distributions in the extravascular space are determined by reaction-diffusion equations, which are solved numerically using a two-dimensional finite element method. Simulated Fmiso time activity curves (TACs), though not directly comparable to PET TACs, reproduce major characteristics of clinical curves, indicating that the microscopic model and the parameter values are adequate. Evidence for dependence of the early PET signal on the vascular fraction is found. Further, possible effects leading to late contrast formation and potential implications on the quantification of Fmiso PET data are discussed.

  17. Seismic Hazard Maps for Seattle, Washington, Incorporating 3D Sedimentary Basin Effects, Nonlinear Site Response, and Rupture Directivity

    USGS Publications Warehouse

    Frankel, Arthur D.; Stephenson, William J.; Carver, David L.; Williams, Robert A.; Odum, Jack K.; Rhea, Susan

    2007-01-01

    This report presents probabilistic seismic hazard maps for Seattle, Washington, based on over 500 3D simulations of ground motions from scenario earthquakes. These maps include 3D sedimentary basin effects and rupture directivity. Nonlinear site response for soft-soil sites of fill and alluvium was also applied in the maps. The report describes the methodology for incorporating source and site dependent amplification factors into a probabilistic seismic hazard calculation. 3D simulations were conducted for the various earthquake sources that can affect Seattle: Seattle fault zone, Cascadia subduction zone, South Whidbey Island fault, and background shallow and deep earthquakes. The maps presented in this document used essentially the same set of faults and distributed-earthquake sources as in the 2002 national seismic hazard maps. The 3D velocity model utilized in the simulations was validated by modeling the amplitudes and waveforms of observed seismograms from five earthquakes in the region, including the 2001 M6.8 Nisqually earthquake. The probabilistic seismic hazard maps presented here depict 1 Hz response spectral accelerations with 10%, 5%, and 2% probabilities of exceedance in 50 years. The maps are based on determinations of seismic hazard for 7236 sites with a spacing of 280 m. The maps show that the most hazardous locations for this frequency band (around 1 Hz) are soft-soil sites (fill and alluvium) within the Seattle basin and along the inferred trace of the frontal fault of the Seattle fault zone. The next highest hazard is typically found for soft-soil sites in the Duwamish Valley south of the Seattle basin. In general, stiff-soil sites in the Seattle basin exhibit higher hazard than stiff-soil sites outside the basin. Sites with shallow bedrock outside the Seattle basin have the lowest estimated hazard for this frequency band.

  18. Kinetic modeling in PET imaging of hypoxia

    PubMed Central

    Li, Fan; Joergensen, Jesper T; Hansen, Anders E; Kjaer, Andreas

    2014-01-01

    Tumor hypoxia is associated with increased therapeutic resistance leading to poor treatment outcome. Therefore the ability to detect and quantify intratumoral oxygenation could play an important role in future individual personalized treatment strategies. Positron Emission Tomography (PET) can be used for non-invasive mapping of tissue oxygenation in vivo and several hypoxia specific PET tracers have been developed. Evaluation of PET data in the clinic is commonly based on visual assessment together with semiquantitative measurements e.g. standard uptake value (SUV). However, dynamic PET contains additional valuable information on the temporal changes in tracer distribution. Kinetic modeling can be used to extract relevant pharmacokinetic parameters of tracer behavior in vivo that reflects relevant physiological processes. In this paper, we review the potential contribution of kinetic analysis for PET imaging of hypoxia. PMID:25250200

  19. Multi-atlas based segmentation using probabilistic label fusion with adaptive weighting of image similarity measures.

    PubMed

    Sjöberg, C; Ahnesjö, A

    2013-06-01

    Label fusion multi-atlas approaches for image segmentation can give better segmentation results than single atlas methods. We present a multi-atlas label fusion strategy based on probabilistic weighting of distance maps. Relationships between image similarities and segmentation similarities are estimated in a learning phase and used to derive fusion weights that are proportional to the probability for each atlas to improve the segmentation result. The method was tested using a leave-one-out strategy on a database of 21 pre-segmented prostate patients for different image registrations combined with different image similarity scorings. The probabilistic weighting yields results that are equal or better compared to both fusion with equal weights and results using the STAPLE algorithm. Results from the experiments demonstrate that label fusion by weighted distance maps is feasible, and that probabilistic weighted fusion improves segmentation quality more the stronger the individual atlas segmentation quality depends on the corresponding registered image similarity. The regions used for evaluation of the image similarity measures were found to be more important than the choice of similarity measure. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  20. Dose mapping: validation in 4D dosimetry with measurements and application in radiotherapy follow-up evaluation.

    PubMed

    Zhang, Geoffrey G; Huang, Tzung-Chi; Forster, Ken M; Lin, Kang-Ping; Stevens, Craig; Harris, Eleanor; Guerrero, Thomas

    2008-04-01

    The purpose of this paper is to validate a dose mapping program using optical flow method (OFM), and to demonstrate application of the program in radiotherapy follow-up evaluation. For the purpose of validation, the deformation matrices between four-dimensional (4D) CT data of different simulated respiration phases of a phantom were calculated using OFM. The matrices were then used to map doses of all phases to a single-phase image, and summed in equal time weighting. The calculated dose should closely represent the dose delivered to the moving phantom if the deformation matrices are accurately calculated. The measured point doses agreed with the OFM calculations better than 2% at isocenters, and dose distributions better than 1mm for the 50% isodose line. To demonstrate proof-of-concept for the use of deformable image registration in dose mapping for treatment evaluation, the treatment-planning CT was registered with the post-treatment CT image from the positron emission tomography (PET)/CT resulting in a deformation matrix. The dose distribution from the treatment plan was then mapped onto the restaging PET/CT using the deformation matrix. Two cases in which patients had thoracic malignancies are presented. Each patient had CT-based treatment planning for radiotherapy and restaging fluorodeoxy glucose (FDG)-PET/CT imaging 4-6 weeks after completion of treatments. Areas of pneumonitis and recurrence were identified radiographically on both PET and CT restaging images. Local dose and standard uptake values for pneumonitis and recurrence were studied as a demonstration of this method. By comparing the deformable mapped dose to measurement, the treatment evaluation method which is introduced in this manuscript proved to be accurate. It thus provides a more accurate analysis than other rigid or linear dose-image registration when used in studying treatment outcome versus dose.

  1. Assessing spatial associations between thermal stress and mortality in Hong Kong: a small-area ecological study.

    PubMed

    Thach, Thuan-Quoc; Zheng, Qishi; Lai, Poh-Chin; Wong, Paulina Pui-Yun; Chau, Patsy Yuen-Kwan; Jahn, Heiko J; Plass, Dietrich; Katzschner, Lutz; Kraemer, Alexander; Wong, Chit-Ming

    2015-01-01

    Physiological equivalent temperature (PET) is a widely used index to assess thermal comfort of the human body. Evidence on how thermal stress-related health effects vary with small geographical areas is limited. The objectives of this study are (i) to explore whether there were significant patterns of geographical clustering of thermal stress as measured by PET and mortality and (ii) to assess the association between PET and mortality in small geographical areas. A small area ecological cross-sectional study was conducted at tertiary planning units (TPUs) level. Age-standardized mortality rates (ASMR) and monthly deaths at TPUs level for 2006 were calculated for cause-specific diseases. A PET map with 100 m × 100 m resolution for the same period was derived from Hong Kong Urban Climatic Analysis Map data and the annual and monthly averages of PET for each TPU were computed. Global Moran's I and local indicator of spatial association (LISA) analyses were performed. A generalized linear mixed model was used to model monthly deaths against PET adjusted for socio-economic deprivation. We found positive spatial autocorrelation between PET and ASMR. There were spatial correlations between PET and ASMR, particularly in the north of Hong Kong Island, most parts of Kowloon, and across New Territories. A 1°C change in PET was associated with an excess risk (%) of 2.99 (95% CI: 0.50-5.48) for all natural causes, 4.75 (1.14-8.36) for cardiovascular, 7.39 (4.64-10.10) for respiratory diseases in the cool season, and 4.31 (0.12 to 8.50) for cardiovascular diseases in the warm season. Variations between TPUs in PET had an important influence on cause-specific mortality, especially in the cool season. PET may have an impact on the health of socio-economically deprived population groups. Our results suggest that targeting policy interventions at high-risk areas may be a feasible option for reducing PET-related mortality. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition, the flexibility of NDSHA allows for generation of ground shaking maps at specified long-term return times, which may permit a straightforward comparison between NDSHA and PSHA maps in terms of average rates of exceedance for specified time windows. The comparison of NDSHA and PSHA maps, particularly for very long recurrence times, may indicate to what extent probabilistic ground shaking estimates are consistent with those from physical models of seismic waves propagation. A systematic comparison over the territory of Italy is carried out exploiting the uniqueness of the Italian earthquake catalogue, a data set covering more than a millennium (a time interval about ten times longer than that available in most of the regions worldwide) with a satisfactory completeness level for M>5, which warrants the results of analysis. By analysing in some detail seismicity in the Vrancea region, we show that well constrained macroseismic field information for individual earthquakes may provide useful information about the reliability of ground shaking estimates. Finally, in order to generalise observations, the comparative analysis is extended to further regions where both standard NDSHA and PSHA maps are available (e.g. State of Gujarat, India). The final Global Seismic Hazard Assessment Program (GSHAP) results and the most recent version of Seismic Hazard Harmonization in Europe (SHARE) project maps, along with other national scale probabilistic maps, all obtained by PSHA, are considered for this comparative analysis.

  3. Optimized statistical parametric mapping for partial-volume-corrected amyloid positron emission tomography in patients with Alzheimer's disease and Lewy body dementia

    NASA Astrophysics Data System (ADS)

    Oh, Jungsu S.; Kim, Jae Seung; Chae, Sun Young; Oh, Minyoung; Oh, Seung Jun; Cha, Seung Nam; Chang, Ho-Jong; Lee, Chong Sik; Lee, Jae Hong

    2017-03-01

    We present an optimized voxelwise statistical parametric mapping (SPM) of partial-volume (PV)-corrected positron emission tomography (PET) of 11C Pittsburgh Compound B (PiB), incorporating the anatomical precision of magnetic resonance image (MRI) and amyloid β (A β) burden-specificity of PiB PET. First, we applied region-based partial-volume correction (PVC), termed the geometric transfer matrix (GTM) method, to PiB PET, creating MRI-based lobar parcels filled with mean PiB uptakes. Then, we conducted a voxelwise PVC by multiplying the original PET by the ratio of a GTM-based PV-corrected PET to a 6-mm-smoothed PV-corrected PET. Finally, we conducted spatial normalizations of the PV-corrected PETs onto the study-specific template. As such, we increased the accuracy of the SPM normalization and the tissue specificity of SPM results. Moreover, lobar smoothing (instead of whole-brain smoothing) was applied to increase the signal-to-noise ratio in the image without degrading the tissue specificity. Thereby, we could optimize a voxelwise group comparison between subjects with high and normal A β burdens (from 10 patients with Alzheimer's disease, 30 patients with Lewy body dementia, and 9 normal controls). Our SPM framework outperformed than the conventional one in terms of the accuracy of the spatial normalization (85% of maximum likelihood tissue classification volume) and the tissue specificity (larger gray matter, and smaller cerebrospinal fluid volume fraction from the SPM results). Our SPM framework optimized the SPM of a PV-corrected A β PET in terms of anatomical precision, normalization accuracy, and tissue specificity, resulting in better detection and localization of A β burdens in patients with Alzheimer's disease and Lewy body dementia.

  4. 18F-FDG PET radiomics approaches: comparing and clustering features in cervical cancer.

    PubMed

    Tsujikawa, Tetsuya; Rahman, Tasmiah; Yamamoto, Makoto; Yamada, Shizuka; Tsuyoshi, Hideaki; Kiyono, Yasushi; Kimura, Hirohiko; Yoshida, Yoshio; Okazawa, Hidehiko

    2017-11-01

    The aims of our study were to find the textural features on 18 F-FDG PET/CT which reflect the different histological architectures between cervical cancer subtypes and to make a visual assessment of the association between 18 F-FDG PET textural features in cervical cancer. Eighty-three cervical cancer patients [62 squamous cell carcinomas (SCCs) and 21 non-SCCs (NSCCs)] who had undergone pretreatment 18 F-FDG PET/CT were enrolled. A texture analysis was performed on PET/CT images, from which 18 PET radiomics features were extracted including first-order features such as standardized uptake value (SUV), metabolic tumor volume (MTV) and total lesion glycolysis (TLG), second- and high-order textural features using SUV histogram, normalized gray-level co-occurrence matrix (NGLCM), and neighborhood gray-tone difference matrix, respectively. These features were compared between SCC and NSCC using a Bonferroni adjusted P value threshold of 0.0028 (0.05/18). To assess the association between PET features, a heat map analysis with hierarchical clustering, one of the radiomics approaches, was performed. Among 18 PET features, correlation, a second-order textural feature derived from NGLCM, was a stable parameter and it was the only feature which showed a robust trend toward significant difference between SCC and NSCC. Cervical SCC showed a higher correlation (0.70 ± 0.07) than NSCC (0.64 ± 0.07, P = 0.0030). The other PET features did not show any significant differences between SCC and NSCC. A higher correlation in SCC might reflect higher structural integrity and stronger spatial/linear relationship of cancer cells compared with NSCC. A heat map with a PET feature dendrogram clearly showed 5 distinct clusters, where correlation belonged to a cluster including MTV and TLG. However, the association between correlation and MTV/TLG was not strong. Correlation was a relatively independent PET feature in cervical cancer. 18 F-FDG PET textural features might reflect the differences in histological architecture between cervical cancer subtypes. PET radiomics approaches reveal the association between PET features and will be useful for finding a single feature or a combination of features leading to precise diagnoses, potential prognostic models, and effective therapeutic strategies.

  5. Ground mapping resolution accuracy of a scanning radiometer from a geostationary satellite.

    PubMed

    Stremler, F G; Khalil, M A; Parent, R J

    1977-06-01

    Measures of the spatial and spatial rate (frequency) mapping of scanned visual imagery from an earth reference system to a spin-scan geostationary satellite are examined. Mapping distortions and coordinate inversions to correct for these distortions are formulated in terms of geometric transformations between earth and satellite frames of reference. Probabilistic methods are used to develop relations for obtainable mapping resolution when coordinate inversions are employed.

  6. H2(15)O or 13NH3 PET and electromagnetic tomography (LORETA) during partial status epilepticus.

    PubMed

    Zumsteg, D; Wennberg, R A; Treyer, V; Buck, A; Wieser, H G

    2005-11-22

    The authors evaluated the feasibility and source localization utility of H2(15)O or 13NH3 PET and low-resolution electromagnetic tomography (LORETA) in three patients with partial status epilepticus (SE). Results were correlated with findings from intraoperative electrocorticographic recordings and surgical outcomes. PET studies of cerebral blood flow and noninvasive source modeling with LORETA using statistical nonparametric mapping provided useful information for localizing the ictal activity in patients with partial SE.

  7. Brain Mapping of Language and Auditory Perception in High-Functioning Autistic Adults: A PET Study.

    ERIC Educational Resources Information Center

    Muller, R-A.; Behen, M. E.; Rothermel, R. D.; Chugani, D. C.; Muzik, O.; Mangner, T. J.; Chugani, H. T.

    1999-01-01

    A study used positron emission tomography (PET) to study patterns of brain activation during auditory processing in five high-functioning adults with autism. Results found that participants showed reversed hemispheric dominance during the verbal auditory stimulation and reduced activation of the auditory cortex and cerebellum. (CR)

  8. Direct mapping of 19F in 19FDG-6P in brain tissue at subcellular resolution using soft X-ray fluorescence

    NASA Astrophysics Data System (ADS)

    Poitry-Yamate, C.; Gianoncelli, A.; Kourousias, G.; Kaulich, B.; Lepore, M.; Gruetter, R.; Kiskinova, M.

    2013-10-01

    Low energy x-ray fluorescence (LEXRF) detection was optimized for imaging cerebral glucose metabolism by mapping the fluorine LEXRF signal of 19F in 19FDG, trapped as intracellular 19F-deoxyglucose-6-phosphate (19FDG-6P) at 1μm spatial resolution from 3μm thick brain slices. 19FDG metabolism was evaluated in brain structures closely resembling the general cerebral cytoarchitecture following formalin fixation of brain slices and their inclusion in an epon matrix. 2-dimensional distribution maps of 19FDG-6P were placed in a cytoarchitectural and morphological context by simultaneous LEXRF mapping of N and O, and scanning transmission x-ray (STXM) imaging. A disproportionately high uptake and metabolism of glucose was found in neuropil relative to intracellular domains of the cell body of hypothalamic neurons, showing directly that neurons, like glial cells, also metabolize glucose. As 19F-deoxyglucose-6P is structurally identical to 18F-deoxyglucose-6P, LEXRF of subcellular 19F provides a link to in vivo 18FDG PET, forming a novel basis for understanding the physiological mechanisms underlying the 18FDG PET image, and the contribution of neurons and glia to the PET signal.

  9. SU-F-J-224: Impact of 4D PET/CT On PERCIST Classification of Lung and Liver Metastases in NSLC and Colorectal Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meier, J; Lopez, B; Mawlawi, O

    2016-06-15

    Purpose: To quantify the impact of 4D PET/CT on PERCIST metrics in lung and liver tumors in NSCLC and colorectal cancer patients. Methods: 32 patients presenting lung or liver tumors of 1–3 cm size affected by respiratory motion were scanned on a GE Discovery 690 PET/CT. The bed position with lesion(s) affected by motion was acquired in a 12 minute PET LIST mode and unlisted into 8 bins with respiratory gating. Three different CT maps were used for attenuation correction: a clinical helical CT (CT-clin), an average CT (CT-ave), and an 8-phase 4D CINE CT (CT-cine). All reconstructions were 3Dmore » OSEM, 2 iterations, 24 subsets, 6.4 Gaussian filtration, 192×192 matrix, non-TOF, and non-PSF. Reconstructions using CT-clin and CT-ave used only 3 out of the 12 minutes of the data (clinical protocol); all 12 minutes were used for the CT-cine reconstruction. The percent change of SUVbw-peak and SUVbw-max was calculated between PET-CTclin and PET-CTave. The same percent change was also calculated between PET-CTclin and PET-CTcine in each of the 8 bins and in the average of all bins. A 30% difference from PET-CTclin classified lesions as progressive metabolic disease (PMD) using maximum bin value and the average of eight bin values. Results: 30 lesions in 25 patients were evaluated. Using the bin with maximum SUVbw-peak and SUVbw-max difference, 4 and 13 lesions were classified as PMD, respectively. Using the average bin values for SUVbw-peak and SUVbw-max, 3 and 6 lesions were classified as PMD, respectively. Using PET-CTave values for SUVbw-peak and SUVbw-max, 4 and 3 lesions were classified as PMD, respectively. Conclusion: These results suggest that response evaluation in 4D PET/CT is dependent on SUV measurement (SUVpeak vs. SUVmax), number of bins (single or average), and the CT map used for attenuation correction.« less

  10. NEMA image quality phantom measurements and attenuation correction in integrated PET/MR hybrid imaging.

    PubMed

    Ziegler, Susanne; Jakoby, Bjoern W; Braun, Harald; Paulus, Daniel H; Quick, Harald H

    2015-12-01

    In integrated PET/MR hybrid imaging the evaluation of PET performance characteristics according to the NEMA standard NU 2-2007 is challenging because of incomplete MR-based attenuation correction (AC) for phantom imaging. In this study, a strategy for CT-based AC of the NEMA image quality (IQ) phantom is assessed. The method is systematically evaluated in NEMA IQ phantom measurements on an integrated PET/MR system. NEMA IQ measurements were performed on the integrated 3.0 Tesla PET/MR hybrid system (Biograph mMR, Siemens Healthcare). AC of the NEMA IQ phantom was realized by an MR-based and by a CT-based method. The suggested CT-based AC uses a template μ-map of the NEMA IQ phantom and a phantom holder for exact repositioning of the phantom on the systems patient table. The PET image quality parameters contrast recovery, background variability, and signal-to-noise ratio (SNR) were determined and compared for both phantom AC methods. Reconstruction parameters of an iterative 3D OP-OSEM reconstruction were optimized for highest lesion SNR in NEMA IQ phantom imaging. Using a CT-based NEMA IQ phantom μ-map on the PET/MR system is straightforward and allowed performing accurate NEMA IQ measurements on the hybrid system. MR-based AC was determined to be insufficient for PET quantification in the tested NEMA IQ phantom because only photon attenuation caused by the MR-visible phantom filling but not the phantom housing is considered. Using the suggested CT-based AC, the highest SNR in this phantom experiment for small lesions (<= 13 mm) was obtained with 3 iterations, 21 subsets and 4 mm Gaussian filtering. This study suggests CT-based AC for the NEMA IQ phantom when performing PET NEMA IQ measurements on an integrated PET/MR hybrid system. The superiority of CT-based AC for this phantom is demonstrated by comparison to measurements using MR-based AC. Furthermore, optimized PET image reconstruction parameters are provided for the highest lesion SNR in NEMA IQ phantom measurements.

  11. Regional potential evapotranspiration in arid climates based on temperature, topography and calculated solar radiation

    NASA Astrophysics Data System (ADS)

    Shevenell, Lisa

    1999-03-01

    Values of evapotranspiration are required for a variety of water planning activities in arid and semi-arid climates, yet data requirements are often large, and it is costly to obtain this information. This work presents a method where a few, readily available data (temperature, elevation) are required to estimate potential evapotranspiration (PET). A method using measured temperature and the calculated ratio of total to vertical radiation (after the work of Behnke and Maxey, 1969) to estimate monthly PET was applied for the months of April-October and compared with pan evaporation measurements. The test area used in this work was in Nevada, which has 124 weather stations that record sufficient amounts of temperature data. The calculated PET values were found to be well correlated (R2=0·940-0·983, slopes near 1·0) with mean monthly pan evaporation measurements at eight weather stations.In order to extrapolate these calculated PET values to areas without temperature measurements and to sites at differing elevations, the state was divided into five regions based on latitude, and linear regressions of PET versus elevation were calculated for each of these regions. These extrapolated PET values generally compare well with the pan evaporation measurements (R2=0·926-0·988, slopes near 1·0). The estimated values are generally somewhat lower than the pan measurements, in part because the effects of wind are not explicitly considered in the calculations, and near-freezing temperatures result in a calculated PET of zero at higher elevations in the spring months. The calculated PET values for April-October are 84-100% of the measured pan evaporation values. Using digital elevation models in a geographical information system, calculated values were adjusted for slope and aspect, and the data were used to construct a series of maps of monthly PET. The resultant maps show a realistic distribution of regional variations in PET throughout Nevada which inversely mimics topography. The general methods described here could be used to estimate regional PET in other arid western states (e.g. New Mexico, Arizona, Utah) and arid regions world-wide (e.g. parts of Africa).

  12. Quantitative assessment of human and pet exposure to Salmonella associated with dry pet foods.

    PubMed

    Lambertini, Elisabetta; Buchanan, Robert L; Narrod, Clare; Ford, Randall M; Baker, Robert C; Pradhan, Abani K

    2016-01-04

    Recent Salmonella outbreaks associated with dry pet foods and treats highlight the importance of these foods as previously overlooked exposure vehicles for both pets and humans. In the last decade efforts have been made to raise the safety of this class of products, for instance by upgrading production equipment, cleaning protocols, and finished product testing. However, no comprehensive or quantitative risk profile is available for pet foods, thus limiting the ability to establish safety standards and assess the effectiveness of current and proposed Salmonella control measures. This study sought to develop an ingredients-to-consumer quantitative microbial exposure assessment model to: 1) estimate pet and human exposure to Salmonella via dry pet food, and 2) assess the impact of industry and household-level mitigation strategies on exposure. Data on prevalence and concentration of Salmonella in pet food ingredients, production process parameters, bacterial ecology, and contact transfer in the household were obtained through literature review, industry data, and targeted research. A probabilistic Monte Carlo modeling framework was developed to simulate the production process and basic household exposure routes. Under the range of assumptions adopted in this model, human exposure due to handling pet food is null to minimal if contamination occurs exclusively before extrusion. Exposure increases considerably if recontamination occurs post-extrusion during coating with fat, although mean ingested doses remain modest even at high fat contamination levels, due to the low percent of fat in the finished product. Exposure is highly variable, with the distribution of doses ingested by adult pet owners spanning 3Log CFU per exposure event. Child exposure due to ingestion of 1g of pet food leads to significantly higher doses than adult doses associated with handling the food. Recontamination after extrusion and coating, e.g., via dust or equipment surfaces, may also lead to exposure due to the absence of pathogen reduction steps after extrusion or at consumer households. Exposure is potentially highest when Salmonella is transferred to human food that is left at growth-promoting conditions. This model can be applied to evaluate the impact of alternative Salmonella control measures during production, risk communication to consumers, and regulatory standards. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Teaching Air Pollution in an Authentic Context

    NASA Astrophysics Data System (ADS)

    Mandrikas, Achilleas; Stavrou, Dimitrios; Skordoulis, Constantine

    2017-04-01

    This paper describes a teaching-learning sequence (TLS) about air pollution and the findings resulting from its implementation by pre-service elementary teachers (PET) currently undergraduate students of the Department of Primary Education in the National and Kapodistrian University of Athens, Greece. The TLS focused on the relation of air pollution with wind and topography in local conditions. An authentic context was provided to the students based on daily up-to-date meteorological data via the Internet in order to estimate air pollution. The results are encouraging given that PET can correlate wind and concentration of air pollutants through reading specialized angular diagrams and weather maps, can recognize the correlation of topography in the concentration of air pollutants, and can describe temperature inversion. However, the PET demonstrated clear difficulties in ability of orientation, in wind naming, and in interpretation of symbols on weather map. Finally, the implications on teaching air pollution are discussed.

  14. Methods for the correction of vascular artifacts in PET O-15 water brain-mapping studies

    NASA Astrophysics Data System (ADS)

    Chen, Kewei; Reiman, E. M.; Lawson, M.; Yun, Lang-sheng; Bandy, D.; Palant, A.

    1996-12-01

    While positron emission tomographic (PET) measurements of regional cerebral blood flow (rCBF) can be used to map brain regions that are involved in normal and pathological human behaviors, measurements in the anteromedial temporal lobe can be confounded by the combined effects of radiotracer activity in neighboring arteries and partial-volume averaging. The authors now describe two simple methods to address this vascular artifact. One method utilizes the early frames of a dynamic PET study, while the other method utilizes a coregistered magnetic resonance image (MRI) to characterize the vascular region of interest (VROI). Both methods subsequently assign a common value to each pixel in the VROI for the control (baseline) scan and the activation scan. To study the vascular artifact and to demonstrate the ability of the proposed methods correcting the vascular artifact, four dynamic PET scans were performed in a single subject during the same behavioral state. For each of the four scans, a vascular scan containing vascular activity was computed as the summation of the images acquired 0-60 s after radiotracer administration, and a control scan containing minimal vascular activity was computed as the summation of the images acquired 20-80 s after radiotracer administration. t-score maps calculated from the four pairs of vascular and control scans were used to characterize regional blood flow differences related to vascular activity before and after the application of each vascular artifact correction method. Both methods eliminated the observed differences in vascular activity, as well as the vascular artifact observed in the anteromedial temporal lobes. Using PET data from a study of normal human emotion, these methods permitted the authors to identify rCBF increases in the anteromedial temporal lobe free from the potentially confounding, combined effects of vascular activity and partial-volume averaging.

  15. Positron emission tomography (PET) advances in neurological applications

    NASA Astrophysics Data System (ADS)

    Sossi, V.

    2003-09-01

    Positron Emission Tomography (PET) is a functional imaging modality used in brain research to map in vivo neurotransmitter and receptor activity and to investigate glucose utilization or blood flow patterns both in healthy and disease states. Such research is made possible by the wealth of radiotracers available for PET, by the fact that metabolic and kinetic parameters of particular processes can be extracted from PET data and by the continuous development of imaging techniques. In recent years great advancements have been made in the areas of PET instrumentation, data quantification and image reconstruction that allow for more detailed and accurate biological information to be extracted from PET data. It is now possible to quantitatively compare data obtained either with different tracers or with the same tracer under different scanning conditions. These sophisticated imaging approaches enable detailed investigation of disease mechanisms and system response to disease and/or therapy.

  16. Correlation of intra-tumor 18F-FDG uptake heterogeneity indices with perfusion CT derived parameters in colorectal cancer.

    PubMed

    Tixier, Florent; Groves, Ashley M; Goh, Vicky; Hatt, Mathieu; Ingrand, Pierre; Le Rest, Catherine Cheze; Visvikis, Dimitris

    2014-01-01

    Thirty patients with proven colorectal cancer prospectively underwent integrated 18F-FDG PET/DCE-CT to assess the metabolic-flow phenotype. Both CT blood flow parametric maps and PET images were analyzed. Correlations between PET heterogeneity and perfusion CT were assessed by Spearman's rank correlation analysis. Blood flow visualization provided by DCE-CT images was significantly correlated with 18F-FDG PET metabolically active tumor volume as well as with uptake heterogeneity for patients with stage III/IV tumors (|ρ|:0.66 to 0.78; p-value<0.02). The positive correlation found with tumor blood flow indicates that intra-tumor heterogeneity of 18F-FDG PET accumulation reflects to some extent tracer distribution and consequently indicates that 18F-FDG PET intra-tumor heterogeneity may be associated with physiological processes such as tumor vascularization.

  17. A Teaching-Learning Sequence about Weather Map Reading

    ERIC Educational Resources Information Center

    Mandrikas, Achilleas; Stavrou, Dimitrios; Skordoulis, Constantine

    2017-01-01

    In this paper a teaching-learning sequence (TLS) introducing pre-service elementary teachers (PET) to weather map reading, with emphasis on wind assignment, is presented. The TLS includes activities about recognition of wind symbols, assignment of wind direction and wind speed on a weather map and identification of wind characteristics in a…

  18. Influence of region-of-interest designs on quantitative measurement of multimodal imaging of MR non-enhancing gliomas.

    PubMed

    Takano, Koji; Kinoshita, Manabu; Arita, Hideyuki; Okita, Yoshiko; Chiba, Yasuyoshi; Kagawa, Naoki; Watanabe, Yoshiyuki; Shimosegawa, Eku; Hatazawa, Jun; Hashimoto, Naoya; Fujimoto, Yasunori; Kishima, Haruhiko

    2018-05-01

    A number of studies have revealed the usefulness of multimodal imaging in gliomas. Although the results have been heavily affected by the method used for region of interest (ROI) design, the most discriminatory method for setting the ROI remains unclear. The aim of the present study was to determine the most suitable ROI design for 18 F-fluorodeoxyglucose (FDG) and 11 C-methionine (MET) positron emission tomography (PET), apparent diffusion coefficient (ADC), and fractional anisotropy (FA) obtained by diffusion tensor imaging (DTI) from the viewpoint of grades of non-enhancing gliomas. A total of 31 consecutive patients with newly diagnosed, histologically confirmed magnetic resonance (MR) non-enhancing gliomas who underwent FDG-PET, MET-PET and DTI were retrospectively investigated. Quantitative measurements were performed using four different ROIs; hotspot/tumor center and whole tumor, constructed in either two-dimensional (2D) or three-dimensional (3D). Histopathological grading of the tumor was considered as empirical truth and the quantitative measurements obtained from each ROI was correlated with the grade of the tumor. The most discriminating ROI for non-enhancing glioma grading was different according to the different imaging modalities. 2D-hotspot/center ROI was most discriminating for FDG-PET (P=0.087), ADC map (P=0.0083), and FA map (P=0.25), whereas 3D-whole tumor ROI was best for MET-PET (P=0.0050). In the majority of scenarios, 2D-ROIs performed better than 3D-ROIs. Results from the image analysis using FDG-PET, MET-PET, ADC and FA may be affected by ROI design and the most discriminating ROI for non-enhancing glioma grading was different according to the imaging modality.

  19. Radioembolization and the Dynamic Role of 90Y PET/CT

    PubMed Central

    Pasciak, Alexander S.; Bourgeois, Austin C.; McKinney, J. Mark; Chang, Ted T.; Osborne, Dustin R.; Acuff, Shelley N.; Bradley, Yong C.

    2014-01-01

    Before the advent of tomographic imaging, it was postulated that decay of 90 Y to the 0+ excited state of 90Zr may result in emission of a positron–electron pair. While the branching ratio for pair-production is small (~32 × 10−6), PET has been successfully used to image 90 Y in numerous recent patients and phantom studies. 90 Y PET imaging has been performed on a variety of PET/CT systems, with and without time-of-flight (TOF) and/or resolution recovery capabilities as well as on both bismuth-germanate and lutetium yttrium orthosilicate (LYSO)-based scanners. On all systems, resolution and contrast superior to bremsstrahlung SPECT has been reported. The intrinsic radioactivity present in LYSO-based PET scanners is a potential limitation associated with accurate quantification of 90 Y. However, intrinsic radioactivity has been shown to have a negligible effect at the high activity concentrations common in 90 Y radioembolization. Accurate quantification is possible on a variety of PET scanner models, with or without TOF, although TOF improves accuracy at lower activity concentrations. Quantitative 90 Y PET images can be transformed into 3-dimensional (3D) maps of absorbed dose based on the premise that the 90 Y activity distribution does not change after infusion. This transformation has been accomplished in several ways, although the most common is with the use of 3D dose-point-kernel convolution. From a clinical standpoint, 90 Y PET provides a superior post-infusion evaluation of treatment technical success owing to its improved resolution. Absorbed dose maps generated from quantitative PET data can be used to predict treatment efficacy and manage patient follow-up. For patients who receive multiple treatments, this information can also be used to provide patient-specific treatment-planning for successive therapies, potentially improving response. The broad utilization of 90 Y PET has the potential to provide a wealth of dose–response information, which may lead to development of improved radioembolization treatment-planning models in the future. PMID:24579065

  20. Earthquake scenario and probabilistic ground-shaking hazard maps for the Albuquerque-Belen-Santa Fe, New Mexico, corridor

    USGS Publications Warehouse

    Wong, I.; Olig, S.; Dober, M.; Silva, W.; Wright, D.; Thomas, P.; Gregor, N.; Sanford, A.; Lin, K.-W.; Love, D.

    2004-01-01

    These maps are not intended to be a substitute for site-specific studies for engineering design nor to replace standard maps commonly referenced in building codes. Rather, we hope that these maps will be used as a guide by government agencies; the engineering, urban planning, emergency preparedness, and response communities; and the general public as part of an overall program to reduce earthquake risk and losses in New Mexico.

  1. Using the Logarithm of Odds to Define a Vector Space on Probabilistic Atlases

    PubMed Central

    Pohl, Kilian M.; Fisher, John; Bouix, Sylvain; Shenton, Martha; McCarley, Robert W.; Grimson, W. Eric L.; Kikinis, Ron; Wells, William M.

    2007-01-01

    The Logarithm of the Odds ratio (LogOdds) is frequently used in areas such as artificial neural networks, economics, and biology, as an alternative representation of probabilities. Here, we use LogOdds to place probabilistic atlases in a linear vector space. This representation has several useful properties for medical imaging. For example, it not only encodes the shape of multiple anatomical structures but also captures some information concerning uncertainty. We demonstrate that the resulting vector space operations of addition and scalar multiplication have natural probabilistic interpretations. We discuss several examples for placing label maps into the space of LogOdds. First, we relate signed distance maps, a widely used implicit shape representation, to LogOdds and compare it to an alternative that is based on smoothing by spatial Gaussians. We find that the LogOdds approach better preserves shapes in a complex multiple object setting. In the second example, we capture the uncertainty of boundary locations by mapping multiple label maps of the same object into the LogOdds space. Third, we define a framework for non-convex interpolations among atlases that capture different time points in the aging process of a population. We evaluate the accuracy of our representation by generating a deformable shape atlas that captures the variations of anatomical shapes across a population. The deformable atlas is the result of a principal component analysis within the LogOdds space. This atlas is integrated into an existing segmentation approach for MR images. We compare the performance of the resulting implementation in segmenting 20 test cases to a similar approach that uses a more standard shape model that is based on signed distance maps. On this data set, the Bayesian classification model with our new representation outperformed the other approaches in segmenting subcortical structures. PMID:17698403

  2. Enhancement of PET Images

    NASA Astrophysics Data System (ADS)

    Davis, Paul B.; Abidi, Mongi A.

    1989-05-01

    PET is the only imaging modality that provides doctors with early analytic and quantitative biochemical assessment and precise localization of pathology. In PET images, boundary information as well as local pixel intensity are both crucial for manual and/or automated feature tracing, extraction, and identification. Unfortunately, the present PET technology does not provide the necessary image quality from which such precise analytic and quantitative measurements can be made. PET images suffer from significantly high levels of radial noise present in the form of streaks caused by the inexactness of the models used in image reconstruction. In this paper, our objective is to model PET noise and remove it without altering dominant features in the image. The ultimate goal here is to enhance these dominant features to allow for automatic computer interpretation and classification of PET images by developing techniques that take into consideration PET signal characteristics, data collection, and data reconstruction. We have modeled the noise steaks in PET images in both rectangular and polar representations and have shown both analytically and through computer simulation that it exhibits consistent mapping patterns. A class of filters was designed and applied successfully. Visual inspection of the filtered images show clear enhancement over the original images.

  3. A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen

    2014-05-01

    Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho are too short for a meaningful frequency analysis. The detailed hazard mapping is performed by a 2D hydrodynamic model for Can Tho city. As the scenarios are derived in a Monte-Carlo framework, the final flood hazard maps are probabilistic, i.e. show the median flood hazard along with uncertainty estimates for each defined level of probabilities of exceedance. For the pluvial flood hazard a frequency analysis of the hourly rain gauge data of Can Tho is performed implementing a peak-over-threshold procedure. Based on this frequency analysis synthetic rains storms are generated in a Monte-Carlo framework for the same probabilities of exceedance as in the fluvial flood hazard analysis. Probabilistic flood hazard maps were then generated with the same 2D hydrodynamic model for the city. In a last step the fluvial and pluvial scenarios are combined assuming independence of the events. These scenarios were also transferred into hazard maps by the 2D hydrodynamic model finally yielding combined fluvial-pluvial probabilistic flood hazard maps for Can Tho. The derived set of maps may be used for an improved city planning or a flood risk analysis.

  4. Chapter 8: US geological survey Circum-Arctic Resource Appraisal (CARA): Introduction and summary of organization and methods

    USGS Publications Warehouse

    Charpentier, R.R.; Gautier, D.L.

    2011-01-01

    The USGS has assessed undiscovered petroleum resources in the Arctic through geological mapping, basin analysis and quantitative assessment. The new map compilation provided the base from which geologists subdivided the Arctic for burial history modelling and quantitative assessment. The CARA was a probabilistic, geologically based study that used existing USGS methodology, modified somewhat for the circumstances of the Arctic. The assessment relied heavily on analogue modelling, with numerical input as lognormal distributions of sizes and numbers of undiscovered accumulations. Probabilistic results for individual assessment units were statistically aggregated taking geological dependencies into account. Fourteen papers in this Geological Society volume present summaries of various aspects of the CARA. ?? 2011 The Geological Society of London.

  5. Nonequilibrium Probabilistic Dynamics of the Logistic Map at the Edge of Chaos

    NASA Astrophysics Data System (ADS)

    Borges, Ernesto P.; Tsallis, Constantino; Añaños, Garín F.; de Oliveira, Paulo Murilo

    2002-12-01

    We consider nonequilibrium probabilistic dynamics in logisticlike maps xt+1=1-a|xt|z, (z>1) at their chaos threshold: We first introduce many initial conditions within one among W>>1 intervals partitioning the phase space and focus on the unique value qsen<1 for which the entropic form Sq≡(1- ∑i=1Wpqi)/(q-1) linearly increases with time. We then verify that Sqsen(t)-Sqsen(∞) vanishes like t-1/[qrel(W)-1] [qrel(W)>1]. We finally exhibit a new finite-size scaling, qrel(∞)-qrel(W)~W- |qsen|. This establishes quantitatively, for the first time, a long pursued relation between sensitivity to the initial conditions and relaxation, concepts which play central roles in nonextensive statistical mechanics.

  6. A wavelet-based estimator of the degrees of freedom in denoised fMRI time series for probabilistic testing of functional connectivity and brain graphs.

    PubMed

    Patel, Ameera X; Bullmore, Edward T

    2016-11-15

    Connectome mapping using techniques such as functional magnetic resonance imaging (fMRI) has become a focus of systems neuroscience. There remain many statistical challenges in analysis of functional connectivity and network architecture from BOLD fMRI multivariate time series. One key statistic for any time series is its (effective) degrees of freedom, df, which will generally be less than the number of time points (or nominal degrees of freedom, N). If we know the df, then probabilistic inference on other fMRI statistics, such as the correlation between two voxel or regional time series, is feasible. However, we currently lack good estimators of df in fMRI time series, especially after the degrees of freedom of the "raw" data have been modified substantially by denoising algorithms for head movement. Here, we used a wavelet-based method both to denoise fMRI data and to estimate the (effective) df of the denoised process. We show that seed voxel correlations corrected for locally variable df could be tested for false positive connectivity with better control over Type I error and greater specificity of anatomical mapping than probabilistic connectivity maps using the nominal degrees of freedom. We also show that wavelet despiked statistics can be used to estimate all pairwise correlations between a set of regional nodes, assign a P value to each edge, and then iteratively add edges to the graph in order of increasing P. These probabilistically thresholded graphs are likely more robust to regional variation in head movement effects than comparable graphs constructed by thresholding correlations. Finally, we show that time-windowed estimates of df can be used for probabilistic connectivity testing or dynamic network analysis so that apparent changes in the functional connectome are appropriately corrected for the effects of transient noise bursts. Wavelet despiking is both an algorithm for fMRI time series denoising and an estimator of the (effective) df of denoised fMRI time series. Accurate estimation of df offers many potential advantages for probabilistically thresholding functional connectivity and network statistics tested in the context of spatially variant and non-stationary noise. Code for wavelet despiking, seed correlational testing and probabilistic graph construction is freely available to download as part of the BrainWavelet Toolbox at www.brainwavelet.org. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Postmortem 3-D brain hemisphere cortical tau and amyloid-β pathology mapping and quantification as a validation method of neuropathology imaging.

    PubMed

    Smid, Lojze M; Kepe, Vladimir; Vinters, Harry V; Bresjanac, Mara; Toyokuni, Tatsushi; Satyamurthy, Nagichettiar; Wong, Koon-Pong; Huang, Sung-Cheng; Silverman, Daniel H S; Miller, Karen; Small, Gary W; Barrio, Jorge R

    2013-01-01

    This work is aimed at correlating pre-mortem [18F]FDDNP positron emission tomography (PET) scan results in a patient with dementia with Lewy bodies (DLB), with cortical neuropathology distribution determined postmortem in three physical dimensions in whole brain coronal sections. Analysis of total amyloid-β (Aβ) distribution in frontal cortex and posterior cingulate gyrus confirmed its statistically significant correlation with cortical [18F]FDDNP PET binding values (distribution volume ratios, DVR) (p < 0.001, R = 0.97, R2 = 0.94). Neurofibrillary tangle (NFT) distribution correlated significantly with cortical [18F]FDDNP PET DVR in the temporal lobe (p < 0.001, R = 0.87, R2 = 0.76). Linear combination of Aβ and NFT densities was highly predictive of [18F]FDDNP PET DVR through all analyzed regions of interest (p < 0.0001, R = 0.92, R2 = 0.85), and both densities contributed significantly to the model. Lewy bodies were present at a much lower level than either Aβ or NFTs and did not significantly contribute to the in vivo signal. [18F]FDG PET scan results in this patient were consistent with the distinctive DLB pattern of hypometabolism. This work offers a mapping brain model applicable to all imaging probes for verification of imaging results with Aβ and/or tau neuropathology brain distribution using immunohistochemistry, fluorescence microscopy, and autoradiography.

  8. Surface feature-guided mapping of cerebral metabolic changes in cognitively normal and mildly impaired elderly.

    PubMed

    Apostolova, Liana G; Thompson, Paul M; Rogers, Steve A; Dinov, Ivo D; Zoumalan, Charleen; Steiner, Calen A; Siu, Erin; Green, Amity E; Small, Gary W; Toga, Arthur W; Cummings, Jeffrey L; Phelps, Michael E; Silverman, Daniel H

    2010-04-01

    The aim of this study was to investigate the longitudinal positron emission tomography (PET) metabolic changes in the elderly. Nineteen nondemented subjects (mean Mini-Mental Status Examination 29.4 +/- 0.7 SD) underwent two detailed neuropsychological evaluations and resting 2-deoxy-2-[F-18]fluoro-D: -glucose (FDG)-PET scan (interval 21.7 +/- 3.7 months), baseline structural 3T magnetic resonance (MR) imaging, and apolipoprotein E4 genotyping. Cortical PET metabolic changes were analyzed in 3-D using the cortical pattern matching technique. Baseline vs. follow-up whole-group comparison revealed significant metabolic decline bilaterally in the posterior temporal, parietal, and occipital lobes and the left lateral frontal cortex. The declining group demonstrated 10-15% decline in bilateral posterior cingulate/precuneus, posterior temporal, parietal, and occipital cortices. The cognitively stable group showed 2.5-5% similarly distributed decline. ApoE4-positive individuals underwent 5-15% metabolic decline in the posterior association cortices. Using 3-D surface-based MR-guided FDG-PET mapping, significant metabolic changes were seen in five posterior and the left lateral frontal regions. The changes were more pronounced for the declining relative to the cognitively stable group.

  9. Improved quantitation and reproducibility in multi-PET/CT lung studies by combining CT information.

    PubMed

    Holman, Beverley F; Cuplov, Vesna; Millner, Lynn; Endozo, Raymond; Maher, Toby M; Groves, Ashley M; Hutton, Brian F; Thielemans, Kris

    2018-06-05

    Matched attenuation maps are vital for obtaining accurate and reproducible kinetic and static parameter estimates from PET data. With increased interest in PET/CT imaging of diffuse lung diseases for assessing disease progression and treatment effectiveness, understanding the extent of the effect of respiratory motion and establishing methods for correction are becoming more important. In a previous study, we have shown that using the wrong attenuation map leads to large errors due to density mismatches in the lung, especially in dynamic PET scans. Here, we extend this work to the case where the study is sub-divided into several scans, e.g. for patient comfort, each with its own CT (cine-CT and 'snap shot' CT). A method to combine multi-CT information into a combined-CT has then been developed, which averages the CT information from each study section to produce composite CT images with the lung density more representative of that in the PET data. This combined-CT was applied to nine patients with idiopathic pulmonary fibrosis, imaged with dynamic 18 F-FDG PET/CT to determine the improvement in the precision of the parameter estimates. Using XCAT simulations, errors in the influx rate constant were found to be as high as 60% in multi-PET/CT studies. Analysis of patient data identified displacements between study sections in the time activity curves, which led to an average standard error in the estimates of the influx rate constant of 53% with conventional methods. This reduced to within 5% after use of combined-CTs for attenuation correction of the study sections. Use of combined-CTs to reconstruct the sections of a multi-PET/CT study, as opposed to using the individually acquired CTs at each study stage, produces more precise parameter estimates and may improve discrimination between diseased and normal lung.

  10. Imaging Bone–Cartilage Interactions in Osteoarthritis Using [18F]-NaF PET-MRI

    PubMed Central

    Pedoia, Valentina; Seo, Youngho; Yang, Jaewon; Bucknor, Matt; Franc, Benjamin L.; Majumdar, Sharmila

    2016-01-01

    Purpose: Simultaneous positron emission tomography–magnetic resonance imaging (PET-MRI) is an emerging technology providing both anatomical and functional images without increasing the scan time. Compared to the traditional PET/computed tomography imaging, it also exposes the patient to significantly less radiation and provides better anatomical images as MRI provides superior soft tissue characterization. Using PET-MRI, we aim to study interactions between cartilage composition and bone function simultaneously, in knee osteoarthritis (OA). Procedures: In this article, bone turnover and remodeling was studied using [18F]-sodium fluoride (NaF) PET data. Quantitative MR-derived T1ρ relaxation times characterized the biochemical cartilage degeneration. Sixteen participants with early signs of OA of the knee received intravenous injections of [18F]-NaF at the onset of PET-MR image acquisition. Regions of interest were identified, and kinetic analysis of dynamic PET data provided the rate of uptake (Ki) and the normalized uptake (standardized uptake value) of [18F]-NaF in the bone. Morphological MR images and quantitative voxel-based T1ρ maps of cartilage were obtained using an atlas-based registration technique to segment cartilage automatically. Voxel-by-voxel statistical parameter mapping was used to investigate the relationship between bone and cartilage. Results: Increases in cartilage T1ρ, indicating degenerative changes, were associated with increased turnover in the adjoining bone but reduced turnover in the nonadjoining compartments. Associations between pain and increased bone uptake were seen in the absence of morphological lesions in cartilage, but the relationship was reversed in the presence of incident cartilage lesions. Conclusion: This study shows significant cartilage and bone interactions in OA of the knee joint using simultaneous [18F]-NaF PET-MR, the first in human study. These observations highlight the complex biomechanical and biochemical interactions in the whole knee joint in OA, which potentially could help assess therapeutic targets in treating OA. PMID:28654417

  11. Solving graph data issues using a layered architecture approach with applications to web spam detection.

    PubMed

    Scarselli, Franco; Tsoi, Ah Chung; Hagenbuchner, Markus; Noi, Lucia Di

    2013-12-01

    This paper proposes the combination of two state-of-the-art algorithms for processing graph input data, viz., the probabilistic mapping graph self organizing map, an unsupervised learning approach, and the graph neural network, a supervised learning approach. We organize these two algorithms in a cascade architecture containing a probabilistic mapping graph self organizing map, and a graph neural network. We show that this combined approach helps us to limit the long-term dependency problem that exists when training the graph neural network resulting in an overall improvement in performance. This is demonstrated in an application to a benchmark problem requiring the detection of spam in a relatively large set of web sites. It is found that the proposed method produces results which reach the state of the art when compared with some of the best results obtained by others using quite different approaches. A particular strength of our method is its applicability towards any input domain which can be represented as a graph. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Probabilistic self-localisation on a qualitative map based on occlusions

    NASA Astrophysics Data System (ADS)

    Santos, Paulo E.; Martins, Murilo F.; Fenelon, Valquiria; Cozman, Fabio G.; Dee, Hannah M.

    2016-09-01

    Spatial knowledge plays an essential role in human reasoning, permitting tasks such as locating objects in the world (including oneself), reasoning about everyday actions and describing perceptual information. This is also the case in the field of mobile robotics, where one of the most basic (and essential) tasks is the autonomous determination of the pose of a robot with respect to a map, given its perception of the environment. This is the problem of robot self-localisation (or simply the localisation problem). This paper presents a probabilistic algorithm for robot self-localisation that is based on a topological map constructed from the observation of spatial occlusion. Distinct locations on the map are defined by means of a classical formalism for qualitative spatial reasoning, whose base definitions are closer to the human categorisation of space than traditional, numerical, localisation procedures. The approach herein proposed was systematically evaluated through experiments using a mobile robot equipped with a RGB-D sensor. The results obtained show that the localisation algorithm is successful in locating the robot in qualitatively distinct regions.

  13. Learning Probabilistic Features for Robotic Navigation Using Laser Sensors

    PubMed Central

    Aznar, Fidel; Pujol, Francisco A.; Pujol, Mar; Rizo, Ramón; Pujol, María-José

    2014-01-01

    SLAM is a popular task used by robots and autonomous vehicles to build a map of an unknown environment and, at the same time, to determine their location within the map. This paper describes a SLAM-based, probabilistic robotic system able to learn the essential features of different parts of its environment. Some previous SLAM implementations had computational complexities ranging from O(Nlog(N)) to O(N 2), where N is the number of map features. Unlike these methods, our approach reduces the computational complexity to O(N) by using a model to fuse the information from the sensors after applying the Bayesian paradigm. Once the training process is completed, the robot identifies and locates those areas that potentially match the sections that have been previously learned. After the training, the robot navigates and extracts a three-dimensional map of the environment using a single laser sensor. Thus, it perceives different sections of its world. In addition, in order to make our system able to be used in a low-cost robot, low-complexity algorithms that can be easily implemented on embedded processors or microcontrollers are used. PMID:25415377

  14. Learning probabilistic features for robotic navigation using laser sensors.

    PubMed

    Aznar, Fidel; Pujol, Francisco A; Pujol, Mar; Rizo, Ramón; Pujol, María-José

    2014-01-01

    SLAM is a popular task used by robots and autonomous vehicles to build a map of an unknown environment and, at the same time, to determine their location within the map. This paper describes a SLAM-based, probabilistic robotic system able to learn the essential features of different parts of its environment. Some previous SLAM implementations had computational complexities ranging from O(Nlog(N)) to O(N(2)), where N is the number of map features. Unlike these methods, our approach reduces the computational complexity to O(N) by using a model to fuse the information from the sensors after applying the Bayesian paradigm. Once the training process is completed, the robot identifies and locates those areas that potentially match the sections that have been previously learned. After the training, the robot navigates and extracts a three-dimensional map of the environment using a single laser sensor. Thus, it perceives different sections of its world. In addition, in order to make our system able to be used in a low-cost robot, low-complexity algorithms that can be easily implemented on embedded processors or microcontrollers are used.

  15. Assignment of functional activations to probabilistic cytoarchitectonic areas revisited.

    PubMed

    Eickhoff, Simon B; Paus, Tomas; Caspers, Svenja; Grosbras, Marie-Helene; Evans, Alan C; Zilles, Karl; Amunts, Katrin

    2007-07-01

    Probabilistic cytoarchitectonic maps in standard reference space provide a powerful tool for the analysis of structure-function relationships in the human brain. While these microstructurally defined maps have already been successfully used in the analysis of somatosensory, motor or language functions, several conceptual issues in the analysis of structure-function relationships still demand further clarification. In this paper, we demonstrate the principle approaches for anatomical localisation of functional activations based on probabilistic cytoarchitectonic maps by exemplary analysis of an anterior parietal activation evoked by visual presentation of hand gestures. After consideration of the conceptual basis and implementation of volume or local maxima labelling, we comment on some potential interpretational difficulties, limitations and caveats that could be encountered. Extending and supplementing these methods, we then propose a supplementary approach for quantification of structure-function correspondences based on distribution analysis. This approach relates the cytoarchitectonic probabilities observed at a particular functionally defined location to the areal specific null distribution of probabilities across the whole brain (i.e., the full probability map). Importantly, this method avoids the need for a unique classification of voxels to a single cortical area and may increase the comparability between results obtained for different areas. Moreover, as distribution-based labelling quantifies the "central tendency" of an activation with respect to anatomical areas, it will, in combination with the established methods, allow an advanced characterisation of the anatomical substrates of functional activations. Finally, the advantages and disadvantages of the various methods are discussed, focussing on the question of which approach is most appropriate for a particular situation.

  16. Quantifying volcanic hazard at Campi Flegrei caldera (Italy) with uncertainty assessment: 2. Pyroclastic density current invasion maps

    NASA Astrophysics Data System (ADS)

    Neri, Augusto; Bevilacqua, Andrea; Esposti Ongaro, Tomaso; Isaia, Roberto; Aspinall, Willy P.; Bisson, Marina; Flandoli, Franco; Baxter, Peter J.; Bertagnini, Antonella; Iannuzzi, Enrico; Orsucci, Simone; Pistolesi, Marco; Rosi, Mauro; Vitale, Stefano

    2015-04-01

    Campi Flegrei (CF) is an example of an active caldera containing densely populated settlements at very high risk of pyroclastic density currents (PDCs). We present here an innovative method for assessing background spatial PDC hazard in a caldera setting with probabilistic invasion maps conditional on the occurrence of an explosive event. The method encompasses the probabilistic assessment of potential vent opening positions, derived in the companion paper, combined with inferences about the spatial density distribution of PDC invasion areas from a simplified flow model, informed by reconstruction of deposits from eruptions in the last 15 ka. The flow model describes the PDC kinematics and accounts for main effects of topography on flow propagation. Structured expert elicitation is used to incorporate certain sources of epistemic uncertainty, and a Monte Carlo approach is adopted to produce a set of probabilistic hazard maps for the whole CF area. Our findings show that, in case of eruption, almost the entire caldera is exposed to invasion with a mean probability of at least 5%, with peaks greater than 50% in some central areas. Some areas outside the caldera are also exposed to this danger, with mean probabilities of invasion of the order of 5-10%. Our analysis suggests that these probability estimates have location-specific uncertainties which can be substantial. The results prove to be robust with respect to alternative elicitation models and allow the influence on hazard mapping of different sources of uncertainty, and of theoretical and numerical assumptions, to be quantified.

  17. Effect of time dependence on probabilistic seismic-hazard maps and deaggregation for the central Apennines, Italy

    USGS Publications Warehouse

    Akinci, A.; Galadini, F.; Pantosti, D.; Petersen, M.; Malagnini, L.; Perkins, D.

    2009-01-01

    We produce probabilistic seismic-hazard assessments for the central Apennines, Italy, using time-dependent models that are characterized using a Brownian passage time recurrence model. Using aperiodicity parameters, ?? of 0.3, 0.5, and 0.7, we examine the sensitivity of the probabilistic ground motion and its deaggregation to these parameters. For the seismic source model we incorporate both smoothed historical seismicity over the area and geological information on faults. We use the maximum magnitude model for the fault sources together with a uniform probability of rupture along the fault (floating fault model) to model fictitious faults to account for earthquakes that cannot be correlated with known geologic structural segmentation.

  18. SU-F-I-58: Image Quality Comparisons of Different Motion Magnitudes and TR Values in MR-PET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patrick, J; Thompson, R; Tavallaei, M

    2016-06-15

    Purpose: The aim of this work is to evaluate the accuracy and sensitivity of a respiratory-triggered MR-PET protocol in detecting four different sized lesions at two different magnitudes of motion, with two different TR values, using a novel PET-MR-CT compatible respiratory motion phantom. Methods: The eight-compartment torso phantom was setup adjacent to the motion stage, which moved four spherical compartments (28, 22, 17, 10 mm diameter) in two separate (1 and 2 cm) linear motion profiles, simulating a 3.5 second respiratory cycle. Scans were acquired on a 3T MR-PET system (Biograph mMR; Siemens Medical Solutions, Germany). MR measurements were takenmore » with: 1) Respiratory-triggered T2-weighted turbo spin echo (BLADE) sequence in coronal orientation, and 2) Real-time balanced steady-state gradient echo sequence (TrueFISP) in coronal and sagittal planes. PET was acquired simultaneously with MR. Sphere geometries and motion profiles were measured and compared with ground truths for T2 BLADE-TSE acquisitions and real time TrueFISP images. PET quantification and geometry measurements were taken using standardized uptake values, voxel intensity plots and were compared with known values, and examined alongside MR-based attenuation maps. Contrast and signal-to-noise ratios were also compared for each of the acquisitions as functions of motion range and TR. Results: Comparison of lesion diameters indicate the respiratory triggered T2 BLADE-TSE was able to maintain geometry within −2 mm for 1 cm motion for both TR values, and within −3.1 mm for TR = 2000 ms at 2 cm motion. Sphere measurements in respiratory triggered PET images were accurate within +/− 5 mm for both ranges of motion for 28, 22, and 17 mm diameter spheres. Conclusion: Hybrid MR-PET systems show promise in imaging lung cancer in non-compliant patients, with their ability to acquire both modalities simultaneously. However, MR-based attenuation maps are still susceptible to motion derived artifacts and pose the potential to affect PET accuracy.« less

  19. Glucose Metabolic Profile by Visual Assessment Combined with Statistical Parametric Mapping Analysis in Pediatric Patients with Epilepsy.

    PubMed

    Zhu, Yuankai; Feng, Jianhua; Wu, Shuang; Hou, Haifeng; Ji, Jianfeng; Zhang, Kai; Chen, Qing; Chen, Lin; Cheng, Haiying; Gao, Liuyan; Chen, Zexin; Zhang, Hong; Tian, Mei

    2017-08-01

    PET with 18 F-FDG has been used for presurgical localization of epileptogenic foci; however, in nonsurgical patients, the correlation between cerebral glucose metabolism and clinical severity has not been fully understood. The aim of this study was to evaluate the glucose metabolic profile using 18 F-FDG PET/CT imaging in patients with epilepsy. Methods: One hundred pediatric epilepsy patients who underwent 18 F-FDG PET/CT, MRI, and electroencephalography examinations were included. Fifteen age-matched controls were also included. 18 F-FDG PET images were analyzed by visual assessment combined with statistical parametric mapping (SPM) analysis. The absolute asymmetry index (|AI|) was calculated in patients with regional abnormal glucose metabolism. Results: Visual assessment combined with SPM analysis of 18 F-FDG PET images detected more patients with abnormal glucose metabolism than visual assessment only. The |AI| significantly positively correlated with seizure frequency ( P < 0.01) but negatively correlated with the time since last seizure ( P < 0.01) in patients with abnormal glucose metabolism. The only significant contributing variable to the |AI| was the time since last seizure, in patients both with hypometabolism ( P = 0.001) and with hypermetabolism ( P = 0.005). For patients with either hypometabolism ( P < 0.01) or hypermetabolism ( P = 0.209), higher |AI| values were found in those with drug resistance than with seizure remission. In the post-1-y follow-up PET studies, a significant change of |AI| (%) was found in patients with clinical improvement compared with those with persistence or progression ( P < 0.01). Conclusion: 18 F-FDG PET imaging with visual assessment combined with SPM analysis could provide cerebral glucose metabolic profiles in nonsurgical epilepsy patients. |AI| might be used for evaluation of clinical severity and progress in these patients. Patients with a prolonged period of seizure freedom may have more subtle (or no) metabolic abnormalities on PET. The clinical value of PET might be enhanced by timing the scan closer to clinical seizures. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  20. Confocal Raman microscopy of morphological changes in poly(ethylene terephthalate) film induced by supercritical CO(2).

    PubMed

    Fleming, Oliver S; Kazarian, Sergei G

    2004-04-01

    Poly(ethylene terephthalate) (PET) film was exposed to supercritical (sc) CO(2) and confocal Raman microscopy was used to investigate the morphological changes induced. The study evaluates the use of oil and dry objectives in confocal mode to obtain depth profiles of PET film. These results were compared with the data obtained by mapping of the film cross-section. A significant gradient of degree of crystallinity normal to the surface of PET film down to 60 microm has been observed. The gradient of the degree of morphological changes are functions of exposure time and pressure.

  1. {sup 18}F-Choline Positron Emission Tomography/Computed Tomography and Multiparametric Magnetic Resonance Imaging for the Detection of Early Local Recurrence of Prostate Cancer Initially Treated by Radiation Therapy: Comparison With Systematic 3-Dimensional Transperineal Mapping Biopsy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanoun, Salim, E-mail: Salim.kanoun@gmail.com; LE2I UMR6306, Centre national de la recherche scientifique, Arts et Métiers, Université Bourgogne Franche-Comté, Dijon; MRI Unit, Centre Hospitalier Régional Universitaire, Hôpital François Mitterrand, Dijon

    Purpose: To compare the diagnostic performance of {sup 18}F-fluorocholine positron emission tomography/computed tomography (FCH-PET/CT), multiparametric prostate magnetic resonance imaging (mpMRI), and a combination of both techniques for the detection of local recurrence of prostate cancer initially treated by radiation therapy. Methods and Materials: This was a retrospective, single-institution study of 32 patients with suspected prostate cancer recurrence who underwent both FCH-PET/CT and 3T mpMRI within 3 months of one another for the detection of recurrence. All included patients had to be cleared for metastatic recurrence. The reference procedure was systematic 3-dimensional (3D)-transperineal prostate biopsy for the final assessment of local recurrence.more » Both imaging modalities were analyzed by 2 experienced readers blinded to clinical data. The analysis was made per-patient and per-segment using a 4-segment model. Results: The median prostate-specific antigen value at the time of imaging was 2.92 ng/mL. The mean prostate-specific antigen doubling time was 14 months. Of the 32 patients, 31 had a positive 3D-transperineal mapping biopsy for a local relapse. On a patient-based analysis, the detection rate was 71% (22 of 31) for mpMRI and 74% (23 of 31) for FCH-PET/CT. On a segment-based analysis, the sensitivity and specificity were, respectively, 32% and 87% for mpMRI, 34% and 87% for FCH-PET/CT, and 43% and 83% for the combined analysis of both techniques. Accuracy was 64%, 65%, and 66%, respectively. The interobserver agreement was κ = 0.92 for FCH-PET/CT and κ = 0.74 for mpMRI. Conclusions: Both mpMRI and FCH-PET/CT show limited sensitivity but good specificity for the detection of local cancer recurrence after radiation therapy, when compared with 3D-transperineal mapping biopsy. Prostate biopsy still seems to be mandatory to diagnose local relapse and select patients who could benefit from local salvage therapy.« less

  2. Toward standardized mapping for left atrial analysis and cardiac ablation guidance

    NASA Astrophysics Data System (ADS)

    Rettmann, M. E.; Holmes, D. R.; Linte, C. A.; Packer, D. L.; Robb, R. A.

    2014-03-01

    In catheter-based cardiac ablation, the pulmonary vein ostia are important landmarks for guiding the ablation procedure, and for this reason, have been the focus of many studies quantifying their size, structure, and variability. Analysis of pulmonary vein structure, however, has been limited by the lack of a standardized reference space for population based studies. Standardized maps are important tools for characterizing anatomic variability across subjects with the goal of separating normal inter-subject variability from abnormal variability associated with disease. In this work, we describe a novel technique for computing flat maps of left atrial anatomy in a standardized space. A flat map of left atrial anatomy is created by casting a single ray through the volume and systematically rotating the camera viewpoint to obtain the entire field of view. The technique is validated by assessing preservation of relative surface areas and distances between the original 3D geometry and the flat map geometry. The proposed methodology is demonstrated on 10 subjects which are subsequently combined to form a probabilistic map of anatomic location for each of the pulmonary vein ostia and the boundary of the left atrial appendage. The probabilistic map demonstrates that the location of the inferior ostia have higher variability than the superior ostia and the variability of the left atrial appendage is similar to the superior pulmonary veins. This technique could also have potential application in mapping electrophysiology data, radio-frequency ablation burns, or treatment planning in cardiac ablation therapy.

  3. A probabilistic seismic model for the European Arctic

    NASA Astrophysics Data System (ADS)

    Hauser, Juerg; Dyer, Kathleen M.; Pasyanos, Michael E.; Bungum, Hilmar; Faleide, Jan I.; Clark, Stephen A.; Schweitzer, Johannes

    2011-01-01

    The development of three-dimensional seismic models for the crust and upper mantle has traditionally focused on finding one model that provides the best fit to the data while observing some regularization constraints. In contrast to this, the inversion employed here fits the data in a probabilistic sense and thus provides a quantitative measure of model uncertainty. Our probabilistic model is based on two sources of information: (1) prior information, which is independent from the data, and (2) different geophysical data sets, including thickness constraints, velocity profiles, gravity data, surface wave group velocities, and regional body wave traveltimes. We use a Markov chain Monte Carlo (MCMC) algorithm to sample models from the prior distribution, the set of plausible models, and test them against the data to generate the posterior distribution, the ensemble of models that fit the data with assigned uncertainties. While being computationally more expensive, such a probabilistic inversion provides a more complete picture of solution space and allows us to combine various data sets. The complex geology of the European Arctic, encompassing oceanic crust, continental shelf regions, rift basins and old cratonic crust, as well as the nonuniform coverage of the region by data with varying degrees of uncertainty, makes it a challenging setting for any imaging technique and, therefore, an ideal environment for demonstrating the practical advantages of a probabilistic approach. Maps of depth to basement and depth to Moho derived from the posterior distribution are in good agreement with previously published maps and interpretations of the regional tectonic setting. The predicted uncertainties, which are as important as the absolute values, correlate well with the variations in data coverage and quality in the region. A practical advantage of our probabilistic model is that it can provide estimates for the uncertainties of observables due to model uncertainties. We will demonstrate how this can be used for the formulation of earthquake location algorithms that take model uncertainties into account when estimating location uncertainties.

  4. Subcortical structure segmentation using probabilistic atlas priors

    NASA Astrophysics Data System (ADS)

    Gouttard, Sylvain; Styner, Martin; Joshi, Sarang; Smith, Rachel G.; Cody Hazlett, Heather; Gerig, Guido

    2007-03-01

    The segmentation of the subcortical structures of the brain is required for many forms of quantitative neuroanatomic analysis. The volumetric and shape parameters of structures such as lateral ventricles, putamen, caudate, hippocampus, pallidus and amygdala are employed to characterize a disease or its evolution. This paper presents a fully automatic segmentation of these structures via a non-rigid registration of a probabilistic atlas prior and alongside a comprehensive validation. Our approach is based on an unbiased diffeomorphic atlas with probabilistic spatial priors built from a training set of MR images with corresponding manual segmentations. The atlas building computes an average image along with transformation fields mapping each training case to the average image. These transformation fields are applied to the manually segmented structures of each case in order to obtain a probabilistic map on the atlas. When applying the atlas for automatic structural segmentation, an MR image is first intensity inhomogeneity corrected, skull stripped and intensity calibrated to the atlas. Then the atlas image is registered to the image using an affine followed by a deformable registration matching the gray level intensity. Finally, the registration transformation is applied to the probabilistic maps of each structures, which are then thresholded at 0.5 probability. Using manual segmentations for comparison, measures of volumetric differences show high correlation with our results. Furthermore, the dice coefficient, which quantifies the volumetric overlap, is higher than 62% for all structures and is close to 80% for basal ganglia. The intraclass correlation coefficient computed on these same datasets shows a good inter-method correlation of the volumetric measurements. Using a dataset of a single patient scanned 10 times on 5 different scanners, reliability is shown with a coefficient of variance of less than 2 percents over the whole dataset. Overall, these validation and reliability studies show that our method accurately and reliably segments almost all structures. Only the hippocampus and amygdala segmentations exhibit relative low correlation with the manual segmentation in at least one of the validation studies, whereas they still show appropriate dice overlap coefficients.

  5. Analysis of the French insurance market exposure to floods: a stochastic model combining river overflow and surface runoff

    NASA Astrophysics Data System (ADS)

    Moncoulon, D.; Labat, D.; Ardon, J.; Onfroy, T.; Leblois, E.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.

    2013-07-01

    The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible but not yet occurred flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2012 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90% of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of CCR claim database has shown that approximately 45% of the insured flood losses are located inside the floodplains and 45% outside. 10% other percent are due to seasurge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: generation of fictive river flows based on the historical records of the river gauge network and generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (MACIF) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).

  6. Seismic Sources and Recurrence Rates as Adopted by USGS Staff for the Production of the 1982 and 1990 Probabilistic Ground Motion Maps for Alaska and the Conterminous United States

    USGS Publications Warehouse

    Hanson, Stanley L.; Perkins, David M.

    1995-01-01

    The construction of a probabilistic ground-motion hazard map for a region follows a sequence of analyses beginning with the selection of an earthquake catalog and ending with the mapping of calculated probabilistic ground-motion values (Hanson and others, 1992). An integral part of this process is the creation of sources used for the calculation of earthquake recurrence rates and ground motions. These sources consist of areas and lines that are representative of geologic or tectonic features and faults. After the design of the sources, it is necessary to arrange the coordinate points in a particular order compatible with the input format for the SEISRISK-III program (Bender and Perkins, 1987). Source zones are usually modeled as a point-rupture source. Where applicable, linear rupture sources are modeled with articulated lines, representing known faults, or a field of parallel lines, representing a generalized distribution of hypothetical faults. Based on the distribution of earthquakes throughout the individual source zones (or a collection of several sources), earthquake recurrence rates are computed for each of the sources, and a minimum and maximum magnitude is assigned. Over a period of time from 1978 to 1980 several conferences were held by the USGS to solicit information on regions of the United States for the purpose of creating source zones for computation of probabilistic ground motions (Thenhaus, 1983). As a result of these regional meetings and previous work in the Pacific Northwest, (Perkins and others, 1980), California continental shelf, (Thenhaus and others, 1980), and the Eastern outer continental shelf, (Perkins and others, 1979) a consensus set of source zones was agreed upon and subsequently used to produce a national ground motion hazard map for the United States (Algermissen and others, 1982). In this report and on the accompanying disk we provide a complete list of source areas and line sources as used for the 1982 and later 1990 seismic hazard maps for the conterminous U.S. and Alaska. These source zones are represented in the input form required for the hazard program SEISRISK-III, and they include the attenuation table and several other input parameter lines normally found at the beginning of an input data set for SEISRISK-III.

  7. Patterns-of-failure guided biological target volume definition for head and neck cancer patients: FDG-PET and dosimetric analysis of dose escalation candidate subregions.

    PubMed

    Mohamed, Abdallah S R; Cardenas, Carlos E; Garden, Adam S; Awan, Musaddiq J; Rock, Crosby D; Westergaard, Sarah A; Brandon Gunn, G; Belal, Abdelaziz M; El-Gowily, Ahmed G; Lai, Stephen Y; Rosenthal, David I; Fuller, Clifton D; Aristophanous, Michalis

    2017-08-01

    To identify the radio-resistant subvolumes in pretreatment FDG-PET by mapping the spatial location of the origin of tumor recurrence after IMRT for head-and-neck squamous cell cancer to the pretreatment FDG-PET/CT. Patients with local/regional recurrence after IMRT with available FDG-PET/CT and post-failure CT were included. For each patient, both pre-therapy PET/CT and recurrence CT were co-registered with the planning CT (pCT). A 4-mm radius was added to the centroid of mapped recurrence growth target volumes (rGTV's) to create recurrence nidus-volumes (NVs). The overlap between boost-tumor-volumes (BTV) representing different SUV thresholds/margins combinations and NVs was measured. Forty-seven patients were eligible. Forty-two (89.4%) had type A central high dose failure. Twenty-six (48%) of type A rGTVs were at the primary site and 28 (52%) were at the nodal site. The mean dose of type A rGTVs was 71Gy. BTV consisting of 50% of the maximum SUV plus 10mm margin was the best subvolume for dose boosting due to high coverage of primary site NVs (92.3%), low average relative volume to CTV1 (41%), and least average percent voxels outside CTV1 (19%). The majority of loco-regional recurrences originate in the regions of central-high-dose. When correlated with pretreatment FDG-PET, the majority of recurrences originated in an area that would be covered by additional 10mm margin on the volume of 50% of the maximum FDG uptake. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Assessment of thermal comfort level at pedestrian level in high-density urban area of Hong Kong

    NASA Astrophysics Data System (ADS)

    Ma, J.; Ng, E.; Yuan, C.; Lai, A.

    2015-12-01

    Hong Kong is a subtropical city which is very hot and humid in the summer. Pedestrians commonly experience thermal discomfort. Various studies have shown that the tall bulky buildings intensify the urban heat island effect and reduce urban air ventilation. However, relatively few studies have focused on modeling the thermal load at pedestrian level (~ 2 m). This study assesses the thermal comfort level, quantified by PET (Physiological Equivalent Temperature), using a GIS - based simulation approach. A thermal comfort level map shows the PET value of a typical summer afternoon in the high building density area. For example, the averaged PET in Sheung Wan is about 41 degree Celsius in a clear day and 38 degree Celsius in a cloudy day. This map shows where the walkways, colonnades, and greening is most needed. In addition, given a start point, a end point, and weather data, we generate the most comfort walking routes weighted by the PET. In the simulation, shortwave irradiance is calculated using the topographic radiation model (Fu and Rich, 1999) under various cloud cover scenarios; longwave irradiance is calculated based the radiative transfer equation (Swinbank, 1963). Combining these two factors, Tmrt (mean radiant temperature) is solved. And in some cases, the Tmrt differ more than 40 degree Celsius between areas under the sun and under the shades. Considering thermal load and wind information, we found that shading from buildings has stronger effect on PET than poor air ventilation resulted from dense buildings. We predict that pedestrians would feel more comfortable (lower PET) in a hot summer afternoon when walking in the higher building density area.

  9. Evaluation of two methods for using MR information in PET reconstruction

    NASA Astrophysics Data System (ADS)

    Caldeira, L.; Scheins, J.; Almeida, P.; Herzog, H.

    2013-02-01

    Using magnetic resonance (MR) information in maximum a posteriori (MAP) algorithms for positron emission tomography (PET) image reconstruction has been investigated in the last years. Recently, three methods to introduce this information have been evaluated and the Bowsher prior was considered the best. Its main advantage is that it does not require image segmentation. Another method that has been widely used for incorporating MR information is using boundaries obtained by segmentation. This method has also shown improvements in image quality. In this paper, two methods for incorporating MR information in PET reconstruction are compared. After a Bayes parameter optimization, the reconstructed images were compared using the mean squared error (MSE) and the coefficient of variation (CV). MSE values are 3% lower in Bowsher than using boundaries. CV values are 10% lower in Bowsher than using boundaries. Both methods performed better than using no prior, that is, maximum likelihood expectation maximization (MLEM) or MAP without anatomic information in terms of MSE and CV. Concluding, incorporating MR information using the Bowsher prior gives better results in terms of MSE and CV than boundaries. MAP algorithms showed again to be effective in noise reduction and convergence, specially when MR information is incorporated. The robustness of the priors in respect to noise and inhomogeneities in the MR image has however still to be performed.

  10. Absolute Cerebral Blood Flow Infarction Threshold for 3-Hour Ischemia Time Determined with CT Perfusion and 18F-FFMZ-PET Imaging in a Porcine Model of Cerebral Ischemia

    PubMed Central

    Cockburn, Neil; Kovacs, Michael

    2016-01-01

    CT Perfusion (CTP) derived cerebral blood flow (CBF) thresholds have been proposed as the optimal parameter for distinguishing the infarct core prior to reperfusion. Previous threshold-derivation studies have been limited by uncertainties introduced by infarct expansion between the acute phase of stroke and follow-up imaging, or DWI lesion reversibility. In this study a model is proposed for determining infarction CBF thresholds at 3hr ischemia time by comparing contemporaneously acquired CTP derived CBF maps to 18F-FFMZ-PET imaging, with the objective of deriving a CBF threshold for infarction after 3 hours of ischemia. Endothelin-1 (ET-1) was injected into the brain of Duroc-Cross pigs (n = 11) through a burr hole in the skull. CTP images were acquired 10 and 30 minutes post ET-1 injection and then every 30 minutes for 150 minutes. 370 MBq of 18F-FFMZ was injected ~120 minutes post ET-1 injection and PET images were acquired for 25 minutes starting ~155–180 minutes post ET-1 injection. CBF maps from each CTP acquisition were co-registered and converted into a median CBF map. The median CBF map was co-registered to blood volume maps for vessel exclusion, an average CT image for grey/white matter segmentation, and 18F-FFMZ-PET images for infarct delineation. Logistic regression and ROC analysis were performed on infarcted and non-infarcted pixel CBF values for each animal that developed infarct. Six of the eleven animals developed infarction. The mean CBF value corresponding to the optimal operating point of the ROC curves for the 6 animals was 12.6 ± 2.8 mL·min-1·100g-1 for infarction after 3 hours of ischemia. The porcine ET-1 model of cerebral ischemia is easier to implement then other large animal models of stroke, and performs similarly as long as CBF is monitored using CTP to prevent reperfusion. PMID:27347877

  11. Fast algorithm for probabilistic bone edge detection (FAPBED)

    NASA Astrophysics Data System (ADS)

    Scepanovic, Danilo; Kirshtein, Joshua; Jain, Ameet K.; Taylor, Russell H.

    2005-04-01

    The registration of preoperative CT to intra-operative reality systems is a crucial step in Computer Assisted Orthopedic Surgery (CAOS). The intra-operative sensors include 3D digitizers, fiducials, X-rays and Ultrasound (US). FAPBED is designed to process CT volumes for registration to tracked US data. Tracked US is advantageous because it is real time, noninvasive, and non-ionizing, but it is also known to have inherent inaccuracies which create the need to develop a framework that is robust to various uncertainties, and can be useful in US-CT registration. Furthermore, conventional registration methods depend on accurate and absolute segmentation. Our proposed probabilistic framework addresses the segmentation-registration duality, wherein exact segmentation is not a prerequisite to achieve accurate registration. In this paper, we develop a method for fast and automatic probabilistic bone surface (edge) detection in CT images. Various features that influence the likelihood of the surface at each spatial coordinate are combined using a simple probabilistic framework, which strikes a fair balance between a high-level understanding of features in an image and the low-level number crunching of standard image processing techniques. The algorithm evaluates different features for detecting the probability of a bone surface at each voxel, and compounds the results of these methods to yield a final, low-noise, probability map of bone surfaces in the volume. Such a probability map can then be used in conjunction with a similar map from tracked intra-operative US to achieve accurate registration. Eight sample pelvic CT scans were used to extract feature parameters and validate the final probability maps. An un-optimized fully automatic Matlab code runs in five minutes per CT volume on average, and was validated by comparison against hand-segmented gold standards. The mean probability assigned to nonzero surface points was 0.8, while nonzero non-surface points had a mean value of 0.38 indicating clear identification of surface points on average. The segmentation was also sufficiently crisp, with a full width at half maximum (FWHM) value of 1.51 voxels.

  12. Quantitative Susceptibility Mapping of Amyloid-β Aggregates in Alzheimer's Disease with 7T MR.

    PubMed

    Tiepolt, Solveig; Schäfer, Andreas; Rullmann, Michael; Roggenhofer, Elisabeth; Gertz, Hermann-Josef; Schroeter, Matthias L; Patt, Marianne; Bazin, Pierre-Louis; Jochimsen, Thies H; Turner, Robert; Sabri, Osama; Barthel, Henryk

    2018-05-28

    PET imaging is an established technique to detect cerebral amyloid-β (Aβ) plaques in vivo. Some preclinical and postmortem data report an accumulation of redox-active iron near Aβ plaques. Quantitative susceptibility mapping (QSM) at high-field MRI enables iron deposits to be depicted with high spatial resolution. Aim of this study was to examine whether iron and Aβ plaque accumulation is related and thus, whether 7T MRI might be an additive diagnostic tool to Aβ PET imaging. Postmortem human Alzheimer's disease (AD) and healthy control (HC) frontal gray matter (GM) was imaged with 7T MRI which resulted in T1 maps and QSM. Aβ plaque load was determined by histopathology. In vivo, 10 Aβ PET-positive AD patients (74.1±6.0a) and 10 Aβ PET-negative HCs (67.1±4.4a) underwent 7T MR examination and QSM maps were analyzed. Severity of cognitive deficits was determined by MMSE. Postmortem, the susceptibility of Aβ plaque-containing GM were higher than those of Aβ plaque-free GM (0.011±0.002 versus - 0.008±0.003 ppm, p <  0.001). In vivo, only the bilateral globus pallidus showed significantly higher susceptibility in AD patients compared to HCs (right: 0.277±0.018 versus - 0.009±0.009 ppm; left: 0.293±0.014 versus - 0.007±0.012 ppm, p <  0.0001). The pallidal QSM values were negatively correlated with those of the MMSE (r = - 0.69, p = 0.001). The postmortem study revealed significant susceptibility differences between the Aβ plaque-containing and Aβ plaque-free GM, whereas in vivo only the QSM values of the globus pallidus differed significantly between AD and HC group. The pallidal QSM values correlated with the severity of cognitive deficits. These findings encourage efforts to optimize the 7T-QSM methodology.

  13. Spatial forecasting of disease risk and uncertainty

    USGS Publications Warehouse

    De Cola, L.

    2002-01-01

    Because maps typically represent the value of a single variable over 2-dimensional space, cartographers must simplify the display of multiscale complexity, temporal dynamics, and underlying uncertainty. A choropleth disease risk map based on data for polygonal regions might depict incidence (cases per 100,000 people) within each polygon for a year but ignore the uncertainty that results from finer-scale variation, generalization, misreporting, small numbers, and future unknowns. In response to such limitations, this paper reports on the bivariate mapping of data "quantity" and "quality" of Lyme disease forecasts for states of the United States. Historical state data for 1990-2000 are used in an autoregressive model to forecast 2001-2010 disease incidence and a probability index of confidence, each of which is then kriged to provide two spatial grids representing continuous values over the nation. A single bivariate map is produced from the combination of the incidence grid (using a blue-to-red hue spectrum), and a probabilistic confidence grid (used to control the saturation of the hue at each grid cell). The resultant maps are easily interpretable, and the approach may be applied to such problems as detecting unusual disease occurences, visualizing past and future incidence, and assembling a consistent regional disease atlas showing patterns of forecasted risks in light of probabilistic confidence.

  14. A Software Tool for Quantitative Seismicity Analysis - ZMAP

    NASA Astrophysics Data System (ADS)

    Wiemer, S.; Gerstenberger, M.

    2001-12-01

    Earthquake catalogs are probably the most basic product of seismology, and remain arguably the most useful for tectonic studies. Modern seismograph networks can locate up to 100,000 earthquakes annually, providing a continuous and sometime overwhelming stream of data. ZMAP is a set of tools driven by a graphical user interface (GUI), designed to help seismologists analyze catalog data. ZMAP is primarily a research tool suited to the evaluation of catalog quality and to addressing specific hypotheses; however, it can also be useful in routine network operations. Examples of ZMAP features include catalog quality assessment (artifacts, completeness, explosion contamination), interactive data exploration, mapping transients in seismicity (rate changes, b-values, p-values), fractal dimension analysis and stress tensor inversions. Roughly 100 scientists worldwide have used the software at least occasionally. About 30 peer-reviewed publications have made use of ZMAP. ZMAP code is open source, written in the commercial software language Matlab by the Mathworks, a widely used software in the natural sciences. ZMAP was first published in 1994, and has continued to grow over the past 7 years. Recently, we released ZMAP v.6. The poster will introduce the features of ZMAP. We will specifically focus on ZMAP features related to time-dependent probabilistic hazard assessment. We are currently implementing a ZMAP based system that computes probabilistic hazard maps, which combine the stationary background hazard as well as aftershock and foreshock hazard into a comprehensive time dependent probabilistic hazard map. These maps will be displayed in near real time on the Internet. This poster is also intended as a forum for ZMAP users to provide feedback and discuss the future of ZMAP.

  15. Probabilistic Hazards Outlook

    Science.gov Websites

    Home Site Map News Organization Search: Go www.nws.noaa.gov Search the CPC Go Download KML Day 3-7 . See static maps below this for the most up to date graphics. Categorical Outlooks Day 3-7 Day 8-14 EDT May 25 2018 Synopsis: The summer season is expected to move in quickly for much of the contiguous

  16. GIS data for the Seaside, Oregon, Tsunami Pilot Study to modernize FEMA flood hazard maps

    USGS Publications Warehouse

    Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.

    2007-01-01

    A Tsunami Pilot Study was conducted for the area surrounding the coastal town of Seaside, Oregon, as part of the Federal Emergency Management's (FEMA) Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). The Cascadia subduction zone extends from Cape Mendocino, California, to Vancouver Island, Canada. The Seaside area was chosen because it is typical of many coastal communities subject to tsunamis generated by far- and near-field (Cascadia) earthquakes. Two goals of the pilot study were to develop probabilistic 100-year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improving tsunami hazard assessment guidelines for FEMA and state and local agencies. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study model data and results are published separately as a geographic information systems (GIS) data report (Wong and others, 2006). The flood maps and GIS data are briefly described here.

  17. Quantifying the uncertainty in site amplification modeling and its effects on site-specific seismic-hazard estimation in the upper Mississippi embayment and adjacent areas

    USGS Publications Warehouse

    Cramer, C.H.

    2006-01-01

    The Mississippi embayment, located in the central United States, and its thick deposits of sediments (over 1 km in places) have a large effect on earthquake ground motions. Several previous studies have addressed how these thick sediments might modify probabilistic seismic-hazard maps. The high seismic hazard associated with the New Madrid seismic zone makes it particularly important to quantify the uncertainty in modeling site amplification to better represent earthquake hazard in seismic-hazard maps. The methodology of the Memphis urban seismic-hazard-mapping project (Cramer et al., 2004) is combined with the reference profile approach of Toro and Silva (2001) to better estimate seismic hazard in the Mississippi embayment. Improvements over previous approaches include using the 2002 national seismic-hazard model, fully probabilistic hazard calculations, calibration of site amplification with improved nonlinear soil-response estimates, and estimates of uncertainty. Comparisons are made with the results of several previous studies, and estimates of uncertainty inherent in site-amplification modeling for the upper Mississippi embayment are developed. I present new seismic-hazard maps for the upper Mississippi embayment with the effects of site geology incorporating these uncertainties.

  18. Mapping flood hazards under uncertainty through probabilistic flood inundation maps

    NASA Astrophysics Data System (ADS)

    Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.

    2017-12-01

    Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.

  19. Quantitative assessment of dynamic PET imaging data in cancer imaging.

    PubMed

    Muzi, Mark; O'Sullivan, Finbarr; Mankoff, David A; Doot, Robert K; Pierce, Larry A; Kurland, Brenda F; Linden, Hannah M; Kinahan, Paul E

    2012-11-01

    Clinical imaging in positron emission tomography (PET) is often performed using single-time-point estimates of tracer uptake or static imaging that provides a spatial map of regional tracer concentration. However, dynamic tracer imaging can provide considerably more information about in vivo biology by delineating both the temporal and spatial pattern of tracer uptake. In addition, several potential sources of error that occur in static imaging can be mitigated. This review focuses on the application of dynamic PET imaging to measuring regional cancer biologic features and especially in using dynamic PET imaging for quantitative therapeutic response monitoring for cancer clinical trials. Dynamic PET imaging output parameters, particularly transport (flow) and overall metabolic rate, have provided imaging end points for clinical trials at single-center institutions for years. However, dynamic imaging poses many challenges for multicenter clinical trial implementations from cross-center calibration to the inadequacy of a common informatics infrastructure. Underlying principles and methodology of PET dynamic imaging are first reviewed, followed by an examination of current approaches to dynamic PET image analysis with a specific case example of dynamic fluorothymidine imaging to illustrate the approach. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. An algorithm for longitudinal registration of PET/CT images acquired during neoadjuvant chemotherapy in breast cancer: preliminary results.

    PubMed

    Li, Xia; Abramson, Richard G; Arlinghaus, Lori R; Chakravarthy, Anuradha Bapsi; Abramson, Vandana; Mayer, Ingrid; Farley, Jaime; Delbeke, Dominique; Yankeelov, Thomas E

    2012-11-16

    By providing estimates of tumor glucose metabolism, 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) can potentially characterize the response of breast tumors to treatment. To assess therapy response, serial measurements of FDG-PET parameters (derived from static and/or dynamic images) can be obtained at different time points during the course of treatment. However, most studies track the changes in average parameter values obtained from the whole tumor, thereby discarding all spatial information manifested in tumor heterogeneity. Here, we propose a method whereby serially acquired FDG-PET breast data sets can be spatially co-registered to enable the spatial comparison of parameter maps at the voxel level. The goal is to optimally register normal tissues while simultaneously preventing tumor distortion. In order to accomplish this, we constructed a PET support device to enable PET/CT imaging of the breasts of ten patients in the prone position and applied a mutual information-based rigid body registration followed by a non-rigid registration. The non-rigid registration algorithm extended the adaptive bases algorithm (ABA) by incorporating a tumor volume-preserving constraint, which computed the Jacobian determinant over the tumor regions as outlined on the PET/CT images, into the cost function. We tested this approach on ten breast cancer patients undergoing neoadjuvant chemotherapy. By both qualitative and quantitative evaluation, our constrained algorithm yielded significantly less tumor distortion than the unconstrained algorithm: considering the tumor volume determined from standard uptake value maps, the post-registration median tumor volume changes, and the 25th and 75th quantiles were 3.42% (0%, 13.39%) and 16.93% (9.21%, 49.93%) for the constrained and unconstrained algorithms, respectively (p = 0.002), while the bending energy (a measure of the smoothness of the deformation) was 0.0015 (0.0005, 0.012) and 0.017 (0.005, 0.044), respectively (p = 0.005). The results indicate that the constrained ABA algorithm can accurately align prone breast FDG-PET images acquired at different time points while keeping the tumor from being substantially compressed or distorted. NCT00474604.

  1. A generative probabilistic model and discriminative extensions for brain lesion segmentation – with application to tumor and stroke

    PubMed Central

    Menze, Bjoern H.; Van Leemput, Koen; Lashkari, Danial; Riklin-Raviv, Tammy; Geremia, Ezequiel; Alberts, Esther; Gruber, Philipp; Wegener, Susanne; Weber, Marc-André; Székely, Gabor; Ayache, Nicholas; Golland, Polina

    2016-01-01

    We introduce a generative probabilistic model for segmentation of brain lesions in multi-dimensional images that generalizes the EM segmenter, a common approach for modelling brain images using Gaussian mixtures and a probabilistic tissue atlas that employs expectation-maximization (EM) to estimate the label map for a new image. Our model augments the probabilistic atlas of the healthy tissues with a latent atlas of the lesion. We derive an estimation algorithm with closed-form EM update equations. The method extracts a latent atlas prior distribution and the lesion posterior distributions jointly from the image data. It delineates lesion areas individually in each channel, allowing for differences in lesion appearance across modalities, an important feature of many brain tumor imaging sequences. We also propose discriminative model extensions to map the output of the generative model to arbitrary labels with semantic and biological meaning, such as “tumor core” or “fluid-filled structure”, but without a one-to-one correspondence to the hypo-or hyper-intense lesion areas identified by the generative model. We test the approach in two image sets: the publicly available BRATS set of glioma patient scans, and multimodal brain images of patients with acute and subacute ischemic stroke. We find the generative model that has been designed for tumor lesions to generalize well to stroke images, and the generative-discriminative model to be one of the top ranking methods in the BRATS evaluation. PMID:26599702

  2. A Generative Probabilistic Model and Discriminative Extensions for Brain Lesion Segmentation--With Application to Tumor and Stroke.

    PubMed

    Menze, Bjoern H; Van Leemput, Koen; Lashkari, Danial; Riklin-Raviv, Tammy; Geremia, Ezequiel; Alberts, Esther; Gruber, Philipp; Wegener, Susanne; Weber, Marc-Andre; Szekely, Gabor; Ayache, Nicholas; Golland, Polina

    2016-04-01

    We introduce a generative probabilistic model for segmentation of brain lesions in multi-dimensional images that generalizes the EM segmenter, a common approach for modelling brain images using Gaussian mixtures and a probabilistic tissue atlas that employs expectation-maximization (EM), to estimate the label map for a new image. Our model augments the probabilistic atlas of the healthy tissues with a latent atlas of the lesion. We derive an estimation algorithm with closed-form EM update equations. The method extracts a latent atlas prior distribution and the lesion posterior distributions jointly from the image data. It delineates lesion areas individually in each channel, allowing for differences in lesion appearance across modalities, an important feature of many brain tumor imaging sequences. We also propose discriminative model extensions to map the output of the generative model to arbitrary labels with semantic and biological meaning, such as "tumor core" or "fluid-filled structure", but without a one-to-one correspondence to the hypo- or hyper-intense lesion areas identified by the generative model. We test the approach in two image sets: the publicly available BRATS set of glioma patient scans, and multimodal brain images of patients with acute and subacute ischemic stroke. We find the generative model that has been designed for tumor lesions to generalize well to stroke images, and the extended discriminative -discriminative model to be one of the top ranking methods in the BRATS evaluation.

  3. Fusion of multi-tracer PET images for dose painting.

    PubMed

    Lelandais, Benoît; Ruan, Su; Denœux, Thierry; Vera, Pierre; Gardin, Isabelle

    2014-10-01

    PET imaging with FluoroDesoxyGlucose (FDG) tracer is clinically used for the definition of Biological Target Volumes (BTVs) for radiotherapy. Recently, new tracers, such as FLuoroThymidine (FLT) or FluoroMisonidazol (FMiso), have been proposed. They provide complementary information for the definition of BTVs. Our work is to fuse multi-tracer PET images to obtain a good BTV definition and to help the radiation oncologist in dose painting. Due to the noise and the partial volume effect leading, respectively, to the presence of uncertainty and imprecision in PET images, the segmentation and the fusion of PET images is difficult. In this paper, a framework based on Belief Function Theory (BFT) is proposed for the segmentation of BTV from multi-tracer PET images. The first step is based on an extension of the Evidential C-Means (ECM) algorithm, taking advantage of neighboring voxels for dealing with uncertainty and imprecision in each mono-tracer PET image. Then, imprecision and uncertainty are, respectively, reduced using prior knowledge related to defects in the acquisition system and neighborhood information. Finally, a multi-tracer PET image fusion is performed. The results are represented by a set of parametric maps that provide important information for dose painting. The performances are evaluated on PET phantoms and patient data with lung cancer. Quantitative results show good performance of our method compared with other methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Multimodal partial volume correction: Application to [11C]PIB PET/MRI myelin imaging in multiple sclerosis.

    PubMed

    Grecchi, Elisabetta; Veronese, Mattia; Bodini, Benedetta; García-Lorenzo, Daniel; Battaglini, Marco; Stankoff, Bruno; Turkheimer, Federico E

    2017-12-01

    The [ 11 C]PIB PET tracer, originally developed for amyloid imaging, has been recently repurposed to quantify demyelination and remyelination in multiple sclerosis (MS). Myelin PET imaging, however, is limited by its low resolution that deteriorates the quantification accuracy of white matter (WM) lesions. Here, we introduce a novel partial volume correction (PVC) method called Multiresolution-Multimodal Resolution-Recovery (MM-RR), which uses the wavelet transform and a synergistic statistical model to exploit MRI structural images to improve the resolution of [ 11 C]PIB PET myelin imaging. MM-RR performance was tested on a phantom acquisition and in a dataset comprising [ 11 C]PIB PET and MR T1- and T2-weighted images of 8 healthy controls and 20 MS patients. For the control group, the MM-RR PET images showed an average increase of 5.7% in WM uptake while the grey-matter (GM) uptake remained constant, resulting in +31% WM/GM contrast. Furthermore, MM-RR PET binding maps correlated significantly with the mRNA expressions of the most represented proteins in the myelin sheath (R 2  = 0.57 ± 0.09). In the patient group, MM-RR PET images showed sharper lesion contours and significant improvement in normal-appearing tissue/WM-lesion contrast compared to standard PET (contrast improvement > +40%). These results were consistent with MM-RR performances in phantom experiments.

  5. A statistical method for lung tumor segmentation uncertainty in PET images based on user inference.

    PubMed

    Zheng, Chaojie; Wang, Xiuying; Feng, Dagan

    2015-01-01

    PET has been widely accepted as an effective imaging modality for lung tumor diagnosis and treatment. However, standard criteria for delineating tumor boundary from PET are yet to develop largely due to relatively low quality of PET images, uncertain tumor boundary definition, and variety of tumor characteristics. In this paper, we propose a statistical solution to segmentation uncertainty on the basis of user inference. We firstly define the uncertainty segmentation band on the basis of segmentation probability map constructed from Random Walks (RW) algorithm; and then based on the extracted features of the user inference, we use Principle Component Analysis (PCA) to formulate the statistical model for labeling the uncertainty band. We validated our method on 10 lung PET-CT phantom studies from the public RIDER collections [1] and 16 clinical PET studies where tumors were manually delineated by two experienced radiologists. The methods were validated using Dice similarity coefficient (DSC) to measure the spatial volume overlap. Our method achieved an average DSC of 0.878 ± 0.078 on phantom studies and 0.835 ± 0.039 on clinical studies.

  6. Direct Parametric Reconstruction With Joint Motion Estimation/Correction for Dynamic Brain PET Data.

    PubMed

    Jiao, Jieqing; Bousse, Alexandre; Thielemans, Kris; Burgos, Ninon; Weston, Philip S J; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Markiewicz, Pawel; Ourselin, Sebastien

    2017-01-01

    Direct reconstruction of parametric images from raw photon counts has been shown to improve the quantitative analysis of dynamic positron emission tomography (PET) data. However it suffers from subject motion which is inevitable during the typical acquisition time of 1-2 hours. In this work we propose a framework to jointly estimate subject head motion and reconstruct the motion-corrected parametric images directly from raw PET data, so that the effects of distorted tissue-to-voxel mapping due to subject motion can be reduced in reconstructing the parametric images with motion-compensated attenuation correction and spatially aligned temporal PET data. The proposed approach is formulated within the maximum likelihood framework, and efficient solutions are derived for estimating subject motion and kinetic parameters from raw PET photon count data. Results from evaluations on simulated [ 11 C]raclopride data using the Zubal brain phantom and real clinical [ 18 F]florbetapir data of a patient with Alzheimer's disease show that the proposed joint direct parametric reconstruction motion correction approach can improve the accuracy of quantifying dynamic PET data with large subject motion.

  7. Reconstruction of metabolic pathways by combining probabilistic graphical model-based and knowledge-based methods

    PubMed Central

    2014-01-01

    Automatic reconstruction of metabolic pathways for an organism from genomics and transcriptomics data has been a challenging and important problem in bioinformatics. Traditionally, known reference pathways can be mapped into an organism-specific ones based on its genome annotation and protein homology. However, this simple knowledge-based mapping method might produce incomplete pathways and generally cannot predict unknown new relations and reactions. In contrast, ab initio metabolic network construction methods can predict novel reactions and interactions, but its accuracy tends to be low leading to a lot of false positives. Here we combine existing pathway knowledge and a new ab initio Bayesian probabilistic graphical model together in a novel fashion to improve automatic reconstruction of metabolic networks. Specifically, we built a knowledge database containing known, individual gene / protein interactions and metabolic reactions extracted from existing reference pathways. Known reactions and interactions were then used as constraints for Bayesian network learning methods to predict metabolic pathways. Using individual reactions and interactions extracted from different pathways of many organisms to guide pathway construction is new and improves both the coverage and accuracy of metabolic pathway construction. We applied this probabilistic knowledge-based approach to construct the metabolic networks from yeast gene expression data and compared its results with 62 known metabolic networks in the KEGG database. The experiment showed that the method improved the coverage of metabolic network construction over the traditional reference pathway mapping method and was more accurate than pure ab initio methods. PMID:25374614

  8. Using Concept Mapping and Paraphrasing for Reading Comprehension

    ERIC Educational Resources Information Center

    Marashi, Hamid; Bagheri, Nazanin

    2015-01-01

    This study investigated the comparative impact of two types of teaching techniques, namely concept mapping and paraphrasing, on the reading comprehension of EFL learners. For this purpose, 60 learners of a total number of 90 intermediate learners studying at a language school in Karaj, Iran, were chosen through taking a piloted PET for…

  9. Comparison of the sensitivity and specificity of 5 image sets of dual-energy computed tomography for detecting first-pass myocardial perfusion defects compared with positron emission tomography.

    PubMed

    Li, Wenhuan; Zhu, Xiaolian; Li, Jing; Peng, Cheng; Chen, Nan; Qi, Zhigang; Yang, Qi; Gao, Yan; Zhao, Yang; Sun, Kai; Li, Kuncheng

    2014-12-01

    The sensitivity and specificity of 5 different image sets of dual-energy computed tomography (DECT) for the detection of first-pass myocardial perfusion defects have not systematically been compared using positron emission tomography (PET) as a reference standard. Forty-nine consecutive patients, with known or strongly suspected of coronary artery disease, were prospectively enrolled in our study. Cardiac DECT was performed at rest state using a second-generation 128-slice dual-source CT. The DECT data were reconstructed to iodine maps, monoenergetic images, 100 kV images, nonlinearly blended images, and linearly blended images by different postprocessing techniques. The myocardial perfusion defects on DECT images were visually assessed by 5 observers, using standard 17-segment model. Diagnostic accuracy of 5 image sets was assessed using nitrogen-13 ammonia PET as the gold standard. Discrimination was quantified using the area under the receiver operating characteristic curve (AUC), and AUCs were compared using the method of DeLong. The DECT and PET examinations were successfully completed in 30 patients and a total of 90 territories and 510 segments were analyzed. Cardiac PET revealed myocardial perfusion defects in 56 territories (62%) and 209 segments (41%). The AUC of iodine maps, monoenergetic images, 100 kV images, nonlinearly blended images, and linearly blended images were 0.986, 0.934, 0.913, 0.881, and 0.871, respectively, on a per-territory basis. These values were 0.922, 0.813, 0.779, 0.763, and 0.728, respectively, on a per-segment basis. DECT iodine maps shows high sensitivity and specificity, and is superior to other DECT image sets for the detection of myocardial perfusion defects in the first-pass myocardial perfusion.

  10. Demonstration of accuracy and clinical versatility of mutual information for automatic multimodality image fusion using affine and thin-plate spline warped geometric deformations.

    PubMed

    Meyer, C R; Boes, J L; Kim, B; Bland, P H; Zasadny, K R; Kison, P V; Koral, K; Frey, K A; Wahl, R L

    1997-04-01

    This paper applies and evaluates an automatic mutual information-based registration algorithm across a broad spectrum of multimodal volume data sets. The algorithm requires little or no pre-processing, minimal user input and easily implements either affine, i.e. linear or thin-plate spline (TPS) warped registrations. We have evaluated the algorithm in phantom studies as well as in selected cases where few other algorithms could perform as well, if at all, to demonstrate the value of this new method. Pairs of multimodal gray-scale volume data sets were registered by iteratively changing registration parameters to maximize mutual information. Quantitative registration errors were assessed in registrations of a thorax phantom using PET/CT and in the National Library of Medicine's Visible Male using MRI T2-/T1-weighted acquisitions. Registrations of diverse clinical data sets were demonstrated including rotate-translate mapping of PET/MRI brain scans with significant missing data, full affine mapping of thoracic PET/CT and rotate-translate mapping of abdominal SPECT/CT. A five-point thin-plate spline (TPS) warped registration of thoracic PET/CT is also demonstrated. The registration algorithm converged in times ranging between 3.5 and 31 min for affine clinical registrations and 57 min for TPS warping. Mean error vector lengths for rotate-translate registrations were measured to be subvoxel in phantoms. More importantly the rotate-translate algorithm performs well even with missing data. The demonstrated clinical fusions are qualitatively excellent at all levels. We conclude that such automatic, rapid, robust algorithms significantly increase the likelihood that multimodality registrations will be routinely used to aid clinical diagnoses and post-therapeutic assessment in the near future.

  11. Mapping of transcription factor binding regions in mammalian cells by ChIP: Comparison of array- and sequencing-based technologies

    PubMed Central

    Euskirchen, Ghia M.; Rozowsky, Joel S.; Wei, Chia-Lin; Lee, Wah Heng; Zhang, Zhengdong D.; Hartman, Stephen; Emanuelsson, Olof; Stolc, Viktor; Weissman, Sherman; Gerstein, Mark B.; Ruan, Yijun; Snyder, Michael

    2007-01-01

    Recent progress in mapping transcription factor (TF) binding regions can largely be credited to chromatin immunoprecipitation (ChIP) technologies. We compared strategies for mapping TF binding regions in mammalian cells using two different ChIP schemes: ChIP with DNA microarray analysis (ChIP-chip) and ChIP with DNA sequencing (ChIP-PET). We first investigated parameters central to obtaining robust ChIP-chip data sets by analyzing STAT1 targets in the ENCODE regions of the human genome, and then compared ChIP-chip to ChIP-PET. We devised methods for scoring and comparing results among various tiling arrays and examined parameters such as DNA microarray format, oligonucleotide length, hybridization conditions, and the use of competitor Cot-1 DNA. The best performance was achieved with high-density oligonucleotide arrays, oligonucleotides ≥50 bases (b), the presence of competitor Cot-1 DNA and hybridizations conducted in microfluidics stations. When target identification was evaluated as a function of array number, 80%–86% of targets were identified with three or more arrays. Comparison of ChIP-chip with ChIP-PET revealed strong agreement for the highest ranked targets with less overlap for the low ranked targets. With advantages and disadvantages unique to each approach, we found that ChIP-chip and ChIP-PET are frequently complementary in their relative abilities to detect STAT1 targets for the lower ranked targets; each method detected validated targets that were missed by the other method. The most comprehensive list of STAT1 binding regions is obtained by merging results from ChIP-chip and ChIP-sequencing. Overall, this study provides information for robust identification, scoring, and validation of TF targets using ChIP-based technologies. PMID:17568005

  12. An open tool for input function estimation and quantification of dynamic PET FDG brain scans.

    PubMed

    Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro

    2016-08-01

    Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main contribution of this article is the development of an open-source, free to use tool that encapsulates several well-known methods for the estimation of the input function and the quantification of dynamic PET FDG studies. Some alternative strategies are also proposed and implemented in the tool for the segmentation of blood pools and parameter estimation. The tool was tested on phantoms with encouraging results that suggest that even bloodless estimators could provide a viable alternative to blood sampling for quantification using graphical analysis. The open tool is a promising opportunity for collaboration among investigators and further validation on real studies.

  13. Pulmonary imaging using respiratory motion compensated simultaneous PET/MR

    PubMed Central

    Dutta, Joyita; Huang, Chuan; Li, Quanzheng; El Fakhri, Georges

    2015-01-01

    Purpose: Pulmonary positron emission tomography (PET) imaging is confounded by blurring artifacts caused by respiratory motion. These artifacts degrade both image quality and quantitative accuracy. In this paper, the authors present a complete data acquisition and processing framework for respiratory motion compensated image reconstruction (MCIR) using simultaneous whole body PET/magnetic resonance (MR) and validate it through simulation and clinical patient studies. Methods: The authors have developed an MCIR framework based on maximum a posteriori or MAP estimation. For fast acquisition of high quality 4D MR images, the authors developed a novel Golden-angle RAdial Navigated Gradient Echo (GRANGE) pulse sequence and used it in conjunction with sparsity-enforcing k-t FOCUSS reconstruction. The authors use a 1D slice-projection navigator signal encapsulated within this pulse sequence along with a histogram-based gate assignment technique to retrospectively sort the MR and PET data into individual gates. The authors compute deformation fields for each gate via nonrigid registration. The deformation fields are incorporated into the PET data model as well as utilized for generating dynamic attenuation maps. The framework was validated using simulation studies on the 4D XCAT phantom and three clinical patient studies that were performed on the Biograph mMR, a simultaneous whole body PET/MR scanner. Results: The authors compared MCIR (MC) results with ungated (UG) and one-gate (OG) reconstruction results. The XCAT study revealed contrast-to-noise ratio (CNR) improvements for MC relative to UG in the range of 21%–107% for 14 mm diameter lung lesions and 39%–120% for 10 mm diameter lung lesions. A strategy for regularization parameter selection was proposed, validated using XCAT simulations, and applied to the clinical studies. The authors’ results show that the MC image yields 19%–190% increase in the CNR of high-intensity features of interest affected by respiratory motion relative to UG and a 6%–51% increase relative to OG. Conclusions: Standalone MR is not the traditional choice for lung scans due to the low proton density, high magnetic susceptibility, and low T2∗ relaxation time in the lungs. By developing and validating this PET/MR pulmonary imaging framework, the authors show that simultaneous PET/MR, unique in its capability of combining structural information from MR with functional information from PET, shows promise in pulmonary imaging. PMID:26133621

  14. Pulmonary imaging using respiratory motion compensated simultaneous PET/MR.

    PubMed

    Dutta, Joyita; Huang, Chuan; Li, Quanzheng; El Fakhri, Georges

    2015-07-01

    Pulmonary positron emission tomography (PET) imaging is confounded by blurring artifacts caused by respiratory motion. These artifacts degrade both image quality and quantitative accuracy. In this paper, the authors present a complete data acquisition and processing framework for respiratory motion compensated image reconstruction (MCIR) using simultaneous whole body PET/magnetic resonance (MR) and validate it through simulation and clinical patient studies. The authors have developed an MCIR framework based on maximum a posteriori or MAP estimation. For fast acquisition of high quality 4D MR images, the authors developed a novel Golden-angle RAdial Navigated Gradient Echo (GRANGE) pulse sequence and used it in conjunction with sparsity-enforcing k-t FOCUSS reconstruction. The authors use a 1D slice-projection navigator signal encapsulated within this pulse sequence along with a histogram-based gate assignment technique to retrospectively sort the MR and PET data into individual gates. The authors compute deformation fields for each gate via nonrigid registration. The deformation fields are incorporated into the PET data model as well as utilized for generating dynamic attenuation maps. The framework was validated using simulation studies on the 4D XCAT phantom and three clinical patient studies that were performed on the Biograph mMR, a simultaneous whole body PET/MR scanner. The authors compared MCIR (MC) results with ungated (UG) and one-gate (OG) reconstruction results. The XCAT study revealed contrast-to-noise ratio (CNR) improvements for MC relative to UG in the range of 21%-107% for 14 mm diameter lung lesions and 39%-120% for 10 mm diameter lung lesions. A strategy for regularization parameter selection was proposed, validated using XCAT simulations, and applied to the clinical studies. The authors' results show that the MC image yields 19%-190% increase in the CNR of high-intensity features of interest affected by respiratory motion relative to UG and a 6%-51% increase relative to OG. Standalone MR is not the traditional choice for lung scans due to the low proton density, high magnetic susceptibility, and low T2 (∗) relaxation time in the lungs. By developing and validating this PET/MR pulmonary imaging framework, the authors show that simultaneous PET/MR, unique in its capability of combining structural information from MR with functional information from PET, shows promise in pulmonary imaging.

  15. Impact of time-of-flight PET on quantification errors in MR imaging-based attenuation correction.

    PubMed

    Mehranian, Abolfazl; Zaidi, Habib

    2015-04-01

    Time-of-flight (TOF) PET/MR imaging is an emerging imaging technology with great capabilities offered by TOF to improve image quality and lesion detectability. We assessed, for the first time, the impact of TOF image reconstruction on PET quantification errors induced by MR imaging-based attenuation correction (MRAC) using simulation and clinical PET/CT studies. Standard 4-class attenuation maps were derived by segmentation of CT images of 27 patients undergoing PET/CT examinations into background air, lung, soft-tissue, and fat tissue classes, followed by the assignment of predefined attenuation coefficients to each class. For each patient, 4 PET images were reconstructed: non-TOF and TOF both corrected for attenuation using reference CT-based attenuation correction and the resulting 4-class MRAC maps. The relative errors between non-TOF and TOF MRAC reconstructions were compared with their reference CT-based attenuation correction reconstructions. The bias was locally and globally evaluated using volumes of interest (VOIs) defined on lesions and normal tissues and CT-derived tissue classes containing all voxels in a given tissue, respectively. The impact of TOF on reducing the errors induced by metal-susceptibility and respiratory-phase mismatch artifacts was also evaluated using clinical and simulation studies. Our results show that TOF PET can remarkably reduce attenuation correction artifacts and quantification errors in the lungs and bone tissues. Using classwise analysis, it was found that the non-TOF MRAC method results in an error of -3.4% ± 11.5% in the lungs and -21.8% ± 2.9% in bones, whereas its TOF counterpart reduced the errors to -2.9% ± 7.1% and -15.3% ± 2.3%, respectively. The VOI-based analysis revealed that the non-TOF and TOF methods resulted in an average overestimation of 7.5% and 3.9% in or near lung lesions (n = 23) and underestimation of less than 5% for soft tissue and in or near bone lesions (n = 91). Simulation results showed that as TOF resolution improves, artifacts and quantification errors are substantially reduced. TOF PET substantially reduces artifacts and improves significantly the quantitative accuracy of standard MRAC methods. Therefore, MRAC should be less of a concern on future TOF PET/MR scanners with improved timing resolution. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  16. Whole-body FDG PET-MR oncologic imaging: pitfalls in clinical interpretation related to inaccurate MR-based attenuation correction.

    PubMed

    Attenberger, Ulrike; Catana, Ciprian; Chandarana, Hersh; Catalano, Onofrio A; Friedman, Kent; Schonberg, Stefan A; Thrall, James; Salvatore, Marco; Rosen, Bruce R; Guimaraes, Alexander R

    2015-08-01

    Simultaneous data collection for positron emission tomography and magnetic resonance imaging (PET/MR) is now a reality. While the full benefits of concurrently acquiring PET and MR data and the potential added clinical value are still being evaluated, initial studies have identified several important potential pitfalls in the interpretation of fluorodeoxyglucose (FDG) PET/MRI in oncologic whole-body imaging, the majority of which being related to the errors in the attenuation maps created from the MR data. The purpose of this article was to present such pitfalls and artifacts using case examples, describe their etiology, and discuss strategies to overcome them. Using a case-based approach, we will illustrate artifacts related to (1) Inaccurate bone tissue segmentation; (2) Inaccurate air cavities segmentation; (3) Motion-induced misregistration; (4) RF coils in the PET field of view; (5) B0 field inhomogeneity; (6) B1 field inhomogeneity; (7) Metallic implants; (8) MR contrast agents.

  17. Prototype design of singles processing unit for the small animal PET

    NASA Astrophysics Data System (ADS)

    Deng, P.; Zhao, L.; Lu, J.; Li, B.; Dong, R.; Liu, S.; An, Q.

    2018-05-01

    Position Emission Tomography (PET) is an advanced clinical diagnostic imaging technique for nuclear medicine. Small animal PET is increasingly used for studying the animal model of disease, new drugs and new therapies. A prototype of Singles Processing Unit (SPU) for a small animal PET system was designed to obtain the time, energy, and position information. The energy and position is actually calculated through high precison charge measurement, which is based on amplification, shaping, A/D conversion and area calculation in digital signal processing domian. Analysis and simulations were also conducted to optimize the key parameters in system design. Initial tests indicate that the charge and time precision is better than 3‰ FWHM and 350 ps FWHM respectively, while the position resolution is better than 3.5‰ FWHM. Commination tests of the SPU prototype with the PET detector indicate that the system time precision is better than 2.5 ns, while the flood map and energy spectra concored well with the expected.

  18. An integrated 3-Dimensional Genome Modeling Engine for data-driven simulation of spatial genome organization.

    PubMed

    Szałaj, Przemysław; Tang, Zhonghui; Michalski, Paul; Pietal, Michal J; Luo, Oscar J; Sadowski, Michał; Li, Xingwang; Radew, Kamen; Ruan, Yijun; Plewczynski, Dariusz

    2016-12-01

    ChIA-PET is a high-throughput mapping technology that reveals long-range chromatin interactions and provides insights into the basic principles of spatial genome organization and gene regulation mediated by specific protein factors. Recently, we showed that a single ChIA-PET experiment provides information at all genomic scales of interest, from the high-resolution locations of binding sites and enriched chromatin interactions mediated by specific protein factors, to the low resolution of nonenriched interactions that reflect topological neighborhoods of higher-order chromosome folding. This multilevel nature of ChIA-PET data offers an opportunity to use multiscale 3D models to study structural-functional relationships at multiple length scales, but doing so requires a structural modeling platform. Here, we report the development of 3D-GNOME (3-Dimensional Genome Modeling Engine), a complete computational pipeline for 3D simulation using ChIA-PET data. 3D-GNOME consists of three integrated components: a graph-distance-based heat map normalization tool, a 3D modeling platform, and an interactive 3D visualization tool. Using ChIA-PET and Hi-C data derived from human B-lymphocytes, we demonstrate the effectiveness of 3D-GNOME in building 3D genome models at multiple levels, including the entire genome, individual chromosomes, and specific segments at megabase (Mb) and kilobase (kb) resolutions of single average and ensemble structures. Further incorporation of CTCF-motif orientation and high-resolution looping patterns in 3D simulation provided additional reliability of potential biologically plausible topological structures. © 2016 Szałaj et al.; Published by Cold Spring Harbor Laboratory Press.

  19. A Framework for the Validation of Probabilistic Seismic Hazard Analysis Maps Using Strong Ground Motion Data

    NASA Astrophysics Data System (ADS)

    Bydlon, S. A.; Beroza, G. C.

    2015-12-01

    Recent debate on the efficacy of Probabilistic Seismic Hazard Analysis (PSHA), and the utility of hazard maps (i.e. Stein et al., 2011; Hanks et al., 2012), has prompted a need for validation of such maps using recorded strong ground motion data. Unfortunately, strong motion records are limited spatially and temporally relative to the area and time windows hazard maps encompass. We develop a framework to test the predictive powers of PSHA maps that is flexible with respect to a map's specified probability of exceedance and time window, and the strong motion receiver coverage. Using a combination of recorded and interpolated strong motion records produced through the ShakeMap environment, we compile a record of ground motion intensity measures for California from 2002-present. We use this information to perform an area-based test of California PSHA maps inspired by the work of Ward (1995). Though this framework is flexible in that it can be applied to seismically active areas where ShakeMap-like ground shaking interpolations have or can be produced, this testing procedure is limited by the relatively short lifetime of strong motion recordings and by the desire to only test with data collected after the development of the PSHA map under scrutiny. To account for this, we use the assumption that PSHA maps are time independent to adapt the testing procedure for periods of recorded data shorter than the lifetime of a map. We note that accuracy of this testing procedure will only improve as more data is collected, or as the time-horizon of interest is reduced, as has been proposed for maps of areas experiencing induced seismicity. We believe that this procedure can be used to determine whether PSHA maps are accurately portraying seismic hazard and whether discrepancies are localized or systemic.

  20. Probabilistic classification learning with corrective feedback is associated with in vivo striatal dopamine release in the ventral striatum, while learning without feedback is not

    PubMed Central

    Wilkinson, Leonora; Tai, Yen Foung; Lin, Chia Shu; Lagnado, David Albert; Brooks, David James; Piccini, Paola; Jahanshahi, Marjan

    2014-01-01

    The basal ganglia (BG) mediate certain types of procedural learning, such as probabilistic classification learning on the ‘weather prediction task’ (WPT). Patients with Parkinson's disease (PD), who have BG dysfunction, are impaired at WPT-learning, but it remains unclear what component of the WPT is important for learning to occur. We tested the hypothesis that learning through processing of corrective feedback is the essential component and is associated with release of striatal dopamine. We employed two WPT paradigms, either involving learning via processing of corrective feedback (FB) or in a paired associate manner (PA). To test the prediction that learning on the FB but not PA paradigm would be associated with dopamine release in the striatum, we used serial 11C-raclopride (RAC) positron emission tomography (PET), to investigate striatal dopamine release during FB and PA WPT-learning in healthy individuals. Two groups, FB, (n = 7) and PA (n = 8), underwent RAC PET twice, once while performing the WPT and once during a control task. Based on a region-of-interest approach, striatal RAC-binding potentials reduced by 13–17% in the right ventral striatum when performing the FB compared to control task, indicating release of synaptic dopamine. In contrast, right ventral striatal RAC binding non-significantly increased by 9% during the PA task. While differences between the FB and PA versions of the WPT in effort and decision-making is also relevant, we conclude striatal dopamine is released during FB-based WPT-learning, implicating the striatum and its dopamine connections in mediating learning with FB. PMID:24777947

  1. The spatial distribution of pet dogs and pet cats on the island of Ireland

    PubMed Central

    2011-01-01

    Background There is considerable international research regarding the link between human demographics and pet ownership. In several international studies, pet ownership was associated with household demographics including: the presence of children in the household, urban/rural location, level of education and age/family structure. What is lacking across all these studies, however, is an understanding of how these pets are spatially distributed throughout the regions under study. This paper describes the spatial distribution of pet dog and pet cat owning households on the island of Ireland. Results In 2006, there were an estimated 640,620 pet dog owning households and 215,542 pet cat owning households in Ireland. These estimates are derived from logistic regression modelling, based on household composition to determine pet dog ownership and the type of house to determine pet cat ownership. Results are presented using chloropleth maps. There is a higher density of pet dog owning households in the east of Ireland and in the cities than the west of Ireland and rural areas. However, in urban districts there are a lower proportion of households owning pet dogs than in rural districts. There are more households with cats in the urban areas, but the proportion of households with cats is greater in rural areas. Conclusions The difference in spatial distribution of dog ownership is a reflection of a generally higher density of households in the east of Ireland and in major cities. The higher proportion of ownership in the west is understandable given the higher proportion of farmers and rural dwellings in this area. Spatial representation allows us to visualise the impact of human household distribution on the density of both pet dogs and pet cats on the island of Ireland. This information can be used when analysing risk of disease spread, for market research and for instigating veterinary care. PMID:21663606

  2. The spatial distribution of pet dogs and pet cats on the island of Ireland.

    PubMed

    Downes, Martin J; Clegg, Tracy A; Collins, Daniel M; McGrath, Guy; More, Simon J

    2011-06-10

    There is considerable international research regarding the link between human demographics and pet ownership. In several international studies, pet ownership was associated with household demographics including: the presence of children in the household, urban/rural location, level of education and age/family structure. What is lacking across all these studies, however, is an understanding of how these pets are spatially distributed throughout the regions under study. This paper describes the spatial distribution of pet dog and pet cat owning households on the island of Ireland. In 2006, there were an estimated 640,620 pet dog owning households and 215,542 pet cat owning households in Ireland. These estimates are derived from logistic regression modelling, based on household composition to determine pet dog ownership and the type of house to determine pet cat ownership. Results are presented using chloropleth maps. There is a higher density of pet dog owning households in the east of Ireland and in the cities than the west of Ireland and rural areas. However, in urban districts there are a lower proportion of households owning pet dogs than in rural districts. There are more households with cats in the urban areas, but the proportion of households with cats is greater in rural areas. The difference in spatial distribution of dog ownership is a reflection of a generally higher density of households in the east of Ireland and in major cities. The higher proportion of ownership in the west is understandable given the higher proportion of farmers and rural dwellings in this area. Spatial representation allows us to visualise the impact of human household distribution on the density of both pet dogs and pet cats on the island of Ireland. This information can be used when analysing risk of disease spread, for market research and for instigating veterinary care.

  3. Value of a Dixon-based MR/PET attenuation correction sequence for the localization and evaluation of PET-positive lesions.

    PubMed

    Eiber, Matthias; Martinez-Möller, Axel; Souvatzoglou, Michael; Holzapfel, Konstantin; Pickhard, Anja; Löffelbein, Dennys; Santi, Ivan; Rummeny, Ernst J; Ziegler, Sibylle; Schwaiger, Markus; Nekolla, Stephan G; Beer, Ambros J

    2011-09-01

    In this study, the potential contribution of Dixon-based MR imaging with a rapid low-resolution breath-hold sequence, which is a technique used for MR-based attenuation correction (AC) for MR/positron emission tomography (PET), was evaluated for anatomical correlation of PET-positive lesions on a 3T clinical scanner compared to low-dose CT. This technique is also used in a recently installed fully integrated whole-body MR/PET system. Thirty-five patients routinely scheduled for oncological staging underwent (18)F-fluorodeoxyglucose (FDG) PET/CT and a 2-point Dixon 3-D volumetric interpolated breath-hold examination (VIBE) T1-weighted MR sequence on the same day. Two PET data sets reconstructed using attenuation maps from low-dose CT (PET(AC_CT)) or simulated MR-based segmentation (PET(AC_MR)) were evaluated for focal PET-positive lesions. The certainty for the correlation with anatomical structures was judged in the low-dose CT and Dixon-based MRI on a 4-point scale (0-3). In addition, the standardized uptake values (SUVs) for PET(AC_CT) and PET(AC_MR) were compared. Statistically, no significant difference could be found concerning anatomical localization for all 81 PET-positive lesions in low-dose CT compared to Dixon-based MR (mean 2.51 ± 0.85 and 2.37 ± 0.87, respectively; p = 0.1909). CT tended to be superior for small lymph nodes, bone metastases and pulmonary nodules, while Dixon-based MR proved advantageous for soft tissue pathologies like head/neck tumours and liver metastases. For the PET(AC_CT)- and PET(AC_MR)-based SUVs (mean 6.36 ± 4.47 and 6.31 ± 4.52, respectively) a nearly complete concordance with a highly significant correlation was found (r = 0.9975, p < 0.0001). Dixon-based MR imaging for MR AC allows for anatomical allocation of PET-positive lesions similar to low-dose CT in conventional PET/CT. Thus, this approach appears to be useful for future MR/PET for body regions not fully covered by diagnostic MRI due to potential time constraints.

  4. Probabilistic brain tissue segmentation in neonatal magnetic resonance imaging.

    PubMed

    Anbeek, Petronella; Vincken, Koen L; Groenendaal, Floris; Koeman, Annemieke; van Osch, Matthias J P; van der Grond, Jeroen

    2008-02-01

    A fully automated method has been developed for segmentation of four different structures in the neonatal brain: white matter (WM), central gray matter (CEGM), cortical gray matter (COGM), and cerebrospinal fluid (CSF). The segmentation algorithm is based on information from T2-weighted (T2-w) and inversion recovery (IR) scans. The method uses a K nearest neighbor (KNN) classification technique with features derived from spatial information and voxel intensities. Probabilistic segmentations of each tissue type were generated. By applying thresholds on these probability maps, binary segmentations were obtained. These final segmentations were evaluated by comparison with a gold standard. The sensitivity, specificity, and Dice similarity index (SI) were calculated for quantitative validation of the results. High sensitivity and specificity with respect to the gold standard were reached: sensitivity >0.82 and specificity >0.9 for all tissue types. Tissue volumes were calculated from the binary and probabilistic segmentations. The probabilistic segmentation volumes of all tissue types accurately estimated the gold standard volumes. The KNN approach offers valuable ways for neonatal brain segmentation. The probabilistic outcomes provide a useful tool for accurate volume measurements. The described method is based on routine diagnostic magnetic resonance imaging (MRI) and is suitable for large population studies.

  5. 68Ga-PSMA-11 PET/CT Mapping of Prostate Cancer Biochemical Recurrence After Radical Prostatectomy in 270 Patients with a PSA Level of Less Than 1.0 ng/mL: Impact on Salvage Radiotherapy Planning.

    PubMed

    Calais, Jeremie; Czernin, Johannes; Cao, Minsong; Kishan, Amar U; Hegde, John V; Shaverdian, Narek; Sandler, Kiri; Chu, Fang-I; King, Chris R; Steinberg, Michael L; Rauscher, Isabel; Schmidt-Hegemann, Nina-Sophie; Poeppel, Thorsten; Hetkamp, Philipp; Ceci, Francesco; Herrmann, Ken; Fendler, Wolfgang P; Eiber, Matthias; Nickols, Nicholas G

    2018-02-01

    Target volume delineations for prostate cancer (PCa) salvage radiotherapy (SRT) after radical prostatectomy are usually drawn in the absence of visibly recurrent disease. 68 Ga-labeled prostate-specific membrane antigen (PSMA-11) PET/CT detects recurrent PCa with sensitivity superior to standard-of-care imaging at serum prostate-specific antigen (PSA) values low enough to affect target volume delineations for routine SRT. Our objective was to map the recurrence pattern of PCa early biochemical recurrence (BCR) after radical prostatectomy with 68 Ga-PSMA-11 PET/CT in patients with serum PSA levels of less than 1 ng/mL, determine how often consensus clinical target volumes (CTVs) based on the Radiation Therapy Oncology Group (RTOG) guidelines cover 68 Ga-PSMA-11 PET/CT-defined disease, and assess the potential impact of 68 Ga-PSMA-11 PET/CT on SRT. Methods: This was a post hoc analysis of an intention-to-treat population of 270 patients who underwent 68 Ga-PSMA-11 PET/CT at 4 institutions for BCR after prostatectomy without prior radiotherapy at a PSA level of less than 1 ng/mL. RTOG consensus CTVs that included both the prostate bed and the pelvic lymph nodes were contoured on the CT dataset of the PET/CT image by a radiation oncologist masked to the PET component. 68 Ga-PSMA-11 PET/CT images were analyzed by a nuclear medicine physician. 68 Ga-PSMA-11-positive lesions not covered by planning volumes based on the consensus CTVs were considered to have a potential major impact on treatment planning. Results: The median PSA level at the time of 68 Ga-PSMA-11 PET/CT was 0.48 ng/mL (range, 0.03-1 ng/mL). One hundred thirty-two of 270 patients (49%) had a positive 68 Ga-PSMA-11 PET/CT result. Fifty-two of 270 (19%) had at least one PSMA-11-positive lesion not covered by the consensus CTVs. Thirty-three of 270 (12%) had extrapelvic PSMA-11-positive lesions, and 19 of 270 (7%) had PSMA-11-positive lesions within the pelvis but not covered by the consensus CTVs. The 2 most common 68 Ga-PSMA-11-positive lesion locations outside the consensus CTVs were bone (23/52, 44%) and perirectal lymph nodes (16/52, 31%). Conclusion: Post hoc analysis of 68 Ga-PSMA-11 PET/CT implied a major impact on SRT planning in 52 of 270 patients (19%) with PCa early BCR (PSA < 1.0 ng/mL). This finding justifies a randomized imaging trial of SRT with or without 68 Ga-PSMA-11 PET/CT investigating its potential benefit on clinical outcome. © 2018 by the Society of Nuclear Medicine and Molecular Imaging.

  6. Advancing Precision Nuclear Medicine and Molecular Imaging for Lymphoma.

    PubMed

    Wright, Chadwick L; Maly, Joseph J; Zhang, Jun; Knopp, Michael V

    2017-01-01

    PET with fluorodeoxyglucose F 18 ( 18 F FDG-PET) is a meaningful biomarker for the detection, targeted biopsy, and treatment of lymphoma. This article reviews the evolution of 18 F FDG-PET as a putative biomarker for lymphoma and addresses the current capabilities, challenges, and opportunities to enable precision medicine practices for lymphoma. Precision nuclear medicine is driven by new imaging technologies and methodologies to more accurately detect malignant disease. Although quantitative assessment of response is limited, such technologies will enable a more precise metabolic mapping with much higher definition image detail and thus may make it a robust and valid quantitative response assessment methodology. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Probabilistic mapping of flood-induced backscatter changes in SAR time series

    NASA Astrophysics Data System (ADS)

    Schlaffer, Stefan; Chini, Marco; Giustarini, Laura; Matgen, Patrick

    2017-04-01

    The information content of flood extent maps can be increased considerably by including information on the uncertainty of the flood area delineation. This additional information can be of benefit in flood forecasting and monitoring. Furthermore, flood probability maps can be converted to binary maps showing flooded and non-flooded areas by applying a threshold probability value pF = 0.5. In this study, a probabilistic change detection approach for flood mapping based on synthetic aperture radar (SAR) time series is proposed. For this purpose, conditional probability density functions (PDFs) for land and open water surfaces were estimated from ENVISAT ASAR Wide Swath (WS) time series containing >600 images using a reference mask of permanent water bodies. A pixel-wise harmonic model was used to account for seasonality in backscatter from land areas caused by soil moisture and vegetation dynamics. The approach was evaluated for a large-scale flood event along the River Severn, United Kingdom. The retrieved flood probability maps were compared to a reference flood mask derived from high-resolution aerial imagery by means of reliability diagrams. The obtained performance measures indicate both high reliability and confidence although there was a slight under-estimation of the flood extent, which may in part be attributed to topographically induced radar shadows along the edges of the floodplain. Furthermore, the results highlight the importance of local incidence angle for the separability between flooded and non-flooded areas as specular reflection properties of open water surfaces increase with a more oblique viewing geometry.

  8. A Probabilistic Strategy for Understanding Action Selection

    PubMed Central

    Kim, Byounghoon; Basso, Michele A.

    2010-01-01

    Brain regions involved in transforming sensory signals into movement commands are the likely sites where decisions are formed. Once formed, a decision must be read-out from the activity of populations of neurons to produce a choice of action. How this occurs remains unresolved. We recorded from four superior colliculus (SC) neurons simultaneously while monkeys performed a target selection task. We implemented three models to gain insight into the computational principles underlying population coding of action selection. We compared the population vector average (PVA), winner-takes-all (WTA) and a Bayesian model, maximum a posteriori estimate (MAP) to determine which predicted choices most often. The probabilistic model predicted more trials correctly than both the WTA and the PVA. The MAP model predicted 81.88% whereas WTA predicted 71.11% and PVA/OLE predicted the least number of trials at 55.71 and 69.47%. Recovering MAP estimates using simulated, non-uniform priors that correlated with monkeys’ choice performance, improved the accuracy of the model by 2.88%. A dynamic analysis revealed that the MAP estimate evolved over time and the posterior probability of the saccade choice reached a maximum at the time of the saccade. MAP estimates also scaled with choice performance accuracy. Although there was overlap in the prediction abilities of all the models, we conclude that movement choice from populations of neurons may be best understood by considering frameworks based on probability. PMID:20147560

  9. FDG-PET improves accuracy in distinguishing frontotemporal dementia and Alzheimer's disease.

    PubMed

    Foster, Norman L; Heidebrink, Judith L; Clark, Christopher M; Jagust, William J; Arnold, Steven E; Barbas, Nancy R; DeCarli, Charles S; Turner, R Scott; Koeppe, Robert A; Higdon, Roger; Minoshima, Satoshi

    2007-10-01

    Distinguishing Alzheimer's disease (AD) and frontotemporal dementia (FTD) currently relies on a clinical history and examination, but positron emission tomography with [(18)F] fluorodeoxyglucose (FDG-PET) shows different patterns of hypometabolism in these disorders that might aid differential diagnosis. Six dementia experts with variable FDG-PET experience made independent, forced choice, diagnostic decisions in 45 patients with pathologically confirmed AD (n = 31) or FTD (n = 14) using five separate methods: (1) review of clinical summaries, (2) a diagnostic checklist alone, (3) summary and checklist, (4) transaxial FDG-PET scans and (5) FDG-PET stereotactic surface projection (SSP) metabolic and statistical maps. In addition, we evaluated the effect of the sequential review of a clinical summary followed by SSP. Visual interpretation of SSP images was superior to clinical assessment and had the best inter-rater reliability (mean kappa = 0.78) and diagnostic accuracy (89.6%). It also had the highest specificity (97.6%) and sensitivity (86%), and positive likelihood ratio for FTD (36.5). The addition of FDG-PET to clinical summaries increased diagnostic accuracy and confidence for both AD and FTD. It was particularly helpful when raters were uncertain in their clinical diagnosis. Visual interpretation of FDG-PET after brief training is more reliable and accurate in distinguishing FTD from AD than clinical methods alone. FDG-PET adds important information that appropriately increases diagnostic confidence, even among experienced dementia specialists.

  10. Myocardial perfusion quantification using simultaneously acquired 13 NH3 -ammonia PET and dynamic contrast-enhanced MRI in patients at rest and stress.

    PubMed

    Kunze, Karl P; Nekolla, Stephan G; Rischpler, Christoph; Zhang, Shelley HuaLei; Hayes, Carmel; Langwieser, Nicolas; Ibrahim, Tareq; Laugwitz, Karl-Ludwig; Schwaiger, Markus

    2018-04-19

    Systematic differences with respect to myocardial perfusion quantification exist between DCE-MRI and PET. Using the potential of integrated PET/MRI, this study was conceived to compare perfusion quantification on the basis of simultaneously acquired 13 NH 3 -ammonia PET and DCE-MRI data in patients at rest and stress. Twenty-nine patients were examined on a 3T PET/MRI scanner. DCE-MRI was implemented in dual-sequence design and additional T 1 mapping for signal normalization. Four different deconvolution methods including a modified version of the Fermi technique were compared against 13 NH 3 -ammonia results. Cohort-average flow comparison yielded higher resting flows for DCE-MRI than for PET and, therefore, significantly lower DCE-MRI perfusion ratios under the common assumption of equal arterial and tissue hematocrit. Absolute flow values were strongly correlated in both slice-average (R 2  = 0.82) and regional (R 2  = 0.7) evaluations. Different DCE-MRI deconvolution methods yielded similar flow result with exception of an unconstrained Fermi method exhibiting outliers at high flows when compared with PET. Thresholds for Ischemia classification may not be directly tradable between PET and MRI flow values. Differences in perfusion ratios between PET and DCE-MRI may be lifted by using stress/rest-specific hematocrit conversion. Proper physiological constraints are advised in model-constrained deconvolution. © 2018 International Society for Magnetic Resonance in Medicine.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalligiannaki, Evangelia, E-mail: ekalligian@tem.uoc.gr; Harmandaris, Vagelis, E-mail: harman@uoc.gr; Institute of Applied and Computational Mathematics

    Using the probabilistic language of conditional expectations, we reformulate the force matching method for coarse-graining of molecular systems as a projection onto spaces of coarse observables. A practical outcome of this probabilistic description is the link of the force matching method with thermodynamic integration. This connection provides a way to systematically construct a local mean force and to optimally approximate the potential of mean force through force matching. We introduce a generalized force matching condition for the local mean force in the sense that allows the approximation of the potential of mean force under both linear and non-linear coarse grainingmore » mappings (e.g., reaction coordinates, end-to-end length of chains). Furthermore, we study the equivalence of force matching with relative entropy minimization which we derive for general non-linear coarse graining maps. We present in detail the generalized force matching condition through applications to specific examples in molecular systems.« less

  12. A partially reflecting random walk on spheres algorithm for electrical impedance tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maire, Sylvain, E-mail: maire@univ-tln.fr; Simon, Martin, E-mail: simon@math.uni-mainz.de

    2015-12-15

    In this work, we develop a probabilistic estimator for the voltage-to-current map arising in electrical impedance tomography. This novel so-called partially reflecting random walk on spheres estimator enables Monte Carlo methods to compute the voltage-to-current map in an embarrassingly parallel manner, which is an important issue with regard to the corresponding inverse problem. Our method uses the well-known random walk on spheres algorithm inside subdomains where the diffusion coefficient is constant and employs replacement techniques motivated by finite difference discretization to deal with both mixed boundary conditions and interface transmission conditions. We analyze the global bias and the variance ofmore » the new estimator both theoretically and experimentally. Subsequently, the variance of the new estimator is considerably reduced via a novel control variate conditional sampling technique which yields a highly efficient hybrid forward solver coupling probabilistic and deterministic algorithms.« less

  13. Robust Depth Image Acquisition Using Modulated Pattern Projection and Probabilistic Graphical Models

    PubMed Central

    Kravanja, Jaka; Žganec, Mario; Žganec-Gros, Jerneja; Dobrišek, Simon; Štruc, Vitomir

    2016-01-01

    Depth image acquisition with structured light approaches in outdoor environments is a challenging problem due to external factors, such as ambient sunlight, which commonly affect the acquisition procedure. This paper presents a novel structured light sensor designed specifically for operation in outdoor environments. The sensor exploits a modulated sequence of structured light projected onto the target scene to counteract environmental factors and estimate a spatial distortion map in a robust manner. The correspondence between the projected pattern and the estimated distortion map is then established using a probabilistic framework based on graphical models. Finally, the depth image of the target scene is reconstructed using a number of reference frames recorded during the calibration process. We evaluate the proposed sensor on experimental data in indoor and outdoor environments and present comparative experiments with other existing methods, as well as commercial sensors. PMID:27775570

  14. Analysis of flood hazard under consideration of dike breaches

    NASA Astrophysics Data System (ADS)

    Vorogushyn, S.; Apel, H.; Lindenschmidt, K.-E.; Merz, B.

    2009-04-01

    The study focuses on the development and application of a new modelling system which allows a comprehensive flood hazard assessment along diked river reaches under consideration of dike failures. The proposed Inundation Hazard Assessment Model (IHAM) represents a hybrid probabilistic-deterministic model. It comprises three models interactively coupled at runtime. These are: (1) 1D unsteady hydrodynamic model of river channel and floodplain flow between dikes, (2) probabilistic dike breach model which determines possible dike breach locations, breach widths and breach outflow discharges, and (3) 2D raster-based diffusion wave storage cell model of the hinterland areas behind the dikes. Due to the unsteady nature of the 1D and 2D coupled models, the dependence between hydraulic load at various locations along the reach is explicitly considered. The probabilistic dike breach model describes dike failures due to three failure mechanisms: overtopping, piping and slope instability caused by the seepage flow through the dike core (micro-instability). Dike failures for each mechanism are simulated based on fragility functions. The probability of breach is conditioned by the uncertainty in geometrical and geotechnical dike parameters. The 2D storage cell model driven by the breach outflow boundary conditions computes an extended spectrum of flood intensity indicators such as water depth, flow velocity, impulse, inundation duration and rate of water rise. IHAM is embedded in a Monte Carlo simulation in order to account for the natural variability of the flood generation processes reflected in the form of input hydrographs and for the randomness of dike failures given by breach locations, times and widths. The scenario calculations for the developed synthetic input hydrographs for the main river and tributary were carried out for floods with return periods of T = 100; 200; 500; 1000 a. Based on the modelling results, probabilistic dike hazard maps could be generated that indicate the failure probability of each discretised dike section for every scenario magnitude. Besides the binary inundation patterns that indicate the probability of raster cells being inundated, IHAM generates probabilistic flood hazard maps. These maps display spatial patterns of the considered flood intensity indicators and their associated return periods. The probabilistic nature of IHAM allows for the generation of percentile flood hazard maps that indicate the median and uncertainty bounds of the flood intensity indicators. The uncertainty results from the natural variability of the flow hydrographs and randomness of dike breach processes. The same uncertainty sources determine the uncertainty in the flow hydrographs along the study reach. The simulations showed that the dike breach stochasticity has an increasing impact on hydrograph uncertainty in downstream direction. Whereas in the upstream part of the reach the hydrograph uncertainty is mainly stipulated by the variability of the flood wave form, the dike failures strongly shape the uncertainty boundaries in the downstream part of the reach. Finally, scenarios of polder deployment for the extreme floods with T = 200; 500; 1000 a were simulated with IHAM. The results indicate a rather weak reduction of the mean and median flow hydrographs in the river channel. However, the capping of the flow peaks resulted in a considerable reduction of the overtopping failures downstream of the polder with a simultaneous slight increase of the piping and slope micro-instability frequencies explained by a more durable average impoundment. The developed IHAM simulation system represents a new scientific tool for studying fluvial inundation dynamics under extreme conditions incorporating effects of technical flood protection measures. With its major outputs in form of novel probabilistic inundation and dike hazard maps, the IHAM system has a high practical value for decision support in flood management.

  15. WE-AB-202-05: Validation of Lung Stress Maps for CT-Ventilation Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cazoulat, G; Jolly, S; Matuszak, M

    Purpose: To date, lung CT-ventilation imaging has been based on quantification of local breathing-induced changes in Hounsfield Units (HU) or volume. This work investigates the use of a stress map resulting from a biomechanical deformable image registration (DIR) algorithm as a metric of the ventilation function. Method: Eight lung cancer patients presenting different kinds of ventilation defects were retrospectively analyzed. Additionally, to the 4DCT acquired for radiotherapy planning, five of them had PET and three had SPECT imaging following inhalation of Ga-68 and Tc-99m, respectively. For each patient, the inhale phase of the 4DCT was registered to the exhale phasemore » using Morfeus, a biomechanical DIR algorithm based on the determination of boundary conditions on the lung surfaces and vessel tree. To take into account the heterogeneity of the tissue stiffness in the stress map estimation, each tetrahedral element of the finite-element model was assigned a Young’s modulus ranging from 60kPa to 12MPa, as a function of the HU in the inhale CT. The node displacements and element stresses resulting from the numerical simulation were used to generate three CT-ventilation maps based on: (i) volume changes (Jacobian determinant), (ii) changes in HU, (iii) the maximum principal stress. The voxel-wise correlation between each CT-ventilation map and the PET or SPECT V image was computed in a lung mask. Results: For patients with PET, the mean (min-max) Spearman correlation coefficients r were: 0.33 (0.19–0.45), 0.36 (0.16–0.51) and 0.42 (0.21–0.59) considering the Jacobian, changes in HU and maximum principal stress, respectively. For patients with SPECT V, the mean r were: 0.12 (−0.12–0.43), 0.29 (0.22–0.45) and 0.33 (0.25–0.39). Conclusion: The maximum principal stress maps showed a stronger correlation with the ventilation images than the previously proposed Jacobian or change in HU maps. This metric thus appears promising for CT-ventilation imaging. This work was funded in part by NIH P01CA059827.« less

  16. Great Balls of Fire: A probabilistic approach to quantify the hazard related to ballistics - A case study at La Fossa volcano, Vulcano Island, Italy

    NASA Astrophysics Data System (ADS)

    Biass, Sébastien; Falcone, Jean-Luc; Bonadonna, Costanza; Di Traglia, Federico; Pistolesi, Marco; Rosi, Mauro; Lestuzzi, Pierino

    2016-10-01

    We present a probabilistic approach to quantify the hazard posed by volcanic ballistic projectiles (VBP) and their potential impact on the built environment. A model named Great Balls of Fire (GBF) is introduced to describe ballistic trajectories of VBPs accounting for a variable drag coefficient and topography. It relies on input parameters easily identifiable in the field and is designed to model large numbers of VBPs stochastically. Associated functions come with the GBF code to post-process model outputs into a comprehensive probabilistic hazard assessment for VBP impacts. Outcomes include probability maps to exceed given thresholds of kinetic energies at impact, hazard curves and probabilistic isoenergy maps. Probabilities are calculated either on equally-sized pixels or zones of interest. The approach is calibrated, validated and applied to La Fossa volcano, Vulcano Island (Italy). We constructed a generic eruption scenario based on stratigraphic studies and numerical inversions of the 1888-1890 long-lasting Vulcanian cycle of La Fossa. Results suggest a ~ 10- 2% probability of occurrence of VBP impacts with kinetic energies ≤ 104 J at the touristic locality of Porto. In parallel, the vulnerability to roof perforation was estimated by combining field observations and published literature, allowing for a first estimate of the potential impact of VBPs during future Vulcanian eruptions. Results indicate a high physical vulnerability to the VBP hazard, and, consequently, half of the building stock having a ≥ 2.5 × 10- 3% probability of roof perforation.

  17. Combining a wavelet transform with a channelized Hotelling observer for tumor detection in 3D PET oncology imaging

    NASA Astrophysics Data System (ADS)

    Lartizien, Carole; Tomei, Sandrine; Maxim, Voichita; Odet, Christophe

    2007-03-01

    This study evaluates new observer models for 3D whole-body Positron Emission Tomography (PET) imaging based on a wavelet sub-band decomposition and compares them with the classical constant-Q CHO model. Our final goal is to develop an original method that performs guided detection of abnormal activity foci in PET oncology imaging based on these new observer models. This computer-aided diagnostic method would highly benefit to clinicians for diagnostic purpose and to biologists for massive screening of rodents populations in molecular imaging. Method: We have previously shown good correlation of the channelized Hotelling observer (CHO) using a constant-Q model with human observer performance for 3D PET oncology imaging. We propose an alternate method based on combining a CHO observer with a wavelet sub-band decomposition of the image and we compare it to the standard CHO implementation. This method performs an undecimated transform using a biorthogonal B-spline 4/4 wavelet basis to extract the features set for input to the Hotelling observer. This work is based on simulated 3D PET images of an extended MCAT phantom with randomly located lesions. We compare three evaluation criteria: classification performance using the signal-to-noise ratio (SNR), computation efficiency and visual quality of the derived 3D maps of the decision variable λ. The SNR is estimated on a series of test images for a variable number of training images for both observers. Results: Results show that the maximum SNR is higher with the constant-Q CHO observer, especially for targets located in the liver, and that it is reached with a smaller number of training images. However, preliminary analysis indicates that the visual quality of the 3D maps of the decision variable λ is higher with the wavelet-based CHO and the computation time to derive a 3D λ-map is about 350 times shorter than for the standard CHO. This suggests that the wavelet-CHO observer is a good candidate for use in our guided detection method.

  18. Comparison of reconstruction methods and quantitative accuracy in Siemens Inveon PET scanner

    NASA Astrophysics Data System (ADS)

    Ram Yu, A.; Kim, Jin Su; Kang, Joo Hyun; Moo Lim, Sang

    2015-04-01

    PET reconstruction is key to the quantification of PET data. To our knowledge, no comparative study of reconstruction methods has been performed to date. In this study, we compared reconstruction methods with various filters in terms of their spatial resolution, non-uniformities (NU), recovery coefficients (RCs), and spillover ratios (SORs). In addition, the linearity of reconstructed radioactivity between linearity of measured and true concentrations were also assessed. A Siemens Inveon PET scanner was used in this study. Spatial resolution was measured with NEMA standard by using a 1 mm3 sized 18F point source. Image quality was assessed in terms of NU, RC and SOR. To measure the effect of reconstruction algorithms and filters, data was reconstructed using FBP, 3D reprojection algorithm (3DRP), ordered subset expectation maximization 2D (OSEM 2D), and maximum a posteriori (MAP) with various filters or smoothing factors (β). To assess the linearity of reconstructed radioactivity, image quality phantom filled with 18F was used using FBP, OSEM and MAP (β =1.5 & 5 × 10-5). The highest achievable volumetric resolution was 2.31 mm3 and the highest RCs were obtained when OSEM 2D was used. SOR was 4.87% for air and 3.97% for water, obtained OSEM 2D reconstruction was used. The measured radioactivity of reconstruction image was proportional to the injected one for radioactivity below 16 MBq/ml when FBP or OSEM 2D reconstruction methods were used. By contrast, when the MAP reconstruction method was used, activity of reconstruction image increased proportionally, regardless of the amount of injected radioactivity. When OSEM 2D or FBP were used, the measured radioactivity concentration was reduced by 53% compared with true injected radioactivity for radioactivity <16 MBq/ml. The OSEM 2D reconstruction method provides the highest achievable volumetric resolution and highest RC among all the tested methods and yields a linear relation between the measured and true concentrations for radioactivity Our data collectively showed that OSEM 2D reconstruction method provides quantitatively accurate reconstructed PET data results.

  19. Evidence of Probabilistic Behaviour in Protein Interaction Networks

    DTIC Science & Technology

    2008-01-31

    Evidence of degree-weighted connectivity in nine PPI networks. a, Homo sapiens (human); b, Drosophila melanogaster (fruit fly); c-e, Saccharomyces...illustrates maps for the networks of Homo sapiens and Dro- sophila melanogaster, while maps for the remaining net- works are provided in Additional file 2. As...protein-protein interaction networks. a, Homo sapiens ; b, Drosophila melanogaster. Distances shown as average shortest path lengths L(k1, k2) between

  20. Probabilistic safety analysis for urgent situations following the accidental release of a pollutant in the atmosphere

    NASA Astrophysics Data System (ADS)

    Armand, P.; Brocheton, F.; Poulet, D.; Vendel, F.; Dubourg, V.; Yalamas, T.

    2014-10-01

    This paper is an original contribution to uncertainty quantification in atmospheric transport & dispersion (AT&D) at the local scale (1-10 km). It is proposed to account for the imprecise knowledge of the meteorological and release conditions in the case of an accidental hazardous atmospheric emission. The aim is to produce probabilistic risk maps instead of a deterministic toxic load map in order to help the stakeholders making their decisions. Due to the urge attached to such situations, the proposed methodology is able to produce such maps in a limited amount of time. It resorts to a Lagrangian particle dispersion model (LPDM) using wind fields interpolated from a pre-established database that collects the results from a computational fluid dynamics (CFD) model. This enables a decoupling of the CFD simulations from the dispersion analysis, thus a considerable saving of computational time. In order to make the Monte-Carlo-sampling-based estimation of the probability field even faster, it is also proposed to recourse to the use of a vector Gaussian process surrogate model together with high performance computing (HPC) resources. The Gaussian process (GP) surrogate modelling technique is coupled with a probabilistic principal component analysis (PCA) for reducing the number of GP predictors to fit, store and predict. The design of experiments (DOE) from which the surrogate model is built, is run over a cluster of PCs for making the total production time as short as possible. The use of GP predictors is validated by comparing the results produced by this technique with those obtained by crude Monte Carlo sampling.

  1. Methamphetamine-sensitized rats show augmented dopamine release to methylphenidate stimulation: a positron emission tomography using [18F]fallypride.

    PubMed

    Ota, Miho; Ogawa, Shintaro; Kato, Koichi; Wakabayashi, Chisato; Kunugi, Hiroshi

    2015-04-30

    Previous studies demonstrated that patients with schizophrenia show greater sensitivity to psychostimulants than healthy subjects. Sensitization to psychostimulants and resultant alteration of dopaminergic neurotransmission in rodents have been suggested as a useful model of schizophrenia. This study was aimed to examine the use of methylphenidate as a psychostimulant to induce dopamine release and that of [18F]fallypride as a radioligand to estimate the release in a rat model of schizophrenia. Six rats were scanned by positron emission tomography (PET) twice before and after methylphenidate challenge to evaluate dopamine release. After the scans, these rats were sensitized by using repeated methamphetamine (MAP) administration. Then, they were re-scanned twice again before and after methylphenidate challenge to evaluate whether MAP-sensitized rats show greater sensitivity to methylphenidate. We revealed a main effect of MAP-pretreatment and that of metylphenidate challenge. We found that % change of distribution volume ratio after repeated administration of MAP was greater than that before sensitization. These results suggest that methylphenidate-induced striatal dopamine release increased after sensitization to MAP. PET scan using [18F]fallypride at methylphenidate-challenge may provide a biological marker for schizophrenia and be useful to diagnose schizophrenia. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. A tesselated probabilistic representation for spatial robot perception and navigation

    NASA Technical Reports Server (NTRS)

    Elfes, Alberto

    1989-01-01

    The ability to recover robust spatial descriptions from sensory information and to efficiently utilize these descriptions in appropriate planning and problem-solving activities are crucial requirements for the development of more powerful robotic systems. Traditional approaches to sensor interpretation, with their emphasis on geometric models, are of limited use for autonomous mobile robots operating in and exploring unknown and unstructured environments. Here, researchers present a new approach to robot perception that addresses such scenarios using a probabilistic tesselated representation of spatial information called the Occupancy Grid. The Occupancy Grid is a multi-dimensional random field that maintains stochastic estimates of the occupancy state of each cell in the grid. The cell estimates are obtained by interpreting incoming range readings using probabilistic models that capture the uncertainty in the spatial information provided by the sensor. A Bayesian estimation procedure allows the incremental updating of the map using readings taken from several sensors over multiple points of view. An overview of the Occupancy Grid framework is given, and its application to a number of problems in mobile robot mapping and navigation are illustrated. It is argued that a number of robotic problem-solving activities can be performed directly on the Occupancy Grid representation. Some parallels are drawn between operations on Occupancy Grids and related image processing operations.

  3. Integrating statistical and process-based models to produce probabilistic landslide hazard at regional scale

    NASA Astrophysics Data System (ADS)

    Strauch, R. L.; Istanbulluoglu, E.

    2017-12-01

    We develop a landslide hazard modeling approach that integrates a data-driven statistical model and a probabilistic process-based shallow landslide model for mapping probability of landslide initiation, transport, and deposition at regional scales. The empirical model integrates the influence of seven site attribute (SA) classes: elevation, slope, curvature, aspect, land use-land cover, lithology, and topographic wetness index, on over 1,600 observed landslides using a frequency ratio (FR) approach. A susceptibility index is calculated by adding FRs for each SA on a grid-cell basis. Using landslide observations we relate susceptibility index to an empirically-derived probability of landslide impact. This probability is combined with results from a physically-based model to produce an integrated probabilistic map. Slope was key in landslide initiation while deposition was linked to lithology and elevation. Vegetation transition from forest to alpine vegetation and barren land cover with lower root cohesion leads to higher frequency of initiation. Aspect effects are likely linked to differences in root cohesion and moisture controlled by solar insulation and snow. We demonstrate the model in the North Cascades of Washington, USA and identify locations of high and low probability of landslide impacts that can be used by land managers in their design, planning, and maintenance.

  4. A Probabilistic Tsunami Hazard Study of the Auckland Region, Part I: Propagation Modelling and Tsunami Hazard Assessment at the Shoreline

    NASA Astrophysics Data System (ADS)

    Power, William; Wang, Xiaoming; Lane, Emily; Gillibrand, Philip

    2013-09-01

    Regional source tsunamis represent a potentially devastating threat to coastal communities in New Zealand, yet are infrequent events for which little historical information is available. It is therefore essential to develop robust methods for quantitatively estimating the hazards posed, so that effective mitigation measures can be implemented. We develop a probabilistic model for the tsunami hazard posed to the Auckland region of New Zealand from the Kermadec Trench and the southern New Hebrides Trench subduction zones. An innovative feature of our model is the systematic analysis of uncertainty regarding the magnitude-frequency distribution of earthquakes in the source regions. The methodology is first used to estimate the tsunami hazard at the coastline, and then used to produce a set of scenarios that can be applied to produce probabilistic maps of tsunami inundation for the study region; the production of these maps is described in part II. We find that the 2,500 year return period regional source tsunami hazard for the densely populated east coast of Auckland is dominated by events originating in the Kermadec Trench, while the equivalent hazard to the sparsely populated west coast is approximately equally due to events on the Kermadec Trench and the southern New Hebrides Trench.

  5. Ultra low-dose CT attenuation correction in PET SPM

    NASA Astrophysics Data System (ADS)

    Wang, Shyh-Jen; Yang, Bang-Hung; Tsai, Chia-Jung; Yang, Ching-Ching; Lee, Jason J. S.; Wu, Tung-Hsin

    2010-07-01

    The use of CT images for attenuation correction (CTAC) allows significantly shorter scanning time and a high quality noise-free attenuation map compared with conventional germanium-68 transmission scan because at least 10 4 times greater of photon flux would be generated from a CT scan under standard operating condition. However, this CTAC technique would potentially introduce more radiation risk to the patients owing to the higher radiation exposure from CT scan. Statistic parameters mapping (SPM) is a prominent technique in nuclear medicine community for the analysis of brain imaging data. The purpose of this study is to assess the feasibility of low-dose CT (LDCT) and ultra low-dose CT (UDCT) in PET SPM applications. The study was divided into two parts. The first part was to evaluate of tracer uptake distribution pattern and quantity analysis by using the striatal phantom to initially assess the feasibility of AC for clinical purpose. The second part was to examine the group SPM analysis using the Hoffman brain phantom. The phantom study is to simulate the human brain and to reduce the experimental uncertainty of real subjects. The initial studies show that the results of PET SPM analysis have no significant differences between LDCT and UDCT comparing to the current used default CTAC. Moreover, the dose of the LDCT is lower than that of the default CT by a factor of 9, and UDCT can even yield a 42 times dose reduction. We have demonstrated the SPM results while using LDCT and UDCT for PET AC is comparable to those using default CT setting, suggesting their feasibility in PET SPM applications. In addition, the necessity of UDCT in PET SPM studies to avoid excess radiation dose is also evident since most of the subjects involved are non-cancer patients or children and some normal subjects are even served as a comparison group in the experiment. It is our belief that additional attempts to decrease the radiation dose would be valuable, especially for children and normal volunteers, to work towards ALARA (as low as reasonably achievable) concept for PET SPM studies.

  6. HIGH-RESOLUTION L(Y)SO DETECTORS USING PMT-QUADRANT-SHARING FOR HUMAN & ANIMAL PET CAMERAS

    PubMed Central

    Ramirez, Rocio A.; Liu, Shitao; Liu, Jiguo; Zhang, Yuxuan; Kim, Soonseok; Baghaei, Hossain; Li, Hongdi; Wang, Yu; Wong, Wai-Hoi

    2009-01-01

    We developed high resolution L(Y)SO detectors for human and animal PET applications using Photomultiplier-quadrant-sharing (PQS) technology. The crystal sizes were 1.27 × 1.27 × 10 mm3 for the animal PQS-blocks and 3.25 × 3.25 × 20 mm3 for human ones. Polymer mirror film patterns (PMR) were placed between crystals as reflector. The blocks were assembled together using optical grease and wrapped by Teflon tape. The blocks were coupled to regular round PMT’s of 19/51 mm in PQS configuration. List-mode data of Ga-68 source (511 KeV) were acquired with our high yield pileup-event recovery (HYPER) electronics and data acquisition software. The high voltage bias was 1100V. Crystal decoding maps and individual crystal energy resolutions were extracted from the data. To investigate the potential imaging resolution of the PET cameras with these blocks, we used GATE (Geant4 Application for Tomographic Emission) simulation package. GATE is a GEANT4 based software toolkit for realistic simulation of PET and SPECT systems. The packing fractions of these blocks were found to be 95.6% and 98.2%. From the decoding maps, all 196 and 225 crystals were clearly identified. The average energy resolutions were 14.0% and 15.6%. For small animal PET systems, the detector ring diameter was 16.5 cm with an axial field of view (AFOV) of 11.8 cm. The simulation data suggests that a reconstructed radial (tangential) spatial resolution of 1.24 (1.25) mm near the center is potentially achievable. For the wholebody human PET systems, the detector ring diameter was 86 cm. The simulation data suggests that a reconstructed radial (tangential) spatial resolution of 3.09(3.38) mm near the center is potentially achievable. From this study we can conclude that PQS design could achieve high spatial resolutions and excellent energy resolutions on human and animal PET systems with substantially lower production costs and inexpensive readout devices. PMID:19946463

  7. Impacts of potential seismic landslides on lifeline corridors.

    DOT National Transportation Integrated Search

    2015-02-01

    This report presents a fully probabilistic method for regional seismically induced landslide hazard analysis and : mapping. The method considers the most current predictions for strong ground motions and seismic sources : through use of the U.S.G.S. ...

  8. Long-term multi-hazard assessment for El Misti volcano (Peru)

    NASA Astrophysics Data System (ADS)

    Sandri, Laura; Thouret, Jean-Claude; Constantinescu, Robert; Biass, Sébastien; Tonini, Roberto

    2014-02-01

    We propose a long-term probabilistic multi-hazard assessment for El Misti Volcano, a composite cone located <20 km from Arequipa. The second largest Peruvian city is a rapidly expanding economic centre and is classified by UNESCO as World Heritage. We apply the Bayesian Event Tree code for Volcanic Hazard (BET_VH) to produce probabilistic hazard maps for the predominant volcanic phenomena that may affect c.900,000 people living around the volcano. The methodology accounts for the natural variability displayed by volcanoes in their eruptive behaviour, such as different types/sizes of eruptions and possible vent locations. For this purpose, we treat probabilistically several model runs for some of the main hazardous phenomena (lahars, pyroclastic density currents (PDCs), tephra fall and ballistic ejecta) and data from past eruptions at El Misti (tephra fall, PDCs and lahars) and at other volcanoes (PDCs). The hazard maps, although neglecting possible interactions among phenomena or cascade effects, have been produced with a homogeneous method and refer to a common time window of 1 year. The probability maps reveal that only the north and east suburbs of Arequipa are exposed to all volcanic threats except for ballistic ejecta, which are limited to the uninhabited but touristic summit cone. The probability for pyroclastic density currents reaching recently expanding urban areas and the city along ravines is around 0.05 %/year, similar to the probability obtained for roof-critical tephra loading during the rainy season. Lahars represent by far the most probable threat (around 10 %/year) because at least four radial drainage channels can convey them approximately 20 km away from the volcano across the entire city area in heavy rain episodes, even without eruption. The Río Chili Valley represents the major concern to city safety owing to the probable cascading effect of combined threats: PDCs and rockslides, dammed lake break-outs and subsequent lahars or floods. Although this study does not intend to replace the current El Misti hazard map, the quantitative results of this probabilistic multi-hazard assessment can be incorporated into a multi-risk analysis, to support decision makers in any future improvement of the current hazard evaluation, such as further land-use planning and possible emergency management.

  9. Analysis of the French insurance market exposure to floods: a stochastic model combining river overflow and surface runoff

    NASA Astrophysics Data System (ADS)

    Moncoulon, D.; Labat, D.; Ardon, J.; Leblois, E.; Onfroy, T.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.

    2014-09-01

    The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible (but which have not yet occurred) flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2010 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90 % of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff, due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of the CCR (Caisse Centrale de Reassurance) claim database have shown that approximately 45 % of the insured flood losses are located inside the floodplains and 45 % outside. Another 10 % is due to sea surge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: a generation of fictive river flows based on the historical records of the river gauge network and a generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (Macif) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).

  10. Nonparametric Residue Analysis of Dynamic PET Data With Application to Cerebral FDG Studies in Normals.

    PubMed

    O'Sullivan, Finbarr; Muzi, Mark; Spence, Alexander M; Mankoff, David M; O'Sullivan, Janet N; Fitzgerald, Niall; Newman, George C; Krohn, Kenneth A

    2009-06-01

    Kinetic analysis is used to extract metabolic information from dynamic positron emission tomography (PET) uptake data. The theory of indicator dilutions, developed in the seminal work of Meier and Zierler (1954), provides a probabilistic framework for representation of PET tracer uptake data in terms of a convolution between an arterial input function and a tissue residue. The residue is a scaled survival function associated with tracer residence in the tissue. Nonparametric inference for the residue, a deconvolution problem, provides a novel approach to kinetic analysis-critically one that is not reliant on specific compartmental modeling assumptions. A practical computational technique based on regularized cubic B-spline approximation of the residence time distribution is proposed. Nonparametric residue analysis allows formal statistical evaluation of specific parametric models to be considered. This analysis needs to properly account for the increased flexibility of the nonparametric estimator. The methodology is illustrated using data from a series of cerebral studies with PET and fluorodeoxyglucose (FDG) in normal subjects. Comparisons are made between key functionals of the residue, tracer flux, flow, etc., resulting from a parametric (the standard two-compartment of Phelps et al. 1979) and a nonparametric analysis. Strong statistical evidence against the compartment model is found. Primarily these differences relate to the representation of the early temporal structure of the tracer residence-largely a function of the vascular supply network. There are convincing physiological arguments against the representations implied by the compartmental approach but this is the first time that a rigorous statistical confirmation using PET data has been reported. The compartmental analysis produces suspect values for flow but, notably, the impact on the metabolic flux, though statistically significant, is limited to deviations on the order of 3%-4%. The general advantage of the nonparametric residue analysis is the ability to provide a valid kinetic quantitation in the context of studies where there may be heterogeneity or other uncertainty about the accuracy of a compartmental model approximation of the tissue residue.

  11. Evaluation of Horizontal Seismic Hazard of Shahrekord, Iran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amiri, G. Ghodrati; Dehkordi, M. Raeisi; Amrei, S. A. Razavian

    2008-07-08

    This paper presents probabilistic horizontal seismic hazard assessment of Shahrekord, Iran. It displays the probabilistic estimate of Peak Ground Horizontal Acceleration (PGHA) for the return period of 75, 225, 475 and 2475 years. The output of the probabilistic seismic hazard analysis is based on peak ground acceleration (PGA), which is the most common criterion in designing of buildings. A catalogue of seismic events that includes both historical and instrumental events was developed and covers the period from 840 to 2007. The seismic sources that affect the hazard in Shahrekord were identified within the radius of 150 km and the recurrencemore » relationships of these sources were generated. Finally four maps have been prepared to indicate the earthquake hazard of Shahrekord in the form of iso-acceleration contour lines for different hazard levels by using SEISRISK III software.« less

  12. Reconstructing cerebrovascular networks under local physiological constraints by integer programming

    DOE PAGES

    Rempfler, Markus; Schneider, Matthias; Ielacqua, Giovanna D.; ...

    2015-04-23

    We introduce a probabilistic approach to vessel network extraction that enforces physiological constraints on the vessel structure. The method accounts for both image evidence and geometric relationships between vessels by solving an integer program, which is shown to yield the maximum a posteriori (MAP) estimate to the probabilistic model. Starting from an over-connected network, it is pruning vessel stumps and spurious connections by evaluating the local geometry and the global connectivity of the graph. We utilize a high-resolution micro computed tomography (µCT) dataset of a cerebrovascular corrosion cast to obtain a reference network and learn the prior distributions of ourmore » probabilistic model. As a result, we perform experiments on micro magnetic resonance angiography (µMRA) images of mouse brains and discuss properties of the networks obtained under different tracking and pruning approaches.« less

  13. Evaluation of attenuation and scatter correction requirements in small animal PET and SPECT imaging

    NASA Astrophysics Data System (ADS)

    Konik, Arda Bekir

    Positron emission tomography (PET) and single photon emission tomography (SPECT) are two nuclear emission-imaging modalities that rely on the detection of high-energy photons emitted from radiotracers administered to the subject. The majority of these photons are attenuated (absorbed or scattered) in the body, resulting in count losses or deviations from true detection, which in turn degrades the accuracy of images. In clinical emission tomography, sophisticated correction methods are often required employing additional x-ray CT or radionuclide transmission scans. Having proven their potential in both clinical and research areas, both PET and SPECT are being adapted for small animal imaging. However, despite the growing interest in small animal emission tomography, little scientific information exists about the accuracy of these correction methods on smaller size objects, and what level of correction is required. The purpose of this work is to determine the role of attenuation and scatter corrections as a function of object size through simulations. The simulations were performed using Interactive Data Language (IDL) and a Monte Carlo based package, Geant4 application for emission tomography (GATE). In IDL simulations, PET and SPECT data acquisition were modeled in the presence of attenuation. A mathematical emission and attenuation phantom approximating a thorax slice and slices from real PET/CT data were scaled to 5 different sizes (i.e., human, dog, rabbit, rat and mouse). The simulated emission data collected from these objects were reconstructed. The reconstructed images, with and without attenuation correction, were compared to the ideal (i.e., non-attenuated) reconstruction. Next, using GATE, scatter fraction values (the ratio of the scatter counts to the total counts) of PET and SPECT scanners were measured for various sizes of NEMA (cylindrical phantoms representing small animals and human), MOBY (realistic mouse/rat model) and XCAT (realistic human model) digital phantoms. In addition, PET projection files for different sizes of MOBY phantoms were reconstructed in 6 different conditions including attenuation and scatter corrections. Selected regions were analyzed for these different reconstruction conditions and object sizes. Finally, real mouse data from the real version of the same small animal PET scanner we modeled in our simulations were analyzed for similar reconstruction conditions. Both our IDL and GATE simulations showed that, for small animal PET and SPECT, even the smallest size objects (˜2 cm diameter) showed ˜15% error when both attenuation and scatter were not corrected. However, a simple attenuation correction using a uniform attenuation map and object boundary obtained from emission data significantly reduces this error in non-lung regions (˜1% for smallest size and ˜6% for largest size). In lungs, emissions values were overestimated when only attenuation correction was performed. In addition, we did not observe any significant improvement between the uses of uniform or actual attenuation map (e.g., only ˜0.5% for largest size in PET studies). The scatter correction was not significant for smaller size objects, but became increasingly important for larger sizes objects. These results suggest that for all mouse sizes and most rat sizes, uniform attenuation correction can be performed using emission data only. For smaller sizes up to ˜ 4 cm, scatter correction is not required even in lung regions. For larger sizes if accurate quantization needed, additional transmission scan may be required to estimate an accurate attenuation map for both attenuation and scatter corrections.

  14. Comparison and validation of shallow landslides susceptibility maps generated by bi-variate and multi-variate linear probabilistic GIS-based techniques. A case study from Ribeira Quente Valley (S. Miguel Island, Azores)

    NASA Astrophysics Data System (ADS)

    Marques, R.; Amaral, P.; Zêzere, J. L.; Queiroz, G.; Goulart, C.

    2009-04-01

    Slope instability research and susceptibility mapping is a fundamental component of hazard assessment and is of extreme importance for risk mitigation, land-use management and emergency planning. Landslide susceptibility zonation has been actively pursued during the last two decades and several methodologies are still being improved. Among all the methods presented in the literature, indirect quantitative probabilistic methods have been extensively used. In this work different linear probabilistic methods, both bi-variate and multi-variate (Informative Value, Fuzzy Logic, Weights of Evidence and Logistic Regression), were used for the computation of the spatial probability of landslide occurrence, using the pixel as mapping unit. The methods used are based on linear relationships between landslides and 9 considered conditioning factors (altimetry, slope angle, exposition, curvature, distance to streams, wetness index, contribution area, lithology and land-use). It was assumed that future landslides will be conditioned by the same factors as past landslides in the study area. The presented work was developed for Ribeira Quente Valley (S. Miguel Island, Azores), a study area of 9,5 km2, mainly composed of volcanic deposits (ash and pumice lapilli) produced by explosive eruptions in Furnas Volcano. This materials associated to the steepness of the slopes (38,9% of the area has slope angles higher than 35°, reaching a maximum of 87,5°), make the area very prone to landslide activity. A total of 1.495 shallow landslides were mapped (at 1:5.000 scale) and included in a GIS database. The total affected area is 401.744 m2 (4,5% of the study area). Most slope movements are translational slides frequently evolving into debris-flows. The landslides are elongated, with maximum length generally equivalent to the slope extent, and their width normally does not exceed 25 m. The failure depth rarely exceeds 1,5 m and the volume is usually smaller than 700 m3. For modelling purposes, the landslides were randomly divided in two sub-datasets: a modelling dataset with 748 events (2,2% of the study area) and a validation dataset with 747 events (2,3% of the study area). The susceptibility algorithms achieved with the different probabilistic techniques, were rated individually using success rate and prediction rate curves. The best model performance was obtained with the logistic regression, although the results from the different methods do not show significant differences neither in success nor in prediction rate curves. These evidences revealed that: (1) the modelling landslide dataset is representative of the entire landslide population characteristics; and (2) the increase of complexity and robustness in the probabilistic methodology did not produce a significant increase in success or prediction rates. Therefore, it was concluded that the resolution and quality of the input variables are much more important than the probabilistic model chosen to assess landslide susceptibility. This work was developed on the behalf of VOLCSOILRISK project (Volcanic Soils Geotechnical Characterization for Landslide Risk Mitigation), supported by Direcção Regional da Ciência e Tecnologia - Governo Regional dos Açores.

  15. Joint explorative analysis of neuroreceptor subsystems in the human brain: application to receptor-transporter correlation using PET data.

    PubMed

    Cselényi, Zsolt; Lundberg, Johan; Halldin, Christer; Farde, Lars; Gulyás, Balázs

    2004-10-01

    Positron emission tomography (PET) has proved to be a highly successful technique in the qualitative and quantitative exploration of the human brain's neurotransmitter-receptor systems. In recent years, the number of PET radioligands, targeted to different neuroreceptor systems of the human brain, has increased considerably. This development paves the way for a simultaneous analysis of different receptor systems and subsystems in the same individual. The detailed exploration of the versatility of neuroreceptor systems requires novel technical approaches, capable of operating on huge parametric image datasets. An initial step of such explorative data processing and analysis should be the development of novel exploratory data-mining tools to gain insight into the "structure" of complex multi-individual, multi-receptor data sets. For practical reasons, a possible and feasible starting point of multi-receptor research can be the analysis of the pre- and post-synaptic binding sites of the same neurotransmitter. In the present study, we propose an unsupervised, unbiased data-mining tool for this task and demonstrate its usefulness by using quantitative receptor maps, obtained with positron emission tomography, from five healthy subjects on (pre-synaptic) serotonin transporters (5-HTT or SERT) and (post-synaptic) 5-HT(1A) receptors. Major components of the proposed technique include the projection of the input receptor maps to a feature space, the quasi-clustering and classification of projected data (neighbourhood formation), trans-individual analysis of neighbourhood properties (trajectory analysis), and the back-projection of the results of trajectory analysis to normal space (creation of multi-receptor maps). The resulting multi-receptor maps suggest that complex relationships and tendencies in the relationship between pre- and post-synaptic transporter-receptor systems can be revealed and classified by using this method. As an example, we demonstrate the regional correlation of the serotonin transporter-receptor systems. These parameter-specific multi-receptor maps can usefully guide the researchers in their endeavour to formulate models of multi-receptor interactions and changes in the human brain.

  16. Specification and estimation of sources of bias affecting neurological studies in PET/MR with an anatomical brain phantom

    NASA Astrophysics Data System (ADS)

    Teuho, J.; Johansson, J.; Linden, J.; Saunavaara, V.; Tolvanen, T.; Teräs, M.

    2014-01-01

    Selection of reconstruction parameters has an effect on the image quantification in PET, with an additional contribution from a scanner-specific attenuation correction method. For achieving comparable results in inter- and intra-center comparisons, any existing quantitative differences should be identified and compensated for. In this study, a comparison between PET, PET/CT and PET/MR is performed by using an anatomical brain phantom, to identify and measure the amount of bias caused due to differences in reconstruction and attenuation correction methods especially in PET/MR. Differences were estimated by using visual, qualitative and quantitative analysis. The qualitative analysis consisted of a line profile analysis for measuring the reproduction of anatomical structures and the contribution of the amount of iterations to image contrast. The quantitative analysis consisted of measurement and comparison of 10 anatomical VOIs, where the HRRT was considered as the reference. All scanners reproduced the main anatomical structures of the phantom adequately, although the image contrast on the PET/MR was inferior when using a default clinical brain protocol. Image contrast was improved by increasing the amount of iterations from 2 to 5 while using 33 subsets. Furthermore, a PET/MR-specific bias was detected, which resulted in underestimation of the activity values in anatomical structures closest to the skull, due to the MR-derived attenuation map that ignores the bone. Thus, further improvements for the PET/MR reconstruction and attenuation correction could be achieved by optimization of RAMLA-specific reconstruction parameters and implementation of bone to the attenuation template.

  17. A new methodological approach for PET implementation in radiotherapy treatment planning.

    PubMed

    Bellan, Elena; Ferretti, Alice; Capirci, Carlo; Grassetto, Gaia; Gava, Marcello; Chondrogiannis, Sotirios; Virdis, Graziella; Marzola, Maria Cristina; Massaro, Arianna; Rubello, Domenico; Nibale, Otello

    2012-05-01

    In this paper, a new methodological approach to using PET information in radiotherapy treatment planning has been discussed. Computed tomography (CT) represents the primary modality to plan personalized radiation treatment, because it provides the basic electron density map for correct dose calculation. If PET scanning is also performed it is typically coregistered with the CT study. This operation can be executed automatically by a hybrid PET/CT scanner or, if the PET and CT imaging sets have been acquired through different equipment, by a dedicated module of the radiotherapy treatment planning system. Both approaches have some disadvantages: in the first case, the bore of a PET/CT system generally used in clinical practice often does not allow the use of certain bulky devices for patient immobilization in radiotherapy, whereas in the second case the result could be affected by limitations in window/level visualization of two different image modalities, and the displayed PET volumes can appear not to be related to the actual uptake into the patient. To overcome these problems, at our centre a specific procedure has been studied and tested in 30 patients, allowing good results of precision in the target contouring to be obtained. The process consists of segmentation of the biological target volume by a dedicated PET/CT console and its export to a dedicated radiotherapy system, where an image registration between the CT images acquired by the PET/CT scanner and a large-bore CT is performed. The planning target volume is contoured only on the large-bore CT and is used for virtual simulation, to individuate permanent skin markers on the patient.

  18. Joint reconstruction of dynamic PET activity and kinetic parametric images using total variation constrained dictionary sparse coding

    NASA Astrophysics Data System (ADS)

    Yu, Haiqing; Chen, Shuhang; Chen, Yunmei; Liu, Huafeng

    2017-05-01

    Dynamic positron emission tomography (PET) is capable of providing both spatial and temporal information of radio tracers in vivo. In this paper, we present a novel joint estimation framework to reconstruct temporal sequences of dynamic PET images and the coefficients characterizing the system impulse response function, from which the associated parametric images of the system macro parameters for tracer kinetics can be estimated. The proposed algorithm, which combines statistical data measurement and tracer kinetic models, integrates a dictionary sparse coding (DSC) into a total variational minimization based algorithm for simultaneous reconstruction of the activity distribution and parametric map from measured emission sinograms. DSC, based on the compartmental theory, provides biologically meaningful regularization, and total variation regularization is incorporated to provide edge-preserving guidance. We rely on techniques from minimization algorithms (the alternating direction method of multipliers) to first generate the estimated activity distributions with sub-optimal kinetic parameter estimates, and then recover the parametric maps given these activity estimates. These coupled iterative steps are repeated as necessary until convergence. Experiments with synthetic, Monte Carlo generated data, and real patient data have been conducted, and the results are very promising.

  19. Transmission imaging for integrated PET-MR systems.

    PubMed

    Bowen, Spencer L; Fuin, Niccolò; Levine, Michael A; Catana, Ciprian

    2016-08-07

    Attenuation correction for PET-MR systems continues to be a challenging problem, particularly for body regions outside the head. The simultaneous acquisition of transmission scan based μ-maps and MR images on integrated PET-MR systems may significantly increase the performance of and offer validation for new MR-based μ-map algorithms. For the Biograph mMR (Siemens Healthcare), however, use of conventional transmission schemes is not practical as the patient table and relatively small diameter scanner bore significantly restrict radioactive source motion and limit source placement. We propose a method for emission-free coincidence transmission imaging on the Biograph mMR. The intended application is not for routine subject imaging, but rather to improve and validate MR-based μ-map algorithms; particularly for patient implant and scanner hardware attenuation correction. In this study we optimized source geometry and assessed the method's performance with Monte Carlo simulations and phantom scans. We utilized a Bayesian reconstruction algorithm, which directly generates μ-map estimates from multiple bed positions, combined with a robust scatter correction method. For simulations with a pelvis phantom a single torus produced peak noise equivalent count rates (34.8 kcps) dramatically larger than a full axial length ring (11.32 kcps) and conventional rotating source configurations. Bias in reconstructed μ-maps for head and pelvis simulations was  ⩽4% for soft tissue and  ⩽11% for bone ROIs. An implementation of the single torus source was filled with (18)F-fluorodeoxyglucose and the proposed method quantified for several test cases alone or in comparison with CT-derived μ-maps. A volume average of 0.095 cm(-1) was recorded for an experimental uniform cylinder phantom scan, while a bias of  <2% was measured for the cortical bone equivalent insert of the multi-compartment phantom. Single torus μ-maps of a hip implant phantom showed significantly less artifacts and improved dynamic range, and differed greatly for highly attenuating materials in the case of the patient table, compared to CT results. Use of a fixed torus geometry, in combination with translation of the patient table to perform complete tomographic sampling, generated highly quantitative measured μ-maps and is expected to produce images with significantly higher SNR than competing fixed geometries at matched total acquisition time.

  20. Transmission imaging for integrated PET-MR systems

    NASA Astrophysics Data System (ADS)

    Bowen, Spencer L.; Fuin, Niccolò; Levine, Michael A.; Catana, Ciprian

    2016-08-01

    Attenuation correction for PET-MR systems continues to be a challenging problem, particularly for body regions outside the head. The simultaneous acquisition of transmission scan based μ-maps and MR images on integrated PET-MR systems may significantly increase the performance of and offer validation for new MR-based μ-map algorithms. For the Biograph mMR (Siemens Healthcare), however, use of conventional transmission schemes is not practical as the patient table and relatively small diameter scanner bore significantly restrict radioactive source motion and limit source placement. We propose a method for emission-free coincidence transmission imaging on the Biograph mMR. The intended application is not for routine subject imaging, but rather to improve and validate MR-based μ-map algorithms; particularly for patient implant and scanner hardware attenuation correction. In this study we optimized source geometry and assessed the method’s performance with Monte Carlo simulations and phantom scans. We utilized a Bayesian reconstruction algorithm, which directly generates μ-map estimates from multiple bed positions, combined with a robust scatter correction method. For simulations with a pelvis phantom a single torus produced peak noise equivalent count rates (34.8 kcps) dramatically larger than a full axial length ring (11.32 kcps) and conventional rotating source configurations. Bias in reconstructed μ-maps for head and pelvis simulations was  ⩽4% for soft tissue and  ⩽11% for bone ROIs. An implementation of the single torus source was filled with 18F-fluorodeoxyglucose and the proposed method quantified for several test cases alone or in comparison with CT-derived μ-maps. A volume average of 0.095 cm-1 was recorded for an experimental uniform cylinder phantom scan, while a bias of  <2% was measured for the cortical bone equivalent insert of the multi-compartment phantom. Single torus μ-maps of a hip implant phantom showed significantly less artifacts and improved dynamic range, and differed greatly for highly attenuating materials in the case of the patient table, compared to CT results. Use of a fixed torus geometry, in combination with translation of the patient table to perform complete tomographic sampling, generated highly quantitative measured μ-maps and is expected to produce images with significantly higher SNR than competing fixed geometries at matched total acquisition time.

  1. Relationship between regional cerebral metabolism and consciousness disturbance in traumatic diffuse brain injury without large focal lesions: an FDG-PET study with statistical parametric mapping analysis.

    PubMed

    Nakayama, N; Okumura, A; Shinoda, J; Nakashima, T; Iwama, T

    2006-07-01

    The cerebral metabolism of patients in the chronic stage of traumatic diffuse brain injury (TDBI) has not been fully investigated. To study the relationship between regional cerebral metabolism (rCM) and consciousness disturbance in patients with TDBI. 52 patients with TDBI in the chronic stage without large focal lesions were enrolled, and rCM was evaluated by fluorine-18-fluorodeoxyglucose positron emission tomography (FDG-PET) with statistical parametric mapping (SPM). All the patients were found to have disturbed consciousness or cognitive function and were divided into the following three groups: group A (n = 22), patients in a state with higher brain dysfunction; group B (n = 13), patients in a minimally conscious state; and group C (n = 17), patients in a vegetative state. rCM patterns on FDG-PET among these groups were evaluated and compared with those of normal control subjects on statistical parametric maps. Hypometabolism was consistently indicated bilaterally in the medial prefrontal regions, the medial frontobasal regions, the cingulate gyrus and the thalamus. Hypometabolism in these regions was the most widespread and prominent in group C, and that in group B was more widespread and prominent than that in group A. Bilateral hypometabolism in the medial prefrontal regions, the medial frontobasal regions, the cingulate gyrus and the thalamus may reflect the clinical deterioration of TDBI, which is due to functional and structural disconnections of neural networks rather than due to direct cerebral focal contusion.

  2. Comparison of different methods for the assessment of the urban heat island in Stuttgart, Germany.

    PubMed

    Ketterer, Christine; Matzarakis, Andreas

    2015-09-01

    This study of the urban heat island (UHI) aims to support planning authorities by going beyond the traditional way of urban heat island studies. Therefore, air temperature as well as the physiologically equivalent temperature (PET) were applied to take into account the effect of the thermal atmosphere on city dwellers. The analysis of the urban heat island phenomenon of Stuttgart, Germany, includes a long-term frequency analysis using data of four urban and one rural meteorological stations. A (high resolution map) of the UHI intensity and PET was created using stepwise multiple linear regression based on data of car traverses as well as spatial data. The mapped conditions were classified according to the long-term frequency analysis. Regarding climate change, the need for adaptation measures as urban greening is obvious. Therefore, a spatial analysis of quantification of two scenarios of a chosen study area was done by the application of a micro-scale model. The nocturnal UHI of Stuttgart is during 15 % stronger than 4 K in the city center during summer when daytime heat stress occurs during 40 %. A typical summer condition is mapped using statistical approach to point out the most strained areas in Stuttgart center and west. According to the model results, the increase in number of trees in a chosen area (Olga hospital) can decrease PET by 0.5 K at 22:00 CET but by maximum 27 K at 14:00 CET.

  3. Displaying uncertainty: investigating the effects of display format and specificity.

    PubMed

    Bisantz, Ann M; Marsiglio, Stephanie Schinzing; Munch, Jessica

    2005-01-01

    We conducted four studies regarding the representation of probabilistic information. Experiments 1 through 3 compared performance on a simulated stock purchase task, in which information regarding stock profitability was probabilistic. Two variables were manipulated: display format for probabilistic information (blurred and colored icons, linguistic phrases, numeric expressions, and combinations) and specificity level (in which the number and size of discrete steps into which the probabilistic information was mapped differed). Results indicated few performance differences attributable to display format; however, performance did improve with greater specificity. Experiment 4, in which participants generated membership functions corresponding to three display formats, found a high degree of similarity in functions across formats and participants and a strong relationship between the shape of the membership function and the intended meaning of the representation. These results indicate that participants can successfully interpret nonnumeric representations of uncertainty and can use such representations in a manner similar to the way numeric expressions are used in a decision-making task. Actual or potential applications of this research include the use of graphical representations of uncertainty in systems such as command and control and situation displays.

  4. MR signal-fat-fraction analysis and T2* weighted imaging measure BAT reliably on humans without cold exposure.

    PubMed

    Holstila, Milja; Pesola, Marko; Saari, Teemu; Koskensalo, Kalle; Raiko, Juho; Borra, Ronald J H; Nuutila, Pirjo; Parkkola, Riitta; Virtanen, Kirsi A

    2017-05-01

    Brown adipose tissue (BAT) is compositionally distinct from white adipose tissue (WAT) in terms of triglyceride and water content. In adult humans, the most significant BAT depot is localized in the supraclavicular area. Our aim is to differentiate brown adipose tissue from white adipose tissue using fat T2* relaxation time mapping and signal-fat-fraction (SFF) analysis based on a commercially available modified 2-point-Dixon (mDixon) water-fat separation method. We hypothesize that magnetic resonance (MR) imaging can reliably measure BAT regardless of the cold-induced metabolic activation, with BAT having a significantly higher water and iron content compared to WAT. The supraclavicular area of 13 volunteers was studied on 3T PET-MRI scanner using T2* relaxation time and SFF mapping both during cold exposure and at ambient temperature; and 18 F-FDG PET during cold exposure. Volumes of interest (VOIs) were defined semiautomatically in the supraclavicular fat depot, subcutaneous WAT and muscle. The supraclavicular fat depot (assumed to contain BAT) had a significantly lower SFF and fat T2* relaxation time compared to subcutaneous WAT. Cold exposure did not significantly affect MR-based measurements. SFF and T2* values measured during cold exposure and at ambient temperature correlated inversely with the glucose uptake measured by 18 F-FDG PET. Human BAT can be reliably and safely assessed using MRI without cold activation and PET-related radiation exposure. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Neural correlates of the popular music phenomenon: evidence from functional MRI and PET imaging.

    PubMed

    Chen, Qiaozhen; Zhang, Ying; Hou, Haifeng; Du, Fenglei; Wu, Shuang; Chen, Lin; Shen, Yehua; Chao, Fangfang; Chung, June-Key; Zhang, Hong; Tian, Mei

    2017-06-01

    Music can induce different emotions. However, its neural mechanism remains unknown. The aim of this study was to use functional magnetic resonance imaging (fMRI) and position emission tomography (PET) imaging for mapping of neural changes under the most popular music in healthy volunteers. Blood-oxygen-level-dependent (BOLD) fMRI and monoamine receptor PET imaging with 11 C-N-methylspiperone ( 11 C-NMSP) were conducted under the popular music Gangnam Style and light music A Comme Amour in healthy subjects. PET and fMRI images were analyzed by using the Statistical Parametric Mapping software (SPM). Significantly increased fMRI BOLD signals were found in the bilateral superior temporal cortices, left cerebellum, left putamen and right thalamus cortex. Monoamine receptor availability was increased significantly in the left superior temporal gyrus and left putamen, but decreased in the bilateral superior occipital cortices under the Gangnam Style compared with the light music condition. Significant positive correlation was found between 11 C-NMSP binding and fMRI BOLD signals in the left temporal cortex. Furthermore, increased 11 C-NMSP binding in the left putamen was positively correlated with the mood arousal level score under the Gangnam Style condition. Popular music Gangnam Style can arouse pleasure experience and strong emotional response. The left putamen is positively correlated with the mood arousal level score under the Gangnam Style condition. Our results revealed characteristic patterns of brain activity associated with Gangnam Style, and may also provide more general insights into the music-induced emotional processing.

  6. Cytoarchitectonic identification and probabilistic mapping of two distinct areas within the anterior ventral bank of the human intraparietal sulcus

    PubMed Central

    Choi, Hi-Jae; Zilles, Karl; Mohlberg, Hartmut; Schleicher, Axel; Fink, Gereon R.; Armstrong, Este; Amunts, Katrin

    2008-01-01

    Anatomical studies in the macaque cortex and functional imaging studies in humans have demonstrated the existence of different cortical areas within the IntraParietal Sulcus (IPS). Such functional segregation, however, does not correlate with presently available architectonic maps of the human brain. This is particularly true for the classical Brodmann map, which is still widely used as an anatomical reference in functional imaging studies. The aim of this cytoarchitectonic mapping study was to use previously defined algorithms to determine whether consistent regions and borders can be found within the cortex of the anterior IPS in a population of ten postmortem human brains. Two areas, the human IntraParietal area 1 (hIP1) and the human IntraParietal area 2 (hIP2), were delineated in serial histological sections of the anterior, lateral bank of the human IPS. The region hIP1 is located posterior and medial to hIP2, and the former is always within the depths of the IPS. The latter, on the other hand, sometimes reaches the free surface of the superior parietal lobule. The delineations were registered to standard reference space, and probabilistic maps were calculated, thereby quantifying the intersubject variability in location and extent of both areas. In the future, they can be a tool in analyzing structure – function relationships and a basis for determining degrees of homology in the IPS among anthropoid primates. We conclude that the human intraparietal sulcus has a finer grained parcellation than shown in Brodmann’s map. PMID:16432904

  7. Probabilistic tsunami inundation map based on stochastic earthquake source model: A demonstration case in Macau, the South China Sea

    NASA Astrophysics Data System (ADS)

    Li, Linlin; Switzer, Adam D.; Wang, Yu; Chan, Chung-Han; Qiu, Qiang; Weiss, Robert

    2017-04-01

    Current tsunami inundation maps are commonly generated using deterministic scenarios, either for real-time forecasting or based on hypothetical "worst-case" events. Such maps are mainly used for emergency response and evacuation planning and do not include the information of return period. However, in practice, probabilistic tsunami inundation maps are required in a wide variety of applications, such as land-use planning, engineer design and for insurance purposes. In this study, we present a method to develop the probabilistic tsunami inundation map using a stochastic earthquake source model. To demonstrate the methodology, we take Macau a coastal city in the South China Sea as an example. Two major advances of this method are: it incorporates the most updated information of seismic tsunamigenic sources along the Manila megathrust; it integrates a stochastic source model into a Monte Carlo-type simulation in which a broad range of slip distribution patterns are generated for large numbers of synthetic earthquake events. When aggregated the large amount of inundation simulation results, we analyze the uncertainties associated with variability of earthquake rupture location and slip distribution. We also explore how tsunami hazard evolves in Macau in the context of sea level rise. Our results suggest Macau faces moderate tsunami risk due to its low-lying elevation, extensive land reclamation, high coastal population and major infrastructure density. Macau consists of four districts: Macau Peninsula, Taipa Island, Coloane island and Cotai strip. Of these Macau Peninsula is the most vulnerable to tsunami due to its low-elevation and exposure to direct waves and refracted waves from the offshore region and reflected waves from mainland. Earthquakes with magnitude larger than Mw8.0 in the northern Manila trench would likely cause hazardous inundation in Macau. Using a stochastic source model, we are able to derive a spread of potential tsunami impacts for earthquakes with the same magnitude. The diversity is caused by both random rupture locations and heterogeneous slip distribution. Adding the sea level rise component, the inundated depth caused by 1 m sea level rise is equivalent to the one caused by 90 percentile of an ensemble of Mw8.4 earthquakes.

  8. MR/PET quantification tools: Registration, segmentation, classification, and MR-based attenuation correction

    PubMed Central

    Fei, Baowei; Yang, Xiaofeng; Nye, Jonathon A.; Aarsvold, John N.; Raghunath, Nivedita; Cervo, Morgan; Stark, Rebecca; Meltzer, Carolyn C.; Votaw, John R.

    2012-01-01

    Purpose: Combined MR/PET is a relatively new, hybrid imaging modality. A human MR/PET prototype system consisting of a Siemens 3T Trio MR and brain PET insert was installed and tested at our institution. Its present design does not offer measured attenuation correction (AC) using traditional transmission imaging. This study is the development of quantification tools including MR-based AC for quantification in combined MR/PET for brain imaging. Methods: The developed quantification tools include image registration, segmentation, classification, and MR-based AC. These components were integrated into a single scheme for processing MR/PET data. The segmentation method is multiscale and based on the Radon transform of brain MR images. It was developed to segment the skull on T1-weighted MR images. A modified fuzzy C-means classification scheme was developed to classify brain tissue into gray matter, white matter, and cerebrospinal fluid. Classified tissue is assigned an attenuation coefficient so that AC factors can be generated. PET emission data are then reconstructed using a three-dimensional ordered sets expectation maximization method with the MR-based AC map. Ten subjects had separate MR and PET scans. The PET with [11C]PIB was acquired using a high-resolution research tomography (HRRT) PET. MR-based AC was compared with transmission (TX)-based AC on the HRRT. Seventeen volumes of interest were drawn manually on each subject image to compare the PET activities between the MR-based and TX-based AC methods. Results: For skull segmentation, the overlap ratio between our segmented results and the ground truth is 85.2 ± 2.6%. Attenuation correction results from the ten subjects show that the difference between the MR and TX-based methods was <6.5%. Conclusions: MR-based AC compared favorably with conventional transmission-based AC. Quantitative tools including registration, segmentation, classification, and MR-based AC have been developed for use in combined MR/PET. PMID:23039679

  9. Optimization of PET-MR Registrations for Nonhuman Primates Using Mutual Information Measures: A Multi-Transform Method (MTM)

    PubMed Central

    Sandiego, Christine M.; Weinzimmer, David; Carson, Richard E.

    2012-01-01

    An important step in PET brain kinetic analysis is the registration of functional data to an anatomical MR image. Typically, PET-MR registrations in nonhuman primate neuroreceptor studies used PET images acquired early post-injection, (e.g., 0–10 min) to closely resemble the subject’s MR image. However, a substantial fraction of these registrations (~25%) fail due to the differences in kinetics and distribution for various radiotracer studies and conditions (e.g., blocking studies). The Multi-Transform Method (MTM) was developed to improve the success of registrations between PET and MR images. Two algorithms were evaluated, MTM-I and MTM-II. The approach involves creating multiple transformations by registering PET images of different time intervals, from a dynamic study, to a single reference (i.e., MR image) (MTM-I) or to multiple reference images (i.e., MR and PET images pre-registered to the MR) (MTM-II). Normalized mutual information was used to compute similarity between the transformed PET images and the reference image(s) to choose the optimal transformation. This final transformation is used to map the dynamic dataset into the animal’s anatomical MR space, required for kinetic analysis. The chosen transformed from MTM-I and MTM-II were evaluated using visual rating scores to assess the quality of spatial alignment between the resliced PET and reference. One hundred twenty PET datasets involving eleven different tracers from 3 different scanners were used to evaluate the MTM algorithms. Studies were performed with baboons and rhesus monkeys on the HR+, HRRT, and Focus-220. Successful transformations increased from 77.5%, 85.8%, to 96.7% using the 0–10 min method, MTM-I, and MTM-II, respectively, based on visual rating scores. The Multi-Transform Methods proved to be a robust technique for PET-MR registrations for a wide range of PET studies. PMID:22926293

  10. PET guidance for liver radiofrequency ablation: an evaluation

    NASA Astrophysics Data System (ADS)

    Lei, Peng; Dandekar, Omkar; Mahmoud, Faaiza; Widlus, David; Malloy, Patrick; Shekhar, Raj

    2007-03-01

    Radiofrequency ablation (RFA) is emerging as the primary mode of treatment of unresectable malignant liver tumors. With current intraoperative imaging modalities, quick, precise, and complete localization of lesions remains a challenge for liver RFA. Fusion of intraoperative CT and preoperative PET images, which relies on PET and CT registration, can produce a new image with complementary metabolic and anatomic data and thus greatly improve the targeting accuracy. Unlike neurological images, alignment of abdominal images by combined PET/CT scanner is prone to errors as a result of large nonrigid misalignment in abdominal images. Our use of a normalized mutual information-based 3D nonrigid registration technique has proven powerful for whole-body PET and CT registration. We demonstrate here that this technique is capable of acceptable abdominal PET and CT registration as well. In five clinical cases, both qualitative and quantitative validation showed that the registration is robust and accurate. Quantitative accuracy was evaluated by comparison between the result from the algorithm and clinical experts. The accuracy of registration is much less than the allowable margin in liver RFA. Study findings show the technique's potential to enable the augmentation of intraoperative CT with preoperative PET to reduce procedure time, avoid repeating procedures, provide clinicians with complementary functional/anatomic maps, avoid omitting dispersed small lesions, and improve the accuracy of tumor targeting in liver RFA.

  11. PET Imaging - from Physics to Clinical Molecular Imaging

    NASA Astrophysics Data System (ADS)

    Majewski, Stan

    2008-03-01

    From the beginnings many years ago in a few physics laboratories and first applications as a research brain function imager, PET became lately a leading molecular imaging modality used in diagnosis, staging and therapy monitoring of cancer, as well as has increased use in assessment of brain function (early diagnosis of Alzheimer's, etc) and in cardiac function. To assist with anatomic structure map and with absorption correction CT is often used with PET in a duo system. Growing interest in the last 5-10 years in dedicated organ specific PET imagers (breast, prostate, brain, etc) presents again an opportunity to the particle physics instrumentation community to contribute to the important field of medical imaging. In addition to the bulky standard ring structures, compact, economical and high performance mobile imagers are being proposed and build. The latest development in standard PET imaging is introduction of the well known TOF concept enabling clearer tomographic pictures of the patient organs. Development and availability of novel photodetectors such as Silicon PMT immune to magnetic fields offers an exciting opportunity to use PET in conjunction with MRI and fMRI. As before with avalanche photodiodes, particle physics community plays a leading role in developing these devices. The presentation will mostly focus on present and future opportunities for better PET designs based on new technologies and methods: new scintillators, photodetectors, readout, software.

  12. USGS National Seismic Hazard Maps

    USGS Publications Warehouse

    Frankel, A.D.; Mueller, C.S.; Barnhard, T.P.; Leyendecker, E.V.; Wesson, R.L.; Harmsen, S.C.; Klein, F.W.; Perkins, D.M.; Dickman, N.C.; Hanson, S.L.; Hopper, M.G.

    2000-01-01

    The U.S. Geological Survey (USGS) recently completed new probabilistic seismic hazard maps for the United States, including Alaska and Hawaii. These hazard maps form the basis of the probabilistic component of the design maps used in the 1997 edition of the NEHRP Recommended Provisions for Seismic Regulations for New Buildings and Other Structures, prepared by the Building Seismic Safety Council arid published by FEMA. The hazard maps depict peak horizontal ground acceleration and spectral response at 0.2, 0.3, and 1.0 sec periods, with 10%, 5%, and 2% probabilities of exceedance in 50 years, corresponding to return times of about 500, 1000, and 2500 years, respectively. In this paper we outline the methodology used to construct the hazard maps. There are three basic components to the maps. First, we use spatially smoothed historic seismicity as one portion of the hazard calculation. In this model, we apply the general observation that moderate and large earthquakes tend to occur near areas of previous small or moderate events, with some notable exceptions. Second, we consider large background source zones based on broad geologic criteria to quantify hazard in areas with little or no historic seismicity, but with the potential for generating large events. Third, we include the hazard from specific fault sources. We use about 450 faults in the western United States (WUS) and derive recurrence times from either geologic slip rates or the dating of pre-historic earthquakes from trenching of faults or other paleoseismic methods. Recurrence estimates for large earthquakes in New Madrid and Charleston, South Carolina, were taken from recent paleoliquefaction studies. We used logic trees to incorporate different seismicity models, fault recurrence models, Cascadia great earthquake scenarios, and ground-motion attenuation relations. We present disaggregation plots showing the contribution to hazard at four cities from potential earthquakes with various magnitudes and distances.

  13. U.S. Geological Survery Oil and Gas Resource Assessment of the Russian Arctic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Donald Gautier; Timothy Klett

    2008-12-31

    The U.S. Geological Survey (USGS) recently completed a study of undiscovered petroleum resources in the Russian Arctic as a part of its Circum-Arctic Resource Appraisal (CARA), which comprised three broad areas of work: geological mapping, basin analysis, and quantitative assessment. The CARA was a probabilistic, geologically based study that used existing USGS methodology, modified somewhat for the circumstances of the Arctic. New map compilation was used to identify assessment units. The CARA relied heavily on geological analysis and analog modeling, with numerical input consisting of lognormal distributions of sizes and numbers of undiscovered accumulations. Probabilistic results for individual assessment unitsmore » were statistically aggregated, taking geological dependencies into account. The U.S. Department of Energy (DOE) funds were used to support the purchase of crucial seismic data collected in the Barents Sea, East Siberian Sea, and Chukchi Sea for use by USGS in its assessment of the Russian Arctic. DOE funds were also used to purchase a commercial study, which interpreted seismic data from the northern Kara Sea, and for geographic information system (GIS) support of USGS mapping of geological features, province boundaries, total petroleum systems, and assessment units used in the USGS assessment.« less

  14. FPGA-based RF interference reduction techniques for simultaneous PET-MRI.

    PubMed

    Gebhardt, P; Wehner, J; Weissler, B; Botnar, R; Marsden, P K; Schulz, V

    2016-05-07

    The combination of positron emission tomography (PET) and magnetic resonance imaging (MRI) as a multi-modal imaging technique is considered very promising and powerful with regard to in vivo disease progression examination, therapy response monitoring and drug development. However, PET-MRI system design enabling simultaneous operation with unaffected intrinsic performance of both modalities is challenging. As one of the major issues, both the PET detectors and the MRI radio-frequency (RF) subsystem are exposed to electromagnetic (EM) interference, which may lead to PET and MRI signal-to-noise ratio (SNR) deteriorations. Early digitization of electronic PET signals within the MRI bore helps to preserve PET SNR, but occurs at the expense of increased amount of PET electronics inside the MRI and associated RF field emissions. This raises the likelihood of PET-related MRI interference by coupling into the MRI RF coil unwanted spurious signals considered as RF noise, as it degrades MRI SNR and results in MR image artefacts. RF shielding of PET detectors is a commonly used technique to reduce PET-related RF interferences, but can introduce eddy-current-related MRI disturbances and hinder the highest system integration. In this paper, we present RF interference reduction methods which rely on EM field coupling-decoupling principles of RF receive coils rather than suppressing emitted fields. By modifying clock frequencies and changing clock phase relations of digital circuits, the resulting RF field emission is optimised with regard to a lower field coupling into the MRI RF coil, thereby increasing the RF silence of PET detectors. Our methods are demonstrated by performing FPGA-based clock frequency and phase shifting of digital silicon photo-multipliers (dSiPMs) used in the PET modules of our MR-compatible Hyperion II (D) PET insert. We present simulations and magnetic-field map scans visualising the impact of altered clock phase pattern on the spatial RF field distribution, followed by MRI noise and SNR scans performed with an operating PET module using different clock frequencies and phase patterns. The methods were implemented via firmware design changes without any hardware modifications. This introduces new means of flexibility by enabling adaptive RF interference reduction optimisations in the field, e.g. when using a PET insert with different MRI systems or when different MRI RF coil types are to be operated with the same PET detector.

  15. FPGA-based RF interference reduction techniques for simultaneous PET-MRI

    NASA Astrophysics Data System (ADS)

    Gebhardt, P.; Wehner, J.; Weissler, B.; Botnar, R.; Marsden, P. K.; Schulz, V.

    2016-05-01

    The combination of positron emission tomography (PET) and magnetic resonance imaging (MRI) as a multi-modal imaging technique is considered very promising and powerful with regard to in vivo disease progression examination, therapy response monitoring and drug development. However, PET-MRI system design enabling simultaneous operation with unaffected intrinsic performance of both modalities is challenging. As one of the major issues, both the PET detectors and the MRI radio-frequency (RF) subsystem are exposed to electromagnetic (EM) interference, which may lead to PET and MRI signal-to-noise ratio (SNR) deteriorations. Early digitization of electronic PET signals within the MRI bore helps to preserve PET SNR, but occurs at the expense of increased amount of PET electronics inside the MRI and associated RF field emissions. This raises the likelihood of PET-related MRI interference by coupling into the MRI RF coil unwanted spurious signals considered as RF noise, as it degrades MRI SNR and results in MR image artefacts. RF shielding of PET detectors is a commonly used technique to reduce PET-related RF interferences, but can introduce eddy-current-related MRI disturbances and hinder the highest system integration. In this paper, we present RF interference reduction methods which rely on EM field coupling-decoupling principles of RF receive coils rather than suppressing emitted fields. By modifying clock frequencies and changing clock phase relations of digital circuits, the resulting RF field emission is optimised with regard to a lower field coupling into the MRI RF coil, thereby increasing the RF silence of PET detectors. Our methods are demonstrated by performing FPGA-based clock frequency and phase shifting of digital silicon photo-multipliers (dSiPMs) used in the PET modules of our MR-compatible Hyperion II D PET insert. We present simulations and magnetic-field map scans visualising the impact of altered clock phase pattern on the spatial RF field distribution, followed by MRI noise and SNR scans performed with an operating PET module using different clock frequencies and phase patterns. The methods were implemented via firmware design changes without any hardware modifications. This introduces new means of flexibility by enabling adaptive RF interference reduction optimisations in the field, e.g. when using a PET insert with different MRI systems or when different MRI RF coil types are to be operated with the same PET detector.

  16. Attenuation-emission alignment in cardiac PET∕CT based on consistency conditions

    PubMed Central

    Alessio, Adam M.; Kinahan, Paul E.; Champley, Kyle M.; Caldwell, James H.

    2010-01-01

    Purpose: In cardiac PET and PET∕CT imaging, misaligned transmission and emission images are a common problem due to respiratory and cardiac motion. This misalignment leads to erroneous attenuation correction and can cause errors in perfusion mapping and quantification. This study develops and tests a method for automated alignment of attenuation and emission data. Methods: The CT-based attenuation map is iteratively transformed until the attenuation corrected emission data minimize an objective function based on the Radon consistency conditions. The alignment process is derived from previous work by Welch et al. [“Attenuation correction in PET using consistency information,” IEEE Trans. Nucl. Sci. 45, 3134–3141 (1998)] for stand-alone PET imaging. The process was evaluated with the simulated data and measured patient data from multiple cardiac ammonia PET∕CT exams. The alignment procedure was applied to simulations of five different noise levels with three different initial attenuation maps. For the measured patient data, the alignment procedure was applied to eight attenuation-emission combinations with initially acceptable alignment and eight combinations with unacceptable alignment. The initially acceptable alignment studies were forced out of alignment a known amount and quantitatively evaluated for alignment and perfusion accuracy. The initially unacceptable studies were compared to the proposed aligned images in a blinded side-by-side review. Results: The proposed automatic alignment procedure reduced errors in the simulated data and iteratively approaches global minimum solutions with the patient data. In simulations, the alignment procedure reduced the root mean square error to less than 5 mm and reduces the axial translation error to less than 1 mm. In patient studies, the procedure reduced the translation error by >50% and resolved perfusion artifacts after a known misalignment for the eight initially acceptable patient combinations. The side-by-side review of the proposed aligned attenuation-emission maps and initially misaligned attenuation-emission maps revealed that reviewers preferred the proposed aligned maps in all cases, except one inconclusive case. Conclusions: The proposed alignment procedure offers an automatic method to reduce attenuation correction artifacts in cardiac PET∕CT and provides a viable supplement to subjective manual realignment tools. PMID:20384256

  17. Fast, noise-free memory for photon synchronization at room temperature.

    PubMed

    Finkelstein, Ran; Poem, Eilon; Michel, Ohad; Lahad, Ohr; Firstenberg, Ofer

    2018-01-01

    Future quantum photonic networks require coherent optical memories for synchronizing quantum sources and gates of probabilistic nature. We demonstrate a fast ladder memory (FLAME) mapping the optical field onto the superposition between electronic orbitals of rubidium vapor. Using a ladder-level system of orbital transitions with nearly degenerate frequencies simultaneously enables high bandwidth, low noise, and long memory lifetime. We store and retrieve 1.7-ns-long pulses, containing 0.5 photons on average, and observe short-time external efficiency of 25%, memory lifetime (1/ e ) of 86 ns, and below 10 -4 added noise photons. Consequently, coupling this memory to a probabilistic source would enhance the on-demand photon generation probability by a factor of 12, the highest number yet reported for a noise-free, room temperature memory. This paves the way toward the controlled production of large quantum states of light from probabilistic photon sources.

  18. Modular analysis of the probabilistic genetic interaction network.

    PubMed

    Hou, Lin; Wang, Lin; Qian, Minping; Li, Dong; Tang, Chao; Zhu, Yunping; Deng, Minghua; Li, Fangting

    2011-03-15

    Epistatic Miniarray Profiles (EMAP) has enabled the mapping of large-scale genetic interaction networks; however, the quantitative information gained from EMAP cannot be fully exploited since the data are usually interpreted as a discrete network based on an arbitrary hard threshold. To address such limitations, we adopted a mixture modeling procedure to construct a probabilistic genetic interaction network and then implemented a Bayesian approach to identify densely interacting modules in the probabilistic network. Mixture modeling has been demonstrated as an effective soft-threshold technique of EMAP measures. The Bayesian approach was applied to an EMAP dataset studying the early secretory pathway in Saccharomyces cerevisiae. Twenty-seven modules were identified, and 14 of those were enriched by gold standard functional gene sets. We also conducted a detailed comparison with state-of-the-art algorithms, hierarchical cluster and Markov clustering. The experimental results show that the Bayesian approach outperforms others in efficiently recovering biologically significant modules.

  19. Cost-Effectiveness of Florbetapir-PET in Alzheimer's Disease: A Spanish Societal Perspective.

    PubMed

    Hornberger, John; Michalopoulos, Steven; Dai, Minghan; Andrade, Paula; Dilla, Tatiana; Happich, Michael

    2015-06-01

    The rising prevalence of Alzheimer's disease (AD), and other diseases associated with dementia, imposes significant burden to various stakeholders who care for the elderly. Management of AD is complicated by multiple factors including disease-specific features which make it difficult to diagnose accurately during milder stages. Florbetapir F18 positron emission tomography (florbetapir-PET) is an approved imaging tool used to capture beta-amyloid neuritic plaque density in brains of cognitively impaired adults undergoing evaluation for AD and other causes of cognitive impairment. It has the potential to help improve healthcare outcomes as it may help clinicians identify patients with AD early so that treatments are initiated when most effective. Evaluate the potential long-term clinical and economic outcomes of adopting florbetapir-PET--adjunctive to standard clinical evaluation (SCE)--versus SCE alone in the diagnostic assessment of cognitively impaired patients with suspected AD. A decision analysis with a ten-year time horizon was developed in compliance with Good Research Practices and CHEERS guidelines. The target population was comprised of Spanish patients who were undergoing initial assessment for cognitive impairment (Mini-Mental State Examination [MMSE] score=20). Diagnostic accuracy, rate of cognitive decline, effect of drugs on cognition and dwelling status, economic burden (direct and indirect costs), and quality of life (QoL) were based on relevant clinical studies and published literature. Scenario analysis was applied to explore outcomes under different conditions, which included: (i) use of florbetapir-PET earlier in disease progression (MMSE score=22); and (ii) the addition of fluorodeoxyglucose (FDG)-PET to SCE. Adjunctive florbetapir-PET increased quality-adjusted life years (QALYs) by 0.008 years and increased costs by 36 compared to SCE alone (incremental cost-effectiveness ratio [ICER], 4,769). Use of florbetapir-PET was dominant in alternate scenarios. Sensitivity analyses indicated rates of institutionalization (by MMSE) and MMSE score upon initiation of acetylcholinesterase inhibitor (AChEI) treatment most influenced the primary outcome (ICER) in the base case scenario. Over 82% of probabilistic simulations were cost-effective using the Spanish threshold (30,000/QALY). The addition of florbetapir-PET to SCE is expected to improve the accuracy of AD diagnoses for patients experiencing cognitive impairment; it is cost-effective due to decreased healthcare costs and caregiver burden. Prospective studies of the clinical utility of florbetapir-PET are necessary to evaluate the long-term implications of adopting florbetapir-PET on clinical outcomes and costs in real-world settings. Florbetapir-PET is expected to improve decision-making regarding appropriate and sufficient care for cognitively impaired patients with suspected AD, while cost-effective. Earlier and more accurate diagnosis of AD may help to improve patient's health status and reduce treatment costs by effectively allocating healthcare resources and maximizing the benefit of treatments and supportive services. Use of florbetapir-PET may help accurately identify patients with AD. The development of novel therapeutics for use with companion diagnostics may provide additional benefits by slowing or halting progressive cognitive decline with AD, increase QoL and prolong survival.

  20. Seismic probabilistic tsunami hazard: from regional to local analysis and use of geological and historical observations

    NASA Astrophysics Data System (ADS)

    Tonini, R.; Lorito, S.; Orefice, S.; Graziani, L.; Brizuela, B.; Smedile, A.; Volpe, M.; Romano, F.; De Martini, P. M.; Maramai, A.; Selva, J.; Piatanesi, A.; Pantosti, D.

    2016-12-01

    Site-specific probabilistic tsunami hazard analyses demand very high computational efforts that are often reduced by introducing approximations on tsunami sources and/or tsunami modeling. On one hand, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could easily lead to important bias in the analysis. On the other hand, detailed inundation maps computed by tsunami numerical simulations require very long running time. When tsunami effects are calculated at regional scale, a common practice is to propagate tsunami waves in deep waters (up to 50-100 m depth) neglecting non-linear effects and using coarse bathymetric meshes. Then, maximum wave heights on the coast are empirically extrapolated, saving a significant amount of computational time. However, moving to local scale, such assumptions drop out and tsunami modeling would require much greater computational resources. In this work, we perform a local Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) for the 50 km long coastal segment between Augusta and Siracusa, a touristic and commercial area placed along the South-Eastern Sicily coast, Italy. The procedure consists in using the outcomes of a regional SPTHA as input for a two-step filtering method to select and substantially reduce the number of scenarios contributing to the specific target area. These selected scenarios are modeled using high resolution topo-bathymetry for producing detailed inundation maps. Results are presented as probabilistic hazard curves and maps, with the goal of analyze, compare and highlight the different results provided by regional and local hazard assessments. Moreover, the analysis is enriched by the use of local observed tsunami data, both geological and historical. Indeed, tsunami data-sets available for the selected target areas are particularly rich with respect to the scarce and heterogeneous data-sets usually available elsewhere. Therefore, they can represent valuable benchmarks for testing and strengthening the results of such kind of studies. The work is funded by the Italian Flagship Project RITMARE, the two EC FP7 projects ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389), and the INGV-DPC Agreement.

  1. A High Performance Computing Approach to Tree Cover Delineation in 1-m NAIP Imagery Using a Probabilistic Learning Framework

    NASA Technical Reports Server (NTRS)

    Basu, Saikat; Ganguly, Sangram; Michaelis, Andrew; Votava, Petr; Roy, Anshuman; Mukhopadhyay, Supratik; Nemani, Ramakrishna

    2015-01-01

    Tree cover delineation is a useful instrument in deriving Above Ground Biomass (AGB) density estimates from Very High Resolution (VHR) airborne imagery data. Numerous algorithms have been designed to address this problem, but most of them do not scale to these datasets, which are of the order of terabytes. In this paper, we present a semi-automated probabilistic framework for the segmentation and classification of 1-m National Agriculture Imagery Program (NAIP) for tree-cover delineation for the whole of Continental United States, using a High Performance Computing Architecture. Classification is performed using a multi-layer Feedforward Backpropagation Neural Network and segmentation is performed using a Statistical Region Merging algorithm. The results from the classification and segmentation algorithms are then consolidated into a structured prediction framework using a discriminative undirected probabilistic graphical model based on Conditional Random Field, which helps in capturing the higher order contextual dependencies between neighboring pixels. Once the final probability maps are generated, the framework is updated and re-trained by relabeling misclassified image patches. This leads to a significant improvement in the true positive rates and reduction in false positive rates. The tree cover maps were generated for the whole state of California, spanning a total of 11,095 NAIP tiles covering a total geographical area of 163,696 sq. miles. The framework produced true positive rates of around 88% for fragmented forests and 74% for urban tree cover areas, with false positive rates lower than 2% for both landscapes. Comparative studies with the National Land Cover Data (NLCD) algorithm and the LiDAR canopy height model (CHM) showed the effectiveness of our framework for generating accurate high-resolution tree-cover maps.

  2. Analysis of lesions in patients with unilateral tactile agnosia using cytoarchitectonic probabilistic maps.

    PubMed

    Hömke, Lars; Amunts, Katrin; Bönig, Lutz; Fretz, Christian; Binkofski, Ferdinand; Zilles, Karl; Weder, Bruno

    2009-05-01

    We propose a novel methodical approach to lesion analyses involving high-resolution MR images in combination with probabilistic cytoarchitectonic maps. 3D-MR images of the whole brain and the manually segmented lesion mask are spatially normalized to the reference brain of a stereotaxic probabilistic cytoarchitectonic atlas using a multiscale registration algorithm based on an elastic model. The procedure is demonstrated in three patients suffering from aperceptive tactile agnosia of the right hand due to chronic infarction of the left parietal cortex. Patient 1 presents a lesion in areas of the postcentral sulcus, Patient 3 in areas of the superior parietal lobule and adjacent intraparietal sulcus, and Patient 2 lesions in both regions. On the basis of neurobehavioral data, we conjectured degradation of sequential elementary sensory information processing within the postcentral gyrus, impeding texture recognition in Patients 1 and 2, and disturbed kinaesthetic information processing in the posterior parietal lobe, causing degraded shape recognition in the patients 2 and 3. The involvement of Brodmann areas 4a, 4p, 3a, 3b, 1, 2, and areas IP1 and IP2 of the intraparietal sulcus was assessed in terms of the voxel overlap between the spatially transformed lesion masks and the 50%-isocontours of the cytoarchitectonic maps. The disruption of the critical cytoarchitectonic areas and the impaired subfunctions, texture and shape recognition, relate as conjectured above. We conclude that the proposed method represents a promising approach to hypothesis-driven lesion analyses, yielding lesion-function correlates based on a cytoarchitectonic model. Finally, the lesion-function correlates are validated by functional imaging reference data. (c) 2008 Wiley-Liss, Inc.

  3. A High Performance Computing Approach to Tree Cover Delineation in 1-m NAIP Imagery using a Probabilistic Learning Framework

    NASA Astrophysics Data System (ADS)

    Basu, S.; Ganguly, S.; Michaelis, A.; Votava, P.; Roy, A.; Mukhopadhyay, S.; Nemani, R. R.

    2015-12-01

    Tree cover delineation is a useful instrument in deriving Above Ground Biomass (AGB) density estimates from Very High Resolution (VHR) airborne imagery data. Numerous algorithms have been designed to address this problem, but most of them do not scale to these datasets which are of the order of terabytes. In this paper, we present a semi-automated probabilistic framework for the segmentation and classification of 1-m National Agriculture Imagery Program (NAIP) for tree-cover delineation for the whole of Continental United States, using a High Performance Computing Architecture. Classification is performed using a multi-layer Feedforward Backpropagation Neural Network and segmentation is performed using a Statistical Region Merging algorithm. The results from the classification and segmentation algorithms are then consolidated into a structured prediction framework using a discriminative undirected probabilistic graphical model based on Conditional Random Field, which helps in capturing the higher order contextual dependencies between neighboring pixels. Once the final probability maps are generated, the framework is updated and re-trained by relabeling misclassified image patches. This leads to a significant improvement in the true positive rates and reduction in false positive rates. The tree cover maps were generated for the whole state of California, spanning a total of 11,095 NAIP tiles covering a total geographical area of 163,696 sq. miles. The framework produced true positive rates of around 88% for fragmented forests and 74% for urban tree cover areas, with false positive rates lower than 2% for both landscapes. Comparative studies with the National Land Cover Data (NLCD) algorithm and the LiDAR canopy height model (CHM) showed the effectiveness of our framework for generating accurate high-resolution tree-cover maps.

  4. A new axial smoothing method based on elastic mapping

    NASA Astrophysics Data System (ADS)

    Yang, J.; Huang, S. C.; Lin, K. P.; Czernin, J.; Wolfenden, P.; Dahlbom, M.; Hoh, C. K.; Phelps, M. E.

    1996-12-01

    New positron emission tomography (PET) scanners have higher axial and in-plane spatial resolutions but at the expense of reduced per plane sensitivity, which prevents the higher resolution from being fully realized. Normally, Gaussian-weighted interplane axial smoothing is used to reduce noise. In this study, the authors developed a new algorithm that first elastically maps adjacent planes, and then the mapped images are smoothed axially to reduce the image noise level. Compared to those obtained by the conventional axial-directional smoothing method, the images by the new method have improved signal-to-noise ratio. To quantify the signal-to-noise improvement, both simulated and real cardiac PET images were studied. Various Hanning reconstruction filters with cutoff frequency=0.5, 0.7, 1.0/spl times/Nyquist frequency and Ramp filter were tested on simulated images. Effective in-plane resolution was measured by the effective global Gaussian resolution (EGGR) and noise reduction was evaluated by the cross-correlation coefficient. Results showed that the new method was robust to various noise levels and indicated larger noise reduction or better image feature preservation (i.e., smaller EGGR) than by the conventional method.

  5. Working towards a clearer and more helpful hazard map: investigating the influence of hazard map design on hazard communication

    NASA Astrophysics Data System (ADS)

    Thompson, M. A.; Lindsay, J. M.; Gaillard, J.

    2015-12-01

    Globally, geological hazards are communicated using maps. In traditional hazard mapping practice, scientists analyse data about a hazard, and then display the results on a map for stakeholder and public use. However, this one-way, top-down approach to hazard communication is not necessarily effective or reliable. The messages which people take away will be dependent on the way in which they read, interpret, and understand the map, a facet of hazard communication which has been relatively unexplored. Decades of cartographic studies suggest that variables in the visual representation of data on maps, such as colour and symbology, can have a powerful effect on how people understand map content. In practice, however, there is little guidance or consistency in how hazard information is expressed and represented on maps. Accordingly, decisions are often made based on subjective preference, rather than research-backed principles. Here we present the results of a study in which we explore how hazard map design features can influence hazard map interpretation, and we propose a number of considerations for hazard map design. A series of hazard maps were generated, with each one showing the same probabilistic volcanic ashfall dataset, but using different verbal and visual variables (e.g., different colour schemes, data classifications, probabilistic formats). Following a short pilot study, these maps were used in an online survey of 110 stakeholders and scientists in New Zealand. Participants answered 30 open-ended and multiple choice questions about ashfall hazard based on the different maps. Results suggest that hazard map design can have a significant influence on the messages readers take away. For example, diverging colour schemes were associated with concepts of "risk" and decision-making more than sequential schemes, and participants made more precise estimates of hazard with isarithmic data classifications compared to binned or gradational shading. Based on such findings, we make a number of suggestions for communicating hazard using maps. Most importantly, we emphasise that multiple meanings may be taken away from a map, and this may have important implications in a crisis. We propose that engaging with map audiences in a two-way dialogue in times of peace may help prevent miscommunications in the event of a crisis.

  6. Modeling Uncertainties in EEG Microstates: Analysis of Real and Imagined Motor Movements Using Probabilistic Clustering-Driven Training of Probabilistic Neural Networks.

    PubMed

    Dinov, Martin; Leech, Robert

    2017-01-01

    Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses.

  7. Modeling Uncertainties in EEG Microstates: Analysis of Real and Imagined Motor Movements Using Probabilistic Clustering-Driven Training of Probabilistic Neural Networks

    PubMed Central

    Dinov, Martin; Leech, Robert

    2017-01-01

    Part of the process of EEG microstate estimation involves clustering EEG channel data at the global field power (GFP) maxima, very commonly using a modified K-means approach. Clustering has also been done deterministically, despite there being uncertainties in multiple stages of the microstate analysis, including the GFP peak definition, the clustering itself and in the post-clustering assignment of microstates back onto the EEG timecourse of interest. We perform a fully probabilistic microstate clustering and labeling, to account for these sources of uncertainty using the closest probabilistic analog to KM called Fuzzy C-means (FCM). We train softmax multi-layer perceptrons (MLPs) using the KM and FCM-inferred cluster assignments as target labels, to then allow for probabilistic labeling of the full EEG data instead of the usual correlation-based deterministic microstate label assignment typically used. We assess the merits of the probabilistic analysis vs. the deterministic approaches in EEG data recorded while participants perform real or imagined motor movements from a publicly available data set of 109 subjects. Though FCM group template maps that are almost topographically identical to KM were found, there is considerable uncertainty in the subsequent assignment of microstate labels. In general, imagined motor movements are less predictable on a time point-by-time point basis, possibly reflecting the more exploratory nature of the brain state during imagined, compared to during real motor movements. We find that some relationships may be more evident using FCM than using KM and propose that future microstate analysis should preferably be performed probabilistically rather than deterministically, especially in situations such as with brain computer interfaces, where both training and applying models of microstates need to account for uncertainty. Probabilistic neural network-driven microstate assignment has a number of advantages that we have discussed, which are likely to be further developed and exploited in future studies. In conclusion, probabilistic clustering and a probabilistic neural network-driven approach to microstate analysis is likely to better model and reveal details and the variability hidden in current deterministic and binarized microstate assignment and analyses. PMID:29163110

  8. A Probabilistic Model of Illegal Drug Trafficking Operations in the Eastern Pacific and Caribbean Sea

    DTIC Science & Technology

    2013-09-01

    partner agencies and nations, detects, tracks, and interdicts illegal drug-trafficking in this region. In this thesis, we develop a probability model based...trafficking in this region. In this thesis, we develop a probability model based on intelligence inputs to generate a spatial temporal heat map specifying the...complement and vet such complicated simulation by developing more analytically tractable models. We develop probability models to generate a heat map

  9. PET attenuation coefficients from CT images: experimental evaluation of the transformation of CT into PET 511-keV attenuation coefficients.

    PubMed

    Burger, C; Goerres, G; Schoenes, S; Buck, A; Lonn, A H R; Von Schulthess, G K

    2002-07-01

    The CT data acquired in combined PET/CT studies provide a fast and essentially noiseless source for the correction of photon attenuation in PET emission data. To this end, the CT values relating to attenuation of photons in the range of 40-140 keV must be transformed into linear attenuation coefficients at the PET energy of 511 keV. As attenuation depends on photon energy and the absorbing material, an accurate theoretical relation cannot be devised. The transformation implemented in the Discovery LS PET/CT scanner (GE Medical Systems, Milwaukee, Wis.) uses a bilinear function based on the attenuation of water and cortical bone at the CT and PET energies. The purpose of this study was to compare this transformation with experimental CT values and corresponding PET attenuation coefficients. In 14 patients, quantitative PET attenuation maps were calculated from germanium-68 transmission scans, and resolution-matched CT images were generated. A total of 114 volumes of interest were defined and the average PET attenuation coefficients and CT values measured. From the CT values the predicted PET attenuation coefficients were calculated using the bilinear transformation. When the transformation was based on the narrow-beam attenuation coefficient of water at 511 keV (0.096 cm(-1)), the predicted attenuation coefficients were higher in soft tissue than the measured values. This bias was reduced by replacing 0.096 cm(-1) in the transformation by the linear attenuation coefficient of 0.093 cm(-1) obtained from germanium-68 transmission scans. An analysis of the corrected emission activities shows that the resulting transformation is essentially equivalent to the transmission-based attenuation correction for human tissue. For non-human material, however, it may assign inaccurate attenuation coefficients which will also affect the correction in neighbouring tissue.

  10. Role of FDG-PET/MRI, FDG-PET/CT, and Dynamic Susceptibility Contrast Perfusion MRI in Differentiating Radiation Necrosis from Tumor Recurrence in Glioblastomas.

    PubMed

    Hojjati, Mojgan; Badve, Chaitra; Garg, Vasant; Tatsuoka, Curtis; Rogers, Lisa; Sloan, Andrew; Faulhaber, Peter; Ros, Pablo R; Wolansky, Leo J

    2018-01-01

    To compare the utility of quantitative PET/MRI, dynamic susceptibility contrast (DSC) perfusion MRI (pMRI), and PET/CT in differentiating radiation necrosis (RN) from tumor recurrence (TR) in patients with treated glioblastoma multiforme (GBM). The study included 24 patients with GBM treated with surgery, radiotherapy, and temozolomide who presented with progression on imaging follow-up. All patients underwent PET/MRI and pMRI during a single examination. Additionally, 19 of 24 patients underwent PET/CT on the same day. Diagnosis was established by pathology in 17 of 24 and by clinical/radiologic consensus in 7 of 24. For the quantitative PET/MRI and PET/CT analysis, a region of interest (ROI) was drawn around each lesion and within the contralateral white matter. Lesion to contralateral white matter ratios for relative maximum, mean, and median were calculated. For pMRI, lesion ROI was drawn on the cerebral blood volume (CBV) maps and histogram metrics were calculated. Diagnostic performance for each metric was assessed using receiver operating characteristic curve analysis and area under curve (AUC) was calculated. In 24 patients, 28 lesions were identified. For PET/MRI, relative mean ≥ 1.31 resulted in AUC of .94 with both sensitivity and negative predictive values (NPVs) of 100%. For pMRI, CBV max ≥3.32 yielded an AUC of .94 with both sensitivity and NPV measuring 100%. The joint model utilizing r-mean (PET/MRI) and CBV mode (pMRI) resulted in AUC of 1.0. Our study demonstrates that quantitative PET/MRI parameters in combination with DSC pMRI provide the best diagnostic utility in distinguishing RN from TR in treated GBMs. © 2017 The Authors. Journal of Neuroimaging published by Wiley Periodicals, Inc. on behalf of American Society of Neuroimaging.

  11. A probabilistic, distributed, recursive mechanism for decision-making in the brain

    PubMed Central

    Gurney, Kevin N.

    2018-01-01

    Decision formation recruits many brain regions, but the procedure they jointly execute is unknown. Here we characterize its essential composition, using as a framework a novel recursive Bayesian algorithm that makes decisions based on spike-trains with the statistics of those in sensory cortex (MT). Using it to simulate the random-dot-motion task, we demonstrate it quantitatively replicates the choice behaviour of monkeys, whilst predicting losses of otherwise usable information from MT. Its architecture maps to the recurrent cortico-basal-ganglia-thalamo-cortical loops, whose components are all implicated in decision-making. We show that the dynamics of its mapped computations match those of neural activity in the sensorimotor cortex and striatum during decisions, and forecast those of basal ganglia output and thalamus. This also predicts which aspects of neural dynamics are and are not part of inference. Our single-equation algorithm is probabilistic, distributed, recursive, and parallel. Its success at capturing anatomy, behaviour, and electrophysiology suggests that the mechanism implemented by the brain has these same characteristics. PMID:29614077

  12. Prevention 0f Unwanted Free-Declaration of Static Obstacles in Probability Occupancy Grids

    NASA Astrophysics Data System (ADS)

    Krause, Stefan; Scholz, M.; Hohmann, R.

    2017-10-01

    Obstacle detection and avoidance are major research fields in unmanned aviation. Map based obstacle detection approaches often use discrete world representations such as probabilistic grid maps to fuse incremental environment data from different views or sensors to build a comprehensive representation. The integration of continuous measurements into a discrete representation can result in rounding errors which, in turn, leads to differences between the artificial model and real environment. The cause of these deviations is a low spatial resolution of the world representation comparison to the used sensor data. Differences between artificial representations which are used for path planning or obstacle avoidance and the real world can lead to unexpected behavior up to collisions with unmapped obstacles. This paper presents three approaches to the treatment of errors that can occur during the integration of continuous laser measurement in the discrete probabilistic grid. Further, the quality of the error prevention and the processing performance are compared with real sensor data.

  13. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  14. Progress report on the Worldwide Earthquake Risk Management (WWERM) Program

    USGS Publications Warehouse

    Algermissen, S.T.; Hays, Walter W.; Krumpe, Paul R.

    1992-01-01

    Considerable progress has been made in the Worldwide Earthquake Risk Management (WWERM) Program since its initiation in late 1989 as a cooperative program of the Agency for International Development (AID), Office of U.S. Foreign Disaster Assistance (OFDA), and the U.S. Geological Survey. Probabilistic peak acceleration and peak Modified Mercalli intensity (MMI) maps have been prepared for Chile and for Sulawesi province in Indonesia. Earthquake risk (loss) studies for dwellings in Gorontalo, North Sulawesi, have been completed and risk studies for dwellings in selected areas of central Chile are underway. A special study of the effect of site response on earthquake ground motion estimation in central Chile has also been completed and indicates that site response may modify the ground shaking by as much as plus or minus two units of MMI. A program for the development of national probabilistic ground motion maps for the Philippines is now underway and pilot studies of earthquake ground motion and risk are being planned for Morocco.

  15. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes thousands of earthquake scenarios. We have carried out preliminary tsunami hazard calculations for different return periods for western North America and Hawaii based on thousands of earthquake scenarios around the Pacific rim and along the coast of North America. We will present tsunami hazard maps for several return periods and also discuss how to use these results for probabilistic inundation and runup mapping. Our knowledge of certain types of tsunami sources is very limited (e.g. submarine landslides), but a probabilistic framework for tsunami hazard evaluation can include even such sources and their uncertainties and present the overall hazard in a meaningful and consistent way.

  16. Seismotectonic Map of Afghanistan and Adjacent Areas

    USGS Publications Warehouse

    Wheeler, Russell L.; Rukstales, Kenneth S.

    2007-01-01

    Introduction This map is part of an assessment of Afghanistan's geology, natural resources, and natural hazards. One of the natural hazards is from earthquake shaking. One of the tools required to address the shaking hazard is a probabilistic seismic-hazard map, which was made separately. The information on this seismotectonic map has been used in the design and computation of the hazard map. A seismotectonic map like this one shows geological, seismological, and other information that previously had been scattered among many sources. The compilation can show spatial relations that might not have been seen by comparing the original sources, and it can suggest hypotheses that might not have occurred to persons who studied those scattered sources. The main map shows faults and earthquakes of Afghanistan. Plate convergence drives the deformations that cause the earthquakes. Accordingly, smaller maps and text explain the modern plate-tectonic setting of Afghanistan and its evolution, and relate both to patterns of faults and earthquakes.

  17. Motion-corrected whole-heart PET-MR for the simultaneous visualisation of coronary artery integrity and myocardial viability: an initial clinical validation.

    PubMed

    Munoz, Camila; Kunze, Karl P; Neji, Radhouene; Vitadello, Teresa; Rischpler, Christoph; Botnar, René M; Nekolla, Stephan G; Prieto, Claudia

    2018-05-12

    Cardiac PET-MR has shown potential for the comprehensive assessment of coronary heart disease. However, image degradation due to physiological motion remains a challenge that could hinder the adoption of this technology in clinical practice. The purpose of this study was to validate a recently proposed respiratory motion-corrected PET-MR framework for the simultaneous visualisation of myocardial viability ( 18 F-FDG PET) and coronary artery anatomy (coronary MR angiography, CMRA) in patients with chronic total occlusion (CTO). A cohort of 14 patients was scanned with the proposed PET-CMRA framework. PET and CMRA images were reconstructed with and without the proposed motion correction approach for comparison purposes. Metrics of image quality including visible vessel length and sharpness were obtained for CMRA for both the right and left anterior descending coronary arteries (RCA, LAD), and relative increase in 18 F-FDG PET signal after motion correction for standard 17-segment polar maps was computed. Resulting coronary anatomy by CMRA and myocardial integrity by PET were visually compared against X-ray angiography and conventional Late Gadolinium Enhancement (LGE) MRI, respectively. Motion correction increased CMRA visible vessel length by 49.9% and 32.6% (RCA, LAD) and vessel sharpness by 12.3% and 18.9% (RCA, LAD) on average compared to uncorrected images. Coronary lumen delineation on motion-corrected CMRA images was in good agreement with X-ray angiography findings. For PET, motion correction resulted in an average 8% increase in 18 F-FDG signal in the inferior and inferolateral segments of the myocardial wall. An improved delineation of myocardial viability defects and reduced noise in the 18 F-FDG PET images was observed, improving correspondence to subendocardial LGE-MRI findings compared to uncorrected images. The feasibility of the PET-CMRA framework for simultaneous cardiac PET-MR imaging in a short and predictable scan time (~11 min) has been demonstrated in 14 patients with CTO. Motion correction increased visible length and sharpness of the coronary arteries by CMRA, and improved delineation of the myocardium by 18 F-FDG PET, resulting in good agreement with X-ray angiography and LGE-MRI.

  18. Effect of spatial smoothing on t-maps: arguments for going back from t-maps to masked contrast images.

    PubMed

    Reimold, Matthias; Slifstein, Mark; Heinz, Andreas; Mueller-Schauenburg, Wolfgang; Bares, Roland

    2006-06-01

    Voxelwise statistical analysis has become popular in explorative functional brain mapping with fMRI or PET. Usually, results are presented as voxelwise levels of significance (t-maps), and for clusters that survive correction for multiple testing the coordinates of the maximum t-value are reported. Before calculating a voxelwise statistical test, spatial smoothing is required to achieve a reasonable statistical power. Little attention is being given to the fact that smoothing has a nonlinear effect on the voxel variances and thus the local characteristics of a t-map, which becomes most evident after smoothing over different types of tissue. We investigated the related artifacts, for example, white matter peaks whose position depend on the relative variance (variance over contrast) of the surrounding regions, and suggest improving spatial precision with 'masked contrast images': color-codes are attributed to the voxelwise contrast, and significant clusters (e.g., detected with statistical parametric mapping, SPM) are enlarged by including contiguous pixels with a contrast above the mean contrast in the original cluster, provided they satisfy P < 0.05. The potential benefit is demonstrated with simulations and data from a [11C]Carfentanil PET study. We conclude that spatial smoothing may lead to critical, sometimes-counterintuitive artifacts in t-maps, especially in subcortical brain regions. If significant clusters are detected, for example, with SPM, the suggested method is one way to improve spatial precision and may give the investigator a more direct sense of the underlying data. Its simplicity and the fact that no further assumptions are needed make it a useful complement for standard methods of statistical mapping.

  19. The value of FDG positron emission tomography/computerised tomography (PET/CT) in pre-operative staging of colorectal cancer: a systematic review and economic evaluation.

    PubMed

    Brush, J; Boyd, K; Chappell, F; Crawford, F; Dozier, M; Fenwick, E; Glanville, J; McIntosh, H; Renehan, A; Weller, D; Dunlop, M

    2011-09-01

    In the UK, colorectal cancer (CRC) is the third most common malignancy (behind lung and breast cancer) with 37,514 cases registered in 2006: around two-thirds (23,384) in the colon and one-third (14,130) in the rectum. Treatment of cancers of the colon can vary considerably, but surgical resection is the mainstay of treatment for curative intent. Following surgical resection, there is a comprehensive assessment of the tumour, it's invasion characteristics and spread (tumour staging). A number of imaging modalities are used in the pre-operative staging of CRCs including; computerised tomography (CT), magnetic resonance imaging, ultrasound imaging and positron emission tomography (PET). This report examines the role of CT in combination with PET scanning (PET/CT 'hybrid' scan). The research objectives are: to evaluate the diagnostic accuracy and therapeutic impact of fluorine-18-deoxyglucose (FDG) PET/CT for the pre-operative staging of primary, recurrent and metastatic cancer using systematic review methods; undertake probabilistic decision-analytic modelling (using Monte Carlo simulation); and conduct a value of information analysis to help inform whether or not there is potential worth in undertaking further research. For each aspect of the research - the systematic review, the handsearch study and the economic evaluation - a database was assembled from a comprehensive search for published and unpublished studies, which included database searches, reference lists search and contact with experts. In the systematic review prospective and retrospective patient series (diagnostic cohort) and randomised controlled trials (RCTs) were eligible for inclusion. Both consecutive series and series that are not explicitly reported as consecutive were included. Two reviewers extracted all data and applied the criteria independently and resolved disagreements by discussion. Data to populate 2 × 2 contingency tables consisting of the number of true positives, true negatives, false positives and false negatives using the studies' own definitions were extracted, as were data relating to changes in management. Fourteen items from the Quality Assessment of Diagnostic Accuracy Studies checklist were used to assess the methodological quality of the included studies. Patient-level data were used to calculate sensitivity and specificity with confidence intervals (CIs). Data were plotted graphically in forest plots. For the economic evaluation, economic models were designed for each of the disease states: primary, recurrent and metastatic. These were developed and populated based on a variety of information sources (in particular from published data sources) and literature, and in consultation with clinical experts. The review found 30 studies that met the eligibility criteria. Only two small studies evaluated the use of FDG PET/CT in primary CRC, and there is insufficient evidence to support its routine use at this time. The use of FDG PET/CT for the detection of recurrent disease identified data from five retrospective studies from which a pooled sensitivity of 91% (95% CI 0.87% to 0.95%) and specificity of 91% (95% CI 0.85% to 0.95%) were observed. Pooled accuracy data from patients undergoing staging for suspected metastatic disease showed FDG PET/CT to have a pooled sensitivity of 91% (95% CI 87% to 94%) and a specificity of 76% (95% CI 58% to 88%), but the poor quality of the studies means the validity of the data may be compromised by several biases. The separate handsearch study did not yield any additional unique studies relevant to FDG PET/CT. Models for recurrent disease demonstrated an incremental cost-effectiveness ratio of £ 21,409 per quality-adjusted life-year (QALY) for rectal cancer, £ 6189 per QALY for colon cancer and £ 21,434 per QALY for metastatic disease. The value of handsearching to identify studies of less clearly defined or reported diagnostic tests is still to be investigated. The systematic review found insufficient evidence to support the routine use of FDG PET/CT in primary CRC and only a small amount of evidence supporting its use in the pre-operative staging of recurrent and metastatic CRC, and, although FDG PET/CT was shown to change patient management, the data are divergent and the quality of research is generally poor. The handsearch to identify studies of less clearly defined or reported diagnostic tests did not find additional studies. The primary limitations in the economic evaluations were due to uncertainty and lack of available evidence from the systematic reviews for key parameters in each of the five models. In order to address this, a conservative approach was adopted in choosing DTA estimates for the model parameters. Probabilistic analyses were undertaken for each of the models, incorporating wide levels of uncertainty particularly for the DTA estimates. None of the economic models reported cost-savings, but the approach adopted was conservative in order to determine more reliable results given the lack of current information. The economic evaluations conclude that FDG PET/CT as an add-on imaging device is cost-effective in the pre-operative staging of recurrent colon, recurrent rectal and metastatic disease but not in primary colon or rectal cancers. There would be value in undertaking an RCT with a concurrent economic evaluation to evaluate the therapeutic impact and cost-effectiveness of FDG PET/CT compared with conventional imaging (without PET) for the pre-operative staging of recurrent and metastatic CRC.

  20. An integrated approach to flood hazard assessment on alluvial fans using numerical modeling, field mapping, and remote sensing

    USGS Publications Warehouse

    Pelletier, J.D.; Mayer, L.; Pearthree, P.A.; House, P.K.; Demsey, K.A.; Klawon, J.K.; Vincent, K.R.

    2005-01-01

    Millions of people in the western United States live near the dynamic, distributary channel networks of alluvial fans where flood behavior is complex and poorly constrained. Here we test a new comprehensive approach to alluvial-fan flood hazard assessment that uses four complementary methods: two-dimensional raster-based hydraulic modeling, satellite-image change detection, fieldbased mapping of recent flood inundation, and surficial geologic mapping. Each of these methods provides spatial detail lacking in the standard method and each provides critical information for a comprehensive assessment. Our numerical model simultaneously solves the continuity equation and Manning's equation (Chow, 1959) using an implicit numerical method. It provides a robust numerical tool for predicting flood flows using the large, high-resolution Digital Elevation Models (DEMs) necessary to resolve the numerous small channels on the typical alluvial fan. Inundation extents and flow depths of historic floods can be reconstructed with the numerical model and validated against field- and satellite-based flood maps. A probabilistic flood hazard map can also be constructed by modeling multiple flood events with a range of specified discharges. This map can be used in conjunction with a surficial geologic map to further refine floodplain delineation on fans. To test the accuracy of the numerical model, we compared model predictions of flood inundation and flow depths against field- and satellite-based flood maps for two recent extreme events on the southern Tortolita and Harquahala piedmonts in Arizona. Model predictions match the field- and satellite-based maps closely. Probabilistic flood hazard maps based on the 10 yr, 100 yr, and maximum floods were also constructed for the study areas using stream gage records and paleoflood deposits. The resulting maps predict spatially complex flood hazards that strongly reflect small-scale topography and are consistent with surficial geology. In contrast, FEMA Flood Insurance Rate Maps (FIRMs) based on the FAN model predict uniformly high flood risk across the study areas without regard for small-scale topography and surficial geology. ?? 2005 Geological Society of America.

  1. A depth-of-interaction PET detector using a stair-shaped reflector arrangement and a single-ended scintillation light readout.

    PubMed

    Son, Jeong-Whan; Lee, Min Sun; Lee, Jae Sung

    2017-01-21

    Positron emission tomography (PET) detectors with the ability to encode depth-of-interaction (DOI) information allow us to simultaneously improve the spatial resolution and sensitivity of PET scanners. In this study, we propose a DOI PET detector based on a stair-pattern reflector arrangement inserted between pixelated crystals and a single-ended scintillation light readout. The main advantage of the proposed method is its simplicity; DOI information is decoded from a flood map and the data can be simply acquired by using a single-ended readout system. Another potential advantage is that the two-step DOI detectors can provide the largest peak position distance in a flood map because two-dimensional peak positions can be evenly distributed. We conducted a Monte Carlo simulation and obtained flood maps. Then, we conducted experimental studies using two-step DOI arrays of 5  ×  5 Lu 1.9 Y 0.1 SiO 5 :Ce crystals with a cross-section of 1.7  ×  1.7 mm 2 and different detector configurations: an unpolished single-layer ( U S) array, a polished single-layer ( P S) array and a polished stacked two-layer ( P T) array. For each detector configuration, both air gaps and room-temperature vulcanization (RTV) silicone gaps were tested. Detectors U S and P T showed good peak separation in each scintillator with an average peak-to-valley ratio (PVR) and distance-to-width ratio (DWR) of 2.09 and 1.53, respectively. Detector P S RTV showed lower PVR and DWR (1.65 and 1.34, respectively). The configuration of detector P T Air is preferable for the construction of time-of-flight-DOI detectors because timing resolution was degraded by only about 40 ps compared with that of a non-DOI detector. The performance of detectors U S Air and P S RTV was lower than that of a non-DOI detector, and thus these designs are favorable when the manufacturing cost is more important than timing performance. The results demonstrate that the proposed DOI-encoding method is a promising candidate for PET scanners that require high resolution and sensitivity and operate with conventional acquisition systems.

  2. Mapping Radiation Injury and Recovery in Bone Marrow Using 18F-FLT PET/CT and USPIO MRI in a Rat Model.

    PubMed

    Rendon, David A; Kotedia, Khushali; Afshar, Solmaz F; Punia, Jyotinder N; Sabek, Omaima M; Shirkey, Beverly A; Zawaski, Janice A; Gaber, M Waleed

    2016-02-01

    We present and test the use of multimodality imaging as a topological tool to map the amount of the body exposed to ionizing radiation and the location of exposure, which are important indicators of survival and recovery. To achieve our goal, PET/CT imaging with 3'-deoxy-3'-(18)F-fluorothymidine ((18)F-FLT) was used to measure cellular proliferation in bone marrow (BM), whereas MRI using ultra-small superparamagnetic iron oxide (USPIO) particles provided noninvasive information on radiation-induced vascular damage. Animals were x-ray-irradiated at a dose of 7.5 Gy with 1 of 3 radiation schemes-whole-body irradiation, half-body shielding (HBS), or 1-leg shielding (1LS)-and imaged repeatedly. The spatial information from the CT scan was used to segment the region corresponding to BM from the PET scan using algorithms developed in-house, allowing for quantification of proliferating cells, and BM blood volume was estimated by measuring the changes in the T2 relaxation rates (ΔR2) collected from MR scans. (18)F-FLT PET/CT imaging differentiated irradiated from unirradiated BM regions. Two days after irradiation, proliferation of 1LS animals was significantly lower than sham (P = 0.0001, femurs; P < 0.0001, tibias) and returned to sham levels by day 10 (P = 0.6344, femurs; P = 0.3962, tibias). The degree of shielding affected proliferation recovery, showing an increase in the irradiated BM of the femurs, but not the tibias, of HBS animals when compared with 1LS (P = 0.0310, femurs; P = 0.5832, tibias). MRI of irradiated spines detected radiation-induced BM vascular damage, measured by the significant increase in ΔR2 2 d after whole-body irradiation (P = 0.0022) and HBS (P = 0.0003) with a decreasing trend of values, returning to levels close to baseline over 10 d. Our data were corroborated using γ-counting and histopathology. We demonstrated that (18)F-FLT PET/CT and USPIO MRI are valuable tools in mapping regional radiation exposure and the effects of radiation on BM. Analysis of the (18)F-FLT signal allowed for a clear demarcation of exposed BM regions and elucidated the kinetics of BM recovery, whereas USPIO MRI was used to assess vascular damage and recovery. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  3. TU-AB-202-07: A Novel Method for Registration of Mid-Treatment PET/CT Images Under Conditions of Tumor Regression for Patients with Locally Advanced Lung Cancers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharifi, Hoda; Department of Physics, Oakland University, Rochester, MI; Zhang, Hong

    Purpose: In PET-guided adaptive radiotherapy (RT), changes in the metabolic activity at individual voxels cannot be derived until the duringtreatment CT images are appropriately registered to pre-treatment CT images. However, deformable image registration (DIR) usually does not preserve tumor volume. This may induce errors when comparing to the target. The aim of this study was to develop a DIR-integrated mechanical modeling technique to track radiation-induced metabolic changes on PET images. Methods: Three patients with non-small cell lung cancer (NSCLC) were treated with adaptive radiotherapy under RTOG 1106. Two PET/CT image sets were acquired 2 weeks before RT and 18 fractionsmore » after the start of treatment. DIR was performed to register the during-RT CT to the pre-RT CT using a B-spline algorithm and the resultant displacements in the region of tumor were remodeled using a hybrid finite element method (FEM). Gross tumor volume (GTV) was delineated on the during-RT PET/CT image sets and deformed using the 3D deformation vector fields generated by the CT-based registrations. Metabolic tumor volume (MTV) was calculated using the pre- and during–RT image set. The quality of the PET mapping was evaluated based on the constancy of the mapped MTV and landmark comparison. Results: The B-spline-based registrations changed MTVs by 7.3%, 4.6% and −5.9% for the 3 patients and the correspondent changes for the hybrid FEM method −2.9%, 1% and 6.3%, respectively. Landmark comparisons were used to evaluate the Rigid, B-Spline, and hybrid FEM registrations with the mean errors of 10.1 ± 1.6 mm, 4.4 ± 0.4 mm, and 3.6 ± 0.4 mm for three patients. The hybrid FEM method outperforms the B-Spline-only registration for patients with tumor regression Conclusion: The hybrid FEM modeling technique improves the B-Spline registrations in tumor regions. This technique may help compare metabolic activities between two PET/CT images with regressing tumors. The author gratefully acknowledges the financial support from the National Institutes of Health Grant.« less

  4. Flutriciclamide (18F-GE180) PET: First-in-Human PET Study of Novel Third-Generation In Vivo Marker of Human Translocator Protein.

    PubMed

    Fan, Zhen; Calsolaro, Valeria; Atkinson, Rebecca A; Femminella, Grazia D; Waldman, Adam; Buckley, Christopher; Trigg, William; Brooks, David J; Hinz, Rainer; Edison, Paul

    2016-11-01

    Neuroinflammation is associated with neurodegenerative disease. PET radioligands targeting the 18-kDa translocator protein (TSPO) have been used as in vivo markers of neuroinflammation, but there is an urgent need for novel probes with improved signal-to-noise ratio. Flutriciclamide ( 18 F-GE180) is a recently developed third-generation TSPO ligand. In this first study, we evaluated the optimum scan duration and kinetic modeling strategies for 18 F-GE180 PET in (older) healthy controls. Ten healthy controls, 6 TSPO high-affinity binders, and 4 mixed-affinity binders were recruited. All subjects underwent detailed neuropsychologic tests, MRI, and a 210-min 18 F-GE180 dynamic PET/CT scan using metabolite-corrected arterial plasma input function. We evaluated 5 different kinetic models: irreversible and reversible 2-tissue-compartment models, a reversible 1-tissue model, and 2 models with an extra irreversible vascular compartment. The minimal scan duration was established using 210-min scan data. The feasibility of generating parametric maps was also investigated using graphical analysis. 18 F-GE180 concentration was higher in plasma than in whole blood during the entire scan duration. The volume of distribution (V T ) was 0.17 in high-affinity binders and 0.12 in mixed-affinity binders using the kinetic model. The model that best represented brain 18 F-GE180 kinetics across regions was the reversible 2-tissue-compartment model (2TCM4k), and 90 min resulted as the optimum scan length required to obtain stable estimates. Logan graphical analysis with arterial input function gave a V T highly consistent with V T in the kinetic model, which could be used for voxelwise analysis. We report for the first time, to our knowledge, the kinetic properties of the novel third-generation TSPO PET ligand 18 F-GE180 in humans: 2TCM4k is the optimal method to quantify the brain uptake, 90 min is the optimal scan length, and the Logan approach could be used to generate parametric maps. Although these control subjects have shown relatively low V T , the methodology presented here forms the basis for quantification for future PET studies using 18 F-GE180 in different pathologies. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  5. Crystal Identification in Dual-Layer-Offset DOI-PET Detectors Using Stratified Peak Tracking Based on SVD and Mean-Shift Algorithm

    NASA Astrophysics Data System (ADS)

    Wei, Qingyang; Dai, Tiantian; Ma, Tianyu; Liu, Yaqiang; Gu, Yu

    2016-10-01

    An Anger-logic based pixelated PET detector block requires a crystal position map (CPM) to assign the position of each detected event to a most probable crystal index. Accurate assignments are crucial to PET imaging performance. In this paper, we present a novel automatic approach to generate the CPMs for dual-layer offset (DLO) PET detectors using a stratified peak tracking method. In which, the top and bottom layers are distinguished by their intensity difference and the peaks of the top and bottom layers are tracked based on a singular value decomposition (SVD) and mean-shift algorithm in succession. The CPM is created by classifying each pixel to its nearest peak and assigning the pixel with the crystal index of that peak. A Matlab-based graphical user interface program was developed including the automatic algorithm and a manual interaction procedure. The algorithm was tested for three DLO PET detector blocks. Results show that the proposed method exhibits good performance as well as robustness for all the three blocks. Compared to the existing methods, our approach can directly distinguish the layer and crystal indices using the information of intensity and offset grid pattern.

  6. Characterization of an In-Beam PET Prototype for Proton Therapy With Different Target Compositions

    NASA Astrophysics Data System (ADS)

    Attanasi, Francesca; Belcari, Nicola; Moehrs, Sascha; Rosso, Valeria; Vecchio, Sara; Cirrone, G. A. Pablo; Cuttone, Giacomo; Lojacono, Piero; Romano, Francesco; Lanconelli, Nico; Del Guerra, Alberto

    2010-06-01

    At the University of Pisa, the DoPET (Dosimetry with a Positron Emission Tomograph) project has focused on the development and characterization of an ad hoc, scalable, dual-head PET prototype for in-beam treatment planning verification of the proton therapy. In this paper we report the first results obtained with our current prototype, consisting of two opposing lutetium yttrium orthosilicate (LYSO) detectors, each one covering an area of 4.5 × 4.5 cm2. We measured the β+-activation induced by 62 MeV proton beams at Catana facility (LNS, Catania, Italy) in several plastic phantoms. Experiments were performed to evaluate the possibility to extract accurate phantom geometrical information from the reconstructed PET images. The PET prototype proved its capability of locating small air cavities in homogeneous PMMA phantoms with a submillimetric accuracy and of distinguishing materials with different 16O and 12C content by back mapping phantom geometry through the separation of the isotope contributions. This could be very useful in the clinical practice as a tool to highlight anatomical or physiological organ variations among different treatment sessions and to discriminate different tissue types, thus providing feedbacks for the accuracy of dose deposition.

  7. FPGA-based RF interference reduction techniques for simultaneous PET–MRI

    PubMed Central

    Gebhardt, P; Wehner, J; Weissler, B; Botnar, R; Marsden, P K; Schulz, V

    2016-01-01

    Abstract The combination of positron emission tomography (PET) and magnetic resonance imaging (MRI) as a multi-modal imaging technique is considered very promising and powerful with regard to in vivo disease progression examination, therapy response monitoring and drug development. However, PET–MRI system design enabling simultaneous operation with unaffected intrinsic performance of both modalities is challenging. As one of the major issues, both the PET detectors and the MRI radio-frequency (RF) subsystem are exposed to electromagnetic (EM) interference, which may lead to PET and MRI signal-to-noise ratio (SNR) deteriorations. Early digitization of electronic PET signals within the MRI bore helps to preserve PET SNR, but occurs at the expense of increased amount of PET electronics inside the MRI and associated RF field emissions. This raises the likelihood of PET-related MRI interference by coupling into the MRI RF coil unwanted spurious signals considered as RF noise, as it degrades MRI SNR and results in MR image artefacts. RF shielding of PET detectors is a commonly used technique to reduce PET-related RF interferences, but can introduce eddy-current-related MRI disturbances and hinder the highest system integration. In this paper, we present RF interference reduction methods which rely on EM field coupling–decoupling principles of RF receive coils rather than suppressing emitted fields. By modifying clock frequencies and changing clock phase relations of digital circuits, the resulting RF field emission is optimised with regard to a lower field coupling into the MRI RF coil, thereby increasing the RF silence of PET detectors. Our methods are demonstrated by performing FPGA-based clock frequency and phase shifting of digital silicon photo-multipliers (dSiPMs) used in the PET modules of our MR-compatible Hyperion IID PET insert. We present simulations and magnetic-field map scans visualising the impact of altered clock phase pattern on the spatial RF field distribution, followed by MRI noise and SNR scans performed with an operating PET module using different clock frequencies and phase patterns. The methods were implemented via firmware design changes without any hardware modifications. This introduces new means of flexibility by enabling adaptive RF interference reduction optimisations in the field, e.g. when using a PET insert with different MRI systems or when different MRI RF coil types are to be operated with the same PET detector. PMID:27049898

  8. A unified Fourier theory for time-of-flight PET data

    PubMed Central

    Li, Yusheng; Matej, Samuel; Metzler, Scott D

    2016-01-01

    Fully 3D time-of-flight (TOF) PET scanners offer the potential of previously unachievable image quality in clinical PET imaging. TOF measurements add another degree of redundancy for cylindrical PET scanners and make photon-limited TOF-PET imaging more robust than non-TOF PET imaging. The data space for 3D TOF-PET data is five-dimensional with two degrees of redundancy. Previously, consistency equations were used to characterize the redundancy of TOF-PET data. In this paper, we first derive two Fourier consistency equations and Fourier-John equation for 3D TOF PET based on the generalized projection-slice theorem; the three partial differential equations (PDEs) are the dual of the sinogram consistency equations and John's equation. We then solve the three PDEs using the method of characteristics. The two degrees of entangled redundancy of the TOF-PET data can be explicitly elicited and exploited by the solutions of the PDEs along the characteristic curves, which gives a complete understanding of the rich structure of the 3D X-ray transform with TOF measurement. Fourier rebinning equations and other mapping equations among different types of PET data are special cases of the general solutions. We also obtain new Fourier rebinning and consistency equations (FORCEs) from other special cases of the general solutions, and thus we obtain a complete scheme to convert among different types of PET data: 3D TOF, 3D non-TOF, 2D TOF and 2D non-TOF data. The new FORCEs can be used as new Fourier-based rebinning algorithms for TOF-PET data reduction, inverse rebinnings for designing fast projectors, or consistency conditions for estimating missing data. Further, we give a geometric interpretation of the general solutions—the two families of characteristic curves can be obtained by respectively changing the azimuthal and co-polar angles of the biorthogonal coordinates in Fourier space. We conclude the unified Fourier theory by showing that the Fourier consistency equations are necessary and sufficient for 3D X-ray transform with TOF measurement. Finally, we give numerical examples of inverse rebinning for a 3D TOF PET and Fourier-based rebinning for a 2D TOF PET using the FORCEs to show the efficacy of the unified Fourier solutions. PMID:26689836

  9. A unified Fourier theory for time-of-flight PET data.

    PubMed

    Li, Yusheng; Matej, Samuel; Metzler, Scott D

    2016-01-21

    Fully 3D time-of-flight (TOF) PET scanners offer the potential of previously unachievable image quality in clinical PET imaging. TOF measurements add another degree of redundancy for cylindrical PET scanners and make photon-limited TOF-PET imaging more robust than non-TOF PET imaging. The data space for 3D TOF-PET data is five-dimensional with two degrees of redundancy. Previously, consistency equations were used to characterize the redundancy of TOF-PET data. In this paper, we first derive two Fourier consistency equations and Fourier-John equation for 3D TOF PET based on the generalized projection-slice theorem; the three partial differential equations (PDEs) are the dual of the sinogram consistency equations and John's equation. We then solve the three PDEs using the method of characteristics. The two degrees of entangled redundancy of the TOF-PET data can be explicitly elicited and exploited by the solutions of the PDEs along the characteristic curves, which gives a complete understanding of the rich structure of the 3D x-ray transform with TOF measurement. Fourier rebinning equations and other mapping equations among different types of PET data are special cases of the general solutions. We also obtain new Fourier rebinning and consistency equations (FORCEs) from other special cases of the general solutions, and thus we obtain a complete scheme to convert among different types of PET data: 3D TOF, 3D non-TOF, 2D TOF and 2D non-TOF data. The new FORCEs can be used as new Fourier-based rebinning algorithms for TOF-PET data reduction, inverse rebinnings for designing fast projectors, or consistency conditions for estimating missing data. Further, we give a geometric interpretation of the general solutions--the two families of characteristic curves can be obtained by respectively changing the azimuthal and co-polar angles of the biorthogonal coordinates in Fourier space. We conclude the unified Fourier theory by showing that the Fourier consistency equations are necessary and sufficient for 3D x-ray transform with TOF measurement. Finally, we give numerical examples of inverse rebinning for a 3D TOF PET and Fourier-based rebinning for a 2D TOF PET using the FORCEs to show the efficacy of the unified Fourier solutions.

  10. Bias atlases for segmentation-based PET attenuation correction using PET-CT and MR.

    PubMed

    Ouyang, Jinsong; Chun, Se Young; Petibon, Yoann; Bonab, Ali A; Alpert, Nathaniel; Fakhri, Georges El

    2013-10-01

    This study was to obtain voxel-wise PET accuracy and precision using tissue-segmentation for attenuation correction. We applied multiple thresholds to the CTs of 23 patients to classify tissues. For six of the 23 patients, MR images were also acquired. The MR fat/in-phase ratio images were used for fat segmentation. Segmented tissue classes were used to create attenuation maps, which were used for attenuation correction in PET reconstruction. PET bias images were then computed using the PET reconstructed with the original CT as the reference. We registered the CTs for all the patients and transformed the corresponding bias images accordingly. We then obtained the mean and standard deviation bias atlas using all the registered bias images. Our CT-based study shows that four-class segmentation (air, lungs, fat, other tissues), which is available on most PET-MR scanners, yields 15.1%, 4.1%, 6.6%, and 12.9% RMSE bias in lungs, fat, non-fat soft-tissues, and bones, respectively. An accurate fat identification is achievable using fat/in-phase MR images. Furthermore, we have found that three-class segmentation (air, lungs, other tissues) yields less than 5% standard deviation of bias within the heart, liver, and kidneys. This implies that three-class segmentation can be sufficient to achieve small variation of bias for imaging these three organs. Finally, we have found that inter- and intra-patient lung density variations contribute almost equally to the overall standard deviation of bias within the lungs.

  11. Self-Organizing Map Neural Network-Based Nearest Neighbor Position Estimation Scheme for Continuous Crystal PET Detectors

    NASA Astrophysics Data System (ADS)

    Wang, Yonggang; Li, Deng; Lu, Xiaoming; Cheng, Xinyi; Wang, Liwei

    2014-10-01

    Continuous crystal-based positron emission tomography (PET) detectors could be an ideal alternative for current high-resolution pixelated PET detectors if the issues of high performance γ interaction position estimation and its real-time implementation are solved. Unfortunately, existing position estimators are not very feasible for implementation on field-programmable gate array (FPGA). In this paper, we propose a new self-organizing map neural network-based nearest neighbor (SOM-NN) positioning scheme aiming not only at providing high performance, but also at being realistic for FPGA implementation. Benefitting from the SOM feature mapping mechanism, the large set of input reference events at each calibration position is approximated by a small set of prototypes, and the computation of the nearest neighbor searching for unknown events is largely reduced. Using our experimental data, the scheme was evaluated, optimized and compared with the smoothed k-NN method. The spatial resolutions of full-width-at-half-maximum (FWHM) of both methods averaged over the center axis of the detector were obtained as 1.87 ±0.17 mm and 1.92 ±0.09 mm, respectively. The test results show that the SOM-NN scheme has an equivalent positioning performance with the smoothed k-NN method, but the amount of computation is only about one-tenth of the smoothed k-NN method. In addition, the algorithm structure of the SOM-NN scheme is more feasible for implementation on FPGA. It has the potential to realize real-time position estimation on an FPGA with a high-event processing throughput.

  12. SU-F-R-36: Validating Quantitative Radiomic Texture Features for Oncologic PET: A Digital Phantom Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, F; Yang, Y; Young, L

    Purpose: Radiomic texture features derived from the oncologic PET have recently been brought under intense investigation within the context of patient stratification and treatment outcome prediction in a variety of cancer types; however, their validity has not yet been examined. This work is aimed to validate radiomic PET texture metrics through the use of realistic simulations in the ground truth setting. Methods: Simulation of FDG-PET was conducted by applying the Zubal phantom as an attenuation map to the SimSET software package that employs Monte Carlo techniques to model the physical process of emission imaging. A total of 15 irregularly-shaped lesionsmore » featuring heterogeneous activity distribution were simulated. For each simulated lesion, 28 texture features in relation to the intensity histograms (GLIH), grey-level co-occurrence matrices (GLCOM), neighborhood difference matrices (GLNDM), and zone size matrices (GLZSM) were evaluated and compared with their respective values extracted from the ground truth activity map. Results: In reference to the values from the ground truth images, texture parameters appearing on the simulated data varied with a range of 0.73–3026.2% for GLIH-based, 0.02–100.1% for GLCOM-based, 1.11–173.8% for GLNDM-based, and 0.35–66.3% for GLZSM-based. For majority of the examined texture metrics (16/28), their values on the simulated data differed significantly from those from the ground truth images (P-value ranges from <0.0001 to 0.04). Features not exhibiting significant difference comprised of GLIH-based standard deviation, GLCO-based energy and entropy, GLND-based coarseness and contrast, and GLZS-based low gray-level zone emphasis, high gray-level zone emphasis, short zone low gray-level emphasis, long zone low gray-level emphasis, long zone high gray-level emphasis, and zone size nonuniformity. Conclusion: The extent to which PET imaging disturbs texture appearance is feature-dependent and could be substantial. It is thus advised that use of PET texture parameters for predictive and prognostic measurements in oncologic setting awaits further systematic and critical evaluation.« less

  13. A Curve Fitting Approach Using ANN for Converting CT Number to Linear Attenuation Coefficient for CT-based PET Attenuation Correction

    NASA Astrophysics Data System (ADS)

    Lai, Chia-Lin; Lee, Jhih-Shian; Chen, Jyh-Cheng

    2015-02-01

    Energy-mapping, the conversion of linear attenuation coefficients (μ) calculated at the effective computed tomography (CT) energy to those corresponding to 511 keV, is an important step in CT-based attenuation correction (CTAC) for positron emission tomography (PET) quantification. The aim of this study was to implement energy-mapping step by using curve fitting ability of artificial neural network (ANN). Eleven digital phantoms simulated by Geant4 application for tomographic emission (GATE) and 12 physical phantoms composed of various volume concentrations of iodine contrast were used in this study to generate energy-mapping curves by acquiring average CT values and linear attenuation coefficients at 511 keV of these phantoms. The curves were built with ANN toolbox in MATLAB. To evaluate the effectiveness of the proposed method, another two digital phantoms (liver and spine-bone) and three physical phantoms (volume concentrations of 3%, 10% and 20%) were used to compare the energy-mapping curves built by ANN and bilinear transformation, and a semi-quantitative analysis was proceeded by injecting 0.5 mCi FDG into a SD rat for micro-PET scanning. The results showed that the percentage relative difference (PRD) values of digital liver and spine-bone phantom are 5.46% and 1.28% based on ANN, and 19.21% and 1.87% based on bilinear transformation. For 3%, 10% and 20% physical phantoms, the PRD values of ANN curve are 0.91%, 0.70% and 3.70%, and the PRD values of bilinear transformation are 3.80%, 1.44% and 4.30%, respectively. Both digital and physical phantoms indicated that the ANN curve can achieve better performance than bilinear transformation. The semi-quantitative analysis of rat PET images showed that the ANN curve can reduce the inaccuracy caused by attenuation effect from 13.75% to 4.43% in brain tissue, and 23.26% to 9.41% in heart tissue. On the other hand, the inaccuracy remained 6.47% and 11.51% in brain and heart tissue when the bilinear transformation was used. Overall, it can be concluded that the bilinear transformation method resulted in considerable bias and the newly proposed calibration curve built by ANN could achieve better results with acceptable accuracy.

  14. Evansville Area Earthquake Hazards Mapping Project (EAEHMP) - Progress Report, 2008

    USGS Publications Warehouse

    Boyd, Oliver S.; Haase, Jennifer L.; Moore, David W.

    2009-01-01

    Maps of surficial geology, deterministic and probabilistic seismic hazard, and liquefaction potential index have been prepared by various members of the Evansville Area Earthquake Hazard Mapping Project for seven quadrangles in the Evansville, Indiana, and Henderson, Kentucky, metropolitan areas. The surficial geologic maps feature 23 types of surficial geologic deposits, artificial fill, and undifferentiated bedrock outcrop and include alluvial and lake deposits of the Ohio River valley. Probabilistic and deterministic seismic hazard and liquefaction hazard mapping is made possible by drawing on a wealth of information including surficial geologic maps, water well logs, and in-situ testing profiles using the cone penetration test, standard penetration test, down-hole shear wave velocity tests, and seismic refraction tests. These data were compiled and collected with contributions from the Indiana Geological Survey, Kentucky Geological Survey, Illinois State Geological Survey, United States Geological Survey, and Purdue University. Hazard map products are in progress and are expected to be completed by the end of 2009, with a public roll out in early 2010. Preliminary results suggest that there is a 2 percent probability that peak ground accelerations of about 0.3 g will be exceeded in much of the study area within 50 years, which is similar to the 2002 USGS National Seismic Hazard Maps for a firm rock site value. Accelerations as high as 0.4-0.5 g may be exceeded along the edge of the Ohio River basin. Most of the region outside of the river basin has a low liquefaction potential index (LPI), where the probability that LPI is greater than 5 (that is, there is a high potential for liquefaction) for a M7.7 New Madrid type event is only 20-30 percent. Within the river basin, most of the region has high LPI, where the probability that LPI is greater than 5 for a New Madrid type event is 80-100 percent.

  15. MrLavaLoba: A new probabilistic model for the simulation of lava flows as a settling process

    NASA Astrophysics Data System (ADS)

    de'Michieli Vitturi, Mattia; Tarquini, Simone

    2018-01-01

    A new code to simulate lava flow spread, MrLavaLoba, is presented. In the code, erupted lava is itemized in parcels having an elliptical shape and prescribed volume. New parcels bud from existing ones according to a probabilistic law influenced by the local steepest slope direction and by tunable input settings. MrLavaLoba must be accounted among the probabilistic codes for the simulation of lava flows, because it is not intended to mimic the actual process of flowing or to provide directly the progression with time of the flow field, but rather to guess the most probable inundated area and final thickness of the lava deposit. The code's flexibility allows it to produce variable lava flow spread and emplacement according to different dynamics (e.g. pahoehoe or channelized-'a'ā). For a given scenario, it is shown that model outputs converge, in probabilistic terms, towards a single solution. The code is applied to real cases in Hawaii and Mt. Etna, and the obtained maps are shown. The model is written in Python and the source code is available at http://demichie.github.io/MrLavaLoba/.

  16. Grid occupancy estimation for environment perception based on belief functions and PCR6

    NASA Astrophysics Data System (ADS)

    Moras, Julien; Dezert, Jean; Pannetier, Benjamin

    2015-05-01

    In this contribution, we propose to improve the grid map occupancy estimation method developed so far based on belief function modeling and the classical Dempster's rule of combination. Grid map offers a useful representation of the perceived world for mobile robotics navigation. It will play a major role for the security (obstacle avoidance) of next generations of terrestrial vehicles, as well as for future autonomous navigation systems. In a grid map, the occupancy of each cell representing a small piece of the surrounding area of the robot must be estimated at first from sensors measurements (typically LIDAR, or camera), and then it must also be classified into different classes in order to get a complete and precise perception of the dynamic environment where the robot moves. So far, the estimation and the grid map updating have been done using fusion techniques based on the probabilistic framework, or on the classical belief function framework thanks to an inverse model of the sensors. Mainly because the latter offers an interesting management of uncertainties when the quality of available information is low, and when the sources of information appear as conflicting. To improve the performances of the grid map estimation, we propose in this paper to replace Dempster's rule of combination by the PCR6 rule (Proportional Conflict Redistribution rule #6) proposed in DSmT (Dezert-Smarandache) Theory. As an illustrating scenario, we consider a platform moving in dynamic area and we compare our new realistic simulation results (based on a LIDAR sensor) with those obtained by the probabilistic and the classical belief-based approaches.

  17. Challenges in making a seismic hazard map for Alaska and the Aleutians

    USGS Publications Warehouse

    Wesson, R.L.; Boyd, O.S.; Mueller, C.S.; Frankel, A.D.; Freymueller, J.T.

    2008-01-01

    We present a summary of the data and analyses leading to the revision of the time-independent probabilistic seismic hazard maps of Alaska and the Aleutians. These maps represent a revision of existing maps based on newly obtained data, and reflect best current judgments about methodology and approach. They have been prepared following the procedures and assumptions made in the preparation of the 2002 National Seismic Hazard Maps for the lower 48 States, and will be proposed for adoption in future revisions to the International Building Code. We present example maps for peak ground acceleration, 0.2 s spectral amplitude (SA), and 1.0 s SA at a probability level of 2% in 50 years (annual probability of 0.000404). In this summary, we emphasize issues encountered in preparation of the maps that motivate or require future investigation and research.

  18. Zero-Echo-Time and Dixon Deep Pseudo-CT (ZeDD CT): Direct Generation of Pseudo-CT Images for Pelvic PET/MRI Attenuation Correction Using Deep Convolutional Neural Networks with Multiparametric MRI.

    PubMed

    Leynes, Andrew P; Yang, Jaewon; Wiesinger, Florian; Kaushik, Sandeep S; Shanbhag, Dattesh D; Seo, Youngho; Hope, Thomas A; Larson, Peder E Z

    2018-05-01

    Accurate quantification of uptake on PET images depends on accurate attenuation correction in reconstruction. Current MR-based attenuation correction methods for body PET use a fat and water map derived from a 2-echo Dixon MRI sequence in which bone is neglected. Ultrashort-echo-time or zero-echo-time (ZTE) pulse sequences can capture bone information. We propose the use of patient-specific multiparametric MRI consisting of Dixon MRI and proton-density-weighted ZTE MRI to directly synthesize pseudo-CT images with a deep learning model: we call this method ZTE and Dixon deep pseudo-CT (ZeDD CT). Methods: Twenty-six patients were scanned using an integrated 3-T time-of-flight PET/MRI system. Helical CT images of the patients were acquired separately. A deep convolutional neural network was trained to transform ZTE and Dixon MR images into pseudo-CT images. Ten patients were used for model training, and 16 patients were used for evaluation. Bone and soft-tissue lesions were identified, and the SUV max was measured. The root-mean-squared error (RMSE) was used to compare the MR-based attenuation correction with the ground-truth CT attenuation correction. Results: In total, 30 bone lesions and 60 soft-tissue lesions were evaluated. The RMSE in PET quantification was reduced by a factor of 4 for bone lesions (10.24% for Dixon PET and 2.68% for ZeDD PET) and by a factor of 1.5 for soft-tissue lesions (6.24% for Dixon PET and 4.07% for ZeDD PET). Conclusion: ZeDD CT produces natural-looking and quantitatively accurate pseudo-CT images and reduces error in pelvic PET/MRI attenuation correction compared with standard methods. © 2018 by the Society of Nuclear Medicine and Molecular Imaging.

  19. Dopamine D3 Receptor Availability Is Associated with Inflexible Decision Making.

    PubMed

    Groman, Stephanie M; Smith, Nathaniel J; Petrullli, J Ryan; Massi, Bart; Chen, Lihui; Ropchan, Jim; Huang, Yiyun; Lee, Daeyeol; Morris, Evan D; Taylor, Jane R

    2016-06-22

    Dopamine D2/3 receptor signaling is critical for flexible adaptive behavior; however, it is unclear whether D2, D3, or both receptor subtypes modulate precise signals of feedback and reward history that underlie optimal decision making. Here, PET with the radioligand [(11)C]-(+)-PHNO was used to quantify individual differences in putative D3 receptor availability in rodents trained on a novel three-choice spatial acquisition and reversal-learning task with probabilistic reinforcement. Binding of [(11)C]-(+)-PHNO in the midbrain was negatively related to the ability of rats to adapt to changes in rewarded locations, but not to the initial learning. Computational modeling of choice behavior in the reversal phase indicated that [(11)C]-(+)-PHNO binding in the midbrain was related to the learning rate and sensitivity to positive, but not negative, feedback. Administration of a D3-preferring agonist likewise impaired reversal performance by reducing the learning rate and sensitivity to positive feedback. These results demonstrate a previously unrecognized role for D3 receptors in select aspects of reinforcement learning and suggest that individual variation in midbrain D3 receptors influences flexible behavior. Our combined neuroimaging, behavioral, pharmacological, and computational approach implicates the dopamine D3 receptor in decision-making processes that are altered in psychiatric disorders. Flexible decision-making behavior is dependent upon dopamine D2/3 signaling in corticostriatal brain regions. However, the role of D3 receptors in adaptive, goal-directed behavior has not been thoroughly investigated. By combining PET imaging with the D3-preferring radioligand [(11)C]-(+)-PHNO, pharmacology, a novel three-choice probabilistic discrimination and reversal task and computational modeling of behavior in rats, we report that naturally occurring variation in [(11)C]-(+)-PHNO receptor availability relates to specific aspects of flexible decision making. We confirm these relationships using a D3-preferring agonist, thus identifying a unique role of midbrain D3 receptors in decision-making processes. Copyright © 2016 the authors 0270-6474/16/366732-10$15.00/0.

  20. Seismically induced landslides: current research by the US Geological Survey.

    USGS Publications Warehouse

    Harp, E.L.; Wilson, R.C.; Keefer, D.K.; Wieczorek, G.F.

    1986-01-01

    We have produced a regional seismic slope-stability map and a probabilistic prediction of landslide distribution from a postulated earthquake. For liquefaction-induced landslides, in situ measurements of seismically induced pore-water pressures have been used to establish an elastic model of pore pressure generation. -from Authors

  1. Feasibility of Computed Tomography-Guided Methods for Spatial Normalization of Dopamine Transporter Positron Emission Tomography Image.

    PubMed

    Kim, Jin Su; Cho, Hanna; Choi, Jae Yong; Lee, Seung Ha; Ryu, Young Hoon; Lyoo, Chul Hyoung; Lee, Myung Sik

    2015-01-01

    Spatial normalization is a prerequisite step for analyzing positron emission tomography (PET) images both by using volume-of-interest (VOI) template and voxel-based analysis. Magnetic resonance (MR) or ligand-specific PET templates are currently used for spatial normalization of PET images. We used computed tomography (CT) images acquired with PET/CT scanner for the spatial normalization for [18F]-N-3-fluoropropyl-2-betacarboxymethoxy-3-beta-(4-iodophenyl) nortropane (FP-CIT) PET images and compared target-to-cerebellar standardized uptake value ratio (SUVR) values with those obtained from MR- or PET-guided spatial normalization method in healthy controls and patients with Parkinson's disease (PD). We included 71 healthy controls and 56 patients with PD who underwent [18F]-FP-CIT PET scans with a PET/CT scanner and T1-weighted MR scans. Spatial normalization of MR images was done with a conventional spatial normalization tool (cvMR) and with DARTEL toolbox (dtMR) in statistical parametric mapping software. The CT images were modified in two ways, skull-stripping (ssCT) and intensity transformation (itCT). We normalized PET images with cvMR-, dtMR-, ssCT-, itCT-, and PET-guided methods by using specific templates for each modality and measured striatal SUVR with a VOI template. The SUVR values measured with FreeSurfer-generated VOIs (FSVOI) overlaid on original PET images were also used as a gold standard for comparison. The SUVR values derived from all four structure-guided spatial normalization methods were highly correlated with those measured with FSVOI (P < 0.0001). Putaminal SUVR values were highly effective for discriminating PD patients from controls. However, the PET-guided method excessively overestimated striatal SUVR values in the PD patients by more than 30% in caudate and putamen, and thereby spoiled the linearity between the striatal SUVR values in all subjects and showed lower disease discrimination ability. Two CT-guided methods showed comparable capability with the MR-guided methods in separating PD patients from controls and showed better correlation between putaminal SUVR values and the parkinsonian motor severity than the PET-guided method. CT-guided spatial normalization methods provided reliable striatal SUVR values comparable to those obtained with MR-guided methods. CT-guided methods can be useful for analyzing dopamine transporter PET images when MR images are unavailable.

  2. Feasibility of Computed Tomography-Guided Methods for Spatial Normalization of Dopamine Transporter Positron Emission Tomography Image

    PubMed Central

    Kim, Jin Su; Cho, Hanna; Choi, Jae Yong; Lee, Seung Ha; Ryu, Young Hoon; Lyoo, Chul Hyoung; Lee, Myung Sik

    2015-01-01

    Background Spatial normalization is a prerequisite step for analyzing positron emission tomography (PET) images both by using volume-of-interest (VOI) template and voxel-based analysis. Magnetic resonance (MR) or ligand-specific PET templates are currently used for spatial normalization of PET images. We used computed tomography (CT) images acquired with PET/CT scanner for the spatial normalization for [18F]-N-3-fluoropropyl-2-betacarboxymethoxy-3-beta-(4-iodophenyl) nortropane (FP-CIT) PET images and compared target-to-cerebellar standardized uptake value ratio (SUVR) values with those obtained from MR- or PET-guided spatial normalization method in healthy controls and patients with Parkinson’s disease (PD). Methods We included 71 healthy controls and 56 patients with PD who underwent [18F]-FP-CIT PET scans with a PET/CT scanner and T1-weighted MR scans. Spatial normalization of MR images was done with a conventional spatial normalization tool (cvMR) and with DARTEL toolbox (dtMR) in statistical parametric mapping software. The CT images were modified in two ways, skull-stripping (ssCT) and intensity transformation (itCT). We normalized PET images with cvMR-, dtMR-, ssCT-, itCT-, and PET-guided methods by using specific templates for each modality and measured striatal SUVR with a VOI template. The SUVR values measured with FreeSurfer-generated VOIs (FSVOI) overlaid on original PET images were also used as a gold standard for comparison. Results The SUVR values derived from all four structure-guided spatial normalization methods were highly correlated with those measured with FSVOI (P < 0.0001). Putaminal SUVR values were highly effective for discriminating PD patients from controls. However, the PET-guided method excessively overestimated striatal SUVR values in the PD patients by more than 30% in caudate and putamen, and thereby spoiled the linearity between the striatal SUVR values in all subjects and showed lower disease discrimination ability. Two CT-guided methods showed comparable capability with the MR-guided methods in separating PD patients from controls and showed better correlation between putaminal SUVR values and the parkinsonian motor severity than the PET-guided method. Conclusion CT-guided spatial normalization methods provided reliable striatal SUVR values comparable to those obtained with MR-guided methods. CT-guided methods can be useful for analyzing dopamine transporter PET images when MR images are unavailable. PMID:26147749

  3. Pig brain stereotaxic standard space: mapping of cerebral blood flow normative values and effect of MPTP-lesioning.

    PubMed

    Andersen, Flemming; Watanabe, Hideaki; Bjarkam, Carsten; Danielsen, Erik H; Cumming, Paul

    2005-07-15

    The analysis of physiological processes in brain by position emission tomography (PET) is facilitated when images are spatially normalized to a standard coordinate system. Thus, PET activation studies of human brain frequently employ the common stereotaxic coordinates of Talairach. We have developed an analogous stereotaxic coordinate system for the brain of the Gottingen miniature pig, based on automatic co-registration of magnetic resonance (MR) images obtained in 22 male pigs. The origin of the pig brain stereotaxic space (0, 0, 0) was arbitrarily placed in the centroid of the pineal gland as identified on the average MRI template. The orthogonal planes were imposed using the line between stereotaxic zero and the optic chiasm. A series of mean MR images in the coronal, sagittal and horizontal planes were generated. To test the utility of the common coordinate system for functional imaging studies of minipig brain, we calculated cerebral blood flow (CBF) maps from normal minipigs and from minipigs with a syndrome of parkisonism induced by 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP)-poisoning. These maps were transformed from the native space into the common stereotaxic space. After global normalization of these maps, an undirected search for differences between the groups was then performed using statistical parametric mapping. Using this method, we detected a statistically significant focal increase in CBF in the left cerebellum of the MPTP-lesioned group. We expect the present approach to be of general use in the statistical parametric mapping of CBF and other physiological parameters in living pig brain.

  4. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. A Bayesian framework based on a Gaussian mixture model and radial-basis-function Fisher discriminant analysis (BayGmmKda V1.1) for spatial prediction of floods

    NASA Astrophysics Data System (ADS)

    Tien Bui, Dieu; Hoang, Nhat-Duc

    2017-09-01

    In this study, a probabilistic model, named as BayGmmKda, is proposed for flood susceptibility assessment in a study area in central Vietnam. The new model is a Bayesian framework constructed by a combination of a Gaussian mixture model (GMM), radial-basis-function Fisher discriminant analysis (RBFDA), and a geographic information system (GIS) database. In the Bayesian framework, GMM is used for modeling the data distribution of flood-influencing factors in the GIS database, whereas RBFDA is utilized to construct a latent variable that aims at enhancing the model performance. As a result, the posterior probabilistic output of the BayGmmKda model is used as flood susceptibility index. Experiment results showed that the proposed hybrid framework is superior to other benchmark models, including the adaptive neuro-fuzzy inference system and the support vector machine. To facilitate the model implementation, a software program of BayGmmKda has been developed in MATLAB. The BayGmmKda program can accurately establish a flood susceptibility map for the study region. Accordingly, local authorities can overlay this susceptibility map onto various land-use maps for the purpose of land-use planning or management.

  6. Precision Medicine and PET/Computed Tomography: Challenges and Implementation.

    PubMed

    Subramaniam, Rathan M

    2017-01-01

    Precision Medicine is about selecting the right therapy for the right patient, at the right time, specific to the molecular targets expressed by disease or tumors, in the context of patient's environment and lifestyle. Some of the challenges for delivery of precision medicine in oncology include biomarkers for patient selection for enrichment-precision diagnostics, mapping out tumor heterogeneity that contributes to therapy failures, and early therapy assessment to identify resistance to therapies. PET/computed tomography offers solutions in these important areas of challenges and facilitates implementation of precision medicine. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Simultaneous quantitative susceptibility mapping and Flutemetamol-PET suggests local correlation of iron and β-amyloid as an indicator of cognitive performance at high age.

    PubMed

    van Bergen, J M G; Li, X; Quevenco, F C; Gietl, A F; Treyer, V; Meyer, R; Buck, A; Kaufmann, P A; Nitsch, R M; van Zijl, P C M; Hock, C; Unschuld, P G

    2018-03-13

    The accumulation of β-amyloid plaques is a hallmark of Alzheimer's disease (AD), and recently published data suggest that increased brain iron burden may reflect pathologies that synergistically contribute to the development of cognitive dysfunction. While preclinical disease stages are considered most promising for therapeutic intervention, the link between emerging AD-pathology and earliest clinical symptoms remains largely unclear. In the current study we therefore investigated local correlations between iron and β-amyloid plaques, and their possible association with cognitive performance in healthy older adults. 116 older adults (mean age 75 ± 7.4 years) received neuropsychological testing to calculate a composite cognitive score of performance in episodic memory, executive functioning, attention, language and communication. All participants were scanned on a combined PET-MRI instrument and were administered T1-sequences for anatomical mapping, quantitative susceptibility mapping (QSM) for assessing iron, and 18F-Flutemetamol-PET for estimating β-amyloid plaque load. Biological parametric mapping (BPM) was used to generate masks indicating voxels with significant (p < 0.05) correlation between susceptibility and 18F-Flutemetamol-SUVR. We found a bilateral pattern of clusters characterized by a statistical relationship between magnetic susceptibility and 18F-Flutemetamol-SUVR, indicating local correlations between iron and β-amyloid plaque deposition. For two bilateral clusters, located in the frontal and temporal cortex, significant relationships (p<0.05) between local β-amyloid and the composite cognitive performance score could be observed. No relationship between whole-cortex β-amyloid plaque load and cognitive performance was observable. Our data suggest that the local correlation of β-amyloid plaque load and iron deposition may provide relevant information regarding cognitive performance of healthy older adults. Further studies are needed to clarify pathological correlates of the local interaction of β-amyloid, iron and other causes of altered magnetic susceptibility. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Comparison of manual and automatic techniques for substriatal segmentation in 11C-raclopride high-resolution PET studies.

    PubMed

    Johansson, Jarkko; Alakurtti, Kati; Joutsa, Juho; Tohka, Jussi; Ruotsalainen, Ulla; Rinne, Juha O

    2016-10-01

    The striatum is the primary target in regional C-raclopride-PET studies, and despite its small volume, it contains several functional and anatomical subregions. The outcome of the quantitative dopamine receptor study using C-raclopride-PET depends heavily on the quality of the region-of-interest (ROI) definition of these subregions. The aim of this study was to evaluate subregional analysis techniques because new approaches have emerged, but have not yet been compared directly. In this paper, we compared manual ROI delineation with several automatic methods. The automatic methods used either direct clustering of the PET image or individualization of chosen brain atlases on the basis of MRI or PET image normalization. State-of-the-art normalization methods and atlases were applied, including those provided in the FreeSurfer, Statistical Parametric Mapping8, and FSL software packages. Evaluation of the automatic methods was based on voxel-wise congruity with the manual delineations and the test-retest variability and reliability of the outcome measures using data from seven healthy male participants who were scanned twice with C-raclopride-PET on the same day. The results show that both manual and automatic methods can be used to define striatal subregions. Although most of the methods performed well with respect to the test-retest variability and reliability of binding potential, the smallest average test-retest variability and SEM were obtained using a connectivity-based atlas and PET normalization (test-retest variability=4.5%, SEM=0.17). The current state-of-the-art automatic ROI methods can be considered good alternatives for subjective and laborious manual segmentation in C-raclopride-PET studies.

  9. An experimental phantom study of the effect of gadolinium-based MR contrast agents on PET attenuation coefficients and PET quantification in PET-MR imaging: application to cardiac studies.

    PubMed

    O' Doherty, Jim; Schleyer, Paul

    2017-12-01

    Simultaneous cardiac perfusion studies are an increasing trend in PET-MR imaging. During dynamic PET imaging, the introduction of gadolinium-based MR contrast agents (GBCA) at high concentrations during a dual injection of GBCA and PET radiotracer may cause increased attenuation effects of the PET signal, and thus errors in quantification of PET images. We thus aimed to calculate the change in linear attenuation coefficient (LAC) of a mixture of PET radiotracer and increasing concentrations of GBCA in solution and furthermore, to investigate if this change in LAC produced a measurable effect on the image-based PET activity concentration when attenuation corrected by three different AC strategies. We performed simultaneous PET-MR imaging of a phantom in a static scenario using a fixed activity of 40 MBq [18 F]-NaF, water, and an increasing GBCA concentration from 0 to 66 mM (based on an assumed maximum possible concentration of GBCA in the left ventricle in a clinical study). This simulated a range of clinical concentrations of GBCA. We investigated two methods to calculate the LAC of the solution mixture at 511 keV: (1) a mathematical mixture rule and (2) CT imaging of each concentration step and subsequent conversion to LAC at 511 keV. This comparison showed that the ranges of LAC produced by both methods are equivalent with an increase in LAC of the mixed solution of approximately 2% over the range of 0-66 mM. We then employed three different attenuation correction methods to the PET data: (1) each PET scan at a specific millimolar concentration of GBCA corrected by its corresponding CT scan, (2) each PET scan corrected by a CT scan with no GBCA present (i.e., at 0 mM GBCA), and (3) a manually generated attenuation map, whereby all CT voxels in the phantom at 0 mM were replaced by LAC = 0.1 cm -1 . All attenuation correction methods (1-3) were accurate to the true measured activity concentration within 5%, and there were no trends in image-based activity concentrations upon increasing the GBCA concentration of the solution. The presence of high GBCA concentration (representing a worst-case scenario in dynamic cardiac studies) in solution with PET radiotracer produces a minimal effect on attenuation-corrected PET quantification.

  10. Central US earthquake catalog for hazard maps of Memphis, Tennessee

    USGS Publications Warehouse

    Wheeler, R.L.; Mueller, C.S.

    2001-01-01

    An updated version of the catalog that was used for the current national probabilistic seismic-hazard maps would suffice for production of large-scale hazard maps of the Memphis urban area. Deaggregation maps provide guidance as to the area that a catalog for calculating Memphis hazard should cover. For the future, the Nuttli and local network catalogs could be examined for earthquakes not presently included in the catalog. Additional work on aftershock removal might reduce hazard uncertainty. Graphs of decadal and annual earthquake rates suggest completeness at and above magnitude 3 for the last three or four decades. Any additional work on completeness should consider the effects of rapid, local population changes during the Nation's westward expansion. ?? 2001 Elsevier Science B.V. All rights reserved.

  11. Probabilistic Seismic Hazard Maps for Ecuador

    NASA Astrophysics Data System (ADS)

    Mariniere, J.; Beauval, C.; Yepes, H. A.; Laurence, A.; Nocquet, J. M.; Alvarado, A. P.; Baize, S.; Aguilar, J.; Singaucho, J. C.; Jomard, H.

    2017-12-01

    A probabilistic seismic hazard study is led for Ecuador, a country facing a high seismic hazard, both from megathrust subduction earthquakes and shallow crustal moderate to large earthquakes. Building on the knowledge produced in the last years in historical seismicity, earthquake catalogs, active tectonics, geodynamics, and geodesy, several alternative earthquake recurrence models are developed. An area source model is first proposed, based on the seismogenic crustal and inslab sources defined in Yepes et al. (2016). A slightly different segmentation is proposed for the subduction interface, with respect to Yepes et al. (2016). Three earthquake catalogs are used to account for the numerous uncertainties in the modeling of frequency-magnitude distributions. The hazard maps obtained highlight several source zones enclosing fault systems that exhibit low seismic activity, not representative of the geological and/or geodetical slip rates. Consequently, a fault model is derived, including faults with an earthquake recurrence model inferred from geological and/or geodetical slip rate estimates. The geodetical slip rates on the set of simplified faults are estimated from a GPS horizontal velocity field (Nocquet et al. 2014). Assumptions on the aseismic component of the deformation are required. Combining these alternative earthquake models in a logic tree, and using a set of selected ground-motion prediction equations adapted to Ecuador's different tectonic contexts, a mean hazard map is obtained. Hazard maps corresponding to the percentiles 16 and 84% are also derived, highlighting the zones where uncertainties on the hazard are highest.

  12. Probabilistic mapping of urban flood risk: Application to extreme events in Surat, India

    NASA Astrophysics Data System (ADS)

    Ramirez, Jorge; Rajasekar, Umamaheshwaran; Coulthard, Tom; Keiler, Margreth

    2016-04-01

    Surat, India is a coastal city that lies on the banks of the river Tapti and is located downstream from the Ukai dam. Given Surat's geographic location, the population of five million people are repeatedly exposed to flooding caused by high tide combined with large emergency dam releases into the Tapti river. In 2006 such a flood event occurred when intense rainfall in the Tapti catchment caused a dam release near 25,000 m3 s-1 and flooded 90% of the city. A first step towards strengthening resilience in Surat requires a robust method for mapping potential flood risk that considers the uncertainty in future dam releases. Here, in this study we develop many combinations of dam release magnitude and duration for the Ukai dam. Afterwards we use these dam releases to drive a two dimensional flood model (CAESAR-Lisflood) of Surat that also considers tidal effects. Our flood model of Surat utilizes fine spatial resolution (30m) topography produced from an extensive differential global positioning system survey and measurements of river cross-sections. Within the city we have modelled scenarios that include extreme conditions with near maximum dam release levels (e.g. 1:250 year flood) and high tides. Results from all scenarios have been summarized into probabilistic flood risk maps for Surat. These maps are currently being integrated within the city disaster management plan for taking both mitigation and adaptation measures for different scenarios of flooding.

  13. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr events could benefit the NTHMP. The joint NFIP/NTHMP pilot study at Seaside, Oregon is organized into three closely related components: Probabilistic, Modeling, and Impact studies. Probabilistic studies (Geist, et al., this session) are led by the USGS and include the specification of near- and far-field seismic tsunami sources and their associated probabilities. Modeling studies (Titov, et al., this session) are led by NOAA and include the development and testing of a Seaside tsunami inundation model and an associated database of computed wave height and flow velocity fields. Impact studies (Synolakis, et al., this session) are led by USC and include the computation and analyses of indices for the categorization of hazard zones. The results of each component study will be integrated to produce a Seaside tsunami hazard map. This presentation will provide a brief overview of the project and an update on progress, while the above-referenced companion presentations will provide details on the methods used and the preliminary results obtained by each project component.

  14. Quick probabilistic binary image matching: changing the rules of the game

    NASA Astrophysics Data System (ADS)

    Mustafa, Adnan A. Y.

    2016-09-01

    A Probabilistic Matching Model for Binary Images (PMMBI) is presented that predicts the probability of matching binary images with any level of similarity. The model relates the number of mappings, the amount of similarity between the images and the detection confidence. We show the advantage of using a probabilistic approach to matching in similarity space as opposed to a linear search in size space. With PMMBI a complete model is available to predict the quick detection of dissimilar binary images. Furthermore, the similarity between the images can be measured to a good degree if the images are highly similar. PMMBI shows that only a few pixels need to be compared to detect dissimilarity between images, as low as two pixels in some cases. PMMBI is image size invariant; images of any size can be matched at the same quick speed. Near-duplicate images can also be detected without much difficulty. We present tests on real images that show the prediction accuracy of the model.

  15. Probabilistic Hazard Estimation at a Densely Urbanised Area: the Neaples Volcanoes

    NASA Astrophysics Data System (ADS)

    de Natale, G.; Mastrolorenzo, G.; Panizza, A.; Pappalardo, L.; Claudia, T.

    2005-12-01

    The Neaples volcanic area (Southern Italy), including Vesuvius, Campi Flegrei caldera and Ischia island, is the highest risk one in the World, where more than 2 million people live within about 10 km from an active volcanic vent. Such an extreme risk calls for accurate methodologies aimed to quantify it, in a probabilistic way, considering all the available volcanological information as well as modelling results. In fact, simple hazard maps based on the observation of deposits from past eruptions have the major problem that eruptive history generally samples a very limited number of possible outcomes, thus resulting almost meaningless to get the event probability in the area. This work describes a methodology making the best use (from a Bayesian point of view) of volcanological data and modelling results, to compute probabilistic hazard maps from multi-vent explosive eruptions. The method, which follows an approach recently developed by the same authors for pyroclastic flows hazard, has been here improved and extended to compute also fall-out hazard. The application of the method to the Neapolitan volcanic area, including the densely populated city of Naples, allows, for the first time, to get a global picture of the areal distribution for the main hazards from multi-vent explosive eruptions. From a joint consideration of the hazard contributions from all the three volcanic areas, new insight on the volcanic hazard distribution emerges, which will have strong implications for urban and emergency planning in the area.

  16. A Framework for Quantitative Assessment of Impacts Related to Energy and Mineral Resource Development

    DOE PAGES

    Haines, Seth S.; Diffendorfer, Jay E.; Balistrieri, Laurie; ...

    2013-05-15

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sagemore » grouse leks and pinon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. In conclusion, the framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.« less

  17. A framework for quantitative assessment of impacts related to energy and mineral resource development

    USGS Publications Warehouse

    Haines, Seth S.; Diffendorfer, James; Balistrieri, Laurie S.; Berger, Byron R.; Cook, Troy A.; Gautier, Donald L.; Gallegos, Tanya J.; Gerritsen, Margot; Graffy, Elisabeth; Hawkins, Sarah; Johnson, Kathleen; Macknick, Jordan; McMahon, Peter; Modde, Tim; Pierce, Brenda; Schuenemeyer, John H.; Semmens, Darius; Simon, Benjamin; Taylor, Jason; Walton-Day, Katherine

    2013-01-01

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sage grouse leks and piñon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. The framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.

  18. Multimodal correlation of dynamic [18F]-AV-1451 perfusion PET and neuronal hypometabolism in [18F]-FDG PET.

    PubMed

    Hammes, Jochen; Leuwer, Isabel; Bischof, Gérard N; Drzezga, Alexander; van Eimeren, Thilo

    2017-12-01

    Cerebral glucose metabolism measured with [18F]-FDG PET is a well established marker of neuronal dysfunction in neurodegeneration. The tau-protein tracer [18F]-AV-1451 PET is currently under evaluation and shows promising results. Here, we assess the feasibility of early perfusion imaging with AV-1451 as a substite for FDG PET in assessing neuronal injury. Twenty patients with suspected neurodegeneration underwent FDG and early phase AV-1451 PET imaging. Ten one-minute timeframes were acquired after application of 200 MBq AV-1451. FDG images were acquired on a different date according to clinical protocol. Early AV-1451 timeframes were coregistered to individual FDG-scans and spatially normalized. Voxel-wise intermodal correlations were calculated on within-subject level for every possible time window. The window with highest pooled correlation was considered optimal. Z-transformed deviation maps (ZMs) were created from both FDG and early AV-1451 images, comparing against FDG images of healthy controls. Regional patterns and extent of perfusion deficits were highly comparable to metabolic deficits. Best results were observed in a time window from 60 to 360 s (r = 0.86). Correlation strength ranged from r = 0.96 (subcortical gray matter) to 0.83 (frontal lobe) in regional analysis. ZMs of early AV-1451 and FDG images were highly similar. Perfusion imaging with AV-1451 is a valid biomarker for assessment of neuronal dysfunction in neurodegenerative diseases. Radiation exposure and complexity of the diagnostic workup could be reduced significantly by routine acquisition of early AV-1451 images, sparing additional FDG PET.

  19. Activity-based costing evaluation of a [(18)F]-fludeoxyglucose positron emission tomography study.

    PubMed

    Krug, Bruno; Van Zanten, Annie; Pirson, Anne-Sophie; Crott, Ralph; Borght, Thierry Vander

    2009-10-01

    The aim of the study is to use the activity-based costing approach to give a better insight in the actual cost structure of a positron emission tomography procedure (FDG-PET) by defining the constituting components and by simulating the impact of possible resource or practice changes. The cost data were obtained from the hospital administration, personnel and vendor interviews as well as from structured questionnaires. A process map separates the process in 16 patient- and non-patient-related activities, to which the detailed cost data are related. One-way sensitivity analyses shows to which degree of uncertainty the different parameters affect the individual cost and evaluate the impact of possible resource or practice changes like the acquisition of a hybrid PET/CT device, the patient throughput or the sales price of a 370MBq (18)F-FDG patient dose. The PET centre spends 73% of time in clinical activities and the resting time after injection of the tracer (42%) is the single largest departmental cost element. The tracer cost and the operational time have the most influence on cost per procedure. The analysis shows a total cost per FDG-PET ranging from 859 Euro for a BGO PET camera to 1142 Euro for a 16 slices PET-CT system, with a distribution of the resource costs in decreasing order: materials (44%), equipment (24%), wage (16%), space (6%) and hospital overhead (10%). The cost of FDG-PET is mainly influenced by the cost of the radiopharmaceutical. Therefore, the latter rather than the operational time should be reduced in order to improve its cost-effectiveness.

  20. Preliminary research on 1-(4-bromo-2-nitroimidazol-1-yl)-3-[(18)F]fluoropropan-2-ol as a novel brain hypoxia PET tracer in a rodent model of stroke.

    PubMed

    Nieto, Elena; Delgado, Mercedes; Sobrado, Mónica; de Ceballos, María L; Alajarín, Ramón; García-García, Luis; Kelly, James; Lizasoain, Ignacio; Pozo, Miguel A; Álvarez-Builla, Julio

    2015-08-28

    The synthesis of the new radiotracer precursor 4-Br-NITTP and the radiolabeling of the new tracer 1-(4-bromo-2-nitroimidazol-1-yl)-3-[(18)F]fluoropropan-2-ol (4-Br-[(18)F]FMISO) is reported. The cyclic voltammetry behaviour, neuronal cell toxicity, transport through the brain endothelial cell monolayer, in vivo PET imaging and preliminary calculations of the tracer uptake for a rodent model of stroke were studied for the new compound and the results were compared to those obtained with [(18)F]FMISO, the current gold standard PET hypoxia tracer. The new PET brain hypoxia tracer is more easily reduced, has higher CLogP than [(18)F]FMISO and it diffuses more rapidly through brain endothelial cells. The new compound is non-toxic to neuronal cells and it allows the in vivo mapping of stroke in mice with higher sensitivity. 4-Br-[(18)F]FMISO is a good candidate for further development in ischemic stroke. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  1. Partial volume correction and image analysis methods for intersubject comparison of FDG-PET studies

    NASA Astrophysics Data System (ADS)

    Yang, Jun

    2000-12-01

    Partial volume effect is an artifact mainly due to the limited imaging sensor resolution. It creates bias in the measured activity in small structures and around tissue boundaries. In brain FDG-PET studies, especially for Alzheimer's disease study where there is serious gray matter atrophy, accurate estimate of cerebral metabolic rate of glucose is even more problematic due to large amount of partial volume effect. In this dissertation, we developed a framework enabling inter-subject comparison of partial volume corrected brain FDG-PET studies. The framework is composed of the following image processing steps: (1)MRI segmentation, (2)MR-PET registration, (3)MR based PVE correction, (4)MR 3D inter-subject elastic mapping. Through simulation studies, we showed that the newly developed partial volume correction methods, either pixel based or ROI based, performed better than previous methods. By applying this framework to a real Alzheimer's disease study, we demonstrated that the partial volume corrected glucose rates vary significantly among the control, at risk and disease patient groups and this framework is a promising tool useful for assisting early identification of Alzheimer's patients.

  2. Activity of Tachykinin1-Expressing Pet1 Raphe Neurons Modulates the Respiratory Chemoreflex

    PubMed Central

    Corcoran, Andrea E.; Brust, Rachael D.; Chang, YoonJeung; Nattie, Eugene E.

    2017-01-01

    Homeostatic control of breathing, heart rate, and body temperature relies on circuits within the brainstem modulated by the neurotransmitter serotonin (5-HT). Mounting evidence points to specialized neuronal subtypes within the serotonergic neuronal system, borne out in functional studies, for the modulation of distinct facets of homeostasis. Such functional differences, read out at the organismal level, are likely subserved by differences among 5-HT neuron subtypes at the cellular and molecular levels, including differences in the capacity to coexpress other neurotransmitters such as glutamate, GABA, thyrotropin releasing hormone, and substance P encoded by the Tachykinin-1 (Tac1) gene. Here, we characterize in mice a 5-HT neuron subtype identified by expression of Tac1 and the serotonergic transcription factor gene Pet1, referred to as the Tac1-Pet1 neuron subtype. Transgenic cell labeling showed Tac1-Pet1 soma resident largely in the caudal medulla. Chemogenetic [clozapine-N-oxide (CNO)-hM4Di] perturbation of Tac1-Pet1 neuron activity blunted the ventilatory response of the respiratory CO2 chemoreflex, which normally augments ventilation in response to hypercapnic acidosis to restore normal pH and PCO2. Tac1-Pet1 axonal boutons were found localized to brainstem areas implicated in respiratory modulation, with highest density in motor regions. These findings demonstrate that the activity of a Pet1 neuron subtype with the potential to release both 5-HT and substance P is necessary for normal respiratory dynamics, perhaps via motor outputs that engage muscles of respiration and maintain airway patency. These Tac1-Pet1 neurons may act downstream of Egr2-Pet1 serotonergic neurons, which were previously established in respiratory chemoreception, but do not innervate respiratory motor nuclei. SIGNIFICANCE STATEMENT Serotonin (5-HT) neurons modulate physiological processes and behaviors as diverse as body temperature, respiration, aggression, and mood. Using genetic tools, we characterize a 5-HT neuron subtype defined by expression of Tachykinin1 and Pet1 (Tac1-Pet1 neurons), mapping soma localization to the caudal medulla primarily and axonal projections to brainstem motor nuclei most prominently, and, when silenced, observed blunting of the ventilatory response to inhaled CO2. Tac1-Pet1 neurons thus appear distinct from and contrast previously described Egr2-Pet1 neurons, which project primarily to chemosensory integration centers and are themselves chemosensitive. PMID:28073937

  3. Probabilistic mapping of descriptive health status responses onto health state utilities using Bayesian networks: an empirical analysis converting SF-12 into EQ-5D utility index in a national US sample.

    PubMed

    Le, Quang A; Doctor, Jason N

    2011-05-01

    As quality-adjusted life years have become the standard metric in health economic evaluations, mapping health-profile or disease-specific measures onto preference-based measures to obtain quality-adjusted life years has become a solution when health utilities are not directly available. However, current mapping methods are limited due to their predictive validity, reliability, and/or other methodological issues. We employ probability theory together with a graphical model, called a Bayesian network, to convert health-profile measures into preference-based measures and to compare the results to those estimated with current mapping methods. A sample of 19,678 adults who completed both the 12-item Short Form Health Survey (SF-12v2) and EuroQoL 5D (EQ-5D) questionnaires from the 2003 Medical Expenditure Panel Survey was split into training and validation sets. Bayesian networks were constructed to explore the probabilistic relationships between each EQ-5D domain and 12 items of the SF-12v2. The EQ-5D utility scores were estimated on the basis of the predicted probability of each response level of the 5 EQ-5D domains obtained from the Bayesian inference process using the following methods: Monte Carlo simulation, expected utility, and most-likely probability. Results were then compared with current mapping methods including multinomial logistic regression, ordinary least squares, and censored least absolute deviations. The Bayesian networks consistently outperformed other mapping models in the overall sample (mean absolute error=0.077, mean square error=0.013, and R overall=0.802), in different age groups, number of chronic conditions, and ranges of the EQ-5D index. Bayesian networks provide a new robust and natural approach to map health status responses into health utility measures for health economic evaluations.

  4. Positron emission tomography assessment of 8-OH-DPAT-mediated changes in an index of cerebral glucose metabolism in female marmosets

    PubMed Central

    Converse, Alexander K.; Aubert, Yves; Farhoud, Mohammed; Weichert, Jamey P.; Rowland, Ian J.; Ingrisano, Nicole M.; Allers, Kelly A.; Sommer, Bernd; Abbott, David H.

    2013-01-01

    As part of a larger experiment investigating serotonergic regulation of female marmoset sexual behavior, this study was designed to (1) advance methods for PET imaging of common marmoset monkey brain, (2) measure normalized FDG uptake as an index of local cerebral metabolic rates for glucose, and (3) study changes induced in this index of cerebral glucose metabolism by chronic treatment of female marmosets with a serotonin 1A receptor (5-HT1A) agonist. We hypothesized that chronic treatment with the 5-HT1A agonist 8-OH-DPAT would alter the glucose metabolism index in dorsal raphe (DR), medial prefrontal cortex (mPFC), medial preoptic area of hypothalamus (mPOA), ventromedial nucleus of hypothalamus (VMH), and field CA1 of hippocampus. Eight adult ovariectomized female common marmosets (Callithrix jacchus) were studied with and without estradiol replacement. In a crossover design, each subject was treated daily with 8-OH-DPAT (0.1 mg/kg SC daily) or saline. After 42–49 days of treatment, the glucose metabolism radiotracer FDG was administered to each female immediately prior to 30 min of interaction with her male pairmate, after which the subject was anesthetized and imaged by PET. Whole brain normalized PET images were analyzed with anatomically defined regions of interest (ROI). Whole brain voxelwise mapping was also used to explore treatment effects and correlations between alterations in the glucose metabolism index and pairmate interactions. The rank order of normalized FDG uptake was VMH/mPOA>DR>mPFC/CA1 in both conditions. 8-OH-DPAT did not induce alterations in the glucose metabolism index in ROIs. Voxelwise mapping showed a significant reduction in normalized FDG uptake in response to 8-OH-DPAT in a cluster in medial occipital cortex as well as a significant correlation between increased rejection of mount attempts and reduced normalized FDG uptake in an overlapping cluster. In conclusion, PET imaging has been used to measure FDG uptake relative to whole brain in marmoset monkeys. Voxelwise mapping shows that 8-OH-DPAT reduces this index of glucose metabolism in medial occipital cortex, consistent with alterations in female sexual behavior. PMID:22233732

  5. Spectral Clustering Predicts Tumor Tissue Heterogeneity Using Dynamic 18F-FDG PET: A Complement to the Standard Compartmental Modeling Approach.

    PubMed

    Katiyar, Prateek; Divine, Mathew R; Kohlhofer, Ursula; Quintanilla-Martinez, Leticia; Schölkopf, Bernhard; Pichler, Bernd J; Disselhorst, Jonathan A

    2017-04-01

    In this study, we described and validated an unsupervised segmentation algorithm for the assessment of tumor heterogeneity using dynamic 18 F-FDG PET. The aim of our study was to objectively evaluate the proposed method and make comparisons with compartmental modeling parametric maps and SUV segmentations using simulations of clinically relevant tumor tissue types. Methods: An irreversible 2-tissue-compartmental model was implemented to simulate clinical and preclinical 18 F-FDG PET time-activity curves using population-based arterial input functions (80 clinical and 12 preclinical) and the kinetic parameter values of 3 tumor tissue types. The simulated time-activity curves were corrupted with different levels of noise and used to calculate the tissue-type misclassification errors of spectral clustering (SC), parametric maps, and SUV segmentation. The utility of the inverse noise variance- and Laplacian score-derived frame weighting schemes before SC was also investigated. Finally, the SC scheme with the best results was tested on a dynamic 18 F-FDG measurement of a mouse bearing subcutaneous colon cancer and validated using histology. Results: In the preclinical setup, the inverse noise variance-weighted SC exhibited the lowest misclassification errors (8.09%-28.53%) at all noise levels in contrast to the Laplacian score-weighted SC (16.12%-31.23%), unweighted SC (25.73%-40.03%), parametric maps (28.02%-61.45%), and SUV (45.49%-45.63%) segmentation. The classification efficacy of both weighted SC schemes in the clinical case was comparable to the unweighted SC. When applied to the dynamic 18 F-FDG measurement of colon cancer, the proposed algorithm accurately identified densely vascularized regions from the rest of the tumor. In addition, the segmented regions and clusterwise average time-activity curves showed excellent correlation with the tumor histology. Conclusion: The promising results of SC mark its position as a robust tool for quantification of tumor heterogeneity using dynamic PET studies. Because SC tumor segmentation is based on the intrinsic structure of the underlying data, it can be easily applied to other cancer types as well. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  6. PET image reconstruction: a robust state space approach.

    PubMed

    Liu, Huafeng; Tian, Yi; Shi, Pengcheng

    2005-01-01

    Statistical iterative reconstruction algorithms have shown improved image quality over conventional nonstatistical methods in PET by using accurate system response models and measurement noise models. Strictly speaking, however, PET measurements, pre-corrected for accidental coincidences, are neither Poisson nor Gaussian distributed and thus do not meet basic assumptions of these algorithms. In addition, the difficulty in determining the proper system response model also greatly affects the quality of the reconstructed images. In this paper, we explore the usage of state space principles for the estimation of activity map in tomographic PET imaging. The proposed strategy formulates the organ activity distribution through tracer kinetics models, and the photon-counting measurements through observation equations, thus makes it possible to unify the dynamic reconstruction problem and static reconstruction problem into a general framework. Further, it coherently treats the uncertainties of the statistical model of the imaging system and the noisy nature of measurement data. Since H(infinity) filter seeks minimummaximum-error estimates without any assumptions on the system and data noise statistics, it is particular suited for PET image reconstruction where the statistical properties of measurement data and the system model are very complicated. The performance of the proposed framework is evaluated using Shepp-Logan simulated phantom data and real phantom data with favorable results.

  7. "Magnetic resonance imaging negative positron emission tomography positive" temporal lobe epilepsy: FDG-PET pattern differs from mesial temporal lobe epilepsy.

    PubMed

    Carne, R P; Cook, M J; MacGregor, L R; Kilpatrick, C J; Hicks, R J; O'Brien, T J

    2007-01-01

    Some patients with temporal lobe epilepsy (TLE) lack evidence of hippocampal sclerosis (HS) on MRI (HS-ve). We hypothesized that this group would have a different pattern of 2-deoxy-2-[F-18]fluoro-D-glucose (FDG)-positron emission tomography (PET) hypometabolism than typical mesial TLE/HS patients with evidence of hippocampal atrophy on magnetic resonance imaging (MRI) (HS+ve), with a lateral temporal neocortical rather than mesial focus. Thirty consecutive HS-ve patients and 30 age- and sex-matched HS+ve patients with well-lateralized EEG were identified. FDG-PET was performed on 28 HS-ve patients and 24 HS+ve patients. Both groups were compared using statistical parametric mapping (SPM), directly and with FDG-PET from 20 healthy controls. Both groups showed lateralized temporal hypometabolism compared to controls. In HS+ve, this was antero-infero-mesial (T = 17.13); in HS-ve the main clustering was inferolateral (T = 17.63). When directly compared, HS+ve had greater hypometabolism inmesial temporal/hippocampal regions (T = 4.86); HS-ve had greater inferolateral temporal hypometabolism (T = 4.18). These data support the hypothesis that focal hypometabolism involves primarily lateal neocortical rather than mesial temporal structures in 'MRI-negative PET-positive TLE.'

  8. Simultaneous reconstruction of emission activity and attenuation coefficient distribution from TOF data, acquired with external transmission source

    NASA Astrophysics Data System (ADS)

    Panin, V. Y.; Aykac, M.; Casey, M. E.

    2013-06-01

    The simultaneous PET data reconstruction of emission activity and attenuation coefficient distribution is presented, where the attenuation image is constrained by exploiting an external transmission source. Data are acquired in time-of-flight (TOF) mode, allowing in principle for separation of emission and transmission data. Nevertheless, here all data are reconstructed at once, eliminating the need to trace the position of the transmission source in sinogram space. Contamination of emission data by the transmission source and vice versa is naturally modeled. Attenuated emission activity data also provide additional information about object attenuation coefficient values. The algorithm alternates between attenuation and emission activity image updates. We also proposed a method of estimation of spatial scatter distribution from the transmission source by incorporating knowledge about the expected range of attenuation map values. The reconstruction of experimental data from the Siemens mCT scanner suggests that simultaneous reconstruction improves attenuation map image quality, as compared to when data are separated. In the presented example, the attenuation map image noise was reduced and non-uniformity artifacts that occurred due to scatter estimation were suppressed. On the other hand, the use of transmission data stabilizes attenuation coefficient distribution reconstruction from TOF emission data alone. The example of improving emission images by refining a CT-based patient attenuation map is presented, revealing potential benefits of simultaneous CT and PET data reconstruction.

  9. Automatic Detection of Lung and Liver Lesions in 3-D Positron Emission Tomography Images: A Pilot Study

    NASA Astrophysics Data System (ADS)

    Lartizien, Carole; Marache-Francisco, Simon; Prost, Rémy

    2012-02-01

    Positron emission tomography (PET) using fluorine-18 deoxyglucose (18F-FDG) has become an increasingly recommended tool in clinical whole-body oncology imaging for the detection, diagnosis, and follow-up of many cancers. One way to improve the diagnostic utility of PET oncology imaging is to assist physicians facing difficult cases of residual or low-contrast lesions. This study aimed at evaluating different schemes of computer-aided detection (CADe) systems for the guided detection and localization of small and low-contrast lesions in PET. These systems are based on two supervised classifiers, linear discriminant analysis (LDA) and the nonlinear support vector machine (SVM). The image feature sets that serve as input data consisted of the coefficients of an undecimated wavelet transform. An optimization study was conducted to select the best combination of parameters for both the SVM and the LDA. Different false-positive reduction (FPR) methods were evaluated to reduce the number of false-positive detections per image (FPI). This includes the removal of small detected clusters and the combination of the LDA and SVM detection maps. The different CAD schemes were trained and evaluated based on a simulated whole-body PET image database containing 250 abnormal cases with 1230 lesions and 250 normal cases with no lesion. The detection performance was measured on a separate series of 25 testing images with 131 lesions. The combination of the LDA and SVM score maps was shown to produce very encouraging detection performance for both the lung lesions, with 91% sensitivity and 18 FPIs, and the liver lesions, with 94% sensitivity and 10 FPIs. Comparison with human performance indicated that the different CAD schemes significantly outperformed human detection sensitivities, especially regarding the low-contrast lesions.

  10. Altered striatal circuits underlie characteristic personality traits in Parkinson's disease.

    PubMed

    Ishii, Toru; Sawamoto, Nobukatsu; Tabu, Hayato; Kawashima, Hidekazu; Okada, Tomohisa; Togashi, Kaori; Takahashi, Ryosuke; Fukuyama, Hidenao

    2016-09-01

    Patients with Parkinson's disease (PD) have been suggested to share personality traits characterised by low novelty-seeking and high harm-avoidance. Although a link between novelty-seeking and dopamine is hypothesised, the link is not fully supported by 6-[(18)F]fluoro-L-dopa positron emission tomography (PET) studies. Meanwhile, tractography studies with magnetic resonance imaging (MRI) link personality to the connectivity of the striatum in healthy subjects. Here, we investigated neurochemical and anatomical correlates of characteristic personality traits in PD. Sixteen PD patients and 28 healthy controls were assessed using the Temperament and Character Inventory. All patients and 17 randomly selected controls were scanned with 2β-carbomethoxy-3β-(4-fluorophenyl)-[N-(11)C-methyl]tropane ([(11)C]CFT) PET to measure striatal dopamine transporter availability. All subjects were scanned with MRI to evaluate the connectivity of the striatum using probabilistic tractography. PET findings revealed no correlation of novelty-seeking and harm-avoidance with [(11)C]CFT uptake in patients or controls. Novelty-seeking correlated positively with the connectivity strength of the striatum with the hippocampus and amygdala in both patients and controls. Harm-avoidance and the fibre connectivity strength of the striatum including ventral area with the amygdala correlated negatively in patients and positively in controls, which differed significantly between the groups. Our data support the notion that the fibre connectivity of the striatum with limbic and frontal areas underlies the personality profile. Furthermore, our findings suggest that higher harm-avoidance in PD is linked to alterations of the network, including the nucleus accumbens and amygdala.

  11. Simultaneous PET/MR imaging of the brain: feasibility of cerebral blood flow measurements with FAIR-TrueFISP arterial spin labeling MRI.

    PubMed

    Stegger, Lars; Martirosian, Petros; Schwenzer, Nina; Bisdas, Sotirios; Kolb, Armin; Pfannenberg, Christina; Claussen, Claus D; Pichler, Bernd; Schick, Fritz; Boss, Andreas

    2012-11-01

    Hybrid positron emission tomography/magnetic resonance imaging (PET/MRI) with simultaneous data acquisition promises a comprehensive evaluation of cerebral pathophysiology on a molecular, anatomical, and functional level. Considering the necessary changes to the MR scanner design the feasibility of arterial spin labeling (ASL) is unclear. To evaluate whether cerebral blood flow imaging with ASL is feasible using a prototype PET/MRI device. ASL imaging of the brain with Flow-sensitive Alternating Inversion Recovery (FAIR) spin preparation and true fast imaging in steady precession (TrueFISP) data readout was performed in eight healthy volunteers sequentially on a prototype PET/MRI and a stand-alone MR scanner with 128 × 128 and 192 × 192 matrix sizes. Cerebral blood flow values for gray matter, signal-to-noise and contrast-to-noise ratios, and relative signal change were compared. Additionally, the feasibility of ASL as part of a clinical hybrid PET/MRI protocol was demonstrated in five patients with intracerebral tumors. Blood flow maps showed good delineation of gray and white matter with no discernible artifacts. The mean blood flow values of the eight volunteers on the PET/MR system were 51 ± 9 and 51 ± 7 mL/100 g/min for the 128 × 128 and 192 × 192 matrices (stand-alone MR, 57 ± 2 and 55 ± 5, not significant). The value for signal-to-noise (SNR) was significantly higher for the PET/MRI system using the 192 × 192 matrix size (P < 0.01), the relative signal change (δS) was significantly lower for the 192 × 192 matrix size (P = 0.02). ASL imaging as part of a clinical hybrid PET/MRI protocol could successfully be accomplished in all patients in diagnostic image quality. ASL brain imaging is feasible with a prototype hybrid PET/MRI scanner, thus adding to the value of this novel imaging technique.

  12. PET/MRI of metabolic activity in osteoarthritis: A feasibility study.

    PubMed

    Kogan, Feliks; Fan, Audrey P; McWalter, Emily J; Oei, Edwin H G; Quon, Andrew; Gold, Garry E

    2017-06-01

    To evaluate positron emission tomography / magnetic resonance imaging (PET/MRI) knee imaging to detect and characterize osseous metabolic abnormalities and correlate PET radiotracer uptake with osseous abnormalities and cartilage degeneration observed on MRI. Both knees of 22 subjects with knee pain or injury were scanned at one timepoint, without gadolinium, on a hybrid 3.0T PET-MRI system following injection of 18 F-fluoride or 18 F-fluorodeoxyglucose (FDG). A musculoskeletal radiologist identified volumes of interest (VOIs) around bone abnormalities on MR images and scored bone marrow lesions (BMLs) and osteophytes using a MOAKS scoring system. Cartilage appearance adjacent to bone abnormalities was graded with MRI-modified Outerbridge classifications. On PET standardized uptake values (SUV) maps, VOIs with SUV greater than 5 times the SUV in normal-appearing bone were identified as high-uptake VOI (VOI High ). Differences in 18 F-fluoride uptake between bone abnormalities, BML, and osteophyte grades and adjacent cartilage grades on MRI were identified using Mann-Whitney U-tests. SUV max in all subchondral bone lesions (BML, osteophytes, sclerosis) was significantly higher than that of normal-appearing bone on MRI (P < 0.001 for all). Of the 172 high-uptake regions on 18 F-fluoride PET, 63 (37%) corresponded to normal-appearing subchondral bone on MRI. Furthermore, many small grade 1 osteophytes (40 of 82 [49%]), often described as the earliest signs of osteoarthritis (OA), did not show high uptake. Lastly, PET SUV max in subchondral bone adjacent to grade 0 cartilage was significantly lower compared to that of grades 1-2 (P < 0.05) and grades 3-4 cartilage (P < 0.001). PET/MRI can simultaneously assess multiple early metabolic and morphologic markers of knee OA across multiple tissues in the joint. Our findings suggest that PET/MR may detect metabolic abnormalities in subchondral bone, which appear normal on MRI. 2 Technical Efficacy: Stage 1 J. MAGN. RESON. IMAGING 2017;45:1736-1745. © 2016 International Society for Magnetic Resonance in Medicine.

  13. The Role of Probability in Developing Learners' Models of Simulation Approaches to Inference

    ERIC Educational Resources Information Center

    Lee, Hollylynne S.; Doerr, Helen M.; Tran, Dung; Lovett, Jennifer N.

    2016-01-01

    Repeated sampling approaches to inference that rely on simulations have recently gained prominence in statistics education, and probabilistic concepts are at the core of this approach. In this approach, learners need to develop a mapping among the problem situation, a physical enactment, computer representations, and the underlying randomization…

  14. Functional Topography of Early Periventricular Brain Lesions in Relation to Cytoarchitectonic Probabilistic Maps

    ERIC Educational Resources Information Center

    Staudt, Martin; Ticini, Luca F.; Grodd, Wolfgang; Krageloh-Mann, Ingeborg; Karnath, Hans-Otto

    2008-01-01

    Early periventricular brain lesions can not only cause cerebral palsy, but can also induce a reorganization of language. Here, we asked whether these different functional consequences can be attributed to topographically distinct portions of the periventricular white matter damage. Eight patients with pre- and perinatally acquired left-sided…

  15. A Diffusion MRI Tractography Connectome of the Mouse Brain and Comparison with Neuronal Tracer Data

    PubMed Central

    Calabrese, Evan; Badea, Alexandra; Cofer, Gary; Qi, Yi; Johnson, G. Allan

    2015-01-01

    Interest in structural brain connectivity has grown with the understanding that abnormal neural connections may play a role in neurologic and psychiatric diseases. Small animal connectivity mapping techniques are particularly important for identifying aberrant connectivity in disease models. Diffusion magnetic resonance imaging tractography can provide nondestructive, 3D, brain-wide connectivity maps, but has historically been limited by low spatial resolution, low signal-to-noise ratio, and the difficulty in estimating multiple fiber orientations within a single image voxel. Small animal diffusion tractography can be substantially improved through the combination of ex vivo MRI with exogenous contrast agents, advanced diffusion acquisition and reconstruction techniques, and probabilistic fiber tracking. Here, we present a comprehensive, probabilistic tractography connectome of the mouse brain at microscopic resolution, and a comparison of these data with a neuronal tracer-based connectivity data from the Allen Brain Atlas. This work serves as a reference database for future tractography studies in the mouse brain, and demonstrates the fundamental differences between tractography and neuronal tracer data. PMID:26048951

  16. Robust fitting for neuroreceptor mapping.

    PubMed

    Chang, Chung; Ogden, R Todd

    2009-03-15

    Among many other uses, positron emission tomography (PET) can be used in studies to estimate the density of a neuroreceptor at each location throughout the brain by measuring the concentration of a radiotracer over time and modeling its kinetics. There are a variety of kinetic models in common usage and these typically rely on nonlinear least-squares (LS) algorithms for parameter estimation. However, PET data often contain artifacts (such as uncorrected head motion) and so the assumptions on which the LS methods are based may be violated. Quantile regression (QR) provides a robust alternative to LS methods and has been used successfully in many applications. We consider fitting various kinetic models to PET data using QR and study the relative performance of the methods via simulation. A data adaptive method for choosing between LS and QR is proposed and the performance of this method is also studied.

  17. PET/MRI for Oncologic Brain Imaging: A Comparison of Standard MR-Based Attenuation Corrections with a Model-Based Approach for the Siemens mMR PET/MR System.

    PubMed

    Rausch, Ivo; Rischka, Lucas; Ladefoged, Claes N; Furtner, Julia; Fenchel, Matthias; Hahn, Andreas; Lanzenberger, Rupert; Mayerhoefer, Marius E; Traub-Weidinger, Tatjana; Beyer, Thomas

    2017-09-01

    The aim of this study was to compare attenuation-correction (AC) approaches for PET/MRI in clinical neurooncology. Methods: Forty-nine PET/MRI brain scans were included: brain tumor studies using 18 F-fluoro-ethyl-tyrosine ( 18 F-FET) ( n = 31) and 68 Ga-DOTANOC ( n = 7) and studies of healthy subjects using 18 F-FDG ( n = 11). For each subject, MR-based AC maps (MR-AC) were acquired using the standard DIXON- and ultrashort echo time (UTE)-based approaches. A third MR-AC was calculated using a model-based, postprocessing approach to account for bone attenuation values (BD, noncommercial prototype software by Siemens Healthcare). As a reference, AC maps were derived from patient-specific CT images (CTref). PET data were reconstructed using standard settings after AC with all 4 AC methods. We report changes in diagnosis for all brain tumor patients and the following relative differences values (RDs [%]), with regards to AC-CTref: for 18 F-FET (A)-SUVs as well as volumes of interest (VOIs) defined by a 70% threshold of all segmented lesions and lesion-to-background ratios; for 68 Ga-DOTANOC (B)-SUVs as well as VOIs defined by a 50% threshold for all lesions and the pituitary gland; and for 18 F-FDG (C)-RD of SUVs of the whole brain and 10 anatomic regions segmented on MR images. Results: For brain tumor imaging (A and B), the standard PET-based diagnosis was not affected by any of the 3 MR-AC methods. For A, the average RDs of SUV mean were -10%, -4%, and -3% and of the VOIs 1%, 2%, and 7% for DIXON, UTE, and BD, respectively. Lesion-to-background ratios for all MR-AC methods were similar to that of CTref. For B, average RDs of SUV mean were -11%, -11%, and -3% and of the VOIs 1%, -4%, and -3%, respectively. In the case of 18 F-FDG PET/MRI (C), RDs for the whole brain were -11%, -8%, and -5% for DIXON, UTE, and BD, respectively. Conclusion: The diagnostic reading of PET/MR patients with brain tumors did not change with the chosen AC method. Quantitative accuracy of SUVs was clinically acceptable for UTE- and BD-AC for group A, whereas for group B BD was in accordance with CTref. Nevertheless, for the quantification of individual lesions large deviations to CTref can be observed independent of the MR-AC method used. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  18. Climatological attribution of wind power ramp events in East Japan and their probabilistic forecast based on multi-model ensembles downscaled by analog ensemble using self-organizing maps

    NASA Astrophysics Data System (ADS)

    Ohba, Masamichi; Nohara, Daisuke; Kadokura, Shinji

    2016-04-01

    Severe storms or other extreme weather events can interrupt the spin of wind turbines in large scale that cause unexpected "wind ramp events". In this study, we present an application of self-organizing maps (SOMs) for climatological attribution of the wind ramp events and their probabilistic prediction. The SOM is an automatic data-mining clustering technique, which allows us to summarize a high-dimensional data space in terms of a set of reference vectors. The SOM is applied to analyze and connect the relationship between atmospheric patterns over Japan and wind power generation. SOM is employed on sea level pressure derived from the JRA55 reanalysis over the target area (Tohoku region in Japan), whereby a two-dimensional lattice of weather patterns (WPs) classified during the 1977-2013 period is obtained. To compare with the atmospheric data, the long-term wind power generation is reconstructed by using a high-resolution surface observation network AMeDAS (Automated Meteorological Data Acquisition System) in Japan. Our analysis extracts seven typical WPs, which are linked to frequent occurrences of wind ramp events. Probabilistic forecasts to wind power generation and ramps are conducted by using the obtained SOM. The probability are derived from the multiple SOM lattices based on the matching of output from TIGGE multi-model global forecast to the WPs on the lattices. Since this method effectively takes care of the empirical uncertainties from the historical data, wind power generation and ramp is probabilistically forecasted from the forecasts of global models. The predictability skill of the forecasts for the wind power generation and ramp events show the relatively good skill score under the downscaling technique. It is expected that the results of this study provides better guidance to the user community and contribute to future development of system operation model for the transmission grid operator.

  19. Brain and Language.

    ERIC Educational Resources Information Center

    Damasio, Antonio R., Damasio, Hanna

    1992-01-01

    Discusses the advances made in understanding the brain structures responsible for language. Presents findings made using magnetic resonance imaging (MRI) and positron emission tomographic (PET) scans to study brain activity. These findings map the structures in the brain that manipulate concepts and those that turn concepts into words. (MCO)

  20. Probabilistic Flood Mapping using Volunteered Geographical Information

    NASA Astrophysics Data System (ADS)

    Rivera, S. J.; Girons Lopez, M.; Seibert, J.; Minsker, B. S.

    2016-12-01

    Flood extent maps are widely used by decision makers and first responders to provide critical information that prevents economic impacts and the loss of human lives. These maps are usually obtained from sensory data and/or hydrologic models, which often have limited coverage in space and time. Recent developments in social media and communication technology have created a wealth of near-real-time, user-generated content during flood events in many urban areas, such as flooded locations, pictures of flooding extent and height, etc. These data could improve decision-making and response operations as events unfold. However, the integration of these data sources has been limited due to the need for methods that can extract and translate the data into useful information for decision-making. This study presents an approach that uses volunteer geographic information (VGI) and non-traditional data sources (i.e., Twitter, Flicker, YouTube, and 911 and 311 calls) to generate/update the flood extent maps in areas where no models and/or gauge data are operational. The approach combines Web-crawling and computer vision techniques to gather information about the location, extent, and water height of the flood from unstructured textual data, images, and videos. These estimates are then used to provide an updated flood extent map for areas surrounding the geo-coordinate of the VGI through the application of a Hydro Growing Region Algorithm (HGRA). HGRA combines hydrologic and image segmentation concepts to estimate a probabilistic flooding extent along the corresponding creeks. Results obtained for a case study in Austin, TX (i.e., 2015 Memorial Day flood) were comparable to those obtained by a calibrated hydrologic model and had good spatial correlation with flooding extents estimated by the Federal Emergency Management Agency (FEMA).

  1. A Moore's cellular automaton model to get probabilistic seismic hazard maps for different magnitude releases: A case study for Greece

    NASA Astrophysics Data System (ADS)

    Jiménez, A.; Posadas, A. M.

    2006-09-01

    Cellular automata are simple mathematical idealizations of natural systems and they supply useful models for many investigations in natural science. Examples include sandpile models, forest fire models, and slider block models used in seismology. In the present paper, they have been used for establishing temporal relations between the energy releases of the seismic events that occurred in neighboring parts of the crust. The catalogue is divided into time intervals, and the region is divided into cells which are declared active or inactive by means of a threshold energy release criterion. Thus, a pattern of active and inactive cells which evolves over time is determined. A stochastic cellular automaton is constructed starting with these patterns, in order to simulate their spatio-temporal evolution, by supposing a Moore's neighborhood interaction between the cells. The best model is chosen by maximizing the mutual information between the past and the future states. Finally, a Probabilistic Seismic Hazard Map is given for the different energy releases considered. The method has been applied to the Greece catalogue from 1900 to 1999. The Probabilistic Seismic Hazard Maps for energies corresponding to m = 4 and m = 5 are close to the real seismicity after the data in that area, and they correspond to a background seismicity in the whole area. This background seismicity seems to cover the whole area in periods of around 25-50 years. The optimum cell size is in agreement with other studies; for m > 6 the optimum area increases according to the threshold of clear spatial resolution, and the active cells are not so clustered. The results are coherent with other hazard studies in the zone and with the seismicity recorded after the data set, as well as provide an interaction model which points out the large scale nature of the earthquake occurrence.

  2. Probabilistic liver atlas construction.

    PubMed

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  3. Comparison of Control Approaches in Genetic Regulatory Networks by Using Stochastic Master Equation Models, Probabilistic Boolean Network Models and Differential Equation Models and Estimated Error Analyzes

    NASA Astrophysics Data System (ADS)

    Caglar, Mehmet Umut; Pal, Ranadip

    2011-03-01

    Central dogma of molecular biology states that ``information cannot be transferred back from protein to either protein or nucleic acid''. However, this assumption is not exactly correct in most of the cases. There are a lot of feedback loops and interactions between different levels of systems. These types of interactions are hard to analyze due to the lack of cell level data and probabilistic - nonlinear nature of interactions. Several models widely used to analyze and simulate these types of nonlinear interactions. Stochastic Master Equation (SME) models give probabilistic nature of the interactions in a detailed manner, with a high calculation cost. On the other hand Probabilistic Boolean Network (PBN) models give a coarse scale picture of the stochastic processes, with a less calculation cost. Differential Equation (DE) models give the time evolution of mean values of processes in a highly cost effective way. The understanding of the relations between the predictions of these models is important to understand the reliability of the simulations of genetic regulatory networks. In this work the success of the mapping between SME, PBN and DE models is analyzed and the accuracy and affectivity of the control policies generated by using PBN and DE models is compared.

  4. Evaluation of (68)Ga-DOTA-TOC PET/CT for the detection of duodenopancreatic neuroendocrine tumors in patients with MEN1.

    PubMed

    Morgat, Clément; Vélayoudom-Céphise, Fritz-Line; Schwartz, Paul; Guyot, Martine; Gaye, Delphine; Vimont, Delphine; Schulz, Jürgen; Mazère, Joachim; Nunes, Marie-Laure; Smith, Denis; Hindié, Elif; Fernandez, Philippe; Tabarin, Antoine

    2016-07-01

    Somatostatin receptor scintigraphy with (111)In-pentetreotide (SRS) is used to detect duodenopancreatic neuroendocrine tumors (dpNETs) in multiple endocrine neoplasia type 1 (MEN1). However, SRS has limited sensitivity for this purpose. Positron emission tomography/computed tomography (PET/CT) with (68)Ga-DOTA-TOC has a higher rate of sporadic dpNETs detection than SRS but there is little data for dpNETs detection in MEN1. To compare the performances of (68)Ga-DOTA-TOC PET/CT, SRS and contrast-enhanced computed tomography (CE-CT) to diagnose dpNETs in MEN1. Single-institution prospective comparative study Nineteen consecutive MEN1 patients (aged 47 ± 13 years) underwent (68)Ga-DOTA-TOC PET/CT, SRS, and CE-CT within 2 months in random order. Blinded readings of images were performed separately by experienced physicians. Unblinded analysis of CE-CT, combined with additional magnetic resonance imaging, endoscopic-ultrasound, (18)F-2-fluoro-deoxy-D-glucose ((18)F-FDG) PET/CT or histopathology results served as reference standard for dpNETs diagnosis. The sensitivity of (68)Ga-DOTA-TOC PET/CT, SRS, and CE-CT was 76, 20, and 60 %, respectively (p < 0.0001). All the true-positive lesions detected by SRS were also depicted on (68)Ga-DOTA-TOC PET/CT. (68)Ga-DOTA-TOC PET/CT detected lesions of smaller size than SRS (10.7 ± 7.6 and 15.2 ± 5.9 mm, respectively, p < 0.03). False negatives of (68)Ga-DOTA-TOC PET/CT included small dpNETs (<10 mm) and (18)F-FDG PET/CT positive aggressive dpNETs. No false positives were recorded. In addition, whole-body mapping with (68)Ga-DOTA-TOC PET/CT identified extra-abdominal MEN1-related tumors including one neuroendocrine thymic carcinoma identified by the three imaging procedures, one bronchial carcinoid undetected by CE-CT and three meningiomas undetected by SRS. Owing to higher diagnostic performance, (68)Ga-DOTA-TOC PET/CT (or alternative (68)Ga-labeled somatostatin analogues) should replace (111)In-pentetreotide in the investigation of MEN1 patients.

  5. An osteogenesis/angiogenesis-stimulation artificial ligament for anterior cruciate ligament reconstruction.

    PubMed

    Li, Hong; Li, Jinyan; Jiang, Jia; Lv, Fang; Chang, Jiang; Chen, Shiyi; Wu, Chengtie

    2017-05-01

    To solve the poor healing of polyethylene terephthalate (PET) artificial ligament in bone tunnel, copper-containing bioactive glass (Cu-BG) nanocoatings on PET artificial ligaments were successfully prepared by pulsed laser deposition (PLD). It was hypothesized that Cu-BG coated PET (Cu-BG/PET) grafts could enhance the in vitro osteogenic and angiogenic differentiation of rat bone marrow mesenchymal stem cells (rBMSCs) and in vivo graft-bone healing after anterior cruciate ligament (ACL) reconstruction in a goat model. Scanning electron microscope and EDS mapping analysis revealed that the prepared nanocoatings had uniform element distribution (Cu, Ca, Si and P) and nanostructure. The surface hydrophilicity of PET grafts was significantly improved after depositing Cu-BG nanocoatings. The in vitro study displayed that the Cu-BG/PET grafts supported the attachment and proliferation of rBMSCs, and significantly promoted the expression of HIF-1α gene, which up-regulated the osteogenesis-related genes (S100A10, BMP2, OCN) and angiogenesis-related genes (VEGF) in comparison with PET or BG coated PET (BG/PET) grafts which do not contain Cu element. Meanwhile, Cu-BG/PET grafts promoted the bone regeneration at the graft-host bone interface and decreased graft-bone interface width, thus enhancing the bonding strength as well as angiogenesis (as indicated by CD31 expression) in the goat model as compared with BG/PET and pure PET grafts. The study demonstrates that the Cu-containing biomaterials significantly promote osteogenesis and angiogenesis in the repair of bone defects of large animals and thus offering a promising method for ACL reconstruction by using Cu-containing nanobioglass modified PET grafts. It remains a significant challenge to develop an artificial graft with distinct osteogenetic/angiogenetic activity to enhance graft-bone healing for ligament reconstruction. To solve these problems, copper-containing bioactive glass (Cu-BG) nanocoatings on PET artificial ligaments were successfully prepared by pulsed laser deposition (PLD). It was found that the prepared Cu-BG/PET grafts significantly stimulated the proliferation and osteogenic/angiogenic differentiation of bone marrow stromal cells (BMSCs) through activating HIF-1α/S100A10/Ca 2+ signal pathway. The most important is that the in vivo bone-forming ability of Cu-containing biomaterials was, for the first time, elucidated in a large animal model, revealing the enhanced capacity of osteogenesis and angiogenesis with incorporation of bioactive Cu element. It is suggested that the copper-containing biomaterials significantly promote osteogenesis and angiogenesis in large animal defects and thus offering a promising method for ACL reconstruction by using Cu-containing nanobioglass modification of PET grafts, paving the way to apply Cu-containing biomaterials for tissue engineering and regenerative medicine. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  6. Heterogeneous response of cardiac sympathetic function to cardiac resynchronization therapy in heart failure documented by 11[C]-hydroxy-ephedrine and PET/CT.

    PubMed

    Capitanio, Selene; Nanni, Cristina; Marini, Cecilia; Bonfiglioli, Rachele; Martignani, Cristian; Dib, Bassam; Fuccio, Chiara; Boriani, Giuseppe; Picori, Lorena; Boschi, Stefano; Morbelli, Silvia; Fanti, Stefano; Sambuceti, Gianmario

    2015-11-01

    Cardiac resynchronization therapy (CRT) is an accepted treatment in patients with end-stage heart failure. PET permits the absolute quantification of global and regional homogeneity in cardiac sympathetic innervation. We evaluated the variation of cardiac adrenergic activity in patients with idiopathic heart failure (IHF) disease (NYHA III-IV) after CRT using (11)C-hydroxyephedrine (HED) PET/CT. Ten IHF patients (mean age = 68; range = 55-81; average left ventricular ejection fraction 26 ± 4%) implanted with a resynchronization device underwent three HED PET/CT studies: PET 1 one week after inactive device implantation; PET 2, one week after PET 1 under stimulated rhythm; PET 3, at 3 months under active CRT. A dedicated software (PMOD 3.4 version) was used to estimate global and regional cardiac uptake of HED through 17 segment polar maps. At baseline, HED uptake was heterogeneously distributed throughout the left ventricle with a variation coefficient of 18 ± 5%. This variable markedly decreased after three months CRT (12 ± 5%, p < 0.01). Interestingly, subdividing the 170 myocardial segments (17 segments of each patient multiplied by the number of patients) into two groups, according to the median value of tracer uptake expressed as % of maximal myocardial uptake (76%), we observed a different behaviour depending on baseline innervation: HED uptake significantly increased only in segments with "impaired innervation" (SUV 2.61 ± 0.92 at PET1 and 3.05 ± 1.67 at three months, p < 0.01). As shown by HED PET/CT uptake and distribution, improvement in homogeneity of myocardial neuronal function reflected a selective improvement of tracer uptake in regions with more severe neuronal damage. These finding supported the presence of a myocardial regional variability in response of cardiac sympathetic system to CRT and a systemic response involving remote tissues with rich adrenergic innervation. This work might contribute to identify imaging parameters that could predict the response to CRT therapy. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Comparison of O-(2-18F-Fluoroethyl)-L-Tyrosine Positron Emission Tomography and Perfusion-Weighted Magnetic Resonance Imaging in the Diagnosis of Patients with Progressive and Recurrent Glioma: A Hybrid Positron Emission Tomography/Magnetic Resonance Study.

    PubMed

    Verger, Antoine; Filss, Christian P; Lohmann, Philipp; Stoffels, Gabriele; Sabel, Michael; Wittsack, Hans-J; Kops, Elena Rota; Galldiks, Norbert; Fink, Gereon R; Shah, Nadim J; Langen, Karl-Josef

    2018-05-01

    To compare the diagnostic performance of O-(2- 18 F-fluoroethyl)-L-tyrosine ( 18 F-FET) positron emission tomography (PET) and perfusion-weighted magnetic resonance imaging (PWI) for the diagnosis of progressive or recurrent glioma. Thirty-two pretreated gliomas (25 progressive or recurrent tumors, 7 treatment-related changes) were investigated with 18 F-FET PET and PWI via a hybrid PET/magnetic resonance scanner. Volumes of interest with a diameter of 16 mm were centered on the maximum of abnormality in the tumor area in PET and PWI maps (relative cerebral blood volume, relative cerebral blood flow, mean transit time) and the contralateral unaffected hemisphere. Mean and maximum tumor-to-brain ratios as well as dynamic data for 18 F-FET uptake were calculated. Diagnostic accuracies were evaluated by receiver operating characteristic analyses, calculating the area under the curve. 18 F-FET PET showed a significant greater sensitivity to detect abnormalities in pretreated gliomas than PWI (76% vs. 52%, P = 0.03). The maximum tumor-to-brain ratio of 18 F-FET PET was the only parameter that discriminated treatment-related changes from progressive or recurrent gliomas (area under the curve, 0.78; P = 0.03, best cut-off 2.61; sensitivity 80%, specificity 86%, accuracy 81%). Among patients with signal abnormality in both modalities, 75% revealed spatially incongruent local hot spots. This pilot study suggests that 18 F-FET PET is superior to PWI to diagnose progressive or recurrent glioma. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Region specific optimization of continuous linear attenuation coefficients based on UTE (RESOLUTE): application to PET/MR brain imaging

    NASA Astrophysics Data System (ADS)

    Ladefoged, Claes N.; Benoit, Didier; Law, Ian; Holm, Søren; Kjær, Andreas; Højgaard, Liselotte; Hansen, Adam E.; Andersen, Flemming L.

    2015-10-01

    The reconstruction of PET brain data in a PET/MR hybrid scanner is challenging in the absence of transmission sources, where MR images are used for MR-based attenuation correction (MR-AC). The main challenge of MR-AC is to separate bone and air, as neither have a signal in traditional MR images, and to assign the correct linear attenuation coefficient to bone. The ultra-short echo time (UTE) MR sequence was proposed as a basis for MR-AC as this sequence shows a small signal in bone. The purpose of this study was to develop a new clinically feasible MR-AC method with patient specific continuous-valued linear attenuation coefficients in bone that provides accurate reconstructed PET image data. A total of 164 [18F]FDG PET/MR patients were included in this study, of which 10 were used for training. MR-AC was based on either standard CT (reference), UTE or our method (RESOLUTE). The reconstructed PET images were evaluated in the whole brain, as well as regionally in the brain using a ROI-based analysis. Our method segments air, brain, cerebral spinal fluid, and soft tissue voxels on the unprocessed UTE TE images, and uses a mapping of R2* values to CT Hounsfield Units (HU) to measure the density in bone voxels. The average error of our method in the brain was 0.1% and less than 1.2% in any region of the brain. On average 95% of the brain was within  ±10% of PETCT, compared to 72% when using UTE. The proposed method is clinically feasible, reducing both the global and local errors on the reconstructed PET images, as well as limiting the number and extent of the outliers.

  9. Respiratory-gated CT as a tool for the simulation of breathing artifacts in PET and PET/CT.

    PubMed

    Hamill, J J; Bosmans, G; Dekker, A

    2008-02-01

    Respiratory motion in PET and PET/CT blurs the images and can cause attenuation-related errors in quantitative parameters such as standard uptake values. In rare instances, this problem even causes localization errors and the disappearance of tumors that should be detectable. Attenuation errors are severe near the diaphragm and can be enhanced when the attenuation correction is based on a CT series acquired during a breath-hold. To quantify the errors and identify the parameters associated with them, the authors performed a simulated PET scan based on respiratory-gated CT studies of five lung cancer patients. Diaphragmatic motion ranged from 8 to 25 mm in the five patients. The CT series were converted to 511-keV attenuation maps which were forward-projected and exponentiated to form sinograms of PET attenuation factors at each phase of respiration. The CT images were also segmented to form a PET object, moving with the same motion as the CT series. In the moving PET object, spherical 20 mm mobile tumors were created in the vicinity of the dome of the liver and immobile 20 mm tumors in the midchest region. The moving PET objects were forward-projected and attenuated, then reconstructed in several ways: phase-matched PET and CT, gated PET with ungated CT, ungated PET with gated CT, and conventional PET. Spatial resolution and statistical noise were not modeled. In each case, tumor uptake recovery factor was defined by comparing the maximum reconstructed pixel value with the known correct value. Mobile 10 and 30 mm tumors were also simulated in the case of a patient with 11 mm of breathing motion. Phase-matched gated PET and CT gave essentially perfect PET reconstructions in the simulation. Gated PET with ungated CT gave tumors of the correct shape, but recovery was too large by an amount that depended on the extent of the motion, as much as 90% for mobile tumors and 60% for immobile tumors. Gated CT with ungated PET resulted in blurred tumors and caused recovery errors between -50% and +75%. Recovery in clinical scans would be 0%-20% lower than stated because spatial resolution was not included in the simulation. Mobile tumors near the dome of the liver were subject to the largest errors in either case. Conventional PET for 20 mm tumors was quantitative in cases of motion less than 15 mm because of canceling errors in blurring and attenuation, but the recovery factors were too low by as much as 30% in cases of motion greater than 15 mm. The 10 mm tumors were blurred by motion to a greater extent, causing a greater SUV underestimation than in the case of 20 mm tumors, and the 30 mm tumors were blurred less. Quantitative PET imaging near the diaphragm requires proper matching of attenuation information to the emission information. The problem of missed tumors near the diaphragm can be reduced by acquiring attenuation-correction information near end expiration. A simple PET/CT protocol requiring no gating equipment also addresses this problem.

  10. Preliminary Seismic Probabilistic Tsunami Hazard Map for Italy

    NASA Astrophysics Data System (ADS)

    Lorito, Stefano; Selva, Jacopo; Basili, Roberto; Grezio, Anita; Molinari, Irene; Piatanesi, Alessio; Romano, Fabrizio; Tiberti, Mara Monica; Tonini, Roberto; Bonini, Lorenzo; Michelini, Alberto; Macias, Jorge; Castro, Manuel J.; González-Vida, José Manuel; de la Asunción, Marc

    2015-04-01

    We present a preliminary release of the first seismic probabilistic tsunami hazard map for Italy. The map aims to become an important tool for the Italian Department of Civil Protection (DPC), as well as a support tool for the NEAMTWS Tsunami Service Provider, the Centro Allerta Tsunami (CAT) at INGV, Rome. The map shows the offshore maximum tsunami elevation expected for several average return periods. Both crustal and subduction earthquakes are considered. The probability for each scenario (location, depth, mechanism, source size, magnitude and temporal rate) is defined on a uniform grid covering the entire Mediterranean for crustal earthquakes and on the plate interface for subduction earthquakes. Activity rates are assigned from seismic catalogues and basing on a tectonic regionalization of the Mediterranean area. The methodology explores the associated aleatory uncertainty through the innovative application of an Event Tree. Main sources of epistemic uncertainty are also addressed although in preliminary way. The whole procedure relies on a database of pre-calculated Gaussian-shaped Green's functions for the sea level elevation, to be used also as a real time hazard assessment tool by CAT. Tsunami simulations are performed using the non-linear shallow water multi-GPU code HySEA, over a 30 arcsec bathymetry (from the SRTM30+ dataset) and the maximum elevations are stored at the 50-meter isobath and then extrapolated through the Green's law at 1 meter depth. This work is partially funded by project ASTARTE - Assessment, Strategy And Risk Reduction for Tsunamis in Europe - FP7-ENV2013 6.4-3, Grant 603839, and by the Italian flagship project RITMARE.

  11. Predicting the Location of Human Perirhinal Cortex, Brodmann's area 35, from MRI

    PubMed Central

    Augustinack, Jean C.; Huber, Kristen E.; Stevens, Allison A.; Roy, Michelle; Frosch, Matthew P.; van der Kouwe, André J.W.; Wald, Lawrence L.; Van Leemput, Koen; McKee, Ann; Fischl, Bruce

    2012-01-01

    The perirhinal cortex (Brodmann's area 35) is a multimodal area that is important for normal memory function. Specifically, perirhinal cortex is involved in detection of novel objects and manifests neurofibrillary tangles in Alzheimer's disease very early in disease progression. We scanned ex vivo brain hemispheres at standard resolution (1 mm × 1 mm × 1 mm) to construct pial/white matter surfaces in FreeSurfer and scanned again at high resolution (120 μm × 120 μm × 120 μm) to determine cortical architectural boundaries. After labeling perirhinal area 35 in the high resolution images, we mapped the high resolution labels to the surface models to localize area 35 in fourteen cases. We validated the area boundaries determined using histological Nissl staining. To test the accuracy of the probabilistic mapping, we measured the Hausdorff distance between the predicted and true labels and found that the median Hausdorff distance was 4.0 mm for left hemispheres (n = 7) and 3.2 mm for right hemispheres (n = 7) across subjects. To show the utility of perirhinal localization, we mapped our labels to a subset of the Alzheimer's Disease Neuroimaging Initiative dataset and found decreased cortical thickness measures in mild cognitive impairment and Alzheimer's disease compared to controls in the predicted perirhinal area 35. Our ex vivo probabilistic mapping of perirhinal cortex provides histologically validated, automated and accurate labeling of architectonic regions in the medial temporal lobe, and facilitates the analysis of atrophic changes in a large dataset for earlier detection and diagnosis. PMID:22960087

  12. Integrating geophysical data for mapping the contamination of industrial sites by polycyclic aromatic hydrocarbons: A geostatistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colin, P.; Nicoletis, S.; Froidevaux, R.

    1996-12-31

    A case study is presented of building a map showing the probability that the concentration in polycyclic aromatic hydrocarbon (PAH) exceeds a critical threshold. This assessment is based on existing PAH sample data (direct information) and on an electrical resistivity survey (indirect information). Simulated annealing is used to build a model of the range of possible values for PAH concentrations and of the bivariate relationship between PAH concentrations and electrical resistivity. The geostatistical technique of simple indicator kriging is then used, together with the probabilistic model, to infer, at each node of a grid, the range of possible values whichmore » the PAH concentration can take. The risk map is then extracted for this characterization of the local uncertainty. The difference between this risk map and a traditional iso-concentration map is then discussed in terms of decision-making.« less

  13. Seismic hazard maps for Haiti

    USGS Publications Warehouse

    Frankel, Arthur; Harmsen, Stephen; Mueller, Charles; Calais, Eric; Haase, Jennifer

    2011-01-01

    We have produced probabilistic seismic hazard maps of Haiti for peak ground acceleration and response spectral accelerations that include the hazard from the major crustal faults, subduction zones, and background earthquakes. The hazard from the Enriquillo-Plantain Garden, Septentrional, and Matheux-Neiba fault zones was estimated using fault slip rates determined from GPS measurements. The hazard from the subduction zones along the northern and southeastern coasts of Hispaniola was calculated from slip rates derived from GPS data and the overall plate motion. Hazard maps were made for a firm-rock site condition and for a grid of shallow shear-wave velocities estimated from topographic slope. The maps show substantial hazard throughout Haiti, with the highest hazard in Haiti along the Enriquillo-Plantain Garden and Septentrional fault zones. The Matheux-Neiba Fault exhibits high hazard in the maps for 2% probability of exceedance in 50 years, although its slip rate is poorly constrained.

  14. XID+: Next generation XID development

    NASA Astrophysics Data System (ADS)

    Hurley, Peter

    2017-04-01

    XID+ is a prior-based source extraction tool which carries out photometry in the Herschel SPIRE (Spectral and Photometric Imaging Receiver) maps at the positions of known sources. It uses a probabilistic Bayesian framework that provides a natural framework in which to include prior information, and uses the Bayesian inference tool Stan to obtain the full posterior probability distribution on flux estimates.

  15. Statistical techniques for sampling and monitoring natural resources

    Treesearch

    Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado

    2004-01-01

    We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....

  16. Development of a novel depth of interaction PET detector using highly multiplexed G-APD cross-strip encoding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolb, A., E-mail: armin.kolb@med.uni-tuebingen.de; Parl, C.; Liu, C. C.

    Purpose: The aim of this study was to develop a prototype PET detector module for a combined small animal positron emission tomography and magnetic resonance imaging (PET/MRI) system. The most important factor for small animal imaging applications is the detection sensitivity of the PET camera, which can be optimized by utilizing longer scintillation crystals. At the same time, small animal PET systems must yield a high spatial resolution. The measured object is very close to the PET detector because the bore diameter of a high field animal MR scanner is limited. When used in combination with long scintillation crystals, thesemore » small-bore PET systems generate parallax errors that ultimately lead to a decreased spatial resolution. Thus, we developed a depth of interaction (DoI) encoding PET detector module that has a uniform spatial resolution across the whole field of view (FOV), high detection sensitivity, compactness, and insensitivity to magnetic fields. Methods: The approach was based on Geiger mode avalanche photodiode (G-APD) detectors with cross-strip encoding. The number of readout channels was reduced by a factor of 36 for the chosen block elements. Two 12 × 2 G-APD strip arrays (25μm cells) were placed perpendicular on each face of a 12 × 12 lutetium oxyorthosilicate crystal block with a crystal size of 1.55 × 1.55 × 20 mm. The strip arrays were multiplexed into two channels and used to calculate the x, y coordinates for each array and the deposited energy. The DoI was measured in step sizes of 1.8 mm by a collimated {sup 18}F source. The coincident resolved time (CRT) was analyzed at all DoI positions by acquiring the waveform for each event and applying a digital leading edge discriminator. Results: All 144 crystals were well resolved in the crystal flood map. The average full width half maximum (FWHM) energy resolution of the detector was 12.8% ± 1.5% with a FWHM CRT of 1.14 ± 0.02 ns. The average FWHM DoI resolution over 12 crystals was 2.90 ± 0.15 mm. Conclusions: The novel DoI PET detector, which is based on strip G-APD arrays, yielded a DoI resolution of 2.9 mm and excellent timing and energy resolution. Its high multiplexing factor reduces the number of electronic channels. Thus, this cross-strip approach enables low-cost, high-performance PET detectors for dedicated small animal PET and PET/MRI and potentially clinical PET/MRI systems.« less

  17. Activity of Tachykinin1-Expressing Pet1 Raphe Neurons Modulates the Respiratory Chemoreflex.

    PubMed

    Hennessy, Morgan L; Corcoran, Andrea E; Brust, Rachael D; Chang, YoonJeung; Nattie, Eugene E; Dymecki, Susan M

    2017-02-15

    Homeostatic control of breathing, heart rate, and body temperature relies on circuits within the brainstem modulated by the neurotransmitter serotonin (5-HT). Mounting evidence points to specialized neuronal subtypes within the serotonergic neuronal system, borne out in functional studies, for the modulation of distinct facets of homeostasis. Such functional differences, read out at the organismal level, are likely subserved by differences among 5-HT neuron subtypes at the cellular and molecular levels, including differences in the capacity to coexpress other neurotransmitters such as glutamate, GABA, thyrotropin releasing hormone, and substance P encoded by the Tachykinin-1 ( Tac1 ) gene. Here, we characterize in mice a 5-HT neuron subtype identified by expression of Tac1 and the serotonergic transcription factor gene Pet1 , referred to as the Tac1-Pet1 neuron subtype. Transgenic cell labeling showed Tac1-Pet1 soma resident largely in the caudal medulla. Chemogenetic [clozapine -N- oxide (CNO)-hM4Di] perturbation of Tac1-Pet1 neuron activity blunted the ventilatory response of the respiratory CO 2 chemoreflex, which normally augments ventilation in response to hypercapnic acidosis to restore normal pH and PCO 2 Tac1-Pet1 axonal boutons were found localized to brainstem areas implicated in respiratory modulation, with highest density in motor regions. These findings demonstrate that the activity of a Pet1 neuron subtype with the potential to release both 5-HT and substance P is necessary for normal respiratory dynamics, perhaps via motor outputs that engage muscles of respiration and maintain airway patency. These Tac1-Pet1 neurons may act downstream of Egr2-Pet1 serotonergic neurons, which were previously established in respiratory chemoreception, but do not innervate respiratory motor nuclei. SIGNIFICANCE STATEMENT Serotonin (5-HT) neurons modulate physiological processes and behaviors as diverse as body temperature, respiration, aggression, and mood. Using genetic tools, we characterize a 5-HT neuron subtype defined by expression of Tachykinin1 and Pet1 ( Tac1-Pet1 neurons), mapping soma localization to the caudal medulla primarily and axonal projections to brainstem motor nuclei most prominently, and, when silenced, observed blunting of the ventilatory response to inhaled CO 2 Tac1-Pet1 neurons thus appear distinct from and contrast previously described Egr2-Pet1 neurons, which project primarily to chemosensory integration centers and are themselves chemosensitive. Copyright © 2017 the authors 0270-6474/17/371807-13$15.00/0.

  18. Construction and comparative evaluation of different activity detection methods in brain FDG-PET.

    PubMed

    Buchholz, Hans-Georg; Wenzel, Fabian; Gartenschläger, Martin; Thiele, Frank; Young, Stewart; Reuss, Stefan; Schreckenberger, Mathias

    2015-08-18

    We constructed and evaluated reference brain FDG-PET databases for usage by three software programs (Computer-aided diagnosis for dementia (CAD4D), Statistical Parametric Mapping (SPM) and NEUROSTAT), which allow a user-independent detection of dementia-related hypometabolism in patients' brain FDG-PET. Thirty-seven healthy volunteers were scanned in order to construct brain FDG reference databases, which reflect the normal, age-dependent glucose consumption in human brain, using either software. Databases were compared to each other to assess the impact of different stereotactic normalization algorithms used by either software package. In addition, performance of the new reference databases in the detection of altered glucose consumption in the brains of patients was evaluated by calculating statistical maps of regional hypometabolism in FDG-PET of 20 patients with confirmed Alzheimer's dementia (AD) and of 10 non-AD patients. Extent (hypometabolic volume referred to as cluster size) and magnitude (peak z-score) of detected hypometabolism was statistically analyzed. Differences between the reference databases built by CAD4D, SPM or NEUROSTAT were observed. Due to the different normalization methods, altered spatial FDG patterns were found. When analyzing patient data with the reference databases created using CAD4D, SPM or NEUROSTAT, similar characteristic clusters of hypometabolism in the same brain regions were found in the AD group with either software. However, larger z-scores were observed with CAD4D and NEUROSTAT than those reported by SPM. Better concordance with CAD4D and NEUROSTAT was achieved using the spatially normalized images of SPM and an independent z-score calculation. The three software packages identified the peak z-scores in the same brain region in 11 of 20 AD cases, and there was concordance between CAD4D and SPM in 16 AD subjects. The clinical evaluation of brain FDG-PET of 20 AD patients with either CAD4D-, SPM- or NEUROSTAT-generated databases from an identical reference dataset showed similar patterns of hypometabolism in the brain regions known to be involved in AD. The extent of hypometabolism and peak z-score appeared to be influenced by the calculation method used in each software package rather than by different spatial normalization parameters.

  19. Use of the FLUKA Monte Carlo code for 3D patient-specific dosimetry on PET-CT and SPECT-CT images*

    PubMed Central

    Botta, F; Mairani, A; Hobbs, R F; Vergara Gil, A; Pacilio, M; Parodi, K; Cremonesi, M; Coca Pérez, M A; Di Dia, A; Ferrari, M; Guerriero, F; Battistoni, G; Pedroli, G; Paganelli, G; Torres Aroche, L A; Sgouros, G

    2014-01-01

    Patient-specific absorbed dose calculation for nuclear medicine therapy is a topic of increasing interest. 3D dosimetry at the voxel level is one of the major improvements for the development of more accurate calculation techniques, as compared to the standard dosimetry at the organ level. This study aims to use the FLUKA Monte Carlo code to perform patient-specific 3D dosimetry through direct Monte Carlo simulation on PET-CT and SPECT-CT images. To this aim, dedicated routines were developed in the FLUKA environment. Two sets of simulations were performed on model and phantom images. Firstly, the correct handling of PET and SPECT images was tested under the assumption of homogeneous water medium by comparing FLUKA results with those obtained with the voxel kernel convolution method and with other Monte Carlo-based tools developed to the same purpose (the EGS-based 3D-RD software and the MCNP5-based MCID). Afterwards, the correct integration of the PET/SPECT and CT information was tested, performing direct simulations on PET/CT images for both homogeneous (water) and non-homogeneous (water with air, lung and bone inserts) phantoms. Comparison was performed with the other Monte Carlo tools performing direct simulation as well. The absorbed dose maps were compared at the voxel level. In the case of homogeneous water, by simulating 108 primary particles a 2% average difference with respect to the kernel convolution method was achieved; such difference was lower than the statistical uncertainty affecting the FLUKA results. The agreement with the other tools was within 3–4%, partially ascribable to the differences among the simulation algorithms. Including the CT-based density map, the average difference was always within 4% irrespective of the medium (water, air, bone), except for a maximum 6% value when comparing FLUKA and 3D-RD in air. The results confirmed that the routines were properly developed, opening the way for the use of FLUKA for patient-specific, image-based dosimetry in nuclear medicine. PMID:24200697

  20. Fatal mechanical asphyxia induces changes in energy utilization in the rat brain: An (18)F-FDG-PET study.

    PubMed

    Ma, Suhua; You, Shengzhong; Hao, Li; Zhang, Dongchuan; Quan, Li

    2015-07-01

    This study was designed to evaluate changes in brain glucose metabolism in rats following ligature strangulation. Thirteen male Wistar rats were used in the present study, divided into control (n=7) and asphyxia groups (n=6, ligature strangulation). Positron emission tomography (PET) with 2-deoxy-2-[(18)F]fluoro-D-glucose ((18)F-FDG) was used to evaluate brain glucose metabolism. Rats were scanned for PET-CT, and image data co-registered with a T2WI MRI template using SPM8 software. Image J was employed to draw regions of interest (ROIs) from the MRI template and acquire ROI activity information from the PET images. In the asphyxia group vs. controls, (18)F-FDG uptake (FU) was decreased in the substantia nigra (25.26%, p<0.001), rhombencephalon (pons/medulla oblongata, 13.92%, p<0.01), hypothalamus (22.06%, p<0.01), ventral tegmentum (10.12%, p<0.05) and amygdala (12.74%, p<0.05); however, FU was increased in motor (18.21%, p<0.05) and visual cortices (19.2%, p<0.05). The glucose metabolism distribution map in the asphyxiated rat brains were substantially changed versus controls. PET with (18)F-FDG can demonstrate excitement and inhibition of different brain areas even in cases of ligature strangulation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. Assessment of cardiac sympathetic neuronal function using PET imaging.

    PubMed

    Bengel, Frank M; Schwaiger, Markus

    2004-01-01

    The autonomic nervous system plays a key role for regulation of cardiac performance, and the importance of alterations of innervation in the pathophysiology of various heart diseases has been increasingly emphasized. Nuclear imaging techniques have been established that allow for global and regional investigation of the myocardial nervous system. The guanethidine analog iodine 123 metaiodobenzylguanidine (MIBG) has been introduced for scintigraphic mapping of presynaptic sympathetic innervation and is available today for imaging on a broad clinical basis. Not much later than MIBG, positron emission tomography (PET) has also been established for characterizing the cardiac autonomic nervous system. Although PET is methodologically demanding and less widely available, it provides substantial advantages. High spatial and temporal resolution along with routinely available attenuation correction allows for detailed definition of tracer kinetics and makes noninvasive absolute quantification a reality. Furthermore, a series of different radiolabeled catecholamines, catecholamine analogs, and receptor ligands are available. Those are often more physiologic than MIBG and well understood with regard to their tracer physiologic properties. PET imaging of sympathetic neuronal function has been successfully applied to gain mechanistic insights into myocardial biology and pathology. Available tracers allow dissection of processes of presynaptic and postsynaptic innervation contributing to cardiovascular disease. This review summarizes characteristics of currently available PET tracers for cardiac neuroimaging along with the major findings derived from their application in health and disease.

  2. SU-G-IeP4-13: PET Image Noise Variability and Its Consequences for Quantifying Tumor Hypoxia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kueng, R; Radiation Medicine Program, Princess Margaret Cancer Centre, University Health Network, Toronto, Ontario; Manser, P

    Purpose: The values in a PET image which represent activity concentrations of a radioactive tracer are influenced by a large number of parameters including patient conditions as well as image acquisition and reconstruction. This work investigates noise characteristics in PET images for various image acquisition and image reconstruction parameters. Methods: Different phantoms with homogeneous activity distributions were scanned using several acquisition parameters and reconstructed with numerous sets of reconstruction parameters. Images from six PET scanners from different vendors were analyzed and compared with respect to quantitative noise characteristics. Local noise metrics, which give rise to a threshold value defining themore » metric of hypoxic fraction, as well as global noise measures in terms of noise power spectra (NPS) were computed. In addition to variability due to different reconstruction parameters, spatial variability of activity distribution and its noise metrics were investigated. Patient data from clinical trials were mapped onto phantom scans to explore the impact of the scanner’s intrinsic noise variability on quantitative clinical analysis. Results: Local noise metrics showed substantial variability up to an order of magnitude for different reconstruction parameters. Investigations of corresponding NPS revealed reconstruction dependent structural noise characteristics. For the acquisition parameters, noise metrics were guided by Poisson statistics. Large spatial non-uniformity of the noise was observed in both axial and radial direction of a PET image. In addition, activity concentrations in PET images of homogeneous phantom scans showed intriguing spatial fluctuations for most scanners. The clinical metric of the hypoxic fraction was shown to be considerably influenced by the PET scanner’s spatial noise characteristics. Conclusion: We showed that a hypoxic fraction metric based on noise characteristics requires careful consideration of the various dependencies in order to justify its quantitative validity. This work may result in recommendations for harmonizing QA of PET imaging for multi-institutional clinical trials.« less

  3. Molecular imaging of malignant tumor metabolism: whole-body image fusion of DWI/CT vs. PET/CT.

    PubMed

    Reiner, Caecilia S; Fischer, Michael A; Hany, Thomas; Stolzmann, Paul; Nanz, Daniel; Donati, Olivio F; Weishaupt, Dominik; von Schulthess, Gustav K; Scheffel, Hans

    2011-08-01

    To prospectively investigate the technical feasibility and performance of image fusion for whole-body diffusion-weighted imaging (wbDWI) and computed tomography (CT) to detect metastases using hybrid positron emission tomography/computed tomography (PET/CT) as reference standard. Fifty-two patients (60 ± 14 years; 18 women) with different malignant tumor disease examined by PET/CT for clinical reasons consented to undergo additional wbDWI at 1.5 Tesla. WbDWI was performed using a diffusion-weighted single-shot echo-planar imaging during free breathing. Images at b = 0 s/mm(2) and b = 700 s/mm(2) were acquired and apparent diffusion coefficient (ADC) maps were generated. Image fusion of wbDWI and CT (from PET/CT scan) was performed yielding for wbDWI/CT fused image data. One radiologist rated the success of image fusion and diagnostic image quality. The presence or absence of metastases on wbDWI/CT fused images was evaluated together with the separate wbDWI and CT images by two different, independent radiologists blinded to results from PET/CT. Detection rate and positive predictive values for diagnosing metastases was calculated. PET/CT examinations were used as reference standard. PET/CT identified 305 malignant lesions in 39 of 52 (75%) patients. WbDWI/CT image fusion was technically successful and yielded diagnostic image quality in 73% and 92% of patients, respectively. Interobserver agreement for the evaluation of wbDWI/CT images was κ = 0.78. WbDWI/CT identified 270 metastases in 43 of 52 (83%) patients. Overall detection rate and positive predictive value of wbDWI/CT was 89% (95% CI, 0.85-0.92) and 94% (95% CI, 0.92-0.97), respectively. WbDWI/CT image fusion is technically feasible in a clinical setting and allows the diagnostic assessment of metastatic tumor disease detecting nine of 10 lesions as compared with PET/CT. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.

  4. Ontological security and connectivity provided by pets: a study in the self-management of the everyday lives of people diagnosed with a long-term mental health condition.

    PubMed

    Brooks, Helen; Rushton, Kelly; Walker, Sandra; Lovell, Karina; Rogers, Anne

    2016-12-09

    Despite evidence that connecting people to relevant wellbeing-related resources brings therapeutic benefit, there is limited understanding, in the context of mental health recovery, of the potential value and contribution of pet ownership to personal support networks for self-management. This study aimed to explore the role of pets in the support and management activities in the personal networks of people with long-term mental health problems. Semi-structured interviews centred on 'ego' network mapping were conducted in two locations (in the North West and in the South of England) with 54 participants with a diagnosis of a long-term mental health problem. Interviews explored the day-to-day experience of living with a mental illness, informed by the notion of illness work undertaken by social network members within personal networks. Narratives were elicited that explored the relationship, value, utility and meaning of pets in the context of the provision of social support and management provided by other network members. Interviews were recorded, then transcribed verbatim before being analysed using a framework analysis. The majority of pets were placed in the central, most valued circle of support within the network diagrams. Pets were implicated in relational work through the provision of secure and intimate relationships not available elsewhere. Pets constituted a valuable source of illness work in managing feelings through distraction from symptoms and upsetting experiences, and provided a form of encouragement for activity. Pets were of enhanced salience where relationships with other network members were limited or difficult. Despite these benefits, pets were unanimously neither considered nor incorporated into individual mental health care plans. Drawing on a conceptual framework built on Corbin and Strauss's notion of illness 'work' and notions of a personal workforce of support undertaken within whole networks of individuals, this study contributes to our understanding of the role of pets in the daily management of long-term mental health problems. Pets should be considered a main rather than a marginal source of support in the management of long-term mental health problems, and this has implications for the planning and delivery of mental health services.

  5. Striatal dopaminergic modulation of reinforcement learning predicts reward-oriented behavior in daily life.

    PubMed

    Kasanova, Zuzana; Ceccarini, Jenny; Frank, Michael J; Amelsvoort, Thérèse van; Booij, Jan; Heinzel, Alexander; Mottaghy, Felix; Myin-Germeys, Inez

    2017-07-01

    Much human behavior is driven by rewards. Preclinical neurophysiological and clinical positron emission tomography (PET) studies have implicated striatal phasic dopamine (DA) release as a primary modulator of reward processing. However, the relationship between experimental reward-induced striatal DA release and responsiveness to naturalistic rewards, and therefore functional relevance of these findings, has been elusive. We therefore combined, for the first time, a DA D 2/3 receptor [ 18 F]fallypride PET during a probabilistic reinforcement learning (RL) task with a six day ecological momentary assessments (EMA) of reward-related behavior in the everyday life of 16 healthy volunteers. We detected significant reward-induced DA release in the bilateral putamen, caudate nucleus and ventral striatum, the extent of which was associated with better behavioral performance on the RL task across all regions. Furthermore, individual variability in the extent of reward-induced DA release in the right caudate nucleus and ventral striatum modulated the tendency to be actively engaged in a behavior if the active engagement was previously deemed enjoyable. This study suggests a link between striatal reward-related DA release and ecologically relevant reward-oriented behavior, suggesting an avenue for the inquiry into the DAergic basis of optimal and impaired motivational drive. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Verbal fluency and positron emission tomographic mapping of regional cerebral glucose metabolism.

    PubMed

    Boivin, M J; Giordani, B; Berent, S; Amato, D A; Lehtinen, S; Koeppe, R A; Buchtel, H A; Foster, N L; Kuhl, D E

    1992-06-01

    Impairment in verbal fluency (VF) has been a consistently reported clinical feature of focal cerebral deficits in frontal and temporal regions. More recent behavioral activation studies with healthy control subjects using positron emission tomography (PET), however, have noted a negative correlation between performance on verbal fluency tasks and regional cortical activity. To see if this negative relationship extends to steady-state non-activation PET measures, thirty-three healthy adults were given a VF task within a day of their 18F-2-fluoro-2-deoxy-D-glucose PET scan. VF was found to correlate positively with left temporal cortical region metabolic activity but to correlate negatively with right and left frontal activity. VF was not correlated significantly with right temporal cortical metabolic activity. Some previous studies with normals using behavioral activation paradigms and PET have reported negative correlations between metabolic activity and cognitive performance similar to that reported here. An explanation for the disparate relationships that were observed between frontal and temporal brain areas and VF might be found in the mediation of different task demands by these separate locations, i.e., task planning and/or initiation by frontal regions and verbal memory by the left temporal area.

  7. Comparison of the Cardiac MicroPET Images Obtained Using [(18)F]FPTP and [(13)N]NH3 in Rat Myocardial Infarction Models.

    PubMed

    Kim, Dong-Yeon; Kim, Hyeon Sik; Jang, Hwa Youn; Kim, Ju Han; Bom, Hee-Seung; Min, Jung-Joon

    2014-10-09

    The short half-life of current positron emission tomography (PET) cardiac tracers limits their widespread clinical use. We previously developed a (18)F-labeled phosphonium cation, [(18)F]FPTP, that demonstrated sharply defined myocardial defects in a corresponding infarcted myocardium. The aim of this study was to compare the image properties of PET scans obtained using [(18)F]FPTP with those obtained using [(13)N]NH3 in rat myocardial infarction models. Perfusion abnormality was analyzed in 17 segments of polar map images. The myocardium-to-liver and myocardium-to-lung ratios of [(18)F]FPTP were 10.48 and 2.65 times higher, respectively, than those of [(13)N]NH3 in images acquired 30 min after tracer injection. The myocardial defect size measured by [(18)F]FPTP correlated more closely with the hypoperfused area measured by quantitative 2,3,5-triphenyltetrazolium chloride staining (r = 0.89, P < 0.01) than did [(13)N]NH3 (r = 0.84, P < 0.01). [(18)F]FPTP might be useful as a replacement for the myocardial agent [(13)N]NH3 in cardiac PET/CT applications.

  8. Integrated PET/MR breast cancer imaging: Attenuation correction and implementation of a 16-channel RF coil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oehmigen, Mark, E-mail: mark.oehmigen@uni-due.de

    Purpose: This study aims to develop, implement, and evaluate a 16-channel radiofrequency (RF) coil for integrated positron emission tomography/magnetic resonance (PET/MR) imaging of breast cancer. The RF coil is designed for optimized MR imaging performance and PET transparency and attenuation correction (AC) is applied for accurate PET quantification. Methods: A 16-channel breast array RF coil was designed for integrated PET/MR hybrid imaging of breast cancer lesions. The RF coil features a lightweight rigid design and is positioned with a spacer at a defined position on the patient table of an integrated PET/MR system. Attenuation correction is performed by generating andmore » applying a dedicated 3D CT-based template attenuation map. Reposition accuracy of the RF coil on the system patient table while using the positioning frame was tested in repeated measurements using MR-visible markers. The MR, PET, and PET/MR imaging performances were systematically evaluated using modular breast phantoms. Attenuation correction of the RF coil was evaluated with difference measurements of the active breast phantoms filled with radiotracer in the PET detector with and without the RF coil in place, serving as a standard of reference measurement. The overall PET/MR imaging performance and PET quantification accuracy of the new 16-channel RF coil and its AC were then evaluated in first clinical examinations on ten patients with local breast cancer. Results: The RF breast array coil provides excellent signal-to-noise ratio and signal homogeneity across the volume of the breast phantoms in MR imaging and visualizes small structures in the phantoms down to 0.4 mm in plane. Difference measurements with PET revealed a global loss and thus attenuation of counts by 13% (mean value across the whole phantom volume) when the RF coil is placed in the PET detector. Local attenuation ranging from 0% in the middle of the phantoms up to 24% was detected in the peripheral regions of the phantoms at positions closer to attenuating hardware structures of the RF coil. The position accuracy of the RF coil on the patient table when using the positioning frame was determined well below 1 mm for all three spatial dimensions. This ensures perfect position match between the RF coil and its three-dimensional attenuation template during the PET data reconstruction process. When applying the CT-based AC of the RF coil, the global attenuation bias was mostly compensated to ±0.5% across the entire breast imaging volume. The patient study revealed high quality MR, PET, and combined PET/MR imaging of breast cancer. Quantitative activity measurements in all 11 breast cancer lesions of the ten patients resulted in increased mean difference values of SUV{sub max} 11.8% (minimum 3.2%; maximum 23.2%) between nonAC images and images when AC of the RF breast coil was applied. This supports the quantitative results of the phantom study as well as successful attenuation correction of the RF coil. Conclusions: A 16-channel breast RF coil was designed for optimized MR imaging performance and PET transparency and was successfully integrated with its dedicated attenuation correction template into a whole-body PET/MR system. Systematic PET/MR imaging evaluation with phantoms and an initial study on patients with breast cancer provided excellent MR and PET image quality and accurate PET quantification.« less

  9. Probabilistic biological network alignment.

    PubMed

    Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2013-01-01

    Interactions between molecules are probabilistic events. An interaction may or may not happen with some probability, depending on a variety of factors such as the size, abundance, or proximity of the interacting molecules. In this paper, we consider the problem of aligning two biological networks. Unlike existing methods, we allow one of the two networks to contain probabilistic interactions. Allowing interaction probabilities makes the alignment more biologically relevant at the expense of explosive growth in the number of alternative topologies that may arise from different subsets of interactions that take place. We develop a novel method that efficiently and precisely characterizes this massive search space. We represent the topological similarity between pairs of aligned molecules (i.e., proteins) with the help of random variables and compute their expected values. We validate our method showing that, without sacrificing the running time performance, it can produce novel alignments. Our results also demonstrate that our method identifies biologically meaningful mappings under a comprehensive set of criteria used in the literature as well as the statistical coherence measure that we developed to analyze the statistical significance of the similarity of the functions of the aligned protein pairs.

  10. Three-dimensional Probabilistic Earthquake Location Applied to 2002-2003 Mt. Etna Eruption

    NASA Astrophysics Data System (ADS)

    Mostaccio, A.; Tuve', T.; Zuccarello, L.; Patane', D.; Saccorotti, G.; D'Agostino, M.

    2005-12-01

    Recorded seismicity for the Mt. Etna volcano, occurred during the 2002-2003 eruption, has been relocated using a probabilistic, non-linear, earthquake location approach. We used the software package NonLinLoc (Lomax et al., 2000) adopting the 3D velocity model obtained by Cocina et al., 2005. We applied our data through different algorithms: (1) via a grid-search; (2) via a Metropolis-Gibbs; and (3) via an Oct-tree. The Oct-Tree algorithm gives efficient, faster and accurate mapping of the PDF (Probability Density Function) of the earthquake location problem. More than 300 seismic events were analyzed in order to compare non-linear location results with the ones obtained by using traditional, linearized earthquake location algorithm such as Hypoellipse, and a 3D linearized inversion (Thurber, 1983). Moreover, we compare 38 focal mechanisms, chosen following stricta criteria selection, with the ones obtained by the 3D and 1D results. Although the presented approach is more of a traditional relocation application, probabilistic earthquake location could be used in routinely survey.

  11. Probabilistic Common Spatial Patterns for Multichannel EEG Analysis

    PubMed Central

    Chen, Zhe; Gao, Xiaorong; Li, Yuanqing; Brown, Emery N.; Gao, Shangkai

    2015-01-01

    Common spatial patterns (CSP) is a well-known spatial filtering algorithm for multichannel electroencephalogram (EEG) analysis. In this paper, we cast the CSP algorithm in a probabilistic modeling setting. Specifically, probabilistic CSP (P-CSP) is proposed as a generic EEG spatio-temporal modeling framework that subsumes the CSP and regularized CSP algorithms. The proposed framework enables us to resolve the overfitting issue of CSP in a principled manner. We derive statistical inference algorithms that can alleviate the issue of local optima. In particular, an efficient algorithm based on eigendecomposition is developed for maximum a posteriori (MAP) estimation in the case of isotropic noise. For more general cases, a variational algorithm is developed for group-wise sparse Bayesian learning for the P-CSP model and for automatically determining the model size. The two proposed algorithms are validated on a simulated data set. Their practical efficacy is also demonstrated by successful applications to single-trial classifications of three motor imagery EEG data sets and by the spatio-temporal pattern analysis of one EEG data set recorded in a Stroop color naming task. PMID:26005228

  12. Application of a time probabilistic approach to seismic landslide hazard estimates in Iran

    NASA Astrophysics Data System (ADS)

    Rajabi, A. M.; Del Gaudio, V.; Capolongo, D.; Khamehchiyan, M.; Mahdavifar, M. R.

    2009-04-01

    Iran is a country located in a tectonic active belt and is prone to earthquake and related phenomena. In the recent years, several earthquakes caused many fatalities and damages to facilities, e.g. the Manjil (1990), Avaj (2002), Bam (2003) and Firuzabad-e-Kojur (2004) earthquakes. These earthquakes generated many landslides. For instance, catastrophic landslides triggered by the Manjil Earthquake (Ms = 7.7) in 1990 buried the village of Fatalak, killed more than 130 peoples and cut many important road and other lifelines, resulting in major economic disruption. In general, earthquakes in Iran have been concentrated in two major zones with different seismicity characteristics: one is the region of Alborz and Central Iran and the other is the Zagros Orogenic Belt. Understanding where seismically induced landslides are most likely to occur is crucial in reducing property damage and loss of life in future earthquakes. For this purpose a time probabilistic approach for earthquake-induced landslide hazard at regional scale, proposed by Del Gaudio et al. (2003), has been applied to the whole Iranian territory to provide the basis of hazard estimates. This method consists in evaluating the recurrence of seismically induced slope failure conditions inferred from the Newmark's model. First, by adopting Arias Intensity to quantify seismic shaking and using different Arias attenuation relations for Alborz - Central Iran and Zagros regions, well-established methods of seismic hazard assessment, based on the Cornell (1968) method, were employed to obtain the occurrence probabilities for different levels of seismic shaking in a time interval of interest (50 year). Then, following Jibson (1998), empirical formulae specifically developed for Alborz - Central Iran and Zagros, were used to represent, according to the Newmark's model, the relation linking Newmark's displacement Dn to Arias intensity Ia and to slope critical acceleration ac. These formulae were employed to evaluate the slope critical acceleration (Ac)x for which a prefixed probability exists that seismic shaking would result in a Dn value equal to a threshold x whose exceedence would cause landslide triggering. The obtained ac values represent the minimum slope resistance required to keep the probability of seismic-landslide triggering within the prefixed value. In particular we calculated the spatial distribution of (Ac)x for x thresholds of 10 and 2 cm in order to represent triggering conditions for coherent slides (e.g., slumps, block slides, slow earth flows) and disrupted slides (e.g., rock falls, rock slides, rock avalanches), respectively. Then we produced a probabilistic national map that shows the spatial distribution of (Ac)10 and (Ac)2, for a 10% probability of exceedence in 50 year, which is a significant level of hazard equal to that commonly used for building codes. The spatial distribution of the calculated (Ac)xvalues can be compared with the in situ actual ac values of specific slopes to estimate whether these slopes have a significant probability of failing under seismic action in the future. As example of possible application of this kind of time probabilistic map to hazard estimates, we compared the values obtained for the Manjil region with a GIS map providing spatial distribution of estimated ac values in the same region. The spatial distribution of slopes characterized by ac < (Ac)10 was then compared with the spatial distribution of the major landslides of coherent type triggered by the Manjil earthquake. This comparison provides indications on potential, problems and limits of the experimented approach for the study area. References Cornell, C.A., 1968: Engineering seismic risk analysis, Bull. Seism. Soc. Am., 58, 1583-1606. Del Gaudio V., Wasowski J., & Pierri P., 2003: An approach to time probabilistic evaluation of seismically-induced landslide hazard. Bull Seism. Soc. Am., 93, 557-569. Jibson, R.W., E.L. Harp and J.A. Michael, 1998: A method for producing digital probabilistic seismic landslide hazard maps: an example from the Los Angeles, California, area, U.S. Geological Survey Open-File Report 98-113, Golden, Colorado, 17 pp.

  13. Regional climate model data used within the SWURVE project - 1: projected changes in seasonal patterns and estimation of PET

    NASA Astrophysics Data System (ADS)

    Ekström, M.; Jones, P. D.; Fowler, H. J.; Lenderink, G.; Buishand, T. A.; Conway, D.

    2007-04-01

    Climate data for studies within the SWURVE (Sustainable Water: Uncertainty, Risk and Vulnerability in Europe) project, assessing the risk posed by future climatic change to various hydrological and hydraulic systems were obtained from the regional climate model HadRM3H, developed at the Hadley Centre of the UK Met Office. This paper gives some background to HadRM3H; it also presents anomaly maps of the projected future changes in European temperature, rainfall and potential evapotranspiration (PET, estimated using a variant of the Penman formula). The future simulations of temperature and rainfall, following the SRES A2 emissions scenario, suggest that most of Europe will experience warming in all seasons, with heavier precipitation in winter in much of western Europe (except for central and northern parts of the Scandinavian mountains) and drier summers in most parts of western and central Europe (except for the north-west and the eastern part of the Baltic Sea). Particularly large temperature anomalies (>6°C) are projected for north-east Europe in winter and for southern Europe, Asia Minor and parts of Russia in summer. The projected PET displayed very large increases in summer for a region extending from southern France to Russia. The unrealistically large values could be the result of an enhanced hydrological cycle in HadRM3H, affecting several of the input parameters to the PET calculation. To avoid problems with hydrological modelling schemes, PET was re-calculated, using empirical relationships derived from observational values of temperature and PET.

  14. Topography of brain glucose hypometabolism and epileptic network in glucose transporter 1 deficiency.

    PubMed

    Akman, Cigdem Inan; Provenzano, Frank; Wang, Dong; Engelstad, Kristin; Hinton, Veronica; Yu, Julia; Tikofsky, Ronald; Ichese, Masonari; De Vivo, Darryl C

    2015-02-01

    (18)F fluorodeoxyglucose positron emission tomography ((18)F FDG-PET) facilitates examination of glucose metabolism. Previously, we described regional cerebral glucose hypometabolism using (18)F FDG-PET in patients with Glucose transporter 1 Deficiency Syndrome (Glut1 DS). We now expand this observation in Glut1 DS using quantitative image analysis to identify the epileptic network based on the regional distribution of glucose hypometabolism. (18)F FDG-PET scans of 16 Glut1 DS patients and 7 healthy participants were examined using Statistical parametric Mapping (SPM). Summed images were preprocessed for statistical analysis using MATLAB 7.1 and SPM 2 software. Region of interest (ROI) analysis was performed to validate SPM results. Visual analysis of the (18)F FDG-PET images demonstrated prominent regional glucose hypometabolism in the thalamus, neocortical regions and cerebellum bilaterally. Group comparison using SPM analysis confirmed that the regional distribution of glucose hypo-metabolism was present in thalamus, cerebellum, temporal cortex and central lobule. Two mildly affected patients without epilepsy had hypometabolism in cerebellum, inferior frontal cortex, and temporal lobe, but not thalamus. Glucose hypometabolism did not correlate with age at the time of PET imaging, head circumference, CSF glucose concentration at the time of diagnosis, RBC glucose uptake, or CNS score. Quantitative analysis of (18)F FDG-PET imaging in Glut1 DS patients confirmed that hypometabolism was present symmetrically in thalamus, cerebellum, frontal and temporal cortex. The hypometabolism in thalamus correlated with the clinical history of epilepsy. Copyright © 2014. Published by Elsevier B.V.

  15. Cerebral glucose uptake in patients with chronic mental and cognitive sequelae following a single blunt mild TBI without visible brain lesions.

    PubMed

    Komura, Akifumi; Kawasaki, Tomohiro; Yamada, Yuichi; Uzuyama, Shiho; Asano, Yoshitaka; Shinoda, Jun

    2018-06-19

    The aim of this study is to investigate glucose uptake on FDG-PET in patients with chronic mental and cognitive symptoms following a single blunt mild traumatic brain injury (TBI) and without visible brain lesions on CT/MRI. Eighty-nine consecutive patients (mean age 43.8±10.75) who had a single blunt mild TBI from a traffic accident and suffering from chronic mental and cognitive symptoms without visible brain lesions on CT/MRI were enrolled in the study. Patients underwent FDG-PET imaging, and the mean interval between the TBI and FDG-PET was 50.0 months. The Wechsler Adult Intelligence Scale version III testing was performed within one month of the FDG-PET. A control group consisting of 93 healthy adult volunteers (mean age 42.2±14.3 years) also underwent FDG-PET. The glucose uptake pattern from FDG-PET in the patient group was compared to that from normal controls using statistical parametric mapping. Glucose uptake was significantly decreased in the bilateral prefrontal area and significantly increased around the limbic system in the patient group compared to normal controls. This topographical pattern of glucose uptake is different from that reported previously in patients with diffuse axonal injury (DAI), but may be similar to that seen in patients with major depression disorder. These results suggest that the pathological mechanism causing chronic mental and cognitive symptoms in patients with a single blunt mild TBI and without visible brain lesions might be different from that due to primary axonopathy in patients with DAI.

  16. Longitudinal Changes in Serum Glucose Levels are Associated with Metabolic Changes in Alzheimer's Disease Related Brain Regions.

    PubMed

    Burns, Christine M; Kaszniak, Alfred W; Chen, Kewei; Lee, Wendy; Bandy, Daniel J; Caselli, Richard J; Reiman, Eric M

    2018-01-01

    The association between longitudinal changes in serum glucose level and longitudinal changes in [18F] Fluorodeoxyglucose-PET (FDG PET) measurements of Alzheimer's disease (AD) risk are unknown. To investigate whether variation in serum glucose levels across time are associated with changes in FDG PET measurements of cerebral metabolic rate for glucose (rCMRgl) in brain regions preferentially affected by Alzheimer's disease (AD). Participants are a subset of a prospective cohort study investigating FDG PET, apolipoprotein E (APOE) ɛ4, and risk for AD which includes data from baseline, interim, and follow up visits over 4.4±1.0-years. An automated brain-mapping algorithm was utilized to characterize and compare associations between longitudinal changes in serum glucose levels and longitudinal changes in rCMRgl. This study included 80 adults aged 61.5±5 years, including 38 carriers and 42 non-carriers of the APOE ɛ4 allele. Longitudinal increases in serum glucose levels were associated with longitudinal CMRgl decline in the vicinity of parietotemporal, precuneus/posterior cingulate, and prefrontal brain regions preferentially affected by AD (p < 0.05, corrected for multiple comparisons). Findings remained significant when controlled for APOE ɛ4 status and baseline and advancing age. Additional studies are needed to clarify and confirm the relationship between longitudinal changes in peripheral glucose and FDG PET measurements of AD risk. Future findings will set the stage on the use of FDG PET in the evaluation of possible interventions that target risk factors for the development of AD.

  17. Topological mappings of video and audio data.

    PubMed

    Fyfe, Colin; Barbakh, Wesam; Ooi, Wei Chuan; Ko, Hanseok

    2008-12-01

    We review a new form of self-organizing map which is based on a nonlinear projection of latent points into data space, identical to that performed in the Generative Topographic Mapping (GTM).(1) But whereas the GTM is an extension of a mixture of experts, this model is an extension of a product of experts.(2) We show visualisation and clustering results on a data set composed of video data of lips uttering 5 Korean vowels. Finally we note that we may dispense with the probabilistic underpinnings of the product of experts and derive the same algorithm as a minimisation of mean squared error between the prototypes and the data. This leads us to suggest a new algorithm which incorporates local and global information in the clustering. Both ot the new algorithms achieve better results than the standard Self-Organizing Map.

  18. Long-range hazard assessment of volcanic ash dispersal for a Plinian eruptive scenario at Popocatépetl volcano (Mexico): implications for civil aviation safety

    USGS Publications Warehouse

    Bonasia, Rosanna; Scaini, Chirara; Capra, Lucia; Nathenson, Manuel; Siebe, Claus; Arana-Salinas, Lilia; Folch, Arnau

    2013-01-01

    Popocatépetl is one of Mexico’s most active volcanoes threatening a densely populated area that includes Mexico City with more than 20 million inhabitants. The destructive potential of this volcano is demonstrated by its Late Pleistocene–Holocene eruptive activity, which has been characterized by recurrent Plinian eruptions of large magnitude, the last two of which destroyed human settlements in pre-Hispanic times. Popocatépetl’s reawakening in 1994 produced a crisis that culminated with the evacuation of two villages on the northeastern flank of the volcano. Shortly after, a monitoring system and a civil protection contingency plan based on a hazard zone map were implemented. The current volcanic hazards map considers the potential occurrence of different volcanic phenomena, including pyroclastic density currents and lahars. However, no quantitative assessment of the tephra hazard, especially related to atmospheric dispersal, has been performed. The presence of airborne volcanic ash at low and jet-cruise atmospheric levels compromises the safety of aircraft operations and forces re-routing of aircraft to prevent encounters with volcanic ash clouds. Given the high number of important airports in the surroundings of Popocatépetl volcano and considering the potential threat posed to civil aviation in Mexico and adjacent regions in case of a Plinian eruption, a hazard assessment for tephra dispersal is required. In this work, we present the first probabilistic tephra dispersal hazard assessment for Popocatépetl volcano. We compute probabilistic hazard maps for critical thresholds of airborne ash concentrations at different flight levels, corresponding to the situation defined in Europe during 2010, and still under discussion. Tephra dispersal mode is performed using the FALL3D numerical model. Probabilistic hazard maps are built for a Plinian eruptive scenario defined on the basis of geological field data for the “Ochre Pumice” Plinian eruption (4965 14C yr BP). FALL3D model input eruptive parameters are constrained through an inversion method carried out with the semi-analytical HAZMAP model and are varied by sampling them using probability density functions. We analyze the influence of seasonal variations on ash dispersal and estimate the average persistence of critical ash concentrations at relevant locations and airports. This study assesses the impact that a Plinian eruption similar to the Ochre Pumice eruption would have on the main airports of Mexico and adjacent areas. The hazard maps presented here can support long-term planning that would help minimize the impacts of such an eruption on civil aviation.

  19. Long-range hazard assessment of volcanic ash dispersal for a Plinian eruptive scenario at Popocatépetl volcano (Mexico): implications for civil aviation safety

    NASA Astrophysics Data System (ADS)

    Bonasia, Rosanna; Scaini, Chiara; Capra, Lucia; Nathenson, Manuel; Siebe, Claus; Arana-Salinas, Lilia; Folch, Arnau

    2014-01-01

    Popocatépetl is one of Mexico's most active volcanoes threatening a densely populated area that includes Mexico City with more than 20 million inhabitants. The destructive potential of this volcano is demonstrated by its Late Pleistocene-Holocene eruptive activity, which has been characterized by recurrent Plinian eruptions of large magnitude, the last two of which destroyed human settlements in pre-Hispanic times. Popocatépetl's reawakening in 1994 produced a crisis that culminated with the evacuation of two villages on the northeastern flank of the volcano. Shortly after, a monitoring system and a civil protection contingency plan based on a hazard zone map were implemented. The current volcanic hazards map considers the potential occurrence of different volcanic phenomena, including pyroclastic density currents and lahars. However, no quantitative assessment of the tephra hazard, especially related to atmospheric dispersal, has been performed. The presence of airborne volcanic ash at low and jet-cruise atmospheric levels compromises the safety of aircraft operations and forces re-routing of aircraft to prevent encounters with volcanic ash clouds. Given the high number of important airports in the surroundings of Popocatépetl volcano and considering the potential threat posed to civil aviation in Mexico and adjacent regions in case of a Plinian eruption, a hazard assessment for tephra dispersal is required. In this work, we present the first probabilistic tephra dispersal hazard assessment for Popocatépetl volcano. We compute probabilistic hazard maps for critical thresholds of airborne ash concentrations at different flight levels, corresponding to the situation defined in Europe during 2010, and still under discussion. Tephra dispersal mode is performed using the FALL3D numerical model. Probabilistic hazard maps are built for a Plinian eruptive scenario defined on the basis of geological field data for the "Ochre Pumice" Plinian eruption (4965 14C yr BP). FALL3D model input eruptive parameters are constrained through an inversion method carried out with the semi-analytical HAZMAP model and are varied by sampling them using probability density functions. We analyze the influence of seasonal variations on ash dispersal and estimate the average persistence of critical ash concentrations at relevant locations and airports. This study assesses the impact that a Plinian eruption similar to the Ochre Pumice eruption would have on the main airports of Mexico and adjacent areas. The hazard maps presented here can support long-term planning that would help minimize the impacts of such an eruption on civil aviation.

  20. Impact of motion and partial volume effects correction on PET myocardial perfusion imaging using simultaneous PET-MR

    NASA Astrophysics Data System (ADS)

    Petibon, Yoann; Guehl, Nicolas J.; Reese, Timothy G.; Ebrahimi, Behzad; Normandin, Marc D.; Shoup, Timothy M.; Alpert, Nathaniel M.; El Fakhri, Georges; Ouyang, Jinsong

    2017-01-01

    PET is an established modality for myocardial perfusion imaging (MPI) which enables quantification of absolute myocardial blood flow (MBF) using dynamic imaging and kinetic modeling. However, heart motion and partial volume effects (PVE) significantly limit the spatial resolution and quantitative accuracy of PET MPI. Simultaneous PET-MR offers a solution to the motion problem in PET by enabling MR-based motion correction of PET data. The aim of this study was to develop a motion and PVE correction methodology for PET MPI using simultaneous PET-MR, and to assess its impact on both static and dynamic PET MPI using 18F-Flurpiridaz, a novel 18F-labeled perfusion tracer. Two dynamic 18F-Flurpiridaz MPI scans were performed on healthy pigs using a PET-MR scanner. Cardiac motion was tracked using a dedicated tagged-MRI (tMR) sequence. Motion fields were estimated using non-rigid registration of tMR images and used to calculate motion-dependent attenuation maps. Motion correction of PET data was achieved by incorporating tMR-based motion fields and motion-dependent attenuation coefficients into image reconstruction. Dynamic and static PET datasets were created for each scan. Each dataset was reconstructed as (i) Ungated, (ii) Gated (end-diastolic phase), and (iii) Motion-Corrected (MoCo), each without and with point spread function (PSF) modeling for PVE correction. Myocardium-to-blood concentration ratios (MBR) and apparent wall thickness were calculated to assess image quality for static MPI. For dynamic MPI, segment- and voxel-wise MBF values were estimated by non-linear fitting of a 2-tissue compartment model to tissue time-activity-curves. MoCo and Gating respectively decreased mean apparent wall thickness by 15.1% and 14.4% and increased MBR by 20.3% and 13.6% compared to Ungated images (P  <  0.01). Combined motion and PSF correction (MoCo-PSF) yielded 30.9% (15.7%) lower wall thickness and 82.2% (20.5%) higher MBR compared to Ungated data reconstructed without (with) PSF modeling (P  <  0.01). For dynamic PET, mean MBF across all segments were comparable for MoCo (0.72  ±  0.21 ml/min/ml) and Gating (0.69  ±  0.18 ml/min/ml). Ungated data yielded significantly lower mean MBF (0.59  ±  0.16 ml/min/ml). Mean MBF for MoCo-PSF was 0.80  ±  0.22 ml/min/ml, which was 37.9% (25.0%) higher than that obtained from Ungated data without (with) PSF correction (P  <  0.01). The developed methodology holds promise to improve the image quality and sensitivity of PET MPI studies performed using PET-MR.

  1. Historical gridded reconstruction of potential evapotranspiration for the UK

    NASA Astrophysics Data System (ADS)

    Tanguy, Maliko; Prudhomme, Christel; Smith, Katie; Hannaford, Jamie

    2018-06-01

    Potential evapotranspiration (PET) is a necessary input data for most hydrological models and is often needed at a daily time step. An accurate estimation of PET requires many input climate variables which are, in most cases, not available prior to the 1960s for the UK, nor indeed most parts of the world. Therefore, when applying hydrological models to earlier periods, modellers have to rely on PET estimations derived from simplified methods. Given that only monthly observed temperature data is readily available for the late 19th and early 20th century at a national scale for the UK, the objective of this work was to derive the best possible UK-wide gridded PET dataset from the limited data available.To that end, firstly, a combination of (i) seven temperature-based PET equations, (ii) four different calibration approaches and (iii) seven input temperature data were evaluated. For this evaluation, a gridded daily PET product based on the physically based Penman-Monteith equation (the CHESS PET dataset) was used, the rationale being that this provides a reliable ground truth PET dataset for evaluation purposes, given that no directly observed, distributed PET datasets exist. The performance of the models was also compared to a naïve method, which is defined as the simplest possible estimation of PET in the absence of any available climate data. The naïve method used in this study is the CHESS PET daily long-term average (the period from 1961 to 1990 was chosen), or CHESS-PET daily climatology.The analysis revealed that the type of calibration and the input temperature dataset had only a minor effect on the accuracy of the PET estimations at catchment scale. From the seven equations tested, only the calibrated version of the McGuinness-Bordne equation was able to outperform the naïve method and was therefore used to derive the gridded, reconstructed dataset. The equation was calibrated using 43 catchments across Great Britain.The dataset produced is a 5 km gridded PET dataset for the period 1891 to 2015, using the Met Office 5 km monthly gridded temperature data available for that time period as input data for the PET equation. The dataset includes daily and monthly PET grids and is complemented with a suite of mapped performance metrics to help users assess the quality of the data spatially.This dataset is expected to be particularly valuable as input to hydrological models for any catchment in the UK. The data can be accessed at https://doi.org/10.5285/17b9c4f7-1c30-4b6f-b2fe-f7780159939c.

  2. MO-G-17A-09: Quantitative Autoradiography of Biopsy Specimens Extracted Under PET/CT Guidance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fanchon, L; Carlin, S; Schmidtlein, C

    2014-06-15

    Purpose: To develop a procedure for accurate determination of PET tracer concentration with high spatial accuracy in situ by performing Quantitative Autoradiography of Biopsy Specimens (QABS) extracted under PET/CT guidance. Methods: Autoradiography (ARG) standards were produced from a gel loaded with a known concentration of FDG biopsied with 18G and 20G biopsy needles. Specimens obtained with these needles are generally cylindrical: up to 18 mm in length and about 0.8 and 0.6 mm in diameter respectively. These standards, with similar shape and density as biopsy specimens were used to generate ARG calibration curves.Quantitative ARG was performed to measure the activitymore » concentration in biopsy specimens extracted from ten patients. The biopsy sites were determined according to PET/CT's obtained in the operating room. Additional CT scans were acquired with the needles in place to confirm correct needle placements. The ARG images were aligned with the needle tip in the PET/CT images using the open source CERR software. The mean SUV calculated from the specimen activities (SUVarg) were compared to that from PET (SUVpet) at the needle locations. Results: Calibration curves show that the relation between ARG signal and activity concentration in those standards is linear for the investigated range (up to 150 kBq/ml). The correlation coefficient of SUVarg with SUVpet is 0.74. Discrepancies between SUVarg and SUVpet can be attributed to the small size of the biopsy specimens compared to PET resolution. Conclusion: The calibration procedure using surrogate biopsy specimens provided a method for quantifying the activity within the biopsy cores obtained under FDG-PET guidance. QABS allows mapping the activity concentration in such biopsy specimens with a resolution of about 1mm. QABS is a promising tool for verification of biopsy adequacy by comparing specimen activity to that expected from the PET image. A portion of this research was funded by a research grant from Biospace Lab, 13 rue Georges Auric 75019 Paris, FRANCE.« less

  3. Deaggregation of Probabilistic Ground Motions in the Central and Eastern United States

    USGS Publications Warehouse

    Harmsen, S.; Perkins, D.; Frankel, A.

    1999-01-01

    Probabilistic seismic hazard analysis (PSHA) is a technique for estimating the annual rate of exceedance of a specified ground motion at a site due to known and suspected earthquake sources. The relative contributions of the various sources to the total seismic hazard are determined as a function of their occurrence rates and their ground-motion potential. The separation of the exceedance contributions into bins whose base dimensions are magnitude and distance is called deaggregation. We have deaggregated the hazard analyses for the new USGS national probabilistic ground-motion hazard maps (Frankel et al., 1996). For points on a 0.2?? grid in the central and eastern United States (CEUS), we show color maps of the geographical variation of mean and modal magnitudes (M??, M??) and distances (D??, D??) for ground motions having a 2% chance of exceedance in 50 years. These maps are displayed for peak horizontal acceleration and for spectral response accelerations of 0.2, 0.3, and 1.0 sec. We tabulate M??, D??, M??, and D?? for 49 CEUS cities for 0.2- and 1.0-sec response. Thus, these maps and tables are PSHA-derived estimates of the potential earthquakes that dominate seismic hazard at short and intermediate periods in the CEUS. The contribution to hazard of the New Madrid and Charleston sources dominates over much of the CEUS; for 0.2-sec response, over 40% of the area; for 1.0-sec response, over 80% of the area. For 0.2-sec response, D?? ranges from 20 to 200 km, for 1.0 sec, 30 to 600 km. For sites influenced by New Madrid or Charleston, D is less than the distance to these sources, and M?? is less than the characteristic magnitude of these sources, because averaging takes into account the effect of smaller magnitude and closer sources. On the other hand, D?? is directly the distance to New Madrid or Charleston and M?? for 0.2- and 1.0-sec response corresponds to the dominating source over much of the CEUS. For some cities in the North Atlantic states, short-period seismic hazard is apt to be controlled by local seismicity, whereas intermediate period (1.0 sec) hazard is commonly controlled by regional seismicity, such as that of the Charlevoix seismic zone.

  4. Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources

    USGS Publications Warehouse

    Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.

    2009-01-01

    The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.

  5. Probabilistic tsunami hazard analysis: Multiple sources and global applications

    USGS Publications Warehouse

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël; Parsons, Thomas E.; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-01-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  6. Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications

    NASA Astrophysics Data System (ADS)

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-12-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  7. Dynamic neurotransmitter interactions measured with PET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiffer, W.K.; Dewey, S.L.

    2001-04-02

    Positron emission tomography (PET) has become a valuable interdisciplinary tool for understanding physiological, biochemical and pharmacological functions at a molecular level in living humans, whether in a healthy or diseased state. The utility of tracing chemical activity through the body transcends the fields of cardiology, oncology, neurology and psychiatry. In this, PET techniques span radiochemistry and radiopharmaceutical development to instrumentation, image analysis, anatomy and modeling. PET has made substantial contributions in each of these fields by providing a,venue for mapping dynamic functions of healthy and unhealthy human anatomy. As diverse as the disciplines it bridges, PET has provided insight intomore » an equally significant variety of psychiatric disorders. Using the unique quantitative ability of PET, researchers are now better able to non-invasively characterize normally occurring neurotransmitter interactions in the brain. With the knowledge that these interactions provide the fundamental basis for brain response, many investigators have recently focused their efforts on an examination of the communication between these chemicals in both healthy volunteers and individuals suffering from diseases classically defined as neurotransmitter specific in nature. In addition, PET can measure the biochemical dynamics of acute and sustained drug abuse. Thus, PET studies of neurotransmitter interactions enable investigators to describe a multitude of specific functional interactions in the human brain. This information can then be applied to understanding side effects that occur in response to acute and chronic drug therapy, and to designing new drugs that target multiple systems as opposed to single receptor types. Knowledge derived from PET studies can be applied to drug discovery, research and development (for review, see (Fowler et al., 1999) and (Burns et al., 1999)). Here, we will cover the most substantial contributions of PET to understanding biologically distinct neurochemical systems that interact to produce a variety of behaviors and disorders. Neurotransmitters are neither static nor isolated in their distribution. In fact, it is through interactions with other neurochemically distinct systems that the central nervous system (CNS) performs its vital role in sustaining life. Exclusive quantitative capabilities intrinsic to PET make this technology a suitable experimental tool to measure not only the regional distribution of specific receptors and their subtypes, but also the dynamic properties of neuroreceptors and their inherent influence on related neurotransmitter pathways. The ability to investigate dynamic properties in a non-invasive and reproducible manner provides a powerful tool that can extend our current knowledge of these interactions. Coupled with innovative paradigms including pharmacologic manipulations, physiologic models and reconstruction theories, knowledge derived from PET studies can greatly advance our understanding of normal and abnormal brain function.« less

  8. Developmental Change in Feedback Processing as Reflected by Phasic Heart Rate Changes

    ERIC Educational Resources Information Center

    Crone, Eveline A.; Jennings, J. Richard; Van der Molen, Maurits W.

    2004-01-01

    Heart rate was recorded from 3 age groups (8-10, 12, and 20-26 years) while they performed a probabilistic learning task. Stimuli had to be sorted by pressing a left versus right key, followed by positive or negative feedback. Adult heart rate slowed following negative feedback when stimuli were consistently mapped onto the left or right key…

  9. A comparison of geospatially modeled fire behavior and fire management utility of three data sources in the southeastern United States

    Treesearch

    LaWen T. Hollingsworth; Laurie L. Kurth; Bernard R. Parresol; Roger D. Ottmar; Susan J. Prichard

    2012-01-01

    Landscape-scale fire behavior analyses are important to inform decisions on resource management projects that meet land management objectives and protect values from adverse consequences of fire. Deterministic and probabilistic geospatial fire behavior analyses are conducted with various modeling systems including FARSITE, FlamMap, FSPro, and Large Fire Simulation...

  10. Generating Continuous Surface Probability Maps from Airborne Video Using Two Sampling Intensities Along the Video Transect

    Treesearch

    Dennis M. Jacobs; William H. Cooke

    2000-01-01

    Airborne videography can be an effective tool for assessing the effects of catastrophic events on forest conditions. However, there is some question about the appropriate sampling intensity to use, especially when trying to develop correlations with probabilistic data sets such as are assembled through the Forest Inventory and Analysis (FIA) surveys. We used airborne...

  11. Multibeam 3D Underwater SLAM with Probabilistic Registration.

    PubMed

    Palomer, Albert; Ridao, Pere; Ribas, David

    2016-04-20

    This paper describes a pose-based underwater 3D Simultaneous Localization and Mapping (SLAM) using a multibeam echosounder to produce high consistency underwater maps. The proposed algorithm compounds swath profiles of the seafloor with dead reckoning localization to build surface patches (i.e., point clouds). An Iterative Closest Point (ICP) with a probabilistic implementation is then used to register the point clouds, taking into account their uncertainties. The registration process is divided in two steps: (1) point-to-point association for coarse registration and (2) point-to-plane association for fine registration. The point clouds of the surfaces to be registered are sub-sampled in order to decrease both the computation time and also the potential of falling into local minima during the registration. In addition, a heuristic is used to decrease the complexity of the association step of the ICP from O(n2) to O(n) . The performance of the SLAM framework is tested using two real world datasets: First, a 2.5D bathymetric dataset obtained with the usual down-looking multibeam sonar configuration, and second, a full 3D underwater dataset acquired with a multibeam sonar mounted on a pan and tilt unit.

  12. Earthquake parametrics based protection for microfinance disaster management in Indonesia

    NASA Astrophysics Data System (ADS)

    Sedayo, M. H.; Damanik, R.

    2017-07-01

    Financial institutions included microfinance institutions those lend money to people also face the risk when catastrophe event hit their operation area. Liquidity risk when withdrawal amount and Non Performance Loan (NPL) hiking fast in the same time could hit their cash flow. There are products in market that provide backup fund for this kind of situation. Microfinance institution needs a guideline too make contingency plan in their disaster management program. We develop a probabilistic seismic hazard, index and zonation map as a tool to help in making financial disaster impact reduction program for microfinance in Indonesia. GMPE was used to estimate PGA for each Kabupaten points. PGA to MMI conversion was done by applied empirical relationship. We used loan distribution data from Financial Service Authority and Bank Indonesia as exposure in indexing. Index level from this study could be use as rank of urgency. Probabilistic hazard map was used to pricing two backup scenarios and to make a zonation. We proposed three zones with annual average cost 0.0684‰, 0.4236‰ and 1.4064 for first scenario and 0.3588‰, 2.6112‰, and 6.0816‰ for second scenario.

  13. Seismic Hazard Assessment of Tehran Based on Arias Intensity

    NASA Astrophysics Data System (ADS)

    Amiri, G. Ghodrati; Mahmoodi, H.; Amrei, S. A. Razavian

    2008-07-01

    In this paper probabilistic seismic hazard assessment of Tehran for Arias intensity parameter is done. Tehran is capital and most populated city of Iran. From economical, political and social points of view, Tehran is the most significant city of Iran. Since in the previous centuries, catastrophic earthquakes have occurred in Tehran and its vicinity, probabilistic seismic hazard assessment of this city for Arias intensity parameter is useful. Iso-intensity contour lines maps of Tehran on the basis of different attenuation relationships for different earthquake periods are plotted. Maps of iso-intensity points in the Tehran region are presented using proportional attenuation relationships for rock and soil beds for 2 hazard levels of 10% and 2% in 50 years. Seismicity parameters on the basis of historical and instrumental earthquakes for a time period that initiate from 4th century BC and ends in the present time are calculated using Tow methods. For calculation of seismicity parameters, the earthquake catalogue with a radius of 200 km around Tehran has been used. SEISRISKIII Software has been employed. Effects of different parameters such as seismicity parameters, length of fault rupture relationships and attenuation relationships are considered using Logic Tree.

  14. Seismic Hazard Assessment of Tehran Based on Arias Intensity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amiri, G. Ghodrati; Mahmoodi, H.; Amrei, S. A. Razavian

    2008-07-08

    In this paper probabilistic seismic hazard assessment of Tehran for Arias intensity parameter is done. Tehran is capital and most populated city of Iran. From economical, political and social points of view, Tehran is the most significant city of Iran. Since in the previous centuries, catastrophic earthquakes have occurred in Tehran and its vicinity, probabilistic seismic hazard assessment of this city for Arias intensity parameter is useful. Iso-intensity contour lines maps of Tehran on the basis of different attenuation relationships for different earthquake periods are plotted. Maps of iso-intensity points in the Tehran region are presented using proportional attenuation relationshipsmore » for rock and soil beds for 2 hazard levels of 10% and 2% in 50 years. Seismicity parameters on the basis of historical and instrumental earthquakes for a time period that initiate from 4th century BC and ends in the present time are calculated using Tow methods. For calculation of seismicity parameters, the earthquake catalogue with a radius of 200 km around Tehran has been used. SEISRISKIII Software has been employed. Effects of different parameters such as seismicity parameters, length of fault rupture relationships and attenuation relationships are considered using Logic Tree.« less

  15. Probabilistic model for quick detection of dissimilar binary images

    NASA Astrophysics Data System (ADS)

    Mustafa, Adnan A. Y.

    2015-09-01

    We present a quick method to detect dissimilar binary images. The method is based on a "probabilistic matching model" for image matching. The matching model is used to predict the probability of occurrence of distinct-dissimilar image pairs (completely different images) when matching one image to another. Based on this model, distinct-dissimilar images can be detected by matching only a few points between two images with high confidence, namely 11 points for a 99.9% successful detection rate. For image pairs that are dissimilar but not distinct-dissimilar, more points need to be mapped. The number of points required to attain a certain successful detection rate or confidence depends on the amount of similarity between the compared images. As this similarity increases, more points are required. For example, images that differ by 1% can be detected by mapping fewer than 70 points on average. More importantly, the model is image size invariant; so, images of any sizes will produce high confidence levels with a limited number of matched points. As a result, this method does not suffer from the image size handicap that impedes current methods. We report on extensive tests conducted on real images of different sizes.

  16. Propagation of the velocity model uncertainties to the seismic event location

    NASA Astrophysics Data System (ADS)

    Gesret, A.; Desassis, N.; Noble, M.; Romary, T.; Maisons, C.

    2015-01-01

    Earthquake hypocentre locations are crucial in many domains of application (academic and industrial) as seismic event location maps are commonly used to delineate faults or fractures. The interpretation of these maps depends on location accuracy and on the reliability of the associated uncertainties. The largest contribution to location and uncertainty errors is due to the fact that the velocity model errors are usually not correctly taken into account. We propose a new Bayesian formulation that integrates properly the knowledge on the velocity model into the formulation of the probabilistic earthquake location. In this work, the velocity model uncertainties are first estimated with a Bayesian tomography of active shot data. We implement a sampling Monte Carlo type algorithm to generate velocity models distributed according to the posterior distribution. In a second step, we propagate the velocity model uncertainties to the seismic event location in a probabilistic framework. This enables to obtain more reliable hypocentre locations as well as their associated uncertainties accounting for picking and velocity model uncertainties. We illustrate the tomography results and the gain in accuracy of earthquake location for two synthetic examples and one real data case study in the context of induced microseismicity.

  17. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  18. Perspective: Stochastic magnetic devices for cognitive computing

    NASA Astrophysics Data System (ADS)

    Roy, Kaushik; Sengupta, Abhronil; Shim, Yong

    2018-06-01

    Stochastic switching of nanomagnets can potentially enable probabilistic cognitive hardware consisting of noisy neural and synaptic components. Furthermore, computational paradigms inspired from the Ising computing model require stochasticity for achieving near-optimality in solutions to various types of combinatorial optimization problems such as the Graph Coloring Problem or the Travelling Salesman Problem. Achieving optimal solutions in such problems are computationally exhaustive and requires natural annealing to arrive at the near-optimal solutions. Stochastic switching of devices also finds use in applications involving Deep Belief Networks and Bayesian Inference. In this article, we provide a multi-disciplinary perspective across the stack of devices, circuits, and algorithms to illustrate how the stochastic switching dynamics of spintronic devices in the presence of thermal noise can provide a direct mapping to the computational units of such probabilistic intelligent systems.

  19. Accurate segmenting of cervical tumors in PET imaging based on similarity between adjacent slices.

    PubMed

    Chen, Liyuan; Shen, Chenyang; Zhou, Zhiguo; Maquilan, Genevieve; Thomas, Kimberly; Folkert, Michael R; Albuquerque, Kevin; Wang, Jing

    2018-06-01

    Because in PET imaging cervical tumors are close to the bladder with high capacity for the secreted 18 FDG tracer, conventional intensity-based segmentation methods often misclassify the bladder as a tumor. Based on the observation that tumor position and area do not change dramatically from slice to slice, we propose a two-stage scheme that facilitates segmentation. In the first stage, we used a graph-cut based algorithm to obtain initial contouring of the tumor based on local similarity information between voxels; this was achieved through manual contouring of the cervical tumor on one slice. In the second stage, initial tumor contours were fine-tuned to more accurate segmentation by incorporating similarity information on tumor shape and position among adjacent slices, according to an intensity-spatial-distance map. Experimental results illustrate that the proposed two-stage algorithm provides a more effective approach to segmenting cervical tumors in 3D 18 FDG PET images than the benchmarks used for comparison. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Non-invasive mapping of deep-tissue lymph nodes in live animals using a multimodal PET/MRI nanoparticle

    NASA Astrophysics Data System (ADS)

    Thorek, Daniel L. J.; Ulmert, David; Diop, Ndeye-Fatou M.; Lupu, Mihaela E.; Doran, Michael G.; Huang, Ruimin; Abou, Diane S.; Larson, Steven M.; Grimm, Jan

    2014-01-01

    The invasion status of tumour-draining lymph nodes (LNs) is a critical indicator of cancer stage and is important for treatment planning. Clinicians currently use planar scintigraphy and single-photon emission computed tomography (SPECT) with 99mTc-radiocolloid to guide biopsy and resection of LNs. However, emerging multimodality approaches such as positron emission tomography combined with magnetic resonance imaging (PET/MRI) detect sites of disease with higher sensitivity and accuracy. Here we present a multimodal nanoparticle, 89Zr-ferumoxytol, for the enhanced detection of LNs with PET/MRI. For genuine translational potential, we leverage a clinical iron oxide formulation, altered with minimal modification for radiolabelling. Axillary drainage in naive mice and from healthy and tumour-bearing prostates was investigated. We demonstrate that 89Zr-ferumoxytol can be used for high-resolution tomographic studies of lymphatic drainage in preclinical disease models. This nanoparticle platform has significant translational potential to improve preoperative planning for nodal resection and tumour staging.

  1. Diversifying sunflower germplasm by integration and mapping of a novel male fertility restoration gene

    USDA-ARS?s Scientific Manuscript database

    The combination of a single cytoplasmic male-sterile (CMS) PET-1, originating from wild Helianthus petiolaris subsp. petiolaris Nutt., and the corresponding fertility restoration gene Rf1, has been used for commercial sunflower hybrid seed production worldwide since the early 1970s. A new CMS line 5...

  2. Correlation of simultaneously acquired diffusion-weighted imaging and 2-deoxy-[18F] fluoro-2-D-glucose positron emission tomography of pulmonary lesions in a dedicated whole-body magnetic resonance/positron emission tomography system.

    PubMed

    Schmidt, Holger; Brendle, Cornelia; Schraml, Christina; Martirosian, Petros; Bezrukov, Ilja; Hetzel, Jürgen; Müller, Mark; Sauter, Alexander; Claussen, Claus D; Pfannenberg, Christina; Schwenzer, Nina F

    2013-05-01

    Hybrid whole-body magnetic resonance/positron emission tomography (MR/PET) systems are a new diagnostic tool enabling the simultaneous acquisition of morphologic and multiple functional data and thus allowing for a diversified characterization of oncological diseases.The aim of this study was to investigate the image and alignment quality of MR/PET in patients with pulmonary lesions and to compare the congruency of the 2 functional measurements of diffusion-weighted imaging (DWI) in MR imaging and 2-deoxy-[18F] fluoro-2-D-glucose (FDG) uptake in PET. A total of 15 patients were examined with a routine positron emission tomography/computer tomography (PET/CT) protocol and, subsequently, in a whole-body MR/PET scanner allowing for simultaneous PET and MR data acquisition. The PET and MR image quality was assessed visually using a 4-point score (1, insufficient; 4, excellent). The alignment quality of the rigidly registered PET/CT and MR/PET data sets was investigated on the basis of multiple anatomic landmarks of the lung using a scoring system from 1 (no alignment) to 4 (very good alignment). In addition, the alignment quality of the tumor lesions in PET/CT and MR/PET as well as for retrospective fusion of PET from PET/CT and MR images was assessed quantitatively and was compared between lesions strongly or less influenced by respiratory motion. The correlation of the simultaneously acquired DWI and FDG uptake in the pulmonary masses was analyzed using the minimum and mean apparent diffusion coefficient (ADC min and ADC mean) as well as the maximum and mean standardized uptake value (SUV max and SUV mean), respectively. In addition, the correlation of SUV max from PET/CT data was investigated as well. On lesions 3 cm or greater, a voxelwise analysis of ADC and SUV was performed. The visual evaluation revealed excellent image quality of the PET images (mean [SD] score, 3.6 [0.5]) and overall good image quality of DWI (mean [SD] score of 2.5 [0.5] for ADC maps and 2.7 [0.5] for diffusion-weighted images, respectively). The alignment quality of the data sets was very good in both MR/PET and PET/CT without significant differences (overall mean [SD] score of MR/PET, 3.8 [0.4]; PET/CT 3.6 [0.5]). Also, the alignment quality of the tumor lesions showed no significant differences between PET/CT and MR/PET (mean cumulative misalignment of MR/PET, 7.7 mm; PET/CT, 7.0 mm; P = 0.705) but between both modalities and a retrospective fusion (mean cumulative misalignment, 17.1 mm; P = 0.002 and P = 0.008 for PET/CT and MR/PET, respectively). Also, the comparison of the lesions strongly or less influenced by respiratory motion showed significant differences only for the retrospective fusion (21.3 mm vs 11.5 mm, respectively; P = 0.043). The ADC min and SUV max as measures of the cell density and glucose metabolism showed a significant reverse correlation (r = -0.80; P = 0.0006). No significant correlation was found between ADC mean and SUV mean (r = -0.42; P = 0.1392). Also, SUV max from the PET/CT data showed significant reverse correlation to ADC min (r = -0.62; P = 0.019). The voxelwise analysis of 5 pulmonary lesions each showed weak but significant negative correlation between ADC and SUV. Examinations of pulmonary lesions in a simultaneous whole-body MR/PET system provide diagnostic image quality in both modalities. Although DWI and FDG-PET reflect different tissue properties, there may very well be an association between the measures of both methods most probably because of increased cellularity and glucose metabolism of FDG-avid pulmonary lesions. A voxelwise DWI and FDG-PET correlation might provide a more sophisticated spatial characterization of pulmonary lesions.

  3. Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy.

    PubMed

    Frey, K; Unholtz, D; Bauer, J; Debus, J; Min, C H; Bortfeld, T; Paganetti, H; Parodi, K

    2014-10-07

    We introduce the automation of the range difference calculation deduced from particle-irradiation induced β(+)-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to 'red flag' problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in a 2D map in beam-eye-view. In comparison to previously proposed approaches, the new most-likely-shift method shows more robust results for assessing in-vivo the range from strongly varying PET distributions caused by differing patient geometry, ion beam species, beam delivery techniques, PET imaging concepts and counting statistics. The additional visualization of the uncertainties and the dedicated weighting strategy contribute to the understanding of the reliability of observed range differences and the complexity in the prediction of activity distributions. The proposed method promises to offer a feasible technique for clinical routine of PET-based range verification.

  4. Clinical evaluation of respiration-induced attenuation uncertainties in pulmonary 3D PET/CT.

    PubMed

    Kruis, Matthijs F; van de Kamer, Jeroen B; Vogel, Wouter V; Belderbos, José Sa; Sonke, Jan-Jakob; van Herk, Marcel

    2015-12-01

    In contemporary positron emission tomography (PET)/computed tomography (CT) scanners, PET attenuation correction is performed by means of a CT-based attenuation map. Respiratory motion can however induce offsets between the PET and CT data. Studies have demonstrated that these offsets can cause errors in quantitative PET measures. The purpose of this study is to quantify the effects of respiration-induced CT differences on the attenuation correction of pulmonary 18-fluordeoxyglucose (FDG) 3D PET/CT in a patient population and to investigate contributing factors. For 32 lung cancer patients, 3D-CT, 4D-PET and 4D-CT data were acquired. The 4D FDG PET data were attenuation corrected (AC) using a free-breathing 3D-CT (3D-AC), the end-inspiration CT (EI-AC), the end-expiration CT (EE-AC) or phase-by-phase (P-AC). After reconstruction and AC, the 4D-PET data were averaged. In the 4Davg data, we measured maximum tumour standardised uptake value (SUV)max in the tumour, SUVmean in a lung volume of interest (VOI) and average SUV (SUVmean) in a muscle VOI. On the 4D-CT, we measured the lung volume differences and CT number changes between inhale and exhale in the lung VOI. Compared to P-AC, we found -2.3% (range -9.7% to 1.2%) lower tumour SUVmax in EI-AC and 2.0% (range -0.9% to 9.5%) higher SUVmax in EE-AC. No differences in the muscle SUV were found. The use of 3D-AC led to respiration-induced SUVmax differences up to 20% compared to the use of P-AC. SUVmean differences in the lung VOI between EI-AC and EE-AC correlated to average CT differences in this region (ρ = 0.83). SUVmax differences in the tumour correlated to the volume changes of the lungs (ρ = -0.55) and the motion amplitude of the tumour (ρ = 0.53), both as measured on the 4D-CT. Respiration-induced CT variations in clinical data can in extreme cases lead to SUV effects larger than 10% on PET attenuation correction. These differences were case specific and correlated to differences in CT number in the lungs.

  5. Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy

    NASA Astrophysics Data System (ADS)

    Frey, K.; Unholtz, D.; Bauer, J.; Debus, J.; Min, C. H.; Bortfeld, T.; Paganetti, H.; Parodi, K.

    2014-10-01

    We introduce the automation of the range difference calculation deduced from particle-irradiation induced β+-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to ‘red flag’ problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in a 2D map in beam-eye-view. In comparison to previously proposed approaches, the new most-likely-shift method shows more robust results for assessing in-vivo the range from strongly varying PET distributions caused by differing patient geometry, ion beam species, beam delivery techniques, PET imaging concepts and counting statistics. The additional visualization of the uncertainties and the dedicated weighting strategy contribute to the understanding of the reliability of observed range differences and the complexity in the prediction of activity distributions. The proposed method promises to offer a feasible technique for clinical routine of PET-based range verification.

  6. Toward uniform probabilistic seismic hazard assessments for Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Shi, X.; Ornthammarath, T.; Warnitchai, P.; Kosuwan, S.; Thant, M.; Nguyen, P. H.; Nguyen, L. M.; Solidum, R., Jr.; Irsyam, M.; Hidayati, S.; Sieh, K.

    2017-12-01

    Although most Southeast Asian countries have seismic hazard maps, various methodologies and quality result in appreciable mismatches at national boundaries. We aim to conduct a uniform assessment across the region by through standardized earthquake and fault databases, ground-shaking scenarios, and regional hazard maps. Our earthquake database contains earthquake parameters obtained from global and national seismic networks, harmonized by removal of duplicate events and the use of moment magnitude. Our active-fault database includes fault parameters from previous studies and from the databases implemented for national seismic hazard maps. Another crucial input for seismic hazard assessment is proper evaluation of ground-shaking attenuation. Since few ground-motion prediction equations (GMPEs) have used local observations from this region, we evaluated attenuation by comparison of instrumental observations and felt intensities for recent earthquakes with predicted ground shaking from published GMPEs. We then utilize the best-fitting GMPEs and site conditions into our seismic hazard assessments. Based on the database and proper GMPEs, we have constructed regional probabilistic seismic hazard maps. The assessment shows highest seismic hazard levels near those faults with high slip rates, including the Sagaing Fault in central Myanmar, the Sumatran Fault in Sumatra, the Palu-Koro, Matano and Lawanopo Faults in Sulawesi, and the Philippine Fault across several islands of the Philippines. In addition, our assessment demonstrates the important fact that regions with low earthquake probability may well have a higher aggregate probability of future earthquakes, since they encompass much larger areas than the areas of high probability. The significant irony then is that in areas of low to moderate probability, where building codes are usually to provide less seismic resilience, seismic risk is likely to be greater. Infrastructural damage in East Malaysia during the 2015 Sabah earthquake offers a case in point.

  7. HELP: XID+, the probabilistic de-blender for Herschel SPIRE maps

    NASA Astrophysics Data System (ADS)

    Hurley, P. D.; Oliver, S.; Betancourt, M.; Clarke, C.; Cowley, W. I.; Duivenvoorden, S.; Farrah, D.; Griffin, M.; Lacey, C.; Le Floc'h, E.; Papadopoulos, A.; Sargent, M.; Scudder, J. M.; Vaccari, M.; Valtchanov, I.; Wang, L.

    2017-01-01

    We have developed a new prior-based source extraction tool, XID+, to carry out photometry in the Herschel SPIRE (Spectral and Photometric Imaging Receiver) maps at the positions of known sources. XID+ is developed using a probabilistic Bayesian framework that provides a natural framework in which to include prior information, and uses the Bayesian inference tool Stan to obtain the full posterior probability distribution on flux estimates. In this paper, we discuss the details of XID+ and demonstrate the basic capabilities and performance by running it on simulated SPIRE maps resembling the COSMOS field, and comparing to the current prior-based source extraction tool DESPHOT. Not only we show that XID+ performs better on metrics such as flux accuracy and flux uncertainty accuracy, but we also illustrate how obtaining the posterior probability distribution can help overcome some of the issues inherent with maximum-likelihood-based source extraction routines. We run XID+ on the COSMOS SPIRE maps from Herschel Multi-Tiered Extragalactic Survey using a 24-μm catalogue as a positional prior, and a uniform flux prior ranging from 0.01 to 1000 mJy. We show the marginalized SPIRE colour-colour plot and marginalized contribution to the cosmic infrared background at the SPIRE wavelengths. XID+ is a core tool arising from the Herschel Extragalactic Legacy Project (HELP) and we discuss how additional work within HELP providing prior information on fluxes can and will be utilized. The software is available at https://github.com/H-E-L-P/XID_plus. We also provide the data product for COSMOS. We believe this is the first time that the full posterior probability of galaxy photometry has been provided as a data product.

  8. Probabilistic map of critical functional regions of the human cerebral cortex: Broca's area revisited.

    PubMed

    Tate, Matthew C; Herbet, Guillaume; Moritz-Gasser, Sylvie; Tate, Joseph E; Duffau, Hugues

    2014-10-01

    The organization of basic functions of the human brain, particularly in the right hemisphere, remains poorly understood. Recent advances in functional neuroimaging have improved our understanding of cortical organization but do not allow for direct interrogation or determination of essential (versus participatory) cortical regions. Direct cortical stimulation represents a unique opportunity to provide novel insights into the functional distribution of critical epicentres. Direct cortical stimulation (bipolar, 60 Hz, 1-ms pulse) was performed in 165 consecutive patients undergoing awake mapping for resection of low-grade gliomas. Tasks included motor, sensory, counting, and picture naming. Stimulation sites eliciting positive (sensory/motor) or negative (speech arrest, dysarthria, anomia, phonological and semantic paraphasias) findings were recorded and mapped onto a standard Montreal Neurological Institute brain atlas. Montreal Neurological Institute-space functional data were subjected to cluster analysis algorithms (K-means, partition around medioids, hierarchical Ward) to elucidate crucial network epicentres. Sensorimotor function was observed in the pre/post-central gyri as expected. Articulation epicentres were also found within the pre/post-central gyri. However, speech arrest localized to ventral premotor cortex, not the classical Broca's area. Anomia/paraphasia data demonstrated foci not only within classical Wernicke's area but also within the middle and inferior frontal gyri. We report the first bilateral probabilistic map for crucial cortical epicentres of human brain functions in the right and left hemispheres, including sensory, motor, and language (speech, articulation, phonology and semantics). These data challenge classical theories of brain organization (e.g. Broca's area as speech output region) and provide a distributed framework for future studies of neural networks. © The Author (2014). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Vision Based Localization in Urban Environments

    NASA Technical Reports Server (NTRS)

    McHenry, Michael; Cheng, Yang; Matthies, Larry

    2005-01-01

    As part of DARPA's MARS2020 program, the Jet Propulsion Laboratory developed a vision-based system for localization in urban environments that requires neither GPS nor active sensors. System hardware consists of a pair of small FireWire cameras and a standard Pentium-based computer. The inputs to the software system consist of: 1) a crude grid-based map describing the positions of buildings, 2) an initial estimate of robot location and 3) the video streams produced by each camera. At each step during the traverse the system: captures new image data, finds image features hypothesized to lie on the outside of a building, computes the range to those features, determines an estimate of the robot's motion since the previous step and combines that data with the map to update a probabilistic representation of the robot's location. This probabilistic representation allows the system to simultaneously represent multiple possible locations, For our testing, we have derived the a priori map manually using non-orthorectified overhead imagery, although this process could be automated. The software system consists of two primary components. The first is the vision system which uses binocular stereo ranging together with a set of heuristics to identify features likely to be part of building exteriors and to compute an estimate of the robot's motion since the previous step. The resulting visual features and the associated range measurements are software component, a particle-filter based localization system. This system uses the map and the then fed to the second primary most recent results from the vision system to update the estimate of the robot's location. This report summarizes the design of both the hardware and software and will include the results of applying the system to the global localization of a robot over an approximately half-kilometer traverse across JPL'S Pasadena campus.

  10. Risk-targeted versus current seismic design maps for the conterminous United States

    USGS Publications Warehouse

    Luco, Nicolas; Ellingwood, Bruce R.; Hamburger, Ronald O.; Hooper, John D.; Kimball, Jeffrey K.; Kircher, Charles A.

    2007-01-01

    The probabilistic portions of the seismic design maps in the NEHRP Provisions (FEMA, 2003/2000/1997), and in the International Building Code (ICC, 2006/2003/2000) and ASCE Standard 7-05 (ASCE, 2005a), provide ground motion values from the USGS that have a 2% probability of being exceeded in 50 years. Under the assumption that the capacity against collapse of structures designed for these "uniformhazard" ground motions is equal to, without uncertainty, the corresponding mapped value at the location of the structure, the probability of its collapse in 50 years is also uniform. This is not the case however, when it is recognized that there is, in fact, uncertainty in the structural capacity. In that case, siteto-site variability in the shape of ground motion hazard curves results in a lack of uniformity. This paper explains the basis for proposed adjustments to the uniform-hazard portions of the seismic design maps currently in the NEHRP Provisions that result in uniform estimated collapse probability. For seismic design of nuclear facilities, analogous but specialized adjustments have recently been defined in ASCE Standard 43-05 (ASCE, 2005b). In support of the 2009 update of the NEHRP Provisions currently being conducted by the Building Seismic Safety Council (BSSC), herein we provide examples of the adjusted ground motions for a selected target collapse probability (or target risk). Relative to the probabilistic MCE ground motions currently in the NEHRP Provisions, the risk-targeted ground motions for design are smaller (by as much as about 30%) in the New Madrid Seismic Zone, near Charleston, South Carolina, and in the coastal region of Oregon, with relatively little (<15%) change almost everywhere else in the conterminous U.S.

  11. High-resolution imaging of the large non-human primate brain using microPET: a feasibility study

    NASA Astrophysics Data System (ADS)

    Naidoo-Variawa, S.; Hey-Cunningham, A. J.; Lehnert, W.; Kench, P. L.; Kassiou, M.; Banati, R.; Meikle, S. R.

    2007-11-01

    The neuroanatomy and physiology of the baboon brain closely resembles that of the human brain and is well suited for evaluating promising new radioligands in non-human primates by PET and SPECT prior to their use in humans. These studies are commonly performed on clinical scanners with 5 mm spatial resolution at best, resulting in sub-optimal images for quantitative analysis. This study assessed the feasibility of using a microPET animal scanner to image the brains of large non-human primates, i.e. papio hamadryas (baboon) at high resolution. Factors affecting image accuracy, including scatter, attenuation and spatial resolution, were measured under conditions approximating a baboon brain and using different reconstruction strategies. Scatter fraction measured 32% at the centre of a 10 cm diameter phantom. Scatter correction increased image contrast by up to 21% but reduced the signal-to-noise ratio. Volume resolution was superior and more uniform using maximum a posteriori (MAP) reconstructed images (3.2-3.6 mm3 FWHM from centre to 4 cm offset) compared to both 3D ordered subsets expectation maximization (OSEM) (5.6-8.3 mm3) and 3D reprojection (3DRP) (5.9-9.1 mm3). A pilot 18F-2-fluoro-2-deoxy-d-glucose ([18F]FDG) scan was performed on a healthy female adult baboon. The pilot study demonstrated the ability to adequately resolve cortical and sub-cortical grey matter structures in the baboon brain and improved contrast when images were corrected for attenuation and scatter and reconstructed by MAP. We conclude that high resolution imaging of the baboon brain with microPET is feasible with appropriate choices of reconstruction strategy and corrections for degrading physical effects. Further work to develop suitable correction algorithms for high-resolution large primate imaging is warranted.

  12. MR-guided dynamic PET reconstruction with the kernel method and spectral temporal basis functions

    NASA Astrophysics Data System (ADS)

    Novosad, Philip; Reader, Andrew J.

    2016-06-01

    Recent advances in dynamic positron emission tomography (PET) reconstruction have demonstrated that it is possible to achieve markedly improved end-point kinetic parameter maps by incorporating a temporal model of the radiotracer directly into the reconstruction algorithm. In this work we have developed a highly constrained, fully dynamic PET reconstruction algorithm incorporating both spectral analysis temporal basis functions and spatial basis functions derived from the kernel method applied to a co-registered T1-weighted magnetic resonance (MR) image. The dynamic PET image is modelled as a linear combination of spatial and temporal basis functions, and a maximum likelihood estimate for the coefficients can be found using the expectation-maximization (EM) algorithm. Following reconstruction, kinetic fitting using any temporal model of interest can be applied. Based on a BrainWeb T1-weighted MR phantom, we performed a realistic dynamic [18F]FDG simulation study with two noise levels, and investigated the quantitative performance of the proposed reconstruction algorithm, comparing it with reconstructions incorporating either spectral analysis temporal basis functions alone or kernel spatial basis functions alone, as well as with conventional frame-independent reconstruction. Compared to the other reconstruction algorithms, the proposed algorithm achieved superior performance, offering a decrease in spatially averaged pixel-level root-mean-square-error on post-reconstruction kinetic parametric maps in the grey/white matter, as well as in the tumours when they were present on the co-registered MR image. When the tumours were not visible in the MR image, reconstruction with the proposed algorithm performed similarly to reconstruction with spectral temporal basis functions and was superior to both conventional frame-independent reconstruction and frame-independent reconstruction with kernel spatial basis functions. Furthermore, we demonstrate that a joint spectral/kernel model can also be used for effective post-reconstruction denoising, through the use of an EM-like image-space algorithm. Finally, we applied the proposed algorithm to reconstruction of real high-resolution dynamic [11C]SCH23390 data, showing promising results.

  13. MR-guided dynamic PET reconstruction with the kernel method and spectral temporal basis functions.

    PubMed

    Novosad, Philip; Reader, Andrew J

    2016-06-21

    Recent advances in dynamic positron emission tomography (PET) reconstruction have demonstrated that it is possible to achieve markedly improved end-point kinetic parameter maps by incorporating a temporal model of the radiotracer directly into the reconstruction algorithm. In this work we have developed a highly constrained, fully dynamic PET reconstruction algorithm incorporating both spectral analysis temporal basis functions and spatial basis functions derived from the kernel method applied to a co-registered T1-weighted magnetic resonance (MR) image. The dynamic PET image is modelled as a linear combination of spatial and temporal basis functions, and a maximum likelihood estimate for the coefficients can be found using the expectation-maximization (EM) algorithm. Following reconstruction, kinetic fitting using any temporal model of interest can be applied. Based on a BrainWeb T1-weighted MR phantom, we performed a realistic dynamic [(18)F]FDG simulation study with two noise levels, and investigated the quantitative performance of the proposed reconstruction algorithm, comparing it with reconstructions incorporating either spectral analysis temporal basis functions alone or kernel spatial basis functions alone, as well as with conventional frame-independent reconstruction. Compared to the other reconstruction algorithms, the proposed algorithm achieved superior performance, offering a decrease in spatially averaged pixel-level root-mean-square-error on post-reconstruction kinetic parametric maps in the grey/white matter, as well as in the tumours when they were present on the co-registered MR image. When the tumours were not visible in the MR image, reconstruction with the proposed algorithm performed similarly to reconstruction with spectral temporal basis functions and was superior to both conventional frame-independent reconstruction and frame-independent reconstruction with kernel spatial basis functions. Furthermore, we demonstrate that a joint spectral/kernel model can also be used for effective post-reconstruction denoising, through the use of an EM-like image-space algorithm. Finally, we applied the proposed algorithm to reconstruction of real high-resolution dynamic [(11)C]SCH23390 data, showing promising results.

  14. Voxel-based mapping of grey matter volume and glucose metabolism profiles in amyotrophic lateral sclerosis.

    PubMed

    Buhour, M-S; Doidy, F; Mondou, A; Pélerin, A; Carluer, L; Eustache, F; Viader, F; Desgranges, B

    2017-12-01

    Amyotrophic lateral sclerosis (ALS) is a rapidly progressive disease of the nervous system involving both upper and lower motor neurons. The patterns of structural and metabolic brain alterations are still unclear. Several studies using anatomical MRI yielded a number of discrepancies in their results, and a few PET studies investigated the effect of ALS on cerebral glucose metabolism. The aim of this study was threefold: to highlight the patterns of grey matter (GM) atrophy, hypometabolism and hypermetabolism in patients with ALS, then to understand the neurobehavioral significance of hypermetabolism and, finally, to investigate the regional differences between the morphologic and functional changes in ALS patients, using a specially designed voxel-based method. Thirty-seven patients with ALS and 37 age- and sex-matched healthy individuals underwent both structural MRI and 18 [F]-fluorodeoxyglucose (FDG) PET examinations. PET data were corrected for partial volume effects. Structural and metabolic abnormalities were examined in ALS patients compared with control subjects using two-sample t tests in statistical parametric mapping (SPM). Then, we extracted the metabolic values of clusters presenting hypermetabolism to correlate with selected cognitive scores. Finally, GM atrophy and hypometabolism patterns were directly compared with a one-paired t test in SPM. We found GM atrophy as well as hypometabolism in motor and extra motor regions and hypermetabolism in medial temporal lobe and cerebellum. We observed negative correlations between the metabolism of the right and left parahippocampal gyri and episodic memory and between the metabolism of right temporal pole and cognitive theory of mind. GM atrophy predominated in the temporal pole, left hippocampus and right thalamus, while hypometabolism predominated in a single cluster in the left frontal superior medial cortex. Our findings provide direct evidence of regional variations in the hierarchy and relationships between GM atrophy and hypometabolism in ALS. Moreover, the 18 FDG-PET investigation suggests that cerebral hypermetabolism is deleterious to cognitive function in ALS.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Ching-Ching, E-mail: cyang@tccn.edu.tw; Liu, Shu-Hsin; Mok, Greta S. P.

    Purpose: This study aimed to tailor the CT imaging protocols for pediatric patients undergoing whole-body PET/CT examinations with appropriate attention to radiation exposure while maintaining adequate image quality for anatomic delineation of PET findings and attenuation correction of PET emission data. Methods: The measurements were made by using three anthropomorphic phantoms representative of 1-, 5-, and 10-year-old children with tube voltages of 80, 100, and 120 kVp, tube currents of 10, 40, 80, and 120 mA, and exposure time of 0.5 s at 1.75:1 pitch. Radiation dose estimates were derived from the dose-length product and were used to calculate riskmore » estimates for radiation-induced cancer. The influence of image noise on image contrast and attenuation map for CT scans were evaluated based on Pearson's correlation coefficient and covariance, respectively. Multiple linear regression methods were used to investigate the effects of patient age, tube voltage, and tube current on radiation-induced cancer risk and image noise for CT scans. Results: The effective dose obtained using three anthropomorphic phantoms and 12 combinations of kVp and mA ranged from 0.09 to 4.08 mSv. Based on our results, CT scans acquired with 80 kVp/60 mA, 80 kVp/80 mA, and 100 kVp/60 mA could be performed on 1-, 5-, and 10-year-old children, respectively, to minimize cancer risk due to CT scans while maintaining the accuracy of attenuation map and CT image contrast. The effective doses of the proposed protocols for 1-, 5- and 10-year-old children were 0.65, 0.86, and 1.065 mSv, respectively. Conclusions: Low-dose pediatric CT protocols were proposed to balance the tradeoff between radiation-induced cancer risk and image quality for patients ranging in age from 1 to 10 years old undergoing whole-body PET/CT examinations.« less

  16. Zero TE-based pseudo-CT image conversion in the head and its application in PET/MR attenuation correction and MR-guided radiation therapy planning.

    PubMed

    Wiesinger, Florian; Bylund, Mikael; Yang, Jaewon; Kaushik, Sandeep; Shanbhag, Dattesh; Ahn, Sangtae; Jonsson, Joakim H; Lundman, Josef A; Hope, Thomas; Nyholm, Tufve; Larson, Peder; Cozzini, Cristina

    2018-02-18

    To describe a method for converting Zero TE (ZTE) MR images into X-ray attenuation information in the form of pseudo-CT images and demonstrate its performance for (1) attenuation correction (AC) in PET/MR and (2) dose planning in MR-guided radiation therapy planning (RTP). Proton density-weighted ZTE images were acquired as input for MR-based pseudo-CT conversion, providing (1) efficient capture of short-lived bone signals, (2) flat soft-tissue contrast, and (3) fast and robust 3D MR imaging. After bias correction and normalization, the images were segmented into bone, soft-tissue, and air by means of thresholding and morphological refinements. Fixed Hounsfield replacement values were assigned for air (-1000 HU) and soft-tissue (+42 HU), whereas continuous linear mapping was used for bone. The obtained ZTE-derived pseudo-CT images accurately resembled the true CT images (i.e., Dice coefficient for bone overlap of 0.73 ± 0.08 and mean absolute error of 123 ± 25 HU evaluated over the whole head, including errors from residual registration mismatches in the neck and mouth regions). The linear bone mapping accounted for bone density variations. Averaged across five patients, ZTE-based AC demonstrated a PET error of -0.04 ± 1.68% relative to CT-based AC. Similarly, for RTP assessed in eight patients, the absolute dose difference over the target volume was found to be 0.23 ± 0.42%. The described method enables MR to pseudo-CT image conversion for the head in an accurate, robust, and fast manner without relying on anatomical prior knowledge. Potential applications include PET/MR-AC, and MR-guided RTP. © 2018 International Society for Magnetic Resonance in Medicine.

  17. dAcquisition setting optimization and quantitative imaging for 124I studies with the Inveon microPET-CT system.

    PubMed

    Anizan, Nadège; Carlier, Thomas; Hindorf, Cecilia; Barbet, Jacques; Bardiès, Manuel

    2012-02-13

    Noninvasive multimodality imaging is essential for preclinical evaluation of the biodistribution and pharmacokinetics of radionuclide therapy and for monitoring tumor response. Imaging with nonstandard positron-emission tomography [PET] isotopes such as 124I is promising in that context but requires accurate activity quantification. The decay scheme of 124I implies an optimization of both acquisition settings and correction processing. The PET scanner investigated in this study was the Inveon PET/CT system dedicated to small animal imaging. The noise equivalent count rate [NECR], the scatter fraction [SF], and the gamma-prompt fraction [GF] were used to determine the best acquisition parameters for mouse- and rat-sized phantoms filled with 124I. An image-quality phantom as specified by the National Electrical Manufacturers Association NU 4-2008 protocol was acquired and reconstructed with two-dimensional filtered back projection, 2D ordered-subset expectation maximization [2DOSEM], and 3DOSEM with maximum a posteriori [3DOSEM/MAP] algorithms, with and without attenuation correction, scatter correction, and gamma-prompt correction (weighted uniform distribution subtraction). Optimal energy windows were established for the rat phantom (390 to 550 keV) and the mouse phantom (400 to 590 keV) by combining the NECR, SF, and GF results. The coincidence time window had no significant impact regarding the NECR curve variation. Activity concentration of 124I measured in the uniform region of an image-quality phantom was underestimated by 9.9% for the 3DOSEM/MAP algorithm with attenuation and scatter corrections, and by 23% with the gamma-prompt correction. Attenuation, scatter, and gamma-prompt corrections decreased the residual signal in the cold insert. The optimal energy windows were chosen with the NECR, SF, and GF evaluation. Nevertheless, an image quality and an activity quantification assessment were required to establish the most suitable reconstruction algorithm and corrections for 124I small animal imaging.

  18. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    NASA Astrophysics Data System (ADS)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  19. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data.

    PubMed

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-21

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  20. Synthesis and characterization of transition metal clusters: From the isolation of ligand-stabilized solid fragments to the tuning of magnetic anisotropy and host-guest selectivity, and, Approaches to science teaching: Development of an observation instrument with a measurement model based on item response theory

    NASA Astrophysics Data System (ADS)

    Hee, Allan George

    Part I. The work presented herein describes efforts to develop general techniques for the synthesis of transition metal clusters and the manipulation of their properties. In Chapter 2, it is demonstrated that a modified metal atom reactor allows for the vaporization, passivation, and isolation of metal-chalcogenide clusters from their parent binary solids. Among the clusters produced by this method were Cr6S8(PEt3)6, Fe4S 4(PEt3)4, Co6S8(PEt 3)6, Cu6S4(PEt3)6, Cu12S6(PEt3)8, and Cu26Se 13(PEt3)14. To create single-molecule magnets with higher demagnetization barriers, we are developing metal-cyanide systems which exhibit highly adjustable magnetic behavior. Chapter 3 reports an attempt to introduce magnetic anisotropy into a MnCr6 cluster. Replacement of CrIII with Mo III resulted in the assembly of K[(Me3tacn)6MnMo 6(CN)18](ClO4)3 (Me3tacn = N,N',N″ -trimethyl-1,4,7-triazacyclononane)---the first well-documented example of a cyano-bridged single-molecule magnet. Recently, it was demonstrated that replacing Me3tacn with the less sterically hindering tach (tach = cis,cis-1,3,5-triaminocyclohexane) in the face-centered cubic cluster [(tach)8Cr8Ni 6(CN)24]Br12 provides greater access to the cluster cavity. Chapter 4 describes my efforts to probe the selectivity of this cluster toward inclusion of various guests. Part II. Successful implementation of student-centered curricula reforms requires the creation of a measurement instrument for monitoring whether the curricula are being used as intended. The creation and development of an observation instrument would greatly contribute to this effort. To develop a theoretically sound construct map, it is necessary to review the literature and conduct our own investigations of approaches to science teaching. Chapter 2 presents the findings of these investigations and their contributions to our understanding of the construct. Using these findings, the Science Teaching Observation Protocol (STOP) was created and designed to measure two subconstructs: intentions and strategies. Chapter 3 details the first pilot test of STOP and analysis of the collected data. In Chapter 4, the theoretical shortcomings of the instrument are analyzed and discussed. Modified versions of the intention and strategy subconstruct maps are presented.

  1. PyBetVH: A Python tool for probabilistic volcanic hazard assessment and for generation of Bayesian hazard curves and maps

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Sandri, Laura; Anne Thompson, Mary

    2015-06-01

    PyBetVH is a completely new, free, open-source and cross-platform software implementation of the Bayesian Event Tree for Volcanic Hazard (BET_VH), a tool for estimating the probability of any magmatic hazardous phenomenon occurring in a selected time frame, accounting for all the uncertainties. New capabilities of this implementation include the ability to calculate hazard curves which describe the distribution of the exceedance probability as a function of intensity (e.g., tephra load) on a grid of points covering the target area. The computed hazard curves are (i) absolute (accounting for the probability of eruption in a given time frame, and for all the possible vent locations and eruptive sizes) and (ii) Bayesian (computed at different percentiles, in order to quantify the epistemic uncertainty). Such curves allow representation of the full information contained in the probabilistic volcanic hazard assessment (PVHA) and are well suited to become a main input to quantitative risk analyses. PyBetVH allows for interactive visualization of both the computed hazard curves, and the corresponding Bayesian hazard/probability maps. PyBetVH is designed to minimize the efforts of end users, making PVHA results accessible to people who may be less experienced in probabilistic methodologies, e.g. decision makers. The broad compatibility of Python language has also allowed PyBetVH to be installed on the VHub cyber-infrastructure, where it can be run online or downloaded at no cost. PyBetVH can be used to assess any type of magmatic hazard from any volcano. Here we illustrate how to perform a PVHA through PyBetVH using the example of analyzing tephra fallout from the Okataina Volcanic Centre (OVC), New Zealand, and highlight the range of outputs that the tool can generate.

  2. From Sensory Signals to Modality-Independent Conceptual Representations: A Probabilistic Language of Thought Approach

    PubMed Central

    Erdogan, Goker; Yildirim, Ilker; Jacobs, Robert A.

    2015-01-01

    People learn modality-independent, conceptual representations from modality-specific sensory signals. Here, we hypothesize that any system that accomplishes this feat will include three components: a representational language for characterizing modality-independent representations, a set of sensory-specific forward models for mapping from modality-independent representations to sensory signals, and an inference algorithm for inverting forward models—that is, an algorithm for using sensory signals to infer modality-independent representations. To evaluate this hypothesis, we instantiate it in the form of a computational model that learns object shape representations from visual and/or haptic signals. The model uses a probabilistic grammar to characterize modality-independent representations of object shape, uses a computer graphics toolkit and a human hand simulator to map from object representations to visual and haptic features, respectively, and uses a Bayesian inference algorithm to infer modality-independent object representations from visual and/or haptic signals. Simulation results show that the model infers identical object representations when an object is viewed, grasped, or both. That is, the model’s percepts are modality invariant. We also report the results of an experiment in which different subjects rated the similarity of pairs of objects in different sensory conditions, and show that the model provides a very accurate account of subjects’ ratings. Conceptually, this research significantly contributes to our understanding of modality invariance, an important type of perceptual constancy, by demonstrating how modality-independent representations can be acquired and used. Methodologically, it provides an important contribution to cognitive modeling, particularly an emerging probabilistic language-of-thought approach, by showing how symbolic and statistical approaches can be combined in order to understand aspects of human perception. PMID:26554704

  3. BA3b and BA1 activate in a serial fashion after median nerve stimulation: direct evidence from combining source analysis of evoked fields and cytoarchitectonic probabilistic maps.

    PubMed

    Papadelis, Christos; Eickhoff, Simon B; Zilles, Karl; Ioannides, Andreas A

    2011-01-01

    This study combines source analysis imaging data for early somatosensory processing and the probabilistic cytoarchitectonic maps (PCMs). Human somatosensory evoked fields (SEFs) were recorded by stimulating left and right median nerves. Filtering the recorded responses in different frequency ranges identified the most responsive frequency band. The short-latency averaged SEFs were analyzed using a single equivalent current dipole (ECD) model and magnetic field tomography (MFT). The identified foci of activity were superimposed with PCMs. Two major components of opposite polarity were prominent around 21 and 31 ms. A weak component around 25 ms was also identified. For the most responsive frequency band (50-150 Hz) ECD and MFT revealed one focal source at the contralateral Brodmann area 3b (BA3b) at the peak of N20. The component ~25 ms was localised in Brodmann area 1 (BA1) in 50-150 Hz. By using ECD, focal generators around 28-30 ms located initially in BA3b and 2 ms later to BA1. MFT also revealed two focal sources - one in BA3b and one in BA1 for these latencies. Our results provide direct evidence that the earliest cortical response after median nerve stimulation is generated within the contralateral BA3b. BA1 activation few milliseconds later indicates a serial mode of somatosensory processing within cytoarchitectonic SI subdivisions. Analysis of non-invasive magnetoencephalography (MEG) data and the use of PCMs allow unambiguous and quantitative (probabilistic) interpretation of cytoarchitectonic identity of activated areas following median nerve stimulation, even with the simple ECD model, but only when the model fits the data extremely well. Copyright © 2010 Elsevier Inc. All rights reserved.

  4. Influence of chronic nicotine administration on cerebral type 1 cannabinoid receptor binding: an in vivo micro-PET study in the rat using [18F]MK-9470.

    PubMed

    Gérard, Nathalie; Ceccarini, Jenny; Bormans, Guy; Vanbilloen, Bert; Casteels, Cindy; Goffin, Karolien; Bosier, Barbara; Lambert, Didier M; Van Laere, Koen

    2010-10-01

    Several lines of evidence suggest a functional interaction between central nicotinic and endocannabinoid systems. Furthermore, type 1 cannabinoid receptor (CB1R) antagonism is evaluated as antismoking therapy, and nicotine usage can be an important confound in positron emission tomography (PET) imaging studies of the CB1R. We evaluated CB1R binding in the rat brain using the PET radioligand [(18)F]MK-9470 after chronic administration of nicotine. Twelve female Wistar rats were scanned at baseline and after chronic administration of either nicotine (1 mg/kg; 2 weeks daily intraperitoneal (IP)) or saline as control. In vivo micro-PET images of CB1R binding were anatomically standardized and analyzed by voxel-based statistical parametric mapping and a predefined volume-of-interest approach. We did not observe changes in [(18)F]MK-9470 binding (p (height) < 0.001 level; uncorrected) on a group basis in either condition. Only at a less stringent threshold of p (height) < 0.005 (uncorrected) was a modest increase observed in tracer binding in the cerebellum for nicotine (peak voxel value + 6.8%, p (cluster) = 0.002 corrected). In conclusion, chronic IP administration of nicotine does not produce major cerebral changes in CB1R binding of [(18)F]MK-9470 in the rat. These results also suggest that chronic nicotine usage is unlikely to interfere with human PET imaging using this radioligand.

  5. EnviroAtlas - Potential Evapotranspiration 1950 - 2099 for the Conterminous United States

    EPA Pesticide Factsheets

    The EnviroAtlas Climate Scenarios were generated from NASA Earth Exchange (NEX) Downscaled Climate Projections (NEX-DCP30) ensemble averages (the average of over 30 available climate models) for each of the four representative concentration pathways (RCP) for the contiguous U.S. at 30 arc-second (approx. 800 m2) spatial resolution. In addition to the three climate variables provided by the NEX-DCP30 dataset (minimum monthly temperature, maximum monthly temperature, and precipitation) a corresponding estimate of potential evapotranspiration (PET) was developed to match the spatial and temporal scales of the input dataset. PET represents the cumulative amount of water returned to the atmosphere due to evaporation from Earth00e2??s surface and plant transpiration under ideal circumstances (i.e., a vegetated surface shading the ground and unlimited water supply). PET was calculated using the Hamon PET equation (Hamon, 1961) and CBM model for daylength (Forsythe et al. 1995) for the 4 RCPs (2.6, 4.5, 6.0, 8.5) and organized by season (Winter, Spring, Summer, and Fall) and annually for the years 2006 00e2?? 2099. Additionally, PET was calculated for the ensemble average of all historic runs and organized similarly for the years 1950 00e2?? 2005. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-u

  6. Biometric recognition via fixation density maps

    NASA Astrophysics Data System (ADS)

    Rigas, Ioannis; Komogortsev, Oleg V.

    2014-05-01

    This work introduces and evaluates a novel eye movement-driven biometric approach that employs eye fixation density maps for person identification. The proposed feature offers a dynamic representation of the biometric identity, storing rich information regarding the behavioral and physical eye movement characteristics of the individuals. The innate ability of fixation density maps to capture the spatial layout of the eye movements in conjunction with their probabilistic nature makes them a particularly suitable option as an eye movement biometrical trait in cases when free-viewing stimuli is presented. In order to demonstrate the effectiveness of the proposed approach, the method is evaluated on three different datasets containing a wide gamut of stimuli types, such as static images, video and text segments. The obtained results indicate a minimum EER (Equal Error Rate) of 18.3 %, revealing the perspectives on the utilization of fixation density maps as an enhancing biometrical cue during identification scenarios in dynamic visual environments.

  7. The cyanobacterial cytochrome b6f subunit PetP adopts an SH3 fold in solution.

    PubMed

    Veit, Sebastian; Nagadoi, Aritaka; Rögner, Matthias; Rexroth, Sascha; Stoll, Raphael; Ikegami, Takahisa

    2016-06-01

    PetP is a peripheral subunit of the cytochrome b(6)f complex (b(6)f) present in both, cyanobacteria and red algae. It is bound to the cytoplasmic surface of this membrane protein complex where it greatly affects the efficiency of the linear photosynthetic electron flow although it is not directly involved in the electron transfer reactions. Despite the crystal structures of the b(6)f core complex, structural information for the transient regulatory b(6)f subunits is still missing. Here we present the first structure of PetP at atomic resolution as determined by solution NMR. The protein adopts an SH3 fold, which is a common protein motif in eukaryotes but comparatively rare in prokaryotes. The structure of PetP enabled the identification of the potential interaction site for b(6)f binding by conservation mapping. The interaction surface is mainly formed by two large loop regions and one short 310 helix which also exhibit an increased flexibility as indicated by heteronuclear steady-state {(1)H}-(15)N NOE and random coil index parameters. The properties of this potential b(6)f binding site greatly differ from the canonical peptide binding site which is highly conserved in eukaryotic SH3 domains. Interestingly, three other proteins of the photosynthetic electron transport chain share this SH3 fold with PetP: NdhS of the photosynthetic NADH dehydrogenase-like complex (NDH-1), PsaE of the photosystem 1 and subunit α of the ferredoxin-thioredoxin reductase have, similar to PetP, a great impact on the photosynthetic electron transport. Finally, a model is presented to illustrate how SH3 domains modulate the photosynthetic electron transport processes in cyanobacteria. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Stereotaxic 18F-FDG PET and MRI templates with three-dimensional digital atlas for statistical parametric mapping analysis of tree shrew brain.

    PubMed

    Huang, Qi; Nie, Binbin; Ma, Chen; Wang, Jing; Zhang, Tianhao; Duan, Shaofeng; Wu, Shang; Liang, Shengxiang; Li, Panlong; Liu, Hua; Sun, Hua; Zhou, Jiangning; Xu, Lin; Shan, Baoci

    2018-01-01

    Tree shrews are proposed as an alternative animal model to nonhuman primates due to their close affinity to primates. Neuroimaging techniques are widely used to study brain functions and structures of humans and animals. However, tree shrews are rarely applied in neuroimaging field partly due to the lack of available species specific analysis methods. In this study, 10 PET/CT and 10 MRI images of tree shrew brain were used to construct PET and MRI templates; based on histological atlas we reconstructed a three-dimensional digital atlas with 628 structures delineated; then the digital atlas and templates were aligned into a stereotaxic space. Finally, we integrated the digital atlas and templates into a toolbox for tree shrew brain spatial normalization, statistical analysis and results localization. We validated the feasibility of the toolbox by simulated data with lesions in laterodorsal thalamic nucleus (LD). The lesion volumes of simulated PET and MRI images were (12.97±3.91)mm 3 and (7.04±0.84)mm 3 . Statistical results at p<0.005 showed the lesion volumes of PET and MRI were 13.18mm 3 and 8.06mm 3 in LD. To our knowledge, we report the first PET template and digital atlas of tree shrew brain. Compared to the existing MRI templates, our MRI template was aligned into stereotaxic space. And the toolbox is the first software dedicated for tree shrew brain analysis. The templates and digital atlas of tree shrew brain, as well as the toolbox, facilitate the use of tree shrews in neuroimaging field. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Estimating oxygen distribution from vasculature in three-dimensional tumour tissue

    PubMed Central

    Kannan, Pavitra; Warren, Daniel R.; Markelc, Bostjan; Bates, Russell; Muschel, Ruth; Partridge, Mike

    2016-01-01

    Regions of tissue which are well oxygenated respond better to radiotherapy than hypoxic regions by up to a factor of three. If these volumes could be accurately estimated, then it might be possible to selectively boost dose to radio-resistant regions, a concept known as dose-painting. While imaging modalities such as 18F-fluoromisonidazole positron emission tomography (PET) allow identification of hypoxic regions, they are intrinsically limited by the physics of such systems to the millimetre domain, whereas tumour oxygenation is known to vary over a micrometre scale. Mathematical modelling of microscopic tumour oxygen distribution therefore has the potential to complement and enhance macroscopic information derived from PET. In this work, we develop a general method of estimating oxygen distribution in three dimensions from a source vessel map. The method is applied analytically to line sources and quasi-linear idealized line source maps, and also applied to full three-dimensional vessel distributions through a kernel method and compared with oxygen distribution in tumour sections. The model outlined is flexible and stable, and can readily be applied to estimating likely microscopic oxygen distribution from any source geometry. We also investigate the problem of reconstructing three-dimensional oxygen maps from histological and confocal two-dimensional sections, concluding that two-dimensional histological sections are generally inadequate representations of the three-dimensional oxygen distribution. PMID:26935806

  10. Estimating oxygen distribution from vasculature in three-dimensional tumour tissue.

    PubMed

    Grimes, David Robert; Kannan, Pavitra; Warren, Daniel R; Markelc, Bostjan; Bates, Russell; Muschel, Ruth; Partridge, Mike

    2016-03-01

    Regions of tissue which are well oxygenated respond better to radiotherapy than hypoxic regions by up to a factor of three. If these volumes could be accurately estimated, then it might be possible to selectively boost dose to radio-resistant regions, a concept known as dose-painting. While imaging modalities such as 18F-fluoromisonidazole positron emission tomography (PET) allow identification of hypoxic regions, they are intrinsically limited by the physics of such systems to the millimetre domain, whereas tumour oxygenation is known to vary over a micrometre scale. Mathematical modelling of microscopic tumour oxygen distribution therefore has the potential to complement and enhance macroscopic information derived from PET. In this work, we develop a general method of estimating oxygen distribution in three dimensions from a source vessel map. The method is applied analytically to line sources and quasi-linear idealized line source maps, and also applied to full three-dimensional vessel distributions through a kernel method and compared with oxygen distribution in tumour sections. The model outlined is flexible and stable, and can readily be applied to estimating likely microscopic oxygen distribution from any source geometry. We also investigate the problem of reconstructing three-dimensional oxygen maps from histological and confocal two-dimensional sections, concluding that two-dimensional histological sections are generally inadequate representations of the three-dimensional oxygen distribution. © 2016 The Authors.

  11. Adaptive Decision Making Using Probabilistic Programming and Stochastic Optimization

    DTIC Science & Technology

    2018-01-01

    world optimization problems (and hence 16 Approved for Public Release (PA); Distribution Unlimited Pred. demand (uncertain; discrete ...simplify the setting, we further assume that the demands are discrete , taking on values d1, . . . , dk with probabilities (conditional on x) (pθ)i ≡ p...Tyrrell Rockafellar. Implicit functions and solution mappings. Springer Monogr. Math ., 2009. Anthony V Fiacco and Yo Ishizuka. Sensitivity and stability

  12. Satellite Based Probabilistic Snow Cover Extent Mapping (SCE) at Hydro-Québec

    NASA Astrophysics Data System (ADS)

    Teasdale, Mylène; De Sève, Danielle; Angers, Jean-François; Perreault, Luc

    2016-04-01

    Over 40% of Canada's water resources are in Quebec and Hydro-Quebec has developed potential to become one of the largest producers of hydroelectricity in the world, with a total installed capacity of 36,643 MW. The Hydro-Québec fleet park includes 27 large reservoirs with a combined storage capacity of 176 TWh, and 668 dams and 98 controls. Thus, over 98% of all electricity used to supply the domestic market comes from water resources and the excess output is sold on the wholesale markets. In this perspective the efficient management of water resources is needed and it is based primarily on a good river flow estimation including appropriate hydrological data. Snow on ground is one of the significant variables representing 30% to 40% of its annual energy reserve. More specifically, information on snow cover extent (SCE) and snow water equivalent (SWE) is crucial for hydrological forecasting, particularly in northern regions since the snowmelt provides the water that fills the reservoirs and is subsequently used for hydropower generation. For several years Hydro Quebec's research institute ( IREQ) developed several algorithms to map SCE and SWE. So far all the methods were deterministic. However, given the need to maximize the efficient use of all resources while ensuring reliability, the electrical systems must now be managed taking into account all risks. Since snow cover estimation is based on limited spatial information, it is important to quantify and handle its uncertainty in the hydrological forecasting system. This paper presents the first results of a probabilistic algorithm for mapping SCE by combining Bayesian mixture of probability distributions and multiple logistic regression models applied to passive microwave data. This approach allows assigning for each grid point, probabilities to the set of the mutually exclusive discrete outcomes: "snow" and "no snow". Its performance was evaluated using the Brier score since it is particularly appropriate to measure the accuracy of probabilistic discrete predictions. The scores were measured by comparing the snow probabilities produced by our models with the Hydro-Québec's snow ground data.

  13. 18F-FLT uptake kinetics in head and neck squamous cell carcinoma: a PET imaging study.

    PubMed

    Liu, Dan; Chalkidou, Anastasia; Landau, David B; Marsden, Paul K; Fenwick, John D

    2014-04-01

    To analyze the kinetics of 3(')-deoxy-3(')-[F-18]-fluorothymidine (18F-FLT) uptake by head and neck squamous cell carcinomas and involved nodes imaged using positron emission tomography (PET). Two- and three-tissue compartment models were fitted to 12 tumor time-activity-curves (TACs) obtained for 6 structures (tumors or involved nodes) imaged in ten dynamic PET studies of 1 h duration, carried out for five patients. The ability of the models to describe the data was assessed using a runs test, the Akaike information criterion (AIC) and leave-one-out cross-validation. To generate parametric maps the models were also fitted to TACs of individual voxels. Correlations between maps of different parameters were characterized using Pearson'sr coefficient; in particular the phosphorylation rate-constants k3-2tiss and k5 of the two- and three-tissue models were studied alongside the flux parameters KFLT- 2tiss and KFLT of these models, and standardized uptake values (SUV). A methodology based on expectation-maximization clustering and the Bayesian information criterion ("EM-BIC clustering") was used to distil the information from noisy parametric images. Fits of two-tissue models 2C3K and 2C4K and three-tissue models 3C5K and 3C6K comprising three, four, five, and six rate-constants, respectively, pass the runs test for 4, 8, 10, and 11 of 12 tumor TACs. The three-tissue models have lower AIC and cross-validation scores for nine of the 12 tumors. Overall the 3C6K model has the lowest AIC and cross-validation scores and its fitted parameter values are of the same orders of magnitude as literature estimates. Maps of KFLT and KFLT- 2tiss are strongly correlated (r = 0.85) and also correlate closely with SUV maps (r = 0.72 for KFLT- 2tiss, 0.64 for KFLT). Phosphorylation rate-constant maps are moderately correlated with flux maps (r = 0.48 for k3-2tiss vs KFLT- 2tiss and r = 0.68 for k5 vs KFLT); however, neither phosphorylation rate-constant correlates significantly with SUV. EM-BIC clustering reduces the parametric maps to a small number of levels--on average 5.8, 3.5, 3.4, and 1.4 for KFLT- 2tiss, KFLT, k3-2tiss, and k5. This large simplification is potentially useful for radiotherapy dose-painting, but demonstrates the high noise in some maps. Statistical simulations show that voxel level noise degrades TACs generated from the 3C6K model sufficiently that the average AIC score, parameter bias, and total uncertainty of 2C4K model fits are similar to those of 3C6K fits, whereas at the whole tumor level the scores are lower for 3C6K fits. For the patients studied here, whole tumor FLT uptake time-courses are represented better overall by a three-tissue than by a two-tissue model. EM-BIC clustering simplifies noisy parametric maps, providing the best description of the underlying information they contain and is potentially useful for radiotherapy dose-painting. However, the clustering highlights the large degree of noise present in maps of the phosphorylation rate-constantsk5 and k3-2tiss, which are conceptually tightly linked to cellular proliferation. Methods must be found to make these maps more robust-either by constraining other model parameters or modifying dynamic imaging protocols. © 2014 American Association of Physicists in Medicine.

  14. Application of Generative Topographic Mapping to Gear Failures Monitoring

    NASA Astrophysics Data System (ADS)

    Liao, Guanglan; Li, Weihua; Shi, Tielin; Rao, Raj B. K. N.

    2002-07-01

    The Generative Topographic Mapping (GTM) model is introduced as a probabilistic re-formation of the self-organizing map and has already been used in a variety of applications. This paper presents a study of the GTM in industrial gear failures monitoring. Vibration signals are analyzed using the GTM model, and the results show that gear feature data sets can be projected into a two-dimensional space and clustered in different areas according to their conditions, which can classify and identify clearly a gear work condition with cracked or broken tooth compared with the normal condition. With the trace of the image points in the two-dimensional space, the variation of gear work conditions can be observed visually, therefore, the occurrence and varying trend of gear failures can be monitored in time.

  15. Modeling Political Populations with Bacteria

    NASA Astrophysics Data System (ADS)

    Cleveland, Chris; Liao, David

    2011-03-01

    Results from lattice-based simulations of micro-environments with heterogeneous nutrient resources reveal that competition between wild-type and GASP rpoS819 strains of E. Coli offers mutual benefit, particularly in nutrient deprived regions. Our computational model spatially maps bacteria populations and energy sources onto a set of 3D lattices that collectively resemble the topology of North America. By implementing Wright-Fishcer re- production into a probabilistic leap-frog scheme, we observe populations of wild-type and GASP rpoS819 cells compete for resources and, yet, aid each other's long term survival. The connection to how spatial political ideologies map in a similar way is discussed.

  16. Landslide Hazard from Coupled Inherent and Dynamic Probabilities

    NASA Astrophysics Data System (ADS)

    Strauch, R. L.; Istanbulluoglu, E.; Nudurupati, S. S.

    2015-12-01

    Landslide hazard research has typically been conducted independently from hydroclimate research. We sought to unify these two lines of research to provide regional scale landslide hazard information for risk assessments and resource management decision-making. Our approach couples an empirical inherent landslide probability, based on a frequency ratio analysis, with a numerical dynamic probability, generated by combining subsurface water recharge and surface runoff from the Variable Infiltration Capacity (VIC) macro-scale land surface hydrologic model with a finer resolution probabilistic slope stability model. Landslide hazard mapping is advanced by combining static and dynamic models of stability into a probabilistic measure of geohazard prediction in both space and time. This work will aid resource management decision-making in current and future landscape and climatic conditions. The approach is applied as a case study in North Cascade National Park Complex in northern Washington State.

  17. A probabilistic cellular automata model for the dynamics of a population driven by logistic growth and weak Allee effect

    NASA Astrophysics Data System (ADS)

    Mendonça, J. R. G.

    2018-04-01

    We propose and investigate a one-parameter probabilistic mixture of one-dimensional elementary cellular automata under the guise of a model for the dynamics of a single-species unstructured population with nonoverlapping generations in which individuals have smaller probability of reproducing and surviving in a crowded neighbourhood but also suffer from isolation and dispersal. Remarkably, the first-order mean field approximation to the dynamics of the model yields a cubic map containing terms representing both logistic and weak Allee effects. The model has a single absorbing state devoid of individuals, but depending on the reproduction and survival probabilities can achieve a stable population. We determine the critical probability separating these two phases and find that the phase transition between them is in the directed percolation universality class of critical behaviour.

  18. SU-D-201-05: Phantom Study to Determine Optimal PET Reconstruction Parameters for PET/MR Imaging of Y-90 Microspheres Following Radioembolization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maughan, N; Conti, M; Parikh, P

    2015-06-15

    Purpose: Imaging Y-90 microspheres with PET/MRI following hepatic radioembolization has the potential for predicting treatment outcome and, in turn, improving patient care. The positron decay branching ratio, however, is very small (32 ppm), yielding images with poor statistics even when therapy doses are used. Our purpose is to find PET reconstruction parameters that maximize the PET recovery coefficients and minimize noise. Methods: An initial 7.5 GBq of Y-90 chloride solution was used to fill an ACR phantom for measurements with a PET/MRI scanner (Siemens Biograph mMR). Four hot cylinders and a warm background activity volume of the phantom were filledmore » with a 10:1 ratio. Phantom attenuation maps were derived from scaled CT images of the phantom and included the MR phased array coil. The phantom was imaged at six time points between 7.5–1.0 GBq total activity over a period of eight days. PET images were reconstructed via OP-OSEM with 21 subsets and varying iteration number (1–5), post-reconstruction filter size (5–10 mm), and either absolute or relative scatter correction. Recovery coefficients, SNR, and noise were measured as well as total activity in the phantom. Results: For the 120 different reconstructions, recovery coefficients ranged from 0.1–0.6 and improved with increasing iteration number and reduced post-reconstruction filter size. SNR, however, improved substantially with lower iteration numbers and larger post-reconstruction filters. From the phantom data, we found that performing 2 iterations, 21 subsets, and applying a 5 mm Gaussian post-reconstruction filter provided optimal recovery coefficients at a moderate noise level for a wide range of activity levels. Conclusion: The choice of reconstruction parameters for Y-90 PET images greatly influences both the accuracy of measurements and image quality. We have found reconstruction parameters that provide optimal recovery coefficients with minimized noise. Future work will include the effects of the body matrix coil and off-center measurements.« less

  19. Utilization of a hybrid finite-element based registration method to quantify heterogeneous tumor response for adaptive treatment for lung cancer patients

    NASA Astrophysics Data System (ADS)

    Sharifi, Hoda; Zhang, Hong; Bagher-Ebadian, Hassan; Lu, Wei; Ajlouni, Munther I.; Jin, Jian-Yue; (Spring Kong, Feng-Ming; Chetty, Indrin J.; Zhong, Hualiang

    2018-03-01

    Tumor response to radiation treatment (RT) can be evaluated from changes in metabolic activity between two positron emission tomography (PET) images. Activity changes at individual voxels in pre-treatment PET images (PET1), however, cannot be derived until their associated PET-CT (CT1) images are appropriately registered to during-treatment PET-CT (CT2) images. This study aimed to investigate the feasibility of using deformable image registration (DIR) techniques to quantify radiation-induced metabolic changes on PET images. Five patients with non-small-cell lung cancer (NSCLC) treated with adaptive radiotherapy were considered. PET-CTs were acquired two weeks before RT and 18 fractions after the start of RT. DIR was performed from CT1 to CT2 using B-Spline and diffeomorphic Demons algorithms. The resultant displacements in the tumor region were then corrected using a hybrid finite element method (FEM). Bitmap masks generated from gross tumor volumes (GTVs) in PET1 were deformed using the four different displacement vector fields (DVFs). The conservation of total lesion glycolysis (TLG) in GTVs was used as a criterion to evaluate the quality of these registrations. The deformed masks were united to form a large mask which was then partitioned into multiple layers from center to border. The averages of SUV changes over all the layers were 1.0  ±  1.3, 1.0  ±  1.2, 0.8  ±  1.3, 1.1  ±  1.5 for the B-Spline, B-Spline  +  FEM, Demons and Demons  +  FEM algorithms, respectively. TLG changes before and after mapping using B-Spline, Demons, hybrid-B-Spline, and hybrid-Demons registrations were 20.2%, 28.3%, 8.7%, and 2.2% on average, respectively. Compared to image intensity-based DIR algorithms, the hybrid FEM modeling technique is better in preserving TLG and could be useful for evaluation of tumor response for patients with regressing tumors.

  20. SU-E-I-88: Realistic Pathological Simulations of the NCAT and Zubal Anthropomorphic Models, Based on Clinical PET/CT Data.

    PubMed

    Papadimitroulas, P; Loudos, G; Le Maitre, A; Efthimiou, N; Visvikis, D; Nikiforidis, G; Kagadis, G C

    2012-06-01

    In the present study a patient-specific dataset of realistic PET simulations was created, taking into account the variability of clinical oncology data. Tumor variability was tested in the simulated results. A comparison of the produced simulated data was performed to clinical PET/CT data, for the validation and the evaluation of the procedure. Clinical PET/CT data of oncology patients were used as the basis of the simulated variability inserting patient-specific characteristics in the NCAT and the Zubal anthropomorphic phantoms. GATE Monte Carlo toolkit was used for simulating a commercial PET scanner. The standard computational anthropomorphic phantoms were adapted to the CT data (organ shapes), using a fitting algorithm. The activity map was derived from PET images. Patient tumors were segmented and inserted in the phantom, using different activity distributions. The produced simulated data were reconstructed using the STIR opensource software and compared to the original clinical ones. The accuracy of the procedure was tested in four different oncology cases. Each pathological situation was illustrated simulating a) a healthy body, b) insertion of the clinical tumor with homogenous activity, and c) insertion of the clinical tumor with variable activity (voxel-by-voxel) based on the clinical PET data. The accuracy of the presented dataset was compared to the original PET/CT data. Partial Volume Correction (PVC) was also applied in the simulated data. In this study patient-specific characteristics were used in computational anthropomorphic models for simulating realistic pathological patients. Voxel-by-voxel activity distribution with PVC within the tumor gives the most accurate results. Radiotherapy applications can utilize the benefits of the accurate realistic imaging simulations, using the anatomicaland biological information of each patient. Further work will incorporate the development of analytical anthropomorphic models with motion and cardiac correction, combined with pathological patients to achieve high accuracy in tumor imaging. This research was supported by the Joint Research and Technology Program between Greece and France; 2009-2011 (protocol ID: 09FR103). © 2012 American Association of Physicists in Medicine.

  1. Correction of MRI-induced geometric distortions in whole-body small animal PET-MRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frohwein, Lynn J., E-mail: frohwein@uni-muenster.de; Schäfers, Klaus P.; Hoerr, Verena

    Purpose: The fusion of positron emission tomography (PET) and magnetic resonance imaging (MRI) data can be a challenging task in whole-body PET-MRI. The quality of the registration between these two modalities in large field-of-views (FOV) is often degraded by geometric distortions of the MRI data. The distortions at the edges of large FOVs mainly originate from MRI gradient nonlinearities. This work describes a method to measure and correct for these kind of geometric distortions in small animal MRI scanners to improve the registration accuracy of PET and MRI data. Methods: The authors have developed a geometric phantom which allows themore » measurement of geometric distortions in all spatial axes via control points. These control points are detected semiautomatically in both PET and MRI data with a subpixel accuracy. The spatial transformation between PET and MRI data is determined with these control points via 3D thin-plate splines (3D TPS). The transformation derived from the 3D TPS is finally applied to real MRI mouse data, which were acquired with the same scan parameters used in the phantom data acquisitions. Additionally, the influence of the phantom material on the homogeneity of the magnetic field is determined via field mapping. Results: The spatial shift according to the magnetic field homogeneity caused by the phantom material was determined to a mean of 0.1 mm. The results of the correction show that distortion with a maximum error of 4 mm could be reduced to less than 1 mm with the proposed correction method. Furthermore, the control point-based registration of PET and MRI data showed improved congruence after correction. Conclusions: The developed phantom has been shown to have no considerable negative effect on the homogeneity of the magnetic field. The proposed method yields an appropriate correction of the measured MRI distortion and is able to improve the PET and MRI registration. Furthermore, the method is applicable to whole-body small animal imaging routines including different standard MRI sequences.« less

  2. Quantifying diffusion MRI tractography of the corticospinal tract in brain tumors with deterministic and probabilistic methods☆

    PubMed Central

    Bucci, Monica; Mandelli, Maria Luisa; Berman, Jeffrey I.; Amirbekian, Bagrat; Nguyen, Christopher; Berger, Mitchel S.; Henry, Roland G.

    2013-01-01

    Introduction Diffusion MRI tractography has been increasingly used to delineate white matter pathways in vivo for which the leading clinical application is presurgical mapping of eloquent regions. However, there is rare opportunity to quantify the accuracy or sensitivity of these approaches to delineate white matter fiber pathways in vivo due to the lack of a gold standard. Intraoperative electrical stimulation (IES) provides a gold standard for the location and existence of functional motor pathways that can be used to determine the accuracy and sensitivity of fiber tracking algorithms. In this study we used intraoperative stimulation from brain tumor patients as a gold standard to estimate the sensitivity and accuracy of diffusion tensor MRI (DTI) and q-ball models of diffusion with deterministic and probabilistic fiber tracking algorithms for delineation of motor pathways. Methods We used preoperative high angular resolution diffusion MRI (HARDI) data (55 directions, b = 2000 s/mm2) acquired in a clinically feasible time frame from 12 patients who underwent a craniotomy for resection of a cerebral glioma. The corticospinal fiber tracts were delineated with DTI and q-ball models using deterministic and probabilistic algorithms. We used cortical and white matter IES sites as a gold standard for the presence and location of functional motor pathways. Sensitivity was defined as the true positive rate of delineating fiber pathways based on cortical IES stimulation sites. For accuracy and precision of the course of the fiber tracts, we measured the distance between the subcortical stimulation sites and the tractography result. Positive predictive rate of the delineated tracts was assessed by comparison of subcortical IES motor function (upper extremity, lower extremity, face) with the connection of the tractography pathway in the motor cortex. Results We obtained 21 cortical and 8 subcortical IES sites from intraoperative mapping of motor pathways. Probabilistic q-ball had the best sensitivity (79%) as determined from cortical IES compared to deterministic q-ball (50%), probabilistic DTI (36%), and deterministic DTI (10%). The sensitivity using the q-ball algorithm (65%) was significantly higher than using DTI (23%) (p < 0.001) and the probabilistic algorithms (58%) were more sensitive than deterministic approaches (30%) (p = 0.003). Probabilistic q-ball fiber tracks had the smallest offset to the subcortical stimulation sites. The offsets between diffusion fiber tracks and subcortical IES sites were increased significantly for those cases where the diffusion fiber tracks were visibly thinner than expected. There was perfect concordance between the subcortical IES function (e.g. hand stimulation) and the cortical connection of the nearest diffusion fiber track (e.g. upper extremity cortex). Discussion This study highlights the tremendous utility of intraoperative stimulation sites to provide a gold standard from which to evaluate diffusion MRI fiber tracking methods and has provided an object standard for evaluation of different diffusion models and approaches to fiber tracking. The probabilistic q-ball fiber tractography was significantly better than DTI methods in terms of sensitivity and accuracy of the course through the white matter. The commonly used DTI fiber tracking approach was shown to have very poor sensitivity (as low as 10% for deterministic DTI fiber tracking) for delineation of the lateral aspects of the corticospinal tract in our study. Effects of the tumor/edema resulted in significantly larger offsets between the subcortical IES and the preoperative fiber tracks. The provided data show that probabilistic HARDI tractography is the most objective and reproducible analysis but given the small sample and number of stimulation points a generalization about our results should be given with caution. Indeed our results inform the capabilities of preoperative diffusion fiber tracking and indicate that such data should be used carefully when making pre-surgical and intra-operative management decisions. PMID:24273719

  3. Enhancing Flood Prediction Reliability Using Bayesian Model Averaging

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Merwade, V.

    2017-12-01

    Uncertainty analysis is an indispensable part of modeling the hydrology and hydrodynamics of non-idealized environmental systems. Compared to reliance on prediction from one model simulation, using on ensemble of predictions that consider uncertainty from different sources is more reliable. In this study, Bayesian model averaging (BMA) is applied to Black River watershed in Arkansas and Missouri by combining multi-model simulations to get reliable deterministic water stage and probabilistic inundation extent predictions. The simulation ensemble is generated from 81 LISFLOOD-FP subgrid model configurations that include uncertainty from channel shape, channel width, channel roughness and discharge. Model simulation outputs are trained with observed water stage data during one flood event, and BMA prediction ability is validated for another flood event. Results from this study indicate that BMA does not always outperform all members in the ensemble, but it provides relatively robust deterministic flood stage predictions across the basin. Station based BMA (BMA_S) water stage prediction has better performance than global based BMA (BMA_G) prediction which is superior to the ensemble mean prediction. Additionally, high-frequency flood inundation extent (probability greater than 60%) in BMA_G probabilistic map is more accurate than the probabilistic flood inundation extent based on equal weights.

  4. Probabilistic TSUnami Hazard MAPS for the NEAM Region: The TSUMAPS-NEAM Project

    NASA Astrophysics Data System (ADS)

    Basili, R.; Babeyko, A. Y.; Baptista, M. A.; Ben Abdallah, S.; Canals, M.; El Mouraouah, A.; Harbitz, C. B.; Ibenbrahim, A.; Lastras, G.; Lorito, S.; Løvholt, F.; Matias, L. M.; Omira, R.; Papadopoulos, G. A.; Pekcan, O.; Nmiri, A.; Selva, J.; Yalciner, A. C.

    2016-12-01

    As global awareness of tsunami hazard and risk grows, the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region still lacks a thorough probabilistic tsunami hazard assessment. The TSUMAPS-NEAM project aims to fill this gap in the NEAM region by 1) producing the first region-wide long-term homogenous Probabilistic Tsunami Hazard Assessment (PTHA) from earthquake sources, and by 2) triggering a common tsunami risk management strategy. The specific objectives of the project are tackled by the following four consecutive actions: 1) Conduct a state-of-the-art, standardized, and updatable PTHA with full uncertainty treatment; 2) Review the entire process with international experts; 3) Produce the PTHA database, with documentation of the entire hazard assessment process; and 4) Publicize the results through an awareness raising and education phase, and a capacity building phase. This presentation will illustrate the project layout, summarize its current status of advancement and prospective results, and outline its connections with similar initiatives in the international context. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  5. A spatio-temporal model for probabilistic seismic hazard zonation of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2013-08-01

    A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.

  6. Probabilistic TSUnami Hazard MAPS for the NEAM Region: The TSUMAPS-NEAM Project

    NASA Astrophysics Data System (ADS)

    Basili, Roberto; Babeyko, Andrey Y.; Hoechner, Andreas; Baptista, Maria Ana; Ben Abdallah, Samir; Canals, Miquel; El Mouraouah, Azelarab; Bonnevie Harbitz, Carl; Ibenbrahim, Aomar; Lastras, Galderic; Lorito, Stefano; Løvholt, Finn; Matias, Luis Manuel; Omira, Rachid; Papadopoulos, Gerassimos A.; Pekcan, Onur; Nmiri, Abdelwaheb; Selva, Jacopo; Yalciner, Ahmet C.; Thio, Hong K.

    2017-04-01

    As global awareness of tsunami hazard and risk grows, the North-East Atlantic, the Mediterranean, and connected Seas (NEAM) region still lacks a thorough probabilistic tsunami hazard assessment. The TSUMAPS-NEAM project aims to fill this gap in the NEAM region by 1) producing the first region-wide long-term homogenous Probabilistic Tsunami Hazard Assessment (PTHA) from earthquake sources, and by 2) triggering a common tsunami risk management strategy. The specific objectives of the project are tackled by the following four consecutive actions: 1) Conduct a state-of-the-art, standardized, and updatable PTHA with full uncertainty treatment; 2) Review the entire process with international experts; 3) Produce the PTHA database, with documentation of the entire hazard assessment process; and 4) Publicize the results through an awareness raising and education phase, and a capacity building phase. This presentation will illustrate the project layout, summarize its current status of advancement including the firs preliminary release of the assessment, and outline its connections with similar initiatives in the international context. The TSUMAPS-NEAM Project (http://www.tsumaps-neam.eu/) is co-financed by the European Union Civil Protection Mechanism, Agreement Number: ECHO/SUB/2015/718568/PREV26.

  7. Probing the Small-scale Structure in Strongly Lensed Systems via Transdimensional Inference

    NASA Astrophysics Data System (ADS)

    Daylan, Tansu; Cyr-Racine, Francis-Yan; Diaz Rivero, Ana; Dvorkin, Cora; Finkbeiner, Douglas P.

    2018-02-01

    Strong lensing is a sensitive probe of the small-scale density fluctuations in the Universe. We implement a pipeline to model strongly lensed systems using probabilistic cataloging, which is a transdimensional, hierarchical, and Bayesian framework to sample from a metamodel (union of models with different dimensionality) consistent with observed photon count maps. Probabilistic cataloging allows one to robustly characterize modeling covariances within and across lens models with different numbers of subhalos. Unlike traditional cataloging of subhalos, it does not require model subhalos to improve the goodness of fit above the detection threshold. Instead, it allows the exploitation of all information contained in the photon count maps—for instance, when constraining the subhalo mass function. We further show that, by not including these small subhalos in the lens model, fixed-dimensional inference methods can significantly mismodel the data. Using a simulated Hubble Space Telescope data set, we show that the subhalo mass function can be probed even when many subhalos in the sample catalogs are individually below the detection threshold and would be absent in a traditional catalog. The implemented software, Probabilistic Cataloger (PCAT) is made publicly available at https://github.com/tdaylan/pcat.

  8. A Max-Flow Based Algorithm for Connected Target Coverage with Probabilistic Sensors

    PubMed Central

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao; Wang, Wensheng

    2017-01-01

    Coverage is a fundamental issue in the research field of wireless sensor networks (WSNs). Connected target coverage discusses the sensor placement to guarantee the needs of both coverage and connectivity. Existing works largely leverage on the Boolean disk model, which is only a coarse approximation to the practical sensing model. In this paper, we focus on the connected target coverage issue based on the probabilistic sensing model, which can characterize the quality of coverage more accurately. In the probabilistic sensing model, sensors are only be able to detect a target with certain probability. We study the collaborative detection probability of target under multiple sensors. Armed with the analysis of collaborative detection probability, we further formulate the minimum ϵ-connected target coverage problem, aiming to minimize the number of sensors satisfying the requirements of both coverage and connectivity. We map it into a flow graph and present an approximation algorithm called the minimum vertices maximum flow algorithm (MVMFA) with provable time complex and approximation ratios. To evaluate our design, we analyze the performance of MVMFA theoretically and also conduct extensive simulation studies to demonstrate the effectiveness of our proposed algorithm. PMID:28587084

  9. A Max-Flow Based Algorithm for Connected Target Coverage with Probabilistic Sensors.

    PubMed

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao; Wang, Wensheng

    2017-05-25

    Coverage is a fundamental issue in the research field of wireless sensor networks (WSNs). Connected target coverage discusses the sensor placement to guarantee the needs of both coverage and connectivity. Existing works largely leverage on the Boolean disk model, which is only a coarse approximation to the practical sensing model. In this paper, we focus on the connected target coverage issue based on the probabilistic sensing model, which can characterize the quality of coverage more accurately. In the probabilistic sensing model, sensors are only be able to detect a target with certain probability. We study the collaborative detection probability of target under multiple sensors. Armed with the analysis of collaborative detection probability, we further formulate the minimum ϵ -connected target coverage problem, aiming to minimize the number of sensors satisfying the requirements of both coverage and connectivity. We map it into a flow graph and present an approximation algorithm called the minimum vertices maximum flow algorithm (MVMFA) with provable time complex and approximation ratios. To evaluate our design, we analyze the performance of MVMFA theoretically and also conduct extensive simulation studies to demonstrate the effectiveness of our proposed algorithm.

  10. MLAA-based RF surface coil attenuation estimation in hybrid PET/MR imaging

    NASA Astrophysics Data System (ADS)

    Heußer, Thorsten; Rank, Christopher M.; Freitag, Martin T.; Kachelrieß, Marc

    2017-03-01

    Attenuation correction (AC) for both patient and hardware attenuation of the 511 keV annihilation photons is required for accurate PET quantification. In hybrid PET/MR imaging, AC for stationary hardware components such as patient table and MR head coil is performed using CT{derived attenuation templates. AC for flexible hardware components such as MR radiofrequency (RF) surface coils is more challenging. Registration{based approaches, aligning scaled CT{derived attenuation templates with the current patient position, have been proposed but are not used in clinical routine. Ignoring RF coil attenuation has been shown to result in regional activity underestimation values of up to 18 %. We propose to employ a modified version of the maximum{ likelihood reconstruction of attenuation and activity (MLAA) algorithm to obtain an estimate of the RF coil attenuation. Starting with an initial attenuation map not including the RF coil, the attenuation update of MLAA is applied outside the body outline only, allowing to estimate RF coil attenuation without changing the patient attenuation map. Hence, the proposed method is referred to as external MLAA (xMLAA). In this work, xMLAA for RF surface coil attenuation estimation is investigated using phantom and patient data acquired with a Siemens Biograph mMR. For the phantom data, average activity errors compared to the ground truth was reduced from -8:1% to +0:8% when using the proposed method. Patient data revealed an average activity underestimation of -6:1% for the abdominal region and -5:3% for the thoracic region when ignoring RF coil attenuation.

  11. Gender differences in cerebral metabolism for color processing in mice: A PET/MRI Study.

    PubMed

    Njemanze, Philip C; Kranz, Mathias; Amend, Mario; Hauser, Jens; Wehrl, Hans; Brust, Peter

    2017-01-01

    Color processing is a central component of mammalian vision. Gender-related differences of color processing revealed by non-invasive functional transcranial Doppler ultrasound suggested right hemisphere pattern for blue/yellow chromatic opponency by men, and a left hemisphere pattern by women. The present study measured the accumulation of [18F]fluorodeoxyglucose ([18F]FDG) in mouse brain using small animal positron emission tomography and magnetic resonance imaging (PET/MRI) with statistical parametric mapping (SPM) during light stimulation with blue and yellow filters compared to darkness condition. PET revealed a reverse pattern relative to dark condition compared to previous human studies: Male mice presented with left visual cortex dominance for blue through the right eye, while female mice presented with right visual cortex dominance for blue through the left eye. We applied statistical parametric mapping (SPM) to examine gender differences in activated architectonic areas within the orbital and medial prefrontal cortex and related cortical and sub-cortical areas that lead to the striatum, medial thalamus and other brain areas. The metabolic connectivity of the orbital and medial prefrontal cortex evoked by blue stimulation spread through a wide range of brain structures implicated in viscerosensory and visceromotor systems in the left intra-hemispheric regions in male, but in the right-to-left inter-hemispheric regions in female mice. Color functional ocular dominance plasticity was noted in the right eye in male mice but in the left eye in female mice. This study of color processing in an animal model could be applied in the study of the role of gender differences in brain disease.

  12. Differentiation of Central Lung Cancer from Atelectasis: Comparison of Diffusion-Weighted MRI with PET/CT

    PubMed Central

    Yang, Rui-Meng; Li, Long; Wei, Xin-Hua; Guo, Yong-Mei; Huang, Yun-Hai; Lai, Li-Sha; Chen, A-Mei; Liu, Guo-Shun; Xiong, Wei-Feng; Luo, Liang-Ping; Jiang, Xin-Qing

    2013-01-01

    Objective Prospectively assess the performance of diffusion-weighted magnetic resonance imaging (DW-MRI) for differentiation of central lung cancer from atelectasis. Materials and Methods 38 consecutive lung cancer patients (26 males, 12 females; age range: 28–71 years; mean age: 49 years) who were referred for thoracic MR imaging examinations were enrolled. MR examinations were performed using a 1.5-T clinical scanner and scanning sequences of T1WI, T2WI, and DWI. Cancers and atelectasis were measured by mapping of the apparent diffusion coefficients (ADCs) obtained with a b-value of 500 s/mm2. Results PET/CT and DW-MR allowed differentiation of tumor and atelectasis in all 38 cases, but T2WI did not allow differentiation in 9 cases. Comparison of conventional T2WI and DW-MRI indicated a higher contrast noise ratio of the central lung carcinoma than the atelectasis by DW-MRI. ADC maps indicated significantly lower mean ADC in the central lung carcinoma than in the atelectasis (1.83±0.58 vs. 2.90±0.26 mm2/s, p<0.0001). ADC values of small cell lung carcinoma were significantly greater than those from squamous cell carcinoma and adenocarcinoma (p<0.0001 for both). Conclusions DW-MR imaging provides valuable information not obtained by conventional MR and may be useful for differentiation of central lung carcinoma from atelectasis. Future developments may allow DW-MR imaging to be used as an alternative to PET-CT in imaging of patients with lung cancer. PMID:23593186

  13. The role of necrosis, acute hypoxia and chronic hypoxia in 18F-FMISO PET image contrast: a computational modelling study

    NASA Astrophysics Data System (ADS)

    Warren, Daniel R.; Partridge, Mike

    2016-12-01

    Positron emission tomography (PET) using 18F-fluoromisonidazole (FMISO) is a promising technique for imaging tumour hypoxia, and a potential target for radiotherapy dose-painting. However, the relationship between FMISO uptake and oxygen partial pressure ({{P}{{\\text{O}2}}} ) is yet to be quantified fully. Tissue oxygenation varies over distances much smaller than clinical PET resolution (<100 μm versus  ˜4 mm), and cyclic variations in tumour perfusion have been observed on timescales shorter than typical FMISO PET studies (˜20 min versus a few hours). Furthermore, tracer uptake may be decreased in voxels containing some degree of necrosis. This work develops a computational model of FMISO uptake in millimetre-scale tumour regions. Coupled partial differential equations govern the evolution of oxygen and FMISO distributions, and a dynamic vascular source map represents temporal variations in perfusion. Local FMISO binding capacity is modulated by the necrotic fraction. Outputs include spatiotemporal maps of {{P}{{\\text{O}2}}} and tracer accumulation, enabling calculation of tissue-to-blood ratios (TBRs) and time-activity curves (TACs) as a function of mean tissue oxygenation. The model is characterised using experimental data, finding half-maximal FMISO binding at local {{P}{{\\text{O}2}}} of 1.4 mmHg (95% CI: 0.3-2.6 mmHg) and half-maximal necrosis at 1.2 mmHg (0.1-4.9 mmHg). Simulations predict a non-linear non-monotonic relationship between FMISO activity (4 hr post-injection) and mean tissue {{P}{{\\text{O}2}}} : tracer uptake rises sharply from negligible levels in avascular tissue, peaking at  ˜5 mmHg and declining towards blood activity in well-oxygenated conditions. Greater temporal variation in perfusion increases peak TBRs (range 2.20-5.27) as a result of smaller predicted necrotic fraction, rather than fundamental differences in FMISO accumulation under acute hypoxia. Identical late FMISO uptake can occur in regions with differing {{P}{{\\text{O}2}}} and necrotic fraction, but simulated TACs indicate that additional early-phase information may allow discrimination of hypoxic and necrotic signals. We conclude that a robust approach to FMISO interpretation (and dose-painting prescription) is likely to be based on dynamic PET analysis.

  14. The role of necrosis, acute hypoxia and chronic hypoxia in 18F-FMISO PET image contrast: a computational modelling study.

    PubMed

    Warren, Daniel R; Partridge, Mike

    2016-12-21

    Positron emission tomography (PET) using 18 F-fluoromisonidazole (FMISO) is a promising technique for imaging tumour hypoxia, and a potential target for radiotherapy dose-painting. However, the relationship between FMISO uptake and oxygen partial pressure ([Formula: see text]) is yet to be quantified fully. Tissue oxygenation varies over distances much smaller than clinical PET resolution (<100 μm versus  ∼4 mm), and cyclic variations in tumour perfusion have been observed on timescales shorter than typical FMISO PET studies (∼20 min versus a few hours). Furthermore, tracer uptake may be decreased in voxels containing some degree of necrosis. This work develops a computational model of FMISO uptake in millimetre-scale tumour regions. Coupled partial differential equations govern the evolution of oxygen and FMISO distributions, and a dynamic vascular source map represents temporal variations in perfusion. Local FMISO binding capacity is modulated by the necrotic fraction. Outputs include spatiotemporal maps of [Formula: see text] and tracer accumulation, enabling calculation of tissue-to-blood ratios (TBRs) and time-activity curves (TACs) as a function of mean tissue oxygenation. The model is characterised using experimental data, finding half-maximal FMISO binding at local [Formula: see text] of 1.4 mmHg (95% CI: 0.3-2.6 mmHg) and half-maximal necrosis at 1.2 mmHg (0.1-4.9 mmHg). Simulations predict a non-linear non-monotonic relationship between FMISO activity (4 hr post-injection) and mean tissue [Formula: see text] : tracer uptake rises sharply from negligible levels in avascular tissue, peaking at  ∼5 mmHg and declining towards blood activity in well-oxygenated conditions. Greater temporal variation in perfusion increases peak TBRs (range 2.20-5.27) as a result of smaller predicted necrotic fraction, rather than fundamental differences in FMISO accumulation under acute hypoxia. Identical late FMISO uptake can occur in regions with differing [Formula: see text] and necrotic fraction, but simulated TACs indicate that additional early-phase information may allow discrimination of hypoxic and necrotic signals. We conclude that a robust approach to FMISO interpretation (and dose-painting prescription) is likely to be based on dynamic PET analysis.

  15. Probabilistic somatotopy of the spinothalamic pathway at the ventroposterolateral nucleus of the thalamus in the human brain.

    PubMed

    Hong, J H; Kwon, H G; Jang, S H

    2011-08-01

    The STP has been regarded as the most plausible neural tract responsible for pathogenesis of central poststroke pain. The VPL nucleus has been a target for neurosurgical procedures for control of central poststroke pain. However, to our knowledge, no DTI studies have been conducted to investigate the somatotopic location of the STP at the VPL nucleus of the thalamus. In the current study, we attempted to investigate this location in the human brain by using a probabilistic tractography technique of DTI. DTI was performed at 1.5T by using a Synergy-L SENSE head coil. STPs for both the hand and leg were obtained by selection of fibers passing through 2 regions of interest (the area of the spinothalamic tract in the posterolateral medulla and the postcentral gyrus) for 41 healthy volunteers. Somatotopic mapping was obtained from the highest probabilistic location at the ACPC level. The highest probabilistic locations for the hand and leg were an average of 16.86 and 16.37 mm lateral to the ACPC line and 7.53 and 8.71 mm posterior to the midpoint of the ACPC line, respectively. Somatotopic locations for the hand and leg were different in the anteroposterior direction (P < .05); however, no difference was observed in the mediolateral direction (P > .05). We found the somatotopic locations for hand and leg of the STP at the VPL nucleus; these somatotopies were arranged in the anteroposterior direction.

  16. Applicability of a neuroprobabilistic integral risk index for the environmental management of polluted areas: a case study.

    PubMed

    Nadal, Martí; Kumar, Vikas; Schuhmacher, Marta; Domingo, José L

    2008-04-01

    Recently, we developed a GIS-Integrated Integral Risk Index (IRI) to assess human health risks in areas with presence of environmental pollutants. Contaminants were previously ranked by applying a self-organizing map (SOM) to their characteristics of persistence, bioaccumulation, and toxicity in order to obtain the Hazard Index (HI). In the present study, the original IRI was substantially improved by allowing the entrance of probabilistic data. A neuroprobabilistic HI was developed by combining SOM and Monte Carlo analysis. In general terms, the deterministic and probabilistic HIs followed a similar pattern: polychlorinated biphenyls (PCBs) and light polycyclic aromatic hydrocarbons (PAHs) were the pollutants showing the highest and lowest values of HI, respectively. However, the bioaccumulation value of heavy metals notably increased after considering a probability density function to explain the bioaccumulation factor. To check its applicability, a case study was investigated. The probabilistic integral risk was calculated in the chemical/petrochemical industrial area of Tarragona (Catalonia, Spain), where an environmental program has been carried out since 2002. The risk change between 2002 and 2005 was evaluated on the basis of probabilistic data of the levels of various pollutants in soils. The results indicated that the risk of the chemicals under study did not follow a homogeneous tendency. However, the current levels of pollution do not mean a relevant source of health risks for the local population. Moreover, the neuroprobabilistic HI seems to be an adequate tool to be taken into account in risk assessment processes.

  17. Quantifying heterogeneity of lesion uptake in dynamic contrast enhanced MRI for breast cancer diagnosis

    NASA Astrophysics Data System (ADS)

    Karahaliou, A.; Vassiou, K.; Skiadopoulos, S.; Kanavou, T.; Yiakoumelos, A.; Costaridou, L.

    2009-07-01

    The current study investigates whether texture features extracted from lesion kinetics feature maps can be used for breast cancer diagnosis. Fifty five women with 57 breast lesions (27 benign, 30 malignant) were subjected to dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) on 1.5T system. A linear-slope model was fitted pixel-wise to a representative lesion slice time series and fitted parameters were used to create three kinetic maps (wash out, time to peak enhancement and peak enhancement). 28 grey level co-occurrence matrices features were extracted from each lesion kinetic map. The ability of texture features per map in discriminating malignant from benign lesions was investigated using a Probabilistic Neural Network classifier. Additional classification was performed by combining classification outputs of most discriminating feature subsets from the three maps, via majority voting. The combined scheme outperformed classification based on individual maps achieving area under Receiver Operating Characteristics curve 0.960±0.029. Results suggest that heterogeneity of breast lesion kinetics, as quantified by texture analysis, may contribute to computer assisted tissue characterization in DCE-MRI.

  18. The comparison between a ground based and a space based probabilistic landslide susceptibility assessment

    NASA Astrophysics Data System (ADS)

    Reichenbach, P.; Mondini, A.; Guzzetti, F.; Rossi, M.; Ardizzone, F.; Cardinali, M.

    2009-04-01

    Probabilistic landslide susceptibility assessments attempt to predict the location and threat posed by known landslides. Under the assumption that landslides will occur in the future because of the same conditions that produced them in the past, geomorphologists use susceptibility assessments to predict the location of future landslides. We present an attempt to exploit satellite data to prepare a landslide susceptibility zonation for a the Collazzone area that extends for 79 sq km in the Umbria region, Central Italy. For the study area we have prepared a map of the Normalized Difference Vegetation Index (NDVI) obtained by processing raw NIR and RED channels (b2 and b3 bands) at 15 m x 15 m resolution of an image acquired by the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), on board the TERRA satellite, and a map of Land Surface Temperature (LST) obtained by processing raw TIR channels (b11 to b15 bands) at 90 m × 90 m resolution from the same image. Both maps, in general proxy for soil moisture maps, were obtained through standard algorithms. As expected, there is a strong correspondence between NDVI and LST, but, when the NDVI does not change, elevation effects and others are predominant in LST. For the Collazzone area we prepared two different susceptibility models. The first was prepared through multivariate analysis of thematic data (including morphometry, lithology, structure and land use) obtained through traditional methods, primarily the interpretation of aerial photographs and field work. The second susceptibility model was prepared using terrain morphology and information obtained processing satellite data. The two models were compared in term of model fit and model performance and were validated exploiting landslide inventories not used to build the models. The two susceptibility models are very similar from a geographic and a classification point of view. This is good news, as it tells us that for landslide susceptibility, thematic maps obtained processing satellite data can be an effective alternative to maps prepared using more traditional, ground based methods.

  19. Radiation Source Mapping with Bayesian Inverse Methods

    DOE PAGES

    Hykes, Joshua M.; Azmy, Yousry Y.

    2017-03-22

    In this work, we present a method to map the spectral and spatial distributions of radioactive sources using a limited number of detectors. Locating and identifying radioactive materials is important for border monitoring, in accounting for special nuclear material in processing facilities, and in cleanup operations following a radioactive material spill. Most methods to analyze these types of problems make restrictive assumptions about the distribution of the source. In contrast, the source mapping method presented here allows an arbitrary three-dimensional distribution in space and a gamma peak distribution in energy. To apply the method, the problem is cast as anmore » inverse problem where the system’s geometry and material composition are known and fixed, while the radiation source distribution is sought. A probabilistic Bayesian approach is used to solve the resulting inverse problem since the system of equations is ill-posed. The posterior is maximized with a Newton optimization method. The probabilistic approach also provides estimates of the confidence in the final source map prediction. A set of adjoint, discrete ordinates flux solutions, obtained in this work by the Denovo code, is required to efficiently compute detector responses from a candidate source distribution. These adjoint fluxes form the linear mapping from the state space to the response space. The test of the method’s success is simultaneously locating a set of 137Cs and 60Co gamma sources in a room. This test problem is solved using experimental measurements that we collected for this purpose. Because of the weak sources available for use in the experiment, some of the expected photopeaks were not distinguishable from the Compton continuum. However, by supplanting 14 flawed measurements (out of a total of 69) with synthetic responses computed by MCNP, the proof-of-principle source mapping was successful. The locations of the sources were predicted within 25 cm for two of the sources and 90 cm for the third, in a room with an ~4-x 4-m floor plan. Finally, the predicted source intensities were within a factor of ten of their true value.« less

  20. Performance of USGS one-year earthquake hazard map for natural and induced seismicity in the central and eastern United States

    NASA Astrophysics Data System (ADS)

    Brooks, E. M.; Stein, S.; Spencer, B. D.; Salditch, L.; Petersen, M. D.; McNamara, D. E.

    2017-12-01

    Seismicity in the central United States has dramatically increased since 2008 due to the injection of wastewater produced by oil and gas extraction. In response, the USGS created a one-year probabilistic hazard model and map for 2016 to describe the increased hazard posed to the central and eastern United States. Using the intensity of shaking reported to the "Did You Feel It?" system during 2016, we assess the performance of this model. Assessing the performance of earthquake hazard maps for natural and induced seismicity is conceptually similar but has practical differences. Maps that have return periods of hundreds or thousands of years— as commonly used for natural seismicity— can be assessed using historical intensity data that also span hundreds or thousands of years. Several different features stand out when assessing the USGS 2016 seismic hazard model for the central and eastern United States from induced and natural earthquakes. First, the model can be assessed as a forecast in one year, because event rates are sufficiently high to permit evaluation with one year of data. Second, because these models are projections from the previous year thus implicitly assuming that fluid injection rates remain the same, misfit may reflect changes in human activity. Our results suggest that the model was very successful by the metric implicit in probabilistic hazard seismic assessment: namely, that the fraction of sites at which the maximum shaking exceeded the mapped value is comparable to that expected. The model also did well by a misfit metric that compares the spatial patterns of predicted and maximum observed shaking. This was true for both the central and eastern United States as a whole, and for the region within it with the highest amount of seismicity, Oklahoma and its surrounding area. The model performed least well in northern Texas, over-stating hazard, presumably because lower oil and gas prices and regulatory action reduced the water injection volume relative to the previous year. These results imply that such hazard maps have the potential to be valuable tools for policy makers and regulators in managing the seismic risks associated with unconventional oil and gas production.

  1. A feasibility study comparing UK older adult mental health inpatient wards which use protected engagement time with other wards which do not: study protocol.

    PubMed

    Nolan, Fiona M; Fox, Chris; Cheston, Richard; Turner, David; Clark, Allan; Dodd, Emily; Khoo, Mary-Ellen; Gray, Richard

    2016-01-01

    Protected engagement time (PET) is a concept of managing staff time on mental health inpatient wards with the aim of increasing staff and patient interaction. Despite apparent widespread use of PET, there remains a dearth of evidence as to how it is implemented and whether it carries benefits for staff or patients. This protocol describes a study which is being carried out on mental health wards caring for older adults (aged over 65) in England. The study shares a large proportion of the procedures, measures and study team membership of a recently completed investigation of the impact of PET in adult acute mental health wards. The study aims to identify prevalence and components of PET to construct a model for the intervention, in addition to testing the feasibility of the measures and procedures in preparation for a randomised trial. The study comprises four modules and uses a mixed methods approach. Module 1 involves mapping all inpatient wards in England which provide care for older adults, including those with dementia, ascertaining how many of these provide PET and in what way. Module 2 uses a prospective cohort method to compare five older adult mental health wards that use PET with five that do not across three National Health Service (NHS) Foundation Trust sites. The comparison comprises questionnaires, observation tools and routinely collected clinical service data and combines validated measures with questions developed specifically for the study. Module 3 entails an in-depth case study evaluation of three of the participating PET wards (one from each NHS Trust site) using semi-structured interviews with patients, carers and staff. Module 4 describes the development of a model and fidelity scale for PET using the information derived from the other modules with a working group of patients, carers and staff. This is a feasibility study to test the application of the measures and methods in inpatient wards for older adults and develop a draft model for the intervention. The next stage will prospectively involve testing of the model and fidelity scale in randomised conditions to provide evidence for the effectiveness of PET as an intervention. ISRCTN31919196.

  2. FDG-PET and CSF biomarker accuracy in prediction of conversion to different dementias in a large multicentre MCI cohort.

    PubMed

    Caminiti, Silvia Paola; Ballarini, Tommaso; Sala, Arianna; Cerami, Chiara; Presotto, Luca; Santangelo, Roberto; Fallanca, Federico; Vanoli, Emilia Giovanna; Gianolli, Luigi; Iannaccone, Sandro; Magnani, Giuseppe; Perani, Daniela

    2018-01-01

    In this multicentre study in clinical settings, we assessed the accuracy of optimized procedures for FDG-PET brain metabolism and CSF classifications in predicting or excluding the conversion to Alzheimer's disease (AD) dementia and non-AD dementias. We included 80 MCI subjects with neurological and neuropsychological assessments, FDG-PET scan and CSF measures at entry, all with clinical follow-up. FDG-PET data were analysed with a validated voxel-based SPM method. Resulting single-subject SPM maps were classified by five imaging experts according to the disease-specific patterns, as "typical-AD", "atypical-AD" (i.e. posterior cortical atrophy, asymmetric logopenic AD variant, frontal-AD variant), "non-AD" (i.e. behavioural variant FTD, corticobasal degeneration, semantic variant FTD; dementia with Lewy bodies) or "negative" patterns. To perform the statistical analyses, the individual patterns were grouped either as "AD dementia vs. non-AD dementia (all diseases)" or as "FTD vs. non-FTD (all diseases)". Aβ42, total and phosphorylated Tau CSF-levels were classified dichotomously, and using the Erlangen Score algorithm. Multivariate logistic models tested the prognostic accuracy of FDG-PET-SPM and CSF dichotomous classifications. Accuracy of Erlangen score and Erlangen Score aided by FDG-PET SPM classification was evaluated. The multivariate logistic model identified FDG-PET "AD" SPM classification (Expβ = 19.35, 95% C.I. 4.8-77.8, p < 0.001) and CSF Aβ42 (Expβ = 6.5, 95% C.I. 1.64-25.43, p < 0.05) as the best predictors of conversion from MCI to AD dementia. The "FTD" SPM pattern significantly predicted conversion to FTD dementias at follow-up (Expβ = 14, 95% C.I. 3.1-63, p < 0.001). Overall, FDG-PET-SPM classification was the most accurate biomarker, able to correctly differentiate either the MCI subjects who converted to AD or FTD dementias, and those who remained stable or reverted to normal cognition (Expβ = 17.9, 95% C.I. 4.55-70.46, p < 0.001). Our results support the relevant role of FDG-PET-SPM classification in predicting progression to different dementia conditions in prodromal MCI phase, and in the exclusion of progression, outperforming CSF biomarkers.

  3. Extensible Probabilistic Repository Technology (XPRT)

    DTIC Science & Technology

    2004-10-01

    projects, such as, Centaurus , Evidence Data Base (EDB), etc., others were fabricated, such as INS and FED, while others contain data from the open...Google Web Report Unlimited SOAP API News BBC News Unlimited WEB RSS 1.0 Centaurus Person Demographics 204,402 people from 240 countries...objects of the domain ontology map to the various simulated data-sources. For example, the PersonDemographics are stored in the Centaurus database, while

  4. Assisted Perception, Planning and Control for Remote Mobility and Dexterous Manipulation

    DTIC Science & Technology

    2017-04-01

    on unmanned aerial vehicles (UAVs). The underlying algorithm is based on an Extended Kalman Filter (EKF) that simultaneously estimates robot state...and sensor biases. The filter developed provided a probabilistic fusion of sensor data from many modalities to produce a single consistent position...estimation for a walking humanoid. Given a prior map using a Gaussian particle filter , the LIDAR based system is able to provide a drift-free

  5. Accounting for pH heterogeneity and variability in modelling human health risks from cadmium in contaminated land.

    PubMed

    Gay, J Rebecca; Korre, Anna

    2009-07-01

    The authors have previously published a methodology which combines quantitative probabilistic human health risk assessment and spatial statistical methods (geostatistics) to produce an assessment, incorporating uncertainty, of risks to human health from exposure to contaminated land. The model assumes a constant soil to plant concentration factor (CF(veg)) when calculating intake of contaminants. This model is modified here to enhance its use in a situation where CF(veg) varies according to soil pH, as is the case for cadmium. The original methodology uses sequential indicator simulation (SIS) to map soil concentration estimates for one contaminant across a site. A real, age-stratified population is mapped across the contaminated area, and intake of soil contaminants by individuals is calculated probabilistically using an adaptation of the Contaminated Land Exposure Assessment (CLEA) model. The proposed improvement involves not only the geostatistical estimation of the contaminant concentration, but also that of soil pH, which in turn leads to a variable CF(veg) estimate which influences the human intake results. The results presented demonstrate that taking pH into account can influence the outcome of the risk assessment greatly. It is proposed that a similar adaptation could be used for other combinations of soil variables which influence CF(veg).

  6. The Manhattan Frame Model-Manhattan World Inference in the Space of Surface Normals.

    PubMed

    Straub, Julian; Freifeld, Oren; Rosman, Guy; Leonard, John J; Fisher, John W

    2018-01-01

    Objects and structures within man-made environments typically exhibit a high degree of organization in the form of orthogonal and parallel planes. Traditional approaches utilize these regularities via the restrictive, and rather local, Manhattan World (MW) assumption which posits that every plane is perpendicular to one of the axes of a single coordinate system. The aforementioned regularities are especially evident in the surface normal distribution of a scene where they manifest as orthogonally-coupled clusters. This motivates the introduction of the Manhattan-Frame (MF) model which captures the notion of an MW in the surface normals space, the unit sphere, and two probabilistic MF models over this space. First, for a single MF we propose novel real-time MAP inference algorithms, evaluate their performance and their use in drift-free rotation estimation. Second, to capture the complexity of real-world scenes at a global scale, we extend the MF model to a probabilistic mixture of Manhattan Frames (MMF). For MMF inference we propose a simple MAP inference algorithm and an adaptive Markov-Chain Monte-Carlo sampling algorithm with Metropolis-Hastings split/merge moves that let us infer the unknown number of mixture components. We demonstrate the versatility of the MMF model and inference algorithm across several scales of man-made environments.

  7. Meditation effects within the hippocampal complex revealed by voxel-based morphometry and cytoarchitectonic probabilistic mapping

    PubMed Central

    Luders, Eileen; Kurth, Florian; Toga, Arthur W.; Narr, Katherine L.; Gaser, Christian

    2013-01-01

    Scientific studies addressing anatomical variations in meditators' brains have emerged rapidly over the last few years, where significant links are most frequently reported with respect to gray matter (GM). To advance prior work, this study examined GM characteristics in a large sample of 100 subjects (50 meditators, 50 controls), where meditators have been practicing close to 20 years, on average. A standard, whole-brain voxel-based morphometry approach was applied and revealed significant meditation effects in the vicinity of the hippocampus, showing more GM in meditators than in controls as well as positive correlations with the number of years practiced. However, the hippocampal complex is regionally segregated by architecture, connectivity, and functional relevance. Thus, to establish differential effects within the hippocampal formation (cornu ammonis, fascia dentata, entorhinal cortex, subiculum) as well as the hippocampal-amygdaloid transition area, we utilized refined cytoarchitectonic probabilistic maps of (peri-) hippocampal subsections. Significant meditation effects were observed within the subiculum specifically. Since the subiculum is known to play a key role in stress regulation and meditation is an established form of stress reduction, these GM findings may reflect neuronal preservation in long-term meditators—perhaps due to an attenuated release of stress hormones and decreased neurotoxicity. PMID:23847572

  8. How well can we test probabilistic seismic hazard maps?

    NASA Astrophysics Data System (ADS)

    Vanneste, Kris; Stein, Seth; Camelbeeck, Thierry; Vleminckx, Bart

    2017-04-01

    Recent large earthquakes that gave rise to shaking much stronger than shown in probabilistic seismic hazard (PSH) maps have stimulated discussion about how well these maps forecast future shaking. These discussions have brought home the fact that although the maps are designed to achieve certain goals, we know little about how well they actually perform. As for any other forecast, this question involves verification and validation. Verification involves assessing how well the algorithm used to produce hazard maps implements the conceptual PSH model ("have we built the model right?"). Validation asks how well the model forecasts the shaking that actually occurs ("have we built the right model?"). We explore the verification issue by simulating shaking histories for an area with assumed uniform distribution of earthquakes, Gutenberg-Richter magnitude-frequency relation, Poisson temporal occurrence model, and ground-motion prediction equation (GMPE). We compare the maximum simulated shaking at many sites over time with that predicted by a hazard map generated for the same set of parameters. The Poisson model predicts that the fraction of sites at which shaking will exceed that of the hazard map is p = 1 - exp(-t/T), where t is the duration of observations and T is the map's return period. Exceedance is typically associated with infrequent large earthquakes, as observed in real cases. The ensemble of simulated earthquake histories yields distributions of fractional exceedance with mean equal to the predicted value. Hence, the PSH algorithm appears to be internally consistent and can be regarded as verified for this set of simulations. However, simulated fractional exceedances show a large scatter about the mean value that decreases with increasing t/T, increasing observation time and increasing Gutenberg-Richter a-value (combining intrinsic activity rate and surface area), but is independent of GMPE uncertainty. This scatter is due to the variability of earthquake recurrence, and so decreases as the largest earthquakes occur in more simulations. Our results are important for evaluating the performance of a hazard map based on misfits in fractional exceedance, and for assessing whether such misfit arises by chance or reflects a bias in the map. More specifically, we determined for a broad range of Gutenberg-Richter a-values theoretical confidence intervals on allowed misfits in fractional exceedance and on the percentage of hazard-map bias that can thus be detected by comparison with observed shaking histories. Given that in the real world we only have one shaking history for an area, these results indicate that even if a hazard map does not fit the observations, it is very difficult to assess its veracity, especially for low-to-moderate-seismicity regions. Because our model is a simplified version of reality, any additional uncertainty or complexity will tend to widen these confidence intervals.

  9. In-situ determination of residual specific activity in activated concrete walls of a PET-cyclotron room

    NASA Astrophysics Data System (ADS)

    Matsumura, H.; Toyoda, A.; Masumoto, K.; Yoshida, G.; Yagishita, T.; Nakabayashi, T.; Sasaki, H.; Matsumura, K.; Yamaya, Y.; Miyazaki, Y.

    2018-06-01

    In the decommissioning work for concrete walls of PET-cyclotron rooms, an in-situ measurement is expected to be useful for obtaining a contour map of the specific activity on the walls without destroying the structure. In this study, specific activities of γ-ray-emitting radionuclides in concrete walls were determined by using an in-situ measurement method employing a portable Ge semiconductor detector, and compared with the specific activity obtained using the sampling measurement method, at the Medical and Pharmacological Research Center Foundation in Hakui, Ishikawa, Japan. Accordingly, the specific activity could be determined by the in-situ determination method. Since there is a clear correlation between the total specific activity of γ-ray-emitting radionuclides and contact dose rate, the specific activity can be determined approximately by contact dose-rate measurement using a NaI scintillation survey meter. The specific activity of each γ-ray-emitting radionuclide can also be estimated from the contact dose rate using a NaI scintillation survey meter. The in-situ measurement method is a powerful tool for the decommissioning of the PET cyclotron room.

  10. Cerebral glucose metabolic prediction from amnestic mild cognitive impairment to Alzheimer's dementia: a meta-analysis.

    PubMed

    Ma, Hai Rong; Sheng, Li Qin; Pan, Ping Lei; Wang, Gen Di; Luo, Rong; Shi, Hai Cun; Dai, Zhen Yu; Zhong, Jian Guo

    2018-01-01

    Brain 18 F-fluorodeoxyglucose positron emission tomography (FDG-PET) has been utilized to monitor disease conversion from amnestic mild cognitive impairment (aMCI) to Alzheimer's dementia (AD). However, the conversion patterns of FDG-PET metabolism across studies are not conclusive. We conducted a voxel-wise meta-analysis using Seed-based d Mapping that included 10 baseline voxel-wise FDG-PET comparisons between 93 aMCI converters and 129 aMCI non-converters from nine longitudinal studies. The most robust and reliable metabolic alterations that predicted conversion from aMCI to AD were localized in the left posterior cingulate cortex (PCC)/precuneus. Furthermore, meta-regression analyses indicated that baseline mean age and severity of cognitive impairment, and follow-up duration were significant moderators for metabolic alterations in aMCI converters. Our study revealed hypometabolism in the left PCC/precuneus as an early feature in the development of AD. This finding has important implications in understanding the neural substrates for AD conversion and could serve as a potential imaging biomarker for early detection of AD as well as for tracking disease progression at the predementia stage.

  11. Patient-dependent count-rate adaptive normalization for PET detector efficiency with delayed-window coincidence events

    NASA Astrophysics Data System (ADS)

    Niu, Xiaofeng; Ye, Hongwei; Xia, Ting; Asma, Evren; Winkler, Mark; Gagnon, Daniel; Wang, Wenli

    2015-07-01

    Quantitative PET imaging is widely used in clinical diagnosis in oncology and neuroimaging. Accurate normalization correction for the efficiency of each line-of- response is essential for accurate quantitative PET image reconstruction. In this paper, we propose a normalization calibration method by using the delayed-window coincidence events from the scanning phantom or patient. The proposed method could dramatically reduce the ‘ring’ artifacts caused by mismatched system count-rates between the calibration and phantom/patient datasets. Moreover, a modified algorithm for mean detector efficiency estimation is proposed, which could generate crystal efficiency maps with more uniform variance. Both phantom and real patient datasets are used for evaluation. The results show that the proposed method could lead to better uniformity in reconstructed images by removing ring artifacts, and more uniform axial variance profiles, especially around the axial edge slices of the scanner. The proposed method also has the potential benefit to simplify the normalization calibration procedure, since the calibration can be performed using the on-the-fly acquired delayed-window dataset.

  12. Noise correlation in PET, CT, SPECT and PET/CT data evaluated using autocorrelation function: a phantom study on data, reconstructed using FBP and OSEM.

    PubMed

    Razifar, Pasha; Sandström, Mattias; Schnieder, Harald; Långström, Bengt; Maripuu, Enn; Bengtsson, Ewert; Bergström, Mats

    2005-08-25

    Positron Emission Tomography (PET), Computed Tomography (CT), PET/CT and Single Photon Emission Tomography (SPECT) are non-invasive imaging tools used for creating two dimensional (2D) cross section images of three dimensional (3D) objects. PET and SPECT have the potential of providing functional or biochemical information by measuring distribution and kinetics of radiolabelled molecules, whereas CT visualizes X-ray density in tissues in the body. PET/CT provides fused images representing both functional and anatomical information with better precision in localization than PET alone. Images generated by these types of techniques are generally noisy, thereby impairing the imaging potential and affecting the precision in quantitative values derived from the images. It is crucial to explore and understand the properties of noise in these imaging techniques. Here we used autocorrelation function (ACF) specifically to describe noise correlation and its non-isotropic behaviour in experimentally generated images of PET, CT, PET/CT and SPECT. Experiments were performed using phantoms with different shapes. In PET and PET/CT studies, data were acquired in 2D acquisition mode and reconstructed by both analytical filter back projection (FBP) and iterative, ordered subsets expectation maximisation (OSEM) methods. In the PET/CT studies, different magnitudes of X-ray dose in the transmission were employed by using different mA settings for the X-ray tube. In the CT studies, data were acquired using different slice thickness with and without applied dose reduction function and the images were reconstructed by FBP. SPECT studies were performed in 2D, reconstructed using FBP and OSEM, using post 3D filtering. ACF images were generated from the primary images, and profiles across the ACF images were used to describe the noise correlation in different directions. The variance of noise across the images was visualised as images and with profiles across these images. The most important finding was that the pattern of noise correlation is rotation symmetric or isotropic, independent of object shape in PET and PET/CT images reconstructed using the iterative method. This is, however, not the case in FBP images when the shape of phantom is not circular. Also CT images reconstructed using FBP show the same non-isotropic pattern independent of slice thickness and utilization of care dose function. SPECT images show an isotropic correlation of the noise independent of object shape or applied reconstruction algorithm. Noise in PET/CT images was identical independent of the applied X-ray dose in the transmission part (CT), indicating that the noise from transmission with the applied doses does not propagate into the PET images showing that the noise from the emission part is dominant. The results indicate that in human studies it is possible to utilize a low dose in transmission part while maintaining the noise behaviour and the quality of the images. The combined effect of noise correlation for asymmetric objects and a varying noise variance across the image field significantly complicates the interpretation of the images when statistical methods are used, such as with statistical estimates of precision in average values, use of statistical parametric mapping methods and principal component analysis. Hence it is recommended that iterative reconstruction methods are used for such applications. However, it is possible to calculate the noise analytically in images reconstructed by FBP, while it is not possible to do the same calculation in images reconstructed by iterative methods. Therefore for performing statistical methods of analysis which depend on knowing the noise, FBP would be preferred.

  13. Overview of positron emission tomography chemistry: clinical and technical considerations and combination with computed tomography.

    PubMed

    Koukourakis, G; Maravelis, G; Koukouraki, S; Padelakos, P; Kouloulias, V

    2009-01-01

    The concept of emission and transmission tomography was introduced by David Kuhl and Roy Edwards in the late 1950s. Their work later led to the design and construction of several tomographic instruments at the University of Pennsylvania. Tomographic imaging techniques were further developed by Michel Ter-Pogossian, Michael E. Phelps and others at the Washington University School of Medicine. Positron emission tomography (PET) is a nuclear medicine imaging technique which produces a 3-dimensional image or map of functional processes in the body. The system detects pairs of gamma rays emitted indirectly by a positron-emitting radionuclide (tracer), which is introduced into the body on a biologically active molecule. Images of tracer concentration in 3-dimensional space within the body are then reconstructed by computer analysis. In modern scanners, this reconstruction is often accomplished with the aid of a CT X-ray scan performed on the patient during the same session, in the same machine. If the biologically active molecule chosen for PET is 18F-fluorodeoxyglucose (FDG), an analogue of glucose, the concentrations of tracer imaged give tissue metabolic activity in terms of regional glucose uptake. Although use of this tracer results in the most common type of PET scan, other tracer molecules are used in PET to image the tissue concentration of many other types of molecules of interest. The main role of this article was to analyse the available types of radiopharmaceuticals used in PET-CT along with the principles of its clinical and technical considerations.

  14. Depth profiling of mechanical degradation of PV backsheets after UV exposure

    NASA Astrophysics Data System (ADS)

    Gu, Xiaohong; Krommenhoek, Peter J.; Lin, Chiao-Chi; Yu, Li-Chieh; Nguyen, Tinh; Watson, Stephanie S.

    2015-09-01

    Polymeric multilayer backsheets protect the photovoltaic modules from damage of moisture and ultraviolet (UV) while providing electrical insulation. Due to the multilayer structures, the properties of the inner layers of the backsheets, including their interfaces, during weathering are not well known. In this study, a commercial type of PPE (polyethylene terephthalate (PET)/PET/ethylene vinyl acetate (EVA)) backsheet films was selected as a model system for a depth profiling study of mechanical properties of a backsheet film during UV exposure. The NIST SPHERE (Simulated Photodegradation via High Energy Radiant Exposure) was used for the accelerated laboratory exposure of the materials with UV at 85°C and two relative humidities (RH) of 5 % (dry) and 60 % (humid). Cryomicrotomy was used to obtain cross-sectional PPE samples. Mechanical depth profiling of the cross-sections of aged and unaged samples was conducted by nanoindentation, and a peak-force based quantitative nanomechanical atomic force microscopy (QNM-AFM) mapping techniquewas used to investigate the microstructure and adhesion properties of the adhesive tie layers. The nanoindentation results show the stiffening of the elastic modulus in the PET outer and pigmented EVA layers. From QNM-AFM, the microstructures and adhesion properties of the adhesive layers between PET outer and core layers and between PET core and EVA inner layers are revealed and found to degrade significantly after aging under humidity environment. The results from mechanical depth profiling of the PPE backsheet are further related to the previous chemical depth profiling of the same material, providing new insights into the effects of accelerated UV and humidity on the degradation of multilayer backsheet.

  15. INCIDENCE OF ABNORMAL POSITRON EMISSION TOMOGRAPHY IN PATIENTS WITH UNEXPLAINED CARDIOMYOPATHY AND VENTRICULAR ARRHYTHMIAS

    PubMed Central

    Tung, Roderick; Bauer, Brenton; Schelbert, Heinrich; Lynch, Joseph; Auerbach, Martin; Gupta, Pawan; Schiepers, Christiaan; Chan, Samantha; Ferris, Julie; Barrio, Martin; Ajijola, Olujimi; Bradfield, Jason; Shivkumar, Kalyanam

    2015-01-01

    Background The incidence of myocardial inflammation in patients with unexplained cardiomyopathy referred for ventricular arrhythmias (VA) is unknown. Objective To report fasting PET scan findings in consecutive patients referred with unexplained cardiomyopathy and VA. Methods 18-FDG PET/CT scans with a >16 hour fasting protocol were prospectively ordered for patients referred for VA and unexplained cardiomyopathy (EF<55%). Patients with focal myocardial FDG uptake were labeled as arrhythmogenic inflammatory cardiomyopathy (AIC) and classified into four groups based on the presence of lymph node uptake (AIC+) and perfusion abnormalities (early vs late stage). Results Over a 3-year period, 103 PET scan were performed with 49% (AIC+=17, AIC=33) exhibiting focal FDG uptake. The mean age was 52±12 years with an EF of 36±16%. Patients with AIC were more likely to have a history of pacemaker (32% vs 6%, p=0.002) compared to those with normal PET. When biopsy was performed, histologic diagnosis revealed non-granulomatous inflammation in 6 patients and sarcoidosis in 18 patients. 90% of patients with AIC/AIC+ were prescribed immunosuppressive therapy and 58% underwent ablation. Correlation between areas of perfusion abnormalities and FDG uptake with electro-anatomic mapping was observed in 79% patients and MRI findings matched in only 33%. Conclusions Nearly 50% of patients referred with unexplained cardiomyopathy and VA demonstrate ongoing focal myocardial inflammation on FDG PET. These data suggests that a significant proportion of patients labeled “idiopathic” may have occult arrhythmogenic inflammatory cardiomyopathy, which may benefit from early detection and immunosuppressive medical therapy. PMID:26272522

  16. Antimicrobial (BN/PE) film combined with modified atmosphere packaging extends the shelf life of minimally processed fresh-cut iceberg lettuce.

    PubMed

    Kang, Sun-Chul; Kim, Min-Jeong; Park, In-Sik; Choi, Ung-Kyu

    2008-03-01

    This study was conducted to investigate the effect of modified atmosphere packaging (MAP) in combination with BN/PE film on the shelf life and quality of fresh-cut iceberg lettuce during cold storage. The total mesophilic population in the sample packed in BN/PE film under MAP conditions was dramatically reduced in comparison with that of PE film, PE film under MAP conditions, and BN/PE film. The O2 concentration in the BN/PE film under MAP conditions decreased slightly as the storage period progressed. The coloration of the iceberg lettuce progressed the slowest when it was packaged in BN/PE film under MAP conditions, followed by BN/PE film, PE film, and PE film under MAP conditions. The shelf life of fresh-cut iceberg lettuce packaged in the BN/PE film under MAP conditions was extended by more than 2 days at 10 degrees as compared with that of the BN/PE film in which the extension effect was more than 2 days longer than that of PE, PET, and OPP films.

  17. Calculating Higher-Order Moments of Phylogenetic Stochastic Mapping Summaries in Linear Time.

    PubMed

    Dhar, Amrit; Minin, Vladimir N

    2017-05-01

    Stochastic mapping is a simulation-based method for probabilistically mapping substitution histories onto phylogenies according to continuous-time Markov models of evolution. This technique can be used to infer properties of the evolutionary process on the phylogeny and, unlike parsimony-based mapping, conditions on the observed data to randomly draw substitution mappings that do not necessarily require the minimum number of events on a tree. Most stochastic mapping applications simulate substitution mappings only to estimate the mean and/or variance of two commonly used mapping summaries: the number of particular types of substitutions (labeled substitution counts) and the time spent in a particular group of states (labeled dwelling times) on the tree. Fast, simulation-free algorithms for calculating the mean of stochastic mapping summaries exist. Importantly, these algorithms scale linearly in the number of tips/leaves of the phylogenetic tree. However, to our knowledge, no such algorithm exists for calculating higher-order moments of stochastic mapping summaries. We present one such simulation-free dynamic programming algorithm that calculates prior and posterior mapping variances and scales linearly in the number of phylogeny tips. Our procedure suggests a general framework that can be used to efficiently compute higher-order moments of stochastic mapping summaries without simulations. We demonstrate the usefulness of our algorithm by extending previously developed statistical tests for rate variation across sites and for detecting evolutionarily conserved regions in genomic sequences.

  18. Calculating Higher-Order Moments of Phylogenetic Stochastic Mapping Summaries in Linear Time

    PubMed Central

    Dhar, Amrit

    2017-01-01

    Abstract Stochastic mapping is a simulation-based method for probabilistically mapping substitution histories onto phylogenies according to continuous-time Markov models of evolution. This technique can be used to infer properties of the evolutionary process on the phylogeny and, unlike parsimony-based mapping, conditions on the observed data to randomly draw substitution mappings that do not necessarily require the minimum number of events on a tree. Most stochastic mapping applications simulate substitution mappings only to estimate the mean and/or variance of two commonly used mapping summaries: the number of particular types of substitutions (labeled substitution counts) and the time spent in a particular group of states (labeled dwelling times) on the tree. Fast, simulation-free algorithms for calculating the mean of stochastic mapping summaries exist. Importantly, these algorithms scale linearly in the number of tips/leaves of the phylogenetic tree. However, to our knowledge, no such algorithm exists for calculating higher-order moments of stochastic mapping summaries. We present one such simulation-free dynamic programming algorithm that calculates prior and posterior mapping variances and scales linearly in the number of phylogeny tips. Our procedure suggests a general framework that can be used to efficiently compute higher-order moments of stochastic mapping summaries without simulations. We demonstrate the usefulness of our algorithm by extending previously developed statistical tests for rate variation across sites and for detecting evolutionarily conserved regions in genomic sequences. PMID:28177780

  19. Probabilistic seismic hazard analysis (PSHA) for Ethiopia and the neighboring region

    NASA Astrophysics Data System (ADS)

    Ayele, Atalay

    2017-10-01

    Seismic hazard calculation is carried out for the Horn of Africa region (0°-20° N and 30°-50°E) based on the probabilistic seismic hazard analysis (PSHA) method. The earthquakes catalogue data obtained from different sources were compiled, homogenized to Mw magnitude scale and declustered to remove the dependent events as required by Poisson earthquake source model. The seismotectonic map of the study area that avails from recent studies is used for area sources zonation. For assessing the seismic hazard, the study area was divided into small grids of size 0.5° × 0.5°, and the hazard parameters were calculated at the center of each of these grid cells by considering contributions from all seismic sources. Peak Ground Acceleration (PGA) corresponding to 10% and 2% probability of exceedance in 50 years were calculated for all the grid points using generic rock site with Vs = 760 m/s. Obtained values vary from 0.0 to 0.18 g and 0.0-0.35 g for 475 and 2475 return periods, respectively. The corresponding contour maps showing the spatial variation of PGA values for the two return periods are presented here. Uniform hazard response spectrum (UHRS) for 10% and 2% probability of exceedance in 50 years and hazard curves for PGA and 0.2 s spectral acceleration (Sa) all at rock site are developed for the city of Addis Ababa. The hazard map of this study corresponding to the 475 return periods has already been used to update and produce the 3rd generation building code of Ethiopia.

  20. Future trends in flood risk in Indonesia - A probabilistic approach

    NASA Astrophysics Data System (ADS)

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to decrease future risks. Preliminary results show that the urban extent in Indonesia is projected to increase within 211 to 351% over the period 2000-2030 (5 and 95 percentile). Mainly driven by this rapid urbanization, potential flood losses in Indonesia increase rapidly and are primarily concentrated on the island of Java. The results reveal the large risk-reducing potential of adaptation measures. Since much of the urban development between 2000 and 2030 takes place in flood-prone areas, strategic urban planning (i.e. building in safe areas) may significantly reduce the urban population and infrastructure exposed to flooding. We conclude that a probabilistic risk approach in future flood risk assessment is vital; the drivers behind risk trends (exposure, hazard, vulnerability) should be understood to develop robust and efficient adaptation pathways.

Top