Sample records for unannotated two-dimensional histogram

  1. Structure Size Enhanced Histogram

    NASA Astrophysics Data System (ADS)

    Wesarg, Stefan; Kirschner, Matthias

    Direct volume visualization requires the definition of transfer functions (TFs) for the assignment of opacity and color. Multi-dimensional TFs are based on at least two image properties, and are specified by means of 2D histograms. In this work we propose a new type of a 2D histogram which combines gray value with information about the size of the structures. This structure size enhanced (SSE) histogram is an intuitive approach for representing anatomical features. Clinicians — the users we are focusing on — are much more familiar with selecting features by their size than by their gradient magnitude value. As a proof of concept, we employ the SSE histogram for the definition of two-dimensional TFs for the visualization of 3D MRI and CT image data.

  2. Analysis of dose heterogeneity using a subvolume-DVH

    NASA Astrophysics Data System (ADS)

    Said, M.; Nilsson, P.; Ceberg, C.

    2017-11-01

    The dose-volume histogram (DVH) is universally used in radiation therapy for its highly efficient way of summarizing three-dimensional dose distributions. An apparent limitation that is inherent to standard histograms is the loss of spatial information, e.g. it is no longer possible to tell where low- and high-dose regions are, and whether they are connected or disjoint. Two methods for overcoming the spatial fragmentation of low- and high-dose regions are presented, both based on the gray-level size zone matrix, which is a two-dimensional histogram describing the frequencies of connected regions of similar intensities. The first approach is a quantitative metric which can be likened to a homogeneity index. The large cold spot metric (LCS) is here defined to emphasize large contiguous regions receiving too low a dose; emphasis is put on both size, and deviation from the prescribed dose. In contrast, the subvolume-DVH (sDVH) is an extension to the standard DVH and allows for a qualitative evaluation of the degree of dose heterogeneity. The information retained from the two-dimensional histogram is overlaid on top of the DVH and the two are presented simultaneously. Both methods gauge the underlying heterogeneity in ways that the DVH alone cannot, and both have their own merits—the sDVH being more intuitive and the LCS being quantitative.

  3. Histogram deconvolution - An aid to automated classifiers

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1983-01-01

    It is shown that N-dimensional histograms are convolved by the addition of noise in the picture domain. Three methods are described which provide the ability to deconvolve such noise-affected histograms. The purpose of the deconvolution is to provide automated classifiers with a higher quality N-dimensional histogram from which to obtain classification statistics.

  4. PDB-UF: database of predicted enzymatic functions for unannotated protein structures from structural genomics.

    PubMed

    von Grotthuss, Marcin; Plewczynski, Dariusz; Ginalski, Krzysztof; Rychlewski, Leszek; Shakhnovich, Eugene I

    2006-02-06

    The number of protein structures from structural genomics centers dramatically increases in the Protein Data Bank (PDB). Many of these structures are functionally unannotated because they have no sequence similarity to proteins of known function. However, it is possible to successfully infer function using only structural similarity. Here we present the PDB-UF database, a web-accessible collection of predictions of enzymatic properties using structure-function relationship. The assignments were conducted for three-dimensional protein structures of unknown function that come from structural genomics initiatives. We show that 4 hypothetical proteins (with PDB accession codes: 1VH0, 1NS5, 1O6D, and 1TO0), for which standard BLAST tools such as PSI-BLAST or RPS-BLAST failed to assign any function, are probably methyltransferase enzymes. We suggest that the structure-based prediction of an EC number should be conducted having the different similarity score cutoff for different protein folds. Moreover, performing the annotation using two different algorithms can reduce the rate of false positive assignments. We believe, that the presented web-based repository will help to decrease the number of protein structures that have functions marked as "unknown" in the PDB file. http://paradox.harvard.edu/PDB-UF and http://bioinfo.pl/PDB-UF.

  5. Predicting protein functions from redundancies in large-scale protein interaction networks

    NASA Technical Reports Server (NTRS)

    Samanta, Manoj Pratim; Liang, Shoudan

    2003-01-01

    Interpreting data from large-scale protein interaction experiments has been a challenging task because of the widespread presence of random false positives. Here, we present a network-based statistical algorithm that overcomes this difficulty and allows us to derive functions of unannotated proteins from large-scale interaction data. Our algorithm uses the insight that if two proteins share significantly larger number of common interaction partners than random, they have close functional associations. Analysis of publicly available data from Saccharomyces cerevisiae reveals >2,800 reliable functional associations, 29% of which involve at least one unannotated protein. By further analyzing these associations, we derive tentative functions for 81 unannotated proteins with high certainty. Our method is not overly sensitive to the false positives present in the data. Even after adding 50% randomly generated interactions to the measured data set, we are able to recover almost all (approximately 89%) of the original associations.

  6. Differentially Private Synthesization of Multi-Dimensional Data using Copula Functions

    PubMed Central

    Li, Haoran; Xiong, Li; Jiang, Xiaoqian

    2014-01-01

    Differential privacy has recently emerged in private statistical data release as one of the strongest privacy guarantees. Most of the existing techniques that generate differentially private histograms or synthetic data only work well for single dimensional or low-dimensional histograms. They become problematic for high dimensional and large domain data due to increased perturbation error and computation complexity. In this paper, we propose DPCopula, a differentially private data synthesization technique using Copula functions for multi-dimensional data. The core of our method is to compute a differentially private copula function from which we can sample synthetic data. Copula functions are used to describe the dependence between multivariate random vectors and allow us to build the multivariate joint distribution using one-dimensional marginal distributions. We present two methods for estimating the parameters of the copula functions with differential privacy: maximum likelihood estimation and Kendall’s τ estimation. We present formal proofs for the privacy guarantee as well as the convergence property of our methods. Extensive experiments using both real datasets and synthetic datasets demonstrate that DPCopula generates highly accurate synthetic multi-dimensional data with significantly better utility than state-of-the-art techniques. PMID:25405241

  7. An Apparent Diffusion Coefficient Histogram Method Versus a Traditional 2-Dimensional Measurement Method for Identifying Non-Puerperal Mastitis From Breast Cancer at 3.0 T.

    PubMed

    Tang, Qi; Li, Qiang; Xie, Dong; Chu, Ketao; Liu, Lidong; Liao, Chengcheng; Qin, Yunying; Wang, Zheng; Su, Danke

    2018-05-21

    This study aimed to investigate the utility of a volumetric apparent diffusion coefficient (ADC) histogram method for distinguishing non-puerperal mastitis (NPM) from breast cancer (BC) and to compare this method with a traditional 2-dimensional measurement method. Pretreatment diffusion-weighted imaging data at 3.0 T were obtained for 80 patients (NPM, n = 27; BC, n = 53) and were retrospectively assessed. Two readers measured ADC values according to 2 distinct region-of-interest (ROI) protocols. The first protocol included the generation of ADC histograms for each lesion, and various parameters were examined. In the second protocol, 3 freehand (TF) ROIs for local lesions were generated to obtain a mean ADC value (defined as ADC-ROITF). All of the ADC values were compared by an independent-samples t test or the Mann-Whitney U test. Receiver operating characteristic curves and a leave-one-out cross-validation method were also used to determine diagnostic deficiencies of the significant parameters. The ADC values for NPM were characterized by significantly higher mean, 5th to 95th percentiles, and maximum and mode ADCs compared with the corresponding ADCs for BC (all P < 0.05). However, the minimum, skewness, and kurtosis ADC values, as well as ADC-ROITF, did not significantly differ between the NPM and BC cases. Thus, the generation of volumetric ADC histograms seems to be a superior method to the traditional 2-dimensional method that was examined, and it also seems to represent a promising image analysis method for distinguishing NPM from BC.

  8. AHIMSA - Ad hoc histogram information measure sensing algorithm for feature selection in the context of histogram inspired clustering techniques

    NASA Technical Reports Server (NTRS)

    Dasarathy, B. V.

    1976-01-01

    An algorithm is proposed for dimensionality reduction in the context of clustering techniques based on histogram analysis. The approach is based on an evaluation of the hills and valleys in the unidimensional histograms along the different features and provides an economical means of assessing the significance of the features in a nonparametric unsupervised data environment. The method has relevance to remote sensing applications.

  9. Content based Image Retrieval based on Different Global and Local Color Histogram Methods: A Survey

    NASA Astrophysics Data System (ADS)

    Suhasini, Pallikonda Sarah; Sri Rama Krishna, K.; Murali Krishna, I. V.

    2017-02-01

    Different global and local color histogram methods for content based image retrieval (CBIR) are investigated in this paper. Color histogram is a widely used descriptor for CBIR. Conventional method of extracting color histogram is global, which misses the spatial content, is less invariant to deformation and viewpoint changes, and results in a very large three dimensional histogram corresponding to the color space used. To address the above deficiencies, different global and local histogram methods are proposed in recent research. Different ways of extracting local histograms to have spatial correspondence, invariant colour histogram to add deformation and viewpoint invariance and fuzzy linking method to reduce the size of the histogram are found in recent papers. The color space and the distance metric used are vital in obtaining color histogram. In this paper the performance of CBIR based on different global and local color histograms in three different color spaces, namely, RGB, HSV, L*a*b* and also with three distance measures Euclidean, Quadratic and Histogram intersection are surveyed, to choose appropriate method for future research.

  10. Exploring gravitational lensing model variations in the Frontier Fields galaxy clusters

    NASA Astrophysics Data System (ADS)

    Harris James, Nicholas John; Raney, Catie; Brennan, Sean; Keeton, Charles

    2018-01-01

    Multiple groups have been working on modeling the mass distributions of the six lensing galaxy clusters in the Hubble Space Telescope Frontier Fields data set. The magnification maps produced from these mass models will be important for the future study of the lensed background galaxies, but there exists significant variation in the different groups’ models and magnification maps. We explore the use of two-dimensional histograms as a tool for visualizing these magnification map variations. Using a number of simple, one- or two-halo singular isothermal sphere models, we explore the features that are produced in 2D histogram model comparisons when parameters such as halo mass, ellipticity, and location are allowed to vary. Our analysis demonstrates the potential of 2D histograms as a means of observing the full range of differences between the Frontier Fields groups’ models.This work has been supported by funding from National Science Foundation grants PHY-1560077 and AST-1211385, and from the Space Telescope Science Institute.

  11. Bennett's acceptance ratio and histogram analysis methods enhanced by umbrella sampling along a reaction coordinate in configurational space.

    PubMed

    Kim, Ilsoo; Allen, Toby W

    2012-04-28

    Free energy perturbation, a method for computing the free energy difference between two states, is often combined with non-Boltzmann biased sampling techniques in order to accelerate the convergence of free energy calculations. Here we present a new extension of the Bennett acceptance ratio (BAR) method by combining it with umbrella sampling (US) along a reaction coordinate in configurational space. In this approach, which we call Bennett acceptance ratio with umbrella sampling (BAR-US), the conditional histogram of energy difference (a mapping of the 3N-dimensional configurational space via a reaction coordinate onto 1D energy difference space) is weighted for marginalization with the associated population density along a reaction coordinate computed by US. This procedure produces marginal histograms of energy difference, from forward and backward simulations, with higher overlap in energy difference space, rendering free energy difference estimations using BAR statistically more reliable. In addition to BAR-US, two histogram analysis methods, termed Bennett overlapping histograms with US (BOH-US) and Bennett-Hummer (linear) least square with US (BHLS-US), are employed as consistency and convergence checks for free energy difference estimation by BAR-US. The proposed methods (BAR-US, BOH-US, and BHLS-US) are applied to a 1-dimensional asymmetric model potential, as has been used previously to test free energy calculations from non-equilibrium processes. We then consider the more stringent test of a 1-dimensional strongly (but linearly) shifted harmonic oscillator, which exhibits no overlap between two states when sampled using unbiased Brownian dynamics. We find that the efficiency of the proposed methods is enhanced over the original Bennett's methods (BAR, BOH, and BHLS) through fast uniform sampling of energy difference space via US in configurational space. We apply the proposed methods to the calculation of the electrostatic contribution to the absolute solvation free energy (excess chemical potential) of water. We then address the controversial issue of ion selectivity in the K(+) ion channel, KcsA. We have calculated the relative binding affinity of K(+) over Na(+) within a binding site of the KcsA channel for which different, though adjacent, K(+) and Na(+) configurations exist, ideally suited to these US-enhanced methods. Our studies demonstrate that the significant improvements in free energy calculations obtained using the proposed methods can have serious consequences for elucidating biological mechanisms and for the interpretation of experimental data.

  12. Secondary iris recognition method based on local energy-orientation feature

    NASA Astrophysics Data System (ADS)

    Huo, Guang; Liu, Yuanning; Zhu, Xiaodong; Dong, Hongxing

    2015-01-01

    This paper proposes a secondary iris recognition based on local features. The application of the energy-orientation feature (EOF) by two-dimensional Gabor filter to the extraction of the iris goes before the first recognition by the threshold of similarity, which sets the whole iris database into two categories-a correctly recognized class and a class to be recognized. Therefore, the former are accepted and the latter are transformed by histogram to achieve an energy-orientation histogram feature (EOHF), which is followed by a second recognition with the chi-square distance. The experiment has proved that the proposed method, because of its higher correct recognition rate, could be designated as the most efficient and effective among its companion studies in iris recognition algorithms.

  13. Three-dimensional volumetric gray-scale uterine cervix histogram prediction of days to delivery in full term pregnancy.

    PubMed

    Kim, Ji Youn; Kim, Hai-Joong; Hahn, Meong Hi; Jeon, Hye Jin; Cho, Geum Joon; Hong, Sun Chul; Oh, Min Jeong

    2013-09-01

    Our aim was to figure out whether volumetric gray-scale histogram difference between anterior and posterior cervix can indicate the extent of cervical consistency. We collected data of 95 patients who were appropriate for vaginal delivery with 36th to 37th weeks of gestational age from September 2010 to October 2011 in the Department of Obstetrics and Gynecology, Korea University Ansan Hospital. Patients were excluded who had one of the followings: Cesarean section, labor induction, premature rupture of membrane. Thirty-four patients were finally enrolled. The patients underwent evaluation of the cervix through Bishop score, cervical length, cervical volume, three-dimensional (3D) cervical volumetric gray-scale histogram. The interval days from the cervix evaluation to the delivery day were counted. We compared to 3D cervical volumetric gray-scale histogram, Bishop score, cervical length, cervical volume with interval days from the evaluation of the cervix to the delivery. Gray-scale histogram difference between anterior and posterior cervix was significantly correlated to days to delivery. Its correlation coefficient (R) was 0.500 (P = 0.003). The cervical length was significantly related to the days to delivery. The correlation coefficient (R) and P-value between them were 0.421 and 0.013. However, anterior lip histogram, posterior lip histogram, total cervical volume, Bishop score were not associated with days to delivery (P >0.05). By using gray-scale histogram difference between anterior and posterior cervix and cervical length correlated with the days to delivery. These methods can be utilized to better help predict a cervical consistency.

  14. Aggregating and Predicting Sequence Labels from Crowd Annotations

    PubMed Central

    Nguyen, An T.; Wallace, Byron C.; Li, Junyi Jessy; Nenkova, Ani; Lease, Matthew

    2017-01-01

    Despite sequences being core to NLP, scant work has considered how to handle noisy sequence labels from multiple annotators for the same text. Given such annotations, we consider two complementary tasks: (1) aggregating sequential crowd labels to infer a best single set of consensus annotations; and (2) using crowd annotations as training data for a model that can predict sequences in unannotated text. For aggregation, we propose a novel Hidden Markov Model variant. To predict sequences in unannotated text, we propose a neural approach using Long Short Term Memory. We evaluate a suite of methods across two different applications and text genres: Named-Entity Recognition in news articles and Information Extraction from biomedical abstracts. Results show improvement over strong baselines. Our source code and data are available online1. PMID:29093611

  15. Pattern-histogram-based temporal change detection using personal chest radiographs

    NASA Astrophysics Data System (ADS)

    Ugurlu, Yucel; Obi, Takashi; Hasegawa, Akira; Yamaguchi, Masahiro; Ohyama, Nagaaki

    1999-05-01

    An accurate and reliable detection of temporal changes from a pair of images has considerable interest in the medical science. Traditional registration and subtraction techniques can be applied to extract temporal differences when,the object is rigid or corresponding points are obvious. However, in radiological imaging, loss of the depth information, the elasticity of object, the absence of clearly defined landmarks and three-dimensional positioning differences constraint the performance of conventional registration techniques. In this paper, we propose a new method in order to detect interval changes accurately without using an image registration technique. The method is based on construction of so-called pattern histogram and comparison procedure. The pattern histogram is a graphic representation of the frequency counts of all allowable patterns in the multi-dimensional pattern vector space. K-means algorithm is employed to partition pattern vector space successively. Any differences in the pattern histograms imply that different patterns are involved in the scenes. In our experiment, a pair of chest radiographs of pneumoconiosis is employed and the changing histogram bins are visualized on both of the images. We found that the method can be used as an alternative way of temporal change detection, particularly when the precise image registration is not available.

  16. Wildfire Detection using by Multi Dimensional Histogram in Boreal Forest

    NASA Astrophysics Data System (ADS)

    Honda, K.; Kimura, K.; Honma, T.

    2008-12-01

    Early detection of wildfires is an issue for reduction of damage to environment and human. There are some attempts to detect wildfires by using satellite imagery, which are mainly classified into three methods: Dozier Method(1981-), Threshold Method(1986-) and Contextual Method(1994-). However, the accuracy of these methods is not enough: some commission and omission errors are included in the detected results. In addition, it is not so easy to analyze satellite imagery with high accuracy because of insufficient ground truth data. Kudoh and Hosoi (2003) developed the detection method by using three-dimensional (3D) histogram from past fire data with the NOAA-AVHRR imagery. But their method is impractical because their method depends on their handworks to pick up past fire data from huge data. Therefore, the purpose of this study is to collect fire points as hot spots efficiently from satellite imagery and to improve the method to detect wildfires with the collected data. As our method, we collect past fire data with the Alaska Fire History data obtained by the Alaska Fire Service (AFS). We select points that are expected to be wildfires, and pick up the points inside the fire area of the AFS data. Next, we make 3D histogram with the past fire data. In this study, we use Bands 1, 21 and 32 of MODIS. We calculate the likelihood to detect wildfires with the three-dimensional histogram. As our result, we select wildfires with the 3D histogram effectively. We can detect the troidally spreading wildfire. This result shows the evidence of good wildfire detection. However, the area surrounding glacier tends to rise brightness temperature. It is a false alarm. Burnt area and bare ground are sometimes indicated as false alarms, so that it is necessary to improve this method. Additionally, we are trying various combinations of MODIS bands as the better method to detect wildfire effectively. So as to adjust our method in another area, we are applying our method to tropical forest in Kalimantan, Indonesia and around Chiang Mai, Thailand. But the ground truth data in these areas is lesser than the one in Alaska. Our method needs lots of accurate observed data to make multi-dimensional histogram in the same area. In this study, we can show the system to select wildfire data efficiently from satellite imagery. Furthermore, the development of multi-dimensional histogram from past fire data makes it possible to detect wildfires accurately.

  17. A CMOS VLSI IC for Real-Time Opto-Electronic Two-Dimensional Histogram Generation

    DTIC Science & Technology

    1993-12-01

    large scale integration) design; MAGIC ; CMOS; optics; image processing; 93 16. PRICE CODE 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATiON 19...1. Sun SPARCstation ............. .............. 6 2. Magic .................. ................... 6 a. Peg ................. .................. 7 b...38 v APPENDIX B. MAGIC CELL LAYOUTS .... ............ .. 39 APPENDIX C: SIMULATION DATA ....... ............. .. 56 A. FINITE STATE MACHINE

  18. Phase transitions and thermodynamic properties of antiferromagnetic Ising model with next-nearest-neighbor interactions on the Kagomé lattice

    NASA Astrophysics Data System (ADS)

    Ramazanov, M. K.; Murtazaev, A. K.; Magomedov, M. A.; Badiev, M. K.

    2018-06-01

    We study phase transitions and thermodynamic properties in the two-dimensional antiferromagnetic Ising model with next-nearest-neighbor interaction on a Kagomé lattice by Monte Carlo simulations. A histogram data analysis shows that a second-order transition occurs in the model. From the analysis of obtained data, we can assume that next-nearest-neighbor ferromagnetic interactions in two-dimensional antiferromagnetic Ising model on a Kagomé lattice excite the occurrence of a second-order transition and unusual behavior of thermodynamic properties on the temperature dependence.

  19. Dose evaluation of organs at risk (OAR) cervical cancer using dose volume histogram (DVH) on brachytherapy

    NASA Astrophysics Data System (ADS)

    Arif Wibowo, R.; Haris, Bambang; Inganatul Islamiyah, dan

    2017-05-01

    Brachytherapy is one way to cure cervical cancer. It works by placing a radioactive source near the tumor. However, there are some healthy tissues or organs at risk (OAR) such as bladder and rectum which received radiation also. This study aims to evaluate the radiation dose of the bladder and rectum. There were 12 total radiation dose data of the bladder and rectum obtained from patients’ brachytherapy. The dose of cervix for all patients was 6 Gy. Two-dimensional calculation of the radiation dose was based on the International Commission on Radiation Units and Measurements (ICRU) points or called DICRU while the 3-dimensional calculation derived from Dose Volume Histogram (DVH) on a volume of 2 cc (D2cc). The radiation dose of bladder and rectum from both methods were analysed using independent t test. The mean DICRU of bladder was 4.33730 Gy and its D2cc was4.78090 Gy. DICRU and D2cc bladder did not differ significantly (p = 0.144). The mean DICRU of rectum was 3.57980 Gy and 4.58670 Gy for D2cc. The mean DICRU of rectum differed significantly from D2cc of rectum (p = 0.000). The three-dimensional method radiation dose of the bladder and rectum was higher than the two-dimensional method with ratios 1.10227 for bladder and 1.28127 for rectum. The radiation dose of the bladder and rectum was still below the tolerance dose. Two-dimensional calculation of the bladder and rectum dose was lower than three-dimension which was more accurate due to its calculation at the whole volume of the organs.

  20. Multispectral histogram normalization contrast enhancement

    NASA Technical Reports Server (NTRS)

    Soha, J. M.; Schwartz, A. A.

    1979-01-01

    A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.

  1. Inferring the Functions of Proteins from the Interrelationships between Functional Categories.

    PubMed

    Taha, Kamal

    2018-01-01

    This study proposes a new method to determine the functions of an unannotated protein. The proteins and amino acid residues mentioned in biomedical texts associated with an unannotated protein can be considered as characteristics terms for , which are highly predictive of the potential functions of . Similarly, proteins and amino acid residues mentioned in biomedical texts associated with proteins annotated with a functional category can be considered as characteristics terms of . We introduce in this paper an information extraction system called IFP_IFC that predicts the functions of an unannotated protein by representing and each functional category by a vector of weights. Each weight reflects the degree of association between a characteristic term and (or a characteristic term and ). First, IFP_IFC constructs a network, whose nodes represent the different functional categories, and its edges the interrelationships between the nodes. Then, it determines the functions of by employing random walks with restarts on the mentioned network. The walker is the vector of . Finally, is assigned to the functional categories of the nodes in the network that are visited most by the walker. We evaluated the quality of IFP_IFC by comparing it experimentally with two other systems. Results showed marked improvement.

  2. Graphical and Numerical Descriptive Analysis: Exploratory Tools Applied to Vietnamese Data

    ERIC Educational Resources Information Center

    Haughton, Dominique; Phong, Nguyen

    2004-01-01

    This case study covers several exploratory data analysis ideas, the histogram and boxplot, kernel density estimates, the recently introduced bagplot--a two-dimensional extension of the boxplot--as well as the violin plot, which combines a boxplot with a density shape plot. We apply these ideas and demonstrate how to interpret the output from these…

  3. Fast and straightforward analysis approach of charge transport data in single molecule junctions.

    PubMed

    Zhang, Qian; Liu, Chenguang; Tao, Shuhui; Yi, Ruowei; Su, Weitao; Zhao, Cezhou; Zhao, Chun; Dappe, Yannick J; Nichols, Richard J; Yang, Li

    2018-08-10

    In this study, we introduce an efficient data sorting algorithm, including filters for noisy signals, conductance mapping for analyzing the most dominant conductance group and sub-population groups. The capacity of our data analysis process has also been corroborated on real experimental data sets of Au-1,6-hexanedithiol-Au and Au-1,8-octanedithiol-Au molecular junctions. The fully automated and unsupervised program requires less than one minute on a standard PC to sort the data and generate histograms. The resulting one-dimensional and two-dimensional log histograms give conductance values in good agreement with previous studies. Our algorithm is a straightforward, fast and user-friendly tool for single molecule charge transport data analysis. We also analyze the data in a form of a conductance map which can offer evidence for diversity in molecular conductance. The code for automatic data analysis is openly available, well-documented and ready to use, thereby offering a useful new tool for single molecule electronics.

  4. MCNP Output Data Analysis with ROOT (MODAR)

    NASA Astrophysics Data System (ADS)

    Carasco, C.

    2010-06-01

    MCNP Output Data Analysis with ROOT (MODAR) is a tool based on CERN's ROOT software. MODAR has been designed to handle time-energy data issued by MCNP simulations of neutron inspection devices using the associated particle technique. MODAR exploits ROOT's Graphical User Interface and functionalities to visualize and process MCNP simulation results in a fast and user-friendly way. MODAR allows to take into account the detection system time resolution (which is not possible with MCNP) as well as detectors energy response function and counting statistics in a straightforward way. Program summaryProgram title: MODAR Catalogue identifier: AEGA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 155 373 No. of bytes in distributed program, including test data, etc.: 14 815 461 Distribution format: tar.gz Programming language: C++ Computer: Most Unix workstations and PC Operating system: Most Unix systems, Linux and windows, provided the ROOT package has been installed. Examples where tested under Suse Linux and Windows XP. RAM: Depends on the size of the MCNP output file. The example presented in the article, which involves three two-dimensional 139×740 bins histograms, allocates about 60 MB. These data are running under ROOT and include consumption by ROOT itself. Classification: 17.6 External routines: ROOT version 5.24.00 ( http://root.cern.ch/drupal/) Nature of problem: The output of an MCNP simulation is an ASCII file. The data processing is usually performed by copying and pasting the relevant parts of the ASCII file into Microsoft Excel. Such an approach is satisfactory when the quantity of data is small but is not efficient when the size of the simulated data is large, for example when time-energy correlations are studied in detail such as in problems involving the associated particle technique. In addition, since the finite time resolution of the simulated detector cannot be modeled with MCNP, systems in which time-energy correlation is crucial cannot be described in a satisfactory way. Finally, realistic particle energy deposit in detectors is calculated with MCNP in a two-step process involving type-5 then type-8 tallies. In the first step, the photon flux energy spectrum associated to a time region is selected and serves as a source energy distribution for the second step. Thus, several files must be manipulated before getting the result, which can be time consuming if one needs to study several time regions or different detectors performances. In the same way, modeling counting statistics obtained in a limited acquisition time requires several steps and can also be time consuming. Solution method: In order to overcome the previous limitations, the MODAR C++ code has been written to make use of CERN's ROOT data analysis software. MCNP output data are read from the MCNP output file with dedicated routines. Two-dimensional histograms are filled and can be handled efficiently within the ROOT framework. To keep a user friendly analysis tool, all processing and data display can be done by means of ROOT Graphical User Interface. Specific routines have been written to include detectors finite time resolution and energy response function as well as counting statistics in a straightforward way. Additional comments: The possibility of adding tallies has also been incorporated in MODAR in order to describe systems in which the signal from several detectors can be summed. Moreover, MODAR can be adapted to handle other problems involving two-dimensional data. Running time: The CPU time needed to smear a two-dimensional histogram depends on the size of the histogram. In the presented example, the time-energy smearing of one of the 139×740 two-dimensional histograms takes 3 minutes with a DELL computer equipped with INTEL Core 2.

  5. A scalable method to improve gray matter segmentation at ultra high field MRI.

    PubMed

    Gulban, Omer Faruk; Schneider, Marian; Marquardt, Ingo; Haast, Roy A M; De Martino, Federico

    2018-01-01

    High-resolution (functional) magnetic resonance imaging (MRI) at ultra high magnetic fields (7 Tesla and above) enables researchers to study how anatomical and functional properties change within the cortical ribbon, along surfaces and across cortical depths. These studies require an accurate delineation of the gray matter ribbon, which often suffers from inclusion of blood vessels, dura mater and other non-brain tissue. Residual segmentation errors are commonly corrected by browsing the data slice-by-slice and manually changing labels. This task becomes increasingly laborious and prone to error at higher resolutions since both work and error scale with the number of voxels. Here we show that many mislabeled, non-brain voxels can be corrected more efficiently and semi-automatically by representing three-dimensional anatomical images using two-dimensional histograms. We propose both a uni-modal (based on first spatial derivative) and multi-modal (based on compositional data analysis) approach to this representation and quantify the benefits in 7 Tesla MRI data of nine volunteers. We present an openly accessible Python implementation of these approaches and demonstrate that editing cortical segmentations using two-dimensional histogram representations as an additional post-processing step aids existing algorithms and yields improved gray matter borders. By making our data and corresponding expert (ground truth) segmentations openly available, we facilitate future efforts to develop and test segmentation algorithms on this challenging type of data.

  6. A scalable method to improve gray matter segmentation at ultra high field MRI

    PubMed Central

    De Martino, Federico

    2018-01-01

    High-resolution (functional) magnetic resonance imaging (MRI) at ultra high magnetic fields (7 Tesla and above) enables researchers to study how anatomical and functional properties change within the cortical ribbon, along surfaces and across cortical depths. These studies require an accurate delineation of the gray matter ribbon, which often suffers from inclusion of blood vessels, dura mater and other non-brain tissue. Residual segmentation errors are commonly corrected by browsing the data slice-by-slice and manually changing labels. This task becomes increasingly laborious and prone to error at higher resolutions since both work and error scale with the number of voxels. Here we show that many mislabeled, non-brain voxels can be corrected more efficiently and semi-automatically by representing three-dimensional anatomical images using two-dimensional histograms. We propose both a uni-modal (based on first spatial derivative) and multi-modal (based on compositional data analysis) approach to this representation and quantify the benefits in 7 Tesla MRI data of nine volunteers. We present an openly accessible Python implementation of these approaches and demonstrate that editing cortical segmentations using two-dimensional histogram representations as an additional post-processing step aids existing algorithms and yields improved gray matter borders. By making our data and corresponding expert (ground truth) segmentations openly available, we facilitate future efforts to develop and test segmentation algorithms on this challenging type of data. PMID:29874295

  7. Nonlinear dimensionality reduction of CT histogram based feature space for predicting recurrence-free survival in non-small-cell lung cancer

    NASA Astrophysics Data System (ADS)

    Kawata, Y.; Niki, N.; Ohmatsu, H.; Aokage, K.; Kusumoto, M.; Tsuchida, T.; Eguchi, K.; Kaneko, M.

    2015-03-01

    Advantages of CT scanners with high resolution have allowed the improved detection of lung cancers. In the recent release of positive results from the National Lung Screening Trial (NLST) in the US showing that CT screening does in fact have a positive impact on the reduction of lung cancer related mortality. While this study does show the efficacy of CT based screening, physicians often face the problems of deciding appropriate management strategies for maximizing patient survival and for preserving lung function. Several key manifold-learning approaches efficiently reveal intrinsic low-dimensional structures latent in high-dimensional data spaces. This study was performed to investigate whether the dimensionality reduction can identify embedded structures from the CT histogram feature of non-small-cell lung cancer (NSCLC) space to improve the performance in predicting the likelihood of RFS for patients with NSCLC.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Font, Joan; Beckman, John E.; Fathi, Kambiz

    In this Letter, we introduce a technique for finding resonance radii in a disk galaxy. We use a two-dimensional velocity field in H{alpha} emission obtained with Fabry-Perot interferometry, derive the classical rotation curve, and subtract it off, leaving a residual velocity map. As the streaming motions should reverse sign at corotation, we detect these reversals and plot them in a histogram against galactocentric radius, excluding points where the amplitude of the reversal is smaller than the measurement uncertainty. The histograms show well-defined peaks which we assume to occur at resonance radii, identifying corotations as the most prominent peaks corresponding tomore » the relevant morphological features of the galaxy (notably bars and spiral arm systems). We compare our results with published measurements on the same galaxies using other methods and different types of data.« less

  9. Abstracting Attribute Space for Transfer Function Exploration and Design.

    PubMed

    Maciejewski, Ross; Jang, Yun; Woo, Insoo; Jänicke, Heike; Gaither, Kelly P; Ebert, David S

    2013-01-01

    Currently, user centered transfer function design begins with the user interacting with a one or two-dimensional histogram of the volumetric attribute space. The attribute space is visualized as a function of the number of voxels, allowing the user to explore the data in terms of the attribute size/magnitude. However, such visualizations provide the user with no information on the relationship between various attribute spaces (e.g., density, temperature, pressure, x, y, z) within the multivariate data. In this work, we propose a modification to the attribute space visualization in which the user is no longer presented with the magnitude of the attribute; instead, the user is presented with an information metric detailing the relationship between attributes of the multivariate volumetric data. In this way, the user can guide their exploration based on the relationship between the attribute magnitude and user selected attribute information as opposed to being constrained by only visualizing the magnitude of the attribute. We refer to this modification to the traditional histogram widget as an abstract attribute space representation. Our system utilizes common one and two-dimensional histogram widgets where the bins of the abstract attribute space now correspond to an attribute relationship in terms of the mean, standard deviation, entropy, or skewness. In this manner, we exploit the relationships and correlations present in the underlying data with respect to the dimension(s) under examination. These relationships are often times key to insight and allow us to guide attribute discovery as opposed to automatic extraction schemes which try to calculate and extract distinct attributes a priori. In this way, our system aids in the knowledge discovery of the interaction of properties within volumetric data.

  10. Kinetics of Surface-Mediated Fibrillization of Amyloid-β (12-28) Peptides.

    PubMed

    Lin, Yi-Chih; Li, Chen; Fakhraai, Zahra

    2018-04-17

    Surfaces or interfaces are considered to be key factors in facilitating the formation of amyloid fibrils under physiological conditions. In this report, we study the kinetics of the surface-mediated fibrillization (SMF) of an amyloid-β fragment (Aβ 12-28 ) on mica. We employ a spin-coating-based drying procedure to control the exposure time of the substrate to a low-concentration peptide solution and then monitor the fibril growth as a function of time via atomic force microscopy (AFM). The evolution of surface-mediated fibril growth is quantitatively characterized in terms of the length histogram of imaged fibrils and their surface concentration. A two-dimensional (2D) kinetic model is proposed to numerically simulate the length evolution of surface-mediated fibrils by assuming a diffusion-limited aggregation (DLA) process along with size-dependent rate constants. We find that both monomer and fibril diffusion on the surface are required to obtain length histograms as a function of time that resemble those observed in experiments. The best-fit simulated data can accurately describe the key features of experimental length histograms and suggests that the mobility of loosely bound amyloid species is crucial in regulating the kinetics of SMF. We determine that the mobility exponent for the size dependence of the DLA rate constants is α = 0.55 ± 0.05, which suggests that the diffusion of loosely bound surface fibrils roughly depends on the inverse of the square root of their size. These studies elucidate the influence of deposition rate and surface diffusion on the formation of amyloid fibrils through SMF. The method used here can be broadly adopted to study the diffusion and aggregation of peptides or proteins on various surfaces to investigate the role of chemical interactions in two-dimensional fibril formation and diffusion.

  11. BIBLIOGRAPHY ON CURRICULUM DEVELOPMENT. SUPPLEMENT I.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS BIBLIOGRAPHY (SUPPLEMENT I) LISTS MATERIALS ON VARIOUS ASPECTS OF CURRICULUM DEVELOPMENT. EIGHTY-TWO UNANNOTATED REFERENCES ARE PROVIDED FOR DOCUMENTS DATING FROM 1961 TO 1966. BOOKS, JOURNALS, REPORT MATERIALS, AND SOME UNPUBLISHED MANUSCRIPTS ARE LISTED IN SUCH AREAS AS EDUCATIONAL GAMES, CURRICULUM CHANGE, CONCEPT DEVELOPMENT, PROGRAM…

  12. Assessment of Arterial Wall Enhancement for Differentiation of Parent Artery Disease from Small Artery Disease: Comparison between Histogram Analysis and Visual Analysis on 3-Dimensional Contrast-Enhanced T1-Weighted Turbo Spin Echo MR Images at 3T.

    PubMed

    Jang, Jinhee; Kim, Tae-Won; Hwang, Eo-Jin; Choi, Hyun Seok; Koo, Jaseong; Shin, Yong Sam; Jung, So-Lyung; Ahn, Kook-Jin; Kim, Bum-Soo

    2017-01-01

    The purpose of this study was to compare the histogram analysis and visual scores in 3T MRI assessment of middle cerebral arterial wall enhancement in patients with acute stroke, for the differentiation of parent artery disease (PAD) from small artery disease (SAD). Among the 82 consecutive patients in a tertiary hospital for one year, 25 patients with acute infarcts in middle cerebral artery (MCA) territory were included in this study including 15 patients with PAD and 10 patients with SAD. Three-dimensional contrast-enhanced T1-weighted turbo spin echo MR images with black-blood preparation at 3T were analyzed both qualitatively and quantitatively. The degree of MCA stenosis, and visual and histogram assessments on MCA wall enhancement were evaluated. A statistical analysis was performed to compare diagnostic accuracy between qualitative and quantitative metrics. The degree of stenosis, visual enhancement score, geometric mean (GM), and the 90th percentile (90P) value from the histogram analysis were significantly higher in PAD than in SAD ( p = 0.006 for stenosis, < 0.001 for others). The receiver operating characteristic curve area of GM and 90P were 1 (95% confidence interval [CI], 0.86-1.00). A histogram analysis of a relevant arterial wall enhancement allows differentiation between PAD and SAD in patients with acute stroke within the MCA territory.

  13. Exploring the dark foldable proteome by considering hydrophobic amino acids topology

    PubMed Central

    Bitard-Feildel, Tristan; Callebaut, Isabelle

    2017-01-01

    The protein universe corresponds to the set of all proteins found in all organisms. A way to explore it is by taking into account the domain content of the proteins. However, some part of sequences and many entire sequences remain un-annotated despite a converging number of domain families. The un-annotated part of the protein universe is referred to as the dark proteome and remains poorly characterized. In this study, we quantify the amount of foldable domains within the dark proteome by using the hydrophobic cluster analysis methodology. These un-annotated foldable domains were grouped using a combination of remote homology searches and domain annotations, leading to define different levels of darkness. The dark foldable domains were analyzed to understand what make them different from domains stored in databases and thus difficult to annotate. The un-annotated domains of the dark proteome universe display specific features relative to database domains: shorter length, non-canonical content and particular topology in hydrophobic residues, higher propensity for disorder, and a higher energy. These features make them hard to relate to known families. Based on these observations, we emphasize that domain annotation methodologies can still be improved to fully apprehend and decipher the molecular evolution of the protein universe. PMID:28134276

  14. Method and apparatus for detecting dilute concentrations of radioactive xenon in samples of xenon extracted from the atmosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warburton, William K.; Hennig, Wolfgang G.

    A method and apparatus for measuring the concentrations of radioxenon isotopes in a gaseous sample wherein the sample cell is surrounded by N sub-detectors that are sensitive to both electrons and to photons from radioxenon decays. Signal processing electronics are provided that can detect events within the sub-detectors, measure their energies, determine whether they arise from electrons or photons, and detect coincidences between events within the same or different sub-detectors. The energies of detected two or three event coincidences are recorded as points in associated two or three-dimensional histograms. Counts within regions of interest in the histograms are then usedmore » to compute estimates of the radioxenon isotope concentrations. The method achieves lower backgrounds and lower minimum detectable concentrations by using smaller detector crystals, eliminating interference between double and triple coincidence decay branches, and segregating double coincidences within the same sub-detector from those occurring between different sub-detectors.« less

  15. LSAH: a fast and efficient local surface feature for point cloud registration

    NASA Astrophysics Data System (ADS)

    Lu, Rongrong; Zhu, Feng; Wu, Qingxiao; Kong, Yanzi

    2018-04-01

    Point cloud registration is a fundamental task in high level three dimensional applications. Noise, uneven point density and varying point cloud resolutions are the three main challenges for point cloud registration. In this paper, we design a robust and compact local surface descriptor called Local Surface Angles Histogram (LSAH) and propose an effectively coarse to fine algorithm for point cloud registration. The LSAH descriptor is formed by concatenating five normalized sub-histograms into one histogram. The five sub-histograms are created by accumulating a different type of angle from a local surface patch respectively. The experimental results show that our LSAH is more robust to uneven point density and point cloud resolutions than four state-of-the-art local descriptors in terms of feature matching. Moreover, we tested our LSAH based coarse to fine algorithm for point cloud registration. The experimental results demonstrate that our algorithm is robust and efficient as well.

  16. SHARE: system design and case studies for statistical health information release

    PubMed Central

    Gardner, James; Xiong, Li; Xiao, Yonghui; Gao, Jingjing; Post, Andrew R; Jiang, Xiaoqian; Ohno-Machado, Lucila

    2013-01-01

    Objectives We present SHARE, a new system for statistical health information release with differential privacy. We present two case studies that evaluate the software on real medical datasets and demonstrate the feasibility and utility of applying the differential privacy framework on biomedical data. Materials and Methods SHARE releases statistical information in electronic health records with differential privacy, a strong privacy framework for statistical data release. It includes a number of state-of-the-art methods for releasing multidimensional histograms and longitudinal patterns. We performed a variety of experiments on two real datasets, the surveillance, epidemiology and end results (SEER) breast cancer dataset and the Emory electronic medical record (EeMR) dataset, to demonstrate the feasibility and utility of SHARE. Results Experimental results indicate that SHARE can deal with heterogeneous data present in medical data, and that the released statistics are useful. The Kullback–Leibler divergence between the released multidimensional histograms and the original data distribution is below 0.5 and 0.01 for seven-dimensional and three-dimensional data cubes generated from the SEER dataset, respectively. The relative error for longitudinal pattern queries on the EeMR dataset varies between 0 and 0.3. While the results are promising, they also suggest that challenges remain in applying statistical data release using the differential privacy framework for higher dimensional data. Conclusions SHARE is one of the first systems to provide a mechanism for custodians to release differentially private aggregate statistics for a variety of use cases in the medical domain. This proof-of-concept system is intended to be applied to large-scale medical data warehouses. PMID:23059729

  17. Isobio software: biological dose distribution and biological dose volume histogram from physical dose conversion using linear-quadratic-linear model.

    PubMed

    Jaikuna, Tanwiwat; Khadsiri, Phatchareewan; Chawapun, Nisa; Saekho, Suwit; Tharavichitkul, Ekkasit

    2017-02-01

    To develop an in-house software program that is able to calculate and generate the biological dose distribution and biological dose volume histogram by physical dose conversion using the linear-quadratic-linear (LQL) model. The Isobio software was developed using MATLAB version 2014b to calculate and generate the biological dose distribution and biological dose volume histograms. The physical dose from each voxel in treatment planning was extracted through Computational Environment for Radiotherapy Research (CERR), and the accuracy was verified by the differentiation between the dose volume histogram from CERR and the treatment planning system. An equivalent dose in 2 Gy fraction (EQD 2 ) was calculated using biological effective dose (BED) based on the LQL model. The software calculation and the manual calculation were compared for EQD 2 verification with pair t -test statistical analysis using IBM SPSS Statistics version 22 (64-bit). Two and three-dimensional biological dose distribution and biological dose volume histogram were displayed correctly by the Isobio software. Different physical doses were found between CERR and treatment planning system (TPS) in Oncentra, with 3.33% in high-risk clinical target volume (HR-CTV) determined by D 90% , 0.56% in the bladder, 1.74% in the rectum when determined by D 2cc , and less than 1% in Pinnacle. The difference in the EQD 2 between the software calculation and the manual calculation was not significantly different with 0.00% at p -values 0.820, 0.095, and 0.593 for external beam radiation therapy (EBRT) and 0.240, 0.320, and 0.849 for brachytherapy (BT) in HR-CTV, bladder, and rectum, respectively. The Isobio software is a feasible tool to generate the biological dose distribution and biological dose volume histogram for treatment plan evaluation in both EBRT and BT.

  18. Elucidating the effects of adsorbent flexibility on fluid adsorption using simple models and flat-histogram sampling methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Vincent K., E-mail: vincent.shen@nist.gov; Siderius, Daniel W.

    2014-06-28

    Using flat-histogram Monte Carlo methods, we investigate the adsorptive behavior of the square-well fluid in two simple slit-pore-like models intended to capture fundamental characteristics of flexible adsorbent materials. Both models require as input thermodynamic information about the flexible adsorbent material itself. An important component of this work involves formulating the flexible pore models in the appropriate thermodynamic (statistical mechanical) ensembles, namely, the osmotic ensemble and a variant of the grand-canonical ensemble. Two-dimensional probability distributions, which are calculated using flat-histogram methods, provide the information necessary to determine adsorption thermodynamics. For example, we are able to determine precisely adsorption isotherms, (equilibrium) phasemore » transition conditions, limits of stability, and free energies for a number of different flexible adsorbent materials, distinguishable as different inputs into the models. While the models used in this work are relatively simple from a geometric perspective, they yield non-trivial adsorptive behavior, including adsorption-desorption hysteresis solely due to material flexibility and so-called “breathing” of the adsorbent. The observed effects can in turn be tied to the inherent properties of the bare adsorbent. Some of the effects are expected on physical grounds while others arise from a subtle balance of thermodynamic and mechanical driving forces. In addition, the computational strategy presented here can be easily applied to more complex models for flexible adsorbents.« less

  19. Elucidating the effects of adsorbent flexibility on fluid adsorption using simple models and flat-histogram sampling methods

    NASA Astrophysics Data System (ADS)

    Shen, Vincent K.; Siderius, Daniel W.

    2014-06-01

    Using flat-histogram Monte Carlo methods, we investigate the adsorptive behavior of the square-well fluid in two simple slit-pore-like models intended to capture fundamental characteristics of flexible adsorbent materials. Both models require as input thermodynamic information about the flexible adsorbent material itself. An important component of this work involves formulating the flexible pore models in the appropriate thermodynamic (statistical mechanical) ensembles, namely, the osmotic ensemble and a variant of the grand-canonical ensemble. Two-dimensional probability distributions, which are calculated using flat-histogram methods, provide the information necessary to determine adsorption thermodynamics. For example, we are able to determine precisely adsorption isotherms, (equilibrium) phase transition conditions, limits of stability, and free energies for a number of different flexible adsorbent materials, distinguishable as different inputs into the models. While the models used in this work are relatively simple from a geometric perspective, they yield non-trivial adsorptive behavior, including adsorption-desorption hysteresis solely due to material flexibility and so-called "breathing" of the adsorbent. The observed effects can in turn be tied to the inherent properties of the bare adsorbent. Some of the effects are expected on physical grounds while others arise from a subtle balance of thermodynamic and mechanical driving forces. In addition, the computational strategy presented here can be easily applied to more complex models for flexible adsorbents.

  20. Biomorphic networks: approach to invariant feature extraction and segmentation for ATR

    NASA Astrophysics Data System (ADS)

    Baek, Andrew; Farhat, Nabil H.

    1998-10-01

    Invariant features in two dimensional binary images are extracted in a single layer network of locally coupled spiking (pulsating) model neurons with prescribed synapto-dendritic response. The feature vector for an image is represented as invariant structure in the aggregate histogram of interspike intervals obtained by computing time intervals between successive spikes produced from each neuron over a given period of time and combining such intervals from all neurons in the network into a histogram. Simulation results show that the feature vectors are more pattern-specific and invariant under translation, rotation, and change in scale or intensity than achieved in earlier work. We also describe an application of such networks to segmentation of line (edge-enhanced or silhouette) images. The biomorphic spiking network's capabilities in segmentation and invariant feature extraction may prove to be, when they are combined, valuable in Automated Target Recognition (ATR) and other automated object recognition systems.

  1. Querying Patterns in High-Dimensional Heterogenous Datasets

    ERIC Educational Resources Information Center

    Singh, Vishwakarma

    2012-01-01

    The recent technological advancements have led to the availability of a plethora of heterogenous datasets, e.g., images tagged with geo-location and descriptive keywords. An object in these datasets is described by a set of high-dimensional feature vectors. For example, a keyword-tagged image is represented by a color-histogram and a…

  2. Modeling semantic aspects for cross-media image indexing.

    PubMed

    Monay, Florent; Gatica-Perez, Daniel

    2007-10-01

    To go beyond the query-by-example paradigm in image retrieval, there is a need for semantic indexing of large image collections for intuitive text-based image search. Different models have been proposed to learn the dependencies between the visual content of an image set and the associated text captions, then allowing for the automatic creation of semantic indices for unannotated images. The task, however, remains unsolved. In this paper, we present three alternatives to learn a Probabilistic Latent Semantic Analysis model (PLSA) for annotated images, and evaluate their respective performance for automatic image indexing. Under the PLSA assumptions, an image is modeled as a mixture of latent aspects that generates both image features and text captions, and we investigate three ways to learn the mixture of aspects. We also propose a more discriminative image representation than the traditional Blob histogram, concatenating quantized local color information and quantized local texture descriptors. The first learning procedure of a PLSA model for annotated images is a standard EM algorithm, which implicitly assumes that the visual and the textual modalities can be treated equivalently. The other two models are based on an asymmetric PLSA learning, allowing to constrain the definition of the latent space on the visual or on the textual modality. We demonstrate that the textual modality is more appropriate to learn a semantically meaningful latent space, which translates into improved annotation performance. A comparison of our learning algorithms with respect to recent methods on a standard dataset is presented, and a detailed evaluation of the performance shows the validity of our framework.

  3. Three-Dimensional Object Recognition and Registration for Robotic Grasping Systems Using a Modified Viewpoint Feature Histogram

    PubMed Central

    Chen, Chin-Sheng; Chen, Po-Chun; Hsu, Chih-Ming

    2016-01-01

    This paper presents a novel 3D feature descriptor for object recognition and to identify poses when there are six-degrees-of-freedom for mobile manipulation and grasping applications. Firstly, a Microsoft Kinect sensor is used to capture 3D point cloud data. A viewpoint feature histogram (VFH) descriptor for the 3D point cloud data then encodes the geometry and viewpoint, so an object can be simultaneously recognized and registered in a stable pose and the information is stored in a database. The VFH is robust to a large degree of surface noise and missing depth information so it is reliable for stereo data. However, the pose estimation for an object fails when the object is placed symmetrically to the viewpoint. To overcome this problem, this study proposes a modified viewpoint feature histogram (MVFH) descriptor that consists of two parts: a surface shape component that comprises an extended fast point feature histogram and an extended viewpoint direction component. The MVFH descriptor characterizes an object’s pose and enhances the system’s ability to identify objects with mirrored poses. Finally, the refined pose is further estimated using an iterative closest point when the object has been recognized and the pose roughly estimated by the MVFH descriptor and it has been registered on a database. The estimation results demonstrate that the MVFH feature descriptor allows more accurate pose estimation. The experiments also show that the proposed method can be applied in vision-guided robotic grasping systems. PMID:27886080

  4. Simple Math is Enough: Two Examples of Inferring Functional Associations from Genomic Data

    NASA Technical Reports Server (NTRS)

    Liang, Shoudan

    2003-01-01

    Non-random features in the genomic data are usually biologically meaningful. The key is to choose the feature well. Having a p-value based score prioritizes the findings. If two proteins share a unusually large number of common interaction partners, they tend to be involved in the same biological process. We used this finding to predict the functions of 81 un-annotated proteins in yeast.

  5. MCNP output data analysis with ROOT (MODAR)

    NASA Astrophysics Data System (ADS)

    Carasco, C.

    2010-12-01

    MCNP Output Data Analysis with ROOT (MODAR) is a tool based on CERN's ROOT software. MODAR has been designed to handle time-energy data issued by MCNP simulations of neutron inspection devices using the associated particle technique. MODAR exploits ROOT's Graphical User Interface and functionalities to visualize and process MCNP simulation results in a fast and user-friendly way. MODAR allows to take into account the detection system time resolution (which is not possible with MCNP) as well as detectors energy response function and counting statistics in a straightforward way. New version program summaryProgram title: MODAR Catalogue identifier: AEGA_v1_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGA_v1_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 150 927 No. of bytes in distributed program, including test data, etc.: 4 981 633 Distribution format: tar.gz Programming language: C++ Computer: Most Unix workstations and PCs Operating system: Most Unix systems, Linux and windows, provided the ROOT package has been installed. Examples where tested under Suse Linux and Windows XP. RAM: Depends on the size of the MCNP output file. The example presented in the article, which involves three two dimensional 139×740 bins histograms, allocates about 60 MB. These data are running under ROOT and include consumption by ROOT itself. Classification: 17.6 Catalogue identifier of previous version: AEGA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 1161 External routines: ROOT version 5.24.00 ( http://root.cern.ch/drupal/) Does the new version supersede the previous version?: Yes Nature of problem: The output of a MCNP simulation is an ascii file. The data processing is usually performed by copying and pasting the relevant parts of the ascii file into Microsoft Excel. Such an approach is satisfactory when the quantity of data is small but is not efficient when the size of the simulated data is large, for example when time-energy correlations are studied in detail such as in problems involving the associated particle technique. In addition, since the finite time resolution of the simulated detector cannot be modeled with MCNP, systems in which time-energy correlation is crucial cannot be described in a satisfactory way. Finally, realistic particle energy deposit in detectors is calculated with MCNP in a two step process involving type-5 then type-8 tallies. In the first step, the photon flux energy spectrum associated to a time region is selected and serves as a source energy distribution for the second step. Thus, several files must be manipulated before getting the result, which can be time consuming if one needs to study several time regions or different detectors performances. In the same way, modeling counting statistics obtained in a limited acquisition time requires several steps and can also be time consuming. Solution method: In order to overcome the previous limitations, the MODAR C++ code has been written to make use of CERN's ROOT data analysis software. MCNP output data are read from the MCNP output file with dedicated routines. Two dimensional histograms are filled and can be handled efficiently within the ROOT framework. To keep a user friendly analysis tool, all processing and data display can be done by means of ROOT Graphical User Interface. Specific routines have been written to include detectors finite time resolution and energy response function as well as counting statistics in a straightforward way. Reasons for new version: For applications involving the Associate Particle Technique, a large number of gamma rays are produced by the fast neutrons interactions. To study the energy spectra, it is useful to identify the gamma-ray energy peaks in a straightforward way. Therefore, the possibility to show gamma rays corresponding to specific reactions has been added in MODAR. Summary of revisions: It is possible to use a gamma ray database to better identify in the energy spectra gamma ray peaks with their first and second escapes. Histograms can be scaled by the number of source particle to evaluate the number of counts that is expected without statistical uncertainties. Additional comments: The possibility of adding tallies has also been incorporated in MODAR in order to describe systems in which the signal from several detectors can be summed. Moreover, MODAR can be adapted to handle other problems involving two dimensional data. Running time: The CPU time needed to smear a two dimensional histogram depends on the size of the histogram. In the presented example, the time-energy smearing of one of the 139×740 two dimensional histograms takes 3 minutes with a DELL computer equipped with INTEL Core 2.

  6. Local intensity area descriptor for facial recognition in ideal and noise conditions

    NASA Astrophysics Data System (ADS)

    Tran, Chi-Kien; Tseng, Chin-Dar; Chao, Pei-Ju; Ting, Hui-Min; Chang, Liyun; Huang, Yu-Jie; Lee, Tsair-Fwu

    2017-03-01

    We propose a local texture descriptor, local intensity area descriptor (LIAD), which is applied for human facial recognition in ideal and noisy conditions. Each facial image is divided into small regions from which LIAD histograms are extracted and concatenated into a single feature vector to represent the facial image. The recognition is performed using a nearest neighbor classifier with histogram intersection and chi-square statistics as dissimilarity measures. Experiments were conducted with LIAD using the ORL database of faces (Olivetti Research Laboratory, Cambridge), the Face94 face database, the Georgia Tech face database, and the FERET database. The results demonstrated the improvement in accuracy of our proposed descriptor compared to conventional descriptors [local binary pattern (LBP), uniform LBP, local ternary pattern, histogram of oriented gradients, and local directional pattern]. Moreover, the proposed descriptor was less sensitive to noise and had low histogram dimensionality. Thus, it is expected to be a powerful texture descriptor that can be used for various computer vision problems.

  7. Two-dimensional hidden semantic information model for target saliency detection and eyetracking identification

    NASA Astrophysics Data System (ADS)

    Wan, Weibing; Yuan, Lingfeng; Zhao, Qunfei; Fang, Tao

    2018-01-01

    Saliency detection has been applied to the target acquisition case. This paper proposes a two-dimensional hidden Markov model (2D-HMM) that exploits the hidden semantic information of an image to detect its salient regions. A spatial pyramid histogram of oriented gradient descriptors is used to extract features. After encoding the image by a learned dictionary, the 2D-Viterbi algorithm is applied to infer the saliency map. This model can predict fixation of the targets and further creates robust and effective depictions of the targets' change in posture and viewpoint. To validate the model with a human visual search mechanism, two eyetrack experiments are employed to train our model directly from eye movement data. The results show that our model achieves better performance than visual attention. Moreover, it indicates the plausibility of utilizing visual track data to identify targets.

  8. Measuring kinetics of complex single ion channel data using mean-variance histograms.

    PubMed

    Patlak, J B

    1993-07-01

    The measurement of single ion channel kinetics is difficult when those channels exhibit subconductance events. When the kinetics are fast, and when the current magnitudes are small, as is the case for Na+, Ca2+, and some K+ channels, these difficulties can lead to serious errors in the estimation of channel kinetics. I present here a method, based on the construction and analysis of mean-variance histograms, that can overcome these problems. A mean-variance histogram is constructed by calculating the mean current and the current variance within a brief "window" (a set of N consecutive data samples) superimposed on the digitized raw channel data. Systematic movement of this window over the data produces large numbers of mean-variance pairs which can be assembled into a two-dimensional histogram. Defined current levels (open, closed, or sublevel) appear in such plots as low variance regions. The total number of events in such low variance regions is estimated by curve fitting and plotted as a function of window width. This function decreases with the same time constants as the original dwell time probability distribution for each of the regions. The method can therefore be used: 1) to present a qualitative summary of the single channel data from which the signal-to-noise ratio, open channel noise, steadiness of the baseline, and number of conductance levels can be quickly determined; 2) to quantify the dwell time distribution in each of the levels exhibited. In this paper I present the analysis of a Na+ channel recording that had a number of complexities. The signal-to-noise ratio was only about 8 for the main open state, open channel noise, and fast flickers to other states were present, as were a substantial number of subconductance states. "Standard" half-amplitude threshold analysis of these data produce open and closed time histograms that were well fitted by the sum of two exponentials, but with apparently erroneous time constants, whereas the mean-variance histogram technique provided a more credible analysis of the open, closed, and subconductance times for the patch. I also show that the method produces accurate results on simulated data in a wide variety of conditions, whereas the half-amplitude method, when applied to complex simulated data shows the same errors as were apparent in the real data. The utility and the limitations of this new method are discussed.

  9. 3D/2D image registration using weighted histogram of gradient directions

    NASA Astrophysics Data System (ADS)

    Ghafurian, Soheil; Hacihaliloglu, Ilker; Metaxas, Dimitris N.; Tan, Virak; Li, Kang

    2015-03-01

    Three dimensional (3D) to two dimensional (2D) image registration is crucial in many medical applications such as image-guided evaluation of musculoskeletal disorders. One of the key problems is to estimate the 3D CT- reconstructed bone model positions (translation and rotation) which maximize the similarity between the digitally reconstructed radiographs (DRRs) and the 2D fluoroscopic images using a registration method. This problem is computational-intensive due to a large search space and the complicated DRR generation process. Also, finding a similarity measure which converges to the global optimum instead of local optima adds to the challenge. To circumvent these issues, most existing registration methods need a manual initialization, which requires user interaction and is prone to human error. In this paper, we introduce a novel feature-based registration method using the weighted histogram of gradient directions of images. This method simplifies the computation by searching the parameter space (rotation and translation) sequentially rather than simultaneously. In our numeric simulation experiments, the proposed registration algorithm was able to achieve sub-millimeter and sub-degree accuracies. Moreover, our method is robust to the initial guess. It can tolerate up to +/-90°rotation offset from the global optimal solution, which minimizes the need for human interaction to initialize the algorithm.

  10. Inexpensive read-out for coincident electron spectroscopy with a transmission electron microscope at nanometer scale using micro channel plates and multistrip anodes

    NASA Astrophysics Data System (ADS)

    Hollander, R. W.; Bom, V. R.; van Eijk, C. W. E.; Faber, J. S.; Hoevers, H.; Kruit, P.

    1994-09-01

    The elemental composition of a sample at nanometer scale is determined by measurement of the characteristic energy of Auger electrons, emitted in coincidence with incoming primary electrons from a microbeam in a scanning transmission electron microscope (STEM). Single electrons are detected with position sensitive detectors, consisting of MicroChannel Plates (MCP) and MultiStrip Anodes (MSA), one for the energy of the Auger electrons (Auger-detector) and one for the energy loss of primary electrons (EELS-detector). The MSAs are sensed with LeCroy 2735DC preamplifiers. The fast readout is based on LeCroy's PCOS III system. On the detection of a coincidence (Event) energy data of Auger and EELS are combined with timing data to an Event word. Event words are stored in list mode in a VME memory module. Blocks of Event words are scanned by transputers in VME and two-dimensional energy histograms are filled using the timing information to obtain a maximal true/accidental ratio. The resulting histograms are stored on disk of a PC-386, which also controls data taking. The system is designed to handle 10 5 Events per second, 90% of which are accidental. In the histograms the "true" to "accidental" ratio will be 5. The dead time is 15%.

  11. Bandgap Inhomogeneity of a PbSe Quantum Dot Ensemble from Two-Dimensional Spectroscopy and Comparison to Size Inhomogeneity from Electron Microscopy

    DOE PAGES

    Park, Samuel D.; Baranov, Dmitry; Ryu, Jisu; ...

    2017-01-03

    Femtosecond two-dimensional Fourier transform spectroscopy is used to determine the static bandgap inhomogeneity of a colloidal quantum dot ensemble. The excited states of quantum dots absorb light, so their absorptive two-dimensional (2D) spectra will typically have positive and negative peaks. We show that the absorption bandgap inhomogeneity is robustly determined by the slope of the nodal line separating positive and negative peaks in the 2D spectrum around the bandgap transition; this nodal line slope is independent of excited state parameters not known from the absorption and emission spectra. The absorption bandgap inhomogeneity is compared to a size and shape distributionmore » determined by electron microscopy. The electron microscopy images are analyzed using new 2D histograms that correlate major and minor image projections to reveal elongated nanocrystals, a conclusion supported by grazing incidence small-angle X-ray scattering and high-resolution transmission electron microscopy. Lastly, the absorption bandgap inhomogeneity quantitatively agrees with the bandgap variations calculated from the size and shape distribution, placing upper bounds on any surface contributions.« less

  12. Theory and Application of DNA Histogram Analysis.

    ERIC Educational Resources Information Center

    Bagwell, Charles Bruce

    The underlying principles and assumptions associated with DNA histograms are discussed along with the characteristics of fluorescent probes. Information theory was described and used to calculate the information content of a DNA histogram. Two major types of DNA histogram analyses are proposed: parametric and nonparametric analysis. Three levels…

  13. Color image segmentation to detect defects on fresh ham

    NASA Astrophysics Data System (ADS)

    Marty-Mahe, Pascale; Loisel, Philippe; Brossard, Didier

    2003-04-01

    We present in this paper the color segmentation methods that were used to detect appearance defects on 3 dimensional shape of fresh ham. The use of color histograms turned out to be an efficient solution to characterize the healthy skin, but a special care must be taken to choose the color components because of the 3 dimensional shape of ham.

  14. Face-iris multimodal biometric scheme based on feature level fusion

    NASA Astrophysics Data System (ADS)

    Huo, Guang; Liu, Yuanning; Zhu, Xiaodong; Dong, Hongxing; He, Fei

    2015-11-01

    Unlike score level fusion, feature level fusion demands all the features extracted from unimodal traits with high distinguishability, as well as homogeneity and compatibility, which is difficult to achieve. Therefore, most multimodal biometric research focuses on score level fusion, whereas few investigate feature level fusion. We propose a face-iris recognition method based on feature level fusion. We build a special two-dimensional-Gabor filter bank to extract local texture features from face and iris images, and then transform them by histogram statistics into an energy-orientation variance histogram feature with lower dimensions and higher distinguishability. Finally, through a fusion-recognition strategy based on principal components analysis and support vector machine (FRSPS), feature level fusion and one-to-n identification are accomplished. The experimental results demonstrate that this method can not only effectively extract face and iris features but also provide higher recognition accuracy. Compared with some state-of-the-art fusion methods, the proposed method has a significant performance advantage.

  15. Genetic Engineering of Optical Properties of Biomaterials

    NASA Astrophysics Data System (ADS)

    Gourley, Paul; Naviaux, Robert; Yaffe, Michael

    2008-03-01

    Baker's yeast cells are easily cultured and can be manipulated genetically to produce large numbers of bioparticles (cells and mitochondria) with controllable size and optical properties. We have recently employed nanolaser spectroscopy to study the refractive index of individual cells and isolated mitochondria from two mutant strains. Results show that biomolecular changes induced by mutation can produce bioparticles with radical changes in refractive index. Wild-type mitochondria exhibit a distribution with a well-defined mean and small variance. In striking contrast, mitochondria from one mutant strain produced a histogram that is highly collapsed with a ten-fold decrease in the mean and standard deviation. In a second mutant strain we observed an opposite effect with the mean nearly unchanged but the variance increased nearly a thousand-fold. Both histograms could be self-consistently modeled with a single, log-normal distribution. The strains were further examined by 2-dimensional gel electrophoresis to measure changes in protein composition. All of these data show that genetic manipulation of cells represents a new approach to engineering optical properties of bioparticles.

  16. Parameterization of the Age-Dependent Whole Brain Apparent Diffusion Coefficient Histogram

    PubMed Central

    Batra, Marion; Nägele, Thomas

    2015-01-01

    Purpose. The distribution of apparent diffusion coefficient (ADC) values in the brain can be used to characterize age effects and pathological changes of the brain tissue. The aim of this study was the parameterization of the whole brain ADC histogram by an advanced model with influence of age considered. Methods. Whole brain ADC histograms were calculated for all data and for seven age groups between 10 and 80 years. Modeling of the histograms was performed for two parts of the histogram separately: the brain tissue part was modeled by two Gaussian curves, while the remaining part was fitted by the sum of a Gaussian curve, a biexponential decay, and a straight line. Results. A consistent fitting of the histograms of all age groups was possible with the proposed model. Conclusions. This study confirms the strong dependence of the whole brain ADC histograms on the age of the examined subjects. The proposed model can be used to characterize changes of the whole brain ADC histogram in certain diseases under consideration of age effects. PMID:26609526

  17. [Population policies and intervention on natality in Eastern Europe. Part 2: classified bibliography].

    PubMed

    Hecht, J; Henripin, J; Ivanov, S; There, C

    1987-06-01

    This is an unannotated bibliography of literature concerning Eastern European population policies, with an emphasis on policies pertaining to fertility. The bibliography is organized into two sections: the first is devoted to population theories and policies, fertility, contraception, and abortion; the second presents citations by country. An author index is included. Eastern Europe is defined as including Albania, the USSR, and Yugoslavia.

  18. Automatic characterization and segmentation of human skin using three-dimensional optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Hori, Yasuaki; Yasuno, Yoshiaki; Sakai, Shingo; Matsumoto, Masayuki; Sugawara, Tomoko; Madjarova, Violeta; Yamanari, Masahiro; Makita, Shuichi; Yasui, Takeshi; Araki, Tsutomu; Itoh, Masahide; Yatagai, Toyohiko

    2006-03-01

    A set of fully automated algorithms that is specialized for analyzing a three-dimensional optical coherence tomography (OCT) volume of human skin is reported. The algorithm set first determines the skin surface of the OCT volume, and a depth-oriented algorithm provides the mean epidermal thickness, distribution map of the epidermis, and a segmented volume of the epidermis. Subsequently, an en face shadowgram is produced by an algorithm to visualize the infundibula in the skin with high contrast. The population and occupation ratio of the infundibula are provided by a histogram-based thresholding algorithm and a distance mapping algorithm. En face OCT slices at constant depths from the sample surface are extracted, and the histogram-based thresholding algorithm is again applied to these slices, yielding a three-dimensional segmented volume of the infundibula. The dermal attenuation coefficient is also calculated from the OCT volume in order to evaluate the skin texture. The algorithm set examines swept-source OCT volumes of the skins of several volunteers, and the results show the high stability, portability and reproducibility of the algorithm.

  19. Measuring kinetics of complex single ion channel data using mean-variance histograms.

    PubMed Central

    Patlak, J B

    1993-01-01

    The measurement of single ion channel kinetics is difficult when those channels exhibit subconductance events. When the kinetics are fast, and when the current magnitudes are small, as is the case for Na+, Ca2+, and some K+ channels, these difficulties can lead to serious errors in the estimation of channel kinetics. I present here a method, based on the construction and analysis of mean-variance histograms, that can overcome these problems. A mean-variance histogram is constructed by calculating the mean current and the current variance within a brief "window" (a set of N consecutive data samples) superimposed on the digitized raw channel data. Systematic movement of this window over the data produces large numbers of mean-variance pairs which can be assembled into a two-dimensional histogram. Defined current levels (open, closed, or sublevel) appear in such plots as low variance regions. The total number of events in such low variance regions is estimated by curve fitting and plotted as a function of window width. This function decreases with the same time constants as the original dwell time probability distribution for each of the regions. The method can therefore be used: 1) to present a qualitative summary of the single channel data from which the signal-to-noise ratio, open channel noise, steadiness of the baseline, and number of conductance levels can be quickly determined; 2) to quantify the dwell time distribution in each of the levels exhibited. In this paper I present the analysis of a Na+ channel recording that had a number of complexities. The signal-to-noise ratio was only about 8 for the main open state, open channel noise, and fast flickers to other states were present, as were a substantial number of subconductance states. "Standard" half-amplitude threshold analysis of these data produce open and closed time histograms that were well fitted by the sum of two exponentials, but with apparently erroneous time constants, whereas the mean-variance histogram technique provided a more credible analysis of the open, closed, and subconductance times for the patch. I also show that the method produces accurate results on simulated data in a wide variety of conditions, whereas the half-amplitude method, when applied to complex simulated data shows the same errors as were apparent in the real data. The utility and the limitations of this new method are discussed. Images FIGURE 2 FIGURE 4 FIGURE 8 FIGURE 9 PMID:7690261

  20. Discovering Functions of Unannotated Genes from a Transcriptome Survey of Wild Fungal Isolates

    PubMed Central

    Ellison, Christopher E.; Kowbel, David; Glass, N. Louise; Taylor, John W.

    2014-01-01

    ABSTRACT Most fungal genomes are poorly annotated, and many fungal traits of industrial and biomedical relevance are not well suited to classical genetic screens. Assigning genes to phenotypes on a genomic scale thus remains an urgent need in the field. We developed an approach to infer gene function from expression profiles of wild fungal isolates, and we applied our strategy to the filamentous fungus Neurospora crassa. Using transcriptome measurements in 70 strains from two well-defined clades of this microbe, we first identified 2,247 cases in which the expression of an unannotated gene rose and fell across N. crassa strains in parallel with the expression of well-characterized genes. We then used image analysis of hyphal morphologies, quantitative growth assays, and expression profiling to test the functions of four genes predicted from our population analyses. The results revealed two factors that influenced regulation of metabolism of nonpreferred carbon and nitrogen sources, a gene that governed hyphal architecture, and a gene that mediated amino acid starvation resistance. These findings validate the power of our population-transcriptomic approach for inference of novel gene function, and we suggest that this strategy will be of broad utility for genome-scale annotation in many fungal systems. PMID:24692637

  1. Three-dimensional volumetric analysis of irradiated lung with adjuvant breast irradiation.

    PubMed

    Teh, Amy Yuen Meei; Park, Eileen J H; Shen, Liang; Chung, Hans T

    2009-12-01

    To retrospectively evaluate the dose-volume histogram data of irradiated lung in adjuvant breast radiotherapy (ABR) using a three-dimensional computed tomography (3D-CT)-guided planning technique; and to investigate the relationship between lung dose-volume data and traditionally used two-dimensional (2D) parameters, as well as their correlation with the incidence of steroid-requiring radiation pneumonitis (SRRP). Patients beginning ABR between January 2005 and February 2006 were retrospectively reviewed. Patients included were women aged >or=18 years with ductal carcinoma in situ or Stage I-III invasive carcinoma, who received radiotherapy using a 3D-CT technique to the breast or chest wall (two-field radiotherapy [2FRT]) with or without supraclavicular irradiation (three-field radiotherapy [3FRT]), to 50 Gy in 25 fractions. A 10-Gy tumor-bed boost was allowed. Lung dose-volume histogram parameters (V(10), V(20), V(30), V(40)), 2D parameters (central lung depth [CLD], maximum lung depth [MLD], and lung length [LL]), and incidence of SRRP were reported. A total of 89 patients met the inclusion criteria: 51 had 2FRT, and 38 had 3FRT. With 2FRT, mean ipsilateral V(10), V(20), V(30), V(40) and CLD, MLD, LL were 20%, 14%, 11%, and 8% and 2.0 cm, 2.1 cm, and 14.6 cm, respectively, with strong correlation between CLD and ipsilateral V(10-V40) (R(2) = 0.73-0.83, p < 0.0005). With 3FRT, mean ipsilateral V(10), V(20), V(30), and V(40) were 30%, 22%, 17%, and 11%, but its correlation with 2D parameters was poor. With a median follow-up of 14.5 months, 1 case of SRRP was identified. With only 1 case of SRRP observed, our study is limited in its ability to provide definitive guidance, but it does provide a starting point for acceptable lung irradiation during ABR. Further prospective studies are warranted.

  2. Protein function prediction using neighbor relativity in protein-protein interaction network.

    PubMed

    Moosavi, Sobhan; Rahgozar, Masoud; Rahimi, Amir

    2013-04-01

    There is a large gap between the number of discovered proteins and the number of functionally annotated ones. Due to the high cost of determining protein function by wet-lab research, function prediction has become a major task for computational biology and bioinformatics. Some researches utilize the proteins interaction information to predict function for un-annotated proteins. In this paper, we propose a novel approach called "Neighbor Relativity Coefficient" (NRC) based on interaction network topology which estimates the functional similarity between two proteins. NRC is calculated for each pair of proteins based on their graph-based features including distance, common neighbors and the number of paths between them. In order to ascribe function to an un-annotated protein, NRC estimates a weight for each neighbor to transfer its annotation to the unknown protein. Finally, the unknown protein will be annotated by the top score transferred functions. We also investigate the effect of using different coefficients for various types of functions. The proposed method has been evaluated on Saccharomyces cerevisiae and Homo sapiens interaction networks. The performance analysis demonstrates that NRC yields better results in comparison with previous protein function prediction approaches that utilize interaction network. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Unsupervised spike sorting based on discriminative subspace learning.

    PubMed

    Keshtkaran, Mohammad Reza; Yang, Zhi

    2014-01-01

    Spike sorting is a fundamental preprocessing step for many neuroscience studies which rely on the analysis of spike trains. In this paper, we present two unsupervised spike sorting algorithms based on discriminative subspace learning. The first algorithm simultaneously learns the discriminative feature subspace and performs clustering. It uses histogram of features in the most discriminative projection to detect the number of neurons. The second algorithm performs hierarchical divisive clustering that learns a discriminative 1-dimensional subspace for clustering in each level of the hierarchy until achieving almost unimodal distribution in the subspace. The algorithms are tested on synthetic and in-vivo data, and are compared against two widely used spike sorting methods. The comparative results demonstrate that our spike sorting methods can achieve substantially higher accuracy in lower dimensional feature space, and they are highly robust to noise. Moreover, they provide significantly better cluster separability in the learned subspace than in the subspace obtained by principal component analysis or wavelet transform.

  4. Proteogenomic analysis reveals alternative splicing and translation as part of the abscisic acid response in Arabidopsis seedlings.

    PubMed

    Zhu, Fu-Yuan; Chen, Mo-Xian; Ye, Neng-Hui; Shi, Lu; Ma, Kai-Long; Yang, Jing-Fang; Cao, Yun-Ying; Zhang, Youjun; Yoshida, Takuya; Fernie, Alisdair R; Fan, Guang-Yi; Wen, Bo; Zhou, Ruo; Liu, Tie-Yuan; Fan, Tao; Gao, Bei; Zhang, Di; Hao, Ge-Fei; Xiao, Shi; Liu, Ying-Gao; Zhang, Jianhua

    2017-08-01

    In eukaryotes, mechanisms such as alternative splicing (AS) and alternative translation initiation (ATI) contribute to organismal protein diversity. Specifically, splicing factors play crucial roles in responses to environment and development cues; however, the underlying mechanisms are not well investigated in plants. Here, we report the parallel employment of short-read RNA sequencing, single molecule long-read sequencing and proteomic identification to unravel AS isoforms and previously unannotated proteins in response to abscisic acid (ABA) treatment. Combining the data from the two sequencing methods, approximately 83.4% of intron-containing genes were alternatively spliced. Two AS types, which are referred to as alternative first exon (AFE) and alternative last exon (ALE), were more abundant than intron retention (IR); however, by contrast to AS events detected under normal conditions, differentially expressed AS isoforms were more likely to be translated. ABA extensively affects the AS pattern, indicated by the increasing number of non-conventional splicing sites. This work also identified thousands of unannotated peptides and proteins by ATI based on mass spectrometry and a virtual peptide library deduced from both strands of coding regions within the Arabidopsis genome. The results enhance our understanding of AS and alternative translation mechanisms under normal conditions, and in response to ABA treatment. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.

  5. Information granules in image histogram analysis.

    PubMed

    Wieclawek, Wojciech

    2018-04-01

    A concept of granular computing employed in intensity-based image enhancement is discussed. First, a weighted granular computing idea is introduced. Then, the implementation of this term in the image processing area is presented. Finally, multidimensional granular histogram analysis is introduced. The proposed approach is dedicated to digital images, especially to medical images acquired by Computed Tomography (CT). As the histogram equalization approach, this method is based on image histogram analysis. Yet, unlike the histogram equalization technique, it works on a selected range of the pixel intensity and is controlled by two parameters. Performance is tested on anonymous clinical CT series. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Multidimensional generalized-ensemble algorithms for complex systems.

    PubMed

    Mitsutake, Ayori; Okamoto, Yuko

    2009-06-07

    We give general formulations of the multidimensional multicanonical algorithm, simulated tempering, and replica-exchange method. We generalize the original potential energy function E(0) by adding any physical quantity V of interest as a new energy term. These multidimensional generalized-ensemble algorithms then perform a random walk not only in E(0) space but also in V space. Among the three algorithms, the replica-exchange method is the easiest to perform because the weight factor is just a product of regular Boltzmann-like factors, while the weight factors for the multicanonical algorithm and simulated tempering are not a priori known. We give a simple procedure for obtaining the weight factors for these two latter algorithms, which uses a short replica-exchange simulation and the multiple-histogram reweighting techniques. As an example of applications of these algorithms, we have performed a two-dimensional replica-exchange simulation and a two-dimensional simulated-tempering simulation using an alpha-helical peptide system. From these simulations, we study the helix-coil transitions of the peptide in gas phase and in aqueous solution.

  7. Statistical Properties of Line Centroid Velocity Increments in the rho Ophiuchi Cloud

    NASA Technical Reports Server (NTRS)

    Lis, D. C.; Keene, Jocelyn; Li, Y.; Phillips, T. G.; Pety, J.

    1998-01-01

    We present a comparison of histograms of CO (2-1) line centroid velocity increments in the rho Ophiuchi molecular cloud with those computed for spectra synthesized from a three-dimensional, compressible, but non-starforming and non-gravitating hydrodynamic simulation. Histograms of centroid velocity increments in the rho Ophiuchi cloud show clearly non-Gaussian wings, similar to those found in histograms of velocity increments and derivatives in experimental studies of laboratory and atmospheric flows, as well as numerical simulations of turbulence. The magnitude of these wings increases monotonically with decreasing separation, down to the angular resolution of the data. This behavior is consistent with that found in the phase of the simulation which has most of the properties of incompressible turbulence. The time evolution of the magnitude of the non-Gaussian wings in the histograms of centroid velocity increments in the simulation is consistent with the evolution of the vorticity in the flow. However, we cannot exclude the possibility that the wings are associated with the shock interaction regions. Moreover, in an active starforming region like the rho Ophiuchi cloud, the effects of shocks may be more important than in the simulation. However, being able to identify shock interaction regions in the interstellar medium is also important, since numerical simulations show that vorticity is generated in shock interactions.

  8. The Proteome Folding Project: Proteome-scale prediction of structure and function

    PubMed Central

    Drew, Kevin; Winters, Patrick; Butterfoss, Glenn L.; Berstis, Viktors; Uplinger, Keith; Armstrong, Jonathan; Riffle, Michael; Schweighofer, Erik; Bovermann, Bill; Goodlett, David R.; Davis, Trisha N.; Shasha, Dennis; Malmström, Lars; Bonneau, Richard

    2011-01-01

    The incompleteness of proteome structure and function annotation is a critical problem for biologists and, in particular, severely limits interpretation of high-throughput and next-generation experiments. We have developed a proteome annotation pipeline based on structure prediction, where function and structure annotations are generated using an integration of sequence comparison, fold recognition, and grid-computing-enabled de novo structure prediction. We predict protein domain boundaries and three-dimensional (3D) structures for protein domains from 94 genomes (including human, Arabidopsis, rice, mouse, fly, yeast, Escherichia coli, and worm). De novo structure predictions were distributed on a grid of more than 1.5 million CPUs worldwide (World Community Grid). We generated significant numbers of new confident fold annotations (9% of domains that are otherwise unannotated in these genomes). We demonstrate that predicted structures can be combined with annotations from the Gene Ontology database to predict new and more specific molecular functions. PMID:21824995

  9. A new computer code for discrete fracture network modelling

    NASA Astrophysics Data System (ADS)

    Xu, Chaoshui; Dowd, Peter

    2010-03-01

    The authors describe a comprehensive software package for two- and three-dimensional stochastic rock fracture simulation using marked point processes. Fracture locations can be modelled by a Poisson, a non-homogeneous, a cluster or a Cox point process; fracture geometries and properties are modelled by their respective probability distributions. Virtual sampling tools such as plane, window and scanline sampling are included in the software together with a comprehensive set of statistical tools including histogram analysis, probability plots, rose diagrams and hemispherical projections. The paper describes in detail the theoretical basis of the implementation and provides a case study in rock fracture modelling to demonstrate the application of the software.

  10. Modeling late rectal toxicities based on a parameterized representation of the 3D dose distribution

    NASA Astrophysics Data System (ADS)

    Buettner, Florian; Gulliford, Sarah L.; Webb, Steve; Partridge, Mike

    2011-04-01

    Many models exist for predicting toxicities based on dose-volume histograms (DVHs) or dose-surface histograms (DSHs). This approach has several drawbacks as firstly the reduction of the dose distribution to a histogram results in the loss of spatial information and secondly the bins of the histograms are highly correlated with each other. Furthermore, some of the complex nonlinear models proposed in the past lack a direct physical interpretation and the ability to predict probabilities rather than binary outcomes. We propose a parameterized representation of the 3D distribution of the dose to the rectal wall which explicitly includes geometrical information in the form of the eccentricity of the dose distribution as well as its lateral and longitudinal extent. We use a nonlinear kernel-based probabilistic model to predict late rectal toxicity based on the parameterized dose distribution and assessed its predictive power using data from the MRC RT01 trial (ISCTRN 47772397). The endpoints under consideration were rectal bleeding, loose stools, and a global toxicity score. We extract simple rules identifying 3D dose patterns related to a specifically low risk of complication. Normal tissue complication probability (NTCP) models based on parameterized representations of geometrical and volumetric measures resulted in areas under the curve (AUCs) of 0.66, 0.63 and 0.67 for predicting rectal bleeding, loose stools and global toxicity, respectively. In comparison, NTCP models based on standard DVHs performed worse and resulted in AUCs of 0.59 for all three endpoints. In conclusion, we have presented low-dimensional, interpretable and nonlinear NTCP models based on the parameterized representation of the dose to the rectal wall. These models had a higher predictive power than models based on standard DVHs and their low dimensionality allowed for the identification of 3D dose patterns related to a low risk of complication.

  11. Prospective Clinical Trial of Bladder Filling and Three-Dimensional Dosimetry in High-Dose-Rate Vaginal Cuff Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Alexandra J.; Cormack, Robert A.; Lee, Hang

    2008-11-01

    Purpose: To investigate the effect of bladder filling on dosimetry and to determine the best bladder dosimetric parameter for vaginal cuff brachytherapy. Methods and Materials: In this prospective clinical trial, a total of 20 women underwent vaginal cylinder high-dose-rate brachytherapy. The bladder was full for Fraction 2 and empty for Fraction 3. Dose-volume histogram and dose-surface histogram values were generated for the bladder, rectum, and urethra. The midline maximal bladder point (MBP) and the midline maximal rectal point were recorded. Paired t tests, Pearson correlations, and regression analyses were performed. Results: The volume and surface area of the irradiated bladdermore » were significantly smaller when the bladder was empty than when full. Of the several dose-volume histogram and dose-surface histogram parameters evaluated, the bladder maximal dose received by 2 cm{sup 3} of tissue, volume of bladder receiving {>=}50% of the dose, volume of bladder receiving {>=}70% of the dose, and surface area of bladder receiving {>=}50% of the dose significantly predicted for the difference between the empty vs. full filling state. The volume of bladder receiving {>=}70% of the dose and the maximal dose received by 2 cm{sup 3} of tissue correlated significantly with the MBP. Bladder filling did not alter the volume or surface area of the rectum irradiated. However, an empty bladder did result in the nearest point of bowel being significantly closer to the vaginal cylinder than when the bladder was full. Conclusions: Patients undergoing vaginal cuff brachytherapy treated with an empty bladder have a lower bladder dose than those treated with a full bladder. The MBP correlated well with the volumetric assessments of bladder dose and provided a noninvasive method for reporting the MBP dose using three-dimensional imaging. The MBP can therefore be used as a surrogate for complex dosimetry in the clinic.« less

  12. MRI histogram analysis enables objective and continuous classification of intervertebral disc degeneration.

    PubMed

    Waldenberg, Christian; Hebelka, Hanna; Brisby, Helena; Lagerstrand, Kerstin Magdalena

    2018-05-01

    Magnetic resonance imaging (MRI) is the best diagnostic imaging method for low back pain. However, the technique is currently not utilized in its full capacity, often failing to depict painful intervertebral discs (IVDs), potentially due to the rough degeneration classification system used clinically today. MR image histograms, which reflect the IVD heterogeneity, may offer sensitive imaging biomarkers for IVD degeneration classification. This study investigates the feasibility of using histogram analysis as means of objective and continuous grading of IVD degeneration. Forty-nine IVDs in ten low back pain patients (six males, 25-69 years) were examined with MRI (T2-weighted images and T2-maps). Each IVD was semi-automatically segmented on three mid-sagittal slices. Histogram features of the IVD were extracted from the defined regions of interest and correlated to Pfirrmann grade. Both T2-weighted images and T2-maps displayed similar histogram features. Histograms of well-hydrated IVDs displayed two separate peaks, representing annulus fibrosus and nucleus pulposus. Degenerated IVDs displayed decreased peak separation, where the separation was shown to correlate strongly with Pfirrmann grade (P < 0.05). In addition, some degenerated IVDs within the same Pfirrmann grade displayed diametrically different histogram appearances. Histogram features correlated well with IVD degeneration, suggesting that IVD histogram analysis is a suitable tool for objective and continuous IVD degeneration classification. As histogram analysis revealed IVD heterogeneity, it may be a clinical tool for characterization of regional IVD degeneration effects. To elucidate the usefulness of histogram analysis in patient management, IVD histogram features between asymptomatic and symptomatic individuals needs to be compared.

  13. CellAtlasSearch: a scalable search engine for single cells.

    PubMed

    Srivastava, Divyanshu; Iyer, Arvind; Kumar, Vibhor; Sengupta, Debarka

    2018-05-21

    Owing to the advent of high throughput single cell transcriptomics, past few years have seen exponential growth in production of gene expression data. Recently efforts have been made by various research groups to homogenize and store single cell expression from a large number of studies. The true value of this ever increasing data deluge can be unlocked by making it searchable. To this end, we propose CellAtlasSearch, a novel search architecture for high dimensional expression data, which is massively parallel as well as light-weight, thus infinitely scalable. In CellAtlasSearch, we use a Graphical Processing Unit (GPU) friendly version of Locality Sensitive Hashing (LSH) for unmatched speedup in data processing and query. Currently, CellAtlasSearch features over 300 000 reference expression profiles including both bulk and single-cell data. It enables the user query individual single cell transcriptomes and finds matching samples from the database along with necessary meta information. CellAtlasSearch aims to assist researchers and clinicians in characterizing unannotated single cells. It also facilitates noise free, low dimensional representation of single-cell expression profiles by projecting them on a wide variety of reference samples. The web-server is accessible at: http://www.cellatlassearch.com.

  14. Detection of ochratoxin A contamination in stored wheat using near-infrared hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Senthilkumar, T.; Jayas, D. S.; White, N. D. G.; Fields, P. G.; Gräfenhan, T.

    2017-03-01

    Near-infrared (NIR) hyperspectral imaging system was used to detect five concentration levels of ochratoxin A (OTA) in contaminated wheat kernels. The wheat kernels artificially inoculated with two different OTA producing Penicillium verrucosum strains, two different non-toxigenic P. verrucosum strains, and sterile control wheat kernels were subjected to NIR hyperspectral imaging. The acquired three-dimensional data were reshaped into readable two-dimensional data. Principal Component Analysis (PCA) was applied to the two dimensional data to identify the key wavelengths which had greater significance in detecting OTA contamination in wheat. Statistical and histogram features extracted at the key wavelengths were used in the linear, quadratic and Mahalanobis statistical discriminant models to differentiate between sterile control, five concentration levels of OTA contamination in wheat kernels, and five infection levels of non-OTA producing P. verrucosum inoculated wheat kernels. The classification models differentiated sterile control samples from OTA contaminated wheat kernels and non-OTA producing P. verrucosum inoculated wheat kernels with a 100% accuracy. The classification models also differentiated between five concentration levels of OTA contaminated wheat kernels and between five infection levels of non-OTA producing P. verrucosum inoculated wheat kernels with a correct classification of more than 98%. The non-OTA producing P. verrucosum inoculated wheat kernels and OTA contaminated wheat kernels subjected to hyperspectral imaging provided different spectral patterns.

  15. Score-level fusion of two-dimensional and three-dimensional palmprint for personal recognition systems

    NASA Astrophysics Data System (ADS)

    Chaa, Mourad; Boukezzoula, Naceur-Eddine; Attia, Abdelouahab

    2017-01-01

    Two types of scores extracted from two-dimensional (2-D) and three-dimensional (3-D) palmprint for personal recognition systems are merged, introducing a local image descriptor for 2-D palmprint-based recognition systems, named bank of binarized statistical image features (B-BSIF). The main idea of B-BSIF is that the extracted histograms from the binarized statistical image features (BSIF) code images (the results of applying the different BSIF descriptor size with the length 12) are concatenated into one to produce a large feature vector. 3-D palmprint contains the depth information of the palm surface. The self-quotient image (SQI) algorithm is applied for reconstructing illumination-invariant 3-D palmprint images. To extract discriminative Gabor features from SQI images, Gabor wavelets are defined and used. Indeed, the dimensionality reduction methods have shown their ability in biometrics systems. Given this, a principal component analysis (PCA)+linear discriminant analysis (LDA) technique is employed. For the matching process, the cosine Mahalanobis distance is applied. Extensive experiments were conducted on a 2-D and 3-D palmprint database with 10,400 range images from 260 individuals. Then, a comparison was made between the proposed algorithm and other existing methods in the literature. Results clearly show that the proposed framework provides a higher correct recognition rate. Furthermore, the best results were obtained by merging the score of B-BSIF descriptor with the score of the SQI+Gabor wavelets+PCA+LDA method, yielding an equal error rate of 0.00% and a recognition rate of rank-1=100.00%.

  16. SVM based colon polyps classifier in a wireless active stereo endoscope.

    PubMed

    Ayoub, J; Granado, B; Mhanna, Y; Romain, O

    2010-01-01

    This work focuses on the recognition of three-dimensional colon polyps captured by an active stereo vision sensor. The detection algorithm consists of SVM classifier trained on robust feature descriptors. The study is related to Cyclope, this prototype sensor allows real time 3D object reconstruction and continues to be optimized technically to improve its classification task by differentiation between hyperplastic and adenomatous polyps. Experimental results were encouraging and show correct classification rate of approximately 97%. The work contains detailed statistics about the detection rate and the computing complexity. Inspired by intensity histogram, the work shows a new approach that extracts a set of features based on depth histogram and combines stereo measurement with SVM classifiers to correctly classify benign and malignant polyps.

  17. True progression versus pseudoprogression in the treatment of glioblastomas: a comparison study of normalized cerebral blood volume and apparent diffusion coefficient by histogram analysis.

    PubMed

    Song, Yong Sub; Choi, Seung Hong; Park, Chul-Kee; Yi, Kyung Sik; Lee, Woong Jae; Yun, Tae Jin; Kim, Tae Min; Lee, Se-Hoon; Kim, Ji-Hoon; Sohn, Chul-Ho; Park, Sung-Hye; Kim, Il Han; Jahng, Geon-Ho; Chang, Kee-Hyun

    2013-01-01

    The purpose of this study was to differentiate true progression from pseudoprogression of glioblastomas treated with concurrent chemoradiotherapy (CCRT) with temozolomide (TMZ) by using histogram analysis of apparent diffusion coefficient (ADC) and normalized cerebral blood volume (nCBV) maps. Twenty patients with histopathologically proven glioblastoma who had received CCRT with TMZ underwent perfusion-weighted imaging and diffusion-weighted imaging (b = 0, 1000 sec/mm(2)). The corresponding nCBV and ADC maps for the newly visible, entirely enhancing lesions were calculated after the completion of CCRT with TMZ. Two observers independently measured the histogram parameters of the nCBV and ADC maps. The histogram parameters between the true progression group (n = 10) and the pseudoprogression group (n = 10) were compared by use of an unpaired Student's t test and subsequent multivariable stepwise logistic regression analysis to determine the best predictors for the differential diagnosis between the two groups. Receiver operating characteristic analysis was employed to determine the best cutoff values for the histogram parameters that proved to be significant predictors for differentiating true progression from pseudoprogression. Intraclass correlation coefficient was used to determine the level of inter-observer reliability for the histogram parameters. The 5th percentile value (C5) of the cumulative ADC histograms was a significant predictor for the differential diagnosis between true progression and pseudoprogression (p = 0.044 for observer 1; p = 0.011 for observer 2). Optimal cutoff values of 892 × 10(-6) mm(2)/sec for observer 1 and 907 × 10(-6) mm(2)/sec for observer 2 could help differentiate between the two groups with a sensitivity of 90% and 80%, respectively, a specificity of 90% and 80%, respectively, and an area under the curve of 0.880 and 0.840, respectively. There was no other significant differentiating parameter on the nCBV histograms. Inter-observer reliability was excellent or good for all histogram parameters (intraclass correlation coefficient range: 0.70-0.99). The C5 of the cumulative ADC histogram can be a promising parameter for the differentiation of true progression from pseudoprogression of newly visible, entirely enhancing lesions after CCRT with TMZ for glioblastomas.

  18. True Progression versus Pseudoprogression in the Treatment of Glioblastomas: A Comparison Study of Normalized Cerebral Blood Volume and Apparent Diffusion Coefficient by Histogram Analysis

    PubMed Central

    Song, Yong Sub; Park, Chul-Kee; Yi, Kyung Sik; Lee, Woong Jae; Yun, Tae Jin; Kim, Tae Min; Lee, Se-Hoon; Kim, Ji-Hoon; Sohn, Chul-Ho; Park, Sung-Hye; Kim, Il Han; Jahng, Geon-Ho; Chang, Kee-Hyun

    2013-01-01

    Objective The purpose of this study was to differentiate true progression from pseudoprogression of glioblastomas treated with concurrent chemoradiotherapy (CCRT) with temozolomide (TMZ) by using histogram analysis of apparent diffusion coefficient (ADC) and normalized cerebral blood volume (nCBV) maps. Materials and Methods Twenty patients with histopathologically proven glioblastoma who had received CCRT with TMZ underwent perfusion-weighted imaging and diffusion-weighted imaging (b = 0, 1000 sec/mm2). The corresponding nCBV and ADC maps for the newly visible, entirely enhancing lesions were calculated after the completion of CCRT with TMZ. Two observers independently measured the histogram parameters of the nCBV and ADC maps. The histogram parameters between the true progression group (n = 10) and the pseudoprogression group (n = 10) were compared by use of an unpaired Student's t test and subsequent multivariable stepwise logistic regression analysis to determine the best predictors for the differential diagnosis between the two groups. Receiver operating characteristic analysis was employed to determine the best cutoff values for the histogram parameters that proved to be significant predictors for differentiating true progression from pseudoprogression. Intraclass correlation coefficient was used to determine the level of inter-observer reliability for the histogram parameters. Results The 5th percentile value (C5) of the cumulative ADC histograms was a significant predictor for the differential diagnosis between true progression and pseudoprogression (p = 0.044 for observer 1; p = 0.011 for observer 2). Optimal cutoff values of 892 × 10-6 mm2/sec for observer 1 and 907 × 10-6 mm2/sec for observer 2 could help differentiate between the two groups with a sensitivity of 90% and 80%, respectively, a specificity of 90% and 80%, respectively, and an area under the curve of 0.880 and 0.840, respectively. There was no other significant differentiating parameter on the nCBV histograms. Inter-observer reliability was excellent or good for all histogram parameters (intraclass correlation coefficient range: 0.70-0.99). Conclusion The C5 of the cumulative ADC histogram can be a promising parameter for the differentiation of true progression from pseudoprogression of newly visible, entirely enhancing lesions after CCRT with TMZ for glioblastomas. PMID:23901325

  19. SAMBA: Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahlfeld, R., E-mail: r.ahlfeld14@imperial.ac.uk; Belkouchi, B.; Montomoli, F.

    2016-09-01

    A new arbitrary Polynomial Chaos (aPC) method is presented for moderately high-dimensional problems characterised by limited input data availability. The proposed methodology improves the algorithm of aPC and extends the method, that was previously only introduced as tensor product expansion, to moderately high-dimensional stochastic problems. The fundamental idea of aPC is to use the statistical moments of the input random variables to develop the polynomial chaos expansion. This approach provides the possibility to propagate continuous or discrete probability density functions and also histograms (data sets) as long as their moments exist, are finite and the determinant of the moment matrixmore » is strictly positive. For cases with limited data availability, this approach avoids bias and fitting errors caused by wrong assumptions. In this work, an alternative way to calculate the aPC is suggested, which provides the optimal polynomials, Gaussian quadrature collocation points and weights from the moments using only a handful of matrix operations on the Hankel matrix of moments. It can therefore be implemented without requiring prior knowledge about statistical data analysis or a detailed understanding of the mathematics of polynomial chaos expansions. The extension to more input variables suggested in this work, is an anisotropic and adaptive version of Smolyak's algorithm that is solely based on the moments of the input probability distributions. It is referred to as SAMBA (PC), which is short for Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos. It is illustrated that for moderately high-dimensional problems (up to 20 different input variables or histograms) SAMBA can significantly simplify the calculation of sparse Gaussian quadrature rules. SAMBA's efficiency for multivariate functions with regard to data availability is further demonstrated by analysing higher order convergence and accuracy for a set of nonlinear test functions with 2, 5 and 10 different input distributions or histograms.« less

  20. SAMBA: Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos

    NASA Astrophysics Data System (ADS)

    Ahlfeld, R.; Belkouchi, B.; Montomoli, F.

    2016-09-01

    A new arbitrary Polynomial Chaos (aPC) method is presented for moderately high-dimensional problems characterised by limited input data availability. The proposed methodology improves the algorithm of aPC and extends the method, that was previously only introduced as tensor product expansion, to moderately high-dimensional stochastic problems. The fundamental idea of aPC is to use the statistical moments of the input random variables to develop the polynomial chaos expansion. This approach provides the possibility to propagate continuous or discrete probability density functions and also histograms (data sets) as long as their moments exist, are finite and the determinant of the moment matrix is strictly positive. For cases with limited data availability, this approach avoids bias and fitting errors caused by wrong assumptions. In this work, an alternative way to calculate the aPC is suggested, which provides the optimal polynomials, Gaussian quadrature collocation points and weights from the moments using only a handful of matrix operations on the Hankel matrix of moments. It can therefore be implemented without requiring prior knowledge about statistical data analysis or a detailed understanding of the mathematics of polynomial chaos expansions. The extension to more input variables suggested in this work, is an anisotropic and adaptive version of Smolyak's algorithm that is solely based on the moments of the input probability distributions. It is referred to as SAMBA (PC), which is short for Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos. It is illustrated that for moderately high-dimensional problems (up to 20 different input variables or histograms) SAMBA can significantly simplify the calculation of sparse Gaussian quadrature rules. SAMBA's efficiency for multivariate functions with regard to data availability is further demonstrated by analysing higher order convergence and accuracy for a set of nonlinear test functions with 2, 5 and 10 different input distributions or histograms.

  1. n-SIFT: n-dimensional scale invariant feature transform.

    PubMed

    Cheung, Warren; Hamarneh, Ghassan

    2009-09-01

    We propose the n-dimensional scale invariant feature transform (n-SIFT) method for extracting and matching salient features from scalar images of arbitrary dimensionality, and compare this method's performance to other related features. The proposed features extend the concepts used for 2-D scalar images in the computer vision SIFT technique for extracting and matching distinctive scale invariant features. We apply the features to images of arbitrary dimensionality through the use of hyperspherical coordinates for gradients and multidimensional histograms to create the feature vectors. We analyze the performance of a fully automated multimodal medical image matching technique based on these features, and successfully apply the technique to determine accurate feature point correspondence between pairs of 3-D MRI images and dynamic 3D + time CT data.

  2. STILTS Plotting Tools

    NASA Astrophysics Data System (ADS)

    Taylor, M. B.

    2009-09-01

    The new plotting functionality in version 2.0 of STILTS is described. STILTS is a mature and powerful package for all kinds of table manipulation, and this version adds facilities for generating plots from one or more tables to its existing wide range of non-graphical capabilities. 2- and 3-dimensional scatter plots and 1-dimensional histograms may be generated using highly configurable style parameters. Features include multiple dataset overplotting, variable transparency, 1-, 2- or 3-dimensional symmetric or asymmetric error bars, higher-dimensional visualization using color, and textual point labeling. Vector and bitmapped output formats are supported. The plotting options provide enough flexibility to perform meaningful visualization on datasets from a few points up to tens of millions. Arbitrarily large datasets can be plotted without heavy memory usage.

  3. Diffusion-weighted imaging: Apparent diffusion coefficient histogram analysis for detecting pathologic complete response to chemoradiotherapy in locally advanced rectal cancer.

    PubMed

    Choi, Moon Hyung; Oh, Soon Nam; Rha, Sung Eun; Choi, Joon-Il; Lee, Sung Hak; Jang, Hong Seok; Kim, Jun-Gi; Grimm, Robert; Son, Yohan

    2016-07-01

    To investigate the usefulness of apparent diffusion coefficient (ADC) values derived from histogram analysis of the whole rectal cancer as a quantitative parameter to evaluate pathologic complete response (pCR) on preoperative magnetic resonance imaging (MRI). We enrolled a total of 86 consecutive patients who had undergone surgery for rectal cancer after neoadjuvant chemoradiotherapy (CRT) at our institution between July 2012 and November 2014. Two radiologists who were blinded to the final pathological results reviewed post-CRT MRI to evaluate tumor stage. Quantitative image analysis was performed using T2 -weighted and diffusion-weighted images independently by two radiologists using dedicated software that performed histogram analysis to assess the distribution of ADC in the whole tumor. After surgery, 16 patients were confirmed to have achieved pCR (18.6%). All parameters from pre- and post-CRT ADC histogram showed good or excellent agreement between two readers. The minimum, 10th, 25th, 50th, and 75th percentile and mean ADC from post-CRT ADC histogram were significantly higher in the pCR group than in the non-pCR group for both readers. The 25th percentile value from ADC histogram in post-CRT MRI had the best diagnostic performance for detecting pCR, with an area under the receiver operating characteristic curve of 0.796. Low percentile values derived from the ADC histogram analysis of rectal cancer on MRI after CRT showed a significant difference between pCR and non-pCR groups, demonstrating the utility of the ADC value as a quantitative and objective marker to evaluate complete pathologic response to preoperative CRT in rectal cancer. J. Magn. Reson. Imaging 2016;44:212-220. © 2015 Wiley Periodicals, Inc.

  4. Hybrid three-dimensional and support vector machine approach for automatic vehicle tracking and classification using a single camera

    NASA Astrophysics Data System (ADS)

    Kachach, Redouane; Cañas, José María

    2016-05-01

    Using video in traffic monitoring is one of the most active research domains in the computer vision community. TrafficMonitor, a system that employs a hybrid approach for automatic vehicle tracking and classification on highways using a simple stationary calibrated camera, is presented. The proposed system consists of three modules: vehicle detection, vehicle tracking, and vehicle classification. Moving vehicles are detected by an enhanced Gaussian mixture model background estimation algorithm. The design includes a technique to resolve the occlusion problem by using a combination of two-dimensional proximity tracking algorithm and the Kanade-Lucas-Tomasi feature tracking algorithm. The last module classifies the shapes identified into five vehicle categories: motorcycle, car, van, bus, and truck by using three-dimensional templates and an algorithm based on histogram of oriented gradients and the support vector machine classifier. Several experiments have been performed using both real and simulated traffic in order to validate the system. The experiments were conducted on GRAM-RTM dataset and a proper real video dataset which is made publicly available as part of this work.

  5. A comparison of methods using optical coherence tomography to detect demineralized regions in teeth

    PubMed Central

    Sowa, Michael G.; Popescu, Dan P.; Friesen, Jeri R.; Hewko, Mark D.; Choo-Smith, Lin-P’ing

    2013-01-01

    Optical coherence tomography (OCT) is a three- dimensional optical imaging technique that can be used to identify areas of early caries formation in dental enamel. The OCT signal at 850 nm back-reflected from sound enamel is attenuated stronger than the signal back-reflected from demineralized regions. To quantify this observation, the OCT signal as a function of depth into the enamel (also known as the A-scan intensity), the histogram of the A-scan intensities and three summary parameters derived from the A-scan are defined and their diagnostic potential compared. A total of 754 OCT A-scans were analyzed. The three summary parameters derived from the A-scans, the OCT attenuation coefficient as well as the mean and standard deviation of the lognormal fit to the histogram of the A-scan ensemble show statistically significant differences (p < 0.01) when comparing parameters from sound enamel and caries. Furthermore, these parameters only show a modest correlation. Based on the area under the curve (AUC) of the receiver operating characteristics (ROC) plot, the OCT attenuation coefficient shows higher discriminatory capacity (AUC=0.98) compared to the parameters derived from the lognormal fit to the histogram of the A-scan. However, direct analysis of the A-scans or the histogram of A-scan intensities using linear support vector machine classification shows diagnostic discrimination (AUC = 0.96) comparable to that achieved using the attenuation coefficient. These findings suggest that either direct analysis of the A-scan, its intensity histogram or the attenuation coefficient derived from the descending slope of the OCT A-scan have high capacity to discriminate between regions of caries and sound enamel. PMID:22052833

  6. Histogram Curve Matching Approaches for Object-based Image Classification of Land Cover and Land Use

    PubMed Central

    Toure, Sory I.; Stow, Douglas A.; Weeks, John R.; Kumar, Sunil

    2013-01-01

    The classification of image-objects is usually done using parametric statistical measures of central tendency and/or dispersion (e.g., mean or standard deviation). The objectives of this study were to analyze digital number histograms of image objects and evaluate classifications measures exploiting characteristic signatures of such histograms. Two histograms matching classifiers were evaluated and compared to the standard nearest neighbor to mean classifier. An ADS40 airborne multispectral image of San Diego, California was used for assessing the utility of curve matching classifiers in a geographic object-based image analysis (GEOBIA) approach. The classifications were performed with data sets having 0.5 m, 2.5 m, and 5 m spatial resolutions. Results show that histograms are reliable features for characterizing classes. Also, both histogram matching classifiers consistently performed better than the one based on the standard nearest neighbor to mean rule. The highest classification accuracies were produced with images having 2.5 m spatial resolution. PMID:24403648

  7. Particle swarm optimization-based local entropy weighted histogram equalization for infrared image enhancement

    NASA Astrophysics Data System (ADS)

    Wan, Minjie; Gu, Guohua; Qian, Weixian; Ren, Kan; Chen, Qian; Maldague, Xavier

    2018-06-01

    Infrared image enhancement plays a significant role in intelligent urban surveillance systems for smart city applications. Unlike existing methods only exaggerating the global contrast, we propose a particle swam optimization-based local entropy weighted histogram equalization which involves the enhancement of both local details and fore-and background contrast. First of all, a novel local entropy weighted histogram depicting the distribution of detail information is calculated based on a modified hyperbolic tangent function. Then, the histogram is divided into two parts via a threshold maximizing the inter-class variance in order to improve the contrasts of foreground and background, respectively. To avoid over-enhancement and noise amplification, double plateau thresholds of the presented histogram are formulated by means of particle swarm optimization algorithm. Lastly, each sub-image is equalized independently according to the constrained sub-local entropy weighted histogram. Comparative experiments implemented on real infrared images prove that our algorithm outperforms other state-of-the-art methods in terms of both visual and quantized evaluations.

  8. New Funding Opportunity - Illuminating the Druggable Genome | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The National Institutes of Health Common Fund announces two new Funding Opportunity Announcements with a focus on the Illuminating the Druggable Genome (IDG). These funding opportunities are designed to foster the development of technologies and information management to facilitate the unveiling of the functions of the poorly characterized and/or un-annotated members in four protein classes of the Druggable Genome. The IDG project is predicated on the need to fully explore the underlying biology and role in disease of genes linked to already drugged genes within the Druggable Genome.

  9. The transcriptional diversity of 25 Drosophila cell lines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cherbas, Lucy; Willingham, Aarron; Zhang, Dayu

    2010-12-22

    Drosophila melanogaster cell lines are important resources for cell biologists. In this article, we catalog the expression of exons, genes, and unannotated transcriptional signals for 25 lines. Unannotated transcription is substantial (typically 19% of euchromatic signal). Conservatively, we identify 1405 novel transcribed regions; 684 of these appear to be new exons of neighboring, often distant, genes. Sixty-four percent of genes are expressed detectably in at least one line, but only 21% are detected in all lines. Each cell line expresses, on average, 5885 genes, including a common set of 3109. Expression levels vary over several orders of magnitude. Major signalingmore » pathways are well represented: most differentiation pathways are ‘‘off’’ and survival/growth pathways ‘‘on.’’ Roughly 50% of the genes expressed by each line are not part of the common set, and these show considerable individuality. Thirty-one percent are expressed at a higher level in at least one cell line than in any single developmental stage, suggesting that each line is enriched for genes characteristic of small sets of cells. Most remarkable is that imaginal disc-derived lines can generally be assigned, on the basis of expression, to small territories within developing discs. These mappings reveal unexpected stability of even fine-grained spatial determination. No two cell lines show identical transcription factor expression. We conclude that each line has retained features of an individual founder cell superimposed on a common ‘‘cell line‘‘ gene expression pattern. We report the transcriptional profiles of 25 Drosophila melanogaster cell lines, principally by whole-genome tiling microarray analysis of total RNA, carried out as part of the modENCODE project. The data produced in this study add to our knowledge of the cell lines and of the Drosophila transcriptome in several ways. We summarize the expression of previously annotated genes in each of the 25 lines with emphasis on what those patterns reveal about the origins of the lines and the stability of spatial expression patterns. In addition, we offer an initial analysis of previously unannotated transcripts in the cell lines.« less

  10. Histogram based analysis of lung perfusion of children after congenital diaphragmatic hernia repair.

    PubMed

    Kassner, Nora; Weis, Meike; Zahn, Katrin; Schaible, Thomas; Schoenberg, Stefan O; Schad, Lothar R; Zöllner, Frank G

    2018-05-01

    To investigate a histogram based approach to characterize the distribution of perfusion in the whole left and right lung by descriptive statistics and to show how histograms could be used to visually explore perfusion defects in two year old children after Congenital Diaphragmatic Hernia (CDH) repair. 28 children (age of 24.2±1.7months; all left sided hernia; 9 after extracorporeal membrane oxygenation therapy) underwent quantitative DCE-MRI of the lung. Segmentations of left and right lung were manually drawn to mask the calculated pulmonary blood flow maps and then to derive histograms for each lung side. Individual and group wise analysis of histograms of left and right lung was performed. Ipsilateral and contralateral lung show significant difference in shape and descriptive statistics derived from the histogram (Wilcoxon signed-rank test, p<0.05) on group wise and individual level. Subgroup analysis (patients with vs without ECMO therapy) showed no significant differences using histogram derived parameters. Histogram analysis can be a valuable tool to characterize and visualize whole lung perfusion of children after CDH repair. It allows for several possibilities to analyze the data, either describing the perfusion differences between the right and left lung but also to explore and visualize localized perfusion patterns in the 3D lung volume. Subgroup analysis will be possible given sufficient sample sizes. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  12. Improved automatic adjustment of density and contrast in FCR system using neural network

    NASA Astrophysics Data System (ADS)

    Takeo, Hideya; Nakajima, Nobuyoshi; Ishida, Masamitsu; Kato, Hisatoyo

    1994-05-01

    FCR system has an automatic adjustment of image density and contrast by analyzing the histogram of image data in the radiation field. Advanced image recognition methods proposed in this paper can improve the automatic adjustment performance, in which neural network technology is used. There are two methods. Both methods are basically used 3-layer neural network with back propagation. The image data are directly input to the input-layer in one method and the histogram data is input in the other method. The former is effective to the imaging menu such as shoulder joint in which the position of interest region occupied on the histogram changes by difference of positioning and the latter is effective to the imaging menu such as chest-pediatrics in which the histogram shape changes by difference of positioning. We experimentally confirm the validity of these methods (about the automatic adjustment performance) as compared with the conventional histogram analysis methods.

  13. Correlation between friction and thickness of vanadium-pentoxide nanowires

    NASA Astrophysics Data System (ADS)

    Kim, Taekyeong

    2015-11-01

    We investigated the correlation between friction and thickness of vanadium-pentoxide nanowires (V2O5 NWs) by using friction/atomic force microscopy (FFM/AFM). We observed that the friction signal generally increased with thickness in the FFM/AFM image of the V2O5 NWs. We constructed a two-dimensional (2D) correlation distribution of the frictional force and the thickness of the V2O5 NWs and found that they are strongly correlated; i.e., thicker NWs had higher friction. We also generated a histogram for the correlation factors obtained from each distribution and found that the most probable factor is ~0.45. Furthermore, we found that the adhesion force between the tip and the V2O5 NWs was about -3 nN, and that the friction increased with increasing applied load for different thicknesses of V2O5 NWs. Our results provide an understanding of tribological and nanomechanical studies of various one-dimensional NWs for future fundamental research.

  14. Phase Transitions in a Model of Y-Molecules Abstract

    NASA Astrophysics Data System (ADS)

    Holz, Danielle; Ruth, Donovan; Toral, Raul; Gunton, James

    Immunoglobulin is a Y-shaped molecule that functions as an antibody to neutralize pathogens. In special cases where there is a high concentration of immunoglobulin molecules, self-aggregation can occur and the molecules undergo phase transitions. This prevents the molecules from completing their function. We used a simplified model of 2-Dimensional Y-molecules with three identical arms on a triangular lattice with 2-dimensional Grand Canonical Ensemble. The molecules were permitted to be placed, removed, rotated or moved on the lattice. Once phase coexistence was found, we used histogram reweighting and multicanonical sampling to calculate our phase diagram.

  15. Discourse Analysis in Stylistics and Literature Instruction.

    ERIC Educational Resources Information Center

    Short, Mick

    1990-01-01

    A review of research regarding discourse analysis in stylistics and literature instruction covers studies of text, systematic analysis, meaning, style, literature pedagogy, and applied linguistics. A 10-citation annotated bibliography and a larger unannotated bibliography are included. (CB)

  16. Improved image retrieval based on fuzzy colour feature vector

    NASA Astrophysics Data System (ADS)

    Ben-Ahmeida, Ahlam M.; Ben Sasi, Ahmed Y.

    2013-03-01

    One of Image indexing techniques is the Content-Based Image Retrieval which is an efficient way for retrieving images from the image database automatically based on their visual contents such as colour, texture, and shape. In this paper will be discuss how using content-based image retrieval (CBIR) method by colour feature extraction and similarity checking. By dividing the query image and all images in the database into pieces and extract the features of each part separately and comparing the corresponding portions in order to increase the accuracy in the retrieval. The proposed approach is based on the use of fuzzy sets, to overcome the problem of curse of dimensionality. The contribution of colour of each pixel is associated to all the bins in the histogram using fuzzy-set membership functions. As a result, the Fuzzy Colour Histogram (FCH), outperformed the Conventional Colour Histogram (CCH) in image retrieving, due to its speedy results, where were images represented as signatures that took less size of memory, depending on the number of divisions. The results also showed that FCH is less sensitive and more robust to brightness changes than the CCH with better retrieval recall values.

  17. Assessing clutter reduction in parallel coordinates using image processing techniques

    NASA Astrophysics Data System (ADS)

    Alhamaydh, Heba; Alzoubi, Hussein; Almasaeid, Hisham

    2018-01-01

    Information visualization has appeared as an important research field for multidimensional data and correlation analysis in recent years. Parallel coordinates (PCs) are one of the popular techniques to visual high-dimensional data. A problem with the PCs technique is that it suffers from crowding, a clutter which hides important data and obfuscates the information. Earlier research has been conducted to reduce clutter without loss in data content. We introduce the use of image processing techniques as an approach for assessing the performance of clutter reduction techniques in PC. We use histogram analysis as our first measure, where the mean feature of the color histograms of the possible alternative orderings of coordinates for the PC images is calculated and compared. The second measure is the extracted contrast feature from the texture of PC images based on gray-level co-occurrence matrices. The results show that the best PC image is the one that has the minimal mean value of the color histogram feature and the maximal contrast value of the texture feature. In addition to its simplicity, the proposed assessment method has the advantage of objectively assessing alternative ordering of PC visualization.

  18. Identification of genetic elements in metabolism by high-throughput mouse phenotyping.

    PubMed

    Rozman, Jan; Rathkolb, Birgit; Oestereicher, Manuela A; Schütt, Christine; Ravindranath, Aakash Chavan; Leuchtenberger, Stefanie; Sharma, Sapna; Kistler, Martin; Willershäuser, Monja; Brommage, Robert; Meehan, Terrence F; Mason, Jeremy; Haselimashhadi, Hamed; Hough, Tertius; Mallon, Ann-Marie; Wells, Sara; Santos, Luis; Lelliott, Christopher J; White, Jacqueline K; Sorg, Tania; Champy, Marie-France; Bower, Lynette R; Reynolds, Corey L; Flenniken, Ann M; Murray, Stephen A; Nutter, Lauryl M J; Svenson, Karen L; West, David; Tocchini-Valentini, Glauco P; Beaudet, Arthur L; Bosch, Fatima; Braun, Robert B; Dobbie, Michael S; Gao, Xiang; Herault, Yann; Moshiri, Ala; Moore, Bret A; Kent Lloyd, K C; McKerlie, Colin; Masuya, Hiroshi; Tanaka, Nobuhiko; Flicek, Paul; Parkinson, Helen E; Sedlacek, Radislav; Seong, Je Kyung; Wang, Chi-Kuang Leo; Moore, Mark; Brown, Steve D; Tschöp, Matthias H; Wurst, Wolfgang; Klingenspor, Martin; Wolf, Eckhard; Beckers, Johannes; Machicao, Fausto; Peter, Andreas; Staiger, Harald; Häring, Hans-Ulrich; Grallert, Harald; Campillos, Monica; Maier, Holger; Fuchs, Helmut; Gailus-Durner, Valerie; Werner, Thomas; Hrabe de Angelis, Martin

    2018-01-18

    Metabolic diseases are a worldwide problem but the underlying genetic factors and their relevance to metabolic disease remain incompletely understood. Genome-wide research is needed to characterize so-far unannotated mammalian metabolic genes. Here, we generate and analyze metabolic phenotypic data of 2016 knockout mouse strains under the aegis of the International Mouse Phenotyping Consortium (IMPC) and find 974 gene knockouts with strong metabolic phenotypes. 429 of those had no previous link to metabolism and 51 genes remain functionally completely unannotated. We compared human orthologues of these uncharacterized genes in five GWAS consortia and indeed 23 candidate genes are associated with metabolic disease. We further identify common regulatory elements in promoters of candidate genes. As each regulatory element is composed of several transcription factor binding sites, our data reveal an extensive metabolic phenotype-associated network of co-regulated genes. Our systematic mouse phenotype analysis thus paves the way for full functional annotation of the genome.

  19. Potential of MR histogram analyses for prediction of response to chemotherapy in patients with colorectal hepatic metastases.

    PubMed

    Liang, He-Yue; Huang, Ya-Qin; Yang, Zhao-Xia; Ying-Ding; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-07-01

    To determine if magnetic resonance imaging (MRI) histogram analyses can help predict response to chemotherapy in patients with colorectal hepatic metastases by using response evaluation criteria in solid tumours (RECIST1.1) as the reference standard. Standard MRI including diffusion-weighted imaging (b=0, 500 s/mm(2)) was performed before chemotherapy in 53 patients with colorectal hepatic metastases. Histograms were performed for apparent diffusion coefficient (ADC) maps, arterial, and portal venous phase images; thereafter, mean, percentiles (1st, 10th, 50th, 90th, 99th), skewness, kurtosis, and variance were generated. Quantitative histogram parameters were compared between responders (partial and complete response, n=15) and non-responders (progressive and stable disease, n=38). Receiver operator characteristics (ROC) analyses were further analyzed for the significant parameters. The mean, 1st percentile, 10th percentile, 50th percentile, 90th percentile, 99th percentile of the ADC maps were significantly lower in responding group than that in non-responding group (p=0.000-0.002) with area under the ROC curve (AUCs) of 0.76-0.82. The histogram parameters of arterial and portal venous phase showed no significant difference (p>0.05) between the two groups. Histogram-derived parameters for ADC maps seem to be a promising tool for predicting response to chemotherapy in patients with colorectal hepatic metastases. • ADC histogram analyses can potentially predict chemotherapy response in colorectal liver metastases. • Lower histogram-derived parameters (mean, percentiles) for ADC tend to have good response. • MR enhancement histogram analyses are not reliable to predict response.

  20. Histogram-based normalization technique on human brain magnetic resonance images from different acquisitions.

    PubMed

    Sun, Xiaofei; Shi, Lin; Luo, Yishan; Yang, Wei; Li, Hongpeng; Liang, Peipeng; Li, Kuncheng; Mok, Vincent C T; Chu, Winnie C W; Wang, Defeng

    2015-07-28

    Intensity normalization is an important preprocessing step in brain magnetic resonance image (MRI) analysis. During MR image acquisition, different scanners or parameters would be used for scanning different subjects or the same subject at a different time, which may result in large intensity variations. This intensity variation will greatly undermine the performance of subsequent MRI processing and population analysis, such as image registration, segmentation, and tissue volume measurement. In this work, we proposed a new histogram normalization method to reduce the intensity variation between MRIs obtained from different acquisitions. In our experiment, we scanned each subject twice on two different scanners using different imaging parameters. With noise estimation, the image with lower noise level was determined and treated as the high-quality reference image. Then the histogram of the low-quality image was normalized to the histogram of the high-quality image. The normalization algorithm includes two main steps: (1) intensity scaling (IS), where, for the high-quality reference image, the intensities of the image are first rescaled to a range between the low intensity region (LIR) value and the high intensity region (HIR) value; and (2) histogram normalization (HN),where the histogram of low-quality image as input image is stretched to match the histogram of the reference image, so that the intensity range in the normalized image will also lie between LIR and HIR. We performed three sets of experiments to evaluate the proposed method, i.e., image registration, segmentation, and tissue volume measurement, and compared this with the existing intensity normalization method. It is then possible to validate that our histogram normalization framework can achieve better results in all the experiments. It is also demonstrated that the brain template with normalization preprocessing is of higher quality than the template with no normalization processing. We have proposed a histogram-based MRI intensity normalization method. The method can normalize scans which were acquired on different MRI units. We have validated that the method can greatly improve the image analysis performance. Furthermore, it is demonstrated that with the help of our normalization method, we can create a higher quality Chinese brain template.

  1. Dynamic contrast-enhanced MR imaging of the rectum: Correlations between single-section and whole-tumor histogram analyses.

    PubMed

    Choi, M H; Oh, S N; Park, G E; Yeo, D-M; Jung, S E

    2018-05-10

    To evaluate the interobserver and intermethod correlations of histogram metrics of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) parameters acquired by multiple readers using the single-section and whole-tumor volume methods. Four DCE parameters (K trans , K ep , V e , V p ) were evaluated in 45 patients (31 men and 14 women; mean age, 61±11 years [range, 29-83 years]) with locally advanced rectal cancer using pre-chemoradiotherapy (CRT) MRI. Ten histogram metrics were extracted using two methods of lesion selection performed by three radiologists: the whole-tumor volume method for the whole tumor on axial section-by-section images and the single-section method for the entire area of the tumor on one axial image. The interobserver and intermethod correlations were evaluated using the intraclass correlation coefficients (ICCs). The ICCs showed excellent interobserver and intermethod correlations in most of histogram metrics of the DCE parameters. The ICCs among the three readers were > 0.7 (P<0.001) for all histogram metrics, except for the minimum and maximum. The intermethod correlations for most of the histogram metrics were excellent for each radiologist, regardless of the differences in the radiologists' experience. The interobserver and intermethod correlations for most of the histogram metrics of the DCE parameters are excellent in rectal cancer. Therefore, the single-section method may be a potential alternative to the whole-tumor volume method using pre-CRT MRI, despite the fact that the high agreement between the two methods cannot be extrapolated to post-CRT MRI. Copyright © 2018 Société française de radiologie. Published by Elsevier Masson SAS. All rights reserved.

  2. Comparison between types I and II epithelial ovarian cancer using histogram analysis of monoexponential, biexponential, and stretched-exponential diffusion models.

    PubMed

    Wang, Feng; Wang, Yuxiang; Zhou, Yan; Liu, Congrong; Xie, Lizhi; Zhou, Zhenyu; Liang, Dong; Shen, Yang; Yao, Zhihang; Liu, Jianyu

    2017-12-01

    To evaluate the utility of histogram analysis of monoexponential, biexponential, and stretched-exponential models to a dualistic model of epithelial ovarian cancer (EOC). Fifty-two patients with histopathologically proven EOC underwent preoperative magnetic resonance imaging (MRI) (including diffusion-weighted imaging [DWI] with 11 b-values) using a 3.0T system and were divided into two groups: types I and II. Apparent diffusion coefficient (ADC), true diffusion coefficient (D), pseudodiffusion coefficient (D*), perfusion fraction (f), distributed diffusion coefficient (DDC), and intravoxel water diffusion heterogeneity (α) histograms were obtained based on solid components of the entire tumor. The following metrics of each histogram were compared between two types: 1) mean; 2) median; 3) 10th percentile and 90th percentile. Conventional MRI morphological features were also recorded. Significant morphological features for predicting EOC type were maximum diameter (P = 0.007), texture of lesion (P = 0.001), and peritoneal implants (P = 0.001). For ADC, D, f, DDC, and α, all metrics were significantly lower in type II than type I (P < 0.05). Mean, median, 10th, and 90th percentile of D* were not significantly different (P = 0.336, 0.154, 0.779, and 0.203, respectively). Most histogram metrics of ADC, D, and DDC had significantly higher area under the receiver operating characteristic curve values than those of f and α (P < 0.05) CONCLUSION: It is feasible to grade EOC by morphological features and three models with histogram analysis. ADC, D, and DDC have better performance than f and α; f and α may provide additional information. 4 Technical Efficacy: Stage 1 J. Magn. Reson. Imaging 2017;46:1797-1809. © 2017 International Society for Magnetic Resonance in Medicine.

  3. Variations of attractors and wavelet spectra of the immunofluorescence distributions for women in the pregnant period

    NASA Astrophysics Data System (ADS)

    Galich, Nikolay E.

    2008-07-01

    Communication contains the description of the immunology data treatment. New nonlinear methods of immunofluorescence statistical analysis of peripheral blood neutrophils have been developed. We used technology of respiratory burst reaction of DNA fluorescence in the neutrophils cells nuclei due to oxidative activity. The histograms of photon count statistics the radiant neutrophils populations' in flow cytometry experiments are considered. Distributions of the fluorescence flashes frequency as functions of the fluorescence intensity are analyzed. Statistic peculiarities of histograms set for women in the pregnant period allow dividing all histograms on the three classes. The classification is based on three different types of smoothing and long-range scale averaged immunofluorescence distributions, their bifurcation and wavelet spectra. Heterogeneity peculiarities of long-range scale immunofluorescence distributions and peculiarities of wavelet spectra allow dividing all histograms on three groups. First histograms group belongs to healthy donors. Two other groups belong to donors with autoimmune and inflammatory diseases. Some of the illnesses are not diagnosed by standards biochemical methods. Medical standards and statistical data of the immunofluorescence histograms for identifications of health and illnesses are interconnected. Peculiarities of immunofluorescence for women in pregnant period are classified. Health or illness criteria are connected with statistics features of immunofluorescence histograms. Neutrophils populations' fluorescence presents the sensitive clear indicator of health status.

  4. Complexity of possibly gapped histogram and analysis of histogram.

    PubMed

    Fushing, Hsieh; Roy, Tania

    2018-02-01

    We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.

  5. Complexity of possibly gapped histogram and analysis of histogram

    PubMed Central

    Roy, Tania

    2018-01-01

    We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT. PMID:29515829

  6. Complexity of possibly gapped histogram and analysis of histogram

    NASA Astrophysics Data System (ADS)

    Fushing, Hsieh; Roy, Tania

    2018-02-01

    We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.

  7. Apparent diffusion coefficient histogram shape analysis for monitoring early response in patients with advanced cervical cancers undergoing concurrent chemo-radiotherapy.

    PubMed

    Meng, Jie; Zhu, Lijing; Zhu, Li; Wang, Huanhuan; Liu, Song; Yan, Jing; Liu, Baorui; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng

    2016-10-22

    To explore the role of apparent diffusion coefficient (ADC) histogram shape related parameters in early assessment of treatment response during the concurrent chemo-radiotherapy (CCRT) course of advanced cervical cancers. This prospective study was approved by the local ethics committee and informed consent was obtained from all patients. Thirty-two patients with advanced cervical squamous cell carcinomas underwent diffusion weighted magnetic resonance imaging (b values, 0 and 800 s/mm 2 ) before CCRT, at the end of 2nd and 4th week during CCRT and immediately after CCRT completion. Whole lesion ADC histogram analysis generated several histogram shape related parameters including skewness, kurtosis, s-sD av , width, standard deviation, as well as first-order entropy and second-order entropies. The averaged ADC histograms of 32 patients were generated to visually observe dynamic changes of the histogram shape following CCRT. All parameters except width and standard deviation showed significant changes during CCRT (all P < 0.05), and their variation trends fell into four different patterns. Skewness and kurtosis both showed high early decline rate (43.10 %, 48.29 %) at the end of 2nd week of CCRT. All entropies kept decreasing significantly since 2 weeks after CCRT initiated. The shape of averaged ADC histogram also changed obviously following CCRT. ADC histogram shape analysis held the potential in monitoring early tumor response in patients with advanced cervical cancers undergoing CCRT.

  8. Dissimilarity representations in lung parenchyma classification

    NASA Astrophysics Data System (ADS)

    Sørensen, Lauge; de Bruijne, Marleen

    2009-02-01

    A good problem representation is important for a pattern recognition system to be successful. The traditional approach to statistical pattern recognition is feature representation. More specifically, objects are represented by a number of features in a feature vector space, and classifiers are built in this representation. This is also the general trend in lung parenchyma classification in computed tomography (CT) images, where the features often are measures on feature histograms. Instead, we propose to build normal density based classifiers in dissimilarity representations for lung parenchyma classification. This allows for the classifiers to work on dissimilarities between objects, which might be a more natural way of representing lung parenchyma. In this context, dissimilarity is defined between CT regions of interest (ROI)s. ROIs are represented by their CT attenuation histogram and ROI dissimilarity is defined as a histogram dissimilarity measure between the attenuation histograms. In this setting, the full histograms are utilized according to the chosen histogram dissimilarity measure. We apply this idea to classification of different emphysema patterns as well as normal, healthy tissue. Two dissimilarity representation approaches as well as different histogram dissimilarity measures are considered. The approaches are evaluated on a set of 168 CT ROIs using normal density based classifiers all showing good performance. Compared to using histogram dissimilarity directly as distance in a emph{k} nearest neighbor classifier, which achieves a classification accuracy of 92.9%, the best dissimilarity representation based classifier is significantly better with a classification accuracy of 97.0% (text{emph{p" border="0" class="imgtopleft"> = 0.046).

  9. Whole-lesion apparent diffusion coefficient histogram analysis: significance in T and N staging of gastric cancers.

    PubMed

    Liu, Song; Zhang, Yujuan; Chen, Ling; Guan, Wenxian; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang

    2017-10-02

    Whole-lesion apparent diffusion coefficient (ADC) histogram analysis has been introduced and proved effective in assessment of multiple tumors. However, the application of whole-volume ADC histogram analysis in gastrointestinal tumors has just started and never been reported in T and N staging of gastric cancers. Eighty patients with pathologically confirmed gastric carcinomas underwent diffusion weighted (DW) magnetic resonance imaging before surgery prospectively. Whole-lesion ADC histogram analysis was performed by two radiologists independently. The differences of ADC histogram parameters among different T and N stages were compared with independent-samples Kruskal-Wallis test. Receiver operating characteristic (ROC) analysis was performed to evaluate the performance of ADC histogram parameters in differentiating particular T or N stages of gastric cancers. There were significant differences of all the ADC histogram parameters for gastric cancers at different T (except ADC min and ADC max ) and N (except ADC max ) stages. Most ADC histogram parameters differed significantly between T1 vs T3, T1 vs T4, T2 vs T4, N0 vs N1, N0 vs N3, and some parameters (ADC 5% , ADC 10% , ADC min ) differed significantly between N0 vs N2, N2 vs N3 (all P < 0.05). Most parameters except ADC max performed well in differentiating different T and N stages of gastric cancers. Especially for identifying patients with and without lymph node metastasis, the ADC 10% yielded the largest area under the ROC curve of 0.794 (95% confidence interval, 0.677-0.911). All the parameters except ADC max showed excellent inter-observer agreement with intra-class correlation coefficients higher than 0.800. Whole-volume ADC histogram parameters held great potential in differentiating different T and N stages of gastric cancers preoperatively.

  10. Dose-volume histogram prediction using density estimation.

    PubMed

    Skarpman Munter, Johanna; Sjölund, Jens

    2015-09-07

    Knowledge of what dose-volume histograms can be expected for a previously unseen patient could increase consistency and quality in radiotherapy treatment planning. We propose a machine learning method that uses previous treatment plans to predict such dose-volume histograms. The key to the approach is the framing of dose-volume histograms in a probabilistic setting.The training consists of estimating, from the patients in the training set, the joint probability distribution of some predictive features and the dose. The joint distribution immediately provides an estimate of the conditional probability of the dose given the values of the predictive features. The prediction consists of estimating, from the new patient, the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimate of the dose-volume histogram.To illustrate how the proposed method relates to previously proposed methods, we use the signed distance to the target boundary as a single predictive feature. As a proof-of-concept, we predicted dose-volume histograms for the brainstems of 22 acoustic schwannoma patients treated with stereotactic radiosurgery, and for the lungs of 9 lung cancer patients treated with stereotactic body radiation therapy. Comparing with two previous attempts at dose-volume histogram prediction we find that, given the same input data, the predictions are similar.In summary, we propose a method for dose-volume histogram prediction that exploits the intrinsic probabilistic properties of dose-volume histograms. We argue that the proposed method makes up for some deficiencies in previously proposed methods, thereby potentially increasing ease of use, flexibility and ability to perform well with small amounts of training data.

  11. Selection and evaluation of optimal two-dimensional CAIPIRINHA kernels applied to time-resolved three-dimensional CE-MRA.

    PubMed

    Weavers, Paul T; Borisch, Eric A; Riederer, Stephen J

    2015-06-01

    To develop and validate a method for choosing the optimal two-dimensional CAIPIRINHA kernel for subtraction contrast-enhanced MR angiography (CE-MRA) and estimate the degree of image quality improvement versus that of some reference acceleration parameter set at R ≥ 8. A metric based on patient-specific coil calibration information was defined for evaluating optimality of CAIPIRINHA kernels as applied to subtraction CE-MRA. Evaluation in retrospective studies using archived coil calibration data from abdomen, calf, foot, and hand CE-MRA exams was accomplished with an evaluation metric comparing the geometry factor (g-factor) histograms. Prospective calf, foot, and hand CE-MRA studies were evaluated with vessel signal-to-noise ratio (SNR). Retrospective studies show g-factor improvement for the selected CAIPIRINHA kernels was significant in the feet, moderate in the abdomen, and modest in the calves and hands. Prospective CE-MRA studies using optimal CAIPIRINHA show reduced noise amplification with identical acquisition time in studies of the feet, with minor improvements in the hands and calves. A method for selection of the optimal CAIPIRINHA kernel for high (R ≥ 8) acceleration CE-MRA exams given a specific patient and receiver array was demonstrated. CAIPIRINHA optimization appears valuable in accelerated CE-MRA of the feet and to a lesser extent in the abdomen. © 2014 Wiley Periodicals, Inc.

  12. Predicting pathologic tumor response to chemoradiotherapy with histogram distances characterizing longitudinal changes in 18F-FDG uptake patterns

    PubMed Central

    Tan, Shan; Zhang, Hao; Zhang, Yongxue; Chen, Wengen; D’Souza, Warren D.; Lu, Wei

    2013-01-01

    Purpose: A family of fluorine-18 (18F)-fluorodeoxyglucose (18F-FDG) positron-emission tomography (PET) features based on histogram distances is proposed for predicting pathologic tumor response to neoadjuvant chemoradiotherapy (CRT). These features describe the longitudinal change of FDG uptake distribution within a tumor. Methods: Twenty patients with esophageal cancer treated with CRT plus surgery were included in this study. All patients underwent PET/CT scans before (pre-) and after (post-) CRT. The two scans were first rigidly registered, and the original tumor sites were then manually delineated on the pre-PET/CT by an experienced nuclear medicine physician. Two histograms representing the FDG uptake distribution were extracted from the pre- and the registered post-PET images, respectively, both within the delineated tumor. Distances between the two histograms quantify longitudinal changes in FDG uptake distribution resulting from CRT, and thus are potential predictors of tumor response. A total of 19 histogram distances were examined and compared to both traditional PET response measures and Haralick texture features. Receiver operating characteristic analyses and Mann-Whitney U test were performed to assess their predictive ability. Results: Among all tested histogram distances, seven bin-to-bin and seven crossbin distances outperformed traditional PET response measures using maximum standardized uptake value (AUC = 0.70) or total lesion glycolysis (AUC = 0.80). The seven bin-to-bin distances were: L2 distance (AUC = 0.84), χ2 distance (AUC = 0.83), intersection distance (AUC = 0.82), cosine distance (AUC = 0.83), squared Euclidean distance (AUC = 0.83), L1 distance (AUC = 0.82), and Jeffrey distance (AUC = 0.82). The seven crossbin distances were: quadratic-chi distance (AUC = 0.89), earth mover distance (AUC = 0.86), fast earth mover distance (AUC = 0.86), diffusion distance (AUC = 0.88), Kolmogorov-Smirnov distance (AUC = 0.88), quadratic form distance (AUC = 0.87), and match distance (AUC = 0.84). These crossbin histogram distance features showed slightly higher prediction accuracy than texture features on post-PET images. Conclusions: The results suggest that longitudinal patterns in 18F-FDG uptake characterized using histogram distances provide useful information for predicting the pathologic response of esophageal cancer to CRT. PMID:24089897

  13. System design of a small OpenPET prototype with 4-layer DOI detectors.

    PubMed

    Yoshida, Eiji; Kinouchi, Shoko; Tashima, Hideaki; Nishikido, Fumihiko; Inadama, Naoko; Murayama, Hideo; Yamaya, Taiga

    2012-01-01

    We have proposed an OpenPET geometry which consists of two axially separated detector rings. The open gap is suitable for in-beam PET. We have developed the small prototype of the OpenPET especially for a proof of concept of in-beam imaging. This paper presents an overview of the main features implemented in this prototype. We also evaluated the detector performance. This prototype was designed with 2 detector rings having 8 depth-of-interaction detectors. Each detector consisted of 784 Lu(2x)Gd(2(1-x))SiO₅:Ce (LGSO) which were arranged in a 4-layer design, coupled to a position-sensitive photomultiplier tube (PS-PMT). The size of the LGSO array was smaller than the sensitive area of the PS-PMT, so that we could obtain sufficient LGSO identification. Peripheral LGSOs near the open gap directly detect the gamma rays on the side face in the OpenPET geometry. Output signals of two detectors stacked axially were projected onto one 2-dimensional position histogram for reduction of the scale of a coincidence processor. Front-end circuits were separated from the detector head by 1.2-m coaxial cables for the protection of electronic circuits from radiation damage. The detectors had sufficient crystal identification capability. Cross talk between the combined two detectors could be ignored. The timing and energy resolutions were 3.0 ns and 14%, respectively. The coincidence window was set 20 ns, because the timing histogram showed that not only the main peak, but also two small shifted peaks were caused by the coaxial cable. However, the detector offers the promise of sufficient performance, because random coincidences are at a nearly undetectable level for in-beam PET experiments.

  14. Autocorrelation descriptor improvements for QSAR: 2DA_Sign and 3DA_Sign

    NASA Astrophysics Data System (ADS)

    Sliwoski, Gregory; Mendenhall, Jeffrey; Meiler, Jens

    2016-03-01

    Quantitative structure-activity relationship (QSAR) is a branch of computer aided drug discovery that relates chemical structures to biological activity. Two well established and related QSAR descriptors are two- and three-dimensional autocorrelation (2DA and 3DA). These descriptors encode the relative position of atoms or atom properties by calculating the separation between atom pairs in terms of number of bonds (2DA) or Euclidean distance (3DA). The sums of all values computed for a given small molecule are collected in a histogram. Atom properties can be added with a coefficient that is the product of atom properties for each pair. This procedure can lead to information loss when signed atom properties are considered such as partial charge. For example, the product of two positive charges is indistinguishable from the product of two equivalent negative charges. In this paper, we present variations of 2DA and 3DA called 2DA_Sign and 3DA_Sign that avoid information loss by splitting unique sign pairs into individual histograms. We evaluate these variations with models trained on nine datasets spanning a range of drug target classes. Both 2DA_Sign and 3DA_Sign significantly increase model performance across all datasets when compared with traditional 2DA and 3DA. Lastly, we find that limiting 3DA_Sign to maximum atom pair distances of 6 Å instead of 12 Å further increases model performance, suggesting that conformational flexibility may hinder performance with longer 3DA descriptors. Consistent with this finding, limiting the number of bonds in 2DA_Sign from 11 to 5 fails to improve performance.

  15. BIBLIOGRAPHY ON CURRICULUM DEVELOPMENT.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS BIBLIOGRAPHY LISTS MATERIALS ON VARIOUS ASPECTS OF CURRICULUM DEVELOPMENT. FORTY UNANNOTATED REFERENCES ARE PROVIDED FOR DOCUMENTS DATING FROM 1960 TO 1966. BOOKS, JOURNALS, REPORT MATERIALS, AND SOME UNPUBLISHED MANUSCRIPTS ARE LISTED IN SUCH AREAS AS COGNITIVE STUDIES, VOCATIONAL REHABILITATION, INSTRUCTIONAL MATERIALS, SCIENCE STUDIES, AND…

  16. BIBLIOGRAPHY ON VERBAL LEARNING.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS BIBLIOGRAPHY LISTS MATERIAL ON VARIOUS ASPECTS OF VERBAL LEARNING. APPROXIMATELY 50 UNANNOTATED REFERENCES ARE PROVIDED TO DOCUMENTS DATING FROM 1960 TO 1965. JOURNALS, BOOKS, AND REPORT MATERIALS ARE LISTED. SUBJECT AREAS INCLUDED ARE CONDITIONING, VERBAL BEHAVIOR, PROBLEM SOLVING, SEMANTIC SATIATION, STIMULUS DURATION, AND VERBAL…

  17. BIBLIOGRAPHY ON INDIVIDUALIZED INSTRUCTION.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS BIBLIOGRAPHY LISTS MATERIAL ON VARIOUS ASPECTS OF INDIVIDUALIZED INSTRUCTION. APPROXIMATELY 85 UNANNOTATED REFERENCES ARE PROVIDED TO DOCUMENTS DATING FROM 1958 TO 1966. JOURNALS, BOOKS, AND REPORT MATERIALS ARE LISTED. SUBJECT AREAS INCLUDED ARE PROGRAMED INSTRUCTION, TEACHING MACHINES, RESPONSE MODE, SELF-INSTRUCTION, AND COMPUTER-ASSISTED…

  18. BIBLIOGRAPHY ON TESTING AND MEASUREMENT.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS BIBLIOGRAPHY LISTS MATERIAL ON VARIOUS ASPECTS OF TESTING AND MEASUREMENT. APPROXIMATELY 80 UNANNOTATED REFERENCES ARE PROVIDED TO DOCUMENTS DATING FROM 1960 TO 1966. JOURNALS, BOOKS, AND REPORT MATERIALS ARE LISTED. SUBJECT AREAS INCLUDED ARE SCHOOL READINESS, CRITERION MEASURES, LONGITUDINAL ANALYSIS, PERSONALITY MEASUREMENT, STATISTICAL…

  19. Delay, change and bifurcation of the immunofluorescence distribution attractors in health statuses diagnostics and in medical treatment

    NASA Astrophysics Data System (ADS)

    Galich, Nikolay E.; Filatov, Michael V.

    2008-07-01

    Communication contains the description of the immunology experiments and the experimental data treatment. New nonlinear methods of immunofluorescence statistical analysis of peripheral blood neutrophils have been developed. We used technology of respiratory burst reaction of DNA fluorescence in the neutrophils cells nuclei due to oxidative activity. The histograms of photon count statistics the radiant neutrophils populations' in flow cytometry experiments are considered. Distributions of the fluorescence flashes frequency as functions of the fluorescence intensity are analyzed. Statistic peculiarities of histograms set for healthy and unhealthy donors allow dividing all histograms on the three classes. The classification is based on three different types of smoothing and long-range scale averaged immunofluorescence distributions and their bifurcation. Heterogeneity peculiarities of long-range scale immunofluorescence distributions allow dividing all histograms on three groups. First histograms group belongs to healthy donors. Two other groups belong to donors with autoimmune and inflammatory diseases. Some of the illnesses are not diagnosed by standards biochemical methods. Medical standards and statistical data of the immunofluorescence histograms for identifications of health and illnesses are interconnected. Possibilities and alterations of immunofluorescence statistics in registration, diagnostics and monitoring of different diseases in various medical treatments have been demonstrated. Health or illness criteria are connected with statistics features of immunofluorescence histograms. Neutrophils populations' fluorescence presents the sensitive clear indicator of health status.

  20. Data processing for soft X-ray diagnostics based on GEM detector measurements for fusion plasma imaging

    NASA Astrophysics Data System (ADS)

    Czarski, T.; Chernyshova, M.; Pozniak, K. T.; Kasprowicz, G.; Byszuk, A.; Juszczyk, B.; Wojenski, A.; Zabolotny, W.; Zienkiewicz, P.

    2015-12-01

    The measurement system based on GEM - Gas Electron Multiplier detector is developed for X-ray diagnostics of magnetic confinement fusion plasmas. The Triple Gas Electron Multiplier (T-GEM) is presented as soft X-ray (SXR) energy and position sensitive detector. The paper is focused on the measurement subject and describes the fundamental data processing to obtain reliable characteristics (histograms) useful for physicists. So, it is the software part of the project between the electronic hardware and physics applications. The project is original and it was developed by the paper authors. Multi-channel measurement system and essential data processing for X-ray energy and position recognition are considered. Several modes of data acquisition determined by hardware and software processing are introduced. Typical measuring issues are deliberated for the enhancement of data quality. The primary version based on 1-D GEM detector was applied for the high-resolution X-ray crystal spectrometer KX1 in the JET tokamak. The current version considers 2-D detector structures initially for the investigation purpose. Two detector structures with single-pixel sensors and multi-pixel (directional) sensors are considered for two-dimensional X-ray imaging. Fundamental output characteristics are presented for one and two dimensional detector structure. Representative results for reference source and tokamak plasma are demonstrated.

  1. Cluster analysis based on dimensional information with applications to feature selection and classification

    NASA Technical Reports Server (NTRS)

    Eigen, D. J.; Fromm, F. R.; Northouse, R. A.

    1974-01-01

    A new clustering algorithm is presented that is based on dimensional information. The algorithm includes an inherent feature selection criterion, which is discussed. Further, a heuristic method for choosing the proper number of intervals for a frequency distribution histogram, a feature necessary for the algorithm, is presented. The algorithm, although usable as a stand-alone clustering technique, is then utilized as a global approximator. Local clustering techniques and configuration of a global-local scheme are discussed, and finally the complete global-local and feature selector configuration is shown in application to a real-time adaptive classification scheme for the analysis of remote sensed multispectral scanner data.

  2. Integration of neutron time-of-flight single-crystal Bragg peaks in reciprocal space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, Arthur J; Joergensen, Mads; Wang, Xiaoping

    2014-01-01

    The intensity of single crystal Bragg peaks obtained by mapping neutron time-of-flight event data into reciprocal space and integrating in various ways are compared. These include spherical integration with a fixed radius, ellipsoid fitting and integrating of the peak intensity and one-dimensional peak profile fitting. In comparison to intensities obtained by integrating in real detector histogram space, the data integrated in reciprocal space results in better agreement factors and more accurate atomic parameters. Furthermore, structure refinement using integrated intensities from one-dimensional profile fitting is demonstrated to be more accurate than simple peak-minus-background integration.

  3. Research of image retrieval technology based on color feature

    NASA Astrophysics Data System (ADS)

    Fu, Yanjun; Jiang, Guangyu; Chen, Fengying

    2009-10-01

    Recently, with the development of the communication and the computer technology and the improvement of the storage technology and the capability of the digital image equipment, more and more image resources are given to us than ever. And thus the solution of how to locate the proper image quickly and accurately is wanted.The early method is to set up a key word for searching in the database, but now the method has become very difficult when we search much more picture that we need. In order to overcome the limitation of the traditional searching method, content based image retrieval technology was aroused. Now, it is a hot research subject.Color image retrieval is the important part of it. Color is the most important feature for color image retrieval. Three key questions on how to make use of the color characteristic are discussed in the paper: the expression of color, the abstraction of color characteristic and the measurement of likeness based on color. On the basis, the extraction technology of the color histogram characteristic is especially discussed. Considering the advantages and disadvantages of the overall histogram and the partition histogram, a new method based the partition-overall histogram is proposed. The basic thought of it is to divide the image space according to a certain strategy, and then calculate color histogram of each block as the color feature of this block. Users choose the blocks that contain important space information, confirming the right value. The system calculates the distance between the corresponding blocks that users choosed. Other blocks merge into part overall histograms again, and the distance should be calculated. Then accumulate all the distance as the real distance between two pictures. The partition-overall histogram comprehensive utilizes advantages of two methods above, by choosing blocks makes the feature contain more spatial information which can improve performance; the distances between partition-overall histogram make rotating and translation does not change. The HSV color space is used to show color characteristic of image, which is suitable to the visual characteristic of human. Taking advance of human's feeling to color, it quantifies color sector with unequal interval, and get characteristic vector. Finally, it matches the similarity of image with the algorithm of the histogram intersection and the partition-overall histogram. Users can choose a demonstration image to show inquired vision require, and also can adjust several right value through the relevance-feedback method to obtain the best result of search.An image retrieval system based on these approaches is presented. The result of the experiments shows that the image retrieval based on partition-overall histogram can keep the space distribution information while abstracting color feature efficiently, and it is superior to the normal color histograms in precision rate while researching. The query precision rate is more than 95%. In addition, the efficient block expression will lower the complicate degree of the images to be searched, and thus the searching efficiency will be increased. The image retrieval algorithms based on the partition-overall histogram proposed in the paper is efficient and effective.

  4. CHILD DEVELOPMENT BIBLIOGRAPHY. SUPPLEMENT I.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS BIBLIOGRAPHY SUPPLEMENT LISTS MATERIAL ON VARIOUS ASPECTS OF CHILD DEVELOPMENT. APPROXIMATELY 90 UNANNOTATED REFERENCES ARE PROVIDED TO DOCUMENTS DATING FROM 1956 TO 1966. JOURNALS, BOOKS, AND REPORT MATERIALS ARE LISTED. SUBJECT AREAS INCLUDED ARE BEHAVIOR TESTS, CONDITIONING, MATERNAL REACTIONS, GRADE PREDICTABILITY, EXPERIMENTAL STUDIES,…

  5. BIBLIOGRAPHY ON CREATIVITY.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS BIBLIOGRAPHY LISTS MATERIAL ON VARIOUS ASPECTS OF CREATIVITY. APPROXIMATELY 50 UNANNOTATED REFERENCES ARE PROVIDED TO DOCUMENTS DATING FROM 1961 TO 1966. JOURNALS, BOOKS, AND REPORT MATERIALS ARE LISTED. SUBJECT AREAS INCLUDED ARE (1) IDENTIFICATION, DEVELOPMENT, AND MEASUREMENT OF CREATIVITY, (2) PSYCHOLOGICAL STUDIES OF CREATIVITY, (3)…

  6. BIBLIOGRAPHY ON ACHIEVEMENT.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS BIBLIOGRAPHY LISTS MATERIAL ON VARIOUS ASPECTS OF ACHIEVEMENT. APPROXIMATELY 40 UNANNOTATED REFERENCES ARE PROVIDED TO DOCUMENTS DATING FROM 1952 TO 1965. JOURNALS, BOOKS, AND REPORT MATERIALS ARE LISTED. SUBJECT AREAS INCLUDED ARE BEHAVIOR TESTS, ACHIEVEMENT BEHAVIOR, ACADEMIC ACHIEVEMENT, AND SOCIAL-CLASS BACKGROUND. A RELATED REPORT IS ED…

  7. BIBLIOGRAPHY ON MENTAL ABILITY.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS BIBLIOGRAPHY LISTS MATERIAL ON VARIOUS ASPECTS OF HUMAN INTELLECT. APPROXIMATELY 50 UNANNOTATED REFERENCES ARE PROVIDED TO DOCUMENTS DATING FROM 1955 TO 1966. BOOKS, REPORTS, JOURNAL MATERIALS, AND SOME UNPUBLISHED TITLES ARE LISTED. SUBJECT AREAS INCLUDED ARE (1) INTELLECTUAL DEVELOPMENT, (2) ABILITY DIFFERENCES BETWEEN INDIVIDUALS, RACES,…

  8. BIBLIOGRAPHY ON LANGUAGE DEVELOPMENT.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS BIBLIOGRAPHY LISTS MATERIAL ON VARIOUS ASPECTS OF LANGUAGE DEVELOPMENT. APPROXIMATELY 65 UNANNOTATED REFERENCES ARE PROVIDED TO DOCUMENTS DATING FROM 1958 TO 1966. JOURNALS, BOOKS, AND REPORT MATERIALS ARE LISTED. SUBJECT AREAS INCLUDED ARE THE NATURE OF LANGUAGE, LINGUISTICS, LANGUAGE LEARNING, LANGUAGE SKILLS, LANGUAGE PATTERNS, AND…

  9. BIBLIOGRAPHY ON TEACHING. SUPPLEMENT.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS BIBLIOGRAPHY LISTS MATERIAL ON VARIOUS ASPECTS OF TEACHING. APPROXIMATELY 100 UNANNOTATED REFERENCES ARE PROVIDED FOR DOCUMENTS DATING FROM 1960 TO 1966. BOOKS, JOURNALS, REPORT MATERIALS, AND SOME UNPUBLISHED MANUSCRIPTS ARE LISTED IN SUCH AREAS OF EDUCATION AS HEURISTIC GAMES, TEACHER EVALUATION, CURRICULUMS, TEACHING TECHNIQUES, AND…

  10. Polymer gel dosimeters for pretreatment radiotherapy verification using the three-dimensional gamma evaluation and pass rate maps.

    PubMed

    Hsieh, Ling-Ling; Shieh, Jiunn-I; Wei, Li-Ju; Wang, Yi-Chun; Cheng, Kai-Yuan; Shih, Cheng-Ting

    2017-05-01

    Polymer gel dosimeters (PGDs) have been widely studied for use in the pretreatment verification of clinical radiation therapy. However, the readability of PGDs in three-dimensional (3D) dosimetry remain unclear. In this study, the pretreatment verifications of clinical radiation therapy were performed using an N-isopropyl-acrylamide (NIPAM) PGD, and the results were used to evaluate the performance of the NIPAM PGD on 3D dose measurement. A gel phantom was used to measure the dose distribution of a clinical case of intensity-modulated radiation therapy. Magnetic resonance imaging scans were performed for dose readouts. The measured dose volumes were compared with the planned dose volume. The relative volume histograms showed that relative volumes with a negative percent dose difference decreased as time elapsed. Furthermore, the histograms revealed few changes after 24h postirradiation. For the 3%/3mm and 2%/2mm criteria, the pass rates of the 12- and 24-h dose volumes were higher than 95%, respectively. This study thus concludes that the pass rate map can be used to evaluate the dose-temporal readability of PGDs and that the NIPAM PGD can be used for clinical pretreatment verifications. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  11. Multicenter study of quantitative computed tomography analysis using a computer-aided three-dimensional system in patients with idiopathic pulmonary fibrosis.

    PubMed

    Iwasawa, Tae; Kanauchi, Tetsu; Hoshi, Toshiko; Ogura, Takashi; Baba, Tomohisa; Gotoh, Toshiyuki; Oba, Mari S

    2016-01-01

    To evaluate the feasibility of automated quantitative analysis with a three-dimensional (3D) computer-aided system (i.e., Gaussian histogram normalized correlation, GHNC) of computed tomography (CT) images from different scanners. Each institution's review board approved the research protocol. Informed patient consent was not required. The participants in this multicenter prospective study were 80 patients (65 men, 15 women) with idiopathic pulmonary fibrosis. Their mean age was 70.6 years. Computed tomography (CT) images were obtained by four different scanners set at different exposures. We measured the extent of fibrosis using GHNC, and used Pearson's correlation analysis, Bland-Altman plots, and kappa analysis to directly compare the GHNC results with manual scoring by radiologists. Multiple linear regression analysis was performed to determine the association between the CT data and forced vital capacity (FVC). For each scanner, the extent of fibrosis as determined by GHNC was significantly correlated with the radiologists' score. In multivariate analysis, the extent of fibrosis as determined by GHNC was significantly correlated with FVC (p < 0.001). There was no significant difference between the results obtained using different CT scanners. Gaussian histogram normalized correlation was feasible, irrespective of the type of CT scanner used.

  12. Microbubble cloud characterization by nonlinear frequency mixing.

    PubMed

    Cavaro, M; Payan, C; Moysan, J; Baqué, F

    2011-05-01

    In the frame of the fourth generation forum, France decided to develop sodium fast nuclear reactors. French Safety Authority requests the associated monitoring of argon gas into sodium. This implies to estimate the void fraction, and a histogram indicating the bubble population. In this context, the present letter studies the possibility of achieving an accurate determination of the histogram with acoustic methods. A nonlinear, two-frequency mixing technique has been implemented, and a specific optical device has been developed in order to validate the experimental results. The acoustically reconstructed histograms are in excellent agreement with those obtained using optical methods.

  13. Identification and Classification of New Transcripts in Dorper and Small-Tailed Han Sheep Skeletal Muscle Transcriptomes.

    PubMed

    Chao, Tianle; Wang, Guizhi; Wang, Jianmin; Liu, Zhaohua; Ji, Zhibin; Hou, Lei; Zhang, Chunlan

    2016-01-01

    High-throughput mRNA sequencing enables the discovery of new transcripts and additional parts of incompletely annotated transcripts. Compared with the human and cow genomes, the reference annotation level of the sheep genome is still low. An investigation of new transcripts in sheep skeletal muscle will improve our understanding of muscle development. Therefore, applying high-throughput sequencing, two cDNA libraries from the biceps brachii of small-tailed Han sheep and Dorper sheep were constructed, and whole-transcriptome analysis was performed to determine the unknown transcript catalogue of this tissue. In this study, 40,129 transcripts were finally mapped to the sheep genome. Among them, 3,467 transcripts were determined to be unannotated in the current reference sheep genome and were defined as new transcripts. Based on protein-coding capacity prediction and comparative analysis of sequence similarity, 246 transcripts were classified as portions of unannotated genes or incompletely annotated genes. Another 1,520 transcripts were predicted with high confidence to be long non-coding RNAs. Our analysis also revealed 334 new transcripts that displayed specific expression in ruminants and uncovered a number of new transcripts without intergenus homology but with specific expression in sheep skeletal muscle. The results confirmed a complex transcript pattern of coding and non-coding RNA in sheep skeletal muscle. This study provided important information concerning the sheep genome and transcriptome annotation, which could provide a basis for further study.

  14. Discourse Analysis and Social Construction.

    ERIC Educational Resources Information Center

    Bazerman, Charles

    1990-01-01

    A brief review of social constructivism as a general movement and how it has been applied in particular to scientific knowledge precedes a review of investigations into the role language and linguistic activities have in the social construction of knowledge. A 39-citation unannotated bibliography is included. (CB)

  15. BIBLIOGRAPHY ON LEARNING PROCESS. SUPPLEMENT II.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS SUPPLEMENTARY BIBLIOGRAPHY LISTS MATERIALS ON VARIOUS FACETS OF HUMAN LEARNING. APPROXIMATELY 60 UNANNOTATED REFERENCES ARE PROVIDED FOR DOCUMENTS DATING FROM 1954 TO 1966. JOURNAL ARTICLES, BOOKS, RESEARCH REPORTS, AND CONFERENCE PAPERS ARE LISTED. SOME SUBJECT AREAS INCLUDED ARE (1) LEARNING PARAMETERS AND ABILITY, (2) RETENTION AND…

  16. BIBLIOGRAPHY ON URBAN EDUCATION, SUPPLEMENT TO BIBLIOGRAPHY ON DISADVANTAGED.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS BIBLIOGRAPHY LISTS MATERIAL ON VARIOUS ASPECTS OF URBAN EDUCATION. APPROXIMATELY 220 UNANNOTATED REFERENCES ARE PROVIDED TO DOCUMENTS FROM 1961 TO 1965. JOURNALS, BOOKS, AND REPORTS ARE LISTED. SUBJECT AREAS INCLUDED ARE FAMILY ENVIRONMENT, CULTURALLY DEPRIVED, LOW ACHIEVERS, DROPOUTS, AND DESEGREGATED EDUCATION. (TC)

  17. BIBLIOGRAPHY ON THE CULTURALLY DISADVANTAGED. SUPPLEMENT III.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS BIBLIOGRAPHY SUPPLEMENT LISTS MATERIAL ON VARIOUS ASPECTS OF THE CULTURALLY DISADVANTAGED. APPROXIMATELY 220 UNANNOTATED REFERENCES ARE PROVIDED TO DOCUMENTS DATING FROM 1963 TO 1966. JOURNALS, BOOKS, AND REPORT MATERIALS ARE LISTED. SUBJECT AREAS INCLUDED ARE PRESCHOOL PROGRAMS, NEIGHBORHOOD DEVELOPMENT PROGRAMS, SHORT-TERM GROUP COUNSELING,…

  18. ACQUISITIONS LIST, MAY 1966.

    ERIC Educational Resources Information Center

    Harvard Univ., Cambridge, MA. Graduate School of Education.

    THIS ACQUISITIONS LIST IS A BIBLIOGRAPHY OF MATERIAL ON VARIOUS ASPECTS OF EDUCATION. OVER 300 UNANNOTATED REFERENCES ARE PROVIDED FOR DOCUMENTS DATING MAINLY FROM 1960 TO 1966. BOOKS, JOURNALS, REPORT MATERIALS, AND UNPUBLISHED MANUSCRIPTS ARE LISTED UNDER THE FOLLOWING HEADINGS--(1) ACHIEVEMENT, (2) ADOLESCENCE, (3) CHILD DEVELOPMENT, (4)…

  19. Potential for reducing the numbers of SiPM readout surfaces of laser-processed X'tal cube PET detectors.

    PubMed

    Hirano, Yoshiyuki; Inadama, Naoko; Yoshida, Eiji; Nishikido, Fumihiko; Murayama, Hideo; Watanabe, Mitsuo; Yamaya, Taiga

    2013-03-07

    We are developing a three-dimensional (3D) position-sensitive detector with isotropic spatial resolution, the X'tal cube. Originally, our design consisted of a crystal block for which all six surfaces were covered with arrays of multi-pixel photon counters (MPPCs). In this paper, we examined the feasibility of reducing the number of surfaces on which a MPPC array must be connected with the aim of reducing the complexity of the system. We evaluated two kinds of laser-processed X'tal cubes of 3 mm and 2 mm pitch segments while varying the numbers of the 4 × 4 MPPC arrays down to two surfaces. The sub-surface laser engraving technique was used to fabricate 3D grids into a monolithic crystal block. The 3D flood histograms were obtained by the Anger-type calculation. Two figures of merit, peak-to-valley ratios and distance-to-width ratios, were used to evaluate crystal identification performance. Clear separation was obtained even in the 2-surface configuration for the 3 mm X'tal cube, and the average peak-to-valley ratios and the distance-to-width ratios were 6.7 and 2.6, respectively. Meanwhile, in the 2 mm X'tal cube, the 6-surface configuration could separate all crystals and even the 2-surface case could also, but the flood histograms were relatively shrunk in the 2-surface case, especially on planes parallel to the sensitive surfaces. However, the minimum peak-to-valley ratio did not fall below 3.9. We concluded that reducing the numbers of MPPC readout surfaces was feasible for both the 3 mm and the 2 mm X'tal cubes.

  20. Inference of Functionally-Relevant N-acetyltransferase Residues Based on Statistical Correlations.

    PubMed

    Neuwald, Andrew F; Altschul, Stephen F

    2016-12-01

    Over evolutionary time, members of a superfamily of homologous proteins sharing a common structural core diverge into subgroups filling various functional niches. At the sequence level, such divergence appears as correlations that arise from residue patterns distinct to each subgroup. Such a superfamily may be viewed as a population of sequences corresponding to a complex, high-dimensional probability distribution. Here we model this distribution as hierarchical interrelated hidden Markov models (hiHMMs), which describe these sequence correlations implicitly. By characterizing such correlations one may hope to obtain information regarding functionally-relevant properties that have thus far evaded detection. To do so, we infer a hiHMM distribution from sequence data using Bayes' theorem and Markov chain Monte Carlo (MCMC) sampling, which is widely recognized as the most effective approach for characterizing a complex, high dimensional distribution. Other routines then map correlated residue patterns to available structures with a view to hypothesis generation. When applied to N-acetyltransferases, this reveals sequence and structural features indicative of functionally important, yet generally unknown biochemical properties. Even for sets of proteins for which nothing is known beyond unannotated sequences and structures, this can lead to helpful insights. We describe, for example, a putative coenzyme-A-induced-fit substrate binding mechanism mediated by arginine residue switching between salt bridge and π-π stacking interactions. A suite of programs implementing this approach is available (psed.igs.umaryland.edu).

  1. Direct observation of intermediate states in model membrane fusion

    PubMed Central

    Keidel, Andrea; Bartsch, Tobias F.; Florin, Ernst-Ludwig

    2016-01-01

    We introduce a novel assay for membrane fusion of solid supported membranes on silica beads and on coverslips. Fusion of the lipid bilayers is induced by bringing an optically trapped bead in contact with the coverslip surface while observing the bead’s thermal motion with microsecond temporal and nanometer spatial resolution using a three-dimensional position detector. The probability of fusion is controlled by the membrane tension on the particle. We show that the progression of fusion can be monitored by changes in the three-dimensional position histograms of the bead and in its rate of diffusion. We were able to observe all fusion intermediates including transient fusion, formation of a stalk, hemifusion and the completion of a fusion pore. Fusion intermediates are characterized by axial but not lateral confinement of the motion of the bead and independently by the change of its rate of diffusion due to the additional drag from the stalk-like connection between the two membranes. The detailed information provided by this assay makes it ideally suited for studies of early events in pure lipid bilayer fusion or fusion assisted by fusogenic molecules. PMID:27029285

  2. The use of thematic mapper data for land cover discrimination: Preliminary results from the UK SATMaP programme

    NASA Technical Reports Server (NTRS)

    Jackson, M. J.; Baker, J. R.; Townshend, J. R. G.; Gayler, J. E.; Hardy, J. R.

    1983-01-01

    The principal objectives of the UK SATMaP program are to determine thematic mapper (TM) performance with particular reference to spatial resolution properties and geometric characteristics of the data. So far, analysis is restricted to images from the U.S. and concentrates on spectra and radiometric properties. The results indicate that the data are inherently three dimensional compared with the two dimensional character of MSS data. Preliminary classification results indicate the importance of the near infrared band (TM 4), at least one middle infrared band (TM 5 or TM 6) and at least one of the visible bands (preferably either TM 3 or TM 1). The thermal infrared also appears to have discriminatory ability despite its coarser spatial resolution. For band 4 the forward and reverse scans show somewhat different spectral responses in one scene but this effect is absent in the other analyzed. From examination of the histograms it would appear that the full 8-bit quantization is not being effectively utilized for all the bands.

  3. Direct observation of intermediate states in model membrane fusion.

    PubMed

    Keidel, Andrea; Bartsch, Tobias F; Florin, Ernst-Ludwig

    2016-03-31

    We introduce a novel assay for membrane fusion of solid supported membranes on silica beads and on coverslips. Fusion of the lipid bilayers is induced by bringing an optically trapped bead in contact with the coverslip surface while observing the bead's thermal motion with microsecond temporal and nanometer spatial resolution using a three-dimensional position detector. The probability of fusion is controlled by the membrane tension on the particle. We show that the progression of fusion can be monitored by changes in the three-dimensional position histograms of the bead and in its rate of diffusion. We were able to observe all fusion intermediates including transient fusion, formation of a stalk, hemifusion and the completion of a fusion pore. Fusion intermediates are characterized by axial but not lateral confinement of the motion of the bead and independently by the change of its rate of diffusion due to the additional drag from the stalk-like connection between the two membranes. The detailed information provided by this assay makes it ideally suited for studies of early events in pure lipid bilayer fusion or fusion assisted by fusogenic molecules.

  4. Visual Image Sensor Organ Replacement

    NASA Technical Reports Server (NTRS)

    Maluf, David A.

    2014-01-01

    This innovation is a system that augments human vision through a technique called "Sensing Super-position" using a Visual Instrument Sensory Organ Replacement (VISOR) device. The VISOR device translates visual and other sensors (i.e., thermal) into sounds to enable very difficult sensing tasks. Three-dimensional spatial brightness and multi-spectral maps of a sensed image are processed using real-time image processing techniques (e.g. histogram normalization) and transformed into a two-dimensional map of an audio signal as a function of frequency and time. Because the human hearing system is capable of learning to process and interpret extremely complicated and rapidly changing auditory patterns, the translation of images into sounds reduces the risk of accidentally filtering out important clues. The VISOR device was developed to augment the current state-of-the-art head-mounted (helmet) display systems. It provides the ability to sense beyond the human visible light range, to increase human sensing resolution, to use wider angle visual perception, and to improve the ability to sense distances. It also allows compensation for movement by the human or changes in the scene being viewed.

  5. Representation and Reconstruction of Three-dimensional Microstructures in Ni-based Superalloys

    DTIC Science & Technology

    2010-12-20

    Materiala, 56, pp. 427-437 (2009); • Application of joint histogram and mutual information to registration and data fusion problems in serial...sectioning data sets and synthetically generated microstructures. The method is easy to use, and allows for a quantitative description of shapes. Further...following objectives were achieved: • we have successfully applied 3-D moment invariant analysis to several experimental data sets; • we have extended 2-D

  6. Information Design: A Bibliography.

    ERIC Educational Resources Information Center

    Albers, Michael J.; Lisberg, Beth Conney

    2000-01-01

    Presents a 17-item annotated list of essential books on information design chosen by members of the InfoDesign e-mail list. Includes a 113-item unannotated bibliography of additional works, on topics of creativity and critical thinking; visual thinking; graphic design; infographics; information design; instructional design; interface design;…

  7. Enhancing clinical concept extraction with distributional semantics

    PubMed Central

    Cohen, Trevor; Wu, Stephen; Gonzalez, Graciela

    2011-01-01

    Extracting concepts (such as drugs, symptoms, and diagnoses) from clinical narratives constitutes a basic enabling technology to unlock the knowledge within and support more advanced reasoning applications such as diagnosis explanation, disease progression modeling, and intelligent analysis of the effectiveness of treatment. The recent release of annotated training sets of de-identified clinical narratives has contributed to the development and refinement of concept extraction methods. However, as the annotation process is labor-intensive, training data are necessarily limited in the concepts and concept patterns covered, which impacts the performance of supervised machine learning applications trained with these data. This paper proposes an approach to minimize this limitation by combining supervised machine learning with empirical learning of semantic relatedness from the distribution of the relevant words in additional unannotated text. The approach uses a sequential discriminative classifier (Conditional Random Fields) to extract the mentions of medical problems, treatments and tests from clinical narratives. It takes advantage of all Medline abstracts indexed as being of the publication type “clinical trials” to estimate the relatedness between words in the i2b2/VA training and testing corpora. In addition to the traditional features such as dictionary matching, pattern matching and part-of-speech tags, we also used as a feature words that appear in similar contexts to the word in question (that is, words that have a similar vector representation measured with the commonly used cosine metric, where vector representations are derived using methods of distributional semantics). To the best of our knowledge, this is the first effort exploring the use of distributional semantics, the semantics derived empirically from unannotated text often using vector space models, for a sequence classification task such as concept extraction. Therefore, we first experimented with different sliding window models and found the model with parameters that led to best performance in a preliminary sequence labeling task. The evaluation of this approach, performed against the i2b2/VA concept extraction corpus, showed that incorporating features based on the distribution of words across a large unannotated corpus significantly aids concept extraction. Compared to a supervised-only approach as a baseline, the micro-averaged f-measure for exact match increased from 80.3% to 82.3% and the micro-averaged f-measure based on inexact match increased from 89.7% to 91.3%. These improvements are highly significant according to the bootstrap resampling method and also considering the performance of other systems. Thus, distributional semantic features significantly improve the performance of concept extraction from clinical narratives by taking advantage of word distribution information obtained from unannotated data. PMID:22085698

  8. Serial data acquisition for the X-ray plasma diagnostics with selected GEM detector structures

    NASA Astrophysics Data System (ADS)

    Czarski, T.; Chernyshova, M.; Pozniak, K. T.; Kasprowicz, G.; Zabolotny, W.; Kolasinski, P.; Krawczyk, R.; Wojenski, A.; Zienkiewicz, P.

    2015-10-01

    The measurement system based on GEM—Gas Electron Multiplier detector is developed for X-ray diagnostics of magnetic confinement tokamak plasmas. The paper is focused on the measurement subject and describes the fundamental data processing to obtain reliable characteristics (histograms) useful for physicists. The required data processing have two steps: 1—processing in the time domain, i.e. events selections for bunches of coinciding clusters, 2—processing in the planar space domain, i.e. cluster identification for the given detector structure. So, it is the software part of the project between the electronic hardware and physics applications. The whole project is original and it was developed by the paper authors. The previous version based on 1-D GEM detector was applied for the high-resolution X-ray crystal spectrometer KX1 in the JET tokamak. The current version considers 2-D detector structures for the new data acquisition system. The fast and accurate mode of data acquisition implemented in the hardware in real time can be applied for the dynamic plasma diagnostics. Several detector structures with single-pixel sensors and multi-pixel (directional) sensors are considered for two-dimensional X-ray imaging. Final data processing is presented by histograms for selected range of position, time interval and cluster charge values. Exemplary radiation source properties are measured by the basic cumulative characteristics: the cluster position distribution and cluster charge value distribution corresponding to the energy spectra. A shorter version of this contribution is due to be published in PoS at: 1st EPS conference on Plasma Diagnostics

  9. Robust Audio Watermarking by Using Low-Frequency Histogram

    NASA Astrophysics Data System (ADS)

    Xiang, Shijun

    In continuation to earlier work where the problem of time-scale modification (TSM) has been studied [1] by modifying the shape of audio time domain histogram, here we consider the additional ingredient of resisting additive noise-like operations, such as Gaussian noise, lossy compression and low-pass filtering. In other words, we study the problem of the watermark against both TSM and additive noises. To this end, in this paper we extract the histogram from a Gaussian-filtered low-frequency component for audio watermarking. The watermark is inserted by shaping the histogram in a way that the use of two consecutive bins as a group is exploited for hiding a bit by reassigning their population. The watermarked signals are perceptibly similar to the original one. Comparing with the previous time-domain watermarking scheme [1], the proposed watermarking method is more robust against additive noise, MP3 compression, low-pass filtering, etc.

  10. Occupational Training Programs in Illinois Secondary Schools. Bulletin No. 44-1173.

    ERIC Educational Resources Information Center

    Illinois State Board of Vocational Education and Rehabilitation, Springfield. Div. of Vocational and Technical Education.

    The directory is an unannotated list of approved occupational training programs found in the secondary schools of Illinois. Entries are listed alphabetically by location. Counties are listed with each school location to differentiate between similar location names. Programs are grouped under five occupational areas: applied biological and…

  11. Assessment of histological differentiation in gastric cancers using whole-volume histogram analysis of apparent diffusion coefficient maps.

    PubMed

    Zhang, Yujuan; Chen, Jun; Liu, Song; Shi, Hua; Guan, Wenxian; Ji, Changfeng; Guo, Tingting; Zheng, Huanhuan; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng; Liu, Tian

    2017-02-01

    To investigate the efficacy of histogram analysis of the entire tumor volume in apparent diffusion coefficient (ADC) maps for differentiating between histological grades in gastric cancer. Seventy-eight patients with gastric cancer were enrolled in a retrospective 3.0T magnetic resonance imaging (MRI) study. ADC maps were obtained at two different b values (0 and 1000 sec/mm 2 ) for each patient. Tumors were delineated on each slice of the ADC maps, and a histogram for the entire tumor volume was subsequently generated. A series of histogram parameters (eg, skew and kurtosis) were calculated and correlated with the histological grade of the surgical specimen. The diagnostic performance of each parameter for distinguishing poorly from moderately well-differentiated gastric cancers was assessed by using the area under the receiver operating characteristic curve (AUC). There were significant differences in the 5 th , 10 th , 25 th , and 50 th percentiles, skew, and kurtosis between poorly and well-differentiated gastric cancers (P < 0.05). There were correlations between the degrees of differentiation and histogram parameters, including the 10 th percentile, skew, kurtosis, and max frequency; the correlation coefficients were 0.273, -0.361, -0.339, and -0.370, respectively. Among all the histogram parameters, the max frequency had the largest AUC value, which was 0.675. Histogram analysis of the ADC maps on the basis of the entire tumor volume can be useful in differentiating between histological grades for gastric cancer. 4 J. Magn. Reson. Imaging 2017;45:440-449. © 2016 International Society for Magnetic Resonance in Medicine.

  12. Efficient reversible data hiding in encrypted image with public key cryptosystem

    NASA Astrophysics Data System (ADS)

    Xiang, Shijun; Luo, Xinrong

    2017-12-01

    This paper proposes a new reversible data hiding scheme for encrypted images by using homomorphic and probabilistic properties of Paillier cryptosystem. The proposed method can embed additional data directly into encrypted image without any preprocessing operations on original image. By selecting two pixels as a group for encryption, data hider can retrieve the absolute differences of groups of two pixels by employing a modular multiplicative inverse method. Additional data can be embedded into encrypted image by shifting histogram of the absolute differences by using the homomorphic property in encrypted domain. On the receiver side, legal user can extract the marked histogram in encrypted domain in the same way as data hiding procedure. Then, the hidden data can be extracted from the marked histogram and the encrypted version of original image can be restored by using inverse histogram shifting operations. Besides, the marked absolute differences can be computed after decryption for extraction of additional data and restoration of original image. Compared with previous state-of-the-art works, the proposed scheme can effectively avoid preprocessing operations before encryption and can efficiently embed and extract data in encrypted domain. The experiments on the standard image files also certify the effectiveness of the proposed scheme.

  13. Using color histogram normalization for recovering chromatic illumination-changed images.

    PubMed

    Pei, S C; Tseng, C L; Wu, C C

    2001-11-01

    We propose a novel image-recovery method using the covariance matrix of the red-green-blue (R-G-B) color histogram and tensor theories. The image-recovery method is called the color histogram normalization algorithm. It is known that the color histograms of an image taken under varied illuminations are related by a general affine transformation of the R-G-B coordinates when the illumination is changed. We propose a simplified affine model for application with illumination variation. This simplified affine model considers the effects of only three basic forms of distortion: translation, scaling, and rotation. According to this principle, we can estimate the affine transformation matrix necessary to recover images whose color distributions are varied as a result of illumination changes. We compare the normalized color histogram of the standard image with that of the tested image. By performing some operations of simple linear algebra, we can estimate the matrix of the affine transformation between two images under different illuminations. To demonstrate the performance of the proposed algorithm, we divide the experiments into two parts: computer-simulated images and real images corresponding to illumination changes. Simulation results show that the proposed algorithm is effective for both types of images. We also explain the noise-sensitive skew-rotation estimation that exists in the general affine model and demonstrate that the proposed simplified affine model without the use of skew rotation is better than the general affine model for such applications.

  14. Detection of Local Tumor Recurrence After Definitive Treatment of Head and Neck Squamous Cell Carcinoma: Histogram Analysis of Dynamic Contrast-Enhanced T1-Weighted Perfusion MRI.

    PubMed

    Choi, Sang Hyun; Lee, Jeong Hyun; Choi, Young Jun; Park, Ji Eun; Sung, Yu Sub; Kim, Namkug; Baek, Jung Hwan

    2017-01-01

    This study aimed to explore the added value of histogram analysis of the ratio of initial to final 90-second time-signal intensity AUC (AUCR) for differentiating local tumor recurrence from contrast-enhancing scar on follow-up dynamic contrast-enhanced T1-weighted perfusion MRI of patients treated for head and neck squamous cell carcinoma (HNSCC). AUCR histogram parameters were assessed among tumor recurrence (n = 19) and contrast-enhancing scar (n = 27) at primary sites and compared using the t test. ROC analysis was used to determine the best differentiating parameters. The added value of AUCR histogram parameters was assessed when they were added to inconclusive conventional MRI results. Histogram analysis showed statistically significant differences in the 50th, 75th, and 90th percentiles of the AUCR values between the two groups (p < 0.05). The 90th percentile of the AUCR values (AUCR 90 ) was the best predictor of local tumor recurrence (AUC, 0.77; 95% CI, 0.64-0.91) with an estimated cutoff of 1.02. AUCR 90 increased sensitivity by 11.7% over that of conventional MRI alone when added to inconclusive results. Histogram analysis of AUCR can improve the diagnostic yield for local tumor recurrence during surveillance after treatment for HNSCC.

  15. Image barcodes

    NASA Astrophysics Data System (ADS)

    Damera-Venkata, Niranjan; Yen, Jonathan

    2003-01-01

    A Visually significant two-dimensional barcode (VSB) developed by Shaked et. al. is a method used to design an information carrying two-dimensional barcode, which has the appearance of a given graphical entity such as a company logo. The encoding and decoding of information using the VSB, uses a base image with very few graylevels (typically only two). This typically requires the image histogram to be bi-modal. For continuous-tone images such as digital photographs of individuals, the representation of tone or "shades of gray" is not only important to obtain a pleasing rendition of the face, but in most cases, the VSB renders these images unrecognizable due to its inability to represent true gray-tone variations. This paper extends the concept of a VSB to an image bar code (IBC). We enable the encoding and subsequent decoding of information embedded in the hardcopy version of continuous-tone base-images such as those acquired with a digital camera. The encoding-decoding process is modeled by robust data transmission through a noisy print-scan channel that is explicitly modeled. The IBC supports a high information capacity that differentiates it from common hardcopy watermarks. The reason for the improved image quality over the VSB is a joint encoding/halftoning strategy based on a modified version of block error diffusion. Encoder stability, image quality vs. information capacity tradeoffs and decoding issues with and without explicit knowledge of the base-image are discussed.

  16. Use of 3-dimensional surface acquisition to study facial morphology in 5 populations.

    PubMed

    Kau, Chung How; Richmond, Stephen; Zhurov, Alexei; Ovsenik, Maja; Tawfik, Wael; Borbely, Peter; English, Jeryl D

    2010-04-01

    The aim of this study was to assess the use of 3-dimensional facial averages for determining morphologic differences from various population groups. We recruited 473 subjects from 5 populations. Three-dimensional images of the subjects were obtained in a reproducible and controlled environment with a commercially available stereo-photogrammetric camera capture system. Minolta VI-900 (Konica Minolta, Tokyo, Japan) and 3dMDface (3dMD LLC, Atlanta, Ga) systems were used. Each image was obtained as a facial mesh and orientated along a triangulated axis. All faces were overlaid, one on top of the other, and a complex mathematical algorithm was performed until average composite faces of 1 man and 1 woman were achieved for each subgroup. These average facial composites were superimposed based on a previously validated superimposition method, and the facial differences were quantified. Distinct facial differences were observed among the groups. The linear differences between surface shells ranged from 0.37 to 1.00 mm for the male groups. The linear differences ranged from 0.28 and 0.87 mm for the women. The color histograms showed that the similarities in facial shells between the subgroups by sex ranged from 26.70% to 70.39% for men and 36.09% to 79.83% for women. The average linear distance from the signed color histograms for the male subgroups ranged from -6.30 to 4.44 mm. The female subgroups ranged from -6.32 to 4.25 mm. Average faces can be efficiently and effectively created from a sample of 3-dimensional faces. Average faces can be used to compare differences in facial morphologies for various populations and sexes. Facial morphologic differences were greatest when totally different ethnic variations were compared. Facial morphologic similarities were present in comparable groups, but there were large variations in concentrated areas of the face. Copyright 2010 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  17. Hybrid Histogram Descriptor: A Fusion Feature Representation for Image Retrieval.

    PubMed

    Feng, Qinghe; Hao, Qiaohong; Chen, Yuqi; Yi, Yugen; Wei, Ying; Dai, Jiangyan

    2018-06-15

    Currently, visual sensors are becoming increasingly affordable and fashionable, acceleratingly the increasing number of image data. Image retrieval has attracted increasing interest due to space exploration, industrial, and biomedical applications. Nevertheless, designing effective feature representation is acknowledged as a hard yet fundamental issue. This paper presents a fusion feature representation called a hybrid histogram descriptor (HHD) for image retrieval. The proposed descriptor comprises two histograms jointly: a perceptually uniform histogram which is extracted by exploiting the color and edge orientation information in perceptually uniform regions; and a motif co-occurrence histogram which is acquired by calculating the probability of a pair of motif patterns. To evaluate the performance, we benchmarked the proposed descriptor on RSSCN7, AID, Outex-00013, Outex-00014 and ETHZ-53 datasets. Experimental results suggest that the proposed descriptor is more effective and robust than ten recent fusion-based descriptors under the content-based image retrieval framework. The computational complexity was also analyzed to give an in-depth evaluation. Furthermore, compared with the state-of-the-art convolutional neural network (CNN)-based descriptors, the proposed descriptor also achieves comparable performance, but does not require any training process.

  18. Improved dose-volume histogram estimates for radiopharmaceutical therapy by optimizing quantitative SPECT reconstruction parameters

    NASA Astrophysics Data System (ADS)

    Cheng, Lishui; Hobbs, Robert F.; Segars, Paul W.; Sgouros, George; Frey, Eric C.

    2013-06-01

    In radiopharmaceutical therapy, an understanding of the dose distribution in normal and target tissues is important for optimizing treatment. Three-dimensional (3D) dosimetry takes into account patient anatomy and the nonuniform uptake of radiopharmaceuticals in tissues. Dose-volume histograms (DVHs) provide a useful summary representation of the 3D dose distribution and have been widely used for external beam treatment planning. Reliable 3D dosimetry requires an accurate 3D radioactivity distribution as the input. However, activity distribution estimates from SPECT are corrupted by noise and partial volume effects (PVEs). In this work, we systematically investigated OS-EM based quantitative SPECT (QSPECT) image reconstruction in terms of its effect on DVHs estimates. A modified 3D NURBS-based Cardiac-Torso (NCAT) phantom that incorporated a non-uniform kidney model and clinically realistic organ activities and biokinetics was used. Projections were generated using a Monte Carlo (MC) simulation; noise effects were studied using 50 noise realizations with clinical count levels. Activity images were reconstructed using QSPECT with compensation for attenuation, scatter and collimator-detector response (CDR). Dose rate distributions were estimated by convolution of the activity image with a voxel S kernel. Cumulative DVHs were calculated from the phantom and QSPECT images and compared both qualitatively and quantitatively. We found that noise, PVEs, and ringing artifacts due to CDR compensation all degraded histogram estimates. Low-pass filtering and early termination of the iterative process were needed to reduce the effects of noise and ringing artifacts on DVHs, but resulted in increased degradations due to PVEs. Large objects with few features, such as the liver, had more accurate histogram estimates and required fewer iterations and more smoothing for optimal results. Smaller objects with fine details, such as the kidneys, required more iterations and less smoothing at early time points post-radiopharmaceutical administration but more smoothing and fewer iterations at later time points when the total organ activity was lower. The results of this study demonstrate the importance of using optimal reconstruction and regularization parameters. Optimal results were obtained with different parameters at each time point, but using a single set of parameters for all time points produced near-optimal dose-volume histograms.

  19. Factorization-based texture segmentation

    DOE PAGES

    Yuan, Jiangye; Wang, Deliang; Cheriyadat, Anil M.

    2015-06-17

    This study introduces a factorization-based approach that efficiently segments textured images. We use local spectral histograms as features, and construct an M × N feature matrix using M-dimensional feature vectors in an N-pixel image. Based on the observation that each feature can be approximated by a linear combination of several representative features, we factor the feature matrix into two matrices-one consisting of the representative features and the other containing the weights of representative features at each pixel used for linear combination. The factorization method is based on singular value decomposition and nonnegative matrix factorization. The method uses local spectral histogramsmore » to discriminate region appearances in a computationally efficient way and at the same time accurately localizes region boundaries. Finally, the experiments conducted on public segmentation data sets show the promise of this simple yet powerful approach.« less

  20. Mechanical break junctions: enormous information in a nanoscale package.

    PubMed

    Natelson, Douglas

    2012-04-24

    Mechanical break junctions, particularly those in which a metal tip is repeatedly moved in and out of contact with a metal film, have provided many insights into electronic conduction at the atomic and molecular scale, most often by averaging over many possible junction configurations. This averaging throws away a great deal of information, and Makk et al. in this issue of ACS Nano demonstrate that, with both simulated and real experimental data, more sophisticated two-dimensional analysis methods can reveal information otherwise obscured in simple histograms. As additional measured quantities come into play in break junction experiments, including thermopower, noise, and optical response, these more sophisticated analytic approaches are likely to become even more powerful. While break junctions are not directly practical for useful electronic devices, they are incredibly valuable tools for unraveling the electronic transport physics relevant for ultrascaled nanoelectronics.

  1. A New Quantum Watermarking Based on Quantum Wavelet Transforms

    NASA Astrophysics Data System (ADS)

    Heidari, Shahrokh; Naseri, Mosayeb; Gheibi, Reza; Baghfalaki, Masoud; Rasoul Pourarian, Mohammad; Farouk, Ahmed

    2017-06-01

    Quantum watermarking is a technique to embed specific information, usually the owner’s identification, into quantum cover data such for copyright protection purposes. In this paper, a new scheme for quantum watermarking based on quantum wavelet transforms is proposed which includes scrambling, embedding and extracting procedures. The invisibility and robustness performances of the proposed watermarking method is confirmed by simulation technique. The invisibility of the scheme is examined by the peak-signal-to-noise ratio (PSNR) and the histogram calculation. Furthermore the robustness of the scheme is analyzed by the Bit Error Rate (BER) and the Correlation Two-Dimensional (Corr 2-D) calculation. The simulation results indicate that the proposed watermarking scheme indicate not only acceptable visual quality but also a good resistance against different types of attack. Supported by Kermanshah Branch, Islamic Azad University, Kermanshah, Iran

  2. MEKS: A program for computation of inclusive jet cross sections at hadron colliders

    NASA Astrophysics Data System (ADS)

    Gao, Jun; Liang, Zhihua; Soper, Davison E.; Lai, Hung-Liang; Nadolsky, Pavel M.; Yuan, C.-P.

    2013-06-01

    EKS is a numerical program that predicts differential cross sections for production of single-inclusive hadronic jets and jet pairs at next-to-leading order (NLO) accuracy in a perturbative QCD calculation. We describe MEKS 1.0, an upgraded EKS program with increased numerical precision, suitable for comparisons to the latest experimental data from the Large Hadron Collider and Tevatron. The program integrates the regularized patron-level matrix elements over the kinematical phase space for production of two and three partons using the VEGAS algorithm. It stores the generated weighted events in finely binned two-dimensional histograms for fast offline analysis. A user interface allows one to customize computation of inclusive jet observables. Results of a benchmark comparison of the MEKS program and the commonly used FastNLO program are also documented. Program SummaryProgram title: MEKS 1.0 Catalogue identifier: AEOX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland. Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 9234 No. of bytes in distributed program, including test data, etc.: 51997 Distribution format: tar.gz Programming language: Fortran (main program), C (CUBA library and analysis program). Computer: All. Operating system: Any UNIX-like system. RAM: ˜300 MB Classification: 11.1. External routines: LHAPDF (https://lhapdf.hepforge.org/) Nature of problem: Computation of differential cross sections for inclusive production of single hadronic jets and jet pairs at next-to-leading order accuracy in perturbative quantum chromodynamics. Solution method: Upon subtraction of infrared singularities, the hard-scattering matrix elements are integrated over available phase space using an optimized VEGAS algorithm. Weighted events are generated and filled into a finely binned two-dimensional histogram, from which the final cross sections with typical experimental binning and cuts are computed by an independent analysis program. Monte Carlo sampling of event weights is tuned automatically to get better efficiency. Running time: Depends on details of the calculation and sought numerical accuracy. See benchmark performance in Section 4. The tests provided take approximately 27 min for the jetbin run and a few seconds for jetana.

  3. Correlation of 18F-FDG PET and MRI Apparent Diffusion Coefficient Histogram Metrics with Survival in Diffuse Intrinsic Pontine Glioma: A Report from the Pediatric Brain Tumor Consortium.

    PubMed

    Zukotynski, Katherine A; Vajapeyam, Sridhar; Fahey, Frederic H; Kocak, Mehmet; Brown, Douglas; Ricci, Kelsey I; Onar-Thomas, Arzu; Fouladi, Maryam; Poussaint, Tina Young

    2017-08-01

    The purpose of this study was to describe baseline 18 F-FDG PET voxel characteristics in pediatric diffuse intrinsic pontine glioma (DIPG) and to correlate these metrics with baseline MRI apparent diffusion coefficient (ADC) histogram metrics, progression-free survival (PFS), and overall survival. Methods: Baseline brain 18 F-FDG PET and MRI scans were obtained in 33 children from Pediatric Brain Tumor Consortium clinical DIPG trials. 18 F-FDG PET images, postgadolinium MR images, and ADC MR images were registered to baseline fluid attenuation inversion recovery MR images. Three-dimensional regions of interest on fluid attenuation inversion recovery MR images and postgadolinium MR images and 18 F-FDG PET and MR ADC histograms were generated. Metrics evaluated included peak number, skewness, and kurtosis. Correlation between PET and MR ADC histogram metrics was evaluated. PET pixel values within the region of interest for each tumor were plotted against MR ADC values. The association of these imaging markers with survival was described. Results: PET histograms were almost always unimodal (94%, vs. 6% bimodal). None of the PET histogram parameters (skewness or kurtosis) had a significant association with PFS, although a higher PET postgadolinium skewness tended toward a less favorable PFS (hazard ratio, 3.48; 95% confidence interval [CI], 0.75-16.28 [ P = 0.11]). There was a significant association between higher MR ADC postgadolinium skewness and shorter PFS (hazard ratio, 2.56; 95% CI, 1.11-5.91 [ P = 0.028]), and there was the suggestion that this also led to shorter overall survival (hazard ratio, 2.18; 95% CI, 0.95-5.04 [ P = 0.067]). Higher MR ADC postgadolinium kurtosis tended toward shorter PFS (hazard ratio, 1.30; 95% CI, 0.98-1.74 [ P = 0.073]). PET and MR ADC pixel values were negatively correlated using the Pearson correlation coefficient. Further, the level of PET and MR ADC correlation was significantly positively associated with PFS; tumors with higher values of ADC-PET correlation had more favorable PFS (hazard ratio, 0.17; 95% CI, 0.03-0.89 [ P = 0.036]), suggesting that a higher level of negative ADC-PET correlation leads to less favorable PFS. A more significant negative correlation may indicate higher-grade elements within the tumor leading to poorer outcomes. Conclusion: 18 F-FDG PET and MR ADC histogram metrics in pediatric DIPG demonstrate different characteristics with often a negative correlation between PET and MR ADC pixel values. A higher negative correlation is associated with a worse PFS, which may indicate higher-grade elements within the tumor. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  4. Naturalness preservation image contrast enhancement via histogram modification

    NASA Astrophysics Data System (ADS)

    Tian, Qi-Chong; Cohen, Laurent D.

    2018-04-01

    Contrast enhancement is a technique for enhancing image contrast to obtain better visual quality. Since many existing contrast enhancement algorithms usually produce over-enhanced results, the naturalness preservation is needed to be considered in the framework of image contrast enhancement. This paper proposes a naturalness preservation contrast enhancement method, which adopts the histogram matching to improve the contrast and uses the image quality assessment to automatically select the optimal target histogram. The contrast improvement and the naturalness preservation are both considered in the target histogram, so this method can avoid the over-enhancement problem. In the proposed method, the optimal target histogram is a weighted sum of the original histogram, the uniform histogram, and the Gaussian-shaped histogram. Then the structural metric and the statistical naturalness metric are used to determine the weights of corresponding histograms. At last, the contrast-enhanced image is obtained via matching the optimal target histogram. The experiments demonstrate the proposed method outperforms the compared histogram-based contrast enhancement algorithms.

  5. A Selected Bibliography on International Education.

    ERIC Educational Resources Information Center

    Foreign Policy Association, New York, NY.

    This unannotated bibliography is divided into four major sections; 1) General Background Readings for Teachers; 2) Approaches and Methods; 3) Materials for the Classroom; and, 4) Sources of Information and Materials. It offers a highly selective list of items which provide wide coverage of the field. Included are items on foreign policy, war and…

  6. The Sociology of Family Health. A Bibliography.

    ERIC Educational Resources Information Center

    Jumba-Masagazi, A. H. K., Comp.

    This unannotated bibliography is on man, his family, the society he makes and lives in, and his health. It is about man and his East African environment. It attempts to bring together both the applied and social sciences as they affect the family. Among the disciplines drawn from are: anthropology, sociology, medicine, religion, economics, labor…

  7. DWI-associated entire-tumor histogram analysis for the differentiation of low-grade prostate cancer from intermediate-high-grade prostate cancer.

    PubMed

    Wu, Chen-Jiang; Wang, Qing; Li, Hai; Wang, Xiao-Ning; Liu, Xi-Sheng; Shi, Hai-Bin; Zhang, Yu-Dong

    2015-10-01

    To investigate diagnostic efficiency of DWI using entire-tumor histogram analysis in differentiating the low-grade (LG) prostate cancer (PCa) from intermediate-high-grade (HG) PCa in comparison with conventional ROI-based measurement. DW images (b of 0-1400 s/mm(2)) from 126 pathology-confirmed PCa (diameter >0.5 cm) in 110 patients were retrospectively collected and processed by mono-exponential model. The measurement of tumor apparent diffusion coefficients (ADCs) was performed with using histogram-based and ROI-based approach, respectively. The diagnostic ability of ADCs from two methods for differentiating LG-PCa (Gleason score, GS ≤ 6) from HG-PCa (GS > 6) was determined by ROC regression, and compared by McNemar's test. There were 49 LG-tumor and 77 HG-tumor at pathologic findings. Histogram-based ADCs (mean, median, 10th and 90th) and ROI-based ADCs (mean) showed dominant relationships with ordinal GS of Pca (ρ = -0.225 to -0.406, p < 0.05). All above imaging indices reflected significant difference between LG-PCa and HG-PCa (all p values <0.01). Histogram 10th ADCs had dominantly high Az (0.738), Youden index (0.415), and positive likelihood ratio (LR+, 2.45) in stratifying tumor GS against mean, median and 90th ADCs, and ROI-based ADCs. Histogram mean, median, and 10th ADCs showed higher specificity (65.3%-74.1% vs. 44.9%, p < 0.01), but lower sensitivity (57.1%-71.3% vs. 84.4%, p < 0.05) than ROI-based ADCs in differentiating LG-PCa from HG-PCa. DWI-associated histogram analysis had higher specificity, Az, Youden index, and LR+ for differentiation of PCa Gleason grade than ROI-based approach.

  8. Measuring the apparent diffusion coefficient in primary rectal tumors: is there a benefit in performing histogram analyses?

    PubMed

    van Heeswijk, Miriam M; Lambregts, Doenja M J; Maas, Monique; Lahaye, Max J; Ayas, Z; Slenter, Jos M G M; Beets, Geerard L; Bakers, Frans C H; Beets-Tan, Regina G H

    2017-06-01

    The apparent diffusion coefficient (ADC) is a potential prognostic imaging marker in rectal cancer. Typically, mean ADC values are used, derived from precise manual whole-volume tumor delineations by experts. The aim was first to explore whether non-precise circular delineation combined with histogram analysis can be a less cumbersome alternative to acquire similar ADC measurements and second to explore whether histogram analyses provide additional prognostic information. Thirty-seven patients who underwent a primary staging MRI including diffusion-weighted imaging (DWI; b0, 25, 50, 100, 500, 1000; 1.5 T) were included. Volumes-of-interest (VOIs) were drawn on b1000-DWI: (a) precise delineation, manually tracing tumor boundaries (2 expert readers), and (b) non-precise delineation, drawing circular VOIs with a wide margin around the tumor (2 non-experts). Mean ADC and histogram metrics (mean, min, max, median, SD, skewness, kurtosis, 5th-95th percentiles) were derived from the VOIs and delineation time was recorded. Measurements were compared between the two methods and correlated with prognostic outcome parameters. Median delineation time reduced from 47-165 s (precise) to 21-43 s (non-precise). The 45th percentile of the non-precise delineation showed the best correlation with the mean ADC from the precise delineation as the reference standard (ICC 0.71-0.75). None of the mean ADC or histogram parameters showed significant prognostic value; only the total tumor volume (VOI) was significantly larger in patients with positive clinical N stage and mesorectal fascia involvement. When performing non-precise tumor delineation, histogram analysis (in specific 45th ADC percentile) may be used as an alternative to obtain similar ADC values as with precise whole tumor delineation. Histogram analyses are not beneficial to obtain additional prognostic information.

  9. The histogram analysis of diffusion-weighted intravoxel incoherent motion (IVIM) imaging for differentiating the gleason grade of prostate cancer.

    PubMed

    Zhang, Yu-Dong; Wang, Qing; Wu, Chen-Jiang; Wang, Xiao-Ning; Zhang, Jing; Liu, Hui; Liu, Xi-Sheng; Shi, Hai-Bin

    2015-04-01

    To evaluate histogram analysis of intravoxel incoherent motion (IVIM) for discriminating the Gleason grade of prostate cancer (PCa). A total of 48 patients pathologically confirmed as having clinically significant PCa (size > 0.5 cm) underwent preoperative DW-MRI (b of 0-900 s/mm(2)). Data was post-processed by monoexponential and IVIM model for quantitation of apparent diffusion coefficients (ADCs), perfusion fraction f, diffusivity D and pseudo-diffusivity D*. Histogram analysis was performed by outlining entire-tumour regions of interest (ROIs) from histological-radiological correlation. The ability of imaging indices to differentiate low-grade (LG, Gleason score (GS) ≤6) from intermediate/high-grade (HG, GS > 6) PCa was analysed by ROC regression. Eleven patients had LG tumours (18 foci) and 37 patients had HG tumours (42 foci) on pathology examination. HG tumours had significantly lower ADCs and D in terms of mean, median, 10th and 75th percentiles, combined with higher histogram kurtosis and skewness for ADCs, D and f, than LG PCa (p < 0.05). Histogram D showed relatively higher correlations (ñ = 0.641-0.668 vs. ADCs: 0.544-0.574) with ordinal GS of PCa; and its mean, median and 10th percentile performed better than ADCs did in distinguishing LG from HG PCa. It is feasible to stratify the pathological grade of PCa by IVIM with histogram metrics. D performed better in distinguishing LG from HG tumour than conventional ADCs. • GS had relatively higher correlation with tumour D than ADCs. • Difference of histogram D among two-grade tumours was statistically significant. • D yielded better individual features in demonstrating tumour grade than ADC. • D* and f failed to determine tumour grade of PCa.

  10. The application of dimensional analysis to the problem of solar wind-magnetosphere energy coupling

    NASA Technical Reports Server (NTRS)

    Bargatze, L. F.; Mcpherron, R. L.; Baker, D. N.; Hones, E. W., Jr.

    1984-01-01

    The constraints imposed by dimensional analysis are used to find how the solar wind-magnetosphere energy transfer rate depends upon interplanetary parameters. The analyses assume that only magnetohydrodynamic processes are important in controlling the rate of energy transfer. The study utilizes ISEE-3 solar wind observations, the AE index, and UT from three 10-day intervals during the International Magnetospheric Study. Simple linear regression and histogram techniques are used to find the value of the magnetohydrodynamic coupling exponent, alpha, which is consistent with observations of magnetospheric response. Once alpha is estimated, the form of the solar wind energy transfer rate is obtained by substitution into an equation of the interplanetary variables whose exponents depend upon alpha.

  11. Whole-tumor apparent diffusion coefficient (ADC) histogram analysis to differentiate benign peripheral neurogenic tumors from soft tissue sarcomas.

    PubMed

    Nakajo, Masanori; Fukukura, Yoshihiko; Hakamada, Hiroto; Yoneyama, Tomohide; Kamimura, Kiyohisa; Nagano, Satoshi; Nakajo, Masayuki; Yoshiura, Takashi

    2018-02-22

    Apparent diffusion coefficient (ADC) histogram analyses have been used to differentiate tumor grades and predict therapeutic responses in various anatomic sites with moderate success. To determine the ability of diffusion-weighted imaging (DWI) with a whole-tumor ADC histogram analysis to differentiate benign peripheral neurogenic tumors (BPNTs) from soft tissue sarcomas (STSs). Retrospective study, single institution. In all, 25 BPNTs and 31 STSs. Two-b value DWI (b-values = 0, 1000s/mm 2 ) was at 3.0T. The histogram parameters of whole-tumor for ADC were calculated by two radiologists and compared between BPNTs and STSs. Nonparametric tests were performed for comparisons between BPNTs and STSs. P < 0.05 was considered statistically significant. The ability of each parameter to differentiate STSs from BPNTs was evaluated using area under the curve (AUC) values derived from a receiver operating characteristic curve analysis. The mean ADC and all percentile parameters were significantly lower in STSs than in BPNTs (P < 0.001-0.009), with AUCs of 0.703-0.773. However, the coefficient of variation (P = 0.020 and AUC = 0.682) and skewness (P = 0.012 and AUC = 0.697) were significantly higher in STSs than in BPNTs. Kurtosis (P = 0.295) and entropy (P = 0.604) did not differ significantly between BPNTs and STSs. Whole-tumor ADC histogram parameters except kurtosis and entropy differed significantly between BPNTs and STSs. 3 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018. © 2018 International Society for Magnetic Resonance in Medicine.

  12. A semi-supervised learning framework for biomedical event extraction based on hidden topics.

    PubMed

    Zhou, Deyu; Zhong, Dayou

    2015-05-01

    Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely described by hidden topics and structures of the sentences. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. [The value of spectral frequency analysis by Doppler examination (author's transl)].

    PubMed

    Boccalon, H; Reggi, M; Lozes, A; Canal, C; Jausseran, J M; Courbier, R; Puel, P; Enjalbert, A

    1981-01-01

    Arterial stenoses of moderate extent may involve modifications of the blood flow. Arterial shading is not always examined at the best incident angle to assess the extent of the stenosis. Spectral frequency analysis by Doppler examination is a good means of evaluating the effect of moderate arterial lesions. The present study was carried out with a Doppler effect having an acoustic spectrum, which is shown in a histogram having 16 frequency bands. The values were recorded on the two femoral arteries. A study was also made of 49 normal subjects so as to establish a normal envelope histogram, taking into account the following parameters: maximum peak (800 Hz), low cut-off frequency (420 Hz), high cut-off frequency (2,600 Hz); the first peak was found to be present in 81 % of the subjects (at 375 Hz) and the second peak in 75 % of the subjects (2,020 Hz). Thirteen patients with iliac lesions of different extent were included in the study; details of these lesions were established in all cases by aortography. None of the recorded frequency histograms were located within the normal envelope. Two cases of moderate iliac stenoses were noted ( Less Than 50 % of the diameter) which interfered with the histogram, even though the femoral velocity signal was normal.

  14. Universal and adapted vocabularies for generic visual categorization.

    PubMed

    Perronnin, Florent

    2008-07-01

    Generic Visual Categorization (GVC) is the pattern classification problem which consists in assigning labels to an image based on its semantic content. This is a challenging task as one has to deal with inherent object/scene variations as well as changes in viewpoint, lighting and occlusion. Several state-of-the-art GVC systems use a vocabulary of visual terms to characterize images with a histogram of visual word counts. We propose a novel practical approach to GVC based on a universal vocabulary, which describes the content of all the considered classes of images, and class vocabularies obtained through the adaptation of the universal vocabulary using class-specific data. The main novelty is that an image is characterized by a set of histograms - one per class - where each histogram describes whether the image content is best modeled by the universal vocabulary or the corresponding class vocabulary. This framework is applied to two types of local image features: low-level descriptors such as the popular SIFT and high-level histograms of word co-occurrences in a spatial neighborhood. It is shown experimentally on two challenging datasets (an in-house database of 19 categories and the PASCAL VOC 2006 dataset) that the proposed approach exhibits state-of-the-art performance at a modest computational cost.

  15. SUBSTRUCTURE WITHIN THE SSA22 PROTOCLUSTER AT z ≈ 3.09

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Topping, Michael W.; Shapley, Alice E.; Steidel, Charles C., E-mail: mtopping@astro.ucla.edu

    We present the results of a densely sampled spectroscopic survey of the SSA22 protocluster at z ≈ 3.09. Our sample with Keck/LRIS spectroscopy includes 106 Ly α emitters (LAEs) and 40 Lyman break galaxies (LBGs) at z = 3.05–3.12. These galaxies are contained within the 9′ × 9′ region in which the protocluster was discovered, which also hosts the maximum galaxy overdensity in the SSA22 region. The redshift histogram of our spectroscopic sample reveals two distinct peaks, at z = 3.069 (blue; 43 galaxies) and z = 3.095 (red; 103 galaxies). Furthermore, objects in the blue and red peaks aremore » segregated on the sky, with galaxies in the blue peak concentrating toward the western half of the field. These results suggest that the blue and red redshift peaks represent two distinct structures in physical space. Although the double-peaked redshift histogram is traced in the same manner by LBGs and LAEs, and brighter and fainter galaxies, we find that 9 out of 10 X-ray AGNs in SSA22, and all 7 spectroscopically confirmed giant Ly α “blobs,” reside in the red peak. We combine our data set with sparsely sampled spectroscopy from the literature over a significantly wider area, finding preliminary evidence that the double-peaked structure in redshift space extends beyond the region of our dense spectroscopic sampling. In order to fully characterize the three-dimensional structure, dynamics, and evolution of large-scale structure in the SSA22 overdensity, we require the measurement of large samples of LAE and LBG redshifts over a significantly wider area, as well as detailed comparisons with cosmological simulations of massive cluster formation.« less

  16. Assessing correlations between the spatial distribution of the dose to the rectal wall and late rectal toxicity after prostate radiotherapy: an analysis of data from the MRC RT01 trial (ISRCTN 47772397)

    NASA Astrophysics Data System (ADS)

    Buettner, Florian; Gulliford, Sarah L.; Webb, Steve; Sydes, Matthew R.; Dearnaley, David P.; Partridge, Mike

    2009-11-01

    Many studies have been performed to assess correlations between measures derived from dose-volume histograms and late rectal toxicities for radiotherapy of prostate cancer. The purpose of this study was to quantify correlations between measures describing the shape and location of the dose distribution and different outcomes. The dose to the rectal wall was projected on a two-dimensional map. In order to characterize the dose distribution, its centre of mass, longitudinal and lateral extent, and eccentricity were calculated at different dose levels. Furthermore, the dose-surface histogram (DSH) was determined. Correlations between these measures and seven clinically relevant rectal-toxicity endpoints were quantified by maximally selected standardized Wilcoxon rank statistics. The analysis was performed using data from the RT01 prostate radiotherapy trial. For some endpoints, the shape of the dose distribution is more strongly correlated with the outcome than simple DSHs. Rectal bleeding was most strongly correlated with the lateral extent of the dose distribution. For loose stools, the strongest correlations were found for longitudinal extent; proctitis was most strongly correlated with DSH. For the other endpoints no statistically significant correlations could be found. The strengths of the correlations between the shape of the dose distribution and outcome differed considerably between the different endpoints. Due to these significant correlations, it is desirable to use shape-based tools in order to assess the quality of a dose distribution.

  17. Selecting a Variable for Predicting the Diagnosis of PTB Patients From Comparison of Chest X-ray Images

    NASA Astrophysics Data System (ADS)

    Mohd. Rijal, Omar; Mohd. Noor, Norliza; Teng, Shee Lee

    A statistical method of comparing two digital chest radiographs for Pulmonary Tuberculosis (PTB) patients has been proposed. After applying appropriate image registration procedures, a selected subset of each image is converted to an image histogram (or box plot). Comparing two chest X-ray images is equivalent to the direct comparison of the two corresponding histograms. From each histogram, eleven percentiles (of image intensity) are calculated. The number of percentiles that shift to the left (NLSP) when second image is compared to the first has been shown to be an indicator of patients` progress. In this study, the values of NLSP is to be compared with the actual diagnosis (Y) of several medical practitioners. A logistic regression model is used to study the relationship between NLSP and Y. This study showed that NLSP may be used as an alternative or second opinion for Y. The proposed regression model also show that important explanatory variables such as outcomes of sputum test (Z) and degree of image registration (W) may be omitted when estimating Y-values.

  18. The Holocaust: A Selected Monographic Bibliography.

    ERIC Educational Resources Information Center

    Silverstein, Leah, Comp.

    This unannotated bibliography on the Holocaust was prepared in the hope that it will be a tool for better understanding of this event. The 2,145 items in the bibliography are books found in the collections of the Library of Congress published between 1980 and 1992. All entries contain Library of Congress Call Numbers. The books are in various…

  19. Bibliography on Fetal Alcohol Syndrome and Related Issues. Second Edition.

    ERIC Educational Resources Information Center

    All Indian Pueblo Council, Albuquerque, NM.

    The bibliography on Fetal Alcohol Syndrome presents 312 unannotated journal articles for use by professionals working with American Indian people and is designed to serve as a vital source of knowledge on alcohol and child health. The bibliography is intended to list articles on Fetal Alcohol Syndrome and humans, and only highlight a minimal…

  20. Twenty Years of Tannen: An Extensive Bibliography of the Writings of Deborah F. Tannen, 1976-1995.

    ERIC Educational Resources Information Center

    Cullum, Linda, Comp.; And Others

    This unannotated bibliography features 99 listings of books and articles on, among other topics, language, linguistics, conversation, and gender, all written by the influential sociolinguist, Deborah Tannen. It also offers 10 listings of works co-authored or co-edited by Tannen. Although the bibliography focuses on written works published in…

  1. Gray-level transformations for interactive image enhancement. M.S. Thesis. Final Technical Report

    NASA Technical Reports Server (NTRS)

    Fittes, B. A.

    1975-01-01

    A gray-level transformation method suitable for interactive image enhancement was presented. It is shown that the well-known histogram equalization approach is a special case of this method. A technique for improving the uniformity of a histogram is also developed. Experimental results which illustrate the capabilities of both algorithms are described. Two proposals for implementing gray-level transformations in a real-time interactive image enhancement system are also presented.

  2. Study on the application of MRF and the D-S theory to image segmentation of the human brain and quantitative analysis of the brain tissue

    NASA Astrophysics Data System (ADS)

    Guan, Yihong; Luo, Yatao; Yang, Tao; Qiu, Lei; Li, Junchang

    2012-01-01

    The features of the spatial information of Markov random field image was used in image segmentation. It can effectively remove the noise, and get a more accurate segmentation results. Based on the fuzziness and clustering of pixel grayscale information, we find clustering center of the medical image different organizations and background through Fuzzy cmeans clustering method. Then we find each threshold point of multi-threshold segmentation through two dimensional histogram method, and segment it. The features of fusing multivariate information based on the Dempster-Shafer evidence theory, getting image fusion and segmentation. This paper will adopt the above three theories to propose a new human brain image segmentation method. Experimental result shows that the segmentation result is more in line with human vision, and is of vital significance to accurate analysis and application of tissues.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orea, Adrian; Betancourt, Minerba

    aThe objective for this project was to use MINERvA data to tune the simulation models in order to obtain the precision needed for current and future neutrino experiments. In order to do this, the current models need to be validated and then improved.more » $$\\#10146$$; Validation was done by recreating figures that have been used in previous publications $$\\#61553$$; This was done by comparing data from the detector and the simulation model (GENIE) $$\\#10146$$; Additionally, a newer version of GENIE was compared to the GENIE used for the publications to validate the new version as well as to note any improvements Another objective was to add new samples into the NUISANCE framework, which was used to compare data from the detector and simulation models. $$\\#10146$$; Specifically, the added sample was the two dimensional histogram of the double differential cross section as a function of the transversal and z-direction momentum for Numu and Numubar $$\\#61553$$; Was also used for validation« less

  4. Histogram contrast analysis and the visual segregation of IID textures.

    PubMed

    Chubb, C; Econopouly, J; Landy, M S

    1994-09-01

    A new psychophysical methodology is introduced, histogram contrast analysis, that allows one to measure stimulus transformations, f, used by the visual system to draw distinctions between different image regions. The method involves the discrimination of images constructed by selecting texture micropatterns randomly and independently (across locations) on the basis of a given micropattern histogram. Different components of f are measured by use of different component functions to modulate the micropattern histogram until the resulting textures are discriminable. When no discrimination threshold can be obtained for a given modulating component function, a second titration technique may be used to measure the contribution of that component to f. The method includes several strong tests of its own assumptions. An example is given of the method applied to visual textures composed of small, uniform squares with randomly chosen gray levels. In particular, for a fixed mean gray level mu and a fixed gray-level variance sigma 2, histogram contrast analysis is used to establish that the class S of all textures composed of small squares with jointly independent, identically distributed gray levels with mean mu and variance sigma 2 is perceptually elementary in the following sense: there exists a single, real-valued function f S of gray level, such that two textures I and J in S are discriminable only if the average value of f S applied to the gray levels in I is significantly different from the average value of f S applied to the gray levels in J. Finally, histogram contrast analysis is used to obtain a seventh-order polynomial approximation of f S.

  5. Insight on AV-45 binding in white and grey matter from histogram analysis: a study on early Alzheimer's disease patients and healthy subjects

    PubMed Central

    Nemmi, Federico; Saint-Aubert, Laure; Adel, Djilali; Salabert, Anne-Sophie; Pariente, Jérémie; Barbeau, Emmanuel; Payoux, Pierre; Péran, Patrice

    2014-01-01

    Purpose AV-45 amyloid biomarker is known to show uptake in white matter in patients with Alzheimer’s disease (AD) but also in healthy population. This binding; thought to be of a non-specific lipophilic nature has not yet been investigated. The aim of this study was to determine the differential pattern of AV-45 binding in healthy and pathological populations in white matter. Methods We recruited 24 patients presenting with AD at early stage and 17 matched, healthy subjects. We used an optimized PET-MRI registration method and an approach based on intensity histogram using several indexes. We compared the results of the intensity histogram analyses with a more canonical approach based on target-to-cerebellum Standard Uptake Value (SUVr) in white and grey matters using MANOVA and discriminant analyses. A cluster analysis on white and grey matter histograms was also performed. Results White matter histogram analysis revealed significant differences between AD and healthy subjects, which were not revealed by SUVr analysis. However, white matter histograms was not decisive to discriminate groups, and indexes based on grey matter only showed better discriminative power than SUVr. The cluster analysis divided our sample in two clusters, showing different uptakes in grey but also in white matter. Conclusion These results demonstrate that AV-45 binding in white matter conveys subtle information not detectable using SUVr approach. Although it is not better than standard SUVr to discriminate AD patients from healthy subjects, this information could reveal white matter modifications. PMID:24573658

  6. Texton-based analysis of paintings

    NASA Astrophysics Data System (ADS)

    van der Maaten, Laurens J. P.; Postma, Eric O.

    2010-08-01

    The visual examination of paintings is traditionally performed by skilled art historians using their eyes. Recent advances in intelligent systems may support art historians in determining the authenticity or date of creation of paintings. In this paper, we propose a technique for the examination of brushstroke structure that views the wildly overlapping brushstrokes as texture. The analysis of the painting texture is performed with the help of a texton codebook, i.e., a codebook of small prototypical textural patches. The texton codebook can be learned from a collection of paintings. Our textural analysis technique represents paintings in terms of histograms that measure the frequency by which the textons in the codebook occur in the painting (so-called texton histograms). We present experiments that show the validity and effectiveness of our technique for textural analysis on a collection of digitized high-resolution reproductions of paintings by Van Gogh and his contemporaries. As texton histograms cannot be easily be interpreted by art experts, the paper proposes to approaches to visualize the results on the textural analysis. The first approach visualizes the similarities between the histogram representations of paintings by employing a recently proposed dimensionality reduction technique, called t-SNE. We show that t-SNE reveals a clear separation of paintings created by Van Gogh and those created by other painters. In addition, the period of creation is faithfully reflected in the t-SNE visualizations. The second approach visualizes the similarities and differences between paintings by highlighting regions in a painting in which the textural structure of the painting is unusual. We illustrate the validity of this approach by means of an experiment in which we highlight regions in a painting by Monet that are not very "Van Gogh-like". Taken together, we believe the tools developed in this study are well capable of assisting for art historians in support of their study of paintings.

  7. Impact of the radiotherapy technique on the correlation between dose-volume histograms of the bladder wall defined on MRI imaging and dose-volume/surface histograms in prostate cancer patients

    NASA Astrophysics Data System (ADS)

    Maggio, Angelo; Carillo, Viviana; Cozzarini, Cesare; Perna, Lucia; Rancati, Tiziana; Valdagni, Riccardo; Gabriele, Pietro; Fiorino, Claudio

    2013-04-01

    The aim of this study was to evaluate the correlation between the ‘true’ absolute and relative dose-volume histograms (DVHs) of the bladder wall, dose-wall histogram (DWH) defined on MRI imaging and other surrogates of bladder dosimetry in prostate cancer patients, planned both with 3D-conformal and intensity-modulated radiation therapy (IMRT) techniques. For 17 prostate cancer patients, previously treated with radical intent, CT and MRI scans were acquired and matched. The contours of bladder walls were drawn by using MRI images. External bladder surfaces were then used to generate artificial bladder walls by performing automatic contractions of 5, 7 and 10 mm. For each patient a 3D conformal radiotherapy (3DCRT) and an IMRT treatment plan was generated with a prescription dose of 77.4 Gy (1.8 Gy/fr) and DVH of the whole bladder of the artificial walls (DVH-5/10) and dose-surface histograms (DSHs) were calculated and compared against the DWH in absolute and relative value, for both treatment planning techniques. A specific software (VODCA v. 4.4.0, MSS Inc.) was used for calculating the dose-volume/surface histogram. Correlation was quantified for selected dose-volume/surface parameters by the Spearman correlation coefficient. The agreement between %DWH and DVH5, DVH7 and DVH10 was found to be very good (maximum average deviations below 2%, SD < 5%): DVH5 showed the best agreement. The correlation was slightly better for absolute (R = 0.80-0.94) compared to relative (R = 0.66-0.92) histograms. The DSH was also found to be highly correlated with the DWH, although slightly higher deviations were generally found. The DVH was not a good surrogate of the DWH (R < 0.7 for most of parameters). When comparing the two treatment techniques, more pronounced differences between relative histograms were seen for IMRT with respect to 3DCRT (p < 0.0001).

  8. Location of Rotator Cuff Tear Initiation: A Magnetic Resonance Imaging Study of 191 Shoulders.

    PubMed

    Jeong, Jeung Yeol; Min, Seul Ki; Park, Keun Min; Park, Yong Bok; Han, Kwang Joon; Yoo, Jae Chul

    2018-03-01

    Degenerative rotator cuff tears (RCTs) are generally thought to originate at the anterior margin of the supraspinatus tendon. However, a recent ultrasonography study suggested that they might originate more posteriorly than originally thought, perhaps even from the isolated infraspinatus (ISP) tendon, and propagate toward the anterior supraspinatus. Hypothesis/Purpose: It was hypothesized that this finding could be reproduced with magnetic resonance imaging (MRI). The purpose was to determine the most common location of degenerative RCTs by using 3-dimensional multiplanar MRI reconstruction. It was assumed that the location of the partial-thickness tears would identify the area of the initiation of full-thickness tears. Cross-sectional study; Level of evidence, 3. A retrospective analysis was conducted including 245 patients who had RCTs (nearly full- or partial-thickness tears) at the outpatient department between January 2011 and December 2013. RCTs were measured on 3-dimensional multiplanar reconstruction MRI with OsiriX software. The width and distance from the biceps tendon to the anterior margin of the tear were measured on T2-weighted sagittal images. In a spreadsheet, columns of consecutive numbers represented the size of each tear (anteroposterior width) and their locations with respect to the biceps brachii tendon. Data were pooled to graphically represent the width and location of all tears. Frequency histograms of the columns were made to visualize the distribution of tears. The tears were divided into 2 groups based on width (group A, <10 mm; group B, <20 and ≥10 mm) and analyzed for any differences in location related to size. The mean width of all RCTs was 11.9 ± 4.1 mm, and the mean length was 11.1 ± 5.0 mm. Histograms showed the most common location of origin to be 9 to 10 mm posterior to the biceps tendon. The histograms of groups A and B showed similar tear location distributions, indicating that the region approximately 10 mm posterior to the biceps tendon is the most common site of tear initiation. These results demonstrate that degenerative RCTs most commonly originate from approximately 9 to 10 mm posterior to the biceps tendon.

  9. One-Dimensional Signal Extraction Of Paper-Written ECG Image And Its Archiving

    NASA Astrophysics Data System (ADS)

    Zhang, Zhi-ni; Zhang, Hong; Zhuang, Tian-ge

    1987-10-01

    A method for converting paper-written electrocardiograms to one dimensional (1-D) signals for archival storage on floppy disk is presented here. Appropriate image processing techniques were employed to remove the back-ground noise inherent to ECG recorder charts and to reconstruct the ECG waveform. The entire process consists of (1) digitization of paper-written ECGs with an image processing system via a TV camera; (2) image preprocessing, including histogram filtering and binary image generation; (3) ECG feature extraction and ECG wave tracing, and (4) transmission of the processed ECG data to IBM-PC compatible floppy disks for storage and retrieval. The algorithms employed here may also be used in the recognition of paper-written EEG or EMG and may be useful in robotic vision.

  10. Cell line name recognition in support of the identification of synthetic lethality in cancer from text

    PubMed Central

    Kaewphan, Suwisa; Van Landeghem, Sofie; Ohta, Tomoko; Van de Peer, Yves; Ginter, Filip; Pyysalo, Sampo

    2016-01-01

    Motivation: The recognition and normalization of cell line names in text is an important task in biomedical text mining research, facilitating for instance the identification of synthetically lethal genes from the literature. While several tools have previously been developed to address cell line recognition, it is unclear whether available systems can perform sufficiently well in realistic and broad-coverage applications such as extracting synthetically lethal genes from the cancer literature. In this study, we revisit the cell line name recognition task, evaluating both available systems and newly introduced methods on various resources to obtain a reliable tagger not tied to any specific subdomain. In support of this task, we introduce two text collections manually annotated for cell line names: the broad-coverage corpus Gellus and CLL, a focused target domain corpus. Results: We find that the best performance is achieved using NERsuite, a machine learning system based on Conditional Random Fields, trained on the Gellus corpus and supported with a dictionary of cell line names. The system achieves an F-score of 88.46% on the test set of Gellus and 85.98% on the independently annotated CLL corpus. It was further applied at large scale to 24 302 102 unannotated articles, resulting in the identification of 5 181 342 cell line mentions, normalized to 11 755 unique cell line database identifiers. Availability and implementation: The manually annotated datasets, the cell line dictionary, derived corpora, NERsuite models and the results of the large-scale run on unannotated texts are available under open licenses at http://turkunlp.github.io/Cell-line-recognition/. Contact: sukaew@utu.fi PMID:26428294

  11. Deep convolutional neural networks for automatic classification of gastric carcinoma using whole slide images in digital histopathology.

    PubMed

    Sharma, Harshita; Zerbe, Norman; Klempert, Iris; Hellwich, Olaf; Hufnagl, Peter

    2017-11-01

    Deep learning using convolutional neural networks is an actively emerging field in histological image analysis. This study explores deep learning methods for computer-aided classification in H&E stained histopathological whole slide images of gastric carcinoma. An introductory convolutional neural network architecture is proposed for two computerized applications, namely, cancer classification based on immunohistochemical response and necrosis detection based on the existence of tumor necrosis in the tissue. Classification performance of the developed deep learning approach is quantitatively compared with traditional image analysis methods in digital histopathology requiring prior computation of handcrafted features, such as statistical measures using gray level co-occurrence matrix, Gabor filter-bank responses, LBP histograms, gray histograms, HSV histograms and RGB histograms, followed by random forest machine learning. Additionally, the widely known AlexNet deep convolutional framework is comparatively analyzed for the corresponding classification problems. The proposed convolutional neural network architecture reports favorable results, with an overall classification accuracy of 0.6990 for cancer classification and 0.8144 for necrosis detection. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Distribution of a suite of elements including arsenic and mercury in Alabama coal

    USGS Publications Warehouse

    Goldhaber, Martin B.; Bigelow, R.C.; Hatch, J.R.; Pashin, J.C.

    2000-01-01

    Arsenic and other elements are unusually abundant in Alabama coal. This conclusion is based on chemical analyses of coal in the U.S. Geological Survey's National Coal Resources Data System (NCRDS; Bragg and others, 1994). According to NCRDS data, the average concentration of arsenic in Alabama coal (72 ppm) is three times higher than is the average for all U.S. coal (24 ppm). Of the U.S. coal analyses for arsenic that are at least 3 standard deviations above the mean, approximately 90% are from the coal fields of Alabama. Figure 1 contrasts the abundance of arsenic in coal of the Warrior field of Alabama (histogram C) with that of coal of the Powder River Basin, Wyoming (histogram A), and the Eastern Interior Province including the Illinois Basin and nearby areas (histogram B). The Warrior field is by far the largest in Alabama. On the histogram, the large 'tail' of very high values (> 200 ppm) in the Warrior coal contrasts with the other two regions that have very few analyses greater than 200 ppm.

  13. Deep sequencing and in silico analysis of small RNA library reveals novel miRNA from leaf Persicaria minor transcriptome.

    PubMed

    Samad, Abdul Fatah A; Nazaruddin, Nazaruddin; Murad, Abdul Munir Abdul; Jani, Jaeyres; Zainal, Zamri; Ismail, Ismanizan

    2018-03-01

    In current era, majority of microRNA (miRNA) are being discovered through computational approaches which are more confined towards model plants. Here, for the first time, we have described the identification and characterization of novel miRNA in a non-model plant, Persicaria minor ( P . minor ) using computational approach. Unannotated sequences from deep sequencing were analyzed based on previous well-established parameters. Around 24 putative novel miRNAs were identified from 6,417,780 reads of the unannotated sequence which represented 11 unique putative miRNA sequences. PsRobot target prediction tool was deployed to identify the target transcripts of putative novel miRNAs. Most of the predicted target transcripts (mRNAs) were known to be involved in plant development and stress responses. Gene ontology showed that majority of the putative novel miRNA targets involved in cellular component (69.07%), followed by molecular function (30.08%) and biological process (0.85%). Out of 11 unique putative miRNAs, 7 miRNAs were validated through semi-quantitative PCR. These novel miRNAs discoveries in P . minor may develop and update the current public miRNA database.

  14. Cloning, purification, crystallization and preliminary structural studies of penicillin V acylase from Bacillus subtilis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rathinaswamy, Priya; Pundle, Archana V.; Prabhune, Asmita A.

    An unannotated protein reported from B. subtilis has been expressed in E. coli and identified as possessing penicillin V acylase activity. The crystallization and preliminary crystallographic analysis of this penicillin V acylase is presented. Penicillin acylase proteins are amidohydrolase enzymes that cleave penicillins at the amide bond connecting the side chain to their β-lactam nucleus. An unannotated protein from Bacillus subtilis has been expressed in Escherichia coli, purified and confirmed to possess penicillin V acylase activity. The protein was crystallized using the hanging-drop vapour-diffusion method from a solution containing 4 M sodium formate in 100 mM Tris–HCl buffer pH 8.2.more » Diffraction data were collected under cryogenic conditions to a spacing of 2.5 Å. The crystals belonged to the orthorhombic space group C222{sub 1}, with unit-cell parameters a = 111.0, b = 308.0, c = 56.0 Å. The estimated Matthews coefficient was 3.23 Å{sup 3} Da{sup −1}, corresponding to 62% solvent content. The structure has been solved using molecular-replacement methods with B. sphaericus penicillin V acylase (PDB code 2pva) as the search model.« less

  15. A High-Performance Parallel Implementation of the Certified Reduced Basis Method

    DTIC Science & Technology

    2010-12-15

    point of view of model reduction due to the “curse of dimensionality”. We consider transient thermal conduction in a three– dimensional “ Swiss cheese ... Swiss cheese ” problem (see Figure 7a) there are 54 unique ordered pairs in I. A histogram of 〈δµ〉 values computed for the ntrain = 106 case is given in...our primal-dual RB method yields a very fast and accurate output approxima- tion for the “ Swiss Cheese ” problem. Our goal in this final subsection is

  16. SU-F-R-50: Radiation-Induced Changes in CT Number Histogram During Chemoradiation Therapy for Pancreatic Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, X; Schott, D; Song, Y

    Purpose: In an effort of early assessment of treatment response, we investigate radiation induced changes in CT number histogram of GTV during the delivery of chemoradiation therapy (CRT) for pancreatic cancer. Methods: Diagnostic-quality CT data acquired daily during routine CT-guided CRT using a CT-on-rails for 20 pancreatic head cancer patients were analyzed. All patients were treated with a radiation dose of 50.4 in 28 fractions. On each daily CT set, the contours of the pancreatic head and the spinal cord were delineated. The Hounsfiled Units (HU) histogram in these contourswere extracted and processed using MATLAB. Eight parameters of the histogrammore » including the mean HU over all the voxels, peak position, volume, standard deviation (SD), skewness, kurtosis, energy, and entropy were calculated for each fraction. The significances were inspected using paired two-tailed t-test and the correlations were analyzed using Spearman rank correlation tests. Results: In general, HU histogram in pancreatic head (but not in spinal cord) changed during the CRT delivery. Changes from the first to the last fraction in mean HU in pancreatic head ranged from −13.4 to 3.7 HU with an average of −4.4 HU, which was significant (P<0.001). Among other quantities, the volume decreased, the skewness increased (less skewed), and the kurtosis decreased (less sharp) during the CRT delivery. The changes of mean HU, volume, skewness, and kurtosis became significant after two weeks of treatment. Patient pathological response status is associated with the changes of SD (ΔSD), i.e., ΔSD= 1.85 (average of 7 patients) for good reponse, −0.08 (average of 6 patients) for moderate and poor response. Conclusion: Significant changes in HU histogram and the histogram-based metrics (e.g., meam HU, skewness, and kurtosis) in tumor were observed during the course of chemoradiation therapy for pancreas cancer. These changes may be potentially used for early assessment of treatment response.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ngo, Van; Wang, Yibo; Haas, Stephan

    Crystal structures of several bacterial Na v channels have been recently published and molecular dynamics simulations of ion permeation through these channels are consistent with many electrophysiological properties of eukaryotic channels. Bacterial Na v channels have been characterized as functionally asymmetric, and the mechanism of this asymmetry has not been clearly understood. To address this question, we combined non-equilibrium simulation data with two-dimensional equilibrium unperturbed landscapes generated by umbrella sampling and Weighted Histogram Analysis Methods for multiple ions traversing the selectivity filter of bacterial Na vAb channel. This approach provided new insight into the mechanism of selective ion permeation inmore » bacterial Nav channels. The non-equilibrium simulations indicate that two or three extracellular K + ions can block the entrance to the selectivity filter of Na vAb in the presence of applied forces in the inward direction, but not in the outward direction. The block state occurs in an unstable local minimum of the equilibrium unperturbed free-energy landscape of two K+ ions that can be ‘locked’ in place bymodest applied forces. In contrast to K +, three Na + ions move favorably through the selectivity filter together as a unit in a loose “knock-on” mechanism of permeation in both inward and outward directions, and there is no similar local minimum in the two-dimensional free-energy landscape of two Na + ions for a block state. The useful work predicted by the non-equilibrium simulations that is required to break the K + block is equivalent to large applied potentials experimentally measured for two bacterial Na v channels to induce inward currents of K + ions. Here, these results illustrate how inclusion of non-equilibrium factors in the simulations can provide detailed information about mechanisms of ion selectivity that is missing from mechanisms derived from either crystal structures or equilibrium unperturbed free-energy landscapes.« less

  18. Differentiating between Glioblastoma and Primary CNS Lymphoma Using Combined Whole-tumor Histogram Analysis of the Normalized Cerebral Blood Volume and the Apparent Diffusion Coefficient.

    PubMed

    Bao, Shixing; Watanabe, Yoshiyuki; Takahashi, Hiroto; Tanaka, Hisashi; Arisawa, Atsuko; Matsuo, Chisato; Wu, Rongli; Fujimoto, Yasunori; Tomiyama, Noriyuki

    2018-05-31

    This study aimed to determine whether whole-tumor histogram analysis of normalized cerebral blood volume (nCBV) and apparent diffusion coefficient (ADC) for contrast-enhancing lesions can be used to differentiate between glioblastoma (GBM) and primary central nervous system lymphoma (PCNSL). From 20 patients, 9 with PCNSL and 11 with GBM without any hemorrhagic lesions, underwent MRI, including diffusion-weighted imaging and dynamic susceptibility contrast perfusion-weighted imaging before surgery. Histogram analysis of nCBV and ADC from whole-tumor voxels in contrast-enhancing lesions was performed. An unpaired t-test was used to compare the mean values for each type of tumor. A multivariate logistic regression model (LRM) was performed to classify GBM and PCNSL using the best parameters of ADC and nCBV. All nCBV histogram parameters of GBMs were larger than those of PCNSLs, but only average nCBV was statistically significant after Bonferroni correction. Meanwhile, ADC histogram parameters were also larger in GBM compared to those in PCNSL, but these differences were not statistically significant. According to receiver operating characteristic curve analysis, the nCBV average and ADC 25th percentile demonstrated the largest area under the curve with values of 0.869 and 0.838, respectively. The LRM combining these two parameters differentiated between GBM and PCNSL with a higher area under the curve value (Logit (P) = -21.12 + 10.00 × ADC 25th percentile (10 -3 mm 2 /s) + 5.420 × nCBV mean, P < 0.001). Our results suggest that whole-tumor histogram analysis of nCBV and ADC combined can be a valuable objective diagnostic method for differentiating between GBM and PCNSL.

  19. Histogram analysis of diffusion kurtosis imaging estimates for in vivo assessment of 2016 WHO glioma grades: A cross-sectional observational study.

    PubMed

    Hempel, Johann-Martin; Schittenhelm, Jens; Brendle, Cornelia; Bender, Benjamin; Bier, Georg; Skardelly, Marco; Tabatabai, Ghazaleh; Castaneda Vega, Salvador; Ernemann, Ulrike; Klose, Uwe

    2017-10-01

    To assess the diagnostic performance of histogram analysis of diffusion kurtosis imaging (DKI) maps for in vivo assessment of the 2016 World Health Organization Classification of Tumors of the Central Nervous System (2016 CNS WHO) integrated glioma grades. Seventy-seven patients with histopathologically-confirmed glioma who provided written informed consent were retrospectively assessed between 01/2014 and 03/2017 from a prospective trial approved by the local institutional review board. Ten histogram parameters of mean kurtosis (MK) and mean diffusivity (MD) metrics from DKI were independently assessed by two blinded physicians from a volume of interest around the entire solid tumor. One-way ANOVA was used to compare MK and MD histogram parameter values between 2016 CNS WHO-based tumor grades. Receiver operating characteristic analysis was performed on MK and MD histogram parameters for significant results. The 25th, 50th, 75th, and 90th percentiles of MK and average MK showed significant differences between IDH1/2 wild-type gliomas, IDH1/2 mutated gliomas, and oligodendrogliomas with chromosome 1p/19q loss of heterozygosity and IDH1/2 mutation (p<0.001). The 50th, 75th, and 90th percentiles showed a slightly higher diagnostic performance (area under the curve (AUC) range; 0.868-0.991) than average MK (AUC range; 0.855-0.988) in classifying glioma according to the integrated approach of 2016 CNS WHO. Histogram analysis of DKI can stratify gliomas according to the integrated approach of 2016 CNS WHO. The 50th (median), 75th , and the 90th percentiles showed the highest diagnostic performance. However, the average MK is also robust and feasible in routine clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Bin Ratio-Based Histogram Distances and Their Application to Image Classification.

    PubMed

    Hu, Weiming; Xie, Nianhua; Hu, Ruiguang; Ling, Haibin; Chen, Qiang; Yan, Shuicheng; Maybank, Stephen

    2014-12-01

    Large variations in image background may cause partial matching and normalization problems for histogram-based representations, i.e., the histograms of the same category may have bins which are significantly different, and normalization may produce large changes in the differences between corresponding bins. In this paper, we deal with this problem by using the ratios between bin values of histograms, rather than bin values' differences which are used in the traditional histogram distances. We propose a bin ratio-based histogram distance (BRD), which is an intra-cross-bin distance, in contrast with previous bin-to-bin distances and cross-bin distances. The BRD is robust to partial matching and histogram normalization, and captures correlations between bins with only a linear computational complexity. We combine the BRD with the ℓ1 histogram distance and the χ(2) histogram distance to generate the ℓ1 BRD and the χ(2) BRD, respectively. These combinations exploit and benefit from the robustness of the BRD under partial matching and the robustness of the ℓ1 and χ(2) distances to small noise. We propose a method for assessing the robustness of histogram distances to partial matching. The BRDs and logistic regression-based histogram fusion are applied to image classification. The experimental results on synthetic data sets show the robustness of the BRDs to partial matching, and the experiments on seven benchmark data sets demonstrate promising results of the BRDs for image classification.

  1. A fully automatic end-to-end method for content-based image retrieval of CT scans with similar liver lesion annotations.

    PubMed

    Spanier, A B; Caplan, N; Sosna, J; Acar, B; Joskowicz, L

    2018-01-01

    The goal of medical content-based image retrieval (M-CBIR) is to assist radiologists in the decision-making process by retrieving medical cases similar to a given image. One of the key interests of radiologists is lesions and their annotations, since the patient treatment depends on the lesion diagnosis. Therefore, a key feature of M-CBIR systems is the retrieval of scans with the most similar lesion annotations. To be of value, M-CBIR systems should be fully automatic to handle large case databases. We present a fully automatic end-to-end method for the retrieval of CT scans with similar liver lesion annotations. The input is a database of abdominal CT scans labeled with liver lesions, a query CT scan, and optionally one radiologist-specified lesion annotation of interest. The output is an ordered list of the database CT scans with the most similar liver lesion annotations. The method starts by automatically segmenting the liver in the scan. It then extracts a histogram-based features vector from the segmented region, learns the features' relative importance, and ranks the database scans according to the relative importance measure. The main advantages of our method are that it fully automates the end-to-end querying process, that it uses simple and efficient techniques that are scalable to large datasets, and that it produces quality retrieval results using an unannotated CT scan. Our experimental results on 9 CT queries on a dataset of 41 volumetric CT scans from the 2014 Image CLEF Liver Annotation Task yield an average retrieval accuracy (Normalized Discounted Cumulative Gain index) of 0.77 and 0.84 without/with annotation, respectively. Fully automatic end-to-end retrieval of similar cases based on image information alone, rather that on disease diagnosis, may help radiologists to better diagnose liver lesions.

  2. Learning dictionaries of sparse codes of 3D movements of body joints for real-time human activity understanding.

    PubMed

    Qi, Jin; Yang, Zhiyong

    2014-01-01

    Real-time human activity recognition is essential for human-robot interactions for assisted healthy independent living. Most previous work in this area is performed on traditional two-dimensional (2D) videos and both global and local methods have been used. Since 2D videos are sensitive to changes of lighting condition, view angle, and scale, researchers begun to explore applications of 3D information in human activity understanding in recently years. Unfortunately, features that work well on 2D videos usually don't perform well on 3D videos and there is no consensus on what 3D features should be used. Here we propose a model of human activity recognition based on 3D movements of body joints. Our method has three steps, learning dictionaries of sparse codes of 3D movements of joints, sparse coding, and classification. In the first step, space-time volumes of 3D movements of body joints are obtained via dense sampling and independent component analysis is then performed to construct a dictionary of sparse codes for each activity. In the second step, the space-time volumes are projected to the dictionaries and a set of sparse histograms of the projection coefficients are constructed as feature representations of the activities. Finally, the sparse histograms are used as inputs to a support vector machine to recognize human activities. We tested this model on three databases of human activities and found that it outperforms the state-of-the-art algorithms. Thus, this model can be used for real-time human activity recognition in many applications.

  3. Grating interferometry-based phase microtomography of atherosclerotic human arteries

    NASA Astrophysics Data System (ADS)

    Buscema, Marzia; Holme, Margaret N.; Deyhle, Hans; Schulz, Georg; Schmitz, Rüdiger; Thalmann, Peter; Hieber, Simone E.; Chicherova, Natalia; Cattin, Philippe C.; Beckmann, Felix; Herzen, Julia; Weitkamp, Timm; Saxer, Till; Müller, Bert

    2014-09-01

    Cardiovascular diseases are the number one cause of death and morbidity in the world. Understanding disease development in terms of lumen morphology and tissue composition of constricted arteries is essential to improve treatment and patient outcome. X-ray tomography provides non-destructive three-dimensional data with micrometer-resolution. However, a common problem is simultaneous visualization of soft and hard tissue-containing specimens, such as atherosclerotic human coronary arteries. Unlike absorption based techniques, where X-ray absorption strongly depends on atomic number and tissue density, phase contrast methods such as grating interferometry have significant advantages as the phase shift is only a linear function of the atomic number. We demonstrate that grating interferometry-based phase tomography is a powerful method to three-dimensionally visualize a variety of anatomical features in atherosclerotic human coronary arteries, including plaque, muscle, fat, and connective tissue. Three formalin-fixed, human coronary arteries were measured using advanced laboratory μCT. While this technique gives information about plaque morphology, it is impossible to extract the lumen morphology. Therefore, selected regions were measured using grating based phase tomography, sinograms were treated with a wavelet-Fourier filter to remove ring artifacts, and reconstructed data were processed to allow extraction of vessel lumen morphology. Phase tomography data in combination with conventional laboratory μCT data of the same specimen shows potential, through use of a joint histogram, to identify more tissue types than either technique alone. Such phase tomography data was also rigidly registered to subsequently decalcified arteries that were histologically sectioned, although the quality of registration was insufficient for joint histogram analysis.

  4. [Characteristics of high resolution diffusion weighted imaging apparent diffusion coefficient histogram and its correlations with cancer stages in patients with nasopharyngeal carcinoma].

    PubMed

    Wang, G J; Wang, Y; Ye, Y; Chen, F; Lu, Y T; Li, S L

    2017-11-07

    Objective: To investigate the features of apparent diffusion coefficient (ADC) histogram parameters based on entire tumor volume data in high resolution diffusion weighted imaging of nasopharyngeal carcinoma (NPC) and to evaluate its correlations with cancer stages. Methods: This retrospective study included 154 cases of NPC patients[102 males and 52 females, mean age (48±11) years]who had received readout segmentation of long variable echo trains of MRI scan before radiation therapy. The area of tumor was delineated on each section of axial ADC maps to generate ADC histogram by using Image J. ADC histogram of entire tumor along with the histogram parameters-the tumor voxels, ADC(mean), ADC(25%), ADC(50%), ADC(75%), skewness and kurtosis were obtained by merging all sections with SPSS 22.0 software. Intra-observer repeatability was assessed by using intra-class correlation coefficients (ICC). The patients were subdivided into two groups according to cancer volume: small cancer group (<305 voxels, about 2 cm(3)) and large cancer group (≥2 cm(3)). The correlation between ADC histogram parameters and cancer stages was evaluated with Spearman test. Results: The ICC of measuring ADC histogram parameters of tumor voxels, ADC(mean), ADC(25%), ADC(50%), ADC(75%), skewness, kurtosis was 0.938, 0.861, 0.885, 0.838, 0.836, 0.358 and 0.456, respectively. The tumor voxels was positively correlated with T staging ( r =0.368, P <0.05). There were significant differences in tumor voxels among patients with different T stages ( K =22.306, P <0.05). There were significant differences in the ADC(mean), ADC(25%), ADC(50%) among patients with different T stages in the small cancer group( K =8.409, 8.187, 8.699, all P <0.05), and the up-mentioned three indices were positively correlated with T staging ( r =0.221, 0.209, 0.235, all P <0.05). Skewness and kurtosis differed significantly between the groups with different cancer volume( t =-2.987, Z =-3.770, both P <0.05). Conclusion: The tumor volume, tissue uniformity of NPC are important factors affecting ADC and cancer stages, parameters of ADC histogram (ADC(mean), ADC(25%), ADC(50%)) increases with T staging in NPC smaller than 2 cm(3).

  5. Example MODIS Global Cloud Optical and Microphysical Properties: Comparisons between Terra and Aqua

    NASA Technical Reports Server (NTRS)

    Hubanks, P. A.; Platnick, S.; King, M. D.; Ackerman, S. A.; Frey, R. A.

    2003-01-01

    MODIS observations from the NASA EOS Terra spacecraft (launched in December 1999, 1030 local time equatorial crossing) have provided a unique data set of Earth observations. With the launch of the NASA Aqua spacecraft in May 2002 (1330 local time), two MODIS daytime (sunlit) and nighttime observations are now available in a 24 hour period, allowing for some measure of diurnal variability. We report on an initial analysis of several operational global (Level-3) cloud products from the two platforms. The MODIS atmosphere Level-3 products, which include clear-sky and aerosol products in addition to cloud products, are available as three separate files providing daily, eight-day, and monthly aggregations; each temporal aggregation is spatially aggregated to a 1 degree grid. The files contain approximately 600 statisitical datasets (from simple means and standard deviations to 1 - and 2-dimensional histograms). Operational cloud products include detection (cloud fraction), cloud-top properties, and daytimeonly cloud optical thickness and particle effective radius for both water and ice clouds. We will compare example global Terra and Aqua cloud fraction, optical thickness, and effective radius aggregations.

  6. Reciprocal-space mapping of epitaxic thin films with crystallite size and shape polydispersity.

    PubMed

    Boulle, A; Conchon, F; Guinebretière, R

    2006-01-01

    A development is presented that allows the simulation of reciprocal-space maps (RSMs) of epitaxic thin films exhibiting fluctuations in the size and shape of the crystalline domains over which diffraction is coherent (crystallites). Three different crystallite shapes are studied, namely parallelepipeds, trigonal prisms and hexagonal prisms. For each shape, two cases are considered. Firstly, the overall size is allowed to vary but with a fixed thickness/width ratio. Secondly, the thickness and width are allowed to vary independently. The calculations are performed assuming three different size probability density functions: the normal distribution, the lognormal distribution and a general histogram distribution. In all cases considered, the computation of the RSM only requires a two-dimensional Fourier integral and the integrand has a simple analytical expression, i.e. there is no significant increase in computing times by taking size and shape fluctuations into account. The approach presented is compatible with most lattice disorder models (dislocations, inclusions, mosaicity, ...) and allows a straightforward account of the instrumental resolution. The applicability of the model is illustrated with the case of an yttria-stabilized zirconia film grown on sapphire.

  7. Novel Variants of a Histogram Shift-Based Reversible Watermarking Technique for Medical Images to Improve Hiding Capacity

    PubMed Central

    Tuckley, Kushal

    2017-01-01

    In telemedicine systems, critical medical data is shared on a public communication channel. This increases the risk of unauthorised access to patient's information. This underlines the importance of secrecy and authentication for the medical data. This paper presents two innovative variations of classical histogram shift methods to increase the hiding capacity. The first technique divides the image into nonoverlapping blocks and embeds the watermark individually using the histogram method. The second method separates the region of interest and embeds the watermark only in the region of noninterest. This approach preserves the medical information intact. This method finds its use in critical medical cases. The high PSNR (above 45 dB) obtained for both techniques indicates imperceptibility of the approaches. Experimental results illustrate superiority of the proposed approaches when compared with other methods based on histogram shifting techniques. These techniques improve embedding capacity by 5–15% depending on the image type, without affecting the quality of the watermarked image. Both techniques also enable lossless reconstruction of the watermark and the host medical image. A higher embedding capacity makes the proposed approaches attractive for medical image watermarking applications without compromising the quality of the image. PMID:29104744

  8. Computing a Non-trivial Lower Bound on the Joint Entropy between Two Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumalla, Kalyan S.

    In this report, a non-trivial lower bound on the joint entropy of two non-identical images is developed, which is greater than the individual entropies of the images. The lower bound is the least joint entropy possible among all pairs of images that have the same histograms as those of the given images. New algorithms are presented to compute the joint entropy lower bound with a computation time proportional to S log S where S is the number of histogram bins of the images. This is faster than the traditional methods of computing the exact joint entropy with a computation timemore » that is quadratic in S .« less

  9. Energy-landscape paving for prediction of face-centered-cubic hydrophobic-hydrophilic lattice model proteins

    NASA Astrophysics Data System (ADS)

    Liu, Jingfa; Song, Beibei; Liu, Zhaoxia; Huang, Weibo; Sun, Yuanyuan; Liu, Wenjie

    2013-11-01

    Protein structure prediction (PSP) is a classical NP-hard problem in computational biology. The energy-landscape paving (ELP) method is a class of heuristic global optimization algorithm, and has been successfully applied to solving many optimization problems with complex energy landscapes in the continuous space. By putting forward a new update mechanism of the histogram function in ELP and incorporating the generation of initial conformation based on the greedy strategy and the neighborhood search strategy based on pull moves into ELP, an improved energy-landscape paving (ELP+) method is put forward. Twelve general benchmark instances are first tested on both two-dimensional and three-dimensional (3D) face-centered-cubic (fcc) hydrophobic-hydrophilic (HP) lattice models. The lowest energies by ELP+ are as good as or better than those of other methods in the literature for all instances. Then, five sets of larger-scale instances, denoted by S, R, F90, F180, and CASP target instances on the 3D FCC HP lattice model are tested. The proposed algorithm finds lower energies than those by the five other methods in literature. Not unexpectedly, this is particularly pronounced for the longer sequences considered. Computational results show that ELP+ is an effective method for PSP on the fcc HP lattice model.

  10. Extracting rate coefficients from single-molecule photon trajectories and FRET efficiency histograms for a fast-folding protein.

    PubMed

    Chung, Hoi Sung; Gopich, Irina V; McHale, Kevin; Cellmer, Troy; Louis, John M; Eaton, William A

    2011-04-28

    Recently developed statistical methods by Gopich and Szabo were used to extract folding and unfolding rate coefficients from single-molecule Förster resonance energy transfer (FRET) data for proteins with kinetics too fast to measure waiting time distributions. Two types of experiments and two different analyses were performed. In one experiment bursts of photons were collected from donor and acceptor fluorophores attached to a 73-residue protein, α(3)D, freely diffusing through the illuminated volume of a confocal microscope system. In the second, the protein was immobilized by linkage to a surface, and photons were collected until one of the fluorophores bleached. Folding and unfolding rate coefficients and mean FRET efficiencies for the folded and unfolded subpopulations were obtained from a photon by photon analysis of the trajectories using a maximum likelihood method. The ability of the method to describe the data in terms of a two-state model was checked by recoloring the photon trajectories with the extracted parameters and comparing the calculated FRET efficiency histograms with the measured histograms. The sum of the rate coefficients for the two-state model agreed to within 30% with the relaxation rate obtained from the decay of the donor-acceptor cross-correlation function, confirming the high accuracy of the method. Interestingly, apparently reliable rate coefficients could be extracted using the maximum likelihood method, even at low (<10%) population of the minor component where the cross-correlation function was too noisy to obtain any useful information. The rate coefficients and mean FRET efficiencies were also obtained in an approximate procedure by simply fitting the FRET efficiency histograms, calculated by binning the donor and acceptor photons, with a sum of three-Gaussian functions. The kinetics are exposed in these histograms by the growth of a FRET efficiency peak at values intermediate between the folded and unfolded peaks as the bin size increases, a phenomenon with similarities to NMR exchange broadening. When comparable populations of folded and unfolded molecules are present, this method yields rate coefficients in very good agreement with those obtained with the maximum likelihood method. As a first step toward characterizing transition paths, the Viterbi algorithm was used to locate the most probable transition points in the photon trajectories.

  11. Adaptive image contrast enhancement using generalizations of histogram equalization.

    PubMed

    Stark, J A

    2000-01-01

    This paper proposes a scheme for adaptive image-contrast enhancement based on a generalization of histogram equalization (HE). HE is a useful technique for improving image contrast, but its effect is too severe for many purposes. However, dramatically different results can be obtained with relatively minor modifications. A concise description of adaptive HE is set out, and this framework is used in a discussion of past suggestions for variations on HE. A key feature of this formalism is a "cumulation function," which is used to generate a grey level mapping from the local histogram. By choosing alternative forms of cumulation function one can achieve a wide variety of effects. A specific form is proposed. Through the variation of one or two parameters, the resulting process can produce a range of degrees of contrast enhancement, at one extreme leaving the image unchanged, at another yielding full adaptive equalization.

  12. Gliomas: Application of Cumulative Histogram Analysis of Normalized Cerebral Blood Volume on 3 T MRI to Tumor Grading

    PubMed Central

    Kim, Hyungjin; Choi, Seung Hong; Kim, Ji-Hoon; Ryoo, Inseon; Kim, Soo Chin; Yeom, Jeong A.; Shin, Hwaseon; Jung, Seung Chai; Lee, A. Leum; Yun, Tae Jin; Park, Chul-Kee; Sohn, Chul-Ho; Park, Sung-Hye

    2013-01-01

    Background Glioma grading assumes significant importance in that low- and high-grade gliomas display different prognoses and are treated with dissimilar therapeutic strategies. The objective of our study was to retrospectively assess the usefulness of a cumulative normalized cerebral blood volume (nCBV) histogram for glioma grading based on 3 T MRI. Methods From February 2010 to April 2012, 63 patients with astrocytic tumors underwent 3 T MRI with dynamic susceptibility contrast perfusion-weighted imaging. Regions of interest containing the entire tumor volume were drawn on every section of the co-registered relative CBV (rCBV) maps and T2-weighted images. The percentile values from the cumulative nCBV histograms and the other histogram parameters were correlated with tumor grades. Cochran’s Q test and the McNemar test were used to compare the diagnostic accuracies of the histogram parameters after the receiver operating characteristic curve analysis. Using the parameter offering the highest diagnostic accuracy, a validation process was performed with an independent test set of nine patients. Results The 99th percentile of the cumulative nCBV histogram (nCBV C99), mean and peak height differed significantly between low- and high-grade gliomas (P = <0.001, 0.014 and <0.001, respectively) and between grade III and IV gliomas (P = <0.001, 0.001 and <0.001, respectively). The diagnostic accuracy of nCBV C99 was significantly higher than that of the mean nCBV (P = 0.016) in distinguishing high- from low-grade gliomas and was comparable to that of the peak height (P = 1.000). Validation using the two cutoff values of nCBV C99 achieved a diagnostic accuracy of 66.7% (6/9) for the separation of all three glioma grades. Conclusion Cumulative histogram analysis of nCBV using 3 T MRI can be a useful method for preoperative glioma grading. The nCBV C99 value is helpful in distinguishing high- from low-grade gliomas and grade IV from III gliomas. PMID:23704910

  13. Introducing parallelism to histogramming functions for GEM systems

    NASA Astrophysics Data System (ADS)

    Krawczyk, Rafał D.; Czarski, Tomasz; Kolasinski, Piotr; Pozniak, Krzysztof T.; Linczuk, Maciej; Byszuk, Adrian; Chernyshova, Maryna; Juszczyk, Bartlomiej; Kasprowicz, Grzegorz; Wojenski, Andrzej; Zabolotny, Wojciech

    2015-09-01

    This article is an assessment of potential parallelization of histogramming algorithms in GEM detector system. Histogramming and preprocessing algorithms in MATLAB were analyzed with regard to adding parallelism. Preliminary implementation of parallel strip histogramming resulted in speedup. Analysis of algorithms parallelizability is presented. Overview of potential hardware and software support to implement parallel algorithm is discussed.

  14. Comparison of Histograms for Use in Cloud Observation and Modeling

    NASA Technical Reports Server (NTRS)

    Green, Lisa; Xu, Kuan-Man

    2005-01-01

    Cloud observation and cloud modeling data can be presented in histograms for each characteristic to be measured. Combining information from single-cloud histograms yields a summary histogram. Summary histograms can be compared to each other to reach conclusions about the behavior of an ensemble of clouds in different places at different times or about the accuracy of a particular cloud model. As in any scientific comparison, it is necessary to decide whether any apparent differences are statistically significant. The usual methods of deciding statistical significance when comparing histograms do not apply in this case because they assume independent data. Thus, a new method is necessary. The proposed method uses the Euclidean distance metric and bootstrapping to calculate the significance level.

  15. Classification of molecular structure images by using ANN, RF, LBP, HOG, and size reduction methods for early stomach cancer detection

    NASA Astrophysics Data System (ADS)

    Aytaç Korkmaz, Sevcan; Binol, Hamidullah

    2018-03-01

    Patients who die from stomach cancer are still present. Early diagnosis is crucial in reducing the mortality rate of cancer patients. Therefore, computer aided methods have been developed for early detection in this article. Stomach cancer images were obtained from Fırat University Medical Faculty Pathology Department. The Local Binary Patterns (LBP) and Histogram of Oriented Gradients (HOG) features of these images are calculated. At the same time, Sammon mapping, Stochastic Neighbor Embedding (SNE), Isomap, Classical multidimensional scaling (MDS), Local Linear Embedding (LLE), Linear Discriminant Analysis (LDA), t-Distributed Stochastic Neighbor Embedding (t-SNE), and Laplacian Eigenmaps methods are used for dimensional the reduction of the features. The high dimension of these features has been reduced to lower dimensions using dimensional reduction methods. Artificial neural networks (ANN) and Random Forest (RF) classifiers were used to classify stomach cancer images with these new lower feature sizes. New medical systems have developed to measure the effects of these dimensions by obtaining features in different dimensional with dimensional reduction methods. When all the methods developed are compared, it has been found that the best accuracy results are obtained with LBP_MDS_ANN and LBP_LLE_ANN methods.

  16. Planetary Photojournal Home Page Graphic

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This image is an unannotated version of the Planetary Photojournal Home Page graphic. This digital collage contains a highly stylized rendition of our solar system and points beyond. As this graphic was intended to be used as a navigation aid in searching for data within the Photojournal, certain artistic embellishments have been added (color, location, etc.). Several data sets from various planetary and astronomy missions were combined to create this image.

  17. Computational tools for exploring sequence databases as a resource for antimicrobial peptides.

    PubMed

    Porto, W F; Pires, A S; Franco, O L

    Data mining has been recognized by many researchers as a hot topic in different areas. In the post-genomic era, the growing number of sequences deposited in databases has been the reason why these databases have become a resource for novel biological information. In recent years, the identification of antimicrobial peptides (AMPs) in databases has gained attention. The identification of unannotated AMPs has shed some light on the distribution and evolution of AMPs and, in some cases, indicated suitable candidates for developing novel antimicrobial agents. The data mining process has been performed mainly by local alignments and/or regular expressions. Nevertheless, for the identification of distant homologous sequences, other techniques such as antimicrobial activity prediction and molecular modelling are required. In this context, this review addresses the tools and techniques, and also their limitations, for mining AMPs from databases. These methods could be helpful not only for the development of novel AMPs, but also for other kinds of proteins, at a higher level of structural genomics. Moreover, solving the problem of unannotated proteins could bring immeasurable benefits to society, especially in the case of AMPs, which could be helpful for developing novel antimicrobial agents and combating resistant bacteria. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. GI-SVM: A sensitive method for predicting genomic islands based on unannotated sequence of a single genome.

    PubMed

    Lu, Bingxin; Leong, Hon Wai

    2016-02-01

    Genomic islands (GIs) are clusters of functionally related genes acquired by lateral genetic transfer (LGT), and they are present in many bacterial genomes. GIs are extremely important for bacterial research, because they not only promote genome evolution but also contain genes that enhance adaption and enable antibiotic resistance. Many methods have been proposed to predict GI. But most of them rely on either annotations or comparisons with other closely related genomes. Hence these methods cannot be easily applied to new genomes. As the number of newly sequenced bacterial genomes rapidly increases, there is a need for methods to detect GI based solely on sequences of a single genome. In this paper, we propose a novel method, GI-SVM, to predict GIs given only the unannotated genome sequence. GI-SVM is based on one-class support vector machine (SVM), utilizing composition bias in terms of k-mer content. From our evaluations on three real genomes, GI-SVM can achieve higher recall compared with current methods, without much loss of precision. Besides, GI-SVM allows flexible parameter tuning to get optimal results for each genome. In short, GI-SVM provides a more sensitive method for researchers interested in a first-pass detection of GI in newly sequenced genomes.

  19. Illuminating structural proteins in viral "dark matter" with metaproteomics

    DOE PAGES

    Brum, Jennifer R.; Ignacio-Espinoza, J. Cesar; Kim, Eun -Hae; ...

    2016-02-16

    Viruses are ecologically important, yet environmental virology is limited by dominance of unannotated genomic sequences representing taxonomic and functional "viral dark matter." Although recent analytical advances are rapidly improving taxonomic annotations, identifying functional darkmatter remains problematic. Here, we apply paired metaproteomics and dsDNA-targeted metagenomics to identify 1,875 virion-associated proteins from the ocean. Over one-half of these proteins were newly functionally annotated and represent abundant and widespread viral metagenome-derived protein clusters (PCs). One primarily unannotated PC dominated the dataset, but structural modeling and genomic context identified this PC as a previously unidentified capsid protein from multiple uncultivated tailed virus families. Furthermore,more » four of the five most abundant PCs in the metaproteome represent capsid proteins containing the HK97-like protein fold previously found in many viruses that infect all three domains of life. The dominance of these proteins within our dataset, as well as their global distribution throughout the world's oceans and seas, supports prior hypotheses that this HK97-like protein fold is the most abundant biological structure on Earth. Altogether, these culture-independent analyses improve virion-associated protein annotations, facilitate the investigation of proteins within natural viral communities, and offer a high-throughput means of illuminating functional viral dark matter.« less

  20. Illuminating structural proteins in viral "dark matter" with metaproteomics.

    PubMed

    Brum, Jennifer R; Ignacio-Espinoza, J Cesar; Kim, Eun-Hae; Trubl, Gareth; Jones, Robert M; Roux, Simon; VerBerkmoes, Nathan C; Rich, Virginia I; Sullivan, Matthew B

    2016-03-01

    Viruses are ecologically important, yet environmental virology is limited by dominance of unannotated genomic sequences representing taxonomic and functional "viral dark matter." Although recent analytical advances are rapidly improving taxonomic annotations, identifying functional dark matter remains problematic. Here, we apply paired metaproteomics and dsDNA-targeted metagenomics to identify 1,875 virion-associated proteins from the ocean. Over one-half of these proteins were newly functionally annotated and represent abundant and widespread viral metagenome-derived protein clusters (PCs). One primarily unannotated PC dominated the dataset, but structural modeling and genomic context identified this PC as a previously unidentified capsid protein from multiple uncultivated tailed virus families. Furthermore, four of the five most abundant PCs in the metaproteome represent capsid proteins containing the HK97-like protein fold previously found in many viruses that infect all three domains of life. The dominance of these proteins within our dataset, as well as their global distribution throughout the world's oceans and seas, supports prior hypotheses that this HK97-like protein fold is the most abundant biological structure on Earth. Together, these culture-independent analyses improve virion-associated protein annotations, facilitate the investigation of proteins within natural viral communities, and offer a high-throughput means of illuminating functional viral dark matter.

  1. Illuminating structural proteins in viral “dark matter” with metaproteomics

    PubMed Central

    Brum, Jennifer R.; Ignacio-Espinoza, J. Cesar; Kim, Eun-Hae; Trubl, Gareth; Jones, Robert M.; Roux, Simon; VerBerkmoes, Nathan C.; Rich, Virginia I.; Sullivan, Matthew B.

    2016-01-01

    Viruses are ecologically important, yet environmental virology is limited by dominance of unannotated genomic sequences representing taxonomic and functional “viral dark matter.” Although recent analytical advances are rapidly improving taxonomic annotations, identifying functional dark matter remains problematic. Here, we apply paired metaproteomics and dsDNA-targeted metagenomics to identify 1,875 virion-associated proteins from the ocean. Over one-half of these proteins were newly functionally annotated and represent abundant and widespread viral metagenome-derived protein clusters (PCs). One primarily unannotated PC dominated the dataset, but structural modeling and genomic context identified this PC as a previously unidentified capsid protein from multiple uncultivated tailed virus families. Furthermore, four of the five most abundant PCs in the metaproteome represent capsid proteins containing the HK97-like protein fold previously found in many viruses that infect all three domains of life. The dominance of these proteins within our dataset, as well as their global distribution throughout the world’s oceans and seas, supports prior hypotheses that this HK97-like protein fold is the most abundant biological structure on Earth. Together, these culture-independent analyses improve virion-associated protein annotations, facilitate the investigation of proteins within natural viral communities, and offer a high-throughput means of illuminating functional viral dark matter. PMID:26884177

  2. Multiple two-dimensional versus three-dimensional PTV definition in treatment planning for conformal radiotherapy.

    PubMed

    Stroom, J C; Korevaar, G A; Koper, P C; Visser, A G; Heijmen, B J

    1998-06-01

    To demonstrate the need for a fully three-dimensional (3D) computerized expansion of the gross tumour volume (GTV) or clinical target volume (CTV), as delineated by the radiation oncologist on CT slices, to obtain the proper planning target volume (PTV) for treatment planning according to the ICRU-50 recommendations. For 10 prostate cancer patients two PTVs have been determined by expansion of the GTV with a 1.5 cm margin, i.e. a 3D PTV and a multiple 2D PTV. The former was obtained by automatically adding the margin while accounting in 3D for GTV contour differences in neighbouring slices. The latter was generated by automatically adding the 1.5 cm margin to the GTV in each CT slice separately; the resulting PTV is a computer simulation of the PTV that a radiation oncologist would obtain with (the still common) manual contouring in CT slices. For each patient the two PTVs were compared to assess the deviations of the multiple 2D PTV from the 3D PTV. For both PTVs conformal plans were designed using a three-field technique with fixed block margins. For each patient dose-volume histograms and tumour control probabilities (TCPs) of the (correct) 3D PTV were calculated, both for the plan designed for this PTV and for the treatment plan based on the (deviating) 2D PTV. Depending on the shape of the GTV, multiple 2D PTV generation could locally result in a 1 cm underestimation of the GTV-to-PTV margin. The deviations occurred predominantly in the cranio-caudal direction at locations where the GTV contour shape varies significantly from slice to slice. This could lead to serious underdosage and to a TCP decrease of up to 15%. A full 3D GTV-to-PTV expansion should be applied in conformal radiotherapy to avoid underdosage.

  3. Detection and tracking of gas plumes in LWIR hyperspectral video sequence data

    NASA Astrophysics Data System (ADS)

    Gerhart, Torin; Sunu, Justin; Lieu, Lauren; Merkurjev, Ekaterina; Chang, Jen-Mei; Gilles, Jérôme; Bertozzi, Andrea L.

    2013-05-01

    Automated detection of chemical plumes presents a segmentation challenge. The segmentation problem for gas plumes is difficult due to the diffusive nature of the cloud. The advantage of considering hyperspectral images in the gas plume detection problem over the conventional RGB imagery is the presence of non-visual data, allowing for a richer representation of information. In this paper we present an effective method of visualizing hyperspectral video sequences containing chemical plumes and investigate the effectiveness of segmentation techniques on these post-processed videos. Our approach uses a combination of dimension reduction and histogram equalization to prepare the hyperspectral videos for segmentation. First, Principal Components Analysis (PCA) is used to reduce the dimension of the entire video sequence. This is done by projecting each pixel onto the first few Principal Components resulting in a type of spectral filter. Next, a Midway method for histogram equalization is used. These methods redistribute the intensity values in order to reduce icker between frames. This properly prepares these high-dimensional video sequences for more traditional segmentation techniques. We compare the ability of various clustering techniques to properly segment the chemical plume. These include K-means, spectral clustering, and the Ginzburg-Landau functional.

  4. Motor Oil Classification using Color Histograms and Pattern Recognition Techniques.

    PubMed

    Ahmadi, Shiva; Mani-Varnosfaderani, Ahmad; Habibi, Biuck

    2018-04-20

    Motor oil classification is important for quality control and the identification of oil adulteration. In thiswork, we propose a simple, rapid, inexpensive and nondestructive approach based on image analysis and pattern recognition techniques for the classification of nine different types of motor oils according to their corresponding color histograms. For this, we applied color histogram in different color spaces such as red green blue (RGB), grayscale, and hue saturation intensity (HSI) in order to extract features that can help with the classification procedure. These color histograms and their combinations were used as input for model development and then were statistically evaluated by using linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), and support vector machine (SVM) techniques. Here, two common solutions for solving a multiclass classification problem were applied: (1) transformation to binary classification problem using a one-against-all (OAA) approach and (2) extension from binary classifiers to a single globally optimized multilabel classification model. In the OAA strategy, LDA, QDA, and SVM reached up to 97% in terms of accuracy, sensitivity, and specificity for both the training and test sets. In extension from binary case, despite good performances by the SVM classification model, QDA and LDA provided better results up to 92% for RGB-grayscale-HSI color histograms and up to 93% for the HSI color map, respectively. In order to reduce the numbers of independent variables for modeling, a principle component analysis algorithm was used. Our results suggest that the proposed method is promising for the identification and classification of different types of motor oils.

  5. A flexible new method for 3D measurement based on multi-view image sequences

    NASA Astrophysics Data System (ADS)

    Cui, Haihua; Zhao, Zhimin; Cheng, Xiaosheng; Guo, Changye; Jia, Huayu

    2016-11-01

    Three-dimensional measurement is the base part for reverse engineering. The paper developed a new flexible and fast optical measurement method based on multi-view geometry theory. At first, feature points are detected and matched with improved SIFT algorithm. The Hellinger Kernel is used to estimate the histogram distance instead of traditional Euclidean distance, which is immunity to the weak texture image; then a new filter three-principle for filtering the calculation of essential matrix is designed, the essential matrix is calculated using the improved a Contrario Ransac filter method. One view point cloud is constructed accurately with two view images; after this, the overlapped features are used to eliminate the accumulated errors caused by added view images, which improved the camera's position precision. At last, the method is verified with the application of dental restoration CAD/CAM, experiment results show that the proposed method is fast, accurate and flexible for tooth 3D measurement.

  6. Three dimensional fabric evolution of sheared sand

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, Alsidqi; Alshibli, Khalid

    2012-10-24

    Granular particles undergo translation and rolling when they are sheared. This paper presents a three-dimensional (3D) experimental assessment of fabric evolution of sheared sand at the particle level. F-75 Ottawa sand specimen was tested under an axisymmetric triaxial loading condition. It measured 9.5 mm in diameter and 20 mm in height. The quantitative evaluation was conducted by analyzing 3D high-resolution x-ray synchrotron micro-tomography images of the specimen at eight axial strain levels. The analyses included visualization of particle translation and rotation, and quantification of fabric orientation as shearing continued. Representative individual particles were successfully tracked and visualized to assess themore » mode of interaction between them. This paper discusses fabric evolution and compares the evolution of particles within and outside the shear band as shearing continues. Changes in particle orientation distributions are presented using fabric histograms and fabric tensor.« less

  7. Pushing the limits of Monte Carlo simulations for the three-dimensional Ising model

    NASA Astrophysics Data System (ADS)

    Ferrenberg, Alan M.; Xu, Jiahao; Landau, David P.

    2018-04-01

    While the three-dimensional Ising model has defied analytic solution, various numerical methods like Monte Carlo, Monte Carlo renormalization group, and series expansion have provided precise information about the phase transition. Using Monte Carlo simulation that employs the Wolff cluster flipping algorithm with both 32-bit and 53-bit random number generators and data analysis with histogram reweighting and quadruple precision arithmetic, we have investigated the critical behavior of the simple cubic Ising Model, with lattice sizes ranging from 163 to 10243. By analyzing data with cross correlations between various thermodynamic quantities obtained from the same data pool, e.g., logarithmic derivatives of magnetization and derivatives of magnetization cumulants, we have obtained the critical inverse temperature Kc=0.221 654 626 (5 ) and the critical exponent of the correlation length ν =0.629 912 (86 ) with precision that exceeds all previous Monte Carlo estimates.

  8. Digital image improvement by adding noise: an example by a professional photographer

    NASA Astrophysics Data System (ADS)

    Kurihara, Takehito; Manabe, Yoshitsugu; Aoki, Naokazu; Kobayashi, Hiroyuki

    2008-01-01

    To overcome shortcomings of digital image, or to reproduce grain of traditional silver halide photographs, some photographers add noise (grain) to digital image. In an effort to find a factor of preferable noise, we analyzed how a professional photographer introduces noise into B&W digital images and found two noticeable characteristics: 1) there is more noise in mid-tones, gradually decreasing in highlights and shadows toward the ends of tonal range, and 2) histograms in highlights are skewed toward shadows and vice versa, while almost symmetrical in mid-tones. Next, we examined whether the professional's noise could be reproduced. The symmetrical histograms were approximated by Gaussian distribution and skewed ones by chi-square distribution. The images on which the noise was reproduced were judged by the professional himself to be satisfactory enough. As the professional said he added the noise so that "it looked like the grain of B&W gelatin silver photographs," we compared the two kinds of noise and found they have in common: 1) more noise in mid-tones but almost none in brightest highlights and deepest shadows, and 2) asymmetrical histograms in highlights and shadows. We think these common characteristics might be one condition for "good" noise.

  9. RNA-Seq Profiling Reveals Novel Hepatic Gene Expression Pattern in Aflatoxin B1 Treated Rats

    PubMed Central

    Merrick, B. Alex; Phadke, Dhiral P.; Auerbach, Scott S.; Mav, Deepak; Stiegelmeyer, Suzy M.; Shah, Ruchir R.; Tice, Raymond R.

    2013-01-01

    Deep sequencing was used to investigate the subchronic effects of 1 ppm aflatoxin B1 (AFB1), a potent hepatocarcinogen, on the male rat liver transcriptome prior to onset of histopathological lesions or tumors. We hypothesized RNA-Seq would reveal more differentially expressed genes (DEG) than microarray analysis, including low copy and novel transcripts related to AFB1’s carcinogenic activity compared to feed controls (CTRL). Paired-end reads were mapped to the rat genome (Rn4) with TopHat and further analyzed by DESeq and Cufflinks-Cuffdiff pipelines to identify differentially expressed transcripts, new exons and unannotated transcripts. PCA and cluster analysis of DEGs showed clear separation between AFB1 and CTRL treatments and concordance among group replicates. qPCR of eight high and medium DEGs and three low DEGs showed good comparability among RNA-Seq and microarray transcripts. DESeq analysis identified 1,026 differentially expressed transcripts at greater than two-fold change (p<0.005) compared to 626 transcripts by microarray due to base pair resolution of transcripts by RNA-Seq, probe placement within transcripts or an absence of probes to detect novel transcripts, splice variants and exons. Pathway analysis among DEGs revealed signaling of Ahr, Nrf2, GSH, xenobiotic, cell cycle, extracellular matrix, and cell differentiation networks consistent with pathways leading to AFB1 carcinogenesis, including almost 200 upregulated transcripts controlled by E2f1-related pathways related to kinetochore structure, mitotic spindle assembly and tissue remodeling. We report 49 novel, differentially-expressed transcripts including confirmation by PCR-cloning of two unique, unannotated, hepatic AFB1-responsive transcripts (HAfT’s) on chromosomes 1.q55 and 15.q11, overexpressed by 10 to 25-fold. Several potentially novel exons were found and exon refinements were made including AFB1 exon-specific induction of homologous family members, Ugt1a6 and Ugt1a7c. We find the rat transcriptome contains many previously unidentified, AFB1-responsive exons and transcripts supporting RNA-Seq’s capabilities to provide new insights into AFB1-mediated gene expression leading to hepatocellular carcinoma. PMID:23630614

  10. Discovering functions of unannotated genes from a transcriptome survey of wild fungal isolates.

    PubMed

    Ellison, Christopher E; Kowbel, David; Glass, N Louise; Taylor, John W; Brem, Rachel B

    2014-04-01

    Most fungal genomes are poorly annotated, and many fungal traits of industrial and biomedical relevance are not well suited to classical genetic screens. Assigning genes to phenotypes on a genomic scale thus remains an urgent need in the field. We developed an approach to infer gene function from expression profiles of wild fungal isolates, and we applied our strategy to the filamentous fungus Neurospora crassa. Using transcriptome measurements in 70 strains from two well-defined clades of this microbe, we first identified 2,247 cases in which the expression of an unannotated gene rose and fell across N. crassa strains in parallel with the expression of well-characterized genes. We then used image analysis of hyphal morphologies, quantitative growth assays, and expression profiling to test the functions of four genes predicted from our population analyses. The results revealed two factors that influenced regulation of metabolism of nonpreferred carbon and nitrogen sources, a gene that governed hyphal architecture, and a gene that mediated amino acid starvation resistance. These findings validate the power of our population-transcriptomic approach for inference of novel gene function, and we suggest that this strategy will be of broad utility for genome-scale annotation in many fungal systems. IMPORTANCE Some fungal species cause deadly infections in humans or crop plants, and other fungi are workhorses of industrial chemistry, including the production of biofuels. Advances in medical and industrial mycology require an understanding of the genes that control fungal traits. We developed a method to infer functions of uncharacterized genes by observing correlated expression of their mRNAs with those of known genes across wild fungal isolates. We applied this strategy to a filamentous fungus and predicted functions for thousands of unknown genes. In four cases, we experimentally validated the predictions from our method, discovering novel genes involved in the metabolism of nutrient sources relevant for biofuel production, as well as colony morphology and starvation resistance. Our strategy is straightforward, inexpensive, and applicable for predicting gene function in many fungal species.

  11. RNA-Seq profiling reveals novel hepatic gene expression pattern in aflatoxin B1 treated rats.

    PubMed

    Merrick, B Alex; Phadke, Dhiral P; Auerbach, Scott S; Mav, Deepak; Stiegelmeyer, Suzy M; Shah, Ruchir R; Tice, Raymond R

    2013-01-01

    Deep sequencing was used to investigate the subchronic effects of 1 ppm aflatoxin B1 (AFB1), a potent hepatocarcinogen, on the male rat liver transcriptome prior to onset of histopathological lesions or tumors. We hypothesized RNA-Seq would reveal more differentially expressed genes (DEG) than microarray analysis, including low copy and novel transcripts related to AFB1's carcinogenic activity compared to feed controls (CTRL). Paired-end reads were mapped to the rat genome (Rn4) with TopHat and further analyzed by DESeq and Cufflinks-Cuffdiff pipelines to identify differentially expressed transcripts, new exons and unannotated transcripts. PCA and cluster analysis of DEGs showed clear separation between AFB1 and CTRL treatments and concordance among group replicates. qPCR of eight high and medium DEGs and three low DEGs showed good comparability among RNA-Seq and microarray transcripts. DESeq analysis identified 1,026 differentially expressed transcripts at greater than two-fold change (p<0.005) compared to 626 transcripts by microarray due to base pair resolution of transcripts by RNA-Seq, probe placement within transcripts or an absence of probes to detect novel transcripts, splice variants and exons. Pathway analysis among DEGs revealed signaling of Ahr, Nrf2, GSH, xenobiotic, cell cycle, extracellular matrix, and cell differentiation networks consistent with pathways leading to AFB1 carcinogenesis, including almost 200 upregulated transcripts controlled by E2f1-related pathways related to kinetochore structure, mitotic spindle assembly and tissue remodeling. We report 49 novel, differentially-expressed transcripts including confirmation by PCR-cloning of two unique, unannotated, hepatic AFB1-responsive transcripts (HAfT's) on chromosomes 1.q55 and 15.q11, overexpressed by 10 to 25-fold. Several potentially novel exons were found and exon refinements were made including AFB1 exon-specific induction of homologous family members, Ugt1a6 and Ugt1a7c. We find the rat transcriptome contains many previously unidentified, AFB1-responsive exons and transcripts supporting RNA-Seq's capabilities to provide new insights into AFB1-mediated gene expression leading to hepatocellular carcinoma.

  12. Alternative types of molecule-decorated atomic chains in Au–CO–Au single-molecule junctions

    PubMed Central

    Balogh, Zoltán; Makk, Péter

    2015-01-01

    Summary We investigate the formation and evolution of Au–CO single-molecule break junctions. The conductance histogram exhibits two distinct molecular configurations, which are further investigated by a combined statistical analysis. According to conditional histogram and correlation analysis these molecular configurations show strong anticorrelations with each other and with pure Au monoatomic junctions and atomic chains. We identify molecular precursor configurations with somewhat higher conductance, which are formed prior to single-molecule junctions. According to detailed length analysis two distinct types of molecule-affected chain-formation processes are observed, and we compare these results to former theoretical calculations considering bridge- and atop-type molecular configurations where the latter has reduced conductance due to destructive Fano interference. PMID:26199840

  13. Alternative types of molecule-decorated atomic chains in Au-CO-Au single-molecule junctions.

    PubMed

    Balogh, Zoltán; Makk, Péter; Halbritter, András

    2015-01-01

    We investigate the formation and evolution of Au-CO single-molecule break junctions. The conductance histogram exhibits two distinct molecular configurations, which are further investigated by a combined statistical analysis. According to conditional histogram and correlation analysis these molecular configurations show strong anticorrelations with each other and with pure Au monoatomic junctions and atomic chains. We identify molecular precursor configurations with somewhat higher conductance, which are formed prior to single-molecule junctions. According to detailed length analysis two distinct types of molecule-affected chain-formation processes are observed, and we compare these results to former theoretical calculations considering bridge- and atop-type molecular configurations where the latter has reduced conductance due to destructive Fano interference.

  14. An embedded face-classification system for infrared images on an FPGA

    NASA Astrophysics Data System (ADS)

    Soto, Javier E.; Figueroa, Miguel

    2014-10-01

    We present a face-classification architecture for long-wave infrared (IR) images implemented on a Field Programmable Gate Array (FPGA). The circuit is fast, compact and low power, can recognize faces in real time and be embedded in a larger image-processing and computer vision system operating locally on an IR camera. The algorithm uses Local Binary Patterns (LBP) to perform feature extraction on each IR image. First, each pixel in the image is represented as an LBP pattern that encodes the similarity between the pixel and its neighbors. Uniform LBP codes are then used to reduce the number of patterns to 59 while preserving more than 90% of the information contained in the original LBP representation. Then, the image is divided into 64 non-overlapping regions, and each region is represented as a 59-bin histogram of patterns. Finally, the algorithm concatenates all 64 regions to create a 3,776-bin spatially enhanced histogram. We reduce the dimensionality of this histogram using Linear Discriminant Analysis (LDA), which improves clustering and enables us to store an entire database of 53 subjects on-chip. During classification, the circuit applies LBP and LDA to each incoming IR image in real time, and compares the resulting feature vector to each pattern stored in the local database using the Manhattan distance. We implemented the circuit on a Xilinx Artix-7 XC7A100T FPGA and tested it with the UCHThermalFace database, which consists of 28 81 x 150-pixel images of 53 subjects in indoor and outdoor conditions. The circuit achieves a 98.6% hit ratio, trained with 16 images and tested with 12 images of each subject in the database. Using a 100 MHz clock, the circuit classifies 8,230 images per second, and consumes only 309mW.

  15. IN VITRO QUANTIFICATION OF THE SIZE DISTRIBUTION OF INTRASACCULAR VOIDS LEFT AFTER ENDOVASCULAR COILING OF CEREBRAL ANEURYSMS.

    PubMed

    Sadasivan, Chander; Brownstein, Jeremy; Patel, Bhumika; Dholakia, Ronak; Santore, Joseph; Al-Mufti, Fawaz; Puig, Enrique; Rakian, Audrey; Fernandez-Prada, Kenneth D; Elhammady, Mohamed S; Farhat, Hamad; Fiorella, David J; Woo, Henry H; Aziz-Sultan, Mohammad A; Lieber, Baruch B

    2013-03-01

    Endovascular coiling of cerebral aneurysms remains limited by coil compaction and associated recanalization. Recent coil designs which effect higher packing densities may be far from optimal because hemodynamic forces causing compaction are not well understood since detailed data regarding the location and distribution of coil masses are unavailable. We present an in vitro methodology to characterize coil masses deployed within aneurysms by quantifying intra-aneurysmal void spaces. Eight identical aneurysms were packed with coils by both balloon- and stent-assist techniques. The samples were embedded, sequentially sectioned and imaged. Empty spaces between the coils were numerically filled with circles (2D) in the planar images and with spheres (3D) in the three-dimensional composite images. The 2D and 3D void size histograms were analyzed for local variations and by fitting theoretical probability distribution functions. Balloon-assist packing densities (31±2%) were lower ( p =0.04) than the stent-assist group (40±7%). The maximum and average 2D and 3D void sizes were higher ( p =0.03 to 0.05) in the balloon-assist group as compared to the stent-assist group. None of the void size histograms were normally distributed; theoretical probability distribution fits suggest that the histograms are most probably exponentially distributed with decay constants of 6-10 mm. Significant ( p <=0.001 to p =0.03) spatial trends were noted with the void sizes but correlation coefficients were generally low (absolute r <=0.35). The methodology we present can provide valuable input data for numerical calculations of hemodynamic forces impinging on intra-aneurysmal coil masses and be used to compare and optimize coil configurations as well as coiling techniques.

  16. Improving the imaging of calcifications in CT by histogram-based selective deblurring

    NASA Astrophysics Data System (ADS)

    Rollano-Hijarrubia, Empar; van der Meer, Frits; van der Lugt, Add; Weinans, Harrie; Vrooman, Henry; Vossepoel, Albert; Stokking, Rik

    2005-04-01

    Imaging of small high-density structures, such as calcifications, with computed tomography (CT) is limited by the spatial resolution of the system. Blur causes small calcifications to be imaged with lower contrast and overestimated volume, thereby hampering the analysis of vessels. The aim of this work is to reduce the blur of calcifications by applying three-dimensional (3D) deconvolution. Unfortunately, the high-frequency amplification of the deconvolution produces edge-related ring artifacts and enhances noise and original artifacts, which degrades the imaging of low-density structures. A method, referred to as Histogram-based Selective Deblurring (HiSD), was implemented to avoid these negative effects. HiSD uses the histogram information to generate a restored image in which the low-intensity voxel information of the observed image is combined with the high-intensity voxel information of the deconvolved image. To evaluate HiSD we scanned four in-vitro atherosclerotic plaques of carotid arteries with a multislice spiral CT and with a microfocus CT (μCT), used as reference. Restored images were generated from the observed images, and qualitatively and quantitatively compared with their corresponding μCT images. Transverse views and maximum-intensity projections of restored images show the decrease of blur of the calcifications in 3D. Measurements of the areas of 27 calcifications and total volumes of calcification of 4 plaques show that the overestimation of calcification was smaller for restored images (mean-error: 90% for area; 92% for volume) than for observed images (143%; 213%, respectively). The qualitative and quantitative analyses show that the imaging of calcifications in CT can be improved considerably by applying HiSD.

  17. Multi-site Study of Diffusion Metric Variability: Characterizing the Effects of Site, Vendor, Field Strength, and Echo Time using the Histogram Distance.

    PubMed

    Helmer, K G; Chou, M-C; Preciado, R I; Gimi, B; Rollins, N K; Song, A; Turner, J; Mori, S

    2016-02-27

    MRI-based multi-site trials now routinely include some form of diffusion-weighted imaging (DWI) in their protocol. These studies can include data originating from scanners built by different vendors, each with their own set of unique protocol restrictions, including restrictions on the number of available gradient directions, whether an externally-generated list of gradient directions can be used, and restrictions on the echo time (TE). One challenge of multi-site studies is to create a common imaging protocol that will result in a reliable and accurate set of diffusion metrics. The present study describes the effect of site, scanner vendor, field strength, and TE on two common metrics: the first moment of the diffusion tensor field (mean diffusivity, MD), and the fractional anisotropy (FA). We have shown in earlier work that ROI metrics and the mean of MD and FA histograms are not sufficiently sensitive for use in site characterization. Here we use the distance between whole brain histograms of FA and MD to investigate within- and between-site effects. We concluded that the variability of DTI metrics due to site, vendor, field strength, and echo time could influence the results in multi-center trials and that histogram distance is sensitive metrics for each of these variables.

  18. Contrast Enhancement Algorithm Based on Gap Adjustment for Histogram Equalization

    PubMed Central

    Chiu, Chung-Cheng; Ting, Chih-Chung

    2016-01-01

    Image enhancement methods have been widely used to improve the visual effects of images. Owing to its simplicity and effectiveness histogram equalization (HE) is one of the methods used for enhancing image contrast. However, HE may result in over-enhancement and feature loss problems that lead to unnatural look and loss of details in the processed images. Researchers have proposed various HE-based methods to solve the over-enhancement problem; however, they have largely ignored the feature loss problem. Therefore, a contrast enhancement algorithm based on gap adjustment for histogram equalization (CegaHE) is proposed. It refers to a visual contrast enhancement algorithm based on histogram equalization (VCEA), which generates visually pleasing enhanced images, and improves the enhancement effects of VCEA. CegaHE adjusts the gaps between two gray values based on the adjustment equation, which takes the properties of human visual perception into consideration, to solve the over-enhancement problem. Besides, it also alleviates the feature loss problem and further enhances the textures in the dark regions of the images to improve the quality of the processed images for human visual perception. Experimental results demonstrate that CegaHE is a reliable method for contrast enhancement and that it significantly outperforms VCEA and other methods. PMID:27338412

  19. Automation in clinical microbiology: a new approach to identifying micro-organisms by automated pattern matching of proteins labelled with 35S-methionine.

    PubMed Central

    Tabaqchali, S; Silman, R; Holland, D

    1987-01-01

    A new rapid automated method for the identification and classification of microorganisms is described. It is based on the incorporation of 35S-methionine into cellular proteins and subsequent separation of the radiolabelled proteins by sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE). The protein patterns produced were species specific and reproducible, permitting discrimination between the species. A large number of Gram negative and Gram positive aerobic and anaerobic organisms were successfully tested. Furthermore, there were sufficient differences within species between the protein profiles to permit subdivision of the species. New typing schemes for Clostridium difficile, coagulase negative staphylococci, and Staphylococcus aureus, including the methicillin resistant strains, could thus be introduced; this has provided the basis for useful epidemiological studies. To standardise and automate the procedure an automated electrophoresis system and a two dimensional scanner were developed to scan the dried gels directly. The scanner is operated by a computer which also stores and analyses the scan data. Specific histograms are produced for each bacterial species. Pattern recognition software is used to construct databases and to compare data obtained from different gels: in this way duplicate "unknowns" can be identified. Specific small areas showing differences between various histograms can also be isolated and expanded to maximise the differences, thus providing differentiation between closely related bacterial species and the identification of differences within the species to provide new typing schemes. This system should be widely applied in clinical microbiology laboratories in the near future. Images Fig 1 Fig 2 Fig 3 Fig 4 Fig 5 Fig 6 Fig 7 Fig 8 PMID:3312300

  20. Predicting the Valence of a Scene from Observers’ Eye Movements

    PubMed Central

    R.-Tavakoli, Hamed; Atyabi, Adham; Rantanen, Antti; Laukka, Seppo J.; Nefti-Meziani, Samia; Heikkilä, Janne

    2015-01-01

    Multimedia analysis benefits from understanding the emotional content of a scene in a variety of tasks such as video genre classification and content-based image retrieval. Recently, there has been an increasing interest in applying human bio-signals, particularly eye movements, to recognize the emotional gist of a scene such as its valence. In order to determine the emotional category of images using eye movements, the existing methods often learn a classifier using several features that are extracted from eye movements. Although it has been shown that eye movement is potentially useful for recognition of scene valence, the contribution of each feature is not well-studied. To address the issue, we study the contribution of features extracted from eye movements in the classification of images into pleasant, neutral, and unpleasant categories. We assess ten features and their fusion. The features are histogram of saccade orientation, histogram of saccade slope, histogram of saccade length, histogram of saccade duration, histogram of saccade velocity, histogram of fixation duration, fixation histogram, top-ten salient coordinates, and saliency map. We utilize machine learning approach to analyze the performance of features by learning a support vector machine and exploiting various feature fusion schemes. The experiments reveal that ‘saliency map’, ‘fixation histogram’, ‘histogram of fixation duration’, and ‘histogram of saccade slope’ are the most contributing features. The selected features signify the influence of fixation information and angular behavior of eye movements in the recognition of the valence of images. PMID:26407322

  1. Histogram analysis of T2*-based pharmacokinetic imaging in cerebral glioma grading.

    PubMed

    Liu, Hua-Shan; Chiang, Shih-Wei; Chung, Hsiao-Wen; Tsai, Ping-Huei; Hsu, Fei-Ting; Cho, Nai-Yu; Wang, Chao-Ying; Chou, Ming-Chung; Chen, Cheng-Yu

    2018-03-01

    To investigate the feasibility of histogram analysis of the T2*-based permeability parameter volume transfer constant (K trans ) for glioma grading and to explore the diagnostic performance of the histogram analysis of K trans and blood plasma volume (v p ). We recruited 31 and 11 patients with high- and low-grade gliomas, respectively. The histogram parameters of K trans and v p , derived from the first-pass pharmacokinetic modeling based on the T2* dynamic susceptibility-weighted contrast-enhanced perfusion-weighted magnetic resonance imaging (T2* DSC-PW-MRI) from the entire tumor volume, were evaluated for differentiating glioma grades. Histogram parameters of K trans and v p showed significant differences between high- and low-grade gliomas and exhibited significant correlations with tumor grades. The mean K trans derived from the T2* DSC-PW-MRI had the highest sensitivity and specificity for differentiating high-grade gliomas from low-grade gliomas compared with other histogram parameters of K trans and v p . Histogram analysis of T2*-based pharmacokinetic imaging is useful for cerebral glioma grading. The histogram parameters of the entire tumor K trans measurement can provide increased accuracy with additional information regarding microvascular permeability changes for identifying high-grade brain tumors. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Stochastic HKMDHE: A multi-objective contrast enhancement algorithm

    NASA Astrophysics Data System (ADS)

    Pratiher, Sawon; Mukhopadhyay, Sabyasachi; Maity, Srideep; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.

    2018-02-01

    This contribution proposes a novel extension of the existing `Hyper Kurtosis based Modified Duo-Histogram Equalization' (HKMDHE) algorithm, for multi-objective contrast enhancement of biomedical images. A novel modified objective function has been formulated by joint optimization of the individual histogram equalization objectives. The optimal adequacy of the proposed methodology with respect to image quality metrics such as brightness preserving abilities, peak signal-to-noise ratio (PSNR), Structural Similarity Index (SSIM) and universal image quality metric has been experimentally validated. The performance analysis of the proposed Stochastic HKMDHE with existing histogram equalization methodologies like Global Histogram Equalization (GHE) and Contrast Limited Adaptive Histogram Equalization (CLAHE) has been given for comparative evaluation.

  3. Infrared image segmentation method based on spatial coherence histogram and maximum entropy

    NASA Astrophysics Data System (ADS)

    Liu, Songtao; Shen, Tongsheng; Dai, Yao

    2014-11-01

    In order to segment the target well and suppress background noises effectively, an infrared image segmentation method based on spatial coherence histogram and maximum entropy is proposed. First, spatial coherence histogram is presented by weighting the importance of the different position of these pixels with the same gray-level, which is obtained by computing their local density. Then, after enhancing the image by spatial coherence histogram, 1D maximum entropy method is used to segment the image. The novel method can not only get better segmentation results, but also have a faster computation time than traditional 2D histogram-based segmentation methods.

  4. Photojournal Home Page Graphic 2007

    NASA Technical Reports Server (NTRS)

    2008-01-01

    This image is an unannotated version of the Photojournal Home Page graphic released in October 2007. This digital collage contains a highly stylized rendition of our solar system and points beyond. As this graphic was intended to be used as a navigation aid in searching for data within the Photojournal, certain artistic embellishments have been added (color, location, etc.). Several data sets from various planetary and astronomy missions were combined to create this image.

  5. Automated Discovery of Long Intergenic RNAs Associated with Breast Cancer Progression

    DTIC Science & Technology

    2012-02-01

    manuscript in preparation), (2) development and publication of an algorithm for detecting gene fusions in RNA-Seq data [1], and (3) discovery of outlier long...subjected to de novo assembly algorithms to discover novel transcripts representing either unannotated genes or novel somatic mutations such as gene...fusions. To this end the P.I. developed and published a novel algorithm called ChimeraScan to facilitate the discovery and validation of gene

  6. The Amazing Histogram.

    ERIC Educational Resources Information Center

    Vandermeulen, H.; DeWreede, R. E.

    1983-01-01

    Presents a histogram drawing program which sorts real numbers in up to 30 categories. Entered data are sorted and saved in a text file which is then used to generate the histogram. Complete Applesoft program listings are included. (JN)

  7. Bin recycling strategy for improving the histogram precision on GPU

    NASA Astrophysics Data System (ADS)

    Cárdenas-Montes, Miguel; Rodríguez-Vázquez, Juan José; Vega-Rodríguez, Miguel A.

    2016-07-01

    Histogram is an easily comprehensible way to present data and analyses. In the current scientific context with access to large volumes of data, the processing time for building histogram has dramatically increased. For this reason, parallel construction is necessary to alleviate the impact of the processing time in the analysis activities. In this scenario, GPU computing is becoming widely used for reducing until affordable levels the processing time of histogram construction. Associated to the increment of the processing time, the implementations are stressed on the bin-count accuracy. Accuracy aspects due to the particularities of the implementations are not usually taken into consideration when building histogram with very large data sets. In this work, a bin recycling strategy to create an accuracy-aware implementation for building histogram on GPU is presented. In order to evaluate the approach, this strategy was applied to the computation of the three-point angular correlation function, which is a relevant function in Cosmology for the study of the Large Scale Structure of Universe. As a consequence of the study a high-accuracy implementation for histogram construction on GPU is proposed.

  8. Clinical Utility of Blood Cell Histogram Interpretation

    PubMed Central

    Bhagya, S.; Majeed, Abdul

    2017-01-01

    An automated haematology analyser provides blood cell histograms by plotting the sizes of different blood cells on X-axis and their relative number on Y-axis. Histogram interpretation needs careful analysis of Red Blood Cell (RBC), White Blood Cell (WBC) and platelet distribution curves. Histogram analysis is often a neglected part of the automated haemogram which if interpreted well, has significant potential to provide diagnostically relevant information even before higher level investigations are ordered. PMID:29207767

  9. Clinical Utility of Blood Cell Histogram Interpretation.

    PubMed

    Thomas, E T Arun; Bhagya, S; Majeed, Abdul

    2017-09-01

    An automated haematology analyser provides blood cell histograms by plotting the sizes of different blood cells on X-axis and their relative number on Y-axis. Histogram interpretation needs careful analysis of Red Blood Cell (RBC), White Blood Cell (WBC) and platelet distribution curves. Histogram analysis is often a neglected part of the automated haemogram which if interpreted well, has significant potential to provide diagnostically relevant information even before higher level investigations are ordered.

  10. [Low dose volume histogram analysis of the lungs in prediction of acute radiation pneumonitis in patients with esophageal cancer treated with three-dimensional conformal radiotherapy].

    PubMed

    Shen, Wen-bin; Zhu, Shu-chai; Gao, Hong-mei; Li, You-mei; Liu, Zhi-kun; Li, Juan; Su, Jing-wei; Wan, Jun

    2013-01-01

    To investigate the predictive value of low dose volume of the lung on acute radiation pneumonitis (RP) in patients with esophageal cancer treated with three-dimensional conformal radiotherapy (3D-CRT) only, and to analyze the relation of comprehensive parameters of the dose-volume V5, V20 and mean lung dose (MLD) with acute RP. Two hundred and twenty-two patients with esophageal cancer treated by 3D-CRT have been followed up. The V5-V30 and MLD were calculated from the dose-volume histogram system. The clinical factors and treatment parameters were collected and analyzed. The acute RP was evaluated according to the RTOG toxicity criteria. The acute RP of grade 1, 2, 3 and 4 were observed in 68 (30.6%), 40 (18.0%), 8 (3.6%) and 1 (0.5%) cases, respectively. The univariate analysis of measurement data:The primary tumor length, radiation fields, MLD and lung V5-V30 had a significant relationship with the acute RP. The magnitude of the number of radiation fields, the volume of GTV, MLD and Lung V5-V30 had a significant difference in whether the ≥ grade 1 and ≥ grade 2 acute RP developed or not. Binary logistic regression analysis showed that MLD, Lung V5, V20 and V25 were independent risk factors of ≥ grade 1 acute RP, and the radiation fields, MLD and Lung V5 were independent risk factors of ≥ grade 2 acute RP. The ≥ grade 1 and ≥ grade 2 acute RP were significantly decreased when MLD less than 14 Gy, V5 and V20 were less than 60% and 28%,respectively. When the V20 ≤ 28%, the acute RP was significantly decreased in V5 ≤ 60% group. When the MLD was ≤ 14 Gy, the ≥ 1 grade acute RP was significantly decreased in the V5 ≤ 60% group. When the MLD was >14 Gy, the ≥ grade 2 acute RP was significantly decreased in the V5 ≤ 60% group. The low dose volume of the lung is effective in predicting radiation pneumonitis in patients with esophageal cancer treated with 3D-CRT only. The comprehensive parameters combined with V5, V20 and MLD may increase the effect in predicting radiation pneumonitis.

  11. Rotation-invariant image and video description with local binary pattern features.

    PubMed

    Zhao, Guoying; Ahonen, Timo; Matas, Jiří; Pietikäinen, Matti

    2012-04-01

    In this paper, we propose a novel approach to compute rotation-invariant features from histograms of local noninvariant patterns. We apply this approach to both static and dynamic local binary pattern (LBP) descriptors. For static-texture description, we present LBP histogram Fourier (LBP-HF) features, and for dynamic-texture recognition, we present two rotation-invariant descriptors computed from the LBPs from three orthogonal planes (LBP-TOP) features in the spatiotemporal domain. LBP-HF is a novel rotation-invariant image descriptor computed from discrete Fourier transforms of LBP histograms. The approach can be also generalized to embed any uniform features into this framework, and combining the supplementary information, e.g., sign and magnitude components of the LBP, together can improve the description ability. Moreover, two variants of rotation-invariant descriptors are proposed to the LBP-TOP, which is an effective descriptor for dynamic-texture recognition, as shown by its recent success in different application problems, but it is not rotation invariant. In the experiments, it is shown that the LBP-HF and its extensions outperform noninvariant and earlier versions of the rotation-invariant LBP in the rotation-invariant texture classification. In experiments on two dynamic-texture databases with rotations or view variations, the proposed video features can effectively deal with rotation variations of dynamic textures (DTs). They also are robust with respect to changes in viewpoint, outperforming recent methods proposed for view-invariant recognition of DTs.

  12. Intrinsic and extrinsic approaches for detecting genes in a bacterial genome.

    PubMed Central

    Borodovsky, M; Rudd, K E; Koonin, E V

    1994-01-01

    The unannotated regions of the Escherichia coli genome DNA sequence from the EcoSeq6 database, totaling 1,278 'intergenic' sequences of the combined length of 359,279 basepairs, were analyzed using computer-assisted methods with the aim of identifying putative unknown genes. The proposed strategy for finding new genes includes two key elements: i) prediction of expressed open reading frames (ORFs) using the GeneMark method based on Markov chain models for coding and non-coding regions of Escherichia coli DNA, and ii) search for protein sequence similarities using programs based on the BLAST algorithm and programs for motif identification. A total of 354 putative expressed ORFs were predicted by GeneMark. Using the BLASTX and TBLASTN programs, it was shown that 208 ORFs located in the unannotated regions of the E. coli chromosome are significantly similar to other protein sequences. Identification of 182 ORFs as probable genes was supported by GeneMark and BLAST, comprising 51.4% of the GeneMark 'hits' and 87.5% of the BLAST 'hits'. 73 putative new genes, comprising 20.6% of the GeneMark predictions, belong to ancient conserved protein families that include both eubacterial and eukaryotic members. This value is close to the overall proportion of highly conserved sequences among eubacterial proteins, indicating that the majority of the putative expressed ORFs that are predicted by GeneMark, but have no significant BLAST hits, nevertheless are likely to be real genes. The majority of the putative genes identified by BLAST search have been described since the release of the EcoSeq6 database, but about 70 genes have not been detected so far. Among these new identifications are genes encoding proteins with a variety of predicted functions including dehydrogenases, kinases, several other metabolic enzymes, ATPases, rRNA methyltransferases, membrane proteins, and different types of regulatory proteins. Images PMID:7984428

  13. A flower image retrieval method based on ROI feature.

    PubMed

    Hong, An-Xiang; Chen, Gang; Li, Jun-Li; Chi, Zhe-Ru; Zhang, Dan

    2004-07-01

    Flower image retrieval is a very important step for computer-aided plant species recognition. In this paper, we propose an efficient segmentation method based on color clustering and domain knowledge to extract flower regions from flower images. For flower retrieval, we use the color histogram of a flower region to characterize the color features of flower and two shape-based features sets, Centroid-Contour Distance (CCD) and Angle Code Histogram (ACH), to characterize the shape features of a flower contour. Experimental results showed that our flower region extraction method based on color clustering and domain knowledge can produce accurate flower regions. Flower retrieval results on a database of 885 flower images collected from 14 plant species showed that our Region-of-Interest (ROI) based retrieval approach using both color and shape features can perform better than a method based on the global color histogram proposed by Swain and Ballard (1991) and a method based on domain knowledge-driven segmentation and color names proposed by Das et al.(1999).

  14. Object-based change detection method using refined Markov random field

    NASA Astrophysics Data System (ADS)

    Peng, Daifeng; Zhang, Yongjun

    2017-01-01

    In order to fully consider the local spatial constraints between neighboring objects in object-based change detection (OBCD), an OBCD approach is presented by introducing a refined Markov random field (MRF). First, two periods of images are stacked and segmented to produce image objects. Second, object spectral and textual histogram features are extracted and G-statistic is implemented to measure the distance among different histogram distributions. Meanwhile, object heterogeneity is calculated by combining spectral and textual histogram distance using adaptive weight. Third, an expectation-maximization algorithm is applied for determining the change category of each object and the initial change map is then generated. Finally, a refined change map is produced by employing the proposed refined object-based MRF method. Three experiments were conducted and compared with some state-of-the-art unsupervised OBCD methods to evaluate the effectiveness of the proposed method. Experimental results demonstrate that the proposed method obtains the highest accuracy among the methods used in this paper, which confirms its validness and effectiveness in OBCD.

  15. Multipurpose contrast enhancement on epiphyseal plates and ossification centers for bone age assessment

    PubMed Central

    2013-01-01

    Background The high variations of background luminance, low contrast and excessively enhanced contrast of hand bone radiograph often impede the bone age assessment rating system in evaluating the degree of epiphyseal plates and ossification centers development. The Global Histogram equalization (GHE) has been the most frequently adopted image contrast enhancement technique but the performance is not satisfying. A brightness and detail preserving histogram equalization method with good contrast enhancement effect has been a goal of much recent research in histogram equalization. Nevertheless, producing a well-balanced histogram equalized radiograph in terms of its brightness preservation, detail preservation and contrast enhancement is deemed to be a daunting task. Method In this paper, we propose a novel framework of histogram equalization with the aim of taking several desirable properties into account, namely the Multipurpose Beta Optimized Bi-Histogram Equalization (MBOBHE). This method performs the histogram optimization separately in both sub-histograms after the segmentation of histogram using an optimized separating point determined based on the regularization function constituted by three components. The result is then assessed by the qualitative and quantitative analysis to evaluate the essential aspects of histogram equalized image using a total of 160 hand radiographs that are implemented in testing and analyses which are acquired from hand bone online database. Result From the qualitative analysis, we found that basic bi-histogram equalizations are not capable of displaying the small features in image due to incorrect selection of separating point by focusing on only certain metric without considering the contrast enhancement and detail preservation. From the quantitative analysis, we found that MBOBHE correlates well with human visual perception, and this improvement shortens the evaluation time taken by inspector in assessing the bone age. Conclusions The proposed MBOBHE outperforms other existing methods regarding comprehensive performance of histogram equalization. All the features which are pertinent to bone age assessment are more protruding relative to other methods; this has shorten the required evaluation time in manual bone age assessment using TW method. While the accuracy remains unaffected or slightly better than using unprocessed original image. The holistic properties in terms of brightness preservation, detail preservation and contrast enhancement are simultaneous taken into consideration and thus the visual effect is contributive to manual inspection. PMID:23565999

  16. Using histograms to introduce randomization in the generation of ensembles of decision trees

    DOEpatents

    Kamath, Chandrika; Cantu-Paz, Erick; Littau, David

    2005-02-22

    A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

  17. Color Histogram Diffusion for Image Enhancement

    NASA Technical Reports Server (NTRS)

    Kim, Taemin

    2011-01-01

    Various color histogram equalization (CHE) methods have been proposed to extend grayscale histogram equalization (GHE) for color images. In this paper a new method called histogram diffusion that extends the GHE method to arbitrary dimensions is proposed. Ranges in a histogram are specified as overlapping bars of uniform heights and variable widths which are proportional to their frequencies. This diagram is called the vistogram. As an alternative approach to GHE, the squared error of the vistogram from the uniform distribution is minimized. Each bar in the vistogram is approximated by a Gaussian function. Gaussian particles in the vistoram diffuse as a nonlinear autonomous system of ordinary differential equations. CHE results of color images showed that the approach is effective.

  18. Locally advanced rectal cancer: post-chemoradiotherapy ADC histogram analysis for predicting a complete response.

    PubMed

    Cho, Seung Hyun; Kim, Gab Chul; Jang, Yun-Jin; Ryeom, Hunkyu; Kim, Hye Jung; Shin, Kyung-Min; Park, Jun Seok; Choi, Gyu-Seog; Kim, See Hyung

    2015-09-01

    The value of diffusion-weighted imaging (DWI) for reliable differentiation between pathologic complete response (pCR) and residual tumor is still unclear. Recently, a few studies reported that histogram analysis can be helpful to monitor the therapeutic response in various cancer research. To investigate whether post-chemoradiotherapy (CRT) apparent diffusion coefficient (ADC) histogram analysis can be helpful to predict a pCR in locally advanced rectal cancer (LARC). Fifty patients who underwent preoperative CRT followed by surgery were enrolled in this retrospective study, non-pCR (n = 41) and pCR (n = 9), respectively. ADC histogram analysis encompassing the whole tumor was performed on two post-CRT ADC600 and ADC1000 (b factors 0, 600 vs. 0, 1000 s/mm(2)) maps. Mean, minimum, maximum, SD, mode, 10th, 25th, 50th, 75th, 90th percentile ADCs, skewness, and kurtosis were derived. Diagnostic performance for predicting pCR was evaluated and compared. On both maps, 10th and 25th ADCs showed better diagnostic performance than that using mean ADC. Tenth percentile ADCs revealed the best diagnostic performance on both ADC600 (AZ 0.841, sensitivity 100%, specificity 70.7%) and ADC1000 (AZ 0.821, sensitivity 77.8%, specificity 87.8%) maps. In comparison between 10th percentile and mean ADC, the specificity was significantly improved on both ADC600 (70.7% vs. 53.7%; P = 0.031) and ADC1000 (87.8% vs. 73.2%; P = 0.039) maps. Post-CRT ADC histogram analysis is helpful for predicting pCR in LARC, especially, in improving the specificity, compared with mean ADC. © The Foundation Acta Radiologica 2014.

  19. Radiological indeterminate vestibular schwannoma and meningioma in cerebellopontine angle area: differentiating using whole-tumor histogram analysis of apparent diffusion coefficient.

    PubMed

    Xu, Xiao-Quan; Li, Yan; Hong, Xun-Ning; Wu, Fei-Yun; Shi, Hai-Bin

    2017-02-01

    To assess the role of whole-tumor histogram analysis of apparent diffusion coefficient (ADC) maps in differentiating radiological indeterminate vestibular schwannoma (VS) from meningioma in cerebellopontine angle (CPA). Diffusion-weighted (DW) images (b = 0 and 1000 s/mm 2 ) of pathologically confirmed and radiological indeterminate CPA meningioma (CPAM) (n = 27) and VS (n = 12) were retrospectively collected and processed with mono-exponential model. Whole-tumor regions of interest were drawn on all slices of the ADC maps to obtain histogram parameters, including the mean ADC (ADC mean ), median ADC (ADC median ), 10th/25th/75th/90th percentile ADC (ADC 10 , ADC 25 , ADC 75 and ADC 90 ), skewness and kurtosis. The differences of ADC histogram parameters between CPAM and VS were compared using unpaired t-test. Multiple receiver operating characteristic (ROC) curves analysis was used to determine and compare the diagnostic value of each significant parameter. Significant differences were found on the ADC mean , ADC median , ADC 10 , ADC 25 , ADC 75 and ADC 90 between CPAM and VS (all p values < 0.001), while no significant difference was found on kurtosis (p = 0.562) and skewness (p = 0.047). ROC curves analysis revealed, a cut-off value of 1.126 × 10 -3 mm 2 /s for the ADC 90 value generated highest area under curves (AUC) for differentiating CPAM from VS (AUC, 0.975; sensitivity, 100%; specificity, 88.9%). Histogram analysis of ADC maps based on whole tumor can be a useful tool for differentiating radiological indeterminate CPAM from VS. The ADC 90 value was the most promising parameter for differentiating these two entities.

  20. Atherogenic lipid phenotype in a general group of subjects.

    PubMed

    Van, Joanne; Pan, Jianqiu; Charles, M Arthur; Krauss, Ronald; Wong, Nathan; Wu, Xiaoshan

    2007-11-01

    The atherogenic lipid phenotype is a major cardiovascular risk factor, but normal values do not exist derived from 1 analysis in a general study group. To determine normal values of all of the atherogenic lipid phenotype parameters using subjects from a general study group. One hundred two general subjects were used to determine their atherogenic lipid phenotype using polyacrylamide gradient gels. Low-density lipoprotein (LDL) size revealed 24% of subjects express LDL phenotype B, defined as average LDL peak particle size 258 A or less; however, among the Chinese subjects, the expression of the B phenotype was higher at 44% (P = .02). For the total group, mean LDL size was 265 +/- 11 A (1 SD); however, histograms were bimodal in both men and women. After excluding subjects expressing LDL phenotype B, because they are at increased cardiovascular risk and thus are not completely healthy, LDL histograms were unimodal and the mean LDL size was 270 +/- 7 A. A small, dense LDL concentration histogram (total group) revealed skewing; thus, phenotype B subjects were excluded, for the rationale described previously, and the mean value was 13 +/- 9 mg/dL (0.33 +/- 0.23 mmol/L). High-density lipoprotein (HDL) cholesterol histograms were bimodal in both sexes. After removing subjects as described previously or if HDL cholesterol levels were less than 45 mg/dL, histograms were unimodal and revealed a mean HDL cholesterol value of 61 +/- 12 mg/dL (1.56 +/- 0.31 mmol/L). HDL 2, HDL 2a, and HDL 2b were similarly evaluated. Approximate normal values for the atherogenic lipid phenotype, similar to those derived from cardiovascular endpoint trials, can be determined if those high proportions of subjects with dyslipidemic cardiovascular risk are excluded.

  1. Identification of unannotated exons of low abundance transcripts in Drosophila melanogaster and cloning of a new serine protease gene upregulated upon injury.

    PubMed

    Maia, Rafaela M; Valente, Valeria; Cunha, Marco A V; Sousa, Josane F; Araujo, Daniela D; Silva, Wilson A; Zago, Marco A; Dias-Neto, Emmanuel; Souza, Sandro J; Simpson, Andrew J G; Monesi, Nadia; Ramos, Ricardo G P; Espreafico, Enilza M; Paçó-Larson, Maria L

    2007-07-24

    The sequencing of the D.melanogaster genome revealed an unexpected small number of genes (~ 14,000) indicating that mechanisms acting on generation of transcript diversity must have played a major role in the evolution of complex metazoans. Among the most extensively used mechanisms that accounts for this diversity is alternative splicing. It is estimated that over 40% of Drosophila protein-coding genes contain one or more alternative exons. A recent transcription map of the Drosophila embryogenesis indicates that 30% of the transcribed regions are unannotated, and that 1/3 of this is estimated as missed or alternative exons of previously characterized protein-coding genes. Therefore, the identification of the variety of expressed transcripts depends on experimental data for its final validation and is continuously being performed using different approaches. We applied the Open Reading Frame Expressed Sequence Tags (ORESTES) methodology, which is capable of generating cDNA data from the central portion of rare transcripts, in order to investigate the presence of hitherto unnanotated regions of Drosophila transcriptome. Bioinformatic analysis of 1,303 Drosophila ORESTES clusters identified 68 sequences derived from unannotated regions in the current Drosophila genome version (4.3). Of these, a set of 38 was analysed by polyA+ northern blot hybridization, validating 17 (50%) new exons of low abundance transcripts. For one of these ESTs, we obtained the cDNA encompassing the complete coding sequence of a new serine protease, named SP212. The SP212 gene is part of a serine protease gene cluster located in the chromosome region 88A12-B1. This cluster includes the predicted genes CG9631, CG9649 and CG31326, which were previously identified as up-regulated after immune challenges in genomic-scale microarray analysis. In agreement with the proposal that this locus is co-regulated in response to microorganisms infection, we show here that SP212 is also up-regulated upon injury. Using the ORESTES methodology we identified 17 novel exons from low abundance Drosophila transcripts, and through a PCR approach the complete CDS of one of these transcripts was defined. Our results show that the computational identification and manual inspection are not sufficient to annotate a genome in the absence of experimentally derived data.

  2. Identification of unannotated exons of low abundance transcripts in Drosophila melanogaster and cloning of a new serine protease gene upregulated upon injury

    PubMed Central

    Maia, Rafaela M; Valente, Valeria; Cunha, Marco AV; Sousa, Josane F; Araujo, Daniela D; Silva, Wilson A; Zago, Marco A; Dias-Neto, Emmanuel; Souza, Sandro J; Simpson, Andrew JG; Monesi, Nadia; Ramos, Ricardo GP; Espreafico, Enilza M; Paçó-Larson, Maria L

    2007-01-01

    Background The sequencing of the D.melanogaster genome revealed an unexpected small number of genes (~ 14,000) indicating that mechanisms acting on generation of transcript diversity must have played a major role in the evolution of complex metazoans. Among the most extensively used mechanisms that accounts for this diversity is alternative splicing. It is estimated that over 40% of Drosophila protein-coding genes contain one or more alternative exons. A recent transcription map of the Drosophila embryogenesis indicates that 30% of the transcribed regions are unannotated, and that 1/3 of this is estimated as missed or alternative exons of previously characterized protein-coding genes. Therefore, the identification of the variety of expressed transcripts depends on experimental data for its final validation and is continuously being performed using different approaches. We applied the Open Reading Frame Expressed Sequence Tags (ORESTES) methodology, which is capable of generating cDNA data from the central portion of rare transcripts, in order to investigate the presence of hitherto unnanotated regions of Drosophila transcriptome. Results Bioinformatic analysis of 1,303 Drosophila ORESTES clusters identified 68 sequences derived from unannotated regions in the current Drosophila genome version (4.3). Of these, a set of 38 was analysed by polyA+ northern blot hybridization, validating 17 (50%) new exons of low abundance transcripts. For one of these ESTs, we obtained the cDNA encompassing the complete coding sequence of a new serine protease, named SP212. The SP212 gene is part of a serine protease gene cluster located in the chromosome region 88A12-B1. This cluster includes the predicted genes CG9631, CG9649 and CG31326, which were previously identified as up-regulated after immune challenges in genomic-scale microarray analysis. In agreement with the proposal that this locus is co-regulated in response to microorganisms infection, we show here that SP212 is also up-regulated upon injury. Conclusion Using the ORESTES methodology we identified 17 novel exons from low abundance Drosophila transcripts, and through a PCR approach the complete CDS of one of these transcripts was defined. Our results show that the computational identification and manual inspection are not sufficient to annotate a genome in the absence of experimentally derived data. PMID:17650329

  3. Discussion on the 3D visualizing of 1:200 000 geological map

    NASA Astrophysics Data System (ADS)

    Wang, Xiaopeng

    2018-01-01

    Using United States National Aeronautics and Space Administration Shuttle Radar Topography Mission (SRTM) terrain data as digital elevation model (DEM), overlap scanned 1:200 000 scale geological map, program using Direct 3D of Microsoft with C# computer language, the author realized the three-dimensional visualization of the standard division geological map. User can inspect the regional geology content with arbitrary angle, rotating, roaming, and can examining the strata synthetical histogram, map section and legend at any moment. This will provide an intuitionistic analyzing tool for the geological practitioner to do structural analysis with the assistant of landform, dispose field exploration route etc.

  4. Global Interior Robot Localisation by a Colour Content Image Retrieval System

    NASA Astrophysics Data System (ADS)

    Chaari, A.; Lelandais, S.; Montagne, C.; Ahmed, M. Ben

    2007-12-01

    We propose a new global localisation approach to determine a coarse position of a mobile robot in structured indoor space using colour-based image retrieval techniques. We use an original method of colour quantisation based on the baker's transformation to extract a two-dimensional colour pallet combining as well space and vicinity-related information as colourimetric aspect of the original image. We conceive several retrieving approaches bringing to a specific similarity measure [InlineEquation not available: see fulltext.] integrating the space organisation of colours in the pallet. The baker's transformation provides a quantisation of the image into a space where colours that are nearby in the original space are also nearby in the output space, thereby providing dimensionality reduction and invariance to minor changes in the image. Whereas the distance [InlineEquation not available: see fulltext.] provides for partial invariance to translation, sight point small changes, and scale factor. In addition to this study, we developed a hierarchical search module based on the logic classification of images following rooms. This hierarchical module reduces the searching indoor space and ensures an improvement of our system performances. Results are then compared with those brought by colour histograms provided with several similarity measures. In this paper, we focus on colour-based features to describe indoor images. A finalised system must obviously integrate other type of signature like shape and texture.

  5. Myocardial Infarct Segmentation from Magnetic Resonance Images for Personalized Modeling of Cardiac Electrophysiology

    PubMed Central

    Ukwatta, Eranga; Arevalo, Hermenegild; Li, Kristina; Yuan, Jing; Qiu, Wu; Malamas, Peter; Wu, Katherine C.

    2016-01-01

    Accurate representation of myocardial infarct geometry is crucial to patient-specific computational modeling of the heart in ischemic cardiomyopathy. We have developed a methodology for segmentation of left ventricular (LV) infarct from clinically acquired, two-dimensional (2D), late-gadolinium enhanced cardiac magnetic resonance (LGE-CMR) images, for personalized modeling of ventricular electrophysiology. The infarct segmentation was expressed as a continuous min-cut optimization problem, which was solved using its dual formulation, the continuous max-flow (CMF). The optimization objective comprised of a smoothness term, and a data term that quantified the similarity between image intensity histograms of segmented regions and those of a set of training images. A manual segmentation of the LV myocardium was used to initialize and constrain the developed method. The three-dimensional geometry of infarct was reconstructed from its segmentation using an implicit, shape-based interpolation method. The proposed methodology was extensively evaluated using metrics based on geometry, and outcomes of individualized electrophysiological simulations of cardiac dys(function). Several existing LV infarct segmentation approaches were implemented, and compared with the proposed method. Our results demonstrated that the CMF method was more accurate than the existing approaches in reproducing expert manual LV infarct segmentations, and in electrophysiological simulations. The infarct segmentation method we have developed and comprehensively evaluated in this study constitutes an important step in advancing clinical applications of personalized simulations of cardiac electrophysiology. PMID:26731693

  6. FPGA based charge fast histogramming for GEM detector

    NASA Astrophysics Data System (ADS)

    Poźniak, Krzysztof T.; Byszuk, A.; Chernyshova, M.; Cieszewski, R.; Czarski, T.; Dominik, W.; Jakubowska, K.; Kasprowicz, G.; Rzadkiewicz, J.; Scholz, M.; Zabolotny, W.

    2013-10-01

    This article presents a fast charge histogramming method for the position sensitive X-ray GEM detector. The energy resolved measurements are carried out simultaneously for 256 channels of the GEM detector. The whole process of histogramming is performed in 21 FPGA chips (Spartan-6 series from Xilinx) . The results of the histogramming process are stored in an external DDR3 memory. The structure of an electronic measuring equipment and a firmware functionality implemented in the FPGAs is described. Examples of test measurements are presented.

  7. Local dynamic range compensation for scanning electron microscope imaging system.

    PubMed

    Sim, K S; Huang, Y H

    2015-01-01

    This is the extended project by introducing the modified dynamic range histogram modification (MDRHM) and is presented in this paper. This technique is used to enhance the scanning electron microscope (SEM) imaging system. By comparing with the conventional histogram modification compensators, this technique utilizes histogram profiling by extending the dynamic range of each tile of an image to the limit of 0-255 range while retains its histogram shape. The proposed technique yields better image compensation compared to conventional methods. © Wiley Periodicals, Inc.

  8. K + block is the mechanism of functional asymmetry in bacterial Na v channels

    DOE PAGES

    Ngo, Van; Wang, Yibo; Haas, Stephan; ...

    2016-01-04

    Crystal structures of several bacterial Na v channels have been recently published and molecular dynamics simulations of ion permeation through these channels are consistent with many electrophysiological properties of eukaryotic channels. Bacterial Na v channels have been characterized as functionally asymmetric, and the mechanism of this asymmetry has not been clearly understood. To address this question, we combined non-equilibrium simulation data with two-dimensional equilibrium unperturbed landscapes generated by umbrella sampling and Weighted Histogram Analysis Methods for multiple ions traversing the selectivity filter of bacterial Na vAb channel. This approach provided new insight into the mechanism of selective ion permeation inmore » bacterial Nav channels. The non-equilibrium simulations indicate that two or three extracellular K + ions can block the entrance to the selectivity filter of Na vAb in the presence of applied forces in the inward direction, but not in the outward direction. The block state occurs in an unstable local minimum of the equilibrium unperturbed free-energy landscape of two K+ ions that can be ‘locked’ in place bymodest applied forces. In contrast to K +, three Na + ions move favorably through the selectivity filter together as a unit in a loose “knock-on” mechanism of permeation in both inward and outward directions, and there is no similar local minimum in the two-dimensional free-energy landscape of two Na + ions for a block state. The useful work predicted by the non-equilibrium simulations that is required to break the K + block is equivalent to large applied potentials experimentally measured for two bacterial Na v channels to induce inward currents of K + ions. Here, these results illustrate how inclusion of non-equilibrium factors in the simulations can provide detailed information about mechanisms of ion selectivity that is missing from mechanisms derived from either crystal structures or equilibrium unperturbed free-energy landscapes.« less

  9. K+ Block Is the Mechanism of Functional Asymmetry in Bacterial Nav Channels

    PubMed Central

    Ngo, Van; Wang, Yibo; Haas, Stephan; Noskov, Sergei Y.; Farley, Robert A.

    2016-01-01

    Crystal structures of several bacterial Nav channels have been recently published and molecular dynamics simulations of ion permeation through these channels are consistent with many electrophysiological properties of eukaryotic channels. Bacterial Nav channels have been characterized as functionally asymmetric, and the mechanism of this asymmetry has not been clearly understood. To address this question, we combined non-equilibrium simulation data with two-dimensional equilibrium unperturbed landscapes generated by umbrella sampling and Weighted Histogram Analysis Methods for multiple ions traversing the selectivity filter of bacterial NavAb channel. This approach provided new insight into the mechanism of selective ion permeation in bacterial Nav channels. The non-equilibrium simulations indicate that two or three extracellular K+ ions can block the entrance to the selectivity filter of NavAb in the presence of applied forces in the inward direction, but not in the outward direction. The block state occurs in an unstable local minimum of the equilibrium unperturbed free-energy landscape of two K+ ions that can be ‘locked’ in place by modest applied forces. In contrast to K+, three Na+ ions move favorably through the selectivity filter together as a unit in a loose “knock-on” mechanism of permeation in both inward and outward directions, and there is no similar local minimum in the two-dimensional free-energy landscape of two Na+ ions for a block state. The useful work predicted by the non-equilibrium simulations that is required to break the K+ block is equivalent to large applied potentials experimentally measured for two bacterial Nav channels to induce inward currents of K+ ions. These results illustrate how inclusion of non-equilibrium factors in the simulations can provide detailed information about mechanisms of ion selectivity that is missing from mechanisms derived from either crystal structures or equilibrium unperturbed free-energy landscapes. PMID:26727271

  10. Histogram Profiling of Postcontrast T1-Weighted MRI Gives Valuable Insights into Tumor Biology and Enables Prediction of Growth Kinetics and Prognosis in Meningiomas.

    PubMed

    Gihr, Georg Alexander; Horvath-Rizea, Diana; Kohlhof-Meinecke, Patricia; Ganslandt, Oliver; Henkes, Hans; Richter, Cindy; Hoffmann, Karl-Titus; Surov, Alexey; Schob, Stefan

    2018-06-14

    Meningiomas are the most frequently diagnosed intracranial masses, oftentimes requiring surgery. Especially procedure-related morbidity can be substantial, particularly in elderly patients. Hence, reliable imaging modalities enabling pretherapeutic prediction of tumor grade, growth kinetic, realistic prognosis, and-as a consequence-necessity of surgery are of great value. In this context, a promising diagnostic approach is advanced analysis of magnetic resonance imaging data. Therefore, our study investigated whether histogram profiling of routinely acquired postcontrast T1-weighted images is capable of separating low-grade from high-grade lesions and whether histogram parameters reflect Ki-67 expression in meningiomas. Pretreatment T1-weighted postcontrast volumes of 44 meningioma patients were used for signal intensity histogram profiling. WHO grade, tumor volume, and Ki-67 expression were evaluated. Comparative and correlative statistics investigating the association between histogram profile parameters and neuropathology were performed. None of the investigated histogram parameters revealed significant differences between low-grade and high-grade meningiomas. However, significant correlations were identified between Ki-67 and the histogram parameters skewness and entropy as well as between entropy and tumor volume. Contrary to previously reported findings, pretherapeutic postcontrast T1-weighted images can be used to predict growth kinetics in meningiomas if whole tumor histogram analysis is employed. However, no differences between distinct WHO grades were identifiable in out cohort. As a consequence, histogram analysis of postcontrast T1-weighted images is a promising approach to obtain quantitative in vivo biomarkers reflecting the proliferative potential in meningiomas. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Comparison of image enhancement methods for the effective diagnosis in successive whole-body bone scans.

    PubMed

    Jeong, Chang Bu; Kim, Kwang Gi; Kim, Tae Sung; Kim, Seok Ki

    2011-06-01

    Whole-body bone scan is one of the most frequent diagnostic procedures in nuclear medicine. Especially, it plays a significant role in important procedures such as the diagnosis of osseous metastasis and evaluation of osseous tumor response to chemotherapy and radiation therapy. It can also be used to monitor the possibility of any recurrence of the tumor. However, it is a very time-consuming effort for radiologists to quantify subtle interval changes between successive whole-body bone scans because of many variations such as intensity, geometry, and morphology. In this paper, we present the most effective method of image enhancement based on histograms, which may assist radiologists in interpreting successive whole-body bone scans effectively. Forty-eight successive whole-body bone scans from 10 patients were obtained and evaluated using six methods of image enhancement based on histograms: histogram equalization, brightness-preserving bi-histogram equalization, contrast-limited adaptive histogram equalization, end-in search, histogram matching, and exact histogram matching (EHM). Comparison of the results of the different methods was made using three similarity measures peak signal-to-noise ratio, histogram intersection, and structural similarity. Image enhancement of successive bone scans using EHM showed the best results out of the six methods measured for all similarity measures. EHM is the best method of image enhancement based on histograms for diagnosing successive whole-body bone scans. The method for successive whole-body bone scans has the potential to greatly assist radiologists quantify interval changes more accurately and quickly by compensating for the variable nature of intensity information. Consequently, it can improve radiologists' diagnostic accuracy as well as reduce reading time for detecting interval changes.

  12. Face recognition algorithm using extended vector quantization histogram features.

    PubMed

    Yan, Yan; Lee, Feifei; Wu, Xueqian; Chen, Qiu

    2018-01-01

    In this paper, we propose a face recognition algorithm based on a combination of vector quantization (VQ) and Markov stationary features (MSF). The VQ algorithm has been shown to be an effective method for generating features; it extracts a codevector histogram as a facial feature representation for face recognition. Still, the VQ histogram features are unable to convey spatial structural information, which to some extent limits their usefulness in discrimination. To alleviate this limitation of VQ histograms, we utilize Markov stationary features (MSF) to extend the VQ histogram-based features so as to add spatial structural information. We demonstrate the effectiveness of our proposed algorithm by achieving recognition results superior to those of several state-of-the-art methods on publicly available face databases.

  13. Ultrasonic histogram assessment of early response to concurrent chemo-radiotherapy in patients with locally advanced cervical cancer: a feasibility study.

    PubMed

    Xu, Yan; Ru, Tong; Zhu, Lijing; Liu, Baorui; Wang, Huanhuan; Zhu, Li; He, Jian; Liu, Song; Zhou, Zhengyang; Yang, Xiaofeng

    To monitor early response for locally advanced cervical cancers undergoing concurrent chemo-radiotherapy (CCRT) by ultrasonic histogram. B-mode ultrasound examinations were performed at 4 time points in thirty-four patients during CCRT. Six ultrasonic histogram parameters were used to assess the echogenicity, homogeneity and heterogeneity of tumors. I peak increased rapidly since the first week after therapy initiation, whereas W low , W high and A high changed significantly at the second week. The average ultrasonic histogram progressively moved toward the right and converted into more symmetrical shape. Ultrasonic histogram could be served as a potential marker to monitor early response during CCRT. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Face verification system for Android mobile devices using histogram based features

    NASA Astrophysics Data System (ADS)

    Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu

    2016-07-01

    This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.

  15. Measurement of susceptibility artifacts with histogram-based reference value on magnetic resonance images according to standard ASTM F2119.

    PubMed

    Heinrich, Andreas; Teichgräber, Ulf K; Güttler, Felix V

    2015-12-01

    The standard ASTM F2119 describes a test method for measuring the size of a susceptibility artifact based on the example of a passive implant. A pixel in an image is considered to be a part of an image artifact if the intensity is changed by at least 30% in the presence of a test object, compared to a reference image in which the test object is absent (reference value). The aim of this paper is to simplify and accelerate the test method using a histogram-based reference value. Four test objects were scanned parallel and perpendicular to the main magnetic field, and the largest susceptibility artifacts were measured using two methods of reference value determination (reference image-based and histogram-based reference value). The results between both methods were compared using the Mann-Whitney U-test. The difference between both reference values was 42.35 ± 23.66. The difference of artifact size was 0.64 ± 0.69 mm. The artifact sizes of both methods did not show significant differences; the p-value of the Mann-Whitney U-test was between 0.710 and 0.521. A standard-conform method for a rapid, objective, and reproducible evaluation of susceptibility artifacts could be implemented. The result of the histogram-based method does not significantly differ from the ASTM-conform method.

  16. Multi-site Study of Diffusion Metric Variability: Characterizing the Effects of Site, Vendor, Field Strength, and Echo Time using the Histogram Distance

    PubMed Central

    Helmer, K. G.; Chou, M-C.; Preciado, R. I.; Gimi, B.; Rollins, N. K.; Song, A.; Turner, J.; Mori, S.

    2016-01-01

    MRI-based multi-site trials now routinely include some form of diffusion-weighted imaging (DWI) in their protocol. These studies can include data originating from scanners built by different vendors, each with their own set of unique protocol restrictions, including restrictions on the number of available gradient directions, whether an externally-generated list of gradient directions can be used, and restrictions on the echo time (TE). One challenge of multi-site studies is to create a common imaging protocol that will result in a reliable and accurate set of diffusion metrics. The present study describes the effect of site, scanner vendor, field strength, and TE on two common metrics: the first moment of the diffusion tensor field (mean diffusivity, MD), and the fractional anisotropy (FA). We have shown in earlier work that ROI metrics and the mean of MD and FA histograms are not sufficiently sensitive for use in site characterization. Here we use the distance between whole brain histograms of FA and MD to investigate within- and between-site effects. We concluded that the variability of DTI metrics due to site, vendor, field strength, and echo time could influence the results in multi-center trials and that histogram distance is sensitive metrics for each of these variables. PMID:27350723

  17. General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique

    NASA Astrophysics Data System (ADS)

    Feng, Shijie; Zhang, Yuzhen; Chen, Qian; Zuo, Chao; Li, Rubin; Shen, Guochen

    2014-08-01

    This paper presents a general solution for realizing high dynamic range three-dimensional (3-D) shape measurement based on fringe projection. Three concrete techniques are involved in the solution for measuring object with large range of reflectivity (LRR) or one with shiny specular surface. For the first technique, the measured surface reflectivities are sub-divided into several groups based on its histogram distribution, then the optimal exposure time for each group can be predicted adaptively so that the bright as well as dark areas on the measured surface are able to be handled without any compromise. Phase-shifted images are then captured at the calculated exposure times and a composite phase-shifted image is generated by extracting the optimally exposed pixels in the raw fringes images. For the second technique, it is proposed by introducing two orthogonal polarizers which are placed separately in front of the camera and projector into the first technique and the third one is developed by combining the second technique with the strategy of properly altering the angle between the transmission axes of the two polarizers. Experimental results show that the first technique can effectively improve the measurement accuracy of diffuse objects with LRR, the second one is capable of measuring object with weak specular reflection (WSR: e.g. shiny plastic surface) and the third can inspect surface with strong specular reflection (SSR: e.g. highlight on aluminum alloy) precisely. Further, more complex scene, such as the one with LRR and WSR, or even the one simultaneously involving LRR, WSR and SSR, can be measured accurately by the proposed solution.

  18. An Improved Pathological Brain Detection System Based on Two-Dimensional PCA and Evolutionary Extreme Learning Machine.

    PubMed

    Nayak, Deepak Ranjan; Dash, Ratnakar; Majhi, Banshidhar

    2017-12-07

    Pathological brain detection has made notable stride in the past years, as a consequence many pathological brain detection systems (PBDSs) have been proposed. But, the accuracy of these systems still needs significant improvement in order to meet the necessity of real world diagnostic situations. In this paper, an efficient PBDS based on MR images is proposed that markedly improves the recent results. The proposed system makes use of contrast limited adaptive histogram equalization (CLAHE) to enhance the quality of the input MR images. Thereafter, two-dimensional PCA (2DPCA) strategy is employed to extract the features and subsequently, a PCA+LDA approach is used to generate a compact and discriminative feature set. Finally, a new learning algorithm called MDE-ELM is suggested that combines modified differential evolution (MDE) and extreme learning machine (ELM) for segregation of MR images as pathological or healthy. The MDE is utilized to optimize the input weights and hidden biases of single-hidden-layer feed-forward neural networks (SLFN), whereas an analytical method is used for determining the output weights. The proposed algorithm performs optimization based on both the root mean squared error (RMSE) and norm of the output weights of SLFNs. The suggested scheme is benchmarked on three standard datasets and the results are compared against other competent schemes. The experimental outcomes show that the proposed scheme offers superior results compared to its counterparts. Further, it has been noticed that the proposed MDE-ELM classifier obtains better accuracy with compact network architecture than conventional algorithms.

  19. Combining Vector Quantization and Histogram Equalization.

    ERIC Educational Resources Information Center

    Cosman, Pamela C.; And Others

    1992-01-01

    Discussion of contrast enhancement techniques focuses on the use of histogram equalization with a data compression technique, i.e., tree-structured vector quantization. The enhancement technique of intensity windowing is described, and the use of enhancement techniques for medical images is explained, including adaptive histogram equalization.…

  20. Histogram and gray level co-occurrence matrix on gray-scale ultrasound images for diagnosing lymphocytic thyroiditis.

    PubMed

    Shin, Young Gyung; Yoo, Jaeheung; Kwon, Hyeong Ju; Hong, Jung Hwa; Lee, Hye Sun; Yoon, Jung Hyun; Kim, Eun-Kyung; Moon, Hee Jung; Han, Kyunghwa; Kwak, Jin Young

    2016-08-01

    The objective of the study was to evaluate whether texture analysis using histogram and gray level co-occurrence matrix (GLCM) parameters can help clinicians diagnose lymphocytic thyroiditis (LT) and differentiate LT according to pathologic grade. The background thyroid pathology of 441 patients was classified into no evidence of LT, chronic LT (CLT), and Hashimoto's thyroiditis (HT). Histogram and GLCM parameters were extracted from the regions of interest on ultrasound. The diagnostic performances of the parameters for diagnosing and differentiating LT were calculated. Of the histogram and GLCM parameters, the mean on histogram had the highest Az (0.63) and VUS (0.303). As the degrees of LT increased, the mean decreased and the standard deviation and entropy increased. The mean on histogram from gray-scale ultrasound showed the best diagnostic performance as a single parameter in differentiating LT according to pathologic grade as well as in diagnosing LT. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Whole-Lesion Histogram Analysis of Apparent Diffusion Coefficient for the Assessment of Cervical Cancer.

    PubMed

    Guan, Yue; Shi, Hua; Chen, Ying; Liu, Song; Li, Weifeng; Jiang, Zhuoran; Wang, Huanhuan; He, Jian; Zhou, Zhengyang; Ge, Yun

    2016-01-01

    The aim of this study was to explore the application of whole-lesion histogram analysis of apparent diffusion coefficient (ADC) values of cervical cancer. A total of 54 women (mean age, 53 years) with cervical cancers underwent 3-T diffusion-weighted imaging with b values of 0 and 800 s/mm prospectively. Whole-lesion histogram analysis of ADC values was performed. Paired sample t test was used to compare differences in ADC histogram parameters between cervical cancers and normal cervical tissues. Receiver operating characteristic curves were constructed to identify the optimal threshold of each parameter. All histogram parameters in this study including ADCmean, ADCmin, ADC10%-ADC90%, mode, skewness, and kurtosis of cervical cancers were significantly lower than those of normal cervical tissues (all P < 0.0001). ADC90% had the largest area under receiver operating characteristic curve of 0.996. Whole-lesion histogram analysis of ADC maps is useful in the assessment of cervical cancer.

  2. Time-cumulated visible and infrared radiance histograms used as descriptors of surface and cloud variations

    NASA Technical Reports Server (NTRS)

    Seze, Genevieve; Rossow, William B.

    1991-01-01

    The spatial and temporal stability of the distributions of satellite-measured visible and infrared radiances, caused by variations in clouds and surfaces, are investigated using bidimensional and monodimensional histograms and time-composite images. Similar analysis of the histograms of the original and time-composite images provides separation of the contributions of the space and time variations to the total variations. The variability of both the surfaces and clouds is found to be larger at scales much larger than the minimum resolved by satellite imagery. This study shows that the shapes of these histograms are distinctive characteristics of the different climate regimes and that particular attributes of these histograms can be related to several general, though not universal, properties of clouds and surface variations at regional and synoptic scales. There are also significant exceptions to these relationships in particular climate regimes. The characteristics of these radiance histograms provide a stable well defined descriptor of the cloud and surface properties.

  3. [Clinical application of MRI histogram in evaluation of muscle fatty infiltration].

    PubMed

    Zheng, Y M; Du, J; Li, W Z; Wang, Z X; Zhang, W; Xiao, J X; Yuan, Y

    2016-10-18

    To describe a method based on analysis of the histogram of intensity values produced from the magnetic resonance imaging (MRI) for quantifying the degree of fatty infiltration. The study included 25 patients with dystrophinopathy. All the subjects underwent muscle MRI test at thigh level. The histogram M values of 250 muscles adjusted for subcutaneous fat, representing the degree of fatty infiltration, were compared with the expert visual reading using the modified Mercuri scale. There was a significant positive correlation between the histogram M values and the scores of visual reading (r=0.854, P<0.001). The distinct pattern of muscle involvement detected in the patients with dystrophinopathy in our study of histogram M values was similar to that of visual reading and results in literature. The histogram M values had stronger correlations with the clinical data than the scores of visual reading as follows: the correlations with age (r=0.730, P<0.001) and (r=0.753, P<0.001); with strength of knee extensor (r=-0.468, P=0.024) and (r=-0.460, P=0.027) respectively. Meanwhile, the histogram M values analysis had better repeatability than visual reading with the interclass correlation coefficient was 0.998 (95% CI: 0.997-0.998, P<0.001) and 0.958 (95% CI: 0.946-0.967, P<0.001) respectively. Histogram M values analysis of MRI with the advantages of repeatability and objectivity can be used to evaluate the degree of muscle fatty infiltration.

  4. Adaptive histogram equalization in digital radiography of destructive skeletal lesions.

    PubMed

    Braunstein, E M; Capek, P; Buckwalter, K; Bland, P; Meyer, C R

    1988-03-01

    Adaptive histogram equalization, an image-processing technique that distributes pixel values of an image uniformly throughout the gray scale, was applied to 28 plain radiographs of bone lesions, after they had been digitized. The non-equalized and equalized digital images were compared by two skeletal radiologists with respect to lesion margins, internal matrix, soft-tissue mass, cortical breakthrough, and periosteal reaction. Receiver operating characteristic (ROC) curves were constructed on the basis of the responses. Equalized images were superior to nonequalized images in determination of cortical breakthrough and presence or absence of periosteal reaction. ROC analysis showed no significant difference in determination of margins, matrix, or soft-tissue masses.

  5. [Registration and 3D rendering of serial tissue section images].

    PubMed

    Liu, Zhexing; Jiang, Guiping; Dong, Wu; Zhang, Yu; Xie, Xiaomian; Hao, Liwei; Wang, Zhiyuan; Li, Shuxiang

    2002-12-01

    It is an important morphological research method to reconstruct the 3D imaging from serial section tissue images. Registration of serial images is a key step to 3D reconstruction. Firstly, an introduction to the segmentation-counting registration algorithm is presented, which is based on the joint histogram. After thresholding of the two images to be registered, the criterion function is defined as counting in a specific region of the joint histogram, which greatly speeds up the alignment process. Then, the method is used to conduct the serial tissue image matching task, and lies a solid foundation for 3D rendering. Finally, preliminary surface rendering results are presented.

  6. The Histogram-Area Connection

    ERIC Educational Resources Information Center

    Gratzer, William; Carpenter, James E.

    2008-01-01

    This article demonstrates an alternative approach to the construction of histograms--one based on the notion of using area to represent relative density in intervals of unequal length. The resulting histograms illustrate the connection between the area of the rectangles associated with particular outcomes and the relative frequency (probability)…

  7. Investigating Student Understanding of Histograms

    ERIC Educational Resources Information Center

    Kaplan, Jennifer J.; Gabrosek, John G.; Curtiss, Phyllis; Malone, Chris

    2014-01-01

    Histograms are adept at revealing the distribution of data values, especially the shape of the distribution and any outlier values. They are included in introductory statistics texts, research methods texts, and in the popular press, yet students often have difficulty interpreting the information conveyed by a histogram. This research identifies…

  8. Visual vs Fully Automatic Histogram-Based Assessment of Idiopathic Pulmonary Fibrosis (IPF) Progression Using Sequential Multidetector Computed Tomography (MDCT)

    PubMed Central

    Colombi, Davide; Dinkel, Julien; Weinheimer, Oliver; Obermayer, Berenike; Buzan, Teodora; Nabers, Diana; Bauer, Claudia; Oltmanns, Ute; Palmowski, Karin; Herth, Felix; Kauczor, Hans Ulrich; Sverzellati, Nicola

    2015-01-01

    Objectives To describe changes over time in extent of idiopathic pulmonary fibrosis (IPF) at multidetector computed tomography (MDCT) assessed by semi-quantitative visual scores (VSs) and fully automatic histogram-based quantitative evaluation and to test the relationship between these two methods of quantification. Methods Forty IPF patients (median age: 70 y, interquartile: 62-75 years; M:F, 33:7) that underwent 2 MDCT at different time points with a median interval of 13 months (interquartile: 10-17 months) were retrospectively evaluated. In-house software YACTA quantified automatically lung density histogram (10th-90th percentile in 5th percentile steps). Longitudinal changes in VSs and in the percentiles of attenuation histogram were obtained in 20 untreated patients and 20 patients treated with pirfenidone. Pearson correlation analysis was used to test the relationship between VSs and selected percentiles. Results In follow-up MDCT, visual overall extent of parenchymal abnormalities (OE) increased in median by 5 %/year (interquartile: 0 %/y; +11 %/y). Substantial difference was found between treated and untreated patients in HU changes of the 40th and of the 80th percentiles of density histogram. Correlation analysis between VSs and selected percentiles showed higher correlation between the changes (Δ) in OE and Δ 40th percentile (r=0.69; p<0.001) as compared to Δ 80th percentile (r=0.58; p<0.001); closer correlation was found between Δ ground-glass extent and Δ 40th percentile (r=0.66, p<0.001) as compared to Δ 80th percentile (r=0.47, p=0.002), while the Δ reticulations correlated better with the Δ 80th percentile (r=0.56, p<0.001) in comparison to Δ 40th percentile (r=0.43, p=0.003). Conclusions There is a relevant and fully automatically measurable difference at MDCT in VSs and in histogram analysis at one year follow-up of IPF patients, whether treated or untreated: Δ 40th percentile might reflect the change in overall extent of lung abnormalities, notably of ground-glass pattern; furthermore Δ 80th percentile might reveal the course of reticular opacities. PMID:26110421

  9. Breast lesion characterization using whole-lesion histogram analysis with stretched-exponential diffusion model.

    PubMed

    Liu, Chunling; Wang, Kun; Li, Xiaodan; Zhang, Jine; Ding, Jie; Spuhler, Karl; Duong, Timothy; Liang, Changhong; Huang, Chuan

    2018-06-01

    Diffusion-weighted imaging (DWI) has been studied in breast imaging and can provide more information about diffusion, perfusion and other physiological interests than standard pulse sequences. The stretched-exponential model has previously been shown to be more reliable than conventional DWI techniques, but different diagnostic sensitivities were found from study to study. This work investigated the characteristics of whole-lesion histogram parameters derived from the stretched-exponential diffusion model for benign and malignant breast lesions, compared them with conventional apparent diffusion coefficient (ADC), and further determined which histogram metrics can be best used to differentiate malignant from benign lesions. This was a prospective study. Seventy females were included in the study. Multi-b value DWI was performed on a 1.5T scanner. Histogram parameters of whole lesions for distributed diffusion coefficient (DDC), heterogeneity index (α), and ADC were calculated by two radiologists and compared among benign lesions, ductal carcinoma in situ (DCIS), and invasive carcinoma confirmed by pathology. Nonparametric tests were performed for comparisons among invasive carcinoma, DCIS, and benign lesions. Comparisons of receiver operating characteristic (ROC) curves were performed to show the ability to discriminate malignant from benign lesions. The majority of histogram parameters (mean/min/max, skewness/kurtosis, 10-90 th percentile values) from DDC, α, and ADC were significantly different among invasive carcinoma, DCIS, and benign lesions. DDC 10% (area under curve [AUC] = 0.931), ADC 10% (AUC = 0.893), and α mean (AUC = 0.787) were found to be the best metrics in differentiating benign from malignant tumors among all histogram parameters derived from ADC and α, respectively. The combination of DDC 10% and α mean , using logistic regression, yielded the highest sensitivity (90.2%) and specificity (95.5%). DDC 10% and α mean derived from the stretched-exponential model provides more information and better diagnostic performance in differentiating malignancy from benign lesions than ADC parameters derived from a monoexponential model. 2 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:1701-1710. © 2017 International Society for Magnetic Resonance in Medicine.

  10. Thresholding histogram equalization.

    PubMed

    Chuang, K S; Chen, S; Hwang, I M

    2001-12-01

    The drawbacks of adaptive histogram equalization techniques are the loss of definition on the edges of the object and overenhancement of noise in the images. These drawbacks can be avoided if the noise is excluded in the equalization transformation function computation. A method has been developed to separate the histogram into zones, each with its own equalization transformation. This method can be used to suppress the nonanatomic noise and enhance only certain parts of the object. This method can be combined with other adaptive histogram equalization techniques. Preliminary results indicate that this method can produce images with superior contrast.

  11. Histogram-based quantitative evaluation of endobronchial ultrasonography images of peripheral pulmonary lesion.

    PubMed

    Morikawa, Kei; Kurimoto, Noriaki; Inoue, Takeo; Mineshita, Masamichi; Miyazawa, Teruomi

    2015-01-01

    Endobronchial ultrasonography using a guide sheath (EBUS-GS) is an increasingly common bronchoscopic technique, but currently, no methods have been established to quantitatively evaluate EBUS images of peripheral pulmonary lesions. The purpose of this study was to evaluate whether histogram data collected from EBUS-GS images can contribute to the diagnosis of lung cancer. Histogram-based analyses focusing on the brightness of EBUS images were retrospectively conducted: 60 patients (38 lung cancer; 22 inflammatory diseases), with clear EBUS images were included. For each patient, a 400-pixel region of interest was selected, typically located at a 3- to 5-mm radius from the probe, from recorded EBUS images during bronchoscopy. Histogram height, width, height/width ratio, standard deviation, kurtosis and skewness were investigated as diagnostic indicators. Median histogram height, width, height/width ratio and standard deviation were significantly different between lung cancer and benign lesions (all p < 0.01). With a cutoff value for standard deviation of 10.5, lung cancer could be diagnosed with an accuracy of 81.7%. Other characteristics investigated were inferior when compared to histogram standard deviation. Histogram standard deviation appears to be the most useful characteristic for diagnosing lung cancer using EBUS images. © 2015 S. Karger AG, Basel.

  12. Web servlet-assisted, dial-in flow cytometry data analysis.

    PubMed

    Battye, F

    2001-02-01

    The obvious benefits of centralized data storage notwithstanding, the size of modern flow cytometry data files discourages their transmission over commonly used telephone modem connections. The proposed solution is to install at the central location a web servlet that can extract compact data arrays, of a form dependent on the requested display type, from the stored files and transmit them to a remote client computer program for display. A client program and a web servlet, both written in the Java programming language, were designed to communicate over standard network connections. The client program creates familiar numerical and graphical display types and allows the creation of gates from combinations of user-defined regions. Data compression techniques further reduce transmission times for data arrays that are already much smaller than the data file itself. For typical data files, network transmission times were reduced more than 700-fold for extraction of one-dimensional (1-D) histograms, between 18 and 120-fold for 2-D histograms, and 6-fold for color-coded dot plots. Numerous display formats are possible without further access to the data file. This scheme enables telephone modem access to centrally stored data without restricting flexibility of display format or preventing comparisons with locally stored files. Copyright 2001 Wiley-Liss, Inc.

  13. Tumor segmentation of multi-echo MR T2-weighted images with morphological operators

    NASA Astrophysics Data System (ADS)

    Torres, W.; Martín-Landrove, M.; Paluszny, M.; Figueroa, G.; Padilla, G.

    2009-02-01

    In the present work an automatic brain tumor segmentation procedure based on mathematical morphology is proposed. The approach considers sequences of eight multi-echo MR T2-weighted images. The relaxation time T2 characterizes the relaxation of water protons in the brain tissue: white matter, gray matter, cerebrospinal fluid (CSF) or pathological tissue. Image data is initially regularized by the application of a log-convex filter in order to adjust its geometrical properties to those of noiseless data, which exhibits monotonously decreasing convex behavior. Finally the regularized data is analyzed by means of an 8-dimensional morphological eccentricity filter. In a first stage, the filter was used for the spatial homogenization of the tissues in the image, replacing each pixel by the most representative pixel within its structuring element, i.e. the one which exhibits the minimum total distance to all members in the structuring element. On the filtered images, the relaxation time T2 is estimated by means of least square regression algorithm and the histogram of T2 is determined. The T2 histogram was partitioned using the watershed morphological operator; relaxation time classes were established and used for tissue classification and segmentation of the image. The method was validated on 15 sets of MRI data with excellent results.

  14. Environmental justice assessment for transportation : risk analysis

    DOT National Transportation Integrated Search

    1999-04-01

    This paper presents methods of comparing populations and their racial/ethnic compositions using tabulations, histograms, and Chi Squared tests for statistical significance of differences found. Two examples of these methods are presented: comparison ...

  15. Histogram of gradient and binarized statistical image features of wavelet subband-based palmprint features extraction

    NASA Astrophysics Data System (ADS)

    Attallah, Bilal; Serir, Amina; Chahir, Youssef; Boudjelal, Abdelwahhab

    2017-11-01

    Palmprint recognition systems are dependent on feature extraction. A method of feature extraction using higher discrimination information was developed to characterize palmprint images. In this method, two individual feature extraction techniques are applied to a discrete wavelet transform of a palmprint image, and their outputs are fused. The two techniques used in the fusion are the histogram of gradient and the binarized statistical image features. They are then evaluated using an extreme learning machine classifier before selecting a feature based on principal component analysis. Three palmprint databases, the Hong Kong Polytechnic University (PolyU) Multispectral Palmprint Database, Hong Kong PolyU Palmprint Database II, and the Delhi Touchless (IIDT) Palmprint Database, are used in this study. The study shows that our method effectively identifies and verifies palmprints and outperforms other methods based on feature extraction.

  16. Clinical outcomes using carbon-ion radiotherapy and dose-volume histogram comparison between carbon-ion radiotherapy and photon therapy for T2b-4N0M0 non-small cell lung cancer-A pilot study.

    PubMed

    Shirai, Katsuyuki; Kawashima, Motohiro; Saitoh, Jun-Ichi; Abe, Takanori; Fukata, Kyohei; Shigeta, Yuka; Irie, Daisuke; Shiba, Shintaro; Okano, Naoko; Ohno, Tatsuya; Nakano, Takashi

    2017-01-01

    The safety and efficacy of carbon-ion radiotherapy for advanced non-small cell lung cancer have not been established. We evaluated the clinical outcomes and dose-volume histogram parameters of carbon-ion radiotherapy compared with photon therapy in T2b-4N0M0 non-small cell lung cancer. Twenty-three patients were treated with carbon-ion radiotherapy between May 2011 and December 2015. Seven, 14, and 2 patients had T2b, T3, and T4, respectively. The median age was 78 (range, 53-91) years, with 22 male patients. There were 12 adenocarcinomas, 8 squamous cell carcinomas, 1 non-small cell lung carcinoma, and 2 clinically diagnosed lung cancers. Eleven patients were operable, and 12 patients were inoperable. Most patients (91%) were treated with carbon-ion radiotherapy of 60.0 Gy relative biological effectiveness (RBE) in 4 fractions or 64.0 Gy (RBE) in 16 fractions. Local control and overall survival rates were calculated. Dose-volume histogram parameters of normal lung and tumor coverages were compared between carbon-ion radiotherapy and photon therapies, including three-dimensional conformal radiotherapy (3DCRT) and intensity-modulated radiotherapy (IMRT). The median follow-up of surviving patients was 25 months. Three patients experienced local recurrence, and the 2-year local control rate was 81%. During follow-up, 5 patients died of lung cancer, and 1 died of intercurrent disease. The 2-year overall survival rate was 70%. Operable patients had a better overall survival rate compared with inoperable patients (100% vs. 43%; P = 0.04). There was no grade ≥2 radiation pneumonitis. In dose-volume histogram analysis, carbon-ion radiotherapy had a significantly lower dose to normal lung and greater tumor coverage compared with photon therapies. Carbon-ion radiotherapy was effectively and safely performed for T2b-4N0M0 non-small cell lung cancer, and the dose distribution was superior compared with those for photon therapies. A Japanese multi-institutional study is ongoing to prospectively evaluate these patients and establish the use of carbon-ion radiotherapy.

  17. Construction and Evaluation of Histograms in Teacher Training

    ERIC Educational Resources Information Center

    Bruno, A.; Espinel, M. C.

    2009-01-01

    This article details the results of a written test designed to reveal how education majors construct and evaluate histograms and frequency polygons. Included is a description of the mistakes made by the students which shows how they tend to confuse histograms with bar diagrams, incorrectly assign data along the Cartesian axes and experience…

  18. Empirical Histograms in Item Response Theory with Ordinal Data

    ERIC Educational Resources Information Center

    Woods, Carol M.

    2007-01-01

    The purpose of this research is to describe, test, and illustrate a new implementation of the empirical histogram (EH) method for ordinal items. The EH method involves the estimation of item response model parameters simultaneously with the approximation of the distribution of the random latent variable (theta) as a histogram. Software for the EH…

  19. Symbol recognition via statistical integration of pixel-level constraint histograms: a new descriptor.

    PubMed

    Yang, Su

    2005-02-01

    A new descriptor for symbol recognition is proposed. 1) A histogram is constructed for every pixel to figure out the distribution of the constraints among the other pixels. 2) All the histograms are statistically integrated to form a feature vector with fixed dimension. The robustness and invariance were experimentally confirmed.

  20. Airborne gamma-ray spectrometer and magnetometer survey, Durango D, Colorado. Final report Volume II A. Detail area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    This volume contains geology of the Durango D detail area, radioactive mineral occurrences in Colorado, and geophysical data interpretation. Eight appendices provide: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, magnetic and ancillary profiles, and test line data.

  1. Airborne gamma-ray spectrometer and magnetometer survey, Durango C, Colorado. Final report Volume II A. Detail area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    Geology of Durango C detail area, radioactive mineral occurrences in Colorado, and geophysical data interpretation are included in this report. Eight appendices provide: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, magnetic and ancillary profiles, and test line data.

  2. Comparison of three-dimensional vs. conventional radiotherapy in saving optic tract in paranasal sinus tumors.

    PubMed

    Kamian, S; Kazemian, A; Esfahani, M; Mohammadi, E; Aghili, M

    2010-01-01

    To assess the possibility of delivering a homogeneous irradiation with respect to maximal tolerated dose to the optic pathway for paranasal sinus (PNS) tumors. Treatment planning with conformal three-dimensional (3D) and conventional two-dimensional (2D) was done on CT scans of 20 patients who had early or advanced PNS tumors. Four cases had been previously irradiated. Dose-volume histograms (DVH) for the planning target volume (PTV) and the visual pathway including globes, chiasma and optic nerves were compared between the 2 treatment plannings. The area under curve (AUC) in the DVH of the globes on the same side and contralateral side of tumor involvement was significantly higher in 2D planning (p <0.05), which caused higher integral dose to both globes. Also, the AUC in the DVH of chiasma was higher in 2D treatment planning (p=0.002). The integral dose to the contralateral optic nerve was significantly lower with 3D planning (p=0.007), but there was no significant difference for the optic nerve which was on the same side of tumor involvement (p >0.05). The AUC in the DVH of PTV was not significant (201.1 + or - 16.23 mm(3) in 2D planning vs. 201.15 + or - 15.09 mm(3) in 3D planning). The volume of PTV which received 90% of the prescribed dose was 96.9 + or - 4.41 cm(3) in 2D planning and 97.2 + or - 2.61 cm(3) in 3D planning (p >0.05). 3D conformal radiotherapy (RT) for PNS tumors enables the delivery of radiation to the tumor with respect to critical organs with a lower toxicity to the optic pathway.

  3. Action recognition via cumulative histogram of multiple features

    NASA Astrophysics Data System (ADS)

    Yan, Xunshi; Luo, Yupin

    2011-01-01

    Spatial-temporal interest points (STIPs) are popular in human action recognition. However, they suffer from difficulties in determining size of codebook and losing much information during forming histograms. In this paper, spatial-temporal interest regions (STIRs) are proposed, which are based on STIPs and are capable of marking the locations of the most ``shining'' human body parts. In order to represent human actions, the proposed approach takes great advantages of multiple features, including STIRs, pyramid histogram of oriented gradients and pyramid histogram of oriented optical flows. To achieve this, cumulative histogram is used to integrate dynamic information in sequences and to form feature vectors. Furthermore, the widely used nearest neighbor and AdaBoost methods are employed as classification algorithms. Experiments on public datasets KTH, Weizmann and UCF sports show that the proposed approach achieves effective and robust results.

  4. Visual Contrast Enhancement Algorithm Based on Histogram Equalization

    PubMed Central

    Ting, Chih-Chung; Wu, Bing-Fei; Chung, Meng-Liang; Chiu, Chung-Cheng; Wu, Ya-Ching

    2015-01-01

    Image enhancement techniques primarily improve the contrast of an image to lend it a better appearance. One of the popular enhancement methods is histogram equalization (HE) because of its simplicity and effectiveness. However, it is rarely applied to consumer electronics products because it can cause excessive contrast enhancement and feature loss problems. These problems make the images processed by HE look unnatural and introduce unwanted artifacts in them. In this study, a visual contrast enhancement algorithm (VCEA) based on HE is proposed. VCEA considers the requirements of the human visual perception in order to address the drawbacks of HE. It effectively solves the excessive contrast enhancement problem by adjusting the spaces between two adjacent gray values of the HE histogram. In addition, VCEA reduces the effects of the feature loss problem by using the obtained spaces. Furthermore, VCEA enhances the detailed textures of an image to generate an enhanced image with better visual quality. Experimental results show that images obtained by applying VCEA have higher contrast and are more suited to human visual perception than those processed by HE and other HE-based methods. PMID:26184219

  5. Adaptive local thresholding for robust nucleus segmentation utilizing shape priors

    NASA Astrophysics Data System (ADS)

    Wang, Xiuzhong; Srinivas, Chukka

    2016-03-01

    This paper describes a novel local thresholding method for foreground detection. First, a Canny edge detection method is used for initial edge detection. Then, tensor voting is applied on the initial edge pixels, using a nonsymmetric tensor field tailored to encode prior information about nucleus size, shape, and intensity spatial distribution. Tensor analysis is then performed to generate the saliency image and, based on that, the refined edge. Next, the image domain is divided into blocks. In each block, at least one foreground and one background pixel are sampled for each refined edge pixel. The saliency weighted foreground histogram and background histogram are then created. These two histograms are used to calculate a threshold by minimizing the background and foreground pixel classification error. The block-wise thresholds are then used to generate the threshold for each pixel via interpolation. Finally, the foreground is obtained by comparing the original image with the threshold image. The effective use of prior information, combined with robust techniques, results in far more reliable foreground detection, which leads to robust nucleus segmentation.

  6. Moving from spatially segregated to transparent motion: a modelling approach

    PubMed Central

    Durant, Szonya; Donoso-Barrera, Alejandra; Tan, Sovira; Johnston, Alan

    2005-01-01

    Motion transparency, in which patterns of moving elements group together to give the impression of lacy overlapping surfaces, provides an important challenge to models of motion perception. It has been suggested that we perceive transparent motion when the shape of the velocity histogram of the stimulus is bimodal. To investigate this further, random-dot kinematogram motion sequences were created to simulate segregated (perceptually spatially separated) and transparent (perceptually overlapping) motion. The motion sequences were analysed using the multi-channel gradient model (McGM) to obtain the speed and direction at every pixel of each frame of the motion sequences. The velocity histograms obtained were found to be quantitatively similar and all were bimodal. However, the spatial and temporal properties of the velocity field differed between segregated and transparent stimuli. Transparent stimuli produced patches of rightward and leftward motion that varied in location over time. This demonstrates that we can successfully differentiate between these two types of motion on the basis of the time varying local velocity field. However, the percept of motion transparency cannot be based simply on the presence of a bimodal velocity histogram. PMID:17148338

  7. Manifestation of peripherial coding in the effect of increasing loudness and enhanced discrimination of the intensity of tone bursts before and after tone burst noise

    NASA Astrophysics Data System (ADS)

    Rimskaya-Korsavkova, L. K.

    2017-07-01

    To find the possible reasons for the midlevel elevation of the Weber fraction in intensity discrimination of a tone burst, a comparison was performed for the complementary distributions of spike activity of an ensemble of space nerves, such as the distribution of time instants when spikes occur, the distribution of interspike intervals, and the autocorrelation function. The distribution properties were detected in a poststimulus histogram, an interspike interval histogram, and an autocorrelation histogram—all obtained from the reaction of an ensemble of model space nerves in response to an auditory noise burst-useful tone burst complex. Two configurations were used: in the first, the peak amplitude of the tone burst was varied and the noise amplitude was fixed; in the other, the tone burst amplitude was fixed and the noise amplitude was varied. Noise could precede or follow the tone burst. The noise and tone burst durations, as well as the interval between them, was 4 kHz and corresponded to the characteristic frequencies of the model space nerves. The profiles of all the mentioned histograms had two maxima. The values and the positions of the maxima in the poststimulus histogram corresponded to the amplitudes and mutual time position of the noise and the tone burst. The maximum that occurred in response to the tone burst action could be a basis for the formation of the loudness of the latter (explicit loudness). However, the positions of the maxima in the other two histograms did not depend on the positions of tone bursts and noise in the combinations. The first maximum fell in short intervals and united intervals corresponding to the noise and tone burst durations. The second maximum fell in intervals corresponding to a tone burst delay with respect to noise, and its value was proportional to the noise amplitude or tone burst amplitude that was smaller in the complex. An increase in tone burst or noise amplitudes was caused by nonlinear variations in the two maxima and the ratio between them. The size of the first maximum in the of interspike interval distribution could be the basis for the formation of the loudness of the masked tone burst (implicit loudness), and the size of the second maximum, for the formation of intensity in the periodicity pitch of the complex. The auditory effect of the midlevel enhancement of tone burst loudness could be the result of variations in the implicit tone burst loudness caused by variations in tone-burst or noise intensity. The reason for the enhancement of the Weber fraction could be competitive interaction between such subjective qualities as explicit and implicit tone-burst loudness and the intensity of the periodicity pitch of the complex.

  8. SEURAT: visual analytics for the integrated analysis of microarray data.

    PubMed

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  9. Application of Magnetic Resonance Imaging and Three-Dimensional Treatment Planning in the Treatment of Orbital Lymphoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rudoltz, Marc S.; Ayyangar, Komanduri; Mohiuddin, Mohammed

    Radiotherapy for lymphoma of the orbit must be individualized for each patient and clinical setting. Most techniques focus on optimizing the dose to the tumor while sparing the lens. This study describes a technique utilizing magnetic resonance imaging (MRI) and three dimensional (3D) planning in the treatment of orbital lymphoma. A patient presented with an intermediate grade lymphoma of the right orbit. The prescribed tumor dose was 4050 cGy in 18 fractions. Three D planning was carried out and tumor volumes, retina, and lens were subsequently outlined. Dose calculations including dose volume histograms of the target, retina, and lens weremore » then performed. Part of the retina was outside of the treatment volume while 50% of the retina received 90% or more of the prescribed dose. The patient was clinically NED when last seen 2 years following therapy with no treatment-related morbidity. Patients with lymphomas of the orbit can be optimally treated using MRI based 3D treatment planning.« less

  10. Histogram analysis of apparent diffusion coefficient for monitoring early response in patients with advanced cervical cancers undergoing concurrent chemo-radiotherapy.

    PubMed

    Meng, Jie; Zhu, Lijing; Zhu, Li; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng

    2017-11-01

    Background Apparent diffusion coefficient (ADC) histogram analysis has been widely used in determining tumor prognosis. Purpose To investigate the dynamic changes of ADC histogram parameters during concurrent chemo-radiotherapy (CCRT) in patients with advanced cervical cancers. Material and Methods This prospective study enrolled 32 patients with advanced cervical cancers undergoing CCRT who received diffusion-weighted (DW) magnetic resonance imaging (MRI) before CCRT, at the end of the second and fourth week during CCRT and one month after CCRT completion. The ADC histogram for the entire tumor volume was generated, and a series of histogram parameters was obtained. Dynamic changes of those parameters in cervical cancers were investigated as early biomarkers for treatment response. Results All histogram parameters except AUC low showed significant changes during CCRT (all P < 0.05). There were three variable trends involving different parameters. The mode, 5th, 10th, and 25th percentiles showed similar early increase rates (33.33%, 33.99%, 34.12%, and 30.49%, respectively) at the end of the second week of CCRT. The pre-CCRT 5th and 25th percentiles of the complete response (CR) group were significantly lower than those of the partial response (PR) group. Conclusion A series of ADC histogram parameters of cervical cancers changed significantly at the early stage of CCRT, indicating their potential in monitoring early tumor response to therapy.

  11. Whole Tumor Histogram-profiling of Diffusion-Weighted Magnetic Resonance Images Reflects Tumorbiological Features of Primary Central Nervous System Lymphoma.

    PubMed

    Schob, Stefan; Münch, Benno; Dieckow, Julia; Quäschling, Ulf; Hoffmann, Karl-Titus; Richter, Cindy; Garnov, Nikita; Frydrychowicz, Clara; Krause, Matthias; Meyer, Hans-Jonas; Surov, Alexey

    2018-04-01

    Diffusion weighted imaging (DWI) quantifies motion of hydrogen nuclei in biological tissues and hereby has been used to assess the underlying tissue microarchitecture. Histogram-profiling of DWI provides more detailed information on diffusion characteristics of a lesion than the standardly calculated values of the apparent diffusion coefficient (ADC)-minimum, mean and maximum. Hence, the aim of our study was to investigate, which parameters of histogram-profiling of DWI in primary central nervous system lymphoma can be used to specifically predict features like cellular density, chromatin content and proliferative activity. Pre-treatment ADC maps of 21 PCNSL patients (8 female, 13 male, 28-89 years) from a 1.5T system were used for Matlab-based histogram profiling. Results of histopathology (H&E staining) and immunohistochemistry (Ki-67 expression) were quantified. Correlations between histogram-profiling parameters and neuropathologic examination were calculated using SPSS 23.0. The lower percentiles (p10 and p25) showed significant correlations with structural parameters of the neuropathologic examination (cellular density, chromatin content). The highest percentile, p90, correlated significantly with Ki-67 expression, resembling proliferative activity. Kurtosis of the ADC histogram correlated significantly with cellular density. Histogram-profiling of DWI in PCNSL provides a comprehensible set of parameters, which reflect distinct tumor-architectural and tumor-biological features, and hence, are promising biomarkers for treatment response and prognosis. Copyright © 2018. Published by Elsevier Inc.

  12. ADC histogram analysis of muscle lymphoma - Correlation with histopathology in a rare entity.

    PubMed

    Meyer, Hans-Jonas; Pazaitis, Nikolaos; Surov, Alexey

    2018-06-21

    Diffusion weighted imaging (DWI) is able to reflect histopathology architecture. A novel imaging approach, namely histogram analysis, is used to further characterize lesion on MRI. The purpose of this study is to correlate histogram parameters derived from apparent diffusion coefficient- (ADC) maps with histopathology parameters in muscle lymphoma. Eight patients (mean age 64.8 years, range 45-72 years) with histopathologically confirmed muscle lymphoma were retrospectively identified. Cell count, total nucleic and average nucleic areas were estimated using ImageJ. Additionally, Ki67-index was calculated. DWI was obtained on a 1.5T scanner by using the b values of 0 and 1000 s/mm2. Histogram analysis was performed as a whole lesion measurement by using a custom-made Matlabbased application. The correlation analysis revealed statistically significant correlation between cell count and ADCmean (p=-0.76, P=0.03) as well with ADCp75 (p=-0.79, P=0.02). Kurtosis and entropy correlated with average nucleic area (p=-0.81, P=0.02, p=0.88, P=0.007, respectively). None of the analyzed ADC parameters correlated with total nucleic area and with Ki67-index. This study identified significant correlations between cellularity and histogram parameters derived from ADC maps in muscle lymphoma. Thus, histogram analysis parameters reflect histopathology in muscle tumors. Advances in knowledge: Whole lesion ADC histogram analysis is able to reflect histopathology parameters in muscle lymphomas.

  13. The value of whole lesion ADC histogram profiling to differentiate between morphologically indistinguishable ring enhancing lesions-comparison of glioblastomas and brain abscesses.

    PubMed

    Horvath-Rizea, Diana; Surov, Alexey; Hoffmann, Karl-Titus; Garnov, Nikita; Vörkel, Cathrin; Kohlhof-Meinecke, Patricia; Ganslandt, Oliver; Bäzner, Hansjörg; Gihr, Georg Alexander; Kalman, Marcell; Henkes, Elina; Henkes, Hans; Schob, Stefan

    2018-04-06

    Morphologically similar appearing ring enhancing lesions in the brain parenchyma can be caused by a number of distinct pathologies, however, they consistently represent life-threatening conditions. The two most frequently encountered diseases manifesting as such are glioblastoma multiforme (GBM) and brain abscess (BA), each requiring disparate therapeutical approaches. As a result of their morphological resemblance, essential treatment might be significantly delayed or even ommited, in case results of conventional imaging remain inconclusive. Therefore, our study aimed to investigate, whether ADC histogram profiling reliably can distinguish between both entities, thus enhancing the differential diagnostic process and preventing treatment failure in this highly critical context. 103 patients (51 BA, 52 GBM) with histopathologically confirmed diagnosis were enrolled. Pretreatment diffusion weighted imaging (DWI) was obtained in a 1.5T system using b values of 0, 500, and 1000 s/mm 2 . Whole lesion ADC volumes were analyzed using a histogram-based approach. Statistical analysis was performed using SPSS version 23. All investigated parameters were statistically different in comparison of both groups. Most importantly, ADCp10 was able to differentiate reliably between BA and GBM with excellent accuracy (0.948) using a cutpoint value of 70 × 10 -5 mm 2 × s -1 . ADC whole lesion histogram profiling provides a valuable tool to differentiate between morphologically indistinguishable mass lesions. Among the investigated parameters, the 10th percentile of the ADC volume distinguished best between GBM and BA.

  14. Whole lesion histogram analysis of meningiomas derived from ADC values. Correlation with several cellularity parameters, proliferation index KI 67, nucleic content, and membrane permeability.

    PubMed

    Surov, Alexey; Hamerla, Gordian; Meyer, Hans Jonas; Winter, Karsten; Schob, Stefan; Fiedler, Eckhard

    2018-09-01

    To analyze several histopathological features and their possible correlations with whole lesion histogram analysis derived from ADC maps in meningioma. The retrospective study involved 36 patients with primary meningiomas. For every tumor, the following histogram analysis parameters of apparent diffusion coefficient (ADC) were calculated: ADC mean , ADC max , ADC min , ADC median , ADC mode , ADC percentiles: P10, P25, P75, P90, as well kurtosis, skewness, and entropy. All measures were performed by two radiologists. Proliferation index KI 67, minimal, maximal and mean cell count, total nucleic area, and expression of water channel aquaporin 4 (AQP4) were estimated. Spearman's correlation coefficient was used to analyze associations between investigated parameters. A perfect interobserver agreement for all ADC values (0.84-0.97) was identified. All ADC values correlated inversely with tumor cellularity with the strongest correlation between P10, P25 and mean cell count (-0.558). KI 67 correlated inversely with all ADC values except ADC min . ADC parameters did not correlate with total nucleic area. All ADC values correlated statistically significant with expression of AQP4. ADC histogram analysis is a valid method with an excellent interobserver agreement. Cellularity parameters and proliferation potential are associated with different ADC values. Membrane permeability may play a greater role for water diffusion than cell count and proliferation activity. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Time-cumulated visible and infrared histograms used as descriptor of cloud cover

    NASA Technical Reports Server (NTRS)

    Seze, G.; Rossow, W.

    1987-01-01

    To study the statistical behavior of clouds for different climate regimes, the spatial and temporal stability of VIS-IR bidimensional histograms is tested. Also, the effect of data sampling and averaging on the histogram shapes is considered; in particular the sampling strategy used by the International Satellite Cloud Climatology Project is tested.

  16. Interpreting Histograms. As Easy as It Seems?

    ERIC Educational Resources Information Center

    Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim

    2014-01-01

    Histograms are widely used, but recent studies have shown that they are not as easy to interpret as it might seem. In this article, we report on three studies on the interpretation of histograms in which we investigated, namely, (1) whether the misinterpretation by university students can be considered to be the result of heuristic reasoning, (2)…

  17. Improving Real World Performance of Vision Aided Navigation in a Flight Environment

    DTIC Science & Technology

    2016-09-15

    Introduction . . . . . . . 63 4.2 Wide Area Search Extent . . . . . . . . . . . . . . . . . 64 4.3 Large-Scale Image Navigation Histogram Filter ...65 4.3.1 Location Model . . . . . . . . . . . . . . . . . . 66 4.3.2 Measurement Model . . . . . . . . . . . . . . . 66 4.3.3 Histogram Filter ...Iteration of Histogram Filter . . . . . . . . . . . 70 4.4 Implementation and Flight Test Campaign . . . . . . . . 71 4.4.1 Software Implementation

  18. Airborne gamma-ray spectrometer and magnetometer survey, Durango A, Colorado. Final report Volume II A. Detail area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    This volume contains geology of the Durango A detail area, radioactive mineral occurences in Colorado, and geophysical data interpretation. Eight appendices provide the following: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, magnetic and ancillary profiles, and test line data.

  19. Airborne gamma-ray spectrometer and magnetometer survey, Durango B, Colorado. Final report Volume II A. Detail area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    The geology of the Durango B detail area, the radioactive mineral occurrences in Colorado and the geophysical data interpretation are included in this report. Seven appendices contain: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, and test line data.

  20. Students' Understanding of Bar Graphs and Histograms: Results from the LOCUS Assessments

    ERIC Educational Resources Information Center

    Whitaker, Douglas; Jacobbe, Tim

    2017-01-01

    Bar graphs and histograms are core statistical tools that are widely used in statistical practice and commonly taught in classrooms. Despite their importance and the instructional time devoted to them, many students demonstrate misunderstandings when asked to read and interpret bar graphs and histograms. Much of the research that has been…

  1. An evaluation of the effectiveness of adaptive histogram equalization for contrast enhancement.

    PubMed

    Zimmerman, J B; Pizer, S M; Staab, E V; Perry, J R; McCartney, W; Brenton, B C

    1988-01-01

    Adaptive histogram equalization (AHE) and intensity windowing have been compared using psychophysical observer studies. Experienced radiologists were shown clinical CT (computerized tomographic) images of the chest. Into some of the images, appropriate artificial lesions were introduced; the physicians were then shown the images processed with both AHE and intensity windowing. They were asked to assess the probability that a given image contained the artificial lesion, and their accuracy was measured. The results of these experiments show that for this particular diagnostic task, there was no significant difference in the ability of the two methods to depict luminance contrast; thus, further evaluation of AHE using controlled clinical trials is indicated.

  2. Comparison Tools for Assessing the Microgravity Environment of Space Missions, Carriers and Conditions

    NASA Technical Reports Server (NTRS)

    DeLombard, Richard; Hrovat, Kenneth; Moskowitz, Milton; McPherson, Kevin M.

    1998-01-01

    The microgravity environment of the NASA Shuttles and Russia's Mir space station have been measured by specially designed accelerometer systems. The need for comparisons between different missions, vehicles, conditions, etc. has been addressed by the two new processes described in this paper. The Principal Component Spectral Analysis (PCSA) and Quasi-steady Three-dimensional Histogram QTH techniques provide the means to describe the microgravity acceleration environment of a long time span of data on a single plot. As described in this paper, the PCSA and QTH techniques allow both the range and the median of the microgravity environment to be represented graphically on a single page. A variety of operating conditions may be made evident by using PCSA or QTH plots. The PCSA plot can help to distinguish between equipment operating full time or part time, as well as show the variability of the magnitude and/or frequency of an acceleration source. A QTH plot summarizes the magnitude and orientation of the low-frequency acceleration vector. This type of plot can show the microgravity effects of attitude, altitude, venting, etc.

  3. Dependence of Interfacial Excess on the Threshold Value of the Isoconcentration Surface

    NASA Technical Reports Server (NTRS)

    Yoon, Kevin E.; Noebe, Ronald D.; Hellman, Olof C.; Seidman, David N.

    2004-01-01

    The proximity histogram (or proxigram for short) is used for analyzing data collected by a three-dimensional atom probe microscope. The interfacial excess of Re (2.41 +/- 0.68 atoms/sq nm) is calculated by employing a proxigram in a completely geometrically independent way for gamma/gamma' interfaces in Rene N6, a third-generation single-crystal Ni-based superalloy. A possible dependence of interfacial excess on the variation of the threshold value of an isoconcentration surface is investigated using the data collected for Rene N6 alloy. It is demonstrated that the dependence of the interfacial excess value on the threshold value of the isoconcentration surface is weak.

  4. Modeling Early Postnatal Brain Growth and Development with CT: Changes in the Brain Radiodensity Histogram from Birth to 2 Years.

    PubMed

    Cauley, K A; Hu, Y; Och, J; Yorks, P J; Fielden, S W

    2018-04-01

    The majority of brain growth and development occur in the first 2 years of life. This study investigated these changes by analysis of the brain radiodensity histogram of head CT scans from the clinical population, 0-2 years of age. One hundred twenty consecutive head CTs with normal findings meeting the inclusion criteria from children from birth to 2 years were retrospectively identified from 3 different CT scan platforms. Histogram analysis was performed on brain-extracted images, and histogram mean, mode, full width at half maximum, skewness, kurtosis, and SD were correlated with subject age. The effects of scan platform were investigated. Normative curves were fitted by polynomial regression analysis. Average total brain volume was 360 cm 3 at birth, 948 cm 3 at 1 year, and 1072 cm 3 at 2 years. Total brain tissue density showed an 11% increase in mean density at 1 year and 19% at 2 years. Brain radiodensity histogram skewness was positive at birth, declining logarithmically in the first 200 days of life. The histogram kurtosis also decreased in the first 200 days to approach a normal distribution. Direct segmentation of CT images showed that changes in brain radiodensity histogram skewness correlated with, and can be explained by, a relative increase in gray matter volume and an increase in gray and white matter tissue density that occurs during this period of brain maturation. Normative metrics of the brain radiodensity histogram derived from routine clinical head CT images can be used to develop a model of normal brain development. © 2018 by American Journal of Neuroradiology.

  5. Histogram analysis derived from apparent diffusion coefficient (ADC) is more sensitive to reflect serological parameters in myositis than conventional ADC analysis.

    PubMed

    Meyer, Hans Jonas; Emmer, Alexander; Kornhuber, Malte; Surov, Alexey

    2018-05-01

    Diffusion-weighted imaging (DWI) has the potential of being able to reflect histopathology architecture. A novel imaging approach, namely histogram analysis, is used to further characterize tissues on MRI. The aim of this study was to correlate histogram parameters derived from apparent diffusion coefficient (ADC) maps with serological parameters in myositis. 16 patients with autoimmune myositis were included in this retrospective study. DWI was obtained on a 1.5 T scanner by using the b-values of 0 and 1000 s mm - 2 . Histogram analysis was performed as a whole muscle measurement by using a custom-made Matlab-based application. The following ADC histogram parameters were estimated: ADCmean, ADCmax, ADCmin, ADCmedian, ADCmode, and the following percentiles ADCp10, ADCp25, ADCp75, ADCp90, as well histogram parameters kurtosis, skewness, and entropy. In all patients, the blood sample was acquired within 3 days to the MRI. The following serological parameters were estimated: alanine aminotransferase, aspartate aminotransferase, creatine kinase, lactate dehydrogenase, C-reactive protein (CRP) and myoglobin. All patients were screened for Jo1-autobodies. Kurtosis correlated inversely with CRP (p = -0.55 and 0.03). Furthermore, ADCp10 and ADCp90 values tended to correlate with creatine kinase (p = -0.43, 0.11, and p = -0.42, = 0.12 respectively). In addition, ADCmean, p10, p25, median, mode, and entropy were different between Jo1-positive and Jo1-negative patients. ADC histogram parameters are sensitive for detection of muscle alterations in myositis patients. Advances in knowledge: This study identified that kurtosis derived from ADC maps is associated with CRP in myositis patients. Furthermore, several ADC histogram parameters are statistically different between Jo1-positive and Jo1-negative patients.

  6. Can histogram analysis of MR images predict aggressiveness in pancreatic neuroendocrine tumors?

    PubMed

    De Robertis, Riccardo; Maris, Bogdan; Cardobi, Nicolò; Tinazzi Martini, Paolo; Gobbo, Stefano; Capelli, Paola; Ortolani, Silvia; Cingarlini, Sara; Paiella, Salvatore; Landoni, Luca; Butturini, Giovanni; Regi, Paolo; Scarpa, Aldo; Tortora, Giampaolo; D'Onofrio, Mirko

    2018-06-01

    To evaluate MRI derived whole-tumour histogram analysis parameters in predicting pancreatic neuroendocrine neoplasm (panNEN) grade and aggressiveness. Pre-operative MR of 42 consecutive patients with panNEN >1 cm were retrospectively analysed. T1-/T2-weighted images and ADC maps were analysed. Histogram-derived parameters were compared to histopathological features using the Mann-Whitney U test. Diagnostic accuracy was assessed by ROC-AUC analysis; sensitivity and specificity were assessed for each histogram parameter. ADC entropy was significantly higher in G2-3 tumours with ROC-AUC 0.757; sensitivity and specificity were 83.3 % (95 % CI: 61.2-94.5) and 61.1 % (95 % CI: 36.1-81.7). ADC kurtosis was higher in panNENs with vascular involvement, nodal and hepatic metastases (p= .008, .021 and .008; ROC-AUC= 0.820, 0.709 and 0.820); sensitivity and specificity were: 85.7/74.3 % (95 % CI: 42-99.2 /56.4-86.9), 36.8/96.5 % (95 % CI: 17.2-61.4 /76-99.8) and 100/62.8 % (95 % CI: 56.1-100/44.9-78.1). No significant differences between groups were found for other histogram-derived parameters (p >.05). Whole-tumour histogram analysis of ADC maps may be helpful in predicting tumour grade, vascular involvement, nodal and liver metastases in panNENs. ADC entropy and ADC kurtosis are the most accurate parameters for identification of panNENs with malignant behaviour. • Whole-tumour ADC histogram analysis can predict aggressiveness in pancreatic neuroendocrine neoplasms. • ADC entropy and kurtosis are higher in aggressive tumours. • ADC histogram analysis can quantify tumour diffusion heterogeneity. • Non-invasive quantification of tumour heterogeneity can provide adjunctive information for prognostication.

  7. Non-small cell lung cancer: Whole-lesion histogram analysis of the apparent diffusion coefficient for assessment of tumor grade, lymphovascular invasion and pleural invasion.

    PubMed

    Tsuchiya, Naoko; Doai, Mariko; Usuda, Katsuo; Uramoto, Hidetaka; Tonami, Hisao

    2017-01-01

    Investigating the diagnostic accuracy of histogram analyses of apparent diffusion coefficient (ADC) values for determining non-small cell lung cancer (NSCLC) tumor grades, lymphovascular invasion, and pleural invasion. We studied 60 surgically diagnosed NSCLC patients. Diffusion-weighted imaging (DWI) was performed in the axial plane using a navigator-triggered single-shot, echo-planar imaging sequence with prospective acquisition correction. The ADC maps were generated, and we placed a volume-of-interest on the tumor to construct the whole-lesion histogram. Using the histogram, we calculated the mean, 5th, 10th, 25th, 50th, 75th, 90th, and 95th percentiles of ADC, skewness, and kurtosis. Histogram parameters were correlated with tumor grade, lymphovascular invasion, and pleural invasion. We performed a receiver operating characteristics (ROC) analysis to assess the diagnostic performance of histogram parameters for distinguishing different pathologic features. The ADC mean, 10th, 25th, 50th, 75th, 90th, and 95th percentiles showed significant differences among the tumor grades. The ADC mean, 25th, 50th, 75th, 90th, and 95th percentiles were significant histogram parameters between high- and low-grade tumors. The ROC analysis between high- and low-grade tumors showed that the 95th percentile ADC achieved the highest area under curve (AUC) at 0.74. Lymphovascular invasion was associated with the ADC mean, 50th, 75th, 90th, and 95th percentiles, skewness, and kurtosis. Kurtosis achieved the highest AUC at 0.809. Pleural invasion was only associated with skewness, with the AUC of 0.648. ADC histogram analyses on the basis of the entire tumor volume are able to stratify NSCLCs' tumor grade, lymphovascular invasion and pleural invasion.

  8. Investigation on improved infrared image detail enhancement algorithm based on adaptive histogram statistical stretching and gradient filtering

    NASA Astrophysics Data System (ADS)

    Zeng, Bangze; Zhu, Youpan; Li, Zemin; Hu, Dechao; Luo, Lin; Zhao, Deli; Huang, Juan

    2014-11-01

    Duo to infrared image with low contrast, big noise and unclear visual effect, target is very difficult to observed and identified. This paper presents an improved infrared image detail enhancement algorithm based on adaptive histogram statistical stretching and gradient filtering (AHSS-GF). Based on the fact that the human eyes are very sensitive to the edges and lines, the author proposed to extract the details and textures by using the gradient filtering. New histogram could be acquired by calculating the sum of original histogram based on fixed window. With the minimum value for cut-off point, author carried on histogram statistical stretching. After the proper weights given to the details and background, the detail-enhanced results could be acquired finally. The results indicate image contrast could be improved and the details and textures could be enhanced effectively as well.

  9. LEDs as light source: examining quality of acquired images

    NASA Astrophysics Data System (ADS)

    Bachnak, Rafic; Funtanilla, Jeng; Hernandez, Jose

    2004-05-01

    Recent advances in technology have made light emitting diodes (LEDs) viable in a number of applications, including vehicle stoplights, traffic lights, machine-vision-inspection, illumination, and street signs. This paper presents the results of comparing images taken by a videoscope using two different light sources. One of the sources is the internal metal halide lamp and the other is a LED placed at the tip of the insertion tube. Images acquired using these two light sources were quantitatively compared using their histogram, intensity profile along a line segment, and edge detection. Also, images were qualitatively compared using image registration and transformation. The gray-level histogram, edge detection, image profile and image registration do not offer conclusive results. The LED light source, however, produces good images for visual inspection by an operator. The paper will present the results and discuss the usefulness and shortcomings of various comparison methods.

  10. Face Liveness Detection Using Defocus

    PubMed Central

    Kim, Sooyeon; Ban, Yuseok; Lee, Sangyoun

    2015-01-01

    In order to develop security systems for identity authentication, face recognition (FR) technology has been applied. One of the main problems of applying FR technology is that the systems are especially vulnerable to attacks with spoofing faces (e.g., 2D pictures). To defend from these attacks and to enhance the reliability of FR systems, many anti-spoofing approaches have been recently developed. In this paper, we propose a method for face liveness detection using the effect of defocus. From two images sequentially taken at different focuses, three features, focus, power histogram and gradient location and orientation histogram (GLOH), are extracted. Afterwards, we detect forged faces through the feature-level fusion approach. For reliable performance verification, we develop two databases with a handheld digital camera and a webcam. The proposed method achieves a 3.29% half total error rate (HTER) at a given depth of field (DoF) and can be extended to camera-equipped devices, like smartphones. PMID:25594594

  11. Statistically based splicing detection reveals neural enrichment and tissue-specific induction of circular RNA during human fetal development.

    PubMed

    Szabo, Linda; Morey, Robert; Palpant, Nathan J; Wang, Peter L; Afari, Nastaran; Jiang, Chuan; Parast, Mana M; Murry, Charles E; Laurent, Louise C; Salzman, Julia

    2015-06-16

    The pervasive expression of circular RNA is a recently discovered feature of gene expression in highly diverged eukaryotes, but the functions of most circular RNAs are still unknown. Computational methods to discover and quantify circular RNA are essential. Moreover, discovering biological contexts where circular RNAs are regulated will shed light on potential functional roles they may play. We present a new algorithm that increases the sensitivity and specificity of circular RNA detection by discovering and quantifying circular and linear RNA splicing events at both annotated and un-annotated exon boundaries, including intergenic regions of the genome, with high statistical confidence. Unlike approaches that rely on read count and exon homology to determine confidence in prediction of circular RNA expression, our algorithm uses a statistical approach. Using our algorithm, we unveiled striking induction of general and tissue-specific circular RNAs, including in the heart and lung, during human fetal development. We discover regions of the human fetal brain, such as the frontal cortex, with marked enrichment for genes where circular RNA isoforms are dominant. The vast majority of circular RNA production occurs at major spliceosome splice sites; however, we find the first examples of developmentally induced circular RNAs processed by the minor spliceosome, and an enriched propensity of minor spliceosome donors to splice into circular RNA at un-annotated, rather than annotated, exons. Together, these results suggest a potentially significant role for circular RNA in human development.

  12. Performance evaluation for 120 four-layer DOI block detectors of the jPET-D4.

    PubMed

    Inadama, Naoko; Murayama, Hideo; Ono, Yusuke; Tsuda, Tomoaki; Hamamoto, Manabu; Yamaya, Taiga; Yoshida, Eiji; Shibuya, Kengo; Nishikido, Fumihiko; Takahashi, Kei; Kawai, Hideyuki

    2008-01-01

    The jPET-D4 is a brain positron emission tomography (PET) scanner that we have developed to meet user demands for high sensitivity and high spatial resolution. For this scanner, we developed a four-layer depth-of-interaction (DOI) detector. The four-layer DOI detector is a key component for the jPET-D4, its performance has great influence on the overall system performance. Previously, we reported the original technique for encoding four-layer DOI. Here, we introduce the final design of the jPET-D4 detector and present the results of an investigation on uniformity in performance of the detector. The performance evaluation was done over the 120 DOI crystal blocks for the detectors, which are to be assembled into the jPET-D4 scanner. We also introduce the crystal assembly method, which is simple enough, even though each DOI crystal block is composed of 1,024 crystal elements. The jPET-D4 detector consists of four layers of 16 x 16 Gd(2)SiO(5) (GSO) crystals and a 256-channel flat-panel position-sensitive photomultiplier tube (256ch FP-PMT). To identify scintillated crystals in the four-layer DOI detector, we use pulse shape discrimination and position discrimination on the two-dimensional (2D) position histogram. For pulse shape discrimination, two kinds of GSO crystals that show different scintillation decay time constants are used in the upper two and lower two layers, respectively. Proper reflector arrangement in the crystal block then allows the scintillated crystals to be identified in these two-layer groupings with two 2D position histograms. We produced the 120 DOI crystal blocks for the jPET-D4 system, and measured their characteristics such as the accuracy of pulse shape discrimination, energy resolution, and the pulse height of the full energy peak. The results show a satisfactory and uniform performance of the four-layer DOI crystal blocks; for example, misidentification rate in each GSO layer is <5% based on pulse shape discrimination, the averaged energy resolutions for the central four crystals of the first (farthest from the FP-PMT), second, third, and 4th layers are 15.7 +/- 1.0, 15.8 +/- 0.6, 17.7 +/- 1.2, and 17.3 +/- 1.4%, respectively, and variation in pulse height of the full energy peak among the four layers is <5% on average.

  13. Spline smoothing of histograms by linear programming

    NASA Technical Reports Server (NTRS)

    Bennett, J. O.

    1972-01-01

    An algorithm for an approximating function to the frequency distribution is obtained from a sample of size n. To obtain the approximating function a histogram is made from the data. Next, Euclidean space approximations to the graph of the histogram using central B-splines as basis elements are obtained by linear programming. The approximating function has area one and is nonnegative.

  14. Histogram analysis of greyscale sonograms to differentiate between the subtypes of follicular variant of papillary thyroid cancer.

    PubMed

    Kwon, M-R; Shin, J H; Hahn, S Y; Oh, Y L; Kwak, J Y; Lee, E; Lim, Y

    2018-06-01

    To evaluate the diagnostic value of histogram analysis using ultrasound (US) to differentiate between the subtypes of follicular variant of papillary thyroid carcinoma (FVPTC). The present study included 151 patients with surgically confirmed FVPTC diagnosed between January 2014 and May 2016. Their preoperative US features were reviewed retrospectively. Histogram parameters (mean, maximum, minimum, range, root mean square, skewness, kurtosis, energy, entropy, and correlation) were obtained for each nodule. The 152 nodules in 151 patients comprised 48 non-invasive follicular thyroid neoplasm with papillary-like nuclear features (NIFTPs; 31.6%), 60 invasive encapsulated FVPTCs (EFVPTCs; 39.5%), and 44 infiltrative FVPTCs (28.9%). The US features differed significantly between the subtypes of FVPTC. Discrimination was achieved between NIFTPs and infiltrative FVPTC, and between invasive EFVPTC and infiltrative FVPTC using histogram parameters; however, the parameters were not significantly different between NIFTP and invasive EFVPTC. It is feasible to use greyscale histogram analysis to differentiate between NIFTP and infiltrative FVPTC, but not between NIFTP and invasive EFVPTC. Histograms can be used as a supplementary tool to differentiate the subtypes of FVPTC. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  15. DSP+FPGA-based real-time histogram equalization system of infrared image

    NASA Astrophysics Data System (ADS)

    Gu, Dongsheng; Yang, Nansheng; Pi, Defu; Hua, Min; Shen, Xiaoyan; Zhang, Ruolan

    2001-10-01

    Histogram Modification is a simple but effective method to enhance an infrared image. There are several methods to equalize an infrared image's histogram due to the different characteristics of the different infrared images, such as the traditional HE (Histogram Equalization) method, and the improved HP (Histogram Projection) and PE (Plateau Equalization) method and so on. If to realize these methods in a single system, the system must have a mass of memory and extremely fast speed. In our system, we introduce a DSP + FPGA based real-time procession technology to do these things together. FPGA is used to realize the common part of these methods while DSP is to do the different part. The choice of methods and the parameter can be input by a keyboard or a computer. By this means, the function of the system is powerful while it is easy to operate and maintain. In this article, we give out the diagram of the system and the soft flow chart of the methods. And at the end of it, we give out the infrared image and its histogram before and after the process of HE method.

  16. Histogram Analysis of Diffusion Weighted Imaging at 3T is Useful for Prediction of Lymphatic Metastatic Spread, Proliferative Activity, and Cellularity in Thyroid Cancer.

    PubMed

    Schob, Stefan; Meyer, Hans Jonas; Dieckow, Julia; Pervinder, Bhogal; Pazaitis, Nikolaos; Höhn, Anne Kathrin; Garnov, Nikita; Horvath-Rizea, Diana; Hoffmann, Karl-Titus; Surov, Alexey

    2017-04-12

    Pre-surgical diffusion weighted imaging (DWI) is increasingly important in the context of thyroid cancer for identification of the optimal treatment strategy. It has exemplarily been shown that DWI at 3T can distinguish undifferentiated from well-differentiated thyroid carcinoma, which has decisive implications for the magnitude of surgery. This study used DWI histogram analysis of whole tumor apparent diffusion coefficient (ADC) maps. The primary aim was to discriminate thyroid carcinomas which had already gained the capacity to metastasize lymphatically from those not yet being able to spread via the lymphatic system. The secondary aim was to reflect prognostically important tumor-biological features like cellularity and proliferative activity with ADC histogram analysis. Fifteen patients with follicular-cell derived thyroid cancer were enrolled. Lymph node status, extent of infiltration of surrounding tissue, and Ki-67 and p53 expression were assessed in these patients. DWI was obtained in a 3T system using b values of 0, 400, and 800 s/mm². Whole tumor ADC volumes were analyzed using a histogram-based approach. Several ADC parameters showed significant correlations with immunohistopathological parameters. Most importantly, ADC histogram skewness and ADC histogram kurtosis were able to differentiate between nodal negative and nodal positive thyroid carcinoma. histogram analysis of whole ADC tumor volumes has the potential to provide valuable information on tumor biology in thyroid carcinoma. However, further studies are warranted.

  17. Histogram Analysis of Diffusion Weighted Imaging at 3T is Useful for Prediction of Lymphatic Metastatic Spread, Proliferative Activity, and Cellularity in Thyroid Cancer

    PubMed Central

    Schob, Stefan; Meyer, Hans Jonas; Dieckow, Julia; Pervinder, Bhogal; Pazaitis, Nikolaos; Höhn, Anne Kathrin; Garnov, Nikita; Horvath-Rizea, Diana; Hoffmann, Karl-Titus; Surov, Alexey

    2017-01-01

    Pre-surgical diffusion weighted imaging (DWI) is increasingly important in the context of thyroid cancer for identification of the optimal treatment strategy. It has exemplarily been shown that DWI at 3T can distinguish undifferentiated from well-differentiated thyroid carcinoma, which has decisive implications for the magnitude of surgery. This study used DWI histogram analysis of whole tumor apparent diffusion coefficient (ADC) maps. The primary aim was to discriminate thyroid carcinomas which had already gained the capacity to metastasize lymphatically from those not yet being able to spread via the lymphatic system. The secondary aim was to reflect prognostically important tumor-biological features like cellularity and proliferative activity with ADC histogram analysis. Fifteen patients with follicular-cell derived thyroid cancer were enrolled. Lymph node status, extent of infiltration of surrounding tissue, and Ki-67 and p53 expression were assessed in these patients. DWI was obtained in a 3T system using b values of 0, 400, and 800 s/mm2. Whole tumor ADC volumes were analyzed using a histogram-based approach. Several ADC parameters showed significant correlations with immunohistopathological parameters. Most importantly, ADC histogram skewness and ADC histogram kurtosis were able to differentiate between nodal negative and nodal positive thyroid carcinoma. Conclusions: histogram analysis of whole ADC tumor volumes has the potential to provide valuable information on tumor biology in thyroid carcinoma. However, further studies are warranted. PMID:28417929

  18. Enhancing tumor apparent diffusion coefficient histogram skewness stratifies the postoperative survival in recurrent glioblastoma multiforme patients undergoing salvage surgery.

    PubMed

    Zolal, Amir; Juratli, Tareq A; Linn, Jennifer; Podlesek, Dino; Sitoci Ficici, Kerim Hakan; Kitzler, Hagen H; Schackert, Gabriele; Sobottka, Stephan B; Rieger, Bernhard; Krex, Dietmar

    2016-05-01

    Objective To determine the value of apparent diffusion coefficient (ADC) histogram parameters for the prediction of individual survival in patients undergoing surgery for recurrent glioblastoma (GBM) in a retrospective cohort study. Methods Thirty-one patients who underwent surgery for first recurrence of a known GBM between 2008 and 2012 were included. The following parameters were collected: age, sex, enhancing tumor size, mean ADC, median ADC, ADC skewness, ADC kurtosis and fifth percentile of the ADC histogram, initial progression free survival (PFS), extent of second resection and further adjuvant treatment. The association of these parameters with survival and PFS after second surgery was analyzed using log-rank test and Cox regression. Results Using log-rank test, ADC histogram skewness of the enhancing tumor was significantly associated with both survival (p = 0.001) and PFS after second surgery (p = 0.005). Further parameters associated with prolonged survival after second surgery were: gross total resection at second surgery (p = 0.026), tumor size (0.040) and third surgery (p = 0.003). In the multivariate Cox analysis, ADC histogram skewness was shown to be an independent prognostic factor for survival after second surgery. Conclusion ADC histogram skewness of the enhancing lesion, enhancing lesion size, third surgery, as well as gross total resection have been shown to be associated with survival following the second surgery. ADC histogram skewness was an independent prognostic factor for survival in the multivariate analysis.

  19. Approach to Privacy-Preserve Data in Two-Tiered Wireless Sensor Network Based on Linear System and Histogram

    NASA Astrophysics Data System (ADS)

    Dang, Van H.; Wohlgemuth, Sven; Yoshiura, Hiroshi; Nguyen, Thuc D.; Echizen, Isao

    Wireless sensor network (WSN) has been one of key technologies for the future with broad applications from the military to everyday life [1,2,3,4,5]. There are two kinds of WSN model models with sensors for sensing data and a sink for receiving and processing queries from users; and models with special additional nodes capable of storing large amounts of data from sensors and processing queries from the sink. Among the latter type, a two-tiered model [6,7] has been widely adopted because of its storage and energy saving benefits for weak sensors, as proved by the advent of commercial storage node products such as Stargate [8] and RISE. However, by concentrating storage in certain nodes, this model becomes more vulnerable to attack. Our novel technique, called zip-histogram, contributes to solving the problems of previous studies [6,7] by protecting the stored data's confidentiality and integrity (including data from the sensor and queries from the sink) against attackers who might target storage nodes in two-tiered WSNs.

  20. The Influence of Tungsten on the Chemical Composition of a Temporally Evolving Nanostructure of a Model Ni-Al-Cr Superalloy

    NASA Technical Reports Server (NTRS)

    Sudbrack, Chantal K.; Isheim, Dieter; Noebe, Ronald D.; Jacobson, Nathan S.; Seidman, David N.

    2004-01-01

    The influence of W on the temporal evolution of gamma' precipitation toward equilibrium in a model Ni-Al-Cr alloy is investigated by three-dimensional atom-probe (3DAP) microscopy and transmission electron microscopy (TEM). We report on the alloys Ni-10 Al-8.5 Cr (at.%) and Ni-10 Al-8.5 Cr-2 W (at.%), which were aged isothermally in the gamma + gamma' two-phase field at 1073 K, for times ranging from 0.25 to 264 h. Spheroidal-shaped gamma' precipitates, 5-15 nm diameter, form during quenching from above the solvus temperature in both alloys at a high number density (approx. 10(exp 23/cu m). As gamma' precipitates grow with aging at 1073 K, a transition from spheriodal- to cuboidal-shaped precipitates is observed in both alloys. The elemental partitioning and spatially resolved concentration profiles across the gamma' precipitates are obtained as a function of aging time from three-dimensional atom-by-atom reconstructions. Proximity histogram concentration profiles of the quaternary alloy demonstrate that W concentration gradients exist in gamma' precipitates in the as-quenched and 0.25-h aging states, which disappear after 1 h of aging. The diffusion coefficient of W in gamma' is estimated to be 6.2 x 10(exp -20) sq m/s at 1073 K. The W addition decreases the coarsening rate constant, and leads to stronger partitioning of Al to gamma' and Cr to gamma.

  1. SU-F-J-207: Non-Small Cell Lung Cancer Patient Survival Prediction with Quantitative Tumor Textures Analysis in Baseline CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Y; Zou, J; Murillo, P

    Purpose: Chemo-radiation therapy (CRT) is widely used in treating patients with locally advanced non-small cell lung cancer (NSCLC). Determination of the likelihood of patient response to treatment and optimization of treatment regime is of clinical significance. Up to date, no imaging biomarker has reliably correlated to NSCLC patient survival rate. This pilot study is to extract CT texture information from tumor regions for patient survival prediction. Methods: Thirteen patients with stage II-III NSCLC were treated using CRT with a median dose of 6210 cGy. Non-contrast-enhanced CT images were acquired for treatment planning and retrospectively collected for this study. Texture analysismore » was applied in segmented tumor regions using the Local Binary Pattern method (LBP). By comparing its HU with neighboring voxels, the LBPs of a voxel were measured in multiple scales with different group radiuses and numbers of neighbors. The LBP histograms formed a multi-dimensional texture vector for each patient, which was then used to establish and test a Support Vector Machine (SVM) model to predict patients’ one year survival. The leave-one-out cross validation strategy was used recursively to enlarge the training set and derive a reliable predictor. The predictions were compared with the true clinical outcomes. Results: A 10-dimensional LBP histogram was extracted from 3D segmented tumor region for each of the 13 patients. Using the SVM model with the leave-one-out strategy, only 1 out of 13 patients was misclassified. The experiments showed an accuracy of 93%, sensitivity of 100%, and specificity of 86%. Conclusion: Within the framework of a Support Vector Machine based model, the Local Binary Pattern method is able to extract a quantitative imaging biomarker in the prediction of NSCLC patient survival. More patients are to be included in the study.« less

  2. Transforming growth factor-beta-1 is a serum biomarker of radiation-induced pneumonitis in esophageal cancer patients treated with thoracic radiotherapy: preliminary results of a prospective study.

    PubMed

    Li, Jingxia; Mu, Shuangfeng; Mu, Lixiang; Zhang, Xiaohui; Pang, Ranran; Gao, Shegan

    2015-01-01

    To examine the relationship between cytokine levels of transforming growth factor-beta-1 (TGF-β1), interleukin-1 beta (IL-1β), and angiotensin-converting enzyme (ACE) in the plasma of esophageal carcinoma patients and radiation-induced pneumonitis (RP). Sixty-three patients with esophageal carcinoma were treated with three-dimensional conformal radiotherapy (RT) using the Elekta Precise treatment planning system with a prescribed dose of 50-70 Gy. Dose-volume histograms were collected from three-dimensional conformal RT to determine the volume percentage of the lung received V5, V10, V20, and the normal tissue complication probability. RP was diagnosed based on computed tomography imaging, respiratory symptoms, and signs. The severity of radiation-induced lung toxicity was determined using the Lent-Soma scale defined by the Radiation Therapy Oncology Group. Plasma samples obtained before RT, during RT (at 40 Gy), and at 1 day, 1 month, and 3 months after RT were assayed for TGF-β1, IL-1β, and ACE levels by enzyme-linked immunosorbent assay. From the 63 patients, 17 (27%) developed RP, and 13 (21%) had RP of grade I and four (6%) had grade II or higher. We found plasma TGF-β1 levels were elevated in the patients that had RP when compared with the other 46 patients who did not have RP. The plasma IL-1β levels were not changed. The ACE levels were significantly lower in the 17 patients with RP compared to the 46 patients without RP throughout the RT. As expected, RP is associated with a higher dose of irradiation (>60 Gy); no other factors, including dose-volume histogram, age, sex, smoking status, location of tumor, and methods of treatment, are associated with RP. Elevated plasma TGF-β1 levels can be used as a marker for RP.

  3. Inverse optimization of objective function weights for treatment planning using clinical dose-volume histograms

    NASA Astrophysics Data System (ADS)

    Babier, Aaron; Boutilier, Justin J.; Sharpe, Michael B.; McNiven, Andrea L.; Chan, Timothy C. Y.

    2018-05-01

    We developed and evaluated a novel inverse optimization (IO) model to estimate objective function weights from clinical dose-volume histograms (DVHs). These weights were used to solve a treatment planning problem to generate ‘inverse plans’ that had similar DVHs to the original clinical DVHs. Our methodology was applied to 217 clinical head and neck cancer treatment plans that were previously delivered at Princess Margaret Cancer Centre in Canada. Inverse plan DVHs were compared to the clinical DVHs using objective function values, dose-volume differences, and frequency of clinical planning criteria satisfaction. Median differences between the clinical and inverse DVHs were within 1.1 Gy. For most structures, the difference in clinical planning criteria satisfaction between the clinical and inverse plans was at most 1.4%. For structures where the two plans differed by more than 1.4% in planning criteria satisfaction, the difference in average criterion violation was less than 0.5 Gy. Overall, the inverse plans were very similar to the clinical plans. Compared with a previous inverse optimization method from the literature, our new inverse plans typically satisfied the same or more clinical criteria, and had consistently lower fluence heterogeneity. Overall, this paper demonstrates that DVHs, which are essentially summary statistics, provide sufficient information to estimate objective function weights that result in high quality treatment plans. However, as with any summary statistic that compresses three-dimensional dose information, care must be taken to avoid generating plans with undesirable features such as hotspots; our computational results suggest that such undesirable spatial features were uncommon. Our IO-based approach can be integrated into the current clinical planning paradigm to better initialize the planning process and improve planning efficiency. It could also be embedded in a knowledge-based planning or adaptive radiation therapy framework to automatically generate a new plan given a predicted or updated target DVH, respectively.

  4. Inverse optimization of objective function weights for treatment planning using clinical dose-volume histograms.

    PubMed

    Babier, Aaron; Boutilier, Justin J; Sharpe, Michael B; McNiven, Andrea L; Chan, Timothy C Y

    2018-05-10

    We developed and evaluated a novel inverse optimization (IO) model to estimate objective function weights from clinical dose-volume histograms (DVHs). These weights were used to solve a treatment planning problem to generate 'inverse plans' that had similar DVHs to the original clinical DVHs. Our methodology was applied to 217 clinical head and neck cancer treatment plans that were previously delivered at Princess Margaret Cancer Centre in Canada. Inverse plan DVHs were compared to the clinical DVHs using objective function values, dose-volume differences, and frequency of clinical planning criteria satisfaction. Median differences between the clinical and inverse DVHs were within 1.1 Gy. For most structures, the difference in clinical planning criteria satisfaction between the clinical and inverse plans was at most 1.4%. For structures where the two plans differed by more than 1.4% in planning criteria satisfaction, the difference in average criterion violation was less than 0.5 Gy. Overall, the inverse plans were very similar to the clinical plans. Compared with a previous inverse optimization method from the literature, our new inverse plans typically satisfied the same or more clinical criteria, and had consistently lower fluence heterogeneity. Overall, this paper demonstrates that DVHs, which are essentially summary statistics, provide sufficient information to estimate objective function weights that result in high quality treatment plans. However, as with any summary statistic that compresses three-dimensional dose information, care must be taken to avoid generating plans with undesirable features such as hotspots; our computational results suggest that such undesirable spatial features were uncommon. Our IO-based approach can be integrated into the current clinical planning paradigm to better initialize the planning process and improve planning efficiency. It could also be embedded in a knowledge-based planning or adaptive radiation therapy framework to automatically generate a new plan given a predicted or updated target DVH, respectively.

  5. SU-E-T-375: Passive Scattering to Pencil-Beam-Scanning Comparison for Medulloblastoma Proton Therapy: LET Distributions and Radiobiological Implications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giantsoudi, D; MacDonald, S; Paganetti, H

    2014-06-01

    Purpose: To compare the linear energy transfer (LET) distributions between passive scattering and pencil beam scanning proton radiation therapy techniques for medulloblastoma patients and study the potential radiobiological implications. Methods: A group of medulloblastoma patients, previously treated with passive scattering (PS) proton craniospinal irradiation followed by prosterior fossa or involved field boost, were selected from the patient database of our institution. Using the beam geometry and planning computed tomography (CT) image sets of the original treatment plans, pencil beam scanning (PBS) treatment plans were generated for the cranial treatment for each patient, with average beam spot size of 8mm (sigmamore » in air at isocenter). 3-dimensional dose and LET distributions were calculated by Monte Carlo methods (TOPAS) both for the original passive scattering and new pencil beam scanning treatment plans. LET volume histograms were calculated for the target and OARs and compared for the two delivery methods. Variable RBE weighted dose distributions and volume histograms were also calculated using a variable dose and LET-based model. Results: Better dose conformity was achieved with PBS planning compared to PS, leading to increased dose coverage for the boost target area and decreased average dose to the structures adjacent to it and critical structures outside the whole brain treatment field. LET values for the target were lower for PBS plans. Elevated LET values for OARs close to the boosted target areas were noticed, due to end of range of proton beams falling inside these structures, resulting in higher RBE weighted dose for these structures compared to the clinical RBE value of 1.1. Conclusion: Transitioning from passive scattering to pencil beam scanning proton radiation treatment can be dosimetrically beneficial for medulloblastoma patients. LET–guided treatment planning could contribute to better decision making for these cases, especially for critical structures at close proximity to the boosted target area.« less

  6. Image Enhancement via Subimage Histogram Equalization Based on Mean and Variance

    PubMed Central

    2017-01-01

    This paper puts forward a novel image enhancement method via Mean and Variance based Subimage Histogram Equalization (MVSIHE), which effectively increases the contrast of the input image with brightness and details well preserved compared with some other methods based on histogram equalization (HE). Firstly, the histogram of input image is divided into four segments based on the mean and variance of luminance component, and the histogram bins of each segment are modified and equalized, respectively. Secondly, the result is obtained via the concatenation of the processed subhistograms. Lastly, the normalization method is deployed on intensity levels, and the integration of the processed image with the input image is performed. 100 benchmark images from a public image database named CVG-UGR-Database are used for comparison with other state-of-the-art methods. The experiment results show that the algorithm can not only enhance image information effectively but also well preserve brightness and details of the original image. PMID:29403529

  7. Unsupervised chunking based on graph propagation from bilingual corpus.

    PubMed

    Zhu, Ling; Wong, Derek F; Chao, Lidia S

    2014-01-01

    This paper presents a novel approach for unsupervised shallow parsing model trained on the unannotated Chinese text of parallel Chinese-English corpus. In this approach, no information of the Chinese side is applied. The exploitation of graph-based label propagation for bilingual knowledge transfer, along with an application of using the projected labels as features in unsupervised model, contributes to a better performance. The experimental comparisons with the state-of-the-art algorithms show that the proposed approach is able to achieve impressive higher accuracy in terms of F-score.

  8. At-TAX: a whole genome tiling array resource for developmental expression analysis and transcript identification in Arabidopsis thaliana

    PubMed Central

    Laubinger, Sascha; Zeller, Georg; Henz, Stefan R; Sachsenberg, Timo; Widmer, Christian K; Naouar, Naïra; Vuylsteke, Marnik; Schölkopf, Bernhard; Rätsch, Gunnar; Weigel, Detlef

    2008-01-01

    Gene expression maps for model organisms, including Arabidopsis thaliana, have typically been created using gene-centric expression arrays. Here, we describe a comprehensive expression atlas, Arabidopsis thaliana Tiling Array Express (At-TAX), which is based on whole-genome tiling arrays. We demonstrate that tiling arrays are accurate tools for gene expression analysis and identified more than 1,000 unannotated transcribed regions. Visualizations of gene expression estimates, transcribed regions, and tiling probe measurements are accessible online at the At-TAX homepage. PMID:18613972

  9. Diagnosis of Tempromandibular Disorders Using Local Binary Patterns.

    PubMed

    Haghnegahdar, A A; Kolahi, S; Khojastepour, L; Tajeripour, F

    2018-03-01

    Temporomandibular joint disorder (TMD) might be manifested as structural changes in bone through modification, adaptation or direct destruction. We propose to use Local Binary Pattern (LBP) characteristics and histogram-oriented gradients on the recorded images as a diagnostic tool in TMD assessment. CBCT images of 66 patients (132 joints) with TMD and 66 normal cases (132 joints) were collected and 2 coronal cut prepared from each condyle, although images were limited to head of mandibular condyle. In order to extract features of images, first we use LBP and then histogram of oriented gradients. To reduce dimensionality, the linear algebra Singular Value Decomposition (SVD) is applied to the feature vectors matrix of all images. For evaluation, we used K nearest neighbor (K-NN), Support Vector Machine, Naïve Bayesian and Random Forest classifiers. We used Receiver Operating Characteristic (ROC) to evaluate the hypothesis. K nearest neighbor classifier achieves a very good accuracy (0.9242), moreover, it has desirable sensitivity (0.9470) and specificity (0.9015) results, when other classifiers have lower accuracy, sensitivity and specificity. We proposed a fully automatic approach to detect TMD using image processing techniques based on local binary patterns and feature extraction. K-NN has been the best classifier for our experiments in detecting patients from healthy individuals, by 92.42% accuracy, 94.70% sensitivity and 90.15% specificity. The proposed method can help automatically diagnose TMD at its initial stages.

  10. Document image cleanup and binarization

    NASA Astrophysics Data System (ADS)

    Wu, Victor; Manmatha, Raghaven

    1998-04-01

    Image binarization is a difficult task for documents with text over textured or shaded backgrounds, poor contrast, and/or considerable noise. Current optical character recognition (OCR) and document analysis technology do not handle such documents well. We have developed a simple yet effective algorithm for document image clean-up and binarization. The algorithm consists of two basic steps. In the first step, the input image is smoothed using a low-pass filter. The smoothing operation enhances the text relative to any background texture. This is because background texture normally has higher frequency than text does. The smoothing operation also removes speckle noise. In the second step, the intensity histogram of the smoothed image is computed and a threshold automatically selected as follows. For black text, the first peak of the histogram corresponds to text. Thresholding the image at the value of the valley between the first and second peaks of the histogram binarizes the image well. In order to reliably identify the valley, the histogram is smoothed by a low-pass filter before the threshold is computed. The algorithm has been applied to some 50 images from a wide variety of source: digitized video frames, photos, newspapers, advertisements in magazines or sales flyers, personal checks, etc. There are 21820 characters and 4406 words in these images. 91 percent of the characters and 86 percent of the words are successfully cleaned up and binarized. A commercial OCR was applied to the binarized text when it consisted of fonts which were OCR recognizable. The recognition rate was 84 percent for the characters and 77 percent for the words.

  11. The value of whole lesion ADC histogram profiling to differentiate between morphologically indistinguishable ring enhancing lesions–comparison of glioblastomas and brain abscesses

    PubMed Central

    Hoffmann, Karl-Titus; Garnov, Nikita; Vörkel, Cathrin; Kohlhof-Meinecke, Patricia; Ganslandt, Oliver; Bäzner, Hansjörg; Gihr, Georg Alexander; Kalman, Marcell; Henkes, Elina; Henkes, Hans; Schob, Stefan

    2018-01-01

    Background Morphologically similar appearing ring enhancing lesions in the brain parenchyma can be caused by a number of distinct pathologies, however, they consistently represent life-threatening conditions. The two most frequently encountered diseases manifesting as such are glioblastoma multiforme (GBM) and brain abscess (BA), each requiring disparate therapeutical approaches. As a result of their morphological resemblance, essential treatment might be significantly delayed or even ommited, in case results of conventional imaging remain inconclusive. Therefore, our study aimed to investigate, whether ADC histogram profiling reliably can distinguish between both entities, thus enhancing the differential diagnostic process and preventing treatment failure in this highly critical context. Methods 103 patients (51 BA, 52 GBM) with histopathologically confirmed diagnosis were enrolled. Pretreatment diffusion weighted imaging (DWI) was obtained in a 1.5T system using b values of 0, 500, and 1000 s/mm2. Whole lesion ADC volumes were analyzed using a histogram-based approach. Statistical analysis was performed using SPSS version 23. Results All investigated parameters were statistically different in comparison of both groups. Most importantly, ADCp10 was able to differentiate reliably between BA and GBM with excellent accuracy (0.948) using a cutpoint value of 70 × 10−5 mm2 × s−1. Conclusions ADC whole lesion histogram profiling provides a valuable tool to differentiate between morphologically indistinguishable mass lesions. Among the investigated parameters, the 10th percentile of the ADC volume distinguished best between GBM and BA. PMID:29719596

  12. Investigating the Role of Global Histogram Equalization Technique for 99mTechnetium-Methylene diphosphonate Bone Scan Image Enhancement.

    PubMed

    Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh

    2017-01-01

    99m Technetium-methylene diphosphonate ( 99m Tc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99m Tc-MDP-bone scan images. A set of 89 low contrast 99m Tc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t -test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful.

  13. Image contrast enhancement using adjacent-blocks-based modification for local histogram equalization

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Pan, Zhibin

    2017-11-01

    Infrared images usually have some non-ideal characteristics such as weak target-to-background contrast and strong noise. Because of these characteristics, it is necessary to apply the contrast enhancement algorithm to improve the visual quality of infrared images. Histogram equalization (HE) algorithm is a widely used contrast enhancement algorithm due to its effectiveness and simple implementation. But a drawback of HE algorithm is that the local contrast of an image cannot be equally enhanced. Local histogram equalization algorithms are proved to be the effective techniques for local image contrast enhancement. However, over-enhancement of noise and artifacts can be easily found in the local histogram equalization enhanced images. In this paper, a new contrast enhancement technique based on local histogram equalization algorithm is proposed to overcome the drawbacks mentioned above. The input images are segmented into three kinds of overlapped sub-blocks using the gradients of them. To overcome the over-enhancement effect, the histograms of these sub-blocks are then modified by adjacent sub-blocks. We pay more attention to improve the contrast of detail information while the brightness of the flat region in these sub-blocks is well preserved. It will be shown that the proposed algorithm outperforms other related algorithms by enhancing the local contrast without introducing over-enhancement effects and additional noise.

  14. Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma.

    PubMed

    Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-06-01

    The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC).Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement.The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P <0.05), with area under the ROC curves (AUCs) of 0.66 to 0.74 for ADC and 0.76 to 0.88 for PVP. The largest AUC of PVP (1th percentile) showed significantly higher accuracy compared with that of arterial phase (AP) or tumor size (P <0.001).MR histogram analyses-in particular for 1th percentile for PVP images-held promise for prediction of MVI of HCC.

  15. Histogram analysis of apparent diffusion coefficient maps for differentiating primary CNS lymphomas from tumefactive demyelinating lesions.

    PubMed

    Lu, Shan Shan; Kim, Sang Joon; Kim, Namkug; Kim, Ho Sung; Choi, Choong Gon; Lim, Young Min

    2015-04-01

    This study intended to investigate the usefulness of histogram analysis of apparent diffusion coefficient (ADC) maps for discriminating primary CNS lymphomas (PCNSLs), especially atypical PCNSLs, from tumefactive demyelinating lesions (TDLs). Forty-seven patients with PCNSLs and 18 with TDLs were enrolled in our study. Hyperintense lesions seen on T2-weighted images were defined as ROIs after ADC maps were registered to the corresponding T2-weighted image. ADC histograms were calculated from the ROIs containing the entire lesion on every section and on a voxel-by-voxel basis. The ADC histogram parameters were compared among all PCNSLs and TDLs as well as between the subgroup of atypical PCNSLs and TDLs. ROC curves were constructed to evaluate the diagnostic performance of the histogram parameters and to determine the optimum thresholds. The differences between the PCNSLs and TDLs were found in the minimum ADC values (ADCmin) and in the 5th and 10th percentiles (ADC5% and ADC10%) of the cumulative ADC histograms. However, no statistical significance was found in the mean ADC value or in the ADC value concerning the mode, kurtosis, and skewness. The ADCmin, ADC5%, and ADC10% were also lower in atypical PCNSLs than in TDLs. ADCmin was the best indicator for discriminating atypical PCNSLs from TDLs, with a threshold of 556×10(-6) mm2/s (sensitivity, 81.3 %; specificity, 88.9%). Histogram analysis of ADC maps may help to discriminate PCNSLs from TDLs and may be particularly useful in differentiating atypical PCNSLs from TDLs.

  16. Macronuclear chromatin structure dynamics in Colpoda inflata (Protista, Ciliophora) resting encystment.

    PubMed

    Tiano, L; Chessa, M G; Carrara, S; Tagliafierro, G; Delmonte Corrado, M U

    1999-01-01

    The chromatin structure dynamics of the Colpoda inflata macronucleus have been investigated in relation to its functional condition, concerning chromatin body extrusion regulating activity. Samples of 2- and 25-day-old resting cysts derived from a standard culture, and of 1-year-old resting cysts derived from a senescent culture, were examined by means of histogram analysis performed on acquired optical microscopy images. Three groups of histograms were detected in each sample. Histogram classification, clustering and matching were assessed in order to obtain the mean histogram of each group. Comparative analysis of the mean histogram showed a similarity in the grey level range of 25-day- and 1-year-old cysts, unlike the wider grey level range found in 2-day-old cysts. Moreover, the respective mean histograms of the three cyst samples appeared rather similar in shape. All this implies that macronuclear chromatin structural features of 1-year-old cysts are common to both cyst standard cultures. The evaluation of the acquired images and their respective histograms evidenced a dynamic state of the macronuclear chromatin, appearing differently condensed in relation to the chromatin body extrusion regulating activity of the macronucleus. The coexistence of a chromatin-decondensed macronucleus with a pycnotic extrusion body suggests that chromatin unable to decondense, thus inactive, is extruded. This finding, along with the presence of chromatin structural features common to standard and senescent cyst populations, supports the occurrence of 'rejuvenated' cell lines from 1-year-old encysted senescent cells, a phenomenon which could be a result of accomplished macronuclear renewal.

  17. Convolution Comparison Pattern: An Efficient Local Image Descriptor for Fingerprint Liveness Detection

    PubMed Central

    Gottschlich, Carsten

    2016-01-01

    We present a new type of local image descriptor which yields binary patterns from small image patches. For the application to fingerprint liveness detection, we achieve rotation invariant image patches by taking the fingerprint segmentation and orientation field into account. We compute the discrete cosine transform (DCT) for these rotation invariant patches and attain binary patterns by comparing pairs of two DCT coefficients. These patterns are summarized into one or more histograms per image. Each histogram comprises the relative frequencies of pattern occurrences. Multiple histograms are concatenated and the resulting feature vector is used for image classification. We name this novel type of descriptor convolution comparison pattern (CCP). Experimental results show the usefulness of the proposed CCP descriptor for fingerprint liveness detection. CCP outperforms other local image descriptors such as LBP, LPQ and WLD on the LivDet 2013 benchmark. The CCP descriptor is a general type of local image descriptor which we expect to prove useful in areas beyond fingerprint liveness detection such as biological and medical image processing, texture recognition, face recognition and iris recognition, liveness detection for face and iris images, and machine vision for surface inspection and material classification. PMID:26844544

  18. Predicting low-temperature free energy landscapes with flat-histogram Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Mahynski, Nathan A.; Blanco, Marco A.; Errington, Jeffrey R.; Shen, Vincent K.

    2017-02-01

    We present a method for predicting the free energy landscape of fluids at low temperatures from flat-histogram grand canonical Monte Carlo simulations performed at higher ones. We illustrate our approach for both pure and multicomponent systems using two different sampling methods as a demonstration. This allows us to predict the thermodynamic behavior of systems which undergo both first order and continuous phase transitions upon cooling using simulations performed only at higher temperatures. After surveying a variety of different systems, we identify a range of temperature differences over which the extrapolation of high temperature simulations tends to quantitatively predict the thermodynamic properties of fluids at lower ones. Beyond this range, extrapolation still provides a reasonably well-informed estimate of the free energy landscape; this prediction then requires less computational effort to refine with an additional simulation at the desired temperature than reconstruction of the surface without any initial estimate. In either case, this method significantly increases the computational efficiency of these flat-histogram methods when investigating thermodynamic properties of fluids over a wide range of temperatures. For example, we demonstrate how a binary fluid phase diagram may be quantitatively predicted for many temperatures using only information obtained from a single supercritical state.

  19. Non-small cell lung cancer: Whole-lesion histogram analysis of the apparent diffusion coefficient for assessment of tumor grade, lymphovascular invasion and pleural invasion

    PubMed Central

    Tsuchiya, Naoko; Doai, Mariko; Usuda, Katsuo; Uramoto, Hidetaka

    2017-01-01

    Purpose Investigating the diagnostic accuracy of histogram analyses of apparent diffusion coefficient (ADC) values for determining non-small cell lung cancer (NSCLC) tumor grades, lymphovascular invasion, and pleural invasion. Materials and methods We studied 60 surgically diagnosed NSCLC patients. Diffusion-weighted imaging (DWI) was performed in the axial plane using a navigator-triggered single-shot, echo-planar imaging sequence with prospective acquisition correction. The ADC maps were generated, and we placed a volume-of-interest on the tumor to construct the whole-lesion histogram. Using the histogram, we calculated the mean, 5th, 10th, 25th, 50th, 75th, 90th, and 95th percentiles of ADC, skewness, and kurtosis. Histogram parameters were correlated with tumor grade, lymphovascular invasion, and pleural invasion. We performed a receiver operating characteristics (ROC) analysis to assess the diagnostic performance of histogram parameters for distinguishing different pathologic features. Results The ADC mean, 10th, 25th, 50th, 75th, 90th, and 95th percentiles showed significant differences among the tumor grades. The ADC mean, 25th, 50th, 75th, 90th, and 95th percentiles were significant histogram parameters between high- and low-grade tumors. The ROC analysis between high- and low-grade tumors showed that the 95th percentile ADC achieved the highest area under curve (AUC) at 0.74. Lymphovascular invasion was associated with the ADC mean, 50th, 75th, 90th, and 95th percentiles, skewness, and kurtosis. Kurtosis achieved the highest AUC at 0.809. Pleural invasion was only associated with skewness, with the AUC of 0.648. Conclusions ADC histogram analyses on the basis of the entire tumor volume are able to stratify NSCLCs' tumor grade, lymphovascular invasion and pleural invasion. PMID:28207858

  20. An Approach to Improve the Quality of Infrared Images of Vein-Patterns

    PubMed Central

    Lin, Chih-Lung

    2011-01-01

    This study develops an approach to improve the quality of infrared (IR) images of vein-patterns, which usually have noise, low contrast, low brightness and small objects of interest, thus requiring preprocessing to improve their quality. The main characteristics of the proposed approach are that no prior knowledge about the IR image is necessary and no parameters must be preset. Two main goals are sought: impulse noise reduction and adaptive contrast enhancement technologies. In our study, a fast median-based filter (FMBF) is developed as a noise reduction method. It is based on an IR imaging mechanism to detect the noisy pixels and on a modified median-based filter to remove the noisy pixels in IR images. FMBF has the advantage of a low computation load. In addition, FMBF can retain reasonably good edges and texture information when the size of the filter window increases. The most important advantage is that the peak signal-to-noise ratio (PSNR) caused by FMBF is higher than the PSNR caused by the median filter. A hybrid cumulative histogram equalization (HCHE) is proposed for adaptive contrast enhancement. HCHE can automatically generate a hybrid cumulative histogram (HCH) based on two different pieces of information about the image histogram. HCHE can improve the enhancement effect on hot objects rather than background. The experimental results are addressed and demonstrate that the proposed approach is feasible for use as an effective and adaptive process for enhancing the quality of IR vein-pattern images. PMID:22247674

  1. An approach to improve the quality of infrared images of vein-patterns.

    PubMed

    Lin, Chih-Lung

    2011-01-01

    This study develops an approach to improve the quality of infrared (IR) images of vein-patterns, which usually have noise, low contrast, low brightness and small objects of interest, thus requiring preprocessing to improve their quality. The main characteristics of the proposed approach are that no prior knowledge about the IR image is necessary and no parameters must be preset. Two main goals are sought: impulse noise reduction and adaptive contrast enhancement technologies. In our study, a fast median-based filter (FMBF) is developed as a noise reduction method. It is based on an IR imaging mechanism to detect the noisy pixels and on a modified median-based filter to remove the noisy pixels in IR images. FMBF has the advantage of a low computation load. In addition, FMBF can retain reasonably good edges and texture information when the size of the filter window increases. The most important advantage is that the peak signal-to-noise ratio (PSNR) caused by FMBF is higher than the PSNR caused by the median filter. A hybrid cumulative histogram equalization (HCHE) is proposed for adaptive contrast enhancement. HCHE can automatically generate a hybrid cumulative histogram (HCH) based on two different pieces of information about the image histogram. HCHE can improve the enhancement effect on hot objects rather than background. The experimental results are addressed and demonstrate that the proposed approach is feasible for use as an effective and adaptive process for enhancing the quality of IR vein-pattern images.

  2. Pedestrian detection from thermal images: A sparse representation based approach

    NASA Astrophysics Data System (ADS)

    Qi, Bin; John, Vijay; Liu, Zheng; Mita, Seiichi

    2016-05-01

    Pedestrian detection, a key technology in computer vision, plays a paramount role in the applications of advanced driver assistant systems (ADASs) and autonomous vehicles. The objective of pedestrian detection is to identify and locate people in a dynamic environment so that accidents can be avoided. With significant variations introduced by illumination, occlusion, articulated pose, and complex background, pedestrian detection is a challenging task for visual perception. Different from visible images, thermal images are captured and presented with intensity maps based objects' emissivity, and thus have an enhanced spectral range to make human beings perceptible from the cool background. In this study, a sparse representation based approach is proposed for pedestrian detection from thermal images. We first adopted the histogram of sparse code to represent image features and then detect pedestrian with the extracted features in an unimodal and a multimodal framework respectively. In the unimodal framework, two types of dictionaries, i.e. joint dictionary and individual dictionary, are built by learning from prepared training samples. In the multimodal framework, a weighted fusion scheme is proposed to further highlight the contributions from features with higher separability. To validate the proposed approach, experiments were conducted to compare with three widely used features: Haar wavelets (HWs), histogram of oriented gradients (HOG), and histogram of phase congruency (HPC) as well as two classification methods, i.e. AdaBoost and support vector machine (SVM). Experimental results on a publicly available data set demonstrate the superiority of the proposed approach.

  3. Monte Carlo Sampling in Fractal Landscapes

    NASA Astrophysics Data System (ADS)

    Leitão, Jorge C.; Lopes, J. M. Viana Parente; Altmann, Eduardo G.

    2013-05-01

    We design a random walk to explore fractal landscapes such as those describing chaotic transients in dynamical systems. We show that the random walk moves efficiently only when its step length depends on the height of the landscape via the largest Lyapunov exponent of the chaotic system. We propose a generalization of the Wang-Landau algorithm which constructs not only the density of states (transient time distribution) but also the correct step length. As a result, we obtain a flat-histogram Monte Carlo method which samples fractal landscapes in polynomial time, a dramatic improvement over the exponential scaling of traditional uniform-sampling methods. Our results are not limited by the dimensionality of the landscape and are confirmed numerically in chaotic systems with up to 30 dimensions.

  4. Reconnection at the earth's magnetopause - Magnetic field observations and flux transfer events

    NASA Technical Reports Server (NTRS)

    Russell, C. T.

    1984-01-01

    Theoretical models of plasma acceleration by magnetic-field-line reconnection at the earth magnetopause and the high-resolution three-dimensional plasma measurements obtained with the ISEE satellites are compared and illustrated with diagrams, graphs, drawings, and histograms. The history of reconnection theory and the results of early satellite observations are summarized; the thickness of the magnetopause current layer is discussed; problems in analyzing the polarization of current-layer rotation are considered; and the flux-transfer events responsible for periods of patchy reconnection are characterized in detail. The need for further observations and refinements of the theory to explain the initiation of reconnection and identify the mechanism determining whether it is patchy or steady-state is indicated.

  5. Iterative dataset optimization in automated planning: Implementation for breast and rectal cancer radiotherapy.

    PubMed

    Fan, Jiawei; Wang, Jiazhou; Zhang, Zhen; Hu, Weigang

    2017-06-01

    To develop a new automated treatment planning solution for breast and rectal cancer radiotherapy. The automated treatment planning solution developed in this study includes selection of the iterative optimized training dataset, dose volume histogram (DVH) prediction for the organs at risk (OARs), and automatic generation of clinically acceptable treatment plans. The iterative optimized training dataset is selected by an iterative optimization from 40 treatment plans for left-breast and rectal cancer patients who received radiation therapy. A two-dimensional kernel density estimation algorithm (noted as two parameters KDE) which incorporated two predictive features was implemented to produce the predicted DVHs. Finally, 10 additional new left-breast treatment plans are re-planned using the Pinnacle 3 Auto-Planning (AP) module (version 9.10, Philips Medical Systems) with the objective functions derived from the predicted DVH curves. Automatically generated re-optimized treatment plans are compared with the original manually optimized plans. By combining the iterative optimized training dataset methodology and two parameters KDE prediction algorithm, our proposed automated planning strategy improves the accuracy of the DVH prediction. The automatically generated treatment plans using the dose derived from the predicted DVHs can achieve better dose sparing for some OARs without compromising other metrics of plan quality. The proposed new automated treatment planning solution can be used to efficiently evaluate and improve the quality and consistency of the treatment plans for intensity-modulated breast and rectal cancer radiation therapy. © 2017 American Association of Physicists in Medicine.

  6. A survey and evaluations of histogram-based statistics in alignment-free sequence comparison.

    PubMed

    Luczak, Brian B; James, Benjamin T; Girgis, Hani Z

    2017-12-06

    Since the dawn of the bioinformatics field, sequence alignment scores have been the main method for comparing sequences. However, alignment algorithms are quadratic, requiring long execution time. As alternatives, scientists have developed tens of alignment-free statistics for measuring the similarity between two sequences. We surveyed tens of alignment-free k-mer statistics. Additionally, we evaluated 33 statistics and multiplicative combinations between the statistics and/or their squares. These statistics are calculated on two k-mer histograms representing two sequences. Our evaluations using global alignment scores revealed that the majority of the statistics are sensitive and capable of finding similar sequences to a query sequence. Therefore, any of these statistics can filter out dissimilar sequences quickly. Further, we observed that multiplicative combinations of the statistics are highly correlated with the identity score. Furthermore, combinations involving sequence length difference or Earth Mover's distance, which takes the length difference into account, are always among the highest correlated paired statistics with identity scores. Similarly, paired statistics including length difference or Earth Mover's distance are among the best performers in finding the K-closest sequences. Interestingly, similar performance can be obtained using histograms of shorter words, resulting in reducing the memory requirement and increasing the speed remarkably. Moreover, we found that simple single statistics are sufficient for processing next-generation sequencing reads and for applications relying on local alignment. Finally, we measured the time requirement of each statistic. The survey and the evaluations will help scientists with identifying efficient alternatives to the costly alignment algorithm, saving thousands of computational hours. The source code of the benchmarking tool is available as Supplementary Materials. © The Author 2017. Published by Oxford University Press.

  7. Scaling images using their background ratio. An application in statistical comparisons of images.

    PubMed

    Kalemis, A; Binnie, D; Bailey, D L; Flower, M A; Ott, R J

    2003-06-07

    Comparison of two medical images often requires image scaling as a pre-processing step. This is usually done with the scaling-to-the-mean or scaling-to-the-maximum techniques which, under certain circumstances, in quantitative applications may contribute a significant amount of bias. In this paper, we present a simple scaling method which assumes only that the most predominant values in the corresponding images belong to their background structure. The ratio of the two images to be compared is calculated and its frequency histogram is plotted. The scaling factor is given by the position of the peak in this histogram which belongs to the background structure. The method was tested against the traditional scaling-to-the-mean technique on simulated planar gamma-camera images which were compared using pixelwise statistical parametric tests. Both sensitivity and specificity for each condition were measured over a range of different contrasts and sizes of inhomogeneity for the two scaling techniques. The new method was found to preserve sensitivity in all cases while the traditional technique resulted in significant degradation of sensitivity in certain cases.

  8. Subtype Differentiation of Small (≤ 4 cm) Solid Renal Mass Using Volumetric Histogram Analysis of DWI at 3-T MRI.

    PubMed

    Li, Anqin; Xing, Wei; Li, Haojie; Hu, Yao; Hu, Daoyu; Li, Zhen; Kamel, Ihab R

    2018-05-29

    The purpose of this article is to evaluate the utility of volumetric histogram analysis of apparent diffusion coefficient (ADC) derived from reduced-FOV DWI for small (≤ 4 cm) solid renal mass subtypes at 3-T MRI. This retrospective study included 38 clear cell renal cell carcinomas (RCCs), 16 papillary RCCs, 18 chromophobe RCCs, 13 minimal fat angiomyolipomas (AMLs), and seven oncocytomas evaluated with preoperative MRI. Volumetric ADC maps were generated using all slices of the reduced-FOV DW images to obtain histogram parameters, including mean, median, 10th percentile, 25th percentile, 75th percentile, 90th percentile, and SD ADC values, as well as skewness, kurtosis, and entropy. Comparisons of these parameters were made by one-way ANOVA, t test, and ROC curves analysis. ADC histogram parameters differentiated eight of 10 pairs of renal tumors. Three subtype pairs (clear cell RCC vs papillary RCC, clear cell RCC vs chromophobe RCC, and clear cell RCC vs minimal fat AML) were differentiated by mean ADC. However, five other subtype pairs (clear cell RCC vs oncocytoma, papillary RCC vs minimal fat AML, papillary RCC vs oncocytoma, chromophobe RCC vs minimal fat AML, and chromophobe RCC vs oncocytoma) were differentiated by histogram distribution parameters exclusively (all p < 0.05). Mean ADC, median ADC, 75th and 90th percentile ADC, SD ADC, and entropy of malignant tumors were significantly higher than those of benign tumors (all p < 0.05). Combination of mean ADC with histogram parameters yielded the highest AUC (0.851; sensitivity, 80.0%; specificity, 86.1%). Quantitative volumetric ADC histogram analysis may help differentiate various subtypes of small solid renal tumors, including benign and malignant lesions.

  9. Sensitivity encoded silicon photomultiplier--a new sensor for high-resolution PET-MRI.

    PubMed

    Schulz, Volkmar; Berker, Yannick; Berneking, Arne; Omidvari, Negar; Kiessling, Fabian; Gola, Alberto; Piemonte, Claudio

    2013-07-21

    Detectors for simultaneous positron emission tomography and magnetic resonance imaging in particular with sub-mm spatial resolution are commonly composed of scintillator crystal arrays, readout via arrays of solid state sensors, such as avalanche photo diodes (APDs) or silicon photomultipliers (SiPMs). Usually a light guide between the crystals and the sensor is used to enable the identification of crystals which are smaller than the sensor elements. However, this complicates crystal identification at the gaps and edges of the sensor arrays. A solution is to use as many sensors as crystals with a direct coupling, which unfortunately increases the complexity and power consumption of the readout electronics. Since 1997, position-sensitive APDs have been successfully used to identify sub-mm crystals. Unfortunately, these devices show a limitation in their time resolution and a degradation of spatial resolution when placed in higher magnetic fields. To overcome these limitations, this paper presents a new sensor concept that extends conventional SiPMs by adding position information via the spatial encoding of the channel sensitivity. The concept allows a direct coupling of high-resolution crystal arrays to the sensor with a reduced amount of readout channels. The theory of sensitivity encoding is detailed and linked to compressed sensing to compute unique sparse solutions. Two devices have been designed using one- and two-dimensional linear sensitivity encoding with eight and four readout channels, respectively. Flood histograms of both devices show the capability to precisely identify all 4 × 4 LYSO crystals with dimensions of 0.93 × 0.93 × 10 mm(3). For these crystals, the energy and time resolution (MV ± SD) of the devices with one (two)-dimensional encoding have been measured to be 12.3 · (1 ± 0.047)% (13.7 · (1 ± 0.047)%) around 511 keV with a paired coincidence time resolution (full width at half maximum) of 462 · (1 ± 0.054) ps (452 · (1 ± 0.078) ps).

  10. Sensitivity encoded silicon photomultiplier—a new sensor for high-resolution PET-MRI

    NASA Astrophysics Data System (ADS)

    Schulz, Volkmar; Berker, Yannick; Berneking, Arne; Omidvari, Negar; Kiessling, Fabian; Gola, Alberto; Piemonte, Claudio

    2013-07-01

    Detectors for simultaneous positron emission tomography and magnetic resonance imaging in particular with sub-mm spatial resolution are commonly composed of scintillator crystal arrays, readout via arrays of solid state sensors, such as avalanche photo diodes (APDs) or silicon photomultipliers (SiPMs). Usually a light guide between the crystals and the sensor is used to enable the identification of crystals which are smaller than the sensor elements. However, this complicates crystal identification at the gaps and edges of the sensor arrays. A solution is to use as many sensors as crystals with a direct coupling, which unfortunately increases the complexity and power consumption of the readout electronics. Since 1997, position-sensitive APDs have been successfully used to identify sub-mm crystals. Unfortunately, these devices show a limitation in their time resolution and a degradation of spatial resolution when placed in higher magnetic fields. To overcome these limitations, this paper presents a new sensor concept that extends conventional SiPMs by adding position information via the spatial encoding of the channel sensitivity. The concept allows a direct coupling of high-resolution crystal arrays to the sensor with a reduced amount of readout channels. The theory of sensitivity encoding is detailed and linked to compressed sensing to compute unique sparse solutions. Two devices have been designed using one- and two-dimensional linear sensitivity encoding with eight and four readout channels, respectively. Flood histograms of both devices show the capability to precisely identify all 4 × 4 LYSO crystals with dimensions of 0.93 × 0.93 × 10 mm3. For these crystals, the energy and time resolution (MV ± SD) of the devices with one (two)-dimensional encoding have been measured to be 12.3 · (1 ± 0.047)% (13.7 · (1 ± 0.047)%) around 511 keV with a paired coincidence time resolution (full width at half maximum) of 462 · (1 ± 0.054) ps (452 · (1 ± 0.078) ps).

  11. The Predicted Secretome of the Plant Pathogenic Fungus Fusarium graminearum: A Refined Comparative Analysis

    PubMed Central

    Brown, Neil A.; Antoniw, John; Hammond-Kosack, Kim E.

    2012-01-01

    The fungus Fusarium graminearum forms an intimate association with the host species wheat whilst infecting the floral tissues at anthesis. During the prolonged latent period of infection, extracellular communication between live pathogen and host cells must occur, implying a role for secreted fungal proteins. The wheat cells in contact with fungal hyphae subsequently die and intracellular hyphal colonisation results in the development of visible disease symptoms. Since the original genome annotation analysis was done in 2007, which predicted the secretome using TargetP, the F. graminearum gene call has changed considerably through the combined efforts of the BROAD and MIPS institutes. As a result of the modifications to the genome and the recent findings that suggested a role for secreted proteins in virulence, the F. graminearum secretome was revisited. In the current study, a refined F. graminearum secretome was predicted by combining several bioinformatic approaches. This strategy increased the probability of identifying truly secreted proteins. A secretome of 574 proteins was predicted of which 99% was supported by transcriptional evidence. The function of the annotated and unannotated secreted proteins was explored. The potential role(s) of the annotated proteins including, putative enzymes, phytotoxins and antifungals are discussed. Characterisation of the unannotated proteins included the analysis of Pfam domains and features associated with known fungal effectors, for example, small size, cysteine-rich and containing internal amino acid repeats. A comprehensive comparative genomic analysis involving 57 fungal and oomycete genomes revealed that only a small number of the predicted F. graminearum secreted proteins can be considered to be either species or sequenced strain specific. PMID:22493673

  12. Serial data acquisition for GEM-2D detector

    NASA Astrophysics Data System (ADS)

    Kolasinski, Piotr; Pozniak, Krzysztof T.; Czarski, Tomasz; Linczuk, Maciej; Byszuk, Adrian; Chernyshova, Maryna; Juszczyk, Bartlomiej; Kasprowicz, Grzegorz; Wojenski, Andrzej; Zabolotny, Wojciech; Zienkiewicz, Pawel; Mazon, Didier; Malard, Philippe; Herrmann, Albrecht; Vezinet, Didier

    2014-11-01

    This article debates about data fast acquisition and histogramming method for the X-ray GEM detector. The whole process of histogramming is performed by FPGA chips (Spartan-6 series from Xilinx). The results of the histogramming process are stored in an internal FPGA memory and then sent to PC. In PC data is merged and processed by MATLAB. The structure of firmware functionality implemented in the FPGAs is described. Examples of test measurements and results are presented.

  13. Frequency distribution histograms for the rapid analysis of data

    NASA Technical Reports Server (NTRS)

    Burke, P. V.; Bullen, B. L.; Poff, K. L.

    1988-01-01

    The mean and standard error are good representations for the response of a population to an experimental parameter and are frequently used for this purpose. Frequency distribution histograms show, in addition, responses of individuals in the population. Both the statistics and a visual display of the distribution of the responses can be obtained easily using a microcomputer and available programs. The type of distribution shown by the histogram may suggest different mechanisms to be tested.

  14. A Monte Carlo study of the impact of the choice of rectum volume definition on estimates of equivalent uniform doses and the volume parameter

    NASA Astrophysics Data System (ADS)

    Kvinnsland, Yngve; Muren, Ludvig Paul; Dahl, Olav

    2004-08-01

    Calculations of normal tissue complication probability (NTCP) values for the rectum are difficult because it is a hollow, non-rigid, organ. Finding the true cumulative dose distribution for a number of treatment fractions requires a CT scan before each treatment fraction. This is labour intensive, and several surrogate distributions have therefore been suggested, such as dose wall histograms, dose surface histograms and histograms for the solid rectum, with and without margins. In this study, a Monte Carlo method is used to investigate the relationships between the cumulative dose distributions based on all treatment fractions and the above-mentioned histograms that are based on one CT scan only, in terms of equivalent uniform dose. Furthermore, the effect of a specific choice of histogram on estimates of the volume parameter of the probit NTCP model was investigated. It was found that the solid rectum and the rectum wall histograms (without margins) gave equivalent uniform doses with an expected value close to the values calculated from the cumulative dose distributions in the rectum wall. With the number of patients available in this study the standard deviations of the estimates of the volume parameter were large, and it was not possible to decide which volume gave the best estimates of the volume parameter, but there were distinct differences in the mean values of the values obtained.

  15. Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma

    PubMed Central

    Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-01-01

    Abstract The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC). Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement. The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P <0.05), with area under the ROC curves (AUCs) of 0.66 to 0.74 for ADC and 0.76 to 0.88 for PVP. The largest AUC of PVP (1th percentile) showed significantly higher accuracy compared with that of arterial phase (AP) or tumor size (P <0.001). MR histogram analyses—in particular for 1th percentile for PVP images—held promise for prediction of MVI of HCC. PMID:27368028

  16. Effect of respiratory and cardiac gating on the major diffusion-imaging metrics

    PubMed Central

    Hamaguchi, Hiroyuki; Sugimori, Hiroyuki; Nakanishi, Mitsuhiro; Nakagawa, Shin; Fujiwara, Taro; Yoshida, Hirokazu; Takamori, Sayaka; Shirato, Hiroki

    2016-01-01

    The effect of respiratory gating on the major diffusion-imaging metrics and that of cardiac gating on mean kurtosis (MK) are not known. For evaluation of whether the major diffusion-imaging metrics—MK, fractional anisotropy (FA), and mean diffusivity (MD) of the brain—varied between gated and non-gated acquisitions, respiratory-gated, cardiac-gated, and non-gated diffusion-imaging of the brain were performed in 10 healthy volunteers. MK, FA, and MD maps were constructed for all acquisitions, and the histograms were constructed. The normalized peak height and location of the histograms were compared among the acquisitions by use of Friedman and post hoc Wilcoxon tests. The effect of the repetition time (TR) on the diffusion-imaging metrics was also tested, and we corrected for its variation among acquisitions, if necessary. The results showed a shift in the peak location of the MK and MD histograms to the right with an increase in TR (p ≤ 0.01). The corrected peak location of the MK histograms, the normalized peak height of the FA histograms, the normalized peak height and the corrected peak location of the MD histograms varied significantly between the gated and non-gated acquisitions (p < 0.05). These results imply an influence of respiration and cardiac pulsation on the major diffusion-imaging metrics. The gating conditions must be kept identical if reproducible results are to be achieved. PMID:27073115

  17. SEURAT: Visual analytics for the integrated analysis of microarray data

    PubMed Central

    2010-01-01

    Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data. PMID:20525257

  18. Augmented reality three-dimensional object visualization and recognition with axially distributed sensing.

    PubMed

    Markman, Adam; Shen, Xin; Hua, Hong; Javidi, Bahram

    2016-01-15

    An augmented reality (AR) smartglass display combines real-world scenes with digital information enabling the rapid growth of AR-based applications. We present an augmented reality-based approach for three-dimensional (3D) optical visualization and object recognition using axially distributed sensing (ADS). For object recognition, the 3D scene is reconstructed, and feature extraction is performed by calculating the histogram of oriented gradients (HOG) of a sliding window. A support vector machine (SVM) is then used for classification. Once an object has been identified, the 3D reconstructed scene with the detected object is optically displayed in the smartglasses allowing the user to see the object, remove partial occlusions of the object, and provide critical information about the object such as 3D coordinates, which are not possible with conventional AR devices. To the best of our knowledge, this is the first report on combining axially distributed sensing with 3D object visualization and recognition for applications to augmented reality. The proposed approach can have benefits for many applications, including medical, military, transportation, and manufacturing.

  19. Deviation from the mean in teaching uncertainties

    NASA Astrophysics Data System (ADS)

    Budini, N.; Giorgi, S.; Sarmiento, L. M.; Cámara, C.; Carreri, R.; Gómez Carrillo, S. C.

    2017-07-01

    In this work we present two simple and interactive web-based activities for introducing students to the concepts of uncertainties in measurements. These activities are based on the real-time construction of histograms from students measurements and their subsequent analysis through an active and dynamic approach.

  20. Regionally adaptive histogram equalization of the chest.

    PubMed

    Sherrier, R H; Johnson, G A

    1987-01-01

    Advances in the area of digital chest radiography have resulted in the acquisition of high-quality images of the human chest. With these advances, there arises a genuine need for image processing algorithms specific to the chest, in order to fully exploit this digital technology. We have implemented the well-known technique of histogram equalization, noting the problems encountered when it is adapted to chest images. These problems have been successfully solved with our regionally adaptive histogram equalization method. With this technique histograms are calculated locally and then modified according to both the mean pixel value of that region as well as certain characteristics of the cumulative distribution function. This process, which has allowed certain regions of the chest radiograph to be enhanced differentially, may also have broader implications for other image processing tasks.

  1. Infrared face recognition based on LBP histogram and KW feature selection

    NASA Astrophysics Data System (ADS)

    Xie, Zhihua

    2014-07-01

    The conventional LBP-based feature as represented by the local binary pattern (LBP) histogram still has room for performance improvements. This paper focuses on the dimension reduction of LBP micro-patterns and proposes an improved infrared face recognition method based on LBP histogram representation. To extract the local robust features in infrared face images, LBP is chosen to get the composition of micro-patterns of sub-blocks. Based on statistical test theory, Kruskal-Wallis (KW) feature selection method is proposed to get the LBP patterns which are suitable for infrared face recognition. The experimental results show combination of LBP and KW features selection improves the performance of infrared face recognition, the proposed method outperforms the traditional methods based on LBP histogram, discrete cosine transform(DCT) or principal component analysis(PCA).

  2. Digital tape unit test facility software

    NASA Technical Reports Server (NTRS)

    Jackson, J. T.

    1971-01-01

    Two computer programs are described which are used for the collection and analysis of data from the digital tape unit test facility (DTUTF). The data are the recorded results of skew tests made on magnetic digital tapes which are used on computers as input/output media. The results of each tape test are keypunched onto an 80 column computer card. The format of the card is checked and the card image is stored on a master summary tape via the DTUTF card checking and tape updating system. The master summary tape containing the results of all the tape tests is then used for analysis as input to the DTUTF histogram generating system which produces a histogram of skew vs. date for selected data, followed by some statistical analysis of the data.

  3. Detection of acute lymphocyte leukemia using k-nearest neighbor algorithm based on shape and histogram features

    NASA Astrophysics Data System (ADS)

    Purwanti, Endah; Calista, Evelyn

    2017-05-01

    Leukemia is a type of cancer which is caused by malignant neoplasms in leukocyte cells. Leukemia disease which can cause death quickly enough for the sufferer is a type of acute lymphocyte leukemia (ALL). In this study, we propose automatic detection of lymphocyte leukemia through classification of lymphocyte cell images obtained from peripheral blood smear single cell. There are two main objectives in this study. The first is to extract featuring cells. The second objective is to classify the lymphocyte cells into two classes, namely normal and abnormal lymphocytes. In conducting this study, we use combination of shape feature and histogram feature, and the classification algorithm is k-nearest Neighbour with k variation is 1, 3, 5, 7, 9, 11, 13, and 15. The best level of accuracy, sensitivity, and specificity in this study are 90%, 90%, and 90%, and they were obtained from combined features of area-perimeter-mean-standard deviation with k=7.

  4. Generalized image contrast enhancement technique based on Heinemann contrast discrimination model

    NASA Astrophysics Data System (ADS)

    Liu, Hong; Nodine, Calvin F.

    1994-03-01

    This paper presents a generalized image contrast enhancement technique which equalizes perceived brightness based on the Heinemann contrast discrimination model. This is a modified algorithm which presents an improvement over the previous study by Mokrane in its mathematically proven existence of a unique solution and in its easily tunable parameterization. The model uses a log-log representation of contrast luminosity between targets and the surround in a fixed luminosity background setting. The algorithm consists of two nonlinear gray-scale mapping functions which have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of gray scale distribution of the image, and can be uniquely determined once the previous three are given. Tests have been carried out to examine the effectiveness of the algorithm for increasing the overall contrast of images. It can be demonstrated that the generalized algorithm provides better contrast enhancement than histogram equalization. In fact, the histogram equalization technique is a special case of the proposed mapping.

  5. a Robust Descriptor Based on Spatial and Frequency Structural Information for Visible and Thermal Infrared Image Matching

    NASA Astrophysics Data System (ADS)

    Fu, Z.; Qin, Q.; Wu, C.; Chang, Y.; Luo, B.

    2017-09-01

    Due to the differences of imaging principles, image matching between visible and thermal infrared images still exist new challenges and difficulties. Inspired by the complementary spatial and frequency information of geometric structural features, a robust descriptor is proposed for visible and thermal infrared images matching. We first divide two different spatial regions to the region around point of interest, using the histogram of oriented magnitudes, which corresponds to the 2-D structural shape information to describe the larger region and the edge oriented histogram to describe the spatial distribution for the smaller region. Then the two vectors are normalized and combined to a higher feature vector. Finally, our proposed descriptor is obtained by applying principal component analysis (PCA) to reduce the dimension of the combined high feature vector to make our descriptor more robust. Experimental results showed that our proposed method was provided with significant improvements in correct matching numbers and obvious advantages by complementing information within spatial and frequency structural information.

  6. Assessment of placental volume and vascularization at 11-14 weeks of gestation in a Taiwanese population using three-dimensional power Doppler ultrasound.

    PubMed

    Wang, Hsing-I; Yang, Ming-Jie; Wang, Peng-Hui; Wu, Yi-Cheng; Chen, Chih-Yao

    2014-12-01

    The placental volume and vascular indices are crucial in helping doctors to evaluate early fetal growth and development. Inadequate placental volume or vascularity might indicate poor fetal growth or gestational complications. This study aimed to evaluate the placental volume and vascular indices during the period of 11-14 weeks of gestation in a Taiwanese population. From June 2006 to September 2009, three-dimensional power Doppler ultrasound was performed in 222 normal pregnancies from 11-14 weeks of gestation. Power Doppler ultrasound was applied to the placenta and the placental volume was obtained by a rotational technique (VOCAL). The three-dimensional power histogram was used to assess the placental vascular indices, including the mean gray value, the vascularization index, the flow index, and the vascularization flow index. The placental vascular indices were then plotted against gestational age (GA) and placental volume. Our results showed that the linear regression equation for placental volume using gestational week as the independent variable was placental volume = 18.852 × GA - 180.89 (r = 0.481, p < 0.05). All the placental vascular indices showed a constant distribution throughout the period 11-14 weeks of gestation. A tendency for a reduction in the placental mean gray value with gestational week was observed, but without statistical significance. All the placental vascular indices estimated by three-dimensional power Doppler ultrasonography showed a constant distribution throughout gestation. Copyright © 2014. Published by Elsevier Taiwan.

  7. Differentiation of orbital lymphoma and idiopathic orbital inflammatory pseudotumor: combined diagnostic value of conventional MRI and histogram analysis of ADC maps.

    PubMed

    Ren, Jiliang; Yuan, Ying; Wu, Yingwei; Tao, Xiaofeng

    2018-05-02

    The overlap of morphological feature and mean ADC value restricted clinical application of MRI in the differential diagnosis of orbital lymphoma and idiopathic orbital inflammatory pseudotumor (IOIP). In this paper, we aimed to retrospectively evaluate the combined diagnostic value of conventional magnetic resonance imaging (MRI) and whole-tumor histogram analysis of apparent diffusion coefficient (ADC) maps in the differentiation of the two lesions. In total, 18 patients with orbital lymphoma and 22 patients with IOIP were included, who underwent both conventional MRI and diffusion weighted imaging before treatment. Conventional MRI features and histogram parameters derived from ADC maps, including mean ADC (ADC mean ), median ADC (ADC median ), skewness, kurtosis, 10th, 25th, 75th and 90th percentiles of ADC (ADC 10 , ADC 25 , ADC 75 , ADC 90 ) were evaluated and compared between orbital lymphoma and IOIP. Multivariate logistic regression analysis was used to identify the most valuable variables for discriminating. Differential model was built upon the selected variables and receiver operating characteristic (ROC) analysis was also performed to determine the differential ability of the model. Multivariate logistic regression showed ADC 10 (P = 0.023) and involvement of orbit preseptal space (P = 0.029) were the most promising indexes in the discrimination of orbital lymphoma and IOIP. The logistic model defined by ADC 10 and involvement of orbit preseptal space was built, which achieved an AUC of 0.939, with sensitivity of 77.30% and specificity of 94.40%. Conventional MRI feature of involvement of orbit preseptal space and ADC histogram parameter of ADC 10 are valuable in differential diagnosis of orbital lymphoma and IOIP.

  8. Investigating the Role of Global Histogram Equalization Technique for 99mTechnetium-Methylene diphosphonate Bone Scan Image Enhancement

    PubMed Central

    Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh

    2017-01-01

    Purpose of the Study: 99mTechnetium-methylene diphosphonate (99mTc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99mTc-MDP-bone scan images. Materials and Methods: A set of 89 low contrast 99mTc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. Results: This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t-test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. Conclusion: GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful. PMID:29142344

  9. Investigation of depth-of-interaction (DOI) effects in single- and dual-layer block detectors by the use of light sharing in scintillators.

    PubMed

    Yamamoto, Seiichi

    2012-01-01

    In block detectors for PET scanners that use different lengths of slits in scintillators to share light among photomultiplier tubes (PMTs), a position histogram is distorted when the depth of interaction (DOI) of the gamma photons is near the PMTs (DOI effect). However, it remains unclear whether a DOI effect is observed for block detectors that use light sharing in scintillators. To investigate the effect, I tested the effect for single- and dual-layer block detectors. In the single-layer block detector, Ce doped Gd₂SiO₅ (GSO) crystals of 1.9 × 1.9 × 15 mm³ (0.5 mol% Ce) were used. In the dual-layer block detector, GSO crystals of a 1.9 × 1.9 × 6 mm³ (1.5 mol% Ce) were used for the front layer and GSO crystals of 1.9 × 1.9 × 9 mm³ (0.5 mol% Ce) for the back layer. These scintillators were arranged to form an 8 × 8 matrix with multi-layer optical film inserted partly between the scintillators for obtaining an optimized position response with use of two dual-PMTs. Position histograms and energy responses were measured for these block detectors at three different DOI positions, and the flood histograms were obtained. The results indicated that DOI effects are observed in both block detectors, but the dual-layer block showed more severe distortion in the position histogram as well as larger energy variations. We conclude that, in the block detectors that use light sharing in the scintillators, the DOI effect is an important factor for the performance of the detectors, especially for DOI block detectors.

  10. Histogram analysis of apparent diffusion coefficient at 3.0 T in urinary bladder lesions: correlation with pathologic findings.

    PubMed

    Suo, Shi-Teng; Chen, Xiao-Xi; Fan, Yu; Wu, Lian-Ming; Yao, Qiu-Ying; Cao, Meng-Qiu; Liu, Qiang; Xu, Jian-Rong

    2014-08-01

    To investigate the potential value of histogram analysis of apparent diffusion coefficient (ADC) obtained at standard (700 s/mm(2)) and high (1500 s/mm(2)) b values on a 3.0-T scanner in the differentiation of bladder cancer from benign lesions and in assessing bladder tumors of different pathologic T stages and to evaluate the diagnostic performance of ADC-based histogram parameters. In all, 52 patients with bladder lesions, including benign lesions (n = 7) and malignant tumors (n = 45; T1 stage or less, 23; T2 stage, 7; T3 stage, 8; and T4 stage, 7), were retrospectively evaluated. Magnetic resonance examination at 3.0 T and diffusion-weighted imaging were performed. ADC maps were obtained at two b values (b = 700 and 1500 s/mm(2); ie, ADC-700 and ADC-1500). Parameters of histogram analysis included mean, kurtosis, skewness, and entropy. The correlations between these parameters and pathologic results were revealed. Receiver operating characteristic (ROC) curves were generated to determine the diagnostic value of histogram parameters. Significant differences were found in mean ADC-700, mean ADC-1500, skewness ADC-1500, and kurtosis ADC-1500 between bladder cancer and benign lesions (P = .002-.032). There were also significant differences in mean ADC-700, mean ADC-1500, and kurtosis ADC-1500 among bladder tumors of different pathologic T stages (P = .000-.046). No significant differences were observed in other parameters. Mean ADC-1500 and kurtosis ADC-1500 were significantly correlated with T stage, respectively (ρ = -0.614, P < .001; ρ = 0.374, P = .011). ROC analysis showed that the combination of mean ADC-1500 and kurtosis ADC-1500 has the maximal area under the ROC curve (AUC, 0.894; P < .001) in the differentiation of benign lesions and malignant tumors, with a sensitivity of 77.78% and specificity of 100%. AUCs for differentiating low- and high-stage tumors were 0.840 for mean ADC-1500 (P < .001) and 0.696 for kurtosis ADC-1500 (P = .015). Histogram analysis of ADC-1500 at 3.0 T can be useful in evaluation of bladder lesions. A combination of mean ADC-1500 and kurtosis ADC-1500 may be more beneficial in the differentiation of benign and malignant lesions. Mean ADC-1500 was the most promising parameter for differentiating low- from high-stage bladder cancer. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  11. Activity-Based Protein Profiling of Organophosphorus and Thiocarbamate Pesticides Reveals Multiple Serine Hydrolase Targets in Mouse Brain

    PubMed Central

    NOMURA, DANIEL K.; CASIDA, JOHN E.

    2010-01-01

    Organophosphorus (OP) and thiocarbamate (TC) agrochemicals are used worldwide as insecticides, herbicides, and fungicides, but their safety assessment in terms of potential off-targets remains incomplete. In this study, we used a chemoproteomic platform, termed activity-based protein profiling, to broadly define serine hydrolase targets in mouse brain of a panel of 29 OP and TC pesticides. Among the secondary targets identified, enzymes involved in degradation of endocannabinoid signaling lipids, monoacylglycerol lipase and fatty acid amide hydrolase, were inhibited by several OP and TC pesticides. Blockade of these two enzymes led to elevations in brain endocannabinoid levels and dysregulated brain arachidonate metabolism. Other secondary targets include enzymes thought to also play important roles in the nervous system and unannotated proteins. This study reveals a multitude of secondary targets for OP and TC pesticides and underscores the utility of chemoproteomic platforms in gaining insights into biochemical pathways that are perturbed by these toxicants. PMID:21341672

  12. Remote logo detection using angle-distance histograms

    NASA Astrophysics Data System (ADS)

    Youn, Sungwook; Ok, Jiheon; Baek, Sangwook; Woo, Seongyoun; Lee, Chulhee

    2016-05-01

    Among all the various computer vision applications, automatic logo recognition has drawn great interest from industry as well as various academic institutions. In this paper, we propose an angle-distance map, which we used to develop a robust logo detection algorithm. The proposed angle-distance histogram is invariant against scale and rotation. The proposed method first used shape information and color characteristics to find the candidate regions and then applied the angle-distance histogram. Experiments show that the proposed method detected logos of various sizes and orientations.

  13. Domain atrophy creates rare cases of functional partial protein domains.

    PubMed

    Prakash, Ananth; Bateman, Alex

    2015-04-30

    Protein domains display a range of structural diversity, with numerous additions and deletions of secondary structural elements between related domains. We have observed a small number of cases of surprising large-scale deletions of core elements of structural domains. We propose a new concept called domain atrophy, where protein domains lose a significant number of core structural elements. Here, we implement a new pipeline to systematically identify new cases of domain atrophy across all known protein sequences. The output of this pipeline was carefully checked by hand, which filtered out partial domain instances that were unlikely to represent true domain atrophy due to misannotations or un-annotated sequence fragments. We identify 75 cases of domain atrophy, of which eight cases are found in a three-dimensional protein structure and 67 cases have been inferred based on mapping to a known homologous structure. Domains with structural variations include ancient folds such as the TIM-barrel and Rossmann folds. Most of these domains are observed to show structural loss that does not affect their functional sites. Our analysis has significantly increased the known cases of domain atrophy. We discuss specific instances of domain atrophy and see that there has often been a compensatory mechanism that helps to maintain the stability of the partial domain. Our study indicates that although domain atrophy is an extremely rare phenomenon, protein domains under certain circumstances can tolerate extreme mutations giving rise to partial, but functional, domains.

  14. Structural Physics of Bee Honeycomb

    NASA Astrophysics Data System (ADS)

    Kaatz, Forrest; Bultheel, Adhemar; Egami, Takeshi

    2008-03-01

    Honeybee combs have aroused interest in the ability of honeybees to form regular hexagonal geometric constructs since ancient times. Here we use a real space technique based on the pair distribution function (PDF) and radial distribution function (RDF), and a reciprocal space method utilizing the Debye-Waller Factor (DWF) to quantify the order for a range of honeycombs made by Apis mellifera. The PDFs and RDFs are fit with a series of Gaussian curves. We characterize the order in the honeycomb using a real space order parameter, OP3, to describe the order in the combs and a two-dimensional Fourier transform from which a Debye-Waller order parameter, u, is derived. Both OP3 and u take values from [0, 1] where the value one represents perfect order. The analyzed combs have values of OP3 from 0.33 to 0.60 and values of u from 0.83 to 0.98. RDF fits of honeycomb histograms show that naturally made comb can be crystalline in a 2D ordered structural sense, yet is more `liquid-like' than cells made on `foundation' wax. We show that with the assistance of man-made foundation wax, honeybees can manufacture highly ordered arrays of hexagonal cells.

  15. An embedded real-time red peach detection system based on an OV7670 camera, ARM cortex-M4 processor and 3D look-up tables.

    PubMed

    Teixidó, Mercè; Font, Davinia; Pallejà, Tomàs; Tresanchez, Marcel; Nogués, Miquel; Palacín, Jordi

    2012-10-22

    This work proposes the development of an embedded real-time fruit detection system for future automatic fruit harvesting. The proposed embedded system is based on an ARM Cortex-M4 (STM32F407VGT6) processor and an Omnivision OV7670 color camera. The future goal of this embedded vision system will be to control a robotized arm to automatically select and pick some fruit directly from the tree. The complete embedded system has been designed to be placed directly in the gripper tool of the future robotized harvesting arm. The embedded system will be able to perform real-time fruit detection and tracking by using a three-dimensional look-up-table (LUT) defined in the RGB color space and optimized for fruit picking. Additionally, two different methodologies for creating optimized 3D LUTs based on existing linear color models and fruit histograms were implemented in this work and compared for the case of red peaches. The resulting system is able to acquire general and zoomed orchard images and to update the relative tracking information of a red peach in the tree ten times per second.

  16. An Embedded Real-Time Red Peach Detection System Based on an OV7670 Camera, ARM Cortex-M4 Processor and 3D Look-Up Tables

    PubMed Central

    Teixidó, Mercè; Font, Davinia; Pallejà, Tomàs; Tresanchez, Marcel; Nogués, Miquel; Palacín, Jordi

    2012-01-01

    This work proposes the development of an embedded real-time fruit detection system for future automatic fruit harvesting. The proposed embedded system is based on an ARM Cortex-M4 (STM32F407VGT6) processor and an Omnivision OV7670 color camera. The future goal of this embedded vision system will be to control a robotized arm to automatically select and pick some fruit directly from the tree. The complete embedded system has been designed to be placed directly in the gripper tool of the future robotized harvesting arm. The embedded system will be able to perform real-time fruit detection and tracking by using a three-dimensional look-up-table (LUT) defined in the RGB color space and optimized for fruit picking. Additionally, two different methodologies for creating optimized 3D LUTs based on existing linear color models and fruit histograms were implemented in this work and compared for the case of red peaches. The resulting system is able to acquire general and zoomed orchard images and to update the relative tracking information of a red peach in the tree ten times per second. PMID:23202040

  17. Pilot climate data system: A state-of-the-art capability in scientific data management

    NASA Technical Reports Server (NTRS)

    Smith, P. H.; Treinish, L. A.; Novak, L. V.

    1983-01-01

    The Pilot Climate Data System (PCDS) was developed by the Information Management Branch of NASA's Goddard Space Flight Center to manage a large collection of climate-related data of interest to the research community. The PCDS now provides uniform data catalogs, inventories, access methods, graphical displays and statistical calculations for selected NASA and non-NASA data sets. Data manipulation capabilities were developed to permit researchers to easily combine or compare data. The current capabilities of the PCDS include many tools for the statistical survey of climate data. A climate researcher can examine any data set of interest via flexible utilities to create a variety of two- and three-dimensional displays, including vector plots, scatter diagrams, histograms, contour plots, surface diagrams and pseudo-color images. The graphics and statistics subsystems employ an intermediate data storage format which is data-set independent. Outside of the graphics system there exist other utilities to select, filter, list, compress, and calculate time-averages and variances for any data of interest. The PCDS now fully supports approximately twenty different data sets and is being used on a trial basis by several different in-house research grounds.

  18. Stationary Wavelet Transform and AdaBoost with SVM Based Pathological Brain Detection in MRI Scanning.

    PubMed

    Nayak, Deepak Ranjan; Dash, Ratnakar; Majhi, Banshidhar

    2017-01-01

    This paper presents an automatic classification system for segregating pathological brain from normal brains in magnetic resonance imaging scanning. The proposed system employs contrast limited adaptive histogram equalization scheme to enhance the diseased region in brain MR images. Two-dimensional stationary wavelet transform is harnessed to extract features from the preprocessed images. The feature vector is constructed using the energy and entropy values, computed from the level- 2 SWT coefficients. Then, the relevant and uncorrelated features are selected using symmetric uncertainty ranking filter. Subsequently, the selected features are given input to the proposed AdaBoost with support vector machine classifier, where SVM is used as the base classifier of AdaBoost algorithm. To validate the proposed system, three standard MR image datasets, Dataset-66, Dataset-160, and Dataset- 255 have been utilized. The 5 runs of k-fold stratified cross validation results indicate the suggested scheme offers better performance than other existing schemes in terms of accuracy and number of features. The proposed system earns ideal classification over Dataset-66 and Dataset-160; whereas, for Dataset- 255, an accuracy of 99.45% is achieved. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  19. Statistical thermodynamics of aligned rigid rods with attractive lateral interactions: Theory and Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    dos Santos, G. J.; Linares, D. H.; Ramirez-Pastor, A. J.

    2018-04-01

    The phase behaviour of aligned rigid rods of length k (k-mers) adsorbed on two-dimensional square lattices has been studied by Monte Carlo (MC) simulations and histogram reweighting technique. The k-mers, containing k identical units (each one occupying a lattice site) were deposited along one of the directions of the lattice. In addition, attractive lateral interactions were considered. The methodology was applied, particularly, to the study of the critical point of the condensation transition occurring in the system. The process was monitored by following the fourth order Binder cumulant as a function of temperature for different lattice sizes. The results, obtained for k ranging from 2 to 7, show that: (i) the transition coverage exhibits a decreasing behaviour when it is plotted as a function of the k-mer size and (ii) the transition temperature, Tc, exhibits a power law dependence on k, Tc ∼k 0 , 4, shifting to higher values as k increases. Comparisons with an analytical model based on a generalization of the Bragg-Williams approximation (BWA) were performed in order to support the simulation technique. A significant qualitative agreement was obtained between BWA and MC results.

  20. Skin temperature evaluation by infrared thermography: Comparison of two image analysis methods during the nonsteady state induced by physical exercise

    NASA Astrophysics Data System (ADS)

    Formenti, Damiano; Ludwig, Nicola; Rossi, Alessio; Trecroci, Athos; Alberti, Giampietro; Gargano, Marco; Merla, Arcangelo; Ammer, Kurt; Caumo, Andrea

    2017-03-01

    The most common method to derive a temperature value from a thermal image in humans is the calculation of the average of the temperature values of all the pixels confined within a demarcated boundary defined region of interest (ROI). Such summary measure of skin temperature is denoted as Troi in this study. Recently, an alternative method for the derivation of skin temperature from the thermal image has been developed. Such novel method (denoted as Tmax) is based on an automated (software-driven) selection of the warmest pixels within the ROI. Troi and Tmax have been compared under basal, steady-state conditions, resulting very well correlated and characterized by a bias of approximately 1 °C (Tmax > Troi). Aim of this study was to investigate the relationship between Tmax and Troi under the nonsteady-state conditions induced by physical exercise. Thermal images of quadriceps of 13 subjects performing a squat exercise were recorded for 120 s before (basal steady state) and for 480 s after the initiation of the exercise (nonsteady state). The thermal images were then analysed to extract Troi and Tmax. Troi and Tmax changed almost in parallel during the nonstead -state. At a closer inspection, it was found that during the nonsteady state the bias between the two methods slightly increased (from 0.7 to 1.1 °C) and the degree of association between them slightly decreased (from Pearson's r = 0.96 to 0.83). Troi and Tmax had different relationships with the skin temperature histogram. Whereas Tmax was the mean, which could be interpreted as the centre of gravity of the histogram, Tmax was related with the extreme upper tail of the histogram. During the nonsteady state, the histogram increased its spread and became slightly more asymmetric. As a result, Troi deviated a little from the 50th percentile, while Tmax remained constantly higher than the 95th percentile. Despite their differences, Troi and Tmax showed a substantial agreement in assessing the changes in skin temperature following physical exercise. Further studies are needed to clarify the relationship existing among Tmax, Troi and cutaneous blood flow during physical exercise.

  1. Regulation of muscle stiffness during periodic length changes in the isolated abdomen of the hermit crab.

    PubMed

    Chapple, W D

    1997-09-01

    Reflex activation of the ventral superficial muscles (VSM) in the abdomen of the hermit crab, Pagurus pollicarus, was studied using sinusoidal and stochastic longitudinal vibration of the muscle while recording the length and force of the muscle and the spike times of three exciter motoneurons. In the absence of vibration, the interspike interval histograms of the two larger motoneurons were bimodal; cutting sensory nerves containing most of the mechanoreceptor input removed the short interval peak in the histogram, indicating that the receptors are important in maintaining tonic firing. Vibration of the muscle evoked a reflex increase in motoneuron frequency that habituated after an initial peak but remained above control levels for the duration of stimulation. Motoneuron frequency increased with root mean square (rms) stimulus amplitude. Average stiffness during stimulation was about two times the stiffness of passive muscle. The reflex did not alter muscle dynamics. Estimated transfer functions were calculated from the fast Fourier transform of length and force signals. Coherence was >0.9 for the frequency range of 3-35 Hz. Stiffness magnitude gradually increased over this range in both reflex activated and passive muscle; phase was between 10 and 20 degrees. Reflex stiffness decreased with increasing stimulus amplitudes, but at larger amplitudes, this decrease was much less pronounced; in this range stiffness was regulated by the reflex. The sinusoidal frequency at which reflex bursts were elicited was approximately 6 Hz, consistent with previous measurements using ramp stretch. During reflex excitation, there was an increase in amplitude of the short interval peak in the interspike interval histogram; this was reduced when the majority of afferent pathways was removed. A phase histogram of motoneuron firing during sinusoidal vibration had a peak at approximately 110 ms, also suggesting that an important component of the reflex is via direct projections from the mechanoreceptors. These results are consistent with the hypothesis that a robust feedforward regulation of abdominal stiffness during continuous disturbances is achieved by mechanoreceptors signalling the absolute value of changing forces; habituation of the reflex, its high-threshold for low frequency disturbances and the activation kinetics of the muscle further modify reflex dynamics.

  2. Histograms and Raisin Bread

    ERIC Educational Resources Information Center

    Leyden, Michael B.

    1975-01-01

    Describes various elementary school activities using a loaf of raisin bread to promote inquiry skills. Activities include estimating the number of raisins in the loaf by constructing histograms of the number of raisins in a slice. (MLH)

  3. Infrared small target enhancement: grey level mapping based on improved sigmoid transformation and saliency histogram

    NASA Astrophysics Data System (ADS)

    Wan, Minjie; Gu, Guohua; Qian, Weixian; Ren, Kan; Chen, Qian

    2018-06-01

    Infrared (IR) small target enhancement plays a significant role in modern infrared search and track (IRST) systems and is the basic technique of target detection and tracking. In this paper, a coarse-to-fine grey level mapping method using improved sigmoid transformation and saliency histogram is designed to enhance IR small targets under different backgrounds. For the stage of rough enhancement, the intensity histogram is modified via an improved sigmoid function so as to narrow the regular intensity range of background as much as possible. For the part of further enhancement, a linear transformation is accomplished based on a saliency histogram constructed by averaging the cumulative saliency values provided by a saliency map. Compared with other typical methods, the presented method can achieve both better visual performances and quantitative evaluations.

  4. A domain-knowledge-inspired mathematical framework for the description and classification of H&E stained histopathology images.

    PubMed

    Massar, Melody L; Bhagavatula, Ramamurthy; Ozolek, John A; Castro, Carlos A; Fickus, Matthew; Kovačević, Jelena

    2011-10-19

    We present the current state of our work on a mathematical framework for identification and delineation of histopathology images-local histograms and occlusion models. Local histograms are histograms computed over defined spatial neighborhoods whose purpose is to characterize an image locally. This unit of description is augmented by our occlusion models that describe a methodology for image formation. In the context of this image formation model, the power of local histograms with respect to appropriate families of images will be shown through various proved statements about expected performance. We conclude by presenting a preliminary study to demonstrate the power of the framework in the context of histopathology image classification tasks that, while differing greatly in application, both originate from what is considered an appropriate class of images for this framework.

  5. [Research on K-means clustering segmentation method for MRI brain image based on selecting multi-peaks in gray histogram].

    PubMed

    Chen, Zhaoxue; Yu, Haizhong; Chen, Hao

    2013-12-01

    To solve the problem of traditional K-means clustering in which initial clustering centers are selected randomly, we proposed a new K-means segmentation algorithm based on robustly selecting 'peaks' standing for White Matter, Gray Matter and Cerebrospinal Fluid in multi-peaks gray histogram of MRI brain image. The new algorithm takes gray value of selected histogram 'peaks' as the initial K-means clustering center and can segment the MRI brain image into three parts of tissue more effectively, accurately, steadily and successfully. Massive experiments have proved that the proposed algorithm can overcome many shortcomings caused by traditional K-means clustering method such as low efficiency, veracity, robustness and time consuming. The histogram 'peak' selecting idea of the proposed segmentootion method is of more universal availability.

  6. Neutron camera employing row and column summations

    DOEpatents

    Clonts, Lloyd G.; Diawara, Yacouba; Donahue, Jr, Cornelius; Montcalm, Christopher A.; Riedel, Richard A.; Visscher, Theodore

    2016-06-14

    For each photomultiplier tube in an Anger camera, an R.times.S array of preamplifiers is provided to detect electrons generated within the photomultiplier tube. The outputs of the preamplifiers are digitized to measure the magnitude of the signals from each preamplifier. For each photomultiplier tube, a corresponding summation circuitry including R row summation circuits and S column summation circuits numerically add the magnitudes of the signals from preamplifiers for each row and for each column to generate histograms. For a P.times.Q array of photomultiplier tubes, P.times.Q summation circuitries generate P.times.Q row histograms including R entries and P.times.Q column histograms including S entries. The total set of histograms include P.times.Q.times.(R+S) entries, which can be analyzed by a position calculation circuit to determine the locations of events (detection of a neutron).

  7. Evaluation of breast cancer using intravoxel incoherent motion (IVIM) histogram analysis: comparison with malignant status, histological subtype, and molecular prognostic factors.

    PubMed

    Cho, Gene Young; Moy, Linda; Kim, Sungheon G; Baete, Steven H; Moccaldi, Melanie; Babb, James S; Sodickson, Daniel K; Sigmund, Eric E

    2016-08-01

    To examine heterogeneous breast cancer through intravoxel incoherent motion (IVIM) histogram analysis. This HIPAA-compliant, IRB-approved retrospective study included 62 patients (age 48.44 ± 11.14 years, 50 malignant lesions and 12 benign) who underwent contrast-enhanced 3 T breast MRI and diffusion-weighted imaging. Apparent diffusion coefficient (ADC) and IVIM biomarkers of tissue diffusivity (Dt), perfusion fraction (fp), and pseudo-diffusivity (Dp) were calculated using voxel-based analysis for the whole lesion volume. Histogram analysis was performed to quantify tumour heterogeneity. Comparisons were made using Mann-Whitney tests between benign/malignant status, histological subtype, and molecular prognostic factor status while Spearman's rank correlation was used to characterize the association between imaging biomarkers and prognostic factor expression. The average values of the ADC and IVIM biomarkers, Dt and fp, showed significant differences between benign and malignant lesions. Additional significant differences were found in the histogram parameters among tumour subtypes and molecular prognostic factor status. IVIM histogram metrics, particularly fp and Dp, showed significant correlation with hormonal factor expression. Advanced diffusion imaging biomarkers show relationships with molecular prognostic factors and breast cancer malignancy. This analysis reveals novel diagnostic metrics that may explain some of the observed variability in treatment response among breast cancer patients. • Novel IVIM biomarkers characterize heterogeneous breast cancer. • Histogram analysis enables quantification of tumour heterogeneity. • IVIM biomarkers show relationships with breast cancer malignancy and molecular prognostic factors.

  8. Whole-tumor histogram analysis of the cerebral blood volume map: tumor volume defined by 11C-methionine positron emission tomography image improves the diagnostic accuracy of cerebral glioma grading.

    PubMed

    Wu, Rongli; Watanabe, Yoshiyuki; Arisawa, Atsuko; Takahashi, Hiroto; Tanaka, Hisashi; Fujimoto, Yasunori; Watabe, Tadashi; Isohashi, Kayako; Hatazawa, Jun; Tomiyama, Noriyuki

    2017-10-01

    This study aimed to compare the tumor volume definition using conventional magnetic resonance (MR) and 11C-methionine positron emission tomography (MET/PET) images in the differentiation of the pre-operative glioma grade by using whole-tumor histogram analysis of normalized cerebral blood volume (nCBV) maps. Thirty-four patients with histopathologically proven primary brain low-grade gliomas (n = 15) and high-grade gliomas (n = 19) underwent pre-operative or pre-biopsy MET/PET, fluid-attenuated inversion recovery, dynamic susceptibility contrast perfusion-weighted magnetic resonance imaging, and contrast-enhanced T1-weighted at 3.0 T. The histogram distribution derived from the nCBV maps was obtained by co-registering the whole tumor volume delineated on conventional MR or MET/PET images, and eight histogram parameters were assessed. The mean nCBV value had the highest AUC value (0.906) based on MET/PET images. Diagnostic accuracy significantly improved when the tumor volume was measured from MET/PET images compared with conventional MR images for the parameters of mean, 50th, and 75th percentile nCBV value (p = 0.0246, 0.0223, and 0.0150, respectively). Whole-tumor histogram analysis of CBV map provides more valuable histogram parameters and increases diagnostic accuracy in the differentiation of pre-operative cerebral gliomas when the tumor volume is derived from MET/PET images.

  9. Effect of respiratory and cardiac gating on the major diffusion-imaging metrics.

    PubMed

    Hamaguchi, Hiroyuki; Tha, Khin Khin; Sugimori, Hiroyuki; Nakanishi, Mitsuhiro; Nakagawa, Shin; Fujiwara, Taro; Yoshida, Hirokazu; Takamori, Sayaka; Shirato, Hiroki

    2016-08-01

    The effect of respiratory gating on the major diffusion-imaging metrics and that of cardiac gating on mean kurtosis (MK) are not known. For evaluation of whether the major diffusion-imaging metrics-MK, fractional anisotropy (FA), and mean diffusivity (MD) of the brain-varied between gated and non-gated acquisitions, respiratory-gated, cardiac-gated, and non-gated diffusion-imaging of the brain were performed in 10 healthy volunteers. MK, FA, and MD maps were constructed for all acquisitions, and the histograms were constructed. The normalized peak height and location of the histograms were compared among the acquisitions by use of Friedman and post hoc Wilcoxon tests. The effect of the repetition time (TR) on the diffusion-imaging metrics was also tested, and we corrected for its variation among acquisitions, if necessary. The results showed a shift in the peak location of the MK and MD histograms to the right with an increase in TR (p ≤ 0.01). The corrected peak location of the MK histograms, the normalized peak height of the FA histograms, the normalized peak height and the corrected peak location of the MD histograms varied significantly between the gated and non-gated acquisitions (p < 0.05). These results imply an influence of respiration and cardiac pulsation on the major diffusion-imaging metrics. The gating conditions must be kept identical if reproducible results are to be achieved. © The Author(s) 2016.

  10. Uyghur face recognition method combining 2DDCT with POEM

    NASA Astrophysics Data System (ADS)

    Yi, Lihamu; Ya, Ermaimaiti

    2017-11-01

    In this paper, in light of the reduced recognition rate and poor robustness of Uyghur face under illumination and partial occlusion, a Uyghur face recognition method combining Two Dimension Discrete Cosine Transform (2DDCT) with Patterns Oriented Edge Magnitudes (POEM) was proposed. Firstly, the Uyghur face images were divided into 8×8 block matrix, and the Uyghur face images after block processing were converted into frequency-domain status using 2DDCT; secondly, the Uyghur face images were compressed to exclude non-sensitive medium frequency parts and non-high frequency parts, so it can reduce the feature dimensions necessary for the Uyghur face images, and further reduce the amount of computation; thirdly, the corresponding POEM histograms of the Uyghur face images were obtained by calculating the feature quantity of POEM; fourthly, the POEM histograms were cascaded together as the texture histogram of the center feature point to obtain the texture features of the Uyghur face feature points; finally, classification of the training samples was carried out using deep learning algorithm. The simulation experiment results showed that the proposed algorithm further improved the recognition rate of the self-built Uyghur face database, and greatly improved the computing speed of the self-built Uyghur face database, and had strong robustness.

  11. A Concise Guide to Feature Histograms with Applications to LIDAR-Based Spacecraft Relative Navigation

    NASA Astrophysics Data System (ADS)

    Rhodes, Andrew P.; Christian, John A.; Evans, Thomas

    2017-12-01

    With the availability and popularity of 3D sensors, it is advantageous to re-examine the use of point cloud descriptors for the purpose of pose estimation and spacecraft relative navigation. One popular descriptor is the oriented unique repeatable clustered viewpoint feature histogram (OUR-CVFH), which is most often utilized in personal and industrial robotics to simultaneously recognize and navigate relative to an object. Recent research into using the OUR-CVFH descriptor for spacecraft navigation has produced favorable results. Since OUR-CVFH is the most recent innovation in a large family of feature histogram point cloud descriptors, discussions of parameter settings and insights into its functionality are spread among various publications and online resources. This paper organizes the history of feature histogram point cloud descriptors for a straightforward explanation of their evolution. This article compiles all the requisite information needed to implement OUR-CVFH into one location, as well as providing useful suggestions on how to tune the generation parameters. This work is beneficial for anyone interested in using this histogram descriptor for object recognition or navigation - may it be personal robotics or spacecraft navigation.

  12. Improved LSB matching steganography with histogram characters reserved

    NASA Astrophysics Data System (ADS)

    Chen, Zhihong; Liu, Wenyao

    2008-03-01

    This letter bases on the researches of LSB (least significant bit, i.e. the last bit of a binary pixel value) matching steganographic method and the steganalytic method which aims at histograms of cover images, and proposes a modification to LSB matching. In the LSB matching, if the LSB of the next cover pixel matches the next bit of secret data, do nothing; otherwise, choose to add or subtract one from the cover pixel value at random. In our improved method, a steganographic information table is defined and records the changes which embedded secrete bits introduce in. Through the table, the next LSB which has the same pixel value will be judged to add or subtract one dynamically in order to ensure the histogram's change of cover image is minimized. Therefore, the modified method allows embedding the same payload as the LSB matching but with improved steganographic security and less vulnerability to attacks compared with LSB matching. The experimental results of the new method show that the histograms maintain their attributes, such as peak values and alternative trends, in an acceptable degree and have better performance than LSB matching in the respects of histogram distortion and resistance against existing steganalysis.

  13. Diagnosis of Tempromandibular Disorders Using Local Binary Patterns

    PubMed Central

    Haghnegahdar, A.A.; Kolahi, S.; Khojastepour, L.; Tajeripour, F.

    2018-01-01

    Background: Temporomandibular joint disorder (TMD) might be manifested as structural changes in bone through modification, adaptation or direct destruction. We propose to use Local Binary Pattern (LBP) characteristics and histogram-oriented gradients on the recorded images as a diagnostic tool in TMD assessment. Material and Methods: CBCT images of 66 patients (132 joints) with TMD and 66 normal cases (132 joints) were collected and 2 coronal cut prepared from each condyle, although images were limited to head of mandibular condyle. In order to extract features of images, first we use LBP and then histogram of oriented gradients. To reduce dimensionality, the linear algebra Singular Value Decomposition (SVD) is applied to the feature vectors matrix of all images. For evaluation, we used K nearest neighbor (K-NN), Support Vector Machine, Naïve Bayesian and Random Forest classifiers. We used Receiver Operating Characteristic (ROC) to evaluate the hypothesis. Results: K nearest neighbor classifier achieves a very good accuracy (0.9242), moreover, it has desirable sensitivity (0.9470) and specificity (0.9015) results, when other classifiers have lower accuracy, sensitivity and specificity. Conclusion: We proposed a fully automatic approach to detect TMD using image processing techniques based on local binary patterns and feature extraction. K-NN has been the best classifier for our experiments in detecting patients from healthy individuals, by 92.42% accuracy, 94.70% sensitivity and 90.15% specificity. The proposed method can help automatically diagnose TMD at its initial stages. PMID:29732343

  14. Histograms and Frequency Density.

    ERIC Educational Resources Information Center

    Micromath, 2003

    2003-01-01

    Introduces exercises on histograms and frequency density. Guides pupils to Discovering Important Statistical Concepts Using Spreadsheets (DISCUSS), created at the University of Coventry. Includes curriculum points, teaching tips, activities, and internet address (http://www.coventry.ac.uk/discuss/). (KHR)

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad Allen

    EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statistical associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can selectmore » a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.« less

  16. The DataCube Server. Animate Agent Project Working Note 2, Version 1.0

    DTIC Science & Technology

    1993-11-01

    before this can be called a histogram of all the needed levels must be made and their one band images must be made. Note if a levels backprojection...will not be used then the level does not need to be histogrammed. Any points outside the active region in a levels backprojection will be undefined...this can be called a histogram of all the needed levels must be made and their one band images must be made. Note if a levels backprojection will not

  17. Variability in CT lung-nodule quantification: Effects of dose reduction and reconstruction methods on density and texture based features.

    PubMed

    Lo, P; Young, S; Kim, H J; Brown, M S; McNitt-Gray, M F

    2016-08-01

    To investigate the effects of dose level and reconstruction method on density and texture based features computed from CT lung nodules. This study had two major components. In the first component, a uniform water phantom was scanned at three dose levels and images were reconstructed using four conventional filtered backprojection (FBP) and four iterative reconstruction (IR) methods for a total of 24 different combinations of acquisition and reconstruction conditions. In the second component, raw projection (sinogram) data were obtained for 33 lung nodules from patients scanned as a part of their clinical practice, where low dose acquisitions were simulated by adding noise to sinograms acquired at clinical dose levels (a total of four dose levels) and reconstructed using one FBP kernel and two IR kernels for a total of 12 conditions. For the water phantom, spherical regions of interest (ROIs) were created at multiple locations within the water phantom on one reference image obtained at a reference condition. For the lung nodule cases, the ROI of each nodule was contoured semiautomatically (with manual editing) from images obtained at a reference condition. All ROIs were applied to their corresponding images reconstructed at different conditions. For 17 of the nodule cases, repeat contours were performed to assess repeatability. Histogram (eight features) and gray level co-occurrence matrix (GLCM) based texture features (34 features) were computed for all ROIs. For the lung nodule cases, the reference condition was selected to be 100% of clinical dose with FBP reconstruction using the B45f kernel; feature values calculated from other conditions were compared to this reference condition. A measure was introduced, which the authors refer to as Q, to assess the stability of features across different conditions, which is defined as the ratio of reproducibility (across conditions) to repeatability (across repeat contours) of each feature. The water phantom results demonstrated substantial variability among feature values calculated across conditions, with the exception of histogram mean. Features calculated from lung nodules demonstrated similar results with histogram mean as the most robust feature (Q ≤ 1), having a mean and standard deviation Q of 0.37 and 0.22, respectively. Surprisingly, histogram standard deviation and variance features were also quite robust. Some GLCM features were also quite robust across conditions, namely, diff. variance, sum variance, sum average, variance, and mean. Except for histogram mean, all features have a Q of larger than one in at least one of the 3% dose level conditions. As expected, the histogram mean is the most robust feature in their study. The effects of acquisition and reconstruction conditions on GLCM features vary widely, though trending toward features involving summation of product between intensities and probabilities being more robust, barring a few exceptions. Overall, care should be taken into account for variation in density and texture features if a variety of dose and reconstruction conditions are used for the quantification of lung nodules in CT, otherwise changes in quantification results may be more reflective of changes due to acquisition and reconstruction conditions than in the nodule itself.

  18. Action recognition using multi-scale histograms of oriented gradients based depth motion trail Images

    NASA Astrophysics Data System (ADS)

    Wang, Guanxi; Tie, Yun; Qi, Lin

    2017-07-01

    In this paper, we propose a novel approach based on Depth Maps and compute Multi-Scale Histograms of Oriented Gradient (MSHOG) from sequences of depth maps to recognize actions. Each depth frame in a depth video sequence is projected onto three orthogonal Cartesian planes. Under each projection view, the absolute difference between two consecutive projected maps is accumulated through a depth video sequence to form a Depth Map, which is called Depth Motion Trail Images (DMTI). The MSHOG is then computed from the Depth Maps for the representation of an action. In addition, we apply L2-Regularized Collaborative Representation (L2-CRC) to classify actions. We evaluate the proposed approach on MSR Action3D dataset and MSRGesture3D dataset. Promising experimental result demonstrates the effectiveness of our proposed method.

  19. Aromaticity of benzene derivatives: an exploration of the Cambridge Structural Database.

    PubMed

    Majerz, Irena; Dziembowska, Teresa

    2018-04-01

    The harmonic oscillator model of aromaticity (HOMA) index, one of the most popular aromaticity indices for solid-state benzene rings in the Cambridge Structural Database (CSD), has been analyzed. The histograms of HOMA for benzene, for benzene derivatives with one formyl, nitro, amino or hydroxy group as well as the histograms for the derivatives with two formyl, nitro, amino or hydroxy groups in ortho, meta and para positions were investigated. The majority of the substituted benzene derivatives in the CSD are characterized by a high value of HOMA, indicating fully aromatic character; however, the distribution of the HOMA value from 1 to about 0 indicates decreasing aromaticity down to non-aromatic character. Among the benzene derivatives investigated, a significant decrease in aromaticity can be related to compounds with diamino and dinitro groups in the meta position.

  20. Diffusion Profiling via a Histogram Approach Distinguishes Low-grade from High-grade Meningiomas, Can Reflect the Respective Proliferative Potential and Progesterone Receptor Status.

    PubMed

    Gihr, Georg Alexander; Horvath-Rizea, Diana; Garnov, Nikita; Kohlhof-Meinecke, Patricia; Ganslandt, Oliver; Henkes, Hans; Meyer, Hans Jonas; Hoffmann, Karl-Titus; Surov, Alexey; Schob, Stefan

    2018-02-01

    Presurgical grading, estimation of growth kinetics, and other prognostic factors are becoming increasingly important for selecting the best therapeutic approach for meningioma patients. Diffusion-weighted imaging (DWI) provides microstructural information and reflects tumor biology. A novel DWI approach, histogram profiling of apparent diffusion coefficient (ADC) volumes, provides more distinct information than conventional DWI. Therefore, our study investigated whether ADC histogram profiling distinguishes low-grade from high-grade lesions and reflects Ki-67 expression and progesterone receptor status. Pretreatment ADC volumes of 37 meningioma patients (28 low-grade, 9 high-grade) were used for histogram profiling. WHO grade, Ki-67 expression, and progesterone receptor status were evaluated. Comparative and correlative statistics investigating the association between histogram profiling and neuropathology were performed. The entire ADC profile (p10, p25, p75, p90, mean, median) was significantly lower in high-grade versus low-grade meningiomas. The lower percentiles, mean, and modus showed significant correlations with Ki-67 expression. Skewness and entropy of the ADC volumes were significantly associated with progesterone receptor status and Ki-67 expression. ROC analysis revealed entropy to be the most accurate parameter distinguishing low-grade from high-grade meningiomas. ADC histogram profiling provides a distinct set of parameters, which help differentiate low-grade versus high-grade meningiomas. Also, histogram metrics correlate significantly with histological surrogates of the respective proliferative potential. More specifically, entropy revealed to be the most promising imaging biomarker for presurgical grading. Both, entropy and skewness were significantly associated with progesterone receptor status and Ki-67 expression and therefore should be investigated further as predictors for prognostically relevant tumor biological features. Since absolute ADC values vary between MRI scanners of different vendors and field strengths, their use is more limited in the presurgical setting.

  1. Histogram Analysis of CT Perfusion of Hepatocellular Carcinoma for Predicting Response to Transarterial Radioembolization: Value of Tumor Heterogeneity Assessment.

    PubMed

    Reiner, Caecilia S; Gordic, Sonja; Puippe, Gilbert; Morsbach, Fabian; Wurnig, Moritz; Schaefer, Niklaus; Veit-Haibach, Patrick; Pfammatter, Thomas; Alkadhi, Hatem

    2016-03-01

    To evaluate in patients with hepatocellular carcinoma (HCC), whether assessment of tumor heterogeneity by histogram analysis of computed tomography (CT) perfusion helps predicting response to transarterial radioembolization (TARE). Sixteen patients (15 male; mean age 65 years; age range 47-80 years) with HCC underwent CT liver perfusion for treatment planning prior to TARE with Yttrium-90 microspheres. Arterial perfusion (AP) derived from CT perfusion was measured in the entire tumor volume, and heterogeneity was analyzed voxel-wise by histogram analysis. Response to TARE was evaluated on follow-up imaging (median follow-up, 129 days) based on modified Response Evaluation Criteria in Solid Tumors (mRECIST). Results of histogram analysis and mean AP values of the tumor were compared between responders and non-responders. Receiver operating characteristics were calculated to determine the parameters' ability to discriminate responders from non-responders. According to mRECIST, 8 patients (50%) were responders and 8 (50%) non-responders. Comparing responders and non-responders, the 50th and 75th percentile of AP derived from histogram analysis was significantly different [AP 43.8/54.3 vs. 27.6/34.3 mL min(-1) 100 mL(-1)); p < 0.05], while the mean AP of HCCs (43.5 vs. 27.9 mL min(-1) 100 mL(-1); p > 0.05) was not. Further heterogeneity parameters from histogram analysis (skewness, coefficient of variation, and 25th percentile) did not differ between responders and non-responders (p > 0.05). If the cut-off for the 75th percentile was set to an AP of 37.5 mL min(-1) 100 mL(-1), therapy response could be predicted with a sensitivity of 88% (7/8) and specificity of 75% (6/8). Voxel-wise histogram analysis of pretreatment CT perfusion indicating tumor heterogeneity of HCC improves the pretreatment prediction of response to TARE.

  2. ADC histogram analysis for adrenal tumor histogram analysis of apparent diffusion coefficient in differentiating adrenal adenoma from pheochromocytoma.

    PubMed

    Umanodan, Tomokazu; Fukukura, Yoshihiko; Kumagae, Yuichi; Shindo, Toshikazu; Nakajo, Masatoyo; Takumi, Koji; Nakajo, Masanori; Hakamada, Hiroto; Umanodan, Aya; Yoshiura, Takashi

    2017-04-01

    To determine the diagnostic performance of apparent diffusion coefficient (ADC) histogram analysis in diffusion-weighted (DW) magnetic resonance imaging (MRI) for differentiating adrenal adenoma from pheochromocytoma. We retrospectively evaluated 52 adrenal tumors (39 adenomas and 13 pheochromocytomas) in 47 patients (21 men, 26 women; mean age, 59.3 years; range, 16-86 years) who underwent DW 3.0T MRI. Histogram parameters of ADC (b-values of 0 and 200 [ADC 200 ], 0 and 400 [ADC 400 ], and 0 and 800 s/mm 2 [ADC 800 ])-mean, variance, coefficient of variation (CV), kurtosis, skewness, and entropy-were compared between adrenal adenomas and pheochromocytomas, using the Mann-Whitney U-test. Receiver operating characteristic (ROC) curves for the histogram parameters were generated to differentiate adrenal adenomas from pheochromocytomas. Sensitivity and specificity were calculated by using a threshold criterion that would maximize the average of sensitivity and specificity. Variance and CV of ADC 800 were significantly higher in pheochromocytomas than in adrenal adenomas (P < 0.001 and P = 0.001, respectively). With all b-value combinations, the entropy of ADC was significantly higher in pheochromocytomas than in adrenal adenomas (all P ≤ 0.001), and showed the highest area under the ROC curve among the ADC histogram parameters for diagnosing adrenal adenomas (ADC 200 , 0.82; ADC 400 , 0.87; and ADC 800 , 0.92), with sensitivity of 84.6% and specificity of 84.6% (cutoff, ≤2.82) with ADC 200 ; sensitivity of 89.7% and specificity of 84.6% (cutoff, ≤2.77) with ADC 400 ; and sensitivity of 94.9% and specificity of 92.3% (cutoff, ≤2.67) with ADC 800 . ADC histogram analysis of DW MRI can help differentiate adrenal adenoma from pheochromocytoma. 3 J. Magn. Reson. Imaging 2017;45:1195-1203. © 2016 International Society for Magnetic Resonance in Medicine.

  3. [Image Feature Extraction and Discriminant Analysis of Xinjiang Uygur Medicine Based on Color Histogram].

    PubMed

    Hamit, Murat; Yun, Weikang; Yan, Chuanbo; Kutluk, Abdugheni; Fang, Yang; Alip, Elzat

    2015-06-01

    Image feature extraction is an important part of image processing and it is an important field of research and application of image processing technology. Uygur medicine is one of Chinese traditional medicine and researchers pay more attention to it. But large amounts of Uygur medicine data have not been fully utilized. In this study, we extracted the image color histogram feature of herbal and zooid medicine of Xinjiang Uygur. First, we did preprocessing, including image color enhancement, size normalizition and color space transformation. Then we extracted color histogram feature and analyzed them with statistical method. And finally, we evaluated the classification ability of features by Bayes discriminant analysis. Experimental results showed that high accuracy for Uygur medicine image classification was obtained by using color histogram feature. This study would have a certain help for the content-based medical image retrieval for Xinjiang Uygur medicine.

  4. Advanced concentration analysis of atom probe tomography data: Local proximity histograms and pseudo-2D concentration maps.

    PubMed

    Felfer, Peter; Cairney, Julie

    2018-06-01

    Analysing the distribution of selected chemical elements with respect to interfaces is one of the most common tasks in data mining in atom probe tomography. This can be represented by 1D concentration profiles, 2D concentration maps or proximity histograms, which represent concentration, density etc. of selected species as a function of the distance from a reference surface/interface. These are some of the most useful tools for the analysis of solute distributions in atom probe data. In this paper, we present extensions to the proximity histogram in the form of 'local' proximity histograms, calculated for selected parts of a surface, and pseudo-2D concentration maps, which are 2D concentration maps calculated on non-flat surfaces. This way, local concentration changes at interfaces or and other structures can be assessed more effectively. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Three-Class Mammogram Classification Based on Descriptive CNN Features

    PubMed Central

    Zhang, Qianni; Jadoon, Adeel

    2017-01-01

    In this paper, a novel classification technique for large data set of mammograms using a deep learning method is proposed. The proposed model targets a three-class classification study (normal, malignant, and benign cases). In our model we have presented two methods, namely, convolutional neural network-discrete wavelet (CNN-DW) and convolutional neural network-curvelet transform (CNN-CT). An augmented data set is generated by using mammogram patches. To enhance the contrast of mammogram images, the data set is filtered by contrast limited adaptive histogram equalization (CLAHE). In the CNN-DW method, enhanced mammogram images are decomposed as its four subbands by means of two-dimensional discrete wavelet transform (2D-DWT), while in the second method discrete curvelet transform (DCT) is used. In both methods, dense scale invariant feature (DSIFT) for all subbands is extracted. Input data matrix containing these subband features of all the mammogram patches is created that is processed as input to convolutional neural network (CNN). Softmax layer and support vector machine (SVM) layer are used to train CNN for classification. Proposed methods have been compared with existing methods in terms of accuracy rate, error rate, and various validation assessment measures. CNN-DW and CNN-CT have achieved accuracy rate of 81.83% and 83.74%, respectively. Simulation results clearly validate the significance and impact of our proposed model as compared to other well-known existing techniques. PMID:28191461

  6. Three-Class Mammogram Classification Based on Descriptive CNN Features.

    PubMed

    Jadoon, M Mohsin; Zhang, Qianni; Haq, Ihsan Ul; Butt, Sharjeel; Jadoon, Adeel

    2017-01-01

    In this paper, a novel classification technique for large data set of mammograms using a deep learning method is proposed. The proposed model targets a three-class classification study (normal, malignant, and benign cases). In our model we have presented two methods, namely, convolutional neural network-discrete wavelet (CNN-DW) and convolutional neural network-curvelet transform (CNN-CT). An augmented data set is generated by using mammogram patches. To enhance the contrast of mammogram images, the data set is filtered by contrast limited adaptive histogram equalization (CLAHE). In the CNN-DW method, enhanced mammogram images are decomposed as its four subbands by means of two-dimensional discrete wavelet transform (2D-DWT), while in the second method discrete curvelet transform (DCT) is used. In both methods, dense scale invariant feature (DSIFT) for all subbands is extracted. Input data matrix containing these subband features of all the mammogram patches is created that is processed as input to convolutional neural network (CNN). Softmax layer and support vector machine (SVM) layer are used to train CNN for classification. Proposed methods have been compared with existing methods in terms of accuracy rate, error rate, and various validation assessment measures. CNN-DW and CNN-CT have achieved accuracy rate of 81.83% and 83.74%, respectively. Simulation results clearly validate the significance and impact of our proposed model as compared to other well-known existing techniques.

  7. A hierarchical word-merging algorithm with class separability measure.

    PubMed

    Wang, Lei; Zhou, Luping; Shen, Chunhua; Liu, Lingqiao; Liu, Huan

    2014-03-01

    In image recognition with the bag-of-features model, a small-sized visual codebook is usually preferred to obtain a low-dimensional histogram representation and high computational efficiency. Such a visual codebook has to be discriminative enough to achieve excellent recognition performance. To create a compact and discriminative codebook, in this paper we propose to merge the visual words in a large-sized initial codebook by maximally preserving class separability. We first show that this results in a difficult optimization problem. To deal with this situation, we devise a suboptimal but very efficient hierarchical word-merging algorithm, which optimally merges two words at each level of the hierarchy. By exploiting the characteristics of the class separability measure and designing a novel indexing structure, the proposed algorithm can hierarchically merge 10,000 visual words down to two words in merely 90 seconds. Also, to show the properties of the proposed algorithm and reveal its advantages, we conduct detailed theoretical analysis to compare it with another hierarchical word-merging algorithm that maximally preserves mutual information, obtaining interesting findings. Experimental studies are conducted to verify the effectiveness of the proposed algorithm on multiple benchmark data sets. As shown, it can efficiently produce more compact and discriminative codebooks than the state-of-the-art hierarchical word-merging algorithms, especially when the size of the codebook is significantly reduced.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, D; Kang, S; Kim, D

    Purpose: The dose difference between three-dimensional dose (3D dose) and 4D dose which considers motion due to respiratory can be varied according to geometrical relationship between planning target volume (PTV) and organ at risk (OAR). The purpose of the study is to investigate the dose difference between 3D and 4D dose using overlap volume histogram (OVH) which is an indicator that quantify geometrical relationship between a PTV and an OAR. Methods: Five liver cancer patients who previously treated stereotactic body radiotherapy (SBRT) were investigated. Four-dimensional computed tomography (4DCT) images were acquired for all patients. ITV-based treatment planning was performed. 3Dmore » dose was calculated on the end-exhale phase image as a reference phase image. 4D dose accumulation was implemented from all phase images using dose warping technique used deformable image registration (DIR) algorithm (Horn and Schunck optical flow) in DIRART. In this study OVH was used to quantify geometrical relationship between a PTV and an OAR. OVH between a PTV and a selected OAR was generated for each patient case and compared for all cases. The dose difference between 3D and 4D dose for normal organ was calculated and compared for all cases according to OVH. Results: The 3D and 4D dose difference for OAR was analyzed using dose-volume histogram (DVH). On the basis of a specific point which corresponds to 10% of OAR volume overlapped with expanded PTV, mean dose difference was 34.56% in minimum OVH distance case and 13.36% in maximum OVH distance case. As the OVH distance increased, mean dose difference between 4D and 3D dose was decreased. Conclusion: The tendency of dose difference variation was verified according to OVH. OVH is seems to be indicator that has a potential to predict the dose difference between 4D and 3D dose. This work was supported by the Radiation Technology R&D program (No. 2013M2A2A7043498) and the Mid-career Researcher Program (2014R1A2A1A10050270) through the National Research Foundation of Korea funded by the Ministry of Science, ICT&Future Planning.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matney, Jason; Park, Peter C.; The University of Texas Graduate School of Biomedical Sciences, Houston, Texas

    Purpose: To quantify and compare the effects of respiratory motion on paired passively scattered proton therapy (PSPT) and intensity modulated photon therapy (IMRT) plans; and to establish the relationship between the magnitude of tumor motion and the respiratory-induced dose difference for both modalities. Methods and Materials: In a randomized clinical trial comparing PSPT and IMRT, radiation therapy plans have been designed according to common planning protocols. Four-dimensional (4D) dose was computed for PSPT and IMRT plans for a patient cohort with respiratory motion ranging from 3 to 17 mm. Image registration and dose accumulation were performed using grayscale-based deformable imagemore » registration algorithms. The dose–volume histogram (DVH) differences (4D-3D [3D = 3-dimensional]) were compared for PSPT and IMRT. Changes in 4D-3D dose were correlated to the magnitude of tumor respiratory motion. Results: The average 4D-3D dose to 95% of the internal target volume was close to zero, with 19 of 20 patients within 1% of prescribed dose for both modalities. The mean 4D-3D between the 2 modalities was not statistically significant (P<.05) for all dose–volume histogram indices (mean ± SD) except the lung V5 (PSPT: +1.1% ± 0.9%; IMRT: +0.4% ± 1.2%) and maximum cord dose (PSPT: +1.5 ± 2.9 Gy; IMRT: 0.0 ± 0.2 Gy). Changes in 4D-3D dose were correlated to tumor motion for only 2 indices: dose to 95% planning target volume, and heterogeneity index. Conclusions: With our current margin formalisms, target coverage was maintained in the presence of respiratory motion up to 17 mm for both PSPT and IMRT. Only 2 of 11 4D-3D indices (lung V5 and spinal cord maximum) were statistically distinguishable between PSPT and IMRT, contrary to the notion that proton therapy will be more susceptible to respiratory motion. Because of the lack of strong correlations with 4D-3D dose differences in PSPT and IMRT, the extent of tumor motion was not an adequate predictor of potential dosimetric error caused by breathing motion.« less

  10. Spatiotemporal models for the simulation of infrared backgrounds

    NASA Astrophysics Data System (ADS)

    Wilkes, Don M.; Cadzow, James A.; Peters, R. Alan, II; Li, Xingkang

    1992-09-01

    It is highly desirable for designers of automatic target recognizers (ATRs) to be able to test their algorithms on targets superimposed on a wide variety of background imagery. Background imagery in the infrared spectrum is expensive to gather from real sources, consequently, there is a need for accurate models for producing synthetic IR background imagery. We have developed a model for such imagery that will do the following: Given a real, infrared background image, generate another image, distinctly different from the one given, that has the same general visual characteristics as well as the first and second-order statistics of the original image. The proposed model consists of a finite impulse response (FIR) kernel convolved with an excitation function, and histogram modification applied to the final solution. A procedure for deriving the FIR kernel using a signal enhancement algorithm has been developed, and the histogram modification step is a simple memoryless nonlinear mapping that imposes the first order statistics of the original image onto the synthetic one, thus the overall model is a linear system cascaded with a memoryless nonlinearity. It has been found that the excitation function relates to the placement of features in the image, the FIR kernel controls the sharpness of the edges and the global spectrum of the image, and the histogram controls the basic coloration of the image. A drawback to this method of simulating IR backgrounds is that a database of actual background images must be collected in order to produce accurate FIR and histogram models. If this database must include images of all types of backgrounds obtained at all times of the day and all times of the year, the size of the database would be prohibitive. In this paper we propose improvements to the model described above that enable time-dependent modeling of the IR background. This approach can greatly reduce the number of actual IR backgrounds that are required to produce a sufficiently accurate mathematical model for synthesizing a similar IR background for different times of the day. Original and synthetic IR backgrounds will be presented. Previous research in simulating IR backgrounds was performed by Strenzwilk, et al., Botkin, et al., and Rapp. The most recent work of Strenzwilk, et al. was based on the use of one-dimensional ARMA models for synthesizing the images. Their results were able to retain the global statistical and spectral behavior of the original image, but the synthetic image was not visually very similar to the original. The research presented in this paper is the result of an attempt to improve upon their results, and represents a significant improvement in quality over previously obtained results.

  11. Fast and fully automatic phalanx segmentation using a grayscale-histogram morphology algorithm

    NASA Astrophysics Data System (ADS)

    Hsieh, Chi-Wen; Liu, Tzu-Chiang; Jong, Tai-Lang; Chen, Chih-Yen; Tiu, Chui-Mei; Chan, Din-Yuen

    2011-08-01

    Bone age assessment is a common radiological examination used in pediatrics to diagnose the discrepancy between the skeletal and chronological age of a child; therefore, it is beneficial to develop a computer-based bone age assessment to help junior pediatricians estimate bone age easily. Unfortunately, the phalanx on radiograms is not easily separated from the background and soft tissue. Therefore, we proposed a new method, called the grayscale-histogram morphology algorithm, to segment the phalanges fast and precisely. The algorithm includes three parts: a tri-stage sieve algorithm used to eliminate the background of hand radiograms, a centroid-edge dual scanning algorithm to frame the phalanx region, and finally a segmentation algorithm based on disk traverse-subtraction filter to segment the phalanx. Moreover, two more segmentation methods: adaptive two-mean and adaptive two-mean clustering were performed, and their results were compared with the segmentation algorithm based on disk traverse-subtraction filter using five indices comprising misclassification error, relative foreground area error, modified Hausdorff distances, edge mismatch, and region nonuniformity. In addition, the CPU time of the three segmentation methods was discussed. The result showed that our method had a better performance than the other two methods. Furthermore, satisfactory segmentation results were obtained with a low standard error.

  12. Quantum effects and anharmonicity in the H2-Li+-benzene complex: A model for hydrogen storage materials

    NASA Astrophysics Data System (ADS)

    Kolmann, Stephen J.; D'Arcy, Jordan H.; Jordan, Meredith J. T.

    2013-12-01

    Quantum and anharmonic effects are investigated in H2-Li+-benzene, a model for hydrogen adsorption in metal-organic frameworks and carbon-based materials. Three- and 8-dimensional quantum diffusion Monte Carlo (QDMC) and rigid-body diffusion Monte Carlo (RBDMC) simulations are performed on potential energy surfaces interpolated from electronic structure calculations at the M05-2X/6-31+G(d,p) and M05-2X/6-311+G(2df,p) levels of theory using a three-dimensional spline or a modified Shepard interpolation. These calculations investigate the intermolecular interactions in this system, with three- and 8-dimensional 0 K H2 binding enthalpy estimates, ΔHbind (0 K), being 16.5 kJ mol-1 and 12.4 kJ mol-1, respectively: 0.1 and 0.6 kJ mol-1 higher than harmonic values. Zero-point energy effects are 35% of the value of ΔHbind (0 K) at M05-2X/6-311+G(2df,p) and cannot be neglected; uncorrected electronic binding energies overestimate ΔHbind (0 K) by at least 6 kJ mol-1. Harmonic intermolecular binding enthalpies can be corrected by treating the H2 "helicopter" and "ferris wheel" rotations as free and hindered rotations, respectively. These simple corrections yield results within 2% of the 8-dimensional anharmonic calculations. Nuclear ground state probability density histograms obtained from the QDMC and RBDMC simulations indicate the H2 molecule is delocalized above the Li+-benzene system at 0 K.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolmann, Stephen J.; D'Arcy, Jordan H.; Jordan, Meredith J. T., E-mail: m.jordan@chem.usyd.edu.au

    Quantum and anharmonic effects are investigated in H{sub 2}-Li{sup +}-benzene, a model for hydrogen adsorption in metal-organic frameworks and carbon-based materials. Three- and 8-dimensional quantum diffusion Monte Carlo (QDMC) and rigid-body diffusion Monte Carlo (RBDMC) simulations are performed on potential energy surfaces interpolated from electronic structure calculations at the M05-2X/6-31+G(d,p) and M05-2X/6-311+G(2df,p) levels of theory using a three-dimensional spline or a modified Shepard interpolation. These calculations investigate the intermolecular interactions in this system, with three- and 8-dimensional 0 K H{sub 2} binding enthalpy estimates, ΔH{sub bind} (0 K), being 16.5 kJ mol{sup −1} and 12.4 kJ mol{sup −1}, respectively: 0.1 and 0.6more » kJ mol{sup −1} higher than harmonic values. Zero-point energy effects are 35% of the value of ΔH{sub bind} (0 K) at M05-2X/6-311+G(2df,p) and cannot be neglected; uncorrected electronic binding energies overestimate ΔH{sub bind} (0 K) by at least 6 kJ mol{sup −1}. Harmonic intermolecular binding enthalpies can be corrected by treating the H{sub 2} “helicopter” and “ferris wheel” rotations as free and hindered rotations, respectively. These simple corrections yield results within 2% of the 8-dimensional anharmonic calculations. Nuclear ground state probability density histograms obtained from the QDMC and RBDMC simulations indicate the H{sub 2} molecule is delocalized above the Li{sup +}-benzene system at 0 K.« less

  14. Postural tasks are associated with center of pressure spatial patterns of three-dimensional statokinesigrams in young and elderly healthy subjects.

    PubMed

    Baracat, Patrícia Junqueira Ferraz; de Sá Ferreira, Arthur

    2013-12-01

    The present study investigated the association between postural tasks and center of pressure spatial patterns of three-dimensional statokinesigrams. Young (n=35; 27.0±7.7years) and elderly (n=38; 67.3±8.7years) healthy volunteers maintained an undisturbed standing position during postural tasks characterized by combined sensory (vision/no vision) and biomechanical challenges (feet apart/together). A method for the analysis of three-dimensional statokinesigrams based on nonparametric statistics and image-processing analysis was employed. Four patterns of spatial distribution were derived from ankle and hip strategies according to the quantity (single; double; multi) and location (anteroposterior; mediolateral) of high-density regions on three-dimensional statokinesigrams. Significant associations between postural task and spatial pattern were observed (young: gamma=0.548, p<.001; elderly: gamma=0.582, p<.001). Robustness analysis revealed small changes related to parameter choices for histogram processing. MANOVA revealed multivariate main effects for postural task [Wilks' Lambda=0.245, p<.001] and age [Wilks' Lambda=0.308, p<.001], with interaction [Wilks' Lambda=0.732, p<.001]. The quantity of high-density regions was positively correlated to stabilogram and statokinesigram variables (p<.05 or lower). In conclusion, postural tasks are associated with center of pressure spatial patterns and are similar in young and elderly healthy volunteers. Single-centered patterns reflected more stable postural conditions and were more frequent with complete visual input and a wide base of support. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. The ISI distribution of the stochastic Hodgkin-Huxley neuron.

    PubMed

    Rowat, Peter F; Greenwood, Priscilla E

    2014-01-01

    The simulation of ion-channel noise has an important role in computational neuroscience. In recent years several approximate methods of carrying out this simulation have been published, based on stochastic differential equations, and all giving slightly different results. The obvious, and essential, question is: which method is the most accurate and which is most computationally efficient? Here we make a contribution to the answer. We compare interspike interval histograms from simulated data using four different approximate stochastic differential equation (SDE) models of the stochastic Hodgkin-Huxley neuron, as well as the exact Markov chain model simulated by the Gillespie algorithm. One of the recent SDE models is the same as the Kurtz approximation first published in 1978. All the models considered give similar ISI histograms over a wide range of deterministic and stochastic input. Three features of these histograms are an initial peak, followed by one or more bumps, and then an exponential tail. We explore how these features depend on deterministic input and on level of channel noise, and explain the results using the stochastic dynamics of the model. We conclude with a rough ranking of the four SDE models with respect to the similarity of their ISI histograms to the histogram of the exact Markov chain model.

  16. Histogram equalization with Bayesian estimation for noise robust speech recognition.

    PubMed

    Suh, Youngjoo; Kim, Hoirin

    2018-02-01

    The histogram equalization approach is an efficient feature normalization technique for noise robust automatic speech recognition. However, it suffers from performance degradation when some fundamental conditions are not satisfied in the test environment. To remedy these limitations of the original histogram equalization methods, class-based histogram equalization approach has been proposed. Although this approach showed substantial performance improvement under noise environments, it still suffers from performance degradation due to the overfitting problem when test data are insufficient. To address this issue, the proposed histogram equalization technique employs the Bayesian estimation method in the test cumulative distribution function estimation. It was reported in a previous study conducted on the Aurora-4 task that the proposed approach provided substantial performance gains in speech recognition systems based on the acoustic modeling of the Gaussian mixture model-hidden Markov model. In this work, the proposed approach was examined in speech recognition systems with deep neural network-hidden Markov model (DNN-HMM), the current mainstream speech recognition approach where it also showed meaningful performance improvement over the conventional maximum likelihood estimation-based method. The fusion of the proposed features with the mel-frequency cepstral coefficients provided additional performance gains in DNN-HMM systems, which otherwise suffer from performance degradation in the clean test condition.

  17. Dynamic Contrast-enhanced MR Imaging in Renal Cell Carcinoma: Reproducibility of Histogram Analysis on Pharmacokinetic Parameters

    PubMed Central

    Wang, Hai-yi; Su, Zi-hua; Xu, Xiao; Sun, Zhi-peng; Duan, Fei-xue; Song, Yuan-yuan; Li, Lu; Wang, Ying-wei; Ma, Xin; Guo, Ai-tao; Ma, Lin; Ye, Hui-yi

    2016-01-01

    Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan–rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K trans & Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733

  18. Multiplicity of the 660-km discontinuity beneath the Izu-Bonin area

    NASA Astrophysics Data System (ADS)

    Zhou, Yuan-Ze; Yu, Xiang-Wei; Yang, Hui; Zang, Shao-Xian

    2012-05-01

    The relatively simple subducting slab geometry in the Izu-Bonin region provides a valuable opportunity to study the multiplicity of the 660-km discontinuity and the related response of the subducting slab on the discontinuity. Vertical short-period recordings of deep events with simple direct P phases beneath the Izu-Bonin region were retrieved from two seismic networks in the western USA and were used to study the structure of the 660-km discontinuity. After careful selection and pre-processing, 23 events from the networks, forming 32 pairs of event-network records, were processed. Related vespagrams were produced using the N-th root slant stack method for detecting weak down-going SdP phases that were inverted to the related conversion points. From depth histograms and the spatial distribution of the conversion points, there were three clear interfaces at depths of 670, 710 and 730 km. These interfaces were depressed approximately 20-30 km in the northern region. In the southern region, only two layers were identified in the depth histograms, and no obvious layered structure could be observed from the distribution of the conversion points.

  19. Texture operator for snow particle classification into snowflake and graupel

    NASA Astrophysics Data System (ADS)

    Nurzyńska, Karolina; Kubo, Mamoru; Muramoto, Ken-ichiro

    2012-11-01

    In order to improve the estimation of precipitation, the coefficients of Z-R relation should be determined for each snow type. Therefore, it is necessary to identify the type of falling snow. Consequently, this research addresses a problem of snow particle classification into snowflake and graupel in an automatic manner (as these types are the most common in the study region). Having correctly classified precipitation events, it is believed that it will be possible to estimate the related parameters accurately. The automatic classification system presented here describes the images with texture operators. Some of them are well-known from the literature: first order features, co-occurrence matrix, grey-tone difference matrix, run length matrix, and local binary pattern, but also a novel approach to design simple local statistic operators is introduced. In this work the following texture operators are defined: mean histogram, min-max histogram, and mean-variance histogram. Moreover, building a feature vector, which is based on the structure created in many from mentioned algorithms is also suggested. For classification, the k-nearest neighbourhood classifier was applied. The results showed that it is possible to achieve correct classification accuracy above 80% by most of the techniques. The best result of 86.06%, was achieved for operator built from a structure achieved in the middle stage of the co-occurrence matrix calculation. Next, it was noticed that describing an image with two texture operators does not improve the classification results considerably. In the best case the correct classification efficiency was 87.89% for a pair of texture operators created from local binary pattern and structure build in a middle stage of grey-tone difference matrix calculation. This also suggests that the information gathered by each texture operator is redundant. Therefore, the principal component analysis was applied in order to remove the unnecessary information and additionally reduce the length of the feature vectors. The improvement of the correct classification efficiency for up to 100% is possible for methods: min-max histogram, texture operator built from structure achieved in a middle stage of co-occurrence matrix calculation, texture operator built from a structure achieved in a middle stage of grey-tone difference matrix creation, and texture operator based on a histogram, when the feature vector stores 99% of initial information.

  20. Reconstructing the Dwarf Galaxy Progenitor from Tidal Streams Using MilkyWay@home

    NASA Astrophysics Data System (ADS)

    Newberg, Heidi; Shelton, Siddhartha

    2018-04-01

    We attempt to reconstruct the mass and radial profile of stars and dark matter in the dwarf galaxy progenitor of the Orphan Stream, using only information from the stars in the Orphan Stream. We show that given perfect data and perfect knowledge of the dwarf galaxy profile and Milky Way potential, we are able to reconstruct the mass and radial profiles of both the stars and dark matter in the progenitor to high accuracy using only the density of stars along the stream and either the velocity dispersion or width of the stream in the sky. To perform this test, we simulated the tidal disruption of a two component (stars and dark matter) dwarf galaxy along the orbit of the Orphan Stream. We then created a histogram of the density of stars along the stream and a histogram of either the velocity dispersion or width of the stream in the sky as a function of position along the stream. The volunteer supercomputer MilkyWay@home was given these two histograms, the Milky Way potential model, and the orbital parameters for the progenitor. N-body simulations were run, varying dwarf galaxy parameters and the time of disruption. The goodness-of-fit of the model to the data was determined using an Earth-Mover Distance algorithm. The parameters were optimized using Differential Evolution. Future work will explore whether currently available information on the Orphan Stream stars is sufficient to constrain its progenitor, and how sensitive the optimization is to our knowledge of the Milky Way potential and the density model of the dwarf galaxy progenitor, as well as a host of other real-life unknowns.

  1. Breast density evaluation using spectral mammography, radiologist reader assessment and segmentation techniques: a retrospective study based on left and right breast comparison

    PubMed Central

    Molloi, Sabee; Ding, Huanjun; Feig, Stephen

    2015-01-01

    Purpose The purpose of this study was to compare the precision of mammographic breast density measurement using radiologist reader assessment, histogram threshold segmentation, fuzzy C-mean segmentation and spectral material decomposition. Materials and Methods Spectral mammography images from a total of 92 consecutive asymptomatic women (50–69 years old) who presented for annual screening mammography were retrospectively analyzed for this study. Breast density was estimated using 10 radiologist reader assessment, standard histogram thresholding, fuzzy C-mean algorithm and spectral material decomposition. The breast density correlation between left and right breasts was used to assess the precision of these techniques to measure breast composition relative to dual-energy material decomposition. Results In comparison to the other techniques, the results of breast density measurements using dual-energy material decomposition showed the highest correlation. The relative standard error of estimate for breast density measurements from left and right breasts using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean algorithm and dual-energy material decomposition was calculated to be 1.95, 2.87, 2.07 and 1.00, respectively. Conclusion The results indicate that the precision of dual-energy material decomposition was approximately factor of two higher than the other techniques with regard to better correlation of breast density measurements from right and left breasts. PMID:26031229

  2. LACIE performance predictor final operational capability program description, volume 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Given the swath table files, the segment set for one country and cloud cover data, the SAGE program determines how many times and under what conditions each segment is accessed by satellites. The program writes a record for each segment on a data file which contains the pertinent acquisition data. The weather data file can also be generated from a NASA supplied tape. The Segment Acquisition Selector Program (SACS) selects data from the segment reference file based upon data input manually and from a crop window file. It writes the extracted data to a data acquisition file and prints two summary reports. The POUT program reads from associated LACIE files and produces printed reports. The major types of reports that can be produced are: (1) Substrate Reference Data Reports, (2) Population Mean, Standard Deviation and Histogram Reports, (3) Histograms of Monte Carlo Statistics Reports, and (4) Frequency of Sample Segment Acquisitions Reports.

  3. Apparent diffusion coefficient histogram analysis can evaluate radiation-induced parotid damage and predict late xerostomia degree in nasopharyngeal carcinoma

    PubMed Central

    Zhou, Nan; Guo, Tingting; Zheng, Huanhuan; Pan, Xia; Chu, Chen; Dou, Xin; Li, Ming; Liu, Song; Zhu, Lijing; Liu, Baorui; Chen, Weibo; He, Jian; Yan, Jing; Zhou, Zhengyang; Yang, Xiaofeng

    2017-01-01

    We investigated apparent diffusion coefficient (ADC) histogram analysis to evaluate radiation-induced parotid damage and predict xerostomia degrees in nasopharyngeal carcinoma (NPC) patients receiving radiotherapy. The imaging of bilateral parotid glands in NPC patients was conducted 2 weeks before radiotherapy (time point 1), one month after radiotherapy (time point 2), and four months after radiotherapy (time point 3). From time point 1 to 2, parotid volume, skewness, and kurtosis decreased (P < 0.001, = 0.001, and < 0.001, respectively), but all other ADC histogram parameters increased (all P < 0.001, except P = 0.006 for standard deviation [SD]). From time point 2 to 3, parotid volume continued to decrease (P = 0.022), and SD, 75th and 90th percentiles continued to increase (P = 0.024, 0.010, and 0.006, respectively). Early change rates of parotid ADCmean, ADCmin, kurtosis, and 25th, 50th, 75th, 90th percentiles (from time point 1 to 2) correlated with late parotid atrophy rate (from time point 1 to 3) (all P < 0.05). Multiple linear regression analysis revealed correlations among parotid volume, time point, and ADC histogram parameters. Early mean change rates for bilateral parotid SD and ADCmax could predict late xerostomia degrees at seven months after radiotherapy (three months after time point 3) with AUC of 0.781 and 0.818 (P = 0.014, 0.005, respectively). ADC histogram parameters were reproducible (intraclass correlation coefficient, 0.830 - 0.999). ADC histogram analysis could be used to evaluate radiation-induced parotid damage noninvasively, and predict late xerostomia degrees of NPC patients treated with radiotherapy. PMID:29050274

  4. Histogram Analysis of Diffusion Tensor Imaging Parameters in Pediatric Cerebellar Tumors.

    PubMed

    Wagner, Matthias W; Narayan, Anand K; Bosemani, Thangamadhan; Huisman, Thierry A G M; Poretti, Andrea

    2016-05-01

    Apparent diffusion coefficient (ADC) values have been shown to assist in differentiating cerebellar pilocytic astrocytomas and medulloblastomas. Previous studies have applied only ADC measurements and calculated the mean/median values. Here we investigated the value of diffusion tensor imaging (DTI) histogram characteristics of the entire tumor for differentiation of cerebellar pilocytic astrocytomas and medulloblastomas. Presurgical DTI data were analyzed with a region of interest (ROI) approach to include the entire tumor. For each tumor, histogram-derived metrics including the 25th percentile, 75th percentile, and skewness were calculated for fractional anisotropy (FA) and mean (MD), axial (AD), and radial (RD) diffusivity. The histogram metrics were used as primary predictors of interest in a logistic regression model. Statistical significance levels were set at p < .01. The study population included 17 children with pilocytic astrocytoma and 16 with medulloblastoma (mean age, 9.21 ± 5.18 years and 7.66 ± 4.97 years, respectively). Compared to children with medulloblastoma, children with pilocytic astrocytoma showed higher MD (P = .003 and P = .008), AD (P = .004 and P = .007), and RD (P = .003 and P = .009) values for the 25th and 75th percentile. In addition, histogram skewness showed statistically significant differences for MD between low- and high-grade tumors (P = .008). The 25th percentile for MD yields the best results for the presurgical differentiation between pediatric cerebellar pilocytic astrocytomas and medulloblastomas. The analysis of other DTI metrics does not provide additional diagnostic value. Our study confirms the diagnostic value of the quantitative histogram analysis of DTI data in pediatric neuro-oncology. Copyright © 2015 by the American Society of Neuroimaging.

  5. Correlation of histogram analysis of apparent diffusion coefficient with uterine cervical pathologic finding.

    PubMed

    Lin, Yuning; Li, Hui; Chen, Ziqian; Ni, Ping; Zhong, Qun; Huang, Huijuan; Sandrasegaran, Kumar

    2015-05-01

    The purpose of this study was to investigate the application of histogram analysis of apparent diffusion coefficient (ADC) in characterizing pathologic features of cervical cancer and benign cervical lesions. This prospective study was approved by the institutional review board, and written informed consent was obtained. Seventy-three patients with cervical cancer (33-69 years old; 35 patients with International Federation of Gynecology and Obstetrics stage IB cervical cancer) and 38 patients (38-61 years old) with normal cervix or cervical benign lesions (control group) were enrolled. All patients underwent 3-T diffusion-weighted imaging (DWI) with b values of 0 and 800 s/mm(2). ADC values of the entire tumor in the patient group and the whole cervix volume in the control group were assessed. Mean ADC, median ADC, 25th and 75th percentiles of ADC, skewness, and kurtosis were calculated. Histogram parameters were compared between different pathologic features, as well as between stage IB cervical cancer and control groups. Mean ADC, median ADC, and 25th percentile of ADC were significantly higher for adenocarcinoma (p = 0.021, 0.006, and 0.004, respectively), and skewness was significantly higher for squamous cell carcinoma (p = 0.011). Median ADC was statistically significantly higher for well or moderately differentiated tumors (p = 0.044), and skewness was statistically significantly higher for poorly differentiated tumors (p = 0.004). No statistically significant difference of ADC histogram was observed between lymphovascular space invasion subgroups. All histogram parameters differed significantly between stage IB cervical cancer and control groups (p < 0.05). Distribution of ADCs characterized by histogram analysis may help to distinguish early-stage cervical cancer from normal cervix or cervical benign lesions and may be useful for evaluating the different pathologic features of cervical cancer.

  6. Apparent diffusion coefficient histogram analysis can evaluate radiation-induced parotid damage and predict late xerostomia degree in nasopharyngeal carcinoma.

    PubMed

    Zhou, Nan; Guo, Tingting; Zheng, Huanhuan; Pan, Xia; Chu, Chen; Dou, Xin; Li, Ming; Liu, Song; Zhu, Lijing; Liu, Baorui; Chen, Weibo; He, Jian; Yan, Jing; Zhou, Zhengyang; Yang, Xiaofeng

    2017-09-19

    We investigated apparent diffusion coefficient (ADC) histogram analysis to evaluate radiation-induced parotid damage and predict xerostomia degrees in nasopharyngeal carcinoma (NPC) patients receiving radiotherapy. The imaging of bilateral parotid glands in NPC patients was conducted 2 weeks before radiotherapy (time point 1), one month after radiotherapy (time point 2), and four months after radiotherapy (time point 3). From time point 1 to 2, parotid volume, skewness, and kurtosis decreased ( P < 0.001, = 0.001, and < 0.001, respectively), but all other ADC histogram parameters increased (all P < 0.001, except P = 0.006 for standard deviation [SD]). From time point 2 to 3, parotid volume continued to decrease ( P = 0.022), and SD, 75 th and 90 th percentiles continued to increase ( P = 0.024, 0.010, and 0.006, respectively). Early change rates of parotid ADC mean , ADC min , kurtosis, and 25 th , 50 th , 75 th , 90 th percentiles (from time point 1 to 2) correlated with late parotid atrophy rate (from time point 1 to 3) (all P < 0.05). Multiple linear regression analysis revealed correlations among parotid volume, time point, and ADC histogram parameters. Early mean change rates for bilateral parotid SD and ADC max could predict late xerostomia degrees at seven months after radiotherapy (three months after time point 3) with AUC of 0.781 and 0.818 ( P = 0.014, 0.005, respectively). ADC histogram parameters were reproducible (intraclass correlation coefficient, 0.830 - 0.999). ADC histogram analysis could be used to evaluate radiation-induced parotid damage noninvasively, and predict late xerostomia degrees of NPC patients treated with radiotherapy.

  7. Principal component analysis of the CT density histogram to generate parametric response maps of COPD

    NASA Astrophysics Data System (ADS)

    Zha, N.; Capaldi, D. P. I.; Pike, D.; McCormack, D. G.; Cunningham, I. A.; Parraga, G.

    2015-03-01

    Pulmonary x-ray computed tomography (CT) may be used to characterize emphysema and airways disease in patients with chronic obstructive pulmonary disease (COPD). One analysis approach - parametric response mapping (PMR) utilizes registered inspiratory and expiratory CT image volumes and CT-density-histogram thresholds, but there is no consensus regarding the threshold values used, or their clinical meaning. Principal-component-analysis (PCA) of the CT density histogram can be exploited to quantify emphysema using data-driven CT-density-histogram thresholds. Thus, the objective of this proof-of-concept demonstration was to develop a PRM approach using PCA-derived thresholds in COPD patients and ex-smokers without airflow limitation. Methods: Fifteen COPD ex-smokers and 5 normal ex-smokers were evaluated. Thoracic CT images were also acquired at full inspiration and full expiration and these images were non-rigidly co-registered. PCA was performed for the CT density histograms, from which the components with the highest eigenvalues greater than one were summed. Since the values of the principal component curve correlate directly with the variability in the sample, the maximum and minimum points on the curve were used as threshold values for the PCA-adjusted PRM technique. Results: A significant correlation was determined between conventional and PCA-adjusted PRM with 3He MRI apparent diffusion coefficient (p<0.001), with CT RA950 (p<0.0001), as well as with 3He MRI ventilation defect percent, a measurement of both small airways disease (p=0.049 and p=0.06, respectively) and emphysema (p=0.02). Conclusions: PRM generated using PCA thresholds of the CT density histogram showed significant correlations with CT and 3He MRI measurements of emphysema, but not airways disease.

  8. Automated labelling of cancer textures in colorectal histopathology slides using quasi-supervised learning.

    PubMed

    Onder, Devrim; Sarioglu, Sulen; Karacali, Bilge

    2013-04-01

    Quasi-supervised learning is a statistical learning algorithm that contrasts two datasets by computing estimate for the posterior probability of each sample in either dataset. This method has not been applied to histopathological images before. The purpose of this study is to evaluate the performance of the method to identify colorectal tissues with or without adenocarcinoma. Light microscopic digital images from histopathological sections were obtained from 30 colorectal radical surgery materials including adenocarcinoma and non-neoplastic regions. The texture features were extracted by using local histograms and co-occurrence matrices. The quasi-supervised learning algorithm operates on two datasets, one containing samples of normal tissues labelled only indirectly, and the other containing an unlabeled collection of samples of both normal and cancer tissues. As such, the algorithm eliminates the need for manually labelled samples of normal and cancer tissues for conventional supervised learning and significantly reduces the expert intervention. Several texture feature vector datasets corresponding to different extraction parameters were tested within the proposed framework. The Independent Component Analysis dimensionality reduction approach was also identified as the one improving the labelling performance evaluated in this series. In this series, the proposed method was applied to the dataset of 22,080 vectors with reduced dimensionality 119 from 132. Regions containing cancer tissue could be identified accurately having false and true positive rates up to 19% and 88% respectively without using manually labelled ground-truth datasets in a quasi-supervised strategy. The resulting labelling performances were compared to that of a conventional powerful supervised classifier using manually labelled ground-truth data. The supervised classifier results were calculated as 3.5% and 95% for the same case. The results in this series in comparison with the benchmark classifier, suggest that quasi-supervised image texture labelling may be a useful method in the analysis and classification of pathological slides but further study is required to improve the results. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Dosimetric comparison of standard three-dimensional conformal radiotherapy followed by intensity-modulated radiotherapy boost schedule (sequential IMRT plan) with simultaneous integrated boost-IMRT (SIB IMRT) treatment plan in patients with localized carcinoma prostate.

    PubMed

    Bansal, A; Kapoor, R; Singh, S K; Kumar, N; Oinam, A S; Sharma, S C

    2012-07-01

    DOSIMETERIC AND RADIOBIOLOGICAL COMPARISON OF TWO RADIATION SCHEDULES IN LOCALIZED CARCINOMA PROSTATE: Standard Three-Dimensional Conformal Radiotherapy (3DCRT) followed by Intensity Modulated Radiotherapy (IMRT) boost (sequential-IMRT) with Simultaneous Integrated Boost IMRT (SIB-IMRT). Thirty patients were enrolled. In all, the target consisted of PTV P + SV (Prostate and seminal vesicles) and PTV LN (lymph nodes) where PTV refers to planning target volume and the critical structures included: bladder, rectum and small bowel. All patients were treated with sequential-IMRT plan, but for dosimetric comparison, SIB-IMRT plan was also created. The prescription dose to PTV P + SV was 74 Gy in both strategies but with different dose per fraction, however, the dose to PTV LN was 50 Gy delivered in 25 fractions over 5 weeks for sequential-IMRT and 54 Gy delivered in 27 fractions over 5.5 weeks for SIB-IMRT. The treatment plans were compared in terms of dose-volume histograms. Also, Tumor Control Probability (TCP) and Normal Tissue Complication Probability (NTCP) obtained with the two plans were compared. The volume of rectum receiving 70 Gy or more (V > 70 Gy) was reduced to 18.23% with SIB-IMRT from 22.81% with sequential-IMRT. SIB-IMRT reduced the mean doses to both bladder and rectum by 13% and 17%, respectively, as compared to sequential-IMRT. NTCP of 0.86 ± 0.75% and 0.01 ± 0.02% for the bladder, 5.87 ± 2.58% and 4.31 ± 2.61% for the rectum and 8.83 ± 7.08% and 8.25 ± 7.98% for the bowel was seen with sequential-IMRT and SIB-IMRT plans respectively. For equal PTV coverage, SIB-IMRT markedly reduced doses to critical structures, therefore should be considered as the strategy for dose escalation. SIB-IMRT achieves lesser NTCP than sequential-IMRT.

  10. The multi-state energy landscape of the SAM-I riboswitch: A single-molecule Förster resonance energy transfer spectroscopy study

    NASA Astrophysics Data System (ADS)

    Manz, Christoph; Kobitski, Andrei Yu.; Samanta, Ayan; Jäschke, Andres; Nienhaus, G. Ulrich

    2018-03-01

    RNA (ribonucleic acid) molecules are highly flexible biopolymers fluctuating at physiological temperatures among many different conformations that are represented by minima in a hierarchical conformational free energy landscape. Here we have employed single-molecule FRET (smFRET) to explore the energy landscape of the B. subtilis yitJ SAM-I riboswitch (RS). In this small RNA molecule, specific binding of an S-adenosyl-L-methionine (SAM) ligand in the aptamer domain regulates gene expression by inducing structural changes in another domain, the expression platform, causing transcription termination by the RNA polymerase. We have measured smFRET histograms over wide ranges of Mg2+ concentration for three RS variants that were specifically labeled with fluorescent dyes on different sites. In the analysis, different conformations are associated with discrete Gaussian model distributions, which are typically fairly broad on the FRET efficiency scale and thus can be extremely challenging to unravel due to their mutual overlap. Our earlier work on two SAM-I RS variants revealed four major conformations. By introducing a global fitting procedure which models both the Mg2+ concentration dependencies of the fractional populations and the average FRET efficiencies of the individual FRET distributions according to Mg2+ binding isotherms, we were able to consistently describe the histogram data of both variants at all studied Mg2+ concentrations. With the third FRET-labeled variant, however, we found significant deviations when applying the four-state model to the data. This can arise because the different FRET labeling of the new variant allows two states to be distinguished that were previously not separable due to overlap. Indeed, the resulting five-state model presented here consistently describes the smFRET histograms of all three variants as well as their variations with Mg2+ concentration. We also performed a triangulation of the donor position for two of the constructs to explore how the expression platform is oriented with respect to the aptamer.

  11. Empirical Distributional Semantics: Methods and Biomedical Applications

    PubMed Central

    Cohen, Trevor; Widdows, Dominic

    2009-01-01

    Over the past fifteen years, a range of methods have been developed that are able to learn human-like estimates of the semantic relatedness between terms from the way in which these terms are distributed in a corpus of unannotated natural language text. These methods have also been evaluated in a number of applications in the cognitive science, computational linguistics and the information retrieval literatures. In this paper, we review the available methodologies for derivation of semantic relatedness from free text, as well as their evaluation in a variety of biomedical and other applications. Recent methodological developments, and their applicability to several existing applications are also discussed. PMID:19232399

  12. SPAM- SPECTRAL ANALYSIS MANAGER (DEC VAX/VMS VERSION)

    NASA Technical Reports Server (NTRS)

    Solomon, J. E.

    1994-01-01

    The Spectral Analysis Manager (SPAM) was developed to allow easy qualitative analysis of multi-dimensional imaging spectrometer data. Imaging spectrometers provide sufficient spectral sampling to define unique spectral signatures on a per pixel basis. Thus direct material identification becomes possible for geologic studies. SPAM provides a variety of capabilities for carrying out interactive analysis of the massive and complex datasets associated with multispectral remote sensing observations. In addition to normal image processing functions, SPAM provides multiple levels of on-line help, a flexible command interpretation, graceful error recovery, and a program structure which can be implemented in a variety of environments. SPAM was designed to be visually oriented and user friendly with the liberal employment of graphics for rapid and efficient exploratory analysis of imaging spectrometry data. SPAM provides functions to enable arithmetic manipulations of the data, such as normalization, linear mixing, band ratio discrimination, and low-pass filtering. SPAM can be used to examine the spectra of an individual pixel or the average spectra over a number of pixels. SPAM also supports image segmentation, fast spectral signature matching, spectral library usage, mixture analysis, and feature extraction. High speed spectral signature matching is performed by using a binary spectral encoding algorithm to separate and identify mineral components present in the scene. The same binary encoding allows automatic spectral clustering. Spectral data may be entered from a digitizing tablet, stored in a user library, compared to the master library containing mineral standards, and then displayed as a timesequence spectral movie. The output plots, histograms, and stretched histograms produced by SPAM can be sent to a lineprinter, stored as separate RGB disk files, or sent to a Quick Color Recorder. SPAM is written in C for interactive execution and is available for two different machine environments. There is a DEC VAX/VMS version with a central memory requirement of approximately 242K of 8 bit bytes and a machine independent UNIX 4.2 version. The display device currently supported is the Raster Technologies display processor. Other 512 x 512 resolution color display devices, such as De Anza, may be added with minor code modifications. This program was developed in 1986.

  13. SPAM- SPECTRAL ANALYSIS MANAGER (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Solomon, J. E.

    1994-01-01

    The Spectral Analysis Manager (SPAM) was developed to allow easy qualitative analysis of multi-dimensional imaging spectrometer data. Imaging spectrometers provide sufficient spectral sampling to define unique spectral signatures on a per pixel basis. Thus direct material identification becomes possible for geologic studies. SPAM provides a variety of capabilities for carrying out interactive analysis of the massive and complex datasets associated with multispectral remote sensing observations. In addition to normal image processing functions, SPAM provides multiple levels of on-line help, a flexible command interpretation, graceful error recovery, and a program structure which can be implemented in a variety of environments. SPAM was designed to be visually oriented and user friendly with the liberal employment of graphics for rapid and efficient exploratory analysis of imaging spectrometry data. SPAM provides functions to enable arithmetic manipulations of the data, such as normalization, linear mixing, band ratio discrimination, and low-pass filtering. SPAM can be used to examine the spectra of an individual pixel or the average spectra over a number of pixels. SPAM also supports image segmentation, fast spectral signature matching, spectral library usage, mixture analysis, and feature extraction. High speed spectral signature matching is performed by using a binary spectral encoding algorithm to separate and identify mineral components present in the scene. The same binary encoding allows automatic spectral clustering. Spectral data may be entered from a digitizing tablet, stored in a user library, compared to the master library containing mineral standards, and then displayed as a timesequence spectral movie. The output plots, histograms, and stretched histograms produced by SPAM can be sent to a lineprinter, stored as separate RGB disk files, or sent to a Quick Color Recorder. SPAM is written in C for interactive execution and is available for two different machine environments. There is a DEC VAX/VMS version with a central memory requirement of approximately 242K of 8 bit bytes and a machine independent UNIX 4.2 version. The display device currently supported is the Raster Technologies display processor. Other 512 x 512 resolution color display devices, such as De Anza, may be added with minor code modifications. This program was developed in 1986.

  14. Impact of Spot Size and Spacing on the Quality of Robustly Optimized Intensity Modulated Proton Therapy Plans for Lung Cancer.

    PubMed

    Liu, Chenbin; Schild, Steven E; Chang, Joe Y; Liao, Zhongxing; Korte, Shawn; Shen, Jiajian; Ding, Xiaoning; Hu, Yanle; Kang, Yixiu; Keole, Sameer R; Sio, Terence T; Wong, William W; Sahoo, Narayan; Bues, Martin; Liu, Wei

    2018-06-01

    To investigate how spot size and spacing affect plan quality, robustness, and interplay effects of robustly optimized intensity modulated proton therapy (IMPT) for lung cancer. Two robustly optimized IMPT plans were created for 10 lung cancer patients: first by a large-spot machine with in-air energy-dependent large spot size at isocenter (σ: 6-15 mm) and spacing (1.3 σ), and second by a small-spot machine with in-air energy-dependent small spot size (σ: 2-6 mm) and spacing (5 mm). Both plans were generated by optimizing radiation dose to internal target volume on averaged 4-dimensional computed tomography scans using an in-house-developed IMPT planning system. The dose-volume histograms band method was used to evaluate plan robustness. Dose evaluation software was developed to model time-dependent spot delivery to incorporate interplay effects with randomized starting phases for each field per fraction. Patient anatomy voxels were mapped phase-to-phase via deformable image registration, and doses were scored using in-house-developed software. Dose-volume histogram indices, including internal target volume dose coverage, homogeneity, and organs at risk (OARs) sparing, were compared using the Wilcoxon signed-rank test. Compared with the large-spot machine, the small-spot machine resulted in significantly lower heart and esophagus mean doses, with comparable target dose coverage, homogeneity, and protection of other OARs. Plan robustness was comparable for targets and most OARs. With interplay effects considered, significantly lower heart and esophagus mean doses with comparable target dose coverage and homogeneity were observed using smaller spots. Robust optimization with a small spot-machine significantly improves heart and esophagus sparing, with comparable plan robustness and interplay effects compared with robust optimization with a large-spot machine. A small-spot machine uses a larger number of spots to cover the same tumors compared with a large-spot machine, which gives the planning system more freedom to compensate for the higher sensitivity to uncertainties and interplay effects for lung cancer treatments. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. A Bayesian Modeling Approach for Estimation of a Shape-Free Groundwater Age Distribution using Multiple Tracers

    DOE PAGES

    Massoudieh, Arash; Visser, Ate; Sharifi, Soroosh; ...

    2013-10-15

    The mixing of groundwaters with different ages in aquifers, groundwater age is more appropriately represented by a distribution rather than a scalar number. To infer a groundwater age distribution from environmental tracers, a mathematical form is often assumed for the shape of the distribution and the parameters of the mathematical distribution are estimated using deterministic or stochastic inverse methods. We found that the prescription of the mathematical form limits the exploration of the age distribution to the shapes that can be described by the selected distribution. In this paper, the use of freeform histograms as groundwater age distributions is evaluated.more » A Bayesian Markov Chain Monte Carlo approach is used to estimate the fraction of groundwater in each histogram bin. This method was able to capture the shape of a hypothetical gamma distribution from the concentrations of four age tracers. The number of bins that can be considered in this approach is limited based on the number of tracers available. The histogram method was also tested on tracer data sets from Holten (The Netherlands; 3H, 3He, 85Kr, 39Ar) and the La Selva Biological Station (Costa-Rica; SF 6, CFCs, 3H, 4He and 14C), and compared to a number of mathematical forms. According to standard Bayesian measures of model goodness, the best mathematical distribution performs better than the histogram distributions in terms of the ability to capture the observed tracer data relative to their complexity. Among the histogram distributions, the four bin histogram performs better in most of the cases. The Monte Carlo simulations showed strong correlations in the posterior estimates of bin contributions, indicating that these bins cannot be well constrained using the available age tracers. The fact that mathematical forms overall perform better than the freeform histogram does not undermine the benefit of the freeform approach, especially for the cases where a larger amount of observed data is available and when the real groundwater distribution is more complex than can be represented by simple mathematical forms.« less

  16. A Bayesian Modeling Approach for Estimation of a Shape-Free Groundwater Age Distribution using Multiple Tracers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Massoudieh, Arash; Visser, Ate; Sharifi, Soroosh

    The mixing of groundwaters with different ages in aquifers, groundwater age is more appropriately represented by a distribution rather than a scalar number. To infer a groundwater age distribution from environmental tracers, a mathematical form is often assumed for the shape of the distribution and the parameters of the mathematical distribution are estimated using deterministic or stochastic inverse methods. We found that the prescription of the mathematical form limits the exploration of the age distribution to the shapes that can be described by the selected distribution. In this paper, the use of freeform histograms as groundwater age distributions is evaluated.more » A Bayesian Markov Chain Monte Carlo approach is used to estimate the fraction of groundwater in each histogram bin. This method was able to capture the shape of a hypothetical gamma distribution from the concentrations of four age tracers. The number of bins that can be considered in this approach is limited based on the number of tracers available. The histogram method was also tested on tracer data sets from Holten (The Netherlands; 3H, 3He, 85Kr, 39Ar) and the La Selva Biological Station (Costa-Rica; SF 6, CFCs, 3H, 4He and 14C), and compared to a number of mathematical forms. According to standard Bayesian measures of model goodness, the best mathematical distribution performs better than the histogram distributions in terms of the ability to capture the observed tracer data relative to their complexity. Among the histogram distributions, the four bin histogram performs better in most of the cases. The Monte Carlo simulations showed strong correlations in the posterior estimates of bin contributions, indicating that these bins cannot be well constrained using the available age tracers. The fact that mathematical forms overall perform better than the freeform histogram does not undermine the benefit of the freeform approach, especially for the cases where a larger amount of observed data is available and when the real groundwater distribution is more complex than can be represented by simple mathematical forms.« less

  17. Histogram Analysis of Apparent Diffusion Coefficients for Occult Tonsil Cancer in Patients with Cervical Nodal Metastasis from an Unknown Primary Site at Presentation.

    PubMed

    Choi, Young Jun; Lee, Jeong Hyun; Kim, Hye Ok; Kim, Dae Yoon; Yoon, Ra Gyoung; Cho, So Hyun; Koh, Myeong Ju; Kim, Namkug; Kim, Sang Yoon; Baek, Jung Hwan

    2016-01-01

    To explore the added value of histogram analysis of apparent diffusion coefficient (ADC) values over magnetic resonance (MR) imaging and fluorine 18 ((18)F) fluorodeoxyglucose (FDG) positron emission tomography (PET)/computed tomography (CT) for the detection of occult palatine tonsil squamous cell carcinoma (SCC) in patients with cervical nodal metastasis from a cancer of an unknown primary site. The institutional review board approved this retrospective study, and the requirement for informed consent was waived. Differences in the bimodal histogram parameters of the ADC values were assessed among occult palatine tonsil SCC (n = 19), overt palatine tonsil SCC (n = 20), and normal palatine tonsils (n = 20). One-way analysis of variance was used to analyze differences among the three groups. Receiver operating characteristic curve analysis was used to determine the best differentiating parameters. The increased sensitivity of histogram analysis over MR imaging and (18)F-FDG PET/CT for the detection of occult palatine tonsil SCC was evaluated as added value. Histogram analysis showed statistically significant differences in the mean, standard deviation, and 50th and 90th percentile ADC values among the three groups (P < .0045). Occult palatine tonsil SCC had a significantly higher standard deviation for the overall curves, mean and standard deviation of the higher curves, and 90th percentile ADC value, compared with normal palatine tonsils (P < .0167). Receiver operating characteristic curve analysis showed that the standard deviation of the overall curve best delineated occult palatine tonsil SCC from normal palatine tonsils, with a sensitivity of 78.9% (15 of 19 patients) and a specificity of 60% (12 of 20 patients). The added value of ADC histogram analysis was 52.6% over MR imaging alone and 15.8% over combined conventional MR imaging and (18)F-FDG PET/CT. Adding ADC histogram analysis to conventional MR imaging can improve the detection sensitivity for occult palatine tonsil SCC in patients with a cervical nodal metastasis originating from a cancer of an unknown primary site. © RSNA, 2015.

  18. Comparison of Utility of Histogram Apparent Diffusion Coefficient and R2* for Differentiation of Low-Grade From High-Grade Clear Cell Renal Cell Carcinoma.

    PubMed

    Zhang, Yu-Dong; Wu, Chen-Jiang; Wang, Qing; Zhang, Jing; Wang, Xiao-Ning; Liu, Xi-Sheng; Shi, Hai-Bin

    2015-08-01

    The purpose of this study was to compare histogram analysis of apparent diffusion coefficient (ADC) and R2* for differentiating low-grade from high-grade clear cell renal cell carcinoma (RCC). Forty-six patients with pathologically confirmed clear cell RCC underwent preoperative BOLD and DWI MRI of the kidneys. ADCs based on the entire tumor volume were calculated with b value combinations of 0 and 800 s/mm(2). ROI-based R2* was calculated with eight TE combinations of 6.7-22.8 milliseconds. Histogram analysis of tumor ADCs and R2* values was performed to obtain mean; median; width; and fifth, 10th, 90th, and 95th percentiles and histogram inhomogeneity, kurtosis, and skewness for all lesions. Thirty-three low-grade and 13 high-grade clear cell RCCs were found at pathologic examination. The TNM classification and tumor volume of clear cell RCC significantly correlated with histogram ADC and R2* (ρ = -0.317 to 0.506; p < 0.05). High-grade clear cell RCC had significantly lower mean, median, and 10th percentile ADCs but higher inhomogeneity and median R2* than low-grade clear cell RCC (all p < 0.05). Compared with other histogram ADC and R2* indexes, 10th percentile ADC had the highest accuracy (91.3%) in discriminating low- from high-grade clear cell RCC. R2* in discriminating hemorrhage was achieved with a threshold of 68.95 Hz. At this threshold, high-grade clear cell RCC had a significantly higher prevalence of intratumor hemorrhage (high-grade, 76.9%; low-grade, 45.4%; p < 0.05) and larger hemorrhagic area than low-grade clear cell RCC (high-grade, 34.9% ± 31.6%; low-grade, 8.9 ± 16.8%; p < 0.05). A close relation was found between MRI indexes and pathologic findings. Histogram analysis of ADC and R2* allows differentiation of low- from high-grade clear cell RCC with high accuracy.

  19. Histogram analysis of apparent diffusion coefficient maps for assessing thymic epithelial tumours: correlation with world health organization classification and clinical staging.

    PubMed

    Kong, Ling-Yan; Zhang, Wei; Zhou, Yue; Xu, Hai; Shi, Hai-Bin; Feng, Qing; Xu, Xiao-Quan; Yu, Tong-Fu

    2018-04-01

    To investigate the value of apparent diffusion coefficients (ADCs) histogram analysis for assessing World Health Organization (WHO) pathological classification and Masaoka clinical stages of thymic epithelial tumours. 37 patients with histologically confirmed thymic epithelial tumours were enrolled. ADC measurements were performed using hot-spot ROI (ADC HS-ROI ) and histogram-based approach. ADC histogram parameters included mean ADC (ADC mean ), median ADC (ADC median ), 10 and 90 percentile of ADC (ADC 10 and ADC 90 ), kurtosis and skewness. One-way ANOVA, independent-sample t-test, and receiver operating characteristic were used for statistical analyses. There were significant differences in ADC mean , ADC median , ADC 10 , ADC 90 and ADC HS-ROI among low-risk thymoma (type A, AB, B1; n = 14), high-risk thymoma (type B2, B3; n = 9) and thymic carcinoma (type C, n = 14) groups (all p-values <0.05), while no significant difference in skewness (p = 0.181) and kurtosis (p = 0.088). ADC 10 showed best differentiating ability (cut-off value, ≤0.689 × 10 -3 mm 2 s -1 ; AUC, 0.957; sensitivity, 95.65%; specificity, 92.86%) for discriminating low-risk thymoma from high-risk thymoma and thymic carcinoma. Advanced Masaoka stages (Stage III and IV; n = 24) tumours showed significant lower ADC parameters and higher kurtosis than early Masaoka stage (Stage I and II; n = 13) tumours (all p-values <0.05), while no significant difference on skewness (p = 0.063). ADC 10 showed best differentiating ability (cut-off value, ≤0.689 × 10 -3 mm 2 s -1 ; AUC, 0.913; sensitivity, 91.30%; specificity, 85.71%) for discriminating advanced and early Masaoka stage epithelial tumours. ADC histogram analysis may assist in assessing the WHO pathological classification and Masaoka clinical stages of thymic epithelial tumours. Advances in knowledge: 1. ADC histogram analysis could help to assess WHO pathological classification of thymic epithelial tumours. 2. ADC histogram analysis could help to evaluate Masaoka clinical stages of thymic epithelial tumours. 3. ADC 10 might be a promising imaging biomarker for assessing and characterizing thymic epithelial tumours.

  20. Utility of whole-lesion ADC histogram metrics for assessing the malignant potential of pancreatic intraductal papillary mucinous neoplasms (IPMNs).

    PubMed

    Hoffman, David H; Ream, Justin M; Hajdu, Christina H; Rosenkrantz, Andrew B

    2017-04-01

    To evaluate whole-lesion ADC histogram metrics for assessing the malignant potential of pancreatic intraductal papillary mucinous neoplasms (IPMNs), including in comparison with conventional MRI features. Eighteen branch-duct IPMNs underwent MRI with DWI prior to resection (n = 16) or FNA (n = 2). A blinded radiologist placed 3D volumes-of-interest on the entire IPMN on the ADC map, from which whole-lesion histogram metrics were generated. The reader also assessed IPMN size, mural nodularity, and adjacent main-duct dilation. Benign (low-to-intermediate grade dysplasia; n = 10) and malignant (high-grade dysplasia or invasive adenocarcinoma; n = 8) IPMNs were compared. Whole-lesion ADC histogram metrics demonstrating significant differences between benign and malignant IPMNs were: entropy (5.1 ± 0.2 vs. 5.4 ± 0.2; p = 0.01, AUC = 86%); mean of the bottom 10th percentile (2.2 ± 0.4 vs. 1.6 ± 0.7; p = 0.03; AUC = 81%); and mean of the 10-25th percentile (2.8 ± 0.4 vs. 2.3 ± 0.6; p = 0.04; AUC = 79%). The overall mean ADC, skewness, and kurtosis were not significantly different between groups (p ≥ 0.06; AUC = 50-78%). For entropy (highest performing histogram metric), an optimal threshold of >5.3 achieved a sensitivity of 100%, a specificity of 70%, and an accuracy of 83% for predicting malignancy. No significant difference (p = 0.18-0.64) was observed between benign and malignant IPMNs for cyst size ≥3 cm, adjacent main-duct dilatation, or mural nodule. At multivariable analysis of entropy in combination with all other ADC histogram and conventional MRI features, entropy was the only significant independent predictor of malignancy (p = 0.004). Although requiring larger studies, ADC entropy obtained from 3D whole-lesion histogram analysis may serve as a biomarker for identifying the malignant potential of IPMNs, independent of conventional MRI features.

  1. A novel pre-processing technique for improving image quality in digital breast tomosynthesis.

    PubMed

    Kim, Hyeongseok; Lee, Taewon; Hong, Joonpyo; Sabir, Sohail; Lee, Jung-Ryun; Choi, Young Wook; Kim, Hak Hee; Chae, Eun Young; Cho, Seungryong

    2017-02-01

    Nonlinear pre-reconstruction processing of the projection data in computed tomography (CT) where accurate recovery of the CT numbers is important for diagnosis is usually discouraged, for such a processing would violate the physics of image formation in CT. However, one can devise a pre-processing step to enhance detectability of lesions in digital breast tomosynthesis (DBT) where accurate recovery of the CT numbers is fundamentally impossible due to the incompleteness of the scanned data. Since the detection of lesions such as micro-calcifications and mass in breasts is the purpose of using DBT, it is justified that a technique producing higher detectability of lesions is a virtue. A histogram modification technique was developed in the projection data domain. Histogram of raw projection data was first divided into two parts: One for the breast projection data and the other for background. Background pixel values were set to a single value that represents the boundary between breast and background. After that, both histogram parts were shifted by an appropriate amount of offset and the histogram-modified projection data were log-transformed. Filtered-backprojection (FBP) algorithm was used for image reconstruction of DBT. To evaluate performance of the proposed method, we computed the detectability index for the reconstructed images from clinically acquired data. Typical breast border enhancement artifacts were greatly suppressed and the detectability of calcifications and masses was increased by use of the proposed method. Compared to a global threshold-based post-reconstruction processing technique, the proposed method produced images of higher contrast without invoking additional image artifacts. In this work, we report a novel pre-processing technique that improves detectability of lesions in DBT and has potential advantages over the global threshold-based post-reconstruction processing technique. The proposed method not only increased the lesion detectability but also reduced typical image artifacts pronounced in conventional FBP-based DBT. © 2016 American Association of Physicists in Medicine.

  2. Sensing Super-position: Visual Instrument Sensor Replacement

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Schipper, John F.

    2006-01-01

    The coming decade of fast, cheap and miniaturized electronics and sensory devices opens new pathways for the development of sophisticated equipment to overcome limitations of the human senses. This project addresses the technical feasibility of augmenting human vision through Sensing Super-position using a Visual Instrument Sensory Organ Replacement (VISOR). The current implementation of the VISOR device translates visual and other passive or active sensory instruments into sounds, which become relevant when the visual resolution is insufficient for very difficult and particular sensing tasks. A successful Sensing Super-position meets many human and pilot vehicle system requirements. The system can be further developed into cheap, portable, and low power taking into account the limited capabilities of the human user as well as the typical characteristics of his dynamic environment. The system operates in real time, giving the desired information for the particular augmented sensing tasks. The Sensing Super-position device increases the image resolution perception and is obtained via an auditory representation as well as the visual representation. Auditory mapping is performed to distribute an image in time. The three-dimensional spatial brightness and multi-spectral maps of a sensed image are processed using real-time image processing techniques (e.g. histogram normalization) and transformed into a two-dimensional map of an audio signal as a function of frequency and time. This paper details the approach of developing Sensing Super-position systems as a way to augment the human vision system by exploiting the capabilities of the human hearing system as an additional neural input. The human hearing system is capable of learning to process and interpret extremely complicated and rapidly changing auditory patterns. The known capabilities of the human hearing system to learn and understand complicated auditory patterns provided the basic motivation for developing an image-to-sound mapping system.

  3. Comparison of respiratory-gated and respiratory-ungated planning in scattered carbon ion beam treatment of the pancreas using four-dimensional computed tomography.

    PubMed

    Mori, Shinichiro; Yanagi, Takeshi; Hara, Ryusuke; Sharp, Gregory C; Asakura, Hiroshi; Kumagai, Motoki; Kishimoto, Riwa; Yamada, Shigeru; Kato, Hirotoshi; Kandatsu, Susumu; Kamada, Tadashi

    2010-01-01

    We compared respiratory-gated and respiratory-ungated treatment strategies using four-dimensional (4D) scattered carbon ion beam distribution in pancreatic 4D computed tomography (CT) datasets. Seven inpatients with pancreatic tumors underwent 4DCT scanning under free-breathing conditions using a rapidly rotating cone-beam CT, which was integrated with a 256-slice detector, in cine mode. Two types of bolus for gated and ungated treatment were designed to cover the planning target volume (PTV) using 4DCT datasets in a 30% duty cycle around exhalation and a single respiratory cycle, respectively. Carbon ion beam distribution for each strategy was calculated as a function of respiratory phase by applying the compensating bolus to 4DCT at the respective phases. Smearing was not applied to the bolus, but consideration was given to drill diameter. The accumulated dose distributions were calculated by applying deformable registration and calculating the dose-volume histogram. Doses to normal tissues in gated treatment were minimized mainly on the inferior aspect, which thereby minimized excessive doses to normal tissues. Over 95% of the dose, however, was delivered to the clinical target volume at all phases for both treatment strategies. Maximum doses to the duodenum and pancreas averaged across all patients were 43.1/43.1 GyE (ungated/gated) and 43.2/43.2 GyE (ungated/gated), respectively. Although gated treatment minimized excessive dosing to normal tissue, the difference between treatment strategies was small. Respiratory gating may not always be required in pancreatic treatment as long as dose distribution is assessed. Any application of our results to clinical use should be undertaken only after discussion with oncologists, particularly with regard to radiotherapy combined with chemotherapy.

  4. CHANGE OF MAGNETIC FIELD-GAS ALIGNMENT AT THE GRAVITY-DRIVEN ALFVÉNIC TRANSITION IN MOLECULAR CLOUDS: IMPLICATIONS FOR DUST POLARIZATION OBSERVATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Che-Yu; King, Patrick K.; Li, Zhi-Yun

    Diffuse striations in molecular clouds are preferentially aligned with local magnetic fields, whereas dense filaments tend to be perpendicular to them. When and why this transition occurs remain uncertain. To explore the physics behind this transition, we compute the histogram of relative orientation (HRO) between the density gradient and the magnetic field in three-dimensional magnetohydrodynamic (MHD) simulations of prestellar core formation in shock-compressed regions within giant molecular clouds. We find that, in the magnetically dominated (sub-Alfvénic) post-shock region, the gas structure is preferentially aligned with the local magnetic field. For overdense sub-regions with super-Alfvénic gas, their elongation becomes preferentially perpendicularmore » to the local magnetic field. The transition occurs when self-gravitating gas gains enough kinetic energy from the gravitational acceleration to overcome the magnetic support against the cross-field contraction, which results in a power-law increase of the field strength with density. Similar results can be drawn from HROs in projected two-dimensional maps with integrated column densities and synthetic polarized dust emission. We quantitatively analyze our simulated polarization properties, and interpret the reduced polarization fraction at high column densities as the result of increased distortion of magnetic field directions in trans- or super-Alfvénic gas. Furthermore, we introduce measures of the inclination and tangledness of the magnetic field along the line of sight as the controlling factors of the polarization fraction. Observations of the polarization fraction and angle dispersion can therefore be utilized in studying local magnetic field morphology in star-forming regions.« less

  5. Clarification to "Examining Rater Errors in the Assessment of Written Composition with a Many-Faceted Rasch Model."

    ERIC Educational Resources Information Center

    Englehard, George, Jr.

    1996-01-01

    Data presented in figure three of the article cited may be misleading in that the automatic scaling procedure used by the computer program that generated the histogram highlighted spikes that would look different with different histogram methods. (SLD)

  6. Using Computer Graphics in Statistics.

    ERIC Educational Resources Information Center

    Kerley, Lyndell M.

    1990-01-01

    Described is software which allows a student to use simulation to produce analytical output as well as graphical results. The results include a frequency histogram of a selected population distribution, a frequency histogram of the distribution of the sample means, and test the normality distributions of the sample means. (KR)

  7. Efficient visibility-driven medical image visualisation via adaptive binned visibility histogram.

    PubMed

    Jung, Younhyun; Kim, Jinman; Kumar, Ashnil; Feng, David Dagan; Fulham, Michael

    2016-07-01

    'Visibility' is a fundamental optical property that represents the observable, by users, proportion of the voxels in a volume during interactive volume rendering. The manipulation of this 'visibility' improves the volume rendering processes; for instance by ensuring the visibility of regions of interest (ROIs) or by guiding the identification of an optimal rendering view-point. The construction of visibility histograms (VHs), which represent the distribution of all the visibility of all voxels in the rendered volume, enables users to explore the volume with real-time feedback about occlusion patterns among spatially related structures during volume rendering manipulations. Volume rendered medical images have been a primary beneficiary of VH given the need to ensure that specific ROIs are visible relative to the surrounding structures, e.g. the visualisation of tumours that may otherwise be occluded by neighbouring structures. VH construction and its subsequent manipulations, however, are computationally expensive due to the histogram binning of the visibilities. This limits the real-time application of VH to medical images that have large intensity ranges and volume dimensions and require a large number of histogram bins. In this study, we introduce an efficient adaptive binned visibility histogram (AB-VH) in which a smaller number of histogram bins are used to represent the visibility distribution of the full VH. We adaptively bin medical images by using a cluster analysis algorithm that groups the voxels according to their intensity similarities into a smaller subset of bins while preserving the distribution of the intensity range of the original images. We increase efficiency by exploiting the parallel computation and multiple render targets (MRT) extension of the modern graphical processing units (GPUs) and this enables efficient computation of the histogram. We show the application of our method to single-modality computed tomography (CT), magnetic resonance (MR) imaging and multi-modality positron emission tomography-CT (PET-CT). In our experiments, the AB-VH markedly improved the computational efficiency for the VH construction and thus improved the subsequent VH-driven volume manipulations. This efficiency was achieved without major degradation in the VH visually and numerical differences between the AB-VH and its full-bin counterpart. We applied several variants of the K-means clustering algorithm with varying Ks (the number of clusters) and found that higher values of K resulted in better performance at a lower computational gain. The AB-VH also had an improved performance when compared to the conventional method of down-sampling of the histogram bins (equal binning) for volume rendering visualisation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. A case of EDTA-dependent pseudothrombocytopenia: simple recognition of an underdiagnosed and misleading phenomenon

    PubMed Central

    2014-01-01

    Background EDTA-dependent pseudothrombocytopenia (EDTA-PTCP) is a common laboratory phenomenon with a prevalence ranging from 0.1-2% in hospitalized patients to 15-17% in outpatients evaluated for isolated thrombocytopenia. Despite its harmlessness, EDTA-PTCP frequently leads to time-consuming, costly and even invasive diagnostic investigations. EDTA-PTCP is often overlooked because blood smears are not evaluated visually in routine practice and histograms as well as warning flags of hematology analyzers are not interpreted correctly. Nonetheless, EDTA-PTCP may be diagnosed easily even by general practitioners without any experiences in blood film examinations. This is the first report illustrating the typical patterns of a platelet (PLT) and white blood cell (WBC) histograms of hematology analyzers. Case presentation A 37-year-old female patient of Caucasian origin was referred with suspected acute leukemia and the crew of the emergency unit arranged extensive investigations for work-up. However, examination of EDTA blood sample revealed atypical lymphocytes and an isolated thrombocytopenia together with typical patterns of WBC and PLT histograms: a serrated curve of the platelet histogram and a peculiar peak on the left side of the WBC histogram. EDTA-PTCP was confirmed by a normal platelet count when examining citrated blood. Conclusion Awareness of typical PLT and WBC patterns may alert to the presence of EDTA-PTCP in routine laboratory practice helping to avoid unnecessary investigations and over-treatment. PMID:24808761

  9. Assessment of Intrafraction Breathing Motion on Left Anterior Descending Artery Dose During Left-Sided Breast Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    El-Sherif, Omar, E-mail: Omar.ElSherif@lhsc.on.ca; Department of Physics, London Regional Cancer Program, London, Ontario; Yu, Edward

    Purpose: To use 4-dimensional computed tomography (4D-CT) imaging to predict the level of uncertainty in cardiac dose estimates of the left anterior descending artery that arises due to breathing motion during radiation therapy for left-sided breast cancer. Methods and Materials: The fast helical CT (FH-CT) and 4D-CT of 30 left-sided breast cancer patients were retrospectively analyzed. Treatment plans were created on the FH-CT. The original treatment plan was then superimposed onto all 10 phases of the 4D-CT to quantify the dosimetric impact of respiratory motion through 4D dose accumulation (4D-dose). Dose-volume histograms for the heart, left ventricle (LV), and left anteriormore » descending (LAD) artery obtained from the FH-CT were compared with those obtained from the 4D-dose. Results: The 95% confidence interval of 4D-dose and FH-CT differences in mean dose estimates for the heart, LV, and LAD were ±0.5 Gy, ±1.0 Gy, and ±8.7 Gy, respectively. Conclusion: Fast helical CT is a good approximation for doses to the heart and LV; however, dose estimates for the LAD are susceptible to uncertainties that arise due to intrafraction breathing motion that cannot be ascertained without the additional information obtained from 4D-CT and dose accumulation. For future clinical studies, we suggest the use of 4D-CT–derived dose-volume histograms for estimating the dose to the LAD.« less

  10. Three-dimensional dose verification of the clinical application of gamma knife stereotactic radiosurgery using polymer gel and MRI.

    PubMed

    Papagiannis, P; Karaiskos, P; Kozicki, M; Rosiak, J M; Sakelliou, L; Sandilos, P; Seimenis, I; Torrens, M

    2005-05-07

    This work seeks to verify multi-shot clinical applications of stereotactic radiosurgery with a Leksell Gamma Knife model C unit employing a polymer gel-MRI based experimental procedure, which has already been shown to be capable of verifying the precision and accuracy of dose delivery in single-shot gamma knife applications. The treatment plan studied in the present work resembles a clinical treatment case of pituitary adenoma using four 8 mm and one 14 mm collimator helmet shots to deliver a prescription dose of 15 Gy to the 50% isodose line (30 Gy maximum dose). For the experimental dose verification of the treatment plan, the same criteria as those used in the clinical treatment planning evaluation were employed. These included comparison of measured and GammaPlan calculated data, in terms of percentage isodose contours on axial, coronal and sagittal planes, as well as 3D plan evaluation criteria such as dose-volume histograms for the target volume, target coverage and conformity indices. Measured percentage isodose contours compared favourably with calculated ones despite individual point fluctuations at low dose contours (e.g., 20%) mainly due to the effect of T2 measurement uncertainty on dose resolution. Dose-volume histogram data were also found in a good agreement while the experimental results for the percentage target coverage and conformity index were 94% and 1.17 relative to corresponding GammaPlan calculations of 96% and 1.12, respectively. Overall, polymer gel results verified the planned dose distribution within experimental uncertainties and uncertainty related to the digitization process of selected GammaPlan output data.

  11. Terrestrial laser scanning to quantify above-ground biomass of structurally complex coastal wetland vegetation

    NASA Astrophysics Data System (ADS)

    Owers, Christopher J.; Rogers, Kerrylee; Woodroffe, Colin D.

    2018-05-01

    Above-ground biomass represents a small yet significant contributor to carbon storage in coastal wetlands. Despite this, above-ground biomass is often poorly quantified, particularly in areas where vegetation structure is complex. Traditional methods for providing accurate estimates involve harvesting vegetation to develop mangrove allometric equations and quantify saltmarsh biomass in quadrats. However broad scale application of these methods may not capture structural variability in vegetation resulting in a loss of detail and estimates with considerable uncertainty. Terrestrial laser scanning (TLS) collects high resolution three-dimensional point clouds capable of providing detailed structural morphology of vegetation. This study demonstrates that TLS is a suitable non-destructive method for estimating biomass of structurally complex coastal wetland vegetation. We compare volumetric models, 3-D surface reconstruction and rasterised volume, and point cloud elevation histogram modelling techniques to estimate biomass. Our results show that current volumetric modelling approaches for estimating TLS-derived biomass are comparable to traditional mangrove allometrics and saltmarsh harvesting. However, volumetric modelling approaches oversimplify vegetation structure by under-utilising the large amount of structural information provided by the point cloud. The point cloud elevation histogram model presented in this study, as an alternative to volumetric modelling, utilises all of the information within the point cloud, as opposed to sub-sampling based on specific criteria. This method is simple but highly effective for both mangrove (r2 = 0.95) and saltmarsh (r2 > 0.92) vegetation. Our results provide evidence that application of TLS in coastal wetlands is an effective non-destructive method to accurately quantify biomass for structurally complex vegetation.

  12. Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite

    ERIC Educational Resources Information Center

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…

  13. NASA TLA workload analysis support. Volume 3: FFD autopilot scenario validation data

    NASA Technical Reports Server (NTRS)

    Sundstrom, J. L.

    1980-01-01

    The data used to validate a seven time line analysis of forward flight deck autopilot mode for the pilot and copilot for NASA B737 terminal configured vehicle are presented. Demand workloads are given in two forms: workload histograms and workload summaries (bar graphs). A report showing task length and task interaction is also presented.

  14. Dimension scaling effects on the yield sensitivity of HEMT digital circuits

    NASA Technical Reports Server (NTRS)

    Sarker, Jogendra C.; Purviance, John E.

    1992-01-01

    In our previous works, using a graphical tool, yield factor histograms, we studied the yield sensitivity of High Electron Mobility Transistors (HEMT) and HEMT circuit performance with the variation of process parameters. This work studies the scaling effects of process parameters on yield sensitivity of HEMT digital circuits. The results from two HEMT circuits are presented.

  15. Gaze Fluctuations Are Not Additively Decomposable: Reply to Bogartz and Staub

    ERIC Educational Resources Information Center

    Kelty-Stephen, Damian G.; Mirman, Daniel

    2013-01-01

    Our previous work interpreted single-lognormal fits to inter-gaze distance (i.e., "gaze steps") histograms as evidence of multiplicativity and hence interactions across scales in visual cognition. Bogartz and Staub (2012) proposed that gaze steps are additively decomposable into fixations and saccades, matching the histograms better and…

  16. A study of self organized criticality in ion temperature gradient mode driven gyrokinetic turbulence

    NASA Astrophysics Data System (ADS)

    Mavridis, M.; Isliker, H.; Vlahos, L.; Görler, T.; Jenko, F.; Told, D.

    2014-10-01

    An investigation on the characteristics of self organized criticality (Soc) in ITG mode driven turbulence is made, with the use of various statistical tools (histograms, power spectra, Hurst exponents estimated with the rescaled range analysis, and the structure function method). For this purpose, local non-linear gyrokinetic simulations of the cyclone base case scenario are performed with the GENE software package. Although most authors concentrate on global simulations, which seem to be a better choice for such an investigation, we use local simulations in an attempt to study the locally underlying mechanisms of Soc. We also study the structural properties of radially extended structures, with several tools (fractal dimension estimate, cluster analysis, and two dimensional autocorrelation function), in order to explore whether they can be characterized as avalanches. We find that, for large enough driving temperature gradients, the local simulations exhibit most of the features of Soc, with the exception of the probability distribution of observables, which show a tail, yet they are not of power-law form. The radial structures have the same radial extent at all temperature gradients examined; radial motion (transport) though appears only at large temperature gradients, in which case the radial structures can be interpreted as avalanches.

  17. A study of self organized criticality in ion temperature gradient mode driven gyrokinetic turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mavridis, M.; Isliker, H.; Vlahos, L.

    2014-10-15

    An investigation on the characteristics of self organized criticality (Soc) in ITG mode driven turbulence is made, with the use of various statistical tools (histograms, power spectra, Hurst exponents estimated with the rescaled range analysis, and the structure function method). For this purpose, local non-linear gyrokinetic simulations of the cyclone base case scenario are performed with the GENE software package. Although most authors concentrate on global simulations, which seem to be a better choice for such an investigation, we use local simulations in an attempt to study the locally underlying mechanisms of Soc. We also study the structural properties ofmore » radially extended structures, with several tools (fractal dimension estimate, cluster analysis, and two dimensional autocorrelation function), in order to explore whether they can be characterized as avalanches. We find that, for large enough driving temperature gradients, the local simulations exhibit most of the features of Soc, with the exception of the probability distribution of observables, which show a tail, yet they are not of power-law form. The radial structures have the same radial extent at all temperature gradients examined; radial motion (transport) though appears only at large temperature gradients, in which case the radial structures can be interpreted as avalanches.« less

  18. g-PRIME: A Free, Windows Based Data Acquisition and Event Analysis Software Package for Physiology in Classrooms and Research Labs.

    PubMed

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.

  19. Cancer imaging phenomics toolkit: quantitative imaging analytics for precision diagnostics and predictive modeling of clinical outcome.

    PubMed

    Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina

    2018-01-01

    The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.

  20. A Computer Program for Practical Semivariogram Modeling and Ordinary Kriging: A Case Study of Porosity Distribution in an Oil Field

    NASA Astrophysics Data System (ADS)

    Mert, Bayram Ali; Dag, Ahmet

    2017-12-01

    In this study, firstly, a practical and educational geostatistical program (JeoStat) was developed, and then example analysis of porosity parameter distribution, using oilfield data, was presented. With this program, two or three-dimensional variogram analysis can be performed by using normal, log-normal or indicator transformed data. In these analyses, JeoStat offers seven commonly used theoretical variogram models (Spherical, Gaussian, Exponential, Linear, Generalized Linear, Hole Effect and Paddington Mix) to the users. These theoretical models can be easily and quickly fitted to experimental models using a mouse. JeoStat uses ordinary kriging interpolation technique for computation of point or block estimate, and also uses cross-validation test techniques for validation of the fitted theoretical model. All the results obtained by the analysis as well as all the graphics such as histogram, variogram and kriging estimation maps can be saved to the hard drive, including digitised graphics and maps. As such, the numerical values of any point in the map can be monitored using a mouse and text boxes. This program is available to students, researchers, consultants and corporations of any size free of charge. The JeoStat software package and source codes available at: http://www.jeostat.com/JeoStat_2017.0.rar.

  1. Dosimetric comparison between conventional and conformal radiotherapy for carcinoma cervix: Are we treating the right volumes?

    PubMed Central

    Goswami, Jyotirup; Patra, Niladri B.; Sarkar, Biplab; Basu, Ayan; Pal, Santanu

    2013-01-01

    Background and Purpose: Conventional portals, based on bony anatomy, for external beam radiotherapy for cervical cancer have been repeatedly demonstrated as inadequate. Conversely, with image-based conformal radiotherapy, better target coverage may be offset by the greater toxicities and poorer compliance associated with treating larger volumes. This study was meant to dosimetrically compare conformal and conventional radiotherapy. Materials and Methods: Five patients of carcinoma cervix underwent planning CT scan with IV contrast and targets, and organs at risk (OAR) were contoured. Two sets of plans-conventional and conformal were generated for each patient. Field sizes were recorded, and dose volume histograms of both sets of plans were generated and compared on the basis of target coverage and OAR sparing. Results: Target coverage was significantly improved with conformal plans though field sizes required were significantly larger. On the other hand, dose homogeneity was not significantly improved. Doses to the OARs (rectum, urinary bladder, and small bowel) were not significantly different across the 2 arms. Conclusion: Three-dimensional conformal radiotherapy gives significantly better target coverage, which may translate into better local control and survival. On the other hand, it also requires significantly larger field sizes though doses to the OARs are not significantly increased. PMID:24455584

  2. A multimodel intercomparison of resolution effects on precipitation: simulations and theory

    DOE PAGES

    Rauscher, Sara A.; O?Brien, Travis A.; Piani, Claudio; ...

    2016-02-27

    An ensemble of six pairs of RCM experiments performed at 25 and 50 km for the period 1961–2000 over a large European domain is examined in order to evaluate the effects of resolution on the simulation of daily precipitation statistics. Application of the non-parametric two-sample Kolmorgorov–Smirnov test, which tests for differences in the location and shape of the probability distributions of two samples, shows that the distribution of daily precipitation differs between the pairs of simulations over most land areas in both summer and winter, with the strongest signal over southern Europe. Two-dimensional histograms reveal that precipitation intensity increases with resolutionmore » over almost the entire domain in both winter and summer. In addition, the 25 km simulations have more dry days than the 50 km simulations. The increase in dry days with resolution is indicative of an improvement in model performance at higher resolution, while the more intense precipitation exceeds observed values. The systematic increase in precipitation extremes with resolution across all models suggests that this response is fundamental to model formulation. Simple theoretical arguments suggest that fluid continuity, combined with the emergent scaling properties of the horizontal wind field, results in an increase in resolved vertical transport as grid spacing decreases. This increase in resolution-dependent vertical mass flux then drives an intensification of convergence and resolvable-scale precipitation as grid spacing decreases. In conclusion, this theoretical result could help explain the increasingly, and often anomalously, large stratiform contribution to total rainfall observed with increasing resolution in many regional and global models.« less

  3. A multimodel intercomparison of resolution effects on precipitation: simulations and theory

    NASA Astrophysics Data System (ADS)

    Rauscher, Sara A.; O'Brien, Travis A.; Piani, Claudio; Coppola, Erika; Giorgi, Filippo; Collins, William D.; Lawston, Patricia M.

    2016-10-01

    An ensemble of six pairs of RCM experiments performed at 25 and 50 km for the period 1961-2000 over a large European domain is examined in order to evaluate the effects of resolution on the simulation of daily precipitation statistics. Application of the non-parametric two-sample Kolmorgorov-Smirnov test, which tests for differences in the location and shape of the probability distributions of two samples, shows that the distribution of daily precipitation differs between the pairs of simulations over most land areas in both summer and winter, with the strongest signal over southern Europe. Two-dimensional histograms reveal that precipitation intensity increases with resolution over almost the entire domain in both winter and summer. In addition, the 25 km simulations have more dry days than the 50 km simulations. The increase in dry days with resolution is indicative of an improvement in model performance at higher resolution, while the more intense precipitation exceeds observed values. The systematic increase in precipitation extremes with resolution across all models suggests that this response is fundamental to model formulation. Simple theoretical arguments suggest that fluid continuity, combined with the emergent scaling properties of the horizontal wind field, results in an increase in resolved vertical transport as grid spacing decreases. This increase in resolution-dependent vertical mass flux then drives an intensification of convergence and resolvable-scale precipitation as grid spacing decreases. This theoretical result could help explain the increasingly, and often anomalously, large stratiform contribution to total rainfall observed with increasing resolution in many regional and global models.

  4. MRI intensity nonuniformity correction using simultaneously spatial and gray-level histogram information.

    PubMed

    Milles, Julien; Zhu, Yue Min; Gimenez, Gérard; Guttmann, Charles R G; Magnin, Isabelle E

    2007-03-01

    A novel approach for correcting intensity nonuniformity in magnetic resonance imaging (MRI) is presented. This approach is based on the simultaneous use of spatial and gray-level histogram information. Spatial information about intensity nonuniformity is obtained using cubic B-spline smoothing. Gray-level histogram information of the image corrupted by intensity nonuniformity is exploited from a frequential point of view. The proposed correction method is illustrated using both physical phantom and human brain images. The results are consistent with theoretical prediction, and demonstrate a new way of dealing with intensity nonuniformity problems. They are all the more significant as the ground truth on intensity nonuniformity is unknown in clinical images.

  5. An effective image classification method with the fusion of invariant feature and a new color descriptor

    NASA Astrophysics Data System (ADS)

    Mansourian, Leila; Taufik Abdullah, Muhamad; Nurliyana Abdullah, Lili; Azman, Azreen; Mustaffa, Mas Rina

    2017-02-01

    Pyramid Histogram of Words (PHOW), combined Bag of Visual Words (BoVW) with the spatial pyramid matching (SPM) in order to add location information to extracted features. However, different PHOW extracted from various color spaces, and they did not extract color information individually, that means they discard color information, which is an important characteristic of any image that is motivated by human vision. This article, concatenated PHOW Multi-Scale Dense Scale Invariant Feature Transform (MSDSIFT) histogram and a proposed Color histogram to improve the performance of existing image classification algorithms. Performance evaluation on several datasets proves that the new approach outperforms other existing, state-of-the-art methods.

  6. Radarclinometry: Bootstrapping the radar reflectance function from the image pixel-signal frequency distribution and an altimetry profile

    USGS Publications Warehouse

    Wildey, R.L.

    1988-01-01

    A method is derived for determining the dependence of radar backscatter on incidence angle that is applicable to the region corresponding to a particular radar image. The method is based on enforcing mathematical consistency between the frequency distribution of the image's pixel signals (histogram of DN values with suitable normalizations) and a one-dimensional frequency distribution of slope component, as might be obtained from a radar or laser altimetry profile in or near the area imaged. In order to achieve a unique solution, the auxiliary assumption is made that the two-dimensional frequency distribution of slope is isotropic. The backscatter is not derived in absolute units. The method is developed in such a way as to separate the reflectance function from the pixel-signal transfer characteristic. However, these two sources of variation are distinguishable only on the basis of a weak dependence on the azimuthal component of slope; therefore such an approach can be expected to be ill-conditioned unless the revision of the transfer characteristic is limited to the determination of an additive instrumental background level. The altimetry profile does not have to be registered in the image, and the statistical nature of the approach minimizes pixel noise effects and the effects of a disparity between the resolutions of the image and the altimetry profile, except in the wings of the distribution where low-number statistics preclude accuracy anyway. The problem of dealing with unknown slope components perpendicular to the profiling traverse, which besets the one-to-one comparison between individual slope components and pixel-signal values, disappears in the present approach. In order to test the resulting algorithm, an artificial radar image was generated from the digitized topographic map of the Lake Champlain West quadrangle in the Adirondack Mountains, U.S.A., using an arbitrarily selected reflectance function. From the same map, a one-dimensional frequency distribution of slope component was extracted. The algorithm recaptured the original reflectance function to the degree that, for the central 90% of the data, the discrepancy translates to a RMS slope error of 0.1 ???. For the central 99% of the data, the maximum error translates to 1 ???; at the absolute extremes of the data the error grows to 6 ???. ?? 1988 Kluwer Academic Publishers.

  7. Fusion-based multi-target tracking and localization for intelligent surveillance systems

    NASA Astrophysics Data System (ADS)

    Rababaah, Haroun; Shirkhodaie, Amir

    2008-04-01

    In this paper, we have presented two approaches addressing visual target tracking and localization in complex urban environment. The two techniques presented in this paper are: fusion-based multi-target visual tracking, and multi-target localization via camera calibration. For multi-target tracking, the data fusion concepts of hypothesis generation/evaluation/selection, target-to-target registration, and association are employed. An association matrix is implemented using RGB histograms for associated tracking of multi-targets of interests. Motion segmentation of targets of interest (TOI) from the background was achieved by a Gaussian Mixture Model. Foreground segmentation, on other hand, was achieved by the Connected Components Analysis (CCA) technique. The tracking of individual targets was estimated by fusing two sources of information, the centroid with the spatial gating, and the RGB histogram association matrix. The localization problem is addressed through an effective camera calibration technique using edge modeling for grid mapping (EMGM). A two-stage image pixel to world coordinates mapping technique is introduced that performs coarse and fine location estimation of moving TOIs. In coarse estimation, an approximate neighborhood of the target position is estimated based on nearest 4-neighbor method, and in fine estimation, we use Euclidean interpolation to localize the position within the estimated four neighbors. Both techniques were tested and shown reliable results for tracking and localization of Targets of interests in complex urban environment.

  8. Application of Markov Models for Analysis of Development of Psychological Characteristics

    ERIC Educational Resources Information Center

    Kuravsky, Lev S.; Malykh, Sergey B.

    2004-01-01

    A technique to study combined influence of environmental and genetic factors on the base of changes in phenotype distributions is presented. Histograms are exploited as base analyzed characteristics. A continuous time, discrete state Markov process with piece-wise constant interstate transition rates is associated with evolution of each histogram.…

  9. Post-Modeling Histogram Matching of Maps Produced Using Regression Trees

    Treesearch

    Andrew J. Lister; Tonya W. Lister

    2006-01-01

    Spatial predictive models often use statistical techniques that in some way rely on averaging of values. Estimates from linear modeling are known to be susceptible to truncation of variance when the independent (predictor) variables are measured with error. A straightforward post-processing technique (histogram matching) for attempting to mitigate this effect is...

  10. Microprocessor-Based Neural-Pulse-Wave Analyzer

    NASA Technical Reports Server (NTRS)

    Kojima, G. K.; Bracchi, F.

    1983-01-01

    Microprocessor-based system analyzes amplitudes and rise times of neural waveforms. Displaying histograms of measured parameters helps researchers determine how many nerves contribute to signal and specify waveform characteristics of each. Results are improved noise rejection, full or partial separation of overlapping peaks, and isolation and identification of related peaks in different histograms. 2

  11. Histogram-based automatic thresholding for bruise detection of apples by structured-illumination reflectance imaging

    USDA-ARS?s Scientific Manuscript database

    Thresholding is an important step in the segmentation of image features, and the existing methods are not all effective when the image histogram exhibits a unimodal pattern, which is common in defect detection of fruit. This study was aimed at developing a general automatic thresholding methodology ...

  12. Real-Time Tracking by Double Templates Matching Based on Timed Motion History Image with HSV Feature

    PubMed Central

    Li, Zhiyong; Li, Pengfei; Yu, Xiaoping; Hashem, Mervat

    2014-01-01

    It is a challenge to represent the target appearance model for moving object tracking under complex environment. This study presents a novel method with appearance model described by double templates based on timed motion history image with HSV color histogram feature (tMHI-HSV). The main components include offline template and online template initialization, tMHI-HSV-based candidate patches feature histograms calculation, double templates matching (DTM) for object location, and templates updating. Firstly, we initialize the target object region and calculate its HSV color histogram feature as offline template and online template. Secondly, the tMHI-HSV is used to segment the motion region and calculate these candidate object patches' color histograms to represent their appearance models. Finally, we utilize the DTM method to trace the target and update the offline template and online template real-timely. The experimental results show that the proposed method can efficiently handle the scale variation and pose change of the rigid and nonrigid objects, even in illumination change and occlusion visual environment. PMID:24592185

  13. Adjustments for the display of quantized ion channel dwell times in histograms with logarithmic bins.

    PubMed

    Stark, J A; Hladky, S B

    2000-02-01

    Dwell-time histograms are often plotted as part of patch-clamp investigations of ion channel currents. The advantages of plotting these histograms with a logarithmic time axis were demonstrated by, J. Physiol. (Lond.). 378:141-174), Pflügers Arch. 410:530-553), and, Biophys. J. 52:1047-1054). Sigworth and Sine argued that the interpretation of such histograms is simplified if the counts are presented in a manner similar to that of a probability density function. However, when ion channel records are recorded as a discrete time series, the dwell times are quantized. As a result, the mapping of dwell times to logarithmically spaced bins is highly irregular; bins may be empty, and significant irregularities may extend beyond the duration of 100 samples. Using simple approximations based on the nature of the binning process and the transformation rules for probability density functions, we develop adjustments for the display of the counts to compensate for this effect. Tests with simulated data suggest that this procedure provides a faithful representation of the data.

  14. Accurate modeling and evaluation of microstructures in complex materials

    NASA Astrophysics Data System (ADS)

    Tahmasebi, Pejman

    2018-02-01

    Accurate characterization of heterogeneous materials is of great importance for different fields of science and engineering. Such a goal can be achieved through imaging. Acquiring three- or two-dimensional images under different conditions is not, however, always plausible. On the other hand, accurate characterization of complex and multiphase materials requires various digital images (I) under different conditions. An ensemble method is presented that can take one single (or a set of) I(s) and stochastically produce several similar models of the given disordered material. The method is based on a successive calculating of a conditional probability by which the initial stochastic models are produced. Then, a graph formulation is utilized for removing unrealistic structures. A distance transform function for the Is with highly connected microstructure and long-range features is considered which results in a new I that is more informative. Reproduction of the I is also considered through a histogram matching approach in an iterative framework. Such an iterative algorithm avoids reproduction of unrealistic structures. Furthermore, a multiscale approach, based on pyramid representation of the large Is, is presented that can produce materials with millions of pixels in a matter of seconds. Finally, the nonstationary systems—those for which the distribution of data varies spatially—are studied using two different methods. The method is tested on several complex and large examples of microstructures. The produced results are all in excellent agreement with the utilized Is and the similarities are quantified using various correlation functions.

  15. Dosimetric impact in the dose-volume histograms of rectal and vesical wall contouring in prostate cancer IMRT treatments.

    PubMed

    Gómez, Laura; Andrés, Carlos; Ruiz, Antonio

    2017-01-01

    The main purpose of this study was to evaluate the differences in dose-volume histograms of IMRT treatments for prostate cancer based on the delineation of the main organs at risk (rectum and bladder) as solid organs or by contouring their wall. Rectum and bladder have typically been delineated as solid organs, including the waste material, which, in practice, can lead to an erroneous assessment of the risk of adverse effects. A retrospective study was made on 25 patients treated with IMRT radiotherapy for prostate adenocarcinoma. 76.32 Gy in 36 fractions was prescribed to the prostate and seminal vesicles. In addition to the delineation of the rectum and bladder as solid organs (including their content), the rectal and bladder wall were also delineated and the resulting dose-volume histograms were analyzed for the two groups of structures. Data analysis shows statistically significant differences in the main parameters used to assess the risk of toxicity of a prostate radiotherapy treatment. Higher doses were received on the rectal and bladder walls compared to doses received on the corresponding solid organs. The observed differences in terms of received doses to the rectum and bladder based on the method of contouring could gain greater importance in inverse planning treatments, where the treatment planning system optimizes the dose in these volumes. So, one should take into account the method of delineating of these structures to make a clinical decision regarding dose limitation and risk assessment of chronic toxicity.

  16. Anvil Clouds of Tropical Mesoscale Convective Systems in Monsoon Regions

    NASA Technical Reports Server (NTRS)

    Cetrone, J.; Houze, R. A., Jr.

    2009-01-01

    The anvil clouds of tropical mesoscale convective systems (MCSs) in West Africa, the Maritime Continent and the Bay of Bengal have been examined with TRMM and CloudSat satellite data and ARM ground-based radar observations. The anvils spreading out from the precipitating cores of MCSs are subdivided into thick, medium and thin portions. The thick portions of anvils show distinct differences from one climatological regime to another. In their upper portions, the thick anvils of West Africa MCSs have a broad, flat histogram of reflectivity, and a maximum of reflectivity in their lower portions. The reflectivity histogram of the Bay of Bengal thick anvils has a sharply peaked distribution of reflectivity at all altitudes with modal values that increase monotonically downward. The reflectivity histogram of the Maritime Continent thick anvils is intermediate between that of the West Africa and Bay of Bengal anvils, consistent with the fact this region comprises a mix of land and ocean influences. It is suggested that the difference between the statistics of the continental and oceanic anvils is related to some combination of two factors: (1) the West African anvils tend to be closely tied to the convective regions of MCSs while the oceanic anvils are more likely to be extending outward from large stratiform precipitation areas of MCSs, and (2) the West African MCSs result from greater buoyancy, so that the convective cells are more likely to produce graupel particles and detrain them into anvils

  17. Fluorescence lifetime imaging ophthalmoscopy in type 2 diabetic patients who have no signs of diabetic retinopathy

    NASA Astrophysics Data System (ADS)

    Schweitzer, Dietrich; Deutsch, Lydia; Klemm, Matthias; Jentsch, Susanne; Hammer, Martin; Peters, Sven; Haueisen, Jens; Müller, Ulrich A.; Dawczynski, Jens

    2015-06-01

    The time-resolved autofluorescence of the eye is used for the detection of metabolic alteration in diabetic patients who have no signs of diabetic retinopathy. One eye from 37 phakic and 11 pseudophakic patients with type 2 diabetes, and one eye from 25 phakic and 23 pseudophakic healthy subjects were included in the study. After a three-exponential fit of the decay of autofluorescence, histograms of lifetimes τi, amplitudes αi, and relative contributions Qi were statistically compared between corresponding groups in two spectral channels (490450 ps, and the shift of τ3 from ˜3000 to 3700 ps in ch1 of diabetic patients when compared with healthy subjects indicate an increased production of free flavin adenine dinucleotide, accumulation of advanced glycation end products (AGE), and, probably, a change from free to protein-bound reduced nicotinamide adenine dinucleotide at the fundus. AGE also accumulated in the crystalline lens.

  18. Detection of simulated microcalcifications in fixed mammary tissue: An ROC study of the effect of local versus global histogram equalization.

    PubMed

    Sund, T; Olsen, J B

    2006-09-01

    To investigate whether sliding window adaptive histogram equalization (SWAHE) of digital mammograms improves the detection of simulated calcifications, as compared to images normalized by global histogram equalization (GHE). Direct digital mammograms were obtained from mammary tissue phantoms superimposed with different frames. Each frame was divided into forty squares by a wire mesh, and contained granular calcifications randomly positioned in about 50% of the squares. Three radiologists read the mammograms on a display monitor. They classified their confidence in the presence of microcalcifications in each square on a scale of 1 to 5. Images processed with GHE were first read and used as a reference. In a later session, the same images processed with SWAHE were read. The results were compared using ROC methodology. When the total areas AZ were compared, the results were completely equivocal. When comparing the high-specificity partial ROC area AZ,0.2 below false-positive fraction (FPF) 0.20, two of the three observers performed best with the images processed with SWAHE. The difference was not statistically significant. When the reader's confidence threshold in malignancy is set at a high level, increasing the contrast of mammograms with SWAHE may enhance the visibility of microcalcifications without adversely affecting the false-positive rate. When the reader's confidence threshold is set at a low level, the effect of SWAHE is an increase of false positives. Further investigation is needed to confirm the validity of the conclusions.

  19. A method for normalizing pathology images to improve feature extraction for quantitative pathology.

    PubMed

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    2016-01-01

    With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  20. Fluorescence lifetime imaging ophthalmoscopy in type 2 diabetic patients who have no signs of diabetic retinopathy.

    PubMed

    Schweitzer, Dietrich; Deutsch, Lydia; Klemm, Matthias; Jentsch, Susanne; Hammer, Martin; Peters, Sven; Haueisen, Jens; Müller, Ulrich A; Dawczynski, Jens

    2015-06-01

    The time-resolved autofluorescence of the eye is used for the detection of metabolic alteration in diabetic patients who have no signs of diabetic retinopathy. One eye from 37 phakic and 11 pseudophakic patients with type 2 diabetes, and one eye from 25 phakic and 23 pseudophakic healthy subjects were included n the study. After a three-exponential fit of the decay of autofluorescence, histograms of lifetimes τ(i), amplitudes α(i), and relative contributions Q(i) were statistically compared between corresponding groups in two spectral channels (490 < ch1 < 560 nm, 560 < ch2 < 700 nm). The change in single fluorophores was estimated by applying the Holm–Bonferroni method and by calculating differences in the sum histograms of lifetimes. Median and mean of the histograms of τ(2), τ(3), and α(3) in ch1 show the greatest differences between phakic diabetic patients and age-matched controls (p < 0.000004). The lack of pixels with a τ(2) of ∼360 ps, the increased number of pixels with τ(2) > 450 ps, and the shift of τ(3) from ∼3000 to 3700 ps in ch1 of diabetic patients when compared with healthy subjects indicate an increased production of free flavin adenine dinucleotide, accumulation of advanced glycation end products (AGE), and, probably, a change from free to protein-bound reduced nicotinamide adenine inucleotide at the fundus. AGE also accumulated in the crystalline lens.

Top