Sample records for adaptive histogram equalization

  1. Thresholding histogram equalization.

    PubMed

    Chuang, K S; Chen, S; Hwang, I M

    2001-12-01

    The drawbacks of adaptive histogram equalization techniques are the loss of definition on the edges of the object and overenhancement of noise in the images. These drawbacks can be avoided if the noise is excluded in the equalization transformation function computation. A method has been developed to separate the histogram into zones, each with its own equalization transformation. This method can be used to suppress the nonanatomic noise and enhance only certain parts of the object. This method can be combined with other adaptive histogram equalization techniques. Preliminary results indicate that this method can produce images with superior contrast.

  2. Combining Vector Quantization and Histogram Equalization.

    ERIC Educational Resources Information Center

    Cosman, Pamela C.; And Others

    1992-01-01

    Discussion of contrast enhancement techniques focuses on the use of histogram equalization with a data compression technique, i.e., tree-structured vector quantization. The enhancement technique of intensity windowing is described, and the use of enhancement techniques for medical images is explained, including adaptive histogram equalization.…

  3. Regionally adaptive histogram equalization of the chest.

    PubMed

    Sherrier, R H; Johnson, G A

    1987-01-01

    Advances in the area of digital chest radiography have resulted in the acquisition of high-quality images of the human chest. With these advances, there arises a genuine need for image processing algorithms specific to the chest, in order to fully exploit this digital technology. We have implemented the well-known technique of histogram equalization, noting the problems encountered when it is adapted to chest images. These problems have been successfully solved with our regionally adaptive histogram equalization method. With this technique histograms are calculated locally and then modified according to both the mean pixel value of that region as well as certain characteristics of the cumulative distribution function. This process, which has allowed certain regions of the chest radiograph to be enhanced differentially, may also have broader implications for other image processing tasks.

  4. Stochastic HKMDHE: A multi-objective contrast enhancement algorithm

    NASA Astrophysics Data System (ADS)

    Pratiher, Sawon; Mukhopadhyay, Sabyasachi; Maity, Srideep; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.

    2018-02-01

    This contribution proposes a novel extension of the existing `Hyper Kurtosis based Modified Duo-Histogram Equalization' (HKMDHE) algorithm, for multi-objective contrast enhancement of biomedical images. A novel modified objective function has been formulated by joint optimization of the individual histogram equalization objectives. The optimal adequacy of the proposed methodology with respect to image quality metrics such as brightness preserving abilities, peak signal-to-noise ratio (PSNR), Structural Similarity Index (SSIM) and universal image quality metric has been experimentally validated. The performance analysis of the proposed Stochastic HKMDHE with existing histogram equalization methodologies like Global Histogram Equalization (GHE) and Contrast Limited Adaptive Histogram Equalization (CLAHE) has been given for comparative evaluation.

  5. Adaptive image contrast enhancement using generalizations of histogram equalization.

    PubMed

    Stark, J A

    2000-01-01

    This paper proposes a scheme for adaptive image-contrast enhancement based on a generalization of histogram equalization (HE). HE is a useful technique for improving image contrast, but its effect is too severe for many purposes. However, dramatically different results can be obtained with relatively minor modifications. A concise description of adaptive HE is set out, and this framework is used in a discussion of past suggestions for variations on HE. A key feature of this formalism is a "cumulation function," which is used to generate a grey level mapping from the local histogram. By choosing alternative forms of cumulation function one can achieve a wide variety of effects. A specific form is proposed. Through the variation of one or two parameters, the resulting process can produce a range of degrees of contrast enhancement, at one extreme leaving the image unchanged, at another yielding full adaptive equalization.

  6. Chest CT window settings with multiscale adaptive histogram equalization: pilot study.

    PubMed

    Fayad, Laura M; Jin, Yinpeng; Laine, Andrew F; Berkmen, Yahya M; Pearson, Gregory D; Freedman, Benjamin; Van Heertum, Ronald

    2002-06-01

    Multiscale adaptive histogram equalization (MAHE), a wavelet-based algorithm, was investigated as a method of automatic simultaneous display of the full dynamic contrast range of a computed tomographic image. Interpretation times were significantly lower for MAHE-enhanced images compared with those for conventionally displayed images. Diagnostic accuracy, however, was insufficient in this pilot study to allow recommendation of MAHE as a replacement for conventional window display.

  7. Adaptive histogram equalization in digital radiography of destructive skeletal lesions.

    PubMed

    Braunstein, E M; Capek, P; Buckwalter, K; Bland, P; Meyer, C R

    1988-03-01

    Adaptive histogram equalization, an image-processing technique that distributes pixel values of an image uniformly throughout the gray scale, was applied to 28 plain radiographs of bone lesions, after they had been digitized. The non-equalized and equalized digital images were compared by two skeletal radiologists with respect to lesion margins, internal matrix, soft-tissue mass, cortical breakthrough, and periosteal reaction. Receiver operating characteristic (ROC) curves were constructed on the basis of the responses. Equalized images were superior to nonequalized images in determination of cortical breakthrough and presence or absence of periosteal reaction. ROC analysis showed no significant difference in determination of margins, matrix, or soft-tissue masses.

  8. An improved contrast enhancement algorithm for infrared images based on adaptive double plateaus histogram equalization

    NASA Astrophysics Data System (ADS)

    Li, Shuo; Jin, Weiqi; Li, Li; Li, Yiyang

    2018-05-01

    Infrared thermal images can reflect the thermal-radiation distribution of a particular scene. However, the contrast of the infrared images is usually low. Hence, it is generally necessary to enhance the contrast of infrared images in advance to facilitate subsequent recognition and analysis. Based on the adaptive double plateaus histogram equalization, this paper presents an improved contrast enhancement algorithm for infrared thermal images. In the proposed algorithm, the normalized coefficient of variation of the histogram, which characterizes the level of contrast enhancement, is introduced as feedback information to adjust the upper and lower plateau thresholds. The experiments on actual infrared images show that compared to the three typical contrast-enhancement algorithms, the proposed algorithm has better scene adaptability and yields better contrast-enhancement results for infrared images with more dark areas or a higher dynamic range. Hence, it has high application value in contrast enhancement, dynamic range compression, and digital detail enhancement for infrared thermal images.

  9. An evaluation of the effectiveness of adaptive histogram equalization for contrast enhancement.

    PubMed

    Zimmerman, J B; Pizer, S M; Staab, E V; Perry, J R; McCartney, W; Brenton, B C

    1988-01-01

    Adaptive histogram equalization (AHE) and intensity windowing have been compared using psychophysical observer studies. Experienced radiologists were shown clinical CT (computerized tomographic) images of the chest. Into some of the images, appropriate artificial lesions were introduced; the physicians were then shown the images processed with both AHE and intensity windowing. They were asked to assess the probability that a given image contained the artificial lesion, and their accuracy was measured. The results of these experiments show that for this particular diagnostic task, there was no significant difference in the ability of the two methods to depict luminance contrast; thus, further evaluation of AHE using controlled clinical trials is indicated.

  10. Feature and contrast enhancement of mammographic image based on multiscale analysis and morphology.

    PubMed

    Wu, Shibin; Yu, Shaode; Yang, Yuhan; Xie, Yaoqin

    2013-01-01

    A new algorithm for feature and contrast enhancement of mammographic images is proposed in this paper. The approach bases on multiscale transform and mathematical morphology. First of all, the Laplacian Gaussian pyramid operator is applied to transform the mammography into different scale subband images. In addition, the detail or high frequency subimages are equalized by contrast limited adaptive histogram equalization (CLAHE) and low-pass subimages are processed by mathematical morphology. Finally, the enhanced image of feature and contrast is reconstructed from the Laplacian Gaussian pyramid coefficients modified at one or more levels by contrast limited adaptive histogram equalization and mathematical morphology, respectively. The enhanced image is processed by global nonlinear operator. The experimental results show that the presented algorithm is effective for feature and contrast enhancement of mammogram. The performance evaluation of the proposed algorithm is measured by contrast evaluation criterion for image, signal-noise-ratio (SNR), and contrast improvement index (CII).

  11. Feature and Contrast Enhancement of Mammographic Image Based on Multiscale Analysis and Morphology

    PubMed Central

    Wu, Shibin; Xie, Yaoqin

    2013-01-01

    A new algorithm for feature and contrast enhancement of mammographic images is proposed in this paper. The approach bases on multiscale transform and mathematical morphology. First of all, the Laplacian Gaussian pyramid operator is applied to transform the mammography into different scale subband images. In addition, the detail or high frequency subimages are equalized by contrast limited adaptive histogram equalization (CLAHE) and low-pass subimages are processed by mathematical morphology. Finally, the enhanced image of feature and contrast is reconstructed from the Laplacian Gaussian pyramid coefficients modified at one or more levels by contrast limited adaptive histogram equalization and mathematical morphology, respectively. The enhanced image is processed by global nonlinear operator. The experimental results show that the presented algorithm is effective for feature and contrast enhancement of mammogram. The performance evaluation of the proposed algorithm is measured by contrast evaluation criterion for image, signal-noise-ratio (SNR), and contrast improvement index (CII). PMID:24416072

  12. Multi-scale Morphological Image Enhancement of Chest Radiographs by a Hybrid Scheme.

    PubMed

    Alavijeh, Fatemeh Shahsavari; Mahdavi-Nasab, Homayoun

    2015-01-01

    Chest radiography is a common diagnostic imaging test, which contains an enormous amount of information about a patient. However, its interpretation is highly challenging. The accuracy of the diagnostic process is greatly influenced by image processing algorithms; hence enhancement of the images is indispensable in order to improve visibility of the details. This paper aims at improving radiograph parameters such as contrast, sharpness, noise level, and brightness to enhance chest radiographs, making use of a triangulation method. Here, contrast limited adaptive histogram equalization technique and noise suppression are simultaneously performed in wavelet domain in a new scheme, followed by morphological top-hat and bottom-hat filtering. A unique implementation of morphological filters allows for adjustment of the image brightness and significant enhancement of the contrast. The proposed method is tested on chest radiographs from Japanese Society of Radiological Technology database. The results are compared with conventional enhancement techniques such as histogram equalization, contrast limited adaptive histogram equalization, Retinex, and some recently proposed methods to show its strengths. The experimental results reveal that the proposed method can remarkably improve the image contrast while keeping the sensitive chest tissue information so that radiologists might have a more precise interpretation.

  13. Multi-scale Morphological Image Enhancement of Chest Radiographs by a Hybrid Scheme

    PubMed Central

    Alavijeh, Fatemeh Shahsavari; Mahdavi-Nasab, Homayoun

    2015-01-01

    Chest radiography is a common diagnostic imaging test, which contains an enormous amount of information about a patient. However, its interpretation is highly challenging. The accuracy of the diagnostic process is greatly influenced by image processing algorithms; hence enhancement of the images is indispensable in order to improve visibility of the details. This paper aims at improving radiograph parameters such as contrast, sharpness, noise level, and brightness to enhance chest radiographs, making use of a triangulation method. Here, contrast limited adaptive histogram equalization technique and noise suppression are simultaneously performed in wavelet domain in a new scheme, followed by morphological top-hat and bottom-hat filtering. A unique implementation of morphological filters allows for adjustment of the image brightness and significant enhancement of the contrast. The proposed method is tested on chest radiographs from Japanese Society of Radiological Technology database. The results are compared with conventional enhancement techniques such as histogram equalization, contrast limited adaptive histogram equalization, Retinex, and some recently proposed methods to show its strengths. The experimental results reveal that the proposed method can remarkably improve the image contrast while keeping the sensitive chest tissue information so that radiologists might have a more precise interpretation. PMID:25709942

  14. Lower-upper-threshold correlation for underwater range-gated imaging self-adaptive enhancement.

    PubMed

    Sun, Liang; Wang, Xinwei; Liu, Xiaoquan; Ren, Pengdao; Lei, Pingshun; He, Jun; Fan, Songtao; Zhou, Yan; Liu, Yuliang

    2016-10-10

    In underwater range-gated imaging (URGI), enhancement of low-brightness and low-contrast images is critical for human observation. Traditional histogram equalizations over-enhance images, with the result of details being lost. To compress over-enhancement, a lower-upper-threshold correlation method is proposed for underwater range-gated imaging self-adaptive enhancement based on double-plateau histogram equalization. The lower threshold determines image details and compresses over-enhancement. It is correlated with the upper threshold. First, the upper threshold is updated by searching for the local maximum in real time, and then the lower threshold is calculated by the upper threshold and the number of nonzero units selected from a filtered histogram. With this method, the backgrounds of underwater images are constrained with enhanced details. Finally, the proof experiments are performed. Peak signal-to-noise-ratio, variance, contrast, and human visual properties are used to evaluate the objective quality of the global and regions of interest images. The evaluation results demonstrate that the proposed method adaptively selects the proper upper and lower thresholds under different conditions. The proposed method contributes to URGI with effective image enhancement for human eyes.

  15. Comparison of image enhancement methods for the effective diagnosis in successive whole-body bone scans.

    PubMed

    Jeong, Chang Bu; Kim, Kwang Gi; Kim, Tae Sung; Kim, Seok Ki

    2011-06-01

    Whole-body bone scan is one of the most frequent diagnostic procedures in nuclear medicine. Especially, it plays a significant role in important procedures such as the diagnosis of osseous metastasis and evaluation of osseous tumor response to chemotherapy and radiation therapy. It can also be used to monitor the possibility of any recurrence of the tumor. However, it is a very time-consuming effort for radiologists to quantify subtle interval changes between successive whole-body bone scans because of many variations such as intensity, geometry, and morphology. In this paper, we present the most effective method of image enhancement based on histograms, which may assist radiologists in interpreting successive whole-body bone scans effectively. Forty-eight successive whole-body bone scans from 10 patients were obtained and evaluated using six methods of image enhancement based on histograms: histogram equalization, brightness-preserving bi-histogram equalization, contrast-limited adaptive histogram equalization, end-in search, histogram matching, and exact histogram matching (EHM). Comparison of the results of the different methods was made using three similarity measures peak signal-to-noise ratio, histogram intersection, and structural similarity. Image enhancement of successive bone scans using EHM showed the best results out of the six methods measured for all similarity measures. EHM is the best method of image enhancement based on histograms for diagnosing successive whole-body bone scans. The method for successive whole-body bone scans has the potential to greatly assist radiologists quantify interval changes more accurately and quickly by compensating for the variable nature of intensity information. Consequently, it can improve radiologists' diagnostic accuracy as well as reduce reading time for detecting interval changes.

  16. A psychophysical comparison of two methods for adaptive histogram equalization.

    PubMed

    Zimmerman, J B; Cousins, S B; Hartzell, K M; Frisse, M E; Kahn, M G

    1989-05-01

    Adaptive histogram equalization (AHE) is a method for adaptive contrast enhancement of digital images. It is an automatic, reproducible method for the simultaneous viewing of contrast within a digital image with a large dynamic range. Recent experiments have shown that in specific cases, there is no significant difference in the ability of AHE and linear intensity windowing to display gray-scale contrast. More recently, a variant of AHE which limits the allowed contrast enhancement of the image has been proposed. This contrast-limited adaptive histogram equalization (CLAHE) produces images in which the noise content of an image is not excessively enhanced, but in which sufficient contrast is provided for the visualization of structures within the image. Images processed with CLAHE have a more natural appearance and facilitate the comparison of different areas of an image. However, the reduced contrast enhancement of CLAHE may hinder the ability of an observer to detect the presence of some significant gray-scale contrast. In this report, a psychophysical observer experiment was performed to determine if there is a significant difference in the ability of AHE and CLAHE to depict gray-scale contrast. Observers were presented with computed tomography (CT) images of the chest processed with AHE and CLAHE. Subtle artificial lesions were introduced into some images. The observers were asked to rate their confidence regarding the presence of the lesions; this rating-scale data was analyzed using receiver operating characteristic (ROC) curve techniques. These ROC curves were compared for significant differences in the observers' performances. In this report, no difference was found in the abilities of AHE and CLAHE to depict contrast information.

  17. Uniform enhancement of optical micro-angiography images using Rayleigh contrast-limited adaptive histogram equalization.

    PubMed

    Yousefi, Siavash; Qin, Jia; Zhi, Zhongwei; Wang, Ruikang K

    2013-02-01

    Optical microangiography is an imaging technology that is capable of providing detailed functional blood flow maps within microcirculatory tissue beds in vivo. Some practical issues however exist when displaying and quantifying the microcirculation that perfuses the scanned tissue volume. These issues include: (I) Probing light is subject to specular reflection when it shines onto sample. The unevenness of the tissue surface makes the light energy entering the tissue not uniform over the entire scanned tissue volume. (II) The biological tissue is heterogeneous in nature, meaning the scattering and absorption properties of tissue would attenuate the probe beam. These physical limitations can result in local contrast degradation and non-uniform micro-angiogram images. In this paper, we propose a post-processing method that uses Rayleigh contrast-limited adaptive histogram equalization to increase the contrast and improve the overall appearance and uniformity of optical micro-angiograms without saturating the vessel intensity and changing the physical meaning of the micro-angiograms. The qualitative and quantitative performance of the proposed method is compared with those of common histogram equalization and contrast enhancement methods. We demonstrate that the proposed method outperforms other existing approaches. The proposed method is not limited to optical microangiography and can be used in other image modalities such as photo-acoustic tomography and scanning laser confocal microscopy.

  18. [A fast iterative algorithm for adaptive histogram equalization].

    PubMed

    Cao, X; Liu, X; Deng, Z; Jiang, D; Zheng, C

    1997-01-01

    In this paper, we propose an iterative algorthm called FAHE., which is based on the relativity between the current local histogram and the one before the sliding window moving. Comparing with the basic AHE, the computing time of FAHE is decreased from 5 hours to 4 minutes on a 486dx/33 compatible computer, when using a 65 x 65 sliding window for a 512 x 512 with 8 bits gray-level range.

  19. An adaptive enhancement algorithm for infrared video based on modified k-means clustering

    NASA Astrophysics Data System (ADS)

    Zhang, Linze; Wang, Jingqi; Wu, Wen

    2016-09-01

    In this paper, we have proposed a video enhancement algorithm to improve the output video of the infrared camera. Sometimes the video obtained by infrared camera is very dark since there is no clear target. In this case, infrared video should be divided into frame images by frame extraction, in order to carry out the image enhancement. For the first frame image, which can be divided into k sub images by using K-means clustering according to the gray interval it occupies before k sub images' histogram equalization according to the amount of information per sub image, we used a method to solve a problem that final cluster centers close to each other in some cases; and for the other frame images, their initial cluster centers can be determined by the final clustering centers of the previous ones, and the histogram equalization of each sub image will be carried out after image segmentation based on K-means clustering. The histogram equalization can make the gray value of the image to the whole gray level, and the gray level of each sub image is determined by the ratio of pixels to a frame image. Experimental results show that this algorithm can improve the contrast of infrared video where night target is not obvious which lead to a dim scene, and reduce the negative effect given by the overexposed pixels adaptively in a certain range.

  20. Uniform enhancement of optical micro-angiography images using Rayleigh contrast-limited adaptive histogram equalization

    PubMed Central

    Yousefi, Siavash; Qin, Jia; Zhi, Zhongwei

    2013-01-01

    Optical microangiography is an imaging technology that is capable of providing detailed functional blood flow maps within microcirculatory tissue beds in vivo. Some practical issues however exist when displaying and quantifying the microcirculation that perfuses the scanned tissue volume. These issues include: (I) Probing light is subject to specular reflection when it shines onto sample. The unevenness of the tissue surface makes the light energy entering the tissue not uniform over the entire scanned tissue volume. (II) The biological tissue is heterogeneous in nature, meaning the scattering and absorption properties of tissue would attenuate the probe beam. These physical limitations can result in local contrast degradation and non-uniform micro-angiogram images. In this paper, we propose a post-processing method that uses Rayleigh contrast-limited adaptive histogram equalization to increase the contrast and improve the overall appearance and uniformity of optical micro-angiograms without saturating the vessel intensity and changing the physical meaning of the micro-angiograms. The qualitative and quantitative performance of the proposed method is compared with those of common histogram equalization and contrast enhancement methods. We demonstrate that the proposed method outperforms other existing approaches. The proposed method is not limited to optical microangiography and can be used in other image modalities such as photo-acoustic tomography and scanning laser confocal microscopy. PMID:23482880

  1. Preprocessing with image denoising and histogram equalization for endoscopy image analysis using texture analysis.

    PubMed

    Hiroyasu, Tomoyuki; Hayashinuma, Katsutoshi; Ichikawa, Hiroshi; Yagi, Nobuaki

    2015-08-01

    A preprocessing method for endoscopy image analysis using texture analysis is proposed. In a previous study, we proposed a feature value that combines a co-occurrence matrix and a run-length matrix to analyze the extent of early gastric cancer from images taken with narrow-band imaging endoscopy. However, the obtained feature value does not identify lesion zones correctly due to the influence of noise and halation. Therefore, we propose a new preprocessing method with a non-local means filter for de-noising and contrast limited adaptive histogram equalization. We have confirmed that the pattern of gastric mucosa in images can be improved by the proposed method. Furthermore, the lesion zone is shown more correctly by the obtained color map.

  2. Multipurpose contrast enhancement on epiphyseal plates and ossification centers for bone age assessment

    PubMed Central

    2013-01-01

    Background The high variations of background luminance, low contrast and excessively enhanced contrast of hand bone radiograph often impede the bone age assessment rating system in evaluating the degree of epiphyseal plates and ossification centers development. The Global Histogram equalization (GHE) has been the most frequently adopted image contrast enhancement technique but the performance is not satisfying. A brightness and detail preserving histogram equalization method with good contrast enhancement effect has been a goal of much recent research in histogram equalization. Nevertheless, producing a well-balanced histogram equalized radiograph in terms of its brightness preservation, detail preservation and contrast enhancement is deemed to be a daunting task. Method In this paper, we propose a novel framework of histogram equalization with the aim of taking several desirable properties into account, namely the Multipurpose Beta Optimized Bi-Histogram Equalization (MBOBHE). This method performs the histogram optimization separately in both sub-histograms after the segmentation of histogram using an optimized separating point determined based on the regularization function constituted by three components. The result is then assessed by the qualitative and quantitative analysis to evaluate the essential aspects of histogram equalized image using a total of 160 hand radiographs that are implemented in testing and analyses which are acquired from hand bone online database. Result From the qualitative analysis, we found that basic bi-histogram equalizations are not capable of displaying the small features in image due to incorrect selection of separating point by focusing on only certain metric without considering the contrast enhancement and detail preservation. From the quantitative analysis, we found that MBOBHE correlates well with human visual perception, and this improvement shortens the evaluation time taken by inspector in assessing the bone age. Conclusions The proposed MBOBHE outperforms other existing methods regarding comprehensive performance of histogram equalization. All the features which are pertinent to bone age assessment are more protruding relative to other methods; this has shorten the required evaluation time in manual bone age assessment using TW method. While the accuracy remains unaffected or slightly better than using unprocessed original image. The holistic properties in terms of brightness preservation, detail preservation and contrast enhancement are simultaneous taken into consideration and thus the visual effect is contributive to manual inspection. PMID:23565999

  3. Bas-relief generation using adaptive histogram equalization.

    PubMed

    Sun, Xianfang; Rosin, Paul L; Martin, Ralph R; Langbein, Frank C

    2009-01-01

    An algorithm is presented to automatically generate bas-reliefs based on adaptive histogram equalization (AHE), starting from an input height field. A mesh model may alternatively be provided, in which case a height field is first created via orthogonal or perspective projection. The height field is regularly gridded and treated as an image, enabling a modified AHE method to be used to generate a bas-relief with a user-chosen height range. We modify the original image-contrast-enhancement AHE method to use gradient weights also to enhance the shape features of the bas-relief. To effectively compress the height field, we limit the height-dependent scaling factors used to compute relative height variations in the output from height variations in the input; this prevents any height differences from having too great effect. Results of AHE over different neighborhood sizes are averaged to preserve information at different scales in the resulting bas-relief. Compared to previous approaches, the proposed algorithm is simple and yet largely preserves original shape features. Experiments show that our results are, in general, comparable to and in some cases better than the best previously published methods.

  4. Brain early infarct detection using gamma correction extreme-level eliminating with weighting distribution.

    PubMed

    Teh, V; Sim, K S; Wong, E K

    2016-11-01

    According to the statistic from World Health Organization (WHO), stroke is one of the major causes of death globally. Computed tomography (CT) scan is one of the main medical diagnosis system used for diagnosis of ischemic stroke. CT scan provides brain images in Digital Imaging and Communication in Medicine (DICOM) format. The presentation of CT brain images is mainly relied on the window setting (window center and window width), which converts an image from DICOM format into normal grayscale format. Nevertheless, the ordinary window parameter could not deliver a proper contrast on CT brain images for ischemic stroke detection. In this paper, a new proposed method namely gamma correction extreme-level eliminating with weighting distribution (GCELEWD) is implemented to improve the contrast on CT brain images. GCELEWD is capable of highlighting the hypodense region for diagnosis of ischemic stroke. The performance of this new proposed technique, GCELEWD, is compared with four of the existing contrast enhancement technique such as brightness preserving bi-histogram equalization (BBHE), dualistic sub-image histogram equalization (DSIHE), extreme-level eliminating histogram equalization (ELEHE), and adaptive gamma correction with weighting distribution (AGCWD). GCELEWD shows better visualization for ischemic stroke detection and higher values with image quality assessment (IQA) module. SCANNING 38:842-856, 2016. © 2016 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.

  5. Hue-preserving and saturation-improved color histogram equalization algorithm.

    PubMed

    Song, Ki Sun; Kang, Hee; Kang, Moon Gi

    2016-06-01

    In this paper, an algorithm is proposed to improve contrast and saturation without color degradation. The local histogram equalization (HE) method offers better performance than the global HE method, whereas the local HE method sometimes produces undesirable results due to the block-based processing. The proposed contrast-enhancement (CE) algorithm reflects the characteristics of the global HE method in the local HE method to avoid the artifacts, while global and local contrasts are enhanced. There are two ways to apply the proposed CE algorithm to color images. One is luminance processing methods, and the other one is each channel processing methods. However, these ways incur excessive or reduced saturation and color degradation problems. The proposed algorithm solves these problems by using channel adaptive equalization and similarity of ratios between the channels. Experimental results show that the proposed algorithm enhances contrast and saturation while preserving the hue and producing better performance than existing methods in terms of objective evaluation metrics.

  6. Image contrast enhancement using adjacent-blocks-based modification for local histogram equalization

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Pan, Zhibin

    2017-11-01

    Infrared images usually have some non-ideal characteristics such as weak target-to-background contrast and strong noise. Because of these characteristics, it is necessary to apply the contrast enhancement algorithm to improve the visual quality of infrared images. Histogram equalization (HE) algorithm is a widely used contrast enhancement algorithm due to its effectiveness and simple implementation. But a drawback of HE algorithm is that the local contrast of an image cannot be equally enhanced. Local histogram equalization algorithms are proved to be the effective techniques for local image contrast enhancement. However, over-enhancement of noise and artifacts can be easily found in the local histogram equalization enhanced images. In this paper, a new contrast enhancement technique based on local histogram equalization algorithm is proposed to overcome the drawbacks mentioned above. The input images are segmented into three kinds of overlapped sub-blocks using the gradients of them. To overcome the over-enhancement effect, the histograms of these sub-blocks are then modified by adjacent sub-blocks. We pay more attention to improve the contrast of detail information while the brightness of the flat region in these sub-blocks is well preserved. It will be shown that the proposed algorithm outperforms other related algorithms by enhancing the local contrast without introducing over-enhancement effects and additional noise.

  7. Information-Adaptive Image Encoding and Restoration

    NASA Technical Reports Server (NTRS)

    Park, Stephen K.; Rahman, Zia-ur

    1998-01-01

    The multiscale retinex with color restoration (MSRCR) has shown itself to be a very versatile automatic image enhancement algorithm that simultaneously provides dynamic range compression, color constancy, and color rendition. A number of algorithms exist that provide one or more of these features, but not all. In this paper we compare the performance of the MSRCR with techniques that are widely used for image enhancement. Specifically, we compare the MSRCR with color adjustment methods such as gamma correction and gain/offset application, histogram modification techniques such as histogram equalization and manual histogram adjustment, and other more powerful techniques such as homomorphic filtering and 'burning and dodging'. The comparison is carried out by testing the suite of image enhancement methods on a set of diverse images. We find that though some of these techniques work well for some of these images, only the MSRCR performs universally well oil the test set.

  8. Adaptive sigmoid function bihistogram equalization for image contrast enhancement

    NASA Astrophysics Data System (ADS)

    Arriaga-Garcia, Edgar F.; Sanchez-Yanez, Raul E.; Ruiz-Pinales, Jose; Garcia-Hernandez, Ma. de Guadalupe

    2015-09-01

    Contrast enhancement plays a key role in a wide range of applications including consumer electronic applications, such as video surveillance, digital cameras, and televisions. The main goal of contrast enhancement is to increase the quality of images. However, most state-of-the-art methods induce different types of distortion such as intensity shift, wash-out, noise, intensity burn-out, and intensity saturation. In addition, in consumer electronics, simple and fast methods are required in order to be implemented in real time. A bihistogram equalization method based on adaptive sigmoid functions is proposed. It consists of splitting the image histogram into two parts that are equalized independently by using adaptive sigmoid functions. In order to preserve the mean brightness of the input image, the parameter of the sigmoid functions is chosen to minimize the absolute mean brightness metric. Experiments on the Berkeley database have shown that the proposed method improves the quality of images and preserves their mean brightness. An application to improve the colorfulness of images is also presented.

  9. Image Enhancement via Subimage Histogram Equalization Based on Mean and Variance

    PubMed Central

    2017-01-01

    This paper puts forward a novel image enhancement method via Mean and Variance based Subimage Histogram Equalization (MVSIHE), which effectively increases the contrast of the input image with brightness and details well preserved compared with some other methods based on histogram equalization (HE). Firstly, the histogram of input image is divided into four segments based on the mean and variance of luminance component, and the histogram bins of each segment are modified and equalized, respectively. Secondly, the result is obtained via the concatenation of the processed subhistograms. Lastly, the normalization method is deployed on intensity levels, and the integration of the processed image with the input image is performed. 100 benchmark images from a public image database named CVG-UGR-Database are used for comparison with other state-of-the-art methods. The experiment results show that the algorithm can not only enhance image information effectively but also well preserve brightness and details of the original image. PMID:29403529

  10. DSP+FPGA-based real-time histogram equalization system of infrared image

    NASA Astrophysics Data System (ADS)

    Gu, Dongsheng; Yang, Nansheng; Pi, Defu; Hua, Min; Shen, Xiaoyan; Zhang, Ruolan

    2001-10-01

    Histogram Modification is a simple but effective method to enhance an infrared image. There are several methods to equalize an infrared image's histogram due to the different characteristics of the different infrared images, such as the traditional HE (Histogram Equalization) method, and the improved HP (Histogram Projection) and PE (Plateau Equalization) method and so on. If to realize these methods in a single system, the system must have a mass of memory and extremely fast speed. In our system, we introduce a DSP + FPGA based real-time procession technology to do these things together. FPGA is used to realize the common part of these methods while DSP is to do the different part. The choice of methods and the parameter can be input by a keyboard or a computer. By this means, the function of the system is powerful while it is easy to operate and maintain. In this article, we give out the diagram of the system and the soft flow chart of the methods. And at the end of it, we give out the infrared image and its histogram before and after the process of HE method.

  11. Particle swarm optimization-based local entropy weighted histogram equalization for infrared image enhancement

    NASA Astrophysics Data System (ADS)

    Wan, Minjie; Gu, Guohua; Qian, Weixian; Ren, Kan; Chen, Qian; Maldague, Xavier

    2018-06-01

    Infrared image enhancement plays a significant role in intelligent urban surveillance systems for smart city applications. Unlike existing methods only exaggerating the global contrast, we propose a particle swam optimization-based local entropy weighted histogram equalization which involves the enhancement of both local details and fore-and background contrast. First of all, a novel local entropy weighted histogram depicting the distribution of detail information is calculated based on a modified hyperbolic tangent function. Then, the histogram is divided into two parts via a threshold maximizing the inter-class variance in order to improve the contrasts of foreground and background, respectively. To avoid over-enhancement and noise amplification, double plateau thresholds of the presented histogram are formulated by means of particle swarm optimization algorithm. Lastly, each sub-image is equalized independently according to the constrained sub-local entropy weighted histogram. Comparative experiments implemented on real infrared images prove that our algorithm outperforms other state-of-the-art methods in terms of both visual and quantized evaluations.

  12. Detection of simulated microcalcifications in fixed mammary tissue: An ROC study of the effect of local versus global histogram equalization.

    PubMed

    Sund, T; Olsen, J B

    2006-09-01

    To investigate whether sliding window adaptive histogram equalization (SWAHE) of digital mammograms improves the detection of simulated calcifications, as compared to images normalized by global histogram equalization (GHE). Direct digital mammograms were obtained from mammary tissue phantoms superimposed with different frames. Each frame was divided into forty squares by a wire mesh, and contained granular calcifications randomly positioned in about 50% of the squares. Three radiologists read the mammograms on a display monitor. They classified their confidence in the presence of microcalcifications in each square on a scale of 1 to 5. Images processed with GHE were first read and used as a reference. In a later session, the same images processed with SWAHE were read. The results were compared using ROC methodology. When the total areas AZ were compared, the results were completely equivocal. When comparing the high-specificity partial ROC area AZ,0.2 below false-positive fraction (FPF) 0.20, two of the three observers performed best with the images processed with SWAHE. The difference was not statistically significant. When the reader's confidence threshold in malignancy is set at a high level, increasing the contrast of mammograms with SWAHE may enhance the visibility of microcalcifications without adversely affecting the false-positive rate. When the reader's confidence threshold is set at a low level, the effect of SWAHE is an increase of false positives. Further investigation is needed to confirm the validity of the conclusions.

  13. Information granules in image histogram analysis.

    PubMed

    Wieclawek, Wojciech

    2018-04-01

    A concept of granular computing employed in intensity-based image enhancement is discussed. First, a weighted granular computing idea is introduced. Then, the implementation of this term in the image processing area is presented. Finally, multidimensional granular histogram analysis is introduced. The proposed approach is dedicated to digital images, especially to medical images acquired by Computed Tomography (CT). As the histogram equalization approach, this method is based on image histogram analysis. Yet, unlike the histogram equalization technique, it works on a selected range of the pixel intensity and is controlled by two parameters. Performance is tested on anonymous clinical CT series. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Histogram-based adaptive gray level scaling for texture feature classification of colorectal polyps

    NASA Astrophysics Data System (ADS)

    Pomeroy, Marc; Lu, Hongbing; Pickhardt, Perry J.; Liang, Zhengrong

    2018-02-01

    Texture features have played an ever increasing role in computer aided detection (CADe) and diagnosis (CADx) methods since their inception. Texture features are often used as a method of false positive reduction for CADe packages, especially for detecting colorectal polyps and distinguishing them from falsely tagged residual stool and healthy colon wall folds. While texture features have shown great success there, the performance of texture features for CADx have lagged behind primarily because of the more similar features among different polyps types. In this paper, we present an adaptive gray level scaling and compare it to the conventional equal-spacing of gray level bins. We use a dataset taken from computed tomography colonography patients, with 392 polyp regions of interest (ROIs) identified and have a confirmed diagnosis through pathology. Using the histogram information from the entire ROI dataset, we generate the gray level bins such that each bin contains roughly the same number of voxels Each image ROI is the scaled down to two different numbers of gray levels, using both an equal spacing of Hounsfield units for each bin, and our adaptive method. We compute a set of texture features from the scaled images including 30 gray level co-occurrence matrix (GLCM) features and 11 gray level run length matrix (GLRLM) features. Using a random forest classifier to distinguish between hyperplastic polyps and all others (adenomas and adenocarcinomas), we find that the adaptive gray level scaling can improve performance based on the area under the receiver operating characteristic curve by up to 4.6%.

  15. Color Histogram Diffusion for Image Enhancement

    NASA Technical Reports Server (NTRS)

    Kim, Taemin

    2011-01-01

    Various color histogram equalization (CHE) methods have been proposed to extend grayscale histogram equalization (GHE) for color images. In this paper a new method called histogram diffusion that extends the GHE method to arbitrary dimensions is proposed. Ranges in a histogram are specified as overlapping bars of uniform heights and variable widths which are proportional to their frequencies. This diagram is called the vistogram. As an alternative approach to GHE, the squared error of the vistogram from the uniform distribution is minimized. Each bar in the vistogram is approximated by a Gaussian function. Gaussian particles in the vistoram diffuse as a nonlinear autonomous system of ordinary differential equations. CHE results of color images showed that the approach is effective.

  16. Histogram equalization with Bayesian estimation for noise robust speech recognition.

    PubMed

    Suh, Youngjoo; Kim, Hoirin

    2018-02-01

    The histogram equalization approach is an efficient feature normalization technique for noise robust automatic speech recognition. However, it suffers from performance degradation when some fundamental conditions are not satisfied in the test environment. To remedy these limitations of the original histogram equalization methods, class-based histogram equalization approach has been proposed. Although this approach showed substantial performance improvement under noise environments, it still suffers from performance degradation due to the overfitting problem when test data are insufficient. To address this issue, the proposed histogram equalization technique employs the Bayesian estimation method in the test cumulative distribution function estimation. It was reported in a previous study conducted on the Aurora-4 task that the proposed approach provided substantial performance gains in speech recognition systems based on the acoustic modeling of the Gaussian mixture model-hidden Markov model. In this work, the proposed approach was examined in speech recognition systems with deep neural network-hidden Markov model (DNN-HMM), the current mainstream speech recognition approach where it also showed meaningful performance improvement over the conventional maximum likelihood estimation-based method. The fusion of the proposed features with the mel-frequency cepstral coefficients provided additional performance gains in DNN-HMM systems, which otherwise suffer from performance degradation in the clean test condition.

  17. Efficient visibility-driven medical image visualisation via adaptive binned visibility histogram.

    PubMed

    Jung, Younhyun; Kim, Jinman; Kumar, Ashnil; Feng, David Dagan; Fulham, Michael

    2016-07-01

    'Visibility' is a fundamental optical property that represents the observable, by users, proportion of the voxels in a volume during interactive volume rendering. The manipulation of this 'visibility' improves the volume rendering processes; for instance by ensuring the visibility of regions of interest (ROIs) or by guiding the identification of an optimal rendering view-point. The construction of visibility histograms (VHs), which represent the distribution of all the visibility of all voxels in the rendered volume, enables users to explore the volume with real-time feedback about occlusion patterns among spatially related structures during volume rendering manipulations. Volume rendered medical images have been a primary beneficiary of VH given the need to ensure that specific ROIs are visible relative to the surrounding structures, e.g. the visualisation of tumours that may otherwise be occluded by neighbouring structures. VH construction and its subsequent manipulations, however, are computationally expensive due to the histogram binning of the visibilities. This limits the real-time application of VH to medical images that have large intensity ranges and volume dimensions and require a large number of histogram bins. In this study, we introduce an efficient adaptive binned visibility histogram (AB-VH) in which a smaller number of histogram bins are used to represent the visibility distribution of the full VH. We adaptively bin medical images by using a cluster analysis algorithm that groups the voxels according to their intensity similarities into a smaller subset of bins while preserving the distribution of the intensity range of the original images. We increase efficiency by exploiting the parallel computation and multiple render targets (MRT) extension of the modern graphical processing units (GPUs) and this enables efficient computation of the histogram. We show the application of our method to single-modality computed tomography (CT), magnetic resonance (MR) imaging and multi-modality positron emission tomography-CT (PET-CT). In our experiments, the AB-VH markedly improved the computational efficiency for the VH construction and thus improved the subsequent VH-driven volume manipulations. This efficiency was achieved without major degradation in the VH visually and numerical differences between the AB-VH and its full-bin counterpart. We applied several variants of the K-means clustering algorithm with varying Ks (the number of clusters) and found that higher values of K resulted in better performance at a lower computational gain. The AB-VH also had an improved performance when compared to the conventional method of down-sampling of the histogram bins (equal binning) for volume rendering visualisation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. An Approach to Improve the Quality of Infrared Images of Vein-Patterns

    PubMed Central

    Lin, Chih-Lung

    2011-01-01

    This study develops an approach to improve the quality of infrared (IR) images of vein-patterns, which usually have noise, low contrast, low brightness and small objects of interest, thus requiring preprocessing to improve their quality. The main characteristics of the proposed approach are that no prior knowledge about the IR image is necessary and no parameters must be preset. Two main goals are sought: impulse noise reduction and adaptive contrast enhancement technologies. In our study, a fast median-based filter (FMBF) is developed as a noise reduction method. It is based on an IR imaging mechanism to detect the noisy pixels and on a modified median-based filter to remove the noisy pixels in IR images. FMBF has the advantage of a low computation load. In addition, FMBF can retain reasonably good edges and texture information when the size of the filter window increases. The most important advantage is that the peak signal-to-noise ratio (PSNR) caused by FMBF is higher than the PSNR caused by the median filter. A hybrid cumulative histogram equalization (HCHE) is proposed for adaptive contrast enhancement. HCHE can automatically generate a hybrid cumulative histogram (HCH) based on two different pieces of information about the image histogram. HCHE can improve the enhancement effect on hot objects rather than background. The experimental results are addressed and demonstrate that the proposed approach is feasible for use as an effective and adaptive process for enhancing the quality of IR vein-pattern images. PMID:22247674

  19. An approach to improve the quality of infrared images of vein-patterns.

    PubMed

    Lin, Chih-Lung

    2011-01-01

    This study develops an approach to improve the quality of infrared (IR) images of vein-patterns, which usually have noise, low contrast, low brightness and small objects of interest, thus requiring preprocessing to improve their quality. The main characteristics of the proposed approach are that no prior knowledge about the IR image is necessary and no parameters must be preset. Two main goals are sought: impulse noise reduction and adaptive contrast enhancement technologies. In our study, a fast median-based filter (FMBF) is developed as a noise reduction method. It is based on an IR imaging mechanism to detect the noisy pixels and on a modified median-based filter to remove the noisy pixels in IR images. FMBF has the advantage of a low computation load. In addition, FMBF can retain reasonably good edges and texture information when the size of the filter window increases. The most important advantage is that the peak signal-to-noise ratio (PSNR) caused by FMBF is higher than the PSNR caused by the median filter. A hybrid cumulative histogram equalization (HCHE) is proposed for adaptive contrast enhancement. HCHE can automatically generate a hybrid cumulative histogram (HCH) based on two different pieces of information about the image histogram. HCHE can improve the enhancement effect on hot objects rather than background. The experimental results are addressed and demonstrate that the proposed approach is feasible for use as an effective and adaptive process for enhancing the quality of IR vein-pattern images.

  20. A Framework for Reproducible Latent Fingerprint Enhancements.

    PubMed

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  1. A Framework for Reproducible Latent Fingerprint Enhancements

    PubMed Central

    Carasso, Alfred S.

    2014-01-01

    Photoshop processing1 of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology. PMID:26601028

  2. Sliding window adaptive histogram equalization of intraoral radiographs: effect on image quality.

    PubMed

    Sund, T; Møystad, A

    2006-05-01

    To investigate whether contrast enhancement by non-interactive, sliding window adaptive histogram equalization (SWAHE) can enhance the image quality of intraoral radiographs in the dental clinic. Three dentists read 22 periapical and 12 bitewing storage phosphor (SP) radiographs. For the periapical readings they graded the quality of the examination with regard to visually locating the root apex. For the bitewing readings they registered all occurrences of approximal caries on a confidence scale. Each reading was first done on an unprocessed radiograph ("single-view"), and then re-done with the image processed with SWAHE displayed beside the unprocessed version ("twin-view"). The processing parameters for SWAHE were the same for all the images. For the periapical examinations, twin-view was judged to raise the image quality for 52% of those cases where the single-view quality was below the maximum. For the bitewing radiographs, there was a change of caries classification (both positive and negative) with twin-view in 19% of the cases, but with only a 3% net increase in the total number of caries registrations. For both examinations interobserver variance was unaffected. Non-interactive SWAHE applied to dental SP radiographs produces a supplemental contrast enhanced image which in twin-view reading improves the image quality of periapical examinations. SWAHE also affects caries diagnosis of bitewing images, and further study using a gold standard is warranted.

  3. A cost-effective line-based light-balancing technique using adaptive processing.

    PubMed

    Hsia, Shih-Chang; Chen, Ming-Huei; Chen, Yu-Min

    2006-09-01

    The camera imaging system has been widely used; however, the displaying image appears to have an unequal light distribution. This paper presents novel light-balancing techniques to compensate uneven illumination based on adaptive signal processing. For text image processing, first, we estimate the background level and then process each pixel with nonuniform gain. This algorithm can balance the light distribution while keeping a high contrast in the image. For graph image processing, the adaptive section control using piecewise nonlinear gain is proposed to equalize the histogram. Simulations show that the performance of light balance is better than the other methods. Moreover, we employ line-based processing to efficiently reduce the memory requirement and the computational cost to make it applicable in real-time systems.

  4. Design of interpolation functions for subpixel-accuracy stereo-vision systems.

    PubMed

    Haller, Istvan; Nedevschi, Sergiu

    2012-02-01

    Traditionally, subpixel interpolation in stereo-vision systems was designed for the block-matching algorithm. During the evaluation of different interpolation strategies, a strong correlation was observed between the type of the stereo algorithm and the subpixel accuracy of the different solutions. Subpixel interpolation should be adapted to each stereo algorithm to achieve maximum accuracy. In consequence, it is more important to propose methodologies for interpolation function generation than specific function shapes. We propose two such methodologies based on data generated by the stereo algorithms. The first proposal uses a histogram to model the environment and applies histogram equalization to an existing solution adapting it to the data. The second proposal employs synthetic images of a known environment and applies function fitting to the resulted data. The resulting function matches the algorithm and the data as best as possible. An extensive evaluation set is used to validate the findings. Both real and synthetic test cases were employed in different scenarios. The test results are consistent and show significant improvements compared with traditional solutions. © 2011 IEEE

  5. A method for normalizing pathology images to improve feature extraction for quantitative pathology.

    PubMed

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    2016-01-01

    With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  6. Contrast limited adaptive histogram equalization image processing to improve the detection of simulated spiculations in dense mammograms.

    PubMed

    Pisano, E D; Zong, S; Hemminger, B M; DeLuca, M; Johnston, R E; Muller, K; Braeuning, M P; Pizer, S M

    1998-11-01

    The purpose of this project was to determine whether Contrast Limited Adaptive Histogram Equalization (CLAHE) improves detection of simulated spiculations in dense mammograms. Lines simulating the appearance of spiculations, a common marker of malignancy when visualized with masses, were embedded in dense mammograms digitized at 50 micron pixels, 12 bits deep. Film images with no CLAHE applied were compared to film images with nine different combinations of clip levels and region sizes applied. A simulated spiculation was embedded in a background of dense breast tissue, with the orientation of the spiculation varied. The key variables involved in each trial included the orientation of the spiculation, contrast level of the spiculation and the CLAHE settings applied to the image. Combining the 10 CLAHE conditions, 4 contrast levels and 4 orientations gave 160 combinations. The trials were constructed by pairing 160 combinations of key variables with 40 backgrounds. Twenty student observers were asked to detect the orientation of the spiculation in the image. There was a statistically significant improvement in detection performance for spiculations with CLAHE over unenhanced images when the region size was set at 32 with a clip level of 2, and when the region size was set at 32 with a clip level of 4. The selected CLAHE settings should be tested in the clinic with digital mammograms to determine whether detection of spiculations associated with masses detected at mammography can be improved.

  7. A comparative study on preprocessing techniques in diabetic retinopathy retinal images: illumination correction and contrast enhancement.

    PubMed

    Rasta, Seyed Hossein; Partovi, Mahsa Eisazadeh; Seyedarabi, Hadi; Javadzadeh, Alireza

    2015-01-01

    To investigate the effect of preprocessing techniques including contrast enhancement and illumination correction on retinal image quality, a comparative study was carried out. We studied and implemented a few illumination correction and contrast enhancement techniques on color retinal images to find out the best technique for optimum image enhancement. To compare and choose the best illumination correction technique we analyzed the corrected red and green components of color retinal images statistically and visually. The two contrast enhancement techniques were analyzed using a vessel segmentation algorithm by calculating the sensitivity and specificity. The statistical evaluation of the illumination correction techniques were carried out by calculating the coefficients of variation. The dividing method using the median filter to estimate background illumination showed the lowest Coefficients of variations in the red component. The quotient and homomorphic filtering methods after the dividing method presented good results based on their low Coefficients of variations. The contrast limited adaptive histogram equalization increased the sensitivity of the vessel segmentation algorithm up to 5% in the same amount of accuracy. The contrast limited adaptive histogram equalization technique has a higher sensitivity than the polynomial transformation operator as a contrast enhancement technique for vessel segmentation. Three techniques including the dividing method using the median filter to estimate background, quotient based and homomorphic filtering were found as the effective illumination correction techniques based on a statistical evaluation. Applying the local contrast enhancement technique, such as CLAHE, for fundus images presented good potentials in enhancing the vasculature segmentation.

  8. A novel parallel architecture for local histogram equalization

    NASA Astrophysics Data System (ADS)

    Ohannessian, Mesrob I.; Choueiter, Ghinwa F.; Diab, Hassan

    2005-07-01

    Local histogram equalization is an image enhancement algorithm that has found wide application in the pre-processing stage of areas such as computer vision, pattern recognition and medical imaging. The computationally intensive nature of the procedure, however, is a main limitation when real time interactive applications are in question. This work explores the possibility of performing parallel local histogram equalization, using an array of special purpose elementary processors, through an HDL implementation that targets FPGA or ASIC platforms. A novel parallelization scheme is presented and the corresponding architecture is derived. The algorithm is reduced to pixel-level operations. Processing elements are assigned image blocks, to maintain a reasonable performance-cost ratio. To further simplify both processor and memory organizations, a bit-serial access scheme is used. A brief performance assessment is provided to illustrate and quantify the merit of the approach.

  9. Reducing Error Rates for Iris Image using higher Contrast in Normalization process

    NASA Astrophysics Data System (ADS)

    Aminu Ghali, Abdulrahman; Jamel, Sapiee; Abubakar Pindar, Zahraddeen; Hasssan Disina, Abdulkadir; Mat Daris, Mustafa

    2017-08-01

    Iris recognition system is the most secured, and faster means of identification and authentication. However, iris recognition system suffers a setback from blurring, low contrast and illumination due to low quality image which compromises the accuracy of the system. The acceptance or rejection rates of verified user depend solely on the quality of the image. In many cases, iris recognition system with low image contrast could falsely accept or reject user. Therefore this paper adopts Histogram Equalization Technique to address the problem of False Rejection Rate (FRR) and False Acceptance Rate (FAR) by enhancing the contrast of the iris image. A histogram equalization technique enhances the image quality and neutralizes the low contrast of the image at normalization stage. The experimental result shows that Histogram Equalization Technique has reduced FRR and FAR compared to the existing techniques.

  10. Contrast Enhancement Algorithm Based on Gap Adjustment for Histogram Equalization

    PubMed Central

    Chiu, Chung-Cheng; Ting, Chih-Chung

    2016-01-01

    Image enhancement methods have been widely used to improve the visual effects of images. Owing to its simplicity and effectiveness histogram equalization (HE) is one of the methods used for enhancing image contrast. However, HE may result in over-enhancement and feature loss problems that lead to unnatural look and loss of details in the processed images. Researchers have proposed various HE-based methods to solve the over-enhancement problem; however, they have largely ignored the feature loss problem. Therefore, a contrast enhancement algorithm based on gap adjustment for histogram equalization (CegaHE) is proposed. It refers to a visual contrast enhancement algorithm based on histogram equalization (VCEA), which generates visually pleasing enhanced images, and improves the enhancement effects of VCEA. CegaHE adjusts the gaps between two gray values based on the adjustment equation, which takes the properties of human visual perception into consideration, to solve the over-enhancement problem. Besides, it also alleviates the feature loss problem and further enhances the textures in the dark regions of the images to improve the quality of the processed images for human visual perception. Experimental results demonstrate that CegaHE is a reliable method for contrast enhancement and that it significantly outperforms VCEA and other methods. PMID:27338412

  11. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology imagesmore » by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline.« less

  12. Multi-stream LSTM-HMM decoding and histogram equalization for noise robust keyword spotting.

    PubMed

    Wöllmer, Martin; Marchi, Erik; Squartini, Stefano; Schuller, Björn

    2011-09-01

    Highly spontaneous, conversational, and potentially emotional and noisy speech is known to be a challenge for today's automatic speech recognition (ASR) systems, which highlights the need for advanced algorithms that improve speech features and models. Histogram Equalization is an efficient method to reduce the mismatch between clean and noisy conditions by normalizing all moments of the probability distribution of the feature vector components. In this article, we propose to combine histogram equalization and multi-condition training for robust keyword detection in noisy speech. To better cope with conversational speaking styles, we show how contextual information can be effectively exploited in a multi-stream ASR framework that dynamically models context-sensitive phoneme estimates generated by a long short-term memory neural network. The proposed techniques are evaluated on the SEMAINE database-a corpus containing emotionally colored conversations with a cognitive system for "Sensitive Artificial Listening".

  13. A Comparative Study on Preprocessing Techniques in Diabetic Retinopathy Retinal Images: Illumination Correction and Contrast Enhancement

    PubMed Central

    Rasta, Seyed Hossein; Partovi, Mahsa Eisazadeh; Seyedarabi, Hadi; Javadzadeh, Alireza

    2015-01-01

    To investigate the effect of preprocessing techniques including contrast enhancement and illumination correction on retinal image quality, a comparative study was carried out. We studied and implemented a few illumination correction and contrast enhancement techniques on color retinal images to find out the best technique for optimum image enhancement. To compare and choose the best illumination correction technique we analyzed the corrected red and green components of color retinal images statistically and visually. The two contrast enhancement techniques were analyzed using a vessel segmentation algorithm by calculating the sensitivity and specificity. The statistical evaluation of the illumination correction techniques were carried out by calculating the coefficients of variation. The dividing method using the median filter to estimate background illumination showed the lowest Coefficients of variations in the red component. The quotient and homomorphic filtering methods after the dividing method presented good results based on their low Coefficients of variations. The contrast limited adaptive histogram equalization increased the sensitivity of the vessel segmentation algorithm up to 5% in the same amount of accuracy. The contrast limited adaptive histogram equalization technique has a higher sensitivity than the polynomial transformation operator as a contrast enhancement technique for vessel segmentation. Three techniques including the dividing method using the median filter to estimate background, quotient based and homomorphic filtering were found as the effective illumination correction techniques based on a statistical evaluation. Applying the local contrast enhancement technique, such as CLAHE, for fundus images presented good potentials in enhancing the vasculature segmentation. PMID:25709940

  14. Analysis of the hand vein pattern for people recognition

    NASA Astrophysics Data System (ADS)

    Castro-Ortega, R.; Toxqui-Quitl, C.; Cristóbal, G.; Marcos, J. Victor; Padilla-Vivanco, A.; Hurtado Pérez, R.

    2015-09-01

    The shape of the hand vascular pattern contains useful and unique features that can be used for identifying and authenticating people, with applications in access control, medicine and financial services. In this work, an optical system for the image acquisition of the hand vascular pattern is implemented. It consists of a CCD camera with sensitivity in the IR and a light source with emission in the 880 nm. The IR radiation interacts with the desoxyhemoglobin, hemoglobin and water present in the blood of the veins, making possible to see the vein pattern underneath skin. The segmentation of the Region Of Interest (ROI) is achieved using geometrical moments locating the centroid of an image. For enhancement of the vein pattern we use the technique of Histogram Equalization and Contrast Limited Adaptive Histogram Equalization (CLAHE). In order to remove unnecessary information such as body hair and skinfolds, a low pass filter is implemented. A method based on geometric moments is used to obtain the invariant descriptors of the input images. The classification task is achieved using Artificial Neural Networks (ANN) and K-Nearest Neighbors (K-nn) algorithms. Experimental results using our database show a percentage of correct classification, higher of 86.36% with ANN for 912 images of 38 people with 12 versions each one.

  15. Radiologists' preferences for digital mammographic display. The International Digital Mammography Development Group.

    PubMed

    Pisano, E D; Cole, E B; Major, S; Zong, S; Hemminger, B M; Muller, K E; Johnston, R E; Walsh, R; Conant, E; Fajardo, L L; Feig, S A; Nishikawa, R M; Yaffe, M J; Williams, M B; Aylward, S R

    2000-09-01

    To determine the preferences of radiologists among eight different image processing algorithms applied to digital mammograms obtained for screening and diagnostic imaging tasks. Twenty-eight images representing histologically proved masses or calcifications were obtained by using three clinically available digital mammographic units. Images were processed and printed on film by using manual intensity windowing, histogram-based intensity windowing, mixture model intensity windowing, peripheral equalization, multiscale image contrast amplification (MUSICA), contrast-limited adaptive histogram equalization, Trex processing, and unsharp masking. Twelve radiologists compared the processed digital images with screen-film mammograms obtained in the same patient for breast cancer screening and breast lesion diagnosis. For the screening task, screen-film mammograms were preferred to all digital presentations, but the acceptability of images processed with Trex and MUSICA algorithms were not significantly different. All printed digital images were preferred to screen-film radiographs in the diagnosis of masses; mammograms processed with unsharp masking were significantly preferred. For the diagnosis of calcifications, no processed digital mammogram was preferred to screen-film mammograms. When digital mammograms were preferred to screen-film mammograms, radiologists selected different digital processing algorithms for each of three mammographic reading tasks and for different lesion types. Soft-copy display will eventually allow radiologists to select among these options more easily.

  16. Moderated histogram equalization, an automatic means of enhancing the contrast in digital light micrographs reversibly.

    PubMed

    Entwistle, A

    2004-06-01

    A means for improving the contrast in the images produced from digital light micrographs is described that requires no intervention by the experimenter: zero-order, scaling, tonally independent, moderated histogram equalization. It is based upon histogram equalization, which often results in digital light micrographs that contain regions that appear to be saturated, negatively biased or very grainy. Here a non-decreasing monotonic function is introduced into the process, which moderates the changes in contrast that are generated. This method is highly effective for all three of the main types of contrast found in digital light micrography: bright objects viewed against a dark background, e.g. fluorescence and dark-ground or dark-field image data sets; bright and dark objects sets against a grey background, e.g. image data sets collected with phase or Nomarski differential interference contrast optics; and darker objects set against a light background, e.g. views of absorbing specimens. Moreover, it is demonstrated that there is a single fixed moderating function, whose actions are independent of the number of elements of image data, which works well with all types of digital light micrographs, including multimodal or multidimensional image data sets. The use of this fixed function is very robust as the appearance of the final image is not altered discernibly when it is applied repeatedly to an image data set. Consequently, moderated histogram equalization can be applied to digital light micrographs as a push-button solution, thereby eliminating biases that those undertaking the processing might have introduced during manual processing. Finally, moderated histogram equalization yields a mapping function and so, through the use of look-up tables, indexes or palettes, the information present in the original data file can be preserved while an image with the improved contrast is displayed on the monitor screen.

  17. Flood Detection/Monitoring Using Adjustable Histogram Equalization Technique

    PubMed Central

    Riaz, Muhammad Mohsin; Ghafoor, Abdul

    2014-01-01

    Flood monitoring technique using adjustable histogram equalization is proposed. The technique overcomes the limitations (overenhancement, artifacts, and unnatural look) of existing technique by adjusting the contrast of images. The proposed technique takes pre- and postimages and applies different processing steps for generating flood map without user interaction. The resultant flood maps can be used for flood monitoring and detection. Simulation results show that the proposed technique provides better output quality compared to the state of the art existing technique. PMID:24558332

  18. Novel medical image enhancement algorithms

    NASA Astrophysics Data System (ADS)

    Agaian, Sos; McClendon, Stephen A.

    2010-01-01

    In this paper, we present two novel medical image enhancement algorithms. The first, a global image enhancement algorithm, utilizes an alpha-trimmed mean filter as its backbone to sharpen images. The second algorithm uses a cascaded unsharp masking technique to separate the high frequency components of an image in order for them to be enhanced using a modified adaptive contrast enhancement algorithm. Experimental results from enhancing electron microscopy, radiological, CT scan and MRI scan images, using the MATLAB environment, are then compared to the original images as well as other enhancement methods, such as histogram equalization and two forms of adaptive contrast enhancement. An image processing scheme for electron microscopy images of Purkinje cells will also be implemented and utilized as a comparison tool to evaluate the performance of our algorithm.

  19. Complex adaptation-based LDR image rendering for 3D image reconstruction

    NASA Astrophysics Data System (ADS)

    Lee, Sung-Hak; Kwon, Hyuk-Ju; Sohng, Kyu-Ik

    2014-07-01

    A low-dynamic tone-compression technique is developed for realistic image rendering that can make three-dimensional (3D) images similar to realistic scenes by overcoming brightness dimming in the 3D display mode. The 3D surround provides varying conditions for image quality, illuminant adaptation, contrast, gamma, color, sharpness, and so on. In general, gain/offset adjustment, gamma compensation, and histogram equalization have performed well in contrast compression; however, as a result of signal saturation and clipping effects, image details are removed and information is lost on bright and dark areas. Thus, an enhanced image mapping technique is proposed based on space-varying image compression. The performance of contrast compression is enhanced with complex adaptation in a 3D viewing surround combining global and local adaptation. Evaluating local image rendering in view of tone and color expression, noise reduction, and edge compensation confirms that the proposed 3D image-mapping model can compensate for the loss of image quality in the 3D mode.

  20. Osteoarthritis classification using self organizing map based on gabor kernel and contrast-limited adaptive histogram equalization.

    PubMed

    Anifah, Lilik; Purnama, I Ketut Eddy; Hariadi, Mochamad; Purnomo, Mauridhi Hery

    2013-01-01

    Localization is the first step in osteoarthritis (OA) classification. Manual classification, however, is time-consuming, tedious, and expensive. The proposed system is designed as decision support system for medical doctors to classify the severity of knee OA. A method has been proposed here to localize a joint space area for OA and then classify it in 4 steps to classify OA into KL-Grade 0, KL-Grade 1, KL-Grade 2, KL-Grade 3 and KL-Grade 4, which are preprocessing, segmentation, feature extraction, and classification. In this proposed system, right and left knee detection was performed by employing the Contrast-Limited Adaptive Histogram Equalization (CLAHE) and the template matching. The Gabor kernel, row sum graph and moment methods were used to localize the junction space area of knee. CLAHE is used for preprocessing step, i.e.to normalize the varied intensities. The segmentation process was conducted using the Gabor kernel, template matching, row sum graph and gray level center of mass method. Here GLCM (contrast, correlation, energy, and homogeinity) features were employed as training data. Overall, 50 data were evaluated for training and 258 data for testing. Experimental results showed the best performance by using gabor kernel with parameters α=8, θ=0, Ψ=[0 π/2], γ=0,8, N=4 and with number of iterations being 5000, momentum value 0.5 and α0=0.6 for the classification process. The run gave classification accuracy rate of 93.8% for KL-Grade 0, 70% for KL-Grade 1, 4% for KL-Grade 2, 10% for KL-Grade 3 and 88.9% for KL-Grade 4.

  1. Osteoarthritis Classification Using Self Organizing Map Based on Gabor Kernel and Contrast-Limited Adaptive Histogram Equalization

    PubMed Central

    Anifah, Lilik; Purnama, I Ketut Eddy; Hariadi, Mochamad; Purnomo, Mauridhi Hery

    2013-01-01

    Localization is the first step in osteoarthritis (OA) classification. Manual classification, however, is time-consuming, tedious, and expensive. The proposed system is designed as decision support system for medical doctors to classify the severity of knee OA. A method has been proposed here to localize a joint space area for OA and then classify it in 4 steps to classify OA into KL-Grade 0, KL-Grade 1, KL-Grade 2, KL-Grade 3 and KL-Grade 4, which are preprocessing, segmentation, feature extraction, and classification. In this proposed system, right and left knee detection was performed by employing the Contrast-Limited Adaptive Histogram Equalization (CLAHE) and the template matching. The Gabor kernel, row sum graph and moment methods were used to localize the junction space area of knee. CLAHE is used for preprocessing step, i.e.to normalize the varied intensities. The segmentation process was conducted using the Gabor kernel, template matching, row sum graph and gray level center of mass method. Here GLCM (contrast, correlation, energy, and homogeinity) features were employed as training data. Overall, 50 data were evaluated for training and 258 data for testing. Experimental results showed the best performance by using gabor kernel with parameters α=8, θ=0, Ψ=[0 π/2], γ=0,8, N=4 and with number of iterations being 5000, momentum value 0.5 and α0=0.6 for the classification process. The run gave classification accuracy rate of 93.8% for KL-Grade 0, 70% for KL-Grade 1, 4% for KL-Grade 2, 10% for KL-Grade 3 and 88.9% for KL-Grade 4. PMID:23525188

  2. Investigation on improved infrared image detail enhancement algorithm based on adaptive histogram statistical stretching and gradient filtering

    NASA Astrophysics Data System (ADS)

    Zeng, Bangze; Zhu, Youpan; Li, Zemin; Hu, Dechao; Luo, Lin; Zhao, Deli; Huang, Juan

    2014-11-01

    Duo to infrared image with low contrast, big noise and unclear visual effect, target is very difficult to observed and identified. This paper presents an improved infrared image detail enhancement algorithm based on adaptive histogram statistical stretching and gradient filtering (AHSS-GF). Based on the fact that the human eyes are very sensitive to the edges and lines, the author proposed to extract the details and textures by using the gradient filtering. New histogram could be acquired by calculating the sum of original histogram based on fixed window. With the minimum value for cut-off point, author carried on histogram statistical stretching. After the proper weights given to the details and background, the detail-enhanced results could be acquired finally. The results indicate image contrast could be improved and the details and textures could be enhanced effectively as well.

  3. Improving the convergence rate in affine registration of PET and SPECT brain images using histogram equalization.

    PubMed

    Salas-Gonzalez, D; Górriz, J M; Ramírez, J; Padilla, P; Illán, I A

    2013-01-01

    A procedure to improve the convergence rate for affine registration methods of medical brain images when the images differ greatly from the template is presented. The methodology is based on a histogram matching of the source images with respect to the reference brain template before proceeding with the affine registration. The preprocessed source brain images are spatially normalized to a template using a general affine model with 12 parameters. A sum of squared differences between the source images and the template is considered as objective function, and a Gauss-Newton optimization algorithm is used to find the minimum of the cost function. Using histogram equalization as a preprocessing step improves the convergence rate in the affine registration algorithm of brain images as we show in this work using SPECT and PET brain images.

  4. Multispectral histogram normalization contrast enhancement

    NASA Technical Reports Server (NTRS)

    Soha, J. M.; Schwartz, A. A.

    1979-01-01

    A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.

  5. Investigating the Role of Global Histogram Equalization Technique for 99mTechnetium-Methylene diphosphonate Bone Scan Image Enhancement.

    PubMed

    Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh

    2017-01-01

    99m Technetium-methylene diphosphonate ( 99m Tc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99m Tc-MDP-bone scan images. A set of 89 low contrast 99m Tc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t -test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful.

  6. Finger vein recognition based on finger crease location

    NASA Astrophysics Data System (ADS)

    Lu, Zhiying; Ding, Shumeng; Yin, Jing

    2016-07-01

    Finger vein recognition technology has significant advantages over other methods in terms of accuracy, uniqueness, and stability, and it has wide promising applications in the field of biometric recognition. We propose using finger creases to locate and extract an object region. Then we use linear fitting to overcome the problem of finger rotation in the plane. The method of modular adaptive histogram equalization (MAHE) is presented to enhance image contrast and reduce computational cost. To extract the finger vein features, we use a fusion method, which can obtain clear and distinguishable vein patterns under different conditions. We used the Hausdorff average distance algorithm to examine the recognition performance of the system. The experimental results demonstrate that MAHE can better balance the recognition accuracy and the expenditure of time compared with three other methods. Our resulting equal error rate throughout the total procedure was 3.268% in a database of 153 finger vein images.

  7. Investigating the Role of Global Histogram Equalization Technique for 99mTechnetium-Methylene diphosphonate Bone Scan Image Enhancement

    PubMed Central

    Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh

    2017-01-01

    Purpose of the Study: 99mTechnetium-methylene diphosphonate (99mTc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99mTc-MDP-bone scan images. Materials and Methods: A set of 89 low contrast 99mTc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. Results: This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t-test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. Conclusion: GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful. PMID:29142344

  8. Illumination robust face recognition using spatial adaptive shadow compensation based on face intensity prior

    NASA Astrophysics Data System (ADS)

    Hsieh, Cheng-Ta; Huang, Kae-Horng; Lee, Chang-Hsing; Han, Chin-Chuan; Fan, Kuo-Chin

    2017-12-01

    Robust face recognition under illumination variations is an important and challenging task in a face recognition system, particularly for face recognition in the wild. In this paper, a face image preprocessing approach, called spatial adaptive shadow compensation (SASC), is proposed to eliminate shadows in the face image due to different lighting directions. First, spatial adaptive histogram equalization (SAHE), which uses face intensity prior model, is proposed to enhance the contrast of each local face region without generating visible noises in smooth face areas. Adaptive shadow compensation (ASC), which performs shadow compensation in each local image block, is then used to produce a wellcompensated face image appropriate for face feature extraction and recognition. Finally, null-space linear discriminant analysis (NLDA) is employed to extract discriminant features from SASC compensated images. Experiments performed on the Yale B, Yale B extended, and CMU PIE face databases have shown that the proposed SASC always yields the best face recognition accuracy. That is, SASC is more robust to face recognition under illumination variations than other shadow compensation approaches.

  9. An Unsupervised Approach for Extraction of Blood Vessels from Fundus Images.

    PubMed

    Dash, Jyotiprava; Bhoi, Nilamani

    2018-04-26

    Pathological disorders may happen due to small changes in retinal blood vessels which may later turn into blindness. Hence, the accurate segmentation of blood vessels is becoming a challenging task for pathological analysis. This paper offers an unsupervised recursive method for extraction of blood vessels from ophthalmoscope images. First, a vessel-enhanced image is generated with the help of gamma correction and contrast-limited adaptive histogram equalization (CLAHE). Next, the vessels are extracted iteratively by applying an adaptive thresholding technique. At last, a final vessel segmented image is produced by applying a morphological cleaning operation. Evaluations are accompanied on the publicly available digital retinal images for vessel extraction (DRIVE) and Child Heart And Health Study in England (CHASE_DB1) databases using nine different measurements. The proposed method achieves average accuracies of 0.957 and 0.952 on DRIVE and CHASE_DB1 databases respectively.

  10. Fast and efficient molecule detection in localization-based super-resolution microscopy by parallel adaptive histogram equalization.

    PubMed

    Li, Yiming; Ishitsuka, Yuji; Hedde, Per Niklas; Nienhaus, G Ulrich

    2013-06-25

    In localization-based super-resolution microscopy, individual fluorescent markers are stochastically photoactivated and subsequently localized within a series of camera frames, yielding a final image with a resolution far beyond the diffraction limit. Yet, before localization can be performed, the subregions within the frames where the individual molecules are present have to be identified-oftentimes in the presence of high background. In this work, we address the importance of reliable molecule identification for the quality of the final reconstructed super-resolution image. We present a fast and robust algorithm (a-livePALM) that vastly improves the molecule detection efficiency while minimizing false assignments that can lead to image artifacts.

  11. Automated Age-related Macular Degeneration screening system using fundus images.

    PubMed

    Kunumpol, P; Umpaipant, W; Kanchanaranya, N; Charoenpong, T; Vongkittirux, S; Kupakanjana, T; Tantibundhit, C

    2017-07-01

    This work proposed an automated screening system for Age-related Macular Degeneration (AMD), and distinguishing between wet or dry types of AMD using fundus images to assist ophthalmologists in eye disease screening and management. The algorithm employs contrast-limited adaptive histogram equalization (CLAHE) in image enhancement. Subsequently, discrete wavelet transform (DWT) and locality sensitivity discrimination analysis (LSDA) were used to extract features for a neural network model to classify the results. The results showed that the proposed algorithm was able to distinguish between normal eyes, dry AMD, or wet AMD with 98.63% sensitivity, 99.15% specificity, and 98.94% accuracy, suggesting promising potential as a medical support system for faster eye disease screening at lower costs.

  12. A multiresolution processing method for contrast enhancement in portal imaging.

    PubMed

    Gonzalez-Lopez, Antonio

    2018-06-18

    Portal images have a unique feature among the imaging modalities used in radiotherapy: they provide direct visualization of the irradiated volumes. However, contrast and spatial resolution are strongly limited due to the high energy of the radiation sources. Because of this, imaging modalities using x-ray energy beams have gained importance in the verification of patient positioning, replacing portal imaging. The purpose of this work was to develop a method for the enhancement of local contrast in portal images. The method operates in the subbands of a wavelet decomposition of the image, re-scaling them in such a way that coefficients in the high and medium resolution subbands are amplified, an approach totally different of those operating on the image histogram, widely used nowadays. Portal images of an anthropomorphic phantom were acquired in an electronic portal imaging device (EPID). Then, different re-scaling strategies were investigated, studying the effects of the scaling parameters on the enhanced images. Also, the effect of using different types of transforms was studied. Finally, the implemented methods were combined with histogram equalization methods like the contrast limited adaptive histogram equalization (CLAHE), and these combinations were compared. Uniform amplification of the detail subbands shows the best results in contrast enhancement. On the other hand, linear re-escalation of the high resolution subbands increases the visibility of fine detail of the images, at the expense of an increase in noise levels. Also, since processing is applied only to detail subbands, not to the approximation, the mean gray level of the image is minimally modified and no further display adjustments are required. It is shown that re-escalation of the detail subbands of portal images can be used as an efficient method for the enhancement of both, the local contrast and the resolution of these images. © 2018 Institute of Physics and Engineering in Medicine.

  13. Quality based approach for adaptive face recognition

    NASA Astrophysics Data System (ADS)

    Abboud, Ali J.; Sellahewa, Harin; Jassim, Sabah A.

    2009-05-01

    Recent advances in biometric technology have pushed towards more robust and reliable systems. We aim to build systems that have low recognition errors and are less affected by variation in recording conditions. Recognition errors are often attributed to the usage of low quality biometric samples. Hence, there is a need to develop new intelligent techniques and strategies to automatically measure/quantify the quality of biometric image samples and if necessary restore image quality according to the need of the intended application. In this paper, we present no-reference image quality measures in the spatial domain that have impact on face recognition. The first is called symmetrical adaptive local quality index (SALQI) and the second is called middle halve (MH). Also, an adaptive strategy has been developed to select the best way to restore the image quality, called symmetrical adaptive histogram equalization (SAHE). The main benefits of using quality measures for adaptive strategy are: (1) avoidance of excessive unnecessary enhancement procedures that may cause undesired artifacts, and (2) reduced computational complexity which is essential for real time applications. We test the success of the proposed measures and adaptive approach for a wavelet-based face recognition system that uses the nearest neighborhood classifier. We shall demonstrate noticeable improvements in the performance of adaptive face recognition system over the corresponding non-adaptive scheme.

  14. A contrast enhancement method for improving the segmentation of breast lesions on ultrasonography.

    PubMed

    Flores, Wilfrido Gómez; Pereira, Wagner Coelho de Albuquerque

    2017-01-01

    This paper presents an adaptive contrast enhancement method based on sigmoidal mapping function (SACE) used for improving the computerized segmentation of breast lesions on ultrasound. First, from the original ultrasound image an intensity variation map is obtained, which is used to generate local sigmoidal mapping functions related to distinct contextual regions. Then, a bilinear interpolation scheme is used to transform every original pixel to a new gray level value. Also, four contrast enhancement techniques widely used in breast ultrasound enhancement are implemented: histogram equalization (HEQ), contrast limited adaptive histogram equalization (CLAHE), fuzzy enhancement (FEN), and sigmoid based enhancement (SEN). In addition, these contrast enhancement techniques are considered in a computerized lesion segmentation scheme based on watershed transformation. The performance comparison among techniques is assessed in terms of both the quality of contrast enhancement and the segmentation accuracy. The former is quantified by the measure, where the greater the value, the better the contrast enhancement, whereas the latter is calculated by the Jaccard index, which should tend towards unity to indicate adequate segmentation. The experiments consider a data set with 500 breast ultrasound images. The results show that SACE outperforms its counterparts, where the median values for the measure are: SACE: 139.4, SEN: 68.2, HEQ: 64.1, CLAHE: 62.8, and FEN: 7.9. Considering the segmentation performance results, the SACE method presents the largest accuracy, where the median values for the Jaccard index are: SACE: 0.81, FEN: 0.80, CLAHE: 0.79, HEQ: 77, and SEN: 0.63. The SACE method performs well due to the combination of three elements: (1) the intensity variation map reduces intensity variations that could distort the real response of the mapping function, (2) the sigmoidal mapping function enhances the gray level range where the transition between lesion and background is found, and (3) the adaptive enhancing scheme for coping with local contrasts. Hence, the SACE approach is appropriate for enhancing contrast before computerized lesion segmentation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Recovery of Background Structures in Nanoscale Helium Ion Microscope Imaging.

    PubMed

    Carasso, Alfred S; Vladár, András E

    2014-01-01

    This paper discusses a two step enhancement technique applicable to noisy Helium Ion Microscope images in which background structures are not easily discernible due to a weak signal. The method is based on a preliminary adaptive histogram equalization, followed by 'slow motion' low-exponent Lévy fractional diffusion smoothing. This combined approach is unexpectedly effective, resulting in a companion enhanced image in which background structures are rendered much more visible, and noise is significantly reduced, all with minimal loss of image sharpness. The method also provides useful enhancements of scanning charged-particle microscopy images obtained by composing multiple drift-corrected 'fast scan' frames. The paper includes software routines, written in Interactive Data Language (IDL),(1) that can perform the above image processing tasks.

  16. [Algorithm of locally adaptive region growing based on multi-template matching applied to automated detection of hemorrhages].

    PubMed

    Gao, Wei-Wei; Shen, Jian-Xin; Wang, Yu-Liang; Liang, Chun; Zuo, Jing

    2013-02-01

    In order to automatically detect hemorrhages in fundus images, and develop an automated diabetic retinopathy screening system, a novel algorithm named locally adaptive region growing based on multi-template matching was established and studied. Firstly, spectral signature of major anatomical structures in fundus was studied, so that the right channel among RGB channels could be selected for different segmentation objects. Secondly, the fundus image was preprocessed by means of HSV brightness correction and contrast limited adaptive histogram equalization (CLAHE). Then, seeds of region growing were founded out by removing optic disc and vessel from the resulting image of normalized cross-correlation (NCC) template matching on the previous preprocessed image with several templates. Finally, locally adaptive region growing segmentation was used to find out the exact contours of hemorrhages, and the automated detection of the lesions was accomplished. The approach was tested on 90 different resolution fundus images with variable color, brightness and quality. Results suggest that the approach could fast and effectively detect hemorrhages in fundus images, and it is stable and robust. As a result, the approach can meet the clinical demands.

  17. Differentially Private Histogram Publication For Dynamic Datasets: An Adaptive Sampling Approach

    PubMed Central

    Li, Haoran; Jiang, Xiaoqian; Xiong, Li; Liu, Jinfei

    2016-01-01

    Differential privacy has recently become a de facto standard for private statistical data release. Many algorithms have been proposed to generate differentially private histograms or synthetic data. However, most of them focus on “one-time” release of a static dataset and do not adequately address the increasing need of releasing series of dynamic datasets in real time. A straightforward application of existing histogram methods on each snapshot of such dynamic datasets will incur high accumulated error due to the composibility of differential privacy and correlations or overlapping users between the snapshots. In this paper, we address the problem of releasing series of dynamic datasets in real time with differential privacy, using a novel adaptive distance-based sampling approach. Our first method, DSFT, uses a fixed distance threshold and releases a differentially private histogram only when the current snapshot is sufficiently different from the previous one, i.e., with a distance greater than a predefined threshold. Our second method, DSAT, further improves DSFT and uses a dynamic threshold adaptively adjusted by a feedback control mechanism to capture the data dynamics. Extensive experiments on real and synthetic datasets demonstrate that our approach achieves better utility than baseline methods and existing state-of-the-art methods. PMID:26973795

  18. Gray-level transformations for interactive image enhancement. M.S. Thesis. Final Technical Report

    NASA Technical Reports Server (NTRS)

    Fittes, B. A.

    1975-01-01

    A gray-level transformation method suitable for interactive image enhancement was presented. It is shown that the well-known histogram equalization approach is a special case of this method. A technique for improving the uniformity of a histogram is also developed. Experimental results which illustrate the capabilities of both algorithms are described. Two proposals for implementing gray-level transformations in a real-time interactive image enhancement system are also presented.

  19. Perceptual Contrast Enhancement with Dynamic Range Adjustment

    PubMed Central

    Zhang, Hong; Li, Yuecheng; Chen, Hao; Yuan, Ding; Sun, Mingui

    2013-01-01

    Recent years, although great efforts have been made to improve its performance, few Histogram equalization (HE) methods take human visual perception (HVP) into account explicitly. The human visual system (HVS) is more sensitive to edges than brightness. This paper proposes to take use of this nature intuitively and develops a perceptual contrast enhancement approach with dynamic range adjustment through histogram modification. The use of perceptual contrast connects the image enhancement problem with the HVS. To pre-condition the input image before the HE procedure is implemented, a perceptual contrast map (PCM) is constructed based on the modified Difference of Gaussian (DOG) algorithm. As a result, the contrast of the image is sharpened and high frequency noise is suppressed. A modified Clipped Histogram Equalization (CHE) is also developed which improves visual quality by automatically detecting the dynamic range of the image with improved perceptual contrast. Experimental results show that the new HE algorithm outperforms several state-of-the-art algorithms in improving perceptual contrast and enhancing details. In addition, the new algorithm is simple to implement, making it suitable for real-time applications. PMID:24339452

  20. Visual Contrast Enhancement Algorithm Based on Histogram Equalization

    PubMed Central

    Ting, Chih-Chung; Wu, Bing-Fei; Chung, Meng-Liang; Chiu, Chung-Cheng; Wu, Ya-Ching

    2015-01-01

    Image enhancement techniques primarily improve the contrast of an image to lend it a better appearance. One of the popular enhancement methods is histogram equalization (HE) because of its simplicity and effectiveness. However, it is rarely applied to consumer electronics products because it can cause excessive contrast enhancement and feature loss problems. These problems make the images processed by HE look unnatural and introduce unwanted artifacts in them. In this study, a visual contrast enhancement algorithm (VCEA) based on HE is proposed. VCEA considers the requirements of the human visual perception in order to address the drawbacks of HE. It effectively solves the excessive contrast enhancement problem by adjusting the spaces between two adjacent gray values of the HE histogram. In addition, VCEA reduces the effects of the feature loss problem by using the obtained spaces. Furthermore, VCEA enhances the detailed textures of an image to generate an enhanced image with better visual quality. Experimental results show that images obtained by applying VCEA have higher contrast and are more suited to human visual perception than those processed by HE and other HE-based methods. PMID:26184219

  1. Recovery of Background Structures in Nanoscale Helium Ion Microscope Imaging

    PubMed Central

    Carasso, Alfred S; Vladár, András E

    2014-01-01

    This paper discusses a two step enhancement technique applicable to noisy Helium Ion Microscope images in which background structures are not easily discernible due to a weak signal. The method is based on a preliminary adaptive histogram equalization, followed by ‘slow motion’ low-exponent Lévy fractional diffusion smoothing. This combined approach is unexpectedly effective, resulting in a companion enhanced image in which background structures are rendered much more visible, and noise is significantly reduced, all with minimal loss of image sharpness. The method also provides useful enhancements of scanning charged-particle microscopy images obtained by composing multiple drift-corrected ‘fast scan’ frames. The paper includes software routines, written in Interactive Data Language (IDL),1 that can perform the above image processing tasks. PMID:26601050

  2. Generalized image contrast enhancement technique based on Heinemann contrast discrimination model

    NASA Astrophysics Data System (ADS)

    Liu, Hong; Nodine, Calvin F.

    1994-03-01

    This paper presents a generalized image contrast enhancement technique which equalizes perceived brightness based on the Heinemann contrast discrimination model. This is a modified algorithm which presents an improvement over the previous study by Mokrane in its mathematically proven existence of a unique solution and in its easily tunable parameterization. The model uses a log-log representation of contrast luminosity between targets and the surround in a fixed luminosity background setting. The algorithm consists of two nonlinear gray-scale mapping functions which have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of gray scale distribution of the image, and can be uniquely determined once the previous three are given. Tests have been carried out to examine the effectiveness of the algorithm for increasing the overall contrast of images. It can be demonstrated that the generalized algorithm provides better contrast enhancement than histogram equalization. In fact, the histogram equalization technique is a special case of the proposed mapping.

  3. Change Detection of High-Resolution Remote Sensing Images Based on Adaptive Fusion of Multiple Features

    NASA Astrophysics Data System (ADS)

    Wang, G. H.; Wang, H. B.; Fan, W. F.; Liu, Y.; Chen, C.

    2018-04-01

    In view of the traditional change detection algorithm mainly depends on the spectral information image spot, failed to effectively mining and fusion of multi-image feature detection advantage, the article borrows the ideas of object oriented analysis proposed a multi feature fusion of remote sensing image change detection algorithm. First by the multi-scale segmentation of image objects based; then calculate the various objects of color histogram and linear gradient histogram; utilizes the color distance and edge line feature distance between EMD statistical operator in different periods of the object, using the adaptive weighted method, the color feature distance and edge in a straight line distance of combination is constructed object heterogeneity. Finally, the curvature histogram analysis image spot change detection results. The experimental results show that the method can fully fuse the color and edge line features, thus improving the accuracy of the change detection.

  4. An Adaptive Image Enhancement Technique by Combining Cuckoo Search and Particle Swarm Optimization Algorithm

    PubMed Central

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper. PMID:25784928

  5. A new method for fusion, denoising and enhancement of x-ray images retrieved from Talbot-Lau grating interferometry.

    PubMed

    Scholkmann, Felix; Revol, Vincent; Kaufmann, Rolf; Baronowski, Heidrun; Kottler, Christian

    2014-03-21

    This paper introduces a new image denoising, fusion and enhancement framework for combining and optimal visualization of x-ray attenuation contrast (AC), differential phase contrast (DPC) and dark-field contrast (DFC) images retrieved from x-ray Talbot-Lau grating interferometry. The new image fusion framework comprises three steps: (i) denoising each input image (AC, DPC and DFC) through adaptive Wiener filtering, (ii) performing a two-step image fusion process based on the shift-invariant wavelet transform, i.e. first fusing the AC with the DPC image and then fusing the resulting image with the DFC image, and finally (iii) enhancing the fused image to obtain a final image using adaptive histogram equalization, adaptive sharpening and contrast optimization. Application examples are presented for two biological objects (a human tooth and a cherry) and the proposed method is compared to two recently published AC/DPC/DFC image processing techniques. In conclusion, the new framework for the processing of AC, DPC and DFC allows the most relevant features of all three images to be combined in one image while reducing the noise and enhancing adaptively the relevant image features. The newly developed framework may be used in technical and medical applications.

  6. An adaptive image enhancement technique by combining cuckoo search and particle swarm optimization algorithm.

    PubMed

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper.

  7. Automatic image equalization and contrast enhancement using Gaussian mixture modeling.

    PubMed

    Celik, Turgay; Tjahjadi, Tardi

    2012-01-01

    In this paper, we propose an adaptive image equalization algorithm that automatically enhances the contrast in an input image. The algorithm uses the Gaussian mixture model to model the image gray-level distribution, and the intersection points of the Gaussian components in the model are used to partition the dynamic range of the image into input gray-level intervals. The contrast equalized image is generated by transforming the pixels' gray levels in each input interval to the appropriate output gray-level interval according to the dominant Gaussian component and the cumulative distribution function of the input interval. To take account of the hypothesis that homogeneous regions in the image represent homogeneous silences (or set of Gaussian components) in the image histogram, the Gaussian components with small variances are weighted with smaller values than the Gaussian components with larger variances, and the gray-level distribution is also used to weight the components in the mapping of the input interval to the output interval. Experimental results show that the proposed algorithm produces better or comparable enhanced images than several state-of-the-art algorithms. Unlike the other algorithms, the proposed algorithm is free of parameter setting for a given dynamic range of the enhanced image and can be applied to a wide range of image types.

  8. A long-term target detection approach in infrared image sequence

    NASA Astrophysics Data System (ADS)

    Li, Hang; Zhang, Qi; Li, Yuanyuan; Wang, Liqiang

    2015-12-01

    An automatic target detection method used in long term infrared (IR) image sequence from a moving platform is proposed. Firstly, based on non-linear histogram equalization, target candidates are coarse-to-fine segmented by using two self-adapt thresholds generated in the intensity space. Then the real target is captured via two different selection approaches. At the beginning of image sequence, the genuine target with litter texture is discriminated from other candidates by using contrast-based confidence measure. On the other hand, when the target becomes larger, we apply online EM method to iteratively estimate and update the distributions of target's size and position based on the prior detection results, and then recognize the genuine one which satisfies both the constraints of size and position. Experimental results demonstrate that the presented method is accurate, robust and efficient.

  9. Histogram analysis for smartphone-based rapid hematocrit determination

    PubMed Central

    Jalal, Uddin M.; Kim, Sang C.; Shim, Joon S.

    2017-01-01

    A novel and rapid analysis technique using histogram has been proposed for the colorimetric quantification of blood hematocrits. A smartphone-based “Histogram” app for the detection of hematocrits has been developed integrating the smartphone embedded camera with a microfluidic chip via a custom-made optical platform. The developed histogram analysis shows its effectiveness in the automatic detection of sample channel including auto-calibration and can analyze the single-channel as well as multi-channel images. Furthermore, the analyzing method is advantageous to the quantification of blood-hematocrit both in the equal and varying optical conditions. The rapid determination of blood hematocrits carries enormous information regarding physiological disorders, and the use of such reproducible, cost-effective, and standard techniques may effectively help with the diagnosis and prevention of a number of human diseases. PMID:28717569

  10. Wavelength-adaptive dehazing using histogram merging-based classification for UAV images.

    PubMed

    Yoon, Inhye; Jeong, Seokhwa; Jeong, Jaeheon; Seo, Doochun; Paik, Joonki

    2015-03-19

    Since incoming light to an unmanned aerial vehicle (UAV) platform can be scattered by haze and dust in the atmosphere, the acquired image loses the original color and brightness of the subject. Enhancement of hazy images is an important task in improving the visibility of various UAV images. This paper presents a spatially-adaptive dehazing algorithm that merges color histograms with consideration of the wavelength-dependent atmospheric turbidity. Based on the wavelength-adaptive hazy image acquisition model, the proposed dehazing algorithm consists of three steps: (i) image segmentation based on geometric classes; (ii) generation of the context-adaptive transmission map; and (iii) intensity transformation for enhancing a hazy UAV image. The major contribution of the research is a novel hazy UAV image degradation model by considering the wavelength of light sources. In addition, the proposed transmission map provides a theoretical basis to differentiate visually important regions from others based on the turbidity and merged classification results.

  11. Universal and adapted vocabularies for generic visual categorization.

    PubMed

    Perronnin, Florent

    2008-07-01

    Generic Visual Categorization (GVC) is the pattern classification problem which consists in assigning labels to an image based on its semantic content. This is a challenging task as one has to deal with inherent object/scene variations as well as changes in viewpoint, lighting and occlusion. Several state-of-the-art GVC systems use a vocabulary of visual terms to characterize images with a histogram of visual word counts. We propose a novel practical approach to GVC based on a universal vocabulary, which describes the content of all the considered classes of images, and class vocabularies obtained through the adaptation of the universal vocabulary using class-specific data. The main novelty is that an image is characterized by a set of histograms - one per class - where each histogram describes whether the image content is best modeled by the universal vocabulary or the corresponding class vocabulary. This framework is applied to two types of local image features: low-level descriptors such as the popular SIFT and high-level histograms of word co-occurrences in a spatial neighborhood. It is shown experimentally on two challenging datasets (an in-house database of 19 categories and the PASCAL VOC 2006 dataset) that the proposed approach exhibits state-of-the-art performance at a modest computational cost.

  12. Automated Segmentation of Light-Sheet Fluorescent Imaging to Characterize Experimental Doxorubicin-Induced Cardiac Injury and Repair.

    PubMed

    Packard, René R Sevag; Baek, Kyung In; Beebe, Tyler; Jen, Nelson; Ding, Yichen; Shi, Feng; Fei, Peng; Kang, Bong Jin; Chen, Po-Heng; Gau, Jonathan; Chen, Michael; Tang, Jonathan Y; Shih, Yu-Huan; Ding, Yonghe; Li, Debiao; Xu, Xiaolei; Hsiai, Tzung K

    2017-08-17

    This study sought to develop an automated segmentation approach based on histogram analysis of raw axial images acquired by light-sheet fluorescent imaging (LSFI) to establish rapid reconstruction of the 3-D zebrafish cardiac architecture in response to doxorubicin-induced injury and repair. Input images underwent a 4-step automated image segmentation process consisting of stationary noise removal, histogram equalization, adaptive thresholding, and image fusion followed by 3-D reconstruction. We applied this method to 3-month old zebrafish injected intraperitoneally with doxorubicin followed by LSFI at 3, 30, and 60 days post-injection. We observed an initial decrease in myocardial and endocardial cavity volumes at day 3, followed by ventricular remodeling at day 30, and recovery at day 60 (P < 0.05, n = 7-19). Doxorubicin-injected fish developed ventricular diastolic dysfunction and worsening global cardiac function evidenced by elevated E/A ratios and myocardial performance indexes quantified by pulsed-wave Doppler ultrasound at day 30, followed by normalization at day 60 (P < 0.05, n = 9-20). Treatment with the γ-secretase inhibitor, DAPT, to inhibit cleavage and release of Notch Intracellular Domain (NICD) blocked cardiac architectural regeneration and restoration of ventricular function at day 60 (P < 0.05, n = 6-14). Our approach provides a high-throughput model with translational implications for drug discovery and genetic modifiers of chemotherapy-induced cardiomyopathy.

  13. Automated Detection of Diabetic Retinopathy using Deep Learning.

    PubMed

    Lam, Carson; Yi, Darvin; Guo, Margaret; Lindsey, Tony

    2018-01-01

    Diabetic retinopathy is a leading cause of blindness among working-age adults. Early detection of this condition is critical for good prognosis. In this paper, we demonstrate the use of convolutional neural networks (CNNs) on color fundus images for the recognition task of diabetic retinopathy staging. Our network models achieved test metric performance comparable to baseline literature results, with validation sensitivity of 95%. We additionally explored multinomial classification models, and demonstrate that errors primarily occur in the misclassification of mild disease as normal due to the CNNs inability to detect subtle disease features. We discovered that preprocessing with contrast limited adaptive histogram equalization and ensuring dataset fidelity by expert verification of class labels improves recognition of subtle features. Transfer learning on pretrained GoogLeNet and AlexNet models from ImageNet improved peak test set accuracies to 74.5%, 68.8%, and 57.2% on 2-ary, 3-ary, and 4-ary classification models, respectively.

  14. Retinex based low-light image enhancement using guided filtering and variational framework

    NASA Astrophysics Data System (ADS)

    Zhang, Shi; Tang, Gui-jin; Liu, Xiao-hua; Luo, Su-huai; Wang, Da-dong

    2018-03-01

    A new image enhancement algorithm based on Retinex theory is proposed to solve the problem of bad visual effect of an image in low-light conditions. First, an image is converted from the RGB color space to the HSV color space to get the V channel. Next, the illuminations are respectively estimated by the guided filtering and the variational framework on the V channel and combined into a new illumination by average gradient. The new reflectance is calculated using V channel and the new illumination. Then a new V channel obtained by multiplying the new illumination and reflectance is processed with contrast limited adaptive histogram equalization (CLAHE). Finally, the new image in HSV space is converted back to RGB space to obtain the enhanced image. Experimental results show that the proposed method has better subjective quality and objective quality than existing methods.

  15. Wavelength-Adaptive Dehazing Using Histogram Merging-Based Classification for UAV Images

    PubMed Central

    Yoon, Inhye; Jeong, Seokhwa; Jeong, Jaeheon; Seo, Doochun; Paik, Joonki

    2015-01-01

    Since incoming light to an unmanned aerial vehicle (UAV) platform can be scattered by haze and dust in the atmosphere, the acquired image loses the original color and brightness of the subject. Enhancement of hazy images is an important task in improving the visibility of various UAV images. This paper presents a spatially-adaptive dehazing algorithm that merges color histograms with consideration of the wavelength-dependent atmospheric turbidity. Based on the wavelength-adaptive hazy image acquisition model, the proposed dehazing algorithm consists of three steps: (i) image segmentation based on geometric classes; (ii) generation of the context-adaptive transmission map; and (iii) intensity transformation for enhancing a hazy UAV image. The major contribution of the research is a novel hazy UAV image degradation model by considering the wavelength of light sources. In addition, the proposed transmission map provides a theoretical basis to differentiate visually important regions from others based on the turbidity and merged classification results. PMID:25808767

  16. A Comparison of the Multiscale Retinex With Other Image Enhancement Techniques

    NASA Technical Reports Server (NTRS)

    Rahman, Zia-Ur; Woodell, Glenn A.; Jobson, Daniel J.

    1997-01-01

    The multiscale retinex with color restoration (MSRCR) has shown itself to be a very versatile automatic image enhancement algorithm that simultaneously provides dynamic range compression, color constancy, and color rendition. A number of algorithms exist that provide one or more of these features, but not all. In this paper we compare the performance of the MSRCR with techniques that are widely used for image enhancement. Specifically, we compare the MSRCR with color adjustment methods such as gamma correction and gain/offset application, histogram modification techniques such as histogram equalization and manual histogram adjustment, and other more powerful techniques such as homomorphic filtering and 'burning and dodging'. The comparison is carried out by testing the suite of image enhancement methods on a set of diverse images. We find that though some of these techniques work well for some of these images, only the MSRCR performs universally well on the test set.

  17. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qiu, J; Washington University in St Louis, St Louis, MO; Li, H. Harlod

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The mostmore » important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.« less

  18. Contrast-dependent saturation adjustment for outdoor image enhancement.

    PubMed

    Wang, Shuhang; Cho, Woon; Jang, Jinbeum; Abidi, Mongi A; Paik, Joonki

    2017-01-01

    Outdoor images captured in bad-weather conditions usually have poor intensity contrast and color saturation since the light arriving at the camera is severely scattered or attenuated. The task of improving image quality in poor conditions remains a challenge. Existing methods of image quality improvement are usually effective for a small group of images but often fail to produce satisfactory results for a broader variety of images. In this paper, we propose an image enhancement method, which makes it applicable to enhance outdoor images by using content-adaptive contrast improvement as well as contrast-dependent saturation adjustment. The main contribution of this work is twofold: (1) we propose the content-adaptive histogram equalization based on the human visual system to improve the intensity contrast; and (2) we introduce a simple yet effective prior for adjusting the color saturation depending on the intensity contrast. The proposed method is tested with different kinds of images, compared with eight state-of-the-art methods: four enhancement methods and four haze removal methods. Experimental results show the proposed method can more effectively improve the visibility and preserve the naturalness of the images, as opposed to the compared methods.

  19. TU-H-CAMPUS-JeP3-02: Automated Dose Accumulation and Dose Accuracy Assessment for Online Or Offline Adaptive Replanning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, G; Ahunbay, E; Li, X

    Purpose: With introduction of high-quality treatment imaging during radiation therapy (RT) delivery, e.g., MR-Linac, adaptive replanning of either online or offline becomes appealing. Dose accumulation of delivered fractions, a prerequisite for the adaptive replanning, can be cumbersome and inaccurate. The purpose of this work is to develop an automated process to accumulate daily doses and to assess the dose accumulation accuracy voxel-by-voxel for adaptive replanning. Methods: The process includes the following main steps: 1) reconstructing daily dose for each delivered fraction with a treatment planning system (Monaco, Elekta) based on the daily images using machine delivery log file and consideringmore » patient repositioning if applicable, 2) overlaying the daily dose to the planning image based on deformable image registering (DIR) (ADMIRE, Elekta), 3) assessing voxel dose deformation accuracy based on deformation field using predetermined criteria, and 4) outputting accumulated dose and dose-accuracy volume histograms and parameters. Daily CTs acquired using a CT-on-rails during routine CT-guided RT for sample patients with head and neck and prostate cancers were used to test the process. Results: Daily and accumulated doses (dose-volume histograms, etc) along with their accuracies (dose-accuracy volume histogram) can be robustly generated using the proposed process. The test data for a head and neck cancer case shows that the gross tumor volume decreased by 20% towards the end of treatment course, and the parotid gland mean dose increased by 10%. Such information would trigger adaptive replanning for the subsequent fractions. The voxel-based accuracy in the accumulated dose showed that errors in accumulated dose near rigid structures were small. Conclusion: A procedure as well as necessary tools to automatically accumulate daily dose and assess dose accumulation accuracy is developed and is useful for adaptive replanning. Partially supported by Elekta, Inc.« less

  20. Reducing charging effects in scanning electron microscope images by Rayleigh contrast stretching method (RCS).

    PubMed

    Wan Ismail, W Z; Sim, K S; Tso, C P; Ting, H Y

    2011-01-01

    To reduce undesirable charging effects in scanning electron microscope images, Rayleigh contrast stretching is developed and employed. First, re-scaling is performed on the input image histograms with Rayleigh algorithm. Then, contrast stretching or contrast adjustment is implemented to improve the images while reducing the contrast charging artifacts. This technique has been compared to some existing histogram equalization (HE) extension techniques: recursive sub-image HE, contrast stretching dynamic HE, multipeak HE and recursive mean separate HE. Other post processing methods, such as wavelet approach, spatial filtering, and exponential contrast stretching, are compared as well. Overall, the proposed method produces better image compensation in reducing charging artifacts. Copyright © 2011 Wiley Periodicals, Inc.

  1. Multivariate statistical model for 3D image segmentation with application to medical images.

    PubMed

    John, Nigel M; Kabuka, Mansur R; Ibrahim, Mohamed O

    2003-12-01

    In this article we describe a statistical model that was developed to segment brain magnetic resonance images. The statistical segmentation algorithm was applied after a pre-processing stage involving the use of a 3D anisotropic filter along with histogram equalization techniques. The segmentation algorithm makes use of prior knowledge and a probability-based multivariate model designed to semi-automate the process of segmentation. The algorithm was applied to images obtained from the Center for Morphometric Analysis at Massachusetts General Hospital as part of the Internet Brain Segmentation Repository (IBSR). The developed algorithm showed improved accuracy over the k-means, adaptive Maximum Apriori Probability (MAP), biased MAP, and other algorithms. Experimental results showing the segmentation and the results of comparisons with other algorithms are provided. Results are based on an overlap criterion against expertly segmented images from the IBSR. The algorithm produced average results of approximately 80% overlap with the expertly segmented images (compared with 85% for manual segmentation and 55% for other algorithms).

  2. Boundary segmentation for fluorescence microscopy using steerable filters

    NASA Astrophysics Data System (ADS)

    Ho, David Joon; Salama, Paul; Dunn, Kenneth W.; Delp, Edward J.

    2017-02-01

    Fluorescence microscopy is used to image multiple subcellular structures in living cells which are not readily observed using conventional optical microscopy. Moreover, two-photon microscopy is widely used to image structures deeper in tissue. Recent advancement in fluorescence microscopy has enabled the generation of large data sets of images at different depths, times, and spectral channels. Thus, automatic object segmentation is necessary since manual segmentation would be inefficient and biased. However, automatic segmentation is still a challenging problem as regions of interest may not have well defined boundaries as well as non-uniform pixel intensities. This paper describes a method for segmenting tubular structures in fluorescence microscopy images of rat kidney and liver samples using adaptive histogram equalization, foreground/background segmentation, steerable filters to capture directional tendencies, and connected-component analysis. The results from several data sets demonstrate that our method can segment tubular boundaries successfully. Moreover, our method has better performance when compared to other popular image segmentation methods when using ground truth data obtained via manual segmentation.

  3. Contact-free palm-vein recognition based on local invariant features.

    PubMed

    Kang, Wenxiong; Liu, Yang; Wu, Qiuxia; Yue, Xishun

    2014-01-01

    Contact-free palm-vein recognition is one of the most challenging and promising areas in hand biometrics. In view of the existing problems in contact-free palm-vein imaging, including projection transformation, uneven illumination and difficulty in extracting exact ROIs, this paper presents a novel recognition approach for contact-free palm-vein recognition that performs feature extraction and matching on all vein textures distributed over the palm surface, including finger veins and palm veins, to minimize the loss of feature information. First, a hierarchical enhancement algorithm, which combines a DOG filter and histogram equalization, is adopted to alleviate uneven illumination and to highlight vein textures. Second, RootSIFT, a more stable local invariant feature extraction method in comparison to SIFT, is adopted to overcome the projection transformation in contact-free mode. Subsequently, a novel hierarchical mismatching removal algorithm based on neighborhood searching and LBP histograms is adopted to improve the accuracy of feature matching. Finally, we rigorously evaluated the proposed approach using two different databases and obtained 0.996% and 3.112% Equal Error Rates (EERs), respectively, which demonstrate the effectiveness of the proposed approach.

  4. Contact-Free Palm-Vein Recognition Based on Local Invariant Features

    PubMed Central

    Kang, Wenxiong; Liu, Yang; Wu, Qiuxia; Yue, Xishun

    2014-01-01

    Contact-free palm-vein recognition is one of the most challenging and promising areas in hand biometrics. In view of the existing problems in contact-free palm-vein imaging, including projection transformation, uneven illumination and difficulty in extracting exact ROIs, this paper presents a novel recognition approach for contact-free palm-vein recognition that performs feature extraction and matching on all vein textures distributed over the palm surface, including finger veins and palm veins, to minimize the loss of feature information. First, a hierarchical enhancement algorithm, which combines a DOG filter and histogram equalization, is adopted to alleviate uneven illumination and to highlight vein textures. Second, RootSIFT, a more stable local invariant feature extraction method in comparison to SIFT, is adopted to overcome the projection transformation in contact-free mode. Subsequently, a novel hierarchical mismatching removal algorithm based on neighborhood searching and LBP histograms is adopted to improve the accuracy of feature matching. Finally, we rigorously evaluated the proposed approach using two different databases and obtained 0.996% and 3.112% Equal Error Rates (EERs), respectively, which demonstrate the effectiveness of the proposed approach. PMID:24866176

  5. Blind identification of image manipulation type using mixed statistical moments

    NASA Astrophysics Data System (ADS)

    Jeong, Bo Gyu; Moon, Yong Ho; Eom, Il Kyu

    2015-01-01

    We present a blind identification of image manipulation types such as blurring, scaling, sharpening, and histogram equalization. Motivated by the fact that image manipulations can change the frequency characteristics of an image, we introduce three types of feature vectors composed of statistical moments. The proposed statistical moments are generated from separated wavelet histograms, the characteristic functions of the wavelet variance, and the characteristic functions of the spatial image. Our method can solve the n-class classification problem. Through experimental simulations, we demonstrate that our proposed method can achieve high performance in manipulation type detection. The average rate of the correctly identified manipulation types is as high as 99.22%, using 10,800 test images and six manipulation types including the authentic image.

  6. Low-resolution expression recognition based on central oblique average CS-LBP with adaptive threshold

    NASA Astrophysics Data System (ADS)

    Han, Sheng; Xi, Shi-qiong; Geng, Wei-dong

    2017-11-01

    In order to solve the problem of low recognition rate of traditional feature extraction operators under low-resolution images, a novel algorithm of expression recognition is proposed, named central oblique average center-symmetric local binary pattern (CS-LBP) with adaptive threshold (ATCS-LBP). Firstly, the features of face images can be extracted by the proposed operator after pretreatment. Secondly, the obtained feature image is divided into blocks. Thirdly, the histogram of each block is computed independently and all histograms can be connected serially to create a final feature vector. Finally, expression classification is achieved by using support vector machine (SVM) classifier. Experimental results on Japanese female facial expression (JAFFE) database show that the proposed algorithm can achieve a recognition rate of 81.9% when the resolution is as low as 16×16, which is much better than that of the traditional feature extraction operators.

  7. Adaptive Processing of RADARSAT-1 Fine Mode Data: Ship Parameter Estimation

    DTIC Science & Technology

    2007-03-01

    53 Figure 60: D7S1, the 63 m long freighter “ Germa ” is one of the smallest ships in the data set. .. 53 Figure 61: D6S1...5 10 15 20 25 30 length [m] N um be r of s hi ps Figure 1: Length histogram of analyzed ships according to the AIS data. 8 DRDC Ottawa TM 2007...053 0 50 100 150 200 250 300 350 400 0 5 10 15 20 25 θ [°] N um be r of s hi ps Figure 2: Aspect angle histogram of analyzed ships

  8. Adaptive Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Fasnacht, Marc

    We develop adaptive Monte Carlo methods for the calculation of the free energy as a function of a parameter of interest. The methods presented are particularly well-suited for systems with complex energy landscapes, where standard sampling techniques have difficulties. The Adaptive Histogram Method uses a biasing potential derived from histograms recorded during the simulation to achieve uniform sampling in the parameter of interest. The Adaptive Integration method directly calculates an estimate of the free energy from the average derivative of the Hamiltonian with respect to the parameter of interest and uses it as a biasing potential. We compare both methods to a state of the art method, and demonstrate that they compare favorably for the calculation of potentials of mean force of dense Lennard-Jones fluids. We use the Adaptive Integration Method to calculate accurate potentials of mean force for different types of simple particles in a Lennard-Jones fluid. Our approach allows us to separate the contributions of the solvent to the potential of mean force from the effect of the direct interaction between the particles. With contributions of the solvent determined, we can find the potential of mean force directly for any other direct interaction without additional simulations. We also test the accuracy of the Adaptive Integration Method on a thermodynamic cycle, which allows us to perform a consistency check between potentials of mean force and chemical potentials calculated using the Adaptive Integration Method. The results demonstrate a high degree of consistency of the method.

  9. Medical image classification using spatial adjacent histogram based on adaptive local binary patterns.

    PubMed

    Liu, Dong; Wang, Shengsheng; Huang, Dezhi; Deng, Gang; Zeng, Fantao; Chen, Huiling

    2016-05-01

    Medical image recognition is an important task in both computer vision and computational biology. In the field of medical image classification, representing an image based on local binary patterns (LBP) descriptor has become popular. However, most existing LBP-based methods encode the binary patterns in a fixed neighborhood radius and ignore the spatial relationships among local patterns. The ignoring of the spatial relationships in the LBP will cause a poor performance in the process of capturing discriminative features for complex samples, such as medical images obtained by microscope. To address this problem, in this paper we propose a novel method to improve local binary patterns by assigning an adaptive neighborhood radius for each pixel. Based on these adaptive local binary patterns, we further propose a spatial adjacent histogram strategy to encode the micro-structures for image representation. An extensive set of evaluations are performed on four medical datasets which show that the proposed method significantly improves standard LBP and compares favorably with several other prevailing approaches. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Automatic dynamic range adjustment for ultrasound B-mode imaging.

    PubMed

    Lee, Yeonhwa; Kang, Jinbum; Yoo, Yangmo

    2015-02-01

    In medical ultrasound imaging, dynamic range (DR) is defined as the difference between the maximum and minimum values of the displayed signal to display and it is one of the most essential parameters that determine its image quality. Typically, DR is given with a fixed value and adjusted manually by operators, which leads to low clinical productivity and high user dependency. Furthermore, in 3D ultrasound imaging, DR values are unable to be adjusted during 3D data acquisition. A histogram matching method, which equalizes the histogram of an input image based on that from a reference image, can be applied to determine the DR value. However, it could be lead to an over contrasted image. In this paper, a new Automatic Dynamic Range Adjustment (ADRA) method is presented that adaptively adjusts the DR value by manipulating input images similar to a reference image. The proposed ADRA method uses the distance ratio between the log average and each extreme value of a reference image. To evaluate the performance of the ADRA method, the similarity between the reference and input images was measured by computing a correlation coefficient (CC). In in vivo experiments, the CC values were increased by applying the ADRA method from 0.6872 to 0.9870 and from 0.9274 to 0.9939 for kidney and liver data, respectively, compared to the fixed DR case. In addition, the proposed ADRA method showed to outperform the histogram matching method with in vivo liver and kidney data. When using 3D abdominal data with 70 frames, while the CC value from the ADRA method is slightly increased (i.e., 0.6%), the proposed method showed improved image quality in the c-plane compared to its fixed counterpart, which suffered from a shadow artifact. These results indicate that the proposed method can enhance image quality in 2D and 3D ultrasound B-mode imaging by improving the similarity between the reference and input images while eliminating unnecessary manual interaction by the user. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Labeling Defects in CT Images of Hardwood Logs with Species-Dependent and Species-Independent Classifiers

    Treesearch

    Pei Li; Jing He; A. Lynn Abbott; Daniel L. Schmoldt

    1996-01-01

    This paper analyses computed tomography (CT) images of hardwood logs, with the goal of locating internal defects. The ability to detect and identify defects automatically is a critical component of efficiency improvements for future sawmills and veneer mills. This paper describes an approach in which 1) histogram equalization is used during preprocessing to normalize...

  12. Is there a preference for linearity when viewing natural images?

    NASA Astrophysics Data System (ADS)

    Kane, David; Bertamío, Marcelo

    2015-01-01

    The system gamma of the imaging pipeline, defined as the product of the encoding and decoding gammas, is typically greater than one and is stronger for images viewed with a dark background (e.g. cinema) than those viewed in lighter conditions (e.g. office displays).1-3 However, for high dynamic range (HDR) images reproduced on a low dynamic range (LDR) monitor, subjects often prefer a system gamma of less than one,4 presumably reflecting the greater need for histogram equalization in HDR images. In this study we ask subjects to rate the perceived quality of images presented on a LDR monitor using various levels of system gamma. We reveal that the optimal system gamma is below one for images with a HDR and approaches or exceeds one for images with a LDR. Additionally, the highest quality scores occur for images where a system gamma of one is optimal, suggesting a preference for linearity (where possible). We find that subjective image quality scores can be predicted by computing the degree of histogram equalization of the lightness distribution. Accordingly, an optimal, image dependent system gamma can be computed that maximizes perceived image quality.

  13. Detection and tracking of gas plumes in LWIR hyperspectral video sequence data

    NASA Astrophysics Data System (ADS)

    Gerhart, Torin; Sunu, Justin; Lieu, Lauren; Merkurjev, Ekaterina; Chang, Jen-Mei; Gilles, Jérôme; Bertozzi, Andrea L.

    2013-05-01

    Automated detection of chemical plumes presents a segmentation challenge. The segmentation problem for gas plumes is difficult due to the diffusive nature of the cloud. The advantage of considering hyperspectral images in the gas plume detection problem over the conventional RGB imagery is the presence of non-visual data, allowing for a richer representation of information. In this paper we present an effective method of visualizing hyperspectral video sequences containing chemical plumes and investigate the effectiveness of segmentation techniques on these post-processed videos. Our approach uses a combination of dimension reduction and histogram equalization to prepare the hyperspectral videos for segmentation. First, Principal Components Analysis (PCA) is used to reduce the dimension of the entire video sequence. This is done by projecting each pixel onto the first few Principal Components resulting in a type of spectral filter. Next, a Midway method for histogram equalization is used. These methods redistribute the intensity values in order to reduce icker between frames. This properly prepares these high-dimensional video sequences for more traditional segmentation techniques. We compare the ability of various clustering techniques to properly segment the chemical plume. These include K-means, spectral clustering, and the Ginzburg-Landau functional.

  14. Rejection of the maternal electrocardiogram in the electrohysterogram signal.

    PubMed

    Leman, H; Marque, C

    2000-08-01

    The electrohysterogram (EHG) signal is mainly corrupted by the mother's electrocardiogram (ECG), which remains present despite analog filtering during acquisition. Wavelets are a powerful denoising tool and have already proved their efficiency on the EHG. In this paper, we propose a new method that employs the redundant wavelet packet transform. We first study wavelet packet coefficient histograms and propose an algorithm to automatically detect the histogram mode number. Using a new criterion, we compute a best basis adapted to the denoising. After EHG wavelet packet coefficient thresholding in the selected basis, the inverse transform is applied. The ECG seems to be very efficiently removed.

  15. Adaptive gamma correction-based expert system for nonuniform illumination face enhancement

    NASA Astrophysics Data System (ADS)

    Abdelhamid, Iratni; Mustapha, Aouache; Adel, Oulefki

    2018-03-01

    The image quality of a face recognition system suffers under severe lighting conditions. Thus, this study aims to develop an approach for nonuniform illumination adjustment based on an adaptive gamma correction (AdaptGC) filter that can solve the aforementioned issue. An approach for adaptive gain factor prediction was developed via neural network model-based cross-validation (NN-CV). To achieve this objective, a gamma correction function and its effects on the face image quality with different gain values were examined first. Second, an orientation histogram (OH) algorithm was assessed as a face's feature descriptor. Subsequently, a density histogram module was developed for face label generation. During the NN-CV construction, the model was assessed to recognize the OH descriptor and predict the face label. The performance of the NN-CV model was evaluated by examining the statistical measures of root mean square error and coefficient of efficiency. Third, to evaluate the AdaptGC enhancement approach, an image quality metric was adopted using enhancement by entropy, contrast per pixel, second-derivative-like measure of enhancement, and sharpness, then supported by visual inspection. The experiment results were examined using five face's databases, namely, extended Yale-B, Carnegie Mellon University-Pose, Illumination, and Expression, Mobio, FERET, and Oulu-CASIA-NIR-VIS. The final results prove that AdaptGC filter implementation compared with state-of-the-art methods is the best choice in terms of contrast and nonuniform illumination adjustment. In summary, the benefits attained prove that AdaptGC is driven by a profitable enhancement rate, which provides satisfying features for high rate face recognition systems.

  16. A novel retinal vessel extraction algorithm based on matched filtering and gradient vector flow

    NASA Astrophysics Data System (ADS)

    Yu, Lei; Xia, Mingliang; Xuan, Li

    2013-10-01

    The microvasculature network of retina plays an important role in the study and diagnosis of retinal diseases (age-related macular degeneration and diabetic retinopathy for example). Although it is possible to noninvasively acquire high-resolution retinal images with modern retinal imaging technologies, non-uniform illumination, the low contrast of thin vessels and the background noises all make it difficult for diagnosis. In this paper, we introduce a novel retinal vessel extraction algorithm based on gradient vector flow and matched filtering to segment retinal vessels with different likelihood. Firstly, we use isotropic Gaussian kernel and adaptive histogram equalization to smooth and enhance the retinal images respectively. Secondly, a multi-scale matched filtering method is adopted to extract the retinal vessels. Then, the gradient vector flow algorithm is introduced to locate the edge of the retinal vessels. Finally, we combine the results of matched filtering method and gradient vector flow algorithm to extract the vessels at different likelihood levels. The experiments demonstrate that our algorithm is efficient and the intensities of vessel images exactly represent the likelihood of the vessels.

  17. Infrared and visible image fusion using discrete cosine transform and swarm intelligence for surveillance applications

    NASA Astrophysics Data System (ADS)

    Paramanandham, Nirmala; Rajendiran, Kishore

    2018-01-01

    A novel image fusion technique is presented for integrating infrared and visible images. Integration of images from the same or various sensing modalities can deliver the required information that cannot be delivered by viewing the sensor outputs individually and consecutively. In this paper, a swarm intelligence based image fusion technique using discrete cosine transform (DCT) domain is proposed for surveillance application which integrates the infrared image with the visible image for generating a single informative fused image. Particle swarm optimization (PSO) is used in the fusion process for obtaining the optimized weighting factor. These optimized weighting factors are used for fusing the DCT coefficients of visible and infrared images. Inverse DCT is applied for obtaining the initial fused image. An enhanced fused image is obtained through adaptive histogram equalization for a better visual understanding and target detection. The proposed framework is evaluated using quantitative metrics such as standard deviation, spatial frequency, entropy and mean gradient. The experimental results demonstrate the outperformance of the proposed algorithm over many other state- of- the- art techniques reported in literature.

  18. Classification of stroke disease using convolutional neural network

    NASA Astrophysics Data System (ADS)

    Marbun, J. T.; Seniman; Andayani, U.

    2018-03-01

    Stroke is a condition that occurs when the blood supply stop flowing to the brain because of a blockage or a broken blood vessel. A symptoms that happen when experiencing stroke, some of them is a dropped consciousness, disrupted vision and paralyzed body. The general examination is being done to get a picture of the brain part that have stroke using Computerized Tomography (CT) Scan. The image produced from CT will be manually checked and need a proper lighting by doctor to get a type of stroke. That is why it needs a method to classify stroke from CT image automatically. A method proposed in this research is Convolutional Neural Network. CT image of the brain is used as the input for image processing. The stage before classification are image processing (Grayscaling, Scaling, Contrast Limited Adaptive Histogram Equalization, then the image being classified with Convolutional Neural Network. The result then showed that the method significantly conducted was able to be used as a tool to classify stroke disease in order to distinguish the type of stroke from CT image.

  19. Design and testing of artifact-suppressed adaptive histogram equalization: a contrast-enhancement technique for display of digital chest radiographs.

    PubMed

    Rehm, K; Seeley, G W; Dallas, W J; Ovitt, T W; Seeger, J F

    1990-01-01

    One of the goals of our research in the field of digital radiography has been to develop contrast-enhancement algorithms for eventual use in the display of chest images on video devices with the aim of preserving the diagnostic information presently available with film, some of which would normally be lost because of the smaller dynamic range of video monitors. The ASAHE algorithm discussed in this article has been tested by investigating observer performance in a difficult detection task involving phantoms and simulated lung nodules, using film as the output medium. The results of the experiment showed that the algorithm is successful in providing contrast-enhanced, natural-looking chest images while maintaining diagnostic information. The algorithm did not effect an increase in nodule detectability, but this was not unexpected because film is a medium capable of displaying a wide range of gray levels. It is sufficient at this stage to show that there is no degradation in observer performance. Future tests will evaluate the performance of the ASAHE algorithm in preparing chest images for video display.

  20. A novel method for segmentation of Infrared Scanning Laser Ophthalmoscope (IR-SLO) images of retina.

    PubMed

    Ajaz, Aqsa; Aliahmad, Behzad; Kumar, Dinesh K

    2017-07-01

    Retinal vessel segmentation forms an essential element of automatic retinal disease screening systems. The development of multimodal imaging system with IR-SLO and OCT could help in studying the early stages of retinal disease. The advantages of IR-SLO to examine the alterations in the structure of retina and direct correlation with OCT can be useful for assessment of various diseases. This paper presents an automatic method for segmentation of IR-SLO fundus images based on the combination of morphological filters and image enhancement techniques. As a first step, the retinal vessels are contrasted using morphological filters followed by background exclusion using Contrast Limited Adaptive Histogram Equalization (CLAHE) and Bilateral filtering. The final segmentation is obtained by using Isodata technique. Our approach was tested on a set of 26 IR-SLO images and results were compared to two set of gold standard images. The performance of the proposed method was evaluated in terms of sensitivity, specificity and accuracy. The system has an average accuracy of 0.90 for both the sets.

  1. Iris double recognition based on modified evolutionary neural network

    NASA Astrophysics Data System (ADS)

    Liu, Shuai; Liu, Yuan-Ning; Zhu, Xiao-Dong; Huo, Guang; Liu, Wen-Tao; Feng, Jia-Kai

    2017-11-01

    Aiming at multicategory iris recognition under illumination and noise interference, this paper proposes a method of iris double recognition based on a modified evolutionary neural network. An equalization histogram and Laplace of Gaussian operator are used to process the iris to suppress illumination and noise interference and Haar wavelet to convert the iris feature to binary feature encoding. Calculate the Hamming distance for the test iris and template iris , and compare with classification threshold, determine the type of iris. If the iris cannot be identified as a different type, there needs to be a secondary recognition. The connection weights in back-propagation (BP) neural network use modified evolutionary neural network to adaptively train. The modified neural network is composed of particle swarm optimization with mutation operator and BP neural network. According to different iris libraries in different circumstances of experimental results, under illumination and noise interference, the correct recognition rate of this algorithm is higher, the ROC curve is closer to the coordinate axis, the training and recognition time is shorter, and the stability and the robustness are better.

  2. A Robust and Fast Computation Touchless Palm Print Recognition System Using LHEAT and the IFkNCN Classifier

    PubMed Central

    Jaafar, Haryati; Ibrahim, Salwani; Ramli, Dzati Athiar

    2015-01-01

    Mobile implementation is a current trend in biometric design. This paper proposes a new approach to palm print recognition, in which smart phones are used to capture palm print images at a distance. A touchless system was developed because of public demand for privacy and sanitation. Robust hand tracking, image enhancement, and fast computation processing algorithms are required for effective touchless and mobile-based recognition. In this project, hand tracking and the region of interest (ROI) extraction method were discussed. A sliding neighborhood operation with local histogram equalization, followed by a local adaptive thresholding or LHEAT approach, was proposed in the image enhancement stage to manage low-quality palm print images. To accelerate the recognition process, a new classifier, improved fuzzy-based k nearest centroid neighbor (IFkNCN), was implemented. By removing outliers and reducing the amount of training data, this classifier exhibited faster computation. Our experimental results demonstrate that a touchless palm print system using LHEAT and IFkNCN achieves a promising recognition rate of 98.64%. PMID:26113861

  3. Quantification and classification of neuronal responses in kernel-smoothed peristimulus time histograms

    PubMed Central

    Fried, Itzhak; Koch, Christof

    2014-01-01

    Peristimulus time histograms are a widespread form of visualizing neuronal responses. Kernel convolution methods transform these histograms into a smooth, continuous probability density function. This provides an improved estimate of a neuron's actual response envelope. We here develop a classifier, called the h-coefficient, to determine whether time-locked fluctuations in the firing rate of a neuron should be classified as a response or as random noise. Unlike previous approaches, the h-coefficient takes advantage of the more precise response envelope estimation provided by the kernel convolution method. The h-coefficient quantizes the smoothed response envelope and calculates the probability of a response of a given shape to occur by chance. We tested the efficacy of the h-coefficient in a large data set of Monte Carlo simulated smoothed peristimulus time histograms with varying response amplitudes, response durations, trial numbers, and baseline firing rates. Across all these conditions, the h-coefficient significantly outperformed more classical classifiers, with a mean false alarm rate of 0.004 and a mean hit rate of 0.494. We also tested the h-coefficient's performance in a set of neuronal responses recorded in humans. The algorithm behind the h-coefficient provides various opportunities for further adaptation and the flexibility to target specific parameters in a given data set. Our findings confirm that the h-coefficient can provide a conservative and powerful tool for the analysis of peristimulus time histograms with great potential for future development. PMID:25475352

  4. Object-based change detection method using refined Markov random field

    NASA Astrophysics Data System (ADS)

    Peng, Daifeng; Zhang, Yongjun

    2017-01-01

    In order to fully consider the local spatial constraints between neighboring objects in object-based change detection (OBCD), an OBCD approach is presented by introducing a refined Markov random field (MRF). First, two periods of images are stacked and segmented to produce image objects. Second, object spectral and textual histogram features are extracted and G-statistic is implemented to measure the distance among different histogram distributions. Meanwhile, object heterogeneity is calculated by combining spectral and textual histogram distance using adaptive weight. Third, an expectation-maximization algorithm is applied for determining the change category of each object and the initial change map is then generated. Finally, a refined change map is produced by employing the proposed refined object-based MRF method. Three experiments were conducted and compared with some state-of-the-art unsupervised OBCD methods to evaluate the effectiveness of the proposed method. Experimental results demonstrate that the proposed method obtains the highest accuracy among the methods used in this paper, which confirms its validness and effectiveness in OBCD.

  5. A novel method for the evaluation of uncertainty in dose-volume histogram computation.

    PubMed

    Henríquez, Francisco Cutanda; Castrillón, Silvia Vargas

    2008-03-15

    Dose-volume histograms (DVHs) are a useful tool in state-of-the-art radiotherapy treatment planning, and it is essential to recognize their limitations. Even after a specific dose-calculation model is optimized, dose distributions computed by using treatment-planning systems are affected by several sources of uncertainty, such as algorithm limitations, measurement uncertainty in the data used to model the beam, and residual differences between measured and computed dose. This report presents a novel method to take them into account. To take into account the effect of associated uncertainties, a probabilistic approach using a new kind of histogram, a dose-expected volume histogram, is introduced. The expected value of the volume in the region of interest receiving an absorbed dose equal to or greater than a certain value is found by using the probability distribution of the dose at each point. A rectangular probability distribution is assumed for this point dose, and a formulation that accounts for uncertainties associated with point dose is presented for practical computations. This method is applied to a set of DVHs for different regions of interest, including 6 brain patients, 8 lung patients, 8 pelvis patients, and 6 prostate patients planned for intensity-modulated radiation therapy. Results show a greater effect on planning target volume coverage than in organs at risk. In cases of steep DVH gradients, such as planning target volumes, this new method shows the largest differences with the corresponding DVH; thus, the effect of the uncertainty is larger.

  6. Experimental Visualizations of a Generic Launch Vehicle Flow Field: Time-Resolved Shadowgraph and Infrared Imaging

    NASA Technical Reports Server (NTRS)

    Garbeff, Theodore J., II; Panda, Jayanta; Ross, James C.

    2017-01-01

    Time-Resolved shadowgraph and infrared (IR) imaging were performed to investigate off-body and on-body flow features of a generic, 'hammer-head' launch vehicle geometry previously tested by Coe and Nute (1962). The measurements discussed here were one part of a large range of wind tunnel test techniques that included steady-state pressure sensitive paint (PSP), dynamic PSP, unsteady surface pressures, and unsteady force measurements. Image data was captured over a Mach number range of 0.6 less than or equal to M less than or equal to 1.2 at a Reynolds number of 3 million per foot. Both shadowgraph and IR imagery were captured in conjunction with unsteady pressures and forces and correlated with IRIG-B timing. High-speed shadowgraph imagery was used to identify wake structure and reattachment behind the payload fairing of the vehicle. Various data processing strategies were employed and ultimately these results correlated well with the location and magnitude of unsteady surface pressure measurements. Two research grade IR cameras were positioned to image boundary layer transition at the vehicle nose and flow reattachment behind the payload fairing. The poor emissivity of the model surface treatment (fast PSP) proved to be challenging for the infrared measurement. Reference image subtraction and contrast limited adaptive histogram equalization (CLAHE) were used to analyze this dataset. Ultimately turbulent boundary layer transition was observed and located forward of the trip dot line at the model sphere-cone junction. Flow reattachment location was identified behind the payload fairing in both steady and unsteady thermal data. As demonstrated in this effort, recent advances in high-speed and thermal imaging technology have modernized classical techniques providing a new viewpoint for the modern researcher

  7. Evaluation of the effectiveness of color attributes for video indexing

    NASA Astrophysics Data System (ADS)

    Chupeau, Bertrand; Forest, Ronan

    2001-10-01

    Color features are reviewed and their effectiveness assessed in the application framework of key-frame clustering for abstracting unconstrained video. Existing color spaces and associated quantization schemes are first studied. Description of global color distribution by means of histograms is then detailed. In our work, 12 combinations of color space and quantization were selected, together with 12 histogram metrics. Their respective effectiveness with respect to picture similarity measurement was evaluated through a query-by-example scenario. For that purpose, a set of still-picture databases was built by extracting key frames from several video clips, including news, documentaries, sports and cartoons. Classical retrieval performance evaluation criteria were adapted to the specificity of our testing methodology.

  8. Evaluation of the effectiveness of color attributes for video indexing

    NASA Astrophysics Data System (ADS)

    Chupeau, Bertrand; Forest, Ronan

    2001-01-01

    Color features are reviewed and their effectiveness assessed in the application framework of key-frame clustering for abstracting unconstrained video. Existing color spaces and associated quantization schemes are first studied. Description of global color distribution by means of histograms is then detailed. In our work, twelve combinations of color space and quantization were selected, together with twelve histogram metrics. Their respective effectiveness with respect to picture similarity measurement was evaluated through a query-be-example scenario. For that purpose, a set of still-picture databases was built by extracting key-frames from several video clips, including news, documentaries, sports and cartoons. Classical retrieval performance evaluation criteria were adapted to the specificity of our testing methodology.

  9. Evaluation of the effectiveness of color attributes for video indexing

    NASA Astrophysics Data System (ADS)

    Chupeau, Bertrand; Forest, Ronan

    2000-12-01

    Color features are reviewed and their effectiveness assessed in the application framework of key-frame clustering for abstracting unconstrained video. Existing color spaces and associated quantization schemes are first studied. Description of global color distribution by means of histograms is then detailed. In our work, twelve combinations of color space and quantization were selected, together with twelve histogram metrics. Their respective effectiveness with respect to picture similarity measurement was evaluated through a query-be-example scenario. For that purpose, a set of still-picture databases was built by extracting key-frames from several video clips, including news, documentaries, sports and cartoons. Classical retrieval performance evaluation criteria were adapted to the specificity of our testing methodology.

  10. Robust Face Detection from Still Images

    DTIC Science & Technology

    2014-01-01

    significant change in false acceptance rates. Keywords— face detection; illumination; skin color variation; Haar-like features; OpenCV I. INTRODUCTION... OpenCV and an algorithm which used histogram equalization. The test is performed against 17 subjects under 576 viewing conditions from the extended Yale...original OpenCV algorithm proved the least accurate, having a hit rate of only 75.6%. It also had the lowest FAR but only by a slight margin at 25.2

  11. Coding and decoding with adapting neurons: a population approach to the peri-stimulus time histogram.

    PubMed

    Naud, Richard; Gerstner, Wulfram

    2012-01-01

    The response of a neuron to a time-dependent stimulus, as measured in a Peri-Stimulus-Time-Histogram (PSTH), exhibits an intricate temporal structure that reflects potential temporal coding principles. Here we analyze the encoding and decoding of PSTHs for spiking neurons with arbitrary refractoriness and adaptation. As a modeling framework, we use the spike response model, also known as the generalized linear neuron model. Because of refractoriness, the effect of the most recent spike on the spiking probability a few milliseconds later is very strong. The influence of the last spike needs therefore to be described with high precision, while the rest of the neuronal spiking history merely introduces an average self-inhibition or adaptation that depends on the expected number of past spikes but not on the exact spike timings. Based on these insights, we derive a 'quasi-renewal equation' which is shown to yield an excellent description of the firing rate of adapting neurons. We explore the domain of validity of the quasi-renewal equation and compare it with other rate equations for populations of spiking neurons. The problem of decoding the stimulus from the population response (or PSTH) is addressed analogously. We find that for small levels of activity and weak adaptation, a simple accumulator of the past activity is sufficient to decode the original input, but when refractory effects become large decoding becomes a non-linear function of the past activity. The results presented here can be applied to the mean-field analysis of coupled neuron networks, but also to arbitrary point processes with negative self-interaction.

  12. Adaptive local thresholding for robust nucleus segmentation utilizing shape priors

    NASA Astrophysics Data System (ADS)

    Wang, Xiuzhong; Srinivas, Chukka

    2016-03-01

    This paper describes a novel local thresholding method for foreground detection. First, a Canny edge detection method is used for initial edge detection. Then, tensor voting is applied on the initial edge pixels, using a nonsymmetric tensor field tailored to encode prior information about nucleus size, shape, and intensity spatial distribution. Tensor analysis is then performed to generate the saliency image and, based on that, the refined edge. Next, the image domain is divided into blocks. In each block, at least one foreground and one background pixel are sampled for each refined edge pixel. The saliency weighted foreground histogram and background histogram are then created. These two histograms are used to calculate a threshold by minimizing the background and foreground pixel classification error. The block-wise thresholds are then used to generate the threshold for each pixel via interpolation. Finally, the foreground is obtained by comparing the original image with the threshold image. The effective use of prior information, combined with robust techniques, results in far more reliable foreground detection, which leads to robust nucleus segmentation.

  13. A 5 Gb/s CMOS adaptive equalizer for serial link

    NASA Astrophysics Data System (ADS)

    Wu, Hongbing; Wang, Jingyu; Liu, Hongxia

    2018-04-01

    A 5 Gb/s adaptive equalizer with a new adaptation scheme is presented here by using 0.13 μm CMOS process. The circuit consists of the combination of equalizer amplifier, limiter amplifier and adaptation loop. The adaptive algorithm exploits both the low frequency gain loop and the equalizer loop to minimize the inter-symbol interference (ISI) for a variety of cable characteristics. In addition, an offset cancellation loop is used to alleviate the offset influence of the signal path. The adaptive equalizer core occupies an area of 0.3567 mm2 and consumes a power consumption of 81.7 mW with 1.8 V power supply. Experiment results demonstrate that the equalizer could compensate for a designed cable loss with 0.23 UI peak-to-peak jitter. Project supported by the National Natural Science Foundation of China (No. 61376099), the Foundation for Fundamental Research of China (No. JSZL2016110B003), and the Major Fundamental Research Program of Shaanxi (No. 2017ZDJC-26).

  14. Adaptive channel estimation for soft decision decoding over non-Gaussian optical channel

    NASA Astrophysics Data System (ADS)

    Xiang, Jing-song; Miao, Tao-tao; Huang, Sheng; Liu, Huan-lin

    2016-10-01

    An adaptive priori likelihood ratio (LLR) estimation method is proposed over non-Gaussian channel in the intensity modulation/direct detection (IM/DD) optical communication systems. Using the nonparametric histogram and the weighted least square linear fitting in the tail regions, the LLR is estimated and used for the soft decision decoding of the low-density parity-check (LDPC) codes. This method can adapt well to the three main kinds of intensity modulation/direct detection (IM/DD) optical channel, i.e., the chi-square channel, the Webb-Gaussian channel and the additive white Gaussian noise (AWGN) channel. The performance penalty of channel estimation is neglected.

  15. Performance analysis and enhancement for visible light communication using CMOS sensors

    NASA Astrophysics Data System (ADS)

    Guan, Weipeng; Wu, Yuxiang; Xie, Canyu; Fang, Liangtao; Liu, Xiaowei; Chen, Yingcong

    2018-03-01

    Complementary Metal-Oxide-Semiconductor (CMOS) sensors are widely used in mobile-phone and cameras. Hence, it is attractive if these camera can be used as the receivers of visible light communication (VLC). Using the rolling shutter mechanism can increase the data rate of VLC based on CMOS camera, and different techniques have been proposed to improve the demodulation of the rolling shutter mechanism. However, these techniques are too complexity. In this work, we demonstrate and analyze the performance of the VLC link using CMOS camera for different LED luminaires for the first time in our knowledge. Experimental evaluation to compare their bit-error-rate (BER) performances and demodulation are also performed, and it can be summarized that just need to change the LED luminaire with more uniformity light output, the blooming effect would not exist; which not only can reduce the complexity of the demodulation but also enhance the communication quality. In addition, we propose and demonstrate to use contrast limited adaptive histogram equalization to extend the transmission distance and mitigate the influence of the background noise. And the experimental results show that the BER can be decreased by an order of magnitude by using the proposed method.

  16. A Morphological Hessian Based Approach for Retinal Blood Vessels Segmentation and Denoising Using Region Based Otsu Thresholding

    PubMed Central

    BahadarKhan, Khan; A Khaliq, Amir; Shahid, Muhammad

    2016-01-01

    Diabetic Retinopathy (DR) harm retinal blood vessels in the eye causing visual deficiency. The appearance and structure of blood vessels in retinal images play an essential part in the diagnoses of an eye sicknesses. We proposed a less computational unsupervised automated technique with promising results for detection of retinal vasculature by using morphological hessian based approach and region based Otsu thresholding. Contrast Limited Adaptive Histogram Equalization (CLAHE) and morphological filters have been used for enhancement and to remove low frequency noise or geometrical objects, respectively. The hessian matrix and eigenvalues approach used has been in a modified form at two different scales to extract wide and thin vessel enhanced images separately. Otsu thresholding has been further applied in a novel way to classify vessel and non-vessel pixels from both enhanced images. Finally, postprocessing steps has been used to eliminate the unwanted region/segment, non-vessel pixels, disease abnormalities and noise, to obtain a final segmented image. The proposed technique has been analyzed on the openly accessible DRIVE (Digital Retinal Images for Vessel Extraction) and STARE (STructured Analysis of the REtina) databases along with the ground truth data that has been precisely marked by the experts. PMID:27441646

  17. Stationary Wavelet Transform and AdaBoost with SVM Based Pathological Brain Detection in MRI Scanning.

    PubMed

    Nayak, Deepak Ranjan; Dash, Ratnakar; Majhi, Banshidhar

    2017-01-01

    This paper presents an automatic classification system for segregating pathological brain from normal brains in magnetic resonance imaging scanning. The proposed system employs contrast limited adaptive histogram equalization scheme to enhance the diseased region in brain MR images. Two-dimensional stationary wavelet transform is harnessed to extract features from the preprocessed images. The feature vector is constructed using the energy and entropy values, computed from the level- 2 SWT coefficients. Then, the relevant and uncorrelated features are selected using symmetric uncertainty ranking filter. Subsequently, the selected features are given input to the proposed AdaBoost with support vector machine classifier, where SVM is used as the base classifier of AdaBoost algorithm. To validate the proposed system, three standard MR image datasets, Dataset-66, Dataset-160, and Dataset- 255 have been utilized. The 5 runs of k-fold stratified cross validation results indicate the suggested scheme offers better performance than other existing schemes in terms of accuracy and number of features. The proposed system earns ideal classification over Dataset-66 and Dataset-160; whereas, for Dataset- 255, an accuracy of 99.45% is achieved. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  18. An assessment of a film enhancement system for use in a radiation therapy department.

    PubMed

    Solowsky, E L; Reinstein, L E; Meek, A G

    1990-01-01

    The clinical uses of a radiotherapy film enhancement system are explored. The primary functions of the system are to improve the quality of poorly exposed simulator and portal films, and to perform comparisons between the two films to determine whether patient or block positioning errors are present. Other features include: the production of inexpensive, high quality hardcopy images of simulation films and initial portal films for chart documentation, the capacity to overlay lateral simulation films with sagittal MRI films to aid in field design, and a mode to zoom in on individual CT or MRI images and enlarge them for video display during chart rounds or instructional sessions. This commercially available system is comprised of a microcomputer, frame grabber, CCD camera with zoom lens, and a high-resolution thermal printer. The user-friendly software is menu driven and utilizes both keyboard and track ball to perform its functions. At the heart of the software is a very fast Adaptive Histogram Equalization (AHE) routine, which enhances and improves the readability of most portal films. The system has been evaluated for several disease sites, and its advantages and limitations will be presented.

  19. Chromaticity based smoke removal in endoscopic images

    NASA Astrophysics Data System (ADS)

    Tchaka, Kevin; Pawar, Vijay M.; Stoyanov, Danail

    2017-02-01

    In minimally invasive surgery, image quality is a critical pre-requisite to ensure a surgeons ability to perform a procedure. In endoscopic procedures, image quality can deteriorate for a number of reasons such as fogging due to the temperature gradient after intra-corporeal insertion, lack of focus and due to smoke generated when using electro-cautery to dissect tissues without bleeding. In this paper we investigate the use of vision processing techniques to remove surgical smoke and improve the clarity of the image. We model the image formation process by introducing a haze medium to account for the degradation of visibility. For simplicity and computational efficiency we use an adapted dark-channel prior method combined with histogram equalization to remove smoke artifacts to recover the radiance image and enhance the contrast and brightness of the final result. Our initial results on images from robotic assisted procedures are promising and show that the proposed approach may be used to enhance image quality during surgery without additional suction devices. In addition, the processing pipeline may be used as an important part of a robust surgical vision pipeline that can continue working in the presence of smoke.

  20. Coding and Decoding with Adapting Neurons: A Population Approach to the Peri-Stimulus Time Histogram

    PubMed Central

    Naud, Richard; Gerstner, Wulfram

    2012-01-01

    The response of a neuron to a time-dependent stimulus, as measured in a Peri-Stimulus-Time-Histogram (PSTH), exhibits an intricate temporal structure that reflects potential temporal coding principles. Here we analyze the encoding and decoding of PSTHs for spiking neurons with arbitrary refractoriness and adaptation. As a modeling framework, we use the spike response model, also known as the generalized linear neuron model. Because of refractoriness, the effect of the most recent spike on the spiking probability a few milliseconds later is very strong. The influence of the last spike needs therefore to be described with high precision, while the rest of the neuronal spiking history merely introduces an average self-inhibition or adaptation that depends on the expected number of past spikes but not on the exact spike timings. Based on these insights, we derive a ‘quasi-renewal equation’ which is shown to yield an excellent description of the firing rate of adapting neurons. We explore the domain of validity of the quasi-renewal equation and compare it with other rate equations for populations of spiking neurons. The problem of decoding the stimulus from the population response (or PSTH) is addressed analogously. We find that for small levels of activity and weak adaptation, a simple accumulator of the past activity is sufficient to decode the original input, but when refractory effects become large decoding becomes a non-linear function of the past activity. The results presented here can be applied to the mean-field analysis of coupled neuron networks, but also to arbitrary point processes with negative self-interaction. PMID:23055914

  1. Noise-induced hearing loss alters the temporal dynamics of auditory-nerve responses

    PubMed Central

    Scheidt, Ryan E.; Kale, Sushrut; Heinz, Michael G.

    2010-01-01

    Auditory-nerve fibers demonstrate dynamic response properties in that they adapt to rapid changes in sound level, both at the onset and offset of a sound. These dynamic response properties affect temporal coding of stimulus modulations that are perceptually relevant for many sounds such as speech and music. Temporal dynamics have been well characterized in auditory-nerve fibers from normal-hearing animals, but little is known about the effects of sensorineural hearing loss on these dynamics. This study examined the effects of noise-induced hearing loss on the temporal dynamics in auditory-nerve fiber responses from anesthetized chinchillas. Post-stimulus time histograms were computed from responses to 50-ms tones presented at characteristic frequency and 30 dB above fiber threshold. Several response metrics related to temporal dynamics were computed from post-stimulus-time histograms and were compared between normal-hearing and noise-exposed animals. Results indicate that noise-exposed auditory-nerve fibers show significantly reduced response latency, increased onset response and percent adaptation, faster adaptation after onset, and slower recovery after offset. The decrease in response latency only occurred in noise-exposed fibers with significantly reduced frequency selectivity. These changes in temporal dynamics have important implications for temporal envelope coding in hearing-impaired ears, as well as for the design of dynamic compression algorithms for hearing aids. PMID:20696230

  2. Adaptive frequency-domain equalization in digital coherent optical receivers.

    PubMed

    Faruk, Md Saifuddin; Kikuchi, Kazuro

    2011-06-20

    We propose a novel frequency-domain adaptive equalizer in digital coherent optical receivers, which can reduce computational complexity of the conventional time-domain adaptive equalizer based on finite-impulse-response (FIR) filters. The proposed equalizer can operate on the input sequence sampled by free-running analog-to-digital converters (ADCs) at the rate of two samples per symbol; therefore, the arbitrary initial sampling phase of ADCs can be adjusted so that the best symbol-spaced sequence is produced. The equalizer can also be configured in the butterfly structure, which enables demultiplexing of polarization tributaries apart from equalization of linear transmission impairments. The performance of the proposed equalization scheme is verified by 40-Gbits/s dual-polarization quadrature phase-shift keying (QPSK) transmission experiments.

  3. Using recurrent neural networks for adaptive communication channel equalization.

    PubMed

    Kechriotis, G; Zervas, E; Manolakos, E S

    1994-01-01

    Nonlinear adaptive filters based on a variety of neural network models have been used successfully for system identification and noise-cancellation in a wide class of applications. An important problem in data communications is that of channel equalization, i.e., the removal of interferences introduced by linear or nonlinear message corrupting mechanisms, so that the originally transmitted symbols can be recovered correctly at the receiver. In this paper we introduce an adaptive recurrent neural network (RNN) based equalizer whose small size and high performance makes it suitable for high-speed channel equalization. We propose RNN based structures for both trained adaptation and blind equalization, and we evaluate their performance via extensive simulations for a variety of signal modulations and communication channel models. It is shown that the RNN equalizers have comparable performance with traditional linear filter based equalizers when the channel interferences are relatively mild, and that they outperform them by several orders of magnitude when either the channel's transfer function has spectral nulls or severe nonlinear distortion is present. In addition, the small-size RNN equalizers, being essentially generalized IIR filters, are shown to outperform multilayer perceptron equalizers of larger computational complexity in linear and nonlinear channel equalization cases.

  4. Fast and efficient search for MPEG-4 video using adjacent pixel intensity difference quantization histogram feature

    NASA Astrophysics Data System (ADS)

    Lee, Feifei; Kotani, Koji; Chen, Qiu; Ohmi, Tadahiro

    2010-02-01

    In this paper, a fast search algorithm for MPEG-4 video clips from video database is proposed. An adjacent pixel intensity difference quantization (APIDQ) histogram is utilized as the feature vector of VOP (video object plane), which had been reliably applied to human face recognition previously. Instead of fully decompressed video sequence, partially decoded data, namely DC sequence of the video object are extracted from the video sequence. Combined with active search, a temporal pruning algorithm, fast and robust video search can be realized. The proposed search algorithm has been evaluated by total 15 hours of video contained of TV programs such as drama, talk, news, etc. to search for given 200 MPEG-4 video clips which each length is 15 seconds. Experimental results show the proposed algorithm can detect the similar video clip in merely 80ms, and Equal Error Rate (ERR) of 2 % in drama and news categories are achieved, which are more accurately and robust than conventional fast video search algorithm.

  5. Improved Adaptive LSB Steganography Based on Chaos and Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Yu, Lifang; Zhao, Yao; Ni, Rongrong; Li, Ting

    2010-12-01

    We propose a novel steganographic method in JPEG images with high performance. Firstly, we propose improved adaptive LSB steganography, which can achieve high capacity while preserving the first-order statistics. Secondly, in order to minimize visual degradation of the stego image, we shuffle bits-order of the message based on chaos whose parameters are selected by the genetic algorithm. Shuffling message's bits-order provides us with a new way to improve the performance of steganography. Experimental results show that our method outperforms classical steganographic methods in image quality, while preserving characteristics of histogram and providing high capacity.

  6. A novel joint-processing adaptive nonlinear equalizer using a modular recurrent neural network for chaotic communication systems.

    PubMed

    Zhao, Haiquan; Zeng, Xiangping; Zhang, Jiashu; Liu, Yangguang; Wang, Xiaomin; Li, Tianrui

    2011-01-01

    To eliminate nonlinear channel distortion in chaotic communication systems, a novel joint-processing adaptive nonlinear equalizer based on a pipelined recurrent neural network (JPRNN) is proposed, using a modified real-time recurrent learning (RTRL) algorithm. Furthermore, an adaptive amplitude RTRL algorithm is adopted to overcome the deteriorating effect introduced by the nesting process. Computer simulations illustrate that the proposed equalizer outperforms the pipelined recurrent neural network (PRNN) and recurrent neural network (RNN) equalizers. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. A negentropy minimization approach to adaptive equalization for digital communication systems.

    PubMed

    Choi, Sooyong; Lee, Te-Won

    2004-07-01

    In this paper, we introduce and investigate a new adaptive equalization method based on minimizing approximate negentropy of the estimation error for a finite-length equalizer. We consider an approximate negentropy using nonpolynomial expansions of the estimation error as a new performance criterion to improve performance of a linear equalizer based on minimizing minimum mean squared error (MMSE). Negentropy includes higher order statistical information and its minimization provides improved converge, performance and accuracy compared to traditional methods such as MMSE in terms of bit error rate (BER). The proposed negentropy minimization (NEGMIN) equalizer has two kinds of solutions, the MMSE solution and the other one, depending on the ratio of the normalization parameters. The NEGMIN equalizer has best BER performance when the ratio of the normalization parameters is properly adjusted to maximize the output power(variance) of the NEGMIN equalizer. Simulation experiments show that BER performance of the NEGMIN equalizer with the other solution than the MMSE one has similar characteristics to the adaptive minimum bit error rate (AMBER) equalizer. The main advantage of the proposed equalizer is that it needs significantly fewer training symbols than the AMBER equalizer. Furthermore, the proposed equalizer is more robust to nonlinear distortions than the MMSE equalizer.

  8. Lifting wavelet method of target detection

    NASA Astrophysics Data System (ADS)

    Han, Jun; Zhang, Chi; Jiang, Xu; Wang, Fang; Zhang, Jin

    2009-11-01

    Image target recognition plays a very important role in the areas of scientific exploration, aeronautics and space-to-ground observation, photography and topographic mapping. Complex environment of the image noise, fuzzy, all kinds of interference has always been to affect the stability of recognition algorithm. In this paper, the existence of target detection in real-time, accuracy problems, as well as anti-interference ability, using lifting wavelet image target detection methods. First of all, the use of histogram equalization, the goal difference method to obtain the region, on the basis of adaptive threshold and mathematical morphology operations to deal with the elimination of the background error. Secondly, the use of multi-channel wavelet filter wavelet transform of the original image de-noising and enhancement, to overcome the general algorithm of the noise caused by the sensitive issue of reducing the rate of miscarriage of justice will be the multi-resolution characteristics of wavelet and promotion of the framework can be designed directly in the benefits of space-time region used in target detection, feature extraction of targets. The experimental results show that the design of lifting wavelet has solved the movement of the target due to the complexity of the context of the difficulties caused by testing, which can effectively suppress noise, and improve the efficiency and speed of detection.

  9. The effects of cloud inhomogeneities upon radiative fluxes, and the supply of a cloud truth validation dataset

    NASA Technical Reports Server (NTRS)

    Welch, Ronald M.

    1996-01-01

    The ASTER polar cloud mask algorithm is currently under development. Several classification techniques have been developed and implemented. The merits and accuracy of each are being examined. The classification techniques under investigation include fuzzy logic, hierarchical neural network, and a pairwise histogram comparison scheme based on sample histograms called the Paired Histogram Method. Scene adaptive methods also are being investigated as a means to improve classifier performance. The feature, arctan of Band 4 and Band 5, and the Band 2 vs. Band 4 feature space are key to separating frozen water (e.g., ice/snow, slush/wet ice, etc.) from cloud over frozen water, and land from cloud over land, respectively. A total of 82 Landsat TM circumpolar scenes are being used as a basis for algorithm development and testing. Numerous spectral features are being tested and include the 7 basic Landsat TM bands, in addition to ratios, differences, arctans, and normalized differences of each combination of bands. A technique for deriving cloud base and top height is developed. It uses 2-D cross correlation between a cloud edge and its corresponding shadow to determine the displacement of the cloud from its shadow. The height is then determined from this displacement, the solar zenith angle, and the sensor viewing angle.

  10. Ship detection based on rotation-invariant HOG descriptors for airborne infrared images

    NASA Astrophysics Data System (ADS)

    Xu, Guojing; Wang, Jinyan; Qi, Shengxiang

    2018-03-01

    Infrared thermal imagery is widely used in various kinds of aircraft because of its all-time application. Meanwhile, detecting ships from infrared images attract lots of research interests in recent years. In the case of downward-looking infrared imagery, in order to overcome the uncertainty of target imaging attitude due to the unknown position relationship between the aircraft and the target, we propose a new infrared ship detection method which integrates rotation invariant gradient direction histogram (Circle Histogram of Oriented Gradient, C-HOG) descriptors and the support vector machine (SVM) classifier. In details, the proposed method uses HOG descriptors to express the local feature of infrared images to adapt to changes in illumination and to overcome sea clutter effects. Different from traditional computation of HOG descriptor, we subdivide the image into annular spatial bins instead of rectangle sub-regions, and then Radial Gradient Transform (RGT) on the gradient is applied to achieve rotation invariant histogram information. Considering the engineering application of airborne and real-time requirements, we use SVM for training ship target and non-target background infrared sample images to discriminate real ships from false targets. Experimental results show that the proposed method has good performance in both the robustness and run-time for infrared ship target detection with different rotation angles.

  11. Fast and fully automatic phalanx segmentation using a grayscale-histogram morphology algorithm

    NASA Astrophysics Data System (ADS)

    Hsieh, Chi-Wen; Liu, Tzu-Chiang; Jong, Tai-Lang; Chen, Chih-Yen; Tiu, Chui-Mei; Chan, Din-Yuen

    2011-08-01

    Bone age assessment is a common radiological examination used in pediatrics to diagnose the discrepancy between the skeletal and chronological age of a child; therefore, it is beneficial to develop a computer-based bone age assessment to help junior pediatricians estimate bone age easily. Unfortunately, the phalanx on radiograms is not easily separated from the background and soft tissue. Therefore, we proposed a new method, called the grayscale-histogram morphology algorithm, to segment the phalanges fast and precisely. The algorithm includes three parts: a tri-stage sieve algorithm used to eliminate the background of hand radiograms, a centroid-edge dual scanning algorithm to frame the phalanx region, and finally a segmentation algorithm based on disk traverse-subtraction filter to segment the phalanx. Moreover, two more segmentation methods: adaptive two-mean and adaptive two-mean clustering were performed, and their results were compared with the segmentation algorithm based on disk traverse-subtraction filter using five indices comprising misclassification error, relative foreground area error, modified Hausdorff distances, edge mismatch, and region nonuniformity. In addition, the CPU time of the three segmentation methods was discussed. The result showed that our method had a better performance than the other two methods. Furthermore, satisfactory segmentation results were obtained with a low standard error.

  12. Adaptive Reception for Underwater Communications

    DTIC Science & Technology

    2011-06-01

    Experimental results prove the effectiveness of the receiver. 14. SUBJECT TERMS Underwater acoustic communications, adaptive algorithms , Kalman filter...the update algorithm design and the value of the spatial diversity are addressed. In this research, an adaptive multichannel equalizer made up of a...for the time-varying nature of the channel is to use an Adaptive Decision Feedback Equalizer based on either the RLS or LMS algorithm . Although this

  13. Content based Image Retrieval based on Different Global and Local Color Histogram Methods: A Survey

    NASA Astrophysics Data System (ADS)

    Suhasini, Pallikonda Sarah; Sri Rama Krishna, K.; Murali Krishna, I. V.

    2017-02-01

    Different global and local color histogram methods for content based image retrieval (CBIR) are investigated in this paper. Color histogram is a widely used descriptor for CBIR. Conventional method of extracting color histogram is global, which misses the spatial content, is less invariant to deformation and viewpoint changes, and results in a very large three dimensional histogram corresponding to the color space used. To address the above deficiencies, different global and local histogram methods are proposed in recent research. Different ways of extracting local histograms to have spatial correspondence, invariant colour histogram to add deformation and viewpoint invariance and fuzzy linking method to reduce the size of the histogram are found in recent papers. The color space and the distance metric used are vital in obtaining color histogram. In this paper the performance of CBIR based on different global and local color histograms in three different color spaces, namely, RGB, HSV, L*a*b* and also with three distance measures Euclidean, Quadratic and Histogram intersection are surveyed, to choose appropriate method for future research.

  14. Learning Rate Updating Methods Applied to Adaptive Fuzzy Equalizers for Broadband Power Line Communications

    NASA Astrophysics Data System (ADS)

    Ribeiro, Moisés V.

    2004-12-01

    This paper introduces adaptive fuzzy equalizers with variable step size for broadband power line (PL) communications. Based on delta-bar-delta and local Lipschitz estimation updating rules, feedforward, and decision feedback approaches, we propose singleton and nonsingleton fuzzy equalizers with variable step size to cope with the intersymbol interference (ISI) effects of PL channels and the hardness of the impulse noises generated by appliances and nonlinear loads connected to low-voltage power grids. The computed results show that the convergence rates of the proposed equalizers are higher than the ones attained by the traditional adaptive fuzzy equalizers introduced by J. M. Mendel and his students. Additionally, some interesting BER curves reveal that the proposed techniques are efficient for mitigating the above-mentioned impairments.

  15. Naturalness preservation image contrast enhancement via histogram modification

    NASA Astrophysics Data System (ADS)

    Tian, Qi-Chong; Cohen, Laurent D.

    2018-04-01

    Contrast enhancement is a technique for enhancing image contrast to obtain better visual quality. Since many existing contrast enhancement algorithms usually produce over-enhanced results, the naturalness preservation is needed to be considered in the framework of image contrast enhancement. This paper proposes a naturalness preservation contrast enhancement method, which adopts the histogram matching to improve the contrast and uses the image quality assessment to automatically select the optimal target histogram. The contrast improvement and the naturalness preservation are both considered in the target histogram, so this method can avoid the over-enhancement problem. In the proposed method, the optimal target histogram is a weighted sum of the original histogram, the uniform histogram, and the Gaussian-shaped histogram. Then the structural metric and the statistical naturalness metric are used to determine the weights of corresponding histograms. At last, the contrast-enhanced image is obtained via matching the optimal target histogram. The experiments demonstrate the proposed method outperforms the compared histogram-based contrast enhancement algorithms.

  16. Video Segmentation Descriptors for Event Recognition

    DTIC Science & Technology

    2014-12-08

    Velastin, 3D Extended Histogram of Oriented Gradients (3DHOG) for Classification of Road Users in Urban Scenes , BMVC, 2009. [3] M.-Y. Chen and A. Hauptmann...computed on 3D volume outputted by the hierarchical segmentation . Each video is described as follows. Each supertube is temporally divided in n-frame...strength of these descriptors is their adaptability to the scene variations since they are grounded on a video segmentation . This makes them naturally robust

  17. Numerically accurate computational techniques for optimal estimator analyses of multi-parameter models

    NASA Astrophysics Data System (ADS)

    Berger, Lukas; Kleinheinz, Konstantin; Attili, Antonio; Bisetti, Fabrizio; Pitsch, Heinz; Mueller, Michael E.

    2018-05-01

    Modelling unclosed terms in partial differential equations typically involves two steps: First, a set of known quantities needs to be specified as input parameters for a model, and second, a specific functional form needs to be defined to model the unclosed terms by the input parameters. Both steps involve a certain modelling error, with the former known as the irreducible error and the latter referred to as the functional error. Typically, only the total modelling error, which is the sum of functional and irreducible error, is assessed, but the concept of the optimal estimator enables the separate analysis of the total and the irreducible errors, yielding a systematic modelling error decomposition. In this work, attention is paid to the techniques themselves required for the practical computation of irreducible errors. Typically, histograms are used for optimal estimator analyses, but this technique is found to add a non-negligible spurious contribution to the irreducible error if models with multiple input parameters are assessed. Thus, the error decomposition of an optimal estimator analysis becomes inaccurate, and misleading conclusions concerning modelling errors may be drawn. In this work, numerically accurate techniques for optimal estimator analyses are identified and a suitable evaluation of irreducible errors is presented. Four different computational techniques are considered: a histogram technique, artificial neural networks, multivariate adaptive regression splines, and an additive model based on a kernel method. For multiple input parameter models, only artificial neural networks and multivariate adaptive regression splines are found to yield satisfactorily accurate results. Beyond a certain number of input parameters, the assessment of models in an optimal estimator analysis even becomes practically infeasible if histograms are used. The optimal estimator analysis in this paper is applied to modelling the filtered soot intermittency in large eddy simulations using a dataset of a direct numerical simulation of a non-premixed sooting turbulent flame.

  18. Color Swapping to Enhance Breast Cancer Digital Images Qualities Using Stain Normalization

    NASA Astrophysics Data System (ADS)

    Muhimmah, Izzati; Puspasari Wijaya, Dhina; Indrayanti

    2017-03-01

    Histopathology is the disease diagnosis by means of the visual examination of tissues under the microscope. The virtually transparent tissue sections were prepared using a number of colored histochemical stains bound selectively to the cellular components. A variation of colors comes to be a problem in histopathology based upon the microscope lighting for the range of factors. This research aimed to investigate an image enhancement by applying a nonlinear mapping approach to stain normalization and histogram equalization for contrast enhancement. Validation was carried out in 59 datasets with 96.6% accordance and expert justification.

  19. Blind adaptive equalization of polarization-switched QPSK modulation.

    PubMed

    Millar, David S; Savory, Seb J

    2011-04-25

    Coherent detection in combination with digital signal processing has recently enabled significant progress in the capacity of optical communications systems. This improvement has enabled detection of optimum constellations for optical signals in four dimensions. In this paper, we propose and investigate an algorithm for the blind adaptive equalization of one such modulation format: polarization-switched quaternary phase shift keying (PS-QPSK). The proposed algorithm, which includes both blind initialization and adaptation of the equalizer, is found to be insensitive to the input polarization state and demonstrates highly robust convergence in the presence of PDL, DGD and polarization rotation.

  20. Semi-automatic breast ultrasound image segmentation based on mean shift and graph cuts.

    PubMed

    Zhou, Zhuhuang; Wu, Weiwei; Wu, Shuicai; Tsui, Po-Hsiang; Lin, Chung-Chih; Zhang, Ling; Wang, Tianfu

    2014-10-01

    Computerized tumor segmentation on breast ultrasound (BUS) images remains a challenging task. In this paper, we proposed a new method for semi-automatic tumor segmentation on BUS images using Gaussian filtering, histogram equalization, mean shift, and graph cuts. The only interaction required was to select two diagonal points to determine a region of interest (ROI) on an input image. The ROI image was shrunken by a factor of 2 using bicubic interpolation to reduce computation time. The shrunken image was smoothed by a Gaussian filter and then contrast-enhanced by histogram equalization. Next, the enhanced image was filtered by pyramid mean shift to improve homogeneity. The object and background seeds for graph cuts were automatically generated on the filtered image. Using these seeds, the filtered image was then segmented by graph cuts into a binary image containing the object and background. Finally, the binary image was expanded by a factor of 2 using bicubic interpolation, and the expanded image was processed by morphological opening and closing to refine the tumor contour. The method was implemented with OpenCV 2.4.3 and Visual Studio 2010 and tested for 38 BUS images with benign tumors and 31 BUS images with malignant tumors from different ultrasound scanners. Experimental results showed that our method had a true positive rate (TP) of 91.7%, a false positive (FP) rate of 11.9%, and a similarity (SI) rate of 85.6%. The mean run time on Intel Core 2.66 GHz CPU and 4 GB RAM was 0.49 ± 0.36 s. The experimental results indicate that the proposed method may be useful in BUS image segmentation. © The Author(s) 2014.

  1. Three-Class Mammogram Classification Based on Descriptive CNN Features

    PubMed Central

    Zhang, Qianni; Jadoon, Adeel

    2017-01-01

    In this paper, a novel classification technique for large data set of mammograms using a deep learning method is proposed. The proposed model targets a three-class classification study (normal, malignant, and benign cases). In our model we have presented two methods, namely, convolutional neural network-discrete wavelet (CNN-DW) and convolutional neural network-curvelet transform (CNN-CT). An augmented data set is generated by using mammogram patches. To enhance the contrast of mammogram images, the data set is filtered by contrast limited adaptive histogram equalization (CLAHE). In the CNN-DW method, enhanced mammogram images are decomposed as its four subbands by means of two-dimensional discrete wavelet transform (2D-DWT), while in the second method discrete curvelet transform (DCT) is used. In both methods, dense scale invariant feature (DSIFT) for all subbands is extracted. Input data matrix containing these subband features of all the mammogram patches is created that is processed as input to convolutional neural network (CNN). Softmax layer and support vector machine (SVM) layer are used to train CNN for classification. Proposed methods have been compared with existing methods in terms of accuracy rate, error rate, and various validation assessment measures. CNN-DW and CNN-CT have achieved accuracy rate of 81.83% and 83.74%, respectively. Simulation results clearly validate the significance and impact of our proposed model as compared to other well-known existing techniques. PMID:28191461

  2. Three-Class Mammogram Classification Based on Descriptive CNN Features.

    PubMed

    Jadoon, M Mohsin; Zhang, Qianni; Haq, Ihsan Ul; Butt, Sharjeel; Jadoon, Adeel

    2017-01-01

    In this paper, a novel classification technique for large data set of mammograms using a deep learning method is proposed. The proposed model targets a three-class classification study (normal, malignant, and benign cases). In our model we have presented two methods, namely, convolutional neural network-discrete wavelet (CNN-DW) and convolutional neural network-curvelet transform (CNN-CT). An augmented data set is generated by using mammogram patches. To enhance the contrast of mammogram images, the data set is filtered by contrast limited adaptive histogram equalization (CLAHE). In the CNN-DW method, enhanced mammogram images are decomposed as its four subbands by means of two-dimensional discrete wavelet transform (2D-DWT), while in the second method discrete curvelet transform (DCT) is used. In both methods, dense scale invariant feature (DSIFT) for all subbands is extracted. Input data matrix containing these subband features of all the mammogram patches is created that is processed as input to convolutional neural network (CNN). Softmax layer and support vector machine (SVM) layer are used to train CNN for classification. Proposed methods have been compared with existing methods in terms of accuracy rate, error rate, and various validation assessment measures. CNN-DW and CNN-CT have achieved accuracy rate of 81.83% and 83.74%, respectively. Simulation results clearly validate the significance and impact of our proposed model as compared to other well-known existing techniques.

  3. Malignancy Detection on Mammography Using Dual Deep Convolutional Neural Networks and Genetically Discovered False Color Input Enhancement.

    PubMed

    Teare, Philip; Fishman, Michael; Benzaquen, Oshra; Toledano, Eyal; Elnekave, Eldad

    2017-08-01

    Breast cancer is the most prevalent malignancy in the US and the third highest cause of cancer-related mortality worldwide. Regular mammography screening has been attributed with doubling the rate of early cancer detection over the past three decades, yet estimates of mammographic accuracy in the hands of experienced radiologists remain suboptimal with sensitivity ranging from 62 to 87% and specificity from 75 to 91%. Advances in machine learning (ML) in recent years have demonstrated capabilities of image analysis which often surpass those of human observers. Here we present two novel techniques to address inherent challenges in the application of ML to the domain of mammography. We describe the use of genetic search of image enhancement methods, leading us to the use of a novel form of false color enhancement through contrast limited adaptive histogram equalization (CLAHE), as a method to optimize mammographic feature representation. We also utilize dual deep convolutional neural networks at different scales, for classification of full mammogram images and derivative patches combined with a random forest gating network as a novel architectural solution capable of discerning malignancy with a specificity of 0.91 and a specificity of 0.80. To our knowledge, this represents the first automatic stand-alone mammography malignancy detection algorithm with sensitivity and specificity performance similar to that of expert radiologists.

  4. A novel algorithm to detect glaucoma risk using texton and local configuration pattern features extracted from fundus images.

    PubMed

    Acharya, U Rajendra; Bhat, Shreya; Koh, Joel E W; Bhandary, Sulatha V; Adeli, Hojjat

    2017-09-01

    Glaucoma is an optic neuropathy defined by characteristic damage to the optic nerve and accompanying visual field deficits. Early diagnosis and treatment are critical to prevent irreversible vision loss and ultimate blindness. Current techniques for computer-aided analysis of the optic nerve and retinal nerve fiber layer (RNFL) are expensive and require keen interpretation by trained specialists. Hence, an automated system is highly desirable for a cost-effective and accurate screening for the diagnosis of glaucoma. This paper presents a new methodology and a computerized diagnostic system. Adaptive histogram equalization is used to convert color images to grayscale images followed by convolution of these images with Leung-Malik (LM), Schmid (S), and maximum response (MR4 and MR8) filter banks. The basic microstructures in typical images are called textons. The convolution process produces textons. Local configuration pattern (LCP) features are extracted from these textons. The significant features are selected using a sequential floating forward search (SFFS) method and ranked using the statistical t-test. Finally, various classifiers are used for classification of images into normal and glaucomatous classes. A high classification accuracy of 95.8% is achieved using six features obtained from the LM filter bank and the k-nearest neighbor (kNN) classifier. A glaucoma integrative index (GRI) is also formulated to obtain a reliable and effective system. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. A robust technique based on VLM and Frangi filter for retinal vessel extraction and denoising.

    PubMed

    Khan, Khan Bahadar; Khaliq, Amir A; Jalil, Abdul; Shahid, Muhammad

    2018-01-01

    The exploration of retinal vessel structure is colossally important on account of numerous diseases including stroke, Diabetic Retinopathy (DR) and coronary heart diseases, which can damage the retinal vessel structure. The retinal vascular network is very hard to be extracted due to its spreading and diminishing geometry and contrast variation in an image. The proposed technique consists of unique parallel processes for denoising and extraction of blood vessels in retinal images. In the preprocessing section, an adaptive histogram equalization enhances dissimilarity between the vessels and the background and morphological top-hat filters are employed to eliminate macula and optic disc, etc. To remove local noise, the difference of images is computed from the top-hat filtered image and the high-boost filtered image. Frangi filter is applied at multi scale for the enhancement of vessels possessing diverse widths. Segmentation is performed by using improved Otsu thresholding on the high-boost filtered image and Frangi's enhanced image, separately. In the postprocessing steps, a Vessel Location Map (VLM) is extracted by using raster to vector transformation. Postprocessing steps are employed in a novel way to reject misclassified vessel pixels. The final segmented image is obtained by using pixel-by-pixel AND operation between VLM and Frangi output image. The method has been rigorously analyzed on the STARE, DRIVE and HRF datasets.

  6. Touch HDR: photograph enhancement by user controlled wide dynamic range adaptation

    NASA Astrophysics Data System (ADS)

    Verrall, Steve; Siddiqui, Hasib; Atanassov, Kalin; Goma, Sergio; Ramachandra, Vikas

    2013-03-01

    High Dynamic Range (HDR) technology enables photographers to capture a greater range of tonal detail. HDR is typically used to bring out detail in a dark foreground object set against a bright background. HDR technologies include multi-frame HDR and single-frame HDR. Multi-frame HDR requires the combination of a sequence of images taken at different exposures. Single-frame HDR requires histogram equalization post-processing of a single image, a technique referred to as local tone mapping (LTM). Images generated using HDR technology can look less natural than their non- HDR counterparts. Sometimes it is only desired to enhance small regions of an original image. For example, it may be desired to enhance the tonal detail of one subject's face while preserving the original background. The Touch HDR technique described in this paper achieves these goals by enabling selective blending of HDR and non-HDR versions of the same image to create a hybrid image. The HDR version of the image can be generated by either multi-frame or single-frame HDR. Selective blending can be performed as a post-processing step, for example, as a feature of a photo editor application, at any time after the image has been captured. HDR and non-HDR blending is controlled by a weighting surface, which is configured by the user through a sequence of touches on a touchscreen.

  7. Filipinos in the Navy: Service, Interpersonal Relations, and Cultural Adaptation

    DTIC Science & Technology

    1977-01-01

    conversation. Between people of equal social standing it would be natural to expect that the number of occasions when each party starts the conversation...would be equal . A deviation from this one-to-one ratio is likely to reflect on the inequality of social relations. Thus, we asked the Filipino...NOTES 19. KEY WORDS (Continue on reverse side if necessary and identify by block number) American equal opportunity Filipino cultural adaptation

  8. Properties of an adaptive feedback equalization algorithm.

    PubMed

    Engebretson, A M; French-St George, M

    1993-01-01

    This paper describes a new approach to feedback equalization for hearing aids. The method involves the use of an adaptive algorithm that estimates and tracks the characteristic of the hearing aid feedback path. The algorithm is described and the results of simulation studies and bench testing are presented.

  9. MRI histogram analysis enables objective and continuous classification of intervertebral disc degeneration.

    PubMed

    Waldenberg, Christian; Hebelka, Hanna; Brisby, Helena; Lagerstrand, Kerstin Magdalena

    2018-05-01

    Magnetic resonance imaging (MRI) is the best diagnostic imaging method for low back pain. However, the technique is currently not utilized in its full capacity, often failing to depict painful intervertebral discs (IVDs), potentially due to the rough degeneration classification system used clinically today. MR image histograms, which reflect the IVD heterogeneity, may offer sensitive imaging biomarkers for IVD degeneration classification. This study investigates the feasibility of using histogram analysis as means of objective and continuous grading of IVD degeneration. Forty-nine IVDs in ten low back pain patients (six males, 25-69 years) were examined with MRI (T2-weighted images and T2-maps). Each IVD was semi-automatically segmented on three mid-sagittal slices. Histogram features of the IVD were extracted from the defined regions of interest and correlated to Pfirrmann grade. Both T2-weighted images and T2-maps displayed similar histogram features. Histograms of well-hydrated IVDs displayed two separate peaks, representing annulus fibrosus and nucleus pulposus. Degenerated IVDs displayed decreased peak separation, where the separation was shown to correlate strongly with Pfirrmann grade (P < 0.05). In addition, some degenerated IVDs within the same Pfirrmann grade displayed diametrically different histogram appearances. Histogram features correlated well with IVD degeneration, suggesting that IVD histogram analysis is a suitable tool for objective and continuous IVD degeneration classification. As histogram analysis revealed IVD heterogeneity, it may be a clinical tool for characterization of regional IVD degeneration effects. To elucidate the usefulness of histogram analysis in patient management, IVD histogram features between asymptomatic and symptomatic individuals needs to be compared.

  10. Adaptively combined FIR and functional link artificial neural network equalizer for nonlinear communication channel.

    PubMed

    Zhao, Haiquan; Zhang, Jiashu

    2009-04-01

    This paper proposes a novel computational efficient adaptive nonlinear equalizer based on combination of finite impulse response (FIR) filter and functional link artificial neural network (CFFLANN) to compensate linear and nonlinear distortions in nonlinear communication channel. This convex nonlinear combination results in improving the speed while retaining the lower steady-state error. In addition, since the CFFLANN needs not the hidden layers, which exist in conventional neural-network-based equalizers, it exhibits a simpler structure than the traditional neural networks (NNs) and can require less computational burden during the training mode. Moreover, appropriate adaptation algorithm for the proposed equalizer is derived by the modified least mean square (MLMS). Results obtained from the simulations clearly show that the proposed equalizer using the MLMS algorithm can availably eliminate various intensity linear and nonlinear distortions, and be provided with better anti-jamming performance. Furthermore, comparisons of the mean squared error (MSE), the bit error rate (BER), and the effect of eigenvalue ratio (EVR) of input correlation matrix are presented.

  11. Deforestation due to Urbanization: a Case Study for Trabzon, Turkey

    NASA Astrophysics Data System (ADS)

    Telkenaroglu, C.; Dikmen, M.

    2017-11-01

    This paper inspects the deforestation of Trabzon in Turkey, due to urbanization, between 2006 and 2016. For this purpose, Landsat 7 ETM+ (Enhanced Thematic Mapper Plus) images are obtained from United States Geographical Survey (USGS) archive (USGS, 2017a) and their VNIR bands related to this study are utilized. For both years, and for each band, histograms are equalized. Finally, Normalized Difference Vegetation Index (NDVI) values are calculated as images. Resulting vegetation indexes are assessed in comparison to the binary ground truth images. A visual inspection is also done with respect to Google's Timelapse images for each year to validate and support the results.

  12. METEOSAT studies of clouds and radiation budget

    NASA Technical Reports Server (NTRS)

    Saunders, R. W.

    1982-01-01

    Radiation budget studies of the atmosphere/surface system from Meteosat, cloud parameter determination from space, and sea surface temperature measurements from TIROS N data are all described. This work was carried out on the interactive planetary image processing system (IPIPS), which allows interactive manipulationion of the image data in addition to the conventional computational tasks. The current hardware configuration of IPIPS is shown. The I(2)S is the principal interactive display allowing interaction via a trackball, four buttons under program control, or a touch tablet. Simple image processing operations such as contrast enhancing, pseudocoloring, histogram equalization, and multispectral combinations, can all be executed at the push of a button.

  13. Bin Ratio-Based Histogram Distances and Their Application to Image Classification.

    PubMed

    Hu, Weiming; Xie, Nianhua; Hu, Ruiguang; Ling, Haibin; Chen, Qiang; Yan, Shuicheng; Maybank, Stephen

    2014-12-01

    Large variations in image background may cause partial matching and normalization problems for histogram-based representations, i.e., the histograms of the same category may have bins which are significantly different, and normalization may produce large changes in the differences between corresponding bins. In this paper, we deal with this problem by using the ratios between bin values of histograms, rather than bin values' differences which are used in the traditional histogram distances. We propose a bin ratio-based histogram distance (BRD), which is an intra-cross-bin distance, in contrast with previous bin-to-bin distances and cross-bin distances. The BRD is robust to partial matching and histogram normalization, and captures correlations between bins with only a linear computational complexity. We combine the BRD with the ℓ1 histogram distance and the χ(2) histogram distance to generate the ℓ1 BRD and the χ(2) BRD, respectively. These combinations exploit and benefit from the robustness of the BRD under partial matching and the robustness of the ℓ1 and χ(2) distances to small noise. We propose a method for assessing the robustness of histogram distances to partial matching. The BRDs and logistic regression-based histogram fusion are applied to image classification. The experimental results on synthetic data sets show the robustness of the BRDs to partial matching, and the experiments on seven benchmark data sets demonstrate promising results of the BRDs for image classification.

  14. An Experimental Comparison of Similarity Assessment Measures for 3D Models on Constrained Surface Deformation

    NASA Astrophysics Data System (ADS)

    Quan, Lulin; Yang, Zhixin

    2010-05-01

    To address the issues in the area of design customization, this paper expressed the specification and application of the constrained surface deformation, and reported the experimental performance comparison of three prevail effective similarity assessment algorithms on constrained surface deformation domain. Constrained surface deformation becomes a promising method that supports for various downstream applications of customized design. Similarity assessment is regarded as the key technology for inspecting the success of new design via measuring the difference level between the deformed new design and the initial sample model, and indicating whether the difference level is within the limitation. According to our theoretical analysis and pre-experiments, three similarity assessment algorithms are suitable for this domain, including shape histogram based method, skeleton based method, and U system moment based method. We analyze their basic functions and implementation methodologies in detail, and do a series of experiments on various situations to test their accuracy and efficiency using precision-recall diagram. Shoe model is chosen as an industrial example for the experiments. It shows that shape histogram based method gained an optimal performance in comparison. Based on the result, we proposed a novel approach that integrating surface constrains and shape histogram description with adaptive weighting method, which emphasize the role of constrains during the assessment. The limited initial experimental result demonstrated that our algorithm outperforms other three algorithms. A clear direction for future development is also drawn at the end of the paper.

  15. Accelerated weight histogram method for exploring free energy landscapes

    NASA Astrophysics Data System (ADS)

    Lindahl, V.; Lidmar, J.; Hess, B.

    2014-07-01

    Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.

  16. Accelerated weight histogram method for exploring free energy landscapes.

    PubMed

    Lindahl, V; Lidmar, J; Hess, B

    2014-07-28

    Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.

  17. The value of a kurtosis metric in estimating the hazard to hearing of complex industrial noise exposures.

    PubMed

    Qiu, Wei; Hamernik, Roger P; Davis, Robert I

    2013-05-01

    A series of Gaussian and non-Gaussian equal energy noise exposures were designed with the objective of establishing the extent to which the kurtosis statistic could be used to grade the severity of noise trauma produced by the exposures. Here, 225 chinchillas distributed in 29 groups, with 6 to 8 animals per group, were exposed at 97 dB SPL. The equal energy exposures were presented either continuously for 5 d or on an interrupted schedule for 19 d. The non-Gaussian noises all differed in the level of the kurtosis statistic or in the temporal structure of the noise, where the latter was defined by different peak, interval, and duration histograms of the impact noise transients embedded in the noise signal. Noise-induced trauma was estimated from auditory evoked potential hearing thresholds and surface preparation histology that quantified sensory cell loss. Results indicated that the equal energy hypothesis is a valid unifying principle for estimating the consequences of an exposure if and only if the equivalent energy exposures had the same kurtosis. Furthermore, for the same level of kurtosis the detailed temporal structure of an exposure does not have a strong effect on trauma.

  18. Theory and Application of DNA Histogram Analysis.

    ERIC Educational Resources Information Center

    Bagwell, Charles Bruce

    The underlying principles and assumptions associated with DNA histograms are discussed along with the characteristics of fluorescent probes. Information theory was described and used to calculate the information content of a DNA histogram. Two major types of DNA histogram analyses are proposed: parametric and nonparametric analysis. Three levels…

  19. Adaptive nonlinear Volterra equalizer for mitigation of chirp-induced distortions in cost effective IMDD OFDM systems.

    PubMed

    André, Nuno Sequeira; Habel, Kai; Louchet, Hadrien; Richter, André

    2013-11-04

    We report experimental validations of an adaptive 2nd order Volterra equalization scheme for cost effective IMDD OFDM systems. This equalization scheme was applied to both uplink and downlink transmission. Downlink settings were optimized for maximum bitrate where we achieved 34 Gb/s over 10 km of SSMF using an EML with 10 GHz bandwidth. For the uplink, maximum reach was optimized achieving 14 Gb/s using a low-cost DML with 2.5 GHz bandwidth.

  20. Enabling the extended compact genetic algorithm for real-parameter optimization by using adaptive discretization.

    PubMed

    Chen, Ying-ping; Chen, Chao-Hong

    2010-01-01

    An adaptive discretization method, called split-on-demand (SoD), enables estimation of distribution algorithms (EDAs) for discrete variables to solve continuous optimization problems. SoD randomly splits a continuous interval if the number of search points within the interval exceeds a threshold, which is decreased at every iteration. After the split operation, the nonempty intervals are assigned integer codes, and the search points are discretized accordingly. As an example of using SoD with EDAs, the integration of SoD and the extended compact genetic algorithm (ECGA) is presented and numerically examined. In this integration, we adopt a local search mechanism as an optional component of our back end optimization engine. As a result, the proposed framework can be considered as a memetic algorithm, and SoD can potentially be applied to other memetic algorithms. The numerical experiments consist of two parts: (1) a set of benchmark functions on which ECGA with SoD and ECGA with two well-known discretization methods: the fixed-height histogram (FHH) and the fixed-width histogram (FWH) are compared; (2) a real-world application, the economic dispatch problem, on which ECGA with SoD is compared to other methods. The experimental results indicate that SoD is a better discretization method to work with ECGA. Moreover, ECGA with SoD works quite well on the economic dispatch problem and delivers solutions better than the best known results obtained by other methods in existence.

  1. Adaptive frequency-domain equalization for the transmission of the fundamental mode in a few-mode fiber.

    PubMed

    Bai, Neng; Xia, Cen; Li, Guifang

    2012-10-08

    We propose and experimentally demonstrate single-carrier adaptive frequency-domain equalization (SC-FDE) to mitigate multipath interference (MPI) for the transmission of the fundamental mode in a few-mode fiber. The FDE approach reduces computational complexity significantly compared to the time-domain equalization (TDE) approach while maintaining the same performance. Both FDE and TDE methods are evaluated by simulating long-haul fundamental-mode transmission using a few-mode fiber. For the fundamental mode operation, the required tap length of the equalizer depends on the differential mode group delay (DMGD) of a single span rather than DMGD of the entire link.

  2. Histogram deconvolution - An aid to automated classifiers

    NASA Technical Reports Server (NTRS)

    Lorre, J. J.

    1983-01-01

    It is shown that N-dimensional histograms are convolved by the addition of noise in the picture domain. Three methods are described which provide the ability to deconvolve such noise-affected histograms. The purpose of the deconvolution is to provide automated classifiers with a higher quality N-dimensional histogram from which to obtain classification statistics.

  3. Parameterization of the Age-Dependent Whole Brain Apparent Diffusion Coefficient Histogram

    PubMed Central

    Batra, Marion; Nägele, Thomas

    2015-01-01

    Purpose. The distribution of apparent diffusion coefficient (ADC) values in the brain can be used to characterize age effects and pathological changes of the brain tissue. The aim of this study was the parameterization of the whole brain ADC histogram by an advanced model with influence of age considered. Methods. Whole brain ADC histograms were calculated for all data and for seven age groups between 10 and 80 years. Modeling of the histograms was performed for two parts of the histogram separately: the brain tissue part was modeled by two Gaussian curves, while the remaining part was fitted by the sum of a Gaussian curve, a biexponential decay, and a straight line. Results. A consistent fitting of the histograms of all age groups was possible with the proposed model. Conclusions. This study confirms the strong dependence of the whole brain ADC histograms on the age of the examined subjects. The proposed model can be used to characterize changes of the whole brain ADC histogram in certain diseases under consideration of age effects. PMID:26609526

  4. Introducing parallelism to histogramming functions for GEM systems

    NASA Astrophysics Data System (ADS)

    Krawczyk, Rafał D.; Czarski, Tomasz; Kolasinski, Piotr; Pozniak, Krzysztof T.; Linczuk, Maciej; Byszuk, Adrian; Chernyshova, Maryna; Juszczyk, Bartlomiej; Kasprowicz, Grzegorz; Wojenski, Andrzej; Zabolotny, Wojciech

    2015-09-01

    This article is an assessment of potential parallelization of histogramming algorithms in GEM detector system. Histogramming and preprocessing algorithms in MATLAB were analyzed with regard to adding parallelism. Preliminary implementation of parallel strip histogramming resulted in speedup. Analysis of algorithms parallelizability is presented. Overview of potential hardware and software support to implement parallel algorithm is discussed.

  5. Comparison of Histograms for Use in Cloud Observation and Modeling

    NASA Technical Reports Server (NTRS)

    Green, Lisa; Xu, Kuan-Man

    2005-01-01

    Cloud observation and cloud modeling data can be presented in histograms for each characteristic to be measured. Combining information from single-cloud histograms yields a summary histogram. Summary histograms can be compared to each other to reach conclusions about the behavior of an ensemble of clouds in different places at different times or about the accuracy of a particular cloud model. As in any scientific comparison, it is necessary to decide whether any apparent differences are statistically significant. The usual methods of deciding statistical significance when comparing histograms do not apply in this case because they assume independent data. Thus, a new method is necessary. The proposed method uses the Euclidean distance metric and bootstrapping to calculate the significance level.

  6. MIMO equalization with adaptive step size for few-mode fiber transmission systems.

    PubMed

    van Uden, Roy G H; Okonkwo, Chigo M; Sleiffer, Vincent A J M; de Waardt, Hugo; Koonen, Antonius M J

    2014-01-13

    Optical multiple-input multiple-output (MIMO) transmission systems generally employ minimum mean squared error time or frequency domain equalizers. Using an experimental 3-mode dual polarization coherent transmission setup, we show that the convergence time of the MMSE time domain equalizer (TDE) and frequency domain equalizer (FDE) can be reduced by approximately 50% and 30%, respectively. The criterion used to estimate the system convergence time is the time it takes for the MIMO equalizer to reach an average output error which is within a margin of 5% of the average output error after 50,000 symbols. The convergence reduction difference between the TDE and FDE is attributed to the limited maximum step size for stable convergence of the frequency domain equalizer. The adaptive step size requires a small overhead in the form of a lookup table. It is highlighted that the convergence time reduction is achieved without sacrificing optical signal-to-noise ratio performance.

  7. An approach to analyze the breast tissues in infrared images using nonlinear adaptive level sets and Riesz transform features.

    PubMed

    Prabha, S; Suganthi, S S; Sujatha, C M

    2015-01-01

    Breast thermography is a potential imaging method for the early detection of breast cancer. The pathological conditions can be determined by measuring temperature variations in the abnormal breast regions. Accurate delineation of breast tissues is reported as a challenging task due to inherent limitations of infrared images such as low contrast, low signal to noise ratio and absence of clear edges. Segmentation technique is attempted to delineate the breast tissues by detecting proper lower breast boundaries and inframammary folds. Characteristic features are extracted to analyze the asymmetrical thermal variations in normal and abnormal breast tissues. An automated analysis of thermal variations of breast tissues is attempted using nonlinear adaptive level sets and Riesz transform. Breast thermal images are initially subjected to Stein's unbiased risk estimate based orthonormal wavelet denoising. These denoised images are enhanced using contrast-limited adaptive histogram equalization method. The breast tissues are then segmented using non-linear adaptive level set method. The phase map of enhanced image is integrated into the level set framework for final boundary estimation. The segmented results are validated against the corresponding ground truth images using overlap and regional similarity metrics. The segmented images are further processed with Riesz transform and structural texture features are derived from the transformed coefficients to analyze pathological conditions of breast tissues. Results show that the estimated average signal to noise ratio of denoised images and average sharpness of enhanced images are improved by 38% and 6% respectively. The interscale consideration adopted in the denoising algorithm is able to improve signal to noise ratio by preserving edges. The proposed segmentation framework could delineate the breast tissues with high degree of correlation (97%) between the segmented and ground truth areas. Also, the average segmentation accuracy and sensitivity are found to be 98%. Similarly, the maximum regional overlap between segmented and ground truth images obtained using volume similarity measure is observed to be 99%. Directionality as a feature, showed a considerable difference between normal and abnormal tissues which is found to be 11%. The proposed framework for breast thermal image analysis that is aided with necessary preprocessing is found to be useful in assisting the early diagnosis of breast abnormalities.

  8. Knowledge-based low-level image analysis for computer vision systems

    NASA Technical Reports Server (NTRS)

    Dhawan, Atam P.; Baxi, Himanshu; Ranganath, M. V.

    1988-01-01

    Two algorithms for entry-level image analysis and preliminary segmentation are proposed which are flexible enough to incorporate local properties of the image. The first algorithm involves pyramid-based multiresolution processing and a strategy to define and use interlevel and intralevel link strengths. The second algorithm, which is designed for selected window processing, extracts regions adaptively using local histograms. The preliminary segmentation and a set of features are employed as the input to an efficient rule-based low-level analysis system, resulting in suboptimal meaningful segmentation.

  9. Free energy profiles from single-molecule pulling experiments.

    PubMed

    Hummer, Gerhard; Szabo, Attila

    2010-12-14

    Nonequilibrium pulling experiments provide detailed information about the thermodynamic and kinetic properties of molecules. We show that unperturbed free energy profiles as a function of molecular extension can be obtained rigorously from such experiments without using work-weighted position histograms. An inverse Weierstrass transform is used to relate the system free energy obtained from the Jarzynski equality directly to the underlying molecular free energy surface. An accurate approximation for the free energy surface is obtained by using the method of steepest descent to evaluate the inverse transform. The formalism is applied to simulated data obtained from a kinetic model of RNA folding, in which the dynamics consists of jumping between linker-dominated folded and unfolded free energy surfaces.

  10. Supervised retinal vessel segmentation from color fundus images based on matched filtering and AdaBoost classifier.

    PubMed

    Memari, Nogol; Ramli, Abd Rahman; Bin Saripan, M Iqbal; Mashohor, Syamsiah; Moghbel, Mehrdad

    2017-01-01

    The structure and appearance of the blood vessel network in retinal fundus images is an essential part of diagnosing various problems associated with the eyes, such as diabetes and hypertension. In this paper, an automatic retinal vessel segmentation method utilizing matched filter techniques coupled with an AdaBoost classifier is proposed. The fundus image is enhanced using morphological operations, the contrast is increased using contrast limited adaptive histogram equalization (CLAHE) method and the inhomogeneity is corrected using Retinex approach. Then, the blood vessels are enhanced using a combination of B-COSFIRE and Frangi matched filters. From this preprocessed image, different statistical features are computed on a pixel-wise basis and used in an AdaBoost classifier to extract the blood vessel network inside the image. Finally, the segmented images are postprocessed to remove the misclassified pixels and regions. The proposed method was validated using publicly accessible Digital Retinal Images for Vessel Extraction (DRIVE), Structured Analysis of the Retina (STARE) and Child Heart and Health Study in England (CHASE_DB1) datasets commonly used for determining the accuracy of retinal vessel segmentation methods. The accuracy of the proposed segmentation method was comparable to other state of the art methods while being very close to the manual segmentation provided by the second human observer with an average accuracy of 0.972, 0.951 and 0.948 in DRIVE, STARE and CHASE_DB1 datasets, respectively.

  11. An improved protocol for optical projection tomography imaging reveals lobular heterogeneities in pancreatic islet and β-cell mass distribution

    PubMed Central

    2011-01-01

    Optical projection tomography (OPT) imaging is a powerful tool for three-dimensional imaging of gene and protein distribution patterns in biomedical specimens. We have previously demonstrated the possibility, by this technique, to extract information of the spatial and quantitative distribution of the islets of Langerhans in the intact mouse pancreas. In order to further increase the sensitivity of OPT imaging for this type of assessment, we have developed a protocol implementing a computational statistical approach: contrast limited adaptive histogram equalization (CLAHE). We demonstrate that this protocol significantly increases the sensitivity of OPT imaging for islet detection, helps preserve islet morphology and diminish subjectivity in thresholding for tomographic reconstruction. When applied to studies of the pancreas from healthy C57BL/6 mice, our data reveal that, at least in this strain, the pancreas harbors substantially more islets than has previously been reported. Further, we provide evidence that the gastric, duodenal and splenic lobes of the pancreas display dramatic differences in total and relative islet and β-cell mass distribution. This includes a 75% higher islet density in the gastric lobe as compared to the splenic lobe and a higher relative volume of insulin producing cells in the duodenal lobe as compared to the other lobes. Altogether, our data show that CLAHE substantially improves OPT based assessments of the islets of Langerhans and that lobular origin must be taken into careful consideration in quantitative and spatial assessments of the pancreas. PMID:21633198

  12. New clinical grading scales and objective measurement for conjunctival injection.

    PubMed

    Park, In Ki; Chun, Yeoun Sook; Kim, Kwang Gi; Yang, Hee Kyung; Hwang, Jeong-Min

    2013-08-05

    To establish a new clinical grading scale and objective measurement method to evaluate conjunctival injection. Photographs of conjunctival injection with variable ocular diseases in 429 eyes were reviewed. Seventy-three images with concordance by three ophthalmologists were classified into a 4-step and 10-step subjective grading scale, and used as standard photographs. Each image was quantified in four ways: the relative magnitude of the redness component of each red-green-blue (RGB) pixel; two different algorithms based on the occupied area by blood vessels (K-means clustering with LAB color model and contrast-limited adaptive histogram equalization [CLAHE] algorithm); and the presence of blood vessel edges, based on the Canny edge-detection algorithm. Area under the receiver operating characteristic curves (AUCs) were calculated to summarize diagnostic accuracies of the four algorithms. The RGB color model, K-means clustering with LAB color model, and CLAHE algorithm showed good correlation with the clinical 10-step grading scale (R = 0.741, 0.784, 0.919, respectively) and with the clinical 4-step grading scale (R = 0.645, 0.702, 0.838, respectively). The CLAHE method showed the largest AUC, best distinction power (P < 0.001, ANOVA, Bonferroni multiple comparison test), and high reproducibility (R = 0.996). CLAHE algorithm showed the best correlation with the 10-step and 4-step subjective clinical grading scales together with high distinction power and reproducibility. CLAHE algorithm can be a useful for method for assessment of conjunctival injection.

  13. Color Retinal Image Enhancement Based on Luminosity and Contrast Adjustment.

    PubMed

    Zhou, Mei; Jin, Kai; Wang, Shaoze; Ye, Juan; Qian, Dahong

    2018-03-01

    Many common eye diseases and cardiovascular diseases can be diagnosed through retinal imaging. However, due to uneven illumination, image blurring, and low contrast, retinal images with poor quality are not useful for diagnosis, especially in automated image analyzing systems. Here, we propose a new image enhancement method to improve color retinal image luminosity and contrast. A luminance gain matrix, which is obtained by gamma correction of the value channel in the HSV (hue, saturation, and value) color space, is used to enhance the R, G, and B (red, green and blue) channels, respectively. Contrast is then enhanced in the luminosity channel of L * a * b * color space by CLAHE (contrast-limited adaptive histogram equalization). Image enhancement by the proposed method is compared to other methods by evaluating quality scores of the enhanced images. The performance of the method is mainly validated on a dataset of 961 poor-quality retinal images. Quality assessment (range 0-1) of image enhancement of this poor dataset indicated that our method improved color retinal image quality from an average of 0.0404 (standard deviation 0.0291) up to an average of 0.4565 (standard deviation 0.1000). The proposed method is shown to achieve superior image enhancement compared to contrast enhancement in other color spaces or by other related methods, while simultaneously preserving image naturalness. This method of color retinal image enhancement may be employed to assist ophthalmologists in more efficient screening of retinal diseases and in development of improved automated image analysis for clinical diagnosis.

  14. The Application of the Montage Image Mosaic Engine To The Visualization Of Astronomical Images

    NASA Astrophysics Data System (ADS)

    Berriman, G. Bruce; Good, J. C.

    2017-05-01

    The Montage Image Mosaic Engine was designed as a scalable toolkit, written in C for performance and portability across *nix platforms, that assembles FITS images into mosaics. This code is freely available and has been widely used in the astronomy and IT communities for research, product generation, and for developing next-generation cyber-infrastructure. Recently, it has begun finding applicability in the field of visualization. This development has come about because the toolkit design allows easy integration into scalable systems that process data for subsequent visualization in a browser or client. The toolkit it includes a visualization tool suitable for automation and for integration into Python: mViewer creates, with a single command, complex multi-color images overlaid with coordinate displays, labels, and observation footprints, and includes an adaptive image histogram equalization method that preserves the structure of a stretched image over its dynamic range. The Montage toolkit contains functionality originally developed to support the creation and management of mosaics, but which also offers value to visualization: a background rectification algorithm that reveals the faint structure in an image; and tools for creating cutout and downsampled versions of large images. Version 5 of Montage offers support for visualizing data written in HEALPix sky-tessellation scheme, and functionality for processing and organizing images to comply with the TOAST sky-tessellation scheme required for consumption by the World Wide Telescope (WWT). Four online tutorials allow readers to reproduce and extend all the visualizations presented in this paper.

  15. Behavioral assessment of adaptive feedback equalization in a digital hearing aid.

    PubMed

    French-St George, M; Wood, D J; Engebretson, A M

    1993-01-01

    An evaluation was made of the efficacy of a digital feedback equalization algorithm employed by the Central Institute for the Deaf Wearable Adaptive Digital Hearing Aid. Three questions were addressed: 1) Does acoustic feedback limit gain adjustments made by hearing aid users? 2) Does feedback equalization permit users with hearing-impairment to select more gain without feedback? and, 3) If more gain is used when feedback equalization is active, does word identification performance improve? Nine subjects with hearing impairment participated in the study. Results suggest that listeners with hearing impairment are indeed limited by acoustic feedback when listening to soft speech (55 dB A) in quiet. The average listener used an additional 4 dB gain when feedback equalization was active. This additional gain resulted in an average 10 rationalized arcsine units (RAU) improvement in word identification score.

  16. Predicting the Valence of a Scene from Observers’ Eye Movements

    PubMed Central

    R.-Tavakoli, Hamed; Atyabi, Adham; Rantanen, Antti; Laukka, Seppo J.; Nefti-Meziani, Samia; Heikkilä, Janne

    2015-01-01

    Multimedia analysis benefits from understanding the emotional content of a scene in a variety of tasks such as video genre classification and content-based image retrieval. Recently, there has been an increasing interest in applying human bio-signals, particularly eye movements, to recognize the emotional gist of a scene such as its valence. In order to determine the emotional category of images using eye movements, the existing methods often learn a classifier using several features that are extracted from eye movements. Although it has been shown that eye movement is potentially useful for recognition of scene valence, the contribution of each feature is not well-studied. To address the issue, we study the contribution of features extracted from eye movements in the classification of images into pleasant, neutral, and unpleasant categories. We assess ten features and their fusion. The features are histogram of saccade orientation, histogram of saccade slope, histogram of saccade length, histogram of saccade duration, histogram of saccade velocity, histogram of fixation duration, fixation histogram, top-ten salient coordinates, and saliency map. We utilize machine learning approach to analyze the performance of features by learning a support vector machine and exploiting various feature fusion schemes. The experiments reveal that ‘saliency map’, ‘fixation histogram’, ‘histogram of fixation duration’, and ‘histogram of saccade slope’ are the most contributing features. The selected features signify the influence of fixation information and angular behavior of eye movements in the recognition of the valence of images. PMID:26407322

  17. Histogram analysis of T2*-based pharmacokinetic imaging in cerebral glioma grading.

    PubMed

    Liu, Hua-Shan; Chiang, Shih-Wei; Chung, Hsiao-Wen; Tsai, Ping-Huei; Hsu, Fei-Ting; Cho, Nai-Yu; Wang, Chao-Ying; Chou, Ming-Chung; Chen, Cheng-Yu

    2018-03-01

    To investigate the feasibility of histogram analysis of the T2*-based permeability parameter volume transfer constant (K trans ) for glioma grading and to explore the diagnostic performance of the histogram analysis of K trans and blood plasma volume (v p ). We recruited 31 and 11 patients with high- and low-grade gliomas, respectively. The histogram parameters of K trans and v p , derived from the first-pass pharmacokinetic modeling based on the T2* dynamic susceptibility-weighted contrast-enhanced perfusion-weighted magnetic resonance imaging (T2* DSC-PW-MRI) from the entire tumor volume, were evaluated for differentiating glioma grades. Histogram parameters of K trans and v p showed significant differences between high- and low-grade gliomas and exhibited significant correlations with tumor grades. The mean K trans derived from the T2* DSC-PW-MRI had the highest sensitivity and specificity for differentiating high-grade gliomas from low-grade gliomas compared with other histogram parameters of K trans and v p . Histogram analysis of T2*-based pharmacokinetic imaging is useful for cerebral glioma grading. The histogram parameters of the entire tumor K trans measurement can provide increased accuracy with additional information regarding microvascular permeability changes for identifying high-grade brain tumors. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Infrared image segmentation method based on spatial coherence histogram and maximum entropy

    NASA Astrophysics Data System (ADS)

    Liu, Songtao; Shen, Tongsheng; Dai, Yao

    2014-11-01

    In order to segment the target well and suppress background noises effectively, an infrared image segmentation method based on spatial coherence histogram and maximum entropy is proposed. First, spatial coherence histogram is presented by weighting the importance of the different position of these pixels with the same gray-level, which is obtained by computing their local density. Then, after enhancing the image by spatial coherence histogram, 1D maximum entropy method is used to segment the image. The novel method can not only get better segmentation results, but also have a faster computation time than traditional 2D histogram-based segmentation methods.

  19. Mutual information estimation for irregularly sampled time series

    NASA Astrophysics Data System (ADS)

    Rehfeld, K.; Marwan, N.; Heitzig, J.; Kurths, J.

    2012-04-01

    For the automated, objective and joint analysis of time series, similarity measures are crucial. Used in the analysis of climate records, they allow for a complimentary, unbiased view onto sparse datasets. The irregular sampling of many of these time series, however, makes it necessary to either perform signal reconstruction (e.g. interpolation) or to develop and use adapted measures. Standard linear interpolation comes with an inevitable loss of information and bias effects. We have recently developed a Gaussian kernel-based correlation algorithm with which the interpolation error can be substantially lowered, but this would not work should the functional relationship in a bivariate setting be non-linear. We therefore propose an algorithm to estimate lagged auto and cross mutual information from irregularly sampled time series. We have extended the standard and adaptive binning histogram estimators and use Gaussian distributed weights in the estimation of the (joint) probabilities. To test our method we have simulated linear and nonlinear auto-regressive processes with Gamma-distributed inter-sampling intervals. We have then performed a sensitivity analysis for the estimation of actual coupling length, the lag of coupling and the decorrelation time in the synthetic time series and contrast our results to the performance of a signal reconstruction scheme. Finally we applied our estimator to speleothem records. We compare the estimated memory (or decorrelation time) to that from a least-squares estimator based on fitting an auto-regressive process of order 1. The calculated (cross) mutual information results are compared for the different estimators (standard or adaptive binning) and contrasted with results from signal reconstruction. We find that the kernel-based estimator has a significantly lower root mean square error and less systematic sampling bias than the interpolation-based method. It is possible that these encouraging results could be further improved by using non-histogram mutual information estimators, like k-Nearest Neighbor or Kernel-Density estimators, but for short (<1000 points) and irregularly sampled datasets the proposed algorithm is already a great improvement.

  20. Data-aided adaptive weighted channel equalizer for coherent optical OFDM.

    PubMed

    Mousa-Pasandi, Mohammad E; Plant, David V

    2010-02-15

    We report an adaptive weighted channel equalizer (AWCE) for orthogonal frequency division multiplexing (OFDM) and study its performance for long-haul coherent optical OFDM (CO-OFDM) transmission systems. This equalizer updates the equalization parameters on a symbol-by-symbol basis thus can track slight drifts of the optical channel. This is suitable to combat polarization mode dispersion (PMD) degradation while increasing the periodicity of pilot symbols which can be translated into a significant overhead reduction. Furthermore, AWCE can increase the precision of RF-pilot enabled phase noise estimation in the presence of noise, using data-aided phase noise estimation. Simulation results corroborate the capability of AWCE in both overhead reduction and improving the quality of the phase noise compensation (PNC).

  1. The Amazing Histogram.

    ERIC Educational Resources Information Center

    Vandermeulen, H.; DeWreede, R. E.

    1983-01-01

    Presents a histogram drawing program which sorts real numbers in up to 30 categories. Entered data are sorted and saved in a text file which is then used to generate the histogram. Complete Applesoft program listings are included. (JN)

  2. Image statistics and the perception of surface gloss and lightness.

    PubMed

    Kim, Juno; Anderson, Barton L

    2010-07-01

    Despite previous data demonstrating the critical importance of 3D surface geometry in the perception of gloss and lightness, I. Motoyoshi, S. Nishida, L. Sharan, and E. H. Adelson (2007) recently proposed that a simple image statistic--histogram or sub-band skew--is computed by the visual system to infer the gloss and albedo of surfaces. One key source of evidence used to support this claim was an experiment in which adaptation to skewed image statistics resulted in opponent aftereffects in observers' judgments of gloss and lightness. We report a series of adaptation experiments that were designed to assess the cause of these aftereffects. We replicated their original aftereffects in gloss but found no consistent aftereffect in lightness. We report that adaptation to zero-skew adaptors produced similar aftereffects as positively skewed adaptors, and that negatively skewed adaptors induced no reliable aftereffects. We further find that the adaptation effect observed with positively skewed adaptors is not robust to changes in mean luminance that diminish the intensity of the luminance extrema. Finally, we show that adaptation to positive skew reduces (rather than increases) the apparent lightness of light pigmentation on non-uniform albedo surfaces. These results challenge the view that the adaptation results reported by Motoyoshi et al. (2007) provide evidence that skew is explicitly computed by the visual system.

  3. Bin recycling strategy for improving the histogram precision on GPU

    NASA Astrophysics Data System (ADS)

    Cárdenas-Montes, Miguel; Rodríguez-Vázquez, Juan José; Vega-Rodríguez, Miguel A.

    2016-07-01

    Histogram is an easily comprehensible way to present data and analyses. In the current scientific context with access to large volumes of data, the processing time for building histogram has dramatically increased. For this reason, parallel construction is necessary to alleviate the impact of the processing time in the analysis activities. In this scenario, GPU computing is becoming widely used for reducing until affordable levels the processing time of histogram construction. Associated to the increment of the processing time, the implementations are stressed on the bin-count accuracy. Accuracy aspects due to the particularities of the implementations are not usually taken into consideration when building histogram with very large data sets. In this work, a bin recycling strategy to create an accuracy-aware implementation for building histogram on GPU is presented. In order to evaluate the approach, this strategy was applied to the computation of the three-point angular correlation function, which is a relevant function in Cosmology for the study of the Large Scale Structure of Universe. As a consequence of the study a high-accuracy implementation for histogram construction on GPU is proposed.

  4. AHIMSA - Ad hoc histogram information measure sensing algorithm for feature selection in the context of histogram inspired clustering techniques

    NASA Technical Reports Server (NTRS)

    Dasarathy, B. V.

    1976-01-01

    An algorithm is proposed for dimensionality reduction in the context of clustering techniques based on histogram analysis. The approach is based on an evaluation of the hills and valleys in the unidimensional histograms along the different features and provides an economical means of assessing the significance of the features in a nonparametric unsupervised data environment. The method has relevance to remote sensing applications.

  5. Clinical Utility of Blood Cell Histogram Interpretation

    PubMed Central

    Bhagya, S.; Majeed, Abdul

    2017-01-01

    An automated haematology analyser provides blood cell histograms by plotting the sizes of different blood cells on X-axis and their relative number on Y-axis. Histogram interpretation needs careful analysis of Red Blood Cell (RBC), White Blood Cell (WBC) and platelet distribution curves. Histogram analysis is often a neglected part of the automated haemogram which if interpreted well, has significant potential to provide diagnostically relevant information even before higher level investigations are ordered. PMID:29207767

  6. Clinical Utility of Blood Cell Histogram Interpretation.

    PubMed

    Thomas, E T Arun; Bhagya, S; Majeed, Abdul

    2017-09-01

    An automated haematology analyser provides blood cell histograms by plotting the sizes of different blood cells on X-axis and their relative number on Y-axis. Histogram interpretation needs careful analysis of Red Blood Cell (RBC), White Blood Cell (WBC) and platelet distribution curves. Histogram analysis is often a neglected part of the automated haemogram which if interpreted well, has significant potential to provide diagnostically relevant information even before higher level investigations are ordered.

  7. Potential of MR histogram analyses for prediction of response to chemotherapy in patients with colorectal hepatic metastases.

    PubMed

    Liang, He-Yue; Huang, Ya-Qin; Yang, Zhao-Xia; Ying-Ding; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-07-01

    To determine if magnetic resonance imaging (MRI) histogram analyses can help predict response to chemotherapy in patients with colorectal hepatic metastases by using response evaluation criteria in solid tumours (RECIST1.1) as the reference standard. Standard MRI including diffusion-weighted imaging (b=0, 500 s/mm(2)) was performed before chemotherapy in 53 patients with colorectal hepatic metastases. Histograms were performed for apparent diffusion coefficient (ADC) maps, arterial, and portal venous phase images; thereafter, mean, percentiles (1st, 10th, 50th, 90th, 99th), skewness, kurtosis, and variance were generated. Quantitative histogram parameters were compared between responders (partial and complete response, n=15) and non-responders (progressive and stable disease, n=38). Receiver operator characteristics (ROC) analyses were further analyzed for the significant parameters. The mean, 1st percentile, 10th percentile, 50th percentile, 90th percentile, 99th percentile of the ADC maps were significantly lower in responding group than that in non-responding group (p=0.000-0.002) with area under the ROC curve (AUCs) of 0.76-0.82. The histogram parameters of arterial and portal venous phase showed no significant difference (p>0.05) between the two groups. Histogram-derived parameters for ADC maps seem to be a promising tool for predicting response to chemotherapy in patients with colorectal hepatic metastases. • ADC histogram analyses can potentially predict chemotherapy response in colorectal liver metastases. • Lower histogram-derived parameters (mean, percentiles) for ADC tend to have good response. • MR enhancement histogram analyses are not reliable to predict response.

  8. Using histograms to introduce randomization in the generation of ensembles of decision trees

    DOEpatents

    Kamath, Chandrika; Cantu-Paz, Erick; Littau, David

    2005-02-22

    A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

  9. Proceedings of the Conference on Moments and Signal

    NASA Astrophysics Data System (ADS)

    Purdue, P.; Solomon, H.

    1992-09-01

    The focus of this paper is (1) to describe systematic methodologies for selecting nonlinear transformations for blind equalization algorithms (and thus new types of cumulants), and (2) to give an overview of the existing blind equalization algorithms and point out their strengths as well as weaknesses. It is shown that all blind equalization algorithms belong in one of the following three categories, depending where the nonlinear transformation is being applied on the data: (1) the Bussgang algorithms, where the nonlinearity is in the output of the adaptive equalization filter; (2) the polyspectra (or Higher-Order Spectra) algorithms, where the nonlinearity is in the input of the adaptive equalization filter; and (3) the algorithms where the nonlinearity is inside the adaptive filter, i.e., the nonlinear filter or neural network. We describe methodologies for selecting nonlinear transformations based on various optimality criteria such as MSE or MAP. We illustrate that such existing algorithms as Sato, Benveniste-Goursat, Godard or CMA, Stop-and-Go, and Donoho are indeed special cases of the Bussgang family of techniques when the nonlinearity is memoryless. We present results that demonstrate the polyspectra-based algorithms exhibit faster convergence rate than Bussgang algorithms. However, this improved performance is at the expense of more computations per iteration. We also show that blind equalizers based on nonlinear filters or neural networks are more suited for channels that have nonlinear distortions.

  10. Histogram based analysis of lung perfusion of children after congenital diaphragmatic hernia repair.

    PubMed

    Kassner, Nora; Weis, Meike; Zahn, Katrin; Schaible, Thomas; Schoenberg, Stefan O; Schad, Lothar R; Zöllner, Frank G

    2018-05-01

    To investigate a histogram based approach to characterize the distribution of perfusion in the whole left and right lung by descriptive statistics and to show how histograms could be used to visually explore perfusion defects in two year old children after Congenital Diaphragmatic Hernia (CDH) repair. 28 children (age of 24.2±1.7months; all left sided hernia; 9 after extracorporeal membrane oxygenation therapy) underwent quantitative DCE-MRI of the lung. Segmentations of left and right lung were manually drawn to mask the calculated pulmonary blood flow maps and then to derive histograms for each lung side. Individual and group wise analysis of histograms of left and right lung was performed. Ipsilateral and contralateral lung show significant difference in shape and descriptive statistics derived from the histogram (Wilcoxon signed-rank test, p<0.05) on group wise and individual level. Subgroup analysis (patients with vs without ECMO therapy) showed no significant differences using histogram derived parameters. Histogram analysis can be a valuable tool to characterize and visualize whole lung perfusion of children after CDH repair. It allows for several possibilities to analyze the data, either describing the perfusion differences between the right and left lung but also to explore and visualize localized perfusion patterns in the 3D lung volume. Subgroup analysis will be possible given sufficient sample sizes. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  12. An analysis of neural receptive field plasticity by point process adaptive filtering

    PubMed Central

    Brown, Emery N.; Nguyen, David P.; Frank, Loren M.; Wilson, Matthew A.; Solo, Victor

    2001-01-01

    Neural receptive fields are plastic: with experience, neurons in many brain regions change their spiking responses to relevant stimuli. Analysis of receptive field plasticity from experimental measurements is crucial for understanding how neural systems adapt their representations of relevant biological information. Current analysis methods using histogram estimates of spike rate functions in nonoverlapping temporal windows do not track the evolution of receptive field plasticity on a fine time scale. Adaptive signal processing is an established engineering paradigm for estimating time-varying system parameters from experimental measurements. We present an adaptive filter algorithm for tracking neural receptive field plasticity based on point process models of spike train activity. We derive an instantaneous steepest descent algorithm by using as the criterion function the instantaneous log likelihood of a point process spike train model. We apply the point process adaptive filter algorithm in a study of spatial (place) receptive field properties of simulated and actual spike train data from rat CA1 hippocampal neurons. A stability analysis of the algorithm is sketched in the Appendix. The adaptive algorithm can update the place field parameter estimates on a millisecond time scale. It reliably tracked the migration, changes in scale, and changes in maximum firing rate characteristic of hippocampal place fields in a rat running on a linear track. Point process adaptive filtering offers an analytic method for studying the dynamics of neural receptive fields. PMID:11593043

  13. Generalized image contrast enhancement technique based on the Heinemann contrast discrimination model

    NASA Astrophysics Data System (ADS)

    Liu, Hong; Nodine, Calvin F.

    1996-07-01

    This paper presents a generalized image contrast enhancement technique, which equalizes the perceived brightness distribution based on the Heinemann contrast discrimination model. It is based on the mathematically proven existence of a unique solution to a nonlinear equation, and is formulated with easily tunable parameters. The model uses a two-step log-log representation of luminance contrast between targets and surround in a luminous background setting. The algorithm consists of two nonlinear gray scale mapping functions that have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of the gray-level distribution of the given image, and can be uniquely determined once the previous three are set. Tests have been carried out to demonstrate the effectiveness of the algorithm for increasing the overall contrast of radiology images. The traditional histogram equalization can be reinterpreted as an image enhancement technique based on the knowledge of human contrast perception. In fact, it is a special case of the proposed algorithm.

  14. Cost-effective forensic image enhancement

    NASA Astrophysics Data System (ADS)

    Dalrymple, Brian E.

    1998-12-01

    In 1977, a paper was presented at the SPIE conference in Reston, Virginia, detailing the computer enhancement of the Zapruder film. The forensic value of this examination in a major homicide investigation was apparent to the viewer. Equally clear was the potential for extracting evidence which is beyond the reach of conventional detection techniques. The cost of this technology in 1976, however, was prohibitive, and well beyond the means of most police agencies. Twenty-two years later, a highly efficient means of image enhancement is easily within the grasp of most police agencies, not only for homicides but for any case application. A PC workstation combined with an enhancement software package allows a forensic investigator to fully exploit digital technology. The goal of this approach is the optimization of the signal to noise ratio in images. Obstructive backgrounds may be diminished or eliminated while weak signals are optimized by the use of algorithms including Fast Fourier Transform, Histogram Equalization and Image Subtraction. An added benefit is the speed with which these processes are completed and the results known. The efficacy of forensic image enhancement is illustrated through case applications.

  15. Low-level image properties in facial expressions.

    PubMed

    Menzel, Claudia; Redies, Christoph; Hayn-Leichsenring, Gregor U

    2018-06-04

    We studied low-level image properties of face photographs and analyzed whether they change with different emotional expressions displayed by an individual. Differences in image properties were measured in three databases that depicted a total of 167 individuals. Face images were used either in their original form, cut to a standard format or superimposed with a mask. Image properties analyzed were: brightness, redness, yellowness, contrast, spectral slope, overall power and relative power in low, medium and high spatial frequencies. Results showed that image properties differed significantly between expressions within each individual image set. Further, specific facial expressions corresponded to patterns of image properties that were consistent across all three databases. In order to experimentally validate our findings, we equalized the luminance histograms and spectral slopes of three images from a given individual who showed two expressions. Participants were significantly slower in matching the expression in an equalized compared to an original image triad. Thus, existing differences in these image properties (i.e., spectral slope, brightness or contrast) facilitate emotion detection in particular sets of face images. Copyright © 2018. Published by Elsevier B.V.

  16. Evaluation of thresholding techniques for segmenting scaffold images in tissue engineering

    NASA Astrophysics Data System (ADS)

    Rajagopalan, Srinivasan; Yaszemski, Michael J.; Robb, Richard A.

    2004-05-01

    Tissue engineering attempts to address the ever widening gap between the demand and supply of organ and tissue transplants using natural and biomimetic scaffolds. The regeneration of specific tissues aided by synthetic materials is dependent on the structural and morphometric properties of the scaffold. These properties can be derived non-destructively using quantitative analysis of high resolution microCT scans of scaffolds. Thresholding of the scanned images into polymeric and porous phase is central to the outcome of the subsequent structural and morphometric analysis. Visual thresholding of scaffolds produced using stochastic processes is inaccurate. Depending on the algorithmic assumptions made, automatic thresholding might also be inaccurate. Hence there is a need to analyze the performance of different techniques and propose alternate ones, if needed. This paper provides a quantitative comparison of different thresholding techniques for segmenting scaffold images. The thresholding algorithms examined include those that exploit spatial information, locally adaptive characteristics, histogram entropy information, histogram shape information, and clustering of gray-level information. The performance of different techniques was evaluated using established criteria, including misclassification error, edge mismatch, relative foreground error, and region non-uniformity. Algorithms that exploit local image characteristics seem to perform much better than those using global information.

  17. FPGA based charge fast histogramming for GEM detector

    NASA Astrophysics Data System (ADS)

    Poźniak, Krzysztof T.; Byszuk, A.; Chernyshova, M.; Cieszewski, R.; Czarski, T.; Dominik, W.; Jakubowska, K.; Kasprowicz, G.; Rzadkiewicz, J.; Scholz, M.; Zabolotny, W.

    2013-10-01

    This article presents a fast charge histogramming method for the position sensitive X-ray GEM detector. The energy resolved measurements are carried out simultaneously for 256 channels of the GEM detector. The whole process of histogramming is performed in 21 FPGA chips (Spartan-6 series from Xilinx) . The results of the histogramming process are stored in an external DDR3 memory. The structure of an electronic measuring equipment and a firmware functionality implemented in the FPGAs is described. Examples of test measurements are presented.

  18. Local dynamic range compensation for scanning electron microscope imaging system.

    PubMed

    Sim, K S; Huang, Y H

    2015-01-01

    This is the extended project by introducing the modified dynamic range histogram modification (MDRHM) and is presented in this paper. This technique is used to enhance the scanning electron microscope (SEM) imaging system. By comparing with the conventional histogram modification compensators, this technique utilizes histogram profiling by extending the dynamic range of each tile of an image to the limit of 0-255 range while retains its histogram shape. The proposed technique yields better image compensation compared to conventional methods. © Wiley Periodicals, Inc.

  19. Illumination invariant feature point matching for high-resolution planetary remote sensing images

    NASA Astrophysics Data System (ADS)

    Wu, Bo; Zeng, Hai; Hu, Han

    2018-03-01

    Despite its success with regular close-range and remote-sensing images, the scale-invariant feature transform (SIFT) algorithm is essentially not invariant to illumination differences due to the use of gradients for feature description. In planetary remote sensing imagery, which normally lacks sufficient textural information, salient regions are generally triggered by the shadow effects of keypoints, reducing the matching performance of classical SIFT. Based on the observation of dual peaks in a histogram of the dominant orientations of SIFT keypoints, this paper proposes an illumination-invariant SIFT matching method for high-resolution planetary remote sensing images. First, as the peaks in the orientation histogram are generally aligned closely with the sub-solar azimuth angle at the time of image collection, an adaptive suppression Gaussian function is tuned to level the histogram and thereby alleviate the differences in illumination caused by a changing solar angle. Next, the suppression function is incorporated into the original SIFT procedure for obtaining feature descriptors, which are used for initial image matching. Finally, as the distribution of feature descriptors changes after anisotropic suppression, and the ratio check used for matching and outlier removal in classical SIFT may produce inferior results, this paper proposes an improved matching procedure based on cross-checking and template image matching. The experimental results for several high-resolution remote sensing images from both the Moon and Mars, with illumination differences of 20°-180°, reveal that the proposed method retrieves about 40%-60% more matches than the classical SIFT method. The proposed method is of significance for matching or co-registration of planetary remote sensing images for their synergistic use in various applications. It also has the potential to be useful for flyby and rover images by integrating with the affine invariant feature detectors.

  20. Comparison of NRZ and duo-binary format in adaptive equalization assisted 10G-optics based 25G-EPON

    NASA Astrophysics Data System (ADS)

    Xia, Junqi; Li, Zhengxuan; Li, Yingchun; Xu, Tingting; Chen, Jian; Song, Yingxiong; Wang, Min

    2018-03-01

    We investigate and compare the requirements of FFE/DFE based adaptive equalization techniques for NRZ and Duo-binary based 25-Gb/s transmission, which are two of the most promising schemes for 25G-EPON. A 25-Gb/s transmission system based on 10G optical transceivers is demonstrated and the performance of only FFE and combination of FFE and DFE with different number of taps are compared with two modulation formats. The FFE/DFE based Duo-binary receiver shows better performance than NRZ receiver. For Duo-binary receiver, only 13-tap FFE is needed for BtB case and the combination of 17-tap FFE and 5-tap DFE can achieve a sensitivity of -23.45 dBm in 25 km transmission case, which is ∼0.6 dB better than the best performance of NRZ equalization. In addition, the requirements of training sequence length for FFE/DFE based adaptive equalization is verified. Experimental results show that 400 symbols training length is optimal for the two modulations, which shows a small packet preamble in up-stream burst-mode transmission.

  1. FEM: Feature-enhanced map

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Afonine, Pavel V.; Moriarty, Nigel W.; Mustyakimov, Marat

    A method is presented that modifies a 2 m F obs- D F modelσ A-weighted map such that the resulting map can strengthen a weak signal, if present, and can reduce model bias and noise. The method consists of first randomizing the starting map and filling in missing reflections using multiple methods. This is followed by restricting the map to regions with convincing density and the application of sharpening. The final map is then created by combining a series of histogram-equalized intermediate maps. In the test cases shown, the maps produced in this way are found to have increased interpretabilitymore » and decreased model bias compared with the starting 2 m F obs- D F modelσ A-weighted map.« less

  2. FEM: feature-enhanced map

    PubMed Central

    Afonine, Pavel V.; Moriarty, Nigel W.; Mustyakimov, Marat; Sobolev, Oleg V.; Terwilliger, Thomas C.; Turk, Dusan; Urzhumtsev, Alexandre; Adams, Paul D.

    2015-01-01

    A method is presented that modifies a 2m F obs − D F model σA-weighted map such that the resulting map can strengthen a weak signal, if present, and can reduce model bias and noise. The method consists of first randomizing the starting map and filling in missing reflections using multiple methods. This is followed by restricting the map to regions with convincing density and the application of sharpening. The final map is then created by combining a series of histogram-equalized intermediate maps. In the test cases shown, the maps produced in this way are found to have increased interpretability and decreased model bias compared with the starting 2m F obs − D F model σA-weighted map. PMID:25760612

  3. FEM: Feature-enhanced map

    DOE PAGES

    Afonine, Pavel V.; Moriarty, Nigel W.; Mustyakimov, Marat; ...

    2015-02-26

    A method is presented that modifies a 2 m F obs- D F modelσ A-weighted map such that the resulting map can strengthen a weak signal, if present, and can reduce model bias and noise. The method consists of first randomizing the starting map and filling in missing reflections using multiple methods. This is followed by restricting the map to regions with convincing density and the application of sharpening. The final map is then created by combining a series of histogram-equalized intermediate maps. In the test cases shown, the maps produced in this way are found to have increased interpretabilitymore » and decreased model bias compared with the starting 2 m F obs- D F modelσ A-weighted map.« less

  4. A natural-color mapping for single-band night-time image based on FPGA

    NASA Astrophysics Data System (ADS)

    Wang, Yilun; Qian, Yunsheng

    2018-01-01

    A natural-color mapping for single-band night-time image method based on FPGA can transmit the color of the reference image to single-band night-time image, which is consistent with human visual habits and can help observers identify the target. This paper introduces the processing of the natural-color mapping algorithm based on FPGA. Firstly, the image can be transformed based on histogram equalization, and the intensity features and standard deviation features of reference image are stored in SRAM. Then, the real-time digital images' intensity features and standard deviation features are calculated by FPGA. At last, FPGA completes the color mapping through matching pixels between images using the features in luminance channel.

  5. Whole-lesion apparent diffusion coefficient histogram analysis: significance in T and N staging of gastric cancers.

    PubMed

    Liu, Song; Zhang, Yujuan; Chen, Ling; Guan, Wenxian; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang

    2017-10-02

    Whole-lesion apparent diffusion coefficient (ADC) histogram analysis has been introduced and proved effective in assessment of multiple tumors. However, the application of whole-volume ADC histogram analysis in gastrointestinal tumors has just started and never been reported in T and N staging of gastric cancers. Eighty patients with pathologically confirmed gastric carcinomas underwent diffusion weighted (DW) magnetic resonance imaging before surgery prospectively. Whole-lesion ADC histogram analysis was performed by two radiologists independently. The differences of ADC histogram parameters among different T and N stages were compared with independent-samples Kruskal-Wallis test. Receiver operating characteristic (ROC) analysis was performed to evaluate the performance of ADC histogram parameters in differentiating particular T or N stages of gastric cancers. There were significant differences of all the ADC histogram parameters for gastric cancers at different T (except ADC min and ADC max ) and N (except ADC max ) stages. Most ADC histogram parameters differed significantly between T1 vs T3, T1 vs T4, T2 vs T4, N0 vs N1, N0 vs N3, and some parameters (ADC 5% , ADC 10% , ADC min ) differed significantly between N0 vs N2, N2 vs N3 (all P < 0.05). Most parameters except ADC max performed well in differentiating different T and N stages of gastric cancers. Especially for identifying patients with and without lymph node metastasis, the ADC 10% yielded the largest area under the ROC curve of 0.794 (95% confidence interval, 0.677-0.911). All the parameters except ADC max showed excellent inter-observer agreement with intra-class correlation coefficients higher than 0.800. Whole-volume ADC histogram parameters held great potential in differentiating different T and N stages of gastric cancers preoperatively.

  6. Histogram Profiling of Postcontrast T1-Weighted MRI Gives Valuable Insights into Tumor Biology and Enables Prediction of Growth Kinetics and Prognosis in Meningiomas.

    PubMed

    Gihr, Georg Alexander; Horvath-Rizea, Diana; Kohlhof-Meinecke, Patricia; Ganslandt, Oliver; Henkes, Hans; Richter, Cindy; Hoffmann, Karl-Titus; Surov, Alexey; Schob, Stefan

    2018-06-14

    Meningiomas are the most frequently diagnosed intracranial masses, oftentimes requiring surgery. Especially procedure-related morbidity can be substantial, particularly in elderly patients. Hence, reliable imaging modalities enabling pretherapeutic prediction of tumor grade, growth kinetic, realistic prognosis, and-as a consequence-necessity of surgery are of great value. In this context, a promising diagnostic approach is advanced analysis of magnetic resonance imaging data. Therefore, our study investigated whether histogram profiling of routinely acquired postcontrast T1-weighted images is capable of separating low-grade from high-grade lesions and whether histogram parameters reflect Ki-67 expression in meningiomas. Pretreatment T1-weighted postcontrast volumes of 44 meningioma patients were used for signal intensity histogram profiling. WHO grade, tumor volume, and Ki-67 expression were evaluated. Comparative and correlative statistics investigating the association between histogram profile parameters and neuropathology were performed. None of the investigated histogram parameters revealed significant differences between low-grade and high-grade meningiomas. However, significant correlations were identified between Ki-67 and the histogram parameters skewness and entropy as well as between entropy and tumor volume. Contrary to previously reported findings, pretherapeutic postcontrast T1-weighted images can be used to predict growth kinetics in meningiomas if whole tumor histogram analysis is employed. However, no differences between distinct WHO grades were identifiable in out cohort. As a consequence, histogram analysis of postcontrast T1-weighted images is a promising approach to obtain quantitative in vivo biomarkers reflecting the proliferative potential in meningiomas. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Dose-volume histogram prediction using density estimation.

    PubMed

    Skarpman Munter, Johanna; Sjölund, Jens

    2015-09-07

    Knowledge of what dose-volume histograms can be expected for a previously unseen patient could increase consistency and quality in radiotherapy treatment planning. We propose a machine learning method that uses previous treatment plans to predict such dose-volume histograms. The key to the approach is the framing of dose-volume histograms in a probabilistic setting.The training consists of estimating, from the patients in the training set, the joint probability distribution of some predictive features and the dose. The joint distribution immediately provides an estimate of the conditional probability of the dose given the values of the predictive features. The prediction consists of estimating, from the new patient, the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimate of the dose-volume histogram.To illustrate how the proposed method relates to previously proposed methods, we use the signed distance to the target boundary as a single predictive feature. As a proof-of-concept, we predicted dose-volume histograms for the brainstems of 22 acoustic schwannoma patients treated with stereotactic radiosurgery, and for the lungs of 9 lung cancer patients treated with stereotactic body radiation therapy. Comparing with two previous attempts at dose-volume histogram prediction we find that, given the same input data, the predictions are similar.In summary, we propose a method for dose-volume histogram prediction that exploits the intrinsic probabilistic properties of dose-volume histograms. We argue that the proposed method makes up for some deficiencies in previously proposed methods, thereby potentially increasing ease of use, flexibility and ability to perform well with small amounts of training data.

  8. Structure Size Enhanced Histogram

    NASA Astrophysics Data System (ADS)

    Wesarg, Stefan; Kirschner, Matthias

    Direct volume visualization requires the definition of transfer functions (TFs) for the assignment of opacity and color. Multi-dimensional TFs are based on at least two image properties, and are specified by means of 2D histograms. In this work we propose a new type of a 2D histogram which combines gray value with information about the size of the structures. This structure size enhanced (SSE) histogram is an intuitive approach for representing anatomical features. Clinicians — the users we are focusing on — are much more familiar with selecting features by their size than by their gradient magnitude value. As a proof of concept, we employ the SSE histogram for the definition of two-dimensional TFs for the visualization of 3D MRI and CT image data.

  9. Face recognition algorithm using extended vector quantization histogram features.

    PubMed

    Yan, Yan; Lee, Feifei; Wu, Xueqian; Chen, Qiu

    2018-01-01

    In this paper, we propose a face recognition algorithm based on a combination of vector quantization (VQ) and Markov stationary features (MSF). The VQ algorithm has been shown to be an effective method for generating features; it extracts a codevector histogram as a facial feature representation for face recognition. Still, the VQ histogram features are unable to convey spatial structural information, which to some extent limits their usefulness in discrimination. To alleviate this limitation of VQ histograms, we utilize Markov stationary features (MSF) to extend the VQ histogram-based features so as to add spatial structural information. We demonstrate the effectiveness of our proposed algorithm by achieving recognition results superior to those of several state-of-the-art methods on publicly available face databases.

  10. Ultrasonic histogram assessment of early response to concurrent chemo-radiotherapy in patients with locally advanced cervical cancer: a feasibility study.

    PubMed

    Xu, Yan; Ru, Tong; Zhu, Lijing; Liu, Baorui; Wang, Huanhuan; Zhu, Li; He, Jian; Liu, Song; Zhou, Zhengyang; Yang, Xiaofeng

    To monitor early response for locally advanced cervical cancers undergoing concurrent chemo-radiotherapy (CCRT) by ultrasonic histogram. B-mode ultrasound examinations were performed at 4 time points in thirty-four patients during CCRT. Six ultrasonic histogram parameters were used to assess the echogenicity, homogeneity and heterogeneity of tumors. I peak increased rapidly since the first week after therapy initiation, whereas W low , W high and A high changed significantly at the second week. The average ultrasonic histogram progressively moved toward the right and converted into more symmetrical shape. Ultrasonic histogram could be served as a potential marker to monitor early response during CCRT. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Face verification system for Android mobile devices using histogram based features

    NASA Astrophysics Data System (ADS)

    Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu

    2016-07-01

    This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.

  12. Histogram Curve Matching Approaches for Object-based Image Classification of Land Cover and Land Use

    PubMed Central

    Toure, Sory I.; Stow, Douglas A.; Weeks, John R.; Kumar, Sunil

    2013-01-01

    The classification of image-objects is usually done using parametric statistical measures of central tendency and/or dispersion (e.g., mean or standard deviation). The objectives of this study were to analyze digital number histograms of image objects and evaluate classifications measures exploiting characteristic signatures of such histograms. Two histograms matching classifiers were evaluated and compared to the standard nearest neighbor to mean classifier. An ADS40 airborne multispectral image of San Diego, California was used for assessing the utility of curve matching classifiers in a geographic object-based image analysis (GEOBIA) approach. The classifications were performed with data sets having 0.5 m, 2.5 m, and 5 m spatial resolutions. Results show that histograms are reliable features for characterizing classes. Also, both histogram matching classifiers consistently performed better than the one based on the standard nearest neighbor to mean rule. The highest classification accuracies were produced with images having 2.5 m spatial resolution. PMID:24403648

  13. Histogram and gray level co-occurrence matrix on gray-scale ultrasound images for diagnosing lymphocytic thyroiditis.

    PubMed

    Shin, Young Gyung; Yoo, Jaeheung; Kwon, Hyeong Ju; Hong, Jung Hwa; Lee, Hye Sun; Yoon, Jung Hyun; Kim, Eun-Kyung; Moon, Hee Jung; Han, Kyunghwa; Kwak, Jin Young

    2016-08-01

    The objective of the study was to evaluate whether texture analysis using histogram and gray level co-occurrence matrix (GLCM) parameters can help clinicians diagnose lymphocytic thyroiditis (LT) and differentiate LT according to pathologic grade. The background thyroid pathology of 441 patients was classified into no evidence of LT, chronic LT (CLT), and Hashimoto's thyroiditis (HT). Histogram and GLCM parameters were extracted from the regions of interest on ultrasound. The diagnostic performances of the parameters for diagnosing and differentiating LT were calculated. Of the histogram and GLCM parameters, the mean on histogram had the highest Az (0.63) and VUS (0.303). As the degrees of LT increased, the mean decreased and the standard deviation and entropy increased. The mean on histogram from gray-scale ultrasound showed the best diagnostic performance as a single parameter in differentiating LT according to pathologic grade as well as in diagnosing LT. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Whole-Lesion Histogram Analysis of Apparent Diffusion Coefficient for the Assessment of Cervical Cancer.

    PubMed

    Guan, Yue; Shi, Hua; Chen, Ying; Liu, Song; Li, Weifeng; Jiang, Zhuoran; Wang, Huanhuan; He, Jian; Zhou, Zhengyang; Ge, Yun

    2016-01-01

    The aim of this study was to explore the application of whole-lesion histogram analysis of apparent diffusion coefficient (ADC) values of cervical cancer. A total of 54 women (mean age, 53 years) with cervical cancers underwent 3-T diffusion-weighted imaging with b values of 0 and 800 s/mm prospectively. Whole-lesion histogram analysis of ADC values was performed. Paired sample t test was used to compare differences in ADC histogram parameters between cervical cancers and normal cervical tissues. Receiver operating characteristic curves were constructed to identify the optimal threshold of each parameter. All histogram parameters in this study including ADCmean, ADCmin, ADC10%-ADC90%, mode, skewness, and kurtosis of cervical cancers were significantly lower than those of normal cervical tissues (all P < 0.0001). ADC90% had the largest area under receiver operating characteristic curve of 0.996. Whole-lesion histogram analysis of ADC maps is useful in the assessment of cervical cancer.

  15. Time-cumulated visible and infrared radiance histograms used as descriptors of surface and cloud variations

    NASA Technical Reports Server (NTRS)

    Seze, Genevieve; Rossow, William B.

    1991-01-01

    The spatial and temporal stability of the distributions of satellite-measured visible and infrared radiances, caused by variations in clouds and surfaces, are investigated using bidimensional and monodimensional histograms and time-composite images. Similar analysis of the histograms of the original and time-composite images provides separation of the contributions of the space and time variations to the total variations. The variability of both the surfaces and clouds is found to be larger at scales much larger than the minimum resolved by satellite imagery. This study shows that the shapes of these histograms are distinctive characteristics of the different climate regimes and that particular attributes of these histograms can be related to several general, though not universal, properties of clouds and surface variations at regional and synoptic scales. There are also significant exceptions to these relationships in particular climate regimes. The characteristics of these radiance histograms provide a stable well defined descriptor of the cloud and surface properties.

  16. Adaptive Pre-FFT Equalizer with High-Precision Channel Estimator for ISI Channels

    NASA Astrophysics Data System (ADS)

    Yoshida, Makoto

    We present an attractive approach for OFDM transmission using an adaptive pre-FFT equalizer, which can select ICI reduction mode according to channel condition, and a degenerated-inverse-matrix-based channel estimator (DIME), which uses a cyclic sinc-function matrix uniquely determined by transmitted subcarriers. In addition to simulation results, the proposed system with an adaptive pre-FFT equalizer and DIME has been laboratory tested by using a software defined radio (SDR)-based test bed. The simulation and experimental results demonstrated that the system at a rate of more than 100Mbps can provide a bit error rate of less than 10-3 for a fast multi-path fading channel that has a moving velocity of more than 200km/h with a delay spread of 1.9µs (a maximum delay path of 7.3µs) in the 5-GHz band.

  17. Adaptive Filter Design Using Type-2 Fuzzy Cerebellar Model Articulation Controller.

    PubMed

    Lin, Chih-Min; Yang, Ming-Shu; Chao, Fei; Hu, Xiao-Min; Zhang, Jun

    2016-10-01

    This paper aims to propose an efficient network and applies it as an adaptive filter for the signal processing problems. An adaptive filter is proposed using a novel interval type-2 fuzzy cerebellar model articulation controller (T2FCMAC). The T2FCMAC realizes an interval type-2 fuzzy logic system based on the structure of the CMAC. Due to the better ability of handling uncertainties, type-2 fuzzy sets can solve some complicated problems with outstanding effectiveness than type-1 fuzzy sets. In addition, the Lyapunov function is utilized to derive the conditions of the adaptive learning rates, so that the convergence of the filtering error can be guaranteed. In order to demonstrate the performance of the proposed adaptive T2FCMAC filter, it is tested in signal processing applications, including a nonlinear channel equalization system, a time-varying channel equalization system, and an adaptive noise cancellation system. The advantages of the proposed filter over the other adaptive filters are verified through simulations.

  18. Apparent diffusion coefficient histogram shape analysis for monitoring early response in patients with advanced cervical cancers undergoing concurrent chemo-radiotherapy.

    PubMed

    Meng, Jie; Zhu, Lijing; Zhu, Li; Wang, Huanhuan; Liu, Song; Yan, Jing; Liu, Baorui; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng

    2016-10-22

    To explore the role of apparent diffusion coefficient (ADC) histogram shape related parameters in early assessment of treatment response during the concurrent chemo-radiotherapy (CCRT) course of advanced cervical cancers. This prospective study was approved by the local ethics committee and informed consent was obtained from all patients. Thirty-two patients with advanced cervical squamous cell carcinomas underwent diffusion weighted magnetic resonance imaging (b values, 0 and 800 s/mm 2 ) before CCRT, at the end of 2nd and 4th week during CCRT and immediately after CCRT completion. Whole lesion ADC histogram analysis generated several histogram shape related parameters including skewness, kurtosis, s-sD av , width, standard deviation, as well as first-order entropy and second-order entropies. The averaged ADC histograms of 32 patients were generated to visually observe dynamic changes of the histogram shape following CCRT. All parameters except width and standard deviation showed significant changes during CCRT (all P < 0.05), and their variation trends fell into four different patterns. Skewness and kurtosis both showed high early decline rate (43.10 %, 48.29 %) at the end of 2nd week of CCRT. All entropies kept decreasing significantly since 2 weeks after CCRT initiated. The shape of averaged ADC histogram also changed obviously following CCRT. ADC histogram shape analysis held the potential in monitoring early tumor response in patients with advanced cervical cancers undergoing CCRT.

  19. [Clinical application of MRI histogram in evaluation of muscle fatty infiltration].

    PubMed

    Zheng, Y M; Du, J; Li, W Z; Wang, Z X; Zhang, W; Xiao, J X; Yuan, Y

    2016-10-18

    To describe a method based on analysis of the histogram of intensity values produced from the magnetic resonance imaging (MRI) for quantifying the degree of fatty infiltration. The study included 25 patients with dystrophinopathy. All the subjects underwent muscle MRI test at thigh level. The histogram M values of 250 muscles adjusted for subcutaneous fat, representing the degree of fatty infiltration, were compared with the expert visual reading using the modified Mercuri scale. There was a significant positive correlation between the histogram M values and the scores of visual reading (r=0.854, P<0.001). The distinct pattern of muscle involvement detected in the patients with dystrophinopathy in our study of histogram M values was similar to that of visual reading and results in literature. The histogram M values had stronger correlations with the clinical data than the scores of visual reading as follows: the correlations with age (r=0.730, P<0.001) and (r=0.753, P<0.001); with strength of knee extensor (r=-0.468, P=0.024) and (r=-0.460, P=0.027) respectively. Meanwhile, the histogram M values analysis had better repeatability than visual reading with the interclass correlation coefficient was 0.998 (95% CI: 0.997-0.998, P<0.001) and 0.958 (95% CI: 0.946-0.967, P<0.001) respectively. Histogram M values analysis of MRI with the advantages of repeatability and objectivity can be used to evaluate the degree of muscle fatty infiltration.

  20. Dissimilarity representations in lung parenchyma classification

    NASA Astrophysics Data System (ADS)

    Sørensen, Lauge; de Bruijne, Marleen

    2009-02-01

    A good problem representation is important for a pattern recognition system to be successful. The traditional approach to statistical pattern recognition is feature representation. More specifically, objects are represented by a number of features in a feature vector space, and classifiers are built in this representation. This is also the general trend in lung parenchyma classification in computed tomography (CT) images, where the features often are measures on feature histograms. Instead, we propose to build normal density based classifiers in dissimilarity representations for lung parenchyma classification. This allows for the classifiers to work on dissimilarities between objects, which might be a more natural way of representing lung parenchyma. In this context, dissimilarity is defined between CT regions of interest (ROI)s. ROIs are represented by their CT attenuation histogram and ROI dissimilarity is defined as a histogram dissimilarity measure between the attenuation histograms. In this setting, the full histograms are utilized according to the chosen histogram dissimilarity measure. We apply this idea to classification of different emphysema patterns as well as normal, healthy tissue. Two dissimilarity representation approaches as well as different histogram dissimilarity measures are considered. The approaches are evaluated on a set of 168 CT ROIs using normal density based classifiers all showing good performance. Compared to using histogram dissimilarity directly as distance in a emph{k} nearest neighbor classifier, which achieves a classification accuracy of 92.9%, the best dissimilarity representation based classifier is significantly better with a classification accuracy of 97.0% (text{emph{p" border="0" class="imgtopleft"> = 0.046).

  1. The Histogram-Area Connection

    ERIC Educational Resources Information Center

    Gratzer, William; Carpenter, James E.

    2008-01-01

    This article demonstrates an alternative approach to the construction of histograms--one based on the notion of using area to represent relative density in intervals of unequal length. The resulting histograms illustrate the connection between the area of the rectangles associated with particular outcomes and the relative frequency (probability)…

  2. Investigating Student Understanding of Histograms

    ERIC Educational Resources Information Center

    Kaplan, Jennifer J.; Gabrosek, John G.; Curtiss, Phyllis; Malone, Chris

    2014-01-01

    Histograms are adept at revealing the distribution of data values, especially the shape of the distribution and any outlier values. They are included in introductory statistics texts, research methods texts, and in the popular press, yet students often have difficulty interpreting the information conveyed by a histogram. This research identifies…

  3. Low complexity adaptive equalizers for underwater acoustic communications

    NASA Astrophysics Data System (ADS)

    Soflaei, Masoumeh; Azmi, Paeiz

    2014-08-01

    Interference signals due to scattering from surface and reflecting from bottom is one of the most important problems of reliable communications in shallow water channels. To solve this problem, one of the best suggested ways is to use adaptive equalizers. Convergence rate and misadjustment error in adaptive algorithms play important roles in adaptive equalizer performance. In this paper, affine projection algorithm (APA), selective regressor APA(SR-APA), family of selective partial update (SPU) algorithms, family of set-membership (SM) algorithms and selective partial update selective regressor APA (SPU-SR-APA) are compared with conventional algorithms such as the least mean square (LMS) in underwater acoustic communications. We apply experimental data from the Strait of Hormuz for demonstrating the efficiency of the proposed methods over shallow water channel. We observe that the values of the steady-state mean square error (MSE) of SR-APA, SPU-APA, SPU-normalized least mean square (SPU-NLMS), SPU-SR-APA, SM-APA and SM-NLMS algorithms decrease in comparison with the LMS algorithm. Also these algorithms have better convergence rates than LMS type algorithm.

  4. Extracting an evaluative feedback from the brain for adaptation of motor neuroprosthetic decoders.

    PubMed

    Mahmoudi, Babak; Principe, Jose C; Sanchez, Justin C

    2010-01-01

    The design of Brain-Machine Interface (BMI) neural decoders that have robust performance in changing environments encountered in daily life activity is a challenging problem. One solution to this problem is the design of neural decoders that are able to assist and adapt to the user by participating in their perception-action-reward cycle (PARC). Using inspiration both from artificial intelligence and neurobiology reinforcement learning theories, we have designed a novel decoding architecture that enables a symbiotic relationship between the user and an Intelligent Assistant (IA). By tapping into the motor and reward centers in the brain, the IA adapts the process of decoding neural motor commands into prosthetic actions based on the user's goals. The focus of this paper is on extraction of goal information directly from the brain and making it accessible to the IA as an evaluative feedback for adaptation. We have recorded the neural activity of the Nucleus Accumbens in behaving rats during a reaching task. The peri-event time histograms demonstrate a rich representation of the reward prediction in this subcortical structure that can be modeled on a single trial basis as a scalar evaluative feedback with high precision.

  5. Variations of attractors and wavelet spectra of the immunofluorescence distributions for women in the pregnant period

    NASA Astrophysics Data System (ADS)

    Galich, Nikolay E.

    2008-07-01

    Communication contains the description of the immunology data treatment. New nonlinear methods of immunofluorescence statistical analysis of peripheral blood neutrophils have been developed. We used technology of respiratory burst reaction of DNA fluorescence in the neutrophils cells nuclei due to oxidative activity. The histograms of photon count statistics the radiant neutrophils populations' in flow cytometry experiments are considered. Distributions of the fluorescence flashes frequency as functions of the fluorescence intensity are analyzed. Statistic peculiarities of histograms set for women in the pregnant period allow dividing all histograms on the three classes. The classification is based on three different types of smoothing and long-range scale averaged immunofluorescence distributions, their bifurcation and wavelet spectra. Heterogeneity peculiarities of long-range scale immunofluorescence distributions and peculiarities of wavelet spectra allow dividing all histograms on three groups. First histograms group belongs to healthy donors. Two other groups belong to donors with autoimmune and inflammatory diseases. Some of the illnesses are not diagnosed by standards biochemical methods. Medical standards and statistical data of the immunofluorescence histograms for identifications of health and illnesses are interconnected. Peculiarities of immunofluorescence for women in pregnant period are classified. Health or illness criteria are connected with statistics features of immunofluorescence histograms. Neutrophils populations' fluorescence presents the sensitive clear indicator of health status.

  6. Complexity of possibly gapped histogram and analysis of histogram.

    PubMed

    Fushing, Hsieh; Roy, Tania

    2018-02-01

    We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.

  7. Histogram-based quantitative evaluation of endobronchial ultrasonography images of peripheral pulmonary lesion.

    PubMed

    Morikawa, Kei; Kurimoto, Noriaki; Inoue, Takeo; Mineshita, Masamichi; Miyazawa, Teruomi

    2015-01-01

    Endobronchial ultrasonography using a guide sheath (EBUS-GS) is an increasingly common bronchoscopic technique, but currently, no methods have been established to quantitatively evaluate EBUS images of peripheral pulmonary lesions. The purpose of this study was to evaluate whether histogram data collected from EBUS-GS images can contribute to the diagnosis of lung cancer. Histogram-based analyses focusing on the brightness of EBUS images were retrospectively conducted: 60 patients (38 lung cancer; 22 inflammatory diseases), with clear EBUS images were included. For each patient, a 400-pixel region of interest was selected, typically located at a 3- to 5-mm radius from the probe, from recorded EBUS images during bronchoscopy. Histogram height, width, height/width ratio, standard deviation, kurtosis and skewness were investigated as diagnostic indicators. Median histogram height, width, height/width ratio and standard deviation were significantly different between lung cancer and benign lesions (all p < 0.01). With a cutoff value for standard deviation of 10.5, lung cancer could be diagnosed with an accuracy of 81.7%. Other characteristics investigated were inferior when compared to histogram standard deviation. Histogram standard deviation appears to be the most useful characteristic for diagnosing lung cancer using EBUS images. © 2015 S. Karger AG, Basel.

  8. Complexity of possibly gapped histogram and analysis of histogram

    PubMed Central

    Roy, Tania

    2018-01-01

    We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT. PMID:29515829

  9. Complexity of possibly gapped histogram and analysis of histogram

    NASA Astrophysics Data System (ADS)

    Fushing, Hsieh; Roy, Tania

    2018-02-01

    We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.

  10. Comparison of adverse effects of proton and X-ray chemoradiotherapy for esophageal cancer using an adaptive dose–volume histogram analysis

    PubMed Central

    Makishima, Hirokazu; Ishikawa, Hitoshi; Terunuma, Toshiyuki; Hashimoto, Takayuki; Yamanashi, Koichi; Sekiguchi, Takao; Mizumoto, Masashi; Okumura, Toshiyuki; Sakae, Takeji; Sakurai, Hideyuki

    2015-01-01

    Cardiopulmonary late toxicity is of concern in concurrent chemoradiotherapy (CCRT) for esophageal cancer. The aim of this study was to examine the benefit of proton beam therapy (PBT) using clinical data and adaptive dose–volume histogram (DVH) analysis. The subjects were 44 patients with esophageal cancer who underwent definitive CCRT using X-rays (n = 19) or protons (n = 25). Experimental recalculation using protons was performed for the patient actually treated with X-rays, and vice versa. Target coverage and dose constraints of normal tissues were conserved. Lung V5–V20, mean lung dose (MLD), and heart V30–V50 were compared for risk organ doses between experimental plans and actual treatment plans. Potential toxicity was estimated using protons in patients actually treated with X-rays, and vice versa. Pulmonary events of Grade ≥2 occurred in 8/44 cases (18%), and cardiac events were seen in 11 cases (25%). Risk organ doses in patients with events of Grade ≥2 were significantly higher than for those with events of Grade ≤1. Risk organ doses were lower in proton plans compared with X-ray plans. All patients suffering toxicity who were treated with X-rays (n = 13) had reduced predicted doses in lung and heart using protons, while doses in all patients treated with protons (n = 24) with toxicity of Grade ≤1 had worsened predicted toxicity with X-rays. Analysis of normal tissue complication probability showed a potential reduction in toxicity by using proton beams. Irradiation dose, volume and adverse effects on the heart and lung can be reduced using protons. Thus, PBT is a promising treatment modality for the management of esophageal cancer. PMID:25755255

  11. Interactive Dose Shaping - efficient strategies for CPU-based real-time treatment planning

    NASA Astrophysics Data System (ADS)

    Ziegenhein, P.; Kamerling, C. P.; Oelfke, U.

    2014-03-01

    Conventional intensity modulated radiation therapy (IMRT) treatment planning is based on the traditional concept of iterative optimization using an objective function specified by dose volume histogram constraints for pre-segmented VOIs. This indirect approach suffers from unavoidable shortcomings: i) The control of local dose features is limited to segmented VOIs. ii) Any objective function is a mathematical measure of the plan quality, i.e., is not able to define the clinically optimal treatment plan. iii) Adapting an existing plan to changed patient anatomy as detected by IGRT procedures is difficult. To overcome these shortcomings, we introduce the method of Interactive Dose Shaping (IDS) as a new paradigm for IMRT treatment planning. IDS allows for a direct and interactive manipulation of local dose features in real-time. The key element driving the IDS process is a two-step Dose Modification and Recovery (DMR) strategy: A local dose modification is initiated by the user which translates into modified fluence patterns. This also affects existing desired dose features elsewhere which is compensated by a heuristic recovery process. The IDS paradigm was implemented together with a CPU-based ultra-fast dose calculation and a 3D GUI for dose manipulation and visualization. A local dose feature can be implemented via the DMR strategy within 1-2 seconds. By imposing a series of local dose features, equal plan qualities could be achieved compared to conventional planning for prostate and head and neck cases within 1-2 minutes. The idea of Interactive Dose Shaping for treatment planning has been introduced and first applications of this concept have been realized.

  12. Identification and characterization of neutrophil extracellular trap shapes in flow cytometry

    NASA Astrophysics Data System (ADS)

    Ginley, Brandon; Emmons, Tiffany; Sasankan, Prabhu; Urban, Constantin; Segal, Brahm H.; Sarder, Pinaki

    2017-03-01

    Neutrophil extracellular trap (NET) formation is an alternate immunologic weapon used mainly by neutrophils. Chromatin backbones fused with proteins derived from granules are shot like projectiles onto foreign invaders. It is thought that this mechanism is highly anti-microbial, aids in preventing bacterial dissemination, is used to break down structures several sizes larger than neutrophils themselves, and may have several more uses yet unknown. NETs have been implied to be involved in a wide array of systemic host immune defenses, including sepsis, autoimmune diseases, and cancer. Existing methods used to visually quantify NETotic versus non-NETotic shapes are extremely time-consuming and subject to user bias. These limitations are obstacles to developing NETs as prognostic biomarkers and therapeutic targets. We propose an automated pipeline for quantitatively detecting neutrophil and NET shapes captured using a flow cytometry-imaging system. Our method uses contrast limited adaptive histogram equalization to improve signal intensity in dimly illuminated NETs. From the contrast improved image, fixed value thresholding is applied to convert the image to binary. Feature extraction is performed on the resulting binary image, by calculating region properties of the resulting foreground structures. Classification of the resulting features is performed using Support Vector Machine. Our method classifies NETs from neutrophils without traps at 0.97/0.96 sensitivity/specificity on n = 387 images, and is 1500X faster than manual classification, per sample. Our method can be extended to rapidly analyze whole-slide immunofluorescence tissue images for NET classification, and has potential to streamline the quantification of NETs for patients with diseases associated with cancer and autoimmunity.

  13. Automated segmentation of geographic atrophy using deep convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Hu, Zhihong; Wang, Ziyuan; Sadda, SriniVas R.

    2018-02-01

    Geographic atrophy (GA) is an end-stage manifestation of the advanced age-related macular degeneration (AMD), the leading cause of blindness and visual impairment in developed nations. Techniques to rapidly and precisely detect and quantify GA would appear to be of critical importance in advancing the understanding of its pathogenesis. In this study, we develop an automated supervised classification system using deep convolutional neural networks (CNNs) for segmenting GA in fundus autofluorescene (FAF) images. More specifically, to enhance the contrast of GA relative to the background, we apply the contrast limited adaptive histogram equalization. Blood vessels may cause GA segmentation errors due to similar intensity level to GA. A tensor-voting technique is performed to identify the blood vessels and a vessel inpainting technique is applied to suppress the GA segmentation errors due to the blood vessels. To handle the large variation of GA lesion sizes, three deep CNNs with three varying sized input image patches are applied. Fifty randomly chosen FAF images are obtained from fifty subjects with GA. The algorithm-defined GA regions are compared with manual delineation by a certified grader. A two-fold cross-validation is applied to evaluate the algorithm performance. The mean segmentation accuracy, true positive rate (i.e. sensitivity), true negative rate (i.e. specificity), positive predictive value, false discovery rate, and overlap ratio, between the algorithm- and manually-defined GA regions are 0.97 +/- 0.02, 0.89 +/- 0.08, 0.98 +/- 0.02, 0.87 +/- 0.12, 0.13 +/- 0.12, and 0.79 +/- 0.12 respectively, demonstrating a high level of agreement.

  14. Computer-aided diagnosis based on enhancement of degraded fundus photographs.

    PubMed

    Jin, Kai; Zhou, Mei; Wang, Shaoze; Lou, Lixia; Xu, Yufeng; Ye, Juan; Qian, Dahong

    2018-05-01

    Retinal imaging is an important and effective tool for detecting retinal diseases. However, degraded images caused by the aberrations of the eye can disguise lesions, so that a diseased eye can be mistakenly diagnosed as normal. In this work, we propose a new image enhancement method to improve the quality of degraded images. A new method is used to enhance degraded-quality fundus images. In this method, the image is converted from the input RGB colour space to LAB colour space and then each normalized component is enhanced using contrast-limited adaptive histogram equalization. Human visual system (HVS)-based fundus image quality assessment, combined with diagnosis by experts, is used to evaluate the enhancement. The study included 191 degraded-quality fundus photographs of 143 subjects with optic media opacity. Objective quality assessment of image enhancement (range: 0-1) indicated that our method improved colour retinal image quality from an average of 0.0773 (variance 0.0801) to an average of 0.3973 (variance 0.0756). Following enhancement, area under curves (AUC) were 0.996 for the glaucoma classifier, 0.989 for the diabetic retinopathy (DR) classifier, 0.975 for the age-related macular degeneration (AMD) classifier and 0.979 for the other retinal diseases classifier. The relatively simple method for enhancing degraded-quality fundus images achieves superior image enhancement, as demonstrated in a qualitative HVS-based image quality assessment. This retinal image enhancement may, therefore, be employed to assist ophthalmologists in more efficient screening of retinal diseases and the development of computer-aided diagnosis. © 2017 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  15. An Improved Pathological Brain Detection System Based on Two-Dimensional PCA and Evolutionary Extreme Learning Machine.

    PubMed

    Nayak, Deepak Ranjan; Dash, Ratnakar; Majhi, Banshidhar

    2017-12-07

    Pathological brain detection has made notable stride in the past years, as a consequence many pathological brain detection systems (PBDSs) have been proposed. But, the accuracy of these systems still needs significant improvement in order to meet the necessity of real world diagnostic situations. In this paper, an efficient PBDS based on MR images is proposed that markedly improves the recent results. The proposed system makes use of contrast limited adaptive histogram equalization (CLAHE) to enhance the quality of the input MR images. Thereafter, two-dimensional PCA (2DPCA) strategy is employed to extract the features and subsequently, a PCA+LDA approach is used to generate a compact and discriminative feature set. Finally, a new learning algorithm called MDE-ELM is suggested that combines modified differential evolution (MDE) and extreme learning machine (ELM) for segregation of MR images as pathological or healthy. The MDE is utilized to optimize the input weights and hidden biases of single-hidden-layer feed-forward neural networks (SLFN), whereas an analytical method is used for determining the output weights. The proposed algorithm performs optimization based on both the root mean squared error (RMSE) and norm of the output weights of SLFNs. The suggested scheme is benchmarked on three standard datasets and the results are compared against other competent schemes. The experimental outcomes show that the proposed scheme offers superior results compared to its counterparts. Further, it has been noticed that the proposed MDE-ELM classifier obtains better accuracy with compact network architecture than conventional algorithms.

  16. Deep architecture neural network-based real-time image processing for image-guided radiotherapy.

    PubMed

    Mori, Shinichiro

    2017-08-01

    To develop real-time image processing for image-guided radiotherapy, we evaluated several neural network models for use with different imaging modalities, including X-ray fluoroscopic image denoising. Setup images of prostate cancer patients were acquired with two oblique X-ray fluoroscopic units. Two types of residual network were designed: a convolutional autoencoder (rCAE) and a convolutional neural network (rCNN). We changed the convolutional kernel size and number of convolutional layers for both networks, and the number of pooling and upsampling layers for rCAE. The ground-truth image was applied to the contrast-limited adaptive histogram equalization (CLAHE) method of image processing. Network models were trained to keep the quality of the output image close to that of the ground-truth image from the input image without image processing. For image denoising evaluation, noisy input images were used for the training. More than 6 convolutional layers with convolutional kernels >5×5 improved image quality. However, this did not allow real-time imaging. After applying a pair of pooling and upsampling layers to both networks, rCAEs with >3 convolutions each and rCNNs with >12 convolutions with a pair of pooling and upsampling layers achieved real-time processing at 30 frames per second (fps) with acceptable image quality. Use of our suggested network achieved real-time image processing for contrast enhancement and image denoising by the use of a conventional modern personal computer. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. Comparison of algorithms for automatic border detection of melanoma in dermoscopy images

    NASA Astrophysics Data System (ADS)

    Srinivasa Raghavan, Sowmya; Kaur, Ravneet; LeAnder, Robert

    2016-09-01

    Melanoma is one of the most rapidly accelerating cancers in the world [1]. Early diagnosis is critical to an effective cure. We propose a new algorithm for more accurately detecting melanoma borders in dermoscopy images. Proper border detection requires eliminating occlusions like hair and bubbles by processing the original image. The preprocessing step involves transforming the RGB image to the CIE L*u*v* color space, in order to decouple brightness from color information, then increasing contrast, using contrast-limited adaptive histogram equalization (CLAHE), followed by artifacts removal using a Gaussian filter. After preprocessing, the Chen-Vese technique segments the preprocessed images to create a lesion mask which undergoes a morphological closing operation. Next, the largest central blob in the lesion is detected, after which, the blob is dilated to generate an image output mask. Finally, the automatically-generated mask is compared to the manual mask by calculating the XOR error [3]. Our border detection algorithm was developed using training and test sets of 30 and 20 images, respectively. This detection method was compared to the SRM method [4] by calculating the average XOR error for each of the two algorithms. Average error for test images was 0.10, using the new algorithm, and 0.99, using SRM method. In comparing the average error values produced by the two algorithms, it is evident that the average XOR error for our technique is lower than the SRM method, thereby implying that the new algorithm detects borders of melanomas more accurately than the SRM algorithm.

  18. Blind equalization with criterion with memory nonlinearity

    NASA Astrophysics Data System (ADS)

    Chen, Yuanjie; Nikias, Chrysostomos L.; Proakis, John G.

    1992-06-01

    Blind equalization methods usually combat the linear distortion caused by a nonideal channel via a transversal filter, without resorting to the a priori known training sequences. We introduce a new criterion with memory nonlinearity (CRIMNO) for the blind equalization problem. The basic idea of this criterion is to augment the Godard [or constant modulus algorithm (CMA)] cost function with additional terms that penalize the autocorrelations of the equalizer outputs. Several variations of the CRIMNO algorithms are derived, with the variations dependent on (1) whether the empirical averages or the single point estimates are used to approximate the expectations, (2) whether the recent or the delayed equalizer coefficients are used, and (3) whether the weights applied to the autocorrelation terms are fixed or are allowed to adapt. Simulation experiments show that the CRIMNO algorithm, and especially its adaptive weight version, exhibits faster convergence speed than the Godard (or CMA) algorithm. Extensions of the CRIMNO criterion to accommodate the case of correlated inputs to the channel are also presented.

  19. Three-dimensional volumetric gray-scale uterine cervix histogram prediction of days to delivery in full term pregnancy.

    PubMed

    Kim, Ji Youn; Kim, Hai-Joong; Hahn, Meong Hi; Jeon, Hye Jin; Cho, Geum Joon; Hong, Sun Chul; Oh, Min Jeong

    2013-09-01

    Our aim was to figure out whether volumetric gray-scale histogram difference between anterior and posterior cervix can indicate the extent of cervical consistency. We collected data of 95 patients who were appropriate for vaginal delivery with 36th to 37th weeks of gestational age from September 2010 to October 2011 in the Department of Obstetrics and Gynecology, Korea University Ansan Hospital. Patients were excluded who had one of the followings: Cesarean section, labor induction, premature rupture of membrane. Thirty-four patients were finally enrolled. The patients underwent evaluation of the cervix through Bishop score, cervical length, cervical volume, three-dimensional (3D) cervical volumetric gray-scale histogram. The interval days from the cervix evaluation to the delivery day were counted. We compared to 3D cervical volumetric gray-scale histogram, Bishop score, cervical length, cervical volume with interval days from the evaluation of the cervix to the delivery. Gray-scale histogram difference between anterior and posterior cervix was significantly correlated to days to delivery. Its correlation coefficient (R) was 0.500 (P = 0.003). The cervical length was significantly related to the days to delivery. The correlation coefficient (R) and P-value between them were 0.421 and 0.013. However, anterior lip histogram, posterior lip histogram, total cervical volume, Bishop score were not associated with days to delivery (P >0.05). By using gray-scale histogram difference between anterior and posterior cervix and cervical length correlated with the days to delivery. These methods can be utilized to better help predict a cervical consistency.

  20. Construction and Evaluation of Histograms in Teacher Training

    ERIC Educational Resources Information Center

    Bruno, A.; Espinel, M. C.

    2009-01-01

    This article details the results of a written test designed to reveal how education majors construct and evaluate histograms and frequency polygons. Included is a description of the mistakes made by the students which shows how they tend to confuse histograms with bar diagrams, incorrectly assign data along the Cartesian axes and experience…

  1. Empirical Histograms in Item Response Theory with Ordinal Data

    ERIC Educational Resources Information Center

    Woods, Carol M.

    2007-01-01

    The purpose of this research is to describe, test, and illustrate a new implementation of the empirical histogram (EH) method for ordinal items. The EH method involves the estimation of item response model parameters simultaneously with the approximation of the distribution of the random latent variable (theta) as a histogram. Software for the EH…

  2. Symbol recognition via statistical integration of pixel-level constraint histograms: a new descriptor.

    PubMed

    Yang, Su

    2005-02-01

    A new descriptor for symbol recognition is proposed. 1) A histogram is constructed for every pixel to figure out the distribution of the constraints among the other pixels. 2) All the histograms are statistically integrated to form a feature vector with fixed dimension. The robustness and invariance were experimentally confirmed.

  3. Airborne gamma-ray spectrometer and magnetometer survey, Durango D, Colorado. Final report Volume II A. Detail area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    This volume contains geology of the Durango D detail area, radioactive mineral occurrences in Colorado, and geophysical data interpretation. Eight appendices provide: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, magnetic and ancillary profiles, and test line data.

  4. Airborne gamma-ray spectrometer and magnetometer survey, Durango C, Colorado. Final report Volume II A. Detail area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    Geology of Durango C detail area, radioactive mineral occurrences in Colorado, and geophysical data interpretation are included in this report. Eight appendices provide: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, magnetic and ancillary profiles, and test line data.

  5. Action recognition via cumulative histogram of multiple features

    NASA Astrophysics Data System (ADS)

    Yan, Xunshi; Luo, Yupin

    2011-01-01

    Spatial-temporal interest points (STIPs) are popular in human action recognition. However, they suffer from difficulties in determining size of codebook and losing much information during forming histograms. In this paper, spatial-temporal interest regions (STIRs) are proposed, which are based on STIPs and are capable of marking the locations of the most ``shining'' human body parts. In order to represent human actions, the proposed approach takes great advantages of multiple features, including STIRs, pyramid histogram of oriented gradients and pyramid histogram of oriented optical flows. To achieve this, cumulative histogram is used to integrate dynamic information in sequences and to form feature vectors. Furthermore, the widely used nearest neighbor and AdaBoost methods are employed as classification algorithms. Experiments on public datasets KTH, Weizmann and UCF sports show that the proposed approach achieves effective and robust results.

  6. Pediatric Out-of-Hospital Cardiac Arrest Characteristics and Their Association With Survival and Neurobehavioral Outcome.

    PubMed

    Meert, Kathleen L; Telford, Russell; Holubkov, Richard; Slomine, Beth S; Christensen, James R; Dean, J Michael; Moler, Frank W

    2016-12-01

    To investigate relationships between cardiac arrest characteristics and survival and neurobehavioral outcome among children recruited to the Therapeutic Hypothermia after Pediatric Cardiac Arrest Out-of-Hospital trial. Secondary analysis of Therapeutic Hypothermia after Pediatric Cardiac Arrest Out-of-Hospital trial data. Thirty-six PICUs in the United States and Canada. All children (n = 295) had chest compressions for greater than or equal to 2 minutes, were comatose, and required mechanical ventilation after return of circulation. Neurobehavioral function was assessed using the Vineland Adaptive Behavior Scales, Second Edition at baseline (reflecting prearrest status) and 12 months postarrest. U.S. norms for Vineland Adaptive Behavior Scales, Second Edition scores are 100 (mean) ± 15 (SD). Higher scores indicate better functioning. Outcomes included 12-month survival and 12-month survival with Vineland Adaptive Behavior Scales, Second Edition greater than or equal to 70. Cardiac etiology of arrest, initial arrest rhythm of ventricular fibrillation/tachycardia, shorter duration of chest compressions, compressions not required at hospital arrival, fewer epinephrine doses, and witnessed arrest were associated with greater 12-month survival and 12-month survival with Vineland Adaptive Behavior Scales, Second Edition greater than or equal to 70. Weekend arrest was associated with lower 12-month survival. Body habitus was associated with 12-month survival with Vineland Adaptive Behavior Scales, Second Edition greater than or equal to 70; underweight children had better outcomes, and obese children had worse outcomes. On multivariate analysis, acute life threatening event/sudden unexpected infant death, chest compressions more than 30 minutes, and weekend arrest were associated with lower 12-month survival; witnessed arrest was associated with greater 12-month survival. Acute life threatening event/sudden unexpected infant death, other respiratory causes of arrest except drowning, other/unknown causes of arrest, and compressions more than 30 minutes were associated with lower 12-month survival with Vineland Adaptive Behavior Scales, Second Edition greater than or equal to 70. Many factors are associated with survival and neurobehavioral outcome among children who are comatose and require mechanical ventilation after out-of-hospital cardiac arrest. These factors may be useful for identifying children at risk for poor outcomes, and for improving prevention and resuscitation strategies.

  7. Improvement of resolution in full-view linear-array photoacoustic computed tomography using a novel adaptive weighting method

    NASA Astrophysics Data System (ADS)

    Omidi, Parsa; Diop, Mamadou; Carson, Jeffrey; Nasiriavanaki, Mohammadreza

    2017-03-01

    Linear-array-based photoacoustic computed tomography is a popular methodology for deep and high resolution imaging. However, issues such as phase aberration, side-lobe effects, and propagation limitations deteriorate the resolution. The effect of phase aberration due to acoustic attenuation and constant assumption of the speed of sound (SoS) can be reduced by applying an adaptive weighting method such as the coherence factor (CF). Utilizing an adaptive beamforming algorithm such as the minimum variance (MV) can improve the resolution at the focal point by eliminating the side-lobes. Moreover, invisibility of directional objects emitting parallel to the detection plane, such as vessels and other absorbing structures stretched in the direction perpendicular to the detection plane can degrade resolution. In this study, we propose a full-view array level weighting algorithm in which different weighs are assigned to different positions of the linear array based on an orientation algorithm which uses the histogram of oriented gradient (HOG). Simulation results obtained from a synthetic phantom show the superior performance of the proposed method over the existing reconstruction methods.

  8. Histogram analysis of apparent diffusion coefficient for monitoring early response in patients with advanced cervical cancers undergoing concurrent chemo-radiotherapy.

    PubMed

    Meng, Jie; Zhu, Lijing; Zhu, Li; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng

    2017-11-01

    Background Apparent diffusion coefficient (ADC) histogram analysis has been widely used in determining tumor prognosis. Purpose To investigate the dynamic changes of ADC histogram parameters during concurrent chemo-radiotherapy (CCRT) in patients with advanced cervical cancers. Material and Methods This prospective study enrolled 32 patients with advanced cervical cancers undergoing CCRT who received diffusion-weighted (DW) magnetic resonance imaging (MRI) before CCRT, at the end of the second and fourth week during CCRT and one month after CCRT completion. The ADC histogram for the entire tumor volume was generated, and a series of histogram parameters was obtained. Dynamic changes of those parameters in cervical cancers were investigated as early biomarkers for treatment response. Results All histogram parameters except AUC low showed significant changes during CCRT (all P < 0.05). There were three variable trends involving different parameters. The mode, 5th, 10th, and 25th percentiles showed similar early increase rates (33.33%, 33.99%, 34.12%, and 30.49%, respectively) at the end of the second week of CCRT. The pre-CCRT 5th and 25th percentiles of the complete response (CR) group were significantly lower than those of the partial response (PR) group. Conclusion A series of ADC histogram parameters of cervical cancers changed significantly at the early stage of CCRT, indicating their potential in monitoring early tumor response to therapy.

  9. Whole Tumor Histogram-profiling of Diffusion-Weighted Magnetic Resonance Images Reflects Tumorbiological Features of Primary Central Nervous System Lymphoma.

    PubMed

    Schob, Stefan; Münch, Benno; Dieckow, Julia; Quäschling, Ulf; Hoffmann, Karl-Titus; Richter, Cindy; Garnov, Nikita; Frydrychowicz, Clara; Krause, Matthias; Meyer, Hans-Jonas; Surov, Alexey

    2018-04-01

    Diffusion weighted imaging (DWI) quantifies motion of hydrogen nuclei in biological tissues and hereby has been used to assess the underlying tissue microarchitecture. Histogram-profiling of DWI provides more detailed information on diffusion characteristics of a lesion than the standardly calculated values of the apparent diffusion coefficient (ADC)-minimum, mean and maximum. Hence, the aim of our study was to investigate, which parameters of histogram-profiling of DWI in primary central nervous system lymphoma can be used to specifically predict features like cellular density, chromatin content and proliferative activity. Pre-treatment ADC maps of 21 PCNSL patients (8 female, 13 male, 28-89 years) from a 1.5T system were used for Matlab-based histogram profiling. Results of histopathology (H&E staining) and immunohistochemistry (Ki-67 expression) were quantified. Correlations between histogram-profiling parameters and neuropathologic examination were calculated using SPSS 23.0. The lower percentiles (p10 and p25) showed significant correlations with structural parameters of the neuropathologic examination (cellular density, chromatin content). The highest percentile, p90, correlated significantly with Ki-67 expression, resembling proliferative activity. Kurtosis of the ADC histogram correlated significantly with cellular density. Histogram-profiling of DWI in PCNSL provides a comprehensible set of parameters, which reflect distinct tumor-architectural and tumor-biological features, and hence, are promising biomarkers for treatment response and prognosis. Copyright © 2018. Published by Elsevier Inc.

  10. ADC histogram analysis of muscle lymphoma - Correlation with histopathology in a rare entity.

    PubMed

    Meyer, Hans-Jonas; Pazaitis, Nikolaos; Surov, Alexey

    2018-06-21

    Diffusion weighted imaging (DWI) is able to reflect histopathology architecture. A novel imaging approach, namely histogram analysis, is used to further characterize lesion on MRI. The purpose of this study is to correlate histogram parameters derived from apparent diffusion coefficient- (ADC) maps with histopathology parameters in muscle lymphoma. Eight patients (mean age 64.8 years, range 45-72 years) with histopathologically confirmed muscle lymphoma were retrospectively identified. Cell count, total nucleic and average nucleic areas were estimated using ImageJ. Additionally, Ki67-index was calculated. DWI was obtained on a 1.5T scanner by using the b values of 0 and 1000 s/mm2. Histogram analysis was performed as a whole lesion measurement by using a custom-made Matlabbased application. The correlation analysis revealed statistically significant correlation between cell count and ADCmean (p=-0.76, P=0.03) as well with ADCp75 (p=-0.79, P=0.02). Kurtosis and entropy correlated with average nucleic area (p=-0.81, P=0.02, p=0.88, P=0.007, respectively). None of the analyzed ADC parameters correlated with total nucleic area and with Ki67-index. This study identified significant correlations between cellularity and histogram parameters derived from ADC maps in muscle lymphoma. Thus, histogram analysis parameters reflect histopathology in muscle tumors. Advances in knowledge: Whole lesion ADC histogram analysis is able to reflect histopathology parameters in muscle lymphomas.

  11. Delay, change and bifurcation of the immunofluorescence distribution attractors in health statuses diagnostics and in medical treatment

    NASA Astrophysics Data System (ADS)

    Galich, Nikolay E.; Filatov, Michael V.

    2008-07-01

    Communication contains the description of the immunology experiments and the experimental data treatment. New nonlinear methods of immunofluorescence statistical analysis of peripheral blood neutrophils have been developed. We used technology of respiratory burst reaction of DNA fluorescence in the neutrophils cells nuclei due to oxidative activity. The histograms of photon count statistics the radiant neutrophils populations' in flow cytometry experiments are considered. Distributions of the fluorescence flashes frequency as functions of the fluorescence intensity are analyzed. Statistic peculiarities of histograms set for healthy and unhealthy donors allow dividing all histograms on the three classes. The classification is based on three different types of smoothing and long-range scale averaged immunofluorescence distributions and their bifurcation. Heterogeneity peculiarities of long-range scale immunofluorescence distributions allow dividing all histograms on three groups. First histograms group belongs to healthy donors. Two other groups belong to donors with autoimmune and inflammatory diseases. Some of the illnesses are not diagnosed by standards biochemical methods. Medical standards and statistical data of the immunofluorescence histograms for identifications of health and illnesses are interconnected. Possibilities and alterations of immunofluorescence statistics in registration, diagnostics and monitoring of different diseases in various medical treatments have been demonstrated. Health or illness criteria are connected with statistics features of immunofluorescence histograms. Neutrophils populations' fluorescence presents the sensitive clear indicator of health status.

  12. A CMOS merged CDR and continuous-time adaptive equalizer

    NASA Astrophysics Data System (ADS)

    Sánchez-Azqueta, C.; Aguirre, J.; Gimeno, C.; Aldea, C.; Celma, S.

    2015-06-01

    We present a low-voltage merged CDR and cntinuous-time adaptive equalizer capable to compensate the attenu- ation of a SI-POF channel while at the same time synchronizing and regenerating the incoming signal in a single stage. The system operates at 1.25 Gbps for NRZ modulation through a 50-m SI-POF channel and it is designed in standard 0.18-μm CMOS fed at 1 V with a power consumption of 43.4 mW.

  13. A NOISE ADAPTIVE FUZZY EQUALIZATION METHOD FOR PROCESSING SOLAR EXTREME ULTRAVIOLET IMAGES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Druckmueller, M., E-mail: druckmuller@fme.vutbr.cz

    A new image enhancement tool ideally suited for the visualization of fine structures in extreme ultraviolet images of the corona is presented in this paper. The Noise Adaptive Fuzzy Equalization method is particularly suited for the exceptionally high dynamic range images from the Atmospheric Imaging Assembly instrument on the Solar Dynamics Observatory. This method produces artifact-free images and gives significantly better results than methods based on convolution or Fourier transform which are often used for that purpose.

  14. Edge enhancement and image equalization by unsharp masking using self-adaptive photochromic filters.

    PubMed

    Ferrari, José A; Flores, Jorge L; Perciante, César D; Frins, Erna

    2009-07-01

    A new method for real-time edge enhancement and image equalization using photochromic filters is presented. The reversible self-adaptive capacity of photochromic materials is used for creating an unsharp mask of the original image. This unsharp mask produces a kind of self filtering of the original image. Unlike the usual Fourier (coherent) image processing, the technique we propose can also be used with incoherent illumination. Validation experiments with Bacteriorhodopsin and photochromic glass are presented.

  15. Binaural interaction in low-frequency neurons in inferior colliculus of the cat. II. Effects of changing rate and direction of interaural phase.

    PubMed

    Yin, T C; Kuwada, S

    1983-10-01

    We used the binaural beat stimulus to study the interaural phase sensitivity of inferior colliculus (IC) neurons in the cat. The binaural beat, produced by delivering tones of slightly different frequencies to the two ears, generates continuous and graded changes in interaural phase. Over 90% of the cells that exhibit a sensitivity to changes in the interaural delay also show a sensitivity to interaural phase disparities with the binaural beat. Cells respond with a burst of impulses with each complete cycle of the beat frequency. The period histogram obtained by binning the poststimulus time histogram on the beat frequency gives a measure of the interaural phase sensitivity of the cell. In general, there is good correspondence in the shapes of the period histograms generated from binaural beats and the interaural phase curves derived from interaural delays and in the mean interaural phase angle calculated from them. The magnitude of the beat frequency determines the rate of change of interaural phase and the sign determines the direction of phase change. While most cells respond in a phase-locked manner up to beat frequencies of 10 Hz, there are some cells tht will phase lock up to 80 Hz. Beat frequency and mean interaural phase angle are linearly related for most cells. Most cells respond equally in the two directions of phase change and with different rates of change, at least up to 10 Hz. However, some IC cells exhibit marked sensitivity to the speed of phase change, either responding more vigorously at low beat frequencies or at high beat frequencies. In addition, other cells demonstrate a clear directional sensitivity. The cells that show sensitivity to the direction and speed of phase changes would be expected to demonstrate a sensitivity to moving sound sources in the free field. Changes in the mean interaural phase of the binaural beat period histograms are used to determine the effects of changes in average and interaural intensity on the phase sensitivity of the cells. The effects of both forms of intensity variation are continuously distributed. The binaural beat offers a number of advantages for studying the interaural phase sensitivity of binaural cells. The dynamic characteristics of the interaural phase can be varied so that the speed and direction of phase change are under direct control. The data can be obtained in a much more efficient manner, as the binaural beat is about 10 times faster in terms of data collection than the interaural delay.

  16. Kalman Filtering Approach to Blind Equalization

    DTIC Science & Technology

    1993-12-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California •GR AD13 DTIC 94-07381 AR 0C199 THESIS S 0 LECTE4u KALMAN FILTERING APPROACH TO BLIND EQUALIZATION by...FILTERING APPROACH 5. FUNDING NUMBERS TO BLIND EQUALIZATION S. AUTHOR(S) Mehmet Kutlu 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) S...which introduces errors due to intersymbol interference. The solution to this problem is provided by equalizers which use a training sequence to adapt to

  17. Time-domain digital pre-equalization for band-limited signals based on receiver-side adaptive equalizers.

    PubMed

    Zhang, Junwen; Yu, Jianjun; Chi, Nan; Chien, Hung-Chang

    2014-08-25

    We theoretically and experimentally investigate a time-domain digital pre-equalization (DPEQ) scheme for bandwidth-limited optical coherent communication systems, which is based on feedback of channel characteristics from the receiver-side blind and adaptive equalizers, such as least-mean-squares (LMS) algorithm and constant or multi- modulus algorithms (CMA, MMA). Based on the proposed DPEQ scheme, we theoretically and experimentally study its performance in terms of various channel conditions as well as resolutions for channel estimation, such as filtering bandwidth, taps length, and OSNR. Using a high speed 64-GSa/s DAC in cooperation with the proposed DPEQ technique, we successfully synthesized band-limited 40-Gbaud signals in modulation formats of polarization-diversion multiplexed (PDM) quadrature phase shift keying (QPSK), 8-quadrature amplitude modulation (QAM) and 16-QAM, and significant improvement in both back-to-back and transmission BER performances are also demonstrated.

  18. Time-cumulated visible and infrared histograms used as descriptor of cloud cover

    NASA Technical Reports Server (NTRS)

    Seze, G.; Rossow, W.

    1987-01-01

    To study the statistical behavior of clouds for different climate regimes, the spatial and temporal stability of VIS-IR bidimensional histograms is tested. Also, the effect of data sampling and averaging on the histogram shapes is considered; in particular the sampling strategy used by the International Satellite Cloud Climatology Project is tested.

  19. Interpreting Histograms. As Easy as It Seems?

    ERIC Educational Resources Information Center

    Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim

    2014-01-01

    Histograms are widely used, but recent studies have shown that they are not as easy to interpret as it might seem. In this article, we report on three studies on the interpretation of histograms in which we investigated, namely, (1) whether the misinterpretation by university students can be considered to be the result of heuristic reasoning, (2)…

  20. Improving Real World Performance of Vision Aided Navigation in a Flight Environment

    DTIC Science & Technology

    2016-09-15

    Introduction . . . . . . . 63 4.2 Wide Area Search Extent . . . . . . . . . . . . . . . . . 64 4.3 Large-Scale Image Navigation Histogram Filter ...65 4.3.1 Location Model . . . . . . . . . . . . . . . . . . 66 4.3.2 Measurement Model . . . . . . . . . . . . . . . 66 4.3.3 Histogram Filter ...Iteration of Histogram Filter . . . . . . . . . . . 70 4.4 Implementation and Flight Test Campaign . . . . . . . . 71 4.4.1 Software Implementation

  1. Airborne gamma-ray spectrometer and magnetometer survey, Durango A, Colorado. Final report Volume II A. Detail area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    This volume contains geology of the Durango A detail area, radioactive mineral occurences in Colorado, and geophysical data interpretation. Eight appendices provide the following: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, magnetic and ancillary profiles, and test line data.

  2. Airborne gamma-ray spectrometer and magnetometer survey, Durango B, Colorado. Final report Volume II A. Detail area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-01-01

    The geology of the Durango B detail area, the radioactive mineral occurrences in Colorado and the geophysical data interpretation are included in this report. Seven appendices contain: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, and test line data.

  3. Students' Understanding of Bar Graphs and Histograms: Results from the LOCUS Assessments

    ERIC Educational Resources Information Center

    Whitaker, Douglas; Jacobbe, Tim

    2017-01-01

    Bar graphs and histograms are core statistical tools that are widely used in statistical practice and commonly taught in classrooms. Despite their importance and the instructional time devoted to them, many students demonstrate misunderstandings when asked to read and interpret bar graphs and histograms. Much of the research that has been…

  4. Adaptive reconfigurable V-BLAST type equalizer for cognitive MIMO-OFDM radios

    NASA Astrophysics Data System (ADS)

    Ozden, Mehmet Tahir

    2015-12-01

    An adaptive channel shortening equalizer design for multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) radio receivers is considered in this presentation. The proposed receiver has desirable features for cognitive and software defined radio implementations. It consists of two sections: MIMO decision feedback equalizer (MIMO-DFE) and adaptive multiple Viterbi detection. In MIMO-DFE section, a complete modified Gram-Schmidt orthogonalization of multichannel input data is accomplished using sequential processing multichannel Givens lattice stages, so that a Vertical Bell Laboratories Layered Space Time (V-BLAST) type MIMO-DFE is realized at the front-end section of the channel shortening equalizer. Matrix operations, a major bottleneck for receiver operations, are accordingly avoided, and only scalar operations are used. A highly modular and regular radio receiver architecture that has a suitable structure for digital signal processing (DSP) chip and field programable gate array (FPGA) implementations, which are important for software defined radio realizations, is achieved. The MIMO-DFE section of the proposed receiver can also be reconfigured for spectrum sensing and positioning functions, which are important tasks for cognitive radio applications. In connection with adaptive multiple Viterbi detection section, a systolic array implementation for each channel is performed so that a receiver architecture with high computational concurrency is attained. The total computational complexity is given in terms of equalizer and desired response filter lengths, alphabet size, and number of antennas. The performance of the proposed receiver is presented for two-channel case by means of mean squared error (MSE) and probability of error evaluations, which are conducted for time-invariant and time-variant channel conditions, orthogonal and nonorthogonal transmissions, and two different modulation schemes.

  5. Modeling Early Postnatal Brain Growth and Development with CT: Changes in the Brain Radiodensity Histogram from Birth to 2 Years.

    PubMed

    Cauley, K A; Hu, Y; Och, J; Yorks, P J; Fielden, S W

    2018-04-01

    The majority of brain growth and development occur in the first 2 years of life. This study investigated these changes by analysis of the brain radiodensity histogram of head CT scans from the clinical population, 0-2 years of age. One hundred twenty consecutive head CTs with normal findings meeting the inclusion criteria from children from birth to 2 years were retrospectively identified from 3 different CT scan platforms. Histogram analysis was performed on brain-extracted images, and histogram mean, mode, full width at half maximum, skewness, kurtosis, and SD were correlated with subject age. The effects of scan platform were investigated. Normative curves were fitted by polynomial regression analysis. Average total brain volume was 360 cm 3 at birth, 948 cm 3 at 1 year, and 1072 cm 3 at 2 years. Total brain tissue density showed an 11% increase in mean density at 1 year and 19% at 2 years. Brain radiodensity histogram skewness was positive at birth, declining logarithmically in the first 200 days of life. The histogram kurtosis also decreased in the first 200 days to approach a normal distribution. Direct segmentation of CT images showed that changes in brain radiodensity histogram skewness correlated with, and can be explained by, a relative increase in gray matter volume and an increase in gray and white matter tissue density that occurs during this period of brain maturation. Normative metrics of the brain radiodensity histogram derived from routine clinical head CT images can be used to develop a model of normal brain development. © 2018 by American Journal of Neuroradiology.

  6. Histogram analysis derived from apparent diffusion coefficient (ADC) is more sensitive to reflect serological parameters in myositis than conventional ADC analysis.

    PubMed

    Meyer, Hans Jonas; Emmer, Alexander; Kornhuber, Malte; Surov, Alexey

    2018-05-01

    Diffusion-weighted imaging (DWI) has the potential of being able to reflect histopathology architecture. A novel imaging approach, namely histogram analysis, is used to further characterize tissues on MRI. The aim of this study was to correlate histogram parameters derived from apparent diffusion coefficient (ADC) maps with serological parameters in myositis. 16 patients with autoimmune myositis were included in this retrospective study. DWI was obtained on a 1.5 T scanner by using the b-values of 0 and 1000 s mm - 2 . Histogram analysis was performed as a whole muscle measurement by using a custom-made Matlab-based application. The following ADC histogram parameters were estimated: ADCmean, ADCmax, ADCmin, ADCmedian, ADCmode, and the following percentiles ADCp10, ADCp25, ADCp75, ADCp90, as well histogram parameters kurtosis, skewness, and entropy. In all patients, the blood sample was acquired within 3 days to the MRI. The following serological parameters were estimated: alanine aminotransferase, aspartate aminotransferase, creatine kinase, lactate dehydrogenase, C-reactive protein (CRP) and myoglobin. All patients were screened for Jo1-autobodies. Kurtosis correlated inversely with CRP (p = -0.55 and 0.03). Furthermore, ADCp10 and ADCp90 values tended to correlate with creatine kinase (p = -0.43, 0.11, and p = -0.42, = 0.12 respectively). In addition, ADCmean, p10, p25, median, mode, and entropy were different between Jo1-positive and Jo1-negative patients. ADC histogram parameters are sensitive for detection of muscle alterations in myositis patients. Advances in knowledge: This study identified that kurtosis derived from ADC maps is associated with CRP in myositis patients. Furthermore, several ADC histogram parameters are statistically different between Jo1-positive and Jo1-negative patients.

  7. Can histogram analysis of MR images predict aggressiveness in pancreatic neuroendocrine tumors?

    PubMed

    De Robertis, Riccardo; Maris, Bogdan; Cardobi, Nicolò; Tinazzi Martini, Paolo; Gobbo, Stefano; Capelli, Paola; Ortolani, Silvia; Cingarlini, Sara; Paiella, Salvatore; Landoni, Luca; Butturini, Giovanni; Regi, Paolo; Scarpa, Aldo; Tortora, Giampaolo; D'Onofrio, Mirko

    2018-06-01

    To evaluate MRI derived whole-tumour histogram analysis parameters in predicting pancreatic neuroendocrine neoplasm (panNEN) grade and aggressiveness. Pre-operative MR of 42 consecutive patients with panNEN >1 cm were retrospectively analysed. T1-/T2-weighted images and ADC maps were analysed. Histogram-derived parameters were compared to histopathological features using the Mann-Whitney U test. Diagnostic accuracy was assessed by ROC-AUC analysis; sensitivity and specificity were assessed for each histogram parameter. ADC entropy was significantly higher in G2-3 tumours with ROC-AUC 0.757; sensitivity and specificity were 83.3 % (95 % CI: 61.2-94.5) and 61.1 % (95 % CI: 36.1-81.7). ADC kurtosis was higher in panNENs with vascular involvement, nodal and hepatic metastases (p= .008, .021 and .008; ROC-AUC= 0.820, 0.709 and 0.820); sensitivity and specificity were: 85.7/74.3 % (95 % CI: 42-99.2 /56.4-86.9), 36.8/96.5 % (95 % CI: 17.2-61.4 /76-99.8) and 100/62.8 % (95 % CI: 56.1-100/44.9-78.1). No significant differences between groups were found for other histogram-derived parameters (p >.05). Whole-tumour histogram analysis of ADC maps may be helpful in predicting tumour grade, vascular involvement, nodal and liver metastases in panNENs. ADC entropy and ADC kurtosis are the most accurate parameters for identification of panNENs with malignant behaviour. • Whole-tumour ADC histogram analysis can predict aggressiveness in pancreatic neuroendocrine neoplasms. • ADC entropy and kurtosis are higher in aggressive tumours. • ADC histogram analysis can quantify tumour diffusion heterogeneity. • Non-invasive quantification of tumour heterogeneity can provide adjunctive information for prognostication.

  8. Non-small cell lung cancer: Whole-lesion histogram analysis of the apparent diffusion coefficient for assessment of tumor grade, lymphovascular invasion and pleural invasion.

    PubMed

    Tsuchiya, Naoko; Doai, Mariko; Usuda, Katsuo; Uramoto, Hidetaka; Tonami, Hisao

    2017-01-01

    Investigating the diagnostic accuracy of histogram analyses of apparent diffusion coefficient (ADC) values for determining non-small cell lung cancer (NSCLC) tumor grades, lymphovascular invasion, and pleural invasion. We studied 60 surgically diagnosed NSCLC patients. Diffusion-weighted imaging (DWI) was performed in the axial plane using a navigator-triggered single-shot, echo-planar imaging sequence with prospective acquisition correction. The ADC maps were generated, and we placed a volume-of-interest on the tumor to construct the whole-lesion histogram. Using the histogram, we calculated the mean, 5th, 10th, 25th, 50th, 75th, 90th, and 95th percentiles of ADC, skewness, and kurtosis. Histogram parameters were correlated with tumor grade, lymphovascular invasion, and pleural invasion. We performed a receiver operating characteristics (ROC) analysis to assess the diagnostic performance of histogram parameters for distinguishing different pathologic features. The ADC mean, 10th, 25th, 50th, 75th, 90th, and 95th percentiles showed significant differences among the tumor grades. The ADC mean, 25th, 50th, 75th, 90th, and 95th percentiles were significant histogram parameters between high- and low-grade tumors. The ROC analysis between high- and low-grade tumors showed that the 95th percentile ADC achieved the highest area under curve (AUC) at 0.74. Lymphovascular invasion was associated with the ADC mean, 50th, 75th, 90th, and 95th percentiles, skewness, and kurtosis. Kurtosis achieved the highest AUC at 0.809. Pleural invasion was only associated with skewness, with the AUC of 0.648. ADC histogram analyses on the basis of the entire tumor volume are able to stratify NSCLCs' tumor grade, lymphovascular invasion and pleural invasion.

  9. Cluster analysis based on dimensional information with applications to feature selection and classification

    NASA Technical Reports Server (NTRS)

    Eigen, D. J.; Fromm, F. R.; Northouse, R. A.

    1974-01-01

    A new clustering algorithm is presented that is based on dimensional information. The algorithm includes an inherent feature selection criterion, which is discussed. Further, a heuristic method for choosing the proper number of intervals for a frequency distribution histogram, a feature necessary for the algorithm, is presented. The algorithm, although usable as a stand-alone clustering technique, is then utilized as a global approximator. Local clustering techniques and configuration of a global-local scheme are discussed, and finally the complete global-local and feature selector configuration is shown in application to a real-time adaptive classification scheme for the analysis of remote sensed multispectral scanner data.

  10. Improved automatic adjustment of density and contrast in FCR system using neural network

    NASA Astrophysics Data System (ADS)

    Takeo, Hideya; Nakajima, Nobuyoshi; Ishida, Masamitsu; Kato, Hisatoyo

    1994-05-01

    FCR system has an automatic adjustment of image density and contrast by analyzing the histogram of image data in the radiation field. Advanced image recognition methods proposed in this paper can improve the automatic adjustment performance, in which neural network technology is used. There are two methods. Both methods are basically used 3-layer neural network with back propagation. The image data are directly input to the input-layer in one method and the histogram data is input in the other method. The former is effective to the imaging menu such as shoulder joint in which the position of interest region occupied on the histogram changes by difference of positioning and the latter is effective to the imaging menu such as chest-pediatrics in which the histogram shape changes by difference of positioning. We experimentally confirm the validity of these methods (about the automatic adjustment performance) as compared with the conventional histogram analysis methods.

  11. 4.5-Gb/s RGB-LED based WDM visible light communication system employing CAP modulation and RLS based adaptive equalization.

    PubMed

    Wang, Yiguang; Huang, Xingxing; Tao, Li; Shi, Jianyang; Chi, Nan

    2015-05-18

    Inter-symbol interference (ISI) is one of the key problems that seriously limit transmission data rate in high-speed VLC systems. To eliminate ISI and further improve the system performance, series of equalization schemes have been widely investigated. As an adaptive algorithm commonly used in wireless communication, RLS is also suitable for visible light communication due to its quick convergence and better performance. In this paper, for the first time we experimentally demonstrate a high-speed RGB-LED based WDM VLC system employing carrier-less amplitude and phase (CAP) modulation and recursive least square (RLS) based adaptive equalization. An aggregate data rate of 4.5Gb/s is successfully achieved over 1.5-m indoor free space transmission with the bit error rate (BER) below the 7% forward error correction (FEC) limit of 3.8x10(-3). To the best of our knowledge, this is the highest data rate ever achieved in RGB-LED based VLC systems.

  12. Research of image retrieval technology based on color feature

    NASA Astrophysics Data System (ADS)

    Fu, Yanjun; Jiang, Guangyu; Chen, Fengying

    2009-10-01

    Recently, with the development of the communication and the computer technology and the improvement of the storage technology and the capability of the digital image equipment, more and more image resources are given to us than ever. And thus the solution of how to locate the proper image quickly and accurately is wanted.The early method is to set up a key word for searching in the database, but now the method has become very difficult when we search much more picture that we need. In order to overcome the limitation of the traditional searching method, content based image retrieval technology was aroused. Now, it is a hot research subject.Color image retrieval is the important part of it. Color is the most important feature for color image retrieval. Three key questions on how to make use of the color characteristic are discussed in the paper: the expression of color, the abstraction of color characteristic and the measurement of likeness based on color. On the basis, the extraction technology of the color histogram characteristic is especially discussed. Considering the advantages and disadvantages of the overall histogram and the partition histogram, a new method based the partition-overall histogram is proposed. The basic thought of it is to divide the image space according to a certain strategy, and then calculate color histogram of each block as the color feature of this block. Users choose the blocks that contain important space information, confirming the right value. The system calculates the distance between the corresponding blocks that users choosed. Other blocks merge into part overall histograms again, and the distance should be calculated. Then accumulate all the distance as the real distance between two pictures. The partition-overall histogram comprehensive utilizes advantages of two methods above, by choosing blocks makes the feature contain more spatial information which can improve performance; the distances between partition-overall histogram make rotating and translation does not change. The HSV color space is used to show color characteristic of image, which is suitable to the visual characteristic of human. Taking advance of human's feeling to color, it quantifies color sector with unequal interval, and get characteristic vector. Finally, it matches the similarity of image with the algorithm of the histogram intersection and the partition-overall histogram. Users can choose a demonstration image to show inquired vision require, and also can adjust several right value through the relevance-feedback method to obtain the best result of search.An image retrieval system based on these approaches is presented. The result of the experiments shows that the image retrieval based on partition-overall histogram can keep the space distribution information while abstracting color feature efficiently, and it is superior to the normal color histograms in precision rate while researching. The query precision rate is more than 95%. In addition, the efficient block expression will lower the complicate degree of the images to be searched, and thus the searching efficiency will be increased. The image retrieval algorithms based on the partition-overall histogram proposed in the paper is efficient and effective.

  13. Spline smoothing of histograms by linear programming

    NASA Technical Reports Server (NTRS)

    Bennett, J. O.

    1972-01-01

    An algorithm for an approximating function to the frequency distribution is obtained from a sample of size n. To obtain the approximating function a histogram is made from the data. Next, Euclidean space approximations to the graph of the histogram using central B-splines as basis elements are obtained by linear programming. The approximating function has area one and is nonnegative.

  14. Histogram analysis of greyscale sonograms to differentiate between the subtypes of follicular variant of papillary thyroid cancer.

    PubMed

    Kwon, M-R; Shin, J H; Hahn, S Y; Oh, Y L; Kwak, J Y; Lee, E; Lim, Y

    2018-06-01

    To evaluate the diagnostic value of histogram analysis using ultrasound (US) to differentiate between the subtypes of follicular variant of papillary thyroid carcinoma (FVPTC). The present study included 151 patients with surgically confirmed FVPTC diagnosed between January 2014 and May 2016. Their preoperative US features were reviewed retrospectively. Histogram parameters (mean, maximum, minimum, range, root mean square, skewness, kurtosis, energy, entropy, and correlation) were obtained for each nodule. The 152 nodules in 151 patients comprised 48 non-invasive follicular thyroid neoplasm with papillary-like nuclear features (NIFTPs; 31.6%), 60 invasive encapsulated FVPTCs (EFVPTCs; 39.5%), and 44 infiltrative FVPTCs (28.9%). The US features differed significantly between the subtypes of FVPTC. Discrimination was achieved between NIFTPs and infiltrative FVPTC, and between invasive EFVPTC and infiltrative FVPTC using histogram parameters; however, the parameters were not significantly different between NIFTP and invasive EFVPTC. It is feasible to use greyscale histogram analysis to differentiate between NIFTP and infiltrative FVPTC, but not between NIFTP and invasive EFVPTC. Histograms can be used as a supplementary tool to differentiate the subtypes of FVPTC. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  15. True progression versus pseudoprogression in the treatment of glioblastomas: a comparison study of normalized cerebral blood volume and apparent diffusion coefficient by histogram analysis.

    PubMed

    Song, Yong Sub; Choi, Seung Hong; Park, Chul-Kee; Yi, Kyung Sik; Lee, Woong Jae; Yun, Tae Jin; Kim, Tae Min; Lee, Se-Hoon; Kim, Ji-Hoon; Sohn, Chul-Ho; Park, Sung-Hye; Kim, Il Han; Jahng, Geon-Ho; Chang, Kee-Hyun

    2013-01-01

    The purpose of this study was to differentiate true progression from pseudoprogression of glioblastomas treated with concurrent chemoradiotherapy (CCRT) with temozolomide (TMZ) by using histogram analysis of apparent diffusion coefficient (ADC) and normalized cerebral blood volume (nCBV) maps. Twenty patients with histopathologically proven glioblastoma who had received CCRT with TMZ underwent perfusion-weighted imaging and diffusion-weighted imaging (b = 0, 1000 sec/mm(2)). The corresponding nCBV and ADC maps for the newly visible, entirely enhancing lesions were calculated after the completion of CCRT with TMZ. Two observers independently measured the histogram parameters of the nCBV and ADC maps. The histogram parameters between the true progression group (n = 10) and the pseudoprogression group (n = 10) were compared by use of an unpaired Student's t test and subsequent multivariable stepwise logistic regression analysis to determine the best predictors for the differential diagnosis between the two groups. Receiver operating characteristic analysis was employed to determine the best cutoff values for the histogram parameters that proved to be significant predictors for differentiating true progression from pseudoprogression. Intraclass correlation coefficient was used to determine the level of inter-observer reliability for the histogram parameters. The 5th percentile value (C5) of the cumulative ADC histograms was a significant predictor for the differential diagnosis between true progression and pseudoprogression (p = 0.044 for observer 1; p = 0.011 for observer 2). Optimal cutoff values of 892 × 10(-6) mm(2)/sec for observer 1 and 907 × 10(-6) mm(2)/sec for observer 2 could help differentiate between the two groups with a sensitivity of 90% and 80%, respectively, a specificity of 90% and 80%, respectively, and an area under the curve of 0.880 and 0.840, respectively. There was no other significant differentiating parameter on the nCBV histograms. Inter-observer reliability was excellent or good for all histogram parameters (intraclass correlation coefficient range: 0.70-0.99). The C5 of the cumulative ADC histogram can be a promising parameter for the differentiation of true progression from pseudoprogression of newly visible, entirely enhancing lesions after CCRT with TMZ for glioblastomas.

  16. True Progression versus Pseudoprogression in the Treatment of Glioblastomas: A Comparison Study of Normalized Cerebral Blood Volume and Apparent Diffusion Coefficient by Histogram Analysis

    PubMed Central

    Song, Yong Sub; Park, Chul-Kee; Yi, Kyung Sik; Lee, Woong Jae; Yun, Tae Jin; Kim, Tae Min; Lee, Se-Hoon; Kim, Ji-Hoon; Sohn, Chul-Ho; Park, Sung-Hye; Kim, Il Han; Jahng, Geon-Ho; Chang, Kee-Hyun

    2013-01-01

    Objective The purpose of this study was to differentiate true progression from pseudoprogression of glioblastomas treated with concurrent chemoradiotherapy (CCRT) with temozolomide (TMZ) by using histogram analysis of apparent diffusion coefficient (ADC) and normalized cerebral blood volume (nCBV) maps. Materials and Methods Twenty patients with histopathologically proven glioblastoma who had received CCRT with TMZ underwent perfusion-weighted imaging and diffusion-weighted imaging (b = 0, 1000 sec/mm2). The corresponding nCBV and ADC maps for the newly visible, entirely enhancing lesions were calculated after the completion of CCRT with TMZ. Two observers independently measured the histogram parameters of the nCBV and ADC maps. The histogram parameters between the true progression group (n = 10) and the pseudoprogression group (n = 10) were compared by use of an unpaired Student's t test and subsequent multivariable stepwise logistic regression analysis to determine the best predictors for the differential diagnosis between the two groups. Receiver operating characteristic analysis was employed to determine the best cutoff values for the histogram parameters that proved to be significant predictors for differentiating true progression from pseudoprogression. Intraclass correlation coefficient was used to determine the level of inter-observer reliability for the histogram parameters. Results The 5th percentile value (C5) of the cumulative ADC histograms was a significant predictor for the differential diagnosis between true progression and pseudoprogression (p = 0.044 for observer 1; p = 0.011 for observer 2). Optimal cutoff values of 892 × 10-6 mm2/sec for observer 1 and 907 × 10-6 mm2/sec for observer 2 could help differentiate between the two groups with a sensitivity of 90% and 80%, respectively, a specificity of 90% and 80%, respectively, and an area under the curve of 0.880 and 0.840, respectively. There was no other significant differentiating parameter on the nCBV histograms. Inter-observer reliability was excellent or good for all histogram parameters (intraclass correlation coefficient range: 0.70-0.99). Conclusion The C5 of the cumulative ADC histogram can be a promising parameter for the differentiation of true progression from pseudoprogression of newly visible, entirely enhancing lesions after CCRT with TMZ for glioblastomas. PMID:23901325

  17. PIRATE: pediatric imaging response assessment and targeting environment

    NASA Astrophysics Data System (ADS)

    Glenn, Russell; Zhang, Yong; Krasin, Matthew; Hua, Chiaho

    2010-02-01

    By combining the strengths of various imaging modalities, the multimodality imaging approach has potential to improve tumor staging, delineation of tumor boundaries, chemo-radiotherapy regime design, and treatment response assessment in cancer management. To address the urgent needs for efficient tools to analyze large-scale clinical trial data, we have developed an integrated multimodality, functional and anatomical imaging analysis software package for target definition and therapy response assessment in pediatric radiotherapy (RT) patients. Our software provides quantitative tools for automated image segmentation, region-of-interest (ROI) histogram analysis, spatial volume-of-interest (VOI) analysis, and voxel-wise correlation across modalities. To demonstrate the clinical applicability of this software, histogram analyses were performed on baseline and follow-up 18F-fluorodeoxyglucose (18F-FDG) PET images of nine patients with rhabdomyosarcoma enrolled in an institutional clinical trial at St. Jude Children's Research Hospital. In addition, we combined 18F-FDG PET, dynamic-contrast-enhanced (DCE) MR, and anatomical MR data to visualize the heterogeneity in tumor pathophysiology with the ultimate goal of adaptive targeting of regions with high tumor burden. Our software is able to simultaneously analyze multimodality images across multiple time points, which could greatly speed up the analysis of large-scale clinical trial data and validation of potential imaging biomarkers.

  18. Robust image region descriptor using local derivative ordinal binary pattern

    NASA Astrophysics Data System (ADS)

    Shang, Jun; Chen, Chuanbo; Pei, Xiaobing; Liang, Hu; Tang, He; Sarem, Mudar

    2015-05-01

    Binary image descriptors have received a lot of attention in recent years, since they provide numerous advantages, such as low memory footprint and efficient matching strategy. However, they utilize intermediate representations and are generally less discriminative than floating-point descriptors. We propose an image region descriptor, namely local derivative ordinal binary pattern, for object recognition and image categorization. In order to preserve more local contrast and edge information, we quantize the intensity differences between the central pixels and their neighbors of the detected local affine covariant regions in an adaptive way. These differences are then sorted and mapped into binary codes and histogrammed with a weight of the sum of the absolute value of the differences. Furthermore, the gray level of the central pixel is quantized to further improve the discriminative ability. Finally, we combine them to form a joint histogram to represent the features of the image. We observe that our descriptor preserves more local brightness and edge information than traditional binary descriptors. Also, our descriptor is robust to rotation, illumination variations, and other geometric transformations. We conduct extensive experiments on the standard ETHZ and Kentucky datasets for object recognition and PASCAL for image classification. The experimental results show that our descriptor outperforms existing state-of-the-art methods.

  19. Histogram Analysis of Diffusion Weighted Imaging at 3T is Useful for Prediction of Lymphatic Metastatic Spread, Proliferative Activity, and Cellularity in Thyroid Cancer.

    PubMed

    Schob, Stefan; Meyer, Hans Jonas; Dieckow, Julia; Pervinder, Bhogal; Pazaitis, Nikolaos; Höhn, Anne Kathrin; Garnov, Nikita; Horvath-Rizea, Diana; Hoffmann, Karl-Titus; Surov, Alexey

    2017-04-12

    Pre-surgical diffusion weighted imaging (DWI) is increasingly important in the context of thyroid cancer for identification of the optimal treatment strategy. It has exemplarily been shown that DWI at 3T can distinguish undifferentiated from well-differentiated thyroid carcinoma, which has decisive implications for the magnitude of surgery. This study used DWI histogram analysis of whole tumor apparent diffusion coefficient (ADC) maps. The primary aim was to discriminate thyroid carcinomas which had already gained the capacity to metastasize lymphatically from those not yet being able to spread via the lymphatic system. The secondary aim was to reflect prognostically important tumor-biological features like cellularity and proliferative activity with ADC histogram analysis. Fifteen patients with follicular-cell derived thyroid cancer were enrolled. Lymph node status, extent of infiltration of surrounding tissue, and Ki-67 and p53 expression were assessed in these patients. DWI was obtained in a 3T system using b values of 0, 400, and 800 s/mm². Whole tumor ADC volumes were analyzed using a histogram-based approach. Several ADC parameters showed significant correlations with immunohistopathological parameters. Most importantly, ADC histogram skewness and ADC histogram kurtosis were able to differentiate between nodal negative and nodal positive thyroid carcinoma. histogram analysis of whole ADC tumor volumes has the potential to provide valuable information on tumor biology in thyroid carcinoma. However, further studies are warranted.

  20. Histogram Analysis of Diffusion Weighted Imaging at 3T is Useful for Prediction of Lymphatic Metastatic Spread, Proliferative Activity, and Cellularity in Thyroid Cancer

    PubMed Central

    Schob, Stefan; Meyer, Hans Jonas; Dieckow, Julia; Pervinder, Bhogal; Pazaitis, Nikolaos; Höhn, Anne Kathrin; Garnov, Nikita; Horvath-Rizea, Diana; Hoffmann, Karl-Titus; Surov, Alexey

    2017-01-01

    Pre-surgical diffusion weighted imaging (DWI) is increasingly important in the context of thyroid cancer for identification of the optimal treatment strategy. It has exemplarily been shown that DWI at 3T can distinguish undifferentiated from well-differentiated thyroid carcinoma, which has decisive implications for the magnitude of surgery. This study used DWI histogram analysis of whole tumor apparent diffusion coefficient (ADC) maps. The primary aim was to discriminate thyroid carcinomas which had already gained the capacity to metastasize lymphatically from those not yet being able to spread via the lymphatic system. The secondary aim was to reflect prognostically important tumor-biological features like cellularity and proliferative activity with ADC histogram analysis. Fifteen patients with follicular-cell derived thyroid cancer were enrolled. Lymph node status, extent of infiltration of surrounding tissue, and Ki-67 and p53 expression were assessed in these patients. DWI was obtained in a 3T system using b values of 0, 400, and 800 s/mm2. Whole tumor ADC volumes were analyzed using a histogram-based approach. Several ADC parameters showed significant correlations with immunohistopathological parameters. Most importantly, ADC histogram skewness and ADC histogram kurtosis were able to differentiate between nodal negative and nodal positive thyroid carcinoma. Conclusions: histogram analysis of whole ADC tumor volumes has the potential to provide valuable information on tumor biology in thyroid carcinoma. However, further studies are warranted. PMID:28417929

  1. Enhancing tumor apparent diffusion coefficient histogram skewness stratifies the postoperative survival in recurrent glioblastoma multiforme patients undergoing salvage surgery.

    PubMed

    Zolal, Amir; Juratli, Tareq A; Linn, Jennifer; Podlesek, Dino; Sitoci Ficici, Kerim Hakan; Kitzler, Hagen H; Schackert, Gabriele; Sobottka, Stephan B; Rieger, Bernhard; Krex, Dietmar

    2016-05-01

    Objective To determine the value of apparent diffusion coefficient (ADC) histogram parameters for the prediction of individual survival in patients undergoing surgery for recurrent glioblastoma (GBM) in a retrospective cohort study. Methods Thirty-one patients who underwent surgery for first recurrence of a known GBM between 2008 and 2012 were included. The following parameters were collected: age, sex, enhancing tumor size, mean ADC, median ADC, ADC skewness, ADC kurtosis and fifth percentile of the ADC histogram, initial progression free survival (PFS), extent of second resection and further adjuvant treatment. The association of these parameters with survival and PFS after second surgery was analyzed using log-rank test and Cox regression. Results Using log-rank test, ADC histogram skewness of the enhancing tumor was significantly associated with both survival (p = 0.001) and PFS after second surgery (p = 0.005). Further parameters associated with prolonged survival after second surgery were: gross total resection at second surgery (p = 0.026), tumor size (0.040) and third surgery (p = 0.003). In the multivariate Cox analysis, ADC histogram skewness was shown to be an independent prognostic factor for survival after second surgery. Conclusion ADC histogram skewness of the enhancing lesion, enhancing lesion size, third surgery, as well as gross total resection have been shown to be associated with survival following the second surgery. ADC histogram skewness was an independent prognostic factor for survival in the multivariate analysis.

  2. Quality Improvement of Liver Ultrasound Images Using Fuzzy Techniques.

    PubMed

    Bayani, Azadeh; Langarizadeh, Mostafa; Radmard, Amir Reza; Nejad, Ahmadreza Farzaneh

    2016-12-01

    Liver ultrasound images are so common and are applied so often to diagnose diffuse liver diseases like fatty liver. However, the low quality of such images makes it difficult to analyze them and diagnose diseases. The purpose of this study, therefore, is to improve the contrast and quality of liver ultrasound images. In this study, a number of image contrast enhancement algorithms which are based on fuzzy logic were applied to liver ultrasound images - in which the view of kidney is observable - using Matlab2013b to improve the image contrast and quality which has a fuzzy definition; just like image contrast improvement algorithms using a fuzzy intensification operator, contrast improvement algorithms applying fuzzy image histogram hyperbolization, and contrast improvement algorithms by fuzzy IF-THEN rules. With the measurement of Mean Squared Error and Peak Signal to Noise Ratio obtained from different images, fuzzy methods provided better results, and their implementation - compared with histogram equalization method - led both to the improvement of contrast and visual quality of images and to the improvement of liver segmentation algorithms results in images. Comparison of the four algorithms revealed the power of fuzzy logic in improving image contrast compared with traditional image processing algorithms. Moreover, contrast improvement algorithm based on a fuzzy intensification operator was selected as the strongest algorithm considering the measured indicators. This method can also be used in future studies on other ultrasound images for quality improvement and other image processing and analysis applications.

  3. Quality Improvement of Liver Ultrasound Images Using Fuzzy Techniques

    PubMed Central

    Bayani, Azadeh; Langarizadeh, Mostafa; Radmard, Amir Reza; Nejad, Ahmadreza Farzaneh

    2016-01-01

    Background: Liver ultrasound images are so common and are applied so often to diagnose diffuse liver diseases like fatty liver. However, the low quality of such images makes it difficult to analyze them and diagnose diseases. The purpose of this study, therefore, is to improve the contrast and quality of liver ultrasound images. Methods: In this study, a number of image contrast enhancement algorithms which are based on fuzzy logic were applied to liver ultrasound images - in which the view of kidney is observable - using Matlab2013b to improve the image contrast and quality which has a fuzzy definition; just like image contrast improvement algorithms using a fuzzy intensification operator, contrast improvement algorithms applying fuzzy image histogram hyperbolization, and contrast improvement algorithms by fuzzy IF-THEN rules. Results: With the measurement of Mean Squared Error and Peak Signal to Noise Ratio obtained from different images, fuzzy methods provided better results, and their implementation - compared with histogram equalization method - led both to the improvement of contrast and visual quality of images and to the improvement of liver segmentation algorithms results in images. Conclusion: Comparison of the four algorithms revealed the power of fuzzy logic in improving image contrast compared with traditional image processing algorithms. Moreover, contrast improvement algorithm based on a fuzzy intensification operator was selected as the strongest algorithm considering the measured indicators. This method can also be used in future studies on other ultrasound images for quality improvement and other image processing and analysis applications. PMID:28077898

  4. Histogram-based normalization technique on human brain magnetic resonance images from different acquisitions.

    PubMed

    Sun, Xiaofei; Shi, Lin; Luo, Yishan; Yang, Wei; Li, Hongpeng; Liang, Peipeng; Li, Kuncheng; Mok, Vincent C T; Chu, Winnie C W; Wang, Defeng

    2015-07-28

    Intensity normalization is an important preprocessing step in brain magnetic resonance image (MRI) analysis. During MR image acquisition, different scanners or parameters would be used for scanning different subjects or the same subject at a different time, which may result in large intensity variations. This intensity variation will greatly undermine the performance of subsequent MRI processing and population analysis, such as image registration, segmentation, and tissue volume measurement. In this work, we proposed a new histogram normalization method to reduce the intensity variation between MRIs obtained from different acquisitions. In our experiment, we scanned each subject twice on two different scanners using different imaging parameters. With noise estimation, the image with lower noise level was determined and treated as the high-quality reference image. Then the histogram of the low-quality image was normalized to the histogram of the high-quality image. The normalization algorithm includes two main steps: (1) intensity scaling (IS), where, for the high-quality reference image, the intensities of the image are first rescaled to a range between the low intensity region (LIR) value and the high intensity region (HIR) value; and (2) histogram normalization (HN),where the histogram of low-quality image as input image is stretched to match the histogram of the reference image, so that the intensity range in the normalized image will also lie between LIR and HIR. We performed three sets of experiments to evaluate the proposed method, i.e., image registration, segmentation, and tissue volume measurement, and compared this with the existing intensity normalization method. It is then possible to validate that our histogram normalization framework can achieve better results in all the experiments. It is also demonstrated that the brain template with normalization preprocessing is of higher quality than the template with no normalization processing. We have proposed a histogram-based MRI intensity normalization method. The method can normalize scans which were acquired on different MRI units. We have validated that the method can greatly improve the image analysis performance. Furthermore, it is demonstrated that with the help of our normalization method, we can create a higher quality Chinese brain template.

  5. Object tracking with adaptive HOG detector and adaptive Rao-Blackwellised particle filter

    NASA Astrophysics Data System (ADS)

    Rosa, Stefano; Paleari, Marco; Ariano, Paolo; Bona, Basilio

    2012-01-01

    Scenarios for a manned mission to the Moon or Mars call for astronaut teams to be accompanied by semiautonomous robots. A prerequisite for human-robot interaction is the capability of successfully tracking humans and objects in the environment. In this paper we present a system for real-time visual object tracking in 2D images for mobile robotic systems. The proposed algorithm is able to specialize to individual objects and to adapt to substantial changes in illumination and object appearance during tracking. The algorithm is composed by two main blocks: a detector based on Histogram of Oriented Gradient (HOG) descriptors and linear Support Vector Machines (SVM), and a tracker which is implemented by an adaptive Rao-Blackwellised particle filter (RBPF). The SVM is re-trained online on new samples taken from previous predicted positions. We use the effective sample size to decide when the classifier needs to be re-trained. Position hypotheses for the tracked object are the result of a clustering procedure applied on the set of particles. The algorithm has been tested on challenging video sequences presenting strong changes in object appearance, illumination, and occlusion. Experimental tests show that the presented method is able to achieve near real-time performances with a precision of about 7 pixels on standard video sequences of dimensions 320 × 240.

  6. Spatio-Temporal Equalizer for a Receiving-Antenna Feed Array

    NASA Technical Reports Server (NTRS)

    Mukai, Ryan; Lee, Dennis; Vilnrotter, Victor

    2010-01-01

    A spatio-temporal equalizer has been conceived as an improved means of suppressing multipath effects in the reception of aeronautical telemetry signals, and may be adaptable to radar and aeronautical communication applications as well. This equalizer would be an integral part of a system that would also include a seven-element planar array of receiving feed horns centered at the focal point of a paraboloidal antenna that would be nominally aimed at or near the aircraft that would be the source of the signal that one seeks to receive (see Figure 1). This spatio-temporal equalizer would consist mostly of a bank of seven adaptive finite-impulse-response (FIR) filters one for each element in the array - and the outputs of the filters would be summed (see Figure 2). The combination of the spatial diversity of the feedhorn array and the temporal diversity of the filter bank would afford better multipath-suppression performance than is achievable by means of temporal equalization alone. The seven-element feed array would supplant the single feed horn used in a conventional paraboloidal ground telemetry-receiving antenna. The radio-frequency telemetry signals re ceiv ed by the seven elements of the array would be digitized, converted to complex baseband form, and sent to the FIR filter bank, which would adapt itself in real time to enable reception of telemetry at a low bit error rate, even in the presence of multipath of the type found at many flight test ranges.

  7. A Two-Stage Approach for Improving the Convergence of Least-Mean-Square Adaptive Decision-Feedback Equalizers in the Presence of Severe Narrowband Interference

    NASA Astrophysics Data System (ADS)

    Batra, Arun; Zeidler, James R.; Beex, A. A. Louis

    2007-12-01

    It has previously been shown that a least-mean-square (LMS) decision-feedback filter can mitigate the effect of narrowband interference (L.-M. Li and L. Milstein, 1983). An adaptive implementation of the filter was shown to converge relatively quickly for mild interference. It is shown here, however, that in the case of severe narrowband interference, the LMS decision-feedback equalizer (DFE) requires a very large number of training symbols for convergence, making it unsuitable for some types of communication systems. This paper investigates the introduction of an LMS prediction-error filter (PEF) as a prefilter to the equalizer and demonstrates that it reduces the convergence time of the two-stage system by as much as two orders of magnitude. It is also shown that the steady-state bit-error rate (BER) performance of the proposed system is still approximately equal to that attained in steady-state by the LMS DFE-only. Finally, it is shown that the two-stage system can be implemented without the use of training symbols. This two-stage structure lowers the complexity of the overall system by reducing the number of filter taps that need to be adapted, while incurring a slight loss in the steady-state BER.

  8. Statistical efficiency of adaptive algorithms.

    PubMed

    Widrow, Bernard; Kamenetsky, Max

    2003-01-01

    The statistical efficiency of a learning algorithm applied to the adaptation of a given set of variable weights is defined as the ratio of the quality of the converged solution to the amount of data used in training the weights. Statistical efficiency is computed by averaging over an ensemble of learning experiences. A high quality solution is very close to optimal, while a low quality solution corresponds to noisy weights and less than optimal performance. In this work, two gradient descent adaptive algorithms are compared, the LMS algorithm and the LMS/Newton algorithm. LMS is simple and practical, and is used in many applications worldwide. LMS/Newton is based on Newton's method and the LMS algorithm. LMS/Newton is optimal in the least squares sense. It maximizes the quality of its adaptive solution while minimizing the use of training data. Many least squares adaptive algorithms have been devised over the years, but no other least squares algorithm can give better performance, on average, than LMS/Newton. LMS is easily implemented, but LMS/Newton, although of great mathematical interest, cannot be implemented in most practical applications. Because of its optimality, LMS/Newton serves as a benchmark for all least squares adaptive algorithms. The performances of LMS and LMS/Newton are compared, and it is found that under many circumstances, both algorithms provide equal performance. For example, when both algorithms are tested with statistically nonstationary input signals, their average performances are equal. When adapting with stationary input signals and with random initial conditions, their respective learning times are on average equal. However, under worst-case initial conditions, the learning time of LMS can be much greater than that of LMS/Newton, and this is the principal disadvantage of the LMS algorithm. But the strong points of LMS are ease of implementation and optimal performance under important practical conditions. For these reasons, the LMS algorithm has enjoyed very widespread application. It is used in almost every modem for channel equalization and echo cancelling. Furthermore, it is related to the famous backpropagation algorithm used for training neural networks.

  9. Neural network with dynamically adaptable neurons

    NASA Technical Reports Server (NTRS)

    Tawel, Raoul (Inventor)

    1994-01-01

    This invention is an adaptive neuron for use in neural network processors. The adaptive neuron participates in the supervised learning phase of operation on a co-equal basis with the synapse matrix elements by adaptively changing its gain in a similar manner to the change of weights in the synapse IO elements. In this manner, training time is decreased by as much as three orders of magnitude.

  10. Underwater wireless optical MIMO system with spatial modulation and adaptive power allocation

    NASA Astrophysics Data System (ADS)

    Huang, Aiping; Tao, Linwei; Niu, Yilong

    2018-04-01

    In this paper, we investigate the performance of underwater wireless optical multiple-input multiple-output communication system combining spatial modulation (SM-UOMIMO) with flag dual amplitude pulse position modulation (FDAPPM). Channel impulse response for coastal and harbor ocean water links are obtained by Monte Carlo (MC) simulation. Moreover, we obtain the closed-form and upper bound average bit error rate (BER) expressions for receiver diversity including optical combining, equal gain combining and selected combining. And a novel adaptive power allocation algorithm (PAA) is proposed to minimize the average BER of SM-UOMIMO system. Our numeric results indicate an excellent match between the analytical results and numerical simulations, which confirms the accuracy of our derived expressions. Furthermore, the results show that adaptive PAA outperforms conventional fixed factor PAA and equal PAA obviously. Multiple-input single-output system with adaptive PAA obtains even better BER performance than MIMO one, at the same time reducing receiver complexity effectively.

  11. Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma.

    PubMed

    Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-06-01

    The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC).Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement.The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P <0.05), with area under the ROC curves (AUCs) of 0.66 to 0.74 for ADC and 0.76 to 0.88 for PVP. The largest AUC of PVP (1th percentile) showed significantly higher accuracy compared with that of arterial phase (AP) or tumor size (P <0.001).MR histogram analyses-in particular for 1th percentile for PVP images-held promise for prediction of MVI of HCC.

  12. Digital equalization of time-delay array receivers on coherent laser communications.

    PubMed

    Belmonte, Aniceto

    2017-01-15

    Field conjugation arrays use adaptive combining techniques on multi-aperture receivers to improve the performance of coherent laser communication links by mitigating the consequences of atmospheric turbulence on the down-converted coherent power. However, this motivates the use of complex receivers as optical signals collected by different apertures need to be adaptively processed, co-phased, and scaled before they are combined. Here, we show that multiple apertures, coupled with optical delay lines, combine retarded versions of a signal at a single coherent receiver, which uses digital equalization to obtain diversity gain against atmospheric fading. We found in our analysis that, instead of field conjugation arrays, digital equalization of time-delay multi-aperture receivers is a simpler and more versatile approach to accomplish reduction of atmospheric fading.

  13. Histogram analysis of apparent diffusion coefficient maps for differentiating primary CNS lymphomas from tumefactive demyelinating lesions.

    PubMed

    Lu, Shan Shan; Kim, Sang Joon; Kim, Namkug; Kim, Ho Sung; Choi, Choong Gon; Lim, Young Min

    2015-04-01

    This study intended to investigate the usefulness of histogram analysis of apparent diffusion coefficient (ADC) maps for discriminating primary CNS lymphomas (PCNSLs), especially atypical PCNSLs, from tumefactive demyelinating lesions (TDLs). Forty-seven patients with PCNSLs and 18 with TDLs were enrolled in our study. Hyperintense lesions seen on T2-weighted images were defined as ROIs after ADC maps were registered to the corresponding T2-weighted image. ADC histograms were calculated from the ROIs containing the entire lesion on every section and on a voxel-by-voxel basis. The ADC histogram parameters were compared among all PCNSLs and TDLs as well as between the subgroup of atypical PCNSLs and TDLs. ROC curves were constructed to evaluate the diagnostic performance of the histogram parameters and to determine the optimum thresholds. The differences between the PCNSLs and TDLs were found in the minimum ADC values (ADCmin) and in the 5th and 10th percentiles (ADC5% and ADC10%) of the cumulative ADC histograms. However, no statistical significance was found in the mean ADC value or in the ADC value concerning the mode, kurtosis, and skewness. The ADCmin, ADC5%, and ADC10% were also lower in atypical PCNSLs than in TDLs. ADCmin was the best indicator for discriminating atypical PCNSLs from TDLs, with a threshold of 556×10(-6) mm2/s (sensitivity, 81.3 %; specificity, 88.9%). Histogram analysis of ADC maps may help to discriminate PCNSLs from TDLs and may be particularly useful in differentiating atypical PCNSLs from TDLs.

  14. Assessment of histological differentiation in gastric cancers using whole-volume histogram analysis of apparent diffusion coefficient maps.

    PubMed

    Zhang, Yujuan; Chen, Jun; Liu, Song; Shi, Hua; Guan, Wenxian; Ji, Changfeng; Guo, Tingting; Zheng, Huanhuan; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng; Liu, Tian

    2017-02-01

    To investigate the efficacy of histogram analysis of the entire tumor volume in apparent diffusion coefficient (ADC) maps for differentiating between histological grades in gastric cancer. Seventy-eight patients with gastric cancer were enrolled in a retrospective 3.0T magnetic resonance imaging (MRI) study. ADC maps were obtained at two different b values (0 and 1000 sec/mm 2 ) for each patient. Tumors were delineated on each slice of the ADC maps, and a histogram for the entire tumor volume was subsequently generated. A series of histogram parameters (eg, skew and kurtosis) were calculated and correlated with the histological grade of the surgical specimen. The diagnostic performance of each parameter for distinguishing poorly from moderately well-differentiated gastric cancers was assessed by using the area under the receiver operating characteristic curve (AUC). There were significant differences in the 5 th , 10 th , 25 th , and 50 th percentiles, skew, and kurtosis between poorly and well-differentiated gastric cancers (P < 0.05). There were correlations between the degrees of differentiation and histogram parameters, including the 10 th percentile, skew, kurtosis, and max frequency; the correlation coefficients were 0.273, -0.361, -0.339, and -0.370, respectively. Among all the histogram parameters, the max frequency had the largest AUC value, which was 0.675. Histogram analysis of the ADC maps on the basis of the entire tumor volume can be useful in differentiating between histological grades for gastric cancer. 4 J. Magn. Reson. Imaging 2017;45:440-449. © 2016 International Society for Magnetic Resonance in Medicine.

  15. Macronuclear chromatin structure dynamics in Colpoda inflata (Protista, Ciliophora) resting encystment.

    PubMed

    Tiano, L; Chessa, M G; Carrara, S; Tagliafierro, G; Delmonte Corrado, M U

    1999-01-01

    The chromatin structure dynamics of the Colpoda inflata macronucleus have been investigated in relation to its functional condition, concerning chromatin body extrusion regulating activity. Samples of 2- and 25-day-old resting cysts derived from a standard culture, and of 1-year-old resting cysts derived from a senescent culture, were examined by means of histogram analysis performed on acquired optical microscopy images. Three groups of histograms were detected in each sample. Histogram classification, clustering and matching were assessed in order to obtain the mean histogram of each group. Comparative analysis of the mean histogram showed a similarity in the grey level range of 25-day- and 1-year-old cysts, unlike the wider grey level range found in 2-day-old cysts. Moreover, the respective mean histograms of the three cyst samples appeared rather similar in shape. All this implies that macronuclear chromatin structural features of 1-year-old cysts are common to both cyst standard cultures. The evaluation of the acquired images and their respective histograms evidenced a dynamic state of the macronuclear chromatin, appearing differently condensed in relation to the chromatin body extrusion regulating activity of the macronucleus. The coexistence of a chromatin-decondensed macronucleus with a pycnotic extrusion body suggests that chromatin unable to decondense, thus inactive, is extruded. This finding, along with the presence of chromatin structural features common to standard and senescent cyst populations, supports the occurrence of 'rejuvenated' cell lines from 1-year-old encysted senescent cells, a phenomenon which could be a result of accomplished macronuclear renewal.

  16. Non-small cell lung cancer: Whole-lesion histogram analysis of the apparent diffusion coefficient for assessment of tumor grade, lymphovascular invasion and pleural invasion

    PubMed Central

    Tsuchiya, Naoko; Doai, Mariko; Usuda, Katsuo; Uramoto, Hidetaka

    2017-01-01

    Purpose Investigating the diagnostic accuracy of histogram analyses of apparent diffusion coefficient (ADC) values for determining non-small cell lung cancer (NSCLC) tumor grades, lymphovascular invasion, and pleural invasion. Materials and methods We studied 60 surgically diagnosed NSCLC patients. Diffusion-weighted imaging (DWI) was performed in the axial plane using a navigator-triggered single-shot, echo-planar imaging sequence with prospective acquisition correction. The ADC maps were generated, and we placed a volume-of-interest on the tumor to construct the whole-lesion histogram. Using the histogram, we calculated the mean, 5th, 10th, 25th, 50th, 75th, 90th, and 95th percentiles of ADC, skewness, and kurtosis. Histogram parameters were correlated with tumor grade, lymphovascular invasion, and pleural invasion. We performed a receiver operating characteristics (ROC) analysis to assess the diagnostic performance of histogram parameters for distinguishing different pathologic features. Results The ADC mean, 10th, 25th, 50th, 75th, 90th, and 95th percentiles showed significant differences among the tumor grades. The ADC mean, 25th, 50th, 75th, 90th, and 95th percentiles were significant histogram parameters between high- and low-grade tumors. The ROC analysis between high- and low-grade tumors showed that the 95th percentile ADC achieved the highest area under curve (AUC) at 0.74. Lymphovascular invasion was associated with the ADC mean, 50th, 75th, 90th, and 95th percentiles, skewness, and kurtosis. Kurtosis achieved the highest AUC at 0.809. Pleural invasion was only associated with skewness, with the AUC of 0.648. Conclusions ADC histogram analyses on the basis of the entire tumor volume are able to stratify NSCLCs' tumor grade, lymphovascular invasion and pleural invasion. PMID:28207858

  17. SAMBA: Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahlfeld, R., E-mail: r.ahlfeld14@imperial.ac.uk; Belkouchi, B.; Montomoli, F.

    2016-09-01

    A new arbitrary Polynomial Chaos (aPC) method is presented for moderately high-dimensional problems characterised by limited input data availability. The proposed methodology improves the algorithm of aPC and extends the method, that was previously only introduced as tensor product expansion, to moderately high-dimensional stochastic problems. The fundamental idea of aPC is to use the statistical moments of the input random variables to develop the polynomial chaos expansion. This approach provides the possibility to propagate continuous or discrete probability density functions and also histograms (data sets) as long as their moments exist, are finite and the determinant of the moment matrixmore » is strictly positive. For cases with limited data availability, this approach avoids bias and fitting errors caused by wrong assumptions. In this work, an alternative way to calculate the aPC is suggested, which provides the optimal polynomials, Gaussian quadrature collocation points and weights from the moments using only a handful of matrix operations on the Hankel matrix of moments. It can therefore be implemented without requiring prior knowledge about statistical data analysis or a detailed understanding of the mathematics of polynomial chaos expansions. The extension to more input variables suggested in this work, is an anisotropic and adaptive version of Smolyak's algorithm that is solely based on the moments of the input probability distributions. It is referred to as SAMBA (PC), which is short for Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos. It is illustrated that for moderately high-dimensional problems (up to 20 different input variables or histograms) SAMBA can significantly simplify the calculation of sparse Gaussian quadrature rules. SAMBA's efficiency for multivariate functions with regard to data availability is further demonstrated by analysing higher order convergence and accuracy for a set of nonlinear test functions with 2, 5 and 10 different input distributions or histograms.« less

  18. Performance Evaluation of EnKF-based Hydrogeological Site Characterization using Color Coherent Vectors

    NASA Astrophysics Data System (ADS)

    Moslehi, M.; de Barros, F.

    2017-12-01

    Complexity of hydrogeological systems arises from the multi-scale heterogeneity and insufficient measurements of their underlying parameters such as hydraulic conductivity and porosity. An inadequate characterization of hydrogeological properties can significantly decrease the trustworthiness of numerical models that predict groundwater flow and solute transport. Therefore, a variety of data assimilation methods have been proposed in order to estimate hydrogeological parameters from spatially scarce data by incorporating the governing physical models. In this work, we propose a novel framework for evaluating the performance of these estimation methods. We focus on the Ensemble Kalman Filter (EnKF) approach that is a widely used data assimilation technique. It reconciles multiple sources of measurements to sequentially estimate model parameters such as the hydraulic conductivity. Several methods have been used in the literature to quantify the accuracy of the estimations obtained by EnKF, including Rank Histograms, RMSE and Ensemble Spread. However, these commonly used methods do not regard the spatial information and variability of geological formations. This can cause hydraulic conductivity fields with very different spatial structures to have similar histograms or RMSE. We propose a vision-based approach that can quantify the accuracy of estimations by considering the spatial structure embedded in the estimated fields. Our new approach consists of adapting a new metric, Color Coherent Vectors (CCV), to evaluate the accuracy of estimated fields achieved by EnKF. CCV is a histogram-based technique for comparing images that incorporate spatial information. We represent estimated fields as digital three-channel images and use CCV to compare and quantify the accuracy of estimations. The sensitivity of CCV to spatial information makes it a suitable metric for assessing the performance of spatial data assimilation techniques. Under various factors of data assimilation methods such as number, layout, and type of measurements, we compare the performance of CCV with other metrics such as RMSE. By simulating hydrogeological processes using estimated and true fields, we observe that CCV outperforms other existing evaluation metrics.

  19. SU-F-J-94: Development of a Plug-in Based Image Analysis Tool for Integration Into Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owen, D; Anderson, C; Mayo, C

    Purpose: To extend the functionality of a commercial treatment planning system (TPS) to support (i) direct use of quantitative image-based metrics within treatment plan optimization and (ii) evaluation of dose-functional volume relationships to assist in functional image adaptive radiotherapy. Methods: A script was written that interfaces with a commercial TPS via an Application Programming Interface (API). The script executes a program that performs dose-functional volume analyses. Written in C#, the script reads the dose grid and correlates it with image data on a voxel-by-voxel basis through API extensions that can access registration transforms. A user interface was designed through WinFormsmore » to input parameters and display results. To test the performance of this program, image- and dose-based metrics computed from perfusion SPECT images aligned to the treatment planning CT were generated, validated, and compared. Results: The integration of image analysis information was successfully implemented as a plug-in to a commercial TPS. Perfusion SPECT images were used to validate the calculation and display of image-based metrics as well as dose-intensity metrics and histograms for defined structures on the treatment planning CT. Various biological dose correction models, custom image-based metrics, dose-intensity computations, and dose-intensity histograms were applied to analyze the image-dose profile. Conclusion: It is possible to add image analysis features to commercial TPSs through custom scripting applications. A tool was developed to enable the evaluation of image-intensity-based metrics in the context of functional targeting and avoidance. In addition to providing dose-intensity metrics and histograms that can be easily extracted from a plan database and correlated with outcomes, the system can also be extended to a plug-in optimization system, which can directly use the computed metrics for optimization of post-treatment tumor or normal tissue response models. Supported by NIH - P01 - CA059827.« less

  20. SAMBA: Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos

    NASA Astrophysics Data System (ADS)

    Ahlfeld, R.; Belkouchi, B.; Montomoli, F.

    2016-09-01

    A new arbitrary Polynomial Chaos (aPC) method is presented for moderately high-dimensional problems characterised by limited input data availability. The proposed methodology improves the algorithm of aPC and extends the method, that was previously only introduced as tensor product expansion, to moderately high-dimensional stochastic problems. The fundamental idea of aPC is to use the statistical moments of the input random variables to develop the polynomial chaos expansion. This approach provides the possibility to propagate continuous or discrete probability density functions and also histograms (data sets) as long as their moments exist, are finite and the determinant of the moment matrix is strictly positive. For cases with limited data availability, this approach avoids bias and fitting errors caused by wrong assumptions. In this work, an alternative way to calculate the aPC is suggested, which provides the optimal polynomials, Gaussian quadrature collocation points and weights from the moments using only a handful of matrix operations on the Hankel matrix of moments. It can therefore be implemented without requiring prior knowledge about statistical data analysis or a detailed understanding of the mathematics of polynomial chaos expansions. The extension to more input variables suggested in this work, is an anisotropic and adaptive version of Smolyak's algorithm that is solely based on the moments of the input probability distributions. It is referred to as SAMBA (PC), which is short for Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos. It is illustrated that for moderately high-dimensional problems (up to 20 different input variables or histograms) SAMBA can significantly simplify the calculation of sparse Gaussian quadrature rules. SAMBA's efficiency for multivariate functions with regard to data availability is further demonstrated by analysing higher order convergence and accuracy for a set of nonlinear test functions with 2, 5 and 10 different input distributions or histograms.

  1. Image enhancement software for underwater recovery operations: User's manual

    NASA Astrophysics Data System (ADS)

    Partridge, William J.; Therrien, Charles W.

    1989-06-01

    This report describes software for performing image enhancement on live or recorded video images. The software was developed for operational use during underwater recovery operations at the Naval Undersea Warfare Engineering Station. The image processing is performed on an IBM-PC/AT compatible computer equipped with hardware to digitize and display video images. The software provides the capability to provide contrast enhancement and other similar functions in real time through hardware lookup tables, to automatically perform histogram equalization, to capture one or more frames and average them or apply one of several different processing algorithms to a captured frame. The report is in the form of a user manual for the software and includes guided tutorial and reference sections. A Digital Image Processing Primer in the appendix serves to explain the principle concepts that are used in the image processing.

  2. Local dynamic range compensation for scanning electron microscope imaging system by sub-blocking multiple peak HE with convolution.

    PubMed

    Sim, K S; Teh, V; Tey, Y C; Kho, T K

    2016-11-01

    This paper introduces new development technique to improve the Scanning Electron Microscope (SEM) image quality and we name it as sub-blocking multiple peak histogram equalization (SUB-B-MPHE) with convolution operator. By using this new proposed technique, it shows that the new modified MPHE performs better than original MPHE. In addition, the sub-blocking method consists of convolution operator which can help to remove the blocking effect for SEM images after applying this new developed technique. Hence, by using the convolution operator, it effectively removes the blocking effect by properly distributing the suitable pixel value for the whole image. Overall, the SUB-B-MPHE with convolution outperforms the rest of methods. SCANNING 38:492-501, 2016. © 2015 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.

  3. Time Reversal Acoustic Communication Using Filtered Multitone Modulation

    PubMed Central

    Sun, Lin; Chen, Baowei; Li, Haisen; Zhou, Tian; Li, Ruo

    2015-01-01

    The multipath spread in underwater acoustic channels is severe and, therefore, when the symbol rate of the time reversal (TR) acoustic communication using single-carrier (SC) modulation is high, the large intersymbol interference (ISI) span caused by multipath reduces the performance of the TR process and needs to be removed using the long adaptive equalizer as the post-processor. In this paper, a TR acoustic communication method using filtered multitone (FMT) modulation is proposed in order to reduce the residual ISI in the processed signal using TR. In the proposed method, FMT modulation is exploited to modulate information symbols onto separate subcarriers with high spectral containment and TR technique, as well as adaptive equalization is adopted at the receiver to suppress ISI and noise. The performance of the proposed method is assessed through simulation and real data from a trial in an experimental pool. The proposed method was compared with the TR acoustic communication using SC modulation with the same spectral efficiency. Results demonstrate that the proposed method can improve the performance of the TR process and reduce the computational complexity of adaptive equalization for post-process. PMID:26393586

  4. Time Reversal Acoustic Communication Using Filtered Multitone Modulation.

    PubMed

    Sun, Lin; Chen, Baowei; Li, Haisen; Zhou, Tian; Li, Ruo

    2015-09-17

    The multipath spread in underwater acoustic channels is severe and, therefore, when the symbol rate of the time reversal (TR) acoustic communication using single-carrier (SC) modulation is high, the large intersymbol interference (ISI) span caused by multipath reduces the performance of the TR process and needs to be removed using the long adaptive equalizer as the post-processor. In this paper, a TR acoustic communication method using filtered multitone (FMT) modulation is proposed in order to reduce the residual ISI in the processed signal using TR. In the proposed method, FMT modulation is exploited to modulate information symbols onto separate subcarriers with high spectral containment and TR technique, as well as adaptive equalization is adopted at the receiver to suppress ISI and noise. The performance of the proposed method is assessed through simulation and real data from a trial in an experimental pool. The proposed method was compared with the TR acoustic communication using SC modulation with the same spectral efficiency. Results demonstrate that the proposed method can improve the performance of the TR process and reduce the computational complexity of adaptive equalization for post-process.

  5. Micro-Droplet Detection Method for Measuring the Concentration of Alkaline Phosphatase-Labeled Nanoparticles in Fluorescence Microscopy

    PubMed Central

    Li, Rufeng; Wang, Yibei; Xu, Hong; Fei, Baowei; Qin, Binjie

    2017-01-01

    This paper developed and evaluated a quantitative image analysis method to measure the concentration of the nanoparticles on which alkaline phosphatase (AP) was immobilized. These AP-labeled nanoparticles are widely used as signal markers for tagging biomolecules at nanometer and sub-nanometer scales. The AP-labeled nanoparticle concentration measurement can then be directly used to quantitatively analyze the biomolecular concentration. Micro-droplets are mono-dispersed micro-reactors that can be used to encapsulate and detect AP-labeled nanoparticles. Micro-droplets include both empty micro-droplets and fluorescent micro-droplets, while fluorescent micro-droplets are generated from the fluorescence reaction between the APs adhering to a single nanoparticle and corresponding fluorogenic substrates within droplets. By detecting micro-droplets and calculating the proportion of fluorescent micro-droplets to the overall micro-droplets, we can calculate the AP-labeled nanoparticle concentration. The proposed micro-droplet detection method includes the following steps: (1) Gaussian filtering to remove the noise of overall fluorescent targets, (2) a contrast-limited, adaptive histogram equalization processing to enhance the contrast of weakly luminescent micro-droplets, (3) an red maximizing inter-class variance thresholding method (OTSU) to segment the enhanced image for getting the binary map of the overall micro-droplets, (4) a circular Hough transform (CHT) method to detect overall micro-droplets and (5) an intensity-mean-based thresholding segmentation method to extract the fluorescent micro-droplets. The experimental results of fluorescent micro-droplet images show that the average accuracy of our micro-droplet detection method is 0.9586; the average true positive rate is 0.9502; and the average false positive rate is 0.0073. The detection method can be successfully applied to measure AP-labeled nanoparticle concentration in fluorescence microscopy. PMID:29160812

  6. Micro-Droplet Detection Method for Measuring the Concentration of Alkaline Phosphatase-Labeled Nanoparticles in Fluorescence Microscopy.

    PubMed

    Li, Rufeng; Wang, Yibei; Xu, Hong; Fei, Baowei; Qin, Binjie

    2017-11-21

    This paper developed and evaluated a quantitative image analysis method to measure the concentration of the nanoparticles on which alkaline phosphatase (AP) was immobilized. These AP-labeled nanoparticles are widely used as signal markers for tagging biomolecules at nanometer and sub-nanometer scales. The AP-labeled nanoparticle concentration measurement can then be directly used to quantitatively analyze the biomolecular concentration. Micro-droplets are mono-dispersed micro-reactors that can be used to encapsulate and detect AP-labeled nanoparticles. Micro-droplets include both empty micro-droplets and fluorescent micro-droplets, while fluorescent micro-droplets are generated from the fluorescence reaction between the APs adhering to a single nanoparticle and corresponding fluorogenic substrates within droplets. By detecting micro-droplets and calculating the proportion of fluorescent micro-droplets to the overall micro-droplets, we can calculate the AP-labeled nanoparticle concentration. The proposed micro-droplet detection method includes the following steps: (1) Gaussian filtering to remove the noise of overall fluorescent targets, (2) a contrast-limited, adaptive histogram equalization processing to enhance the contrast of weakly luminescent micro-droplets, (3) an red maximizing inter-class variance thresholding method (OTSU) to segment the enhanced image for getting the binary map of the overall micro-droplets, (4) a circular Hough transform (CHT) method to detect overall micro-droplets and (5) an intensity-mean-based thresholding segmentation method to extract the fluorescent micro-droplets. The experimental results of fluorescent micro-droplet images show that the average accuracy of our micro-droplet detection method is 0.9586; the average true positive rate is 0.9502; and the average false positive rate is 0.0073. The detection method can be successfully applied to measure AP-labeled nanoparticle concentration in fluorescence microscopy.

  7. A novel pipeline for adrenal tumour segmentation.

    PubMed

    Koyuncu, Hasan; Ceylan, Rahime; Erdogan, Hasan; Sivri, Mesut

    2018-06-01

    Adrenal tumours occur on adrenal glands surrounded by organs and osteoid. These tumours can be categorized as either functional, non-functional, malign, or benign. Depending on their appearance in the abdomen, adrenal tumours can arise from one adrenal gland (unilateral) or from both adrenal glands (bilateral) and can connect with other organs, including the liver, spleen, pancreas, etc. This connection phenomenon constitutes the most important handicap against adrenal tumour segmentation. Size change, variety of shape, diverse location, and low contrast (similar grey values between the various tissues) are other disadvantages compounding segmentation difficulty. Few studies have considered adrenal tumour segmentation, and no significant improvement has been achieved for unilateral, bilateral, adherent, or noncohesive tumour segmentation. There is also no recognised segmentation pipeline or method for adrenal tumours including different shape, size, or location information. This study proposes an adrenal tumour segmentation (ATUS) pipeline designed to eliminate the above disadvantages for adrenal tumour segmentation. ATUS incorporates a number of image methods, including contrast limited adaptive histogram equalization, split and merge based on quadtree decomposition, mean shift segmentation, large grey level eliminator, and region growing. Performance assessment of ATUS was realised on 32 arterial and portal phase computed tomography images using six metrics: dice, jaccard, sensitivity, specificity, accuracy, and structural similarity index. ATUS achieved remarkable segmentation performance, and was not affected by the discussed handicaps, on particularly adherence to other organs, with success rates of 83.06%, 71.44%, 86.44%, 99.66%, 99.43%, and 98.51% for the metrics, respectively, for images including sufficient contrast uptake. The proposed ATUS system realises detailed adrenal tumour segmentation, and avoids known disadvantages preventing accurate segmentation. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Blood vessel segmentation in modern wide-field retinal images in the presence of additive Gaussian noise.

    PubMed

    Asem, Morteza Modarresi; Oveisi, Iman Sheikh; Janbozorgi, Mona

    2018-07-01

    Retinal blood vessels indicate some serious health ramifications, such as cardiovascular disease and stroke. Thanks to modern imaging technology, high-resolution images provide detailed information to help analyze retinal vascular features before symptoms associated with such conditions fully develop. Additionally, these retinal images can be used by ophthalmologists to facilitate diagnosis and the procedures of eye surgery. A fuzzy noise reduction algorithm was employed to enhance color images corrupted by Gaussian noise. The present paper proposes employing a contrast limited adaptive histogram equalization to enhance illumination and increase the contrast of retinal images captured from state-of-the-art cameras. Possessing directional properties, the multistructure elements method can lead to high-performance edge detection. Therefore, multistructure elements-based morphology operators are used to detect high-quality image ridges. Following this detection, the irrelevant ridges, which are not part of the vessel tree, were removed by morphological operators by reconstruction, attempting also to keep the thin vessels preserved. A combined method of connected components analysis (CCA) in conjunction with a thresholding approach was further used to identify the ridges that correspond to vessels. The application of CCA can yield higher efficiency when it is locally applied rather than applied on the whole image. The significance of our work lies in the way in which several methods are effectively combined and the originality of the database employed, making this work unique in the literature. Computer simulation results in wide-field retinal images with up to a 200-deg field of view are a testimony of the efficacy of the proposed approach, with an accuracy of 0.9524.

  9. Protective effects of biodegradable collagen implants on thinned sclera after strabismus surgery: a paired-eye study.

    PubMed

    Yoo, Tae Keun; Han, Sueng-Han; Han, Jinu

    2017-12-01

    To determine the efficacy of a biodegradable Ologen (Aeon Astron Europe BV, Leiden, The Netherlands) collagen matrix in reducing the blue color change due to exposed thinned sclera after strabismus surgery. Fourteen patients with intermittent exotropia undergoing symmetric bilateral lateral rectus recession surgery were included in this prospective, randomized, paired-eye controlled study. In each patient, Ologen was implanted at the original rectus insertion site in one randomly selected eye; the other eye underwent conventional surgery. Ologen was inserted under the conjunctiva without suturing, covering the muscle insertion site. Conjunctival color change was analyzed using computer-based image analysis immediately and 1 week, 1 month, and 3 months postoperatively. Slit-lamp photographs of each eye were evaluated using contrast limited adaptive histogram equalization (CLAHE), Canny edge, and the RGB (red-green-blue) model. Secondary outcomes were conjunctival and sclera thickness 3 months postoperatively determined by anterior segment optical coherence tomography. Immediately and 1 week postoperatively all color models showed no significant differences between Ologen-implanted and control eyes. Three months postoperatively, Ologen-implanted eyes exhibited significantly lower CLAHE (P = 0.041) and RGB model blue color (P = 0.008) values than control eyes. Canny edge (P = 0.061) and RGB model red color (P = 0.152) values did not differ between eyes. Conjunctival stroma and episcleral complex thickness was greater in Ologen-implanted eyes than in controls (P = 0.001). Blue color change was significantly less noticeable in Ologen-implanted eyes than in controls. Thus, Ologen implantation helps prevent visible blue sclera at the original rectus insertion site after lateral rectus recession. Copyright © 2017 American Association for Pediatric Ophthalmology and Strabismus. Published by Elsevier Inc. All rights reserved.

  10. Enhancement and restoration of non-uniform illuminated Fundus Image of Retina obtained through thin layer of cataract.

    PubMed

    Mitra, Anirban; Roy, Sudipta; Roy, Somais; Setua, Sanjit Kumar

    2018-03-01

    Retinal fundus images are extensively used in manually or without human intervention to identify and analyze various diseases. Due to the comprehensive imaging arrangement, there is a large radiance, reflectance and contrast inconsistency within and across images. A novel method is proposed based on the cataract physical model to reduce the generated blurriness of the fundus image at the time of image acquisition through the thin layer of cataract by the fundus camera. After the blurriness reduction the method is proposed the enhancement procedure of the images with an objective on contrast perfection with no preamble of artifacts. Due to the uneven distribution of thickness of the cataract, the cataract surroundings are first predicted in the domain of frequency. Second, the resultant image of first step enhanced by the intensity histogram equalization in the adapted Hue Saturation Intensity (HSI) color image space such as the gamut problem can be avoided. The concluding image with suitable color and disparity is acquired by using the proposed max-min color correction approach. The result indicates that not only the proposed method can more effectively enhanced the non-uniform image of retina obtain through thin layer of cataract, but also the resulting image show appropriate brightness and saturation and maintain complete color space information. The projected enhancement method has been tested on the openly available datasets and the result evaluated with the standard used image enhancement algorithms and the cataract removal method. Results show noticeable development over existing methods. Cataract often prevents the clinician from objectively evaluating fundus feature. Cataract also affect subjective test. Enhancement and restoration of non-uniform illuminated Fundus Image of Retina obtained through thin layer of Cataract has shown here to be potentially beneficial. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Chromosome number variation in two antipodean floras.

    PubMed

    Peruzzi, Lorenzo; Dawson, Murray I; Bedini, Gianni

    2011-01-01

    We compared chromosome number (CN) variation in the nearly antipodean Italian and New Zealand floras to verify (i) whether patterns of variation reflect their similar latitudinal ranges or their different biogeographic/taxonomic contexts, (ii) if any differences are equally distributed across major taxa/lineages and (iii) if the frequency, number and taxonomic distribution of B-chromosomes differ between the two countries. We compared two datasets comprising 3426 (Italy) and 2525 (New Zealand) distinct cytotypes. We also compared a subset based on taxonomic orders and superimposed them onto a phylogeny of vascular plants. We used standard statistics, histograms, and either analysis of variance or Kruskal-Wallis tests to analyse the data. Mean CN of the vascular New Zealand flora is about twice that of Italy. For most orders, mean CN values for New Zealand are higher than those of the Italian flora and the differences are statistically significant. Further differences in CN variation among the orders and main clades that we studied, irrespective of geographical distinctions, are revealed. No correlation was found between chromosome and B-chromosome number. Mean CN of the whole New Zealand dataset is about twice that of the Italian flora. This suggests that extensive polyploidization played a major role in the evolution of the New Zealand vascular flora that is characterized by a rate of high endemism. Our results show that the hypothesis of a polyploid increase proportional to distance from the Equator cannot be applied to territories with the same latitudinal ranges but placed in different hemispheres. We suggest that bioclimatic gradients, rather than or in addition to latitudinal gradients, might account for a polyploidy increase. Our data also suggest that any adaptive role of B-chromosomes at geographic scale may be sought in their frequency rather than in their number.

  12. Chromosome number variation in two antipodean floras

    PubMed Central

    Peruzzi, Lorenzo; Dawson, Murray I.; Bedini, Gianni

    2011-01-01

    Background and aims We compared chromosome number (CN) variation in the nearly antipodean Italian and New Zealand floras to verify (i) whether patterns of variation reflect their similar latitudinal ranges or their different biogeographic/taxonomic contexts, (ii) if any differences are equally distributed across major taxa/lineages and (iii) if the frequency, number and taxonomic distribution of B-chromosomes differ between the two countries. Methodology We compared two datasets comprising 3426 (Italy) and 2525 (New Zealand) distinct cytotypes. We also compared a subset based on taxonomic orders and superimposed them onto a phylogeny of vascular plants. We used standard statistics, histograms, and either analysis of variance or Kruskal–Wallis tests to analyse the data. Principal results Mean CN of the vascular New Zealand flora is about twice that of Italy. For most orders, mean CN values for New Zealand are higher than those of the Italian flora and the differences are statistically significant. Further differences in CN variation among the orders and main clades that we studied, irrespective of geographical distinctions, are revealed. No correlation was found between chromosome and B-chromosome number. Conclusions Mean CN of the whole New Zealand dataset is about twice that of the Italian flora. This suggests that extensive polyploidization played a major role in the evolution of the New Zealand vascular flora that is characterized by a rate of high endemism. Our results show that the hypothesis of a polyploid increase proportional to distance from the Equator cannot be applied to territories with the same latitudinal ranges but placed in different hemispheres. We suggest that bioclimatic gradients, rather than or in addition to latitudinal gradients, might account for a polyploidy increase. Our data also suggest that any adaptive role of B-chromosomes at geographic scale may be sought in their frequency rather than in their number. PMID:22476490

  13. Smartphone-based colorimetric analysis for detection of saliva alcohol concentration.

    PubMed

    Jung, Youngkee; Kim, Jinhee; Awofeso, Olumide; Kim, Huisung; Regnier, Fred; Bae, Euiwon

    2015-11-01

    A simple device and associated analytical methods are reported. We provide objective and accurate determination of saliva alcohol concentrations using smartphone-based colorimetric imaging. The device utilizes any smartphone with a miniature attachment that positions the sample and provides constant illumination for sample imaging. Analyses of histograms based on channel imaging of red-green-blue (RGB) and hue-saturation-value (HSV) color space provide unambiguous determination of blood alcohol concentration from color changes on sample pads. A smartphone-based sample analysis by colorimetry was developed and tested with blind samples that matched with the training sets. This technology can be adapted to any smartphone and used to conduct color change assays.

  14. Discoveries far from the lamppost with matrix elements and ranking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Debnath, Dipsikha; Gainer, James S.; Matchev, Konstantin T.

    2015-04-01

    The prevalence of null results in searches for new physics at the LHC motivates the effort to make these searches as model-independent as possible. We describe procedures for adapting the Matrix Element Method for situations where the signal hypothesis is not known a priori. We also present general and intuitive approaches for performing analyses and presenting results, which involve the flattening of background distributions using likelihood information. The first flattening method involves ranking events by background matrix element, the second involves quantile binning with respect to likelihood (and other) variables, and the third method involves reweighting histograms by the inversemore » of the background distribution.« less

  15. Thermodynamic properties of solvated peptides from selective integrated tempering sampling with a new weighting factor estimation algorithm

    NASA Astrophysics Data System (ADS)

    Shen, Lin; Xie, Liangxu; Yang, Mingjun

    2017-04-01

    Conformational sampling under rugged energy landscape is always a challenge in computer simulations. The recently developed integrated tempering sampling, together with its selective variant (SITS), emerges to be a powerful tool in exploring the free energy landscape or functional motions of various systems. The estimation of weighting factors constitutes a critical step in these methods and requires accurate calculation of partition function ratio between different thermodynamic states. In this work, we propose a new adaptive update algorithm to compute the weighting factors based on the weighted histogram analysis method (WHAM). The adaptive-WHAM algorithm with SITS is then applied to study the thermodynamic properties of several representative peptide systems solvated in an explicit water box. The performance of the new algorithm is validated in simulations of these solvated peptide systems. We anticipate more applications of this coupled optimisation and production algorithm to other complicated systems such as the biochemical reactions in solution.

  16. DWI-associated entire-tumor histogram analysis for the differentiation of low-grade prostate cancer from intermediate-high-grade prostate cancer.

    PubMed

    Wu, Chen-Jiang; Wang, Qing; Li, Hai; Wang, Xiao-Ning; Liu, Xi-Sheng; Shi, Hai-Bin; Zhang, Yu-Dong

    2015-10-01

    To investigate diagnostic efficiency of DWI using entire-tumor histogram analysis in differentiating the low-grade (LG) prostate cancer (PCa) from intermediate-high-grade (HG) PCa in comparison with conventional ROI-based measurement. DW images (b of 0-1400 s/mm(2)) from 126 pathology-confirmed PCa (diameter >0.5 cm) in 110 patients were retrospectively collected and processed by mono-exponential model. The measurement of tumor apparent diffusion coefficients (ADCs) was performed with using histogram-based and ROI-based approach, respectively. The diagnostic ability of ADCs from two methods for differentiating LG-PCa (Gleason score, GS ≤ 6) from HG-PCa (GS > 6) was determined by ROC regression, and compared by McNemar's test. There were 49 LG-tumor and 77 HG-tumor at pathologic findings. Histogram-based ADCs (mean, median, 10th and 90th) and ROI-based ADCs (mean) showed dominant relationships with ordinal GS of Pca (ρ = -0.225 to -0.406, p < 0.05). All above imaging indices reflected significant difference between LG-PCa and HG-PCa (all p values <0.01). Histogram 10th ADCs had dominantly high Az (0.738), Youden index (0.415), and positive likelihood ratio (LR+, 2.45) in stratifying tumor GS against mean, median and 90th ADCs, and ROI-based ADCs. Histogram mean, median, and 10th ADCs showed higher specificity (65.3%-74.1% vs. 44.9%, p < 0.01), but lower sensitivity (57.1%-71.3% vs. 84.4%, p < 0.05) than ROI-based ADCs in differentiating LG-PCa from HG-PCa. DWI-associated histogram analysis had higher specificity, Az, Youden index, and LR+ for differentiation of PCa Gleason grade than ROI-based approach.

  17. Dynamic contrast-enhanced MR imaging of the rectum: Correlations between single-section and whole-tumor histogram analyses.

    PubMed

    Choi, M H; Oh, S N; Park, G E; Yeo, D-M; Jung, S E

    2018-05-10

    To evaluate the interobserver and intermethod correlations of histogram metrics of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) parameters acquired by multiple readers using the single-section and whole-tumor volume methods. Four DCE parameters (K trans , K ep , V e , V p ) were evaluated in 45 patients (31 men and 14 women; mean age, 61±11 years [range, 29-83 years]) with locally advanced rectal cancer using pre-chemoradiotherapy (CRT) MRI. Ten histogram metrics were extracted using two methods of lesion selection performed by three radiologists: the whole-tumor volume method for the whole tumor on axial section-by-section images and the single-section method for the entire area of the tumor on one axial image. The interobserver and intermethod correlations were evaluated using the intraclass correlation coefficients (ICCs). The ICCs showed excellent interobserver and intermethod correlations in most of histogram metrics of the DCE parameters. The ICCs among the three readers were > 0.7 (P<0.001) for all histogram metrics, except for the minimum and maximum. The intermethod correlations for most of the histogram metrics were excellent for each radiologist, regardless of the differences in the radiologists' experience. The interobserver and intermethod correlations for most of the histogram metrics of the DCE parameters are excellent in rectal cancer. Therefore, the single-section method may be a potential alternative to the whole-tumor volume method using pre-CRT MRI, despite the fact that the high agreement between the two methods cannot be extrapolated to post-CRT MRI. Copyright © 2018 Société française de radiologie. Published by Elsevier Masson SAS. All rights reserved.

  18. Measuring the apparent diffusion coefficient in primary rectal tumors: is there a benefit in performing histogram analyses?

    PubMed

    van Heeswijk, Miriam M; Lambregts, Doenja M J; Maas, Monique; Lahaye, Max J; Ayas, Z; Slenter, Jos M G M; Beets, Geerard L; Bakers, Frans C H; Beets-Tan, Regina G H

    2017-06-01

    The apparent diffusion coefficient (ADC) is a potential prognostic imaging marker in rectal cancer. Typically, mean ADC values are used, derived from precise manual whole-volume tumor delineations by experts. The aim was first to explore whether non-precise circular delineation combined with histogram analysis can be a less cumbersome alternative to acquire similar ADC measurements and second to explore whether histogram analyses provide additional prognostic information. Thirty-seven patients who underwent a primary staging MRI including diffusion-weighted imaging (DWI; b0, 25, 50, 100, 500, 1000; 1.5 T) were included. Volumes-of-interest (VOIs) were drawn on b1000-DWI: (a) precise delineation, manually tracing tumor boundaries (2 expert readers), and (b) non-precise delineation, drawing circular VOIs with a wide margin around the tumor (2 non-experts). Mean ADC and histogram metrics (mean, min, max, median, SD, skewness, kurtosis, 5th-95th percentiles) were derived from the VOIs and delineation time was recorded. Measurements were compared between the two methods and correlated with prognostic outcome parameters. Median delineation time reduced from 47-165 s (precise) to 21-43 s (non-precise). The 45th percentile of the non-precise delineation showed the best correlation with the mean ADC from the precise delineation as the reference standard (ICC 0.71-0.75). None of the mean ADC or histogram parameters showed significant prognostic value; only the total tumor volume (VOI) was significantly larger in patients with positive clinical N stage and mesorectal fascia involvement. When performing non-precise tumor delineation, histogram analysis (in specific 45th ADC percentile) may be used as an alternative to obtain similar ADC values as with precise whole tumor delineation. Histogram analyses are not beneficial to obtain additional prognostic information.

  19. The histogram analysis of diffusion-weighted intravoxel incoherent motion (IVIM) imaging for differentiating the gleason grade of prostate cancer.

    PubMed

    Zhang, Yu-Dong; Wang, Qing; Wu, Chen-Jiang; Wang, Xiao-Ning; Zhang, Jing; Liu, Hui; Liu, Xi-Sheng; Shi, Hai-Bin

    2015-04-01

    To evaluate histogram analysis of intravoxel incoherent motion (IVIM) for discriminating the Gleason grade of prostate cancer (PCa). A total of 48 patients pathologically confirmed as having clinically significant PCa (size > 0.5 cm) underwent preoperative DW-MRI (b of 0-900 s/mm(2)). Data was post-processed by monoexponential and IVIM model for quantitation of apparent diffusion coefficients (ADCs), perfusion fraction f, diffusivity D and pseudo-diffusivity D*. Histogram analysis was performed by outlining entire-tumour regions of interest (ROIs) from histological-radiological correlation. The ability of imaging indices to differentiate low-grade (LG, Gleason score (GS) ≤6) from intermediate/high-grade (HG, GS > 6) PCa was analysed by ROC regression. Eleven patients had LG tumours (18 foci) and 37 patients had HG tumours (42 foci) on pathology examination. HG tumours had significantly lower ADCs and D in terms of mean, median, 10th and 75th percentiles, combined with higher histogram kurtosis and skewness for ADCs, D and f, than LG PCa (p < 0.05). Histogram D showed relatively higher correlations (ñ = 0.641-0.668 vs. ADCs: 0.544-0.574) with ordinal GS of PCa; and its mean, median and 10th percentile performed better than ADCs did in distinguishing LG from HG PCa. It is feasible to stratify the pathological grade of PCa by IVIM with histogram metrics. D performed better in distinguishing LG from HG tumour than conventional ADCs. • GS had relatively higher correlation with tumour D than ADCs. • Difference of histogram D among two-grade tumours was statistically significant. • D yielded better individual features in demonstrating tumour grade than ADC. • D* and f failed to determine tumour grade of PCa.

  20. Subtype Differentiation of Small (≤ 4 cm) Solid Renal Mass Using Volumetric Histogram Analysis of DWI at 3-T MRI.

    PubMed

    Li, Anqin; Xing, Wei; Li, Haojie; Hu, Yao; Hu, Daoyu; Li, Zhen; Kamel, Ihab R

    2018-05-29

    The purpose of this article is to evaluate the utility of volumetric histogram analysis of apparent diffusion coefficient (ADC) derived from reduced-FOV DWI for small (≤ 4 cm) solid renal mass subtypes at 3-T MRI. This retrospective study included 38 clear cell renal cell carcinomas (RCCs), 16 papillary RCCs, 18 chromophobe RCCs, 13 minimal fat angiomyolipomas (AMLs), and seven oncocytomas evaluated with preoperative MRI. Volumetric ADC maps were generated using all slices of the reduced-FOV DW images to obtain histogram parameters, including mean, median, 10th percentile, 25th percentile, 75th percentile, 90th percentile, and SD ADC values, as well as skewness, kurtosis, and entropy. Comparisons of these parameters were made by one-way ANOVA, t test, and ROC curves analysis. ADC histogram parameters differentiated eight of 10 pairs of renal tumors. Three subtype pairs (clear cell RCC vs papillary RCC, clear cell RCC vs chromophobe RCC, and clear cell RCC vs minimal fat AML) were differentiated by mean ADC. However, five other subtype pairs (clear cell RCC vs oncocytoma, papillary RCC vs minimal fat AML, papillary RCC vs oncocytoma, chromophobe RCC vs minimal fat AML, and chromophobe RCC vs oncocytoma) were differentiated by histogram distribution parameters exclusively (all p < 0.05). Mean ADC, median ADC, 75th and 90th percentile ADC, SD ADC, and entropy of malignant tumors were significantly higher than those of benign tumors (all p < 0.05). Combination of mean ADC with histogram parameters yielded the highest AUC (0.851; sensitivity, 80.0%; specificity, 86.1%). Quantitative volumetric ADC histogram analysis may help differentiate various subtypes of small solid renal tumors, including benign and malignant lesions.

  1. Diffusion-weighted imaging: Apparent diffusion coefficient histogram analysis for detecting pathologic complete response to chemoradiotherapy in locally advanced rectal cancer.

    PubMed

    Choi, Moon Hyung; Oh, Soon Nam; Rha, Sung Eun; Choi, Joon-Il; Lee, Sung Hak; Jang, Hong Seok; Kim, Jun-Gi; Grimm, Robert; Son, Yohan

    2016-07-01

    To investigate the usefulness of apparent diffusion coefficient (ADC) values derived from histogram analysis of the whole rectal cancer as a quantitative parameter to evaluate pathologic complete response (pCR) on preoperative magnetic resonance imaging (MRI). We enrolled a total of 86 consecutive patients who had undergone surgery for rectal cancer after neoadjuvant chemoradiotherapy (CRT) at our institution between July 2012 and November 2014. Two radiologists who were blinded to the final pathological results reviewed post-CRT MRI to evaluate tumor stage. Quantitative image analysis was performed using T2 -weighted and diffusion-weighted images independently by two radiologists using dedicated software that performed histogram analysis to assess the distribution of ADC in the whole tumor. After surgery, 16 patients were confirmed to have achieved pCR (18.6%). All parameters from pre- and post-CRT ADC histogram showed good or excellent agreement between two readers. The minimum, 10th, 25th, 50th, and 75th percentile and mean ADC from post-CRT ADC histogram were significantly higher in the pCR group than in the non-pCR group for both readers. The 25th percentile value from ADC histogram in post-CRT MRI had the best diagnostic performance for detecting pCR, with an area under the receiver operating characteristic curve of 0.796. Low percentile values derived from the ADC histogram analysis of rectal cancer on MRI after CRT showed a significant difference between pCR and non-pCR groups, demonstrating the utility of the ADC value as a quantitative and objective marker to evaluate complete pathologic response to preoperative CRT in rectal cancer. J. Magn. Reson. Imaging 2016;44:212-220. © 2015 Wiley Periodicals, Inc.

  2. Serial data acquisition for GEM-2D detector

    NASA Astrophysics Data System (ADS)

    Kolasinski, Piotr; Pozniak, Krzysztof T.; Czarski, Tomasz; Linczuk, Maciej; Byszuk, Adrian; Chernyshova, Maryna; Juszczyk, Bartlomiej; Kasprowicz, Grzegorz; Wojenski, Andrzej; Zabolotny, Wojciech; Zienkiewicz, Pawel; Mazon, Didier; Malard, Philippe; Herrmann, Albrecht; Vezinet, Didier

    2014-11-01

    This article debates about data fast acquisition and histogramming method for the X-ray GEM detector. The whole process of histogramming is performed by FPGA chips (Spartan-6 series from Xilinx). The results of the histogramming process are stored in an internal FPGA memory and then sent to PC. In PC data is merged and processed by MATLAB. The structure of firmware functionality implemented in the FPGAs is described. Examples of test measurements and results are presented.

  3. Frequency distribution histograms for the rapid analysis of data

    NASA Technical Reports Server (NTRS)

    Burke, P. V.; Bullen, B. L.; Poff, K. L.

    1988-01-01

    The mean and standard error are good representations for the response of a population to an experimental parameter and are frequently used for this purpose. Frequency distribution histograms show, in addition, responses of individuals in the population. Both the statistics and a visual display of the distribution of the responses can be obtained easily using a microcomputer and available programs. The type of distribution shown by the histogram may suggest different mechanisms to be tested.

  4. A Monte Carlo study of the impact of the choice of rectum volume definition on estimates of equivalent uniform doses and the volume parameter

    NASA Astrophysics Data System (ADS)

    Kvinnsland, Yngve; Muren, Ludvig Paul; Dahl, Olav

    2004-08-01

    Calculations of normal tissue complication probability (NTCP) values for the rectum are difficult because it is a hollow, non-rigid, organ. Finding the true cumulative dose distribution for a number of treatment fractions requires a CT scan before each treatment fraction. This is labour intensive, and several surrogate distributions have therefore been suggested, such as dose wall histograms, dose surface histograms and histograms for the solid rectum, with and without margins. In this study, a Monte Carlo method is used to investigate the relationships between the cumulative dose distributions based on all treatment fractions and the above-mentioned histograms that are based on one CT scan only, in terms of equivalent uniform dose. Furthermore, the effect of a specific choice of histogram on estimates of the volume parameter of the probit NTCP model was investigated. It was found that the solid rectum and the rectum wall histograms (without margins) gave equivalent uniform doses with an expected value close to the values calculated from the cumulative dose distributions in the rectum wall. With the number of patients available in this study the standard deviations of the estimates of the volume parameter were large, and it was not possible to decide which volume gave the best estimates of the volume parameter, but there were distinct differences in the mean values of the values obtained.

  5. Detection of Local Tumor Recurrence After Definitive Treatment of Head and Neck Squamous Cell Carcinoma: Histogram Analysis of Dynamic Contrast-Enhanced T1-Weighted Perfusion MRI.

    PubMed

    Choi, Sang Hyun; Lee, Jeong Hyun; Choi, Young Jun; Park, Ji Eun; Sung, Yu Sub; Kim, Namkug; Baek, Jung Hwan

    2017-01-01

    This study aimed to explore the added value of histogram analysis of the ratio of initial to final 90-second time-signal intensity AUC (AUCR) for differentiating local tumor recurrence from contrast-enhancing scar on follow-up dynamic contrast-enhanced T1-weighted perfusion MRI of patients treated for head and neck squamous cell carcinoma (HNSCC). AUCR histogram parameters were assessed among tumor recurrence (n = 19) and contrast-enhancing scar (n = 27) at primary sites and compared using the t test. ROC analysis was used to determine the best differentiating parameters. The added value of AUCR histogram parameters was assessed when they were added to inconclusive conventional MRI results. Histogram analysis showed statistically significant differences in the 50th, 75th, and 90th percentiles of the AUCR values between the two groups (p < 0.05). The 90th percentile of the AUCR values (AUCR 90 ) was the best predictor of local tumor recurrence (AUC, 0.77; 95% CI, 0.64-0.91) with an estimated cutoff of 1.02. AUCR 90 increased sensitivity by 11.7% over that of conventional MRI alone when added to inconclusive results. Histogram analysis of AUCR can improve the diagnostic yield for local tumor recurrence during surveillance after treatment for HNSCC.

  6. Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma

    PubMed Central

    Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-01-01

    Abstract The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC). Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement. The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P <0.05), with area under the ROC curves (AUCs) of 0.66 to 0.74 for ADC and 0.76 to 0.88 for PVP. The largest AUC of PVP (1th percentile) showed significantly higher accuracy compared with that of arterial phase (AP) or tumor size (P <0.001). MR histogram analyses—in particular for 1th percentile for PVP images—held promise for prediction of MVI of HCC. PMID:27368028

  7. Effect of respiratory and cardiac gating on the major diffusion-imaging metrics

    PubMed Central

    Hamaguchi, Hiroyuki; Sugimori, Hiroyuki; Nakanishi, Mitsuhiro; Nakagawa, Shin; Fujiwara, Taro; Yoshida, Hirokazu; Takamori, Sayaka; Shirato, Hiroki

    2016-01-01

    The effect of respiratory gating on the major diffusion-imaging metrics and that of cardiac gating on mean kurtosis (MK) are not known. For evaluation of whether the major diffusion-imaging metrics—MK, fractional anisotropy (FA), and mean diffusivity (MD) of the brain—varied between gated and non-gated acquisitions, respiratory-gated, cardiac-gated, and non-gated diffusion-imaging of the brain were performed in 10 healthy volunteers. MK, FA, and MD maps were constructed for all acquisitions, and the histograms were constructed. The normalized peak height and location of the histograms were compared among the acquisitions by use of Friedman and post hoc Wilcoxon tests. The effect of the repetition time (TR) on the diffusion-imaging metrics was also tested, and we corrected for its variation among acquisitions, if necessary. The results showed a shift in the peak location of the MK and MD histograms to the right with an increase in TR (p ≤ 0.01). The corrected peak location of the MK histograms, the normalized peak height of the FA histograms, the normalized peak height and the corrected peak location of the MD histograms varied significantly between the gated and non-gated acquisitions (p < 0.05). These results imply an influence of respiration and cardiac pulsation on the major diffusion-imaging metrics. The gating conditions must be kept identical if reproducible results are to be achieved. PMID:27073115

  8. Infrared face recognition based on LBP histogram and KW feature selection

    NASA Astrophysics Data System (ADS)

    Xie, Zhihua

    2014-07-01

    The conventional LBP-based feature as represented by the local binary pattern (LBP) histogram still has room for performance improvements. This paper focuses on the dimension reduction of LBP micro-patterns and proposes an improved infrared face recognition method based on LBP histogram representation. To extract the local robust features in infrared face images, LBP is chosen to get the composition of micro-patterns of sub-blocks. Based on statistical test theory, Kruskal-Wallis (KW) feature selection method is proposed to get the LBP patterns which are suitable for infrared face recognition. The experimental results show combination of LBP and KW features selection improves the performance of infrared face recognition, the proposed method outperforms the traditional methods based on LBP histogram, discrete cosine transform(DCT) or principal component analysis(PCA).

  9. Remote logo detection using angle-distance histograms

    NASA Astrophysics Data System (ADS)

    Youn, Sungwook; Ok, Jiheon; Baek, Sangwook; Woo, Seongyoun; Lee, Chulhee

    2016-05-01

    Among all the various computer vision applications, automatic logo recognition has drawn great interest from industry as well as various academic institutions. In this paper, we propose an angle-distance map, which we used to develop a robust logo detection algorithm. The proposed angle-distance histogram is invariant against scale and rotation. The proposed method first used shape information and color characteristics to find the candidate regions and then applied the angle-distance histogram. Experiments show that the proposed method detected logos of various sizes and orientations.

  10. Diagnosis of Tempromandibular Disorders Using Local Binary Patterns.

    PubMed

    Haghnegahdar, A A; Kolahi, S; Khojastepour, L; Tajeripour, F

    2018-03-01

    Temporomandibular joint disorder (TMD) might be manifested as structural changes in bone through modification, adaptation or direct destruction. We propose to use Local Binary Pattern (LBP) characteristics and histogram-oriented gradients on the recorded images as a diagnostic tool in TMD assessment. CBCT images of 66 patients (132 joints) with TMD and 66 normal cases (132 joints) were collected and 2 coronal cut prepared from each condyle, although images were limited to head of mandibular condyle. In order to extract features of images, first we use LBP and then histogram of oriented gradients. To reduce dimensionality, the linear algebra Singular Value Decomposition (SVD) is applied to the feature vectors matrix of all images. For evaluation, we used K nearest neighbor (K-NN), Support Vector Machine, Naïve Bayesian and Random Forest classifiers. We used Receiver Operating Characteristic (ROC) to evaluate the hypothesis. K nearest neighbor classifier achieves a very good accuracy (0.9242), moreover, it has desirable sensitivity (0.9470) and specificity (0.9015) results, when other classifiers have lower accuracy, sensitivity and specificity. We proposed a fully automatic approach to detect TMD using image processing techniques based on local binary patterns and feature extraction. K-NN has been the best classifier for our experiments in detecting patients from healthy individuals, by 92.42% accuracy, 94.70% sensitivity and 90.15% specificity. The proposed method can help automatically diagnose TMD at its initial stages.

  11. Impact of the radiotherapy technique on the correlation between dose-volume histograms of the bladder wall defined on MRI imaging and dose-volume/surface histograms in prostate cancer patients

    NASA Astrophysics Data System (ADS)

    Maggio, Angelo; Carillo, Viviana; Cozzarini, Cesare; Perna, Lucia; Rancati, Tiziana; Valdagni, Riccardo; Gabriele, Pietro; Fiorino, Claudio

    2013-04-01

    The aim of this study was to evaluate the correlation between the ‘true’ absolute and relative dose-volume histograms (DVHs) of the bladder wall, dose-wall histogram (DWH) defined on MRI imaging and other surrogates of bladder dosimetry in prostate cancer patients, planned both with 3D-conformal and intensity-modulated radiation therapy (IMRT) techniques. For 17 prostate cancer patients, previously treated with radical intent, CT and MRI scans were acquired and matched. The contours of bladder walls were drawn by using MRI images. External bladder surfaces were then used to generate artificial bladder walls by performing automatic contractions of 5, 7 and 10 mm. For each patient a 3D conformal radiotherapy (3DCRT) and an IMRT treatment plan was generated with a prescription dose of 77.4 Gy (1.8 Gy/fr) and DVH of the whole bladder of the artificial walls (DVH-5/10) and dose-surface histograms (DSHs) were calculated and compared against the DWH in absolute and relative value, for both treatment planning techniques. A specific software (VODCA v. 4.4.0, MSS Inc.) was used for calculating the dose-volume/surface histogram. Correlation was quantified for selected dose-volume/surface parameters by the Spearman correlation coefficient. The agreement between %DWH and DVH5, DVH7 and DVH10 was found to be very good (maximum average deviations below 2%, SD < 5%): DVH5 showed the best agreement. The correlation was slightly better for absolute (R = 0.80-0.94) compared to relative (R = 0.66-0.92) histograms. The DSH was also found to be highly correlated with the DWH, although slightly higher deviations were generally found. The DVH was not a good surrogate of the DWH (R < 0.7 for most of parameters). When comparing the two treatment techniques, more pronounced differences between relative histograms were seen for IMRT with respect to 3DCRT (p < 0.0001).

  12. Comparison of adverse effects of proton and X-ray chemoradiotherapy for esophageal cancer using an adaptive dose-volume histogram analysis.

    PubMed

    Makishima, Hirokazu; Ishikawa, Hitoshi; Terunuma, Toshiyuki; Hashimoto, Takayuki; Yamanashi, Koichi; Sekiguchi, Takao; Mizumoto, Masashi; Okumura, Toshiyuki; Sakae, Takeji; Sakurai, Hideyuki

    2015-05-01

    Cardiopulmonary late toxicity is of concern in concurrent chemoradiotherapy (CCRT) for esophageal cancer. The aim of this study was to examine the benefit of proton beam therapy (PBT) using clinical data and adaptive dose-volume histogram (DVH) analysis. The subjects were 44 patients with esophageal cancer who underwent definitive CCRT using X-rays (n = 19) or protons (n = 25). Experimental recalculation using protons was performed for the patient actually treated with X-rays, and vice versa. Target coverage and dose constraints of normal tissues were conserved. Lung V5-V20, mean lung dose (MLD), and heart V30-V50 were compared for risk organ doses between experimental plans and actual treatment plans. Potential toxicity was estimated using protons in patients actually treated with X-rays, and vice versa. Pulmonary events of Grade ≥2 occurred in 8/44 cases (18%), and cardiac events were seen in 11 cases (25%). Risk organ doses in patients with events of Grade ≥2 were significantly higher than for those with events of Grade ≤1. Risk organ doses were lower in proton plans compared with X-ray plans. All patients suffering toxicity who were treated with X-rays (n = 13) had reduced predicted doses in lung and heart using protons, while doses in all patients treated with protons (n = 24) with toxicity of Grade ≤1 had worsened predicted toxicity with X-rays. Analysis of normal tissue complication probability showed a potential reduction in toxicity by using proton beams. Irradiation dose, volume and adverse effects on the heart and lung can be reduced using protons. Thus, PBT is a promising treatment modality for the management of esophageal cancer. © The Author 2015. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  13. Objective evaluation of linear and nonlinear tomosynthetic reconstruction algorithms

    NASA Astrophysics Data System (ADS)

    Webber, Richard L.; Hemler, Paul F.; Lavery, John E.

    2000-04-01

    This investigation objectively tests five different tomosynthetic reconstruction methods involving three different digital sensors, each used in a different radiologic application: chest, breast, and pelvis, respectively. The common task was to simulate a specific representative projection for each application by summation of appropriately shifted tomosynthetically generated slices produced by using the five algorithms. These algorithms were, respectively, (1) conventional back projection, (2) iteratively deconvoluted back projection, (3) a nonlinear algorithm similar to back projection, except that the minimum value from all of the component projections for each pixel is computed instead of the average value, (4) a similar algorithm wherein the maximum value was computed instead of the minimum value, and (5) the same type of algorithm except that the median value was computed. Using these five algorithms, we obtained data from each sensor-tissue combination, yielding three factorially distributed series of contiguous tomosynthetic slices. The respective slice stacks then were aligned orthogonally and averaged to yield an approximation of a single orthogonal projection radiograph of the complete (unsliced) tissue thickness. Resulting images were histogram equalized, and actual projection control images were subtracted from their tomosynthetically synthesized counterparts. Standard deviations of the resulting histograms were recorded as inverse figures of merit (FOMs). Visual rankings of image differences by five human observers of a subset (breast data only) also were performed to determine whether their subjective observations correlated with homologous FOMs. Nonparametric statistical analysis of these data demonstrated significant differences (P > 0.05) between reconstruction algorithms. The nonlinear minimization reconstruction method nearly always outperformed the other methods tested. Observer rankings were similar to those measured objectively.

  14. Underwater Acoustic Propagation and Communications: A Coupled Research Program

    DTIC Science & Technology

    2015-06-15

    coding technique suitable for both SIMO and MIMO systems. 4. an adaptive OFDM modulation technique, whereby the transmitter acts in response to...timate based adaptation for SIMO and MIMO systems in a interactive turbo-equalization framework were developed and analyzed. MIMO and SISO

  15. Histograms and Raisin Bread

    ERIC Educational Resources Information Center

    Leyden, Michael B.

    1975-01-01

    Describes various elementary school activities using a loaf of raisin bread to promote inquiry skills. Activities include estimating the number of raisins in the loaf by constructing histograms of the number of raisins in a slice. (MLH)

  16. Infrared small target enhancement: grey level mapping based on improved sigmoid transformation and saliency histogram

    NASA Astrophysics Data System (ADS)

    Wan, Minjie; Gu, Guohua; Qian, Weixian; Ren, Kan; Chen, Qian

    2018-06-01

    Infrared (IR) small target enhancement plays a significant role in modern infrared search and track (IRST) systems and is the basic technique of target detection and tracking. In this paper, a coarse-to-fine grey level mapping method using improved sigmoid transformation and saliency histogram is designed to enhance IR small targets under different backgrounds. For the stage of rough enhancement, the intensity histogram is modified via an improved sigmoid function so as to narrow the regular intensity range of background as much as possible. For the part of further enhancement, a linear transformation is accomplished based on a saliency histogram constructed by averaging the cumulative saliency values provided by a saliency map. Compared with other typical methods, the presented method can achieve both better visual performances and quantitative evaluations.

  17. A domain-knowledge-inspired mathematical framework for the description and classification of H&E stained histopathology images.

    PubMed

    Massar, Melody L; Bhagavatula, Ramamurthy; Ozolek, John A; Castro, Carlos A; Fickus, Matthew; Kovačević, Jelena

    2011-10-19

    We present the current state of our work on a mathematical framework for identification and delineation of histopathology images-local histograms and occlusion models. Local histograms are histograms computed over defined spatial neighborhoods whose purpose is to characterize an image locally. This unit of description is augmented by our occlusion models that describe a methodology for image formation. In the context of this image formation model, the power of local histograms with respect to appropriate families of images will be shown through various proved statements about expected performance. We conclude by presenting a preliminary study to demonstrate the power of the framework in the context of histopathology image classification tasks that, while differing greatly in application, both originate from what is considered an appropriate class of images for this framework.

  18. [Research on K-means clustering segmentation method for MRI brain image based on selecting multi-peaks in gray histogram].

    PubMed

    Chen, Zhaoxue; Yu, Haizhong; Chen, Hao

    2013-12-01

    To solve the problem of traditional K-means clustering in which initial clustering centers are selected randomly, we proposed a new K-means segmentation algorithm based on robustly selecting 'peaks' standing for White Matter, Gray Matter and Cerebrospinal Fluid in multi-peaks gray histogram of MRI brain image. The new algorithm takes gray value of selected histogram 'peaks' as the initial K-means clustering center and can segment the MRI brain image into three parts of tissue more effectively, accurately, steadily and successfully. Massive experiments have proved that the proposed algorithm can overcome many shortcomings caused by traditional K-means clustering method such as low efficiency, veracity, robustness and time consuming. The histogram 'peak' selecting idea of the proposed segmentootion method is of more universal availability.

  19. Neutron camera employing row and column summations

    DOEpatents

    Clonts, Lloyd G.; Diawara, Yacouba; Donahue, Jr, Cornelius; Montcalm, Christopher A.; Riedel, Richard A.; Visscher, Theodore

    2016-06-14

    For each photomultiplier tube in an Anger camera, an R.times.S array of preamplifiers is provided to detect electrons generated within the photomultiplier tube. The outputs of the preamplifiers are digitized to measure the magnitude of the signals from each preamplifier. For each photomultiplier tube, a corresponding summation circuitry including R row summation circuits and S column summation circuits numerically add the magnitudes of the signals from preamplifiers for each row and for each column to generate histograms. For a P.times.Q array of photomultiplier tubes, P.times.Q summation circuitries generate P.times.Q row histograms including R entries and P.times.Q column histograms including S entries. The total set of histograms include P.times.Q.times.(R+S) entries, which can be analyzed by a position calculation circuit to determine the locations of events (detection of a neutron).

  20. Evaluation of breast cancer using intravoxel incoherent motion (IVIM) histogram analysis: comparison with malignant status, histological subtype, and molecular prognostic factors.

    PubMed

    Cho, Gene Young; Moy, Linda; Kim, Sungheon G; Baete, Steven H; Moccaldi, Melanie; Babb, James S; Sodickson, Daniel K; Sigmund, Eric E

    2016-08-01

    To examine heterogeneous breast cancer through intravoxel incoherent motion (IVIM) histogram analysis. This HIPAA-compliant, IRB-approved retrospective study included 62 patients (age 48.44 ± 11.14 years, 50 malignant lesions and 12 benign) who underwent contrast-enhanced 3 T breast MRI and diffusion-weighted imaging. Apparent diffusion coefficient (ADC) and IVIM biomarkers of tissue diffusivity (Dt), perfusion fraction (fp), and pseudo-diffusivity (Dp) were calculated using voxel-based analysis for the whole lesion volume. Histogram analysis was performed to quantify tumour heterogeneity. Comparisons were made using Mann-Whitney tests between benign/malignant status, histological subtype, and molecular prognostic factor status while Spearman's rank correlation was used to characterize the association between imaging biomarkers and prognostic factor expression. The average values of the ADC and IVIM biomarkers, Dt and fp, showed significant differences between benign and malignant lesions. Additional significant differences were found in the histogram parameters among tumour subtypes and molecular prognostic factor status. IVIM histogram metrics, particularly fp and Dp, showed significant correlation with hormonal factor expression. Advanced diffusion imaging biomarkers show relationships with molecular prognostic factors and breast cancer malignancy. This analysis reveals novel diagnostic metrics that may explain some of the observed variability in treatment response among breast cancer patients. • Novel IVIM biomarkers characterize heterogeneous breast cancer. • Histogram analysis enables quantification of tumour heterogeneity. • IVIM biomarkers show relationships with breast cancer malignancy and molecular prognostic factors.

  1. Whole-tumor histogram analysis of the cerebral blood volume map: tumor volume defined by 11C-methionine positron emission tomography image improves the diagnostic accuracy of cerebral glioma grading.

    PubMed

    Wu, Rongli; Watanabe, Yoshiyuki; Arisawa, Atsuko; Takahashi, Hiroto; Tanaka, Hisashi; Fujimoto, Yasunori; Watabe, Tadashi; Isohashi, Kayako; Hatazawa, Jun; Tomiyama, Noriyuki

    2017-10-01

    This study aimed to compare the tumor volume definition using conventional magnetic resonance (MR) and 11C-methionine positron emission tomography (MET/PET) images in the differentiation of the pre-operative glioma grade by using whole-tumor histogram analysis of normalized cerebral blood volume (nCBV) maps. Thirty-four patients with histopathologically proven primary brain low-grade gliomas (n = 15) and high-grade gliomas (n = 19) underwent pre-operative or pre-biopsy MET/PET, fluid-attenuated inversion recovery, dynamic susceptibility contrast perfusion-weighted magnetic resonance imaging, and contrast-enhanced T1-weighted at 3.0 T. The histogram distribution derived from the nCBV maps was obtained by co-registering the whole tumor volume delineated on conventional MR or MET/PET images, and eight histogram parameters were assessed. The mean nCBV value had the highest AUC value (0.906) based on MET/PET images. Diagnostic accuracy significantly improved when the tumor volume was measured from MET/PET images compared with conventional MR images for the parameters of mean, 50th, and 75th percentile nCBV value (p = 0.0246, 0.0223, and 0.0150, respectively). Whole-tumor histogram analysis of CBV map provides more valuable histogram parameters and increases diagnostic accuracy in the differentiation of pre-operative cerebral gliomas when the tumor volume is derived from MET/PET images.

  2. Effect of respiratory and cardiac gating on the major diffusion-imaging metrics.

    PubMed

    Hamaguchi, Hiroyuki; Tha, Khin Khin; Sugimori, Hiroyuki; Nakanishi, Mitsuhiro; Nakagawa, Shin; Fujiwara, Taro; Yoshida, Hirokazu; Takamori, Sayaka; Shirato, Hiroki

    2016-08-01

    The effect of respiratory gating on the major diffusion-imaging metrics and that of cardiac gating on mean kurtosis (MK) are not known. For evaluation of whether the major diffusion-imaging metrics-MK, fractional anisotropy (FA), and mean diffusivity (MD) of the brain-varied between gated and non-gated acquisitions, respiratory-gated, cardiac-gated, and non-gated diffusion-imaging of the brain were performed in 10 healthy volunteers. MK, FA, and MD maps were constructed for all acquisitions, and the histograms were constructed. The normalized peak height and location of the histograms were compared among the acquisitions by use of Friedman and post hoc Wilcoxon tests. The effect of the repetition time (TR) on the diffusion-imaging metrics was also tested, and we corrected for its variation among acquisitions, if necessary. The results showed a shift in the peak location of the MK and MD histograms to the right with an increase in TR (p ≤ 0.01). The corrected peak location of the MK histograms, the normalized peak height of the FA histograms, the normalized peak height and the corrected peak location of the MD histograms varied significantly between the gated and non-gated acquisitions (p < 0.05). These results imply an influence of respiration and cardiac pulsation on the major diffusion-imaging metrics. The gating conditions must be kept identical if reproducible results are to be achieved. © The Author(s) 2016.

  3. A Substituting Meaning for the Equals Sign in Arithmetic Notating Tasks

    ERIC Educational Resources Information Center

    Jones, Ian; Pratt, Dave

    2012-01-01

    Three studies explore arithmetic tasks that support both substitutive and basic relational meanings for the equals sign. The duality of meanings enabled children to engage meaningfully and purposefully with the structural properties of arithmetic statements in novel ways. Some, but not all, children were successful at the adapted task and were…

  4. Gender Equality in German Universities: Vernacularising the Battle for the Best Brains

    ERIC Educational Resources Information Center

    Zippel, Kathrin; Ferree, Myra Marx; Zimmermann, Karin

    2016-01-01

    We examine how global pressures for competitiveness and gender equality have merged into a discourse of "inclusive excellence" in the twenty-first century and shaped three recent German higher education programmes. After placing these programmes in the larger discourse about gender inequalities, we focus on how they adapt current global…

  5. Repetitive Behaviors in Autism: Relationships with Associated Clinical Features

    ERIC Educational Resources Information Center

    Gabriels, Robin L.; Cuccaro, Michael L.; Hill, Dina E.; Ivers, Bonnie J.; Goldson, Edward

    2005-01-01

    Relationships between repetitive behaviors (RBs) and associated clinical features (i.e., cognitive and adaptive functioning levels, sleep problems, medication use, and other behavioral problems) were examined in two groups (High nonverbal IQ greater than or equal to 97 versus Low nonverbal IQ less than or equal to 56) of children with autism…

  6. Efficient nonlinear equalizer for intra-channel nonlinearity compensation for next generation agile and dynamically reconfigurable optical networks.

    PubMed

    Malekiha, Mahdi; Tselniker, Igor; Plant, David V

    2016-02-22

    In this work, we propose and experimentally demonstrate a novel low-complexity technique for fiber nonlinearity compensation. We achieved a transmission distance of 2818 km for a 32-GBaud dual-polarization 16QAM signal. For efficient implantation, and to facilitate integration with conventional digital signal processing (DSP) approaches, we independently compensate fiber nonlinearities after linear impairment equalization. Therefore this algorithm can be easily implemented in currently deployed transmission systems after using linear DSP. The proposed equalizer operates at one sample per symbol and requires only one computation step. The structure of the algorithm is based on a first-order perturbation model with quantized perturbation coefficients. Also, it does not require any prior calculation or detailed knowledge of the transmission system. We identified common symmetries between perturbation coefficients to avoid duplicate and unnecessary operations. In addition, we use only a few adaptive filter coefficients by grouping multiple nonlinear terms and dedicating only one adaptive nonlinear filter coefficient to each group. Finally, the complexity of the proposed algorithm is lower than previously studied nonlinear equalizers by more than one order of magnitude.

  7. A successive overrelaxation iterative technique for an adaptive equalizer

    NASA Technical Reports Server (NTRS)

    Kosovych, O. S.

    1973-01-01

    An adaptive strategy for the equalization of pulse-amplitude-modulated signals in the presence of intersymbol interference and additive noise is reported. The successive overrelaxation iterative technique is used as the algorithm for the iterative adjustment of the equalizer coefficents during a training period for the minimization of the mean square error. With 2-cyclic and nonnegative Jacobi matrices substantial improvement is demonstrated in the rate of convergence over the commonly used gradient techniques. The Jacobi theorems are also extended to nonpositive Jacobi matrices. Numerical examples strongly indicate that the improvements obtained for the special cases are possible for general channel characteristics. The technique is analytically demonstrated to decrease the mean square error at each iteration for a large range of parameter values for light or moderate intersymbol interference and for small intervals for general channels. Analytically, convergence of the relaxation algorithm was proven in a noisy environment and the coefficient variance was demonstrated to be bounded.

  8. Equalization of nonlinear transmission impairments by maximum-likelihood-sequence estimation in digital coherent receivers.

    PubMed

    Khairuzzaman, Md; Zhang, Chao; Igarashi, Koji; Katoh, Kazuhiro; Kikuchi, Kazuro

    2010-03-01

    We describe a successful introduction of maximum-likelihood-sequence estimation (MLSE) into digital coherent receivers together with finite-impulse response (FIR) filters in order to equalize both linear and nonlinear fiber impairments. The MLSE equalizer based on the Viterbi algorithm is implemented in the offline digital signal processing (DSP) core. We transmit 20-Gbit/s quadrature phase-shift keying (QPSK) signals through a 200-km-long standard single-mode fiber. The bit-error rate performance shows that the MLSE equalizer outperforms the conventional adaptive FIR filter, especially when nonlinear impairments are predominant.

  9. Automated segmentation and isolation of touching cell nuclei in cytopathology smear images of pleural effusion using distance transform watershed method

    NASA Astrophysics Data System (ADS)

    Win, Khin Yadanar; Choomchuay, Somsak; Hamamoto, Kazuhiko

    2017-06-01

    The automated segmentation of cell nuclei is an essential stage in the quantitative image analysis of cell nuclei extracted from smear cytology images of pleural fluid. Cell nuclei can indicate cancer as the characteristics of cell nuclei are associated with cells proliferation and malignancy in term of size, shape and the stained color. Nevertheless, automatic nuclei segmentation has remained challenging due to the artifacts caused by slide preparation, nuclei heterogeneity such as the poor contrast, inconsistent stained color, the cells variation, and cells overlapping. In this paper, we proposed a watershed-based method that is capable to segment the nuclei of the variety of cells from cytology pleural fluid smear images. Firstly, the original image is preprocessed by converting into the grayscale image and enhancing by adjusting and equalizing the intensity using histogram equalization. Next, the cell nuclei are segmented using OTSU thresholding as the binary image. The undesirable artifacts are eliminated using morphological operations. Finally, the distance transform based watershed method is applied to isolate the touching and overlapping cell nuclei. The proposed method is tested with 25 Papanicolaou (Pap) stained pleural fluid images. The accuracy of our proposed method is 92%. The method is relatively simple, and the results are very promising.

  10. MRI volumetry of prefrontal cortex

    NASA Astrophysics Data System (ADS)

    Sheline, Yvette I.; Black, Kevin J.; Lin, Daniel Y.; Pimmel, Joseph; Wang, Po; Haller, John W.; Csernansky, John G.; Gado, Mokhtar; Walkup, Ronald K.; Brunsden, Barry S.; Vannier, Michael W.

    1995-05-01

    Prefrontal cortex volumetry by brain magnetic resonance (MR) is required to estimate changes postulated to occur in certain psychiatric and neurologic disorders. A semiautomated method with quantitative characterization of its performance is sought to reliably distinguish small prefrontal cortex volume changes within individuals and between groups. Stereological methods were tested by a blinded comparison of measurements applied to 3D MR scans obtained using an MPRAGE protocol. Fixed grid stereologic methods were used to estimate prefrontal cortex volumes on a graphic workstation, after the images are scaled from 16 to 8 bits using a histogram method. In addition images were resliced into coronal sections perpendicular to the bicommissural plane. Prefrontal cortex volumes were defined as all sections of the frontal lobe anterior to the anterior commissure. Ventricular volumes were excluded. Stereological measurement yielded high repeatability and precision, and was time efficient for the raters. The coefficient of error was

  11. Evaluating CMA equalization of SOQPSK-TG data for aeronautical telemetry

    NASA Astrophysics Data System (ADS)

    Cole-Rhodes, Arlene; KoneDossongui, Serge; Umuolo, Henry; Rice, Michael

    2015-05-01

    This paper presents the results of using a constant modulus algorithm (CMA) to recover shaped offset quadrature-phase shift keying (SOQPSK)-TG modulated data, which has been transmitted using the iNET data packet structure. This standard is defined and used for aeronautical telemetry. Based on the iNET-packet structure, the adaptive block processing CMA equalizer can be initialized using the minimum mean square error (MMSE) equalizer [3]. This CMA equalizer is being evaluated for use on iNET structured data, with initial tests being conducted on measured data which has been received in a controlled laboratory environment. Thus the CMA equalizer is applied at the receiver to data packets which have been experimentally generated in order to determine the feasibility of our equalization approach, and its performance is compared to that of the MMSE equalizer. Performance evaluation is based on computed bit error rate (BER) counts for these equalizers.

  12. Pattern-histogram-based temporal change detection using personal chest radiographs

    NASA Astrophysics Data System (ADS)

    Ugurlu, Yucel; Obi, Takashi; Hasegawa, Akira; Yamaguchi, Masahiro; Ohyama, Nagaaki

    1999-05-01

    An accurate and reliable detection of temporal changes from a pair of images has considerable interest in the medical science. Traditional registration and subtraction techniques can be applied to extract temporal differences when,the object is rigid or corresponding points are obvious. However, in radiological imaging, loss of the depth information, the elasticity of object, the absence of clearly defined landmarks and three-dimensional positioning differences constraint the performance of conventional registration techniques. In this paper, we propose a new method in order to detect interval changes accurately without using an image registration technique. The method is based on construction of so-called pattern histogram and comparison procedure. The pattern histogram is a graphic representation of the frequency counts of all allowable patterns in the multi-dimensional pattern vector space. K-means algorithm is employed to partition pattern vector space successively. Any differences in the pattern histograms imply that different patterns are involved in the scenes. In our experiment, a pair of chest radiographs of pneumoconiosis is employed and the changing histogram bins are visualized on both of the images. We found that the method can be used as an alternative way of temporal change detection, particularly when the precise image registration is not available.

  13. A Concise Guide to Feature Histograms with Applications to LIDAR-Based Spacecraft Relative Navigation

    NASA Astrophysics Data System (ADS)

    Rhodes, Andrew P.; Christian, John A.; Evans, Thomas

    2017-12-01

    With the availability and popularity of 3D sensors, it is advantageous to re-examine the use of point cloud descriptors for the purpose of pose estimation and spacecraft relative navigation. One popular descriptor is the oriented unique repeatable clustered viewpoint feature histogram (OUR-CVFH), which is most often utilized in personal and industrial robotics to simultaneously recognize and navigate relative to an object. Recent research into using the OUR-CVFH descriptor for spacecraft navigation has produced favorable results. Since OUR-CVFH is the most recent innovation in a large family of feature histogram point cloud descriptors, discussions of parameter settings and insights into its functionality are spread among various publications and online resources. This paper organizes the history of feature histogram point cloud descriptors for a straightforward explanation of their evolution. This article compiles all the requisite information needed to implement OUR-CVFH into one location, as well as providing useful suggestions on how to tune the generation parameters. This work is beneficial for anyone interested in using this histogram descriptor for object recognition or navigation - may it be personal robotics or spacecraft navigation.

  14. Hybrid Histogram Descriptor: A Fusion Feature Representation for Image Retrieval.

    PubMed

    Feng, Qinghe; Hao, Qiaohong; Chen, Yuqi; Yi, Yugen; Wei, Ying; Dai, Jiangyan

    2018-06-15

    Currently, visual sensors are becoming increasingly affordable and fashionable, acceleratingly the increasing number of image data. Image retrieval has attracted increasing interest due to space exploration, industrial, and biomedical applications. Nevertheless, designing effective feature representation is acknowledged as a hard yet fundamental issue. This paper presents a fusion feature representation called a hybrid histogram descriptor (HHD) for image retrieval. The proposed descriptor comprises two histograms jointly: a perceptually uniform histogram which is extracted by exploiting the color and edge orientation information in perceptually uniform regions; and a motif co-occurrence histogram which is acquired by calculating the probability of a pair of motif patterns. To evaluate the performance, we benchmarked the proposed descriptor on RSSCN7, AID, Outex-00013, Outex-00014 and ETHZ-53 datasets. Experimental results suggest that the proposed descriptor is more effective and robust than ten recent fusion-based descriptors under the content-based image retrieval framework. The computational complexity was also analyzed to give an in-depth evaluation. Furthermore, compared with the state-of-the-art convolutional neural network (CNN)-based descriptors, the proposed descriptor also achieves comparable performance, but does not require any training process.

  15. Improved LSB matching steganography with histogram characters reserved

    NASA Astrophysics Data System (ADS)

    Chen, Zhihong; Liu, Wenyao

    2008-03-01

    This letter bases on the researches of LSB (least significant bit, i.e. the last bit of a binary pixel value) matching steganographic method and the steganalytic method which aims at histograms of cover images, and proposes a modification to LSB matching. In the LSB matching, if the LSB of the next cover pixel matches the next bit of secret data, do nothing; otherwise, choose to add or subtract one from the cover pixel value at random. In our improved method, a steganographic information table is defined and records the changes which embedded secrete bits introduce in. Through the table, the next LSB which has the same pixel value will be judged to add or subtract one dynamically in order to ensure the histogram's change of cover image is minimized. Therefore, the modified method allows embedding the same payload as the LSB matching but with improved steganographic security and less vulnerability to attacks compared with LSB matching. The experimental results of the new method show that the histograms maintain their attributes, such as peak values and alternative trends, in an acceptable degree and have better performance than LSB matching in the respects of histogram distortion and resistance against existing steganalysis.

  16. Hybrid time-frequency domain equalization for LED nonlinearity mitigation in OFDM-based VLC systems.

    PubMed

    Li, Jianfeng; Huang, Zhitong; Liu, Xiaoshuang; Ji, Yuefeng

    2015-01-12

    A novel hybrid time-frequency domain equalization scheme is proposed and experimentally demonstrated to mitigate the white light emitting diode (LED) nonlinearity in visible light communication (VLC) systems based on orthogonal frequency division multiplexing (OFDM). We handle the linear and nonlinear distortion separately in a nonlinear OFDM system. The linear part is equalized in frequency domain and the nonlinear part is compensated by an adaptive nonlinear time domain equalizer (N-TDE). The experimental results show that with only a small number of parameters the nonlinear equalizer can efficiently mitigate the LED nonlinearity. With the N-TDE the modulation index (MI) and BER performance can be significantly enhanced.

  17. Histograms and Frequency Density.

    ERIC Educational Resources Information Center

    Micromath, 2003

    2003-01-01

    Introduces exercises on histograms and frequency density. Guides pupils to Discovering Important Statistical Concepts Using Spreadsheets (DISCUSS), created at the University of Coventry. Includes curriculum points, teaching tips, activities, and internet address (http://www.coventry.ac.uk/discuss/). (KHR)

  18. The spectral archive of cosmic X-ray sources observed by the Einstein Observatory Focal Plane Crystal Spectrometer

    NASA Technical Reports Server (NTRS)

    Lum, Kenneth S. K.; Canizares, Claude R.; Clark, George W.; Coyne, Joan M.; Markert, Thomas H.; Saez, Pablo J.; Schattenburg, Mark L.; Winkler, P. F.

    1992-01-01

    The Einstein Observatory Focal Plane Crystal Spectrometer (FPCS) used the technique of Bragg spectroscopy to study cosmic X-ray sources in the 0.2-3 keV energy range. The high spectral resolving power (E/Delta-E is approximately equal to 100-1000) of this instrument allowed it to resolve closely spaced lines and study the structure of individual features in the spectra of 41 cosmic X-ray sources. An archival summary of the results is presented as a concise record the FPCS observations and a source of information for future analysis by the general astrophysics community. For each observation, the instrument configuration, background rate, X-ray flux or upper limit within the energy band observed, and spectral histograms are given. Examples of the contributions the FPCS observations have made to the understanding of the objects observed are discussed.

  19. Video enhancement workbench: an operational real-time video image processing system

    NASA Astrophysics Data System (ADS)

    Yool, Stephen R.; Van Vactor, David L.; Smedley, Kirk G.

    1993-01-01

    Video image sequences can be exploited in real-time, giving analysts rapid access to information for military or criminal investigations. Video-rate dynamic range adjustment subdues fluctuations in image intensity, thereby assisting discrimination of small or low- contrast objects. Contrast-regulated unsharp masking enhances differentially shadowed or otherwise low-contrast image regions. Real-time removal of localized hotspots, when combined with automatic histogram equalization, may enhance resolution of objects directly adjacent. In video imagery corrupted by zero-mean noise, real-time frame averaging can assist resolution and location of small or low-contrast objects. To maximize analyst efficiency, lengthy video sequences can be screened automatically for low-frequency, high-magnitude events. Combined zoom, roam, and automatic dynamic range adjustment permit rapid analysis of facial features captured by video cameras recording crimes in progress. When trying to resolve small objects in murky seawater, stereo video places the moving imagery in an optimal setting for human interpretation.

  20. Lead concentration distribution and source tracing of urban/suburban aquatic sediments in two typical famous tourist cities: Haikou and Sanya, China.

    PubMed

    Dong, Zhicheng; Bao, Zhengyu; Wu, Guoai; Fu, Yangrong; Yang, Yi

    2010-11-01

    The content and spatial distribution of lead in the aquatic systems in two Chinese tropical cities in Hainan province (Haikou and Sanyan) show an unequal distribution of lead between the urban and the suburban areas. The lead content is significantly higher (72.3 mg/kg) in the urban area than the suburbs (15.0 mg/kg) in Haikou, but quite equal in Sanya (41.6 and 43.9 mg/kg). The frequency distribution histograms suggest that the lead in Haikou and in Sanya derives from different natural and/or anthropogenic sources. The isotopic compositions indicate that urban sediment lead in Haikou originates mainly from anthropogenic sources (automobile exhaust, atmospheric deposition, etc.) which contribute much more than the natural sources, while natural lead (basalt and sea sands) is still dominant in the suburban areas in Haikou. In Sanya, the primary source is natural (soils and sea sands).

  1. Image contrast enhancement with brightness preservation using an optimal gamma correction and weighted sum approach

    NASA Astrophysics Data System (ADS)

    Jiang, G.; Wong, C. Y.; Lin, S. C. F.; Rahman, M. A.; Ren, T. R.; Kwok, Ngaiming; Shi, Haiyan; Yu, Ying-Hao; Wu, Tonghai

    2015-04-01

    The enhancement of image contrast and preservation of image brightness are two important but conflicting objectives in image restoration. Previous attempts based on linear histogram equalization had achieved contrast enhancement, but exact preservation of brightness was not accomplished. A new perspective is taken here to provide balanced performance of contrast enhancement and brightness preservation simultaneously by casting the quest of such solution to an optimization problem. Specifically, the non-linear gamma correction method is adopted to enhance the contrast, while a weighted sum approach is employed for brightness preservation. In addition, the efficient golden search algorithm is exploited to determine the required optimal parameters to produce the enhanced images. Experiments are conducted on natural colour images captured under various indoor, outdoor and illumination conditions. Results have shown that the proposed method outperforms currently available methods in contrast to enhancement and brightness preservation.

  2. Nondestructive Detection of the Internalquality of Apple Using X-Ray and Machine Vision

    NASA Astrophysics Data System (ADS)

    Yang, Fuzeng; Yang, Liangliang; Yang, Qing; Kang, Likui

    The internal quality of apple is impossible to be detected by eyes in the procedure of sorting, which could reduce the apple’s quality reaching market. This paper illustrates an instrument using X-ray and machine vision. The following steps were introduced to process the X-ray image in order to determine the mould core apple. Firstly, lifting wavelet transform was used to get a low frequency image and three high frequency images. Secondly, we enhanced the low frequency image through image’s histogram equalization. Then, the edge of each apple's image was detected using canny operator. Finally, a threshold was set to clarify mould core and normal apple according to the different length of the apple core’s diameter. The experimental results show that this method could on-line detect the mould core apple with less time consuming, less than 0.03 seconds per apple, and the accuracy could reach 92%.

  3. Fluorescent Microscopy Enhancement Using Imaging

    NASA Astrophysics Data System (ADS)

    Conrad, Morgan P.; Reck tenwald, Diether J.; Woodhouse, Bryan S.

    1986-06-01

    To enhance our capabilities for observing fluorescent stains in biological systems, we are developing a low cost imaging system based around an IBM AT microcomputer and a commercial image capture board compatible with a standard RS-170 format video camera. The image is digitized in real time with 256 grey levels, while being displayed and also stored in memory. The software allows for interactive processing of the data, such as histogram equalization or pseudocolor enhancement of the display. The entire image, or a quadrant thereof, can be averaged over time to improve the signal to noise ratio. Images may be stored to disk for later use or comparison. The camera may be selected for better response in the UV or near IR. Combined with signal averaging, this increases the sensitivity relative to that of the human eye, while still allowing for the fluorescence distribution on either the surface or internal cytoskeletal structure to be observed.

  4. The DataCube Server. Animate Agent Project Working Note 2, Version 1.0

    DTIC Science & Technology

    1993-11-01

    before this can be called a histogram of all the needed levels must be made and their one band images must be made. Note if a levels backprojection...will not be used then the level does not need to be histogrammed. Any points outside the active region in a levels backprojection will be undefined...this can be called a histogram of all the needed levels must be made and their one band images must be made. Note if a levels backprojection will not

  5. DNA flow cytometric analysis of primary operable breast cancer. Relation of ploidy and S-phase fraction to outcome of patients in NSABP B-04.

    PubMed

    Fisher, B; Gunduz, N; Costantino, J; Fisher, E R; Redmond, C; Mamounas, E P; Siderits, R

    1991-10-01

    Between 1971 and 1974, 1665 women with primary operable breast cancer were randomized into a National Surgical Adjuvant Breast and Bowel Project (NSABP) trial (B-04) conducted to evaluate the effectiveness of several different regimens of surgical and radiation therapy. No systemic therapy was given. Cells from archival paraffin-embedded tumor tissue taken from 398 patients were analyzed for ploidy and S-phase fraction (SPF) using flow cytometry. Characteristics and outcome of patients with satisfactory DNA histograms were comparable to those from whom no satisfactory cytometric studies were available. In patients with diploid tumors (43%), the mean SPF was 3.4% +/- 2.3%; in the aneuploid population (57%), the SPF was 7.9% +/- 6.3%. Only 29.9% +/- 17.3% of cells in aneuploid tumors were aneuploid. Diploid tumors were more likely than aneuploid tumors to be of good nuclear grade (P less than 0.001) and smaller size (P equals 0.03). More tumors with high SPF were of poor nuclear grade than were tumors with low SPF (P equals 0.002). No significant difference in 10-year disease-free survival (P equals 0.3) or survival (P equals 0.1) was found between women with diploid or aneuploid tumors. Patients with low SPF tumors had a 13% better disease-free survival (P equals 0.0006) than those with a high SPF and a 14% better survival (P equals 0.007) at 10 years than patients with high SPF tumors. After adjustment for clinical tumor size, the difference in both disease-free survival and survival between patients with high and low SPF tumors was only 10% (P equals 0.04 and 0.08, respectively). Although SPF was found to be of independent prognostic significance for disease-free survival and marginal significance for survival, it did not detect patients with such a good prognosis as to preclude their receiving chemotherapy. The overall survival of patients with low SPF was only 53% at 10 years. These findings and those of others indicate that additional studies are necessary before tumor ploidy and SPF can be used to select patients who should or should not receive systemic therapy.

  6. Enhanced facial texture illumination normalization for face recognition.

    PubMed

    Luo, Yong; Guan, Ye-Peng

    2015-08-01

    An uncontrolled lighting condition is one of the most critical challenges for practical face recognition applications. An enhanced facial texture illumination normalization method is put forward to resolve this challenge. An adaptive relighting algorithm is developed to improve the brightness uniformity of face images. Facial texture is extracted by using an illumination estimation difference algorithm. An anisotropic histogram-stretching algorithm is proposed to minimize the intraclass distance of facial skin and maximize the dynamic range of facial texture distribution. Compared with the existing methods, the proposed method can more effectively eliminate the redundant information of facial skin and illumination. Extensive experiments show that the proposed method has superior performance in normalizing illumination variation and enhancing facial texture features for illumination-insensitive face recognition.

  7. Diffusion Profiling via a Histogram Approach Distinguishes Low-grade from High-grade Meningiomas, Can Reflect the Respective Proliferative Potential and Progesterone Receptor Status.

    PubMed

    Gihr, Georg Alexander; Horvath-Rizea, Diana; Garnov, Nikita; Kohlhof-Meinecke, Patricia; Ganslandt, Oliver; Henkes, Hans; Meyer, Hans Jonas; Hoffmann, Karl-Titus; Surov, Alexey; Schob, Stefan

    2018-02-01

    Presurgical grading, estimation of growth kinetics, and other prognostic factors are becoming increasingly important for selecting the best therapeutic approach for meningioma patients. Diffusion-weighted imaging (DWI) provides microstructural information and reflects tumor biology. A novel DWI approach, histogram profiling of apparent diffusion coefficient (ADC) volumes, provides more distinct information than conventional DWI. Therefore, our study investigated whether ADC histogram profiling distinguishes low-grade from high-grade lesions and reflects Ki-67 expression and progesterone receptor status. Pretreatment ADC volumes of 37 meningioma patients (28 low-grade, 9 high-grade) were used for histogram profiling. WHO grade, Ki-67 expression, and progesterone receptor status were evaluated. Comparative and correlative statistics investigating the association between histogram profiling and neuropathology were performed. The entire ADC profile (p10, p25, p75, p90, mean, median) was significantly lower in high-grade versus low-grade meningiomas. The lower percentiles, mean, and modus showed significant correlations with Ki-67 expression. Skewness and entropy of the ADC volumes were significantly associated with progesterone receptor status and Ki-67 expression. ROC analysis revealed entropy to be the most accurate parameter distinguishing low-grade from high-grade meningiomas. ADC histogram profiling provides a distinct set of parameters, which help differentiate low-grade versus high-grade meningiomas. Also, histogram metrics correlate significantly with histological surrogates of the respective proliferative potential. More specifically, entropy revealed to be the most promising imaging biomarker for presurgical grading. Both, entropy and skewness were significantly associated with progesterone receptor status and Ki-67 expression and therefore should be investigated further as predictors for prognostically relevant tumor biological features. Since absolute ADC values vary between MRI scanners of different vendors and field strengths, their use is more limited in the presurgical setting.

  8. Histogram Analysis of CT Perfusion of Hepatocellular Carcinoma for Predicting Response to Transarterial Radioembolization: Value of Tumor Heterogeneity Assessment.

    PubMed

    Reiner, Caecilia S; Gordic, Sonja; Puippe, Gilbert; Morsbach, Fabian; Wurnig, Moritz; Schaefer, Niklaus; Veit-Haibach, Patrick; Pfammatter, Thomas; Alkadhi, Hatem

    2016-03-01

    To evaluate in patients with hepatocellular carcinoma (HCC), whether assessment of tumor heterogeneity by histogram analysis of computed tomography (CT) perfusion helps predicting response to transarterial radioembolization (TARE). Sixteen patients (15 male; mean age 65 years; age range 47-80 years) with HCC underwent CT liver perfusion for treatment planning prior to TARE with Yttrium-90 microspheres. Arterial perfusion (AP) derived from CT perfusion was measured in the entire tumor volume, and heterogeneity was analyzed voxel-wise by histogram analysis. Response to TARE was evaluated on follow-up imaging (median follow-up, 129 days) based on modified Response Evaluation Criteria in Solid Tumors (mRECIST). Results of histogram analysis and mean AP values of the tumor were compared between responders and non-responders. Receiver operating characteristics were calculated to determine the parameters' ability to discriminate responders from non-responders. According to mRECIST, 8 patients (50%) were responders and 8 (50%) non-responders. Comparing responders and non-responders, the 50th and 75th percentile of AP derived from histogram analysis was significantly different [AP 43.8/54.3 vs. 27.6/34.3 mL min(-1) 100 mL(-1)); p < 0.05], while the mean AP of HCCs (43.5 vs. 27.9 mL min(-1) 100 mL(-1); p > 0.05) was not. Further heterogeneity parameters from histogram analysis (skewness, coefficient of variation, and 25th percentile) did not differ between responders and non-responders (p > 0.05). If the cut-off for the 75th percentile was set to an AP of 37.5 mL min(-1) 100 mL(-1), therapy response could be predicted with a sensitivity of 88% (7/8) and specificity of 75% (6/8). Voxel-wise histogram analysis of pretreatment CT perfusion indicating tumor heterogeneity of HCC improves the pretreatment prediction of response to TARE.

  9. ADC histogram analysis for adrenal tumor histogram analysis of apparent diffusion coefficient in differentiating adrenal adenoma from pheochromocytoma.

    PubMed

    Umanodan, Tomokazu; Fukukura, Yoshihiko; Kumagae, Yuichi; Shindo, Toshikazu; Nakajo, Masatoyo; Takumi, Koji; Nakajo, Masanori; Hakamada, Hiroto; Umanodan, Aya; Yoshiura, Takashi

    2017-04-01

    To determine the diagnostic performance of apparent diffusion coefficient (ADC) histogram analysis in diffusion-weighted (DW) magnetic resonance imaging (MRI) for differentiating adrenal adenoma from pheochromocytoma. We retrospectively evaluated 52 adrenal tumors (39 adenomas and 13 pheochromocytomas) in 47 patients (21 men, 26 women; mean age, 59.3 years; range, 16-86 years) who underwent DW 3.0T MRI. Histogram parameters of ADC (b-values of 0 and 200 [ADC 200 ], 0 and 400 [ADC 400 ], and 0 and 800 s/mm 2 [ADC 800 ])-mean, variance, coefficient of variation (CV), kurtosis, skewness, and entropy-were compared between adrenal adenomas and pheochromocytomas, using the Mann-Whitney U-test. Receiver operating characteristic (ROC) curves for the histogram parameters were generated to differentiate adrenal adenomas from pheochromocytomas. Sensitivity and specificity were calculated by using a threshold criterion that would maximize the average of sensitivity and specificity. Variance and CV of ADC 800 were significantly higher in pheochromocytomas than in adrenal adenomas (P < 0.001 and P = 0.001, respectively). With all b-value combinations, the entropy of ADC was significantly higher in pheochromocytomas than in adrenal adenomas (all P ≤ 0.001), and showed the highest area under the ROC curve among the ADC histogram parameters for diagnosing adrenal adenomas (ADC 200 , 0.82; ADC 400 , 0.87; and ADC 800 , 0.92), with sensitivity of 84.6% and specificity of 84.6% (cutoff, ≤2.82) with ADC 200 ; sensitivity of 89.7% and specificity of 84.6% (cutoff, ≤2.77) with ADC 400 ; and sensitivity of 94.9% and specificity of 92.3% (cutoff, ≤2.67) with ADC 800 . ADC histogram analysis of DW MRI can help differentiate adrenal adenoma from pheochromocytoma. 3 J. Magn. Reson. Imaging 2017;45:1195-1203. © 2016 International Society for Magnetic Resonance in Medicine.

  10. Robust Audio Watermarking by Using Low-Frequency Histogram

    NASA Astrophysics Data System (ADS)

    Xiang, Shijun

    In continuation to earlier work where the problem of time-scale modification (TSM) has been studied [1] by modifying the shape of audio time domain histogram, here we consider the additional ingredient of resisting additive noise-like operations, such as Gaussian noise, lossy compression and low-pass filtering. In other words, we study the problem of the watermark against both TSM and additive noises. To this end, in this paper we extract the histogram from a Gaussian-filtered low-frequency component for audio watermarking. The watermark is inserted by shaping the histogram in a way that the use of two consecutive bins as a group is exploited for hiding a bit by reassigning their population. The watermarked signals are perceptibly similar to the original one. Comparing with the previous time-domain watermarking scheme [1], the proposed watermarking method is more robust against additive noise, MP3 compression, low-pass filtering, etc.

  11. [Image Feature Extraction and Discriminant Analysis of Xinjiang Uygur Medicine Based on Color Histogram].

    PubMed

    Hamit, Murat; Yun, Weikang; Yan, Chuanbo; Kutluk, Abdugheni; Fang, Yang; Alip, Elzat

    2015-06-01

    Image feature extraction is an important part of image processing and it is an important field of research and application of image processing technology. Uygur medicine is one of Chinese traditional medicine and researchers pay more attention to it. But large amounts of Uygur medicine data have not been fully utilized. In this study, we extracted the image color histogram feature of herbal and zooid medicine of Xinjiang Uygur. First, we did preprocessing, including image color enhancement, size normalizition and color space transformation. Then we extracted color histogram feature and analyzed them with statistical method. And finally, we evaluated the classification ability of features by Bayes discriminant analysis. Experimental results showed that high accuracy for Uygur medicine image classification was obtained by using color histogram feature. This study would have a certain help for the content-based medical image retrieval for Xinjiang Uygur medicine.

  12. LSAH: a fast and efficient local surface feature for point cloud registration

    NASA Astrophysics Data System (ADS)

    Lu, Rongrong; Zhu, Feng; Wu, Qingxiao; Kong, Yanzi

    2018-04-01

    Point cloud registration is a fundamental task in high level three dimensional applications. Noise, uneven point density and varying point cloud resolutions are the three main challenges for point cloud registration. In this paper, we design a robust and compact local surface descriptor called Local Surface Angles Histogram (LSAH) and propose an effectively coarse to fine algorithm for point cloud registration. The LSAH descriptor is formed by concatenating five normalized sub-histograms into one histogram. The five sub-histograms are created by accumulating a different type of angle from a local surface patch respectively. The experimental results show that our LSAH is more robust to uneven point density and point cloud resolutions than four state-of-the-art local descriptors in terms of feature matching. Moreover, we tested our LSAH based coarse to fine algorithm for point cloud registration. The experimental results demonstrate that our algorithm is robust and efficient as well.

  13. Advanced concentration analysis of atom probe tomography data: Local proximity histograms and pseudo-2D concentration maps.

    PubMed

    Felfer, Peter; Cairney, Julie

    2018-06-01

    Analysing the distribution of selected chemical elements with respect to interfaces is one of the most common tasks in data mining in atom probe tomography. This can be represented by 1D concentration profiles, 2D concentration maps or proximity histograms, which represent concentration, density etc. of selected species as a function of the distance from a reference surface/interface. These are some of the most useful tools for the analysis of solute distributions in atom probe data. In this paper, we present extensions to the proximity histogram in the form of 'local' proximity histograms, calculated for selected parts of a surface, and pseudo-2D concentration maps, which are 2D concentration maps calculated on non-flat surfaces. This way, local concentration changes at interfaces or and other structures can be assessed more effectively. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Histogram contrast analysis and the visual segregation of IID textures.

    PubMed

    Chubb, C; Econopouly, J; Landy, M S

    1994-09-01

    A new psychophysical methodology is introduced, histogram contrast analysis, that allows one to measure stimulus transformations, f, used by the visual system to draw distinctions between different image regions. The method involves the discrimination of images constructed by selecting texture micropatterns randomly and independently (across locations) on the basis of a given micropattern histogram. Different components of f are measured by use of different component functions to modulate the micropattern histogram until the resulting textures are discriminable. When no discrimination threshold can be obtained for a given modulating component function, a second titration technique may be used to measure the contribution of that component to f. The method includes several strong tests of its own assumptions. An example is given of the method applied to visual textures composed of small, uniform squares with randomly chosen gray levels. In particular, for a fixed mean gray level mu and a fixed gray-level variance sigma 2, histogram contrast analysis is used to establish that the class S of all textures composed of small squares with jointly independent, identically distributed gray levels with mean mu and variance sigma 2 is perceptually elementary in the following sense: there exists a single, real-valued function f S of gray level, such that two textures I and J in S are discriminable only if the average value of f S applied to the gray levels in I is significantly different from the average value of f S applied to the gray levels in J. Finally, histogram contrast analysis is used to obtain a seventh-order polynomial approximation of f S.

  15. Insight on AV-45 binding in white and grey matter from histogram analysis: a study on early Alzheimer's disease patients and healthy subjects

    PubMed Central

    Nemmi, Federico; Saint-Aubert, Laure; Adel, Djilali; Salabert, Anne-Sophie; Pariente, Jérémie; Barbeau, Emmanuel; Payoux, Pierre; Péran, Patrice

    2014-01-01

    Purpose AV-45 amyloid biomarker is known to show uptake in white matter in patients with Alzheimer’s disease (AD) but also in healthy population. This binding; thought to be of a non-specific lipophilic nature has not yet been investigated. The aim of this study was to determine the differential pattern of AV-45 binding in healthy and pathological populations in white matter. Methods We recruited 24 patients presenting with AD at early stage and 17 matched, healthy subjects. We used an optimized PET-MRI registration method and an approach based on intensity histogram using several indexes. We compared the results of the intensity histogram analyses with a more canonical approach based on target-to-cerebellum Standard Uptake Value (SUVr) in white and grey matters using MANOVA and discriminant analyses. A cluster analysis on white and grey matter histograms was also performed. Results White matter histogram analysis revealed significant differences between AD and healthy subjects, which were not revealed by SUVr analysis. However, white matter histograms was not decisive to discriminate groups, and indexes based on grey matter only showed better discriminative power than SUVr. The cluster analysis divided our sample in two clusters, showing different uptakes in grey but also in white matter. Conclusion These results demonstrate that AV-45 binding in white matter conveys subtle information not detectable using SUVr approach. Although it is not better than standard SUVr to discriminate AD patients from healthy subjects, this information could reveal white matter modifications. PMID:24573658

  16. A Perspective on Diversity, Equality and Equity in Swedish Schools

    ERIC Educational Resources Information Center

    Johansson, Olof; Davis, Anna; Geijer, Luule

    2007-01-01

    This study presents policy and theory as they apply to diversity, equality and equity in Swedish social and educational policy. All education in Sweden should, according to the curriculum (Lpo 94, 1994, p. 5) be of equivalent value, irrespective of where in the country it is provided and education should be adapted to each pupil's circumstances…

  17. Predicting pathologic tumor response to chemoradiotherapy with histogram distances characterizing longitudinal changes in 18F-FDG uptake patterns

    PubMed Central

    Tan, Shan; Zhang, Hao; Zhang, Yongxue; Chen, Wengen; D’Souza, Warren D.; Lu, Wei

    2013-01-01

    Purpose: A family of fluorine-18 (18F)-fluorodeoxyglucose (18F-FDG) positron-emission tomography (PET) features based on histogram distances is proposed for predicting pathologic tumor response to neoadjuvant chemoradiotherapy (CRT). These features describe the longitudinal change of FDG uptake distribution within a tumor. Methods: Twenty patients with esophageal cancer treated with CRT plus surgery were included in this study. All patients underwent PET/CT scans before (pre-) and after (post-) CRT. The two scans were first rigidly registered, and the original tumor sites were then manually delineated on the pre-PET/CT by an experienced nuclear medicine physician. Two histograms representing the FDG uptake distribution were extracted from the pre- and the registered post-PET images, respectively, both within the delineated tumor. Distances between the two histograms quantify longitudinal changes in FDG uptake distribution resulting from CRT, and thus are potential predictors of tumor response. A total of 19 histogram distances were examined and compared to both traditional PET response measures and Haralick texture features. Receiver operating characteristic analyses and Mann-Whitney U test were performed to assess their predictive ability. Results: Among all tested histogram distances, seven bin-to-bin and seven crossbin distances outperformed traditional PET response measures using maximum standardized uptake value (AUC = 0.70) or total lesion glycolysis (AUC = 0.80). The seven bin-to-bin distances were: L2 distance (AUC = 0.84), χ2 distance (AUC = 0.83), intersection distance (AUC = 0.82), cosine distance (AUC = 0.83), squared Euclidean distance (AUC = 0.83), L1 distance (AUC = 0.82), and Jeffrey distance (AUC = 0.82). The seven crossbin distances were: quadratic-chi distance (AUC = 0.89), earth mover distance (AUC = 0.86), fast earth mover distance (AUC = 0.86), diffusion distance (AUC = 0.88), Kolmogorov-Smirnov distance (AUC = 0.88), quadratic form distance (AUC = 0.87), and match distance (AUC = 0.84). These crossbin histogram distance features showed slightly higher prediction accuracy than texture features on post-PET images. Conclusions: The results suggest that longitudinal patterns in 18F-FDG uptake characterized using histogram distances provide useful information for predicting the pathologic response of esophageal cancer to CRT. PMID:24089897

  18. [Characteristics of high resolution diffusion weighted imaging apparent diffusion coefficient histogram and its correlations with cancer stages in patients with nasopharyngeal carcinoma].

    PubMed

    Wang, G J; Wang, Y; Ye, Y; Chen, F; Lu, Y T; Li, S L

    2017-11-07

    Objective: To investigate the features of apparent diffusion coefficient (ADC) histogram parameters based on entire tumor volume data in high resolution diffusion weighted imaging of nasopharyngeal carcinoma (NPC) and to evaluate its correlations with cancer stages. Methods: This retrospective study included 154 cases of NPC patients[102 males and 52 females, mean age (48±11) years]who had received readout segmentation of long variable echo trains of MRI scan before radiation therapy. The area of tumor was delineated on each section of axial ADC maps to generate ADC histogram by using Image J. ADC histogram of entire tumor along with the histogram parameters-the tumor voxels, ADC(mean), ADC(25%), ADC(50%), ADC(75%), skewness and kurtosis were obtained by merging all sections with SPSS 22.0 software. Intra-observer repeatability was assessed by using intra-class correlation coefficients (ICC). The patients were subdivided into two groups according to cancer volume: small cancer group (<305 voxels, about 2 cm(3)) and large cancer group (≥2 cm(3)). The correlation between ADC histogram parameters and cancer stages was evaluated with Spearman test. Results: The ICC of measuring ADC histogram parameters of tumor voxels, ADC(mean), ADC(25%), ADC(50%), ADC(75%), skewness, kurtosis was 0.938, 0.861, 0.885, 0.838, 0.836, 0.358 and 0.456, respectively. The tumor voxels was positively correlated with T staging ( r =0.368, P <0.05). There were significant differences in tumor voxels among patients with different T stages ( K =22.306, P <0.05). There were significant differences in the ADC(mean), ADC(25%), ADC(50%) among patients with different T stages in the small cancer group( K =8.409, 8.187, 8.699, all P <0.05), and the up-mentioned three indices were positively correlated with T staging ( r =0.221, 0.209, 0.235, all P <0.05). Skewness and kurtosis differed significantly between the groups with different cancer volume( t =-2.987, Z =-3.770, both P <0.05). Conclusion: The tumor volume, tissue uniformity of NPC are important factors affecting ADC and cancer stages, parameters of ADC histogram (ADC(mean), ADC(25%), ADC(50%)) increases with T staging in NPC smaller than 2 cm(3).

  19. RLS Channel Estimation with Adaptive Forgetting Factor for DS-CDMA Frequency-Domain Equalization

    NASA Astrophysics Data System (ADS)

    Kojima, Yohei; Tomeba, Hiromichi; Takeda, Kazuaki; Adachi, Fumiyuki

    Frequency-domain equalization (FDE) based on the minimum mean square error (MMSE) criterion can increase the downlink bit error rate (BER) performance of DS-CDMA beyond that possible with conventional rake combining in a frequency-selective fading channel. FDE requires accurate channel estimation. Recently, we proposed a pilot-assisted channel estimation (CE) based on the MMSE criterion. Using MMSE-CE, the channel estimation accuracy is almost insensitive to the pilot chip sequence, and a good BER performance is achieved. In this paper, we propose a channel estimation scheme using one-tap recursive least square (RLS) algorithm, where the forgetting factor is adapted to the changing channel condition by the least mean square (LMS)algorithm, for DS-CDMA with FDE. We evaluate the BER performance using RLS-CE with adaptive forgetting factor in a frequency-selective fast Rayleigh fading channel by computer simulation.

  20. Visual learning with reduced adaptation is eccentricity-specific.

    PubMed

    Harris, Hila; Sagi, Dov

    2018-01-12

    Visual learning is known to be specific to the trained target location, showing little transfer to untrained locations. Recently, learning was shown to transfer across equal-eccentricity retinal-locations when sensory adaptation due to repetitive stimulation was minimized. It was suggested that learning transfers to previously untrained locations when the learned representation is location invariant, with sensory adaptation introducing location-dependent representations, thus preventing transfer. Spatial invariance may also fail when the trained and tested locations are at different distance from the center of gaze (different retinal eccentricities), due to differences in the corresponding low-level cortical representations (e.g. allocated cortical area decreases with eccentricity). Thus, if learning improves performance by better classifying target-dependent early visual representations, generalization is predicted to fail when locations of different retinal eccentricities are trained and tested in the absence sensory adaptation. Here, using the texture discrimination task, we show specificity of learning across different retinal eccentricities (4-8°) using reduced adaptation training. The existence of generalization across equal-eccentricity locations but not across different eccentricities demonstrates that learning accesses visual representations preceding location independent representations, with specificity of learning explained by inhomogeneous sensory representation.

  1. A complex valued radial basis function network for equalization of fast time varying channels.

    PubMed

    Gan, Q; Saratchandran, P; Sundararajan, N; Subramanian, K R

    1999-01-01

    This paper presents a complex valued radial basis function (RBF) network for equalization of fast time varying channels. A new method for calculating the centers of the RBF network is given. The method allows fixing the number of RBF centers even as the equalizer order is increased so that a good performance is obtained by a high-order RBF equalizer with small number of centers. Simulations are performed on time varying channels using a Rayleigh fading channel model to compare the performance of our RBF with an adaptive maximum-likelihood sequence estimator (MLSE) consisting of a channel estimator and a MLSE implemented by the Viterbi algorithm. The results show that the RBF equalizer produces superior performance with less computational complexity.

  2. Binarization of apodizers by adapted one-dimensional error diffusion method

    NASA Astrophysics Data System (ADS)

    Kowalczyk, Marek; Cichocki, Tomasz; Martinez-Corral, Manuel; Andres, Pedro

    1994-10-01

    Two novel algorithms for the binarization of continuous rotationally symmetric real positive pupil filters are presented. Both algorithms are based on 1-D error diffusion concept. The original gray-tone apodizer is substituted by a set of transparent and opaque concentric annular zones. Depending on the algorithm the resulting binary mask consists of either equal width or equal area zones. The diffractive behavior of binary filters is evaluated. It is shown that the pupils with equal width zones give Fraunhofer diffraction pattern more similar to that of the original continuous-tone pupil than those with equal area zones, assuming in both cases the same resolution limit of printing device.

  3. Microbubble cloud characterization by nonlinear frequency mixing.

    PubMed

    Cavaro, M; Payan, C; Moysan, J; Baqué, F

    2011-05-01

    In the frame of the fourth generation forum, France decided to develop sodium fast nuclear reactors. French Safety Authority requests the associated monitoring of argon gas into sodium. This implies to estimate the void fraction, and a histogram indicating the bubble population. In this context, the present letter studies the possibility of achieving an accurate determination of the histogram with acoustic methods. A nonlinear, two-frequency mixing technique has been implemented, and a specific optical device has been developed in order to validate the experimental results. The acoustically reconstructed histograms are in excellent agreement with those obtained using optical methods.

  4. The ISI distribution of the stochastic Hodgkin-Huxley neuron.

    PubMed

    Rowat, Peter F; Greenwood, Priscilla E

    2014-01-01

    The simulation of ion-channel noise has an important role in computational neuroscience. In recent years several approximate methods of carrying out this simulation have been published, based on stochastic differential equations, and all giving slightly different results. The obvious, and essential, question is: which method is the most accurate and which is most computationally efficient? Here we make a contribution to the answer. We compare interspike interval histograms from simulated data using four different approximate stochastic differential equation (SDE) models of the stochastic Hodgkin-Huxley neuron, as well as the exact Markov chain model simulated by the Gillespie algorithm. One of the recent SDE models is the same as the Kurtz approximation first published in 1978. All the models considered give similar ISI histograms over a wide range of deterministic and stochastic input. Three features of these histograms are an initial peak, followed by one or more bumps, and then an exponential tail. We explore how these features depend on deterministic input and on level of channel noise, and explain the results using the stochastic dynamics of the model. We conclude with a rough ranking of the four SDE models with respect to the similarity of their ISI histograms to the histogram of the exact Markov chain model.

  5. Dynamic Contrast-enhanced MR Imaging in Renal Cell Carcinoma: Reproducibility of Histogram Analysis on Pharmacokinetic Parameters

    PubMed Central

    Wang, Hai-yi; Su, Zi-hua; Xu, Xiao; Sun, Zhi-peng; Duan, Fei-xue; Song, Yuan-yuan; Li, Lu; Wang, Ying-wei; Ma, Xin; Guo, Ai-tao; Ma, Lin; Ye, Hui-yi

    2016-01-01

    Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan–rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K trans & Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733

  6. Real-Time Motion Tracking for Indoor Moving Sphere Objects with a LiDAR Sensor.

    PubMed

    Huang, Lvwen; Chen, Siyuan; Zhang, Jianfeng; Cheng, Bang; Liu, Mingqing

    2017-08-23

    Object tracking is a crucial research subfield in computer vision and it has wide applications in navigation, robotics and military applications and so on. In this paper, the real-time visualization of 3D point clouds data based on the VLP-16 3D Light Detection and Ranging (LiDAR) sensor is achieved, and on the basis of preprocessing, fast ground segmentation, Euclidean clustering segmentation for outliers, View Feature Histogram (VFH) feature extraction, establishing object models and searching matching a moving spherical target, the Kalman filter and adaptive particle filter are used to estimate in real-time the position of a moving spherical target. The experimental results show that the Kalman filter has the advantages of high efficiency while adaptive particle filter has the advantages of high robustness and high precision when tested and validated on three kinds of scenes under the condition of target partial occlusion and interference, different moving speed and different trajectories. The research can be applied in the natural environment of fruit identification and tracking, robot navigation and control and other fields.

  7. Real-Time Motion Tracking for Indoor Moving Sphere Objects with a LiDAR Sensor

    PubMed Central

    Chen, Siyuan; Zhang, Jianfeng; Cheng, Bang; Liu, Mingqing

    2017-01-01

    Object tracking is a crucial research subfield in computer vision and it has wide applications in navigation, robotics and military applications and so on. In this paper, the real-time visualization of 3D point clouds data based on the VLP-16 3D Light Detection and Ranging (LiDAR) sensor is achieved, and on the basis of preprocessing, fast ground segmentation, Euclidean clustering segmentation for outliers, View Feature Histogram (VFH) feature extraction, establishing object models and searching matching a moving spherical target, the Kalman filter and adaptive particle filter are used to estimate in real-time the position of a moving spherical target. The experimental results show that the Kalman filter has the advantages of high efficiency while adaptive particle filter has the advantages of high robustness and high precision when tested and validated on three kinds of scenes under the condition of target partial occlusion and interference, different moving speed and different trajectories. The research can be applied in the natural environment of fruit identification and tracking, robot navigation and control and other fields. PMID:28832520

  8. 75 FR 34726 - Energy Conservation Program for Consumer Products: Notice of Petition for Waiver of LG...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-18

    ... relative humidity sensors and adaptive control anti-sweat heaters. The rationale for granting these waivers is equally applicable to LG, which has products containing similar relative humidity sensors and anti... humidity sensors and adaptive control anti-sweat heaters. Therefore, it is ordered that: The application...

  9. Identifying Reading Problems with Computer-Adaptive Assessments

    ERIC Educational Resources Information Center

    Merrell, C.; Tymms, P.

    2007-01-01

    This paper describes the development of an adaptive assessment called Interactive Computerised Assessment System (InCAS) that is aimed at children of a wide age and ability range to identify specific reading problems. Rasch measurement has been used to create the equal interval scales that form each part of the assessment. The rationale for the…

  10. Multiple point least squares equalization in a room

    NASA Technical Reports Server (NTRS)

    Elliott, S. J.; Nelson, P. A.

    1988-01-01

    Equalization filters designed to minimize the mean square error between a delayed version of the original electrical signal and the equalized response at a point in a room have previously been investigated. In general, such a strategy degrades the response at positions in a room away from the equalization point. A method is presented for designing an equalization filter by adjusting the filter coefficients to minimize the sum of the squares of the errors between the equalized responses at multiple points in the room and delayed versions of the original, electrical signal. Such an equalization filter can give a more uniform frequency response over a greater volume of the enclosure than can the single point equalizer above. Computer simulation results are presented of equalizing the frequency responses from a loudspeaker to various typical ear positions, in a room with dimensions and acoustic damping typical of a car interior, using the two approaches outlined above. Adaptive filter algorithms, which can automatically adjust the coefficients of a digital equalization filter to achieve this minimization, will also be discussed.

  11. Apparent diffusion coefficient histogram analysis can evaluate radiation-induced parotid damage and predict late xerostomia degree in nasopharyngeal carcinoma

    PubMed Central

    Zhou, Nan; Guo, Tingting; Zheng, Huanhuan; Pan, Xia; Chu, Chen; Dou, Xin; Li, Ming; Liu, Song; Zhu, Lijing; Liu, Baorui; Chen, Weibo; He, Jian; Yan, Jing; Zhou, Zhengyang; Yang, Xiaofeng

    2017-01-01

    We investigated apparent diffusion coefficient (ADC) histogram analysis to evaluate radiation-induced parotid damage and predict xerostomia degrees in nasopharyngeal carcinoma (NPC) patients receiving radiotherapy. The imaging of bilateral parotid glands in NPC patients was conducted 2 weeks before radiotherapy (time point 1), one month after radiotherapy (time point 2), and four months after radiotherapy (time point 3). From time point 1 to 2, parotid volume, skewness, and kurtosis decreased (P < 0.001, = 0.001, and < 0.001, respectively), but all other ADC histogram parameters increased (all P < 0.001, except P = 0.006 for standard deviation [SD]). From time point 2 to 3, parotid volume continued to decrease (P = 0.022), and SD, 75th and 90th percentiles continued to increase (P = 0.024, 0.010, and 0.006, respectively). Early change rates of parotid ADCmean, ADCmin, kurtosis, and 25th, 50th, 75th, 90th percentiles (from time point 1 to 2) correlated with late parotid atrophy rate (from time point 1 to 3) (all P < 0.05). Multiple linear regression analysis revealed correlations among parotid volume, time point, and ADC histogram parameters. Early mean change rates for bilateral parotid SD and ADCmax could predict late xerostomia degrees at seven months after radiotherapy (three months after time point 3) with AUC of 0.781 and 0.818 (P = 0.014, 0.005, respectively). ADC histogram parameters were reproducible (intraclass correlation coefficient, 0.830 - 0.999). ADC histogram analysis could be used to evaluate radiation-induced parotid damage noninvasively, and predict late xerostomia degrees of NPC patients treated with radiotherapy. PMID:29050274

  12. Histogram Analysis of Diffusion Tensor Imaging Parameters in Pediatric Cerebellar Tumors.

    PubMed

    Wagner, Matthias W; Narayan, Anand K; Bosemani, Thangamadhan; Huisman, Thierry A G M; Poretti, Andrea

    2016-05-01

    Apparent diffusion coefficient (ADC) values have been shown to assist in differentiating cerebellar pilocytic astrocytomas and medulloblastomas. Previous studies have applied only ADC measurements and calculated the mean/median values. Here we investigated the value of diffusion tensor imaging (DTI) histogram characteristics of the entire tumor for differentiation of cerebellar pilocytic astrocytomas and medulloblastomas. Presurgical DTI data were analyzed with a region of interest (ROI) approach to include the entire tumor. For each tumor, histogram-derived metrics including the 25th percentile, 75th percentile, and skewness were calculated for fractional anisotropy (FA) and mean (MD), axial (AD), and radial (RD) diffusivity. The histogram metrics were used as primary predictors of interest in a logistic regression model. Statistical significance levels were set at p < .01. The study population included 17 children with pilocytic astrocytoma and 16 with medulloblastoma (mean age, 9.21 ± 5.18 years and 7.66 ± 4.97 years, respectively). Compared to children with medulloblastoma, children with pilocytic astrocytoma showed higher MD (P = .003 and P = .008), AD (P = .004 and P = .007), and RD (P = .003 and P = .009) values for the 25th and 75th percentile. In addition, histogram skewness showed statistically significant differences for MD between low- and high-grade tumors (P = .008). The 25th percentile for MD yields the best results for the presurgical differentiation between pediatric cerebellar pilocytic astrocytomas and medulloblastomas. The analysis of other DTI metrics does not provide additional diagnostic value. Our study confirms the diagnostic value of the quantitative histogram analysis of DTI data in pediatric neuro-oncology. Copyright © 2015 by the American Society of Neuroimaging.

  13. Correlation of histogram analysis of apparent diffusion coefficient with uterine cervical pathologic finding.

    PubMed

    Lin, Yuning; Li, Hui; Chen, Ziqian; Ni, Ping; Zhong, Qun; Huang, Huijuan; Sandrasegaran, Kumar

    2015-05-01

    The purpose of this study was to investigate the application of histogram analysis of apparent diffusion coefficient (ADC) in characterizing pathologic features of cervical cancer and benign cervical lesions. This prospective study was approved by the institutional review board, and written informed consent was obtained. Seventy-three patients with cervical cancer (33-69 years old; 35 patients with International Federation of Gynecology and Obstetrics stage IB cervical cancer) and 38 patients (38-61 years old) with normal cervix or cervical benign lesions (control group) were enrolled. All patients underwent 3-T diffusion-weighted imaging (DWI) with b values of 0 and 800 s/mm(2). ADC values of the entire tumor in the patient group and the whole cervix volume in the control group were assessed. Mean ADC, median ADC, 25th and 75th percentiles of ADC, skewness, and kurtosis were calculated. Histogram parameters were compared between different pathologic features, as well as between stage IB cervical cancer and control groups. Mean ADC, median ADC, and 25th percentile of ADC were significantly higher for adenocarcinoma (p = 0.021, 0.006, and 0.004, respectively), and skewness was significantly higher for squamous cell carcinoma (p = 0.011). Median ADC was statistically significantly higher for well or moderately differentiated tumors (p = 0.044), and skewness was statistically significantly higher for poorly differentiated tumors (p = 0.004). No statistically significant difference of ADC histogram was observed between lymphovascular space invasion subgroups. All histogram parameters differed significantly between stage IB cervical cancer and control groups (p < 0.05). Distribution of ADCs characterized by histogram analysis may help to distinguish early-stage cervical cancer from normal cervix or cervical benign lesions and may be useful for evaluating the different pathologic features of cervical cancer.

  14. Differentiating between Glioblastoma and Primary CNS Lymphoma Using Combined Whole-tumor Histogram Analysis of the Normalized Cerebral Blood Volume and the Apparent Diffusion Coefficient.

    PubMed

    Bao, Shixing; Watanabe, Yoshiyuki; Takahashi, Hiroto; Tanaka, Hisashi; Arisawa, Atsuko; Matsuo, Chisato; Wu, Rongli; Fujimoto, Yasunori; Tomiyama, Noriyuki

    2018-05-31

    This study aimed to determine whether whole-tumor histogram analysis of normalized cerebral blood volume (nCBV) and apparent diffusion coefficient (ADC) for contrast-enhancing lesions can be used to differentiate between glioblastoma (GBM) and primary central nervous system lymphoma (PCNSL). From 20 patients, 9 with PCNSL and 11 with GBM without any hemorrhagic lesions, underwent MRI, including diffusion-weighted imaging and dynamic susceptibility contrast perfusion-weighted imaging before surgery. Histogram analysis of nCBV and ADC from whole-tumor voxels in contrast-enhancing lesions was performed. An unpaired t-test was used to compare the mean values for each type of tumor. A multivariate logistic regression model (LRM) was performed to classify GBM and PCNSL using the best parameters of ADC and nCBV. All nCBV histogram parameters of GBMs were larger than those of PCNSLs, but only average nCBV was statistically significant after Bonferroni correction. Meanwhile, ADC histogram parameters were also larger in GBM compared to those in PCNSL, but these differences were not statistically significant. According to receiver operating characteristic curve analysis, the nCBV average and ADC 25th percentile demonstrated the largest area under the curve with values of 0.869 and 0.838, respectively. The LRM combining these two parameters differentiated between GBM and PCNSL with a higher area under the curve value (Logit (P) = -21.12 + 10.00 × ADC 25th percentile (10 -3 mm 2 /s) + 5.420 × nCBV mean, P < 0.001). Our results suggest that whole-tumor histogram analysis of nCBV and ADC combined can be a valuable objective diagnostic method for differentiating between GBM and PCNSL.

  15. Histogram analysis of diffusion kurtosis imaging estimates for in vivo assessment of 2016 WHO glioma grades: A cross-sectional observational study.

    PubMed

    Hempel, Johann-Martin; Schittenhelm, Jens; Brendle, Cornelia; Bender, Benjamin; Bier, Georg; Skardelly, Marco; Tabatabai, Ghazaleh; Castaneda Vega, Salvador; Ernemann, Ulrike; Klose, Uwe

    2017-10-01

    To assess the diagnostic performance of histogram analysis of diffusion kurtosis imaging (DKI) maps for in vivo assessment of the 2016 World Health Organization Classification of Tumors of the Central Nervous System (2016 CNS WHO) integrated glioma grades. Seventy-seven patients with histopathologically-confirmed glioma who provided written informed consent were retrospectively assessed between 01/2014 and 03/2017 from a prospective trial approved by the local institutional review board. Ten histogram parameters of mean kurtosis (MK) and mean diffusivity (MD) metrics from DKI were independently assessed by two blinded physicians from a volume of interest around the entire solid tumor. One-way ANOVA was used to compare MK and MD histogram parameter values between 2016 CNS WHO-based tumor grades. Receiver operating characteristic analysis was performed on MK and MD histogram parameters for significant results. The 25th, 50th, 75th, and 90th percentiles of MK and average MK showed significant differences between IDH1/2 wild-type gliomas, IDH1/2 mutated gliomas, and oligodendrogliomas with chromosome 1p/19q loss of heterozygosity and IDH1/2 mutation (p<0.001). The 50th, 75th, and 90th percentiles showed a slightly higher diagnostic performance (area under the curve (AUC) range; 0.868-0.991) than average MK (AUC range; 0.855-0.988) in classifying glioma according to the integrated approach of 2016 CNS WHO. Histogram analysis of DKI can stratify gliomas according to the integrated approach of 2016 CNS WHO. The 50th (median), 75th , and the 90th percentiles showed the highest diagnostic performance. However, the average MK is also robust and feasible in routine clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Apparent diffusion coefficient histogram analysis can evaluate radiation-induced parotid damage and predict late xerostomia degree in nasopharyngeal carcinoma.

    PubMed

    Zhou, Nan; Guo, Tingting; Zheng, Huanhuan; Pan, Xia; Chu, Chen; Dou, Xin; Li, Ming; Liu, Song; Zhu, Lijing; Liu, Baorui; Chen, Weibo; He, Jian; Yan, Jing; Zhou, Zhengyang; Yang, Xiaofeng

    2017-09-19

    We investigated apparent diffusion coefficient (ADC) histogram analysis to evaluate radiation-induced parotid damage and predict xerostomia degrees in nasopharyngeal carcinoma (NPC) patients receiving radiotherapy. The imaging of bilateral parotid glands in NPC patients was conducted 2 weeks before radiotherapy (time point 1), one month after radiotherapy (time point 2), and four months after radiotherapy (time point 3). From time point 1 to 2, parotid volume, skewness, and kurtosis decreased ( P < 0.001, = 0.001, and < 0.001, respectively), but all other ADC histogram parameters increased (all P < 0.001, except P = 0.006 for standard deviation [SD]). From time point 2 to 3, parotid volume continued to decrease ( P = 0.022), and SD, 75 th and 90 th percentiles continued to increase ( P = 0.024, 0.010, and 0.006, respectively). Early change rates of parotid ADC mean , ADC min , kurtosis, and 25 th , 50 th , 75 th , 90 th percentiles (from time point 1 to 2) correlated with late parotid atrophy rate (from time point 1 to 3) (all P < 0.05). Multiple linear regression analysis revealed correlations among parotid volume, time point, and ADC histogram parameters. Early mean change rates for bilateral parotid SD and ADC max could predict late xerostomia degrees at seven months after radiotherapy (three months after time point 3) with AUC of 0.781 and 0.818 ( P = 0.014, 0.005, respectively). ADC histogram parameters were reproducible (intraclass correlation coefficient, 0.830 - 0.999). ADC histogram analysis could be used to evaluate radiation-induced parotid damage noninvasively, and predict late xerostomia degrees of NPC patients treated with radiotherapy.

  17. Comparison between types I and II epithelial ovarian cancer using histogram analysis of monoexponential, biexponential, and stretched-exponential diffusion models.

    PubMed

    Wang, Feng; Wang, Yuxiang; Zhou, Yan; Liu, Congrong; Xie, Lizhi; Zhou, Zhenyu; Liang, Dong; Shen, Yang; Yao, Zhihang; Liu, Jianyu

    2017-12-01

    To evaluate the utility of histogram analysis of monoexponential, biexponential, and stretched-exponential models to a dualistic model of epithelial ovarian cancer (EOC). Fifty-two patients with histopathologically proven EOC underwent preoperative magnetic resonance imaging (MRI) (including diffusion-weighted imaging [DWI] with 11 b-values) using a 3.0T system and were divided into two groups: types I and II. Apparent diffusion coefficient (ADC), true diffusion coefficient (D), pseudodiffusion coefficient (D*), perfusion fraction (f), distributed diffusion coefficient (DDC), and intravoxel water diffusion heterogeneity (α) histograms were obtained based on solid components of the entire tumor. The following metrics of each histogram were compared between two types: 1) mean; 2) median; 3) 10th percentile and 90th percentile. Conventional MRI morphological features were also recorded. Significant morphological features for predicting EOC type were maximum diameter (P = 0.007), texture of lesion (P = 0.001), and peritoneal implants (P = 0.001). For ADC, D, f, DDC, and α, all metrics were significantly lower in type II than type I (P < 0.05). Mean, median, 10th, and 90th percentile of D* were not significantly different (P = 0.336, 0.154, 0.779, and 0.203, respectively). Most histogram metrics of ADC, D, and DDC had significantly higher area under the receiver operating characteristic curve values than those of f and α (P < 0.05) CONCLUSION: It is feasible to grade EOC by morphological features and three models with histogram analysis. ADC, D, and DDC have better performance than f and α; f and α may provide additional information. 4 Technical Efficacy: Stage 1 J. Magn. Reson. Imaging 2017;46:1797-1809. © 2017 International Society for Magnetic Resonance in Medicine.

  18. Principal component analysis of the CT density histogram to generate parametric response maps of COPD

    NASA Astrophysics Data System (ADS)

    Zha, N.; Capaldi, D. P. I.; Pike, D.; McCormack, D. G.; Cunningham, I. A.; Parraga, G.

    2015-03-01

    Pulmonary x-ray computed tomography (CT) may be used to characterize emphysema and airways disease in patients with chronic obstructive pulmonary disease (COPD). One analysis approach - parametric response mapping (PMR) utilizes registered inspiratory and expiratory CT image volumes and CT-density-histogram thresholds, but there is no consensus regarding the threshold values used, or their clinical meaning. Principal-component-analysis (PCA) of the CT density histogram can be exploited to quantify emphysema using data-driven CT-density-histogram thresholds. Thus, the objective of this proof-of-concept demonstration was to develop a PRM approach using PCA-derived thresholds in COPD patients and ex-smokers without airflow limitation. Methods: Fifteen COPD ex-smokers and 5 normal ex-smokers were evaluated. Thoracic CT images were also acquired at full inspiration and full expiration and these images were non-rigidly co-registered. PCA was performed for the CT density histograms, from which the components with the highest eigenvalues greater than one were summed. Since the values of the principal component curve correlate directly with the variability in the sample, the maximum and minimum points on the curve were used as threshold values for the PCA-adjusted PRM technique. Results: A significant correlation was determined between conventional and PCA-adjusted PRM with 3He MRI apparent diffusion coefficient (p<0.001), with CT RA950 (p<0.0001), as well as with 3He MRI ventilation defect percent, a measurement of both small airways disease (p=0.049 and p=0.06, respectively) and emphysema (p=0.02). Conclusions: PRM generated using PCA thresholds of the CT density histogram showed significant correlations with CT and 3He MRI measurements of emphysema, but not airways disease.

  19. Computerized image analysis: estimation of breast density on mammograms

    NASA Astrophysics Data System (ADS)

    Zhou, Chuan; Chan, Heang-Ping; Petrick, Nicholas; Sahiner, Berkman; Helvie, Mark A.; Roubidoux, Marilyn A.; Hadjiiski, Lubomir M.; Goodsitt, Mitchell M.

    2000-06-01

    An automated image analysis tool is being developed for estimation of mammographic breast density, which may be useful for risk estimation or for monitoring breast density change in a prevention or intervention program. A mammogram is digitized using a laser scanner and the resolution is reduced to a pixel size of 0.8 mm X 0.8 mm. Breast density analysis is performed in three stages. First, the breast region is segmented from the surrounding background by an automated breast boundary-tracking algorithm. Second, an adaptive dynamic range compression technique is applied to the breast image to reduce the range of the gray level distribution in the low frequency background and to enhance the differences in the characteristic features of the gray level histogram for breasts of different densities. Third, rule-based classification is used to classify the breast images into several classes according to the characteristic features of their gray level histogram. For each image, a gray level threshold is automatically determined to segment the dense tissue from the breast region. The area of segmented dense tissue as a percentage of the breast area is then estimated. In this preliminary study, we analyzed the interobserver variation of breast density estimation by two experienced radiologists using BI-RADS lexicon. The radiologists' visually estimated percent breast densities were compared with the computer's calculation. The results demonstrate the feasibility of estimating mammographic breast density using computer vision techniques and its potential to improve the accuracy and reproducibility in comparison with the subjective visual assessment by radiologists.

  20. Diagnosis of Tempromandibular Disorders Using Local Binary Patterns

    PubMed Central

    Haghnegahdar, A.A.; Kolahi, S.; Khojastepour, L.; Tajeripour, F.

    2018-01-01

    Background: Temporomandibular joint disorder (TMD) might be manifested as structural changes in bone through modification, adaptation or direct destruction. We propose to use Local Binary Pattern (LBP) characteristics and histogram-oriented gradients on the recorded images as a diagnostic tool in TMD assessment. Material and Methods: CBCT images of 66 patients (132 joints) with TMD and 66 normal cases (132 joints) were collected and 2 coronal cut prepared from each condyle, although images were limited to head of mandibular condyle. In order to extract features of images, first we use LBP and then histogram of oriented gradients. To reduce dimensionality, the linear algebra Singular Value Decomposition (SVD) is applied to the feature vectors matrix of all images. For evaluation, we used K nearest neighbor (K-NN), Support Vector Machine, Naïve Bayesian and Random Forest classifiers. We used Receiver Operating Characteristic (ROC) to evaluate the hypothesis. Results: K nearest neighbor classifier achieves a very good accuracy (0.9242), moreover, it has desirable sensitivity (0.9470) and specificity (0.9015) results, when other classifiers have lower accuracy, sensitivity and specificity. Conclusion: We proposed a fully automatic approach to detect TMD using image processing techniques based on local binary patterns and feature extraction. K-NN has been the best classifier for our experiments in detecting patients from healthy individuals, by 92.42% accuracy, 94.70% sensitivity and 90.15% specificity. The proposed method can help automatically diagnose TMD at its initial stages. PMID:29732343

  1. Comparing stochastic proton interactions simulated using TOPAS-nBio to experimental data from fluorescent nuclear track detectors

    NASA Astrophysics Data System (ADS)

    Underwood, T. S. A.; Sung, W.; McFadden, C. H.; McMahon, S. J.; Hall, D. C.; McNamara, A. L.; Paganetti, H.; Sawakuchi, G. O.; Schuemann, J.

    2017-04-01

    Whilst Monte Carlo (MC) simulations of proton energy deposition have been well-validated at the macroscopic level, their microscopic validation remains lacking. Equally, no gold-standard yet exists for experimental metrology of individual proton tracks. In this work we compare the distributions of stochastic proton interactions simulated using the TOPAS-nBio MC platform against confocal microscope data for Al2O3:C,Mg fluorescent nuclear track detectors (FNTDs). We irradiated 8× 4× 0.5 mm3 FNTD chips inside a water phantom, positioned at seven positions along a pristine proton Bragg peak with a range in water of 12 cm. MC simulations were implemented in two stages: (1) using TOPAS to model the beam properties within a water phantom and (2) using TOPAS-nBio with Geant4-DNA physics to score particle interactions through a water surrogate of Al2O3:C,Mg. The measured median track integrated brightness (IB) was observed to be strongly correlated to both (i) voxelized track-averaged linear energy transfer (LET) and (ii) frequency mean microdosimetric lineal energy, \\overline{{{y}F}} , both simulated in pure water. Histograms of FNTD track IB were compared against TOPAS-nBio histograms of the number of terminal electrons per proton, scored in water with mass-density scaled to mimic Al2O3:C,Mg. Trends between exposure depths observed in TOPAS-nBio simulations were experimentally replicated in the study of FNTD track IB. Our results represent an important first step towards the experimental validation of MC simulations on the sub-cellular scale and suggest that FNTDs can enable experimental study of the microdosimetric properties of individual proton tracks.

  2. Comparing stochastic proton interactions simulated using TOPAS-nBio to experimental data from fluorescent nuclear track detectors.

    PubMed

    Underwood, T S A; Sung, W; McFadden, C H; McMahon, S J; Hall, D C; McNamara, A L; Paganetti, H; Sawakuchi, G O; Schuemann, J

    2017-04-21

    Whilst Monte Carlo (MC) simulations of proton energy deposition have been well-validated at the macroscopic level, their microscopic validation remains lacking. Equally, no gold-standard yet exists for experimental metrology of individual proton tracks. In this work we compare the distributions of stochastic proton interactions simulated using the TOPAS-nBio MC platform against confocal microscope data for Al 2 O 3 :C,Mg fluorescent nuclear track detectors (FNTDs). We irradiated [Formula: see text] mm 3 FNTD chips inside a water phantom, positioned at seven positions along a pristine proton Bragg peak with a range in water of 12 cm. MC simulations were implemented in two stages: (1) using TOPAS to model the beam properties within a water phantom and (2) using TOPAS-nBio with Geant4-DNA physics to score particle interactions through a water surrogate of Al 2 O 3 :C,Mg. The measured median track integrated brightness (IB) was observed to be strongly correlated to both (i) voxelized track-averaged linear energy transfer (LET) and (ii) frequency mean microdosimetric lineal energy, [Formula: see text], both simulated in pure water. Histograms of FNTD track IB were compared against TOPAS-nBio histograms of the number of terminal electrons per proton, scored in water with mass-density scaled to mimic Al 2 O 3 :C,Mg. Trends between exposure depths observed in TOPAS-nBio simulations were experimentally replicated in the study of FNTD track IB. Our results represent an important first step towards the experimental validation of MC simulations on the sub-cellular scale and suggest that FNTDs can enable experimental study of the microdosimetric properties of individual proton tracks.

  3. Application of the Minkowski-functionals for automated pattern classification of breast parenchyma depicted by digital mammography

    NASA Astrophysics Data System (ADS)

    Boehm, Holger F.; Fischer, Tanja; Riosk, Dororthea; Britsch, Stefanie; Reiser, Maximilian

    2008-03-01

    With an estimated life-time-risk of about 10%, breast cancer is the most common cancer among women in western societies. Extensive mammography-screening programs have been implemented for diagnosis of the disease at an early stage. Several algorithms for computer-aided detection (CAD) have been proposed to help radiologists manage the increasing number of mammographic image-data and identify new cases of cancer. However, a major issue with most CAD-solutions is the fact that performance strongly depends on the structure and density of the breast tissue. Prior information about the global tissue quality in a patient would be helpful for selecting the most effective CAD-approach in order to increase the sensitivity of lesion-detection. In our study, we propose an automated method for textural evaluation of digital mammograms using the Minkowski Functionals in 2D. 80 mammograms are consensus-classified by two experienced readers as fibrosis, involution/atrophy, or normal. For each case, the topology of graylevel distribution is evaluated within a retromamillary image-section of 512 x 512 pixels. In addition, we obtain parameters from the graylevel-histogram (20th percentile, median and mean graylevel intensity). As a result, correct classification of the mammograms based on the densitometic parameters is achieved in between 38 and 48%, whereas topological analysis increases the rate to 83%. The findings demonstrate the effectiveness of the proposed algorithm. Compared to features obtained from graylevel histograms and comparable studies, we draw the conclusion that the presented method performs equally good or better. Our future work will be focused on the characterization of the mammographic tissue according to the Breast Imaging Reporting and Data System (BI-RADS). Moreover, other databases will be tested for an in-depth evaluation of the efficiency of our proposal.

  4. A New Vegetation Segmentation Approach for Cropped Fields Based on Threshold Detection from Hue Histograms

    PubMed Central

    Hassanein, Mohamed; El-Sheimy, Naser

    2018-01-01

    Over the last decade, the use of unmanned aerial vehicle (UAV) technology has evolved significantly in different applications as it provides a special platform capable of combining the benefits of terrestrial and aerial remote sensing. Therefore, such technology has been established as an important source of data collection for different precision agriculture (PA) applications such as crop health monitoring and weed management. Generally, these PA applications depend on performing a vegetation segmentation process as an initial step, which aims to detect the vegetation objects in collected agriculture fields’ images. The main result of the vegetation segmentation process is a binary image, where vegetations are presented in white color and the remaining objects are presented in black. Such process could easily be performed using different vegetation indexes derived from multispectral imagery. Recently, to expand the use of UAV imagery systems for PA applications, it was important to reduce the cost of such systems through using low-cost RGB cameras Thus, developing vegetation segmentation techniques for RGB images is a challenging problem. The proposed paper introduces a new vegetation segmentation methodology for low-cost UAV RGB images, which depends on using Hue color channel. The proposed methodology follows the assumption that the colors in any agriculture field image can be distributed into vegetation and non-vegetations colors. Therefore, four main steps are developed to detect five different threshold values using the hue histogram of the RGB image, these thresholds are capable to discriminate the dominant color, either vegetation or non-vegetation, within the agriculture field image. The achieved results for implementing the proposed methodology showed its ability to generate accurate and stable vegetation segmentation performance with mean accuracy equal to 87.29% and standard deviation as 12.5%. PMID:29670055

  5. SU-F-BRB-07: A Plan Comparison Tool to Ensure Robustness and Deliverability in Online-Adaptive Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, P; Labby, Z; Bayliss, R A

    Purpose: To develop a plan comparison tool that will ensure robustness and deliverability through analysis of baseline and online-adaptive radiotherapy plans using similarity metrics. Methods: The ViewRay MRIdian treatment planning system allows export of a plan file that contains plan and delivery information. A software tool was developed to read and compare two plans, providing information and metrics to assess their similarity. In addition to performing direct comparisons (e.g. demographics, ROI volumes, number of segments, total beam-on time), the tool computes and presents histograms of derived metrics (e.g. step-and-shoot segment field sizes, segment average leaf gaps). Such metrics were investigatedmore » for their ability to predict that an online-adapted plan reasonably similar to a baseline plan where deliverability has already been established. Results: In the realm of online-adaptive planning, comparing ROI volumes offers a sanity check to verify observations found during contouring. Beyond ROI analysis, it has been found that simply editing contours and re-optimizing to adapt treatment can produce a delivery that is substantially different than the baseline plan (e.g. number of segments increased by 31%), with no changes in optimization parameters and only minor changes in anatomy. Currently the tool can quickly identify large omissions or deviations from baseline expectations. As our online-adaptive patient population increases, we will continue to develop and refine quantitative acceptance criteria for adapted plans and relate them historical delivery QA measurements. Conclusion: The plan comparison tool is in clinical use and reports a wide range of comparison metrics, illustrating key differences between two plans. This independent check is accomplished in seconds and can be performed in parallel to other tasks in the online-adaptive workflow. Current use prevents large planning or delivery errors from occurring, and ongoing refinements will lead to increased assurance of plan quality.« less

  6. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iliopoulos, AS; Sun, X; Floros, D

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well asmore » histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial signal/noise variations. An efficient multi-scale computational mechanism is developed to curtail processing latency. Spatially adaptive filtering may impact subsequent processing tasks such as reconstruction and numerical gradient computations for deformable registration. NIH Grant No. R01-184173.« less

  7. Dosimetric and radiobiological consequences of computed tomography-guided adaptive strategies for intensity modulated radiation therapy of the prostate.

    PubMed

    Battista, Jerry J; Johnson, Carol; Turnbull, David; Kempe, Jeff; Bzdusek, Karl; Van Dyk, Jacob; Bauman, Glenn

    2013-12-01

    To examine a range of scenarios for image-guided adaptive radiation therapy of prostate cancer, including different schedules for megavoltage CT imaging, patient repositioning, and dose replanning. We simulated multifraction dose distributions with deformable registration using 35 sets of megavoltage CT scans of 13 patients. We computed cumulative dose-volume histograms, from which tumor control probabilities and normal tissue complication probabilities (NTCPs) for rectum were calculated. Five-field intensity modulated radiation therapy (IMRT) with 18-MV x-rays was planned to achieve an isocentric dose of 76 Gy to the clinical target volume (CTV). The differences between D95, tumor control probability, V70Gy, and NTCP for rectum, for accumulated versus planned dose distributions, were compared for different target volume sizes, margins, and adaptive strategies. The CTV D95 for IMRT treatment plans, averaged over 13 patients, was 75.2 Gy. Using the largest CTV margins (10/7 mm), the D95 values accumulated over 35 fractions were within 2% of the planned value, regardless of the adaptive strategy used. For tighter margins (5 mm), the average D95 values dropped to approximately 73.0 Gy even with frequent repositioning, and daily replanning was necessary to correct this deficit. When personalized margins were applied to an adaptive CTV derived from the first 6 treatment fractions using the STAPLE (Simultaneous Truth and Performance Level Estimation) algorithm, target coverage could be maintained using a single replan 1 week into therapy. For all approaches, normal tissue parameters (rectum V(70Gy) and NTCP) remained within acceptable limits. The frequency of adaptive interventions depends on the size of the CTV combined with target margins used during IMRT optimization. The application of adaptive target margins (<5 mm) to an adaptive CTV determined 1 week into therapy minimizes the need for subsequent dose replanning. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Local intensity area descriptor for facial recognition in ideal and noise conditions

    NASA Astrophysics Data System (ADS)

    Tran, Chi-Kien; Tseng, Chin-Dar; Chao, Pei-Ju; Ting, Hui-Min; Chang, Liyun; Huang, Yu-Jie; Lee, Tsair-Fwu

    2017-03-01

    We propose a local texture descriptor, local intensity area descriptor (LIAD), which is applied for human facial recognition in ideal and noisy conditions. Each facial image is divided into small regions from which LIAD histograms are extracted and concatenated into a single feature vector to represent the facial image. The recognition is performed using a nearest neighbor classifier with histogram intersection and chi-square statistics as dissimilarity measures. Experiments were conducted with LIAD using the ORL database of faces (Olivetti Research Laboratory, Cambridge), the Face94 face database, the Georgia Tech face database, and the FERET database. The results demonstrated the improvement in accuracy of our proposed descriptor compared to conventional descriptors [local binary pattern (LBP), uniform LBP, local ternary pattern, histogram of oriented gradients, and local directional pattern]. Moreover, the proposed descriptor was less sensitive to noise and had low histogram dimensionality. Thus, it is expected to be a powerful texture descriptor that can be used for various computer vision problems.

  9. A Bayesian Modeling Approach for Estimation of a Shape-Free Groundwater Age Distribution using Multiple Tracers

    DOE PAGES

    Massoudieh, Arash; Visser, Ate; Sharifi, Soroosh; ...

    2013-10-15

    The mixing of groundwaters with different ages in aquifers, groundwater age is more appropriately represented by a distribution rather than a scalar number. To infer a groundwater age distribution from environmental tracers, a mathematical form is often assumed for the shape of the distribution and the parameters of the mathematical distribution are estimated using deterministic or stochastic inverse methods. We found that the prescription of the mathematical form limits the exploration of the age distribution to the shapes that can be described by the selected distribution. In this paper, the use of freeform histograms as groundwater age distributions is evaluated.more » A Bayesian Markov Chain Monte Carlo approach is used to estimate the fraction of groundwater in each histogram bin. This method was able to capture the shape of a hypothetical gamma distribution from the concentrations of four age tracers. The number of bins that can be considered in this approach is limited based on the number of tracers available. The histogram method was also tested on tracer data sets from Holten (The Netherlands; 3H, 3He, 85Kr, 39Ar) and the La Selva Biological Station (Costa-Rica; SF 6, CFCs, 3H, 4He and 14C), and compared to a number of mathematical forms. According to standard Bayesian measures of model goodness, the best mathematical distribution performs better than the histogram distributions in terms of the ability to capture the observed tracer data relative to their complexity. Among the histogram distributions, the four bin histogram performs better in most of the cases. The Monte Carlo simulations showed strong correlations in the posterior estimates of bin contributions, indicating that these bins cannot be well constrained using the available age tracers. The fact that mathematical forms overall perform better than the freeform histogram does not undermine the benefit of the freeform approach, especially for the cases where a larger amount of observed data is available and when the real groundwater distribution is more complex than can be represented by simple mathematical forms.« less

  10. A Bayesian Modeling Approach for Estimation of a Shape-Free Groundwater Age Distribution using Multiple Tracers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Massoudieh, Arash; Visser, Ate; Sharifi, Soroosh

    The mixing of groundwaters with different ages in aquifers, groundwater age is more appropriately represented by a distribution rather than a scalar number. To infer a groundwater age distribution from environmental tracers, a mathematical form is often assumed for the shape of the distribution and the parameters of the mathematical distribution are estimated using deterministic or stochastic inverse methods. We found that the prescription of the mathematical form limits the exploration of the age distribution to the shapes that can be described by the selected distribution. In this paper, the use of freeform histograms as groundwater age distributions is evaluated.more » A Bayesian Markov Chain Monte Carlo approach is used to estimate the fraction of groundwater in each histogram bin. This method was able to capture the shape of a hypothetical gamma distribution from the concentrations of four age tracers. The number of bins that can be considered in this approach is limited based on the number of tracers available. The histogram method was also tested on tracer data sets from Holten (The Netherlands; 3H, 3He, 85Kr, 39Ar) and the La Selva Biological Station (Costa-Rica; SF 6, CFCs, 3H, 4He and 14C), and compared to a number of mathematical forms. According to standard Bayesian measures of model goodness, the best mathematical distribution performs better than the histogram distributions in terms of the ability to capture the observed tracer data relative to their complexity. Among the histogram distributions, the four bin histogram performs better in most of the cases. The Monte Carlo simulations showed strong correlations in the posterior estimates of bin contributions, indicating that these bins cannot be well constrained using the available age tracers. The fact that mathematical forms overall perform better than the freeform histogram does not undermine the benefit of the freeform approach, especially for the cases where a larger amount of observed data is available and when the real groundwater distribution is more complex than can be represented by simple mathematical forms.« less

  11. Histogram Analysis of Apparent Diffusion Coefficients for Occult Tonsil Cancer in Patients with Cervical Nodal Metastasis from an Unknown Primary Site at Presentation.

    PubMed

    Choi, Young Jun; Lee, Jeong Hyun; Kim, Hye Ok; Kim, Dae Yoon; Yoon, Ra Gyoung; Cho, So Hyun; Koh, Myeong Ju; Kim, Namkug; Kim, Sang Yoon; Baek, Jung Hwan

    2016-01-01

    To explore the added value of histogram analysis of apparent diffusion coefficient (ADC) values over magnetic resonance (MR) imaging and fluorine 18 ((18)F) fluorodeoxyglucose (FDG) positron emission tomography (PET)/computed tomography (CT) for the detection of occult palatine tonsil squamous cell carcinoma (SCC) in patients with cervical nodal metastasis from a cancer of an unknown primary site. The institutional review board approved this retrospective study, and the requirement for informed consent was waived. Differences in the bimodal histogram parameters of the ADC values were assessed among occult palatine tonsil SCC (n = 19), overt palatine tonsil SCC (n = 20), and normal palatine tonsils (n = 20). One-way analysis of variance was used to analyze differences among the three groups. Receiver operating characteristic curve analysis was used to determine the best differentiating parameters. The increased sensitivity of histogram analysis over MR imaging and (18)F-FDG PET/CT for the detection of occult palatine tonsil SCC was evaluated as added value. Histogram analysis showed statistically significant differences in the mean, standard deviation, and 50th and 90th percentile ADC values among the three groups (P < .0045). Occult palatine tonsil SCC had a significantly higher standard deviation for the overall curves, mean and standard deviation of the higher curves, and 90th percentile ADC value, compared with normal palatine tonsils (P < .0167). Receiver operating characteristic curve analysis showed that the standard deviation of the overall curve best delineated occult palatine tonsil SCC from normal palatine tonsils, with a sensitivity of 78.9% (15 of 19 patients) and a specificity of 60% (12 of 20 patients). The added value of ADC histogram analysis was 52.6% over MR imaging alone and 15.8% over combined conventional MR imaging and (18)F-FDG PET/CT. Adding ADC histogram analysis to conventional MR imaging can improve the detection sensitivity for occult palatine tonsil SCC in patients with a cervical nodal metastasis originating from a cancer of an unknown primary site. © RSNA, 2015.

  12. Gliomas: Application of Cumulative Histogram Analysis of Normalized Cerebral Blood Volume on 3 T MRI to Tumor Grading

    PubMed Central

    Kim, Hyungjin; Choi, Seung Hong; Kim, Ji-Hoon; Ryoo, Inseon; Kim, Soo Chin; Yeom, Jeong A.; Shin, Hwaseon; Jung, Seung Chai; Lee, A. Leum; Yun, Tae Jin; Park, Chul-Kee; Sohn, Chul-Ho; Park, Sung-Hye

    2013-01-01

    Background Glioma grading assumes significant importance in that low- and high-grade gliomas display different prognoses and are treated with dissimilar therapeutic strategies. The objective of our study was to retrospectively assess the usefulness of a cumulative normalized cerebral blood volume (nCBV) histogram for glioma grading based on 3 T MRI. Methods From February 2010 to April 2012, 63 patients with astrocytic tumors underwent 3 T MRI with dynamic susceptibility contrast perfusion-weighted imaging. Regions of interest containing the entire tumor volume were drawn on every section of the co-registered relative CBV (rCBV) maps and T2-weighted images. The percentile values from the cumulative nCBV histograms and the other histogram parameters were correlated with tumor grades. Cochran’s Q test and the McNemar test were used to compare the diagnostic accuracies of the histogram parameters after the receiver operating characteristic curve analysis. Using the parameter offering the highest diagnostic accuracy, a validation process was performed with an independent test set of nine patients. Results The 99th percentile of the cumulative nCBV histogram (nCBV C99), mean and peak height differed significantly between low- and high-grade gliomas (P = <0.001, 0.014 and <0.001, respectively) and between grade III and IV gliomas (P = <0.001, 0.001 and <0.001, respectively). The diagnostic accuracy of nCBV C99 was significantly higher than that of the mean nCBV (P = 0.016) in distinguishing high- from low-grade gliomas and was comparable to that of the peak height (P = 1.000). Validation using the two cutoff values of nCBV C99 achieved a diagnostic accuracy of 66.7% (6/9) for the separation of all three glioma grades. Conclusion Cumulative histogram analysis of nCBV using 3 T MRI can be a useful method for preoperative glioma grading. The nCBV C99 value is helpful in distinguishing high- from low-grade gliomas and grade IV from III gliomas. PMID:23704910

  13. Comparison of Utility of Histogram Apparent Diffusion Coefficient and R2* for Differentiation of Low-Grade From High-Grade Clear Cell Renal Cell Carcinoma.

    PubMed

    Zhang, Yu-Dong; Wu, Chen-Jiang; Wang, Qing; Zhang, Jing; Wang, Xiao-Ning; Liu, Xi-Sheng; Shi, Hai-Bin

    2015-08-01

    The purpose of this study was to compare histogram analysis of apparent diffusion coefficient (ADC) and R2* for differentiating low-grade from high-grade clear cell renal cell carcinoma (RCC). Forty-six patients with pathologically confirmed clear cell RCC underwent preoperative BOLD and DWI MRI of the kidneys. ADCs based on the entire tumor volume were calculated with b value combinations of 0 and 800 s/mm(2). ROI-based R2* was calculated with eight TE combinations of 6.7-22.8 milliseconds. Histogram analysis of tumor ADCs and R2* values was performed to obtain mean; median; width; and fifth, 10th, 90th, and 95th percentiles and histogram inhomogeneity, kurtosis, and skewness for all lesions. Thirty-three low-grade and 13 high-grade clear cell RCCs were found at pathologic examination. The TNM classification and tumor volume of clear cell RCC significantly correlated with histogram ADC and R2* (ρ = -0.317 to 0.506; p < 0.05). High-grade clear cell RCC had significantly lower mean, median, and 10th percentile ADCs but higher inhomogeneity and median R2* than low-grade clear cell RCC (all p < 0.05). Compared with other histogram ADC and R2* indexes, 10th percentile ADC had the highest accuracy (91.3%) in discriminating low- from high-grade clear cell RCC. R2* in discriminating hemorrhage was achieved with a threshold of 68.95 Hz. At this threshold, high-grade clear cell RCC had a significantly higher prevalence of intratumor hemorrhage (high-grade, 76.9%; low-grade, 45.4%; p < 0.05) and larger hemorrhagic area than low-grade clear cell RCC (high-grade, 34.9% ± 31.6%; low-grade, 8.9 ± 16.8%; p < 0.05). A close relation was found between MRI indexes and pathologic findings. Histogram analysis of ADC and R2* allows differentiation of low- from high-grade clear cell RCC with high accuracy.

  14. Histogram analysis of apparent diffusion coefficient maps for assessing thymic epithelial tumours: correlation with world health organization classification and clinical staging.

    PubMed

    Kong, Ling-Yan; Zhang, Wei; Zhou, Yue; Xu, Hai; Shi, Hai-Bin; Feng, Qing; Xu, Xiao-Quan; Yu, Tong-Fu

    2018-04-01

    To investigate the value of apparent diffusion coefficients (ADCs) histogram analysis for assessing World Health Organization (WHO) pathological classification and Masaoka clinical stages of thymic epithelial tumours. 37 patients with histologically confirmed thymic epithelial tumours were enrolled. ADC measurements were performed using hot-spot ROI (ADC HS-ROI ) and histogram-based approach. ADC histogram parameters included mean ADC (ADC mean ), median ADC (ADC median ), 10 and 90 percentile of ADC (ADC 10 and ADC 90 ), kurtosis and skewness. One-way ANOVA, independent-sample t-test, and receiver operating characteristic were used for statistical analyses. There were significant differences in ADC mean , ADC median , ADC 10 , ADC 90 and ADC HS-ROI among low-risk thymoma (type A, AB, B1; n = 14), high-risk thymoma (type B2, B3; n = 9) and thymic carcinoma (type C, n = 14) groups (all p-values <0.05), while no significant difference in skewness (p = 0.181) and kurtosis (p = 0.088). ADC 10 showed best differentiating ability (cut-off value, ≤0.689 × 10 -3 mm 2 s -1 ; AUC, 0.957; sensitivity, 95.65%; specificity, 92.86%) for discriminating low-risk thymoma from high-risk thymoma and thymic carcinoma. Advanced Masaoka stages (Stage III and IV; n = 24) tumours showed significant lower ADC parameters and higher kurtosis than early Masaoka stage (Stage I and II; n = 13) tumours (all p-values <0.05), while no significant difference on skewness (p = 0.063). ADC 10 showed best differentiating ability (cut-off value, ≤0.689 × 10 -3 mm 2 s -1 ; AUC, 0.913; sensitivity, 91.30%; specificity, 85.71%) for discriminating advanced and early Masaoka stage epithelial tumours. ADC histogram analysis may assist in assessing the WHO pathological classification and Masaoka clinical stages of thymic epithelial tumours. Advances in knowledge: 1. ADC histogram analysis could help to assess WHO pathological classification of thymic epithelial tumours. 2. ADC histogram analysis could help to evaluate Masaoka clinical stages of thymic epithelial tumours. 3. ADC 10 might be a promising imaging biomarker for assessing and characterizing thymic epithelial tumours.

  15. Utility of whole-lesion ADC histogram metrics for assessing the malignant potential of pancreatic intraductal papillary mucinous neoplasms (IPMNs).

    PubMed

    Hoffman, David H; Ream, Justin M; Hajdu, Christina H; Rosenkrantz, Andrew B

    2017-04-01

    To evaluate whole-lesion ADC histogram metrics for assessing the malignant potential of pancreatic intraductal papillary mucinous neoplasms (IPMNs), including in comparison with conventional MRI features. Eighteen branch-duct IPMNs underwent MRI with DWI prior to resection (n = 16) or FNA (n = 2). A blinded radiologist placed 3D volumes-of-interest on the entire IPMN on the ADC map, from which whole-lesion histogram metrics were generated. The reader also assessed IPMN size, mural nodularity, and adjacent main-duct dilation. Benign (low-to-intermediate grade dysplasia; n = 10) and malignant (high-grade dysplasia or invasive adenocarcinoma; n = 8) IPMNs were compared. Whole-lesion ADC histogram metrics demonstrating significant differences between benign and malignant IPMNs were: entropy (5.1 ± 0.2 vs. 5.4 ± 0.2; p = 0.01, AUC = 86%); mean of the bottom 10th percentile (2.2 ± 0.4 vs. 1.6 ± 0.7; p = 0.03; AUC = 81%); and mean of the 10-25th percentile (2.8 ± 0.4 vs. 2.3 ± 0.6; p = 0.04; AUC = 79%). The overall mean ADC, skewness, and kurtosis were not significantly different between groups (p ≥ 0.06; AUC = 50-78%). For entropy (highest performing histogram metric), an optimal threshold of >5.3 achieved a sensitivity of 100%, a specificity of 70%, and an accuracy of 83% for predicting malignancy. No significant difference (p = 0.18-0.64) was observed between benign and malignant IPMNs for cyst size ≥3 cm, adjacent main-duct dilatation, or mural nodule. At multivariable analysis of entropy in combination with all other ADC histogram and conventional MRI features, entropy was the only significant independent predictor of malignancy (p = 0.004). Although requiring larger studies, ADC entropy obtained from 3D whole-lesion histogram analysis may serve as a biomarker for identifying the malignant potential of IPMNs, independent of conventional MRI features.

  16. Clarification to "Examining Rater Errors in the Assessment of Written Composition with a Many-Faceted Rasch Model."

    ERIC Educational Resources Information Center

    Englehard, George, Jr.

    1996-01-01

    Data presented in figure three of the article cited may be misleading in that the automatic scaling procedure used by the computer program that generated the histogram highlighted spikes that would look different with different histogram methods. (SLD)

  17. Using Computer Graphics in Statistics.

    ERIC Educational Resources Information Center

    Kerley, Lyndell M.

    1990-01-01

    Described is software which allows a student to use simulation to produce analytical output as well as graphical results. The results include a frequency histogram of a selected population distribution, a frequency histogram of the distribution of the sample means, and test the normality distributions of the sample means. (KR)

  18. Performance comparisons on spatial lattice algorithm and direct matrix inverse method with application to adaptive arrays processing

    NASA Technical Reports Server (NTRS)

    An, S. H.; Yao, K.

    1986-01-01

    Lattice algorithm has been employed in numerous adaptive filtering applications such as speech analysis/synthesis, noise canceling, spectral analysis, and channel equalization. In this paper the application to adaptive-array processing is discussed. The advantages are fast convergence rate as well as computational accuracy independent of the noise and interference conditions. The results produced by this technique are compared to those obtained by the direct matrix inverse method.

  19. A case of EDTA-dependent pseudothrombocytopenia: simple recognition of an underdiagnosed and misleading phenomenon

    PubMed Central

    2014-01-01

    Background EDTA-dependent pseudothrombocytopenia (EDTA-PTCP) is a common laboratory phenomenon with a prevalence ranging from 0.1-2% in hospitalized patients to 15-17% in outpatients evaluated for isolated thrombocytopenia. Despite its harmlessness, EDTA-PTCP frequently leads to time-consuming, costly and even invasive diagnostic investigations. EDTA-PTCP is often overlooked because blood smears are not evaluated visually in routine practice and histograms as well as warning flags of hematology analyzers are not interpreted correctly. Nonetheless, EDTA-PTCP may be diagnosed easily even by general practitioners without any experiences in blood film examinations. This is the first report illustrating the typical patterns of a platelet (PLT) and white blood cell (WBC) histograms of hematology analyzers. Case presentation A 37-year-old female patient of Caucasian origin was referred with suspected acute leukemia and the crew of the emergency unit arranged extensive investigations for work-up. However, examination of EDTA blood sample revealed atypical lymphocytes and an isolated thrombocytopenia together with typical patterns of WBC and PLT histograms: a serrated curve of the platelet histogram and a peculiar peak on the left side of the WBC histogram. EDTA-PTCP was confirmed by a normal platelet count when examining citrated blood. Conclusion Awareness of typical PLT and WBC patterns may alert to the presence of EDTA-PTCP in routine laboratory practice helping to avoid unnecessary investigations and over-treatment. PMID:24808761

  20. Adaptation of Human Resource Management and Industrial Relations Graduate Courses in Spain to the European Higher Education Area

    ERIC Educational Resources Information Center

    Barba-Sanchez, Virginia

    2009-01-01

    Europe's higher education system is currently undergoing a process of change and convergence in order to guarantee equal conditions for labour mobility within its borders. Spain, like any other European country, must adapt its legislation, homogenize its studies, and raise awareness among its educational institutions (beginning with their teaching…

  1. Disentangling Stability, Variability and Adaptability in Human Performance: Focus on the Interplay between Local Variance and Serial Correlation

    ERIC Educational Resources Information Center

    Torre, Kjerstin; Balasubramaniam, Ramesh

    2011-01-01

    We address the complex relationship between the stability, variability, and adaptability of psychological systems by decomposing the global variance of serial performance into two independent parts: the local variance (LV) and the serial correlation structure. For two time series with equal LV, the presence of persistent long-range correlations…

  2. Functional Outcome Trajectories After Out-of-Hospital Pediatric Cardiac Arrest.

    PubMed

    Silverstein, Faye S; Slomine, Beth S; Christensen, James; Holubkov, Richard; Page, Kent; Dean, J Michael; Moler, Frank W

    2016-12-01

    To analyze functional performance measures collected prospectively during the conduct of a clinical trial that enrolled children (up to age 18 yr old), resuscitated after out-of-hospital cardiac arrest, who were at high risk of poor outcomes. Children with Glasgow Motor Scale score less than 5, within 6 hours of resuscitation, were enrolled in a clinical trial that compared two targeted temperature management interventions (THAPCA-OH, NCT00878644). The primary outcome, 12-month survival with Vineland Adaptive Behavior Scale, second edition, score greater or equal to 70, did not differ between groups. Thirty-eight North American PICUs. Two hundred ninety-five children were enrolled; 270 of 295 had baseline Vineland Adaptive Behavior Scale, second edition, scores greater or equal to 70; 87 of 270 survived 1 year. Targeted temperatures were 33.0°C and 36.8°C for hypothermia and normothermia groups. Baseline measures included Vineland Adaptive Behavior Scale, second edition, Pediatric Cerebral Performance Category, and Pediatric Overall Performance Category. Pediatric Cerebral Performance Category and Pediatric Overall Performance Category were rescored at hospital discharges; all three were scored at 3 and 12 months. In survivors with baseline Vineland Adaptive Behavior Scale, second edition scores greater or equal to 70, we evaluated relationships of hospital discharge Pediatric Cerebral Performance Category with 3- and 12-month scores and between 3- and 12-month Vineland Adaptive Behavior Scale, second edition, scores. Hospital discharge Pediatric Cerebral Performance Category scores strongly predicted 3- and 12-month Pediatric Cerebral Performance Category (r = 0.82 and 0.79; p < 0.0001) and Vineland Adaptive Behavior Scale, second edition, scores (r = -0.81 and -0.77; p < 0.0001). Three-month Vineland Adaptive Behavior Scale, second edition, scores strongly predicted 12-month performance (r = 0.95; p < 0.0001). Hypothermia treatment did not alter these relationships. In comatose children, with Glasgow Motor Scale score less than 5 in the initial hours after out-of-hospital cardiac arrest resuscitation, function scores at hospital discharge and at 3 months predicted 12-month performance well in the majority of survivors.

  3. Adaptation in pronoun resolution: Evidence from Brazilian and European Portuguese.

    PubMed

    Fernandes, Eunice G; Luegi, Paula; Correa Soares, Eduardo; de la Fuente, Israel; Hemforth, Barbara

    2018-04-26

    Previous research accounting for pronoun resolution as a problem of probabilistic inference has not explored the phenomenon of adaptation, whereby the processor constantly tracks and adapts, rationally, to changes in a statistical environment. We investigate whether Brazilian (BP) and European Portuguese (EP) speakers adapt to variations in the probability of occurrence of ambiguous overt and null pronouns, in two experiments assessing resolution toward subject and object referents. For each variety (BP, EP), participants were faced with either the same number of null and overt pronouns (equal distribution), or with an environment with fewer overt (than null) pronouns (unequal distribution). We find that the preference for interpreting overt pronouns as referring back to an object referent (object-biased interpretation) is higher when there are fewer overt pronouns (i.e., in the unequal, relative to the equal distribution condition). This is especially the case for BP, a variety with higher prior frequency and smaller object-biased interpretation of overt pronouns, suggesting that participants adapted incrementally and integrated prior statistical knowledge with the knowledge obtained in the experiment. We hypothesize that comprehenders adapted rationally, with the goal of maintaining, across variations in pronoun probability, the likelihood of subject and object referents. Our findings unify insights from research in pronoun resolution and in adaptation, and add to previous studies in both topics: They provide evidence for the influence of pronoun probability in pronoun resolution, and for an adaptation process whereby the language processor not only tracks statistical information, but uses it to make interpretational inferences. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. Fast convergent frequency-domain MIMO equalizer for few-mode fiber communication systems

    NASA Astrophysics Data System (ADS)

    He, Xuan; Weng, Yi; Wang, Junyi; Pan, Z.

    2018-02-01

    Space division multiplexing using few-mode fibers has been extensively explored to sustain the continuous traffic growth. In few-mode fiber optical systems, both spatial and polarization modes are exploited to transmit parallel channels, thus increasing the overall capacity. However, signals on spatial channels inevitably suffer from the intrinsic inter-modal coupling and large accumulated differential mode group delay (DMGD), which causes spatial modes de-multiplex even harder. Many research articles have demonstrated that frequency domain adaptive multi-input multi-output (MIMO) equalizer can effectively compensate the DMGD and demultiplex the spatial channels with digital signal processing (DSP). However, the large accumulated DMGD usually requires a large number of training blocks for the initial convergence of adaptive MIMO equalizers, which will decrease the overall system efficiency and even degrade the equalizer performance in fast-changing optical channels. Least mean square (LMS) algorithm is always used in MIMO equalization to dynamically demultiplex the spatial signals. We have proposed to use signal power spectral density (PSD) dependent method and noise PSD directed method to improve the convergence speed of adaptive frequency domain LMS algorithm. We also proposed frequency domain recursive least square (RLS) algorithm to further increase the convergence speed of MIMO equalizer at cost of greater hardware complexity. In this paper, we will compare the hardware complexity and convergence speed of signal PSD dependent and noise power directed algorithms against the conventional frequency domain LMS algorithm. In our numerical study of a three-mode 112 Gbit/s PDM-QPSK optical system with 3000 km transmission, the noise PSD directed and signal PSD dependent methods could improve the convergence speed by 48.3% and 36.1% respectively, at cost of 17.2% and 10.7% higher hardware complexity. We will also compare the frequency domain RLS algorithm against conventional frequency domain LMS algorithm. Our numerical study shows that, in a three-mode 224 Gbit/s PDM-16-QAM system with 3000 km transmission, the RLS algorithm could improve the convergence speed by 53.7% over conventional frequency domain LMS algorithm.

  5. Adapted Finnegan scoring list for observation of anti-depressant exposed infants.

    PubMed

    Kieviet, Noera; van Ravenhorst, Mariëtte; Dolman, Koert M; van de Ven, Peter M; Heres, Marion; Wennink, Hanneke; Honig, Adriaan

    2015-01-01

    The Finnegan scoring list (FSL) is widely used to screen for poor neonatal adaptation in infants exposed to anti-depressants in utero. However, the large number of FSL-items and differential weighing of each item is time consuming. The aim of this study was to shorten and simplify the FSL yet preserving its clinimetric properties. This observational study examined infants exposed to an anti-depressant during pregnancy admitted for at least 72 h on a maternity ward. Trained nurses completed the FSL three times daily. Items for the adapted FSL were selected through forward analysis whereby the number of selected items was based on the area under the curve (AUC). Internal validity was assessed by cross-validation. 183 infants met the inclusion criteria. By forward analysis eight equally-weighed items resulted in an AUC of 0.91. In cross-validation, the mean AUC was 0.89 for 8 items. This adapted FSL had a sensitivity of 97.7% and specificity of 37.0% and a sensitivity of 41.9% and specificity of 86.2% regarding a cut-off of, respectively, 1 and 2. An adapted FSL with eight equally-weighed items has acceptable clinimetric properties and can serve as an easy to apply screening tool in infants exposed to anti-depressants during pregnancy.

  6. Gaze Fluctuations Are Not Additively Decomposable: Reply to Bogartz and Staub

    ERIC Educational Resources Information Center

    Kelty-Stephen, Damian G.; Mirman, Daniel

    2013-01-01

    Our previous work interpreted single-lognormal fits to inter-gaze distance (i.e., "gaze steps") histograms as evidence of multiplicativity and hence interactions across scales in visual cognition. Bogartz and Staub (2012) proposed that gaze steps are additively decomposable into fixations and saccades, matching the histograms better and…

  7. A fast adaptive convex hull algorithm on two-dimensional processor arrays with a reconfigurable BUS system

    NASA Technical Reports Server (NTRS)

    Olariu, S.; Schwing, J.; Zhang, J.

    1991-01-01

    A bus system that can change dynamically to suit computational needs is referred to as reconfigurable. We present a fast adaptive convex hull algorithm on a two-dimensional processor array with a reconfigurable bus system (2-D PARBS, for short). Specifically, we show that computing the convex hull of a planar set of n points taken O(log n/log m) time on a 2-D PARBS of size mn x n with 3 less than or equal to m less than or equal to n. Our result implies that the convex hull of n points in the plane can be computed in O(1) time in a 2-D PARBS of size n(exp 1.5) x n.

  8. Trinucleotide's quadruplet symmetries and natural symmetry law of DNA creation ensuing Chargaff's second parity rule.

    PubMed

    Rosandić, Marija; Vlahović, Ines; Glunčić, Matko; Paar, Vladimir

    2016-07-01

    For almost 50 years the conclusive explanation of Chargaff's second parity rule (CSPR), the equality of frequencies of nucleotides A=T and C=G or the equality of direct and reverse complement trinucleotides in the same DNA strand, has not been determined yet. Here, we relate CSPR to the interstrand mirror symmetry in 20 symbolic quadruplets of trinucleotides (direct, reverse complement, complement, and reverse) mapped to double-stranded genome. The symmetries of Q-box corresponding to quadruplets can be obtained as a consequence of Watson-Crick base pairing and CSPR together. Alternatively, assuming Natural symmetry law for DNA creation that each trinucleotide in one strand of DNA must simultaneously appear also in the opposite strand automatically leads to Q-box direct-reverse mirror symmetry which in conjunction with Watson-Crick base pairing generates CSPR. We demonstrate quadruplet's symmetries in chromosomes of wide range of organisms, from Escherichia coli to Neanderthal and human genomes, introducing novel quadruplet-frequency histograms and 3D-diagrams with combined interstrand frequencies. These "landscapes" are mutually similar in all mammals, including extinct Neanderthals, and somewhat different in most of older species. In human chromosomes 1-12, and X, Y the "landscapes" are almost identical and slightly different in the remaining smaller and telocentric chromosomes. Quadruplet frequencies could provide a new robust tool for characterization and classification of genomes and their evolutionary trajectories.

  9. Speaker Invariance for Phonetic Information: an fMRI Investigation

    PubMed Central

    Salvata, Caden; Blumstein, Sheila E.; Myers, Emily B.

    2012-01-01

    The current study explored how listeners map the variable acoustic input onto a common sound structure representation while being able to retain phonetic detail to distinguish among the identity of talkers. An adaptation paradigm was utilized to examine areas which showed an equal neural response (equal release from adaptation) to phonetic change when spoken by the same speaker and when spoken by two different speakers, and insensitivity (failure to show release from adaptation) when the same phonetic input was spoken by a different speaker. Neural areas which showed speaker invariance were located in the anterior portion of the middle superior temporal gyrus bilaterally. These findings provide support for the view that speaker normalization processes allow for the translation of a variable speech input to a common abstract sound structure. That this process appears to occur early in the processing stream, recruiting temporal structures, suggests that this mapping takes place prelexically, before sound structure input is mapped on to lexical representations. PMID:23264714

  10. MRI intensity nonuniformity correction using simultaneously spatial and gray-level histogram information.

    PubMed

    Milles, Julien; Zhu, Yue Min; Gimenez, Gérard; Guttmann, Charles R G; Magnin, Isabelle E

    2007-03-01

    A novel approach for correcting intensity nonuniformity in magnetic resonance imaging (MRI) is presented. This approach is based on the simultaneous use of spatial and gray-level histogram information. Spatial information about intensity nonuniformity is obtained using cubic B-spline smoothing. Gray-level histogram information of the image corrupted by intensity nonuniformity is exploited from a frequential point of view. The proposed correction method is illustrated using both physical phantom and human brain images. The results are consistent with theoretical prediction, and demonstrate a new way of dealing with intensity nonuniformity problems. They are all the more significant as the ground truth on intensity nonuniformity is unknown in clinical images.

  11. An effective image classification method with the fusion of invariant feature and a new color descriptor

    NASA Astrophysics Data System (ADS)

    Mansourian, Leila; Taufik Abdullah, Muhamad; Nurliyana Abdullah, Lili; Azman, Azreen; Mustaffa, Mas Rina

    2017-02-01

    Pyramid Histogram of Words (PHOW), combined Bag of Visual Words (BoVW) with the spatial pyramid matching (SPM) in order to add location information to extracted features. However, different PHOW extracted from various color spaces, and they did not extract color information individually, that means they discard color information, which is an important characteristic of any image that is motivated by human vision. This article, concatenated PHOW Multi-Scale Dense Scale Invariant Feature Transform (MSDSIFT) histogram and a proposed Color histogram to improve the performance of existing image classification algorithms. Performance evaluation on several datasets proves that the new approach outperforms other existing, state-of-the-art methods.

  12. Application of Markov Models for Analysis of Development of Psychological Characteristics

    ERIC Educational Resources Information Center

    Kuravsky, Lev S.; Malykh, Sergey B.

    2004-01-01

    A technique to study combined influence of environmental and genetic factors on the base of changes in phenotype distributions is presented. Histograms are exploited as base analyzed characteristics. A continuous time, discrete state Markov process with piece-wise constant interstate transition rates is associated with evolution of each histogram.…

  13. Post-Modeling Histogram Matching of Maps Produced Using Regression Trees

    Treesearch

    Andrew J. Lister; Tonya W. Lister

    2006-01-01

    Spatial predictive models often use statistical techniques that in some way rely on averaging of values. Estimates from linear modeling are known to be susceptible to truncation of variance when the independent (predictor) variables are measured with error. A straightforward post-processing technique (histogram matching) for attempting to mitigate this effect is...

  14. Microprocessor-Based Neural-Pulse-Wave Analyzer

    NASA Technical Reports Server (NTRS)

    Kojima, G. K.; Bracchi, F.

    1983-01-01

    Microprocessor-based system analyzes amplitudes and rise times of neural waveforms. Displaying histograms of measured parameters helps researchers determine how many nerves contribute to signal and specify waveform characteristics of each. Results are improved noise rejection, full or partial separation of overlapping peaks, and isolation and identification of related peaks in different histograms. 2

  15. Histogram-based automatic thresholding for bruise detection of apples by structured-illumination reflectance imaging

    USDA-ARS?s Scientific Manuscript database

    Thresholding is an important step in the segmentation of image features, and the existing methods are not all effective when the image histogram exhibits a unimodal pattern, which is common in defect detection of fruit. This study was aimed at developing a general automatic thresholding methodology ...

  16. Adaptive Control Of Woofer-Tweeter Adaptive Optics

    DTIC Science & Technology

    2009-03-01

    the actuator geometry and the matrix F describes the lowpass filter. The columns of T form a set of basis vectors in the space of the master...set equal to the simulated aperture size of 76 cm. The tweeter DM has 39 actuators across the aperture with a spacing of 2 cm for a total of 1521...actuators over the square aperture. The

  17. Smart Acoustic Network Using Combined FSK-PSK, Adaptive Beamforming and Equalization

    DTIC Science & Technology

    2002-09-30

    sonar data transmission from underwater vehicle during mission. The two-year objectives for the high-reliability acoustic network using multiple... sonar laboratory) and used for acoustic networking during underwater vehicle operation. The joint adaptive coherent path beamformer method consists...broadband communications transducer, while the low noise preamplifier conditions received signals for analog to digital conversion. External user

  18. Smart Acoustic Network Using Combined FSK-PSK, Adaptive, Beamforming and Equalization

    DTIC Science & Technology

    2001-09-30

    sonar data transmission from underwater vehicle during mission. The two-year objectives for the high-reliability acoustic network using multiple... sonar laboratory) and used for acoustic networking during underwater vehicle operation. The joint adaptive coherent path beamformer method consists...broadband communications transducer, while the low noise preamplifier conditions received signals for analog to digital conversion. External user

  19. Deep convolutional neural networks for automatic classification of gastric carcinoma using whole slide images in digital histopathology.

    PubMed

    Sharma, Harshita; Zerbe, Norman; Klempert, Iris; Hellwich, Olaf; Hufnagl, Peter

    2017-11-01

    Deep learning using convolutional neural networks is an actively emerging field in histological image analysis. This study explores deep learning methods for computer-aided classification in H&E stained histopathological whole slide images of gastric carcinoma. An introductory convolutional neural network architecture is proposed for two computerized applications, namely, cancer classification based on immunohistochemical response and necrosis detection based on the existence of tumor necrosis in the tissue. Classification performance of the developed deep learning approach is quantitatively compared with traditional image analysis methods in digital histopathology requiring prior computation of handcrafted features, such as statistical measures using gray level co-occurrence matrix, Gabor filter-bank responses, LBP histograms, gray histograms, HSV histograms and RGB histograms, followed by random forest machine learning. Additionally, the widely known AlexNet deep convolutional framework is comparatively analyzed for the corresponding classification problems. The proposed convolutional neural network architecture reports favorable results, with an overall classification accuracy of 0.6990 for cancer classification and 0.8144 for necrosis detection. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Distribution of a suite of elements including arsenic and mercury in Alabama coal

    USGS Publications Warehouse

    Goldhaber, Martin B.; Bigelow, R.C.; Hatch, J.R.; Pashin, J.C.

    2000-01-01

    Arsenic and other elements are unusually abundant in Alabama coal. This conclusion is based on chemical analyses of coal in the U.S. Geological Survey's National Coal Resources Data System (NCRDS; Bragg and others, 1994). According to NCRDS data, the average concentration of arsenic in Alabama coal (72 ppm) is three times higher than is the average for all U.S. coal (24 ppm). Of the U.S. coal analyses for arsenic that are at least 3 standard deviations above the mean, approximately 90% are from the coal fields of Alabama. Figure 1 contrasts the abundance of arsenic in coal of the Warrior field of Alabama (histogram C) with that of coal of the Powder River Basin, Wyoming (histogram A), and the Eastern Interior Province including the Illinois Basin and nearby areas (histogram B). The Warrior field is by far the largest in Alabama. On the histogram, the large 'tail' of very high values (> 200 ppm) in the Warrior coal contrasts with the other two regions that have very few analyses greater than 200 ppm.

  1. Real-Time Tracking by Double Templates Matching Based on Timed Motion History Image with HSV Feature

    PubMed Central

    Li, Zhiyong; Li, Pengfei; Yu, Xiaoping; Hashem, Mervat

    2014-01-01

    It is a challenge to represent the target appearance model for moving object tracking under complex environment. This study presents a novel method with appearance model described by double templates based on timed motion history image with HSV color histogram feature (tMHI-HSV). The main components include offline template and online template initialization, tMHI-HSV-based candidate patches feature histograms calculation, double templates matching (DTM) for object location, and templates updating. Firstly, we initialize the target object region and calculate its HSV color histogram feature as offline template and online template. Secondly, the tMHI-HSV is used to segment the motion region and calculate these candidate object patches' color histograms to represent their appearance models. Finally, we utilize the DTM method to trace the target and update the offline template and online template real-timely. The experimental results show that the proposed method can efficiently handle the scale variation and pose change of the rigid and nonrigid objects, even in illumination change and occlusion visual environment. PMID:24592185

  2. Adjustments for the display of quantized ion channel dwell times in histograms with logarithmic bins.

    PubMed

    Stark, J A; Hladky, S B

    2000-02-01

    Dwell-time histograms are often plotted as part of patch-clamp investigations of ion channel currents. The advantages of plotting these histograms with a logarithmic time axis were demonstrated by, J. Physiol. (Lond.). 378:141-174), Pflügers Arch. 410:530-553), and, Biophys. J. 52:1047-1054). Sigworth and Sine argued that the interpretation of such histograms is simplified if the counts are presented in a manner similar to that of a probability density function. However, when ion channel records are recorded as a discrete time series, the dwell times are quantized. As a result, the mapping of dwell times to logarithmically spaced bins is highly irregular; bins may be empty, and significant irregularities may extend beyond the duration of 100 samples. Using simple approximations based on the nature of the binning process and the transformation rules for probability density functions, we develop adjustments for the display of the counts to compensate for this effect. Tests with simulated data suggest that this procedure provides a faithful representation of the data.

  3. Estimation of urban surface water at subpixel level from neighborhood pixels using multispectral remote sensing image (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Xie, Huan; Luo, Xin; Xu, Xiong; Wang, Chen; Pan, Haiyan; Tong, Xiaohua; Liu, Shijie

    2016-10-01

    Water body is a fundamental element in urban ecosystems and water mapping is critical for urban and landscape planning and management. As remote sensing has increasingly been used for water mapping in rural areas, this spatially explicit approach applied in urban area is also a challenging work due to the water bodies mainly distributed in a small size and the spectral confusion widely exists between water and complex features in the urban environment. Water index is the most common method for water extraction at pixel level, and spectral mixture analysis (SMA) has been widely employed in analyzing urban environment at subpixel level recently. In this paper, we introduce an automatic subpixel water mapping method in urban areas using multispectral remote sensing data. The objectives of this research consist of: (1) developing an automatic land-water mixed pixels extraction technique by water index; (2) deriving the most representative endmembers of water and land by utilizing neighboring water pixels and adaptive iterative optimal neighboring land pixel for respectively; (3) applying a linear unmixing model for subpixel water fraction estimation. Specifically, to automatically extract land-water pixels, the locally weighted scatter plot smoothing is firstly used to the original histogram curve of WI image . And then the Ostu threshold is derived as the start point to select land-water pixels based on histogram of the WI image with the land threshold and water threshold determination through the slopes of histogram curve . Based on the previous process at pixel level, the image is divided into three parts: water pixels, land pixels, and mixed land-water pixels. Then the spectral mixture analysis (SMA) is applied to land-water mixed pixels for water fraction estimation at subpixel level. With the assumption that the endmember signature of a target pixel should be more similar to adjacent pixels due to spatial dependence, the endmember of water and land are determined by neighboring pure land or pure water pixels within a distance. To obtaining the most representative endmembers in SMA, we designed an adaptive iterative endmember selection method based on the spatial similarity of adjacent pixels. According to the spectral similarity in a spatial adjacent region, the spectrum of land endmember is determined by selecting the most representative land pixel in a local window, and the spectrum of water endmember is determined by calculating an average of the water pixels in the local window. The proposed hierarchical processing method based on WI and SMA (WISMA) is applied to urban areas for reliability evaluation using the Landsat-8 Operational Land Imager (OLI) images. For comparison, four methods at pixel level and subpixel level were chosen respectively. Results indicate that the water maps generated by the proposed method correspond as closely with the truth water maps with subpixel precision. And the results showed that the WISMA achieved the best performance in water mapping with comprehensive analysis of different accuracy evaluation indexes (RMSE and SE).

  4. Assessment of Arterial Wall Enhancement for Differentiation of Parent Artery Disease from Small Artery Disease: Comparison between Histogram Analysis and Visual Analysis on 3-Dimensional Contrast-Enhanced T1-Weighted Turbo Spin Echo MR Images at 3T.

    PubMed

    Jang, Jinhee; Kim, Tae-Won; Hwang, Eo-Jin; Choi, Hyun Seok; Koo, Jaseong; Shin, Yong Sam; Jung, So-Lyung; Ahn, Kook-Jin; Kim, Bum-Soo

    2017-01-01

    The purpose of this study was to compare the histogram analysis and visual scores in 3T MRI assessment of middle cerebral arterial wall enhancement in patients with acute stroke, for the differentiation of parent artery disease (PAD) from small artery disease (SAD). Among the 82 consecutive patients in a tertiary hospital for one year, 25 patients with acute infarcts in middle cerebral artery (MCA) territory were included in this study including 15 patients with PAD and 10 patients with SAD. Three-dimensional contrast-enhanced T1-weighted turbo spin echo MR images with black-blood preparation at 3T were analyzed both qualitatively and quantitatively. The degree of MCA stenosis, and visual and histogram assessments on MCA wall enhancement were evaluated. A statistical analysis was performed to compare diagnostic accuracy between qualitative and quantitative metrics. The degree of stenosis, visual enhancement score, geometric mean (GM), and the 90th percentile (90P) value from the histogram analysis were significantly higher in PAD than in SAD ( p = 0.006 for stenosis, < 0.001 for others). The receiver operating characteristic curve area of GM and 90P were 1 (95% confidence interval [CI], 0.86-1.00). A histogram analysis of a relevant arterial wall enhancement allows differentiation between PAD and SAD in patients with acute stroke within the MCA territory.

  5. Utility of histogram analysis of apparent diffusion coefficient maps obtained using 3.0T MRI for distinguishing uterine carcinosarcoma from endometrial carcinoma.

    PubMed

    Takahashi, Masahiro; Kozawa, Eito; Tanisaka, Megumi; Hasegawa, Kousei; Yasuda, Masanori; Sakai, Fumikazu

    2016-06-01

    We explored the role of histogram analysis of apparent diffusion coefficient (ADC) maps for discriminating uterine carcinosarcoma and endometrial carcinoma. We retrospectively evaluated findings in 13 patients with uterine carcinosarcoma and 50 patients with endometrial carcinoma who underwent diffusion-weighted imaging (b = 0, 500, 1000 s/mm(2) ) at 3T with acquisition of corresponding ADC maps. We derived histogram data from regions of interest drawn on all slices of the ADC maps in which tumor was visualized, excluding areas of necrosis and hemorrhage in the tumor. We used the Mann-Whitney test to evaluate the capacity of histogram parameters (mean ADC value, 5th to 95th percentiles, skewness, kurtosis) to discriminate uterine carcinosarcoma and endometrial carcinoma and analyzed the receiver operating characteristic (ROC) curve to determine the optimum threshold value for each parameter and its corresponding sensitivity and specificity. Carcinosarcomas demonstrated significantly higher mean vales of ADC, 95th, 90th, 75th, 50th, 25th percentiles and kurtosis than endometrial carcinomas (P < 0.05). ROC curve analysis of the 75th percentile yielded the best area under the ROC curve (AUC; 0.904), sensitivity of 100%, and specificity of 78.0%, with a cutoff value of 1.034 × 10(-3) mm(2) /s. Histogram analysis of ADC maps might be helpful for discriminating uterine carcinosarcomas and endometrial carcinomas. J. Magn. Reson. Imaging 2016;43:1301-1307. © 2015 Wiley Periodicals, Inc.

  6. Characterization of testicular germ cell tumors: Whole-lesion histogram analysis of the apparent diffusion coefficient at 3T.

    PubMed

    Min, Xiangde; Feng, Zhaoyan; Wang, Liang; Cai, Jie; Yan, Xu; Li, Basen; Ke, Zan; Zhang, Peipei; You, Huijuan

    2018-01-01

    To assess the values of parameters derived from whole-lesion histograms of the apparent diffusion coefficient (ADC) at 3T for the characterization of testicular germ cell tumors (TGCTs). A total of 24 men with TGCTs underwent 3T diffusion-weighted imaging. Fourteen tumors were pathologically confirmed as seminomas, and ten tumors were pathologically confirmed as nonseminomas. Whole-lesion histogram analysis of the ADC values was performed. A Mann-Whitney U test was employed to compare the differences in ADC histogram parameters between seminomas and nonseminomas. Receiver operating characteristic analysis was used to identify the cutoff values for each parameter for differentiating seminomas from nonseminomas; furthermore, the area under the curve (AUC) was calculated to evaluate the diagnostic accuracy. The median of 10th, 25th, 50th, 75th, and 90th percentiles and mean, minimum and maximum ADC values were all significantly reduced for seminomas compared with nonseminomas (p<0.05 for all). In contrast, the median of kurtosis and skewness of ADC values of seminomas were both significantly increased compared with those of nonseminomas (p=0.003 and 0.001, respectively). For differentiating nonseminomas from seminomas, the 10th percentile ADC yielded the highest AUC with a sensitivity and specificity of 100% and 92.86%, respectively. Whole-lesion histogram analysis of ADCs might be used for preoperative characterization of TGCTs. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Using color histogram normalization for recovering chromatic illumination-changed images.

    PubMed

    Pei, S C; Tseng, C L; Wu, C C

    2001-11-01

    We propose a novel image-recovery method using the covariance matrix of the red-green-blue (R-G-B) color histogram and tensor theories. The image-recovery method is called the color histogram normalization algorithm. It is known that the color histograms of an image taken under varied illuminations are related by a general affine transformation of the R-G-B coordinates when the illumination is changed. We propose a simplified affine model for application with illumination variation. This simplified affine model considers the effects of only three basic forms of distortion: translation, scaling, and rotation. According to this principle, we can estimate the affine transformation matrix necessary to recover images whose color distributions are varied as a result of illumination changes. We compare the normalized color histogram of the standard image with that of the tested image. By performing some operations of simple linear algebra, we can estimate the matrix of the affine transformation between two images under different illuminations. To demonstrate the performance of the proposed algorithm, we divide the experiments into two parts: computer-simulated images and real images corresponding to illumination changes. Simulation results show that the proposed algorithm is effective for both types of images. We also explain the noise-sensitive skew-rotation estimation that exists in the general affine model and demonstrate that the proposed simplified affine model without the use of skew rotation is better than the general affine model for such applications.

  8. One-dimensional error-diffusion technique adapted for binarization of rotationally symmetric pupil filters

    NASA Astrophysics Data System (ADS)

    Kowalczyk, Marek; Martínez-Corral, Manuel; Cichocki, Tomasz; Andrés, Pedro

    1995-02-01

    Two novel algorithms for the binarization of continuous rotationally symmetric real and positive pupil filters are presented. Both algorithms are based on the one-dimensional error diffusion concept. In our numerical experiment an original gray-tone apodizer is substituted by a set of transparent and opaque concentric annular zones. Depending on the algorithm the resulting binary mask consists of either equal width or equal area zones. The diffractive behavior of binary filters is evaluated. It is shown that the filter with equal width zones gives Fraunhofer diffraction pattern more similar to that of the original gray-tone apodizer than that with equal area zones, assuming in both cases the same resolution limit of device used to print both filters.

  9. GPU accelerated population annealing algorithm

    NASA Astrophysics Data System (ADS)

    Barash, Lev Yu.; Weigel, Martin; Borovský, Michal; Janke, Wolfhard; Shchur, Lev N.

    2017-11-01

    Population annealing is a promising recent approach for Monte Carlo simulations in statistical physics, in particular for the simulation of systems with complex free-energy landscapes. It is a hybrid method, combining importance sampling through Markov chains with elements of sequential Monte Carlo in the form of population control. While it appears to provide algorithmic capabilities for the simulation of such systems that are roughly comparable to those of more established approaches such as parallel tempering, it is intrinsically much more suitable for massively parallel computing. Here, we tap into this structural advantage and present a highly optimized implementation of the population annealing algorithm on GPUs that promises speed-ups of several orders of magnitude as compared to a serial implementation on CPUs. While the sample code is for simulations of the 2D ferromagnetic Ising model, it should be easily adapted for simulations of other spin models, including disordered systems. Our code includes implementations of some advanced algorithmic features that have only recently been suggested, namely the automatic adaptation of temperature steps and a multi-histogram analysis of the data at different temperatures. Program Files doi:http://dx.doi.org/10.17632/sgzt4b7b3m.1 Licensing provisions: Creative Commons Attribution license (CC BY 4.0) Programming language: C, CUDA External routines/libraries: NVIDIA CUDA Toolkit 6.5 or newer Nature of problem: The program calculates the internal energy, specific heat, several magnetization moments, entropy and free energy of the 2D Ising model on square lattices of edge length L with periodic boundary conditions as a function of inverse temperature β. Solution method: The code uses population annealing, a hybrid method combining Markov chain updates with population control. The code is implemented for NVIDIA GPUs using the CUDA language and employs advanced techniques such as multi-spin coding, adaptive temperature steps and multi-histogram reweighting. Additional comments: Code repository at https://github.com/LevBarash/PAising. The system size and size of the population of replicas are limited depending on the memory of the GPU device used. For the default parameter values used in the sample programs, L = 64, θ = 100, β0 = 0, βf = 1, Δβ = 0 . 005, R = 20 000, a typical run time on an NVIDIA Tesla K80 GPU is 151 seconds for the single spin coded (SSC) and 17 seconds for the multi-spin coded (MSC) program (see Section 2 for a description of these parameters).

  10. The impact of slice-reduced computed tomography on histogram-based densitometry assessment of lung fibrosis in patients with systemic sclerosis.

    PubMed

    Nguyen-Kim, Thi Dan Linh; Maurer, Britta; Suliman, Yossra A; Morsbach, Fabian; Distler, Oliver; Frauenfelder, Thomas

    2018-04-01

    To evaluate usability of slice-reduced sequential computed tomography (CT) compared to standard high-resolution CT (HRCT) in patients with systemic sclerosis (SSc) for qualitative and quantitative assessment of interstitial lung disease (ILD) with respect to (I) detection of lung parenchymal abnormalities, (II) qualitative and semiquantitative visual assessment, (III) quantification of ILD by histograms and (IV) accuracy for the 20%-cut off discrimination. From standard chest HRCT of 60 SSc patients sequential 9-slice-computed tomography (reduced HRCT) was retrospectively reconstructed. ILD was assessed by visual scoring and quantitative histogram parameters. Results from standard and reduced HRCT were compared using non-parametric tests and analysed by univariate linear regression analyses. With respect to the detection of parenchymal abnormalities, only the detection of intrapulmonary bronchiectasis was significantly lower in reduced HRCT compared to standard HRCT (P=0.039). No differences were found comparing visual scores for fibrosis severity and extension from standard and reduced HRCT (P=0.051-0.073). All scores correlated significantly (P<0.001) to histogram parameters derived from both, standard and reduced HRCT. Significant higher values of kurtosis and skewness for reduced HRCT were found (both P<0.001). In contrast to standard HRCT histogram parameters from reduced HRCT showed significant discrimination at cut-off 20% fibrosis (sensitivity 88% kurtosis and skewness; specificity 81% kurtosis and 86% skewness; cut-off kurtosis ≤26, cut-off skewness ≤4; both P<0.001). Reduced HRCT is a robust method to assess lung fibrosis in SSc with minimal radiation dose with no difference in scoring assessment of lung fibrosis severity and extension in comparison to standard HRCT. In contrast to standard HRCT histogram parameters derived from the approach of reduced HRCT could discriminate at a threshold of 20% lung fibrosis with high sensitivity and specificity. Hence it might be used to detect early disease progression of lung fibrosis in context of monitoring and treatment of SSc patients.

  11. Histogram analysis of ADC in rectal cancer: associations with different histopathological findings including expression of EGFR, Hif1-alpha, VEGF, p53, PD1, and KI 67. A preliminary study.

    PubMed

    Meyer, Hans Jonas; Höhn, Annekathrin; Surov, Alexey

    2018-04-06

    Functional imaging modalities like Diffusion-weighted imaging are increasingly used to predict tumor behavior like cellularity and vascularity in different tumors. Histogram analysis is an emergent imaging analysis, in which every voxel is used to obtain a histogram and therefore statistically information about tumors can be provided. The purpose of this study was to elucidate possible associations between ADC histogram parameters and several immunhistochemical features in rectal cancer. Overall, 11 patients with histologically proven rectal cancer were included into the study. There were 2 (18.18%) females and 9 males with a mean age of 67.1 years. KI 67-index, expression of p53, EGFR, VEGF, and Hif1-alpha were semiautomatically estimated. The tumors were divided into PD1-positive and PD1-negative lesions. ADC histogram analysis was performed as a whole lesion measurement using an in-house matlab application. Spearman's correlation analysis revealed a strong correlation between EGFR expression and ADCmax (p=0.72, P=0.02). None of the vascular parameters (VEGF, Hif1-alpha) correlated with ADC parameters. Kurtosis and skewness correlated inversely with p53 expression (p=-0.64, P=0.03 and p=-0.81, P=0.002, respectively). ADCmedian and ADCmode correlated with Ki67 (p=-0.62, P=0.04 and p=-0.65, P=0.03, respectively). PD1-positive tumors showed statistically significant lower ADCmax values in comparison to PD1-negative tumors, 1.93 ± 0.36 vs 2.32 ± 0.47×10 -3 mm 2 /s, p=0.04. Several associations were identified between histogram parameter derived from ADC maps and EGFR, KI 67 and p53 expression in rectal cancer. Furthermore, ADCmax was different between PD1 positive and PD1 negative tumors indicating an important role of ADC parameters for possible future treatment prediction.

  12. Histogram analysis of ADC in rectal cancer: associations with different histopathological findings including expression of EGFR, Hif1-alpha, VEGF, p53, PD1, and KI 67. A preliminary study

    PubMed Central

    Meyer, Hans Jonas; Höhn, Annekathrin; Surov, Alexey

    2018-01-01

    Functional imaging modalities like Diffusion-weighted imaging are increasingly used to predict tumor behavior like cellularity and vascularity in different tumors. Histogram analysis is an emergent imaging analysis, in which every voxel is used to obtain a histogram and therefore statistically information about tumors can be provided. The purpose of this study was to elucidate possible associations between ADC histogram parameters and several immunhistochemical features in rectal cancer. Overall, 11 patients with histologically proven rectal cancer were included into the study. There were 2 (18.18%) females and 9 males with a mean age of 67.1 years. KI 67-index, expression of p53, EGFR, VEGF, and Hif1-alpha were semiautomatically estimated. The tumors were divided into PD1-positive and PD1-negative lesions. ADC histogram analysis was performed as a whole lesion measurement using an in-house matlab application. Spearman's correlation analysis revealed a strong correlation between EGFR expression and ADCmax (p=0.72, P=0.02). None of the vascular parameters (VEGF, Hif1-alpha) correlated with ADC parameters. Kurtosis and skewness correlated inversely with p53 expression (p=-0.64, P=0.03 and p=-0.81, P=0.002, respectively). ADCmedian and ADCmode correlated with Ki67 (p=-0.62, P=0.04 and p=-0.65, P=0.03, respectively). PD1-positive tumors showed statistically significant lower ADCmax values in comparison to PD1-negative tumors, 1.93 ± 0.36 vs 2.32 ± 0.47×10−3mm2/s, p=0.04. Several associations were identified between histogram parameter derived from ADC maps and EGFR, KI 67 and p53 expression in rectal cancer. Furthermore, ADCmax was different between PD1 positive and PD1 negative tumors indicating an important role of ADC parameters for possible future treatment prediction. PMID:29719621

  13. SU-C-207A-07: Cumulative 18F-FDG Uptake Histogram Relative to Radiation Dose Volume Histogram of Lung After IMRT Or PSPT and Their Association with Radiation Pneumonitis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shusharina, N; Choi, N; Bortfeld, T

    2016-06-15

    Purpose: To determine whether the difference in cumulative 18F-FDG uptake histogram of lung treated with either IMRT or PSPT is associated with radiation pneumonitis (RP) in patients with inoperable stage II and III NSCLC. Methods: We analyzed 24 patients from a prospective randomized trial to compare IMRT (n=12) with vs. PSPT (n=12) for inoperable NSCLC. All patients underwent PET-CT imaging between 35 and 88 days post-therapy. Post-treatment PET-CT was aligned with planning 4D CT to establish a voxel-to-voxel correspondence between post-treatment PET and planning dose images. 18F-FDG uptake as a function of radiation dose to normal lung was obtained formore » each patient. Distribution of the standard uptake value (SUV) was analyzed using a volume histogram method. The image quantitative characteristics and DVH measures were correlated with clinical symptoms of pneumonitis. Results: Patients with RP were present in both groups: 5 in the IMRT and 6 in the PSPT. The analysis of cumulative SUV histograms showed significantly higher relative volumes of the normal lung having higher SUV uptake in the PSPT patients for both symptomatic and asymptomatic cases (VSUV=2: 10% for IMRT vs 16% for proton RT and VSUV=1: 10% for IMRT vs 23% for proton RT). In addition, the SUV histograms for symptomatic cases in PSPT patients exhibited a significantly longer tail at the highest SUV. The absolute volume of the lung receiving the dose >70 Gy was larger in the PSPT patients. Conclusion: 18F-FDG uptake – radiation dose response correlates with RP in both groups of patients by means of the linear regression slope. SUV is higher for the PSPT patients for both symptomatic and asymptomatic cases. Higher uptake after PSPT patients is explained by larger volumes of the lung receiving high radiation dose.« less

  14. LinoSPAD: a time-resolved 256×1 CMOS SPAD line sensor system featuring 64 FPGA-based TDC channels running at up to 8.5 giga-events per second

    NASA Astrophysics Data System (ADS)

    Burri, Samuel; Homulle, Harald; Bruschini, Claudio; Charbon, Edoardo

    2016-04-01

    LinoSPAD is a reconfigurable camera sensor with a 256×1 CMOS SPAD (single-photon avalanche diode) pixel array connected to a low cost Xilinx Spartan 6 FPGA. The LinoSPAD sensor's line of pixels has a pitch of 24 μm and 40% fill factor. The FPGA implements an array of 64 TDCs and histogram engines capable of processing up to 8.5 giga-photons per second. The LinoSPAD sensor measures 1.68 mm×6.8 mm and each pixel has a direct digital output to connect to the FPGA. The chip is bonded on a carrier PCB to connect to the FPGA motherboard. 64 carry chain based TDCs sampled at 400 MHz can generate a timestamp every 7.5 ns with a mean time resolution below 25 ps per code. The 64 histogram engines provide time-of-arrival histograms covering up to 50 ns. An alternative mode allows the readout of 28 bit timestamps which have a range of up to 4.5 ms. Since the FPGA TDCs have considerable non-linearity we implemented a correction module capable of increasing histogram linearity at real-time. The TDC array is interfaced to a computer using a super-speed USB3 link to transfer over 150k histograms per second for the 12.5 ns reference period used in our characterization. After characterization and subsequent programming of the post-processing we measure an instrument response histogram shorter than 100 ps FWHM using a strong laser pulse with 50 ps FWHM. A timing resolution that when combined with the high fill factor makes the sensor well suited for a wide variety of applications from fluorescence lifetime microscopy over Raman spectroscopy to 3D time-of-flight.

  15. Isobio software: biological dose distribution and biological dose volume histogram from physical dose conversion using linear-quadratic-linear model.

    PubMed

    Jaikuna, Tanwiwat; Khadsiri, Phatchareewan; Chawapun, Nisa; Saekho, Suwit; Tharavichitkul, Ekkasit

    2017-02-01

    To develop an in-house software program that is able to calculate and generate the biological dose distribution and biological dose volume histogram by physical dose conversion using the linear-quadratic-linear (LQL) model. The Isobio software was developed using MATLAB version 2014b to calculate and generate the biological dose distribution and biological dose volume histograms. The physical dose from each voxel in treatment planning was extracted through Computational Environment for Radiotherapy Research (CERR), and the accuracy was verified by the differentiation between the dose volume histogram from CERR and the treatment planning system. An equivalent dose in 2 Gy fraction (EQD 2 ) was calculated using biological effective dose (BED) based on the LQL model. The software calculation and the manual calculation were compared for EQD 2 verification with pair t -test statistical analysis using IBM SPSS Statistics version 22 (64-bit). Two and three-dimensional biological dose distribution and biological dose volume histogram were displayed correctly by the Isobio software. Different physical doses were found between CERR and treatment planning system (TPS) in Oncentra, with 3.33% in high-risk clinical target volume (HR-CTV) determined by D 90% , 0.56% in the bladder, 1.74% in the rectum when determined by D 2cc , and less than 1% in Pinnacle. The difference in the EQD 2 between the software calculation and the manual calculation was not significantly different with 0.00% at p -values 0.820, 0.095, and 0.593 for external beam radiation therapy (EBRT) and 0.240, 0.320, and 0.849 for brachytherapy (BT) in HR-CTV, bladder, and rectum, respectively. The Isobio software is a feasible tool to generate the biological dose distribution and biological dose volume histogram for treatment plan evaluation in both EBRT and BT.

  16. Diagnostic accuracy of ultrasonic histogram features to evaluate radiation toxicity of the parotid glands: a clinical study of xerostomia following head-and-neck cancer radiotherapy.

    PubMed

    Yang, Xiaofeng; Tridandapani, Srini; Beitler, Jonathan J; Yu, David S; Chen, Zhengjia; Kim, Sungjin; Bruner, Deborah W; Curran, Walter J; Liu, Tian

    2014-10-01

    To investigate the diagnostic accuracy of ultrasound histogram features in the quantitative assessment of radiation-induced parotid gland injury and to identify potential imaging biomarkers for radiation-induced xerostomia (dry mouth)-the most common and debilitating side effect after head-and-neck radiotherapy (RT). Thirty-four patients, who have developed xerostomia after RT for head-and-neck cancer, were enrolled. Radiation-induced xerostomia was defined by the Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer morbidity scale. Ultrasound scans were performed on each patient's parotids bilaterally. The 34 patients were stratified into the acute-toxicity groups (16 patients, ≤ 3 months after treatment) and the late-toxicity group (18 patients, > 3 months after treatment). A separate control group of 13 healthy volunteers underwent similar ultrasound scans of their parotid glands. Six sonographic features were derived from the echo-intensity histograms to assess acute and late toxicity of the parotid glands. The quantitative assessments were compared to a radiologist's clinical evaluations. The diagnostic accuracy of these ultrasonic histogram features was evaluated with the receiver operating characteristic (ROC) curve. With an area under the ROC curve greater than 0.90, several histogram features demonstrated excellent diagnostic accuracy for evaluation of acute and late toxicity of parotid glands. Significant differences (P < .05) in all six sonographic features were demonstrated between the control, acute-toxicity, and late-toxicity groups. However, subjective radiologic evaluation cannot distinguish between acute and late toxicity of parotid glands. We demonstrated that ultrasound histogram features could be used to measure acute and late toxicity of the parotid glands after head-and-neck cancer RT, which may be developed into a low-cost imaging method for xerostomia monitoring and assessment. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  17. Subtype differentiation of renal tumors using voxel-based histogram analysis of intravoxel incoherent motion parameters.

    PubMed

    Gaing, Byron; Sigmund, Eric E; Huang, William C; Babb, James S; Parikh, Nainesh S; Stoffel, David; Chandarana, Hersh

    2015-03-01

    The aim of this study was to determine if voxel-based histogram analysis of intravoxel incoherent motion imaging (IVIM) parameters can differentiate various subtypes of renal tumors, including benign and malignant lesions. A total of 44 patients with renal tumors who underwent surgery and had histopathology available were included in this Health Insurance Portability and Accountability Act-compliant, institutional review board-approved, single-institution prospective study. In addition to routine renal magnetic resonance imaging examination performed on a 1.5-T system, all patients were imaged with axial diffusion-weighted imaging using 8 b values (range, 0-800 s/mm). A biexponential model was fitted to the diffusion signal data using a segmented algorithm to extract the IVIM parameters perfusion fraction (fp), tissue diffusivity (Dt), and pseudodiffusivity (Dp) for each voxel. Mean and histogram measures of heterogeneity (standard deviation, skewness, and kurtosis) of IVIM parameters were correlated with pathology results of tumor subtype using unequal variance t tests to compare subtypes in terms of each measure. Correction for multiple comparisons was accomplished using the Tukey honestly significant difference procedure. A total of 44 renal tumors including 23 clear cell (ccRCC), 4 papillary (pRCC), 5 chromophobe, and 5 cystic renal cell carcinomas, as well as benign lesions, 4 oncocytomas (Onc) and 3 angiomyolipomas (AMLs), were included in our analysis. Mean IVIM parameters fp and Dt differentiated 8 of 15 pairs of renal tumors. Histogram analysis of IVIM parameters differentiated 9 of 15 subtype pairs. One subtype pair (ccRCC vs pRCC) was differentiated by mean analysis but not by histogram analysis. However, 2 other subtype pairs (AML vs Onc and ccRCC vs Onc) were differentiated by histogram distribution parameters exclusively. The standard deviation of Dt [σ(Dt)] differentiated ccRCC (0.362 ± 0.136 × 10 mm/s) from AML (0.199 ± 0.043 × 10 mm/s) (P = 0.002). Kurtosis of fp separated Onc (2.767 ± 1.299) from AML (-0.325 ± 0.279; P = 0.001), ccRCC (0.612 ± 1.139; P = 0.042), and pRCC (0.308 ± 0.730; P = 0.025). Intravoxel incoherent motion imaging parameters with inclusion of histogram measures of heterogeneity can help differentiate malignant from benign lesions as well as various subtypes of renal cancers.

  18. A tone mapping operator based on neural and psychophysical models of visual perception

    NASA Astrophysics Data System (ADS)

    Cyriac, Praveen; Bertalmio, Marcelo; Kane, David; Vazquez-Corral, Javier

    2015-03-01

    High dynamic range imaging techniques involve capturing and storing real world radiance values that span many orders of magnitude. However, common display devices can usually reproduce intensity ranges only up to two to three orders of magnitude. Therefore, in order to display a high dynamic range image on a low dynamic range screen, the dynamic range of the image needs to be compressed without losing details or introducing artefacts, and this process is called tone mapping. A good tone mapping operator must be able to produce a low dynamic range image that matches as much as possible the perception of the real world scene. We propose a two stage tone mapping approach, in which the first stage is a global method for range compression based on a gamma curve that equalizes the lightness histogram the best, and the second stage performs local contrast enhancement and color induction using neural activity models for the visual cortex.

  19. Enhancing the pictorial content of digital holograms at 100 frames per second.

    PubMed

    Tsang, P W M; Poon, T-C; Cheung, K W K

    2012-06-18

    We report a low complexity, non-iterative method for enhancing the sharpness, brightness, and contrast of the pictorial content that is recorded in a digital hologram, without the need of re-generating the latter from the original object scene. In our proposed method, the hologram is first back-projected to a 2-D virtual diffraction plane (VDP) which is located at close proximity to the original object points. Next the field distribution on the VDP, which shares similar optical properties as the object scene, is enhanced. Subsequently, the processed VDP is expanded into a full hologram. We demonstrate two types of enhancement: a modified histogram equalization to improve the brightness and contrast, and localized high-boost-filtering (LHBF) to increase the sharpness. Experiment results have demonstrated that our proposed method is capable of enhancing a 2048x2048 hologram at a rate of around 100 frames per second. To the best of our knowledge, this is the first time real-time image enhancement is considered in the context of digital holography.

  20. Joint OSNR monitoring and modulation format identification in digital coherent receivers using deep neural networks.

    PubMed

    Khan, Faisal Nadeem; Zhong, Kangping; Zhou, Xian; Al-Arashi, Waled Hussein; Yu, Changyuan; Lu, Chao; Lau, Alan Pak Tao

    2017-07-24

    We experimentally demonstrate the use of deep neural networks (DNNs) in combination with signals' amplitude histograms (AHs) for simultaneous optical signal-to-noise ratio (OSNR) monitoring and modulation format identification (MFI) in digital coherent receivers. The proposed technique automatically extracts OSNR and modulation format dependent features of AHs, obtained after constant modulus algorithm (CMA) equalization, and exploits them for the joint estimation of these parameters. Experimental results for 112 Gbps polarization-multiplexed (PM) quadrature phase-shift keying (QPSK), 112 Gbps PM 16 quadrature amplitude modulation (16-QAM), and 240 Gbps PM 64-QAM signals demonstrate OSNR monitoring with mean estimation errors of 1.2 dB, 0.4 dB, and 1 dB, respectively. Similarly, the results for MFI show 100% identification accuracy for all three modulation formats. The proposed technique applies deep machine learning algorithms inside standard digital coherent receiver and does not require any additional hardware. Therefore, it is attractive for cost-effective multi-parameter estimation in next-generation elastic optical networks (EONs).

  1. Optimal nonlinear codes for the perception of natural colours.

    PubMed

    von der Twer, T; MacLeod, D I

    2001-08-01

    We discuss how visual nonlinearity can be optimized for the precise representation of environmental inputs. Such optimization leads to neural signals with a compressively nonlinear input-output function the gradient of which is matched to the cube root of the probability density function (PDF) of the environmental input values (and not to the PDF directly as in histogram equalization). Comparisons between theory and psychophysical and electrophysiological data are roughly consistent with the idea that parvocellular (P) cells are optimized for precision representation of colour: their contrast-response functions span a range appropriately matched to the environmental distribution of natural colours along each dimension of colour space. Thus P cell codes for colour may have been selected to minimize error in the perceptual estimation of stimulus parameters for natural colours. But magnocellular (M) cells have a much stronger than expected saturating nonlinearity; this supports the view that the function of M cells is mainly to detect boundaries rather than to specify contrast or lightness.

  2. An application of viola jones method for face recognition for absence process efficiency

    NASA Astrophysics Data System (ADS)

    Rizki Damanik, Rudolfo; Sitanggang, Delima; Pasaribu, Hendra; Siagian, Hendrik; Gulo, Frisman

    2018-04-01

    Absence was a list of documents that the company used to record the attendance time of each employee. The most common problem in a fingerprint machine is the identification of a slow sensor or a sensor not recognizing a finger. The employees late to work because they get difficulties at fingerprint system, they need about 3 – 5 minutes to absence when the condition of finger is wet or not fit. To overcome this problem, this research tried to utilize facial recognition for attendance process. The method used for facial recognition was Viola Jones. Through the processing phase of the RGB face image was converted into a histogram equalization face image for the next stage of recognition. The result of this research was the absence process could be done less than 1 second with a maximum slope of ± 700 and a distance of 20-200 cm. After implement facial recognition the process of absence is more efficient, just take less 1 minute to absence.

  3. Discrete Walsh Hadamard transform based visible watermarking technique for digital color images

    NASA Astrophysics Data System (ADS)

    Santhi, V.; Thangavelu, Arunkumar

    2011-10-01

    As the size of the Internet is growing enormously the illegal manipulation of digital multimedia data become very easy with the advancement in technology tools. In order to protect those multimedia data from unauthorized access the digital watermarking system is used. In this paper a new Discrete walsh Hadamard Transform based visible watermarking system is proposed. As the watermark is embedded in transform domain, the system is robust to many signal processing attacks. Moreover in this proposed method the watermark is embedded in tiling manner in all the range of frequencies to make it robust to compression and cropping attack. The robustness of the algorithm is tested against noise addition, cropping, compression, Histogram equalization and resizing attacks. The experimental results show that the algorithm is robust to common signal processing attacks and the observed peak signal to noise ratio (PSNR) of watermarked image is varying from 20 to 30 db depends on the size of the watermark.

  4. A CCD-based reader combined with CdS quantum dot-labeled lateral flow strips for ultrasensitive quantitative detection of CagA

    NASA Astrophysics Data System (ADS)

    Gui, Chen; Wang, Kan; Li, Chao; Dai, Xuan; Cui, Daxiang

    2014-02-01

    Immunochromatographic assays are widely used to detect many analytes. CagA is proved to be associated closely with initiation of gastric carcinoma. Here, we reported that a charge-coupled device (CCD)-based test strip reader combined with CdS quantum dot-labeled lateral flow strips for quantitative detection of CagA was developed, which used 365-nm ultraviolet LED as the excitation light source, and captured the test strip images through an acquisition module. Then, the captured image was transferred to the computer and was processed by a software system. A revised weighted threshold histogram equalization (WTHE) image processing algorithm was applied to analyze the result. CdS quantum dot-labeled lateral flow strips for detection of CagA were prepared. One hundred sera samples from clinical patients with gastric cancer and healthy people were prepared for detection, which demonstrated that the device could realize rapid, stable, and point-of-care detection, with a sensitivity of 20 pg/mL.

  5. A semi-blind logo watermarking scheme for color images by comparison and modification of DFT coefficients

    NASA Astrophysics Data System (ADS)

    Kusyk, Janusz; Eskicioglu, Ahmet M.

    2005-10-01

    Digital watermarking is considered to be a major technology for the protection of multimedia data. Some of the important applications are broadcast monitoring, copyright protection, and access control. In this paper, we present a semi-blind watermarking scheme for embedding a logo in color images using the DFT domain. After computing the DFT of the luminance layer of the cover image, the magnitudes of DFT coefficients are compared, and modified. A given watermark is embedded in three frequency bands: Low, middle, and high. Our experiments show that the watermarks extracted from the lower frequencies have the best visual quality for low pass filtering, adding Gaussian noise, JPEG compression, resizing, rotation, and scaling, and the watermarks extracted from the higher frequencies have the best visual quality for cropping, intensity adjustment, histogram equalization, and gamma correction. Extractions from the fragmented and translated image are identical to extractions from the unattacked watermarked image. The collusion and rewatermarking attacks do not provide the hacker with useful tools.

  6. False-Color-Image Map of Quadrangles 3768 and 3668, Imam-Saheb (215), Rustaq (216), Baghlan (221), and Taloqan (222) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

  7. False-Color-Image Map of Quadrangle 3362, Shin-Dand (415) and Tulak (416) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

  8. False-Color-Image Map of Quadrangle 3670, Jarm-Keshem (223) and Zebak (224) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

  9. False-Color-Image Map of Quadrangle 3164, Lashkargah (605) and Kandahar (606) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

  10. False-Color-Image Map of Quadrangle 3166, Jaldak (701) and Maruf-Nawa (702) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

  11. False-Color-Image Map of Quadrangle 3366, Gizab (513) and Nawer (514) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

  12. False-Color-Image Map of Quadrangle 3564, Chahriaq (Joand) (405) and Gurziwan (406) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

  13. False-Color-Image Map of Quadrangle 3264, Nawzad-Musa-Qala (423) and Dehrawat (424) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

  14. False-Color-Image Map of Quadrangle 3468, Chak Wardak-Syahgerd (509) and Kabul (510) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

  15. False-Color-Image Map of Quadrangles 3772, 3774, 3672, and 3674, Gaz-Khan (313), Sarhad (314), Kol-I-Chaqmaqtin (315), Khandud (319), Deh-Ghulaman (320), and Ertfah (321) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

  16. False-Color-Image Map of Quadrangle 3470 and the Northern Edge of Quadrangle 3370, Jalal-Abad (511), Chaghasaray (512), and Northernmost Jaji-Maydan (517) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

  17. False-Color-Image Map of Quadrangles 3666 and 3766, Balkh (219), Mazar-I-Sharif (220), Qarqin (213), and Hazara Toghai (214) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

  18. False-Color-Image Map of Quadrangle 3364, Pasa-Band (417) and Kejran (418) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

  19. False-Color-Image Map of Quadrangle 3368 and Part of Quadrangle 3370, Ghazni (515), Gardez (516), and Part of Jaji-Maydan (517) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

  20. False-Color-Image Map of Quadrangles 3062 and 2962, Charburjak (609), Khanneshin (610), Gawdezereh (615), and Galachah (616) Quadrangles, Afghanistan

    USGS Publications Warehouse

    Davis, Philip A.; Turner, Kenzie J.

    2007-01-01

    This map is a false-color rendition created from Landsat 7 Enhanced Thematic Mapper Plus imagery collected between 1999 and 2002. The false colors were generated by applying an adaptive histogram equalization stretch to Landsat bands 7 (displayed in red), 4 (displayed in green), and 2 (displayed in blue). These three bands contain most of the spectral differences provided by Landsat imagery and, therefore, provide the most discrimination between surface materials. Landsat bands 4 and 7 are in the near-infrared and short-wave-infrared regions, respectively, where differences in absorption of sunlight by different surface materials are more pronounced than in visible wavelengths. Cultural data were extracted from files downloaded from the Afghanistan Information Management Service (AIMS) Web site (http://www.aims.org.af). The AIMS files were originally derived from maps produced by the Afghanistan Geodesy and Cartography Head Office (AGCHO). Cultural features were not derived from the Landsat base and consequently do not match it precisely. This map is part of a series that includes a geologic map, a topographic map, a Landsat natural-color-image map, and a Landsat false-color-image map for the USGS/AGS (U.S. Geological Survey/Afghan Geological Survey) quadrangles covering Afghanistan. The maps for any given quadrangle have the same open-file report (OFR) number but a different letter suffix, namely, -A, -B, -C, and -D for the geologic, topographic, Landsat natural-color, and Landsat false-color maps, respectively. The OFR numbers range in sequence from 1092 to 1123. The present map series is to be followed by a second series, in which the geology is reinterpreted on the basis of analysis of remote-sensing data, limited fieldwork, and library research. The second series is to be produced by the USGS in cooperation with the AGS and AGCHO.

Top