Local dynamic range compensation for scanning electron microscope imaging system.
Sim, K S; Huang, Y H
2015-01-01
This is the extended project by introducing the modified dynamic range histogram modification (MDRHM) and is presented in this paper. This technique is used to enhance the scanning electron microscope (SEM) imaging system. By comparing with the conventional histogram modification compensators, this technique utilizes histogram profiling by extending the dynamic range of each tile of an image to the limit of 0-255 range while retains its histogram shape. The proposed technique yields better image compensation compared to conventional methods. © Wiley Periodicals, Inc.
Digital enhancement of computerized axial tomograms
NASA Technical Reports Server (NTRS)
Roberts, E., Jr.
1978-01-01
A systematic evaluation was conducted of certain digital image enhancement techniques performed in image space. Three types of images were used, computer generated phantoms, tomograms of a synthetic phantom, and axial tomograms of human anatomy containing images of lesions, artificially introduced into the tomograms. Several types of smoothing, sharpening, and histogram modification were explored. It was concluded that the most useful enhancement techniques are a selective smoothing of singular picture elements, combined with contrast manipulation. The most useful tool in applying these techniques is the gray-scale histogram.
Digital enhancement of computerized axial tomograms
NASA Technical Reports Server (NTRS)
Roberts, E., Jr.
1978-01-01
A systematic evaluation has been conducted of certain digital image enhancement techniques performed in image space. Three types of images have been used, computer generated phantoms, tomograms of a synthetic phantom, and axial tomograms of human anatomy containing images of lesions, artificially introduced into the tomograms. Several types of smoothing, sharpening, and histogram modification have been explored. It has been concluded that the most useful enhancement techniques are a selective smoothing of singular picture elements, combined with contrast manipulation. The most useful tool in applying these techniques is the gray-scale histogram.
Naturalness preservation image contrast enhancement via histogram modification
NASA Astrophysics Data System (ADS)
Tian, Qi-Chong; Cohen, Laurent D.
2018-04-01
Contrast enhancement is a technique for enhancing image contrast to obtain better visual quality. Since many existing contrast enhancement algorithms usually produce over-enhanced results, the naturalness preservation is needed to be considered in the framework of image contrast enhancement. This paper proposes a naturalness preservation contrast enhancement method, which adopts the histogram matching to improve the contrast and uses the image quality assessment to automatically select the optimal target histogram. The contrast improvement and the naturalness preservation are both considered in the target histogram, so this method can avoid the over-enhancement problem. In the proposed method, the optimal target histogram is a weighted sum of the original histogram, the uniform histogram, and the Gaussian-shaped histogram. Then the structural metric and the statistical naturalness metric are used to determine the weights of corresponding histograms. At last, the contrast-enhanced image is obtained via matching the optimal target histogram. The experiments demonstrate the proposed method outperforms the compared histogram-based contrast enhancement algorithms.
A Comparison of the Multiscale Retinex With Other Image Enhancement Techniques
NASA Technical Reports Server (NTRS)
Rahman, Zia-Ur; Woodell, Glenn A.; Jobson, Daniel J.
1997-01-01
The multiscale retinex with color restoration (MSRCR) has shown itself to be a very versatile automatic image enhancement algorithm that simultaneously provides dynamic range compression, color constancy, and color rendition. A number of algorithms exist that provide one or more of these features, but not all. In this paper we compare the performance of the MSRCR with techniques that are widely used for image enhancement. Specifically, we compare the MSRCR with color adjustment methods such as gamma correction and gain/offset application, histogram modification techniques such as histogram equalization and manual histogram adjustment, and other more powerful techniques such as homomorphic filtering and 'burning and dodging'. The comparison is carried out by testing the suite of image enhancement methods on a set of diverse images. We find that though some of these techniques work well for some of these images, only the MSRCR performs universally well on the test set.
Information-Adaptive Image Encoding and Restoration
NASA Technical Reports Server (NTRS)
Park, Stephen K.; Rahman, Zia-ur
1998-01-01
The multiscale retinex with color restoration (MSRCR) has shown itself to be a very versatile automatic image enhancement algorithm that simultaneously provides dynamic range compression, color constancy, and color rendition. A number of algorithms exist that provide one or more of these features, but not all. In this paper we compare the performance of the MSRCR with techniques that are widely used for image enhancement. Specifically, we compare the MSRCR with color adjustment methods such as gamma correction and gain/offset application, histogram modification techniques such as histogram equalization and manual histogram adjustment, and other more powerful techniques such as homomorphic filtering and 'burning and dodging'. The comparison is carried out by testing the suite of image enhancement methods on a set of diverse images. We find that though some of these techniques work well for some of these images, only the MSRCR performs universally well oil the test set.
Image contrast enhancement using adjacent-blocks-based modification for local histogram equalization
NASA Astrophysics Data System (ADS)
Wang, Yang; Pan, Zhibin
2017-11-01
Infrared images usually have some non-ideal characteristics such as weak target-to-background contrast and strong noise. Because of these characteristics, it is necessary to apply the contrast enhancement algorithm to improve the visual quality of infrared images. Histogram equalization (HE) algorithm is a widely used contrast enhancement algorithm due to its effectiveness and simple implementation. But a drawback of HE algorithm is that the local contrast of an image cannot be equally enhanced. Local histogram equalization algorithms are proved to be the effective techniques for local image contrast enhancement. However, over-enhancement of noise and artifacts can be easily found in the local histogram equalization enhanced images. In this paper, a new contrast enhancement technique based on local histogram equalization algorithm is proposed to overcome the drawbacks mentioned above. The input images are segmented into three kinds of overlapped sub-blocks using the gradients of them. To overcome the over-enhancement effect, the histograms of these sub-blocks are then modified by adjacent sub-blocks. We pay more attention to improve the contrast of detail information while the brightness of the flat region in these sub-blocks is well preserved. It will be shown that the proposed algorithm outperforms other related algorithms by enhancing the local contrast without introducing over-enhancement effects and additional noise.
Adaptive image contrast enhancement using generalizations of histogram equalization.
Stark, J A
2000-01-01
This paper proposes a scheme for adaptive image-contrast enhancement based on a generalization of histogram equalization (HE). HE is a useful technique for improving image contrast, but its effect is too severe for many purposes. However, dramatically different results can be obtained with relatively minor modifications. A concise description of adaptive HE is set out, and this framework is used in a discussion of past suggestions for variations on HE. A key feature of this formalism is a "cumulation function," which is used to generate a grey level mapping from the local histogram. By choosing alternative forms of cumulation function one can achieve a wide variety of effects. A specific form is proposed. Through the variation of one or two parameters, the resulting process can produce a range of degrees of contrast enhancement, at one extreme leaving the image unchanged, at another yielding full adaptive equalization.
A novel pre-processing technique for improving image quality in digital breast tomosynthesis.
Kim, Hyeongseok; Lee, Taewon; Hong, Joonpyo; Sabir, Sohail; Lee, Jung-Ryun; Choi, Young Wook; Kim, Hak Hee; Chae, Eun Young; Cho, Seungryong
2017-02-01
Nonlinear pre-reconstruction processing of the projection data in computed tomography (CT) where accurate recovery of the CT numbers is important for diagnosis is usually discouraged, for such a processing would violate the physics of image formation in CT. However, one can devise a pre-processing step to enhance detectability of lesions in digital breast tomosynthesis (DBT) where accurate recovery of the CT numbers is fundamentally impossible due to the incompleteness of the scanned data. Since the detection of lesions such as micro-calcifications and mass in breasts is the purpose of using DBT, it is justified that a technique producing higher detectability of lesions is a virtue. A histogram modification technique was developed in the projection data domain. Histogram of raw projection data was first divided into two parts: One for the breast projection data and the other for background. Background pixel values were set to a single value that represents the boundary between breast and background. After that, both histogram parts were shifted by an appropriate amount of offset and the histogram-modified projection data were log-transformed. Filtered-backprojection (FBP) algorithm was used for image reconstruction of DBT. To evaluate performance of the proposed method, we computed the detectability index for the reconstructed images from clinically acquired data. Typical breast border enhancement artifacts were greatly suppressed and the detectability of calcifications and masses was increased by use of the proposed method. Compared to a global threshold-based post-reconstruction processing technique, the proposed method produced images of higher contrast without invoking additional image artifacts. In this work, we report a novel pre-processing technique that improves detectability of lesions in DBT and has potential advantages over the global threshold-based post-reconstruction processing technique. The proposed method not only increased the lesion detectability but also reduced typical image artifacts pronounced in conventional FBP-based DBT. © 2016 American Association of Physicists in Medicine.
Combining Vector Quantization and Histogram Equalization.
ERIC Educational Resources Information Center
Cosman, Pamela C.; And Others
1992-01-01
Discussion of contrast enhancement techniques focuses on the use of histogram equalization with a data compression technique, i.e., tree-structured vector quantization. The enhancement technique of intensity windowing is described, and the use of enhancement techniques for medical images is explained, including adaptive histogram equalization.…
NASA Technical Reports Server (NTRS)
Dasarathy, B. V.
1976-01-01
An algorithm is proposed for dimensionality reduction in the context of clustering techniques based on histogram analysis. The approach is based on an evaluation of the hills and valleys in the unidimensional histograms along the different features and provides an economical means of assessing the significance of the features in a nonparametric unsupervised data environment. The method has relevance to remote sensing applications.
Thresholding histogram equalization.
Chuang, K S; Chen, S; Hwang, I M
2001-12-01
The drawbacks of adaptive histogram equalization techniques are the loss of definition on the edges of the object and overenhancement of noise in the images. These drawbacks can be avoided if the noise is excluded in the equalization transformation function computation. A method has been developed to separate the histogram into zones, each with its own equalization transformation. This method can be used to suppress the nonanatomic noise and enhance only certain parts of the object. This method can be combined with other adaptive histogram equalization techniques. Preliminary results indicate that this method can produce images with superior contrast.
Detecting duplicate biological entities using Shortest Path Edit Distance.
Rudniy, Alex; Song, Min; Geller, James
2010-01-01
Duplicate entity detection in biological data is an important research task. In this paper, we propose a novel and context-sensitive Shortest Path Edit Distance (SPED) extending and supplementing our previous work on Markov Random Field-based Edit Distance (MRFED). SPED transforms the edit distance computational problem to the calculation of the shortest path among two selected vertices of a graph. We produce several modifications of SPED by applying Levenshtein, arithmetic mean, histogram difference and TFIDF techniques to solve subtasks. We compare SPED performance to other well-known distance algorithms for biological entity matching. The experimental results show that SPED produces competitive outcomes.
DSP+FPGA-based real-time histogram equalization system of infrared image
NASA Astrophysics Data System (ADS)
Gu, Dongsheng; Yang, Nansheng; Pi, Defu; Hua, Min; Shen, Xiaoyan; Zhang, Ruolan
2001-10-01
Histogram Modification is a simple but effective method to enhance an infrared image. There are several methods to equalize an infrared image's histogram due to the different characteristics of the different infrared images, such as the traditional HE (Histogram Equalization) method, and the improved HP (Histogram Projection) and PE (Plateau Equalization) method and so on. If to realize these methods in a single system, the system must have a mass of memory and extremely fast speed. In our system, we introduce a DSP + FPGA based real-time procession technology to do these things together. FPGA is used to realize the common part of these methods while DSP is to do the different part. The choice of methods and the parameter can be input by a keyboard or a computer. By this means, the function of the system is powerful while it is easy to operate and maintain. In this article, we give out the diagram of the system and the soft flow chart of the methods. And at the end of it, we give out the infrared image and its histogram before and after the process of HE method.
Robust Audio Watermarking by Using Low-Frequency Histogram
NASA Astrophysics Data System (ADS)
Xiang, Shijun
In continuation to earlier work where the problem of time-scale modification (TSM) has been studied [1] by modifying the shape of audio time domain histogram, here we consider the additional ingredient of resisting additive noise-like operations, such as Gaussian noise, lossy compression and low-pass filtering. In other words, we study the problem of the watermark against both TSM and additive noises. To this end, in this paper we extract the histogram from a Gaussian-filtered low-frequency component for audio watermarking. The watermark is inserted by shaping the histogram in a way that the use of two consecutive bins as a group is exploited for hiding a bit by reassigning their population. The watermarked signals are perceptibly similar to the original one. Comparing with the previous time-domain watermarking scheme [1], the proposed watermarking method is more robust against additive noise, MP3 compression, low-pass filtering, etc.
Flood Detection/Monitoring Using Adjustable Histogram Equalization Technique
Riaz, Muhammad Mohsin; Ghafoor, Abdul
2014-01-01
Flood monitoring technique using adjustable histogram equalization is proposed. The technique overcomes the limitations (overenhancement, artifacts, and unnatural look) of existing technique by adjusting the contrast of images. The proposed technique takes pre- and postimages and applies different processing steps for generating flood map without user interaction. The resultant flood maps can be used for flood monitoring and detection. Simulation results show that the proposed technique provides better output quality compared to the state of the art existing technique. PMID:24558332
Tuckley, Kushal
2017-01-01
In telemedicine systems, critical medical data is shared on a public communication channel. This increases the risk of unauthorised access to patient's information. This underlines the importance of secrecy and authentication for the medical data. This paper presents two innovative variations of classical histogram shift methods to increase the hiding capacity. The first technique divides the image into nonoverlapping blocks and embeds the watermark individually using the histogram method. The second method separates the region of interest and embeds the watermark only in the region of noninterest. This approach preserves the medical information intact. This method finds its use in critical medical cases. The high PSNR (above 45 dB) obtained for both techniques indicates imperceptibility of the approaches. Experimental results illustrate superiority of the proposed approaches when compared with other methods based on histogram shifting techniques. These techniques improve embedding capacity by 5–15% depending on the image type, without affecting the quality of the watermarked image. Both techniques also enable lossless reconstruction of the watermark and the host medical image. A higher embedding capacity makes the proposed approaches attractive for medical image watermarking applications without compromising the quality of the image. PMID:29104744
Post-Modeling Histogram Matching of Maps Produced Using Regression Trees
Andrew J. Lister; Tonya W. Lister
2006-01-01
Spatial predictive models often use statistical techniques that in some way rely on averaging of values. Estimates from linear modeling are known to be susceptible to truncation of variance when the independent (predictor) variables are measured with error. A straightforward post-processing technique (histogram matching) for attempting to mitigate this effect is...
Pattern-histogram-based temporal change detection using personal chest radiographs
NASA Astrophysics Data System (ADS)
Ugurlu, Yucel; Obi, Takashi; Hasegawa, Akira; Yamaguchi, Masahiro; Ohyama, Nagaaki
1999-05-01
An accurate and reliable detection of temporal changes from a pair of images has considerable interest in the medical science. Traditional registration and subtraction techniques can be applied to extract temporal differences when,the object is rigid or corresponding points are obvious. However, in radiological imaging, loss of the depth information, the elasticity of object, the absence of clearly defined landmarks and three-dimensional positioning differences constraint the performance of conventional registration techniques. In this paper, we propose a new method in order to detect interval changes accurately without using an image registration technique. The method is based on construction of so-called pattern histogram and comparison procedure. The pattern histogram is a graphic representation of the frequency counts of all allowable patterns in the multi-dimensional pattern vector space. K-means algorithm is employed to partition pattern vector space successively. Any differences in the pattern histograms imply that different patterns are involved in the scenes. In our experiment, a pair of chest radiographs of pneumoconiosis is employed and the changing histogram bins are visualized on both of the images. We found that the method can be used as an alternative way of temporal change detection, particularly when the precise image registration is not available.
Regionally adaptive histogram equalization of the chest.
Sherrier, R H; Johnson, G A
1987-01-01
Advances in the area of digital chest radiography have resulted in the acquisition of high-quality images of the human chest. With these advances, there arises a genuine need for image processing algorithms specific to the chest, in order to fully exploit this digital technology. We have implemented the well-known technique of histogram equalization, noting the problems encountered when it is adapted to chest images. These problems have been successfully solved with our regionally adaptive histogram equalization method. With this technique histograms are calculated locally and then modified according to both the mean pixel value of that region as well as certain characteristics of the cumulative distribution function. This process, which has allowed certain regions of the chest radiograph to be enhanced differentially, may also have broader implications for other image processing tasks.
NASA Astrophysics Data System (ADS)
Maggio, Angelo; Carillo, Viviana; Cozzarini, Cesare; Perna, Lucia; Rancati, Tiziana; Valdagni, Riccardo; Gabriele, Pietro; Fiorino, Claudio
2013-04-01
The aim of this study was to evaluate the correlation between the ‘true’ absolute and relative dose-volume histograms (DVHs) of the bladder wall, dose-wall histogram (DWH) defined on MRI imaging and other surrogates of bladder dosimetry in prostate cancer patients, planned both with 3D-conformal and intensity-modulated radiation therapy (IMRT) techniques. For 17 prostate cancer patients, previously treated with radical intent, CT and MRI scans were acquired and matched. The contours of bladder walls were drawn by using MRI images. External bladder surfaces were then used to generate artificial bladder walls by performing automatic contractions of 5, 7 and 10 mm. For each patient a 3D conformal radiotherapy (3DCRT) and an IMRT treatment plan was generated with a prescription dose of 77.4 Gy (1.8 Gy/fr) and DVH of the whole bladder of the artificial walls (DVH-5/10) and dose-surface histograms (DSHs) were calculated and compared against the DWH in absolute and relative value, for both treatment planning techniques. A specific software (VODCA v. 4.4.0, MSS Inc.) was used for calculating the dose-volume/surface histogram. Correlation was quantified for selected dose-volume/surface parameters by the Spearman correlation coefficient. The agreement between %DWH and DVH5, DVH7 and DVH10 was found to be very good (maximum average deviations below 2%, SD < 5%): DVH5 showed the best agreement. The correlation was slightly better for absolute (R = 0.80-0.94) compared to relative (R = 0.66-0.92) histograms. The DSH was also found to be highly correlated with the DWH, although slightly higher deviations were generally found. The DVH was not a good surrogate of the DWH (R < 0.7 for most of parameters). When comparing the two treatment techniques, more pronounced differences between relative histograms were seen for IMRT with respect to 3DCRT (p < 0.0001).
Information granules in image histogram analysis.
Wieclawek, Wojciech
2018-04-01
A concept of granular computing employed in intensity-based image enhancement is discussed. First, a weighted granular computing idea is introduced. Then, the implementation of this term in the image processing area is presented. Finally, multidimensional granular histogram analysis is introduced. The proposed approach is dedicated to digital images, especially to medical images acquired by Computed Tomography (CT). As the histogram equalization approach, this method is based on image histogram analysis. Yet, unlike the histogram equalization technique, it works on a selected range of the pixel intensity and is controlled by two parameters. Performance is tested on anonymous clinical CT series. Copyright © 2017 Elsevier Ltd. All rights reserved.
Improved LSB matching steganography with histogram characters reserved
NASA Astrophysics Data System (ADS)
Chen, Zhihong; Liu, Wenyao
2008-03-01
This letter bases on the researches of LSB (least significant bit, i.e. the last bit of a binary pixel value) matching steganographic method and the steganalytic method which aims at histograms of cover images, and proposes a modification to LSB matching. In the LSB matching, if the LSB of the next cover pixel matches the next bit of secret data, do nothing; otherwise, choose to add or subtract one from the cover pixel value at random. In our improved method, a steganographic information table is defined and records the changes which embedded secrete bits introduce in. Through the table, the next LSB which has the same pixel value will be judged to add or subtract one dynamically in order to ensure the histogram's change of cover image is minimized. Therefore, the modified method allows embedding the same payload as the LSB matching but with improved steganographic security and less vulnerability to attacks compared with LSB matching. The experimental results of the new method show that the histograms maintain their attributes, such as peak values and alternative trends, in an acceptable degree and have better performance than LSB matching in the respects of histogram distortion and resistance against existing steganalysis.
Reducing Error Rates for Iris Image using higher Contrast in Normalization process
NASA Astrophysics Data System (ADS)
Aminu Ghali, Abdulrahman; Jamel, Sapiee; Abubakar Pindar, Zahraddeen; Hasssan Disina, Abdulkadir; Mat Daris, Mustafa
2017-08-01
Iris recognition system is the most secured, and faster means of identification and authentication. However, iris recognition system suffers a setback from blurring, low contrast and illumination due to low quality image which compromises the accuracy of the system. The acceptance or rejection rates of verified user depend solely on the quality of the image. In many cases, iris recognition system with low image contrast could falsely accept or reject user. Therefore this paper adopts Histogram Equalization Technique to address the problem of False Rejection Rate (FRR) and False Acceptance Rate (FAR) by enhancing the contrast of the iris image. A histogram equalization technique enhances the image quality and neutralizes the low contrast of the image at normalization stage. The experimental result shows that Histogram Equalization Technique has reduced FRR and FAR compared to the existing techniques.
NASA Astrophysics Data System (ADS)
Jadhav, Vidya
2015-09-01
Surface modifications caused by a swift heavy ion irradiation on crystalline p-type gallium antimonide crystal have been reported. Single crystal, 1 0 0> orientations and ∼500 μm thick p-type GaSb samples with carrier concentration of 3.30 × 1017 cm-3 were irradiated at 100 MeV Fe7+ ions. We have used 15UD Pelletron facilities at IUAC with varying fluences of 5 × 1010-1 × 1014 ions cm-2. The effects of irradiation on these samples have been investigated using, spectroscopic ellipsometry, atomic force microscopy and ultraviolet-visible-NIR spectroscopy techniques. Ellipsometry parameters, psi (Ψ) and delta (Δ) for the unirradiated sample and samples irradiated with different fluences were recorded. The data were fit to a three phase model to determine the refractive index and extinction coefficient. The refractive index and extinction coefficient for various fluences in ultraviolet, visible, and infrared, regimes were evaluated. Atomic force microscopy has been used to study these surface modifications. In order to have more statistical information about the surface, we have plotted the height structure histogram for all the samples. For unirradiated sample, we observed the Gaussian fitting. This result indicates the more ordered height structure symmetry. Whereas for the sample irradiated with the fluence of 1 × 1013, 5 × 1013 and 1 × 1014 ions cm-2, we observed the scattered data. The width of the histogram for samples irradiated up to the fluence of 1 × 1013 ion cm-2 was found to be almost same however it decreased at higher fluence. UV reflectance spectra of the sample irradiated with increasing fluences exhibit three peaks at 292, 500 and 617 nm represent the high energy GaSb; E1, E1 + Δ and E2 band gaps in all irradiated samples.
An improved reversible data hiding algorithm based on modification of prediction errors
NASA Astrophysics Data System (ADS)
Jafar, Iyad F.; Hiary, Sawsan A.; Darabkh, Khalid A.
2014-04-01
Reversible data hiding algorithms are concerned with the ability of hiding data and recovering the original digital image upon extraction. This issue is of interest in medical and military imaging applications. One particular class of such algorithms relies on the idea of histogram shifting of prediction errors. In this paper, we propose an improvement over one popular algorithm in this class. The improvement is achieved by employing a different predictor, the use of more bins in the prediction error histogram in addition to multilevel embedding. The proposed extension shows significant improvement over the original algorithm and its variations.
Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh
2017-01-01
99m Technetium-methylene diphosphonate ( 99m Tc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99m Tc-MDP-bone scan images. A set of 89 low contrast 99m Tc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t -test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful.
Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh
2017-01-01
Purpose of the Study: 99mTechnetium-methylene diphosphonate (99mTc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99mTc-MDP-bone scan images. Materials and Methods: A set of 89 low contrast 99mTc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. Results: This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t-test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. Conclusion: GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful. PMID:29142344
Waldenberg, Christian; Hebelka, Hanna; Brisby, Helena; Lagerstrand, Kerstin Magdalena
2018-05-01
Magnetic resonance imaging (MRI) is the best diagnostic imaging method for low back pain. However, the technique is currently not utilized in its full capacity, often failing to depict painful intervertebral discs (IVDs), potentially due to the rough degeneration classification system used clinically today. MR image histograms, which reflect the IVD heterogeneity, may offer sensitive imaging biomarkers for IVD degeneration classification. This study investigates the feasibility of using histogram analysis as means of objective and continuous grading of IVD degeneration. Forty-nine IVDs in ten low back pain patients (six males, 25-69 years) were examined with MRI (T2-weighted images and T2-maps). Each IVD was semi-automatically segmented on three mid-sagittal slices. Histogram features of the IVD were extracted from the defined regions of interest and correlated to Pfirrmann grade. Both T2-weighted images and T2-maps displayed similar histogram features. Histograms of well-hydrated IVDs displayed two separate peaks, representing annulus fibrosus and nucleus pulposus. Degenerated IVDs displayed decreased peak separation, where the separation was shown to correlate strongly with Pfirrmann grade (P < 0.05). In addition, some degenerated IVDs within the same Pfirrmann grade displayed diametrically different histogram appearances. Histogram features correlated well with IVD degeneration, suggesting that IVD histogram analysis is a suitable tool for objective and continuous IVD degeneration classification. As histogram analysis revealed IVD heterogeneity, it may be a clinical tool for characterization of regional IVD degeneration effects. To elucidate the usefulness of histogram analysis in patient management, IVD histogram features between asymptomatic and symptomatic individuals needs to be compared.
Nemmi, Federico; Saint-Aubert, Laure; Adel, Djilali; Salabert, Anne-Sophie; Pariente, Jérémie; Barbeau, Emmanuel; Payoux, Pierre; Péran, Patrice
2014-01-01
Purpose AV-45 amyloid biomarker is known to show uptake in white matter in patients with Alzheimer’s disease (AD) but also in healthy population. This binding; thought to be of a non-specific lipophilic nature has not yet been investigated. The aim of this study was to determine the differential pattern of AV-45 binding in healthy and pathological populations in white matter. Methods We recruited 24 patients presenting with AD at early stage and 17 matched, healthy subjects. We used an optimized PET-MRI registration method and an approach based on intensity histogram using several indexes. We compared the results of the intensity histogram analyses with a more canonical approach based on target-to-cerebellum Standard Uptake Value (SUVr) in white and grey matters using MANOVA and discriminant analyses. A cluster analysis on white and grey matter histograms was also performed. Results White matter histogram analysis revealed significant differences between AD and healthy subjects, which were not revealed by SUVr analysis. However, white matter histograms was not decisive to discriminate groups, and indexes based on grey matter only showed better discriminative power than SUVr. The cluster analysis divided our sample in two clusters, showing different uptakes in grey but also in white matter. Conclusion These results demonstrate that AV-45 binding in white matter conveys subtle information not detectable using SUVr approach. Although it is not better than standard SUVr to discriminate AD patients from healthy subjects, this information could reveal white matter modifications. PMID:24573658
McCann, Cooper; Repasky, Kevin S.; Morin, Mikindra; ...
2017-05-23
Hyperspectral image analysis has benefited from an array of methods that take advantage of the increased spectral depth compared to multispectral sensors; however, the focus of these developments has been on supervised classification methods. Lack of a priori knowledge regarding land cover characteristics can make unsupervised classification methods preferable under certain circumstances. An unsupervised classification technique is presented in this paper that utilizes physically relevant basis functions to model the reflectance spectra. These fit parameters used to generate the basis functions allow clustering based on spectral characteristics rather than spectral channels and provide both noise and data reduction. Histogram splittingmore » of the fit parameters is then used as a means of producing an unsupervised classification. Unlike current unsupervised classification techniques that rely primarily on Euclidian distance measures to determine similarity, the unsupervised classification technique uses the natural splitting of the fit parameters associated with the basis functions creating clusters that are similar in terms of physical parameters. The data set used in this work utilizes the publicly available data collected at Indian Pines, Indiana. This data set provides reference data allowing for comparisons of the efficacy of different unsupervised data analysis. The unsupervised histogram splitting technique presented in this paper is shown to be better than the standard unsupervised ISODATA clustering technique with an overall accuracy of 34.3/19.0% before merging and 40.9/39.2% after merging. Finally, this improvement is also seen as an improvement of kappa before/after merging of 24.8/30.5 for the histogram splitting technique compared to 15.8/28.5 for ISODATA.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCann, Cooper; Repasky, Kevin S.; Morin, Mikindra
Hyperspectral image analysis has benefited from an array of methods that take advantage of the increased spectral depth compared to multispectral sensors; however, the focus of these developments has been on supervised classification methods. Lack of a priori knowledge regarding land cover characteristics can make unsupervised classification methods preferable under certain circumstances. An unsupervised classification technique is presented in this paper that utilizes physically relevant basis functions to model the reflectance spectra. These fit parameters used to generate the basis functions allow clustering based on spectral characteristics rather than spectral channels and provide both noise and data reduction. Histogram splittingmore » of the fit parameters is then used as a means of producing an unsupervised classification. Unlike current unsupervised classification techniques that rely primarily on Euclidian distance measures to determine similarity, the unsupervised classification technique uses the natural splitting of the fit parameters associated with the basis functions creating clusters that are similar in terms of physical parameters. The data set used in this work utilizes the publicly available data collected at Indian Pines, Indiana. This data set provides reference data allowing for comparisons of the efficacy of different unsupervised data analysis. The unsupervised histogram splitting technique presented in this paper is shown to be better than the standard unsupervised ISODATA clustering technique with an overall accuracy of 34.3/19.0% before merging and 40.9/39.2% after merging. Finally, this improvement is also seen as an improvement of kappa before/after merging of 24.8/30.5 for the histogram splitting technique compared to 15.8/28.5 for ISODATA.« less
Histogram analysis for smartphone-based rapid hematocrit determination
Jalal, Uddin M.; Kim, Sang C.; Shim, Joon S.
2017-01-01
A novel and rapid analysis technique using histogram has been proposed for the colorimetric quantification of blood hematocrits. A smartphone-based “Histogram” app for the detection of hematocrits has been developed integrating the smartphone embedded camera with a microfluidic chip via a custom-made optical platform. The developed histogram analysis shows its effectiveness in the automatic detection of sample channel including auto-calibration and can analyze the single-channel as well as multi-channel images. Furthermore, the analyzing method is advantageous to the quantification of blood-hematocrit both in the equal and varying optical conditions. The rapid determination of blood hematocrits carries enormous information regarding physiological disorders, and the use of such reproducible, cost-effective, and standard techniques may effectively help with the diagnosis and prevention of a number of human diseases. PMID:28717569
List mode multichannel analyzer
Archer, Daniel E [Livermore, CA; Luke, S John [Pleasanton, CA; Mauger, G Joseph [Livermore, CA; Riot, Vincent J [Berkeley, CA; Knapp, David A [Livermore, CA
2007-08-07
A digital list mode multichannel analyzer (MCA) built around a programmable FPGA device for onboard data analysis and on-the-fly modification of system detection/operating parameters, and capable of collecting and processing data in very small time bins (<1 millisecond) when used in histogramming mode, or in list mode as a list mode MCA.
Histogram equalization with Bayesian estimation for noise robust speech recognition.
Suh, Youngjoo; Kim, Hoirin
2018-02-01
The histogram equalization approach is an efficient feature normalization technique for noise robust automatic speech recognition. However, it suffers from performance degradation when some fundamental conditions are not satisfied in the test environment. To remedy these limitations of the original histogram equalization methods, class-based histogram equalization approach has been proposed. Although this approach showed substantial performance improvement under noise environments, it still suffers from performance degradation due to the overfitting problem when test data are insufficient. To address this issue, the proposed histogram equalization technique employs the Bayesian estimation method in the test cumulative distribution function estimation. It was reported in a previous study conducted on the Aurora-4 task that the proposed approach provided substantial performance gains in speech recognition systems based on the acoustic modeling of the Gaussian mixture model-hidden Markov model. In this work, the proposed approach was examined in speech recognition systems with deep neural network-hidden Markov model (DNN-HMM), the current mainstream speech recognition approach where it also showed meaningful performance improvement over the conventional maximum likelihood estimation-based method. The fusion of the proposed features with the mel-frequency cepstral coefficients provided additional performance gains in DNN-HMM systems, which otherwise suffer from performance degradation in the clean test condition.
Motor Oil Classification using Color Histograms and Pattern Recognition Techniques.
Ahmadi, Shiva; Mani-Varnosfaderani, Ahmad; Habibi, Biuck
2018-04-20
Motor oil classification is important for quality control and the identification of oil adulteration. In thiswork, we propose a simple, rapid, inexpensive and nondestructive approach based on image analysis and pattern recognition techniques for the classification of nine different types of motor oils according to their corresponding color histograms. For this, we applied color histogram in different color spaces such as red green blue (RGB), grayscale, and hue saturation intensity (HSI) in order to extract features that can help with the classification procedure. These color histograms and their combinations were used as input for model development and then were statistically evaluated by using linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), and support vector machine (SVM) techniques. Here, two common solutions for solving a multiclass classification problem were applied: (1) transformation to binary classification problem using a one-against-all (OAA) approach and (2) extension from binary classifiers to a single globally optimized multilabel classification model. In the OAA strategy, LDA, QDA, and SVM reached up to 97% in terms of accuracy, sensitivity, and specificity for both the training and test sets. In extension from binary case, despite good performances by the SVM classification model, QDA and LDA provided better results up to 92% for RGB-grayscale-HSI color histograms and up to 93% for the HSI color map, respectively. In order to reduce the numbers of independent variables for modeling, a principle component analysis algorithm was used. Our results suggest that the proposed method is promising for the identification and classification of different types of motor oils.
High capacity reversible watermarking for audio by histogram shifting and predicted error expansion.
Wang, Fei; Xie, Zhaoxin; Chen, Zuo
2014-01-01
Being reversible, the watermarking information embedded in audio signals can be extracted while the original audio data can achieve lossless recovery. Currently, the few reversible audio watermarking algorithms are confronted with following problems: relatively low SNR (signal-to-noise) of embedded audio; a large amount of auxiliary embedded location information; and the absence of accurate capacity control capability. In this paper, we present a novel reversible audio watermarking scheme based on improved prediction error expansion and histogram shifting. First, we use differential evolution algorithm to optimize prediction coefficients and then apply prediction error expansion to output stego data. Second, in order to reduce location map bits length, we introduced histogram shifting scheme. Meanwhile, the prediction error modification threshold according to a given embedding capacity can be computed by our proposed scheme. Experiments show that this algorithm improves the SNR of embedded audio signals and embedding capacity, drastically reduces location map bits length, and enhances capacity control capability.
Assessing clutter reduction in parallel coordinates using image processing techniques
NASA Astrophysics Data System (ADS)
Alhamaydh, Heba; Alzoubi, Hussein; Almasaeid, Hisham
2018-01-01
Information visualization has appeared as an important research field for multidimensional data and correlation analysis in recent years. Parallel coordinates (PCs) are one of the popular techniques to visual high-dimensional data. A problem with the PCs technique is that it suffers from crowding, a clutter which hides important data and obfuscates the information. Earlier research has been conducted to reduce clutter without loss in data content. We introduce the use of image processing techniques as an approach for assessing the performance of clutter reduction techniques in PC. We use histogram analysis as our first measure, where the mean feature of the color histograms of the possible alternative orderings of coordinates for the PC images is calculated and compared. The second measure is the extracted contrast feature from the texture of PC images based on gray-level co-occurrence matrices. The results show that the best PC image is the one that has the minimal mean value of the color histogram feature and the maximal contrast value of the texture feature. In addition to its simplicity, the proposed assessment method has the advantage of objectively assessing alternative ordering of PC visualization.
Application of Markov Models for Analysis of Development of Psychological Characteristics
ERIC Educational Resources Information Center
Kuravsky, Lev S.; Malykh, Sergey B.
2004-01-01
A technique to study combined influence of environmental and genetic factors on the base of changes in phenotype distributions is presented. Histograms are exploited as base analyzed characteristics. A continuous time, discrete state Markov process with piece-wise constant interstate transition rates is associated with evolution of each histogram.…
On algorithmic optimization of histogramming functions for GEM systems
NASA Astrophysics Data System (ADS)
Krawczyk, Rafał D.; Czarski, Tomasz; Kolasinski, Piotr; Poźniak, Krzysztof T.; Linczuk, Maciej; Byszuk, Adrian; Chernyshova, Maryna; Juszczyk, Bartlomiej; Kasprowicz, Grzegorz; Wojenski, Andrzej; Zabolotny, Wojciech
2015-09-01
This article concerns optimization methods for data analysis for the X-ray GEM detector system. The offline analysis of collected samples was optimized for MATLAB computations. Compiled functions in C language were used with MEX library. Significant speedup was received for both ordering-preprocessing and for histogramming of samples. Utilized techniques with obtained results are presented.
[The value of spectral frequency analysis by Doppler examination (author's transl)].
Boccalon, H; Reggi, M; Lozes, A; Canal, C; Jausseran, J M; Courbier, R; Puel, P; Enjalbert, A
1981-01-01
Arterial stenoses of moderate extent may involve modifications of the blood flow. Arterial shading is not always examined at the best incident angle to assess the extent of the stenosis. Spectral frequency analysis by Doppler examination is a good means of evaluating the effect of moderate arterial lesions. The present study was carried out with a Doppler effect having an acoustic spectrum, which is shown in a histogram having 16 frequency bands. The values were recorded on the two femoral arteries. A study was also made of 49 normal subjects so as to establish a normal envelope histogram, taking into account the following parameters: maximum peak (800 Hz), low cut-off frequency (420 Hz), high cut-off frequency (2,600 Hz); the first peak was found to be present in 81 % of the subjects (at 375 Hz) and the second peak in 75 % of the subjects (2,020 Hz). Thirteen patients with iliac lesions of different extent were included in the study; details of these lesions were established in all cases by aortography. None of the recorded frequency histograms were located within the normal envelope. Two cases of moderate iliac stenoses were noted ( Less Than 50 % of the diameter) which interfered with the histogram, even though the femoral velocity signal was normal.
Microbubble cloud characterization by nonlinear frequency mixing.
Cavaro, M; Payan, C; Moysan, J; Baqué, F
2011-05-01
In the frame of the fourth generation forum, France decided to develop sodium fast nuclear reactors. French Safety Authority requests the associated monitoring of argon gas into sodium. This implies to estimate the void fraction, and a histogram indicating the bubble population. In this context, the present letter studies the possibility of achieving an accurate determination of the histogram with acoustic methods. A nonlinear, two-frequency mixing technique has been implemented, and a specific optical device has been developed in order to validate the experimental results. The acoustically reconstructed histograms are in excellent agreement with those obtained using optical methods.
Diagnosis of Tempromandibular Disorders Using Local Binary Patterns.
Haghnegahdar, A A; Kolahi, S; Khojastepour, L; Tajeripour, F
2018-03-01
Temporomandibular joint disorder (TMD) might be manifested as structural changes in bone through modification, adaptation or direct destruction. We propose to use Local Binary Pattern (LBP) characteristics and histogram-oriented gradients on the recorded images as a diagnostic tool in TMD assessment. CBCT images of 66 patients (132 joints) with TMD and 66 normal cases (132 joints) were collected and 2 coronal cut prepared from each condyle, although images were limited to head of mandibular condyle. In order to extract features of images, first we use LBP and then histogram of oriented gradients. To reduce dimensionality, the linear algebra Singular Value Decomposition (SVD) is applied to the feature vectors matrix of all images. For evaluation, we used K nearest neighbor (K-NN), Support Vector Machine, Naïve Bayesian and Random Forest classifiers. We used Receiver Operating Characteristic (ROC) to evaluate the hypothesis. K nearest neighbor classifier achieves a very good accuracy (0.9242), moreover, it has desirable sensitivity (0.9470) and specificity (0.9015) results, when other classifiers have lower accuracy, sensitivity and specificity. We proposed a fully automatic approach to detect TMD using image processing techniques based on local binary patterns and feature extraction. K-NN has been the best classifier for our experiments in detecting patients from healthy individuals, by 92.42% accuracy, 94.70% sensitivity and 90.15% specificity. The proposed method can help automatically diagnose TMD at its initial stages.
Morikawa, Kei; Kurimoto, Noriaki; Inoue, Takeo; Mineshita, Masamichi; Miyazawa, Teruomi
2015-01-01
Endobronchial ultrasonography using a guide sheath (EBUS-GS) is an increasingly common bronchoscopic technique, but currently, no methods have been established to quantitatively evaluate EBUS images of peripheral pulmonary lesions. The purpose of this study was to evaluate whether histogram data collected from EBUS-GS images can contribute to the diagnosis of lung cancer. Histogram-based analyses focusing on the brightness of EBUS images were retrospectively conducted: 60 patients (38 lung cancer; 22 inflammatory diseases), with clear EBUS images were included. For each patient, a 400-pixel region of interest was selected, typically located at a 3- to 5-mm radius from the probe, from recorded EBUS images during bronchoscopy. Histogram height, width, height/width ratio, standard deviation, kurtosis and skewness were investigated as diagnostic indicators. Median histogram height, width, height/width ratio and standard deviation were significantly different between lung cancer and benign lesions (all p < 0.01). With a cutoff value for standard deviation of 10.5, lung cancer could be diagnosed with an accuracy of 81.7%. Other characteristics investigated were inferior when compared to histogram standard deviation. Histogram standard deviation appears to be the most useful characteristic for diagnosing lung cancer using EBUS images. © 2015 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Wan, Minjie; Gu, Guohua; Qian, Weixian; Ren, Kan; Chen, Qian
2018-06-01
Infrared (IR) small target enhancement plays a significant role in modern infrared search and track (IRST) systems and is the basic technique of target detection and tracking. In this paper, a coarse-to-fine grey level mapping method using improved sigmoid transformation and saliency histogram is designed to enhance IR small targets under different backgrounds. For the stage of rough enhancement, the intensity histogram is modified via an improved sigmoid function so as to narrow the regular intensity range of background as much as possible. For the part of further enhancement, a linear transformation is accomplished based on a saliency histogram constructed by averaging the cumulative saliency values provided by a saliency map. Compared with other typical methods, the presented method can achieve both better visual performances and quantitative evaluations.
Perceptual Contrast Enhancement with Dynamic Range Adjustment
Zhang, Hong; Li, Yuecheng; Chen, Hao; Yuan, Ding; Sun, Mingui
2013-01-01
Recent years, although great efforts have been made to improve its performance, few Histogram equalization (HE) methods take human visual perception (HVP) into account explicitly. The human visual system (HVS) is more sensitive to edges than brightness. This paper proposes to take use of this nature intuitively and develops a perceptual contrast enhancement approach with dynamic range adjustment through histogram modification. The use of perceptual contrast connects the image enhancement problem with the HVS. To pre-condition the input image before the HE procedure is implemented, a perceptual contrast map (PCM) is constructed based on the modified Difference of Gaussian (DOG) algorithm. As a result, the contrast of the image is sharpened and high frequency noise is suppressed. A modified Clipped Histogram Equalization (CHE) is also developed which improves visual quality by automatically detecting the dynamic range of the image with improved perceptual contrast. Experimental results show that the new HE algorithm outperforms several state-of-the-art algorithms in improving perceptual contrast and enhancing details. In addition, the new algorithm is simple to implement, making it suitable for real-time applications. PMID:24339452
Abstracting Attribute Space for Transfer Function Exploration and Design.
Maciejewski, Ross; Jang, Yun; Woo, Insoo; Jänicke, Heike; Gaither, Kelly P; Ebert, David S
2013-01-01
Currently, user centered transfer function design begins with the user interacting with a one or two-dimensional histogram of the volumetric attribute space. The attribute space is visualized as a function of the number of voxels, allowing the user to explore the data in terms of the attribute size/magnitude. However, such visualizations provide the user with no information on the relationship between various attribute spaces (e.g., density, temperature, pressure, x, y, z) within the multivariate data. In this work, we propose a modification to the attribute space visualization in which the user is no longer presented with the magnitude of the attribute; instead, the user is presented with an information metric detailing the relationship between attributes of the multivariate volumetric data. In this way, the user can guide their exploration based on the relationship between the attribute magnitude and user selected attribute information as opposed to being constrained by only visualizing the magnitude of the attribute. We refer to this modification to the traditional histogram widget as an abstract attribute space representation. Our system utilizes common one and two-dimensional histogram widgets where the bins of the abstract attribute space now correspond to an attribute relationship in terms of the mean, standard deviation, entropy, or skewness. In this manner, we exploit the relationships and correlations present in the underlying data with respect to the dimension(s) under examination. These relationships are often times key to insight and allow us to guide attribute discovery as opposed to automatic extraction schemes which try to calculate and extract distinct attributes a priori. In this way, our system aids in the knowledge discovery of the interaction of properties within volumetric data.
Diagnosis of Tempromandibular Disorders Using Local Binary Patterns
Haghnegahdar, A.A.; Kolahi, S.; Khojastepour, L.; Tajeripour, F.
2018-01-01
Background: Temporomandibular joint disorder (TMD) might be manifested as structural changes in bone through modification, adaptation or direct destruction. We propose to use Local Binary Pattern (LBP) characteristics and histogram-oriented gradients on the recorded images as a diagnostic tool in TMD assessment. Material and Methods: CBCT images of 66 patients (132 joints) with TMD and 66 normal cases (132 joints) were collected and 2 coronal cut prepared from each condyle, although images were limited to head of mandibular condyle. In order to extract features of images, first we use LBP and then histogram of oriented gradients. To reduce dimensionality, the linear algebra Singular Value Decomposition (SVD) is applied to the feature vectors matrix of all images. For evaluation, we used K nearest neighbor (K-NN), Support Vector Machine, Naïve Bayesian and Random Forest classifiers. We used Receiver Operating Characteristic (ROC) to evaluate the hypothesis. Results: K nearest neighbor classifier achieves a very good accuracy (0.9242), moreover, it has desirable sensitivity (0.9470) and specificity (0.9015) results, when other classifiers have lower accuracy, sensitivity and specificity. Conclusion: We proposed a fully automatic approach to detect TMD using image processing techniques based on local binary patterns and feature extraction. K-NN has been the best classifier for our experiments in detecting patients from healthy individuals, by 92.42% accuracy, 94.70% sensitivity and 90.15% specificity. The proposed method can help automatically diagnose TMD at its initial stages. PMID:29732343
Postmortem validation of breast density using dual-energy mammography
Molloi, Sabee; Ducote, Justin L.; Ding, Huanjun; Feig, Stephen A.
2014-01-01
Purpose: Mammographic density has been shown to be an indicator of breast cancer risk and also reduces the sensitivity of screening mammography. Currently, there is no accepted standard for measuring breast density. Dual energy mammography has been proposed as a technique for accurate measurement of breast density. The purpose of this study is to validate its accuracy in postmortem breasts and compare it with other existing techniques. Methods: Forty postmortem breasts were imaged using a dual energy mammography system. Glandular and adipose equivalent phantoms of uniform thickness were used to calibrate a dual energy basis decomposition algorithm. Dual energy decomposition was applied after scatter correction to calculate breast density. Breast density was also estimated using radiologist reader assessment, standard histogram thresholding and a fuzzy C-mean algorithm. Chemical analysis was used as the reference standard to assess the accuracy of different techniques to measure breast composition. Results: Breast density measurements using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean algorithm, and dual energy were in good agreement with the measured fibroglandular volume fraction using chemical analysis. The standard error estimates using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean, and dual energy were 9.9%, 8.6%, 7.2%, and 4.7%, respectively. Conclusions: The results indicate that dual energy mammography can be used to accurately measure breast density. The variability in breast density estimation using dual energy mammography was lower than reader assessment rankings, standard histogram thresholding, and fuzzy C-mean algorithm. Improved quantification of breast density is expected to further enhance its utility as a risk factor for breast cancer. PMID:25086548
Postmortem validation of breast density using dual-energy mammography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molloi, Sabee, E-mail: symolloi@uci.edu; Ducote, Justin L.; Ding, Huanjun
2014-08-15
Purpose: Mammographic density has been shown to be an indicator of breast cancer risk and also reduces the sensitivity of screening mammography. Currently, there is no accepted standard for measuring breast density. Dual energy mammography has been proposed as a technique for accurate measurement of breast density. The purpose of this study is to validate its accuracy in postmortem breasts and compare it with other existing techniques. Methods: Forty postmortem breasts were imaged using a dual energy mammography system. Glandular and adipose equivalent phantoms of uniform thickness were used to calibrate a dual energy basis decomposition algorithm. Dual energy decompositionmore » was applied after scatter correction to calculate breast density. Breast density was also estimated using radiologist reader assessment, standard histogram thresholding and a fuzzy C-mean algorithm. Chemical analysis was used as the reference standard to assess the accuracy of different techniques to measure breast composition. Results: Breast density measurements using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean algorithm, and dual energy were in good agreement with the measured fibroglandular volume fraction using chemical analysis. The standard error estimates using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean, and dual energy were 9.9%, 8.6%, 7.2%, and 4.7%, respectively. Conclusions: The results indicate that dual energy mammography can be used to accurately measure breast density. The variability in breast density estimation using dual energy mammography was lower than reader assessment rankings, standard histogram thresholding, and fuzzy C-mean algorithm. Improved quantification of breast density is expected to further enhance its utility as a risk factor for breast cancer.« less
Gray-level transformations for interactive image enhancement. M.S. Thesis. Final Technical Report
NASA Technical Reports Server (NTRS)
Fittes, B. A.
1975-01-01
A gray-level transformation method suitable for interactive image enhancement was presented. It is shown that the well-known histogram equalization approach is a special case of this method. A technique for improving the uniformity of a histogram is also developed. Experimental results which illustrate the capabilities of both algorithms are described. Two proposals for implementing gray-level transformations in a real-time interactive image enhancement system are also presented.
2013-01-01
Background The high variations of background luminance, low contrast and excessively enhanced contrast of hand bone radiograph often impede the bone age assessment rating system in evaluating the degree of epiphyseal plates and ossification centers development. The Global Histogram equalization (GHE) has been the most frequently adopted image contrast enhancement technique but the performance is not satisfying. A brightness and detail preserving histogram equalization method with good contrast enhancement effect has been a goal of much recent research in histogram equalization. Nevertheless, producing a well-balanced histogram equalized radiograph in terms of its brightness preservation, detail preservation and contrast enhancement is deemed to be a daunting task. Method In this paper, we propose a novel framework of histogram equalization with the aim of taking several desirable properties into account, namely the Multipurpose Beta Optimized Bi-Histogram Equalization (MBOBHE). This method performs the histogram optimization separately in both sub-histograms after the segmentation of histogram using an optimized separating point determined based on the regularization function constituted by three components. The result is then assessed by the qualitative and quantitative analysis to evaluate the essential aspects of histogram equalized image using a total of 160 hand radiographs that are implemented in testing and analyses which are acquired from hand bone online database. Result From the qualitative analysis, we found that basic bi-histogram equalizations are not capable of displaying the small features in image due to incorrect selection of separating point by focusing on only certain metric without considering the contrast enhancement and detail preservation. From the quantitative analysis, we found that MBOBHE correlates well with human visual perception, and this improvement shortens the evaluation time taken by inspector in assessing the bone age. Conclusions The proposed MBOBHE outperforms other existing methods regarding comprehensive performance of histogram equalization. All the features which are pertinent to bone age assessment are more protruding relative to other methods; this has shorten the required evaluation time in manual bone age assessment using TW method. While the accuracy remains unaffected or slightly better than using unprocessed original image. The holistic properties in terms of brightness preservation, detail preservation and contrast enhancement are simultaneous taken into consideration and thus the visual effect is contributive to manual inspection. PMID:23565999
Breast density quantification with cone-beam CT: A post-mortem study
Johnson, Travis; Ding, Huanjun; Le, Huy Q.; Ducote, Justin L.; Molloi, Sabee
2014-01-01
Forty post-mortem breasts were imaged with a flat-panel based cone-beam x-ray CT system at 50 kVp. The feasibility of breast density quantification has been investigated using standard histogram thresholding and an automatic segmentation method based on the fuzzy c-means algorithm (FCM). The breasts were chemically decomposed into water, lipid, and protein immediately after image acquisition was completed. The percent fibroglandular volume (%FGV) from chemical analysis was used as the gold standard for breast density comparison. Both image-based segmentation techniques showed good precision in breast density quantification with high linear coefficients between the right and left breast of each pair. When comparing with the gold standard using %FGV from chemical analysis, Pearson’s r-values were estimated to be 0.983 and 0.968 for the FCM clustering and the histogram thresholding techniques, respectively. The standard error of the estimate (SEE) was also reduced from 3.92% to 2.45% by applying the automatic clustering technique. The results of the postmortem study suggested that breast tissue can be characterized in terms of water, lipid and protein contents with high accuracy by using chemical analysis, which offers a gold standard for breast density studies comparing different techniques. In the investigated image segmentation techniques, the FCM algorithm had high precision and accuracy in breast density quantification. In comparison to conventional histogram thresholding, it was more efficient and reduced inter-observer variation. PMID:24254317
Molloi, Sabee; Ding, Huanjun; Feig, Stephen
2015-01-01
Purpose The purpose of this study was to compare the precision of mammographic breast density measurement using radiologist reader assessment, histogram threshold segmentation, fuzzy C-mean segmentation and spectral material decomposition. Materials and Methods Spectral mammography images from a total of 92 consecutive asymptomatic women (50–69 years old) who presented for annual screening mammography were retrospectively analyzed for this study. Breast density was estimated using 10 radiologist reader assessment, standard histogram thresholding, fuzzy C-mean algorithm and spectral material decomposition. The breast density correlation between left and right breasts was used to assess the precision of these techniques to measure breast composition relative to dual-energy material decomposition. Results In comparison to the other techniques, the results of breast density measurements using dual-energy material decomposition showed the highest correlation. The relative standard error of estimate for breast density measurements from left and right breasts using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean algorithm and dual-energy material decomposition was calculated to be 1.95, 2.87, 2.07 and 1.00, respectively. Conclusion The results indicate that the precision of dual-energy material decomposition was approximately factor of two higher than the other techniques with regard to better correlation of breast density measurements from right and left breasts. PMID:26031229
Differential diagnosis of normal pressure hydrocephalus by MRI mean diffusivity histogram analysis.
Ivkovic, M; Liu, B; Ahmed, F; Moore, D; Huang, C; Raj, A; Kovanlikaya, I; Heier, L; Relkin, N
2013-01-01
Accurate diagnosis of normal pressure hydrocephalus is challenging because the clinical symptoms and radiographic appearance of NPH often overlap those of other conditions, including age-related neurodegenerative disorders such as Alzheimer and Parkinson diseases. We hypothesized that radiologic differences between NPH and AD/PD can be characterized by a robust and objective MR imaging DTI technique that does not require intersubject image registration or operator-defined regions of interest, thus avoiding many pitfalls common in DTI methods. We collected 3T DTI data from 15 patients with probable NPH and 25 controls with AD, PD, or dementia with Lewy bodies. We developed a parametric model for the shape of intracranial mean diffusivity histograms that separates brain and ventricular components from a third component composed mostly of partial volume voxels. To accurately fit the shape of the third component, we constructed a parametric function named the generalized Voss-Dyke function. We then examined the use of the fitting parameters for the differential diagnosis of NPH from AD, PD, and DLB. Using parameters for the MD histogram shape, we distinguished clinically probable NPH from the 3 other disorders with 86% sensitivity and 96% specificity. The technique yielded 86% sensitivity and 88% specificity when differentiating NPH from AD only. An adequate parametric model for the shape of intracranial MD histograms can distinguish NPH from AD, PD, or DLB with high sensitivity and specificity.
Wan Ismail, W Z; Sim, K S; Tso, C P; Ting, H Y
2011-01-01
To reduce undesirable charging effects in scanning electron microscope images, Rayleigh contrast stretching is developed and employed. First, re-scaling is performed on the input image histograms with Rayleigh algorithm. Then, contrast stretching or contrast adjustment is implemented to improve the images while reducing the contrast charging artifacts. This technique has been compared to some existing histogram equalization (HE) extension techniques: recursive sub-image HE, contrast stretching dynamic HE, multipeak HE and recursive mean separate HE. Other post processing methods, such as wavelet approach, spatial filtering, and exponential contrast stretching, are compared as well. Overall, the proposed method produces better image compensation in reducing charging artifacts. Copyright © 2011 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Welch, Ronald M.
1996-01-01
The ASTER polar cloud mask algorithm is currently under development. Several classification techniques have been developed and implemented. The merits and accuracy of each are being examined. The classification techniques under investigation include fuzzy logic, hierarchical neural network, and a pairwise histogram comparison scheme based on sample histograms called the Paired Histogram Method. Scene adaptive methods also are being investigated as a means to improve classifier performance. The feature, arctan of Band 4 and Band 5, and the Band 2 vs. Band 4 feature space are key to separating frozen water (e.g., ice/snow, slush/wet ice, etc.) from cloud over frozen water, and land from cloud over land, respectively. A total of 82 Landsat TM circumpolar scenes are being used as a basis for algorithm development and testing. Numerous spectral features are being tested and include the 7 basic Landsat TM bands, in addition to ratios, differences, arctans, and normalized differences of each combination of bands. A technique for deriving cloud base and top height is developed. It uses 2-D cross correlation between a cloud edge and its corresponding shadow to determine the displacement of the cloud from its shadow. The height is then determined from this displacement, the solar zenith angle, and the sensor viewing angle.
The Utility of Using a Near-Infrared (NIR) Camera to Measure Beach Surface Moisture
NASA Astrophysics Data System (ADS)
Nelson, S.; Schmutz, P. P.
2017-12-01
Surface moisture content is an important factor that must be considered when studying aeolian sediment transport in a beach environment. A few different instruments and procedures are available for measuring surface moisture content (i.e. moisture probes, LiDAR, and gravimetric moisture data from surface scrapings); however, these methods can be inaccurate, costly, and inapplicable, particularly in the field. Near-infrared (NIR) spectral band imagery is another technique used to obtain moisture data. NIR imagery has been predominately used through remote sensing and has yet to be used for ground-based measurements. Dry sand reflects infrared radiation given off by the sun and wet sand absorbs IR radiation. All things considered, this study assesses the utility of measuring surface moisture content of beach sand with a modified NIR camera. A traditional point and shoot digital camera was internally modified with the placement of a visible light-blocking filter. Images were taken of three different types of beach sand at controlled moisture content values, with sunlight as the source of infrared radiation. A technique was established through trial and error by comparing resultant histogram values using Adobe Photoshop with the various moisture conditions. The resultant IR absorption histogram values were calibrated to actual gravimetric moisture content from surface scrapings of the samples. Overall, the results illustrate that the NIR spectrum modified camera does not provide the ability to adequately measure beach surface moisture content. However, there were noted differences in IR absorption histogram values among the different sediment types. Sediment with darker quartz mineralogy provided larger variations in histogram values, but the technique is not sensitive enough to accurately represent low moisture percentages, which are of most importance when studying aeolian sediment transport.
Texton-based analysis of paintings
NASA Astrophysics Data System (ADS)
van der Maaten, Laurens J. P.; Postma, Eric O.
2010-08-01
The visual examination of paintings is traditionally performed by skilled art historians using their eyes. Recent advances in intelligent systems may support art historians in determining the authenticity or date of creation of paintings. In this paper, we propose a technique for the examination of brushstroke structure that views the wildly overlapping brushstrokes as texture. The analysis of the painting texture is performed with the help of a texton codebook, i.e., a codebook of small prototypical textural patches. The texton codebook can be learned from a collection of paintings. Our textural analysis technique represents paintings in terms of histograms that measure the frequency by which the textons in the codebook occur in the painting (so-called texton histograms). We present experiments that show the validity and effectiveness of our technique for textural analysis on a collection of digitized high-resolution reproductions of paintings by Van Gogh and his contemporaries. As texton histograms cannot be easily be interpreted by art experts, the paper proposes to approaches to visualize the results on the textural analysis. The first approach visualizes the similarities between the histogram representations of paintings by employing a recently proposed dimensionality reduction technique, called t-SNE. We show that t-SNE reveals a clear separation of paintings created by Van Gogh and those created by other painters. In addition, the period of creation is faithfully reflected in the t-SNE visualizations. The second approach visualizes the similarities and differences between paintings by highlighting regions in a painting in which the textural structure of the painting is unusual. We illustrate the validity of this approach by means of an experiment in which we highlight regions in a painting by Monet that are not very "Van Gogh-like". Taken together, we believe the tools developed in this study are well capable of assisting for art historians in support of their study of paintings.
Statistical density modification using local pattern matching
Terwilliger, Thomas C.
2007-01-23
A computer implemented method modifies an experimental electron density map. A set of selected known experimental and model electron density maps is provided and standard templates of electron density are created from the selected experimental and model electron density maps by clustering and averaging values of electron density in a spherical region about each point in a grid that defines each selected known experimental and model electron density maps. Histograms are also created from the selected experimental and model electron density maps that relate the value of electron density at the center of each of the spherical regions to a correlation coefficient of a density surrounding each corresponding grid point in each one of the standard templates. The standard templates and the histograms are applied to grid points on the experimental electron density map to form new estimates of electron density at each grid point in the experimental electron density map.
NASA Astrophysics Data System (ADS)
McCann, C.; Repasky, K. S.; Morin, M.; Lawrence, R. L.; Powell, S. L.
2016-12-01
Compact, cost-effective, flight-based hyperspectral imaging systems can provide scientifically relevant data over large areas for a variety of applications such as ecosystem studies, precision agriculture, and land management. To fully realize this capability, unsupervised classification techniques based on radiometrically-calibrated data that cluster based on biophysical similarity rather than simply spectral similarity are needed. An automated technique to produce high-resolution, large-area, radiometrically-calibrated hyperspectral data sets based on the Landsat surface reflectance data product as a calibration target was developed and applied to three subsequent years of data covering approximately 1850 hectares. The radiometrically-calibrated data allows inter-comparison of the temporal series. Advantages of the radiometric calibration technique include the need for minimal site access, no ancillary instrumentation, and automated processing. Fitting the reflectance spectra of each pixel using a set of biophysically relevant basis functions reduces the data from 80 spectral bands to 9 parameters providing noise reduction and data compression. Examination of histograms of these parameters allows for determination of natural splitting into biophysical similar clusters. This method creates clusters that are similar in terms of biophysical parameters, not simply spectral proximity. Furthermore, this method can be applied to other data sets, such as urban scenes, by developing other physically meaningful basis functions. The ability to use hyperspectral imaging for a variety of important applications requires the development of data processing techniques that can be automated. The radiometric-calibration combined with the histogram based unsupervised classification technique presented here provide one potential avenue for managing big-data associated with hyperspectral imaging.
Generalized expectation-maximization segmentation of brain MR images
NASA Astrophysics Data System (ADS)
Devalkeneer, Arnaud A.; Robe, Pierre A.; Verly, Jacques G.; Phillips, Christophe L. M.
2006-03-01
Manual segmentation of medical images is unpractical because it is time consuming, not reproducible, and prone to human error. It is also very difficult to take into account the 3D nature of the images. Thus, semi- or fully-automatic methods are of great interest. Current segmentation algorithms based on an Expectation- Maximization (EM) procedure present some limitations. The algorithm by Ashburner et al., 2005, does not allow multichannel inputs, e.g. two MR images of different contrast, and does not use spatial constraints between adjacent voxels, e.g. Markov random field (MRF) constraints. The solution of Van Leemput et al., 1999, employs a simplified model (mixture coefficients are not estimated and only one Gaussian is used by tissue class, with three for the image background). We have thus implemented an algorithm that combines the features of these two approaches: multichannel inputs, intensity bias correction, multi-Gaussian histogram model, and Markov random field (MRF) constraints. Our proposed method classifies tissues in three iterative main stages by way of a Generalized-EM (GEM) algorithm: (1) estimation of the Gaussian parameters modeling the histogram of the images, (2) correction of image intensity non-uniformity, and (3) modification of prior classification knowledge by MRF techniques. The goal of the GEM algorithm is to maximize the log-likelihood across the classes and voxels. Our segmentation algorithm was validated on synthetic data (with the Dice metric criterion) and real data (by a neurosurgeon) and compared to the original algorithms by Ashburner et al. and Van Leemput et al. Our combined approach leads to more robust and accurate segmentation.
Statistical normalization techniques for magnetic resonance imaging.
Shinohara, Russell T; Sweeney, Elizabeth M; Goldsmith, Jeff; Shiee, Navid; Mateen, Farrah J; Calabresi, Peter A; Jarso, Samson; Pham, Dzung L; Reich, Daniel S; Crainiceanu, Ciprian M
2014-01-01
While computed tomography and other imaging techniques are measured in absolute units with physical meaning, magnetic resonance images are expressed in arbitrary units that are difficult to interpret and differ between study visits and subjects. Much work in the image processing literature on intensity normalization has focused on histogram matching and other histogram mapping techniques, with little emphasis on normalizing images to have biologically interpretable units. Furthermore, there are no formalized principles or goals for the crucial comparability of image intensities within and across subjects. To address this, we propose a set of criteria necessary for the normalization of images. We further propose simple and robust biologically motivated normalization techniques for multisequence brain imaging that have the same interpretation across acquisitions and satisfy the proposed criteria. We compare the performance of different normalization methods in thousands of images of patients with Alzheimer's disease, hundreds of patients with multiple sclerosis, and hundreds of healthy subjects obtained in several different studies at dozens of imaging centers.
Sun, Xiaofei; Shi, Lin; Luo, Yishan; Yang, Wei; Li, Hongpeng; Liang, Peipeng; Li, Kuncheng; Mok, Vincent C T; Chu, Winnie C W; Wang, Defeng
2015-07-28
Intensity normalization is an important preprocessing step in brain magnetic resonance image (MRI) analysis. During MR image acquisition, different scanners or parameters would be used for scanning different subjects or the same subject at a different time, which may result in large intensity variations. This intensity variation will greatly undermine the performance of subsequent MRI processing and population analysis, such as image registration, segmentation, and tissue volume measurement. In this work, we proposed a new histogram normalization method to reduce the intensity variation between MRIs obtained from different acquisitions. In our experiment, we scanned each subject twice on two different scanners using different imaging parameters. With noise estimation, the image with lower noise level was determined and treated as the high-quality reference image. Then the histogram of the low-quality image was normalized to the histogram of the high-quality image. The normalization algorithm includes two main steps: (1) intensity scaling (IS), where, for the high-quality reference image, the intensities of the image are first rescaled to a range between the low intensity region (LIR) value and the high intensity region (HIR) value; and (2) histogram normalization (HN),where the histogram of low-quality image as input image is stretched to match the histogram of the reference image, so that the intensity range in the normalized image will also lie between LIR and HIR. We performed three sets of experiments to evaluate the proposed method, i.e., image registration, segmentation, and tissue volume measurement, and compared this with the existing intensity normalization method. It is then possible to validate that our histogram normalization framework can achieve better results in all the experiments. It is also demonstrated that the brain template with normalization preprocessing is of higher quality than the template with no normalization processing. We have proposed a histogram-based MRI intensity normalization method. The method can normalize scans which were acquired on different MRI units. We have validated that the method can greatly improve the image analysis performance. Furthermore, it is demonstrated that with the help of our normalization method, we can create a higher quality Chinese brain template.
Evaluation of pulmonary function using single-breath-hold dual-energy computed tomography with xenon
Kyoyama, Hiroyuki; Hirata, Yusuke; Kikuchi, Satoshi; Sakai, Kosuke; Saito, Yuriko; Mikami, Shintaro; Moriyama, Gaku; Yanagita, Hisami; Watanabe, Wataru; Otani, Katharina; Honda, Norinari; Uematsu, Kazutsugu
2017-01-01
Abstract Xenon-enhanced dual-energy computed tomography (xenon-enhanced CT) can provide lung ventilation maps that may be useful for assessing structural and functional abnormalities of the lung. Xenon-enhanced CT has been performed using a multiple-breath-hold technique during xenon washout. We recently developed xenon-enhanced CT using a single-breath-hold technique to assess ventilation. We sought to evaluate whether xenon-enhanced CT using a single-breath-hold technique correlates with pulmonary function testing (PFT) results. Twenty-six patients, including 11 chronic obstructive pulmonary disease (COPD) patients, underwent xenon-enhanced CT and PFT. Three of the COPD patients underwent xenon-enhanced CT before and after bronchodilator treatment. Images from xenon-CT were obtained by dual-source CT during a breath-hold after a single vital-capacity inspiration of a xenon–oxygen gas mixture. Image postprocessing by 3-material decomposition generated conventional CT and xenon-enhanced images. Low-attenuation areas on xenon images matched low-attenuation areas on conventional CT in 21 cases but matched normal-attenuation areas in 5 cases. Volumes of Hounsfield unit (HU) histograms of xenon images correlated moderately and highly with vital capacity (VC) and total lung capacity (TLC), respectively (r = 0.68 and 0.85). Means and modes of histograms weakly correlated with VC (r = 0.39 and 0.38), moderately with forced expiratory volume in 1 second (FEV1) (r = 0.59 and 0.56), weakly with the ratio of FEV1 to FVC (r = 0.46 and 0.42), and moderately with the ratio of FEV1 to its predicted value (r = 0.64 and 0.60). Mode and volume of histograms increased in 2 COPD patients after the improvement of FEV1 with bronchodilators. Inhalation of xenon gas caused no adverse effects. Xenon-enhanced CT using a single-breath-hold technique depicted functional abnormalities not detectable on thin-slice CT. Mode, mean, and volume of HU histograms of xenon images reflected pulmonary function. Xenon images obtained with xenon-enhanced CT using a single-breath-hold technique can qualitatively depict pulmonary ventilation. A larger study comprising only COPD patients should be conducted, as xenon-enhanced CT is expected to be a promising technique for the management of COPD. PMID:28099359
Kyoyama, Hiroyuki; Hirata, Yusuke; Kikuchi, Satoshi; Sakai, Kosuke; Saito, Yuriko; Mikami, Shintaro; Moriyama, Gaku; Yanagita, Hisami; Watanabe, Wataru; Otani, Katharina; Honda, Norinari; Uematsu, Kazutsugu
2017-01-01
Xenon-enhanced dual-energy computed tomography (xenon-enhanced CT) can provide lung ventilation maps that may be useful for assessing structural and functional abnormalities of the lung. Xenon-enhanced CT has been performed using a multiple-breath-hold technique during xenon washout. We recently developed xenon-enhanced CT using a single-breath-hold technique to assess ventilation. We sought to evaluate whether xenon-enhanced CT using a single-breath-hold technique correlates with pulmonary function testing (PFT) results.Twenty-six patients, including 11 chronic obstructive pulmonary disease (COPD) patients, underwent xenon-enhanced CT and PFT. Three of the COPD patients underwent xenon-enhanced CT before and after bronchodilator treatment. Images from xenon-CT were obtained by dual-source CT during a breath-hold after a single vital-capacity inspiration of a xenon-oxygen gas mixture. Image postprocessing by 3-material decomposition generated conventional CT and xenon-enhanced images.Low-attenuation areas on xenon images matched low-attenuation areas on conventional CT in 21 cases but matched normal-attenuation areas in 5 cases. Volumes of Hounsfield unit (HU) histograms of xenon images correlated moderately and highly with vital capacity (VC) and total lung capacity (TLC), respectively (r = 0.68 and 0.85). Means and modes of histograms weakly correlated with VC (r = 0.39 and 0.38), moderately with forced expiratory volume in 1 second (FEV1) (r = 0.59 and 0.56), weakly with the ratio of FEV1 to FVC (r = 0.46 and 0.42), and moderately with the ratio of FEV1 to its predicted value (r = 0.64 and 0.60). Mode and volume of histograms increased in 2 COPD patients after the improvement of FEV1 with bronchodilators. Inhalation of xenon gas caused no adverse effects.Xenon-enhanced CT using a single-breath-hold technique depicted functional abnormalities not detectable on thin-slice CT. Mode, mean, and volume of HU histograms of xenon images reflected pulmonary function. Xenon images obtained with xenon-enhanced CT using a single-breath-hold technique can qualitatively depict pulmonary ventilation. A larger study comprising only COPD patients should be conducted, as xenon-enhanced CT is expected to be a promising technique for the management of COPD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teng, C; Yin, L; Ainsley, C
2015-06-15
Purpose: To characterize the changes in Hounsfield unit (HU) in lung radiotherapy with proton beams during the course of treatment and to study the effect on the proton plan dose distribution. Methods: Twenty consecutive patients with non-small cell lung cancer treated with proton radiotherapy who underwent multiple CT scans including the planning CT and weekly verification CTs were studied. HU histograms were computed for irradiated lung volumes in beam paths for all scans using the same treatment plan. Histograms for un-irradiated lung volume were used as control to characterize inter-scan variations. HU statistics were calculated for both irradiated and un-irradiatedmore » lung volumes for each patient scan. Further, multiple CT scans based on the same planning CT were generated by replacing the HU of the lung based on the verification CT scans HU values. Using the same beam arrangement, we created plans for each of the altered CT scans to study the dosimetric effect using the dose volume histogram. Results: Lung HU decreased for irradiated lung volume during the course of radiotherapy. The magnitude of this change increased with total irradiation dose. On average, HU changed by −53.8 in the irradiated volume. This change resulted in less than 0.5mm of beam overshoot in tissue for every 1cm beam traversed in the irradiated lung. The dose modification is about +3% for the lung, and less than +1% for the primary tumor. Conclusion: HU of the lung decrease throughout the course of radiation therapy. This change results in a beam overshoot (e.g. 3mm for 6cm of lung traversed) and causes a small dose modification in the overall plan. However, this overshoot does not affect the quality of plans since the margins used in planning, based on proton range uncertainty, are greater. HU needs to change by 150 units before re-planning is warranted.« less
A Bio Medical Waste Identification and Classification Algorithm Using Mltrp and Rvm.
Achuthan, Aravindan; Ayyallu Madangopal, Vasumathi
2016-10-01
We aimed to extract the histogram features for text analysis and, to classify the types of Bio Medical Waste (BMW) for garbage disposal and management. The given BMW was preprocessed by using the median filtering technique that efficiently reduced the noise in the image. After that, the histogram features of the filtered image were extracted with the help of proposed Modified Local Tetra Pattern (MLTrP) technique. Finally, the Relevance Vector Machine (RVM) was used to classify the BMW into human body parts, plastics, cotton and liquids. The BMW image was collected from the garbage image dataset for analysis. The performance of the proposed BMW identification and classification system was evaluated in terms of sensitivity, specificity, classification rate and accuracy with the help of MATLAB. When compared to the existing techniques, the proposed techniques provided the better results. This work proposes a new texture analysis and classification technique for BMW management and disposal. It can be used in many real time applications such as hospital and healthcare management systems for proper BMW disposal.
Multi-scale Morphological Image Enhancement of Chest Radiographs by a Hybrid Scheme.
Alavijeh, Fatemeh Shahsavari; Mahdavi-Nasab, Homayoun
2015-01-01
Chest radiography is a common diagnostic imaging test, which contains an enormous amount of information about a patient. However, its interpretation is highly challenging. The accuracy of the diagnostic process is greatly influenced by image processing algorithms; hence enhancement of the images is indispensable in order to improve visibility of the details. This paper aims at improving radiograph parameters such as contrast, sharpness, noise level, and brightness to enhance chest radiographs, making use of a triangulation method. Here, contrast limited adaptive histogram equalization technique and noise suppression are simultaneously performed in wavelet domain in a new scheme, followed by morphological top-hat and bottom-hat filtering. A unique implementation of morphological filters allows for adjustment of the image brightness and significant enhancement of the contrast. The proposed method is tested on chest radiographs from Japanese Society of Radiological Technology database. The results are compared with conventional enhancement techniques such as histogram equalization, contrast limited adaptive histogram equalization, Retinex, and some recently proposed methods to show its strengths. The experimental results reveal that the proposed method can remarkably improve the image contrast while keeping the sensitive chest tissue information so that radiologists might have a more precise interpretation.
Multi-scale Morphological Image Enhancement of Chest Radiographs by a Hybrid Scheme
Alavijeh, Fatemeh Shahsavari; Mahdavi-Nasab, Homayoun
2015-01-01
Chest radiography is a common diagnostic imaging test, which contains an enormous amount of information about a patient. However, its interpretation is highly challenging. The accuracy of the diagnostic process is greatly influenced by image processing algorithms; hence enhancement of the images is indispensable in order to improve visibility of the details. This paper aims at improving radiograph parameters such as contrast, sharpness, noise level, and brightness to enhance chest radiographs, making use of a triangulation method. Here, contrast limited adaptive histogram equalization technique and noise suppression are simultaneously performed in wavelet domain in a new scheme, followed by morphological top-hat and bottom-hat filtering. A unique implementation of morphological filters allows for adjustment of the image brightness and significant enhancement of the contrast. The proposed method is tested on chest radiographs from Japanese Society of Radiological Technology database. The results are compared with conventional enhancement techniques such as histogram equalization, contrast limited adaptive histogram equalization, Retinex, and some recently proposed methods to show its strengths. The experimental results reveal that the proposed method can remarkably improve the image contrast while keeping the sensitive chest tissue information so that radiologists might have a more precise interpretation. PMID:25709942
Modifications to Improve Data Acquisition and Analysis for Camouflage Design
1983-01-01
terrains into facsimiles of the original scenes in 3, 4# or 5 colors in CIELAB notation. Tasks that were addressed included optimization of the...a histogram algorithm (HIST) was used as a first step In the clustering of the CIELAB values of the scene pixels. This algorithm Is highly efficient...however, an optimal process and the CIELAB coordinates of the final color domains can be Influenced by the color coordinate Increments used In the
Histogram contrast analysis and the visual segregation of IID textures.
Chubb, C; Econopouly, J; Landy, M S
1994-09-01
A new psychophysical methodology is introduced, histogram contrast analysis, that allows one to measure stimulus transformations, f, used by the visual system to draw distinctions between different image regions. The method involves the discrimination of images constructed by selecting texture micropatterns randomly and independently (across locations) on the basis of a given micropattern histogram. Different components of f are measured by use of different component functions to modulate the micropattern histogram until the resulting textures are discriminable. When no discrimination threshold can be obtained for a given modulating component function, a second titration technique may be used to measure the contribution of that component to f. The method includes several strong tests of its own assumptions. An example is given of the method applied to visual textures composed of small, uniform squares with randomly chosen gray levels. In particular, for a fixed mean gray level mu and a fixed gray-level variance sigma 2, histogram contrast analysis is used to establish that the class S of all textures composed of small squares with jointly independent, identically distributed gray levels with mean mu and variance sigma 2 is perceptually elementary in the following sense: there exists a single, real-valued function f S of gray level, such that two textures I and J in S are discriminable only if the average value of f S applied to the gray levels in I is significantly different from the average value of f S applied to the gray levels in J. Finally, histogram contrast analysis is used to obtain a seventh-order polynomial approximation of f S.
Asking the Right Questions: Techniques for Collaboration and School Change. 2nd Edition.
ERIC Educational Resources Information Center
Holcomb, Edie L.
This work provides school change leaders with tools, techniques, tips, examples, illustrations, and stories about promoting school change. Tools provided include histograms, surveys, run charts, weighted voting, force-field analysis, decision matrices, and many others. Chapter 1, "Introduction," applies a matrix for asking questions…
High frequency measurements of shot noise suppression in atomic-scale metal contacts
NASA Astrophysics Data System (ADS)
Wheeler, Patrick J.; Evans, Kenneth; Russom, Jeffrey; King, Nicholas; Natelson, Douglas
2009-03-01
Shot noise provides a means of assessing the number and transmission coefficients of transmitting channels in atomic- and molecular-scale junctions. Previous experiments at low temperatures in metal and semiconductor point contacts have demonstrated the expected suppression of shot noise when junction conductance is near an integer multiple of the conductance quantum, G0≡2e^2/h. Using high frequency techniques, we demonstrate the high speed acquisition of such data at room temperature in mechanical break junctions. In clean Au contacts conductance histograms with clear peaks at G0, 2G0, and 3G0 are acquired within hours, and histograms of simultaneous measurements of the shot noise show clear suppression at those conductance values. We describe the dependence of the noise on bias voltage and analyze the noise vs. conductance histograms in terms of a model that averages over transmission coefficients.
Multi-stream LSTM-HMM decoding and histogram equalization for noise robust keyword spotting.
Wöllmer, Martin; Marchi, Erik; Squartini, Stefano; Schuller, Björn
2011-09-01
Highly spontaneous, conversational, and potentially emotional and noisy speech is known to be a challenge for today's automatic speech recognition (ASR) systems, which highlights the need for advanced algorithms that improve speech features and models. Histogram Equalization is an efficient method to reduce the mismatch between clean and noisy conditions by normalizing all moments of the probability distribution of the feature vector components. In this article, we propose to combine histogram equalization and multi-condition training for robust keyword detection in noisy speech. To better cope with conversational speaking styles, we show how contextual information can be effectively exploited in a multi-stream ASR framework that dynamically models context-sensitive phoneme estimates generated by a long short-term memory neural network. The proposed techniques are evaluated on the SEMAINE database-a corpus containing emotionally colored conversations with a cognitive system for "Sensitive Artificial Listening".
SU-F-I-45: An Automated Technique to Measure Image Contrast in Clinical CT Images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanders, J; Abadi, E; Meng, B
Purpose: To develop and validate an automated technique for measuring image contrast in chest computed tomography (CT) exams. Methods: An automated computer algorithm was developed to measure the distribution of Hounsfield units (HUs) inside four major organs: the lungs, liver, aorta, and bones. These organs were first segmented or identified using computer vision and image processing techniques. Regions of interest (ROIs) were automatically placed inside the lungs, liver, and aorta and histograms of the HUs inside the ROIs were constructed. The mean and standard deviation of each histogram were computed for each CT dataset. Comparison of the mean and standardmore » deviation of the HUs in the different organs provides different contrast values. The ROI for the bones is simply the segmentation mask of the bones. Since the histogram for bones does not follow a Gaussian distribution, the 25th and 75th percentile were computed instead of the mean. The sensitivity and accuracy of the algorithm was investigated by comparing the automated measurements with manual measurements. Fifteen contrast enhanced and fifteen non-contrast enhanced chest CT clinical datasets were examined in the validation procedure. Results: The algorithm successfully measured the histograms of the four organs in both contrast and non-contrast enhanced chest CT exams. The automated measurements were in agreement with manual measurements. The algorithm has sufficient sensitivity as indicated by the near unity slope of the automated versus manual measurement plots. Furthermore, the algorithm has sufficient accuracy as indicated by the high coefficient of determination, R2, values ranging from 0.879 to 0.998. Conclusion: Patient-specific image contrast can be measured from clinical datasets. The algorithm can be run on both contrast enhanced and non-enhanced clinical datasets. The method can be applied to automatically assess the contrast characteristics of clinical chest CT images and quantify dependencies that may not be captured in phantom data.« less
Scaling images using their background ratio. An application in statistical comparisons of images.
Kalemis, A; Binnie, D; Bailey, D L; Flower, M A; Ott, R J
2003-06-07
Comparison of two medical images often requires image scaling as a pre-processing step. This is usually done with the scaling-to-the-mean or scaling-to-the-maximum techniques which, under certain circumstances, in quantitative applications may contribute a significant amount of bias. In this paper, we present a simple scaling method which assumes only that the most predominant values in the corresponding images belong to their background structure. The ratio of the two images to be compared is calculated and its frequency histogram is plotted. The scaling factor is given by the position of the peak in this histogram which belongs to the background structure. The method was tested against the traditional scaling-to-the-mean technique on simulated planar gamma-camera images which were compared using pixelwise statistical parametric tests. Both sensitivity and specificity for each condition were measured over a range of different contrasts and sizes of inhomogeneity for the two scaling techniques. The new method was found to preserve sensitivity in all cases while the traditional technique resulted in significant degradation of sensitivity in certain cases.
Generalized image contrast enhancement technique based on Heinemann contrast discrimination model
NASA Astrophysics Data System (ADS)
Liu, Hong; Nodine, Calvin F.
1994-03-01
This paper presents a generalized image contrast enhancement technique which equalizes perceived brightness based on the Heinemann contrast discrimination model. This is a modified algorithm which presents an improvement over the previous study by Mokrane in its mathematically proven existence of a unique solution and in its easily tunable parameterization. The model uses a log-log representation of contrast luminosity between targets and the surround in a fixed luminosity background setting. The algorithm consists of two nonlinear gray-scale mapping functions which have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of gray scale distribution of the image, and can be uniquely determined once the previous three are given. Tests have been carried out to examine the effectiveness of the algorithm for increasing the overall contrast of images. It can be demonstrated that the generalized algorithm provides better contrast enhancement than histogram equalization. In fact, the histogram equalization technique is a special case of the proposed mapping.
NASA Astrophysics Data System (ADS)
Zha, N.; Capaldi, D. P. I.; Pike, D.; McCormack, D. G.; Cunningham, I. A.; Parraga, G.
2015-03-01
Pulmonary x-ray computed tomography (CT) may be used to characterize emphysema and airways disease in patients with chronic obstructive pulmonary disease (COPD). One analysis approach - parametric response mapping (PMR) utilizes registered inspiratory and expiratory CT image volumes and CT-density-histogram thresholds, but there is no consensus regarding the threshold values used, or their clinical meaning. Principal-component-analysis (PCA) of the CT density histogram can be exploited to quantify emphysema using data-driven CT-density-histogram thresholds. Thus, the objective of this proof-of-concept demonstration was to develop a PRM approach using PCA-derived thresholds in COPD patients and ex-smokers without airflow limitation. Methods: Fifteen COPD ex-smokers and 5 normal ex-smokers were evaluated. Thoracic CT images were also acquired at full inspiration and full expiration and these images were non-rigidly co-registered. PCA was performed for the CT density histograms, from which the components with the highest eigenvalues greater than one were summed. Since the values of the principal component curve correlate directly with the variability in the sample, the maximum and minimum points on the curve were used as threshold values for the PCA-adjusted PRM technique. Results: A significant correlation was determined between conventional and PCA-adjusted PRM with 3He MRI apparent diffusion coefficient (p<0.001), with CT RA950 (p<0.0001), as well as with 3He MRI ventilation defect percent, a measurement of both small airways disease (p=0.049 and p=0.06, respectively) and emphysema (p=0.02). Conclusions: PRM generated using PCA thresholds of the CT density histogram showed significant correlations with CT and 3He MRI measurements of emphysema, but not airways disease.
NASA Astrophysics Data System (ADS)
Attallah, Bilal; Serir, Amina; Chahir, Youssef; Boudjelal, Abdelwahhab
2017-11-01
Palmprint recognition systems are dependent on feature extraction. A method of feature extraction using higher discrimination information was developed to characterize palmprint images. In this method, two individual feature extraction techniques are applied to a discrete wavelet transform of a palmprint image, and their outputs are fused. The two techniques used in the fusion are the histogram of gradient and the binarized statistical image features. They are then evaluated using an extreme learning machine classifier before selecting a feature based on principal component analysis. Three palmprint databases, the Hong Kong Polytechnic University (PolyU) Multispectral Palmprint Database, Hong Kong PolyU Palmprint Database II, and the Delhi Touchless (IIDT) Palmprint Database, are used in this study. The study shows that our method effectively identifies and verifies palmprints and outperforms other methods based on feature extraction.
Single-photon technique for the detection of periodic extraterrestrial laser pulses.
Leeb, W R; Poppe, A; Hammel, E; Alves, J; Brunner, M; Meingast, S
2013-06-01
To draw humankind's attention to its existence, an extraterrestrial civilization could well direct periodic laser pulses toward Earth. We developed a technique capable of detecting a quasi-periodic light signal with an average of less than one photon per pulse within a measurement time of a few tens of milliseconds in the presence of the radiation emitted by an exoplanet's host star. Each of the electronic events produced by one or more single-photon avalanche detectors is tagged with precise time-of-arrival information and stored. From this we compute a histogram displaying the frequency of event-time differences in classes with bin widths on the order of a nanosecond. The existence of periodic laser pulses manifests itself in histogram peaks regularly spaced at multiples of the-a priori unknown-pulse repetition frequency. With laser sources simulating both the pulse source and the background radiation, we tested a detection system in the laboratory at a wavelength of 850 nm. We present histograms obtained from various recorded data sequences with the number of photons per pulse, the background photons per pulse period, and the recording time as main parameters. We then simulated a periodic signal hypothetically generated on a planet orbiting a G2V-type star (distance to Earth 500 light-years) and show that the technique is capable of detecting the signal even if the received pulses carry as little as one photon on average on top of the star's background light.
A histogram-based technique for rapid vector extraction from PIV photographs
NASA Technical Reports Server (NTRS)
Humphreys, William M., Jr.
1991-01-01
A new analysis technique, performed totally in the image plane, is proposed which rapidly extracts all available vectors from individual interrogation regions on PIV photographs. The technique avoids the need for using Fourier transforms with the associated computational burden. The data acquisition and analysis procedure is described, and results of a preliminary simulation study to evaluate the accuracy of the technique are presented. Recently obtained PIV photographs are analyzed.
NASA Astrophysics Data System (ADS)
Berger, Lukas; Kleinheinz, Konstantin; Attili, Antonio; Bisetti, Fabrizio; Pitsch, Heinz; Mueller, Michael E.
2018-05-01
Modelling unclosed terms in partial differential equations typically involves two steps: First, a set of known quantities needs to be specified as input parameters for a model, and second, a specific functional form needs to be defined to model the unclosed terms by the input parameters. Both steps involve a certain modelling error, with the former known as the irreducible error and the latter referred to as the functional error. Typically, only the total modelling error, which is the sum of functional and irreducible error, is assessed, but the concept of the optimal estimator enables the separate analysis of the total and the irreducible errors, yielding a systematic modelling error decomposition. In this work, attention is paid to the techniques themselves required for the practical computation of irreducible errors. Typically, histograms are used for optimal estimator analyses, but this technique is found to add a non-negligible spurious contribution to the irreducible error if models with multiple input parameters are assessed. Thus, the error decomposition of an optimal estimator analysis becomes inaccurate, and misleading conclusions concerning modelling errors may be drawn. In this work, numerically accurate techniques for optimal estimator analyses are identified and a suitable evaluation of irreducible errors is presented. Four different computational techniques are considered: a histogram technique, artificial neural networks, multivariate adaptive regression splines, and an additive model based on a kernel method. For multiple input parameter models, only artificial neural networks and multivariate adaptive regression splines are found to yield satisfactorily accurate results. Beyond a certain number of input parameters, the assessment of models in an optimal estimator analysis even becomes practically infeasible if histograms are used. The optimal estimator analysis in this paper is applied to modelling the filtered soot intermittency in large eddy simulations using a dataset of a direct numerical simulation of a non-premixed sooting turbulent flame.
Teh, V; Sim, K S; Wong, E K
2016-11-01
According to the statistic from World Health Organization (WHO), stroke is one of the major causes of death globally. Computed tomography (CT) scan is one of the main medical diagnosis system used for diagnosis of ischemic stroke. CT scan provides brain images in Digital Imaging and Communication in Medicine (DICOM) format. The presentation of CT brain images is mainly relied on the window setting (window center and window width), which converts an image from DICOM format into normal grayscale format. Nevertheless, the ordinary window parameter could not deliver a proper contrast on CT brain images for ischemic stroke detection. In this paper, a new proposed method namely gamma correction extreme-level eliminating with weighting distribution (GCELEWD) is implemented to improve the contrast on CT brain images. GCELEWD is capable of highlighting the hypodense region for diagnosis of ischemic stroke. The performance of this new proposed technique, GCELEWD, is compared with four of the existing contrast enhancement technique such as brightness preserving bi-histogram equalization (BBHE), dualistic sub-image histogram equalization (DSIHE), extreme-level eliminating histogram equalization (ELEHE), and adaptive gamma correction with weighting distribution (AGCWD). GCELEWD shows better visualization for ischemic stroke detection and higher values with image quality assessment (IQA) module. SCANNING 38:842-856, 2016. © 2016 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.
Whole brain myelin mapping using T1- and T2-weighted MR imaging data
Ganzetti, Marco; Wenderoth, Nicole; Mantini, Dante
2014-01-01
Despite recent advancements in MR imaging, non-invasive mapping of myelin in the brain still remains an open issue. Here we attempted to provide a potential solution. Specifically, we developed a processing workflow based on T1-w and T2-w MR data to generate an optimized myelin enhanced contrast image. The workflow allows whole brain mapping using the T1-w/T2-w technique, which was originally introduced as a non-invasive method for assessing cortical myelin content. The hallmark of our approach is a retrospective calibration algorithm, applied to bias-corrected T1-w and T2-w images, that relies on image intensities outside the brain. This permits standardizing the intensity histogram of the ratio image, thereby allowing for across-subject statistical analyses. Quantitative comparisons of image histograms within and across different datasets confirmed the effectiveness of our normalization procedure. Not only did the calibrated T1-w/T2-w images exhibit a comparable intensity range, but also the shape of the intensity histograms was largely corresponding. We also assessed the reliability and specificity of the ratio image compared to other MR-based techniques, such as magnetization transfer ratio (MTR), fractional anisotropy (FA), and fluid-attenuated inversion recovery (FLAIR). With respect to these other techniques, T1-w/T2-w had consistently high values, as well as low inter-subject variability, in brain structures where myelin is most abundant. Overall, our results suggested that the T1-w/T2-w technique may be a valid tool supporting the non-invasive mapping of myelin in the brain. Therefore, it might find important applications in the study of brain development, aging and disease. PMID:25228871
Rasta, Seyed Hossein; Partovi, Mahsa Eisazadeh; Seyedarabi, Hadi; Javadzadeh, Alireza
2015-01-01
To investigate the effect of preprocessing techniques including contrast enhancement and illumination correction on retinal image quality, a comparative study was carried out. We studied and implemented a few illumination correction and contrast enhancement techniques on color retinal images to find out the best technique for optimum image enhancement. To compare and choose the best illumination correction technique we analyzed the corrected red and green components of color retinal images statistically and visually. The two contrast enhancement techniques were analyzed using a vessel segmentation algorithm by calculating the sensitivity and specificity. The statistical evaluation of the illumination correction techniques were carried out by calculating the coefficients of variation. The dividing method using the median filter to estimate background illumination showed the lowest Coefficients of variations in the red component. The quotient and homomorphic filtering methods after the dividing method presented good results based on their low Coefficients of variations. The contrast limited adaptive histogram equalization increased the sensitivity of the vessel segmentation algorithm up to 5% in the same amount of accuracy. The contrast limited adaptive histogram equalization technique has a higher sensitivity than the polynomial transformation operator as a contrast enhancement technique for vessel segmentation. Three techniques including the dividing method using the median filter to estimate background, quotient based and homomorphic filtering were found as the effective illumination correction techniques based on a statistical evaluation. Applying the local contrast enhancement technique, such as CLAHE, for fundus images presented good potentials in enhancing the vasculature segmentation.
NASA Astrophysics Data System (ADS)
Wyjadłowski, Marek
2017-12-01
The constant development of geotechnical technologies imposes the necessity of monitoring techniques to provide a proper quality and the safe execution of geotechnical works. Several monitoring methods enable the preliminary design of work process and current control of hydrotechnical works (pile driving, sheet piling, ground improvement methods). Wave parameter measurements and/or continuous histogram recording of shocks and vibrations and its dynamic impact on engineering structures in the close vicinity of the building site enable the modification of the technology parameters, such as vibrator frequency or hammer drop height. Many examples of practical applications have already been published and provide a basis for the formulation of guidelines, for work on the following sites. In the current work the author's experience gained during sheet piling works for the reconstruction of City Channel in Wrocław (Poland) was presented. The examples chosen describe ways of proceedings in the case of new and old residential buildings where the concrete or masonry walls were exposed to vibrations and in the case of the hydrotechnical structures (sluices, bridges).
Adaptive histogram equalization in digital radiography of destructive skeletal lesions.
Braunstein, E M; Capek, P; Buckwalter, K; Bland, P; Meyer, C R
1988-03-01
Adaptive histogram equalization, an image-processing technique that distributes pixel values of an image uniformly throughout the gray scale, was applied to 28 plain radiographs of bone lesions, after they had been digitized. The non-equalized and equalized digital images were compared by two skeletal radiologists with respect to lesion margins, internal matrix, soft-tissue mass, cortical breakthrough, and periosteal reaction. Receiver operating characteristic (ROC) curves were constructed on the basis of the responses. Equalized images were superior to nonequalized images in determination of cortical breakthrough and presence or absence of periosteal reaction. ROC analysis showed no significant difference in determination of margins, matrix, or soft-tissue masses.
Spatial detection of tv channel logos as outliers from the content
NASA Astrophysics Data System (ADS)
Ekin, Ahmet; Braspenning, Ralph
2006-01-01
This paper proposes a purely image-based TV channel logo detection algorithm that can detect logos independently from their motion and transparency features. The proposed algorithm can robustly detect any type of logos, such as transparent and animated, without requiring any temporal constraints whereas known methods have to wait for the occurrence of large motion in the scene and assume stationary logos. The algorithm models logo pixels as outliers from the actual scene content that is represented by multiple 3-D histograms in the YC BC R space. We use four scene histograms corresponding to each of the four corners because the content characteristics change from one image corner to another. A further novelty of the proposed algorithm is that we define image corners and the areas where we compute the scene histograms by a cinematic technique called Golden Section Rule that is used by professionals. The robustness of the proposed algorithm is demonstrated over a dataset of representative TV content.
Evaluation of thresholding techniques for segmenting scaffold images in tissue engineering
NASA Astrophysics Data System (ADS)
Rajagopalan, Srinivasan; Yaszemski, Michael J.; Robb, Richard A.
2004-05-01
Tissue engineering attempts to address the ever widening gap between the demand and supply of organ and tissue transplants using natural and biomimetic scaffolds. The regeneration of specific tissues aided by synthetic materials is dependent on the structural and morphometric properties of the scaffold. These properties can be derived non-destructively using quantitative analysis of high resolution microCT scans of scaffolds. Thresholding of the scanned images into polymeric and porous phase is central to the outcome of the subsequent structural and morphometric analysis. Visual thresholding of scaffolds produced using stochastic processes is inaccurate. Depending on the algorithmic assumptions made, automatic thresholding might also be inaccurate. Hence there is a need to analyze the performance of different techniques and propose alternate ones, if needed. This paper provides a quantitative comparison of different thresholding techniques for segmenting scaffold images. The thresholding algorithms examined include those that exploit spatial information, locally adaptive characteristics, histogram entropy information, histogram shape information, and clustering of gray-level information. The performance of different techniques was evaluated using established criteria, including misclassification error, edge mismatch, relative foreground error, and region non-uniformity. Algorithms that exploit local image characteristics seem to perform much better than those using global information.
Histogram-based ionogram displays and their application to autoscaling
NASA Astrophysics Data System (ADS)
Lynn, Kenneth J. W.
2018-03-01
A simple method is described for displaying and auto scaling the basic ionogram parameters foF2 and h'F2 as well as some additional layer parameters from digital ionograms. The technique employed is based on forming frequency and height histograms in each ionogram. This technique has now been applied specifically to ionograms produced by the IPS5D ionosonde developed and operated by the Australian Space Weather Service (SWS). The SWS ionograms are archived in a cleaned format and readily available from the SWS internet site. However, the method is applicable to any ionosonde which produces ionograms in a digital format at a useful signal-to-noise level. The most novel feature of the technique for autoscaling is its simplicity and the avoidance of the mathematical imaging and line fitting techniques often used. The program arose from the necessity to display many days of ionogram output to allow the location of specific types of ionospheric event such as ionospheric storms, travelling ionospheric disturbances and repetitive ionospheric height changes for further investigation and measurement. Examples and applications of the method are given including the removal of sporadic E and spread F.
Parallel Geospatial Data Management for Multi-Scale Environmental Data Analysis on GPUs
NASA Astrophysics Data System (ADS)
Wang, D.; Zhang, J.; Wei, Y.
2013-12-01
As the spatial and temporal resolutions of Earth observatory data and Earth system simulation outputs are getting higher, in-situ and/or post- processing such large amount of geospatial data increasingly becomes a bottleneck in scientific inquires of Earth systems and their human impacts. Existing geospatial techniques that are based on outdated computing models (e.g., serial algorithms and disk-resident systems), as have been implemented in many commercial and open source packages, are incapable of processing large-scale geospatial data and achieve desired level of performance. In this study, we have developed a set of parallel data structures and algorithms that are capable of utilizing massively data parallel computing power available on commodity Graphics Processing Units (GPUs) for a popular geospatial technique called Zonal Statistics. Given two input datasets with one representing measurements (e.g., temperature or precipitation) and the other one represent polygonal zones (e.g., ecological or administrative zones), Zonal Statistics computes major statistics (or complete distribution histograms) of the measurements in all regions. Our technique has four steps and each step can be mapped to GPU hardware by identifying its inherent data parallelisms. First, a raster is divided into blocks and per-block histograms are derived. Second, the Minimum Bounding Boxes (MBRs) of polygons are computed and are spatially matched with raster blocks; matched polygon-block pairs are tested and blocks that are either inside or intersect with polygons are identified. Third, per-block histograms are aggregated to polygons for blocks that are completely within polygons. Finally, for blocks that intersect with polygon boundaries, all the raster cells within the blocks are examined using point-in-polygon-test and cells that are within polygons are used to update corresponding histograms. As the task becomes I/O bound after applying spatial indexing and GPU hardware acceleration, we have developed a GPU-based data compression technique by reusing our previous work on Bitplane Quadtree (or BPQ-Tree) based indexing of binary bitmaps. Results have shown that our GPU-based parallel Zonal Statistic technique on 3000+ US counties over 20+ billion NASA SRTM 30 meter resolution Digital Elevation (DEM) raster cells has achieved impressive end-to-end runtimes: 101 seconds and 46 seconds a low-end workstation equipped with a Nvidia GTX Titan GPU using cold and hot cache, respectively; and, 60-70 seconds using a single OLCF TITAN computing node and 10-15 seconds using 8 nodes. Our experiment results clearly show the potentials of using high-end computing facilities for large-scale geospatial processing.
Rasta, Seyed Hossein; Partovi, Mahsa Eisazadeh; Seyedarabi, Hadi; Javadzadeh, Alireza
2015-01-01
To investigate the effect of preprocessing techniques including contrast enhancement and illumination correction on retinal image quality, a comparative study was carried out. We studied and implemented a few illumination correction and contrast enhancement techniques on color retinal images to find out the best technique for optimum image enhancement. To compare and choose the best illumination correction technique we analyzed the corrected red and green components of color retinal images statistically and visually. The two contrast enhancement techniques were analyzed using a vessel segmentation algorithm by calculating the sensitivity and specificity. The statistical evaluation of the illumination correction techniques were carried out by calculating the coefficients of variation. The dividing method using the median filter to estimate background illumination showed the lowest Coefficients of variations in the red component. The quotient and homomorphic filtering methods after the dividing method presented good results based on their low Coefficients of variations. The contrast limited adaptive histogram equalization increased the sensitivity of the vessel segmentation algorithm up to 5% in the same amount of accuracy. The contrast limited adaptive histogram equalization technique has a higher sensitivity than the polynomial transformation operator as a contrast enhancement technique for vessel segmentation. Three techniques including the dividing method using the median filter to estimate background, quotient based and homomorphic filtering were found as the effective illumination correction techniques based on a statistical evaluation. Applying the local contrast enhancement technique, such as CLAHE, for fundus images presented good potentials in enhancing the vasculature segmentation. PMID:25709940
Measurement of Device Parameters Using Image Recovery Techniques in Large-Scale IC Devices
NASA Technical Reports Server (NTRS)
Scheick, Leif; Edmonds, Larry
2004-01-01
Devices that respond to radiation on a cell level will produce histograms showing the relative frequency of cell damage as a function of damage. The measured distribution is the convolution of distributions from radiation responses, measurement noise, and manufacturing parameters. A method of extracting device characteristics and parameters from measured distributions via mathematical and image subtraction techniques is described.
Digital image modification detection using color information and its histograms.
Zhou, Haoyu; Shen, Yue; Zhu, Xinghui; Liu, Bo; Fu, Zigang; Fan, Na
2016-09-01
The rapid development of many open source and commercial image editing software makes the authenticity of the digital images questionable. Copy-move forgery is one of the most widely used tampering techniques to create desirable objects or conceal undesirable objects in a scene. Existing techniques reported in the literature to detect such tampering aim to improve the robustness of these methods against the use of JPEG compression, blurring, noise, or other types of post processing operations. These post processing operations are frequently used with the intention to conceal tampering and reduce tampering clues. A robust method based on the color moments and other five image descriptors is proposed in this paper. The method divides the image into fixed size overlapping blocks. Clustering operation divides entire search space into smaller pieces with similar color distribution. Blocks from the tampered regions will reside within the same cluster since both copied and moved regions have similar color distributions. Five image descriptors are used to extract block features, which makes the method more robust to post processing operations. An ensemble of deep compositional pattern-producing neural networks are trained with these extracted features. Similarity among feature vectors in clusters indicates possible forged regions. Experimental results show that the proposed method can detect copy-move forgery even if an image was distorted by gamma correction, addictive white Gaussian noise, JPEG compression, or blurring. Copyright © 2016. Published by Elsevier Ireland Ltd.
Content based Image Retrieval based on Different Global and Local Color Histogram Methods: A Survey
NASA Astrophysics Data System (ADS)
Suhasini, Pallikonda Sarah; Sri Rama Krishna, K.; Murali Krishna, I. V.
2017-02-01
Different global and local color histogram methods for content based image retrieval (CBIR) are investigated in this paper. Color histogram is a widely used descriptor for CBIR. Conventional method of extracting color histogram is global, which misses the spatial content, is less invariant to deformation and viewpoint changes, and results in a very large three dimensional histogram corresponding to the color space used. To address the above deficiencies, different global and local histogram methods are proposed in recent research. Different ways of extracting local histograms to have spatial correspondence, invariant colour histogram to add deformation and viewpoint invariance and fuzzy linking method to reduce the size of the histogram are found in recent papers. The color space and the distance metric used are vital in obtaining color histogram. In this paper the performance of CBIR based on different global and local color histograms in three different color spaces, namely, RGB, HSV, L*a*b* and also with three distance measures Euclidean, Quadratic and Histogram intersection are surveyed, to choose appropriate method for future research.
Spatiotemporal models for the simulation of infrared backgrounds
NASA Astrophysics Data System (ADS)
Wilkes, Don M.; Cadzow, James A.; Peters, R. Alan, II; Li, Xingkang
1992-09-01
It is highly desirable for designers of automatic target recognizers (ATRs) to be able to test their algorithms on targets superimposed on a wide variety of background imagery. Background imagery in the infrared spectrum is expensive to gather from real sources, consequently, there is a need for accurate models for producing synthetic IR background imagery. We have developed a model for such imagery that will do the following: Given a real, infrared background image, generate another image, distinctly different from the one given, that has the same general visual characteristics as well as the first and second-order statistics of the original image. The proposed model consists of a finite impulse response (FIR) kernel convolved with an excitation function, and histogram modification applied to the final solution. A procedure for deriving the FIR kernel using a signal enhancement algorithm has been developed, and the histogram modification step is a simple memoryless nonlinear mapping that imposes the first order statistics of the original image onto the synthetic one, thus the overall model is a linear system cascaded with a memoryless nonlinearity. It has been found that the excitation function relates to the placement of features in the image, the FIR kernel controls the sharpness of the edges and the global spectrum of the image, and the histogram controls the basic coloration of the image. A drawback to this method of simulating IR backgrounds is that a database of actual background images must be collected in order to produce accurate FIR and histogram models. If this database must include images of all types of backgrounds obtained at all times of the day and all times of the year, the size of the database would be prohibitive. In this paper we propose improvements to the model described above that enable time-dependent modeling of the IR background. This approach can greatly reduce the number of actual IR backgrounds that are required to produce a sufficiently accurate mathematical model for synthesizing a similar IR background for different times of the day. Original and synthetic IR backgrounds will be presented. Previous research in simulating IR backgrounds was performed by Strenzwilk, et al., Botkin, et al., and Rapp. The most recent work of Strenzwilk, et al. was based on the use of one-dimensional ARMA models for synthesizing the images. Their results were able to retain the global statistical and spectral behavior of the original image, but the synthetic image was not visually very similar to the original. The research presented in this paper is the result of an attempt to improve upon their results, and represents a significant improvement in quality over previously obtained results.
A comparison of methods using optical coherence tomography to detect demineralized regions in teeth
Sowa, Michael G.; Popescu, Dan P.; Friesen, Jeri R.; Hewko, Mark D.; Choo-Smith, Lin-P’ing
2013-01-01
Optical coherence tomography (OCT) is a three- dimensional optical imaging technique that can be used to identify areas of early caries formation in dental enamel. The OCT signal at 850 nm back-reflected from sound enamel is attenuated stronger than the signal back-reflected from demineralized regions. To quantify this observation, the OCT signal as a function of depth into the enamel (also known as the A-scan intensity), the histogram of the A-scan intensities and three summary parameters derived from the A-scan are defined and their diagnostic potential compared. A total of 754 OCT A-scans were analyzed. The three summary parameters derived from the A-scans, the OCT attenuation coefficient as well as the mean and standard deviation of the lognormal fit to the histogram of the A-scan ensemble show statistically significant differences (p < 0.01) when comparing parameters from sound enamel and caries. Furthermore, these parameters only show a modest correlation. Based on the area under the curve (AUC) of the receiver operating characteristics (ROC) plot, the OCT attenuation coefficient shows higher discriminatory capacity (AUC=0.98) compared to the parameters derived from the lognormal fit to the histogram of the A-scan. However, direct analysis of the A-scans or the histogram of A-scan intensities using linear support vector machine classification shows diagnostic discrimination (AUC = 0.96) comparable to that achieved using the attenuation coefficient. These findings suggest that either direct analysis of the A-scan, its intensity histogram or the attenuation coefficient derived from the descending slope of the OCT A-scan have high capacity to discriminate between regions of caries and sound enamel. PMID:22052833
Arisawa, Atsuko; Watanabe, Yoshiyuki; Tanaka, Hisashi; Takahashi, Hiroto; Matsuo, Chisato; Fujiwara, Takuya; Fujiwara, Masahiro; Fujimoto, Yasunori; Tomiyama, Noriyuki
2018-06-01
Arterial spin labeling (ASL) is a non-invasive perfusion technique that may be an alternative to dynamic susceptibility contrast magnetic resonance imaging (DSC-MRI) for assessment of brain tumors. To our knowledge, there have been no reports on histogram analysis of ASL. The purpose of this study was to determine whether ASL is comparable with DSC-MRI in terms of differentiating high-grade and low-grade gliomas by evaluating the histogram analysis of cerebral blood flow (CBF) in the entire tumor. Thirty-four patients with pathologically proven glioma underwent ASL and DSC-MRI. High-signal areas on contrast-enhanced T 1 -weighted images or high-intensity areas on fluid-attenuated inversion recovery images were designated as the volumes of interest (VOIs). ASL-CBF, DSC-CBF, and DSC-cerebral blood volume maps were constructed and co-registered to the VOI. Perfusion histogram analyses of the whole VOI and statistical analyses were performed to compare the ASL and DSC images. There was no significant difference in the mean values for any of the histogram metrics in both of the low-grade gliomas (n = 15) and the high-grade gliomas (n = 19). Strong correlations were seen in the 75th percentile, mean, median, and standard deviation values between the ASL and DSC images. The area under the curve values tended to be greater for the DSC images than for the ASL images. DSC-MRI is superior to ASL for distinguishing high-grade from low-grade glioma. ASL could be an alternative evaluation method when DSC-MRI cannot be used, e.g., in patients with renal failure, those in whom repeated examination is required, and in children.
Axelsen, Jacob Bock; Yan, Koon-Kiu; Maslov, Sergei
2007-01-01
Background The evolution of the full repertoire of proteins encoded in a given genome is mostly driven by gene duplications, deletions, and sequence modifications of existing proteins. Indirect information about relative rates and other intrinsic parameters of these three basic processes is contained in the proteome-wide distribution of sequence identities of pairs of paralogous proteins. Results We introduce a simple mathematical framework based on a stochastic birth-and-death model that allows one to extract some of this information and apply it to the set of all pairs of paralogous proteins in H. pylori, E. coli, S. cerevisiae, C. elegans, D. melanogaster, and H. sapiens. It was found that the histogram of sequence identities p generated by an all-to-all alignment of all protein sequences encoded in a genome is well fitted with a power-law form ~ p-γ with the value of the exponent γ around 4 for the majority of organisms used in this study. This implies that the intra-protein variability of substitution rates is best described by the Gamma-distribution with the exponent α ≈ 0.33. Different features of the shape of such histograms allow us to quantify the ratio between the genome-wide average deletion/duplication rates and the amino-acid substitution rate. Conclusion We separately measure the short-term ("raw") duplication and deletion rates rdup∗, rdel∗ which include gene copies that will be removed soon after the duplication event and their dramatically reduced long-term counterparts rdup, rdel. High deletion rate among recently duplicated proteins is consistent with a scenario in which they didn't have enough time to significantly change their functional roles and thus are to a large degree disposable. Systematic trends of each of the four duplication/deletion rates with the total number of genes in the genome were analyzed. All but the deletion rate of recent duplicates rdel∗ were shown to systematically increase with Ngenes. Abnormally flat shapes of sequence identity histograms observed for yeast and human are consistent with lineages leading to these organisms undergoing one or more whole-genome duplications. This interpretation is corroborated by our analysis of the genome of Paramecium tetraurelia where the p-4 profile of the histogram is gradually restored by the successive removal of paralogs generated in its four known whole-genome duplication events. PMID:18039386
Blind technique using blocking artifacts and entropy of histograms for image tampering detection
NASA Astrophysics Data System (ADS)
Manu, V. T.; Mehtre, B. M.
2017-06-01
The tremendous technological advancements in recent times has enabled people to create, edit and circulate images easily than ever before. As a result of this, ensuring the integrity and authenticity of the images has become challenging. Malicious editing of images to deceive the viewer is referred to as image tampering. A widely used image tampering technique is image splicing or compositing, in which regions from different images are copied and pasted. In this paper, we propose a tamper detection method utilizing the blocking and blur artifacts which are the footprints of splicing. The classification of images as tampered or not, is done based on the standard deviations of the entropy histograms and block discrete cosine transformations. We can detect the exact boundaries of the tampered area in the image, if the image is classified as tampered. Experimental results on publicly available image tampering datasets show that the proposed method outperforms the existing methods in terms of accuracy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aggarwal, R.
1978-07-01
Various proposed improvements in the design and operational procedures for bomber/transport aircraft are evaluated. The evaluation is performed in terms of the estimated savings in fuel consumption and in Direct Operating Cost (DOC). As an aid in the evaluation of design modifications, graphs of fuel and DOC savings as a function of the design parameters are developed. These graphs are based on actual mission trajectory data rather than some type trajectory profile. The actual mission data is presented in terms of histograms which provide statistical information concerning altitude, air speed, take-off weight, landing weights, and mission time. Separate analyses aremore » performed on the following aircraft: the B-52G, the B-52H, the KC-135, the C-141, the C-130, and the C-5A.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aggarwal, R.K.
1978-07-01
Various proposed improvements in the design and operational procedures for bomber/transport aircraft are evaluated. The evaluation is performed in terms of the estimated savings in fuel consumption and in Direct Operating Cost (DOC). As an aid in the evaluation of design modifications, graphs of fuel and DOC savings as a function of the design parameters are developed. These graphs are based on actual mission trajectory data rather than some typical trajectory profile. The actual mission data is presented in terms of histograms which provide statistical information concerning altitude, air speed, take-off weight, landing weight, and mission time. Separate analyses aremore » performed on the following aircraft: the B-52G, the B-52H, the KC-135, the C-141, the C-130, and the C-5A. (Author)« less
Visual Contrast Enhancement Algorithm Based on Histogram Equalization
Ting, Chih-Chung; Wu, Bing-Fei; Chung, Meng-Liang; Chiu, Chung-Cheng; Wu, Ya-Ching
2015-01-01
Image enhancement techniques primarily improve the contrast of an image to lend it a better appearance. One of the popular enhancement methods is histogram equalization (HE) because of its simplicity and effectiveness. However, it is rarely applied to consumer electronics products because it can cause excessive contrast enhancement and feature loss problems. These problems make the images processed by HE look unnatural and introduce unwanted artifacts in them. In this study, a visual contrast enhancement algorithm (VCEA) based on HE is proposed. VCEA considers the requirements of the human visual perception in order to address the drawbacks of HE. It effectively solves the excessive contrast enhancement problem by adjusting the spaces between two adjacent gray values of the HE histogram. In addition, VCEA reduces the effects of the feature loss problem by using the obtained spaces. Furthermore, VCEA enhances the detailed textures of an image to generate an enhanced image with better visual quality. Experimental results show that images obtained by applying VCEA have higher contrast and are more suited to human visual perception than those processed by HE and other HE-based methods. PMID:26184219
Adaptive local thresholding for robust nucleus segmentation utilizing shape priors
NASA Astrophysics Data System (ADS)
Wang, Xiuzhong; Srinivas, Chukka
2016-03-01
This paper describes a novel local thresholding method for foreground detection. First, a Canny edge detection method is used for initial edge detection. Then, tensor voting is applied on the initial edge pixels, using a nonsymmetric tensor field tailored to encode prior information about nucleus size, shape, and intensity spatial distribution. Tensor analysis is then performed to generate the saliency image and, based on that, the refined edge. Next, the image domain is divided into blocks. In each block, at least one foreground and one background pixel are sampled for each refined edge pixel. The saliency weighted foreground histogram and background histogram are then created. These two histograms are used to calculate a threshold by minimizing the background and foreground pixel classification error. The block-wise thresholds are then used to generate the threshold for each pixel via interpolation. Finally, the foreground is obtained by comparing the original image with the threshold image. The effective use of prior information, combined with robust techniques, results in far more reliable foreground detection, which leads to robust nucleus segmentation.
Fusion-based multi-target tracking and localization for intelligent surveillance systems
NASA Astrophysics Data System (ADS)
Rababaah, Haroun; Shirkhodaie, Amir
2008-04-01
In this paper, we have presented two approaches addressing visual target tracking and localization in complex urban environment. The two techniques presented in this paper are: fusion-based multi-target visual tracking, and multi-target localization via camera calibration. For multi-target tracking, the data fusion concepts of hypothesis generation/evaluation/selection, target-to-target registration, and association are employed. An association matrix is implemented using RGB histograms for associated tracking of multi-targets of interests. Motion segmentation of targets of interest (TOI) from the background was achieved by a Gaussian Mixture Model. Foreground segmentation, on other hand, was achieved by the Connected Components Analysis (CCA) technique. The tracking of individual targets was estimated by fusing two sources of information, the centroid with the spatial gating, and the RGB histogram association matrix. The localization problem is addressed through an effective camera calibration technique using edge modeling for grid mapping (EMGM). A two-stage image pixel to world coordinates mapping technique is introduced that performs coarse and fine location estimation of moving TOIs. In coarse estimation, an approximate neighborhood of the target position is estimated based on nearest 4-neighbor method, and in fine estimation, we use Euclidean interpolation to localize the position within the estimated four neighbors. Both techniques were tested and shown reliable results for tracking and localization of Targets of interests in complex urban environment.
Freezing Transition Studies Through Constrained Cell Model Simulation
NASA Astrophysics Data System (ADS)
Nayhouse, Michael; Kwon, Joseph Sang-Il; Heng, Vincent R.; Amlani, Ankur M.; Orkoulas, G.
2014-10-01
In the present work, a simulation method based on cell models is used to deduce the fluid-solid transition of a system of particles that interact via a pair potential, , which is of the form with . The simulations are implemented under constant-pressure conditions on a generalized version of the constrained cell model. The constrained cell model is constructed by dividing the volume into Wigner-Seitz cells and confining each particle in a single cell. This model is a special case of a more general cell model which is formed by introducing an additional field variable that controls the number of particles per cell and, thus, the relative stability of the solid against the fluid phase. High field values force configurations with one particle per cell and thus favor the solid phase. Fluid-solid coexistence on the isotherm that corresponds to a reduced temperature of 2 is determined from constant-pressure simulations of the generalized cell model using tempering and histogram reweighting techniques. The entire fluid-solid phase boundary is determined through a thermodynamic integration technique based on histogram reweighting, using the previous coexistence point as a reference point. The vapor-liquid phase diagram is obtained from constant-pressure simulations of the unconstrained system using tempering and histogram reweighting. The phase diagram of the system is found to contain a stable critical point and a triple point. The phase diagram of the corresponding constrained cell model is also found to contain both a stable critical point and a triple point.
NASA Astrophysics Data System (ADS)
McCann, Cooper Patrick
Low-cost flight-based hyperspectral imaging systems have the potential to provide valuable information for ecosystem and environmental studies as well as aide in land management and land health monitoring. This thesis describes (1) a bootstrap method of producing mesoscale, radiometrically-referenced hyperspectral data using the Landsat surface reflectance (LaSRC) data product as a reference target, (2) biophysically relevant basis functions to model the reflectance spectra, (3) an unsupervised classification technique based on natural histogram splitting of these biophysically relevant parameters, and (4) local and multi-temporal anomaly detection. The bootstrap method extends standard processing techniques to remove uneven illumination conditions between flight passes, allowing the creation of radiometrically self-consistent data. Through selective spectral and spatial resampling, LaSRC data is used as a radiometric reference target. Advantages of the bootstrap method include the need for minimal site access, no ancillary instrumentation, and automated data processing. Data from a flight on 06/02/2016 is compared with concurrently collected ground based reflectance spectra as a means of validation achieving an average error of 2.74%. Fitting reflectance spectra using basis functions, based on biophysically relevant spectral features, allows both noise and data reductions while shifting information from spectral bands to biophysical features. Histogram splitting is used to determine a clustering based on natural splittings of these fit parameters. The Indian Pines reference data enabled comparisons of the efficacy of this technique to established techniques. The splitting technique is shown to be an improvement over the ISODATA clustering technique with an overall accuracy of 34.3/19.0% before merging and 40.9/39.2% after merging. This improvement is also seen as an improvement of kappa before/after merging of 24.8/30.5 for the histogram splitting technique compared to 15.8/28.5 for ISODATA. Three hyperspectral flights over the Kevin Dome area, covering 1843 ha, acquired 06/21/2014, 06/24/2015 and 06/26/2016 are examined with different methods of anomaly detection. Detection of anomalies within a single data set is examined to determine, on a local scale, areas that are significantly different from the surrounding area. Additionally, the detection and identification of persistent anomalies and non-persistent anomalies was investigated across multiple data sets.
nSTAT: Open-Source Neural Spike Train Analysis Toolbox for Matlab
Cajigas, I.; Malik, W.Q.; Brown, E.N.
2012-01-01
Over the last decade there has been a tremendous advance in the analytical tools available to neuroscientists to understand and model neural function. In particular, the point process - Generalized Linear Model (PPGLM) framework has been applied successfully to problems ranging from neuro-endocrine physiology to neural decoding. However, the lack of freely distributed software implementations of published PP-GLM algorithms together with problem-specific modifications required for their use, limit wide application of these techniques. In an effort to make existing PP-GLM methods more accessible to the neuroscience community, we have developed nSTAT – an open source neural spike train analysis toolbox for Matlab®. By adopting an Object-Oriented Programming (OOP) approach, nSTAT allows users to easily manipulate data by performing operations on objects that have an intuitive connection to the experiment (spike trains, covariates, etc.), rather than by dealing with data in vector/matrix form. The algorithms implemented within nSTAT address a number of common problems including computation of peri-stimulus time histograms, quantification of the temporal response properties of neurons, and characterization of neural plasticity within and across trials. nSTAT provides a starting point for exploratory data analysis, allows for simple and systematic building and testing of point process models, and for decoding of stimulus variables based on point process models of neural function. By providing an open-source toolbox, we hope to establish a platform that can be easily used, modified, and extended by the scientific community to address limitations of current techniques and to extend available techniques to more complex problems. PMID:22981419
Differentially Private Synthesization of Multi-Dimensional Data using Copula Functions
Li, Haoran; Xiong, Li; Jiang, Xiaoqian
2014-01-01
Differential privacy has recently emerged in private statistical data release as one of the strongest privacy guarantees. Most of the existing techniques that generate differentially private histograms or synthetic data only work well for single dimensional or low-dimensional histograms. They become problematic for high dimensional and large domain data due to increased perturbation error and computation complexity. In this paper, we propose DPCopula, a differentially private data synthesization technique using Copula functions for multi-dimensional data. The core of our method is to compute a differentially private copula function from which we can sample synthetic data. Copula functions are used to describe the dependence between multivariate random vectors and allow us to build the multivariate joint distribution using one-dimensional marginal distributions. We present two methods for estimating the parameters of the copula functions with differential privacy: maximum likelihood estimation and Kendall’s τ estimation. We present formal proofs for the privacy guarantee as well as the convergence property of our methods. Extensive experiments using both real datasets and synthetic datasets demonstrate that DPCopula generates highly accurate synthetic multi-dimensional data with significantly better utility than state-of-the-art techniques. PMID:25405241
Detection and tracking of gas plumes in LWIR hyperspectral video sequence data
NASA Astrophysics Data System (ADS)
Gerhart, Torin; Sunu, Justin; Lieu, Lauren; Merkurjev, Ekaterina; Chang, Jen-Mei; Gilles, Jérôme; Bertozzi, Andrea L.
2013-05-01
Automated detection of chemical plumes presents a segmentation challenge. The segmentation problem for gas plumes is difficult due to the diffusive nature of the cloud. The advantage of considering hyperspectral images in the gas plume detection problem over the conventional RGB imagery is the presence of non-visual data, allowing for a richer representation of information. In this paper we present an effective method of visualizing hyperspectral video sequences containing chemical plumes and investigate the effectiveness of segmentation techniques on these post-processed videos. Our approach uses a combination of dimension reduction and histogram equalization to prepare the hyperspectral videos for segmentation. First, Principal Components Analysis (PCA) is used to reduce the dimension of the entire video sequence. This is done by projecting each pixel onto the first few Principal Components resulting in a type of spectral filter. Next, a Midway method for histogram equalization is used. These methods redistribute the intensity values in order to reduce icker between frames. This properly prepares these high-dimensional video sequences for more traditional segmentation techniques. We compare the ability of various clustering techniques to properly segment the chemical plume. These include K-means, spectral clustering, and the Ginzburg-Landau functional.
Seppänen, Tapio
2017-01-01
Fourier transform infrared (FTIR) microspectroscopy images contain information from the whole infrared spectrum used for microspectroscopic analyses. In combination with the FTIR image, visible light images are used to depict the area from which the FTIR spectral image was sampled. These two images are traditionally acquired as separate files. This paper proposes a histogram shifting-based data hiding technique to embed visible light images in FTIR spectral images producing single entities. The primary objective is to improve data management efficiency. Secondary objectives are confidentiality, availability, and reliability. Since the integrity of biomedical data is vital, the proposed method applies reversible data hiding. After extraction of the embedded data, the FTIR image is reversed to its original state. Furthermore, the proposed method applies authentication tags generated with keyed Hash-Based Message Authentication Codes (HMAC) to detect tampered or corrupted areas of FTIR images. The experimental results show that the FTIR spectral images carrying the payload maintain good perceptual fidelity and the payload can be reliably recovered even after bit flipping or cropping attacks. It has been also shown that extraction successfully removes all modifications caused by the payload. Finally, authentication tags successfully indicated tampered FTIR image areas. PMID:29259987
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takashima, Kengo; Yamamoto, Takahiro, E-mail: takahiro@rs.tus.ac.jp; Department of Liberal Arts
Conductance fluctuation of edge-disordered graphene nanoribbons (ED-GNRs) is examined using the non-equilibrium Green's function technique combined with the extended Hückel approximation. The mean free path λ and the localization length ξ of the ED-GNRs are determined to classify the quantum transport regimes. In the diffusive regime where the length L{sub c} of the ED-GNRs is much longer than λ and much shorter than ξ, the conductance histogram is given by a Gaussian distribution function with universal conductance fluctuation. In the localization regime where L{sub c}≫ξ, the histogram is no longer the universal Gaussian distribution but a lognormal distribution that characterizesmore » Anderson localization.« less
Using Statistical Process Control to Make Data-Based Clinical Decisions.
ERIC Educational Resources Information Center
Pfadt, Al; Wheeler, Donald J.
1995-01-01
Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…
Benchmarking the Degree of Implementation of Learner-Centered Approaches
ERIC Educational Resources Information Center
Blumberg, Phyllis; Pontiggia, Laura
2011-01-01
We describe an objective way to measure whether curricula, educational programs, and institutions are learner-centered. This technique for benchmarking learner-centeredness uses rubrics to measure courses on 29 components within Weimer's five dimensions. We converted the scores on the rubrics to four-point indices and constructed histograms that…
Liu, Chunling; Wang, Kun; Li, Xiaodan; Zhang, Jine; Ding, Jie; Spuhler, Karl; Duong, Timothy; Liang, Changhong; Huang, Chuan
2018-06-01
Diffusion-weighted imaging (DWI) has been studied in breast imaging and can provide more information about diffusion, perfusion and other physiological interests than standard pulse sequences. The stretched-exponential model has previously been shown to be more reliable than conventional DWI techniques, but different diagnostic sensitivities were found from study to study. This work investigated the characteristics of whole-lesion histogram parameters derived from the stretched-exponential diffusion model for benign and malignant breast lesions, compared them with conventional apparent diffusion coefficient (ADC), and further determined which histogram metrics can be best used to differentiate malignant from benign lesions. This was a prospective study. Seventy females were included in the study. Multi-b value DWI was performed on a 1.5T scanner. Histogram parameters of whole lesions for distributed diffusion coefficient (DDC), heterogeneity index (α), and ADC were calculated by two radiologists and compared among benign lesions, ductal carcinoma in situ (DCIS), and invasive carcinoma confirmed by pathology. Nonparametric tests were performed for comparisons among invasive carcinoma, DCIS, and benign lesions. Comparisons of receiver operating characteristic (ROC) curves were performed to show the ability to discriminate malignant from benign lesions. The majority of histogram parameters (mean/min/max, skewness/kurtosis, 10-90 th percentile values) from DDC, α, and ADC were significantly different among invasive carcinoma, DCIS, and benign lesions. DDC 10% (area under curve [AUC] = 0.931), ADC 10% (AUC = 0.893), and α mean (AUC = 0.787) were found to be the best metrics in differentiating benign from malignant tumors among all histogram parameters derived from ADC and α, respectively. The combination of DDC 10% and α mean , using logistic regression, yielded the highest sensitivity (90.2%) and specificity (95.5%). DDC 10% and α mean derived from the stretched-exponential model provides more information and better diagnostic performance in differentiating malignancy from benign lesions than ADC parameters derived from a monoexponential model. 2 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:1701-1710. © 2017 International Society for Magnetic Resonance in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Font, Joan; Beckman, John E.; Fathi, Kambiz
In this Letter, we introduce a technique for finding resonance radii in a disk galaxy. We use a two-dimensional velocity field in H{alpha} emission obtained with Fabry-Perot interferometry, derive the classical rotation curve, and subtract it off, leaving a residual velocity map. As the streaming motions should reverse sign at corotation, we detect these reversals and plot them in a histogram against galactocentric radius, excluding points where the amplitude of the reversal is smaller than the measurement uncertainty. The histograms show well-defined peaks which we assume to occur at resonance radii, identifying corotations as the most prominent peaks corresponding tomore » the relevant morphological features of the galaxy (notably bars and spiral arm systems). We compare our results with published measurements on the same galaxies using other methods and different types of data.« less
Analysis of memory use for improved design and compile-time allocation of local memory
NASA Technical Reports Server (NTRS)
Mcniven, Geoffrey D.; Davidson, Edward S.
1986-01-01
Trace analysis techniques are used to study memory referencing behavior for the purpose of designing local memories and determining how to allocate them for data and instructions. In an attempt to assess the inherent behavior of the source code, the trace analysis system described here reduced the effects of the compiler and host architecture on the trace by using a technical called flattening. The variables in the trace, their associated single-assignment values, and references are histogrammed on the basis of various parameters describing memory referencing behavior. Bounds are developed specifying the amount of memory space required to store all live values in a particular histogram class. The reduction achieved in main memory traffic by allocating local memory is specified for each class.
Improved image retrieval based on fuzzy colour feature vector
NASA Astrophysics Data System (ADS)
Ben-Ahmeida, Ahlam M.; Ben Sasi, Ahmed Y.
2013-03-01
One of Image indexing techniques is the Content-Based Image Retrieval which is an efficient way for retrieving images from the image database automatically based on their visual contents such as colour, texture, and shape. In this paper will be discuss how using content-based image retrieval (CBIR) method by colour feature extraction and similarity checking. By dividing the query image and all images in the database into pieces and extract the features of each part separately and comparing the corresponding portions in order to increase the accuracy in the retrieval. The proposed approach is based on the use of fuzzy sets, to overcome the problem of curse of dimensionality. The contribution of colour of each pixel is associated to all the bins in the histogram using fuzzy-set membership functions. As a result, the Fuzzy Colour Histogram (FCH), outperformed the Conventional Colour Histogram (CCH) in image retrieving, due to its speedy results, where were images represented as signatures that took less size of memory, depending on the number of divisions. The results also showed that FCH is less sensitive and more robust to brightness changes than the CCH with better retrieval recall values.
Dietrich, John D.; Johnson, Ronald C.
2013-01-01
Thirteen stratigraphic cross sections of the Eocene Green River Formation in the Piceance Basin of northwestern Colorado are presented in this report. Originally published in a much larger and more detailed form by Self and others (2010), they are shown here in simplified, page-size versions that are easily accessed and used for presentation purposes. Modifications to the original versions include the elimination of the detailed lithologic columns and oil-yield histograms from Fischer assay data and the addition of ground-surface lines to give the depth of the various oil shale units shown on the cross section.
Reversible Data Hiding Based on DNA Computing
Xie, Yingjie
2017-01-01
Biocomputing, especially DNA, computing has got great development. It is widely used in information security. In this paper, a novel algorithm of reversible data hiding based on DNA computing is proposed. Inspired by the algorithm of histogram modification, which is a classical algorithm for reversible data hiding, we combine it with DNA computing to realize this algorithm based on biological technology. Compared with previous results, our experimental results have significantly improved the ER (Embedding Rate). Furthermore, some PSNR (peak signal-to-noise ratios) of test images are also improved. Experimental results show that it is suitable for protecting the copyright of cover image in DNA-based information security. PMID:28280504
Standardized volume-rendering of contrast-enhanced renal magnetic resonance angiography.
Smedby, O; Oberg, R; Asberg, B; Stenström, H; Eriksson, P
2005-08-01
To propose a technique for standardizing volume-rendering technique (VRT) protocols and to compare this with maximum intensity projection (MIP) in regard to image quality and diagnostic confidence in stenosis diagnosis with magnetic resonance angiography (MRA). Twenty patients were examined with MRA under suspicion of renal artery stenosis. Using the histogram function in the volume-rendering software, the 95th and 99th percentiles of the 3D data set were identified and used to define the VRT transfer function. Two radiologists assessed the stenosis pathology and image quality from rotational sequences of MIP and VRT images. Good overall agreement (mean kappa=0.72) was found between MIP and VRT diagnoses. The agreement between MIP and VRT was considerably better than that between observers (mean kappa=0.43). One of the observers judged VRT images as having higher image quality than MIP images. Presenting renal MRA images with VRT gave results in good agreement with MIP. With VRT protocols defined from the histogram of the image, the lack of an absolute gray scale in MRI need not be a major problem.
Bin Ratio-Based Histogram Distances and Their Application to Image Classification.
Hu, Weiming; Xie, Nianhua; Hu, Ruiguang; Ling, Haibin; Chen, Qiang; Yan, Shuicheng; Maybank, Stephen
2014-12-01
Large variations in image background may cause partial matching and normalization problems for histogram-based representations, i.e., the histograms of the same category may have bins which are significantly different, and normalization may produce large changes in the differences between corresponding bins. In this paper, we deal with this problem by using the ratios between bin values of histograms, rather than bin values' differences which are used in the traditional histogram distances. We propose a bin ratio-based histogram distance (BRD), which is an intra-cross-bin distance, in contrast with previous bin-to-bin distances and cross-bin distances. The BRD is robust to partial matching and histogram normalization, and captures correlations between bins with only a linear computational complexity. We combine the BRD with the ℓ1 histogram distance and the χ(2) histogram distance to generate the ℓ1 BRD and the χ(2) BRD, respectively. These combinations exploit and benefit from the robustness of the BRD under partial matching and the robustness of the ℓ1 and χ(2) distances to small noise. We propose a method for assessing the robustness of histogram distances to partial matching. The BRDs and logistic regression-based histogram fusion are applied to image classification. The experimental results on synthetic data sets show the robustness of the BRDs to partial matching, and the experiments on seven benchmark data sets demonstrate promising results of the BRDs for image classification.
Muscle categorization using PDF estimation and Naive Bayes classification.
Adel, Tameem M; Smith, Benn E; Stashuk, Daniel W
2012-01-01
The structure of motor unit potentials (MUPs) and their times of occurrence provide information about the motor units (MUs) that created them. As such, electromyographic (EMG) data can be used to categorize muscles as normal or suffering from a neuromuscular disease. Using pattern discovery (PD) allows clinicians to understand the rationale underlying a certain muscle characterization; i.e. it is transparent. Discretization is required in PD, which leads to some loss in accuracy. In this work, characterization techniques that are based on estimating probability density functions (PDFs) for each muscle category are implemented. Characterization probabilities of each motor unit potential train (MUPT) are obtained from these PDFs and then Bayes rule is used to aggregate the MUPT characterization probabilities to calculate muscle level probabilities. Even though this technique is not as transparent as PD, its accuracy is higher than the discrete PD. Ultimately, the goal is to use a technique that is based on both PDFs and PD and make it as transparent and as efficient as possible, but first it was necessary to thoroughly assess how accurate a fully continuous approach can be. Using gaussian PDF estimation achieved improvements in muscle categorization accuracy over PD and further improvements resulted from using feature value histograms to choose more representative PDFs; for instance, using log-normal distribution to represent skewed histograms.
Automatic lesion boundary detection in dermoscopy images using gradient vector flow snakes
Erkol, Bulent; Moss, Randy H.; Stanley, R. Joe; Stoecker, William V.; Hvatum, Erik
2011-01-01
Background Malignant melanoma has a good prognosis if treated early. Dermoscopy images of pigmented lesions are most commonly taken at × 10 magnification under lighting at a low angle of incidence while the skin is immersed in oil under a glass plate. Accurate skin lesion segmentation from the background skin is important because some of the features anticipated to be used for diagnosis deal with shape of the lesion and others deal with the color of the lesion compared with the color of the surrounding skin. Methods In this research, gradient vector flow (GVF) snakes are investigated to find the border of skin lesions in dermoscopy images. An automatic initialization method is introduced to make the skin lesion border determination process fully automated. Results Skin lesion segmentation results are presented for 70 benign and 30 melanoma skin lesion images for the GVF-based method and a color histogram analysis technique. The average errors obtained by the GVF-based method are lower for both the benign and melanoma image sets than for the color histogram analysis technique based on comparison with manually segmented lesions determined by a dermatologist. Conclusions The experimental results for the GVF-based method demonstrate promise as an automated technique for skin lesion segmentation in dermoscopy images. PMID:15691255
NASA Astrophysics Data System (ADS)
Moslehi, M.; de Barros, F.
2017-12-01
Complexity of hydrogeological systems arises from the multi-scale heterogeneity and insufficient measurements of their underlying parameters such as hydraulic conductivity and porosity. An inadequate characterization of hydrogeological properties can significantly decrease the trustworthiness of numerical models that predict groundwater flow and solute transport. Therefore, a variety of data assimilation methods have been proposed in order to estimate hydrogeological parameters from spatially scarce data by incorporating the governing physical models. In this work, we propose a novel framework for evaluating the performance of these estimation methods. We focus on the Ensemble Kalman Filter (EnKF) approach that is a widely used data assimilation technique. It reconciles multiple sources of measurements to sequentially estimate model parameters such as the hydraulic conductivity. Several methods have been used in the literature to quantify the accuracy of the estimations obtained by EnKF, including Rank Histograms, RMSE and Ensemble Spread. However, these commonly used methods do not regard the spatial information and variability of geological formations. This can cause hydraulic conductivity fields with very different spatial structures to have similar histograms or RMSE. We propose a vision-based approach that can quantify the accuracy of estimations by considering the spatial structure embedded in the estimated fields. Our new approach consists of adapting a new metric, Color Coherent Vectors (CCV), to evaluate the accuracy of estimated fields achieved by EnKF. CCV is a histogram-based technique for comparing images that incorporate spatial information. We represent estimated fields as digital three-channel images and use CCV to compare and quantify the accuracy of estimations. The sensitivity of CCV to spatial information makes it a suitable metric for assessing the performance of spatial data assimilation techniques. Under various factors of data assimilation methods such as number, layout, and type of measurements, we compare the performance of CCV with other metrics such as RMSE. By simulating hydrogeological processes using estimated and true fields, we observe that CCV outperforms other existing evaluation metrics.
Li, Zhiwei; Ai, Tao; Hu, Yiqi; Yan, Xu; Nickel, Marcel Dominik; Xu, Xiao; Xia, Liming
2018-01-01
To investigate the application of whole-lesion histogram analysis of pharmacokinetic parameters for differentiating malignant from benign breast lesions on dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). In all, 92 women with 97 breast lesions (26 benign and 71 malignant lesions) were enrolled in this study. Patients underwent dynamic breast MRI at 3T using a prototypical CAIPIRINHA-Dixon-TWIST-VIBE (CDT-VIBE) sequence and a subsequent surgery or biopsy. Inflow rate of the agent between plasma and interstitium (K trans ), outflow rate of agent between interstitium and plasma (K ep ), extravascular space volume per unit volume of tissue (v e ) including mean value, 25th/50th/75th/90th percentiles, skewness, and kurtosis were then calculated based on the whole lesion. A single-sample Kolmogorov-Smirnov test, paired t-test, and receiver operating characteristic curve (ROC) analysis were used for statistical analysis. Malignant breast lesions had significantly higher K trans , K ep , and lower v e in mean values, 25th/50th/75th/90th percentiles, and significantly higher skewness of v e than benign breast lesions (all P < 0.05). There was no significant difference in kurtosis values between malignant and benign breast lesions (all P > 0.05). The 90th percentile of K trans , the 90th percentile of K ep , and the 50th percentile of v e showed the greatest areas under the ROC curve (AUC) for each pharmacokinetic parameter derived from DCE-MRI. The 90th percentile of K ep achieved the highest AUC value (0.927) among all histogram-derived values. The whole-lesion histogram analysis of pharmacokinetic parameters can improve the diagnostic accuracy of breast DCE-MRI with the CDT-VIBE technique. The 90th percentile of K ep may be the best indicator in differentiation between malignant and benign breast lesions. 4 Technical Efficacy Stage: 2 J. Magn. Reson. Imaging 2018;47:91-96. © 2017 International Society for Magnetic Resonance in Medicine.
Theory and Application of DNA Histogram Analysis.
ERIC Educational Resources Information Center
Bagwell, Charles Bruce
The underlying principles and assumptions associated with DNA histograms are discussed along with the characteristics of fluorescent probes. Information theory was described and used to calculate the information content of a DNA histogram. Two major types of DNA histogram analyses are proposed: parametric and nonparametric analysis. Three levels…
NOTE: Reducing the number of segments in unidirectional MLC segmentations
NASA Astrophysics Data System (ADS)
Mellado, X.; Cruz, S.; Artacho, J. M.; Canellas, M.
2010-02-01
In intensity-modulated radiation therapy (IMRT), fluence matrices obtained from a treatment planning system are usually delivered by a linear accelerator equipped with a multileaf collimator (MLC). A segmentation method is needed for decomposing these fluence matrices into segments suitable for the MLC, and the number of segments used is an important factor for treatment time. In this work, an algorithm for reduction of the number of segments (NS) is presented for unidirectional segmentations, where there is no backtracking of the MLC leaves. It uses a geometrical representation of the segmentation output for searching the key values in a fluence matrix that complicate its decomposition. The NS reduction is achieved by performing minor modifications in these values, under the conditions of avoiding substantial modifications of the dose-volume histogram, and does not increase in average the total number of monitor units delivered. The proposed method was tested using two clinical cases planned with the PCRT 3D® treatment planning system.
Measuring kinetics of complex single ion channel data using mean-variance histograms.
Patlak, J B
1993-07-01
The measurement of single ion channel kinetics is difficult when those channels exhibit subconductance events. When the kinetics are fast, and when the current magnitudes are small, as is the case for Na+, Ca2+, and some K+ channels, these difficulties can lead to serious errors in the estimation of channel kinetics. I present here a method, based on the construction and analysis of mean-variance histograms, that can overcome these problems. A mean-variance histogram is constructed by calculating the mean current and the current variance within a brief "window" (a set of N consecutive data samples) superimposed on the digitized raw channel data. Systematic movement of this window over the data produces large numbers of mean-variance pairs which can be assembled into a two-dimensional histogram. Defined current levels (open, closed, or sublevel) appear in such plots as low variance regions. The total number of events in such low variance regions is estimated by curve fitting and plotted as a function of window width. This function decreases with the same time constants as the original dwell time probability distribution for each of the regions. The method can therefore be used: 1) to present a qualitative summary of the single channel data from which the signal-to-noise ratio, open channel noise, steadiness of the baseline, and number of conductance levels can be quickly determined; 2) to quantify the dwell time distribution in each of the levels exhibited. In this paper I present the analysis of a Na+ channel recording that had a number of complexities. The signal-to-noise ratio was only about 8 for the main open state, open channel noise, and fast flickers to other states were present, as were a substantial number of subconductance states. "Standard" half-amplitude threshold analysis of these data produce open and closed time histograms that were well fitted by the sum of two exponentials, but with apparently erroneous time constants, whereas the mean-variance histogram technique provided a more credible analysis of the open, closed, and subconductance times for the patch. I also show that the method produces accurate results on simulated data in a wide variety of conditions, whereas the half-amplitude method, when applied to complex simulated data shows the same errors as were apparent in the real data. The utility and the limitations of this new method are discussed.
Histogram deconvolution - An aid to automated classifiers
NASA Technical Reports Server (NTRS)
Lorre, J. J.
1983-01-01
It is shown that N-dimensional histograms are convolved by the addition of noise in the picture domain. Three methods are described which provide the ability to deconvolve such noise-affected histograms. The purpose of the deconvolution is to provide automated classifiers with a higher quality N-dimensional histogram from which to obtain classification statistics.
Parameterization of the Age-Dependent Whole Brain Apparent Diffusion Coefficient Histogram
Batra, Marion; Nägele, Thomas
2015-01-01
Purpose. The distribution of apparent diffusion coefficient (ADC) values in the brain can be used to characterize age effects and pathological changes of the brain tissue. The aim of this study was the parameterization of the whole brain ADC histogram by an advanced model with influence of age considered. Methods. Whole brain ADC histograms were calculated for all data and for seven age groups between 10 and 80 years. Modeling of the histograms was performed for two parts of the histogram separately: the brain tissue part was modeled by two Gaussian curves, while the remaining part was fitted by the sum of a Gaussian curve, a biexponential decay, and a straight line. Results. A consistent fitting of the histograms of all age groups was possible with the proposed model. Conclusions. This study confirms the strong dependence of the whole brain ADC histograms on the age of the examined subjects. The proposed model can be used to characterize changes of the whole brain ADC histogram in certain diseases under consideration of age effects. PMID:26609526
Sadasivan, Chander; Brownstein, Jeremy; Patel, Bhumika; Dholakia, Ronak; Santore, Joseph; Al-Mufti, Fawaz; Puig, Enrique; Rakian, Audrey; Fernandez-Prada, Kenneth D; Elhammady, Mohamed S; Farhat, Hamad; Fiorella, David J; Woo, Henry H; Aziz-Sultan, Mohammad A; Lieber, Baruch B
2013-03-01
Endovascular coiling of cerebral aneurysms remains limited by coil compaction and associated recanalization. Recent coil designs which effect higher packing densities may be far from optimal because hemodynamic forces causing compaction are not well understood since detailed data regarding the location and distribution of coil masses are unavailable. We present an in vitro methodology to characterize coil masses deployed within aneurysms by quantifying intra-aneurysmal void spaces. Eight identical aneurysms were packed with coils by both balloon- and stent-assist techniques. The samples were embedded, sequentially sectioned and imaged. Empty spaces between the coils were numerically filled with circles (2D) in the planar images and with spheres (3D) in the three-dimensional composite images. The 2D and 3D void size histograms were analyzed for local variations and by fitting theoretical probability distribution functions. Balloon-assist packing densities (31±2%) were lower ( p =0.04) than the stent-assist group (40±7%). The maximum and average 2D and 3D void sizes were higher ( p =0.03 to 0.05) in the balloon-assist group as compared to the stent-assist group. None of the void size histograms were normally distributed; theoretical probability distribution fits suggest that the histograms are most probably exponentially distributed with decay constants of 6-10 mm. Significant ( p <=0.001 to p =0.03) spatial trends were noted with the void sizes but correlation coefficients were generally low (absolute r <=0.35). The methodology we present can provide valuable input data for numerical calculations of hemodynamic forces impinging on intra-aneurysmal coil masses and be used to compare and optimize coil configurations as well as coiling techniques.
Introducing parallelism to histogramming functions for GEM systems
NASA Astrophysics Data System (ADS)
Krawczyk, Rafał D.; Czarski, Tomasz; Kolasinski, Piotr; Pozniak, Krzysztof T.; Linczuk, Maciej; Byszuk, Adrian; Chernyshova, Maryna; Juszczyk, Bartlomiej; Kasprowicz, Grzegorz; Wojenski, Andrzej; Zabolotny, Wojciech
2015-09-01
This article is an assessment of potential parallelization of histogramming algorithms in GEM detector system. Histogramming and preprocessing algorithms in MATLAB were analyzed with regard to adding parallelism. Preliminary implementation of parallel strip histogramming resulted in speedup. Analysis of algorithms parallelizability is presented. Overview of potential hardware and software support to implement parallel algorithm is discussed.
Comparison of Histograms for Use in Cloud Observation and Modeling
NASA Technical Reports Server (NTRS)
Green, Lisa; Xu, Kuan-Man
2005-01-01
Cloud observation and cloud modeling data can be presented in histograms for each characteristic to be measured. Combining information from single-cloud histograms yields a summary histogram. Summary histograms can be compared to each other to reach conclusions about the behavior of an ensemble of clouds in different places at different times or about the accuracy of a particular cloud model. As in any scientific comparison, it is necessary to decide whether any apparent differences are statistically significant. The usual methods of deciding statistical significance when comparing histograms do not apply in this case because they assume independent data. Thus, a new method is necessary. The proposed method uses the Euclidean distance metric and bootstrapping to calculate the significance level.
Analytical techniques of pilot scanning behavior and their application
NASA Technical Reports Server (NTRS)
Harris, R. L., Sr.; Glover, B. J.; Spady, A. A., Jr.
1986-01-01
The state of the art of oculometric data analysis techniques and their applications in certain research areas such as pilot workload, information transfer provided by various display formats, crew role in automated systems, and pilot training are documented. These analytical techniques produce the following data: real-time viewing of the pilot's scanning behavior, average dwell times, dwell percentages, instrument transition paths, dwell histograms, and entropy rate measures. These types of data are discussed, and overviews of the experimental setup, data analysis techniques, and software are presented. A glossary of terms frequently used in pilot scanning behavior and a bibliography of reports on related research sponsored by NASA Langley Research Center are also presented.
Ghosh, Tonmoy; Fattah, Shaikh Anowarul; Wahid, Khan A
2018-01-01
Wireless capsule endoscopy (WCE) is the most advanced technology to visualize whole gastrointestinal (GI) tract in a non-invasive way. But the major disadvantage here, it takes long reviewing time, which is very laborious as continuous manual intervention is necessary. In order to reduce the burden of the clinician, in this paper, an automatic bleeding detection method for WCE video is proposed based on the color histogram of block statistics, namely CHOBS. A single pixel in WCE image may be distorted due to the capsule motion in the GI tract. Instead of considering individual pixel values, a block surrounding to that individual pixel is chosen for extracting local statistical features. By combining local block features of three different color planes of RGB color space, an index value is defined. A color histogram, which is extracted from those index values, provides distinguishable color texture feature. A feature reduction technique utilizing color histogram pattern and principal component analysis is proposed, which can drastically reduce the feature dimension. For bleeding zone detection, blocks are classified using extracted local features that do not incorporate any computational burden for feature extraction. From extensive experimentation on several WCE videos and 2300 images, which are collected from a publicly available database, a very satisfactory bleeding frame and zone detection performance is achieved in comparison to that obtained by some of the existing methods. In the case of bleeding frame detection, the accuracy, sensitivity, and specificity obtained from proposed method are 97.85%, 99.47%, and 99.15%, respectively, and in the case of bleeding zone detection, 95.75% of precision is achieved. The proposed method offers not only low feature dimension but also highly satisfactory bleeding detection performance, which even can effectively detect bleeding frame and zone in a continuous WCE video data.
NASA Astrophysics Data System (ADS)
Zavaletta, Vanessa A.; Bartholmai, Brian J.; Robb, Richard A.
2007-03-01
Diffuse lung diseases, such as idiopathic pulmonary fibrosis (IPF), can be characterized and quantified by analysis of volumetric high resolution CT scans of the lungs. These data sets typically have dimensions of 512 x 512 x 400. It is too subjective and labor intensive for a radiologist to analyze each slice and quantify regional abnormalities manually. Thus, computer aided techniques are necessary, particularly texture analysis techniques which classify various lung tissue types. Second and higher order statistics which relate the spatial variation of the intensity values are good discriminatory features for various textures. The intensity values in lung CT scans range between [-1024, 1024]. Calculation of second order statistics on this range is too computationally intensive so the data is typically binned between 16 or 32 gray levels. There are more effective ways of binning the gray level range to improve classification. An optimal and very efficient way to nonlinearly bin the histogram is to use a dynamic programming algorithm. The objective of this paper is to show that nonlinear binning using dynamic programming is computationally efficient and improves the discriminatory power of the second and higher order statistics for more accurate quantification of diffuse lung disease.
Marinozzi, Franco; Bini, Fabiano; Marinozzi, Andrea; Zuppante, Francesca; De Paolis, Annalisa; Pecci, Raffaella; Bedini, Rossella
2013-01-01
Micro-CT analysis is a powerful technique for a non-invasive evaluation of the morphometric parameters of trabecular bone samples. This elaboration requires a previous binarization of the images. A problem which arises from the binarization process is the partial volume artifact. Voxels at the external surface of the sample can contain both bone and air so thresholding operates an incorrect estimation of volume occupied by the two materials. The aim of this study is the extraction of bone volumetric information directly from the image histograms, by fitting them with a suitable set of functions. Nineteen trabecular bone samples were extracted from femoral heads of eight patients subject to a hip arthroplasty surgery. Trabecular bone samples were acquired using micro-CT Scanner. Hystograms of the acquired images were computed and fitted by Gaussian-like functions accounting for: a) gray levels produced by the bone x-ray absorption, b) the portions of the image occupied by air and c) voxels that contain a mixture of bone and air. This latter contribution can be considered such as an estimation of the partial volume effect. The comparison of the proposed technique to the bone volumes measured by a reference instrument such as by a helium pycnometer show the method as a good way for an accurate bone volume calculation of trabecular bone samples.
Towards human behavior recognition based on spatio temporal features and support vector machines
NASA Astrophysics Data System (ADS)
Ghabri, Sawsen; Ouarda, Wael; Alimi, Adel M.
2017-03-01
Security and surveillance are vital issues in today's world. The recent acts of terrorism have highlighted the urgent need for efficient surveillance. There is indeed a need for an automated system for video surveillance which can detect identity and activity of person. In this article, we propose a new paradigm to recognize an aggressive human behavior such as boxing action. Our proposed system for human activity detection includes the use of a fusion between Spatio Temporal Interest Point (STIP) and Histogram of Oriented Gradient (HoG) features. The novel feature called Spatio Temporal Histogram Oriented Gradient (STHOG). To evaluate the robustness of our proposed paradigm with a local application of HoG technique on STIP points, we made experiments on KTH human action dataset based on Multi Class Support Vector Machines classification. The proposed scheme outperforms basic descriptors like HoG and STIP to achieve 82.26% us an accuracy value of classification rate.
Liquid-liquid transition in the ST2 model of water
NASA Astrophysics Data System (ADS)
Debenedetti, Pablo
2013-03-01
We present clear evidence of the existence of a metastable liquid-liquid phase transition in the ST2 model of water. Using four different techniques (the weighted histogram analysis method with single-particle moves, well-tempered metadynamics with single-particle moves, weighted histograms with parallel tempering and collective particle moves, and conventional molecular dynamics), we calculate the free energy surface over a range of thermodynamic conditions, we perform a finite size scaling analysis for the free energy barrier between the coexisting liquid phases, we demonstrate the attainment of diffusive behavior, and we perform stringent thermodynamic consistency checks. The results provide conclusive evidence of a first-order liquid-liquid transition. We also show that structural equilibration in the sluggish low-density phase is attained over the time scale of our simulations, and that crystallization times are significantly longer than structural equilibration, even under deeply supercooled conditions. We place our results in the context of the theory of metastability.
NASA Astrophysics Data System (ADS)
Liu, Changjiang; Cheng, Irene; Zhang, Yi; Basu, Anup
2017-06-01
This paper presents an improved multi-scale Retinex (MSR) based enhancement for ariel images under low visibility. For traditional multi-scale Retinex, three scales are commonly employed, which limits its application scenarios. We extend our research to a general purpose enhanced method, and design an MSR with more than three scales. Based on the mathematical analysis and deductions, an explicit multi-scale representation is proposed that balances image contrast and color consistency. In addition, a histogram truncation technique is introduced as a post-processing strategy to remap the multi-scale Retinex output to the dynamic range of the display. Analysis of experimental results and comparisons with existing algorithms demonstrate the effectiveness and generality of the proposed method. Results on image quality assessment proves the accuracy of the proposed method with respect to both objective and subjective criteria.
A new phase correction method in NMR imaging based on autocorrelation and histogram analysis.
Ahn, C B; Cho, Z H
1987-01-01
A new statistical approach to phase correction in NMR imaging is proposed. The proposed scheme consists of first-and zero-order phase corrections each by the inverse multiplication of estimated phase error. The first-order error is estimated by the phase of autocorrelation calculated from the complex valued phase distorted image while the zero-order correction factor is extracted from the histogram of phase distribution of the first-order corrected image. Since all the correction procedures are performed on the spatial domain after completion of data acquisition, no prior adjustments or additional measurements are required. The algorithm can be applicable to most of the phase-involved NMR imaging techniques including inversion recovery imaging, quadrature modulated imaging, spectroscopic imaging, and flow imaging, etc. Some experimental results with inversion recovery imaging as well as quadrature spectroscopic imaging are shown to demonstrate the usefulness of the algorithm.
Mechanisms of behavior modification in clinical behavioral medicine in China.
Yang, Zhiyin; Su, Zhonghua; Ji, Feng; Zhu, Min; Bai, Bo
2014-08-01
Behavior modification, as the core of clinical behavioral medicine, is often used in clinical settings. We seek to summarize behavior modification techniques that are commonly used in clinical practice of behavioral medicine in China and discuss possible biobehavioral mechanisms. We reviewed common behavior modification techniques in clinical settings in China, and we reviewed studies that explored possible biobehavioral mechanisms. Commonly used clinical approaches of behavior modification in China include behavior therapy, cognitive therapy, cognitive-behavioral therapy, health education, behavior management, behavioral relaxation training, stress management intervention, desensitization therapy, biofeedback therapy, and music therapy. These techniques have been applied in the clinical treatment of a variety of diseases, such as chronic diseases, psychosomatic diseases, and psychological disorders. The biobehavioral mechanisms of these techniques involve the autonomic nervous system, neuroendocrine system, neurobiochemistry, and neuroplasticity. Behavior modification techniques are commonly used in the treatment of a variety of somatic and psychological disorders in China. Multiple biobehavioral mechanisms are involved in successful behavior modification.
Predicting the Valence of a Scene from Observers’ Eye Movements
R.-Tavakoli, Hamed; Atyabi, Adham; Rantanen, Antti; Laukka, Seppo J.; Nefti-Meziani, Samia; Heikkilä, Janne
2015-01-01
Multimedia analysis benefits from understanding the emotional content of a scene in a variety of tasks such as video genre classification and content-based image retrieval. Recently, there has been an increasing interest in applying human bio-signals, particularly eye movements, to recognize the emotional gist of a scene such as its valence. In order to determine the emotional category of images using eye movements, the existing methods often learn a classifier using several features that are extracted from eye movements. Although it has been shown that eye movement is potentially useful for recognition of scene valence, the contribution of each feature is not well-studied. To address the issue, we study the contribution of features extracted from eye movements in the classification of images into pleasant, neutral, and unpleasant categories. We assess ten features and their fusion. The features are histogram of saccade orientation, histogram of saccade slope, histogram of saccade length, histogram of saccade duration, histogram of saccade velocity, histogram of fixation duration, fixation histogram, top-ten salient coordinates, and saliency map. We utilize machine learning approach to analyze the performance of features by learning a support vector machine and exploiting various feature fusion schemes. The experiments reveal that ‘saliency map’, ‘fixation histogram’, ‘histogram of fixation duration’, and ‘histogram of saccade slope’ are the most contributing features. The selected features signify the influence of fixation information and angular behavior of eye movements in the recognition of the valence of images. PMID:26407322
TU-G-BRD-08: In-Vivo EPID Dosimetry: Quantifying the Detectability of Four Classes of Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ford, E; Phillips, M; Bojechko, C
Purpose: EPID dosimetry is an emerging method for treatment verification and QA. Given that the in-vivo EPID technique is in clinical use at some centers, we investigate the sensitivity and specificity for detecting different classes of errors. We assess the impact of these errors using dose volume histogram endpoints. Though data exist for EPID dosimetry performed pre-treatment, this is the first study quantifying its effectiveness when used during patient treatment (in-vivo). Methods: We analyzed 17 patients; EPID images of the exit dose were acquired and used to reconstruct the planar dose at isocenter. This dose was compared to the TPSmore » dose using a 3%/3mm gamma criteria. To simulate errors, modifications were made to treatment plans using four possible classes of error: 1) patient misalignment, 2) changes in patient body habitus, 3) machine output changes and 4) MLC misalignments. Each error was applied with varying magnitudes. To assess the detectability of the error, the area under a ROC curve (AUC) was analyzed. The AUC was compared to changes in D99 of the PTV introduced by the simulated error. Results: For systematic changes in the MLC leaves, changes in the machine output and patient habitus, the AUC varied from 0.78–0.97 scaling with the magnitude of the error. The optimal gamma threshold as determined by the ROC curve varied between 84–92%. There was little diagnostic power in detecting random MLC leaf errors and patient shifts (AUC 0.52–0.74). Some errors with weak detectability had large changes in D99. Conclusion: These data demonstrate the ability of EPID-based in-vivo dosimetry in detecting variations in patient habitus and errors related to machine parameters such as systematic MLC misalignments and machine output changes. There was no correlation found between the detectability of the error using the gamma pass rate, ROC analysis and the impact on the dose volume histogram. Funded by grant R18HS022244 from AHRQ.« less
Histogram analysis of T2*-based pharmacokinetic imaging in cerebral glioma grading.
Liu, Hua-Shan; Chiang, Shih-Wei; Chung, Hsiao-Wen; Tsai, Ping-Huei; Hsu, Fei-Ting; Cho, Nai-Yu; Wang, Chao-Ying; Chou, Ming-Chung; Chen, Cheng-Yu
2018-03-01
To investigate the feasibility of histogram analysis of the T2*-based permeability parameter volume transfer constant (K trans ) for glioma grading and to explore the diagnostic performance of the histogram analysis of K trans and blood plasma volume (v p ). We recruited 31 and 11 patients with high- and low-grade gliomas, respectively. The histogram parameters of K trans and v p , derived from the first-pass pharmacokinetic modeling based on the T2* dynamic susceptibility-weighted contrast-enhanced perfusion-weighted magnetic resonance imaging (T2* DSC-PW-MRI) from the entire tumor volume, were evaluated for differentiating glioma grades. Histogram parameters of K trans and v p showed significant differences between high- and low-grade gliomas and exhibited significant correlations with tumor grades. The mean K trans derived from the T2* DSC-PW-MRI had the highest sensitivity and specificity for differentiating high-grade gliomas from low-grade gliomas compared with other histogram parameters of K trans and v p . Histogram analysis of T2*-based pharmacokinetic imaging is useful for cerebral glioma grading. The histogram parameters of the entire tumor K trans measurement can provide increased accuracy with additional information regarding microvascular permeability changes for identifying high-grade brain tumors. Copyright © 2017 Elsevier B.V. All rights reserved.
Stochastic HKMDHE: A multi-objective contrast enhancement algorithm
NASA Astrophysics Data System (ADS)
Pratiher, Sawon; Mukhopadhyay, Sabyasachi; Maity, Srideep; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.
2018-02-01
This contribution proposes a novel extension of the existing `Hyper Kurtosis based Modified Duo-Histogram Equalization' (HKMDHE) algorithm, for multi-objective contrast enhancement of biomedical images. A novel modified objective function has been formulated by joint optimization of the individual histogram equalization objectives. The optimal adequacy of the proposed methodology with respect to image quality metrics such as brightness preserving abilities, peak signal-to-noise ratio (PSNR), Structural Similarity Index (SSIM) and universal image quality metric has been experimentally validated. The performance analysis of the proposed Stochastic HKMDHE with existing histogram equalization methodologies like Global Histogram Equalization (GHE) and Contrast Limited Adaptive Histogram Equalization (CLAHE) has been given for comparative evaluation.
Infrared image segmentation method based on spatial coherence histogram and maximum entropy
NASA Astrophysics Data System (ADS)
Liu, Songtao; Shen, Tongsheng; Dai, Yao
2014-11-01
In order to segment the target well and suppress background noises effectively, an infrared image segmentation method based on spatial coherence histogram and maximum entropy is proposed. First, spatial coherence histogram is presented by weighting the importance of the different position of these pixels with the same gray-level, which is obtained by computing their local density. Then, after enhancing the image by spatial coherence histogram, 1D maximum entropy method is used to segment the image. The novel method can not only get better segmentation results, but also have a faster computation time than traditional 2D histogram-based segmentation methods.
ERIC Educational Resources Information Center
Vandermeulen, H.; DeWreede, R. E.
1983-01-01
Presents a histogram drawing program which sorts real numbers in up to 30 categories. Entered data are sorted and saved in a text file which is then used to generate the histogram. Complete Applesoft program listings are included. (JN)
Bin recycling strategy for improving the histogram precision on GPU
NASA Astrophysics Data System (ADS)
Cárdenas-Montes, Miguel; Rodríguez-Vázquez, Juan José; Vega-Rodríguez, Miguel A.
2016-07-01
Histogram is an easily comprehensible way to present data and analyses. In the current scientific context with access to large volumes of data, the processing time for building histogram has dramatically increased. For this reason, parallel construction is necessary to alleviate the impact of the processing time in the analysis activities. In this scenario, GPU computing is becoming widely used for reducing until affordable levels the processing time of histogram construction. Associated to the increment of the processing time, the implementations are stressed on the bin-count accuracy. Accuracy aspects due to the particularities of the implementations are not usually taken into consideration when building histogram with very large data sets. In this work, a bin recycling strategy to create an accuracy-aware implementation for building histogram on GPU is presented. In order to evaluate the approach, this strategy was applied to the computation of the three-point angular correlation function, which is a relevant function in Cosmology for the study of the Large Scale Structure of Universe. As a consequence of the study a high-accuracy implementation for histogram construction on GPU is proposed.
Clinical Utility of Blood Cell Histogram Interpretation
Bhagya, S.; Majeed, Abdul
2017-01-01
An automated haematology analyser provides blood cell histograms by plotting the sizes of different blood cells on X-axis and their relative number on Y-axis. Histogram interpretation needs careful analysis of Red Blood Cell (RBC), White Blood Cell (WBC) and platelet distribution curves. Histogram analysis is often a neglected part of the automated haemogram which if interpreted well, has significant potential to provide diagnostically relevant information even before higher level investigations are ordered. PMID:29207767
Clinical Utility of Blood Cell Histogram Interpretation.
Thomas, E T Arun; Bhagya, S; Majeed, Abdul
2017-09-01
An automated haematology analyser provides blood cell histograms by plotting the sizes of different blood cells on X-axis and their relative number on Y-axis. Histogram interpretation needs careful analysis of Red Blood Cell (RBC), White Blood Cell (WBC) and platelet distribution curves. Histogram analysis is often a neglected part of the automated haemogram which if interpreted well, has significant potential to provide diagnostically relevant information even before higher level investigations are ordered.
Image Processing for Planetary Limb/Terminator Extraction
NASA Technical Reports Server (NTRS)
Udomkesmalee, S.; Zhu, D. Q.; Chu, C. -C.
1995-01-01
A novel image segmentation technique for extracting limb and terminator of planetary bodies is proposed. Conventional edge- based histogramming approaches are used to trace object boundaries. The limb and terminator bifurcation is achieved by locating the harmonized segment in the two equations representing the 2-D parameterized boundary curve. Real planetary images from Voyager 1 and 2 served as representative test cases to verify the proposed methodology.
Vladimir A. Kovalev; Alexander Petkov; Cyle Wold; Wei Min Hao
2010-01-01
Data-processing techniques for the scanning lidar data are considered that allow determining the upper and lower boundaries of the smoke plume or smoke layering in the vicinity of wildfires. The task is fulfilled by utilizing the Atmospheric Heterogeneity Height Indicator (AHHI). The AHHI is a histogram, which shows a number of heterogeneity events defined by scanning...
Measuring kinetics of complex single ion channel data using mean-variance histograms.
Patlak, J B
1993-01-01
The measurement of single ion channel kinetics is difficult when those channels exhibit subconductance events. When the kinetics are fast, and when the current magnitudes are small, as is the case for Na+, Ca2+, and some K+ channels, these difficulties can lead to serious errors in the estimation of channel kinetics. I present here a method, based on the construction and analysis of mean-variance histograms, that can overcome these problems. A mean-variance histogram is constructed by calculating the mean current and the current variance within a brief "window" (a set of N consecutive data samples) superimposed on the digitized raw channel data. Systematic movement of this window over the data produces large numbers of mean-variance pairs which can be assembled into a two-dimensional histogram. Defined current levels (open, closed, or sublevel) appear in such plots as low variance regions. The total number of events in such low variance regions is estimated by curve fitting and plotted as a function of window width. This function decreases with the same time constants as the original dwell time probability distribution for each of the regions. The method can therefore be used: 1) to present a qualitative summary of the single channel data from which the signal-to-noise ratio, open channel noise, steadiness of the baseline, and number of conductance levels can be quickly determined; 2) to quantify the dwell time distribution in each of the levels exhibited. In this paper I present the analysis of a Na+ channel recording that had a number of complexities. The signal-to-noise ratio was only about 8 for the main open state, open channel noise, and fast flickers to other states were present, as were a substantial number of subconductance states. "Standard" half-amplitude threshold analysis of these data produce open and closed time histograms that were well fitted by the sum of two exponentials, but with apparently erroneous time constants, whereas the mean-variance histogram technique provided a more credible analysis of the open, closed, and subconductance times for the patch. I also show that the method produces accurate results on simulated data in a wide variety of conditions, whereas the half-amplitude method, when applied to complex simulated data shows the same errors as were apparent in the real data. The utility and the limitations of this new method are discussed. Images FIGURE 2 FIGURE 4 FIGURE 8 FIGURE 9 PMID:7690261
Liang, He-Yue; Huang, Ya-Qin; Yang, Zhao-Xia; Ying-Ding; Zeng, Meng-Su; Rao, Sheng-Xiang
2016-07-01
To determine if magnetic resonance imaging (MRI) histogram analyses can help predict response to chemotherapy in patients with colorectal hepatic metastases by using response evaluation criteria in solid tumours (RECIST1.1) as the reference standard. Standard MRI including diffusion-weighted imaging (b=0, 500 s/mm(2)) was performed before chemotherapy in 53 patients with colorectal hepatic metastases. Histograms were performed for apparent diffusion coefficient (ADC) maps, arterial, and portal venous phase images; thereafter, mean, percentiles (1st, 10th, 50th, 90th, 99th), skewness, kurtosis, and variance were generated. Quantitative histogram parameters were compared between responders (partial and complete response, n=15) and non-responders (progressive and stable disease, n=38). Receiver operator characteristics (ROC) analyses were further analyzed for the significant parameters. The mean, 1st percentile, 10th percentile, 50th percentile, 90th percentile, 99th percentile of the ADC maps were significantly lower in responding group than that in non-responding group (p=0.000-0.002) with area under the ROC curve (AUCs) of 0.76-0.82. The histogram parameters of arterial and portal venous phase showed no significant difference (p>0.05) between the two groups. Histogram-derived parameters for ADC maps seem to be a promising tool for predicting response to chemotherapy in patients with colorectal hepatic metastases. • ADC histogram analyses can potentially predict chemotherapy response in colorectal liver metastases. • Lower histogram-derived parameters (mean, percentiles) for ADC tend to have good response. • MR enhancement histogram analyses are not reliable to predict response.
Using histograms to introduce randomization in the generation of ensembles of decision trees
Kamath, Chandrika; Cantu-Paz, Erick; Littau, David
2005-02-22
A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.
Color Histogram Diffusion for Image Enhancement
NASA Technical Reports Server (NTRS)
Kim, Taemin
2011-01-01
Various color histogram equalization (CHE) methods have been proposed to extend grayscale histogram equalization (GHE) for color images. In this paper a new method called histogram diffusion that extends the GHE method to arbitrary dimensions is proposed. Ranges in a histogram are specified as overlapping bars of uniform heights and variable widths which are proportional to their frequencies. This diagram is called the vistogram. As an alternative approach to GHE, the squared error of the vistogram from the uniform distribution is minimized. Each bar in the vistogram is approximated by a Gaussian function. Gaussian particles in the vistoram diffuse as a nonlinear autonomous system of ordinary differential equations. CHE results of color images showed that the approach is effective.
Histogram based analysis of lung perfusion of children after congenital diaphragmatic hernia repair.
Kassner, Nora; Weis, Meike; Zahn, Katrin; Schaible, Thomas; Schoenberg, Stefan O; Schad, Lothar R; Zöllner, Frank G
2018-05-01
To investigate a histogram based approach to characterize the distribution of perfusion in the whole left and right lung by descriptive statistics and to show how histograms could be used to visually explore perfusion defects in two year old children after Congenital Diaphragmatic Hernia (CDH) repair. 28 children (age of 24.2±1.7months; all left sided hernia; 9 after extracorporeal membrane oxygenation therapy) underwent quantitative DCE-MRI of the lung. Segmentations of left and right lung were manually drawn to mask the calculated pulmonary blood flow maps and then to derive histograms for each lung side. Individual and group wise analysis of histograms of left and right lung was performed. Ipsilateral and contralateral lung show significant difference in shape and descriptive statistics derived from the histogram (Wilcoxon signed-rank test, p<0.05) on group wise and individual level. Subgroup analysis (patients with vs without ECMO therapy) showed no significant differences using histogram derived parameters. Histogram analysis can be a valuable tool to characterize and visualize whole lung perfusion of children after CDH repair. It allows for several possibilities to analyze the data, either describing the perfusion differences between the right and left lung but also to explore and visualize localized perfusion patterns in the 3D lung volume. Subgroup analysis will be possible given sufficient sample sizes. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
FPGA based charge fast histogramming for GEM detector
NASA Astrophysics Data System (ADS)
Poźniak, Krzysztof T.; Byszuk, A.; Chernyshova, M.; Cieszewski, R.; Czarski, T.; Dominik, W.; Jakubowska, K.; Kasprowicz, G.; Rzadkiewicz, J.; Scholz, M.; Zabolotny, W.
2013-10-01
This article presents a fast charge histogramming method for the position sensitive X-ray GEM detector. The energy resolved measurements are carried out simultaneously for 256 channels of the GEM detector. The whole process of histogramming is performed in 21 FPGA chips (Spartan-6 series from Xilinx) . The results of the histogramming process are stored in an external DDR3 memory. The structure of an electronic measuring equipment and a firmware functionality implemented in the FPGAs is described. Examples of test measurements are presented.
Liu, Song; Zhang, Yujuan; Chen, Ling; Guan, Wenxian; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang
2017-10-02
Whole-lesion apparent diffusion coefficient (ADC) histogram analysis has been introduced and proved effective in assessment of multiple tumors. However, the application of whole-volume ADC histogram analysis in gastrointestinal tumors has just started and never been reported in T and N staging of gastric cancers. Eighty patients with pathologically confirmed gastric carcinomas underwent diffusion weighted (DW) magnetic resonance imaging before surgery prospectively. Whole-lesion ADC histogram analysis was performed by two radiologists independently. The differences of ADC histogram parameters among different T and N stages were compared with independent-samples Kruskal-Wallis test. Receiver operating characteristic (ROC) analysis was performed to evaluate the performance of ADC histogram parameters in differentiating particular T or N stages of gastric cancers. There were significant differences of all the ADC histogram parameters for gastric cancers at different T (except ADC min and ADC max ) and N (except ADC max ) stages. Most ADC histogram parameters differed significantly between T1 vs T3, T1 vs T4, T2 vs T4, N0 vs N1, N0 vs N3, and some parameters (ADC 5% , ADC 10% , ADC min ) differed significantly between N0 vs N2, N2 vs N3 (all P < 0.05). Most parameters except ADC max performed well in differentiating different T and N stages of gastric cancers. Especially for identifying patients with and without lymph node metastasis, the ADC 10% yielded the largest area under the ROC curve of 0.794 (95% confidence interval, 0.677-0.911). All the parameters except ADC max showed excellent inter-observer agreement with intra-class correlation coefficients higher than 0.800. Whole-volume ADC histogram parameters held great potential in differentiating different T and N stages of gastric cancers preoperatively.
Gihr, Georg Alexander; Horvath-Rizea, Diana; Kohlhof-Meinecke, Patricia; Ganslandt, Oliver; Henkes, Hans; Richter, Cindy; Hoffmann, Karl-Titus; Surov, Alexey; Schob, Stefan
2018-06-14
Meningiomas are the most frequently diagnosed intracranial masses, oftentimes requiring surgery. Especially procedure-related morbidity can be substantial, particularly in elderly patients. Hence, reliable imaging modalities enabling pretherapeutic prediction of tumor grade, growth kinetic, realistic prognosis, and-as a consequence-necessity of surgery are of great value. In this context, a promising diagnostic approach is advanced analysis of magnetic resonance imaging data. Therefore, our study investigated whether histogram profiling of routinely acquired postcontrast T1-weighted images is capable of separating low-grade from high-grade lesions and whether histogram parameters reflect Ki-67 expression in meningiomas. Pretreatment T1-weighted postcontrast volumes of 44 meningioma patients were used for signal intensity histogram profiling. WHO grade, tumor volume, and Ki-67 expression were evaluated. Comparative and correlative statistics investigating the association between histogram profile parameters and neuropathology were performed. None of the investigated histogram parameters revealed significant differences between low-grade and high-grade meningiomas. However, significant correlations were identified between Ki-67 and the histogram parameters skewness and entropy as well as between entropy and tumor volume. Contrary to previously reported findings, pretherapeutic postcontrast T1-weighted images can be used to predict growth kinetics in meningiomas if whole tumor histogram analysis is employed. However, no differences between distinct WHO grades were identifiable in out cohort. As a consequence, histogram analysis of postcontrast T1-weighted images is a promising approach to obtain quantitative in vivo biomarkers reflecting the proliferative potential in meningiomas. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Jeong, Chang Bu; Kim, Kwang Gi; Kim, Tae Sung; Kim, Seok Ki
2011-06-01
Whole-body bone scan is one of the most frequent diagnostic procedures in nuclear medicine. Especially, it plays a significant role in important procedures such as the diagnosis of osseous metastasis and evaluation of osseous tumor response to chemotherapy and radiation therapy. It can also be used to monitor the possibility of any recurrence of the tumor. However, it is a very time-consuming effort for radiologists to quantify subtle interval changes between successive whole-body bone scans because of many variations such as intensity, geometry, and morphology. In this paper, we present the most effective method of image enhancement based on histograms, which may assist radiologists in interpreting successive whole-body bone scans effectively. Forty-eight successive whole-body bone scans from 10 patients were obtained and evaluated using six methods of image enhancement based on histograms: histogram equalization, brightness-preserving bi-histogram equalization, contrast-limited adaptive histogram equalization, end-in search, histogram matching, and exact histogram matching (EHM). Comparison of the results of the different methods was made using three similarity measures peak signal-to-noise ratio, histogram intersection, and structural similarity. Image enhancement of successive bone scans using EHM showed the best results out of the six methods measured for all similarity measures. EHM is the best method of image enhancement based on histograms for diagnosing successive whole-body bone scans. The method for successive whole-body bone scans has the potential to greatly assist radiologists quantify interval changes more accurately and quickly by compensating for the variable nature of intensity information. Consequently, it can improve radiologists' diagnostic accuracy as well as reduce reading time for detecting interval changes.
Dose-volume histogram prediction using density estimation.
Skarpman Munter, Johanna; Sjölund, Jens
2015-09-07
Knowledge of what dose-volume histograms can be expected for a previously unseen patient could increase consistency and quality in radiotherapy treatment planning. We propose a machine learning method that uses previous treatment plans to predict such dose-volume histograms. The key to the approach is the framing of dose-volume histograms in a probabilistic setting.The training consists of estimating, from the patients in the training set, the joint probability distribution of some predictive features and the dose. The joint distribution immediately provides an estimate of the conditional probability of the dose given the values of the predictive features. The prediction consists of estimating, from the new patient, the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimate of the dose-volume histogram.To illustrate how the proposed method relates to previously proposed methods, we use the signed distance to the target boundary as a single predictive feature. As a proof-of-concept, we predicted dose-volume histograms for the brainstems of 22 acoustic schwannoma patients treated with stereotactic radiosurgery, and for the lungs of 9 lung cancer patients treated with stereotactic body radiation therapy. Comparing with two previous attempts at dose-volume histogram prediction we find that, given the same input data, the predictions are similar.In summary, we propose a method for dose-volume histogram prediction that exploits the intrinsic probabilistic properties of dose-volume histograms. We argue that the proposed method makes up for some deficiencies in previously proposed methods, thereby potentially increasing ease of use, flexibility and ability to perform well with small amounts of training data.
Structure Size Enhanced Histogram
NASA Astrophysics Data System (ADS)
Wesarg, Stefan; Kirschner, Matthias
Direct volume visualization requires the definition of transfer functions (TFs) for the assignment of opacity and color. Multi-dimensional TFs are based on at least two image properties, and are specified by means of 2D histograms. In this work we propose a new type of a 2D histogram which combines gray value with information about the size of the structures. This structure size enhanced (SSE) histogram is an intuitive approach for representing anatomical features. Clinicians — the users we are focusing on — are much more familiar with selecting features by their size than by their gradient magnitude value. As a proof of concept, we employ the SSE histogram for the definition of two-dimensional TFs for the visualization of 3D MRI and CT image data.
Face recognition algorithm using extended vector quantization histogram features.
Yan, Yan; Lee, Feifei; Wu, Xueqian; Chen, Qiu
2018-01-01
In this paper, we propose a face recognition algorithm based on a combination of vector quantization (VQ) and Markov stationary features (MSF). The VQ algorithm has been shown to be an effective method for generating features; it extracts a codevector histogram as a facial feature representation for face recognition. Still, the VQ histogram features are unable to convey spatial structural information, which to some extent limits their usefulness in discrimination. To alleviate this limitation of VQ histograms, we utilize Markov stationary features (MSF) to extend the VQ histogram-based features so as to add spatial structural information. We demonstrate the effectiveness of our proposed algorithm by achieving recognition results superior to those of several state-of-the-art methods on publicly available face databases.
Xu, Yan; Ru, Tong; Zhu, Lijing; Liu, Baorui; Wang, Huanhuan; Zhu, Li; He, Jian; Liu, Song; Zhou, Zhengyang; Yang, Xiaofeng
To monitor early response for locally advanced cervical cancers undergoing concurrent chemo-radiotherapy (CCRT) by ultrasonic histogram. B-mode ultrasound examinations were performed at 4 time points in thirty-four patients during CCRT. Six ultrasonic histogram parameters were used to assess the echogenicity, homogeneity and heterogeneity of tumors. I peak increased rapidly since the first week after therapy initiation, whereas W low , W high and A high changed significantly at the second week. The average ultrasonic histogram progressively moved toward the right and converted into more symmetrical shape. Ultrasonic histogram could be served as a potential marker to monitor early response during CCRT. Copyright © 2018 Elsevier Inc. All rights reserved.
Face verification system for Android mobile devices using histogram based features
NASA Astrophysics Data System (ADS)
Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu
2016-07-01
This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.
Histogram Curve Matching Approaches for Object-based Image Classification of Land Cover and Land Use
Toure, Sory I.; Stow, Douglas A.; Weeks, John R.; Kumar, Sunil
2013-01-01
The classification of image-objects is usually done using parametric statistical measures of central tendency and/or dispersion (e.g., mean or standard deviation). The objectives of this study were to analyze digital number histograms of image objects and evaluate classifications measures exploiting characteristic signatures of such histograms. Two histograms matching classifiers were evaluated and compared to the standard nearest neighbor to mean classifier. An ADS40 airborne multispectral image of San Diego, California was used for assessing the utility of curve matching classifiers in a geographic object-based image analysis (GEOBIA) approach. The classifications were performed with data sets having 0.5 m, 2.5 m, and 5 m spatial resolutions. Results show that histograms are reliable features for characterizing classes. Also, both histogram matching classifiers consistently performed better than the one based on the standard nearest neighbor to mean rule. The highest classification accuracies were produced with images having 2.5 m spatial resolution. PMID:24403648
Shin, Young Gyung; Yoo, Jaeheung; Kwon, Hyeong Ju; Hong, Jung Hwa; Lee, Hye Sun; Yoon, Jung Hyun; Kim, Eun-Kyung; Moon, Hee Jung; Han, Kyunghwa; Kwak, Jin Young
2016-08-01
The objective of the study was to evaluate whether texture analysis using histogram and gray level co-occurrence matrix (GLCM) parameters can help clinicians diagnose lymphocytic thyroiditis (LT) and differentiate LT according to pathologic grade. The background thyroid pathology of 441 patients was classified into no evidence of LT, chronic LT (CLT), and Hashimoto's thyroiditis (HT). Histogram and GLCM parameters were extracted from the regions of interest on ultrasound. The diagnostic performances of the parameters for diagnosing and differentiating LT were calculated. Of the histogram and GLCM parameters, the mean on histogram had the highest Az (0.63) and VUS (0.303). As the degrees of LT increased, the mean decreased and the standard deviation and entropy increased. The mean on histogram from gray-scale ultrasound showed the best diagnostic performance as a single parameter in differentiating LT according to pathologic grade as well as in diagnosing LT. Copyright © 2016 Elsevier Ltd. All rights reserved.
Guan, Yue; Shi, Hua; Chen, Ying; Liu, Song; Li, Weifeng; Jiang, Zhuoran; Wang, Huanhuan; He, Jian; Zhou, Zhengyang; Ge, Yun
2016-01-01
The aim of this study was to explore the application of whole-lesion histogram analysis of apparent diffusion coefficient (ADC) values of cervical cancer. A total of 54 women (mean age, 53 years) with cervical cancers underwent 3-T diffusion-weighted imaging with b values of 0 and 800 s/mm prospectively. Whole-lesion histogram analysis of ADC values was performed. Paired sample t test was used to compare differences in ADC histogram parameters between cervical cancers and normal cervical tissues. Receiver operating characteristic curves were constructed to identify the optimal threshold of each parameter. All histogram parameters in this study including ADCmean, ADCmin, ADC10%-ADC90%, mode, skewness, and kurtosis of cervical cancers were significantly lower than those of normal cervical tissues (all P < 0.0001). ADC90% had the largest area under receiver operating characteristic curve of 0.996. Whole-lesion histogram analysis of ADC maps is useful in the assessment of cervical cancer.
NASA Technical Reports Server (NTRS)
Seze, Genevieve; Rossow, William B.
1991-01-01
The spatial and temporal stability of the distributions of satellite-measured visible and infrared radiances, caused by variations in clouds and surfaces, are investigated using bidimensional and monodimensional histograms and time-composite images. Similar analysis of the histograms of the original and time-composite images provides separation of the contributions of the space and time variations to the total variations. The variability of both the surfaces and clouds is found to be larger at scales much larger than the minimum resolved by satellite imagery. This study shows that the shapes of these histograms are distinctive characteristics of the different climate regimes and that particular attributes of these histograms can be related to several general, though not universal, properties of clouds and surface variations at regional and synoptic scales. There are also significant exceptions to these relationships in particular climate regimes. The characteristics of these radiance histograms provide a stable well defined descriptor of the cloud and surface properties.
NASA Astrophysics Data System (ADS)
Wan, Minjie; Gu, Guohua; Qian, Weixian; Ren, Kan; Chen, Qian; Maldague, Xavier
2018-06-01
Infrared image enhancement plays a significant role in intelligent urban surveillance systems for smart city applications. Unlike existing methods only exaggerating the global contrast, we propose a particle swam optimization-based local entropy weighted histogram equalization which involves the enhancement of both local details and fore-and background contrast. First of all, a novel local entropy weighted histogram depicting the distribution of detail information is calculated based on a modified hyperbolic tangent function. Then, the histogram is divided into two parts via a threshold maximizing the inter-class variance in order to improve the contrasts of foreground and background, respectively. To avoid over-enhancement and noise amplification, double plateau thresholds of the presented histogram are formulated by means of particle swarm optimization algorithm. Lastly, each sub-image is equalized independently according to the constrained sub-local entropy weighted histogram. Comparative experiments implemented on real infrared images prove that our algorithm outperforms other state-of-the-art methods in terms of both visual and quantized evaluations.
Visibility enhancement of color images using Type-II fuzzy membership function
NASA Astrophysics Data System (ADS)
Singh, Harmandeep; Khehra, Baljit Singh
2018-04-01
Images taken in poor environmental conditions decrease the visibility and hidden information of digital images. Therefore, image enhancement techniques are necessary for improving the significant details of these images. An extensive review has shown that histogram-based enhancement techniques greatly suffer from over/under enhancement issues. Fuzzy-based enhancement techniques suffer from over/under saturated pixels problems. In this paper, a novel Type-II fuzzy-based image enhancement technique has been proposed for improving the visibility of images. The Type-II fuzzy logic can automatically extract the local atmospheric light and roughly eliminate the atmospheric veil in local detail enhancement. The proposed technique has been evaluated on 10 well-known weather degraded color images and is also compared with four well-known existing image enhancement techniques. The experimental results reveal that the proposed technique outperforms others regarding visible edge ratio, color gradients and number of saturated pixels.
Meng, Jie; Zhu, Lijing; Zhu, Li; Wang, Huanhuan; Liu, Song; Yan, Jing; Liu, Baorui; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng
2016-10-22
To explore the role of apparent diffusion coefficient (ADC) histogram shape related parameters in early assessment of treatment response during the concurrent chemo-radiotherapy (CCRT) course of advanced cervical cancers. This prospective study was approved by the local ethics committee and informed consent was obtained from all patients. Thirty-two patients with advanced cervical squamous cell carcinomas underwent diffusion weighted magnetic resonance imaging (b values, 0 and 800 s/mm 2 ) before CCRT, at the end of 2nd and 4th week during CCRT and immediately after CCRT completion. Whole lesion ADC histogram analysis generated several histogram shape related parameters including skewness, kurtosis, s-sD av , width, standard deviation, as well as first-order entropy and second-order entropies. The averaged ADC histograms of 32 patients were generated to visually observe dynamic changes of the histogram shape following CCRT. All parameters except width and standard deviation showed significant changes during CCRT (all P < 0.05), and their variation trends fell into four different patterns. Skewness and kurtosis both showed high early decline rate (43.10 %, 48.29 %) at the end of 2nd week of CCRT. All entropies kept decreasing significantly since 2 weeks after CCRT initiated. The shape of averaged ADC histogram also changed obviously following CCRT. ADC histogram shape analysis held the potential in monitoring early tumor response in patients with advanced cervical cancers undergoing CCRT.
[Clinical application of MRI histogram in evaluation of muscle fatty infiltration].
Zheng, Y M; Du, J; Li, W Z; Wang, Z X; Zhang, W; Xiao, J X; Yuan, Y
2016-10-18
To describe a method based on analysis of the histogram of intensity values produced from the magnetic resonance imaging (MRI) for quantifying the degree of fatty infiltration. The study included 25 patients with dystrophinopathy. All the subjects underwent muscle MRI test at thigh level. The histogram M values of 250 muscles adjusted for subcutaneous fat, representing the degree of fatty infiltration, were compared with the expert visual reading using the modified Mercuri scale. There was a significant positive correlation between the histogram M values and the scores of visual reading (r=0.854, P<0.001). The distinct pattern of muscle involvement detected in the patients with dystrophinopathy in our study of histogram M values was similar to that of visual reading and results in literature. The histogram M values had stronger correlations with the clinical data than the scores of visual reading as follows: the correlations with age (r=0.730, P<0.001) and (r=0.753, P<0.001); with strength of knee extensor (r=-0.468, P=0.024) and (r=-0.460, P=0.027) respectively. Meanwhile, the histogram M values analysis had better repeatability than visual reading with the interclass correlation coefficient was 0.998 (95% CI: 0.997-0.998, P<0.001) and 0.958 (95% CI: 0.946-0.967, P<0.001) respectively. Histogram M values analysis of MRI with the advantages of repeatability and objectivity can be used to evaluate the degree of muscle fatty infiltration.
Dissimilarity representations in lung parenchyma classification
NASA Astrophysics Data System (ADS)
Sørensen, Lauge; de Bruijne, Marleen
2009-02-01
A good problem representation is important for a pattern recognition system to be successful. The traditional approach to statistical pattern recognition is feature representation. More specifically, objects are represented by a number of features in a feature vector space, and classifiers are built in this representation. This is also the general trend in lung parenchyma classification in computed tomography (CT) images, where the features often are measures on feature histograms. Instead, we propose to build normal density based classifiers in dissimilarity representations for lung parenchyma classification. This allows for the classifiers to work on dissimilarities between objects, which might be a more natural way of representing lung parenchyma. In this context, dissimilarity is defined between CT regions of interest (ROI)s. ROIs are represented by their CT attenuation histogram and ROI dissimilarity is defined as a histogram dissimilarity measure between the attenuation histograms. In this setting, the full histograms are utilized according to the chosen histogram dissimilarity measure. We apply this idea to classification of different emphysema patterns as well as normal, healthy tissue. Two dissimilarity representation approaches as well as different histogram dissimilarity measures are considered. The approaches are evaluated on a set of 168 CT ROIs using normal density based classifiers all showing good performance. Compared to using histogram dissimilarity directly as distance in a emph{k} nearest neighbor classifier, which achieves a classification accuracy of 92.9%, the best dissimilarity representation based classifier is significantly better with a classification accuracy of 97.0% (text{emph{p" border="0" class="imgtopleft"> = 0.046).
ERIC Educational Resources Information Center
Gratzer, William; Carpenter, James E.
2008-01-01
This article demonstrates an alternative approach to the construction of histograms--one based on the notion of using area to represent relative density in intervals of unequal length. The resulting histograms illustrate the connection between the area of the rectangles associated with particular outcomes and the relative frequency (probability)…
Investigating Student Understanding of Histograms
ERIC Educational Resources Information Center
Kaplan, Jennifer J.; Gabrosek, John G.; Curtiss, Phyllis; Malone, Chris
2014-01-01
Histograms are adept at revealing the distribution of data values, especially the shape of the distribution and any outlier values. They are included in introductory statistics texts, research methods texts, and in the popular press, yet students often have difficulty interpreting the information conveyed by a histogram. This research identifies…
NASA Astrophysics Data System (ADS)
Galich, Nikolay E.
2008-07-01
Communication contains the description of the immunology data treatment. New nonlinear methods of immunofluorescence statistical analysis of peripheral blood neutrophils have been developed. We used technology of respiratory burst reaction of DNA fluorescence in the neutrophils cells nuclei due to oxidative activity. The histograms of photon count statistics the radiant neutrophils populations' in flow cytometry experiments are considered. Distributions of the fluorescence flashes frequency as functions of the fluorescence intensity are analyzed. Statistic peculiarities of histograms set for women in the pregnant period allow dividing all histograms on the three classes. The classification is based on three different types of smoothing and long-range scale averaged immunofluorescence distributions, their bifurcation and wavelet spectra. Heterogeneity peculiarities of long-range scale immunofluorescence distributions and peculiarities of wavelet spectra allow dividing all histograms on three groups. First histograms group belongs to healthy donors. Two other groups belong to donors with autoimmune and inflammatory diseases. Some of the illnesses are not diagnosed by standards biochemical methods. Medical standards and statistical data of the immunofluorescence histograms for identifications of health and illnesses are interconnected. Peculiarities of immunofluorescence for women in pregnant period are classified. Health or illness criteria are connected with statistics features of immunofluorescence histograms. Neutrophils populations' fluorescence presents the sensitive clear indicator of health status.
Complexity of possibly gapped histogram and analysis of histogram.
Fushing, Hsieh; Roy, Tania
2018-02-01
We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.
Complexity of possibly gapped histogram and analysis of histogram
Roy, Tania
2018-01-01
We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT. PMID:29515829
Complexity of possibly gapped histogram and analysis of histogram
NASA Astrophysics Data System (ADS)
Fushing, Hsieh; Roy, Tania
2018-02-01
We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.
Muijs, Christina T; Schreurs, Liesbeth M; Busz, Dianne M; Beukema, Jannet C; van der Borden, Arnout J; Pruim, Jan; Van der Jagt, Eric J; Plukker, John Th; Langendijk, Johannes A
2009-12-01
To determine the consequences of target volume (TV) modifications, based on the additional use of PET information, on radiation planning, assuming PET/CT-imaging represents the true extent of the tumour. For 21 patients with esophageal cancer, two separate TV's were retrospectively defined based on CT (CT-TV) and co-registered PET/CT images (PET/CT-TV). Two 3D-CRT plans (prescribed dose 50.4 Gy) were constructed to cover the corresponding TV's. Subsequently, these plans were compared for target coverage, normal tissue dose-volume histograms and the corresponding normal tissue complication probability (NTCP) values. The addition of PET led to the modification of CT-TV with at least 10% in 12 of 21 patients (57%) (reduction in 9, enlargement in 3). PET/CT-TV was inadequately covered by the CT-based treatment plan in 8 patients (36%). Treatment plan modifications resulted in significant changes (p<0.05) in dose distributions to heart and lungs. Corresponding changes in NTCP values ranged from -3% to +2% for radiation pneumonitis and from -0.2% to +1.2% for cardiac mortality. This study demonstrated that TV's based on CT might exclude PET-avid disease. Consequences are under dosing and thereby possibly ineffective treatment. Moreover, the addition of PET in radiation planning might result in clinical important changes in NTCP.
Research of Daily Conversation Transmitting System Based on Mouth Part Pattern Recognition
NASA Astrophysics Data System (ADS)
Watanabe, Mutsumi; Nishi, Natsuko
The authors are developing a vision-based intension transfer technique by recognizing user’s face expressions and movements, to help free and convenient communications with aged or disabled persons who find difficulties in talking, discriminating small character prints and operating keyboards by hands and fingers. In this paper we report a prototype system, where layered daily conversations are successively selected by recognizing the transition in shape of user’s mouth parts using camera image sequences settled in front of the user. Four mouth part patterns are used in the system. A method that automatically recognizes these patterns by analyzing the intensity histogram data around the mouth region is newly developed. The confirmation of a selection on the way is executed by detecting the open and shut movements of mouth through the temporal change in intensity histogram data. The method has been installed in a desktop PC by VC++ programs. Experimental results of mouth shape pattern recognition by twenty-five persons have shown the effectiveness of the method.
Segmentation by fusion of histogram-based k-means clusters in different color spaces.
Mignotte, Max
2008-05-01
This paper presents a new, simple, and efficient segmentation approach, based on a fusion procedure which aims at combining several segmentation maps associated to simpler partition models in order to finally get a more reliable and accurate segmentation result. The different label fields to be fused in our application are given by the same and simple (K-means based) clustering technique on an input image expressed in different color spaces. Our fusion strategy aims at combining these segmentation maps with a final clustering procedure using as input features, the local histogram of the class labels, previously estimated and associated to each site and for all these initial partitions. This fusion framework remains simple to implement, fast, general enough to be applied to various computer vision applications (e.g., motion detection and segmentation), and has been successfully applied on the Berkeley image database. The experiments herein reported in this paper illustrate the potential of this approach compared to the state-of-the-art segmentation methods recently proposed in the literature.
Kim, Ji Youn; Kim, Hai-Joong; Hahn, Meong Hi; Jeon, Hye Jin; Cho, Geum Joon; Hong, Sun Chul; Oh, Min Jeong
2013-09-01
Our aim was to figure out whether volumetric gray-scale histogram difference between anterior and posterior cervix can indicate the extent of cervical consistency. We collected data of 95 patients who were appropriate for vaginal delivery with 36th to 37th weeks of gestational age from September 2010 to October 2011 in the Department of Obstetrics and Gynecology, Korea University Ansan Hospital. Patients were excluded who had one of the followings: Cesarean section, labor induction, premature rupture of membrane. Thirty-four patients were finally enrolled. The patients underwent evaluation of the cervix through Bishop score, cervical length, cervical volume, three-dimensional (3D) cervical volumetric gray-scale histogram. The interval days from the cervix evaluation to the delivery day were counted. We compared to 3D cervical volumetric gray-scale histogram, Bishop score, cervical length, cervical volume with interval days from the evaluation of the cervix to the delivery. Gray-scale histogram difference between anterior and posterior cervix was significantly correlated to days to delivery. Its correlation coefficient (R) was 0.500 (P = 0.003). The cervical length was significantly related to the days to delivery. The correlation coefficient (R) and P-value between them were 0.421 and 0.013. However, anterior lip histogram, posterior lip histogram, total cervical volume, Bishop score were not associated with days to delivery (P >0.05). By using gray-scale histogram difference between anterior and posterior cervix and cervical length correlated with the days to delivery. These methods can be utilized to better help predict a cervical consistency.
Xu, Xiao-Quan; Ma, Gao; Wang, Yan-Jun; Hu, Hao; Su, Guo-Yi; Shi, Hai-Bin; Wu, Fei-Yun
2017-07-18
To evaluate the correlation between histogram parameters derived from diffusion-kurtosis (DK) imaging and the clinical stage of nasopharyngeal carcinoma (NPC). High T-stage (T3/4) NPC showed significantly higher Kapp-mean (P = 0.018), Kapp-median (P = 0.029) and Kapp-90th (P = 0.003) than low T-stage (T1/2) NPC. High N-stage NPC (N2/3) showed significantly lower Dapp-mean (P = 0.002), Dapp-median (P = 0.002) and Dapp-10th (P < 0.001) than low N-stage NPC (N0/1). High AJCC-stage NPC (III/IV) showed significantly lower Dapp-10th (P = 0.038) than low AJCC-stage NPC (I/II). ROC analyses indicated that Kapp-90th was optimal for predicting high T-stage (AUC, 0.759; sensitivity, 0.842; specificity, 0.607), while Dapp-10th was best for predicting high N- and AJCC-stage (N-stage, AUC, 0.841; sensitivity, 0.875; specificity, 0.807; AJCC-stage, AUC, 0.671; sensitivity, 0.800; specificity, 0.588). DK imaging data of forty-seven consecutive NPC patients were retrospectively analyzed. Apparent diffusion for Gaussian distribution (Dapp) and apparent kurtosis coefficient (Kapp) were generated using diffusion-kurtosis model. Histogram parameters, including mean, median, 10th, 90th percentiles, skewness and kurtosis of Dapp and Kapp were calculated. Patients were divided into low and high T, N and clinical stage based on American Joint Committee on Cancer (AJCC) staging system. Differences of histogram parameters between low and high T, N and AJCC stages were compared using t test. Multiple receiver operating characteristic (ROC) curves were used to determine and compare the value of significant parameters in predicting high T, N and AJCC stage, respectively. DK imaging-derived parameters correlated well with clinical stage of NPC, therefore could serve as an adjunctive imaging technique for evaluating NPC.
Ghosh, Tonmoy; Wahid, Khan A.
2018-01-01
Wireless capsule endoscopy (WCE) is the most advanced technology to visualize whole gastrointestinal (GI) tract in a non-invasive way. But the major disadvantage here, it takes long reviewing time, which is very laborious as continuous manual intervention is necessary. In order to reduce the burden of the clinician, in this paper, an automatic bleeding detection method for WCE video is proposed based on the color histogram of block statistics, namely CHOBS. A single pixel in WCE image may be distorted due to the capsule motion in the GI tract. Instead of considering individual pixel values, a block surrounding to that individual pixel is chosen for extracting local statistical features. By combining local block features of three different color planes of RGB color space, an index value is defined. A color histogram, which is extracted from those index values, provides distinguishable color texture feature. A feature reduction technique utilizing color histogram pattern and principal component analysis is proposed, which can drastically reduce the feature dimension. For bleeding zone detection, blocks are classified using extracted local features that do not incorporate any computational burden for feature extraction. From extensive experimentation on several WCE videos and 2300 images, which are collected from a publicly available database, a very satisfactory bleeding frame and zone detection performance is achieved in comparison to that obtained by some of the existing methods. In the case of bleeding frame detection, the accuracy, sensitivity, and specificity obtained from proposed method are 97.85%, 99.47%, and 99.15%, respectively, and in the case of bleeding zone detection, 95.75% of precision is achieved. The proposed method offers not only low feature dimension but also highly satisfactory bleeding detection performance, which even can effectively detect bleeding frame and zone in a continuous WCE video data. PMID:29468094
Texture operator for snow particle classification into snowflake and graupel
NASA Astrophysics Data System (ADS)
Nurzyńska, Karolina; Kubo, Mamoru; Muramoto, Ken-ichiro
2012-11-01
In order to improve the estimation of precipitation, the coefficients of Z-R relation should be determined for each snow type. Therefore, it is necessary to identify the type of falling snow. Consequently, this research addresses a problem of snow particle classification into snowflake and graupel in an automatic manner (as these types are the most common in the study region). Having correctly classified precipitation events, it is believed that it will be possible to estimate the related parameters accurately. The automatic classification system presented here describes the images with texture operators. Some of them are well-known from the literature: first order features, co-occurrence matrix, grey-tone difference matrix, run length matrix, and local binary pattern, but also a novel approach to design simple local statistic operators is introduced. In this work the following texture operators are defined: mean histogram, min-max histogram, and mean-variance histogram. Moreover, building a feature vector, which is based on the structure created in many from mentioned algorithms is also suggested. For classification, the k-nearest neighbourhood classifier was applied. The results showed that it is possible to achieve correct classification accuracy above 80% by most of the techniques. The best result of 86.06%, was achieved for operator built from a structure achieved in the middle stage of the co-occurrence matrix calculation. Next, it was noticed that describing an image with two texture operators does not improve the classification results considerably. In the best case the correct classification efficiency was 87.89% for a pair of texture operators created from local binary pattern and structure build in a middle stage of grey-tone difference matrix calculation. This also suggests that the information gathered by each texture operator is redundant. Therefore, the principal component analysis was applied in order to remove the unnecessary information and additionally reduce the length of the feature vectors. The improvement of the correct classification efficiency for up to 100% is possible for methods: min-max histogram, texture operator built from structure achieved in a middle stage of co-occurrence matrix calculation, texture operator built from a structure achieved in a middle stage of grey-tone difference matrix creation, and texture operator based on a histogram, when the feature vector stores 99% of initial information.
Construction and Evaluation of Histograms in Teacher Training
ERIC Educational Resources Information Center
Bruno, A.; Espinel, M. C.
2009-01-01
This article details the results of a written test designed to reveal how education majors construct and evaluate histograms and frequency polygons. Included is a description of the mistakes made by the students which shows how they tend to confuse histograms with bar diagrams, incorrectly assign data along the Cartesian axes and experience…
Empirical Histograms in Item Response Theory with Ordinal Data
ERIC Educational Resources Information Center
Woods, Carol M.
2007-01-01
The purpose of this research is to describe, test, and illustrate a new implementation of the empirical histogram (EH) method for ordinal items. The EH method involves the estimation of item response model parameters simultaneously with the approximation of the distribution of the random latent variable (theta) as a histogram. Software for the EH…
Yang, Su
2005-02-01
A new descriptor for symbol recognition is proposed. 1) A histogram is constructed for every pixel to figure out the distribution of the constraints among the other pixels. 2) All the histograms are statistically integrated to form a feature vector with fixed dimension. The robustness and invariance were experimentally confirmed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
This volume contains geology of the Durango D detail area, radioactive mineral occurrences in Colorado, and geophysical data interpretation. Eight appendices provide: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, magnetic and ancillary profiles, and test line data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
Geology of Durango C detail area, radioactive mineral occurrences in Colorado, and geophysical data interpretation are included in this report. Eight appendices provide: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, magnetic and ancillary profiles, and test line data.
Action recognition via cumulative histogram of multiple features
NASA Astrophysics Data System (ADS)
Yan, Xunshi; Luo, Yupin
2011-01-01
Spatial-temporal interest points (STIPs) are popular in human action recognition. However, they suffer from difficulties in determining size of codebook and losing much information during forming histograms. In this paper, spatial-temporal interest regions (STIRs) are proposed, which are based on STIPs and are capable of marking the locations of the most ``shining'' human body parts. In order to represent human actions, the proposed approach takes great advantages of multiple features, including STIRs, pyramid histogram of oriented gradients and pyramid histogram of oriented optical flows. To achieve this, cumulative histogram is used to integrate dynamic information in sequences and to form feature vectors. Furthermore, the widely used nearest neighbor and AdaBoost methods are employed as classification algorithms. Experiments on public datasets KTH, Weizmann and UCF sports show that the proposed approach achieves effective and robust results.
An Analysis of Measures Used to Evaluate the Air Force Critical Item Program
1991-09-01
example of a histogram. Cause & Effect Diagram. The cause and effect diagram was introduced in 1953 by Dr. Kaoru Ishikawa in summarizing the opinions of...Personal Interview. Air Force Institute of Technology, School of Engineering, Wright-Patterson AFB OH, 24 April 1991. 31. Ishikawa , Dr. Kaoru . Guide to...collected. How the data are collected will determine which measurement techniques are appropriate. Ishikawa classifies data collection into five categories
Meng, Jie; Zhu, Lijing; Zhu, Li; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng
2017-11-01
Background Apparent diffusion coefficient (ADC) histogram analysis has been widely used in determining tumor prognosis. Purpose To investigate the dynamic changes of ADC histogram parameters during concurrent chemo-radiotherapy (CCRT) in patients with advanced cervical cancers. Material and Methods This prospective study enrolled 32 patients with advanced cervical cancers undergoing CCRT who received diffusion-weighted (DW) magnetic resonance imaging (MRI) before CCRT, at the end of the second and fourth week during CCRT and one month after CCRT completion. The ADC histogram for the entire tumor volume was generated, and a series of histogram parameters was obtained. Dynamic changes of those parameters in cervical cancers were investigated as early biomarkers for treatment response. Results All histogram parameters except AUC low showed significant changes during CCRT (all P < 0.05). There were three variable trends involving different parameters. The mode, 5th, 10th, and 25th percentiles showed similar early increase rates (33.33%, 33.99%, 34.12%, and 30.49%, respectively) at the end of the second week of CCRT. The pre-CCRT 5th and 25th percentiles of the complete response (CR) group were significantly lower than those of the partial response (PR) group. Conclusion A series of ADC histogram parameters of cervical cancers changed significantly at the early stage of CCRT, indicating their potential in monitoring early tumor response to therapy.
Schob, Stefan; Münch, Benno; Dieckow, Julia; Quäschling, Ulf; Hoffmann, Karl-Titus; Richter, Cindy; Garnov, Nikita; Frydrychowicz, Clara; Krause, Matthias; Meyer, Hans-Jonas; Surov, Alexey
2018-04-01
Diffusion weighted imaging (DWI) quantifies motion of hydrogen nuclei in biological tissues and hereby has been used to assess the underlying tissue microarchitecture. Histogram-profiling of DWI provides more detailed information on diffusion characteristics of a lesion than the standardly calculated values of the apparent diffusion coefficient (ADC)-minimum, mean and maximum. Hence, the aim of our study was to investigate, which parameters of histogram-profiling of DWI in primary central nervous system lymphoma can be used to specifically predict features like cellular density, chromatin content and proliferative activity. Pre-treatment ADC maps of 21 PCNSL patients (8 female, 13 male, 28-89 years) from a 1.5T system were used for Matlab-based histogram profiling. Results of histopathology (H&E staining) and immunohistochemistry (Ki-67 expression) were quantified. Correlations between histogram-profiling parameters and neuropathologic examination were calculated using SPSS 23.0. The lower percentiles (p10 and p25) showed significant correlations with structural parameters of the neuropathologic examination (cellular density, chromatin content). The highest percentile, p90, correlated significantly with Ki-67 expression, resembling proliferative activity. Kurtosis of the ADC histogram correlated significantly with cellular density. Histogram-profiling of DWI in PCNSL provides a comprehensible set of parameters, which reflect distinct tumor-architectural and tumor-biological features, and hence, are promising biomarkers for treatment response and prognosis. Copyright © 2018. Published by Elsevier Inc.
ADC histogram analysis of muscle lymphoma - Correlation with histopathology in a rare entity.
Meyer, Hans-Jonas; Pazaitis, Nikolaos; Surov, Alexey
2018-06-21
Diffusion weighted imaging (DWI) is able to reflect histopathology architecture. A novel imaging approach, namely histogram analysis, is used to further characterize lesion on MRI. The purpose of this study is to correlate histogram parameters derived from apparent diffusion coefficient- (ADC) maps with histopathology parameters in muscle lymphoma. Eight patients (mean age 64.8 years, range 45-72 years) with histopathologically confirmed muscle lymphoma were retrospectively identified. Cell count, total nucleic and average nucleic areas were estimated using ImageJ. Additionally, Ki67-index was calculated. DWI was obtained on a 1.5T scanner by using the b values of 0 and 1000 s/mm2. Histogram analysis was performed as a whole lesion measurement by using a custom-made Matlabbased application. The correlation analysis revealed statistically significant correlation between cell count and ADCmean (p=-0.76, P=0.03) as well with ADCp75 (p=-0.79, P=0.02). Kurtosis and entropy correlated with average nucleic area (p=-0.81, P=0.02, p=0.88, P=0.007, respectively). None of the analyzed ADC parameters correlated with total nucleic area and with Ki67-index. This study identified significant correlations between cellularity and histogram parameters derived from ADC maps in muscle lymphoma. Thus, histogram analysis parameters reflect histopathology in muscle tumors. Advances in knowledge: Whole lesion ADC histogram analysis is able to reflect histopathology parameters in muscle lymphomas.
NASA Astrophysics Data System (ADS)
Galich, Nikolay E.; Filatov, Michael V.
2008-07-01
Communication contains the description of the immunology experiments and the experimental data treatment. New nonlinear methods of immunofluorescence statistical analysis of peripheral blood neutrophils have been developed. We used technology of respiratory burst reaction of DNA fluorescence in the neutrophils cells nuclei due to oxidative activity. The histograms of photon count statistics the radiant neutrophils populations' in flow cytometry experiments are considered. Distributions of the fluorescence flashes frequency as functions of the fluorescence intensity are analyzed. Statistic peculiarities of histograms set for healthy and unhealthy donors allow dividing all histograms on the three classes. The classification is based on three different types of smoothing and long-range scale averaged immunofluorescence distributions and their bifurcation. Heterogeneity peculiarities of long-range scale immunofluorescence distributions allow dividing all histograms on three groups. First histograms group belongs to healthy donors. Two other groups belong to donors with autoimmune and inflammatory diseases. Some of the illnesses are not diagnosed by standards biochemical methods. Medical standards and statistical data of the immunofluorescence histograms for identifications of health and illnesses are interconnected. Possibilities and alterations of immunofluorescence statistics in registration, diagnostics and monitoring of different diseases in various medical treatments have been demonstrated. Health or illness criteria are connected with statistics features of immunofluorescence histograms. Neutrophils populations' fluorescence presents the sensitive clear indicator of health status.
Accelerated weight histogram method for exploring free energy landscapes
NASA Astrophysics Data System (ADS)
Lindahl, V.; Lidmar, J.; Hess, B.
2014-07-01
Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.
Accelerated weight histogram method for exploring free energy landscapes.
Lindahl, V; Lidmar, J; Hess, B
2014-07-28
Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.
n-SIFT: n-dimensional scale invariant feature transform.
Cheung, Warren; Hamarneh, Ghassan
2009-09-01
We propose the n-dimensional scale invariant feature transform (n-SIFT) method for extracting and matching salient features from scalar images of arbitrary dimensionality, and compare this method's performance to other related features. The proposed features extend the concepts used for 2-D scalar images in the computer vision SIFT technique for extracting and matching distinctive scale invariant features. We apply the features to images of arbitrary dimensionality through the use of hyperspherical coordinates for gradients and multidimensional histograms to create the feature vectors. We analyze the performance of a fully automated multimodal medical image matching technique based on these features, and successfully apply the technique to determine accurate feature point correspondence between pairs of 3-D MRI images and dynamic 3D + time CT data.
NASA Astrophysics Data System (ADS)
Licznar, Paweł; Rupp, David; Adamowski, Witold
2013-04-01
In the fall of 2008, Municipal Water Supply and Sewerage Company (MWSSC) in Warsaw began operating the first large precipitation monitoring network dedicated to urban hydrology in Poland. The process of establishing the network as well as the preliminary phase of its operation, raised a number of questions concerning optimal gauge location and density and revealed the urgent need for new data processing techniques. When considering the full-field precipitation as input to hydrodynamic models of stormwater and combined sewage systems, standard processing techniques developed previously for single gauges and concentrating mainly on the analysis of maximum rainfall rates and intensity-duration-frequency (IDF) curves development were found inadequate. We used a multifractal rainfall modeling framework based on microcanonical multiplicative random cascades to analyze properties of Warsaw precipitation. We calculated breakdown coefficients (BDC) for the hierarchy of timescales from λ=1 (5-min) up to λ=128 (1280-min) for all 25 gauges in the network. At small timescales histograms of BDCs were strongly deformed due to the recording precision of rainfall amounts. A randomization procedure statistically removed the artifacts due to precision errors in the original series. At large timescales BDC values were sparse due to relatively short period of observations (2008-2011). An algorithm with a moving window was proposed to increase the number of BDC values at large timescales and to smooth their histograms. The resulting empirical BDC histograms were modeled by a theoretical "2N-B" distribution, which combined 2 separate normal (N) distributions and one beta (B) distribution. A clear evolution of BDC histograms from a 2N-B distribution for small timescales to a N-B distributions for intermediate timescales and finally to a single beta distributions for large timescales was observed for all gauges. Cluster analysis revealed close patterns of BDC distributions among almost all gauges and timescales with exception of two gauges located at the city limits (one gauge was located on the Okęcie airport). We evaluated the performance of the microcanonical cascades at disaggregating 1280-min (quasi daily precipitation totals) into 5-min rainfall data for selected gauges. Synthetic time series were analyzed with respect to their intermittency and variability of rainfall intensities and compared to observational series. We showed that microcanonical cascades models could be used in practice for generating synthetic rainfall time series suitable as input to urban hydrology models in Warsaw.
Time-cumulated visible and infrared histograms used as descriptor of cloud cover
NASA Technical Reports Server (NTRS)
Seze, G.; Rossow, W.
1987-01-01
To study the statistical behavior of clouds for different climate regimes, the spatial and temporal stability of VIS-IR bidimensional histograms is tested. Also, the effect of data sampling and averaging on the histogram shapes is considered; in particular the sampling strategy used by the International Satellite Cloud Climatology Project is tested.
Interpreting Histograms. As Easy as It Seems?
ERIC Educational Resources Information Center
Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim
2014-01-01
Histograms are widely used, but recent studies have shown that they are not as easy to interpret as it might seem. In this article, we report on three studies on the interpretation of histograms in which we investigated, namely, (1) whether the misinterpretation by university students can be considered to be the result of heuristic reasoning, (2)…
Improving Real World Performance of Vision Aided Navigation in a Flight Environment
2016-09-15
Introduction . . . . . . . 63 4.2 Wide Area Search Extent . . . . . . . . . . . . . . . . . 64 4.3 Large-Scale Image Navigation Histogram Filter ...65 4.3.1 Location Model . . . . . . . . . . . . . . . . . . 66 4.3.2 Measurement Model . . . . . . . . . . . . . . . 66 4.3.3 Histogram Filter ...Iteration of Histogram Filter . . . . . . . . . . . 70 4.4 Implementation and Flight Test Campaign . . . . . . . . 71 4.4.1 Software Implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
This volume contains geology of the Durango A detail area, radioactive mineral occurences in Colorado, and geophysical data interpretation. Eight appendices provide the following: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, magnetic and ancillary profiles, and test line data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
The geology of the Durango B detail area, the radioactive mineral occurrences in Colorado and the geophysical data interpretation are included in this report. Seven appendices contain: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, and test line data.
Students' Understanding of Bar Graphs and Histograms: Results from the LOCUS Assessments
ERIC Educational Resources Information Center
Whitaker, Douglas; Jacobbe, Tim
2017-01-01
Bar graphs and histograms are core statistical tools that are widely used in statistical practice and commonly taught in classrooms. Despite their importance and the instructional time devoted to them, many students demonstrate misunderstandings when asked to read and interpret bar graphs and histograms. Much of the research that has been…
Cauley, K A; Hu, Y; Och, J; Yorks, P J; Fielden, S W
2018-04-01
The majority of brain growth and development occur in the first 2 years of life. This study investigated these changes by analysis of the brain radiodensity histogram of head CT scans from the clinical population, 0-2 years of age. One hundred twenty consecutive head CTs with normal findings meeting the inclusion criteria from children from birth to 2 years were retrospectively identified from 3 different CT scan platforms. Histogram analysis was performed on brain-extracted images, and histogram mean, mode, full width at half maximum, skewness, kurtosis, and SD were correlated with subject age. The effects of scan platform were investigated. Normative curves were fitted by polynomial regression analysis. Average total brain volume was 360 cm 3 at birth, 948 cm 3 at 1 year, and 1072 cm 3 at 2 years. Total brain tissue density showed an 11% increase in mean density at 1 year and 19% at 2 years. Brain radiodensity histogram skewness was positive at birth, declining logarithmically in the first 200 days of life. The histogram kurtosis also decreased in the first 200 days to approach a normal distribution. Direct segmentation of CT images showed that changes in brain radiodensity histogram skewness correlated with, and can be explained by, a relative increase in gray matter volume and an increase in gray and white matter tissue density that occurs during this period of brain maturation. Normative metrics of the brain radiodensity histogram derived from routine clinical head CT images can be used to develop a model of normal brain development. © 2018 by American Journal of Neuroradiology.
Meyer, Hans Jonas; Emmer, Alexander; Kornhuber, Malte; Surov, Alexey
2018-05-01
Diffusion-weighted imaging (DWI) has the potential of being able to reflect histopathology architecture. A novel imaging approach, namely histogram analysis, is used to further characterize tissues on MRI. The aim of this study was to correlate histogram parameters derived from apparent diffusion coefficient (ADC) maps with serological parameters in myositis. 16 patients with autoimmune myositis were included in this retrospective study. DWI was obtained on a 1.5 T scanner by using the b-values of 0 and 1000 s mm - 2 . Histogram analysis was performed as a whole muscle measurement by using a custom-made Matlab-based application. The following ADC histogram parameters were estimated: ADCmean, ADCmax, ADCmin, ADCmedian, ADCmode, and the following percentiles ADCp10, ADCp25, ADCp75, ADCp90, as well histogram parameters kurtosis, skewness, and entropy. In all patients, the blood sample was acquired within 3 days to the MRI. The following serological parameters were estimated: alanine aminotransferase, aspartate aminotransferase, creatine kinase, lactate dehydrogenase, C-reactive protein (CRP) and myoglobin. All patients were screened for Jo1-autobodies. Kurtosis correlated inversely with CRP (p = -0.55 and 0.03). Furthermore, ADCp10 and ADCp90 values tended to correlate with creatine kinase (p = -0.43, 0.11, and p = -0.42, = 0.12 respectively). In addition, ADCmean, p10, p25, median, mode, and entropy were different between Jo1-positive and Jo1-negative patients. ADC histogram parameters are sensitive for detection of muscle alterations in myositis patients. Advances in knowledge: This study identified that kurtosis derived from ADC maps is associated with CRP in myositis patients. Furthermore, several ADC histogram parameters are statistically different between Jo1-positive and Jo1-negative patients.
Can histogram analysis of MR images predict aggressiveness in pancreatic neuroendocrine tumors?
De Robertis, Riccardo; Maris, Bogdan; Cardobi, Nicolò; Tinazzi Martini, Paolo; Gobbo, Stefano; Capelli, Paola; Ortolani, Silvia; Cingarlini, Sara; Paiella, Salvatore; Landoni, Luca; Butturini, Giovanni; Regi, Paolo; Scarpa, Aldo; Tortora, Giampaolo; D'Onofrio, Mirko
2018-06-01
To evaluate MRI derived whole-tumour histogram analysis parameters in predicting pancreatic neuroendocrine neoplasm (panNEN) grade and aggressiveness. Pre-operative MR of 42 consecutive patients with panNEN >1 cm were retrospectively analysed. T1-/T2-weighted images and ADC maps were analysed. Histogram-derived parameters were compared to histopathological features using the Mann-Whitney U test. Diagnostic accuracy was assessed by ROC-AUC analysis; sensitivity and specificity were assessed for each histogram parameter. ADC entropy was significantly higher in G2-3 tumours with ROC-AUC 0.757; sensitivity and specificity were 83.3 % (95 % CI: 61.2-94.5) and 61.1 % (95 % CI: 36.1-81.7). ADC kurtosis was higher in panNENs with vascular involvement, nodal and hepatic metastases (p= .008, .021 and .008; ROC-AUC= 0.820, 0.709 and 0.820); sensitivity and specificity were: 85.7/74.3 % (95 % CI: 42-99.2 /56.4-86.9), 36.8/96.5 % (95 % CI: 17.2-61.4 /76-99.8) and 100/62.8 % (95 % CI: 56.1-100/44.9-78.1). No significant differences between groups were found for other histogram-derived parameters (p >.05). Whole-tumour histogram analysis of ADC maps may be helpful in predicting tumour grade, vascular involvement, nodal and liver metastases in panNENs. ADC entropy and ADC kurtosis are the most accurate parameters for identification of panNENs with malignant behaviour. • Whole-tumour ADC histogram analysis can predict aggressiveness in pancreatic neuroendocrine neoplasms. • ADC entropy and kurtosis are higher in aggressive tumours. • ADC histogram analysis can quantify tumour diffusion heterogeneity. • Non-invasive quantification of tumour heterogeneity can provide adjunctive information for prognostication.
Tsuchiya, Naoko; Doai, Mariko; Usuda, Katsuo; Uramoto, Hidetaka; Tonami, Hisao
2017-01-01
Investigating the diagnostic accuracy of histogram analyses of apparent diffusion coefficient (ADC) values for determining non-small cell lung cancer (NSCLC) tumor grades, lymphovascular invasion, and pleural invasion. We studied 60 surgically diagnosed NSCLC patients. Diffusion-weighted imaging (DWI) was performed in the axial plane using a navigator-triggered single-shot, echo-planar imaging sequence with prospective acquisition correction. The ADC maps were generated, and we placed a volume-of-interest on the tumor to construct the whole-lesion histogram. Using the histogram, we calculated the mean, 5th, 10th, 25th, 50th, 75th, 90th, and 95th percentiles of ADC, skewness, and kurtosis. Histogram parameters were correlated with tumor grade, lymphovascular invasion, and pleural invasion. We performed a receiver operating characteristics (ROC) analysis to assess the diagnostic performance of histogram parameters for distinguishing different pathologic features. The ADC mean, 10th, 25th, 50th, 75th, 90th, and 95th percentiles showed significant differences among the tumor grades. The ADC mean, 25th, 50th, 75th, 90th, and 95th percentiles were significant histogram parameters between high- and low-grade tumors. The ROC analysis between high- and low-grade tumors showed that the 95th percentile ADC achieved the highest area under curve (AUC) at 0.74. Lymphovascular invasion was associated with the ADC mean, 50th, 75th, 90th, and 95th percentiles, skewness, and kurtosis. Kurtosis achieved the highest AUC at 0.809. Pleural invasion was only associated with skewness, with the AUC of 0.648. ADC histogram analyses on the basis of the entire tumor volume are able to stratify NSCLCs' tumor grade, lymphovascular invasion and pleural invasion.
Improved automatic adjustment of density and contrast in FCR system using neural network
NASA Astrophysics Data System (ADS)
Takeo, Hideya; Nakajima, Nobuyoshi; Ishida, Masamitsu; Kato, Hisatoyo
1994-05-01
FCR system has an automatic adjustment of image density and contrast by analyzing the histogram of image data in the radiation field. Advanced image recognition methods proposed in this paper can improve the automatic adjustment performance, in which neural network technology is used. There are two methods. Both methods are basically used 3-layer neural network with back propagation. The image data are directly input to the input-layer in one method and the histogram data is input in the other method. The former is effective to the imaging menu such as shoulder joint in which the position of interest region occupied on the histogram changes by difference of positioning and the latter is effective to the imaging menu such as chest-pediatrics in which the histogram shape changes by difference of positioning. We experimentally confirm the validity of these methods (about the automatic adjustment performance) as compared with the conventional histogram analysis methods.
NASA Astrophysics Data System (ADS)
Zeng, Bangze; Zhu, Youpan; Li, Zemin; Hu, Dechao; Luo, Lin; Zhao, Deli; Huang, Juan
2014-11-01
Duo to infrared image with low contrast, big noise and unclear visual effect, target is very difficult to observed and identified. This paper presents an improved infrared image detail enhancement algorithm based on adaptive histogram statistical stretching and gradient filtering (AHSS-GF). Based on the fact that the human eyes are very sensitive to the edges and lines, the author proposed to extract the details and textures by using the gradient filtering. New histogram could be acquired by calculating the sum of original histogram based on fixed window. With the minimum value for cut-off point, author carried on histogram statistical stretching. After the proper weights given to the details and background, the detail-enhanced results could be acquired finally. The results indicate image contrast could be improved and the details and textures could be enhanced effectively as well.
Mechanical methods of chaparral modification
George Roby; Lisle Green
1976-01-01
Chaparral modification is undertaken for a variety of land-management purposes. To help land managers in selecting equipment and methods for such work, practitioners in county, State, and Federal modification projects were asked for evaluations of equipment and techniques they had used. This handbook describes the alternative techniques and equipment, provides...
NASA Technical Reports Server (NTRS)
Eigen, D. J.; Fromm, F. R.; Northouse, R. A.
1974-01-01
A new clustering algorithm is presented that is based on dimensional information. The algorithm includes an inherent feature selection criterion, which is discussed. Further, a heuristic method for choosing the proper number of intervals for a frequency distribution histogram, a feature necessary for the algorithm, is presented. The algorithm, although usable as a stand-alone clustering technique, is then utilized as a global approximator. Local clustering techniques and configuration of a global-local scheme are discussed, and finally the complete global-local and feature selector configuration is shown in application to a real-time adaptive classification scheme for the analysis of remote sensed multispectral scanner data.
Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors.
Dutton, Neale A W; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K
2016-07-20
SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed.
Advances in the surface modification techniques of bone-related implants for last 10 years
Qiu, Zhi-Ye; Chen, Cen; Wang, Xiu-Mei; Lee, In-Seop
2014-01-01
At the time of implanting bone-related implants into human body, a variety of biological responses to the material surface occur with respect to surface chemistry and physical state. The commonly used biomaterials (e.g. titanium and its alloy, Co–Cr alloy, stainless steel, polyetheretherketone, ultra-high molecular weight polyethylene and various calcium phosphates) have many drawbacks such as lack of biocompatibility and improper mechanical properties. As surface modification is very promising technology to overcome such problems, a variety of surface modification techniques have been being investigated. This review paper covers recent advances in surface modification techniques of bone-related materials including physicochemical coating, radiation grafting, plasma surface engineering, ion beam processing and surface patterning techniques. The contents are organized with different types of techniques to applicable materials, and typical examples are also described. PMID:26816626
Research of image retrieval technology based on color feature
NASA Astrophysics Data System (ADS)
Fu, Yanjun; Jiang, Guangyu; Chen, Fengying
2009-10-01
Recently, with the development of the communication and the computer technology and the improvement of the storage technology and the capability of the digital image equipment, more and more image resources are given to us than ever. And thus the solution of how to locate the proper image quickly and accurately is wanted.The early method is to set up a key word for searching in the database, but now the method has become very difficult when we search much more picture that we need. In order to overcome the limitation of the traditional searching method, content based image retrieval technology was aroused. Now, it is a hot research subject.Color image retrieval is the important part of it. Color is the most important feature for color image retrieval. Three key questions on how to make use of the color characteristic are discussed in the paper: the expression of color, the abstraction of color characteristic and the measurement of likeness based on color. On the basis, the extraction technology of the color histogram characteristic is especially discussed. Considering the advantages and disadvantages of the overall histogram and the partition histogram, a new method based the partition-overall histogram is proposed. The basic thought of it is to divide the image space according to a certain strategy, and then calculate color histogram of each block as the color feature of this block. Users choose the blocks that contain important space information, confirming the right value. The system calculates the distance between the corresponding blocks that users choosed. Other blocks merge into part overall histograms again, and the distance should be calculated. Then accumulate all the distance as the real distance between two pictures. The partition-overall histogram comprehensive utilizes advantages of two methods above, by choosing blocks makes the feature contain more spatial information which can improve performance; the distances between partition-overall histogram make rotating and translation does not change. The HSV color space is used to show color characteristic of image, which is suitable to the visual characteristic of human. Taking advance of human's feeling to color, it quantifies color sector with unequal interval, and get characteristic vector. Finally, it matches the similarity of image with the algorithm of the histogram intersection and the partition-overall histogram. Users can choose a demonstration image to show inquired vision require, and also can adjust several right value through the relevance-feedback method to obtain the best result of search.An image retrieval system based on these approaches is presented. The result of the experiments shows that the image retrieval based on partition-overall histogram can keep the space distribution information while abstracting color feature efficiently, and it is superior to the normal color histograms in precision rate while researching. The query precision rate is more than 95%. In addition, the efficient block expression will lower the complicate degree of the images to be searched, and thus the searching efficiency will be increased. The image retrieval algorithms based on the partition-overall histogram proposed in the paper is efficient and effective.
A novel technique for prevention of self-mutilation in three Harris' hawks (Parabuteo unicinctus).
Smith, Stephen P; Forbes, Neil A
2009-03-01
Feather destructive behavior is an ever-increasing problem in captive birds of prey and any associated self-mutilation can be challenging to manage. This clinical report describes a noninvasive, nonpainful, temporary beak modification technique used to prevent self-mutilation in 3 Harris' hawks (Parabuteo unicinctus), when other methods and husbandry modifications had failed. The technique was successful in all 3 birds, and no adverse effects were observed. Further investigations are required to further evaluate if this beak modification technique causes any psychologic or physical stress to the avian patient.
Grating interferometry-based phase microtomography of atherosclerotic human arteries
NASA Astrophysics Data System (ADS)
Buscema, Marzia; Holme, Margaret N.; Deyhle, Hans; Schulz, Georg; Schmitz, Rüdiger; Thalmann, Peter; Hieber, Simone E.; Chicherova, Natalia; Cattin, Philippe C.; Beckmann, Felix; Herzen, Julia; Weitkamp, Timm; Saxer, Till; Müller, Bert
2014-09-01
Cardiovascular diseases are the number one cause of death and morbidity in the world. Understanding disease development in terms of lumen morphology and tissue composition of constricted arteries is essential to improve treatment and patient outcome. X-ray tomography provides non-destructive three-dimensional data with micrometer-resolution. However, a common problem is simultaneous visualization of soft and hard tissue-containing specimens, such as atherosclerotic human coronary arteries. Unlike absorption based techniques, where X-ray absorption strongly depends on atomic number and tissue density, phase contrast methods such as grating interferometry have significant advantages as the phase shift is only a linear function of the atomic number. We demonstrate that grating interferometry-based phase tomography is a powerful method to three-dimensionally visualize a variety of anatomical features in atherosclerotic human coronary arteries, including plaque, muscle, fat, and connective tissue. Three formalin-fixed, human coronary arteries were measured using advanced laboratory μCT. While this technique gives information about plaque morphology, it is impossible to extract the lumen morphology. Therefore, selected regions were measured using grating based phase tomography, sinograms were treated with a wavelet-Fourier filter to remove ring artifacts, and reconstructed data were processed to allow extraction of vessel lumen morphology. Phase tomography data in combination with conventional laboratory μCT data of the same specimen shows potential, through use of a joint histogram, to identify more tissue types than either technique alone. Such phase tomography data was also rigidly registered to subsequently decalcified arteries that were histologically sectioned, although the quality of registration was insufficient for joint histogram analysis.
Spline smoothing of histograms by linear programming
NASA Technical Reports Server (NTRS)
Bennett, J. O.
1972-01-01
An algorithm for an approximating function to the frequency distribution is obtained from a sample of size n. To obtain the approximating function a histogram is made from the data. Next, Euclidean space approximations to the graph of the histogram using central B-splines as basis elements are obtained by linear programming. The approximating function has area one and is nonnegative.
Kwon, M-R; Shin, J H; Hahn, S Y; Oh, Y L; Kwak, J Y; Lee, E; Lim, Y
2018-06-01
To evaluate the diagnostic value of histogram analysis using ultrasound (US) to differentiate between the subtypes of follicular variant of papillary thyroid carcinoma (FVPTC). The present study included 151 patients with surgically confirmed FVPTC diagnosed between January 2014 and May 2016. Their preoperative US features were reviewed retrospectively. Histogram parameters (mean, maximum, minimum, range, root mean square, skewness, kurtosis, energy, entropy, and correlation) were obtained for each nodule. The 152 nodules in 151 patients comprised 48 non-invasive follicular thyroid neoplasm with papillary-like nuclear features (NIFTPs; 31.6%), 60 invasive encapsulated FVPTCs (EFVPTCs; 39.5%), and 44 infiltrative FVPTCs (28.9%). The US features differed significantly between the subtypes of FVPTC. Discrimination was achieved between NIFTPs and infiltrative FVPTC, and between invasive EFVPTC and infiltrative FVPTC using histogram parameters; however, the parameters were not significantly different between NIFTP and invasive EFVPTC. It is feasible to use greyscale histogram analysis to differentiate between NIFTP and infiltrative FVPTC, but not between NIFTP and invasive EFVPTC. Histograms can be used as a supplementary tool to differentiate the subtypes of FVPTC. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Song, Yong Sub; Choi, Seung Hong; Park, Chul-Kee; Yi, Kyung Sik; Lee, Woong Jae; Yun, Tae Jin; Kim, Tae Min; Lee, Se-Hoon; Kim, Ji-Hoon; Sohn, Chul-Ho; Park, Sung-Hye; Kim, Il Han; Jahng, Geon-Ho; Chang, Kee-Hyun
2013-01-01
The purpose of this study was to differentiate true progression from pseudoprogression of glioblastomas treated with concurrent chemoradiotherapy (CCRT) with temozolomide (TMZ) by using histogram analysis of apparent diffusion coefficient (ADC) and normalized cerebral blood volume (nCBV) maps. Twenty patients with histopathologically proven glioblastoma who had received CCRT with TMZ underwent perfusion-weighted imaging and diffusion-weighted imaging (b = 0, 1000 sec/mm(2)). The corresponding nCBV and ADC maps for the newly visible, entirely enhancing lesions were calculated after the completion of CCRT with TMZ. Two observers independently measured the histogram parameters of the nCBV and ADC maps. The histogram parameters between the true progression group (n = 10) and the pseudoprogression group (n = 10) were compared by use of an unpaired Student's t test and subsequent multivariable stepwise logistic regression analysis to determine the best predictors for the differential diagnosis between the two groups. Receiver operating characteristic analysis was employed to determine the best cutoff values for the histogram parameters that proved to be significant predictors for differentiating true progression from pseudoprogression. Intraclass correlation coefficient was used to determine the level of inter-observer reliability for the histogram parameters. The 5th percentile value (C5) of the cumulative ADC histograms was a significant predictor for the differential diagnosis between true progression and pseudoprogression (p = 0.044 for observer 1; p = 0.011 for observer 2). Optimal cutoff values of 892 × 10(-6) mm(2)/sec for observer 1 and 907 × 10(-6) mm(2)/sec for observer 2 could help differentiate between the two groups with a sensitivity of 90% and 80%, respectively, a specificity of 90% and 80%, respectively, and an area under the curve of 0.880 and 0.840, respectively. There was no other significant differentiating parameter on the nCBV histograms. Inter-observer reliability was excellent or good for all histogram parameters (intraclass correlation coefficient range: 0.70-0.99). The C5 of the cumulative ADC histogram can be a promising parameter for the differentiation of true progression from pseudoprogression of newly visible, entirely enhancing lesions after CCRT with TMZ for glioblastomas.
Song, Yong Sub; Park, Chul-Kee; Yi, Kyung Sik; Lee, Woong Jae; Yun, Tae Jin; Kim, Tae Min; Lee, Se-Hoon; Kim, Ji-Hoon; Sohn, Chul-Ho; Park, Sung-Hye; Kim, Il Han; Jahng, Geon-Ho; Chang, Kee-Hyun
2013-01-01
Objective The purpose of this study was to differentiate true progression from pseudoprogression of glioblastomas treated with concurrent chemoradiotherapy (CCRT) with temozolomide (TMZ) by using histogram analysis of apparent diffusion coefficient (ADC) and normalized cerebral blood volume (nCBV) maps. Materials and Methods Twenty patients with histopathologically proven glioblastoma who had received CCRT with TMZ underwent perfusion-weighted imaging and diffusion-weighted imaging (b = 0, 1000 sec/mm2). The corresponding nCBV and ADC maps for the newly visible, entirely enhancing lesions were calculated after the completion of CCRT with TMZ. Two observers independently measured the histogram parameters of the nCBV and ADC maps. The histogram parameters between the true progression group (n = 10) and the pseudoprogression group (n = 10) were compared by use of an unpaired Student's t test and subsequent multivariable stepwise logistic regression analysis to determine the best predictors for the differential diagnosis between the two groups. Receiver operating characteristic analysis was employed to determine the best cutoff values for the histogram parameters that proved to be significant predictors for differentiating true progression from pseudoprogression. Intraclass correlation coefficient was used to determine the level of inter-observer reliability for the histogram parameters. Results The 5th percentile value (C5) of the cumulative ADC histograms was a significant predictor for the differential diagnosis between true progression and pseudoprogression (p = 0.044 for observer 1; p = 0.011 for observer 2). Optimal cutoff values of 892 × 10-6 mm2/sec for observer 1 and 907 × 10-6 mm2/sec for observer 2 could help differentiate between the two groups with a sensitivity of 90% and 80%, respectively, a specificity of 90% and 80%, respectively, and an area under the curve of 0.880 and 0.840, respectively. There was no other significant differentiating parameter on the nCBV histograms. Inter-observer reliability was excellent or good for all histogram parameters (intraclass correlation coefficient range: 0.70-0.99). Conclusion The C5 of the cumulative ADC histogram can be a promising parameter for the differentiation of true progression from pseudoprogression of newly visible, entirely enhancing lesions after CCRT with TMZ for glioblastomas. PMID:23901325
Evaluation of Techniques for Reducing In-Use Automotive Fuel Consumption
DOT National Transportation Integrated Search
1981-04-01
This report presents an assessment of proposed techniques for reducing fuel consumption in the in-use light duty road vehicle fleet. Three general classes of techniques are treated: (1) modification of vehicles, (2) modification of traffic flow, and ...
Schob, Stefan; Meyer, Hans Jonas; Dieckow, Julia; Pervinder, Bhogal; Pazaitis, Nikolaos; Höhn, Anne Kathrin; Garnov, Nikita; Horvath-Rizea, Diana; Hoffmann, Karl-Titus; Surov, Alexey
2017-04-12
Pre-surgical diffusion weighted imaging (DWI) is increasingly important in the context of thyroid cancer for identification of the optimal treatment strategy. It has exemplarily been shown that DWI at 3T can distinguish undifferentiated from well-differentiated thyroid carcinoma, which has decisive implications for the magnitude of surgery. This study used DWI histogram analysis of whole tumor apparent diffusion coefficient (ADC) maps. The primary aim was to discriminate thyroid carcinomas which had already gained the capacity to metastasize lymphatically from those not yet being able to spread via the lymphatic system. The secondary aim was to reflect prognostically important tumor-biological features like cellularity and proliferative activity with ADC histogram analysis. Fifteen patients with follicular-cell derived thyroid cancer were enrolled. Lymph node status, extent of infiltration of surrounding tissue, and Ki-67 and p53 expression were assessed in these patients. DWI was obtained in a 3T system using b values of 0, 400, and 800 s/mm². Whole tumor ADC volumes were analyzed using a histogram-based approach. Several ADC parameters showed significant correlations with immunohistopathological parameters. Most importantly, ADC histogram skewness and ADC histogram kurtosis were able to differentiate between nodal negative and nodal positive thyroid carcinoma. histogram analysis of whole ADC tumor volumes has the potential to provide valuable information on tumor biology in thyroid carcinoma. However, further studies are warranted.
Schob, Stefan; Meyer, Hans Jonas; Dieckow, Julia; Pervinder, Bhogal; Pazaitis, Nikolaos; Höhn, Anne Kathrin; Garnov, Nikita; Horvath-Rizea, Diana; Hoffmann, Karl-Titus; Surov, Alexey
2017-01-01
Pre-surgical diffusion weighted imaging (DWI) is increasingly important in the context of thyroid cancer for identification of the optimal treatment strategy. It has exemplarily been shown that DWI at 3T can distinguish undifferentiated from well-differentiated thyroid carcinoma, which has decisive implications for the magnitude of surgery. This study used DWI histogram analysis of whole tumor apparent diffusion coefficient (ADC) maps. The primary aim was to discriminate thyroid carcinomas which had already gained the capacity to metastasize lymphatically from those not yet being able to spread via the lymphatic system. The secondary aim was to reflect prognostically important tumor-biological features like cellularity and proliferative activity with ADC histogram analysis. Fifteen patients with follicular-cell derived thyroid cancer were enrolled. Lymph node status, extent of infiltration of surrounding tissue, and Ki-67 and p53 expression were assessed in these patients. DWI was obtained in a 3T system using b values of 0, 400, and 800 s/mm2. Whole tumor ADC volumes were analyzed using a histogram-based approach. Several ADC parameters showed significant correlations with immunohistopathological parameters. Most importantly, ADC histogram skewness and ADC histogram kurtosis were able to differentiate between nodal negative and nodal positive thyroid carcinoma. Conclusions: histogram analysis of whole ADC tumor volumes has the potential to provide valuable information on tumor biology in thyroid carcinoma. However, further studies are warranted. PMID:28417929
Zolal, Amir; Juratli, Tareq A; Linn, Jennifer; Podlesek, Dino; Sitoci Ficici, Kerim Hakan; Kitzler, Hagen H; Schackert, Gabriele; Sobottka, Stephan B; Rieger, Bernhard; Krex, Dietmar
2016-05-01
Objective To determine the value of apparent diffusion coefficient (ADC) histogram parameters for the prediction of individual survival in patients undergoing surgery for recurrent glioblastoma (GBM) in a retrospective cohort study. Methods Thirty-one patients who underwent surgery for first recurrence of a known GBM between 2008 and 2012 were included. The following parameters were collected: age, sex, enhancing tumor size, mean ADC, median ADC, ADC skewness, ADC kurtosis and fifth percentile of the ADC histogram, initial progression free survival (PFS), extent of second resection and further adjuvant treatment. The association of these parameters with survival and PFS after second surgery was analyzed using log-rank test and Cox regression. Results Using log-rank test, ADC histogram skewness of the enhancing tumor was significantly associated with both survival (p = 0.001) and PFS after second surgery (p = 0.005). Further parameters associated with prolonged survival after second surgery were: gross total resection at second surgery (p = 0.026), tumor size (0.040) and third surgery (p = 0.003). In the multivariate Cox analysis, ADC histogram skewness was shown to be an independent prognostic factor for survival after second surgery. Conclusion ADC histogram skewness of the enhancing lesion, enhancing lesion size, third surgery, as well as gross total resection have been shown to be associated with survival following the second surgery. ADC histogram skewness was an independent prognostic factor for survival in the multivariate analysis.
Image correlation and sampling study
NASA Technical Reports Server (NTRS)
Popp, D. J.; Mccormack, D. S.; Sedwick, J. L.
1972-01-01
The development of analytical approaches for solving image correlation and image sampling of multispectral data is discussed. Relevant multispectral image statistics which are applicable to image correlation and sampling are identified. The general image statistics include intensity mean, variance, amplitude histogram, power spectral density function, and autocorrelation function. The translation problem associated with digital image registration and the analytical means for comparing commonly used correlation techniques are considered. General expressions for determining the reconstruction error for specific image sampling strategies are developed.
NASA Astrophysics Data System (ADS)
Zhou, Xiang
Using an innovative portable holographic inspection and testing system (PHITS) developed at the Australian Defence Force Academy, fatigue cracks in riveted lap joints can be detected by visually inspecting the abnormal fringe changes recorded on holographic interferograms. In this thesis, for automatic crack detection, some modern digital image processing techniques are investigated and applied to holographic interferogram evaluation. Fringe analysis algorithms are developed for identification of the crack-induced fringe changes. Theoretical analysis of PHITS and riveted lap joints and two typical experiments demonstrate that the fatigue cracks in lightly-clamped joints induce two characteristic fringe changes: local fringe discontinuities at the cracking sites; and the global crescent fringe distribution near to the edge of the rivet hole. Both of the fringe features are used for crack detection in this thesis. As a basis of the fringe feature extraction, an algorithm for local fringe orientation calculation is proposed. For high orientation accuracy and computational efficiency, Gaussian gradient filtering and neighboring direction averaging are used to minimize the effects of image background variations and random noise. The neighboring direction averaging is also used to approximate the fringe directions in centerlines of bright and dark fringes. Experimental results indicate that for high orientation accuracy the scales of the Gaussian filter and neighboring direction averaging should be chosen according to the local fringe spacings. The orientation histogram technique is applied to detect the local fringe discontinuity due to the fatigue cracks. The Fourier descriptor technique is used to characterize the global fringe distribution change from a circular to a crescent distribution with the fatigue crack growth. Experiments and computer simulations are conducted to analyze the detectability and reliability of crack detection using the two techniques. Results demonstrate that the Fourier descriptor technique is more promising in the detection of the short cracks near the edge of the rivet head. However, it is not as reliable as the fringe orientation technique for detection of the long through cracks. For reliability, both techniques should be used in practical crack detection. Neither the Fourier descriptor technique nor the orientation histogram technique have been previously applied to holographic interferometry. While this work related primarily to interferograms of cracked rivets, the techniques would be readily applied to other areas of fringe pattern analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arai, K; Tohoku University Graduate School of Medicine, Sendal, Miyagi; Kadoya, N
Purpose: The aim of this study was to confirm On-Board Imager cone-beam computed tomography (CBCT) using a histogram-matching algorithm as a useful method for proton dose calculation in head and neck radiotherapy. Methods: We studied one head and neck phantom and ten patients with head and neck cancer treated using intensity-modulated radiation therapy (IMRT) and proton beam therapy. We modified Hounsfield unit (HU) values of CBCT (mCBCT) using a histogram-matching algorithm. In order to evaluate the accuracy of the proton dose calculation, we compared dose differences in dosimetric parameters (Dmean) for clinical target volume (CTV), planning target volume (PTV) andmore » left parotid and proton ranges (PR) between the planning CT (reference) and CBCT or mCBCT, and gamma passing rates of CBCT and mCBCT. To minimize the effect of organ deformation, we also performed image registration. Results: For patients, the average differences in Dmean for CTV, PTV, and left parotid between planning CT and CBCT were 1.63 ± 2.34%, 3.30 ± 1.02%, and 5.42 ± 3.06%, respectively. Similarly, the average differences between planning CT and mCBCT were 0.20 ± 0.19%, 0.58 ±0.43%, and 3.53 ±2.40%, respectively. The average differences in PR between planning CT and CBCT or mCBCT of a 50° beam for ten patients were 2.1 ± 2.1 mm and 0.3 ± 0.5 mm, respectively. Similarly, the average differences in PR of a 120° beam were 2.9 ± 2.6 mm and 1.1 ± 0.9 mm, respectively. The average dose and PR differences of mCBCT were smaller than those of CBCT. Additionally, the average gamma passing rates of mCBCT were larger than those of CBCT. Conclusion: We evaluated the accuracy of the proton dose calculation in CBCT and mCBCT with the image registration for ten patients. Our results showed that HU modification using a histogram-matching algorithm could improve the accuracy of the proton dose calculation.« less
Image Enhancement via Subimage Histogram Equalization Based on Mean and Variance
2017-01-01
This paper puts forward a novel image enhancement method via Mean and Variance based Subimage Histogram Equalization (MVSIHE), which effectively increases the contrast of the input image with brightness and details well preserved compared with some other methods based on histogram equalization (HE). Firstly, the histogram of input image is divided into four segments based on the mean and variance of luminance component, and the histogram bins of each segment are modified and equalized, respectively. Secondly, the result is obtained via the concatenation of the processed subhistograms. Lastly, the normalization method is deployed on intensity levels, and the integration of the processed image with the input image is performed. 100 benchmark images from a public image database named CVG-UGR-Database are used for comparison with other state-of-the-art methods. The experiment results show that the algorithm can not only enhance image information effectively but also well preserve brightness and details of the original image. PMID:29403529
Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma.
Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang
2016-06-01
The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC).Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement.The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P <0.05), with area under the ROC curves (AUCs) of 0.66 to 0.74 for ADC and 0.76 to 0.88 for PVP. The largest AUC of PVP (1th percentile) showed significantly higher accuracy compared with that of arterial phase (AP) or tumor size (P <0.001).MR histogram analyses-in particular for 1th percentile for PVP images-held promise for prediction of MVI of HCC.
Lu, Shan Shan; Kim, Sang Joon; Kim, Namkug; Kim, Ho Sung; Choi, Choong Gon; Lim, Young Min
2015-04-01
This study intended to investigate the usefulness of histogram analysis of apparent diffusion coefficient (ADC) maps for discriminating primary CNS lymphomas (PCNSLs), especially atypical PCNSLs, from tumefactive demyelinating lesions (TDLs). Forty-seven patients with PCNSLs and 18 with TDLs were enrolled in our study. Hyperintense lesions seen on T2-weighted images were defined as ROIs after ADC maps were registered to the corresponding T2-weighted image. ADC histograms were calculated from the ROIs containing the entire lesion on every section and on a voxel-by-voxel basis. The ADC histogram parameters were compared among all PCNSLs and TDLs as well as between the subgroup of atypical PCNSLs and TDLs. ROC curves were constructed to evaluate the diagnostic performance of the histogram parameters and to determine the optimum thresholds. The differences between the PCNSLs and TDLs were found in the minimum ADC values (ADCmin) and in the 5th and 10th percentiles (ADC5% and ADC10%) of the cumulative ADC histograms. However, no statistical significance was found in the mean ADC value or in the ADC value concerning the mode, kurtosis, and skewness. The ADCmin, ADC5%, and ADC10% were also lower in atypical PCNSLs than in TDLs. ADCmin was the best indicator for discriminating atypical PCNSLs from TDLs, with a threshold of 556×10(-6) mm2/s (sensitivity, 81.3 %; specificity, 88.9%). Histogram analysis of ADC maps may help to discriminate PCNSLs from TDLs and may be particularly useful in differentiating atypical PCNSLs from TDLs.
Zhang, Yujuan; Chen, Jun; Liu, Song; Shi, Hua; Guan, Wenxian; Ji, Changfeng; Guo, Tingting; Zheng, Huanhuan; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng; Liu, Tian
2017-02-01
To investigate the efficacy of histogram analysis of the entire tumor volume in apparent diffusion coefficient (ADC) maps for differentiating between histological grades in gastric cancer. Seventy-eight patients with gastric cancer were enrolled in a retrospective 3.0T magnetic resonance imaging (MRI) study. ADC maps were obtained at two different b values (0 and 1000 sec/mm 2 ) for each patient. Tumors were delineated on each slice of the ADC maps, and a histogram for the entire tumor volume was subsequently generated. A series of histogram parameters (eg, skew and kurtosis) were calculated and correlated with the histological grade of the surgical specimen. The diagnostic performance of each parameter for distinguishing poorly from moderately well-differentiated gastric cancers was assessed by using the area under the receiver operating characteristic curve (AUC). There were significant differences in the 5 th , 10 th , 25 th , and 50 th percentiles, skew, and kurtosis between poorly and well-differentiated gastric cancers (P < 0.05). There were correlations between the degrees of differentiation and histogram parameters, including the 10 th percentile, skew, kurtosis, and max frequency; the correlation coefficients were 0.273, -0.361, -0.339, and -0.370, respectively. Among all the histogram parameters, the max frequency had the largest AUC value, which was 0.675. Histogram analysis of the ADC maps on the basis of the entire tumor volume can be useful in differentiating between histological grades for gastric cancer. 4 J. Magn. Reson. Imaging 2017;45:440-449. © 2016 International Society for Magnetic Resonance in Medicine.
Tiano, L; Chessa, M G; Carrara, S; Tagliafierro, G; Delmonte Corrado, M U
1999-01-01
The chromatin structure dynamics of the Colpoda inflata macronucleus have been investigated in relation to its functional condition, concerning chromatin body extrusion regulating activity. Samples of 2- and 25-day-old resting cysts derived from a standard culture, and of 1-year-old resting cysts derived from a senescent culture, were examined by means of histogram analysis performed on acquired optical microscopy images. Three groups of histograms were detected in each sample. Histogram classification, clustering and matching were assessed in order to obtain the mean histogram of each group. Comparative analysis of the mean histogram showed a similarity in the grey level range of 25-day- and 1-year-old cysts, unlike the wider grey level range found in 2-day-old cysts. Moreover, the respective mean histograms of the three cyst samples appeared rather similar in shape. All this implies that macronuclear chromatin structural features of 1-year-old cysts are common to both cyst standard cultures. The evaluation of the acquired images and their respective histograms evidenced a dynamic state of the macronuclear chromatin, appearing differently condensed in relation to the chromatin body extrusion regulating activity of the macronucleus. The coexistence of a chromatin-decondensed macronucleus with a pycnotic extrusion body suggests that chromatin unable to decondense, thus inactive, is extruded. This finding, along with the presence of chromatin structural features common to standard and senescent cyst populations, supports the occurrence of 'rejuvenated' cell lines from 1-year-old encysted senescent cells, a phenomenon which could be a result of accomplished macronuclear renewal.
Tsuchiya, Naoko; Doai, Mariko; Usuda, Katsuo; Uramoto, Hidetaka
2017-01-01
Purpose Investigating the diagnostic accuracy of histogram analyses of apparent diffusion coefficient (ADC) values for determining non-small cell lung cancer (NSCLC) tumor grades, lymphovascular invasion, and pleural invasion. Materials and methods We studied 60 surgically diagnosed NSCLC patients. Diffusion-weighted imaging (DWI) was performed in the axial plane using a navigator-triggered single-shot, echo-planar imaging sequence with prospective acquisition correction. The ADC maps were generated, and we placed a volume-of-interest on the tumor to construct the whole-lesion histogram. Using the histogram, we calculated the mean, 5th, 10th, 25th, 50th, 75th, 90th, and 95th percentiles of ADC, skewness, and kurtosis. Histogram parameters were correlated with tumor grade, lymphovascular invasion, and pleural invasion. We performed a receiver operating characteristics (ROC) analysis to assess the diagnostic performance of histogram parameters for distinguishing different pathologic features. Results The ADC mean, 10th, 25th, 50th, 75th, 90th, and 95th percentiles showed significant differences among the tumor grades. The ADC mean, 25th, 50th, 75th, 90th, and 95th percentiles were significant histogram parameters between high- and low-grade tumors. The ROC analysis between high- and low-grade tumors showed that the 95th percentile ADC achieved the highest area under curve (AUC) at 0.74. Lymphovascular invasion was associated with the ADC mean, 50th, 75th, 90th, and 95th percentiles, skewness, and kurtosis. Kurtosis achieved the highest AUC at 0.809. Pleural invasion was only associated with skewness, with the AUC of 0.648. Conclusions ADC histogram analyses on the basis of the entire tumor volume are able to stratify NSCLCs' tumor grade, lymphovascular invasion and pleural invasion. PMID:28207858
Wu, Ningling; Christenbury, Joseph G; Dishler, Jon G; Bozkurt, Tahir Kansu; Duel, Daniel; Zhang, Lijun; Hamilton, D Rex
2017-09-01
To identify risk factors for opaque bubble layer (OBL) formation and compare the incidence of OBL using a cone modification technique versus the original technique for LASIK flap creation using the VisuMax laser (Carl Zeiss Meditec, Jena, Germany). This retrospective study examined videos of flap creation using the VisuMax laser to identify OBL occurrence. Eyes were divided into three groups: eyes where OBL occurred using the original technique (OBL group), eyes where OBL did not occur using the original technique (no OBL group), and eyes in which the cone modification technique was used for LASIK flap creation (larger flap diameter) (cone modification technique group). Preoperative measurements including simulated keratometry (flat and steep) values, white-to-white distance (WTW), pachymetry, patient age and gender, amount of correction, flap parameters, energy setting, corneal hysteresis, and corneal resistance factor were analyzed to identify parameters with statistical difference between the OBL and no OBL groups. Incidence of OBL was compared between the original and cone modification techniques. OBL incidence was significantly lower with the cone modification technique (7.6%; 7 of 92 eyes) than with the original technique (28.8%; 34 of 118 eyes) (Fisher's exact test, P = .0009). Factors identified with a significant difference between eyes with and without OBL using the original technique were: corneal thickness (OBL: 561.2 μm, no OBL: 549.6 μm, P = .0132), WTW diameter (OBL: 11.6 mm, no OBL: 11.9 mm, P = .0048), corneal resistance factor (OBL: 10.4 mm Hg, no OBL: 9.6 mm Hg, P = 0.0329), and corneal astigmatism (OBL: 0.80 diopter, no OBL: 1.00 diopter, P = .0472) CONCLUSIONS: Less astigmatic, thicker, denser, and smaller corneas increased the risk of OBL using the original technique for flap creation. The cone modification technique was associated with lower risk of OBL formation, even in eyes with significant risk factors for OBL using the original technique. [J Refract Surg. 2017;33(9):584-590.]. Copyright 2017, SLACK Incorporated.
A psychophysical comparison of two methods for adaptive histogram equalization.
Zimmerman, J B; Cousins, S B; Hartzell, K M; Frisse, M E; Kahn, M G
1989-05-01
Adaptive histogram equalization (AHE) is a method for adaptive contrast enhancement of digital images. It is an automatic, reproducible method for the simultaneous viewing of contrast within a digital image with a large dynamic range. Recent experiments have shown that in specific cases, there is no significant difference in the ability of AHE and linear intensity windowing to display gray-scale contrast. More recently, a variant of AHE which limits the allowed contrast enhancement of the image has been proposed. This contrast-limited adaptive histogram equalization (CLAHE) produces images in which the noise content of an image is not excessively enhanced, but in which sufficient contrast is provided for the visualization of structures within the image. Images processed with CLAHE have a more natural appearance and facilitate the comparison of different areas of an image. However, the reduced contrast enhancement of CLAHE may hinder the ability of an observer to detect the presence of some significant gray-scale contrast. In this report, a psychophysical observer experiment was performed to determine if there is a significant difference in the ability of AHE and CLAHE to depict gray-scale contrast. Observers were presented with computed tomography (CT) images of the chest processed with AHE and CLAHE. Subtle artificial lesions were introduced into some images. The observers were asked to rate their confidence regarding the presence of the lesions; this rating-scale data was analyzed using receiver operating characteristic (ROC) curve techniques. These ROC curves were compared for significant differences in the observers' performances. In this report, no difference was found in the abilities of AHE and CLAHE to depict contrast information.
Performance analysis of a dual-tree algorithm for computing spatial distance histograms
Chen, Shaoping; Tu, Yi-Cheng; Xia, Yuni
2011-01-01
Many scientific and engineering fields produce large volume of spatiotemporal data. The storage, retrieval, and analysis of such data impose great challenges to database systems design. Analysis of scientific spatiotemporal data often involves computing functions of all point-to-point interactions. One such analytics, the Spatial Distance Histogram (SDH), is of vital importance to scientific discovery. Recently, algorithms for efficient SDH processing in large-scale scientific databases have been proposed. These algorithms adopt a recursive tree-traversing strategy to process point-to-point distances in the visited tree nodes in batches, thus require less time when compared to the brute-force approach where all pairwise distances have to be computed. Despite the promising experimental results, the complexity of such algorithms has not been thoroughly studied. In this paper, we present an analysis of such algorithms based on a geometric modeling approach. The main technique is to transform the analysis of point counts into a problem of quantifying the area of regions where pairwise distances can be processed in batches by the algorithm. From the analysis, we conclude that the number of pairwise distances that are left to be processed decreases exponentially with more levels of the tree visited. This leads to the proof of a time complexity lower than the quadratic time needed for a brute-force algorithm and builds the foundation for a constant-time approximate algorithm. Our model is also general in that it works for a wide range of point spatial distributions, histogram types, and space-partitioning options in building the tree. PMID:21804753
Kim, Ilsoo; Allen, Toby W
2012-04-28
Free energy perturbation, a method for computing the free energy difference between two states, is often combined with non-Boltzmann biased sampling techniques in order to accelerate the convergence of free energy calculations. Here we present a new extension of the Bennett acceptance ratio (BAR) method by combining it with umbrella sampling (US) along a reaction coordinate in configurational space. In this approach, which we call Bennett acceptance ratio with umbrella sampling (BAR-US), the conditional histogram of energy difference (a mapping of the 3N-dimensional configurational space via a reaction coordinate onto 1D energy difference space) is weighted for marginalization with the associated population density along a reaction coordinate computed by US. This procedure produces marginal histograms of energy difference, from forward and backward simulations, with higher overlap in energy difference space, rendering free energy difference estimations using BAR statistically more reliable. In addition to BAR-US, two histogram analysis methods, termed Bennett overlapping histograms with US (BOH-US) and Bennett-Hummer (linear) least square with US (BHLS-US), are employed as consistency and convergence checks for free energy difference estimation by BAR-US. The proposed methods (BAR-US, BOH-US, and BHLS-US) are applied to a 1-dimensional asymmetric model potential, as has been used previously to test free energy calculations from non-equilibrium processes. We then consider the more stringent test of a 1-dimensional strongly (but linearly) shifted harmonic oscillator, which exhibits no overlap between two states when sampled using unbiased Brownian dynamics. We find that the efficiency of the proposed methods is enhanced over the original Bennett's methods (BAR, BOH, and BHLS) through fast uniform sampling of energy difference space via US in configurational space. We apply the proposed methods to the calculation of the electrostatic contribution to the absolute solvation free energy (excess chemical potential) of water. We then address the controversial issue of ion selectivity in the K(+) ion channel, KcsA. We have calculated the relative binding affinity of K(+) over Na(+) within a binding site of the KcsA channel for which different, though adjacent, K(+) and Na(+) configurations exist, ideally suited to these US-enhanced methods. Our studies demonstrate that the significant improvements in free energy calculations obtained using the proposed methods can have serious consequences for elucidating biological mechanisms and for the interpretation of experimental data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burke, TImothy P.; Kiedrowski, Brian C.; Martin, William R.
Kernel Density Estimators (KDEs) are a non-parametric density estimation technique that has recently been applied to Monte Carlo radiation transport simulations. Kernel density estimators are an alternative to histogram tallies for obtaining global solutions in Monte Carlo tallies. With KDEs, a single event, either a collision or particle track, can contribute to the score at multiple tally points with the uncertainty at those points being independent of the desired resolution of the solution. Thus, KDEs show potential for obtaining estimates of a global solution with reduced variance when compared to a histogram. Previously, KDEs have been applied to neutronics formore » one-group reactor physics problems and fixed source shielding applications. However, little work was done to obtain reaction rates using KDEs. This paper introduces a new form of the MFP KDE that is capable of handling general geometries. Furthermore, extending the MFP KDE to 2-D problems in continuous energy introduces inaccuracies to the solution. An ad-hoc solution to these inaccuracies is introduced that produces errors smaller than 4% at material interfaces.« less
Directional Histogram Ratio at Random Probes: A Local Thresholding Criterion for Capillary Images
Lu, Na; Silva, Jharon; Gu, Yu; Gerber, Scott; Wu, Hulin; Gelbard, Harris; Dewhurst, Stephen; Miao, Hongyu
2013-01-01
With the development of micron-scale imaging techniques, capillaries can be conveniently visualized using methods such as two-photon and whole mount microscopy. However, the presence of background staining, leaky vessels and the diffusion of small fluorescent molecules can lead to significant complexity in image analysis and loss of information necessary to accurately quantify vascular metrics. One solution to this problem is the development of accurate thresholding algorithms that reliably distinguish blood vessels from surrounding tissue. Although various thresholding algorithms have been proposed, our results suggest that without appropriate pre- or post-processing, the existing approaches may fail to obtain satisfactory results for capillary images that include areas of contamination. In this study, we propose a novel local thresholding algorithm, called directional histogram ratio at random probes (DHR-RP). This method explicitly considers the geometric features of tube-like objects in conducting image binarization, and has a reliable performance in distinguishing small vessels from either clean or contaminated background. Experimental and simulation studies suggest that our DHR-RP algorithm is superior over existing thresholding methods. PMID:23525856
Multifractal diffusion entropy analysis: Optimal bin width of probability histograms
NASA Astrophysics Data System (ADS)
Jizba, Petr; Korbel, Jan
2014-11-01
In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. The method presented uses techniques of Rényi’s entropy and the mean squared error analysis to discuss the conditions under which the error in the multifractal spectrum estimation is minimal. We illustrate the utility of our approach by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the method proposed we compare the multifractal δ-spectrum for various bin-widths and show the robustness of the method, especially for large values of q. For such values, other methods in use, e.g., those based on moment estimation, tend to fail for heavy-tailed data or data with long correlations. Connection between the δ-spectrum and Rényi’s q parameter is also discussed and elucidated on a simple example of multiscale time series.
A Study of Hand Back Skin Texture Patterns for Personal Identification and Gender Classification
Xie, Jin; Zhang, Lei; You, Jane; Zhang, David; Qu, Xiaofeng
2012-01-01
Human hand back skin texture (HBST) is often consistent for a person and distinctive from person to person. In this paper, we study the HBST pattern recognition problem with applications to personal identification and gender classification. A specially designed system is developed to capture HBST images, and an HBST image database was established, which consists of 1,920 images from 80 persons (160 hands). An efficient texton learning based method is then presented to classify the HBST patterns. First, textons are learned in the space of filter bank responses from a set of training images using the l1 -minimization based sparse representation (SR) technique. Then, under the SR framework, we represent the feature vector at each pixel over the learned dictionary to construct a representation coefficient histogram. Finally, the coefficient histogram is used as skin texture feature for classification. Experiments on personal identification and gender classification are performed by using the established HBST database. The results show that HBST can be used to assist human identification and gender classification. PMID:23012512
On the equivalence of the RTI and SVM approaches to time correlated analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Croft, S.; Favalli, A.; Henzlova, D.
2014-11-21
Recently two papers on how to perform passive neutron auto-correlation analysis on time gated histograms formed from pulse train data, generically called time correlation analysis (TCA), have appeared in this journal [1,2]. For those of us working in international nuclear safeguards these treatments are of particular interest because passive neutron multiplicity counting is a widely deployed technique for the quantification of plutonium. The purpose of this letter is to show that the skewness-variance-mean (SVM) approach developed in [1] is equivalent in terms of assay capability to the random trigger interval (RTI) analysis laid out in [2]. Mathematically we could alsomore » use other numerical ways to extract the time correlated information from the histogram data including for example what we might call the mean, mean square, and mean cube approach. The important feature however, from the perspective of real world applications, is that the correlated information extracted is the same, and subsequently gets interpreted in the same way based on the same underlying physics model.« less
NASA Astrophysics Data System (ADS)
Dang, Van H.; Wohlgemuth, Sven; Yoshiura, Hiroshi; Nguyen, Thuc D.; Echizen, Isao
Wireless sensor network (WSN) has been one of key technologies for the future with broad applications from the military to everyday life [1,2,3,4,5]. There are two kinds of WSN model models with sensors for sensing data and a sink for receiving and processing queries from users; and models with special additional nodes capable of storing large amounts of data from sensors and processing queries from the sink. Among the latter type, a two-tiered model [6,7] has been widely adopted because of its storage and energy saving benefits for weak sensors, as proved by the advent of commercial storage node products such as Stargate [8] and RISE. However, by concentrating storage in certain nodes, this model becomes more vulnerable to attack. Our novel technique, called zip-histogram, contributes to solving the problems of previous studies [6,7] by protecting the stored data's confidentiality and integrity (including data from the sensor and queries from the sink) against attackers who might target storage nodes in two-tiered WSNs.
Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors
Dutton, Neale A. W.; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K.
2016-01-01
SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed. PMID:27447643
Component Pin Recognition Using Algorithms Based on Machine Learning
NASA Astrophysics Data System (ADS)
Xiao, Yang; Hu, Hong; Liu, Ze; Xu, Jiangchang
2018-04-01
The purpose of machine vision for a plug-in machine is to improve the machine’s stability and accuracy, and recognition of the component pin is an important part of the vision. This paper focuses on component pin recognition using three different techniques. The first technique involves traditional image processing using the core algorithm for binary large object (BLOB) analysis. The second technique uses the histogram of oriented gradients (HOG), to experimentally compare the effect of the support vector machine (SVM) and the adaptive boosting machine (AdaBoost) learning meta-algorithm classifiers. The third technique is the use of an in-depth learning method known as convolution neural network (CNN), which involves identifying the pin by comparing a sample to its training. The main purpose of the research presented in this paper is to increase the knowledge of learning methods used in the plug-in machine industry in order to achieve better results.
Vidić, Igor; Egnell, Liv; Jerome, Neil P; Teruel, Jose R; Sjøbakk, Torill E; Østlie, Agnes; Fjøsne, Hans E; Bathen, Tone F; Goa, Pål Erik
2018-05-01
Diffusion-weighted MRI (DWI) is currently one of the fastest developing MRI-based techniques in oncology. Histogram properties from model fitting of DWI are useful features for differentiation of lesions, and classification can potentially be improved by machine learning. To evaluate classification of malignant and benign tumors and breast cancer subtypes using support vector machine (SVM). Prospective. Fifty-one patients with benign (n = 23) and malignant (n = 28) breast tumors (26 ER+, whereof six were HER2+). Patients were imaged with DW-MRI (3T) using twice refocused spin-echo echo-planar imaging with echo time / repetition time (TR/TE) = 9000/86 msec, 90 × 90 matrix size, 2 × 2 mm in-plane resolution, 2.5 mm slice thickness, and 13 b-values. Apparent diffusion coefficient (ADC), relative enhanced diffusivity (RED), and the intravoxel incoherent motion (IVIM) parameters diffusivity (D), pseudo-diffusivity (D*), and perfusion fraction (f) were calculated. The histogram properties (median, mean, standard deviation, skewness, kurtosis) were used as features in SVM (10-fold cross-validation) for differentiation of lesions and subtyping. Accuracies of the SVM classifications were calculated to find the combination of features with highest prediction accuracy. Mann-Whitney tests were performed for univariate comparisons. For benign versus malignant tumors, univariate analysis found 11 histogram properties to be significant differentiators. Using SVM, the highest accuracy (0.96) was achieved from a single feature (mean of RED), or from three feature combinations of IVIM or ADC. Combining features from all models gave perfect classification. No single feature predicted HER2 status of ER + tumors (univariate or SVM), although high accuracy (0.90) was achieved with SVM combining several features. Importantly, these features had to include higher-order statistics (kurtosis and skewness), indicating the importance to account for heterogeneity. Our findings suggest that SVM, using features from a combination of diffusion models, improves prediction accuracy for differentiation of benign versus malignant breast tumors, and may further assist in subtyping of breast cancer. 3 Technical Efficacy: Stage 3 J. Magn. Reson. Imaging 2018;47:1205-1216. © 2017 International Society for Magnetic Resonance in Medicine.
Ocampo-Duque, William; Osorio, Carolina; Piamba, Christian; Schuhmacher, Marta; Domingo, José L
2013-02-01
The integration of water quality monitoring variables is essential in environmental decision making. Nowadays, advanced techniques to manage subjectivity, imprecision, uncertainty, vagueness, and variability are required in such complex evaluation process. We here propose a probabilistic fuzzy hybrid model to assess river water quality. Fuzzy logic reasoning has been used to compute a water quality integrative index. By applying a Monte Carlo technique, based on non-parametric probability distributions, the randomness of model inputs was estimated. Annual histograms of nine water quality variables were built with monitoring data systematically collected in the Colombian Cauca River, and probability density estimations using the kernel smoothing method were applied to fit data. Several years were assessed, and river sectors upstream and downstream the city of Santiago de Cali, a big city with basic wastewater treatment and high industrial activity, were analyzed. The probabilistic fuzzy water quality index was able to explain the reduction in water quality, as the river receives a larger number of agriculture, domestic, and industrial effluents. The results of the hybrid model were compared to traditional water quality indexes. The main advantage of the proposed method is that it considers flexible boundaries between the linguistic qualifiers used to define the water status, being the belongingness of water quality to the diverse output fuzzy sets or classes provided with percentiles and histograms, which allows classify better the real water condition. The results of this study show that fuzzy inference systems integrated to stochastic non-parametric techniques may be used as complementary tools in water quality indexing methodologies. Copyright © 2012 Elsevier Ltd. All rights reserved.
Wu, Chen-Jiang; Wang, Qing; Li, Hai; Wang, Xiao-Ning; Liu, Xi-Sheng; Shi, Hai-Bin; Zhang, Yu-Dong
2015-10-01
To investigate diagnostic efficiency of DWI using entire-tumor histogram analysis in differentiating the low-grade (LG) prostate cancer (PCa) from intermediate-high-grade (HG) PCa in comparison with conventional ROI-based measurement. DW images (b of 0-1400 s/mm(2)) from 126 pathology-confirmed PCa (diameter >0.5 cm) in 110 patients were retrospectively collected and processed by mono-exponential model. The measurement of tumor apparent diffusion coefficients (ADCs) was performed with using histogram-based and ROI-based approach, respectively. The diagnostic ability of ADCs from two methods for differentiating LG-PCa (Gleason score, GS ≤ 6) from HG-PCa (GS > 6) was determined by ROC regression, and compared by McNemar's test. There were 49 LG-tumor and 77 HG-tumor at pathologic findings. Histogram-based ADCs (mean, median, 10th and 90th) and ROI-based ADCs (mean) showed dominant relationships with ordinal GS of Pca (ρ = -0.225 to -0.406, p < 0.05). All above imaging indices reflected significant difference between LG-PCa and HG-PCa (all p values <0.01). Histogram 10th ADCs had dominantly high Az (0.738), Youden index (0.415), and positive likelihood ratio (LR+, 2.45) in stratifying tumor GS against mean, median and 90th ADCs, and ROI-based ADCs. Histogram mean, median, and 10th ADCs showed higher specificity (65.3%-74.1% vs. 44.9%, p < 0.01), but lower sensitivity (57.1%-71.3% vs. 84.4%, p < 0.05) than ROI-based ADCs in differentiating LG-PCa from HG-PCa. DWI-associated histogram analysis had higher specificity, Az, Youden index, and LR+ for differentiation of PCa Gleason grade than ROI-based approach.
Choi, M H; Oh, S N; Park, G E; Yeo, D-M; Jung, S E
2018-05-10
To evaluate the interobserver and intermethod correlations of histogram metrics of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) parameters acquired by multiple readers using the single-section and whole-tumor volume methods. Four DCE parameters (K trans , K ep , V e , V p ) were evaluated in 45 patients (31 men and 14 women; mean age, 61±11 years [range, 29-83 years]) with locally advanced rectal cancer using pre-chemoradiotherapy (CRT) MRI. Ten histogram metrics were extracted using two methods of lesion selection performed by three radiologists: the whole-tumor volume method for the whole tumor on axial section-by-section images and the single-section method for the entire area of the tumor on one axial image. The interobserver and intermethod correlations were evaluated using the intraclass correlation coefficients (ICCs). The ICCs showed excellent interobserver and intermethod correlations in most of histogram metrics of the DCE parameters. The ICCs among the three readers were > 0.7 (P<0.001) for all histogram metrics, except for the minimum and maximum. The intermethod correlations for most of the histogram metrics were excellent for each radiologist, regardless of the differences in the radiologists' experience. The interobserver and intermethod correlations for most of the histogram metrics of the DCE parameters are excellent in rectal cancer. Therefore, the single-section method may be a potential alternative to the whole-tumor volume method using pre-CRT MRI, despite the fact that the high agreement between the two methods cannot be extrapolated to post-CRT MRI. Copyright © 2018 Société française de radiologie. Published by Elsevier Masson SAS. All rights reserved.
van Heeswijk, Miriam M; Lambregts, Doenja M J; Maas, Monique; Lahaye, Max J; Ayas, Z; Slenter, Jos M G M; Beets, Geerard L; Bakers, Frans C H; Beets-Tan, Regina G H
2017-06-01
The apparent diffusion coefficient (ADC) is a potential prognostic imaging marker in rectal cancer. Typically, mean ADC values are used, derived from precise manual whole-volume tumor delineations by experts. The aim was first to explore whether non-precise circular delineation combined with histogram analysis can be a less cumbersome alternative to acquire similar ADC measurements and second to explore whether histogram analyses provide additional prognostic information. Thirty-seven patients who underwent a primary staging MRI including diffusion-weighted imaging (DWI; b0, 25, 50, 100, 500, 1000; 1.5 T) were included. Volumes-of-interest (VOIs) were drawn on b1000-DWI: (a) precise delineation, manually tracing tumor boundaries (2 expert readers), and (b) non-precise delineation, drawing circular VOIs with a wide margin around the tumor (2 non-experts). Mean ADC and histogram metrics (mean, min, max, median, SD, skewness, kurtosis, 5th-95th percentiles) were derived from the VOIs and delineation time was recorded. Measurements were compared between the two methods and correlated with prognostic outcome parameters. Median delineation time reduced from 47-165 s (precise) to 21-43 s (non-precise). The 45th percentile of the non-precise delineation showed the best correlation with the mean ADC from the precise delineation as the reference standard (ICC 0.71-0.75). None of the mean ADC or histogram parameters showed significant prognostic value; only the total tumor volume (VOI) was significantly larger in patients with positive clinical N stage and mesorectal fascia involvement. When performing non-precise tumor delineation, histogram analysis (in specific 45th ADC percentile) may be used as an alternative to obtain similar ADC values as with precise whole tumor delineation. Histogram analyses are not beneficial to obtain additional prognostic information.
Zhang, Yu-Dong; Wang, Qing; Wu, Chen-Jiang; Wang, Xiao-Ning; Zhang, Jing; Liu, Hui; Liu, Xi-Sheng; Shi, Hai-Bin
2015-04-01
To evaluate histogram analysis of intravoxel incoherent motion (IVIM) for discriminating the Gleason grade of prostate cancer (PCa). A total of 48 patients pathologically confirmed as having clinically significant PCa (size > 0.5 cm) underwent preoperative DW-MRI (b of 0-900 s/mm(2)). Data was post-processed by monoexponential and IVIM model for quantitation of apparent diffusion coefficients (ADCs), perfusion fraction f, diffusivity D and pseudo-diffusivity D*. Histogram analysis was performed by outlining entire-tumour regions of interest (ROIs) from histological-radiological correlation. The ability of imaging indices to differentiate low-grade (LG, Gleason score (GS) ≤6) from intermediate/high-grade (HG, GS > 6) PCa was analysed by ROC regression. Eleven patients had LG tumours (18 foci) and 37 patients had HG tumours (42 foci) on pathology examination. HG tumours had significantly lower ADCs and D in terms of mean, median, 10th and 75th percentiles, combined with higher histogram kurtosis and skewness for ADCs, D and f, than LG PCa (p < 0.05). Histogram D showed relatively higher correlations (ñ = 0.641-0.668 vs. ADCs: 0.544-0.574) with ordinal GS of PCa; and its mean, median and 10th percentile performed better than ADCs did in distinguishing LG from HG PCa. It is feasible to stratify the pathological grade of PCa by IVIM with histogram metrics. D performed better in distinguishing LG from HG tumour than conventional ADCs. • GS had relatively higher correlation with tumour D than ADCs. • Difference of histogram D among two-grade tumours was statistically significant. • D yielded better individual features in demonstrating tumour grade than ADC. • D* and f failed to determine tumour grade of PCa.
Li, Anqin; Xing, Wei; Li, Haojie; Hu, Yao; Hu, Daoyu; Li, Zhen; Kamel, Ihab R
2018-05-29
The purpose of this article is to evaluate the utility of volumetric histogram analysis of apparent diffusion coefficient (ADC) derived from reduced-FOV DWI for small (≤ 4 cm) solid renal mass subtypes at 3-T MRI. This retrospective study included 38 clear cell renal cell carcinomas (RCCs), 16 papillary RCCs, 18 chromophobe RCCs, 13 minimal fat angiomyolipomas (AMLs), and seven oncocytomas evaluated with preoperative MRI. Volumetric ADC maps were generated using all slices of the reduced-FOV DW images to obtain histogram parameters, including mean, median, 10th percentile, 25th percentile, 75th percentile, 90th percentile, and SD ADC values, as well as skewness, kurtosis, and entropy. Comparisons of these parameters were made by one-way ANOVA, t test, and ROC curves analysis. ADC histogram parameters differentiated eight of 10 pairs of renal tumors. Three subtype pairs (clear cell RCC vs papillary RCC, clear cell RCC vs chromophobe RCC, and clear cell RCC vs minimal fat AML) were differentiated by mean ADC. However, five other subtype pairs (clear cell RCC vs oncocytoma, papillary RCC vs minimal fat AML, papillary RCC vs oncocytoma, chromophobe RCC vs minimal fat AML, and chromophobe RCC vs oncocytoma) were differentiated by histogram distribution parameters exclusively (all p < 0.05). Mean ADC, median ADC, 75th and 90th percentile ADC, SD ADC, and entropy of malignant tumors were significantly higher than those of benign tumors (all p < 0.05). Combination of mean ADC with histogram parameters yielded the highest AUC (0.851; sensitivity, 80.0%; specificity, 86.1%). Quantitative volumetric ADC histogram analysis may help differentiate various subtypes of small solid renal tumors, including benign and malignant lesions.
Choi, Moon Hyung; Oh, Soon Nam; Rha, Sung Eun; Choi, Joon-Il; Lee, Sung Hak; Jang, Hong Seok; Kim, Jun-Gi; Grimm, Robert; Son, Yohan
2016-07-01
To investigate the usefulness of apparent diffusion coefficient (ADC) values derived from histogram analysis of the whole rectal cancer as a quantitative parameter to evaluate pathologic complete response (pCR) on preoperative magnetic resonance imaging (MRI). We enrolled a total of 86 consecutive patients who had undergone surgery for rectal cancer after neoadjuvant chemoradiotherapy (CRT) at our institution between July 2012 and November 2014. Two radiologists who were blinded to the final pathological results reviewed post-CRT MRI to evaluate tumor stage. Quantitative image analysis was performed using T2 -weighted and diffusion-weighted images independently by two radiologists using dedicated software that performed histogram analysis to assess the distribution of ADC in the whole tumor. After surgery, 16 patients were confirmed to have achieved pCR (18.6%). All parameters from pre- and post-CRT ADC histogram showed good or excellent agreement between two readers. The minimum, 10th, 25th, 50th, and 75th percentile and mean ADC from post-CRT ADC histogram were significantly higher in the pCR group than in the non-pCR group for both readers. The 25th percentile value from ADC histogram in post-CRT MRI had the best diagnostic performance for detecting pCR, with an area under the receiver operating characteristic curve of 0.796. Low percentile values derived from the ADC histogram analysis of rectal cancer on MRI after CRT showed a significant difference between pCR and non-pCR groups, demonstrating the utility of the ADC value as a quantitative and objective marker to evaluate complete pathologic response to preoperative CRT in rectal cancer. J. Magn. Reson. Imaging 2016;44:212-220. © 2015 Wiley Periodicals, Inc.
Hot-wire anemometry in hypersonic helium flow
NASA Technical Reports Server (NTRS)
Wagner, R. D.; Weinstein, L. M.
1974-01-01
Hot-wire anemometry techniques are described that have been developed and used for hypersonic-helium-flow studies. The short run time available dictated certain innovations in applying conventional hot-wire techniques. Some examples are given to show the application of the techniques used. Modifications to conventional equipment are described, including probe modifications and probe heating controls.
Serial data acquisition for GEM-2D detector
NASA Astrophysics Data System (ADS)
Kolasinski, Piotr; Pozniak, Krzysztof T.; Czarski, Tomasz; Linczuk, Maciej; Byszuk, Adrian; Chernyshova, Maryna; Juszczyk, Bartlomiej; Kasprowicz, Grzegorz; Wojenski, Andrzej; Zabolotny, Wojciech; Zienkiewicz, Pawel; Mazon, Didier; Malard, Philippe; Herrmann, Albrecht; Vezinet, Didier
2014-11-01
This article debates about data fast acquisition and histogramming method for the X-ray GEM detector. The whole process of histogramming is performed by FPGA chips (Spartan-6 series from Xilinx). The results of the histogramming process are stored in an internal FPGA memory and then sent to PC. In PC data is merged and processed by MATLAB. The structure of firmware functionality implemented in the FPGAs is described. Examples of test measurements and results are presented.
Frequency distribution histograms for the rapid analysis of data
NASA Technical Reports Server (NTRS)
Burke, P. V.; Bullen, B. L.; Poff, K. L.
1988-01-01
The mean and standard error are good representations for the response of a population to an experimental parameter and are frequently used for this purpose. Frequency distribution histograms show, in addition, responses of individuals in the population. Both the statistics and a visual display of the distribution of the responses can be obtained easily using a microcomputer and available programs. The type of distribution shown by the histogram may suggest different mechanisms to be tested.
NASA Astrophysics Data System (ADS)
Kvinnsland, Yngve; Muren, Ludvig Paul; Dahl, Olav
2004-08-01
Calculations of normal tissue complication probability (NTCP) values for the rectum are difficult because it is a hollow, non-rigid, organ. Finding the true cumulative dose distribution for a number of treatment fractions requires a CT scan before each treatment fraction. This is labour intensive, and several surrogate distributions have therefore been suggested, such as dose wall histograms, dose surface histograms and histograms for the solid rectum, with and without margins. In this study, a Monte Carlo method is used to investigate the relationships between the cumulative dose distributions based on all treatment fractions and the above-mentioned histograms that are based on one CT scan only, in terms of equivalent uniform dose. Furthermore, the effect of a specific choice of histogram on estimates of the volume parameter of the probit NTCP model was investigated. It was found that the solid rectum and the rectum wall histograms (without margins) gave equivalent uniform doses with an expected value close to the values calculated from the cumulative dose distributions in the rectum wall. With the number of patients available in this study the standard deviations of the estimates of the volume parameter were large, and it was not possible to decide which volume gave the best estimates of the volume parameter, but there were distinct differences in the mean values of the values obtained.
Choi, Sang Hyun; Lee, Jeong Hyun; Choi, Young Jun; Park, Ji Eun; Sung, Yu Sub; Kim, Namkug; Baek, Jung Hwan
2017-01-01
This study aimed to explore the added value of histogram analysis of the ratio of initial to final 90-second time-signal intensity AUC (AUCR) for differentiating local tumor recurrence from contrast-enhancing scar on follow-up dynamic contrast-enhanced T1-weighted perfusion MRI of patients treated for head and neck squamous cell carcinoma (HNSCC). AUCR histogram parameters were assessed among tumor recurrence (n = 19) and contrast-enhancing scar (n = 27) at primary sites and compared using the t test. ROC analysis was used to determine the best differentiating parameters. The added value of AUCR histogram parameters was assessed when they were added to inconclusive conventional MRI results. Histogram analysis showed statistically significant differences in the 50th, 75th, and 90th percentiles of the AUCR values between the two groups (p < 0.05). The 90th percentile of the AUCR values (AUCR 90 ) was the best predictor of local tumor recurrence (AUC, 0.77; 95% CI, 0.64-0.91) with an estimated cutoff of 1.02. AUCR 90 increased sensitivity by 11.7% over that of conventional MRI alone when added to inconclusive results. Histogram analysis of AUCR can improve the diagnostic yield for local tumor recurrence during surveillance after treatment for HNSCC.
Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma
Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang
2016-01-01
Abstract The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC). Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement. The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P <0.05), with area under the ROC curves (AUCs) of 0.66 to 0.74 for ADC and 0.76 to 0.88 for PVP. The largest AUC of PVP (1th percentile) showed significantly higher accuracy compared with that of arterial phase (AP) or tumor size (P <0.001). MR histogram analyses—in particular for 1th percentile for PVP images—held promise for prediction of MVI of HCC. PMID:27368028
Effect of respiratory and cardiac gating on the major diffusion-imaging metrics
Hamaguchi, Hiroyuki; Sugimori, Hiroyuki; Nakanishi, Mitsuhiro; Nakagawa, Shin; Fujiwara, Taro; Yoshida, Hirokazu; Takamori, Sayaka; Shirato, Hiroki
2016-01-01
The effect of respiratory gating on the major diffusion-imaging metrics and that of cardiac gating on mean kurtosis (MK) are not known. For evaluation of whether the major diffusion-imaging metrics—MK, fractional anisotropy (FA), and mean diffusivity (MD) of the brain—varied between gated and non-gated acquisitions, respiratory-gated, cardiac-gated, and non-gated diffusion-imaging of the brain were performed in 10 healthy volunteers. MK, FA, and MD maps were constructed for all acquisitions, and the histograms were constructed. The normalized peak height and location of the histograms were compared among the acquisitions by use of Friedman and post hoc Wilcoxon tests. The effect of the repetition time (TR) on the diffusion-imaging metrics was also tested, and we corrected for its variation among acquisitions, if necessary. The results showed a shift in the peak location of the MK and MD histograms to the right with an increase in TR (p ≤ 0.01). The corrected peak location of the MK histograms, the normalized peak height of the FA histograms, the normalized peak height and the corrected peak location of the MD histograms varied significantly between the gated and non-gated acquisitions (p < 0.05). These results imply an influence of respiration and cardiac pulsation on the major diffusion-imaging metrics. The gating conditions must be kept identical if reproducible results are to be achieved. PMID:27073115
Infrared face recognition based on LBP histogram and KW feature selection
NASA Astrophysics Data System (ADS)
Xie, Zhihua
2014-07-01
The conventional LBP-based feature as represented by the local binary pattern (LBP) histogram still has room for performance improvements. This paper focuses on the dimension reduction of LBP micro-patterns and proposes an improved infrared face recognition method based on LBP histogram representation. To extract the local robust features in infrared face images, LBP is chosen to get the composition of micro-patterns of sub-blocks. Based on statistical test theory, Kruskal-Wallis (KW) feature selection method is proposed to get the LBP patterns which are suitable for infrared face recognition. The experimental results show combination of LBP and KW features selection improves the performance of infrared face recognition, the proposed method outperforms the traditional methods based on LBP histogram, discrete cosine transform(DCT) or principal component analysis(PCA).
Pixel-based skin segmentation in psoriasis images.
George, Y; Aldeen, M; Garnavi, R
2016-08-01
In this paper, we present a detailed comparison study of skin segmentation methods for psoriasis images. Different techniques are modified and then applied to a set of psoriasis images acquired from the Royal Melbourne Hospital, Melbourne, Australia, with aim of finding the best technique suited for application to psoriasis images. We investigate the effect of different colour transformations on skin detection performance. In this respect, explicit skin thresholding is evaluated with three different decision boundaries (CbCr, HS and rgHSV). Histogram-based Bayesian classifier is applied to extract skin probability maps (SPMs) for different colour channels. This is then followed by using different approaches to find a binary skin map (SM) image from the SPMs. The approaches used include binary decision tree (DT) and Otsu's thresholding. Finally, a set of morphological operations are implemented to refine the resulted SM image. The paper provides detailed analysis and comparison of the performance of the Bayesian classifier in five different colour spaces (YCbCr, HSV, RGB, XYZ and CIELab). The results show that histogram-based Bayesian classifier is more effective than explicit thresholding, when applied to psoriasis images. It is also found that decision boundary CbCr outperforms HS and rgHSV. Another finding is that the SPMs of Cb, Cr, H and B-CIELab colour bands yield the best SMs for psoriasis images. In this study, we used a set of 100 psoriasis images for training and testing the presented methods. True Positive (TP) and True Negative (TN) are used as statistical evaluation measures.
Computerized image analysis: estimation of breast density on mammograms
NASA Astrophysics Data System (ADS)
Zhou, Chuan; Chan, Heang-Ping; Petrick, Nicholas; Sahiner, Berkman; Helvie, Mark A.; Roubidoux, Marilyn A.; Hadjiiski, Lubomir M.; Goodsitt, Mitchell M.
2000-06-01
An automated image analysis tool is being developed for estimation of mammographic breast density, which may be useful for risk estimation or for monitoring breast density change in a prevention or intervention program. A mammogram is digitized using a laser scanner and the resolution is reduced to a pixel size of 0.8 mm X 0.8 mm. Breast density analysis is performed in three stages. First, the breast region is segmented from the surrounding background by an automated breast boundary-tracking algorithm. Second, an adaptive dynamic range compression technique is applied to the breast image to reduce the range of the gray level distribution in the low frequency background and to enhance the differences in the characteristic features of the gray level histogram for breasts of different densities. Third, rule-based classification is used to classify the breast images into several classes according to the characteristic features of their gray level histogram. For each image, a gray level threshold is automatically determined to segment the dense tissue from the breast region. The area of segmented dense tissue as a percentage of the breast area is then estimated. In this preliminary study, we analyzed the interobserver variation of breast density estimation by two experienced radiologists using BI-RADS lexicon. The radiologists' visually estimated percent breast densities were compared with the computer's calculation. The results demonstrate the feasibility of estimating mammographic breast density using computer vision techniques and its potential to improve the accuracy and reproducibility in comparison with the subjective visual assessment by radiologists.
Multispectral histogram normalization contrast enhancement
NASA Technical Reports Server (NTRS)
Soha, J. M.; Schwartz, A. A.
1979-01-01
A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.
Remote logo detection using angle-distance histograms
NASA Astrophysics Data System (ADS)
Youn, Sungwook; Ok, Jiheon; Baek, Sangwook; Woo, Seongyoun; Lee, Chulhee
2016-05-01
Among all the various computer vision applications, automatic logo recognition has drawn great interest from industry as well as various academic institutions. In this paper, we propose an angle-distance map, which we used to develop a robust logo detection algorithm. The proposed angle-distance histogram is invariant against scale and rotation. The proposed method first used shape information and color characteristics to find the candidate regions and then applied the angle-distance histogram. Experiments show that the proposed method detected logos of various sizes and orientations.
Sim, K S; Teh, V; Tey, Y C; Kho, T K
2016-11-01
This paper introduces new development technique to improve the Scanning Electron Microscope (SEM) image quality and we name it as sub-blocking multiple peak histogram equalization (SUB-B-MPHE) with convolution operator. By using this new proposed technique, it shows that the new modified MPHE performs better than original MPHE. In addition, the sub-blocking method consists of convolution operator which can help to remove the blocking effect for SEM images after applying this new developed technique. Hence, by using the convolution operator, it effectively removes the blocking effect by properly distributing the suitable pixel value for the whole image. Overall, the SUB-B-MPHE with convolution outperforms the rest of methods. SCANNING 38:492-501, 2016. © 2015 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.
Classroom Management Through the Application of Behavior Modification Techniques.
ERIC Educational Resources Information Center
Ferinden, William E., Jr.
The primary aim of this book is to bring to the grade school teacher a survey of the most recent techniques and ideas of behavior modification which are applicable to good classroom management. All of the approaches and techniques presented could be of interest to teachers working at all grade levels. Since research has shown that the systematic…
Behavior Modification in the Classroom
ERIC Educational Resources Information Center
Whitman, Mryon; Whitman, Joan
1971-01-01
This article presents the theoretical rationale for behavior modification, principally through its comparison with traditional psychotherapies, and suggests some behavior modification techniques for the classroom management of maladaptive behavior. (Author)
A Review of Some Alternative Approaches to Drug Management of Hyperactivity in Children.
ERIC Educational Resources Information Center
Walden, Everett L.; Thompson, Sheila A.
1981-01-01
Literature is cited on such classroom management techniques as environmental manipulations, curriculum modifications, behavior modification, and dealing with affective attitude. Also considered are home management techniques and self-control programs including biofeedback and relaxation training. (SB)
ERIC Educational Resources Information Center
Leyden, Michael B.
1975-01-01
Describes various elementary school activities using a loaf of raisin bread to promote inquiry skills. Activities include estimating the number of raisins in the loaf by constructing histograms of the number of raisins in a slice. (MLH)
Massar, Melody L; Bhagavatula, Ramamurthy; Ozolek, John A; Castro, Carlos A; Fickus, Matthew; Kovačević, Jelena
2011-10-19
We present the current state of our work on a mathematical framework for identification and delineation of histopathology images-local histograms and occlusion models. Local histograms are histograms computed over defined spatial neighborhoods whose purpose is to characterize an image locally. This unit of description is augmented by our occlusion models that describe a methodology for image formation. In the context of this image formation model, the power of local histograms with respect to appropriate families of images will be shown through various proved statements about expected performance. We conclude by presenting a preliminary study to demonstrate the power of the framework in the context of histopathology image classification tasks that, while differing greatly in application, both originate from what is considered an appropriate class of images for this framework.
Chen, Zhaoxue; Yu, Haizhong; Chen, Hao
2013-12-01
To solve the problem of traditional K-means clustering in which initial clustering centers are selected randomly, we proposed a new K-means segmentation algorithm based on robustly selecting 'peaks' standing for White Matter, Gray Matter and Cerebrospinal Fluid in multi-peaks gray histogram of MRI brain image. The new algorithm takes gray value of selected histogram 'peaks' as the initial K-means clustering center and can segment the MRI brain image into three parts of tissue more effectively, accurately, steadily and successfully. Massive experiments have proved that the proposed algorithm can overcome many shortcomings caused by traditional K-means clustering method such as low efficiency, veracity, robustness and time consuming. The histogram 'peak' selecting idea of the proposed segmentootion method is of more universal availability.
Neutron camera employing row and column summations
Clonts, Lloyd G.; Diawara, Yacouba; Donahue, Jr, Cornelius; Montcalm, Christopher A.; Riedel, Richard A.; Visscher, Theodore
2016-06-14
For each photomultiplier tube in an Anger camera, an R.times.S array of preamplifiers is provided to detect electrons generated within the photomultiplier tube. The outputs of the preamplifiers are digitized to measure the magnitude of the signals from each preamplifier. For each photomultiplier tube, a corresponding summation circuitry including R row summation circuits and S column summation circuits numerically add the magnitudes of the signals from preamplifiers for each row and for each column to generate histograms. For a P.times.Q array of photomultiplier tubes, P.times.Q summation circuitries generate P.times.Q row histograms including R entries and P.times.Q column histograms including S entries. The total set of histograms include P.times.Q.times.(R+S) entries, which can be analyzed by a position calculation circuit to determine the locations of events (detection of a neutron).
Cho, Gene Young; Moy, Linda; Kim, Sungheon G; Baete, Steven H; Moccaldi, Melanie; Babb, James S; Sodickson, Daniel K; Sigmund, Eric E
2016-08-01
To examine heterogeneous breast cancer through intravoxel incoherent motion (IVIM) histogram analysis. This HIPAA-compliant, IRB-approved retrospective study included 62 patients (age 48.44 ± 11.14 years, 50 malignant lesions and 12 benign) who underwent contrast-enhanced 3 T breast MRI and diffusion-weighted imaging. Apparent diffusion coefficient (ADC) and IVIM biomarkers of tissue diffusivity (Dt), perfusion fraction (fp), and pseudo-diffusivity (Dp) were calculated using voxel-based analysis for the whole lesion volume. Histogram analysis was performed to quantify tumour heterogeneity. Comparisons were made using Mann-Whitney tests between benign/malignant status, histological subtype, and molecular prognostic factor status while Spearman's rank correlation was used to characterize the association between imaging biomarkers and prognostic factor expression. The average values of the ADC and IVIM biomarkers, Dt and fp, showed significant differences between benign and malignant lesions. Additional significant differences were found in the histogram parameters among tumour subtypes and molecular prognostic factor status. IVIM histogram metrics, particularly fp and Dp, showed significant correlation with hormonal factor expression. Advanced diffusion imaging biomarkers show relationships with molecular prognostic factors and breast cancer malignancy. This analysis reveals novel diagnostic metrics that may explain some of the observed variability in treatment response among breast cancer patients. • Novel IVIM biomarkers characterize heterogeneous breast cancer. • Histogram analysis enables quantification of tumour heterogeneity. • IVIM biomarkers show relationships with breast cancer malignancy and molecular prognostic factors.
Wu, Rongli; Watanabe, Yoshiyuki; Arisawa, Atsuko; Takahashi, Hiroto; Tanaka, Hisashi; Fujimoto, Yasunori; Watabe, Tadashi; Isohashi, Kayako; Hatazawa, Jun; Tomiyama, Noriyuki
2017-10-01
This study aimed to compare the tumor volume definition using conventional magnetic resonance (MR) and 11C-methionine positron emission tomography (MET/PET) images in the differentiation of the pre-operative glioma grade by using whole-tumor histogram analysis of normalized cerebral blood volume (nCBV) maps. Thirty-four patients with histopathologically proven primary brain low-grade gliomas (n = 15) and high-grade gliomas (n = 19) underwent pre-operative or pre-biopsy MET/PET, fluid-attenuated inversion recovery, dynamic susceptibility contrast perfusion-weighted magnetic resonance imaging, and contrast-enhanced T1-weighted at 3.0 T. The histogram distribution derived from the nCBV maps was obtained by co-registering the whole tumor volume delineated on conventional MR or MET/PET images, and eight histogram parameters were assessed. The mean nCBV value had the highest AUC value (0.906) based on MET/PET images. Diagnostic accuracy significantly improved when the tumor volume was measured from MET/PET images compared with conventional MR images for the parameters of mean, 50th, and 75th percentile nCBV value (p = 0.0246, 0.0223, and 0.0150, respectively). Whole-tumor histogram analysis of CBV map provides more valuable histogram parameters and increases diagnostic accuracy in the differentiation of pre-operative cerebral gliomas when the tumor volume is derived from MET/PET images.
Effect of respiratory and cardiac gating on the major diffusion-imaging metrics.
Hamaguchi, Hiroyuki; Tha, Khin Khin; Sugimori, Hiroyuki; Nakanishi, Mitsuhiro; Nakagawa, Shin; Fujiwara, Taro; Yoshida, Hirokazu; Takamori, Sayaka; Shirato, Hiroki
2016-08-01
The effect of respiratory gating on the major diffusion-imaging metrics and that of cardiac gating on mean kurtosis (MK) are not known. For evaluation of whether the major diffusion-imaging metrics-MK, fractional anisotropy (FA), and mean diffusivity (MD) of the brain-varied between gated and non-gated acquisitions, respiratory-gated, cardiac-gated, and non-gated diffusion-imaging of the brain were performed in 10 healthy volunteers. MK, FA, and MD maps were constructed for all acquisitions, and the histograms were constructed. The normalized peak height and location of the histograms were compared among the acquisitions by use of Friedman and post hoc Wilcoxon tests. The effect of the repetition time (TR) on the diffusion-imaging metrics was also tested, and we corrected for its variation among acquisitions, if necessary. The results showed a shift in the peak location of the MK and MD histograms to the right with an increase in TR (p ≤ 0.01). The corrected peak location of the MK histograms, the normalized peak height of the FA histograms, the normalized peak height and the corrected peak location of the MD histograms varied significantly between the gated and non-gated acquisitions (p < 0.05). These results imply an influence of respiration and cardiac pulsation on the major diffusion-imaging metrics. The gating conditions must be kept identical if reproducible results are to be achieved. © The Author(s) 2016.
A cost-effective line-based light-balancing technique using adaptive processing.
Hsia, Shih-Chang; Chen, Ming-Huei; Chen, Yu-Min
2006-09-01
The camera imaging system has been widely used; however, the displaying image appears to have an unequal light distribution. This paper presents novel light-balancing techniques to compensate uneven illumination based on adaptive signal processing. For text image processing, first, we estimate the background level and then process each pixel with nonuniform gain. This algorithm can balance the light distribution while keeping a high contrast in the image. For graph image processing, the adaptive section control using piecewise nonlinear gain is proposed to equalize the histogram. Simulations show that the performance of light balance is better than the other methods. Moreover, we employ line-based processing to efficiently reduce the memory requirement and the computational cost to make it applicable in real-time systems.
The magnifying glass - A feature space local expansion for visual analysis. [and image enhancement
NASA Technical Reports Server (NTRS)
Juday, R. D.
1981-01-01
The Magnifying Glass Transformation (MGT) technique is proposed, as a multichannel spectral operation yielding visual imagery which is enhanced in a specified spectral vicinity, guided by the statistics of training samples. An application example is that in which the discrimination among spectral neighbors within an interactive display may be increased without altering distant object appearances or overall interpretation. A direct histogram specification technique is applied to the channels within the multispectral image so that a subset of the spectral domain occupies an increased fraction of the domain. The transformation is carried out by obtaining the training information, establishing the condition of the covariance matrix, determining the influenced solid, and initializing the lookup table. Finally, the image is transformed.
Method for star identification using neural networks
NASA Astrophysics Data System (ADS)
Lindsey, Clark S.; Lindblad, Thomas; Eide, Age J.
1997-04-01
Identification of star constellations with an onboard star tracker provides the highest precision of all attitude determination techniques for spacecraft. A method for identification of star constellations inspired by neural network (NNW) techniques is presented. It compares feature vectors derived from histograms of distances to multiple stars around the unknown star. The NNW method appears most robust with respect to position noise and would require a smaller database than conventional methods, especially for small fields of view. The neural network method is quite slow when performed on a sequential (serial) processor, but would provide very high speed if implemented in special hardware. Such hardware solutions could also yield lower low weight and low power consumption, both important features for small satellites.
Automatic video segmentation and indexing
NASA Astrophysics Data System (ADS)
Chahir, Youssef; Chen, Liming
1999-08-01
Indexing is an important aspect of video database management. Video indexing involves the analysis of video sequences, which is a computationally intensive process. However, effective management of digital video requires robust indexing techniques. The main purpose of our proposed video segmentation is twofold. Firstly, we develop an algorithm that identifies camera shot boundary. The approach is based on the use of combination of color histograms and block-based technique. Next, each temporal segment is represented by a color reference frame which specifies the shot similarities and which is used in the constitution of scenes. Experimental results using a variety of videos selected in the corpus of the French Audiovisual National Institute are presented to demonstrate the effectiveness of performing shot detection, the content characterization of shots and the scene constitution.
NASA Astrophysics Data System (ADS)
Rhodes, Andrew P.; Christian, John A.; Evans, Thomas
2017-12-01
With the availability and popularity of 3D sensors, it is advantageous to re-examine the use of point cloud descriptors for the purpose of pose estimation and spacecraft relative navigation. One popular descriptor is the oriented unique repeatable clustered viewpoint feature histogram (
Hybrid Histogram Descriptor: A Fusion Feature Representation for Image Retrieval.
Feng, Qinghe; Hao, Qiaohong; Chen, Yuqi; Yi, Yugen; Wei, Ying; Dai, Jiangyan
2018-06-15
Currently, visual sensors are becoming increasingly affordable and fashionable, acceleratingly the increasing number of image data. Image retrieval has attracted increasing interest due to space exploration, industrial, and biomedical applications. Nevertheless, designing effective feature representation is acknowledged as a hard yet fundamental issue. This paper presents a fusion feature representation called a hybrid histogram descriptor (HHD) for image retrieval. The proposed descriptor comprises two histograms jointly: a perceptually uniform histogram which is extracted by exploiting the color and edge orientation information in perceptually uniform regions; and a motif co-occurrence histogram which is acquired by calculating the probability of a pair of motif patterns. To evaluate the performance, we benchmarked the proposed descriptor on RSSCN7, AID, Outex-00013, Outex-00014 and ETHZ-53 datasets. Experimental results suggest that the proposed descriptor is more effective and robust than ten recent fusion-based descriptors under the content-based image retrieval framework. The computational complexity was also analyzed to give an in-depth evaluation. Furthermore, compared with the state-of-the-art convolutional neural network (CNN)-based descriptors, the proposed descriptor also achieves comparable performance, but does not require any training process.
Histograms and Frequency Density.
ERIC Educational Resources Information Center
Micromath, 2003
2003-01-01
Introduces exercises on histograms and frequency density. Guides pupils to Discovering Important Statistical Concepts Using Spreadsheets (DISCUSS), created at the University of Coventry. Includes curriculum points, teaching tips, activities, and internet address (http://www.coventry.ac.uk/discuss/). (KHR)
The DataCube Server. Animate Agent Project Working Note 2, Version 1.0
1993-11-01
before this can be called a histogram of all the needed levels must be made and their one band images must be made. Note if a levels backprojection...will not be used then the level does not need to be histogrammed. Any points outside the active region in a levels backprojection will be undefined...this can be called a histogram of all the needed levels must be made and their one band images must be made. Note if a levels backprojection will not
Hu, Yunzi; Daoud, Walid A.; Cheuk, Kevin Ka Leung; Lin, Carol Sze Ki
2016-01-01
Polycondensation and ring-opening polymerization are two important polymer synthesis methods. Poly(lactic acid), the most typical biodegradable polymer, has been researched extensively from 1900s. It is of significant importance to have an up-to-date review on the recent improvement in techniques for biodegradable polymers. This review takes poly(lactic acid) as the example to present newly developed polymer synthesis techniques on polycondensation and ring-opening polymerization reported in the recent decade (2005–2015) on the basis of industrial technique modifications and advanced laboratory research. Different polymerization methods, including various solvents, heating programs, reaction apparatus and catalyst systems, are summarized and compared with the current industrial production situation. Newly developed modification techniques for polymer properties improvement are also discussed based on the case of poly(lactic acid). PMID:28773260
Study on modification of the Misgav Ladach method for cesarean section.
Li, M; Zou, L; Zhu, J
2001-01-01
172 cases of pregnant women scheduled for delivery by cesarean section were randomly assigned to 59 cases in modification group with modified Misgav Ladach technique, 57 cases in Misgav Ladach group with Misgav Ladach technique and 56 cases in Pfannenstiel group with Pfannenstiel technique from May to Dec. 1999. The modified points included: transversely incising the fascia 2 to 3 cm, then dividing it bluntly; without opening and dissociating the visceral peritoneum; two layers suturing of low transverse uterine incision; closing the skin by continuous suturing. Results showed the average delivery time in the modification group was (3.6 +/- 2.6) min and (5.7 +/- 2.9) min in the Misgav Ladach group (P < 0.05). Median operating time was (28.3 +/- 5.4) min in modification group compared with (27.5 +/- 6.5) min in the Misgav Ladach group (P > 0.05). Average blood loss was (128 +/- 35) ml in modification group compared with (212 +/- 147) ml in the Pfannenstiel group (P < 0.05). It was concluded that the modified Misgav Ladach technique not only preserved all advantages of Misgav Ladach method, but also had additional advantages, such as faster in delivering the fetus, less damage, easier mastering for obstetricians.
Ecological Effects of Weather Modification: A Problem Analysis.
ERIC Educational Resources Information Center
Cooper, Charles F.; Jolly, William C.
This publication reviews the potential hazards to the environment of weather modification techniques as they eventually become capable of producing large scale weather pattern modifications. Such weather modifications could result in ecological changes which would generally require several years to be fully evident, including the alteration of…
Gihr, Georg Alexander; Horvath-Rizea, Diana; Garnov, Nikita; Kohlhof-Meinecke, Patricia; Ganslandt, Oliver; Henkes, Hans; Meyer, Hans Jonas; Hoffmann, Karl-Titus; Surov, Alexey; Schob, Stefan
2018-02-01
Presurgical grading, estimation of growth kinetics, and other prognostic factors are becoming increasingly important for selecting the best therapeutic approach for meningioma patients. Diffusion-weighted imaging (DWI) provides microstructural information and reflects tumor biology. A novel DWI approach, histogram profiling of apparent diffusion coefficient (ADC) volumes, provides more distinct information than conventional DWI. Therefore, our study investigated whether ADC histogram profiling distinguishes low-grade from high-grade lesions and reflects Ki-67 expression and progesterone receptor status. Pretreatment ADC volumes of 37 meningioma patients (28 low-grade, 9 high-grade) were used for histogram profiling. WHO grade, Ki-67 expression, and progesterone receptor status were evaluated. Comparative and correlative statistics investigating the association between histogram profiling and neuropathology were performed. The entire ADC profile (p10, p25, p75, p90, mean, median) was significantly lower in high-grade versus low-grade meningiomas. The lower percentiles, mean, and modus showed significant correlations with Ki-67 expression. Skewness and entropy of the ADC volumes were significantly associated with progesterone receptor status and Ki-67 expression. ROC analysis revealed entropy to be the most accurate parameter distinguishing low-grade from high-grade meningiomas. ADC histogram profiling provides a distinct set of parameters, which help differentiate low-grade versus high-grade meningiomas. Also, histogram metrics correlate significantly with histological surrogates of the respective proliferative potential. More specifically, entropy revealed to be the most promising imaging biomarker for presurgical grading. Both, entropy and skewness were significantly associated with progesterone receptor status and Ki-67 expression and therefore should be investigated further as predictors for prognostically relevant tumor biological features. Since absolute ADC values vary between MRI scanners of different vendors and field strengths, their use is more limited in the presurgical setting.
Reiner, Caecilia S; Gordic, Sonja; Puippe, Gilbert; Morsbach, Fabian; Wurnig, Moritz; Schaefer, Niklaus; Veit-Haibach, Patrick; Pfammatter, Thomas; Alkadhi, Hatem
2016-03-01
To evaluate in patients with hepatocellular carcinoma (HCC), whether assessment of tumor heterogeneity by histogram analysis of computed tomography (CT) perfusion helps predicting response to transarterial radioembolization (TARE). Sixteen patients (15 male; mean age 65 years; age range 47-80 years) with HCC underwent CT liver perfusion for treatment planning prior to TARE with Yttrium-90 microspheres. Arterial perfusion (AP) derived from CT perfusion was measured in the entire tumor volume, and heterogeneity was analyzed voxel-wise by histogram analysis. Response to TARE was evaluated on follow-up imaging (median follow-up, 129 days) based on modified Response Evaluation Criteria in Solid Tumors (mRECIST). Results of histogram analysis and mean AP values of the tumor were compared between responders and non-responders. Receiver operating characteristics were calculated to determine the parameters' ability to discriminate responders from non-responders. According to mRECIST, 8 patients (50%) were responders and 8 (50%) non-responders. Comparing responders and non-responders, the 50th and 75th percentile of AP derived from histogram analysis was significantly different [AP 43.8/54.3 vs. 27.6/34.3 mL min(-1) 100 mL(-1)); p < 0.05], while the mean AP of HCCs (43.5 vs. 27.9 mL min(-1) 100 mL(-1); p > 0.05) was not. Further heterogeneity parameters from histogram analysis (skewness, coefficient of variation, and 25th percentile) did not differ between responders and non-responders (p > 0.05). If the cut-off for the 75th percentile was set to an AP of 37.5 mL min(-1) 100 mL(-1), therapy response could be predicted with a sensitivity of 88% (7/8) and specificity of 75% (6/8). Voxel-wise histogram analysis of pretreatment CT perfusion indicating tumor heterogeneity of HCC improves the pretreatment prediction of response to TARE.
Umanodan, Tomokazu; Fukukura, Yoshihiko; Kumagae, Yuichi; Shindo, Toshikazu; Nakajo, Masatoyo; Takumi, Koji; Nakajo, Masanori; Hakamada, Hiroto; Umanodan, Aya; Yoshiura, Takashi
2017-04-01
To determine the diagnostic performance of apparent diffusion coefficient (ADC) histogram analysis in diffusion-weighted (DW) magnetic resonance imaging (MRI) for differentiating adrenal adenoma from pheochromocytoma. We retrospectively evaluated 52 adrenal tumors (39 adenomas and 13 pheochromocytomas) in 47 patients (21 men, 26 women; mean age, 59.3 years; range, 16-86 years) who underwent DW 3.0T MRI. Histogram parameters of ADC (b-values of 0 and 200 [ADC 200 ], 0 and 400 [ADC 400 ], and 0 and 800 s/mm 2 [ADC 800 ])-mean, variance, coefficient of variation (CV), kurtosis, skewness, and entropy-were compared between adrenal adenomas and pheochromocytomas, using the Mann-Whitney U-test. Receiver operating characteristic (ROC) curves for the histogram parameters were generated to differentiate adrenal adenomas from pheochromocytomas. Sensitivity and specificity were calculated by using a threshold criterion that would maximize the average of sensitivity and specificity. Variance and CV of ADC 800 were significantly higher in pheochromocytomas than in adrenal adenomas (P < 0.001 and P = 0.001, respectively). With all b-value combinations, the entropy of ADC was significantly higher in pheochromocytomas than in adrenal adenomas (all P ≤ 0.001), and showed the highest area under the ROC curve among the ADC histogram parameters for diagnosing adrenal adenomas (ADC 200 , 0.82; ADC 400 , 0.87; and ADC 800 , 0.92), with sensitivity of 84.6% and specificity of 84.6% (cutoff, ≤2.82) with ADC 200 ; sensitivity of 89.7% and specificity of 84.6% (cutoff, ≤2.77) with ADC 400 ; and sensitivity of 94.9% and specificity of 92.3% (cutoff, ≤2.67) with ADC 800 . ADC histogram analysis of DW MRI can help differentiate adrenal adenoma from pheochromocytoma. 3 J. Magn. Reson. Imaging 2017;45:1195-1203. © 2016 International Society for Magnetic Resonance in Medicine.
Simultaneous F 0-F 1 modifications of Arabic for the improvement of natural-sounding
NASA Astrophysics Data System (ADS)
Ykhlef, F.; Bensebti, M.
2013-03-01
Pitch (F 0) modification is one of the most important problems in the area of speech synthesis. Several techniques have been developed in the literature to achieve this goal. The main restrictions of these techniques are in the modification range and the synthesised speech quality, intelligibility and naturalness. The control of formants in a spoken language can significantly improve the naturalness of the synthesised speech. This improvement is mainly dependent on the control of the first formant (F 1). Inspired by this observation, this article proposes a new approach that modifies both F 0 and F 1 of Arabic voiced sounds in order to improve the naturalness of the pitch shifted speech. The developed strategy takes a parallel processing approach, in which the analysis segments are decomposed into sub-bands in the wavelet domain, modified in the desired sub-band by using a resampling technique and reconstructed without affecting the remained sub-bands. Pitch marking and voicing detection are performed in the frequency decomposition step based on the comparison of the multi-level approximation and detail signals. The performance of the proposed technique is evaluated by listening tests and compared to the pitch synchronous overlap and add (PSOLA) technique in the third approximation level. Experimental results have shown that the manipulation in the wavelet domain of F 0 in conjunction with F 1 guarantees natural-sounding of the synthesised speech compared to the classical pitch modification technique. This improvement was appropriate for high pitch modifications.
Hamit, Murat; Yun, Weikang; Yan, Chuanbo; Kutluk, Abdugheni; Fang, Yang; Alip, Elzat
2015-06-01
Image feature extraction is an important part of image processing and it is an important field of research and application of image processing technology. Uygur medicine is one of Chinese traditional medicine and researchers pay more attention to it. But large amounts of Uygur medicine data have not been fully utilized. In this study, we extracted the image color histogram feature of herbal and zooid medicine of Xinjiang Uygur. First, we did preprocessing, including image color enhancement, size normalizition and color space transformation. Then we extracted color histogram feature and analyzed them with statistical method. And finally, we evaluated the classification ability of features by Bayes discriminant analysis. Experimental results showed that high accuracy for Uygur medicine image classification was obtained by using color histogram feature. This study would have a certain help for the content-based medical image retrieval for Xinjiang Uygur medicine.
LSAH: a fast and efficient local surface feature for point cloud registration
NASA Astrophysics Data System (ADS)
Lu, Rongrong; Zhu, Feng; Wu, Qingxiao; Kong, Yanzi
2018-04-01
Point cloud registration is a fundamental task in high level three dimensional applications. Noise, uneven point density and varying point cloud resolutions are the three main challenges for point cloud registration. In this paper, we design a robust and compact local surface descriptor called Local Surface Angles Histogram (LSAH) and propose an effectively coarse to fine algorithm for point cloud registration. The LSAH descriptor is formed by concatenating five normalized sub-histograms into one histogram. The five sub-histograms are created by accumulating a different type of angle from a local surface patch respectively. The experimental results show that our LSAH is more robust to uneven point density and point cloud resolutions than four state-of-the-art local descriptors in terms of feature matching. Moreover, we tested our LSAH based coarse to fine algorithm for point cloud registration. The experimental results demonstrate that our algorithm is robust and efficient as well.
Felfer, Peter; Cairney, Julie
2018-06-01
Analysing the distribution of selected chemical elements with respect to interfaces is one of the most common tasks in data mining in atom probe tomography. This can be represented by 1D concentration profiles, 2D concentration maps or proximity histograms, which represent concentration, density etc. of selected species as a function of the distance from a reference surface/interface. These are some of the most useful tools for the analysis of solute distributions in atom probe data. In this paper, we present extensions to the proximity histogram in the form of 'local' proximity histograms, calculated for selected parts of a surface, and pseudo-2D concentration maps, which are 2D concentration maps calculated on non-flat surfaces. This way, local concentration changes at interfaces or and other structures can be assessed more effectively. Copyright © 2018 Elsevier B.V. All rights reserved.
SU-E-T-178: Clinical Feasibility of Multi-Leaf Collimator Based Dynamic Wedge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeong, C; Kwak, J; Ahn, S
2015-06-15
Purpose: A multi-leaf collimator (MLC) based dynamic wedge (MDW), which provide similar dose profile of physical wedge (PW) along x-jaw direction while significant monitor unit (MU) reduction, was developed and investigated for clinical use. Methods: A novel technique was used to create the wedge profile using MLC. A modification was applied to the DICOM-RT format file of the plan made with the PW to replace PW with MDW. The Varian enhanced dynamic wedge profile was used to produce MLC sequence, while the MU of the wedged field was recalculated using PW factor and fluence map. The profiles for all possiblemore » MDWs to substitute PWs were verified in 6/15 MV x-ray irradiations. New plans with MDWs were compared with the original plans in 5 rectal, 5 RT breast and 5 liver cases. Results: The wedge profile of the MDW fields were well matched with those of PWs inside the fields while less scatter than PW out of the fields. For plan comparisons of the clinical cases no significant dose discrepancy was observed between MDW plan and PW’s with the dose volume histograms. The maximum and mean doses in PTVs are agreed within 1.0%. The Result of OARs of MDW plans are slightly improved in the maximum doses (3.22 ∼ 150.4 cGy) and the mean doses (17.18 ∼ 85.52 cGy) on average for all cases while the prescribed doses are 45 Gy for rectal cases, 40 or 45 Gy for liver cases and 50 Gy for breast cases. The MUs of the fields which replace PW with MDW are reduced to 68% of those of PW. Conclusion: We developed a novel dynamic wedge technique with MLC that shows clinical advantage compared to PW.« less
Motion compensation in digital subtraction angiography using graphics hardware.
Deuerling-Zheng, Yu; Lell, Michael; Galant, Adam; Hornegger, Joachim
2006-07-01
An inherent disadvantage of digital subtraction angiography (DSA) is its sensitivity to patient motion which causes artifacts in the subtraction images. These artifacts could often reduce the diagnostic value of this technique. Automated, fast and accurate motion compensation is therefore required. To cope with this requirement, we first examine a method explicitly designed to detect local motions in DSA. Then, we implement a motion compensation algorithm by means of block matching on modern graphics hardware. Both methods search for maximal local similarity by evaluating a histogram-based measure. In this context, we are the first who have mapped an optimizing search strategy on graphics hardware while paralleling block matching. Moreover, we provide an innovative method for creating histograms on graphics hardware with vertex texturing and frame buffer blending. It turns out that both methods can effectively correct the artifacts in most case, as the hardware implementation of block matching performs much faster: the displacements of two 1024 x 1024 images can be calculated at 3 frames/s with integer precision or 2 frames/s with sub-pixel precision. Preliminary clinical evaluation indicates that the computation with integer precision could already be sufficient.
Redshift data and statistical inference
NASA Technical Reports Server (NTRS)
Newman, William I.; Haynes, Martha P.; Terzian, Yervant
1994-01-01
Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.
Behavior Modification Techniques. I: Overcorrection. II: Punishment.
ERIC Educational Resources Information Center
Sovner, Robert, Ed.; Hurley, Anne Des Noyers, Ed.
1983-01-01
Two newsletters review the principles and application of two behavior modification techniques with mentally retarded persons: overcorrection and punishment. Overcorrection may be either restitutional, in which the client is made to restore the environment to a far better state than before the inappropriate behavior occurred, or positive practice…
Tan, Shan; Zhang, Hao; Zhang, Yongxue; Chen, Wengen; D’Souza, Warren D.; Lu, Wei
2013-01-01
Purpose: A family of fluorine-18 (18F)-fluorodeoxyglucose (18F-FDG) positron-emission tomography (PET) features based on histogram distances is proposed for predicting pathologic tumor response to neoadjuvant chemoradiotherapy (CRT). These features describe the longitudinal change of FDG uptake distribution within a tumor. Methods: Twenty patients with esophageal cancer treated with CRT plus surgery were included in this study. All patients underwent PET/CT scans before (pre-) and after (post-) CRT. The two scans were first rigidly registered, and the original tumor sites were then manually delineated on the pre-PET/CT by an experienced nuclear medicine physician. Two histograms representing the FDG uptake distribution were extracted from the pre- and the registered post-PET images, respectively, both within the delineated tumor. Distances between the two histograms quantify longitudinal changes in FDG uptake distribution resulting from CRT, and thus are potential predictors of tumor response. A total of 19 histogram distances were examined and compared to both traditional PET response measures and Haralick texture features. Receiver operating characteristic analyses and Mann-Whitney U test were performed to assess their predictive ability. Results: Among all tested histogram distances, seven bin-to-bin and seven crossbin distances outperformed traditional PET response measures using maximum standardized uptake value (AUC = 0.70) or total lesion glycolysis (AUC = 0.80). The seven bin-to-bin distances were: L2 distance (AUC = 0.84), χ2 distance (AUC = 0.83), intersection distance (AUC = 0.82), cosine distance (AUC = 0.83), squared Euclidean distance (AUC = 0.83), L1 distance (AUC = 0.82), and Jeffrey distance (AUC = 0.82). The seven crossbin distances were: quadratic-chi distance (AUC = 0.89), earth mover distance (AUC = 0.86), fast earth mover distance (AUC = 0.86), diffusion distance (AUC = 0.88), Kolmogorov-Smirnov distance (AUC = 0.88), quadratic form distance (AUC = 0.87), and match distance (AUC = 0.84). These crossbin histogram distance features showed slightly higher prediction accuracy than texture features on post-PET images. Conclusions: The results suggest that longitudinal patterns in 18F-FDG uptake characterized using histogram distances provide useful information for predicting the pathologic response of esophageal cancer to CRT. PMID:24089897
Wang, G J; Wang, Y; Ye, Y; Chen, F; Lu, Y T; Li, S L
2017-11-07
Objective: To investigate the features of apparent diffusion coefficient (ADC) histogram parameters based on entire tumor volume data in high resolution diffusion weighted imaging of nasopharyngeal carcinoma (NPC) and to evaluate its correlations with cancer stages. Methods: This retrospective study included 154 cases of NPC patients[102 males and 52 females, mean age (48±11) years]who had received readout segmentation of long variable echo trains of MRI scan before radiation therapy. The area of tumor was delineated on each section of axial ADC maps to generate ADC histogram by using Image J. ADC histogram of entire tumor along with the histogram parameters-the tumor voxels, ADC(mean), ADC(25%), ADC(50%), ADC(75%), skewness and kurtosis were obtained by merging all sections with SPSS 22.0 software. Intra-observer repeatability was assessed by using intra-class correlation coefficients (ICC). The patients were subdivided into two groups according to cancer volume: small cancer group (<305 voxels, about 2 cm(3)) and large cancer group (≥2 cm(3)). The correlation between ADC histogram parameters and cancer stages was evaluated with Spearman test. Results: The ICC of measuring ADC histogram parameters of tumor voxels, ADC(mean), ADC(25%), ADC(50%), ADC(75%), skewness, kurtosis was 0.938, 0.861, 0.885, 0.838, 0.836, 0.358 and 0.456, respectively. The tumor voxels was positively correlated with T staging ( r =0.368, P <0.05). There were significant differences in tumor voxels among patients with different T stages ( K =22.306, P <0.05). There were significant differences in the ADC(mean), ADC(25%), ADC(50%) among patients with different T stages in the small cancer group( K =8.409, 8.187, 8.699, all P <0.05), and the up-mentioned three indices were positively correlated with T staging ( r =0.221, 0.209, 0.235, all P <0.05). Skewness and kurtosis differed significantly between the groups with different cancer volume( t =-2.987, Z =-3.770, both P <0.05). Conclusion: The tumor volume, tissue uniformity of NPC are important factors affecting ADC and cancer stages, parameters of ADC histogram (ADC(mean), ADC(25%), ADC(50%)) increases with T staging in NPC smaller than 2 cm(3).
Techniques of laparoscopic cholecystectomy: Nomenclature and selection.
Haribhakti, Sanjiv P; Mistry, Jitendra H
2015-01-01
There are more than 50 different techniques of laparoscopic cholecystectomy (LC) available in literature mainly due to modifications by surgeons in aim to improve postoperative outcome and cosmesis. These modifications include reduction in port size and/or number than what is used in standard LC. There is no uniform nomenclature to describe these different techniques so that it is not possible to compare the outcomes of different techniques. We brief the advantages and disadvantages of each of these techniques and suggest the situation where particular technique would be useful. We also propose a nomenclature which is easy to remember and apply, so that any future comparison will be possible between the techniques.
Approximate techniques of structural reanalysis
NASA Technical Reports Server (NTRS)
Noor, A. K.; Lowder, H. E.
1974-01-01
A study is made of two approximate techniques for structural reanalysis. These include Taylor series expansions for response variables in terms of design variables and the reduced-basis method. In addition, modifications to these techniques are proposed to overcome some of their major drawbacks. The modifications include a rational approach to the selection of the reduced-basis vectors and the use of Taylor series approximation in an iterative process. For the reduced basis a normalized set of vectors is chosen which consists of the original analyzed design and the first-order sensitivity analysis vectors. The use of the Taylor series approximation as a first (initial) estimate in an iterative process, can lead to significant improvements in accuracy, even with one iteration cycle. Therefore, the range of applicability of the reanalysis technique can be extended. Numerical examples are presented which demonstrate the gain in accuracy obtained by using the proposed modification techniques, for a wide range of variations in the design variables.
Mainstreaming: Sharing Ideas, Strategies, Materials, Techniques.
ERIC Educational Resources Information Center
Hillside School, Cushing, OK.
The manual provides teaching approaches based on a model of least to highest modification of instruction, which may be used for a continuum of special education placements ranging from regular classroom through hospital settings. The first section on adaptive techniques (requiring the least modification) includes suggestions to adjust time for…
Two Student Self-Management Techniques Applied to Data-Based Program Modification.
ERIC Educational Resources Information Center
Wesson, Caren
Two student self-management techniques, student charting and student selection of instructional activities, were applied to ongoing data-based program modification. Forty-two elementary school resource room students were assigned randomly (within teacher) to one of three treatment conditions: Teacher Chart-Teacher Select Instructional Activities…
The ISI distribution of the stochastic Hodgkin-Huxley neuron.
Rowat, Peter F; Greenwood, Priscilla E
2014-01-01
The simulation of ion-channel noise has an important role in computational neuroscience. In recent years several approximate methods of carrying out this simulation have been published, based on stochastic differential equations, and all giving slightly different results. The obvious, and essential, question is: which method is the most accurate and which is most computationally efficient? Here we make a contribution to the answer. We compare interspike interval histograms from simulated data using four different approximate stochastic differential equation (SDE) models of the stochastic Hodgkin-Huxley neuron, as well as the exact Markov chain model simulated by the Gillespie algorithm. One of the recent SDE models is the same as the Kurtz approximation first published in 1978. All the models considered give similar ISI histograms over a wide range of deterministic and stochastic input. Three features of these histograms are an initial peak, followed by one or more bumps, and then an exponential tail. We explore how these features depend on deterministic input and on level of channel noise, and explain the results using the stochastic dynamics of the model. We conclude with a rough ranking of the four SDE models with respect to the similarity of their ISI histograms to the histogram of the exact Markov chain model.
Wang, Hai-yi; Su, Zi-hua; Xu, Xiao; Sun, Zhi-peng; Duan, Fei-xue; Song, Yuan-yuan; Li, Lu; Wang, Ying-wei; Ma, Xin; Guo, Ai-tao; Ma, Lin; Ye, Hui-yi
2016-01-01
Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan–rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K trans & Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733
Hu, Yu-Chuan; Li, Gang; Yang, Yang; Han, Yu; Sun, Ying-Zhi; Liu, Zhi-Cheng; Tian, Qiang; Han, Zi-Yang; Liu, Le-De; Hu, Bin-Quan; Qiu, Zi-Yu; Wang, Wen; Cui, Guang-Bin
2017-01-01
Current machine learning techniques provide the opportunity to develop noninvasive and automated glioma grading tools, by utilizing quantitative parameters derived from multi-modal magnetic resonance imaging (MRI) data. However, the efficacies of different machine learning methods in glioma grading have not been investigated.A comprehensive comparison of varied machine learning methods in differentiating low-grade gliomas (LGGs) and high-grade gliomas (HGGs) as well as WHO grade II, III and IV gliomas based on multi-parametric MRI images was proposed in the current study. The parametric histogram and image texture attributes of 120 glioma patients were extracted from the perfusion, diffusion and permeability parametric maps of preoperative MRI. Then, 25 commonly used machine learning classifiers combined with 8 independent attribute selection methods were applied and evaluated using leave-one-out cross validation (LOOCV) strategy. Besides, the influences of parameter selection on the classifying performances were investigated. We found that support vector machine (SVM) exhibited superior performance to other classifiers. By combining all tumor attributes with synthetic minority over-sampling technique (SMOTE), the highest classifying accuracy of 0.945 or 0.961 for LGG and HGG or grade II, III and IV gliomas was achieved. Application of Recursive Feature Elimination (RFE) attribute selection strategy further improved the classifying accuracies. Besides, the performances of LibSVM, SMO, IBk classifiers were influenced by some key parameters such as kernel type, c, gama, K, etc. SVM is a promising tool in developing automated preoperative glioma grading system, especially when being combined with RFE strategy. Model parameters should be considered in glioma grading model optimization. PMID:28599282
Perroud, Thomas D.; Bokoch, Michael P.; Zare, Richard N.
2005-01-01
We apply the photon counting histogram (PCH) model, a fluorescence technique with single-molecule sensitivity, to study pH-induced conformational changes of cytochrome c. PCH is able to distinguish different protein conformations based on the brightness of a fluorophore sensitive to its local environment. We label cytochrome c through its single free cysteine with tetramethylrhodamine-5-maleimide (TMR), a fluorophore with specific brightnesses that we associate with specific protein conformations. Ensemble measurements demonstrate two different fluorescence responses with increasing pH: (i) a decrease in fluorescence intensity caused by the alkaline transition of cytochrome c (pH 7.0–9.5), and (ii) an increase in intensity when the protein unfolds (pH 9.5–10.8). The magnitudes of these two responses depend strongly on the molar ratio of TMR used to label cytochrome c. Using PCH we determine that this effect arises from the proportion of a nonfunctional conformation in the sample, which can be differentiated from the functional conformation. We further determine the causes of each ensemble fluorescence response: (i) during the alkaline transition, the fluorophore enters a dark state and discrete conformations are observed, and (ii) as cytochrome c unfolds, the fluorophore incrementally brightens, but discrete conformations are no longer resolved. Moreover, we also show that functional TMR-cytochrome c undergoes a response of identical magnitude regardless of the proportion of nonfunctional protein in the sample. As expected for a technique with single-molecule sensitivity, we demonstrate that PCH can directly observe the most relevant conformation, unlike ensemble fluorometry. PMID:16314563
Zhang, Xin; Yan, Lin-Feng; Hu, Yu-Chuan; Li, Gang; Yang, Yang; Han, Yu; Sun, Ying-Zhi; Liu, Zhi-Cheng; Tian, Qiang; Han, Zi-Yang; Liu, Le-De; Hu, Bin-Quan; Qiu, Zi-Yu; Wang, Wen; Cui, Guang-Bin
2017-07-18
Current machine learning techniques provide the opportunity to develop noninvasive and automated glioma grading tools, by utilizing quantitative parameters derived from multi-modal magnetic resonance imaging (MRI) data. However, the efficacies of different machine learning methods in glioma grading have not been investigated.A comprehensive comparison of varied machine learning methods in differentiating low-grade gliomas (LGGs) and high-grade gliomas (HGGs) as well as WHO grade II, III and IV gliomas based on multi-parametric MRI images was proposed in the current study. The parametric histogram and image texture attributes of 120 glioma patients were extracted from the perfusion, diffusion and permeability parametric maps of preoperative MRI. Then, 25 commonly used machine learning classifiers combined with 8 independent attribute selection methods were applied and evaluated using leave-one-out cross validation (LOOCV) strategy. Besides, the influences of parameter selection on the classifying performances were investigated. We found that support vector machine (SVM) exhibited superior performance to other classifiers. By combining all tumor attributes with synthetic minority over-sampling technique (SMOTE), the highest classifying accuracy of 0.945 or 0.961 for LGG and HGG or grade II, III and IV gliomas was achieved. Application of Recursive Feature Elimination (RFE) attribute selection strategy further improved the classifying accuracies. Besides, the performances of LibSVM, SMO, IBk classifiers were influenced by some key parameters such as kernel type, c, gama, K, etc. SVM is a promising tool in developing automated preoperative glioma grading system, especially when being combined with RFE strategy. Model parameters should be considered in glioma grading model optimization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, T; Yu, D; Beitler, J
Purpose: Xerostomia (dry mouth), secondary to parotid-gland injury, is a distressing side-effect in head-and-neck radiotherapy (RT). This study's purpose is to develop a novel ultrasound technique to quantitatively evaluate post-RT parotid-gland injury. Methods: Recent ultrasound studies have shown that healthy parotid glands exhibit homogeneous echotexture, whereas post-RT parotid glands are often heterogeneous, with multiple hypoechoic (inflammation) or hyperechoic (fibrosis) regions. We propose to use a Gaussian mixture model to analyze the ultrasonic echo-histogram of the parotid glands. An IRB-approved clinical study was conducted: (1) control-group: 13 healthy-volunteers, served as the control; (2) acutetoxicity group − 20 patients (mean age: 62.5more » ± 8.9 years, follow-up: 2.0±0.8 months); and (3) late-toxicity group − 18 patients (mean age: 60.7 ± 7.3 years, follow-up: 20.1±10.4 months). All patients experienced RTOG grade 1 or 2 salivary-gland toxicity. Each participant underwent an ultrasound scan (10 MHz) of the bilateral parotid glands. An echo-intensity histogram was derived for each parotid and a Gaussian mixture model was used to fit the histogram using expectation maximization (EM) algorithm. The quality of the fitting was evaluated with the R-squared value. Results: (1) Controlgroup: all parotid glands fitted well with one Gaussian component, with a mean intensity of 79.8±4.9 (R-squared>0.96). (2) Acute-toxicity group: 37 of the 40 post-RT parotid glands fitted well with two Gaussian components, with a mean intensity of 42.9±7.4, 73.3±12.2 (R-squared>0.95). (3) Latetoxicity group: 32 of the 36 post-RT parotid fitted well with 3 Gaussian components, with mean intensities of 49.7±7.6, 77.2±8.7, and 118.6±11.8 (R-squared>0.98). Conclusion: RT-associated parotid-gland injury is common in head-and-neck RT, but challenging to assess. This work has demonstrated that the Gaussian mixture model of the echo-histogram could quantify acute and late toxicity of the parotid glands. This study provides meaningful preliminary data from future observational and interventional clinical research.« less
Lin, Ying-Ting
2013-04-30
A tandem technique of hard equipment is often used for the chemical analysis of a single cell to first isolate and then detect the wanted identities. The first part is the separation of wanted chemicals from the bulk of a cell; the second part is the actual detection of the important identities. To identify the key structural modifications around ligand binding, the present study aims to develop a counterpart of tandem technique for cheminformatics. A statistical regression and its outliers act as a computational technique for separation. A PPARγ (peroxisome proliferator-activated receptor gamma) agonist cellular system was subjected to such an investigation. Results show that this tandem regression-outlier analysis, or the prioritization of the context equations tagged with features of the outliers, is an effective regression technique of cheminformatics to detect key structural modifications, as well as their tendency of impact to ligand binding. The key structural modifications around ligand binding are effectively extracted or characterized out of cellular reactions. This is because molecular binding is the paramount factor in such ligand cellular system and key structural modifications around ligand binding are expected to create outliers. Therefore, such outliers can be captured by this tandem regression-outlier analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Y; Liao, Z; Jiang, W
Purpose: To evaluate the feasibility of using an automatic segmentation tool to delineate cardiac substructures from computed tomography (CT) images for cardiac toxicity analysis for non-small cell lung cancer (NSCLC) patients after radiotherapy. Methods: A multi-atlas segmentation tool developed in-house was used to delineate eleven cardiac substructures including the whole heart, four heart chambers, and six greater vessels automatically from the averaged 4DCT planning images for 49 NSCLC patients. The automatic segmented contours were edited appropriately by two experienced radiation oncologists. The modified contours were compared with the auto-segmented contours using Dice similarity coefficient (DSC) and mean surface distance (MSD)more » to evaluate how much modification was needed. In addition, the dose volume histogram (DVH) of the modified contours were compared with that of the auto-segmented contours to evaluate the dosimetric difference between modified and auto-segmented contours. Results: Of the eleven structures, the averaged DSC values ranged from 0.73 ± 0.08 to 0.95 ± 0.04 and the averaged MSD values ranged from 1.3 ± 0.6 mm to 2.9 ± 5.1mm for the 49 patients. Overall, the modification is small. The pulmonary vein (PV) and the inferior vena cava required the most modifications. The V30 (volume receiving 30 Gy or above) for the whole heart and the mean dose to the whole heart and four heart chambers did not show statistically significant difference between modified and auto-segmented contours. The maximum dose to the greater vessels did not show statistically significant difference except for the PV. Conclusion: The automatic segmentation of the cardiac substructures did not require substantial modification. The dosimetric evaluation showed no statistically significant difference between auto-segmented and modified contours except for the PV, which suggests that auto-segmented contours for the cardiac dose response study are feasible in the clinical practice with a minor modification to the PV vessel.« less
Zhou, Nan; Guo, Tingting; Zheng, Huanhuan; Pan, Xia; Chu, Chen; Dou, Xin; Li, Ming; Liu, Song; Zhu, Lijing; Liu, Baorui; Chen, Weibo; He, Jian; Yan, Jing; Zhou, Zhengyang; Yang, Xiaofeng
2017-01-01
We investigated apparent diffusion coefficient (ADC) histogram analysis to evaluate radiation-induced parotid damage and predict xerostomia degrees in nasopharyngeal carcinoma (NPC) patients receiving radiotherapy. The imaging of bilateral parotid glands in NPC patients was conducted 2 weeks before radiotherapy (time point 1), one month after radiotherapy (time point 2), and four months after radiotherapy (time point 3). From time point 1 to 2, parotid volume, skewness, and kurtosis decreased (P < 0.001, = 0.001, and < 0.001, respectively), but all other ADC histogram parameters increased (all P < 0.001, except P = 0.006 for standard deviation [SD]). From time point 2 to 3, parotid volume continued to decrease (P = 0.022), and SD, 75th and 90th percentiles continued to increase (P = 0.024, 0.010, and 0.006, respectively). Early change rates of parotid ADCmean, ADCmin, kurtosis, and 25th, 50th, 75th, 90th percentiles (from time point 1 to 2) correlated with late parotid atrophy rate (from time point 1 to 3) (all P < 0.05). Multiple linear regression analysis revealed correlations among parotid volume, time point, and ADC histogram parameters. Early mean change rates for bilateral parotid SD and ADCmax could predict late xerostomia degrees at seven months after radiotherapy (three months after time point 3) with AUC of 0.781 and 0.818 (P = 0.014, 0.005, respectively). ADC histogram parameters were reproducible (intraclass correlation coefficient, 0.830 - 0.999). ADC histogram analysis could be used to evaluate radiation-induced parotid damage noninvasively, and predict late xerostomia degrees of NPC patients treated with radiotherapy. PMID:29050274
Histogram Analysis of Diffusion Tensor Imaging Parameters in Pediatric Cerebellar Tumors.
Wagner, Matthias W; Narayan, Anand K; Bosemani, Thangamadhan; Huisman, Thierry A G M; Poretti, Andrea
2016-05-01
Apparent diffusion coefficient (ADC) values have been shown to assist in differentiating cerebellar pilocytic astrocytomas and medulloblastomas. Previous studies have applied only ADC measurements and calculated the mean/median values. Here we investigated the value of diffusion tensor imaging (DTI) histogram characteristics of the entire tumor for differentiation of cerebellar pilocytic astrocytomas and medulloblastomas. Presurgical DTI data were analyzed with a region of interest (ROI) approach to include the entire tumor. For each tumor, histogram-derived metrics including the 25th percentile, 75th percentile, and skewness were calculated for fractional anisotropy (FA) and mean (MD), axial (AD), and radial (RD) diffusivity. The histogram metrics were used as primary predictors of interest in a logistic regression model. Statistical significance levels were set at p < .01. The study population included 17 children with pilocytic astrocytoma and 16 with medulloblastoma (mean age, 9.21 ± 5.18 years and 7.66 ± 4.97 years, respectively). Compared to children with medulloblastoma, children with pilocytic astrocytoma showed higher MD (P = .003 and P = .008), AD (P = .004 and P = .007), and RD (P = .003 and P = .009) values for the 25th and 75th percentile. In addition, histogram skewness showed statistically significant differences for MD between low- and high-grade tumors (P = .008). The 25th percentile for MD yields the best results for the presurgical differentiation between pediatric cerebellar pilocytic astrocytomas and medulloblastomas. The analysis of other DTI metrics does not provide additional diagnostic value. Our study confirms the diagnostic value of the quantitative histogram analysis of DTI data in pediatric neuro-oncology. Copyright © 2015 by the American Society of Neuroimaging.
Lin, Yuning; Li, Hui; Chen, Ziqian; Ni, Ping; Zhong, Qun; Huang, Huijuan; Sandrasegaran, Kumar
2015-05-01
The purpose of this study was to investigate the application of histogram analysis of apparent diffusion coefficient (ADC) in characterizing pathologic features of cervical cancer and benign cervical lesions. This prospective study was approved by the institutional review board, and written informed consent was obtained. Seventy-three patients with cervical cancer (33-69 years old; 35 patients with International Federation of Gynecology and Obstetrics stage IB cervical cancer) and 38 patients (38-61 years old) with normal cervix or cervical benign lesions (control group) were enrolled. All patients underwent 3-T diffusion-weighted imaging (DWI) with b values of 0 and 800 s/mm(2). ADC values of the entire tumor in the patient group and the whole cervix volume in the control group were assessed. Mean ADC, median ADC, 25th and 75th percentiles of ADC, skewness, and kurtosis were calculated. Histogram parameters were compared between different pathologic features, as well as between stage IB cervical cancer and control groups. Mean ADC, median ADC, and 25th percentile of ADC were significantly higher for adenocarcinoma (p = 0.021, 0.006, and 0.004, respectively), and skewness was significantly higher for squamous cell carcinoma (p = 0.011). Median ADC was statistically significantly higher for well or moderately differentiated tumors (p = 0.044), and skewness was statistically significantly higher for poorly differentiated tumors (p = 0.004). No statistically significant difference of ADC histogram was observed between lymphovascular space invasion subgroups. All histogram parameters differed significantly between stage IB cervical cancer and control groups (p < 0.05). Distribution of ADCs characterized by histogram analysis may help to distinguish early-stage cervical cancer from normal cervix or cervical benign lesions and may be useful for evaluating the different pathologic features of cervical cancer.
Bao, Shixing; Watanabe, Yoshiyuki; Takahashi, Hiroto; Tanaka, Hisashi; Arisawa, Atsuko; Matsuo, Chisato; Wu, Rongli; Fujimoto, Yasunori; Tomiyama, Noriyuki
2018-05-31
This study aimed to determine whether whole-tumor histogram analysis of normalized cerebral blood volume (nCBV) and apparent diffusion coefficient (ADC) for contrast-enhancing lesions can be used to differentiate between glioblastoma (GBM) and primary central nervous system lymphoma (PCNSL). From 20 patients, 9 with PCNSL and 11 with GBM without any hemorrhagic lesions, underwent MRI, including diffusion-weighted imaging and dynamic susceptibility contrast perfusion-weighted imaging before surgery. Histogram analysis of nCBV and ADC from whole-tumor voxels in contrast-enhancing lesions was performed. An unpaired t-test was used to compare the mean values for each type of tumor. A multivariate logistic regression model (LRM) was performed to classify GBM and PCNSL using the best parameters of ADC and nCBV. All nCBV histogram parameters of GBMs were larger than those of PCNSLs, but only average nCBV was statistically significant after Bonferroni correction. Meanwhile, ADC histogram parameters were also larger in GBM compared to those in PCNSL, but these differences were not statistically significant. According to receiver operating characteristic curve analysis, the nCBV average and ADC 25th percentile demonstrated the largest area under the curve with values of 0.869 and 0.838, respectively. The LRM combining these two parameters differentiated between GBM and PCNSL with a higher area under the curve value (Logit (P) = -21.12 + 10.00 × ADC 25th percentile (10 -3 mm 2 /s) + 5.420 × nCBV mean, P < 0.001). Our results suggest that whole-tumor histogram analysis of nCBV and ADC combined can be a valuable objective diagnostic method for differentiating between GBM and PCNSL.
Hempel, Johann-Martin; Schittenhelm, Jens; Brendle, Cornelia; Bender, Benjamin; Bier, Georg; Skardelly, Marco; Tabatabai, Ghazaleh; Castaneda Vega, Salvador; Ernemann, Ulrike; Klose, Uwe
2017-10-01
To assess the diagnostic performance of histogram analysis of diffusion kurtosis imaging (DKI) maps for in vivo assessment of the 2016 World Health Organization Classification of Tumors of the Central Nervous System (2016 CNS WHO) integrated glioma grades. Seventy-seven patients with histopathologically-confirmed glioma who provided written informed consent were retrospectively assessed between 01/2014 and 03/2017 from a prospective trial approved by the local institutional review board. Ten histogram parameters of mean kurtosis (MK) and mean diffusivity (MD) metrics from DKI were independently assessed by two blinded physicians from a volume of interest around the entire solid tumor. One-way ANOVA was used to compare MK and MD histogram parameter values between 2016 CNS WHO-based tumor grades. Receiver operating characteristic analysis was performed on MK and MD histogram parameters for significant results. The 25th, 50th, 75th, and 90th percentiles of MK and average MK showed significant differences between IDH1/2 wild-type gliomas, IDH1/2 mutated gliomas, and oligodendrogliomas with chromosome 1p/19q loss of heterozygosity and IDH1/2 mutation (p<0.001). The 50th, 75th, and 90th percentiles showed a slightly higher diagnostic performance (area under the curve (AUC) range; 0.868-0.991) than average MK (AUC range; 0.855-0.988) in classifying glioma according to the integrated approach of 2016 CNS WHO. Histogram analysis of DKI can stratify gliomas according to the integrated approach of 2016 CNS WHO. The 50th (median), 75th , and the 90th percentiles showed the highest diagnostic performance. However, the average MK is also robust and feasible in routine clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.
Zhou, Nan; Guo, Tingting; Zheng, Huanhuan; Pan, Xia; Chu, Chen; Dou, Xin; Li, Ming; Liu, Song; Zhu, Lijing; Liu, Baorui; Chen, Weibo; He, Jian; Yan, Jing; Zhou, Zhengyang; Yang, Xiaofeng
2017-09-19
We investigated apparent diffusion coefficient (ADC) histogram analysis to evaluate radiation-induced parotid damage and predict xerostomia degrees in nasopharyngeal carcinoma (NPC) patients receiving radiotherapy. The imaging of bilateral parotid glands in NPC patients was conducted 2 weeks before radiotherapy (time point 1), one month after radiotherapy (time point 2), and four months after radiotherapy (time point 3). From time point 1 to 2, parotid volume, skewness, and kurtosis decreased ( P < 0.001, = 0.001, and < 0.001, respectively), but all other ADC histogram parameters increased (all P < 0.001, except P = 0.006 for standard deviation [SD]). From time point 2 to 3, parotid volume continued to decrease ( P = 0.022), and SD, 75 th and 90 th percentiles continued to increase ( P = 0.024, 0.010, and 0.006, respectively). Early change rates of parotid ADC mean , ADC min , kurtosis, and 25 th , 50 th , 75 th , 90 th percentiles (from time point 1 to 2) correlated with late parotid atrophy rate (from time point 1 to 3) (all P < 0.05). Multiple linear regression analysis revealed correlations among parotid volume, time point, and ADC histogram parameters. Early mean change rates for bilateral parotid SD and ADC max could predict late xerostomia degrees at seven months after radiotherapy (three months after time point 3) with AUC of 0.781 and 0.818 ( P = 0.014, 0.005, respectively). ADC histogram parameters were reproducible (intraclass correlation coefficient, 0.830 - 0.999). ADC histogram analysis could be used to evaluate radiation-induced parotid damage noninvasively, and predict late xerostomia degrees of NPC patients treated with radiotherapy.
Wang, Feng; Wang, Yuxiang; Zhou, Yan; Liu, Congrong; Xie, Lizhi; Zhou, Zhenyu; Liang, Dong; Shen, Yang; Yao, Zhihang; Liu, Jianyu
2017-12-01
To evaluate the utility of histogram analysis of monoexponential, biexponential, and stretched-exponential models to a dualistic model of epithelial ovarian cancer (EOC). Fifty-two patients with histopathologically proven EOC underwent preoperative magnetic resonance imaging (MRI) (including diffusion-weighted imaging [DWI] with 11 b-values) using a 3.0T system and were divided into two groups: types I and II. Apparent diffusion coefficient (ADC), true diffusion coefficient (D), pseudodiffusion coefficient (D*), perfusion fraction (f), distributed diffusion coefficient (DDC), and intravoxel water diffusion heterogeneity (α) histograms were obtained based on solid components of the entire tumor. The following metrics of each histogram were compared between two types: 1) mean; 2) median; 3) 10th percentile and 90th percentile. Conventional MRI morphological features were also recorded. Significant morphological features for predicting EOC type were maximum diameter (P = 0.007), texture of lesion (P = 0.001), and peritoneal implants (P = 0.001). For ADC, D, f, DDC, and α, all metrics were significantly lower in type II than type I (P < 0.05). Mean, median, 10th, and 90th percentile of D* were not significantly different (P = 0.336, 0.154, 0.779, and 0.203, respectively). Most histogram metrics of ADC, D, and DDC had significantly higher area under the receiver operating characteristic curve values than those of f and α (P < 0.05) CONCLUSION: It is feasible to grade EOC by morphological features and three models with histogram analysis. ADC, D, and DDC have better performance than f and α; f and α may provide additional information. 4 Technical Efficacy: Stage 1 J. Magn. Reson. Imaging 2017;46:1797-1809. © 2017 International Society for Magnetic Resonance in Medicine.
A basic review on the inferior alveolar nerve block techniques.
Khalil, Hesham
2014-01-01
The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned.
A basic review on the inferior alveolar nerve block techniques
Khalil, Hesham
2014-01-01
The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned. PMID:25886095
Techniques of laparoscopic cholecystectomy: Nomenclature and selection
Haribhakti, Sanjiv P.; Mistry, Jitendra H.
2015-01-01
There are more than 50 different techniques of laparoscopic cholecystectomy (LC) available in literature mainly due to modifications by surgeons in aim to improve postoperative outcome and cosmesis. These modifications include reduction in port size and/or number than what is used in standard LC. There is no uniform nomenclature to describe these different techniques so that it is not possible to compare the outcomes of different techniques. We brief the advantages and disadvantages of each of these techniques and suggest the situation where particular technique would be useful. We also propose a nomenclature which is easy to remember and apply, so that any future comparison will be possible between the techniques. PMID:25883450
Local intensity area descriptor for facial recognition in ideal and noise conditions
NASA Astrophysics Data System (ADS)
Tran, Chi-Kien; Tseng, Chin-Dar; Chao, Pei-Ju; Ting, Hui-Min; Chang, Liyun; Huang, Yu-Jie; Lee, Tsair-Fwu
2017-03-01
We propose a local texture descriptor, local intensity area descriptor (LIAD), which is applied for human facial recognition in ideal and noisy conditions. Each facial image is divided into small regions from which LIAD histograms are extracted and concatenated into a single feature vector to represent the facial image. The recognition is performed using a nearest neighbor classifier with histogram intersection and chi-square statistics as dissimilarity measures. Experiments were conducted with LIAD using the ORL database of faces (Olivetti Research Laboratory, Cambridge), the Face94 face database, the Georgia Tech face database, and the FERET database. The results demonstrated the improvement in accuracy of our proposed descriptor compared to conventional descriptors [local binary pattern (LBP), uniform LBP, local ternary pattern, histogram of oriented gradients, and local directional pattern]. Moreover, the proposed descriptor was less sensitive to noise and had low histogram dimensionality. Thus, it is expected to be a powerful texture descriptor that can be used for various computer vision problems.
Massoudieh, Arash; Visser, Ate; Sharifi, Soroosh; ...
2013-10-15
The mixing of groundwaters with different ages in aquifers, groundwater age is more appropriately represented by a distribution rather than a scalar number. To infer a groundwater age distribution from environmental tracers, a mathematical form is often assumed for the shape of the distribution and the parameters of the mathematical distribution are estimated using deterministic or stochastic inverse methods. We found that the prescription of the mathematical form limits the exploration of the age distribution to the shapes that can be described by the selected distribution. In this paper, the use of freeform histograms as groundwater age distributions is evaluated.more » A Bayesian Markov Chain Monte Carlo approach is used to estimate the fraction of groundwater in each histogram bin. This method was able to capture the shape of a hypothetical gamma distribution from the concentrations of four age tracers. The number of bins that can be considered in this approach is limited based on the number of tracers available. The histogram method was also tested on tracer data sets from Holten (The Netherlands; 3H, 3He, 85Kr, 39Ar) and the La Selva Biological Station (Costa-Rica; SF 6, CFCs, 3H, 4He and 14C), and compared to a number of mathematical forms. According to standard Bayesian measures of model goodness, the best mathematical distribution performs better than the histogram distributions in terms of the ability to capture the observed tracer data relative to their complexity. Among the histogram distributions, the four bin histogram performs better in most of the cases. The Monte Carlo simulations showed strong correlations in the posterior estimates of bin contributions, indicating that these bins cannot be well constrained using the available age tracers. The fact that mathematical forms overall perform better than the freeform histogram does not undermine the benefit of the freeform approach, especially for the cases where a larger amount of observed data is available and when the real groundwater distribution is more complex than can be represented by simple mathematical forms.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Massoudieh, Arash; Visser, Ate; Sharifi, Soroosh
The mixing of groundwaters with different ages in aquifers, groundwater age is more appropriately represented by a distribution rather than a scalar number. To infer a groundwater age distribution from environmental tracers, a mathematical form is often assumed for the shape of the distribution and the parameters of the mathematical distribution are estimated using deterministic or stochastic inverse methods. We found that the prescription of the mathematical form limits the exploration of the age distribution to the shapes that can be described by the selected distribution. In this paper, the use of freeform histograms as groundwater age distributions is evaluated.more » A Bayesian Markov Chain Monte Carlo approach is used to estimate the fraction of groundwater in each histogram bin. This method was able to capture the shape of a hypothetical gamma distribution from the concentrations of four age tracers. The number of bins that can be considered in this approach is limited based on the number of tracers available. The histogram method was also tested on tracer data sets from Holten (The Netherlands; 3H, 3He, 85Kr, 39Ar) and the La Selva Biological Station (Costa-Rica; SF 6, CFCs, 3H, 4He and 14C), and compared to a number of mathematical forms. According to standard Bayesian measures of model goodness, the best mathematical distribution performs better than the histogram distributions in terms of the ability to capture the observed tracer data relative to their complexity. Among the histogram distributions, the four bin histogram performs better in most of the cases. The Monte Carlo simulations showed strong correlations in the posterior estimates of bin contributions, indicating that these bins cannot be well constrained using the available age tracers. The fact that mathematical forms overall perform better than the freeform histogram does not undermine the benefit of the freeform approach, especially for the cases where a larger amount of observed data is available and when the real groundwater distribution is more complex than can be represented by simple mathematical forms.« less
Choi, Young Jun; Lee, Jeong Hyun; Kim, Hye Ok; Kim, Dae Yoon; Yoon, Ra Gyoung; Cho, So Hyun; Koh, Myeong Ju; Kim, Namkug; Kim, Sang Yoon; Baek, Jung Hwan
2016-01-01
To explore the added value of histogram analysis of apparent diffusion coefficient (ADC) values over magnetic resonance (MR) imaging and fluorine 18 ((18)F) fluorodeoxyglucose (FDG) positron emission tomography (PET)/computed tomography (CT) for the detection of occult palatine tonsil squamous cell carcinoma (SCC) in patients with cervical nodal metastasis from a cancer of an unknown primary site. The institutional review board approved this retrospective study, and the requirement for informed consent was waived. Differences in the bimodal histogram parameters of the ADC values were assessed among occult palatine tonsil SCC (n = 19), overt palatine tonsil SCC (n = 20), and normal palatine tonsils (n = 20). One-way analysis of variance was used to analyze differences among the three groups. Receiver operating characteristic curve analysis was used to determine the best differentiating parameters. The increased sensitivity of histogram analysis over MR imaging and (18)F-FDG PET/CT for the detection of occult palatine tonsil SCC was evaluated as added value. Histogram analysis showed statistically significant differences in the mean, standard deviation, and 50th and 90th percentile ADC values among the three groups (P < .0045). Occult palatine tonsil SCC had a significantly higher standard deviation for the overall curves, mean and standard deviation of the higher curves, and 90th percentile ADC value, compared with normal palatine tonsils (P < .0167). Receiver operating characteristic curve analysis showed that the standard deviation of the overall curve best delineated occult palatine tonsil SCC from normal palatine tonsils, with a sensitivity of 78.9% (15 of 19 patients) and a specificity of 60% (12 of 20 patients). The added value of ADC histogram analysis was 52.6% over MR imaging alone and 15.8% over combined conventional MR imaging and (18)F-FDG PET/CT. Adding ADC histogram analysis to conventional MR imaging can improve the detection sensitivity for occult palatine tonsil SCC in patients with a cervical nodal metastasis originating from a cancer of an unknown primary site. © RSNA, 2015.
Kim, Hyungjin; Choi, Seung Hong; Kim, Ji-Hoon; Ryoo, Inseon; Kim, Soo Chin; Yeom, Jeong A.; Shin, Hwaseon; Jung, Seung Chai; Lee, A. Leum; Yun, Tae Jin; Park, Chul-Kee; Sohn, Chul-Ho; Park, Sung-Hye
2013-01-01
Background Glioma grading assumes significant importance in that low- and high-grade gliomas display different prognoses and are treated with dissimilar therapeutic strategies. The objective of our study was to retrospectively assess the usefulness of a cumulative normalized cerebral blood volume (nCBV) histogram for glioma grading based on 3 T MRI. Methods From February 2010 to April 2012, 63 patients with astrocytic tumors underwent 3 T MRI with dynamic susceptibility contrast perfusion-weighted imaging. Regions of interest containing the entire tumor volume were drawn on every section of the co-registered relative CBV (rCBV) maps and T2-weighted images. The percentile values from the cumulative nCBV histograms and the other histogram parameters were correlated with tumor grades. Cochran’s Q test and the McNemar test were used to compare the diagnostic accuracies of the histogram parameters after the receiver operating characteristic curve analysis. Using the parameter offering the highest diagnostic accuracy, a validation process was performed with an independent test set of nine patients. Results The 99th percentile of the cumulative nCBV histogram (nCBV C99), mean and peak height differed significantly between low- and high-grade gliomas (P = <0.001, 0.014 and <0.001, respectively) and between grade III and IV gliomas (P = <0.001, 0.001 and <0.001, respectively). The diagnostic accuracy of nCBV C99 was significantly higher than that of the mean nCBV (P = 0.016) in distinguishing high- from low-grade gliomas and was comparable to that of the peak height (P = 1.000). Validation using the two cutoff values of nCBV C99 achieved a diagnostic accuracy of 66.7% (6/9) for the separation of all three glioma grades. Conclusion Cumulative histogram analysis of nCBV using 3 T MRI can be a useful method for preoperative glioma grading. The nCBV C99 value is helpful in distinguishing high- from low-grade gliomas and grade IV from III gliomas. PMID:23704910
Zhang, Yu-Dong; Wu, Chen-Jiang; Wang, Qing; Zhang, Jing; Wang, Xiao-Ning; Liu, Xi-Sheng; Shi, Hai-Bin
2015-08-01
The purpose of this study was to compare histogram analysis of apparent diffusion coefficient (ADC) and R2* for differentiating low-grade from high-grade clear cell renal cell carcinoma (RCC). Forty-six patients with pathologically confirmed clear cell RCC underwent preoperative BOLD and DWI MRI of the kidneys. ADCs based on the entire tumor volume were calculated with b value combinations of 0 and 800 s/mm(2). ROI-based R2* was calculated with eight TE combinations of 6.7-22.8 milliseconds. Histogram analysis of tumor ADCs and R2* values was performed to obtain mean; median; width; and fifth, 10th, 90th, and 95th percentiles and histogram inhomogeneity, kurtosis, and skewness for all lesions. Thirty-three low-grade and 13 high-grade clear cell RCCs were found at pathologic examination. The TNM classification and tumor volume of clear cell RCC significantly correlated with histogram ADC and R2* (ρ = -0.317 to 0.506; p < 0.05). High-grade clear cell RCC had significantly lower mean, median, and 10th percentile ADCs but higher inhomogeneity and median R2* than low-grade clear cell RCC (all p < 0.05). Compared with other histogram ADC and R2* indexes, 10th percentile ADC had the highest accuracy (91.3%) in discriminating low- from high-grade clear cell RCC. R2* in discriminating hemorrhage was achieved with a threshold of 68.95 Hz. At this threshold, high-grade clear cell RCC had a significantly higher prevalence of intratumor hemorrhage (high-grade, 76.9%; low-grade, 45.4%; p < 0.05) and larger hemorrhagic area than low-grade clear cell RCC (high-grade, 34.9% ± 31.6%; low-grade, 8.9 ± 16.8%; p < 0.05). A close relation was found between MRI indexes and pathologic findings. Histogram analysis of ADC and R2* allows differentiation of low- from high-grade clear cell RCC with high accuracy.
Kong, Ling-Yan; Zhang, Wei; Zhou, Yue; Xu, Hai; Shi, Hai-Bin; Feng, Qing; Xu, Xiao-Quan; Yu, Tong-Fu
2018-04-01
To investigate the value of apparent diffusion coefficients (ADCs) histogram analysis for assessing World Health Organization (WHO) pathological classification and Masaoka clinical stages of thymic epithelial tumours. 37 patients with histologically confirmed thymic epithelial tumours were enrolled. ADC measurements were performed using hot-spot ROI (ADC HS-ROI ) and histogram-based approach. ADC histogram parameters included mean ADC (ADC mean ), median ADC (ADC median ), 10 and 90 percentile of ADC (ADC 10 and ADC 90 ), kurtosis and skewness. One-way ANOVA, independent-sample t-test, and receiver operating characteristic were used for statistical analyses. There were significant differences in ADC mean , ADC median , ADC 10 , ADC 90 and ADC HS-ROI among low-risk thymoma (type A, AB, B1; n = 14), high-risk thymoma (type B2, B3; n = 9) and thymic carcinoma (type C, n = 14) groups (all p-values <0.05), while no significant difference in skewness (p = 0.181) and kurtosis (p = 0.088). ADC 10 showed best differentiating ability (cut-off value, ≤0.689 × 10 -3 mm 2 s -1 ; AUC, 0.957; sensitivity, 95.65%; specificity, 92.86%) for discriminating low-risk thymoma from high-risk thymoma and thymic carcinoma. Advanced Masaoka stages (Stage III and IV; n = 24) tumours showed significant lower ADC parameters and higher kurtosis than early Masaoka stage (Stage I and II; n = 13) tumours (all p-values <0.05), while no significant difference on skewness (p = 0.063). ADC 10 showed best differentiating ability (cut-off value, ≤0.689 × 10 -3 mm 2 s -1 ; AUC, 0.913; sensitivity, 91.30%; specificity, 85.71%) for discriminating advanced and early Masaoka stage epithelial tumours. ADC histogram analysis may assist in assessing the WHO pathological classification and Masaoka clinical stages of thymic epithelial tumours. Advances in knowledge: 1. ADC histogram analysis could help to assess WHO pathological classification of thymic epithelial tumours. 2. ADC histogram analysis could help to evaluate Masaoka clinical stages of thymic epithelial tumours. 3. ADC 10 might be a promising imaging biomarker for assessing and characterizing thymic epithelial tumours.
Hoffman, David H; Ream, Justin M; Hajdu, Christina H; Rosenkrantz, Andrew B
2017-04-01
To evaluate whole-lesion ADC histogram metrics for assessing the malignant potential of pancreatic intraductal papillary mucinous neoplasms (IPMNs), including in comparison with conventional MRI features. Eighteen branch-duct IPMNs underwent MRI with DWI prior to resection (n = 16) or FNA (n = 2). A blinded radiologist placed 3D volumes-of-interest on the entire IPMN on the ADC map, from which whole-lesion histogram metrics were generated. The reader also assessed IPMN size, mural nodularity, and adjacent main-duct dilation. Benign (low-to-intermediate grade dysplasia; n = 10) and malignant (high-grade dysplasia or invasive adenocarcinoma; n = 8) IPMNs were compared. Whole-lesion ADC histogram metrics demonstrating significant differences between benign and malignant IPMNs were: entropy (5.1 ± 0.2 vs. 5.4 ± 0.2; p = 0.01, AUC = 86%); mean of the bottom 10th percentile (2.2 ± 0.4 vs. 1.6 ± 0.7; p = 0.03; AUC = 81%); and mean of the 10-25th percentile (2.8 ± 0.4 vs. 2.3 ± 0.6; p = 0.04; AUC = 79%). The overall mean ADC, skewness, and kurtosis were not significantly different between groups (p ≥ 0.06; AUC = 50-78%). For entropy (highest performing histogram metric), an optimal threshold of >5.3 achieved a sensitivity of 100%, a specificity of 70%, and an accuracy of 83% for predicting malignancy. No significant difference (p = 0.18-0.64) was observed between benign and malignant IPMNs for cyst size ≥3 cm, adjacent main-duct dilatation, or mural nodule. At multivariable analysis of entropy in combination with all other ADC histogram and conventional MRI features, entropy was the only significant independent predictor of malignancy (p = 0.004). Although requiring larger studies, ADC entropy obtained from 3D whole-lesion histogram analysis may serve as a biomarker for identifying the malignant potential of IPMNs, independent of conventional MRI features.
Graphic/symbol segmentation for Group 4 facsimile systems
NASA Astrophysics Data System (ADS)
Deutermann, A. R.
1982-04-01
The purpose of this study was to examine possible techniques for and symbol areas, and assemble a code that represents the entire document. Parameters to be considered include compression, commonality with facsimile and TELETEX* transmissions, and complexity of implementation. Six segmentation technique were selected for analysis. The techniques were designed to differ from each other as much as possible, so as to display a wide variety of characteristics. For each technique, many minor modifications would be possible, but it is not expected that these modifications would alter the conclusions drawn from the study.
ERIC Educational Resources Information Center
Englehard, George, Jr.
1996-01-01
Data presented in figure three of the article cited may be misleading in that the automatic scaling procedure used by the computer program that generated the histogram highlighted spikes that would look different with different histogram methods. (SLD)
Using Computer Graphics in Statistics.
ERIC Educational Resources Information Center
Kerley, Lyndell M.
1990-01-01
Described is software which allows a student to use simulation to produce analytical output as well as graphical results. The results include a frequency histogram of a selected population distribution, a frequency histogram of the distribution of the sample means, and test the normality distributions of the sample means. (KR)
Efficient visibility-driven medical image visualisation via adaptive binned visibility histogram.
Jung, Younhyun; Kim, Jinman; Kumar, Ashnil; Feng, David Dagan; Fulham, Michael
2016-07-01
'Visibility' is a fundamental optical property that represents the observable, by users, proportion of the voxels in a volume during interactive volume rendering. The manipulation of this 'visibility' improves the volume rendering processes; for instance by ensuring the visibility of regions of interest (ROIs) or by guiding the identification of an optimal rendering view-point. The construction of visibility histograms (VHs), which represent the distribution of all the visibility of all voxels in the rendered volume, enables users to explore the volume with real-time feedback about occlusion patterns among spatially related structures during volume rendering manipulations. Volume rendered medical images have been a primary beneficiary of VH given the need to ensure that specific ROIs are visible relative to the surrounding structures, e.g. the visualisation of tumours that may otherwise be occluded by neighbouring structures. VH construction and its subsequent manipulations, however, are computationally expensive due to the histogram binning of the visibilities. This limits the real-time application of VH to medical images that have large intensity ranges and volume dimensions and require a large number of histogram bins. In this study, we introduce an efficient adaptive binned visibility histogram (AB-VH) in which a smaller number of histogram bins are used to represent the visibility distribution of the full VH. We adaptively bin medical images by using a cluster analysis algorithm that groups the voxels according to their intensity similarities into a smaller subset of bins while preserving the distribution of the intensity range of the original images. We increase efficiency by exploiting the parallel computation and multiple render targets (MRT) extension of the modern graphical processing units (GPUs) and this enables efficient computation of the histogram. We show the application of our method to single-modality computed tomography (CT), magnetic resonance (MR) imaging and multi-modality positron emission tomography-CT (PET-CT). In our experiments, the AB-VH markedly improved the computational efficiency for the VH construction and thus improved the subsequent VH-driven volume manipulations. This efficiency was achieved without major degradation in the VH visually and numerical differences between the AB-VH and its full-bin counterpart. We applied several variants of the K-means clustering algorithm with varying Ks (the number of clusters) and found that higher values of K resulted in better performance at a lower computational gain. The AB-VH also had an improved performance when compared to the conventional method of down-sampling of the histogram bins (equal binning) for volume rendering visualisation. Copyright © 2016 Elsevier Ltd. All rights reserved.
An analysis of automatic human detection and tracking
NASA Astrophysics Data System (ADS)
Demuth, Philipe R.; Cosmo, Daniel L.; Ciarelli, Patrick M.
2015-12-01
This paper presents an automatic method to detect and follow people on video streams. This method uses two techniques to determine the initial position of the person at the beginning of the video file: one based on optical flow and the other one based on Histogram of Oriented Gradients (HOG). After defining the initial bounding box, tracking is done using four different trackers: Median Flow tracker, TLD tracker, Mean Shift tracker and a modified version of the Mean Shift tracker using HSV color space. The results of the methods presented in this paper are then compared at the end of the paper.
NASA Technical Reports Server (NTRS)
Vanoss, C. J.
1978-01-01
Pancreatic islets were obtained from guinea pig pancreas by the collagenase method and kept alive in tissue culture prior to further studies. Pancreas cell morphology was studied by standard histochemical techniques using light microscopy. Preparative vertical electrophoresis-levitation of dispersed fetal guinea pig pancreas cells was conducted in phosphate buffer containing a heavy water (D20) gradient which does not cause clumping of cells or alter the osmolarity of the buffers. The faster migrating fractions tended to be enriched in beta-cell content. Alpha and delta cells were found to some degree in most fractions. A histogram showing the cell count distribution is included.
2014-01-01
Background EDTA-dependent pseudothrombocytopenia (EDTA-PTCP) is a common laboratory phenomenon with a prevalence ranging from 0.1-2% in hospitalized patients to 15-17% in outpatients evaluated for isolated thrombocytopenia. Despite its harmlessness, EDTA-PTCP frequently leads to time-consuming, costly and even invasive diagnostic investigations. EDTA-PTCP is often overlooked because blood smears are not evaluated visually in routine practice and histograms as well as warning flags of hematology analyzers are not interpreted correctly. Nonetheless, EDTA-PTCP may be diagnosed easily even by general practitioners without any experiences in blood film examinations. This is the first report illustrating the typical patterns of a platelet (PLT) and white blood cell (WBC) histograms of hematology analyzers. Case presentation A 37-year-old female patient of Caucasian origin was referred with suspected acute leukemia and the crew of the emergency unit arranged extensive investigations for work-up. However, examination of EDTA blood sample revealed atypical lymphocytes and an isolated thrombocytopenia together with typical patterns of WBC and PLT histograms: a serrated curve of the platelet histogram and a peculiar peak on the left side of the WBC histogram. EDTA-PTCP was confirmed by a normal platelet count when examining citrated blood. Conclusion Awareness of typical PLT and WBC patterns may alert to the presence of EDTA-PTCP in routine laboratory practice helping to avoid unnecessary investigations and over-treatment. PMID:24808761
Muhammad, Sani Ismaila; Maznah, Ismail; Mahmud, Rozi Binti; Esmaile, Maher Faik; Zuki, Abu Bakar Zakaria
2013-01-01
Background Bone mass density is an important parameter used in the estimation of the severity and depth of lesions in osteoporosis. Estimation of bone density using existing methods in experimental models has its advantages as well as drawbacks. Materials and methods In this study, the X-ray histogram edge detection technique was used to estimate the bone mass density in ovariectomized rats treated orally with germinated brown rice (GBR) bioactives, and the results were compared with estimated results obtained using Archimede’s principle. New bone cell proliferation was assessed by histology and immunohistochemical reaction using polyclonal nuclear antigen. Additionally, serum alkaline phosphatase activity, serum and bone calcium and zinc concentrations were detected using a chemistry analyzer and atomic absorption spectroscopy. Rats were divided into groups of six as follows: sham (nonovariectomized, nontreated); ovariectomized, nontreated; and ovariectomized and treated with estrogen, or Remifemin®, GBR-phenolics, acylated steryl glucosides, gamma oryzanol, and gamma amino-butyric acid extracted from GBR at different doses. Results Our results indicate a significant increase in alkaline phosphatase activity, serum and bone calcium, and zinc and ash content in the treated groups compared with the ovariectomized nontreated group (P < 0.05). Bone density increased significantly (P < 0.05) in groups treated with estrogen, GBR, Remifemin®, and gamma oryzanol compared to the ovariectomized nontreated group. Histological sections revealed more osteoblasts in the treated groups when compared with the untreated groups. A polyclonal nuclear antigen reaction showing proliferating new cells was observed in groups treated with estrogen, Remifemin®, GBR, acylated steryl glucosides, and gamma oryzanol. There was a good correlation between bone mass densities estimated using Archimede’s principle and the edge detection technique between the treated groups (r2 = 0.737, P = 0.004). Conclusion Our study shows that GBR bioactives increase bone density, which might be via the activation of zinc formation and increased calcium content, and that X-ray edge detection technique is effective in the measurement of bone density and can be employed effectively in this respect. PMID:24187491
Probability Distribution Extraction from TEC Estimates based on Kernel Density Estimation
NASA Astrophysics Data System (ADS)
Demir, Uygar; Toker, Cenk; Çenet, Duygu
2016-07-01
Statistical analysis of the ionosphere, specifically the Total Electron Content (TEC), may reveal important information about its temporal and spatial characteristics. One of the core metrics that express the statistical properties of a stochastic process is its Probability Density Function (pdf). Furthermore, statistical parameters such as mean, variance and kurtosis, which can be derived from the pdf, may provide information about the spatial uniformity or clustering of the electron content. For example, the variance differentiates between a quiet ionosphere and a disturbed one, whereas kurtosis differentiates between a geomagnetic storm and an earthquake. Therefore, valuable information about the state of the ionosphere (and the natural phenomena that cause the disturbance) can be obtained by looking at the statistical parameters. In the literature, there are publications which try to fit the histogram of TEC estimates to some well-known pdf.s such as Gaussian, Exponential, etc. However, constraining a histogram to fit to a function with a fixed shape will increase estimation error, and all the information extracted from such pdf will continue to contain this error. In such techniques, it is highly likely to observe some artificial characteristics in the estimated pdf which is not present in the original data. In the present study, we use the Kernel Density Estimation (KDE) technique to estimate the pdf of the TEC. KDE is a non-parametric approach which does not impose a specific form on the TEC. As a result, better pdf estimates that almost perfectly fit to the observed TEC values can be obtained as compared to the techniques mentioned above. KDE is particularly good at representing the tail probabilities, and outliers. We also calculate the mean, variance and kurtosis of the measured TEC values. The technique is applied to the ionosphere over Turkey where the TEC values are estimated from the GNSS measurement from the TNPGN-Active (Turkish National Permanent GNSS Network) network. This study is supported by by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.
Treating Retentive Encopresis: Dietary Modification and Behavioral Techniques.
ERIC Educational Resources Information Center
Nabors, Laura; Morgan, Sam B.
1995-01-01
A home-based contingency management program, consisting of diet modification, laxatives, correction for soiling accidents, stimulus control training, and positive reinforcement, was implemented for treatment of a 4-year-old encopretic male. The findings provide evidence supporting the effectiveness of dietary modification combined with behavior…
The double stapling technique for low anterior resection. Results, modifications, and observations.
Griffen, F D; Knight, C D; Whitaker, J M; Knight, C D
1990-01-01
Since the introduction of the end-to-end anastomosis (EEA) stapler for rectal reconstruction, we have used a modification of the conventional technique in which the lower rectal segment is closed with the linear stapler (TA-55) and the anastomosis is performed using the EEA instrument across the linear staple line (double stapling technique). Our experience with this procedure includes stapled colorectal anastomoses in 75 patients and is the basis for the report. This review presents the details and advantages of the technique and the results. Complications include two patients with anastomotic leak (2.7%), and two with stenosis that required treatment (2.7%). Protective colostomy was not done in this series. There were no deaths. Our experience and that of others suggests that this modification of the EEA technique can allow a lower anastomosis in some patients, and that it can be done with greater safety and facility. Images Fig. 1. Fig. 2. Fig. 3. Fig. 4. Fig. 5. Fig. 6. PMID:2357137
Gaze Fluctuations Are Not Additively Decomposable: Reply to Bogartz and Staub
ERIC Educational Resources Information Center
Kelty-Stephen, Damian G.; Mirman, Daniel
2013-01-01
Our previous work interpreted single-lognormal fits to inter-gaze distance (i.e., "gaze steps") histograms as evidence of multiplicativity and hence interactions across scales in visual cognition. Bogartz and Staub (2012) proposed that gaze steps are additively decomposable into fixations and saccades, matching the histograms better and…
Efficient computational nonlinear dynamic analysis using modal modification response technique
NASA Astrophysics Data System (ADS)
Marinone, Timothy; Avitabile, Peter; Foley, Jason; Wolfson, Janet
2012-08-01
Generally, structural systems contain nonlinear characteristics in many cases. These nonlinear systems require significant computational resources for solution of the equations of motion. Much of the model, however, is linear where the nonlinearity results from discrete local elements connecting different components together. Using a component mode synthesis approach, a nonlinear model can be developed by interconnecting these linear components with highly nonlinear connection elements. The approach presented in this paper, the Modal Modification Response Technique (MMRT), is a very efficient technique that has been created to address this specific class of nonlinear problem. By utilizing a Structural Dynamics Modification (SDM) approach in conjunction with mode superposition, a significantly smaller set of matrices are required for use in the direct integration of the equations of motion. The approach will be compared to traditional analytical approaches to make evident the usefulness of the technique for a variety of test cases.
Foda, Abd Al-Rahman Mohammad
2013-05-01
Manual tissue microarray (TMA) construction had been introduced to avoid the high cost of automated and semiautomated techniques. The cheapest and simplest technique for constructing manual TMA was that of using mechanical pencil tips. This study was carried out to modify this method, aiming to raise its quality to reach that of expensive ones. Some modifications were introduced to Shebl's technique. Two conventional mechanical pencil tips of different diameters were used to construct the recipient blocks. A source of mild heat was used, and blocks were incubated at 38°C overnight. With our modifications, 3 high-density TMA blocks were constructed. We successfully performed immunostaining without substantial tissue loss. Our modifications increased the number of cores per block and improved the stability of the cores within the paraffin block. This new, modified technique is a good alternative for expensive machines in many laboratories.
Bodison, Stefanie C; Parham, L Diane
This systematic review examined the effectiveness of specific sensory techniques and sensory environmental modifications to improve participation of children with sensory integration (SI) difficulties. Abstracts of 11,436 articles published between January 2007 and May 2015 were examined. Studies were included if designs reflected high levels of evidence, participants demonstrated SI difficulties, and outcome measures addressed function or participation. Eight studies met inclusion criteria. Seven studies evaluated effects of specific sensory techniques for children with autism spectrum disorder (ASD) or attention deficit hyperactivity disorder: Qigong massage, weighted vests, slow swinging, and incorporation of multisensory activities into preschool routines. One study of sensory environmental modifications examined adaptations to a dental clinic for children with ASD. Strong evidence supported Qigong massage, moderate evidence supported sensory modifications to the dental care environment, and limited evidence supported weighted vests. The evidence is insufficient to draw conclusions regarding slow linear swinging and incorporation of multisensory activities into preschool settings. Copyright © 2018 by the American Occupational Therapy Association, Inc.
Public perceptions of hurricane modification.
Klima, Kelly; Bruine de Bruin, Wändi; Morgan, M Granger; Grossmann, Iris
2012-07-01
If hurricane modification were to become a feasible strategy for potentially reducing hurricane damages, it would likely generate public discourse about whether to support its implementation. To facilitate an informed and constructive discourse, policymakers need to understand how people perceive hurricane modification. Here, we examine Florida residents' perceptions of hurricane modification techniques that aim to alter path and wind speed. Following the mental models approach, we conducted a survey study about public perceptions of hurricane modification that was guided by formative interviews on the topic. We report a set of four primary findings. First, hurricane modification was perceived as a relatively ineffective strategy for damage reduction, compared to other strategies for damage reduction. Second, hurricane modification was expected to lead to changes in projected hurricane path, but not necessarily to the successful reduction of projected hurricane strength. Third, more anger was evoked when a hurricane was described as having changed from the initially forecasted path or strength after an attempted modification. Fourth, unlike what we expected, participants who more strongly agreed with statements that recognized the uncertainty inherent in forecasts reported more rather than less anger at scientists across hurricane modification scenarios. If the efficacy of intensity-reduction techniques can be increased, people may be willing to support hurricane modification. However, such an effort would need to be combined with open and honest communications to members of the general public. © 2011 Society for Risk Analysis.
Milles, Julien; Zhu, Yue Min; Gimenez, Gérard; Guttmann, Charles R G; Magnin, Isabelle E
2007-03-01
A novel approach for correcting intensity nonuniformity in magnetic resonance imaging (MRI) is presented. This approach is based on the simultaneous use of spatial and gray-level histogram information. Spatial information about intensity nonuniformity is obtained using cubic B-spline smoothing. Gray-level histogram information of the image corrupted by intensity nonuniformity is exploited from a frequential point of view. The proposed correction method is illustrated using both physical phantom and human brain images. The results are consistent with theoretical prediction, and demonstrate a new way of dealing with intensity nonuniformity problems. They are all the more significant as the ground truth on intensity nonuniformity is unknown in clinical images.
NASA Astrophysics Data System (ADS)
Mansourian, Leila; Taufik Abdullah, Muhamad; Nurliyana Abdullah, Lili; Azman, Azreen; Mustaffa, Mas Rina
2017-02-01
Pyramid Histogram of Words (PHOW), combined Bag of Visual Words (BoVW) with the spatial pyramid matching (SPM) in order to add location information to extracted features. However, different PHOW extracted from various color spaces, and they did not extract color information individually, that means they discard color information, which is an important characteristic of any image that is motivated by human vision. This article, concatenated PHOW Multi-Scale Dense Scale Invariant Feature Transform (MSDSIFT) histogram and a proposed Color histogram to improve the performance of existing image classification algorithms. Performance evaluation on several datasets proves that the new approach outperforms other existing, state-of-the-art methods.
Analysis of the hand vein pattern for people recognition
NASA Astrophysics Data System (ADS)
Castro-Ortega, R.; Toxqui-Quitl, C.; Cristóbal, G.; Marcos, J. Victor; Padilla-Vivanco, A.; Hurtado Pérez, R.
2015-09-01
The shape of the hand vascular pattern contains useful and unique features that can be used for identifying and authenticating people, with applications in access control, medicine and financial services. In this work, an optical system for the image acquisition of the hand vascular pattern is implemented. It consists of a CCD camera with sensitivity in the IR and a light source with emission in the 880 nm. The IR radiation interacts with the desoxyhemoglobin, hemoglobin and water present in the blood of the veins, making possible to see the vein pattern underneath skin. The segmentation of the Region Of Interest (ROI) is achieved using geometrical moments locating the centroid of an image. For enhancement of the vein pattern we use the technique of Histogram Equalization and Contrast Limited Adaptive Histogram Equalization (CLAHE). In order to remove unnecessary information such as body hair and skinfolds, a low pass filter is implemented. A method based on geometric moments is used to obtain the invariant descriptors of the input images. The classification task is achieved using Artificial Neural Networks (ANN) and K-Nearest Neighbors (K-nn) algorithms. Experimental results using our database show a percentage of correct classification, higher of 86.36% with ANN for 912 images of 38 people with 12 versions each one.
Web servlet-assisted, dial-in flow cytometry data analysis.
Battye, F
2001-02-01
The obvious benefits of centralized data storage notwithstanding, the size of modern flow cytometry data files discourages their transmission over commonly used telephone modem connections. The proposed solution is to install at the central location a web servlet that can extract compact data arrays, of a form dependent on the requested display type, from the stored files and transmit them to a remote client computer program for display. A client program and a web servlet, both written in the Java programming language, were designed to communicate over standard network connections. The client program creates familiar numerical and graphical display types and allows the creation of gates from combinations of user-defined regions. Data compression techniques further reduce transmission times for data arrays that are already much smaller than the data file itself. For typical data files, network transmission times were reduced more than 700-fold for extraction of one-dimensional (1-D) histograms, between 18 and 120-fold for 2-D histograms, and 6-fold for color-coded dot plots. Numerous display formats are possible without further access to the data file. This scheme enables telephone modem access to centrally stored data without restricting flexibility of display format or preventing comparisons with locally stored files. Copyright 2001 Wiley-Liss, Inc.
Underwater image enhancement based on the dark channel prior and attenuation compensation
NASA Astrophysics Data System (ADS)
Guo, Qingwen; Xue, Lulu; Tang, Ruichun; Guo, Lingrui
2017-10-01
Aimed at the two problems of underwater imaging, fog effect and color cast, an Improved Segmentation Dark Channel Prior (ISDCP) defogging method is proposed to solve the fog effects caused by physical properties of water. Due to mass refraction of light in the process of underwater imaging, fog effects would lead to image blurring. And color cast is closely related to different degree of attenuation while light with different wavelengths is traveling in water. The proposed method here integrates the ISDCP and quantitative histogram stretching techniques into the image enhancement procedure. Firstly, the threshold value is set during the refinement process of the transmission maps to identify the original mismatching, and to conduct the differentiated defogging process further. Secondly, a method of judging the propagating distance of light is adopted to get the attenuation degree of energy during the propagation underwater. Finally, the image histogram is stretched quantitatively in Red-Green-Blue channel respectively according to the degree of attenuation in each color channel. The proposed method ISDCP can reduce the computational complexity and improve the efficiency in terms of defogging effect to meet the real-time requirements. Qualitative and quantitative comparison for several different underwater scenes reveals that the proposed method can significantly improve the visibility compared with previous methods.
Laser hardening techniques on steam turbine blade and application
NASA Astrophysics Data System (ADS)
Yao, Jianhua; Zhang, Qunli; Kong, Fanzhi; Ding, Qingming
Different laser surface hardening techniques, such as laser alloying and laser solution strengthening were adopted to perform modification treatment on the local region of inset edge for 2Cr13 and 17-4PH steam turbine blades to prolong the life of the blades. The microstructures, microhardness and anti-cavitation properties were investigated on the blades after laser treatment. The hardening mechanism and technique adaptability were researched. Large scale installation practices confirmed that the laser surface modification techniques are safe and reliable, which can improve the properties of blades greatly with advantages of high automation, high quality, little distortion and simple procedure.
ERIC Educational Resources Information Center
Starling, Julia; Munro, Natalie; Togher, Leanne; Arciuli, Joanne
2012-01-01
Purpose: This study evaluated the efficacy of a collaborative intervention where a speech-language pathologist (SLP) trained mainstream secondary school teachers to make modifications to their oral and written instructional language. The trained teachers' uptake of techniques in their whole-class teaching practices and the impact this had on the…
ERIC Educational Resources Information Center
New York State Commission on Quality of Care for the Mentally Disabled, Albany.
This report examines issues concerned with the use of aversive behavior modification techniques in actual treatment practices at one intermediate care facility for the mentally retarded. The review of these practices reveals how, once the philosophy of using aversives takes hold at a program (to deal with seemingly intractable behaviors), its…
William E. Taylor; Robert G. Hooper
1991-01-01
A modification to Copeyon's drilling technique for making highly effective artificial cavities for red-cockaded wookpeckers is described.The changes virtually eliminate the possibility of making a mistake in constructing cavities and reduces the learning time to less than 2 weeks.The basic change is the use of a 3-inch access hole that allows the relative position...
A New Quantum Watermarking Based on Quantum Wavelet Transforms
NASA Astrophysics Data System (ADS)
Heidari, Shahrokh; Naseri, Mosayeb; Gheibi, Reza; Baghfalaki, Masoud; Rasoul Pourarian, Mohammad; Farouk, Ahmed
2017-06-01
Quantum watermarking is a technique to embed specific information, usually the owner’s identification, into quantum cover data such for copyright protection purposes. In this paper, a new scheme for quantum watermarking based on quantum wavelet transforms is proposed which includes scrambling, embedding and extracting procedures. The invisibility and robustness performances of the proposed watermarking method is confirmed by simulation technique. The invisibility of the scheme is examined by the peak-signal-to-noise ratio (PSNR) and the histogram calculation. Furthermore the robustness of the scheme is analyzed by the Bit Error Rate (BER) and the Correlation Two-Dimensional (Corr 2-D) calculation. The simulation results indicate that the proposed watermarking scheme indicate not only acceptable visual quality but also a good resistance against different types of attack. Supported by Kermanshah Branch, Islamic Azad University, Kermanshah, Iran
Modifications Of Hydrostatic-Bearing Computer Program
NASA Technical Reports Server (NTRS)
Hibbs, Robert I., Jr.; Beatty, Robert F.
1991-01-01
Several modifications made to enhance utility of HBEAR, computer program for analysis and design of hydrostatic bearings. Modifications make program applicable to more realistic cases and reduce time and effort necessary to arrive at a suitable design. Uses search technique to iterate on size of orifice to obtain required pressure ratio.
Microprocessor-Based Neural-Pulse-Wave Analyzer
NASA Technical Reports Server (NTRS)
Kojima, G. K.; Bracchi, F.
1983-01-01
Microprocessor-based system analyzes amplitudes and rise times of neural waveforms. Displaying histograms of measured parameters helps researchers determine how many nerves contribute to signal and specify waveform characteristics of each. Results are improved noise rejection, full or partial separation of overlapping peaks, and isolation and identification of related peaks in different histograms. 2
USDA-ARS?s Scientific Manuscript database
Thresholding is an important step in the segmentation of image features, and the existing methods are not all effective when the image histogram exhibits a unimodal pattern, which is common in defect detection of fruit. This study was aimed at developing a general automatic thresholding methodology ...
Evaluation of digital radiography practice using exposure index tracking
Zhou, Yifang; Allahverdian, Janet; Nute, Jessica L.; Lee, Christina
2016-01-01
Some digital radiography (DR) detectors and software allow for remote download of exam statistics, including image reject status, body part, projection, and exposure index (EI). The ability to have automated data collection from multiple DR units is conducive to a quality control (QC) program monitoring institutional radiographic exposures. We have implemented such a QC program with the goal to identify outliers in machine radiation output and opportunities for improvement in radiation dose levels. We studied the QC records of four digital detectors in greater detail on a monthly basis for one year. Although individual patient entrance skin exposure varied, the radiation dose levels to the detectors were made to be consistent via phototimer recalibration. The exposure data stored on each digital detector were periodically downloaded in a spreadsheet format for analysis. EI median and standard deviation were calculated for each protocol (by body part) and EI histograms were created for torso protocols. When histograms of EI values for different units were compared, we observed differences up to 400 in average EI (representing 60% difference in radiation levels to the detector) between units nominally calibrated to the same EI. We identified distinct components of the EI distributions, which in some cases, had mean EI values 300 apart. Peaks were observed at the current calibrated EI, a previously calibrated EI, and an EI representing computed radiography (CR) techniques. Our findings in this ongoing project have allowed us to make useful interventions, from emphasizing the use of phototimers instead of institutional memory of manual techniques to improvements in our phototimer calibration. We believe that this QC program can be implemented at other sites and can reveal problems with radiation levels in the aggregate that are difficult to identify on a case‐by‐case basis. PACS number(s): 87.59.bf PMID:27929507
Söhn, Matthias; Alber, Markus; Yan, Di
2007-09-01
The variability of dose-volume histogram (DVH) shapes in a patient population can be quantified using principal component analysis (PCA). We applied this to rectal DVHs of prostate cancer patients and investigated the correlation of the PCA parameters with late bleeding. PCA was applied to the rectal wall DVHs of 262 patients, who had been treated with a four-field box, conformal adaptive radiotherapy technique. The correlated changes in the DVH pattern were revealed as "eigenmodes," which were ordered by their importance to represent data set variability. Each DVH is uniquely characterized by its principal components (PCs). The correlation of the first three PCs and chronic rectal bleeding of Grade 2 or greater was investigated with uni- and multivariate logistic regression analyses. Rectal wall DVHs in four-field conformal RT can primarily be represented by the first two or three PCs, which describe approximately 94% or 96% of the DVH shape variability, respectively. The first eigenmode models the total irradiated rectal volume; thus, PC1 correlates to the mean dose. Mode 2 describes the interpatient differences of the relative rectal volume in the two- or four-field overlap region. Mode 3 reveals correlations of volumes with intermediate doses ( approximately 40-45 Gy) and volumes with doses >70 Gy; thus, PC3 is associated with the maximal dose. According to univariate logistic regression analysis, only PC2 correlated significantly with toxicity. However, multivariate logistic regression analysis with the first two or three PCs revealed an increased probability of bleeding for DVHs with more than one large PC. PCA can reveal the correlation structure of DVHs for a patient population as imposed by the treatment technique and provide information about its relationship to toxicity. It proves useful for augmenting normal tissue complication probability modeling approaches.
Sharma, Harshita; Zerbe, Norman; Klempert, Iris; Hellwich, Olaf; Hufnagl, Peter
2017-11-01
Deep learning using convolutional neural networks is an actively emerging field in histological image analysis. This study explores deep learning methods for computer-aided classification in H&E stained histopathological whole slide images of gastric carcinoma. An introductory convolutional neural network architecture is proposed for two computerized applications, namely, cancer classification based on immunohistochemical response and necrosis detection based on the existence of tumor necrosis in the tissue. Classification performance of the developed deep learning approach is quantitatively compared with traditional image analysis methods in digital histopathology requiring prior computation of handcrafted features, such as statistical measures using gray level co-occurrence matrix, Gabor filter-bank responses, LBP histograms, gray histograms, HSV histograms and RGB histograms, followed by random forest machine learning. Additionally, the widely known AlexNet deep convolutional framework is comparatively analyzed for the corresponding classification problems. The proposed convolutional neural network architecture reports favorable results, with an overall classification accuracy of 0.6990 for cancer classification and 0.8144 for necrosis detection. Copyright © 2017 Elsevier Ltd. All rights reserved.
Distribution of a suite of elements including arsenic and mercury in Alabama coal
Goldhaber, Martin B.; Bigelow, R.C.; Hatch, J.R.; Pashin, J.C.
2000-01-01
Arsenic and other elements are unusually abundant in Alabama coal. This conclusion is based on chemical analyses of coal in the U.S. Geological Survey's National Coal Resources Data System (NCRDS; Bragg and others, 1994). According to NCRDS data, the average concentration of arsenic in Alabama coal (72 ppm) is three times higher than is the average for all U.S. coal (24 ppm). Of the U.S. coal analyses for arsenic that are at least 3 standard deviations above the mean, approximately 90% are from the coal fields of Alabama. Figure 1 contrasts the abundance of arsenic in coal of the Warrior field of Alabama (histogram C) with that of coal of the Powder River Basin, Wyoming (histogram A), and the Eastern Interior Province including the Illinois Basin and nearby areas (histogram B). The Warrior field is by far the largest in Alabama. On the histogram, the large 'tail' of very high values (> 200 ppm) in the Warrior coal contrasts with the other two regions that have very few analyses greater than 200 ppm.
Real-Time Tracking by Double Templates Matching Based on Timed Motion History Image with HSV Feature
Li, Zhiyong; Li, Pengfei; Yu, Xiaoping; Hashem, Mervat
2014-01-01
It is a challenge to represent the target appearance model for moving object tracking under complex environment. This study presents a novel method with appearance model described by double templates based on timed motion history image with HSV color histogram feature (tMHI-HSV). The main components include offline template and online template initialization, tMHI-HSV-based candidate patches feature histograms calculation, double templates matching (DTM) for object location, and templates updating. Firstly, we initialize the target object region and calculate its HSV color histogram feature as offline template and online template. Secondly, the tMHI-HSV is used to segment the motion region and calculate these candidate object patches' color histograms to represent their appearance models. Finally, we utilize the DTM method to trace the target and update the offline template and online template real-timely. The experimental results show that the proposed method can efficiently handle the scale variation and pose change of the rigid and nonrigid objects, even in illumination change and occlusion visual environment. PMID:24592185
Stark, J A; Hladky, S B
2000-02-01
Dwell-time histograms are often plotted as part of patch-clamp investigations of ion channel currents. The advantages of plotting these histograms with a logarithmic time axis were demonstrated by, J. Physiol. (Lond.). 378:141-174), Pflügers Arch. 410:530-553), and, Biophys. J. 52:1047-1054). Sigworth and Sine argued that the interpretation of such histograms is simplified if the counts are presented in a manner similar to that of a probability density function. However, when ion channel records are recorded as a discrete time series, the dwell times are quantized. As a result, the mapping of dwell times to logarithmically spaced bins is highly irregular; bins may be empty, and significant irregularities may extend beyond the duration of 100 samples. Using simple approximations based on the nature of the binning process and the transformation rules for probability density functions, we develop adjustments for the display of the counts to compensate for this effect. Tests with simulated data suggest that this procedure provides a faithful representation of the data.
Qualitative evaluations and comparisons of six night-vision colorization methods
NASA Astrophysics Data System (ADS)
Zheng, Yufeng; Reese, Kristopher; Blasch, Erik; McManamon, Paul
2013-05-01
Current multispectral night vision (NV) colorization techniques can manipulate images to produce colorized images that closely resemble natural scenes. The colorized NV images can enhance human perception by improving observer object classification and reaction times especially for low light conditions. This paper focuses on the qualitative (subjective) evaluations and comparisons of six NV colorization methods. The multispectral images include visible (Red-Green- Blue), near infrared (NIR), and long wave infrared (LWIR) images. The six colorization methods are channel-based color fusion (CBCF), statistic matching (SM), histogram matching (HM), joint-histogram matching (JHM), statistic matching then joint-histogram matching (SM-JHM), and the lookup table (LUT). Four categries of quality measurements are used for the qualitative evaluations, which are contrast, detail, colorfulness, and overall quality. The score of each measurement is rated from 1 to 3 scale to represent low, average, and high quality, respectively. Specifically, high contrast (of rated score 3) means an adequate level of brightness and contrast. The high detail represents high clarity of detailed contents while maintaining low artifacts. The high colorfulness preserves more natural colors (i.e., closely resembles the daylight image). Overall quality is determined from the NV image compared to the reference image. Nine sets of multispectral NV images were used in our experiments. For each set, the six colorized NV images (produced from NIR and LWIR images) are concurrently presented to users along with the reference color (RGB) image (taken at daytime). A total of 67 subjects passed a screening test ("Ishihara Color Blindness Test") and were asked to evaluate the 9-set colorized images. The experimental results showed the quality order of colorization methods from the best to the worst: CBCF < SM < SM-JHM < LUT < JHM < HM. It is anticipated that this work will provide a benchmark for NV colorization and for quantitative evaluation using an objective metric such as objective evaluation index (OEI).
Zadpoor, Amir A
2015-03-01
Mechanical characterization of biological tissues and biomaterials at the nano-scale is often performed using nanoindentation experiments. The different constituents of the characterized materials will then appear in the histogram that shows the probability of measuring a certain range of mechanical properties. An objective technique is needed to separate the probability distributions that are mixed together in such a histogram. In this paper, finite mixture models (FMMs) are proposed as a tool capable of performing such types of analysis. Finite Gaussian mixture models assume that the measured probability distribution is a weighted combination of a finite number of Gaussian distributions with separate mean and standard deviation values. Dedicated optimization algorithms are available for fitting such a weighted mixture model to experimental data. Moreover, certain objective criteria are available to determine the optimum number of Gaussian distributions. In this paper, FMMs are used for interpreting the probability distribution functions representing the distributions of the elastic moduli of osteoarthritic human cartilage and co-polymeric microspheres. As for cartilage experiments, FMMs indicate that at least three mixture components are needed for describing the measured histogram. While the mechanical properties of the softer mixture components, often assumed to be associated with Glycosaminoglycans, were found to be more or less constant regardless of whether two or three mixture components were used, those of the second mixture component (i.e. collagen network) considerably changed depending on the number of mixture components. Regarding the co-polymeric microspheres, the optimum number of mixture components estimated by the FMM theory, i.e. 3, nicely matches the number of co-polymeric components used in the structure of the polymer. The computer programs used for the presented analyses are made freely available online for other researchers to use. Copyright © 2014 Elsevier B.V. All rights reserved.
Grover, Davinder S; Fellman, Ronald L
2016-06-01
To describe a novel technique for thermally marking the tip of a suture, in preparation for a gonioscopy-assisted transluminal trabeculotomy. One patient was used as an example for this technique. Technique report. The authors introduce a modification of a novel surgical procedure (GATT) in which a suture is marked and thermally blunted allowing a proper visualization while performing an ab interno, minimally invasive, circumferential 360-degree suture trabeculotomy. The authors have previously reported on the GATT surgery with the use of an illuminated microcatheter, which allowed for visualization of the tip of the catheter as it circumnavigated Schlemm canal. This modification allows for similar visualization of the tip of the suture, however, is much more cost-effective while still maintaining similar safety.
WebGL and web audio software lightweight components for multimedia education
NASA Astrophysics Data System (ADS)
Chang, Xin; Yuksel, Kivanc; Skarbek, Władysław
2017-08-01
The paper presents the results of our recent work on development of contemporary computing platform DC2 for multimedia education usingWebGL andWeb Audio { the W3C standards. Using literate programming paradigm the WEBSA educational tools were developed. It offers for a user (student), the access to expandable collection of WEBGL Shaders and web Audio scripts. The unique feature of DC2 is the option of literate programming, offered for both, the author and the reader in order to improve interactivity to lightweightWebGL andWeb Audio components. For instance users can define: source audio nodes including synthetic sources, destination audio nodes, and nodes for audio processing such as: sound wave shaping, spectral band filtering, convolution based modification, etc. In case of WebGL beside of classic graphics effects based on mesh and fractal definitions, the novel image processing analysis by shaders is offered like nonlinear filtering, histogram of gradients, and Bayesian classifiers.
NASA Astrophysics Data System (ADS)
Kusyk, Janusz; Eskicioglu, Ahmet M.
2005-10-01
Digital watermarking is considered to be a major technology for the protection of multimedia data. Some of the important applications are broadcast monitoring, copyright protection, and access control. In this paper, we present a semi-blind watermarking scheme for embedding a logo in color images using the DFT domain. After computing the DFT of the luminance layer of the cover image, the magnitudes of DFT coefficients are compared, and modified. A given watermark is embedded in three frequency bands: Low, middle, and high. Our experiments show that the watermarks extracted from the lower frequencies have the best visual quality for low pass filtering, adding Gaussian noise, JPEG compression, resizing, rotation, and scaling, and the watermarks extracted from the higher frequencies have the best visual quality for cropping, intensity adjustment, histogram equalization, and gamma correction. Extractions from the fragmented and translated image are identical to extractions from the unattacked watermarked image. The collusion and rewatermarking attacks do not provide the hacker with useful tools.
Jang, Jinhee; Kim, Tae-Won; Hwang, Eo-Jin; Choi, Hyun Seok; Koo, Jaseong; Shin, Yong Sam; Jung, So-Lyung; Ahn, Kook-Jin; Kim, Bum-Soo
2017-01-01
The purpose of this study was to compare the histogram analysis and visual scores in 3T MRI assessment of middle cerebral arterial wall enhancement in patients with acute stroke, for the differentiation of parent artery disease (PAD) from small artery disease (SAD). Among the 82 consecutive patients in a tertiary hospital for one year, 25 patients with acute infarcts in middle cerebral artery (MCA) territory were included in this study including 15 patients with PAD and 10 patients with SAD. Three-dimensional contrast-enhanced T1-weighted turbo spin echo MR images with black-blood preparation at 3T were analyzed both qualitatively and quantitatively. The degree of MCA stenosis, and visual and histogram assessments on MCA wall enhancement were evaluated. A statistical analysis was performed to compare diagnostic accuracy between qualitative and quantitative metrics. The degree of stenosis, visual enhancement score, geometric mean (GM), and the 90th percentile (90P) value from the histogram analysis were significantly higher in PAD than in SAD ( p = 0.006 for stenosis, < 0.001 for others). The receiver operating characteristic curve area of GM and 90P were 1 (95% confidence interval [CI], 0.86-1.00). A histogram analysis of a relevant arterial wall enhancement allows differentiation between PAD and SAD in patients with acute stroke within the MCA territory.
Takahashi, Masahiro; Kozawa, Eito; Tanisaka, Megumi; Hasegawa, Kousei; Yasuda, Masanori; Sakai, Fumikazu
2016-06-01
We explored the role of histogram analysis of apparent diffusion coefficient (ADC) maps for discriminating uterine carcinosarcoma and endometrial carcinoma. We retrospectively evaluated findings in 13 patients with uterine carcinosarcoma and 50 patients with endometrial carcinoma who underwent diffusion-weighted imaging (b = 0, 500, 1000 s/mm(2) ) at 3T with acquisition of corresponding ADC maps. We derived histogram data from regions of interest drawn on all slices of the ADC maps in which tumor was visualized, excluding areas of necrosis and hemorrhage in the tumor. We used the Mann-Whitney test to evaluate the capacity of histogram parameters (mean ADC value, 5th to 95th percentiles, skewness, kurtosis) to discriminate uterine carcinosarcoma and endometrial carcinoma and analyzed the receiver operating characteristic (ROC) curve to determine the optimum threshold value for each parameter and its corresponding sensitivity and specificity. Carcinosarcomas demonstrated significantly higher mean vales of ADC, 95th, 90th, 75th, 50th, 25th percentiles and kurtosis than endometrial carcinomas (P < 0.05). ROC curve analysis of the 75th percentile yielded the best area under the ROC curve (AUC; 0.904), sensitivity of 100%, and specificity of 78.0%, with a cutoff value of 1.034 × 10(-3) mm(2) /s. Histogram analysis of ADC maps might be helpful for discriminating uterine carcinosarcomas and endometrial carcinomas. J. Magn. Reson. Imaging 2016;43:1301-1307. © 2015 Wiley Periodicals, Inc.
Min, Xiangde; Feng, Zhaoyan; Wang, Liang; Cai, Jie; Yan, Xu; Li, Basen; Ke, Zan; Zhang, Peipei; You, Huijuan
2018-01-01
To assess the values of parameters derived from whole-lesion histograms of the apparent diffusion coefficient (ADC) at 3T for the characterization of testicular germ cell tumors (TGCTs). A total of 24 men with TGCTs underwent 3T diffusion-weighted imaging. Fourteen tumors were pathologically confirmed as seminomas, and ten tumors were pathologically confirmed as nonseminomas. Whole-lesion histogram analysis of the ADC values was performed. A Mann-Whitney U test was employed to compare the differences in ADC histogram parameters between seminomas and nonseminomas. Receiver operating characteristic analysis was used to identify the cutoff values for each parameter for differentiating seminomas from nonseminomas; furthermore, the area under the curve (AUC) was calculated to evaluate the diagnostic accuracy. The median of 10th, 25th, 50th, 75th, and 90th percentiles and mean, minimum and maximum ADC values were all significantly reduced for seminomas compared with nonseminomas (p<0.05 for all). In contrast, the median of kurtosis and skewness of ADC values of seminomas were both significantly increased compared with those of nonseminomas (p=0.003 and 0.001, respectively). For differentiating nonseminomas from seminomas, the 10th percentile ADC yielded the highest AUC with a sensitivity and specificity of 100% and 92.86%, respectively. Whole-lesion histogram analysis of ADCs might be used for preoperative characterization of TGCTs. Copyright © 2017 Elsevier B.V. All rights reserved.
Fried, Itzhak; Koch, Christof
2014-01-01
Peristimulus time histograms are a widespread form of visualizing neuronal responses. Kernel convolution methods transform these histograms into a smooth, continuous probability density function. This provides an improved estimate of a neuron's actual response envelope. We here develop a classifier, called the h-coefficient, to determine whether time-locked fluctuations in the firing rate of a neuron should be classified as a response or as random noise. Unlike previous approaches, the h-coefficient takes advantage of the more precise response envelope estimation provided by the kernel convolution method. The h-coefficient quantizes the smoothed response envelope and calculates the probability of a response of a given shape to occur by chance. We tested the efficacy of the h-coefficient in a large data set of Monte Carlo simulated smoothed peristimulus time histograms with varying response amplitudes, response durations, trial numbers, and baseline firing rates. Across all these conditions, the h-coefficient significantly outperformed more classical classifiers, with a mean false alarm rate of 0.004 and a mean hit rate of 0.494. We also tested the h-coefficient's performance in a set of neuronal responses recorded in humans. The algorithm behind the h-coefficient provides various opportunities for further adaptation and the flexibility to target specific parameters in a given data set. Our findings confirm that the h-coefficient can provide a conservative and powerful tool for the analysis of peristimulus time histograms with great potential for future development. PMID:25475352
Using color histogram normalization for recovering chromatic illumination-changed images.
Pei, S C; Tseng, C L; Wu, C C
2001-11-01
We propose a novel image-recovery method using the covariance matrix of the red-green-blue (R-G-B) color histogram and tensor theories. The image-recovery method is called the color histogram normalization algorithm. It is known that the color histograms of an image taken under varied illuminations are related by a general affine transformation of the R-G-B coordinates when the illumination is changed. We propose a simplified affine model for application with illumination variation. This simplified affine model considers the effects of only three basic forms of distortion: translation, scaling, and rotation. According to this principle, we can estimate the affine transformation matrix necessary to recover images whose color distributions are varied as a result of illumination changes. We compare the normalized color histogram of the standard image with that of the tested image. By performing some operations of simple linear algebra, we can estimate the matrix of the affine transformation between two images under different illuminations. To demonstrate the performance of the proposed algorithm, we divide the experiments into two parts: computer-simulated images and real images corresponding to illumination changes. Simulation results show that the proposed algorithm is effective for both types of images. We also explain the noise-sensitive skew-rotation estimation that exists in the general affine model and demonstrate that the proposed simplified affine model without the use of skew rotation is better than the general affine model for such applications.
NASA Astrophysics Data System (ADS)
Urbaniak, Daniel J.
2004-11-01
In the research reported here, the surface modification of medical grade poly(dimethyl siloxane), polyetherurethane, and stainless steel through gamma-radiation grafting of hydrophilic polymers was investigated. Emphasis was placed on developing improved and simplified surface modification methods that produce more stable and more bioacceptible hydrophilic graft surfaces. As a result of this research, new surface modification techniques were developed that yield significantly improved surface stability unachievable using previous surface modification techniques. The surface modification of poly(dimethyl siloxane) with hydrophilic polymers was carried out using gamma radiation initiated graft polymerization. The addition of alkali metal hydroxides afforded a unique way to enhance the grafting of N-vinyl-2 pyrrolidone, dimethylacryamide, 2-methacryloyloxyethyl phosphoryl choline, N,N-dimethyl-N-(methacryloyloxyethyl)-N-(3-sulfopropyl)-ammonium-betaine, N,N-dimethyl-N-(methacrylamidopropyl)-N-(3-sulfopropyl)-ammonium-betaine, and copolymers thereof to silicones. Ethanolamine was found to further enhance the grafting of some hydrophilic polymers to silicone. The resulting hydrophilic surface grafts were resistant to hydrophobic surface rearrangement. This process overcomes previous problems inherent in silicone surface modification. The technique was also found to moderately enhance the grafting of hydrophilic monomers to polyetherurethane and to 316-L stainless steel. The surface modification of 316-L stainless steel was further enhanced by treating the substrates with a chromium III methacrylate bonding agent prior to irradiation. The coatings were evaluated for their potential use as depots for delivering therapeutic agents. The release of ofloxacin from surface-modified poly(dimethyl siloxane) and dexamethasone from surface-modified 316-L stainless steel was evaluated by in-vitro experiments. Therapeutic levels of drugs were released from surface-modified specimens via a burst effect. Improved surface characterization methods were another aspect of this research. New nanomechanical testing techniques were developed and used to evaluate the viscoelastic surface mechanical properties of low modulus surface-modified specimens. Dynamic nanoindentation characterization techniques were designed to measure the storage modulus and loss modulus of compliant viscoelastic substrate surfaces. The results of these experiments were compared with modulus data obtained by conventional dynamic mechanical spectroscopy. Nanoscratch testing methods were also developed that qualitatively compared the abrasion resistance of surface-modified substrates. (Abstract shortened by UMI.)
Nguyen-Kim, Thi Dan Linh; Maurer, Britta; Suliman, Yossra A; Morsbach, Fabian; Distler, Oliver; Frauenfelder, Thomas
2018-04-01
To evaluate usability of slice-reduced sequential computed tomography (CT) compared to standard high-resolution CT (HRCT) in patients with systemic sclerosis (SSc) for qualitative and quantitative assessment of interstitial lung disease (ILD) with respect to (I) detection of lung parenchymal abnormalities, (II) qualitative and semiquantitative visual assessment, (III) quantification of ILD by histograms and (IV) accuracy for the 20%-cut off discrimination. From standard chest HRCT of 60 SSc patients sequential 9-slice-computed tomography (reduced HRCT) was retrospectively reconstructed. ILD was assessed by visual scoring and quantitative histogram parameters. Results from standard and reduced HRCT were compared using non-parametric tests and analysed by univariate linear regression analyses. With respect to the detection of parenchymal abnormalities, only the detection of intrapulmonary bronchiectasis was significantly lower in reduced HRCT compared to standard HRCT (P=0.039). No differences were found comparing visual scores for fibrosis severity and extension from standard and reduced HRCT (P=0.051-0.073). All scores correlated significantly (P<0.001) to histogram parameters derived from both, standard and reduced HRCT. Significant higher values of kurtosis and skewness for reduced HRCT were found (both P<0.001). In contrast to standard HRCT histogram parameters from reduced HRCT showed significant discrimination at cut-off 20% fibrosis (sensitivity 88% kurtosis and skewness; specificity 81% kurtosis and 86% skewness; cut-off kurtosis ≤26, cut-off skewness ≤4; both P<0.001). Reduced HRCT is a robust method to assess lung fibrosis in SSc with minimal radiation dose with no difference in scoring assessment of lung fibrosis severity and extension in comparison to standard HRCT. In contrast to standard HRCT histogram parameters derived from the approach of reduced HRCT could discriminate at a threshold of 20% lung fibrosis with high sensitivity and specificity. Hence it might be used to detect early disease progression of lung fibrosis in context of monitoring and treatment of SSc patients.
Meyer, Hans Jonas; Höhn, Annekathrin; Surov, Alexey
2018-04-06
Functional imaging modalities like Diffusion-weighted imaging are increasingly used to predict tumor behavior like cellularity and vascularity in different tumors. Histogram analysis is an emergent imaging analysis, in which every voxel is used to obtain a histogram and therefore statistically information about tumors can be provided. The purpose of this study was to elucidate possible associations between ADC histogram parameters and several immunhistochemical features in rectal cancer. Overall, 11 patients with histologically proven rectal cancer were included into the study. There were 2 (18.18%) females and 9 males with a mean age of 67.1 years. KI 67-index, expression of p53, EGFR, VEGF, and Hif1-alpha were semiautomatically estimated. The tumors were divided into PD1-positive and PD1-negative lesions. ADC histogram analysis was performed as a whole lesion measurement using an in-house matlab application. Spearman's correlation analysis revealed a strong correlation between EGFR expression and ADCmax (p=0.72, P=0.02). None of the vascular parameters (VEGF, Hif1-alpha) correlated with ADC parameters. Kurtosis and skewness correlated inversely with p53 expression (p=-0.64, P=0.03 and p=-0.81, P=0.002, respectively). ADCmedian and ADCmode correlated with Ki67 (p=-0.62, P=0.04 and p=-0.65, P=0.03, respectively). PD1-positive tumors showed statistically significant lower ADCmax values in comparison to PD1-negative tumors, 1.93 ± 0.36 vs 2.32 ± 0.47×10 -3 mm 2 /s, p=0.04. Several associations were identified between histogram parameter derived from ADC maps and EGFR, KI 67 and p53 expression in rectal cancer. Furthermore, ADCmax was different between PD1 positive and PD1 negative tumors indicating an important role of ADC parameters for possible future treatment prediction.
Meyer, Hans Jonas; Höhn, Annekathrin; Surov, Alexey
2018-01-01
Functional imaging modalities like Diffusion-weighted imaging are increasingly used to predict tumor behavior like cellularity and vascularity in different tumors. Histogram analysis is an emergent imaging analysis, in which every voxel is used to obtain a histogram and therefore statistically information about tumors can be provided. The purpose of this study was to elucidate possible associations between ADC histogram parameters and several immunhistochemical features in rectal cancer. Overall, 11 patients with histologically proven rectal cancer were included into the study. There were 2 (18.18%) females and 9 males with a mean age of 67.1 years. KI 67-index, expression of p53, EGFR, VEGF, and Hif1-alpha were semiautomatically estimated. The tumors were divided into PD1-positive and PD1-negative lesions. ADC histogram analysis was performed as a whole lesion measurement using an in-house matlab application. Spearman's correlation analysis revealed a strong correlation between EGFR expression and ADCmax (p=0.72, P=0.02). None of the vascular parameters (VEGF, Hif1-alpha) correlated with ADC parameters. Kurtosis and skewness correlated inversely with p53 expression (p=-0.64, P=0.03 and p=-0.81, P=0.002, respectively). ADCmedian and ADCmode correlated with Ki67 (p=-0.62, P=0.04 and p=-0.65, P=0.03, respectively). PD1-positive tumors showed statistically significant lower ADCmax values in comparison to PD1-negative tumors, 1.93 ± 0.36 vs 2.32 ± 0.47×10−3mm2/s, p=0.04. Several associations were identified between histogram parameter derived from ADC maps and EGFR, KI 67 and p53 expression in rectal cancer. Furthermore, ADCmax was different between PD1 positive and PD1 negative tumors indicating an important role of ADC parameters for possible future treatment prediction. PMID:29719621
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shusharina, N; Choi, N; Bortfeld, T
2016-06-15
Purpose: To determine whether the difference in cumulative 18F-FDG uptake histogram of lung treated with either IMRT or PSPT is associated with radiation pneumonitis (RP) in patients with inoperable stage II and III NSCLC. Methods: We analyzed 24 patients from a prospective randomized trial to compare IMRT (n=12) with vs. PSPT (n=12) for inoperable NSCLC. All patients underwent PET-CT imaging between 35 and 88 days post-therapy. Post-treatment PET-CT was aligned with planning 4D CT to establish a voxel-to-voxel correspondence between post-treatment PET and planning dose images. 18F-FDG uptake as a function of radiation dose to normal lung was obtained formore » each patient. Distribution of the standard uptake value (SUV) was analyzed using a volume histogram method. The image quantitative characteristics and DVH measures were correlated with clinical symptoms of pneumonitis. Results: Patients with RP were present in both groups: 5 in the IMRT and 6 in the PSPT. The analysis of cumulative SUV histograms showed significantly higher relative volumes of the normal lung having higher SUV uptake in the PSPT patients for both symptomatic and asymptomatic cases (VSUV=2: 10% for IMRT vs 16% for proton RT and VSUV=1: 10% for IMRT vs 23% for proton RT). In addition, the SUV histograms for symptomatic cases in PSPT patients exhibited a significantly longer tail at the highest SUV. The absolute volume of the lung receiving the dose >70 Gy was larger in the PSPT patients. Conclusion: 18F-FDG uptake – radiation dose response correlates with RP in both groups of patients by means of the linear regression slope. SUV is higher for the PSPT patients for both symptomatic and asymptomatic cases. Higher uptake after PSPT patients is explained by larger volumes of the lung receiving high radiation dose.« less
NASA Astrophysics Data System (ADS)
Burri, Samuel; Homulle, Harald; Bruschini, Claudio; Charbon, Edoardo
2016-04-01
LinoSPAD is a reconfigurable camera sensor with a 256×1 CMOS SPAD (single-photon avalanche diode) pixel array connected to a low cost Xilinx Spartan 6 FPGA. The LinoSPAD sensor's line of pixels has a pitch of 24 μm and 40% fill factor. The FPGA implements an array of 64 TDCs and histogram engines capable of processing up to 8.5 giga-photons per second. The LinoSPAD sensor measures 1.68 mm×6.8 mm and each pixel has a direct digital output to connect to the FPGA. The chip is bonded on a carrier PCB to connect to the FPGA motherboard. 64 carry chain based TDCs sampled at 400 MHz can generate a timestamp every 7.5 ns with a mean time resolution below 25 ps per code. The 64 histogram engines provide time-of-arrival histograms covering up to 50 ns. An alternative mode allows the readout of 28 bit timestamps which have a range of up to 4.5 ms. Since the FPGA TDCs have considerable non-linearity we implemented a correction module capable of increasing histogram linearity at real-time. The TDC array is interfaced to a computer using a super-speed USB3 link to transfer over 150k histograms per second for the 12.5 ns reference period used in our characterization. After characterization and subsequent programming of the post-processing we measure an instrument response histogram shorter than 100 ps FWHM using a strong laser pulse with 50 ps FWHM. A timing resolution that when combined with the high fill factor makes the sensor well suited for a wide variety of applications from fluorescence lifetime microscopy over Raman spectroscopy to 3D time-of-flight.
Jaikuna, Tanwiwat; Khadsiri, Phatchareewan; Chawapun, Nisa; Saekho, Suwit; Tharavichitkul, Ekkasit
2017-02-01
To develop an in-house software program that is able to calculate and generate the biological dose distribution and biological dose volume histogram by physical dose conversion using the linear-quadratic-linear (LQL) model. The Isobio software was developed using MATLAB version 2014b to calculate and generate the biological dose distribution and biological dose volume histograms. The physical dose from each voxel in treatment planning was extracted through Computational Environment for Radiotherapy Research (CERR), and the accuracy was verified by the differentiation between the dose volume histogram from CERR and the treatment planning system. An equivalent dose in 2 Gy fraction (EQD 2 ) was calculated using biological effective dose (BED) based on the LQL model. The software calculation and the manual calculation were compared for EQD 2 verification with pair t -test statistical analysis using IBM SPSS Statistics version 22 (64-bit). Two and three-dimensional biological dose distribution and biological dose volume histogram were displayed correctly by the Isobio software. Different physical doses were found between CERR and treatment planning system (TPS) in Oncentra, with 3.33% in high-risk clinical target volume (HR-CTV) determined by D 90% , 0.56% in the bladder, 1.74% in the rectum when determined by D 2cc , and less than 1% in Pinnacle. The difference in the EQD 2 between the software calculation and the manual calculation was not significantly different with 0.00% at p -values 0.820, 0.095, and 0.593 for external beam radiation therapy (EBRT) and 0.240, 0.320, and 0.849 for brachytherapy (BT) in HR-CTV, bladder, and rectum, respectively. The Isobio software is a feasible tool to generate the biological dose distribution and biological dose volume histogram for treatment plan evaluation in both EBRT and BT.
Yang, Xiaofeng; Tridandapani, Srini; Beitler, Jonathan J; Yu, David S; Chen, Zhengjia; Kim, Sungjin; Bruner, Deborah W; Curran, Walter J; Liu, Tian
2014-10-01
To investigate the diagnostic accuracy of ultrasound histogram features in the quantitative assessment of radiation-induced parotid gland injury and to identify potential imaging biomarkers for radiation-induced xerostomia (dry mouth)-the most common and debilitating side effect after head-and-neck radiotherapy (RT). Thirty-four patients, who have developed xerostomia after RT for head-and-neck cancer, were enrolled. Radiation-induced xerostomia was defined by the Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer morbidity scale. Ultrasound scans were performed on each patient's parotids bilaterally. The 34 patients were stratified into the acute-toxicity groups (16 patients, ≤ 3 months after treatment) and the late-toxicity group (18 patients, > 3 months after treatment). A separate control group of 13 healthy volunteers underwent similar ultrasound scans of their parotid glands. Six sonographic features were derived from the echo-intensity histograms to assess acute and late toxicity of the parotid glands. The quantitative assessments were compared to a radiologist's clinical evaluations. The diagnostic accuracy of these ultrasonic histogram features was evaluated with the receiver operating characteristic (ROC) curve. With an area under the ROC curve greater than 0.90, several histogram features demonstrated excellent diagnostic accuracy for evaluation of acute and late toxicity of parotid glands. Significant differences (P < .05) in all six sonographic features were demonstrated between the control, acute-toxicity, and late-toxicity groups. However, subjective radiologic evaluation cannot distinguish between acute and late toxicity of parotid glands. We demonstrated that ultrasound histogram features could be used to measure acute and late toxicity of the parotid glands after head-and-neck cancer RT, which may be developed into a low-cost imaging method for xerostomia monitoring and assessment. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
A Novel Surgical Technique for Ahmed Valves in Refractory Glaucoma With Silicone Oil Endotamponade.
Davo-Cabrera, Juan Maria; Lanzagorta-Aresti, Aitor; Alcocer Yuste, Pablo
2017-10-01
The purpose of the study is to describe a novel technique to implant Ahmed valves in patients with refractory glaucoma because of silicone oil (SO) endotamponade PATIENTS:: Three patients with glaucoma without SO removal were used as an example for this technique. Technique report. We introduce a standard technique modification for Ahmed valves in patients with SO. This modification consists on locating the Ahmed valve more tangential to the limbus curvature instead of the usual perpendicular position and inserting the tube in the posterior chamber. The tube can be longer and run parallel to pupil. This technique allows use superotemporal quadrant (fewer complications), avoid corneal touch and decrease SO loss through the tube to subconjunctival space.
MCNP Output Data Analysis with ROOT (MODAR)
NASA Astrophysics Data System (ADS)
Carasco, C.
2010-06-01
MCNP Output Data Analysis with ROOT (MODAR) is a tool based on CERN's ROOT software. MODAR has been designed to handle time-energy data issued by MCNP simulations of neutron inspection devices using the associated particle technique. MODAR exploits ROOT's Graphical User Interface and functionalities to visualize and process MCNP simulation results in a fast and user-friendly way. MODAR allows to take into account the detection system time resolution (which is not possible with MCNP) as well as detectors energy response function and counting statistics in a straightforward way. Program summaryProgram title: MODAR Catalogue identifier: AEGA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 155 373 No. of bytes in distributed program, including test data, etc.: 14 815 461 Distribution format: tar.gz Programming language: C++ Computer: Most Unix workstations and PC Operating system: Most Unix systems, Linux and windows, provided the ROOT package has been installed. Examples where tested under Suse Linux and Windows XP. RAM: Depends on the size of the MCNP output file. The example presented in the article, which involves three two-dimensional 139×740 bins histograms, allocates about 60 MB. These data are running under ROOT and include consumption by ROOT itself. Classification: 17.6 External routines: ROOT version 5.24.00 ( http://root.cern.ch/drupal/) Nature of problem: The output of an MCNP simulation is an ASCII file. The data processing is usually performed by copying and pasting the relevant parts of the ASCII file into Microsoft Excel. Such an approach is satisfactory when the quantity of data is small but is not efficient when the size of the simulated data is large, for example when time-energy correlations are studied in detail such as in problems involving the associated particle technique. In addition, since the finite time resolution of the simulated detector cannot be modeled with MCNP, systems in which time-energy correlation is crucial cannot be described in a satisfactory way. Finally, realistic particle energy deposit in detectors is calculated with MCNP in a two-step process involving type-5 then type-8 tallies. In the first step, the photon flux energy spectrum associated to a time region is selected and serves as a source energy distribution for the second step. Thus, several files must be manipulated before getting the result, which can be time consuming if one needs to study several time regions or different detectors performances. In the same way, modeling counting statistics obtained in a limited acquisition time requires several steps and can also be time consuming. Solution method: In order to overcome the previous limitations, the MODAR C++ code has been written to make use of CERN's ROOT data analysis software. MCNP output data are read from the MCNP output file with dedicated routines. Two-dimensional histograms are filled and can be handled efficiently within the ROOT framework. To keep a user friendly analysis tool, all processing and data display can be done by means of ROOT Graphical User Interface. Specific routines have been written to include detectors finite time resolution and energy response function as well as counting statistics in a straightforward way. Additional comments: The possibility of adding tallies has also been incorporated in MODAR in order to describe systems in which the signal from several detectors can be summed. Moreover, MODAR can be adapted to handle other problems involving two-dimensional data. Running time: The CPU time needed to smear a two-dimensional histogram depends on the size of the histogram. In the presented example, the time-energy smearing of one of the 139×740 two-dimensional histograms takes 3 minutes with a DELL computer equipped with INTEL Core 2.
A smart technique for attendance system to recognize faces through parallelism
NASA Astrophysics Data System (ADS)
Prabhavathi, B.; Tanuja, V.; Madhu Viswanatham, V.; Rajashekhara Babu, M.
2017-11-01
Major part of recognising a person is face with the help of image processing techniques we can exploit the physical features of a person. In the old approach method that is used in schools and colleges it is there that the professor calls the student name and then the attendance for the students marked. Here in paper want to deviate from the old approach and go with the new approach by using techniques that are there in image processing. In this paper we presenting spontaneous presence for students in classroom. At first classroom image has been in use and after that image is kept in data record. For the images that are stored in the database we apply system algorithm which includes steps such as, histogram classification, noise removal, face detection and face recognition methods. So by using these steps we detect the faces and then compare it with the database. The attendance gets marked automatically if the system recognizes the faces.
Gaing, Byron; Sigmund, Eric E; Huang, William C; Babb, James S; Parikh, Nainesh S; Stoffel, David; Chandarana, Hersh
2015-03-01
The aim of this study was to determine if voxel-based histogram analysis of intravoxel incoherent motion imaging (IVIM) parameters can differentiate various subtypes of renal tumors, including benign and malignant lesions. A total of 44 patients with renal tumors who underwent surgery and had histopathology available were included in this Health Insurance Portability and Accountability Act-compliant, institutional review board-approved, single-institution prospective study. In addition to routine renal magnetic resonance imaging examination performed on a 1.5-T system, all patients were imaged with axial diffusion-weighted imaging using 8 b values (range, 0-800 s/mm). A biexponential model was fitted to the diffusion signal data using a segmented algorithm to extract the IVIM parameters perfusion fraction (fp), tissue diffusivity (Dt), and pseudodiffusivity (Dp) for each voxel. Mean and histogram measures of heterogeneity (standard deviation, skewness, and kurtosis) of IVIM parameters were correlated with pathology results of tumor subtype using unequal variance t tests to compare subtypes in terms of each measure. Correction for multiple comparisons was accomplished using the Tukey honestly significant difference procedure. A total of 44 renal tumors including 23 clear cell (ccRCC), 4 papillary (pRCC), 5 chromophobe, and 5 cystic renal cell carcinomas, as well as benign lesions, 4 oncocytomas (Onc) and 3 angiomyolipomas (AMLs), were included in our analysis. Mean IVIM parameters fp and Dt differentiated 8 of 15 pairs of renal tumors. Histogram analysis of IVIM parameters differentiated 9 of 15 subtype pairs. One subtype pair (ccRCC vs pRCC) was differentiated by mean analysis but not by histogram analysis. However, 2 other subtype pairs (AML vs Onc and ccRCC vs Onc) were differentiated by histogram distribution parameters exclusively. The standard deviation of Dt [σ(Dt)] differentiated ccRCC (0.362 ± 0.136 × 10 mm/s) from AML (0.199 ± 0.043 × 10 mm/s) (P = 0.002). Kurtosis of fp separated Onc (2.767 ± 1.299) from AML (-0.325 ± 0.279; P = 0.001), ccRCC (0.612 ± 1.139; P = 0.042), and pRCC (0.308 ± 0.730; P = 0.025). Intravoxel incoherent motion imaging parameters with inclusion of histogram measures of heterogeneity can help differentiate malignant from benign lesions as well as various subtypes of renal cancers.
NASA Astrophysics Data System (ADS)
Pal, S. K.; Majumdar, T. J.; Bhattacharya, Amit K.
Fusion of optical and synthetic aperture radar data has been attempted in the present study for mapping of various lithologic units over a part of the Singhbhum Shear Zone (SSZ) and its surroundings. ERS-2 SAR data over the study area has been enhanced using Fast Fourier Transformation (FFT) based filtering approach, and also using Frost filtering technique. Both the enhanced SAR imagery have been then separately fused with histogram equalized IRS-1C LISS III image using Principal Component Analysis (PCA) technique. Later, Feature-oriented Principal Components Selection (FPCS) technique has been applied to generate False Color Composite (FCC) images, from which corresponding geological maps have been prepared. Finally, GIS techniques have been successfully used for change detection analysis in the lithological interpretation between the published geological map and the fusion based geological maps. In general, there is good agreement between these maps over a large portion of the study area. Based on the change detection studies, few areas could be identified which need attention for further detailed ground-based geological studies.
MCNP output data analysis with ROOT (MODAR)
NASA Astrophysics Data System (ADS)
Carasco, C.
2010-12-01
MCNP Output Data Analysis with ROOT (MODAR) is a tool based on CERN's ROOT software. MODAR has been designed to handle time-energy data issued by MCNP simulations of neutron inspection devices using the associated particle technique. MODAR exploits ROOT's Graphical User Interface and functionalities to visualize and process MCNP simulation results in a fast and user-friendly way. MODAR allows to take into account the detection system time resolution (which is not possible with MCNP) as well as detectors energy response function and counting statistics in a straightforward way. New version program summaryProgram title: MODAR Catalogue identifier: AEGA_v1_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGA_v1_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 150 927 No. of bytes in distributed program, including test data, etc.: 4 981 633 Distribution format: tar.gz Programming language: C++ Computer: Most Unix workstations and PCs Operating system: Most Unix systems, Linux and windows, provided the ROOT package has been installed. Examples where tested under Suse Linux and Windows XP. RAM: Depends on the size of the MCNP output file. The example presented in the article, which involves three two dimensional 139×740 bins histograms, allocates about 60 MB. These data are running under ROOT and include consumption by ROOT itself. Classification: 17.6 Catalogue identifier of previous version: AEGA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 1161 External routines: ROOT version 5.24.00 ( http://root.cern.ch/drupal/) Does the new version supersede the previous version?: Yes Nature of problem: The output of a MCNP simulation is an ascii file. The data processing is usually performed by copying and pasting the relevant parts of the ascii file into Microsoft Excel. Such an approach is satisfactory when the quantity of data is small but is not efficient when the size of the simulated data is large, for example when time-energy correlations are studied in detail such as in problems involving the associated particle technique. In addition, since the finite time resolution of the simulated detector cannot be modeled with MCNP, systems in which time-energy correlation is crucial cannot be described in a satisfactory way. Finally, realistic particle energy deposit in detectors is calculated with MCNP in a two step process involving type-5 then type-8 tallies. In the first step, the photon flux energy spectrum associated to a time region is selected and serves as a source energy distribution for the second step. Thus, several files must be manipulated before getting the result, which can be time consuming if one needs to study several time regions or different detectors performances. In the same way, modeling counting statistics obtained in a limited acquisition time requires several steps and can also be time consuming. Solution method: In order to overcome the previous limitations, the MODAR C++ code has been written to make use of CERN's ROOT data analysis software. MCNP output data are read from the MCNP output file with dedicated routines. Two dimensional histograms are filled and can be handled efficiently within the ROOT framework. To keep a user friendly analysis tool, all processing and data display can be done by means of ROOT Graphical User Interface. Specific routines have been written to include detectors finite time resolution and energy response function as well as counting statistics in a straightforward way. Reasons for new version: For applications involving the Associate Particle Technique, a large number of gamma rays are produced by the fast neutrons interactions. To study the energy spectra, it is useful to identify the gamma-ray energy peaks in a straightforward way. Therefore, the possibility to show gamma rays corresponding to specific reactions has been added in MODAR. Summary of revisions: It is possible to use a gamma ray database to better identify in the energy spectra gamma ray peaks with their first and second escapes. Histograms can be scaled by the number of source particle to evaluate the number of counts that is expected without statistical uncertainties. Additional comments: The possibility of adding tallies has also been incorporated in MODAR in order to describe systems in which the signal from several detectors can be summed. Moreover, MODAR can be adapted to handle other problems involving two dimensional data. Running time: The CPU time needed to smear a two dimensional histogram depends on the size of the histogram. In the presented example, the time-energy smearing of one of the 139×740 two dimensional histograms takes 3 minutes with a DELL computer equipped with INTEL Core 2.
Intraluminal pressure response in Baerveldt tube shunts: a comparison of modification techniques.
Gilbert, David D; Bond, Brent
2007-01-01
To design an apparatus for in vitro quantification of effects of modifications to nonvalved tube shunts, and to compare the effects of these modifications done to prevent early postoperative hypotony. A testing apparatus was designed and constructed to measure intraluminal pressure (ILP) during constant infusion of balanced salt solution at 2 microL/min through Baerveldt tube shunts mounted on cadaver eyes. Three different modifications were performed and 3 shunts were used for each modification. The modifications were partial tube occlusion with a 3-0 Supramid suture, tube perforation with a 27-gauge needle on an occluded tube, and a 1.2 mm longitudinal venting slit on an occluded tube. The final steady state ILP for the 3-0 Supramid partially occluded tube ranged between 1.6 and 2.8 mm Hg over 3 trials with an average final steady state ILP of 2.0 mm Hg. The 27-gauge needle perforation produced ILPs of 4.6 to 8.2 mm Hg over 3 trials with an average final steady state pressure of 5.9 mm Hg. The 1.2 mm longitudinal venting slit produced an ILP range of 12.6 to 17.3 mm Hg over 3 trials with an average final steady state ILP of 14.8 mm Hg. The differences in final steady state ILP between the 3 modification techniques were statistically significant (P=0.00036). The 3-0 Supramid occlusion and the 27-gauge perforation techniques produced hypotony, whereas the 1.2 mm longitudinal venting slit produced acceptable pressures in vitro.
ERIC Educational Resources Information Center
Cooper, Linda L.; Shore, Felice S.
2008-01-01
This paper identifies and discusses misconceptions that students have in making judgments of center and variability when data are presented graphically. An assessment addressing interpreting center and variability in histograms and stem-and-leaf plots was administered to, and follow-up interviews were conducted with, undergraduates enrolled in…
Texture and phase analysis of deformed SUS304 by using HIPPO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takajo, Shigehiro; Vogel, Sven C.
2016-11-15
These slides represent the author's research activity at Los Alamos National Laboratory (LANL), which is about texture and phase analysis of deformed SUS304 by using HIPPO. The following topics are covered: diffraction histogram at each sample position, diffraction histogram (all bank data averaged), possiblity of ε-phase, MAUD analysis with including ε-phase.
Shift-Invariant Image Reconstruction of Speckle-Degraded Images Using Bispectrum Estimation
1990-05-01
process with the requisite negative exponential pelf. I call this model the Negative Exponential Model ( NENI ). The NENI flowchart is seen in Figure 6...Figure ]3d-g. Statistical Histograms and Phase for the RPj NG EXP FDF MULT METHOD FILuteC 14a. Truth Object Speckled Via the NENI HISTOGRAM OF SPECKLE
Damage Proxy Map from Interferometric Synthetic Aperture Radar Coherence
NASA Technical Reports Server (NTRS)
Webb, Frank H. (Inventor); Yun, Sang-Ho (Inventor); Fielding, Eric Jameson (Inventor); Simons, Mark (Inventor)
2015-01-01
A method, apparatus, and article of manufacture provide the ability to generate a damage proxy map. A master coherence map and a slave coherence map, for an area prior and subsequent to (including) a damage event are obtained. The slave coherence map is registered to the master coherence map. Pixel values of the slave coherence map are modified using histogram matching to provide a first histogram of the master coherence map that exactly matches a second histogram of the slave coherence map. A coherence difference between the slave coherence map and the master coherence map is computed to produce a damage proxy map. The damage proxy map is displayed with the coherence difference displayed in a visually distinguishable manner.
Selected photographic techniques, a compilation
NASA Technical Reports Server (NTRS)
1971-01-01
A selection has been made of methods, devices, and techniques developed in the field of photography during implementation of space and nuclear research projects. These items include many adaptations, variations, and modifications to standard hardware and practice, and should prove interesting to both amateur and professional photographers and photographic technicians. This compilation is divided into two sections. The first section presents techniques and devices that have been found useful in making photolab work simpler, more productive, and higher in quality. Section two deals with modifications to and special applications for existing photographic equipment.
Multimodal Image Registration through Simultaneous Segmentation.
Aganj, Iman; Fischl, Bruce
2017-11-01
Multimodal image registration facilitates the combination of complementary information from images acquired with different modalities. Most existing methods require computation of the joint histogram of the images, while some perform joint segmentation and registration in alternate iterations. In this work, we introduce a new non-information-theoretical method for pairwise multimodal image registration, in which the error of segmentation - using both images - is considered as the registration cost function. We empirically evaluate our method via rigid registration of multi-contrast brain magnetic resonance images, and demonstrate an often higher registration accuracy in the results produced by the proposed technique, compared to those by several existing methods.
Event-Based Processing of Neutron Scattering Data
Peterson, Peter F.; Campbell, Stuart I.; Reuter, Michael A.; ...
2015-09-16
Many of the world's time-of-flight spallation neutrons sources are migrating to the recording of individual neutron events. This provides for new opportunities in data processing, the least of which is to filter the events based on correlating them with logs of sample environment and other ancillary equipment. This paper will describe techniques for processing neutron scattering data acquired in event mode that preserve event information all the way to a final spectrum, including any necessary corrections or normalizations. This results in smaller final errors, while significantly reducing processing time and memory requirements in typical experiments. Results with traditional histogramming techniquesmore » will be shown for comparison.« less
Automatic measurement of images on astrometric plates
NASA Astrophysics Data System (ADS)
Ortiz Gil, A.; Lopez Garcia, A.; Martinez Gonzalez, J. M.; Yershov, V.
1994-04-01
We present some results on the process of automatic detection and measurement of objects in overlapped fields of astrometric plates. The main steps of our algorithm are the following: determination of the Scale and Tilt between charge coupled devices (CCD) and microscope coordinate systems and estimation of signal-to-noise ratio in each field;--image identification and improvement of its position and size;--image final centering;--image selection and storage. Several parameters allow the use of variable criteria for image identification, characterization and selection. Problems related with faint images and crowded fields will be approached by special techniques (morphological filters, histogram properties and fitting models).
LISA Technology Development at GSFC
NASA Technical Reports Server (NTRS)
Thorpe, James Ira; McWilliams, S.; Baker, J.
2008-01-01
The prime focus of LISA technology development efforts at NASA/GSFC has been in LISA interferometry, specifically in the area of laser frequency noise mitigation. Laser frequency noise is addressed through a combination of stabilization and common-mode rejection. Current plans call for two stages of stabilization, pre-stabilization to a local frequency reference and further stabilization using the constellation as a frequency reference. In order for these techniques to be used simultaneously, the pre-stabilization step must provide an adjustable frequency offset. Here, we report on a modification to the standard modulation/demodulation techniques used to stabilize to optical cavities that generates a frequency-tunable reference from a fixed-length cavity. This technique requires no modifications to the cavity itself and only minor modifications to the components. The measured noise performance and dynamic range of the laboratory prototype meets the LISA requirements.
Titration in the treatment of the more troubled patient.
Winer, J A; Ornstein, E D
2001-01-01
This article defines and discusses a modification of technique recommended by the authors in the psychoanalytic treatment of more troubled patients--a modification they call titration. Titration is defined as a conscious decision by the analyst to increase or decrease assistance (or gratification) gradually, in order to facilitate the analytic process. The authors emphasize the complexity of decisions in treatment by focusing on the decision-making processes that titration requires. Guidelines and a case vignette are presented. The authors conclude by considering some of the politics involved in the introduction of technique modifications, the salience of the titration concept, and directions for further exploration.
Albumin modification and fragmentation in renal disease.
Donadio, Carlo; Tognotti, Danika; Donadio, Elena
2012-02-18
Albumin is the most important antioxidant substance in plasma and performs many physiological functions. Furthermore, albumin is the major carrier of endogenous molecules and exogenous ligands. This paper reviews the importance of post-translational modifications of albumin and fragments thereof in patients with renal disease. First, current views and controversies on renal handling of proteins, mainly albumin, will be discussed. Post-translational modifications, namely the fragmentation of albumin found with proteomic techniques in nephrotic patients, diabetics, and ESRD patients will be presented and discussed. It is reasonable to hypothesize that proteolytic fragmentation of serum albumin is due to a higher susceptibility to proteases, induced by oxidative stress. The clinical relevance of the fragmentation of albumin has not yet been established. These modifications could affect some physiological functions of albumin and have a patho-physiological role in uremic syndrome. Proteomic analysis of serum allows the identification of over-expressed proteins and can detect post-translational modifications of serum proteins, hitherto hidden, using standard laboratory techniques. Copyright © 2011 Elsevier B.V. All rights reserved.
Surface Characterization Techniques: An Overview
NASA Technical Reports Server (NTRS)
Miyoshi, Kazuhisa
2002-01-01
To understand the benefits that surface modifications provide, and ultimately to devise better ones, it is necessary to study the physical, mechanical, and chemical changes they cause. This chapter surveys classical and leading-edge developments in surface structure and property characterization methodologies. The primary emphases are on the use of these techniques as they relate to surface modifications, thin films and coatings, and tribological engineering surfaces and on the implications rather than the instrumentation.
Ostrander, Roger V; McKinney, Bart I
2012-10-01
Studies suggest that arthroscopic repair techniques may have high recurrence rates for larger rotator cuff tears. A more anatomic repair may improve the success rate when performing arthroscopic rotator cuff repair. We hypothesized that a triple-row modification of the suture-bridge technique for rotator cuff repair would result in significantly more footprint contact area and pressure between the rotator cuff and the humeral tuberosity. Eighteen ovine infraspinatus tendons were repaired using 1 of 3 simulated arthroscopic techniques: a double-row repair, the suture-bridge technique, and a triple-row repair. The triple-row repair technique is a modification of the suture-bridge technique that uses an additional reducing anchor between the medial and lateral rows. Six samples were tested per group. Pressure-indicating film was used to measure the footprint contact area and pressure after each repair. The triple-row repair resulted in significantly more rotator cuff footprint contact area and contact pressure compared with the double-row technique and the standard suture-bridge technique. No statistical difference in contact area or contact pressure was found between the double-row technique and the suture-bridge technique. The triple-row technique for rotator cuff repair results in significantly more footprint contact area and contact pressure compared with the double-row and standard suture-bridge techniques. This more anatomic repair may improve the healing rate when performing arthroscopic rotator cuff repair. Copyright © 2012 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Mosby, Inc. All rights reserved.
Nakajo, Masanori; Fukukura, Yoshihiko; Hakamada, Hiroto; Yoneyama, Tomohide; Kamimura, Kiyohisa; Nagano, Satoshi; Nakajo, Masayuki; Yoshiura, Takashi
2018-02-22
Apparent diffusion coefficient (ADC) histogram analyses have been used to differentiate tumor grades and predict therapeutic responses in various anatomic sites with moderate success. To determine the ability of diffusion-weighted imaging (DWI) with a whole-tumor ADC histogram analysis to differentiate benign peripheral neurogenic tumors (BPNTs) from soft tissue sarcomas (STSs). Retrospective study, single institution. In all, 25 BPNTs and 31 STSs. Two-b value DWI (b-values = 0, 1000s/mm 2 ) was at 3.0T. The histogram parameters of whole-tumor for ADC were calculated by two radiologists and compared between BPNTs and STSs. Nonparametric tests were performed for comparisons between BPNTs and STSs. P < 0.05 was considered statistically significant. The ability of each parameter to differentiate STSs from BPNTs was evaluated using area under the curve (AUC) values derived from a receiver operating characteristic curve analysis. The mean ADC and all percentile parameters were significantly lower in STSs than in BPNTs (P < 0.001-0.009), with AUCs of 0.703-0.773. However, the coefficient of variation (P = 0.020 and AUC = 0.682) and skewness (P = 0.012 and AUC = 0.697) were significantly higher in STSs than in BPNTs. Kurtosis (P = 0.295) and entropy (P = 0.604) did not differ significantly between BPNTs and STSs. Whole-tumor ADC histogram parameters except kurtosis and entropy differed significantly between BPNTs and STSs. 3 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018. © 2018 International Society for Magnetic Resonance in Medicine.
Histogram Matching Extends Acceptable Signal Strength Range on Optical Coherence Tomography Images
Chen, Chieh-Li; Ishikawa, Hiroshi; Wollstein, Gadi; Bilonick, Richard A.; Sigal, Ian A.; Kagemann, Larry; Schuman, Joel S.
2015-01-01
Purpose. We minimized the influence of image quality variability, as measured by signal strength (SS), on optical coherence tomography (OCT) thickness measurements using the histogram matching (HM) method. Methods. We scanned 12 eyes from 12 healthy subjects with the Cirrus HD-OCT device to obtain a series of OCT images with a wide range of SS (maximal range, 1–10) at the same visit. For each eye, the histogram of an image with the highest SS (best image quality) was set as the reference. We applied HM to the images with lower SS by shaping the input histogram into the reference histogram. Retinal nerve fiber layer (RNFL) thickness was automatically measured before and after HM processing (defined as original and HM measurements), and compared to the device output (device measurements). Nonlinear mixed effects models were used to analyze the relationship between RNFL thickness and SS. In addition, the lowest tolerable SSs, which gave the RNFL thickness within the variability margin of manufacturer recommended SS range (6–10), were determined for device, original, and HM measurements. Results. The HM measurements showed less variability across a wide range of image quality than the original and device measurements (slope = 1.17 vs. 4.89 and 1.72 μm/SS, respectively). The lowest tolerable SS was successfully reduced to 4.5 after HM processing. Conclusions. The HM method successfully extended the acceptable SS range on OCT images. This would qualify more OCT images with low SS for clinical assessment, broadening the OCT application to a wider range of subjects. PMID:26066749
Qi, Xi-Xun; Shi, Da-Fa; Ren, Si-Xie; Zhang, Su-Ya; Li, Long; Li, Qing-Chang; Guan, Li-Ming
2018-04-01
To investigate the value of histogram analysis of diffusion kurtosis imaging (DKI) maps in the evaluation of glioma grading. A total of 39 glioma patients who underwent preoperative magnetic resonance imaging (MRI) were classified into low-grade (13 cases) and high-grade (26 cases) glioma groups. Parametric DKI maps were derived, and histogram metrics between low- and high-grade gliomas were analysed. The optimum diagnostic thresholds of the parameters, area under the receiver operating characteristic curve (AUC), sensitivity, and specificity were achieved using a receiver operating characteristic (ROC). Significant differences were observed not only in 12 metrics of histogram DKI parameters (P<0.05), but also in mean diffusivity (MD) and mean kurtosis (MK) values, including age as a covariate (F=19.127, P<0.001 and F=20.894, P<0.001, respectively), between low- and high-grade gliomas. Mean MK was the best independent predictor of differentiating glioma grades (B=18.934, 22.237 adjusted for age, P<0.05). The partial correlation coefficient between fractional anisotropy (FA) and kurtosis fractional anisotropy (KFA) was 0.675 (P<0.001). The AUC of the mean MK, sensitivity, and specificity were 0.925, 88.5% and 84.6%, respectively. DKI parameters can effectively distinguish between low- and high-grade gliomas. Mean MK is the best independent predictor of differentiating glioma grades. • DKI is a new and important method. • DKI can provide additional information on microstructural architecture. • Histogram analysis of DKI may be more effective in glioma grading.
Cui, Yanfen; Yang, Xiaotang; Du, Xiaosong; Zhuo, Zhizheng; Xin, Lei; Cheng, Xintao
2018-04-01
To investigate potential relationships between diffusion kurtosis imaging (DKI)-derived parameters using whole-tumour volume histogram analysis and clinicopathological prognostic factors in patients with rectal adenocarcinoma. 79 consecutive patients who underwent MRI examination with rectal adenocarcinoma were retrospectively evaluated. Parameters D, K and conventional ADC were measured using whole-tumour volume histogram analysis. Student's t-test or Mann-Whitney U-test, receiver operating characteristic curves and Spearman's correlation were used for statistical analysis. Almost all the percentile metrics of K were correlated positively with nodal involvement, higher histological grades, the presence of lymphangiovascular invasion (LVI) and circumferential margin (CRM) (p<0.05), with the exception of between K 10th , K 90th and histological grades. In contrast, significant negative correlations were observed between 25th, 50th percentiles and mean values of ADC and D, as well as ADC 10th , with tumour T stages (p< 0.05). Meanwhile, lower 75th and 90th percentiles of ADC and D values were also correlated inversely with nodal involvement (p< 0.05). K mean showed a relatively higher area under the curve (AUC) and higher specificity than other percentiles for differentiation of lesions with nodal involvement. DKI metrics with whole-tumour volume histogram analysis, especially K parameters, were associated with important prognostic factors of rectal cancer. • K correlated positively with some important prognostic factors of rectal cancer. • K mean showed higher AUC and specificity for differentiation of nodal involvement. • DKI metrics with whole-tumour volume histogram analysis depicted tumour heterogeneity.
Poussaint, Tina Young; Vajapeyam, Sridhar; Ricci, Kelsey I.; Panigrahy, Ashok; Kocak, Mehmet; Kun, Larry E.; Boyett, James M.; Pollack, Ian F.; Fouladi, Maryam
2016-01-01
Background Diffuse intrinsic pontine glioma (DIPG) is associated with poor survival regardless of therapy. We used volumetric apparent diffusion coefficient (ADC) histogram metrics to determine associations with progression-free survival (PFS) and overall survival (OS) at baseline and after radiation therapy (RT). Methods Baseline and post-RT quantitative ADC histograms were generated from fluid-attenuated inversion recovery (FLAIR) images and enhancement regions of interest. Metrics assessed included number of peaks (ie, unimodal or bimodal), mean and median ADC, standard deviation, mode, skewness, and kurtosis. Results Based on FLAIR images, the majority of tumors had unimodal peaks with significantly shorter average survival. Pre-RT FLAIR mean, mode, and median values were significantly associated with decreased risk of progression; higher pre-RT ADC values had longer PFS on average. Pre-RT FLAIR skewness and standard deviation were significantly associated with increased risk of progression; higher pre-RT FLAIR skewness and standard deviation had shorter PFS. Nonenhancing tumors at baseline showed higher ADC FLAIR mean values, lower kurtosis, and higher PFS. For enhancing tumors at baseline, bimodal enhancement histograms had much worse PFS and OS than unimodal cases and significantly lower mean peak values. Enhancement in tumors only after RT led to significantly shorter PFS and OS than in patients with baseline or no baseline enhancement. Conclusions ADC histogram metrics in DIPG demonstrate significant correlations between diffusion metrics and survival, with lower diffusion values (increased cellularity), increased skewness, and enhancement associated with shorter survival, requiring future investigations in large DIPG clinical trials. PMID:26487690
Issues around Creating a Reusable Learning Object to Support Statistics Teaching
ERIC Educational Resources Information Center
Gilchrist, Mollie
2007-01-01
Although our health professional students have some experience of simple charts, such as pie and bar, and some intuition of histograms, they do not appear to have much knowledge or understanding about box and whisker plots and their relation to the data they are describing or compared to histograms. The boxplot is a versatile charting tool, useful…
ERIC Educational Resources Information Center
CASE, C. MARSTON
THIS PAPER IS CONCERNED WITH GRAPHIC PRESENTATION AND ANALYSIS OF GROUPED OBSERVATIONS. IT PRESENTS A METHOD AND SUPPORTING THEORY FOR THE CONSTRUCTION OF AN AREA-CONSERVING, MINIMAL LENGTH FREQUENCY POLYGON CORRESPONDING TO A GIVEN HISTOGRAM. TRADITIONALLY, THE CONCEPT OF A FREQUENCY POLYGON CORRESPONDING TO A GIVEN HISTOGRAM HAS REFERRED TO THAT…
Methods for Determining Particle Size Distributions from Nuclear Detonations.
1987-03-01
Debris . . . 30 IV. Summary of Sample Preparation Method . . . . 35 V. Set Parameters for PCS ... ........... 39 VI. Analysis by Vendors...54 XV. Results From Brookhaven Analysis Using The Method of Cumulants ... ........... . 54 XVI. Results From Brookhaven Analysis of Sample...R-3 Using Histogram Method ......... .55 XVII. Results From Brookhaven Analysis of Sample R-8 Using Histogram Method ........... 56 XVIII.TEM Particle
Schob, Stefan; Beeskow, Anne; Dieckow, Julia; Meyer, Hans-Jonas; Krause, Matthias; Frydrychowicz, Clara; Hirsch, Franz-Wolfgang; Surov, Alexey
2018-05-31
Medulloblastomas are the most common central nervous system tumors in childhood. Treatment and prognosis strongly depend on histology and transcriptomic profiling. However, the proliferative potential also has prognostical value. Our study aimed to investigate correlations between histogram profiling of diffusion-weighted images and further microarchitectural features. Seven patients (age median 14.6 years, minimum 2 years, maximum 20 years; 5 male, 2 female) were included in this retrospective study. Using a Matlab-based analysis tool, histogram analysis of whole apparent diffusion coefficient (ADC) volumes was performed. ADC entropy revealed a strong inverse correlation with the expression of the proliferation marker Ki67 (r = - 0.962, p = 0.009) and with total nuclear area (r = - 0.888, p = 0.044). Furthermore, ADC percentiles, most of all ADCp90, showed significant correlations with Ki67 expression (r = 0.902, p = 0.036). Diffusion histogram profiling of medulloblastomas provides valuable in vivo information which potentially can be used for risk stratification and prognostication. First of all, entropy revealed to be the most promising imaging biomarker. However, further studies are warranted.
Meng, Jie; Zhu, Lijing; Zhu, Li; Xie, Li; Wang, Huanhuan; Liu, Song; Yan, Jing; Liu, Baorui; Guan, Yue; He, Jian; Ge, Yun; Zhou, Zhengyang; Yang, Xiaofeng
2017-11-03
To explore the value of whole-lesion apparent diffusion coefficient (ADC) histogram and texture analysis in predicting tumor recurrence of advanced cervical cancer treated with concurrent chemo-radiotherapy (CCRT). 36 women with pathologically confirmed advanced cervical squamous carcinomas were enrolled in this prospective study. 3.0 T pelvic MR examinations including diffusion weighted imaging (b = 0, 800 s/mm 2 ) were performed before CCRT (pre-CCRT) and at the end of 2nd week of CCRT (mid-CCRT). ADC histogram and texture features were derived from the whole volume of cervical cancers. With a mean follow-up of 25 months (range, 11 ∼ 43), 10/36 (27.8%) patients ended with recurrence. Pre-CCRT 75th, 90th, correlation, autocorrelation and mid-CCRT ADC mean , 10th, 25th, 50th, 75th, 90th, autocorrelation can effectively differentiate the recurrence from nonrecurrence group with area under the curve ranging from 0.742 to 0.850 (P values range, 0.001 ∼ 0.038). Pre- and mid-treatment whole-lesion ADC histogram and texture analysis hold great potential in predicting tumor recurrence of advanced cervical cancer treated with CCRT.
NASA Astrophysics Data System (ADS)
Zhang, Min; Zhou, Xiangrong; Goshima, Satoshi; Chen, Huayue; Muramatsu, Chisako; Hara, Takeshi; Yokoyama, Ryujiro; Kanematsu, Masayuki; Fujita, Hiroshi
2013-03-01
In this paper, we present a texture classification method based on texton learned via sparse representation (SR) with new feature histogram maps in the classification of emphysema. First, an overcomplete dictionary of textons is learned via KSVD learning on every class image patches in the training dataset. In this stage, high-pass filter is introduced to exclude patches in smooth area to speed up the dictionary learning process. Second, 3D joint-SR coefficients and intensity histograms of the test images are used for characterizing regions of interest (ROIs) instead of conventional feature histograms constructed from SR coefficients of the test images over the dictionary. Classification is then performed using a classifier with distance as a histogram dissimilarity measure. Four hundreds and seventy annotated ROIs extracted from 14 test subjects, including 6 paraseptal emphysema (PSE) subjects, 5 centrilobular emphysema (CLE) subjects and 3 panlobular emphysema (PLE) subjects, are used to evaluate the effectiveness and robustness of the proposed method. The proposed method is tested on 167 PSE, 240 CLE and 63 PLE ROIs consisting of mild, moderate and severe pulmonary emphysema. The accuracy of the proposed system is around 74%, 88% and 89% for PSE, CLE and PLE, respectively.
Efficient reversible data hiding in encrypted image with public key cryptosystem
NASA Astrophysics Data System (ADS)
Xiang, Shijun; Luo, Xinrong
2017-12-01
This paper proposes a new reversible data hiding scheme for encrypted images by using homomorphic and probabilistic properties of Paillier cryptosystem. The proposed method can embed additional data directly into encrypted image without any preprocessing operations on original image. By selecting two pixels as a group for encryption, data hider can retrieve the absolute differences of groups of two pixels by employing a modular multiplicative inverse method. Additional data can be embedded into encrypted image by shifting histogram of the absolute differences by using the homomorphic property in encrypted domain. On the receiver side, legal user can extract the marked histogram in encrypted domain in the same way as data hiding procedure. Then, the hidden data can be extracted from the marked histogram and the encrypted version of original image can be restored by using inverse histogram shifting operations. Besides, the marked absolute differences can be computed after decryption for extraction of additional data and restoration of original image. Compared with previous state-of-the-art works, the proposed scheme can effectively avoid preprocessing operations before encryption and can efficiently embed and extract data in encrypted domain. The experiments on the standard image files also certify the effectiveness of the proposed scheme.
Comparison of 3D CRT and IMRT Tratment Plans
Bakiu, Erjona; Telhaj, Ervis; Kozma, Elvisa; Ruçi, Ferdinand; Malkaj, Partizan
2013-01-01
Plans of patients with prostate tumor have been studied. These patients have been scanned in the CT simulator and the images have been sent to the Focal, the system where the doctor delineates the tumor and the organs at risk. After that in the treatment planning system XiO there are created for the same patients three dimensional conformal and intensity modulated radiotherapy treatment plans. The planes are compared according to the dose volume histograms. It is observed that the plans with IMRT technique conform better the isodoses to the planning target volume and protect more the organs at risk, but the time needed to create such plans and to control it is higher than 3D CRT. So it necessary to decide in which patients to do one or the other technique depending on the full dose given to PTV and time consuming in genereral. PMID:24167395
Wang, Yue; Adalý, Tülay; Kung, Sun-Yuan; Szabo, Zsolt
2007-01-01
This paper presents a probabilistic neural network based technique for unsupervised quantification and segmentation of brain tissues from magnetic resonance images. It is shown that this problem can be solved by distribution learning and relaxation labeling, resulting in an efficient method that may be particularly useful in quantifying and segmenting abnormal brain tissues where the number of tissue types is unknown and the distributions of tissue types heavily overlap. The new technique uses suitable statistical models for both the pixel and context images and formulates the problem in terms of model-histogram fitting and global consistency labeling. The quantification is achieved by probabilistic self-organizing mixtures and the segmentation by a probabilistic constraint relaxation network. The experimental results show the efficient and robust performance of the new algorithm and that it outperforms the conventional classification based approaches. PMID:18172510
Cardot, J C; Berthout, P; Verdenet, J; Bidet, A; Faivre, R; Bassand, J P; Bidet, R; Maurat, J P
1982-01-01
Regional and global left ventricular wall motion was assessed in 120 patients using radionuclide cineangiography (RCA) and contrast angiography. Functional imaging procedures based on a temporal Fourier analysis of dynamic image sequences were applied to the study of cardiac contractility. Two images were constructed by taking the phase and amplitude values of the first harmonic in the Fourier transform for each pixel. These two images aided in determining the perimeter of the left ventricle to calculate the global ejection fraction. Regional left ventricular wall motion was studied by analyzing the phase value and by examining the distribution histogram of these values. The accuracy of global ejection fraction calculation was improved by the Fourier technique. This technique increased the sensitivity of RCA for determining segmental abnormalities especially in the left anterior oblique view (LAO).
NASA Astrophysics Data System (ADS)
Peng, Yahui; Ma, Xiao; Gao, Xinyu; Zhou, Fangxu
2015-12-01
Computer vision is an important tool for sports video processing. However, its application in badminton match analysis is very limited. In this study, we proposed a straightforward but robust histogram-based background estimation and player detection methods for badminton video clips, and compared the results with the naive averaging method and the mixture of Gaussians methods, respectively. The proposed method yielded better background estimation results than the naive averaging method and more accurate player detection results than the mixture of Gaussians player detection method. The preliminary results indicated that the proposed histogram-based method could estimate the background and extract the players accurately. We conclude that the proposed method can be used for badminton player tracking and further studies are warranted for automated match analysis.
Machine assisted histogram classification
NASA Astrophysics Data System (ADS)
Benyó, B.; Gaspar, C.; Somogyi, P.
2010-04-01
LHCb is one of the four major experiments under completion at the Large Hadron Collider (LHC). Monitoring the quality of the acquired data is important, because it allows the verification of the detector performance. Anomalies, such as missing values or unexpected distributions can be indicators of a malfunctioning detector, resulting in poor data quality. Spotting faulty or ageing components can be either done visually using instruments, such as the LHCb Histogram Presenter, or with the help of automated tools. In order to assist detector experts in handling the vast monitoring information resulting from the sheer size of the detector, we propose a graph based clustering tool combined with machine learning algorithm and demonstrate its use by processing histograms representing 2D hitmaps events. We prove the concept by detecting ion feedback events in the LHCb experiment's RICH subdetector.
Methods of Single Station and Limited Data Analysis and Forecasting
1985-08-15
example using real data. Discusses modifications of SSA technique in certain climatological regimes and describes some statistical tech- niques for SSA of... caster has access to radar or satellite observations, or any computer products during the period of his isolation. Where calculations are involved, it is...chapters of the text will deal with special topics such as modifications of the SSA technique that must be considered for certain clima- tological regimes
High-resolution studies of the HF ionospheric modification interaction region
NASA Technical Reports Server (NTRS)
Duncan, L. M.; Sheerin, J. P.
1985-01-01
The use of the pulse edge analysis technique to explain ionospheric modifications caused by high-power HF radio waves is discussed. The technique, implemented at the Arecibo Observatory, uses long radar pulses and very rapid data sampling. A comparison of the pulse leading and trailing edge characteristics is obtained and the comparison is used to estimate the relative changes in the interaction region height and layer width; an example utilizing this technique is provided. Main plasma line overshoot and miniovershoot were studied from the pulse edge observations; the observations at various HF pulsings and radar resolutions are graphically presented. From the pulse edge data the development and the occurrence of main plasma line overshoot and miniovershoot are explained. The theories of soliton formation and collapse, wave ducting, profile modification, and parametric instabilities are examined as a means of explaining main plasma line overshoots and miniovershoots.
Huang, Junfeng; Wang, Fangjun; Ye, Mingliang; Zou, Hanfa
2014-11-06
Comprehensive analysis of the post-translational modifications (PTMs) on proteins at proteome level is crucial to elucidate the regulatory mechanisms of various biological processes. In the past decades, thanks to the development of specific PTM enrichment techniques and efficient multidimensional liquid chromatography (LC) separation strategy, the identification of protein PTMs have made tremendous progress. A huge number of modification sites for some major protein PTMs have been identified by proteomics analysis. In this review, we first introduced the recent progresses of PTM enrichment methods for the analysis of several major PTMs including phosphorylation, glycosylation, ubiquitination, acetylation, methylation, and oxidation/reduction status. We then briefly summarized the challenges for PTM enrichment. Finally, we introduced the fractionation and separation techniques for efficient separation of PTM peptides in large-scale PTM analysis. Copyright © 2014 Elsevier B.V. All rights reserved.
Wildfire Detection using by Multi Dimensional Histogram in Boreal Forest
NASA Astrophysics Data System (ADS)
Honda, K.; Kimura, K.; Honma, T.
2008-12-01
Early detection of wildfires is an issue for reduction of damage to environment and human. There are some attempts to detect wildfires by using satellite imagery, which are mainly classified into three methods: Dozier Method(1981-), Threshold Method(1986-) and Contextual Method(1994-). However, the accuracy of these methods is not enough: some commission and omission errors are included in the detected results. In addition, it is not so easy to analyze satellite imagery with high accuracy because of insufficient ground truth data. Kudoh and Hosoi (2003) developed the detection method by using three-dimensional (3D) histogram from past fire data with the NOAA-AVHRR imagery. But their method is impractical because their method depends on their handworks to pick up past fire data from huge data. Therefore, the purpose of this study is to collect fire points as hot spots efficiently from satellite imagery and to improve the method to detect wildfires with the collected data. As our method, we collect past fire data with the Alaska Fire History data obtained by the Alaska Fire Service (AFS). We select points that are expected to be wildfires, and pick up the points inside the fire area of the AFS data. Next, we make 3D histogram with the past fire data. In this study, we use Bands 1, 21 and 32 of MODIS. We calculate the likelihood to detect wildfires with the three-dimensional histogram. As our result, we select wildfires with the 3D histogram effectively. We can detect the troidally spreading wildfire. This result shows the evidence of good wildfire detection. However, the area surrounding glacier tends to rise brightness temperature. It is a false alarm. Burnt area and bare ground are sometimes indicated as false alarms, so that it is necessary to improve this method. Additionally, we are trying various combinations of MODIS bands as the better method to detect wildfire effectively. So as to adjust our method in another area, we are applying our method to tropical forest in Kalimantan, Indonesia and around Chiang Mai, Thailand. But the ground truth data in these areas is lesser than the one in Alaska. Our method needs lots of accurate observed data to make multi-dimensional histogram in the same area. In this study, we can show the system to select wildfire data efficiently from satellite imagery. Furthermore, the development of multi-dimensional histogram from past fire data makes it possible to detect wildfires accurately.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, J; Harb, J; Jawad, M
2014-06-15
Purpose: In follow-up T2-weighted MR images of spinal tumor patients treated with stereotactic body radiation therapy (SBRT), high intensity features embedded in dark surroundings may suggest a local failure (LF). We investigated image intensity histogram in imaging features to predict LF and local control (LC). Methods: Sixty-seven spinal tumors were treated with SBRT at our institution with scheduled follow-up MR T2-weighted (TR 3200–6600ms; TE 75-132ms) imaging. The LF group included 10 tumors with 8.7 months median follow-up, while the LC group had 11 tumors with 24.1 months median follow-up. The follow-up images were fused to the planning CT. Image intensitymore » histograms of the GTV were calculated. Voxels in greater than 90% (V90), 80% (V80), and peak (Vpeak) of the histogram were grouped into sub-ROIs to determine the best feature histogram. The intensity of each sub-ROI was evaluated using the mean T2-weighted signal ratio (intensity in sub-ROI / intensity in normal vertebrae). An ROC curve in predicting LF for each sub-ROI was calculated to determine the best feature histogram parameter for LF prediction. Results: Mean T2-weighted signal ratio in the LF group was significantly higher than that in the LC group for all sub-ROIs (1.1±0.4 vs. 0.7±0.2, 1.2±0.4 vs. 0.8±0.2, 1.4±0.5 vs. 0.8±0.2, for V90, V80, and Vpeak, p=0.02, 0.02, and 0.002, respectively). The corresponding areas-under-curve (AUC) of ROC were 0.78, 0.80, and 0.87, p=0.02, 0.03, 0.004, respectively. No correlation was found between T2-weighted signal ratio in Vpeak and follow-up time (Pearson's ρ=0.15). Conclusion: Increased T2-weighted signal can be used to identify local failure while decreased signal indicates local control after spinal SBRT. By choosing the best histogram parameter (here the Vpeak), the AUC of the ROC can be substantially improved, which implies reliable prediction of LC and LF. These results are being further studied and validated with large multi-institutional data.« less
NASA Technical Reports Server (NTRS)
1972-01-01
Developments by the aerospace industry in seals and sealing techniques are announced for possible use in other areas. The announcements presented are grouped as: sealing techniques for cryogenic fluids, high pressure applications, and modification for improved performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schulz, Douglas A.
2007-10-08
A biometric system suitable for validating user identity using only mouse movements and no specialized equipment is presented. Mouse curves (mouse movements with little or no pause between them) are individually classied and used to develop classication histograms, which are representative of an individual's typical mouse use. These classication histograms can then be compared to validate identity. This classication approach is suitable for providing continuous identity validation during an entire user session.
DIF Testing with an Empirical-Histogram Approximation of the Latent Density for Each Group
ERIC Educational Resources Information Center
Woods, Carol M.
2011-01-01
This research introduces, illustrates, and tests a variation of IRT-LR-DIF, called EH-DIF-2, in which the latent density for each group is estimated simultaneously with the item parameters as an empirical histogram (EH). IRT-LR-DIF is used to evaluate the degree to which items have different measurement properties for one group of people versus…
An Automated Energy Detection Algorithm Based on Kurtosis-Histogram Excision
2018-01-01
ARL-TR-8269 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Kurtosis-Histogram Excision...needed. Do not return it to the originator. ARL-TR-8269 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, X; Schott, D; Song, Y
Purpose: In an effort of early assessment of treatment response, we investigate radiation induced changes in CT number histogram of GTV during the delivery of chemoradiation therapy (CRT) for pancreatic cancer. Methods: Diagnostic-quality CT data acquired daily during routine CT-guided CRT using a CT-on-rails for 20 pancreatic head cancer patients were analyzed. All patients were treated with a radiation dose of 50.4 in 28 fractions. On each daily CT set, the contours of the pancreatic head and the spinal cord were delineated. The Hounsfiled Units (HU) histogram in these contourswere extracted and processed using MATLAB. Eight parameters of the histogrammore » including the mean HU over all the voxels, peak position, volume, standard deviation (SD), skewness, kurtosis, energy, and entropy were calculated for each fraction. The significances were inspected using paired two-tailed t-test and the correlations were analyzed using Spearman rank correlation tests. Results: In general, HU histogram in pancreatic head (but not in spinal cord) changed during the CRT delivery. Changes from the first to the last fraction in mean HU in pancreatic head ranged from −13.4 to 3.7 HU with an average of −4.4 HU, which was significant (P<0.001). Among other quantities, the volume decreased, the skewness increased (less skewed), and the kurtosis decreased (less sharp) during the CRT delivery. The changes of mean HU, volume, skewness, and kurtosis became significant after two weeks of treatment. Patient pathological response status is associated with the changes of SD (ΔSD), i.e., ΔSD= 1.85 (average of 7 patients) for good reponse, −0.08 (average of 6 patients) for moderate and poor response. Conclusion: Significant changes in HU histogram and the histogram-based metrics (e.g., meam HU, skewness, and kurtosis) in tumor were observed during the course of chemoradiation therapy for pancreas cancer. These changes may be potentially used for early assessment of treatment response.« less
Hu, Fubi; Yang, Ru; Huang, Zixing; Wang, Min; Zhang, Hanmei; Yan, Xu; Song, Bin
2017-12-01
To retrospectively determine the feasibility of intravoxel incoherent motion (IVIM) imaging based on histogram analysis for the staging of liver fibrosis (LF) using histopathologic findings as the reference standard. 56 consecutive patients (14 men, 42 women; age range, 15-76, years) with chronic liver diseases (CLDs) were studied using IVIM-DWI with 9 b-values (0, 25, 50, 75, 100, 150, 200, 500, 800 s/mm 2 ) at 3.0 T. Fibrosis stage was evaluated using the METAVIR scoring system. Histogram metrics including mean, standard deviation (Std), skewness, kurtosis, minimum (Min), maximum (Max), range, interquartile (Iq) range, and percentiles (10, 25, 50, 75, 90th) were extracted from apparent diffusion coefficient (ADC), true diffusion coefficient (D), pseudo-diffusion coefficient (D*), and perfusion fraction (f) maps. All histogram metrics among different fibrosis groups were compared using one-way analysis of variance or nonparametric Kruskal-Wallis test. For significant parameters, receivers operating characteristic curve (ROC) analyses were further performed for the staging of LF. Based on their METAVIR stage, the 56 patients were reclassified into three groups as follows: F0-1 group (n = 25), F2-3 group (n = 21), and F4 group (n = 10). The mean, Iq range, percentiles (50, 75, and 90th) of D* maps between the groups were significant differences (all P < 0.05). Area under the ROC curve (AUC) of the mean, Iq range, 50, 75, and 90th percentile of D* maps for identifying significant LF (≥F2 stage) was 0.901, 0.859, 0.876, 0.943, and 0.886 (all P < 0.0001), respectively; for diagnosing severe fibrosis or cirrhosis (F4), AUC was 0.917, 0.922, 0.943, 0.985, and 0.939 (all P < 0.0001), respectively. The histogram metrics of ADC, D, and f maps demonstrated no significant difference among the groups (all P > 0.05). Histogram analysis of D* map derived from IVIM can be used to stage liver fibrosis in patients with CLDs and provide more quantitative information beyond the mean value.
Maurer, Britta; Suliman, Yossra A.; Morsbach, Fabian; Distler, Oliver; Frauenfelder, Thomas
2018-01-01
Background To evaluate usability of slice-reduced sequential computed tomography (CT) compared to standard high-resolution CT (HRCT) in patients with systemic sclerosis (SSc) for qualitative and quantitative assessment of interstitial lung disease (ILD) with respect to (I) detection of lung parenchymal abnormalities, (II) qualitative and semiquantitative visual assessment, (III) quantification of ILD by histograms and (IV) accuracy for the 20%-cut off discrimination. Methods From standard chest HRCT of 60 SSc patients sequential 9-slice-computed tomography (reduced HRCT) was retrospectively reconstructed. ILD was assessed by visual scoring and quantitative histogram parameters. Results from standard and reduced HRCT were compared using non-parametric tests and analysed by univariate linear regression analyses. Results With respect to the detection of parenchymal abnormalities, only the detection of intrapulmonary bronchiectasis was significantly lower in reduced HRCT compared to standard HRCT (P=0.039). No differences were found comparing visual scores for fibrosis severity and extension from standard and reduced HRCT (P=0.051–0.073). All scores correlated significantly (P<0.001) to histogram parameters derived from both, standard and reduced HRCT. Significant higher values of kurtosis and skewness for reduced HRCT were found (both P<0.001). In contrast to standard HRCT histogram parameters from reduced HRCT showed significant discrimination at cut-off 20% fibrosis (sensitivity 88% kurtosis and skewness; specificity 81% kurtosis and 86% skewness; cut-off kurtosis ≤26, cut-off skewness ≤4; both P<0.001). Conclusions Reduced HRCT is a robust method to assess lung fibrosis in SSc with minimal radiation dose with no difference in scoring assessment of lung fibrosis severity and extension in comparison to standard HRCT. In contrast to standard HRCT histogram parameters derived from the approach of reduced HRCT could discriminate at a threshold of 20% lung fibrosis with high sensitivity and specificity. Hence it might be used to detect early disease progression of lung fibrosis in context of monitoring and treatment of SSc patients. PMID:29850118
Colombi, Davide; Dinkel, Julien; Weinheimer, Oliver; Obermayer, Berenike; Buzan, Teodora; Nabers, Diana; Bauer, Claudia; Oltmanns, Ute; Palmowski, Karin; Herth, Felix; Kauczor, Hans Ulrich; Sverzellati, Nicola
2015-01-01
Objectives To describe changes over time in extent of idiopathic pulmonary fibrosis (IPF) at multidetector computed tomography (MDCT) assessed by semi-quantitative visual scores (VSs) and fully automatic histogram-based quantitative evaluation and to test the relationship between these two methods of quantification. Methods Forty IPF patients (median age: 70 y, interquartile: 62-75 years; M:F, 33:7) that underwent 2 MDCT at different time points with a median interval of 13 months (interquartile: 10-17 months) were retrospectively evaluated. In-house software YACTA quantified automatically lung density histogram (10th-90th percentile in 5th percentile steps). Longitudinal changes in VSs and in the percentiles of attenuation histogram were obtained in 20 untreated patients and 20 patients treated with pirfenidone. Pearson correlation analysis was used to test the relationship between VSs and selected percentiles. Results In follow-up MDCT, visual overall extent of parenchymal abnormalities (OE) increased in median by 5 %/year (interquartile: 0 %/y; +11 %/y). Substantial difference was found between treated and untreated patients in HU changes of the 40th and of the 80th percentiles of density histogram. Correlation analysis between VSs and selected percentiles showed higher correlation between the changes (Δ) in OE and Δ 40th percentile (r=0.69; p<0.001) as compared to Δ 80th percentile (r=0.58; p<0.001); closer correlation was found between Δ ground-glass extent and Δ 40th percentile (r=0.66, p<0.001) as compared to Δ 80th percentile (r=0.47, p=0.002), while the Δ reticulations correlated better with the Δ 80th percentile (r=0.56, p<0.001) in comparison to Δ 40th percentile (r=0.43, p=0.003). Conclusions There is a relevant and fully automatically measurable difference at MDCT in VSs and in histogram analysis at one year follow-up of IPF patients, whether treated or untreated: Δ 40th percentile might reflect the change in overall extent of lung abnormalities, notably of ground-glass pattern; furthermore Δ 80th percentile might reveal the course of reticular opacities. PMID:26110421
Quantitative computed tomography applied to interstitial lung diseases.
Obert, Martin; Kampschulte, Marian; Limburg, Rebekka; Barańczuk, Stefan; Krombach, Gabriele A
2018-03-01
To evaluate a new image marker that retrieves information from computed tomography (CT) density histograms, with respect to classification properties between different lung parenchyma groups. Furthermore, to conduct a comparison of the new image marker with conventional markers. Density histograms from 220 different subjects (normal = 71; emphysema = 73; fibrotic = 76) were used to compare the conventionally applied emphysema index (EI), 15 th percentile value (PV), mean value (MV), variance (V), skewness (S), kurtosis (K), with a new histogram's functional shape (HFS) method. Multinomial logistic regression (MLR) analyses was performed to calculate predictions of different lung parenchyma group membership using the individual methods, as well as combinations thereof, as covariates. Overall correct assigned subjects (OCA), sensitivity (sens), specificity (spec), and Nagelkerke's pseudo R 2 (NR 2 ) effect size were estimated. NR 2 was used to set up a ranking list of the different methods. MLR indicates the highest classification power (OCA of 92%; sens 0.95; spec 0.89; NR 2 0.95) when all histogram analyses methods were applied together in the MLR. Highest classification power among individually applied methods was found using the HFS concept (OCA 86%; sens 0.93; spec 0.79; NR 2 0.80). Conventional methods achieved lower classification potential on their own: EI (OCA 69%; sens 0.95; spec 0.26; NR 2 0.52); PV (OCA 69%; sens 0.90; spec 0.37; NR 2 0.57); MV (OCA 65%; sens 0.71; spec 0.58; NR 2 0.61); V (OCA 66%; sens 0.72; spec 0.53; NR 2 0.66); S (OCA 65%; sens 0.88; spec 0.26; NR 2 0.55); and K (OCA 63%; sens 0.90; spec 0.16; NR 2 0.48). The HFS method, which was so far applied to a CT bone density curve analysis, is also a remarkable information extraction tool for lung density histograms. Presumably, being a principle mathematical approach, the HFS method can extract valuable health related information also from histograms from complete different areas. Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reiner, Caecilia S., E-mail: caecilia.reiner@usz.ch; Gordic, Sonja; Puippe, Gilbert
2016-03-15
PurposeTo evaluate in patients with hepatocellular carcinoma (HCC), whether assessment of tumor heterogeneity by histogram analysis of computed tomography (CT) perfusion helps predicting response to transarterial radioembolization (TARE).Materials and MethodsSixteen patients (15 male; mean age 65 years; age range 47–80 years) with HCC underwent CT liver perfusion for treatment planning prior to TARE with Yttrium-90 microspheres. Arterial perfusion (AP) derived from CT perfusion was measured in the entire tumor volume, and heterogeneity was analyzed voxel-wise by histogram analysis. Response to TARE was evaluated on follow-up imaging (median follow-up, 129 days) based on modified Response Evaluation Criteria in Solid Tumors (mRECIST). Results of histogrammore » analysis and mean AP values of the tumor were compared between responders and non-responders. Receiver operating characteristics were calculated to determine the parameters’ ability to discriminate responders from non-responders.ResultsAccording to mRECIST, 8 patients (50 %) were responders and 8 (50 %) non-responders. Comparing responders and non-responders, the 50th and 75th percentile of AP derived from histogram analysis was significantly different [AP 43.8/54.3 vs. 27.6/34.3 mL min{sup −1} 100 mL{sup −1}); p < 0.05], while the mean AP of HCCs (43.5 vs. 27.9 mL min{sup −1} 100 mL{sup −1}; p > 0.05) was not. Further heterogeneity parameters from histogram analysis (skewness, coefficient of variation, and 25th percentile) did not differ between responders and non-responders (p > 0.05). If the cut-off for the 75th percentile was set to an AP of 37.5 mL min{sup −1} 100 mL{sup −1}, therapy response could be predicted with a sensitivity of 88 % (7/8) and specificity of 75 % (6/8).ConclusionVoxel-wise histogram analysis of pretreatment CT perfusion indicating tumor heterogeneity of HCC improves the pretreatment prediction of response to TARE.« less
Landmark Detection in Orbital Images Using Salience Histograms
NASA Technical Reports Server (NTRS)
Wagstaff, Kiri L.; Panetta, Julian; Schorghofer, Norbert; Greeley, Ronald; PendletonHoffer, Mary; bunte, Melissa
2010-01-01
NASA's planetary missions have collected, and continue to collect, massive volumes of orbital imagery. The volume is such that it is difficult to manually review all of the data and determine its significance. As a result, images are indexed and searchable by location and date but generally not by their content. A new automated method analyzes images and identifies "landmarks," or visually salient features such as gullies, craters, dust devil tracks, and the like. This technique uses a statistical measure of salience derived from information theory, so it is not associated with any specific landmark type. It identifies regions that are unusual or that stand out from their surroundings, so the resulting landmarks are context-sensitive areas that can be used to recognize the same area when it is encountered again. A machine learning classifier is used to identify the type of each discovered landmark. Using a specified window size, an intensity histogram is computed for each such window within the larger image (sliding the window across the image). Next, a salience map is computed that specifies, for each pixel, the salience of the window centered at that pixel. The salience map is thresholded to identify landmark contours (polygons) using the upper quartile of salience values. Descriptive attributes are extracted for each landmark polygon: size, perimeter, mean intensity, standard deviation of intensity, and shape features derived from an ellipse fit.
NASA Astrophysics Data System (ADS)
Lei, Tianhu; Udupa, Jayaram K.; Moonis, Gul; Schwartz, Eric; Balcer, Laura
2005-04-01
Based on Fuzzy Connectedness (FC) object delineation principles and algorithms, a hierarchical brain tissue segmentation technique has been developed for MR images. After MR image background intensity inhomogeneity correction and intensity standardization, three FC objects for cerebrospinal fluid (CSF), gray matter (GM), and white matter (WM) are generated via FC object delineation, and an intracranial (IC) mask is created via morphological operations. Then, the IC mask is decomposed into parenchymal (BP) and CSF masks, while the BP mask is separated into WM and GM masks. WM mask is further divided into pure and dirty white matter masks (PWM and DWM). In Multiple Sclerosis studies, a severe white matter lesion (LS) mask is defined from DWM mask. Based on the segmented brain tissue images, a histogram-based method has been developed to find disease-specific, image-based quantitative markers for characterizing the macromolecular manifestation of the two diseases. These same procedures have been applied to 65 MS (46 patients and 19 normal subjects) and 25 AD (15 patients and 10 normal subjects) data sets, each of which consists of FSE PD- and T2-weighted MR images. Histograms representing standardized PD and T2 intensity distributions and their numerical parameters provide an effective means for characterizing the two diseases. The procedures are systematic, nearly automated, robust, and the results are reproducible.
NASA Astrophysics Data System (ADS)
Xu, Pengcheng; Wang, Dong; Singh, Vijay P.; Wang, Yuankun; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Chen, Yuanfang; Chen, Xi; Liu, Jiufu; Zou, Ying; He, Ruimin
2017-12-01
Hydrometeorological data are needed for obtaining point and areal mean, quantifying the spatial variability of hydrometeorological variables, and calibration and verification of hydrometeorological models. Hydrometeorological networks are utilized to collect such data. Since data collection is expensive, it is essential to design an optimal network based on the minimal number of hydrometeorological stations in order to reduce costs. This study proposes a two-phase copula entropy- based multiobjective optimization approach that includes: (1) copula entropy-based directional information transfer (CDIT) for clustering the potential hydrometeorological gauges into several groups, and (2) multiobjective method for selecting the optimal combination of gauges for regionalized groups. Although entropy theory has been employed for network design before, the joint histogram method used for mutual information estimation has several limitations. The copula entropy-based mutual information (MI) estimation method is shown to be more effective for quantifying the uncertainty of redundant information than the joint histogram (JH) method. The effectiveness of this approach is verified by applying to one type of hydrometeorological gauge network, with the use of three model evaluation measures, including Nash-Sutcliffe Coefficient (NSC), arithmetic mean of the negative copula entropy (MNCE), and MNCE/NSC. Results indicate that the two-phase copula entropy-based multiobjective technique is capable of evaluating the performance of regional hydrometeorological networks and can enable decision makers to develop strategies for water resources management.
Real-time computed tomography dosimetry during ultrasound-guided brachytherapy for prostate cancer.
Kaplan, Irving D; Meskell, Paul; Oldenburg, Nicklas E; Saltzman, Brian; Kearney, Gary P; Holupka, Edward J
2006-01-01
Ultrasound-guided implantation of permanent radioactive seeds is a treatment option for localized prostate cancer. Several techniques have been described for the optimal placement of the seeds in the prostate during this procedure. Postimplantation dosimetric calculations are performed after the implant. Areas of underdosing can only be corrected with either an external beam boost or by performing a second implant. We demonstrate the feasibility of performing computed tomography (CT)-based postplanning during the ultrasound-guided implant and subsequently correcting for underdosed areas. Ultrasound-guided brachytherapy is performed on a modified CT table with general anesthesia. The postplanning CT scan is performed after the implant, while the patient is still under anesthesia. Additional seeds are implanted into "cold spots," and the resultant dosimetry confirmed with CT. Intraoperative postplanning was successfully performed. Dose-volume histograms demonstrated adequate dose coverage during the initial implant, but on detailed analysis, for some patients, areas of underdosing were observed either at the apex or the peripheral zone. Additional seeds were implanted to bring these areas to prescription dose. Intraoperative postplanning is feasible during ultrasound-guided brachytherapy for prostate cancer. Although the postimplant dose-volume histograms for all patients, before the implantation of additional seeds, were adequate according to the American Brachytherapy Society criteria, specific critical areas can be underdosed. Additional seeds can then be implanted to optimize the dosimetry and reduce the risk of underdosing areas of cancer.
Li, Zhiming; Yu, Lan; Wang, Xin; Yu, Haiyang; Gao, Yuanxiang; Ren, Yande; Wang, Gang; Zhou, Xiaoming
2017-11-09
The purpose of this study was to investigate the diagnostic performance of mammographic texture analysis in the differential diagnosis of benign and malignant breast tumors. Digital mammography images were obtained from the Picture Archiving and Communication System at our institute. Texture features of mammographic images were calculated. Mann-Whitney U test was used to identify differences between the benign and malignant group. The receiver operating characteristic (ROC) curve analysis was used to assess the diagnostic performance of texture features. Significant differences of texture features of histogram, gray-level co-occurrence matrix (GLCM) and run length matrix (RLM) were found between the benign and malignant breast group (P < .05). The area under the ROC (AUROC) of histogram, GLCM, and RLM were 0.800, 0.787, and 0.761, with no differences between them (P > .05). The AUROCs of imaging-based diagnosis, texture analysis, and imaging-based diagnosis combined with texture analysis were 0.873, 0.863, and 0.961, respectively. When imaging-based diagnosis was combined with texture analysis, the AUROC was higher than that of imaging-based diagnosis or texture analysis (P < .05). Mammographic texture analysis is a reliable technique for differential diagnosis of benign and malignant breast tumors. Furthermore, the combination of imaging-based diagnosis and texture analysis can significantly improve diagnostic performance. Copyright © 2017 Elsevier Inc. All rights reserved.
Efficient Scalable Median Filtering Using Histogram-Based Operations.
Green, Oded
2018-05-01
Median filtering is a smoothing technique for noise removal in images. While there are various implementations of median filtering for a single-core CPU, there are few implementations for accelerators and multi-core systems. Many parallel implementations of median filtering use a sorting algorithm for rearranging the values within a filtering window and taking the median of the sorted value. While using sorting algorithms allows for simple parallel implementations, the cost of the sorting becomes prohibitive as the filtering windows grow. This makes such algorithms, sequential and parallel alike, inefficient. In this work, we introduce the first software parallel median filtering that is non-sorting-based. The new algorithm uses efficient histogram-based operations. These reduce the computational requirements of the new algorithm while also accessing the image fewer times. We show an implementation of our algorithm for both the CPU and NVIDIA's CUDA supported graphics processing unit (GPU). The new algorithm is compared with several other leading CPU and GPU implementations. The CPU implementation has near perfect linear scaling with a speedup on a quad-core system. The GPU implementation is several orders of magnitude faster than the other GPU implementations for mid-size median filters. For small kernels, and , comparison-based approaches are preferable as fewer operations are required. Lastly, the new algorithm is open-source and can be found in the OpenCV library.
Germline Genetic Modification and Identity: the Mitochondrial and Nuclear Genomes.
Scott, Rosamund; Wilkinson, Stephen
2017-12-01
In a legal 'first', the UK removed a prohibition against modifying embryos in human reproduction, to enable mitochondrial replacement techniques (MRTs), a move the Government distanced from 'germline genetic modification', which it aligned with modifying the nuclear genome. This paper (1) analyzes the uses and meanings of this term in UK/US legal and policy debates; and (2) evaluates related ethical concerns about identity. It shows that, with respect to identity, MRTs and nuclear genome editing techniques such as CRISPR/Cas-9 (now a policy topic), are not as different as has been supposed. While it does not follow that the two should be treated exactly alike, one of the central reasons offered for treating MRTs more permissively than nuclear genetic modification, and for not regarding MRTs as 'germline genetic modification', is thereby in doubt. Identity cannot, by itself, do the work thus far assigned to it, explicitly or otherwise, in law and policy.
Malinowski, Andrzej; Mołas, Justyna; Maciołek-Blewniewska, Grazyna; Cieślak, Jarosław
2006-02-01
Vault haematoma is one of the most common complication of vaginal hysterectomy. The aim of this work was to analyse the effects of a modification of incision and closure technique of the vaginal vault on the incidence of vault haematoma after vaginal hysterectomy. The study group consisted of 333 women of whom 49 (group A) underwent vaginal hysterectomy traditional technique of incision and closure of the vaginal vault, an 284 (group B) modified technique. Following parameters were evaluated: number of vault haematomas, blood loss, postoperative fever, required antibiotics, length of hospital stay. The risk of vault haematoma was significantly lower in the group B (1,06% vs 12,4%). Loss of blood was higher in group A (310 ml vs 206 ml). Incidence of postoperative fever was in 12,2% patients from group A, and 1,4% from group B. The length of hospitalization was lower for women in group B (4,3 days compared with 7,3 days). The modification of incision and closure technique of the vaginal vault during vaginal hysterectomy is recommended to minimise intra- and postoperative complications.
Is cepstrum averaging applicable to circularly polarized electric-field data?
NASA Astrophysics Data System (ADS)
Tunnell, T.
1990-04-01
In FY 1988 a cepstrum averaging technique was developed to eliminate the ground reflections from charged particle beam (CPB) electromagnetic pulse (EMP) data. The work was done for the Los Alamos National Laboratory Project DEWPOINT at SST-7. The technique averages the cepstra of horizontally and vertically polarized electric field data (i.e., linearly polarized electric field data). This cepstrum averaging technique was programmed into the FORTRAN codes CEP and CEPSIM. Steve Knox, the principal investigator for Project DEWPOINT, asked the authors to determine if the cepstrum averaging technique could be applied to circularly polarized electric field data. The answer is, Yes, but some modifications may be necessary. There are two aspects to this answer that we need to address, namely, the Yes and the modifications. First, regarding the Yes, the technique is applicable to elliptically polarized electric field data in general: circular polarization is a special case of elliptical polarization. Secondly, regarding the modifications, greater care may be required in computing the phase in the calculation of the complex logarithm. The calculation of the complex logarithm is the most critical step in cepstrum-based analysis. This memorandum documents these findings.
Yi, Jisook; Lee, Young Han; Kim, Sang Kyum; Kim, Seung Hyun; Song, Ho-Taek; Shin, Kyoo-Ho; Suh, Jin-Suck
2018-05-01
This study aimed to compare computed tomography (CT) features, including tumor size and textural and histogram measurements, of giant-cell tumors of bone (GCTBs) before and after denosumab treatment and determine their applicability in monitoring GCTB response to denosumab treatment. This retrospective study included eight patients (male, 3; female, 5; mean age, 33.4 years) diagnosed with GCTB, who had received treatment by denosumab and had undergone pre- and post-treatment non-contrast CT between January 2010 and December 2016. This study was approved by the institutional review board. Pre- and post-treatment size, histogram, and textural parameters of GCTBs were compared by the Wilcoxon signed-rank test. Pathological findings of five patients who underwent surgery after denosumab treatment were evaluated for assessment of treatment response. Relative to the baseline values, the tumor size had decreased, while the mean attenuation, standard deviation, entropy (all, P = 0.017), and skewness (P = 0.036) of the GCTBs had significantly increased post-treatment. Although the difference was statistically insignificant, the tumors also exhibited increased kurtosis, contrast, and inverse difference moment (P = 0.123, 0.327, and 0.575, respectively) post-treatment. Histologic findings revealed new bone formation and complete depletion or decrease in the number of osteoclast-like giant cells. The histogram and textural parameters of GCTBs changed significantly after denosumab treatment. Knowledge of the tendency towards increased mean attenuation and heterogeneity but increased local homogeneity in post-treatment CT histogram and textural features of GCTBs might aid in treatment planning and tumor response evaluation during denosumab treatment. Copyright © 2018. Published by Elsevier B.V.
Hao, Yonghong; Pan, Chu; Chen, WeiWei; Li, Tao; Zhu, WenZhen; Qi, JianPin
2016-12-01
To explore the usefulness of whole-lesion histogram analysis of apparent diffusion coefficient (ADC) derived from reduced field-of-view (r-FOV) diffusion-weighted imaging (DWI) in differentiating malignant and benign thyroid nodules and stratifying papillary thyroid cancer (PTC) with aggressive histological features. This Institutional Review Board-approved, retrospective study included 93 patients with 101 pathologically proven thyroid nodules. All patients underwent preoperative r-FOV DWI at 3T. The whole-lesion ADC assessments were performed for each patient. Histogram-derived ADC parameters between different subgroups (pathologic type, extrathyroidal extension, lymph node metastasis) were compared. Receiver operating characteristic curve analysis was used to determine optimal histogram parameters in differentiating benign and malignant nodules and predicting aggressiveness of PTC. Mean ADC, median ADC, 5 th percentile ADC, 25 th percentile ADC, 75 th percentile ADC, 95 th percentile ADC (all P < 0.001), and kurtosis (P = 0.001) were significantly lower in malignant thyroid nodules, and mean ADC achieved the highest AUC (0.919) with a cutoff value of 1842.78 × 10 -6 mm 2 /s in differentiating malignant and benign nodules. Compared to the PTCs without extrathyroidal extension, PTCs with extrathyroidal extension showed significantly lower median ADC, 5 th percentile ADC, and 25 th percentile ADC. The 5 th percentile ADC achieved the highest AUC (0.757) with cutoff value of 911.5 × 10 -6 mm 2 /s for differentiating between PTCs with and without extrathyroidal extension. Whole-lesion ADC histogram analysis might help to differentiate malignant nodules from benign ones and show the PTCs with extrathyroidal extension. J. Magn. Reson. Imaging 2016;44:1546-1555. © 2016 International Society for Magnetic Resonance in Medicine.
Hu, Xin-Xing; Yang, Zhao-Xia; Liang, He-Yue; Ding, Ying; Grimm, Robert; Fu, Cai-Xia; Liu, Hui; Yan, Xu; Ji, Yuan; Zeng, Meng-Su; Rao, Sheng-Xiang
2017-08-01
To evaluate whether whole-tumor histogram-derived parameters for an apparent diffusion coefficient (ADC) map and contrast-enhanced magnetic resonance imaging (MRI) could aid in assessing Ki-67 labeling index (LI) of hepatocellular carcinoma (HCC). In all, 57 patients with HCC who underwent pretreatment MRI with a 3T MR scanner were included retrospectively. Histogram parameters including mean, median, standard deviation, skewness, kurtosis, and percentiles (5 th , 25 th , 75 th , 95 th ) were derived from the ADC map and MR enhancement. Correlations between histogram parameters and Ki-67 LI were evaluated and differences between low Ki-67 (≤10%) and high Ki-67 (>10%) groups were assessed. Mean, median, 5 th , 25 th , 75 th percentiles of ADC, and mean, median, 25 th , 75 th , 95 th percentiles of enhancement of arterial phase (AP) demonstrated significant inverse correlations with Ki-67 LI (rho up to -0.48 for ADC, -0.43 for AP) and showed significant differences between low and high Ki-67 groups (P < 0.001-0.04). Areas under the receiver operator characteristics (ROC) curve for identification of high Ki-67 were 0.78, 0.77, 0.79, 0.82, and 0.76 for mean, median, 5 th , 25 th , 75 th percentiles of ADC, respectively, and 0.74, 0.81, 0.76, 0.82, 0.69 for mean, median, 25 th , 75 th , 95 th percentiles of AP, respectively. Histogram-derived parameters of ADC and AP were potentially helpful for predicting Ki-67 LI of HCC. 3 Technical Efficacy: Stage 3 J. MAGN. RESON. IMAGING 2017;46:383-392. © 2016 International Society for Magnetic Resonance in Medicine.
Cho, Seung Hyun; Kim, Gab Chul; Jang, Yun-Jin; Ryeom, Hunkyu; Kim, Hye Jung; Shin, Kyung-Min; Park, Jun Seok; Choi, Gyu-Seog; Kim, See Hyung
2015-09-01
The value of diffusion-weighted imaging (DWI) for reliable differentiation between pathologic complete response (pCR) and residual tumor is still unclear. Recently, a few studies reported that histogram analysis can be helpful to monitor the therapeutic response in various cancer research. To investigate whether post-chemoradiotherapy (CRT) apparent diffusion coefficient (ADC) histogram analysis can be helpful to predict a pCR in locally advanced rectal cancer (LARC). Fifty patients who underwent preoperative CRT followed by surgery were enrolled in this retrospective study, non-pCR (n = 41) and pCR (n = 9), respectively. ADC histogram analysis encompassing the whole tumor was performed on two post-CRT ADC600 and ADC1000 (b factors 0, 600 vs. 0, 1000 s/mm(2)) maps. Mean, minimum, maximum, SD, mode, 10th, 25th, 50th, 75th, 90th percentile ADCs, skewness, and kurtosis were derived. Diagnostic performance for predicting pCR was evaluated and compared. On both maps, 10th and 25th ADCs showed better diagnostic performance than that using mean ADC. Tenth percentile ADCs revealed the best diagnostic performance on both ADC600 (AZ 0.841, sensitivity 100%, specificity 70.7%) and ADC1000 (AZ 0.821, sensitivity 77.8%, specificity 87.8%) maps. In comparison between 10th percentile and mean ADC, the specificity was significantly improved on both ADC600 (70.7% vs. 53.7%; P = 0.031) and ADC1000 (87.8% vs. 73.2%; P = 0.039) maps. Post-CRT ADC histogram analysis is helpful for predicting pCR in LARC, especially, in improving the specificity, compared with mean ADC. © The Foundation Acta Radiologica 2014.
Zhang, Wei; Zhou, Yue; Xu, Xiao-Quan; Kong, Ling-Yan; Xu, Hai; Yu, Tong-Fu; Shi, Hai-Bin; Feng, Qing
2018-01-01
To assess the performance of a whole-tumor histogram analysis of apparent diffusion coefficient (ADC) maps in differentiating thymic carcinoma from lymphoma, and compare it with that of a commonly used hot-spot region-of-interest (ROI)-based ADC measurement. Diffusion weighted imaging data of 15 patients with thymic carcinoma and 13 patients with lymphoma were retrospectively collected and processed with a mono-exponential model. ADC measurements were performed by using a histogram-based and hot-spot-ROI-based approach. In the histogram-based approach, the following parameters were generated: mean ADC (ADC mean ), median ADC (ADC median ), 10th and 90th percentile of ADC (ADC 10 and ADC 90 ), kurtosis, and skewness. The difference in ADCs between thymic carcinoma and lymphoma was compared using a t test. Receiver operating characteristic analyses were conducted to determine and compare the differentiating performance of ADCs. Lymphoma demonstrated significantly lower ADC mean , ADC median , ADC 10 , ADC 90 , and hot-spot-ROI-based mean ADC than those found in thymic carcinoma (all p values < 0.05). There were no differences found in the kurtosis ( p = 0.412) and skewness ( p = 0.273). The ADC 10 demonstrated optimal differentiating performance (cut-off value, 0.403 × 10 -3 mm 2 /s; area under the receiver operating characteristic curve [AUC], 0.977; sensitivity, 92.3%; specificity, 93.3%), followed by the ADC mean , ADC median , ADC 90 , and hot-spot-ROI-based mean ADC. The AUC of ADC 10 was significantly higher than that of the hot spot ROI based ADC (0.977 vs. 0.797, p = 0.036). Compared with the commonly used hot spot ROI based ADC measurement, a histogram analysis of ADC maps can improve the differentiating performance between thymic carcinoma and lymphoma.
Xu, Xiao-Quan; Li, Yan; Hong, Xun-Ning; Wu, Fei-Yun; Shi, Hai-Bin
2017-02-01
To assess the role of whole-tumor histogram analysis of apparent diffusion coefficient (ADC) maps in differentiating radiological indeterminate vestibular schwannoma (VS) from meningioma in cerebellopontine angle (CPA). Diffusion-weighted (DW) images (b = 0 and 1000 s/mm 2 ) of pathologically confirmed and radiological indeterminate CPA meningioma (CPAM) (n = 27) and VS (n = 12) were retrospectively collected and processed with mono-exponential model. Whole-tumor regions of interest were drawn on all slices of the ADC maps to obtain histogram parameters, including the mean ADC (ADC mean ), median ADC (ADC median ), 10th/25th/75th/90th percentile ADC (ADC 10 , ADC 25 , ADC 75 and ADC 90 ), skewness and kurtosis. The differences of ADC histogram parameters between CPAM and VS were compared using unpaired t-test. Multiple receiver operating characteristic (ROC) curves analysis was used to determine and compare the diagnostic value of each significant parameter. Significant differences were found on the ADC mean , ADC median , ADC 10 , ADC 25 , ADC 75 and ADC 90 between CPAM and VS (all p values < 0.001), while no significant difference was found on kurtosis (p = 0.562) and skewness (p = 0.047). ROC curves analysis revealed, a cut-off value of 1.126 × 10 -3 mm 2 /s for the ADC 90 value generated highest area under curves (AUC) for differentiating CPAM from VS (AUC, 0.975; sensitivity, 100%; specificity, 88.9%). Histogram analysis of ADC maps based on whole tumor can be a useful tool for differentiating radiological indeterminate CPAM from VS. The ADC 90 value was the most promising parameter for differentiating these two entities.
Atherogenic lipid phenotype in a general group of subjects.
Van, Joanne; Pan, Jianqiu; Charles, M Arthur; Krauss, Ronald; Wong, Nathan; Wu, Xiaoshan
2007-11-01
The atherogenic lipid phenotype is a major cardiovascular risk factor, but normal values do not exist derived from 1 analysis in a general study group. To determine normal values of all of the atherogenic lipid phenotype parameters using subjects from a general study group. One hundred two general subjects were used to determine their atherogenic lipid phenotype using polyacrylamide gradient gels. Low-density lipoprotein (LDL) size revealed 24% of subjects express LDL phenotype B, defined as average LDL peak particle size 258 A or less; however, among the Chinese subjects, the expression of the B phenotype was higher at 44% (P = .02). For the total group, mean LDL size was 265 +/- 11 A (1 SD); however, histograms were bimodal in both men and women. After excluding subjects expressing LDL phenotype B, because they are at increased cardiovascular risk and thus are not completely healthy, LDL histograms were unimodal and the mean LDL size was 270 +/- 7 A. A small, dense LDL concentration histogram (total group) revealed skewing; thus, phenotype B subjects were excluded, for the rationale described previously, and the mean value was 13 +/- 9 mg/dL (0.33 +/- 0.23 mmol/L). High-density lipoprotein (HDL) cholesterol histograms were bimodal in both sexes. After removing subjects as described previously or if HDL cholesterol levels were less than 45 mg/dL, histograms were unimodal and revealed a mean HDL cholesterol value of 61 +/- 12 mg/dL (1.56 +/- 0.31 mmol/L). HDL 2, HDL 2a, and HDL 2b were similarly evaluated. Approximate normal values for the atherogenic lipid phenotype, similar to those derived from cardiovascular endpoint trials, can be determined if those high proportions of subjects with dyslipidemic cardiovascular risk are excluded.
Feasibility of histogram analysis of susceptibility-weighted MRI for staging of liver fibrosis
Yang, Zhao-Xia; Liang, He-Yue; Hu, Xin-Xing; Huang, Ya-Qin; Ding, Ying; Yang, Shan; Zeng, Meng-Su; Rao, Sheng-Xiang
2016-01-01
PURPOSE We aimed to evaluate whether histogram analysis of susceptibility-weighted imaging (SWI) could quantify liver fibrosis grade in patients with chronic liver disease (CLD). METHODS Fifty-three patients with CLD who underwent multi-echo SWI (TEs of 2.5, 5, and 10 ms) were included. Histogram analysis of SWI images were performed and mean, variance, skewness, kurtosis, and the 1st, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared. For significant parameters, further receiver operating characteristic (ROC) analyses were performed to evaluate the potential diagnostic performance for differentiating liver fibrosis stages. RESULTS The number of patients in each pathologic fibrosis grade was 7, 3, 5, 5, and 33 for F0, F1, F2, F3, and F4, respectively. The results of variance (TE: 10 ms), 90th percentile (TE: 10 ms), and 99th percentile (TE: 10 and 5 ms) in F0–F3 group were significantly lower than in F4 group, with areas under the ROC curves (AUCs) of 0.84 for variance and 0.70–0.73 for the 90th and 99th percentiles, respectively. The results of variance (TE: 10 and 5 ms), 99th percentile (TE: 10 ms), and skewness (TE: 2.5 and 5 ms) in F0–F2 group were smaller than those of F3/F4 group, with AUCs of 0.88 and 0.69 for variance (TE: 10 and 5 ms, respectively), 0.68 for 99th percentile (TE: 10 ms), and 0.73 and 0.68 for skewness (TE: 2.5 and 5 ms, respectively). CONCLUSION Magnetic resonance histogram analysis of SWI, particularly the variance, is promising for predicting advanced liver fibrosis and cirrhosis. PMID:27113421
A contrast enhancement method for improving the segmentation of breast lesions on ultrasonography.
Flores, Wilfrido Gómez; Pereira, Wagner Coelho de Albuquerque
2017-01-01
This paper presents an adaptive contrast enhancement method based on sigmoidal mapping function (SACE) used for improving the computerized segmentation of breast lesions on ultrasound. First, from the original ultrasound image an intensity variation map is obtained, which is used to generate local sigmoidal mapping functions related to distinct contextual regions. Then, a bilinear interpolation scheme is used to transform every original pixel to a new gray level value. Also, four contrast enhancement techniques widely used in breast ultrasound enhancement are implemented: histogram equalization (HEQ), contrast limited adaptive histogram equalization (CLAHE), fuzzy enhancement (FEN), and sigmoid based enhancement (SEN). In addition, these contrast enhancement techniques are considered in a computerized lesion segmentation scheme based on watershed transformation. The performance comparison among techniques is assessed in terms of both the quality of contrast enhancement and the segmentation accuracy. The former is quantified by the measure, where the greater the value, the better the contrast enhancement, whereas the latter is calculated by the Jaccard index, which should tend towards unity to indicate adequate segmentation. The experiments consider a data set with 500 breast ultrasound images. The results show that SACE outperforms its counterparts, where the median values for the measure are: SACE: 139.4, SEN: 68.2, HEQ: 64.1, CLAHE: 62.8, and FEN: 7.9. Considering the segmentation performance results, the SACE method presents the largest accuracy, where the median values for the Jaccard index are: SACE: 0.81, FEN: 0.80, CLAHE: 0.79, HEQ: 77, and SEN: 0.63. The SACE method performs well due to the combination of three elements: (1) the intensity variation map reduces intensity variations that could distort the real response of the mapping function, (2) the sigmoidal mapping function enhances the gray level range where the transition between lesion and background is found, and (3) the adaptive enhancing scheme for coping with local contrasts. Hence, the SACE approach is appropriate for enhancing contrast before computerized lesion segmentation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Salas-Gonzalez, D; Górriz, J M; Ramírez, J; Padilla, P; Illán, I A
2013-01-01
A procedure to improve the convergence rate for affine registration methods of medical brain images when the images differ greatly from the template is presented. The methodology is based on a histogram matching of the source images with respect to the reference brain template before proceeding with the affine registration. The preprocessed source brain images are spatially normalized to a template using a general affine model with 12 parameters. A sum of squared differences between the source images and the template is considered as objective function, and a Gauss-Newton optimization algorithm is used to find the minimum of the cost function. Using histogram equalization as a preprocessing step improves the convergence rate in the affine registration algorithm of brain images as we show in this work using SPECT and PET brain images.
HoDOr: histogram of differential orientations for rigid landmark tracking in medical images
NASA Astrophysics Data System (ADS)
Tiwari, Abhishek; Patwardhan, Kedar Anil
2018-03-01
Feature extraction plays a pivotal role in pattern recognition and matching. An ideal feature should be invariant to image transformations such as translation, rotation, scaling, etc. In this work, we present a novel rotation-invariant feature, which is based on Histogram of Oriented Gradients (HOG). We compare performance of the proposed approach with the HOG feature on 2D phantom data, as well as 3D medical imaging data. We have used traditional histogram comparison measures such as Bhattacharyya distance and Normalized Correlation Coefficient (NCC) to assess efficacy of the proposed approach under effects of image rotation. In our experiments, the proposed feature performs 40%, 20%, and 28% better than the HOG feature on phantom (2D), Computed Tomography (CT-3D), and Ultrasound (US-3D) data for image matching, and landmark tracking tasks respectively.
A novel parallel architecture for local histogram equalization
NASA Astrophysics Data System (ADS)
Ohannessian, Mesrob I.; Choueiter, Ghinwa F.; Diab, Hassan
2005-07-01
Local histogram equalization is an image enhancement algorithm that has found wide application in the pre-processing stage of areas such as computer vision, pattern recognition and medical imaging. The computationally intensive nature of the procedure, however, is a main limitation when real time interactive applications are in question. This work explores the possibility of performing parallel local histogram equalization, using an array of special purpose elementary processors, through an HDL implementation that targets FPGA or ASIC platforms. A novel parallelization scheme is presented and the corresponding architecture is derived. The algorithm is reduced to pixel-level operations. Processing elements are assigned image blocks, to maintain a reasonable performance-cost ratio. To further simplify both processor and memory organizations, a bit-serial access scheme is used. A brief performance assessment is provided to illustrate and quantify the merit of the approach.
A Review of Particle Swarm Optimization
NASA Astrophysics Data System (ADS)
Jain, N. K.; Nangia, Uma; Jain, Jyoti
2018-03-01
This paper presents an overview of the research progress in Particle Swarm Optimization (PSO) during 1995-2017. Fifty two papers have been reviewed. They have been categorized into nine categories based on various aspects. This technique has attracted many researchers because of its simplicity which led to many improvements and modifications of the basic PSO. Some researchers carried out the hybridization of PSO with other evolutionary techniques. This paper discusses the progress of PSO, its improvements, modifications and applications.
NASA Astrophysics Data System (ADS)
Krummacher, B. C.; Mathai, M. K.; Choong, V.; Choulis, S. A.; So, F.; Winnacker, A.
2006-09-01
The external light output of organic light emitting diodes (OLEDs) can be increased by modifying the light emitting surface. The apparent light extraction enhancement is given by the ratio between the efficiency of the unmodified device and the efficiency of the modified device. This apparent light extraction enhancement is dependent on the OLED architecture itself and is not the correct value to judge the effectiveness of a technique to enhance light outcoupling due to substrate surface modification. We propose a general method to evaluate substrate surface modification techniques for light extraction enhancement of OLEDs independent from the device architecture. This method is experimentally demonstrated using green electrophosphorescent OLEDs with different device architectures. The substrate surface of these OLEDs was modified by applying a prismatic film to increase light outcoupling from the device stack. It was demonstrated that the conventionally measured apparent light extraction enhancement by means of the prismatic film does not reflect the actual performance of the light outcoupling technique. Rather, by comparing the light extracted out of the prismatic film to that generated in the OLED layers and coupled into the substrate (before the substrate/air interface), a more accurate evaluation of light outcoupling enhancement can be achieved. Furthermore we show that substrate surface modification can change the output spectrum of a broad band emitting OLED.
Liu, Chenwei; Shea, Nancy; Rucker, Sally; Harvey, Linda; Russo, Paul; Saul, Richard; Lopez, Mary F; Mikulskis, Alvydas; Kuzdzal, Scott; Golenko, Eva; Fishman, David; Vonderheid, Eric; Booher, Susan; Cowen, Edward W; Hwang, Sam T; Whiteley, Gordon R
2007-11-01
Proteomic patterns as a potential diagnostic technology has been well established for several cancer conditions and other diseases. The use of machine learning techniques such as decision trees, neural networks, genetic algorithms, and other methods has been the basis for pattern determination. Cancer is known to involve signaling pathways that are regulated through PTM of proteins. These modifications are also detectable with high confidence using high-resolution MS. We generated data using a prOTOF mass spectrometer on two sets of patient samples: ovarian cancer and cutaneous t-cell lymphoma (CTCL) with matched normal samples for each disease. Using the knowledge of mass shifts caused by common modifications, we built models using peak pairs and compared this to a conventional technique using individual peaks. The results for each disease showed that a small number of peak pairs gave classification equal to or better than the conventional technique that used multiple individual peaks. This simple peak picking technique could be used to guide identification of important peak pairs involved in the disease process.
Enhanced removal of nitrate from water using surface modification of adsorbents--a review.
Loganathan, Paripurnanda; Vigneswaran, Saravanamuthu; Kandasamy, Jaya
2013-12-15
Elevated concentration of nitrate results in eutrophication of natural water bodies affecting the aquatic environment and reduces the quality of drinking water. This in turn causes harm to people's health, especially that of infants and livestock. Adsorbents with the high capacity to selectively adsorb nitrate are required to effectively remove nitrate from water. Surface modifications of adsorbents have been reported to enhance their adsorption of nitrate. The major techniques of surface modification are: protonation, impregnation of metals and metal oxides, grafting of amine groups, organic compounds including surfactant coating of aluminosilicate minerals, and heat treatment. This paper reviews current information on these techniques, compares the enhanced nitrate adsorption capacities achieved by the modifications, and the mechanisms of adsorption, and presents advantages and drawbacks of the techniques. Most studies on this subject have been conducted in batch experiments. These studies need to include continuous mode column trials which have more relevance to real operating systems and pilot-plant trials. Reusability of adsorbents is important for economic reasons and practical treatment applications. However, only limited information is available on the regeneration of surface modified adsorbents. Copyright © 2013 Elsevier Ltd. All rights reserved.
Chest CT window settings with multiscale adaptive histogram equalization: pilot study.
Fayad, Laura M; Jin, Yinpeng; Laine, Andrew F; Berkmen, Yahya M; Pearson, Gregory D; Freedman, Benjamin; Van Heertum, Ronald
2002-06-01
Multiscale adaptive histogram equalization (MAHE), a wavelet-based algorithm, was investigated as a method of automatic simultaneous display of the full dynamic contrast range of a computed tomographic image. Interpretation times were significantly lower for MAHE-enhanced images compared with those for conventionally displayed images. Diagnostic accuracy, however, was insufficient in this pilot study to allow recommendation of MAHE as a replacement for conventional window display.
Differentially Private Histogram Publication For Dynamic Datasets: An Adaptive Sampling Approach
Li, Haoran; Jiang, Xiaoqian; Xiong, Li; Liu, Jinfei
2016-01-01
Differential privacy has recently become a de facto standard for private statistical data release. Many algorithms have been proposed to generate differentially private histograms or synthetic data. However, most of them focus on “one-time” release of a static dataset and do not adequately address the increasing need of releasing series of dynamic datasets in real time. A straightforward application of existing histogram methods on each snapshot of such dynamic datasets will incur high accumulated error due to the composibility of differential privacy and correlations or overlapping users between the snapshots. In this paper, we address the problem of releasing series of dynamic datasets in real time with differential privacy, using a novel adaptive distance-based sampling approach. Our first method, DSFT, uses a fixed distance threshold and releases a differentially private histogram only when the current snapshot is sufficiently different from the previous one, i.e., with a distance greater than a predefined threshold. Our second method, DSAT, further improves DSFT and uses a dynamic threshold adaptively adjusted by a feedback control mechanism to capture the data dynamics. Extensive experiments on real and synthetic datasets demonstrate that our approach achieves better utility than baseline methods and existing state-of-the-art methods. PMID:26973795
Helmer, K G; Chou, M-C; Preciado, R I; Gimi, B; Rollins, N K; Song, A; Turner, J; Mori, S
2016-02-27
MRI-based multi-site trials now routinely include some form of diffusion-weighted imaging (DWI) in their protocol. These studies can include data originating from scanners built by different vendors, each with their own set of unique protocol restrictions, including restrictions on the number of available gradient directions, whether an externally-generated list of gradient directions can be used, and restrictions on the echo time (TE). One challenge of multi-site studies is to create a common imaging protocol that will result in a reliable and accurate set of diffusion metrics. The present study describes the effect of site, scanner vendor, field strength, and TE on two common metrics: the first moment of the diffusion tensor field (mean diffusivity, MD), and the fractional anisotropy (FA). We have shown in earlier work that ROI metrics and the mean of MD and FA histograms are not sufficiently sensitive for use in site characterization. Here we use the distance between whole brain histograms of FA and MD to investigate within- and between-site effects. We concluded that the variability of DTI metrics due to site, vendor, field strength, and echo time could influence the results in multi-center trials and that histogram distance is sensitive metrics for each of these variables.
Liu, Song; Zhang, Yujuan; Xia, Jie; Chen, Ling; Guan, Wenxian; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang
2017-10-01
To explore the application of histogram analysis in preoperative T and N staging of gastric cancers, with a focus on characteristic parameters of apparent diffusion coefficient (ADC) maps. Eighty-seven patients with gastric cancers underwent diffusion weighted magnetic resonance imaging (b=0, 1000s/mm 2 ), which generated ADC maps. Whole-volume histogram analysis was performed on ADC maps and 7 characteristic parameters were obtained. All those patients underwent surgery and postoperative pathologic T and N stages were determined. Four parameters, including skew, kurtosis, s-sD av and sample number, showed significant differences among gastric cancers at different T and N stages. Most parameters correlated with T and N stages significantly and worked in differentiating gastric cancers at different T or N stages. Especially skew yielded a sensitivity of 0.758, a specificity of 0.810, and an area under the curve (AUC) of 0.802 for differentiating gastric cancers with and without lymph node metastasis (P<0.001). All the parameters, except AUC low , showed good or excellent inter-observer agreement with intra-class correlation coefficients ranging from 0.710 to 0.991. Characteristic parameters derived from whole-volume ADC histogram analysis could help assessing preoperative T and N stages of gastric cancers. Copyright © 2017. Published by Elsevier Inc.
Winter, Karsten; Richter, Cindy; Hoehn, Anna-Kathrin
2018-01-01
Our purpose was to analyze associations between apparent diffusion coefficient (ADC) histogram analysis parameters and histopathologicalfeatures in head and neck squamous cell carcinoma (HNSCC). The study involved 32 patients with primary HNSCC. For every tumor, the following histogram analysis parameters were calculated: ADCmean, ADCmax, ADCmin, ADCmedian, ADCmode, P10, P25, P75, P90, kurtosis, skewness, and entropy. Furthermore, proliferation index KI 67, cell count, total and average nucleic areas were estimated. Spearman's correlation coefficient (p) was used to analyze associations between investigated parameters. In overall sample, all ADC values showed moderate inverse correlations with KI 67. All ADC values except ADCmax correlated inversely with tumor cellularity. Slightly correlations were identified between total/average nucleic area and ADCmean, ADCmin, ADCmedian, and P25. In G1/2 tumors, only ADCmode correlated well with Ki67. No statistically significant correlations between ADC parameters and cellularity were found. In G3 tumors, Ki 67 correlated with all ADC parameters except ADCmode. Cell count correlated well with all ADC parameters except ADCmax. Total nucleic area correlated inversely with ADCmean, ADCmin, ADCmedian, P25, and P90. ADC histogram parameters reflect proliferation potential and cellularity in HNSCC. The associations between histopathology and imaging depend on tumor grading. PMID:29805759
Universal and adapted vocabularies for generic visual categorization.
Perronnin, Florent
2008-07-01
Generic Visual Categorization (GVC) is the pattern classification problem which consists in assigning labels to an image based on its semantic content. This is a challenging task as one has to deal with inherent object/scene variations as well as changes in viewpoint, lighting and occlusion. Several state-of-the-art GVC systems use a vocabulary of visual terms to characterize images with a histogram of visual word counts. We propose a novel practical approach to GVC based on a universal vocabulary, which describes the content of all the considered classes of images, and class vocabularies obtained through the adaptation of the universal vocabulary using class-specific data. The main novelty is that an image is characterized by a set of histograms - one per class - where each histogram describes whether the image content is best modeled by the universal vocabulary or the corresponding class vocabulary. This framework is applied to two types of local image features: low-level descriptors such as the popular SIFT and high-level histograms of word co-occurrences in a spatial neighborhood. It is shown experimentally on two challenging datasets (an in-house database of 19 categories and the PASCAL VOC 2006 dataset) that the proposed approach exhibits state-of-the-art performance at a modest computational cost.
Statistical Properties of Line Centroid Velocity Increments in the rho Ophiuchi Cloud
NASA Technical Reports Server (NTRS)
Lis, D. C.; Keene, Jocelyn; Li, Y.; Phillips, T. G.; Pety, J.
1998-01-01
We present a comparison of histograms of CO (2-1) line centroid velocity increments in the rho Ophiuchi molecular cloud with those computed for spectra synthesized from a three-dimensional, compressible, but non-starforming and non-gravitating hydrodynamic simulation. Histograms of centroid velocity increments in the rho Ophiuchi cloud show clearly non-Gaussian wings, similar to those found in histograms of velocity increments and derivatives in experimental studies of laboratory and atmospheric flows, as well as numerical simulations of turbulence. The magnitude of these wings increases monotonically with decreasing separation, down to the angular resolution of the data. This behavior is consistent with that found in the phase of the simulation which has most of the properties of incompressible turbulence. The time evolution of the magnitude of the non-Gaussian wings in the histograms of centroid velocity increments in the simulation is consistent with the evolution of the vorticity in the flow. However, we cannot exclude the possibility that the wings are associated with the shock interaction regions. Moreover, in an active starforming region like the rho Ophiuchi cloud, the effects of shocks may be more important than in the simulation. However, being able to identify shock interaction regions in the interstellar medium is also important, since numerical simulations show that vorticity is generated in shock interactions.
Contrast Enhancement Algorithm Based on Gap Adjustment for Histogram Equalization
Chiu, Chung-Cheng; Ting, Chih-Chung
2016-01-01
Image enhancement methods have been widely used to improve the visual effects of images. Owing to its simplicity and effectiveness histogram equalization (HE) is one of the methods used for enhancing image contrast. However, HE may result in over-enhancement and feature loss problems that lead to unnatural look and loss of details in the processed images. Researchers have proposed various HE-based methods to solve the over-enhancement problem; however, they have largely ignored the feature loss problem. Therefore, a contrast enhancement algorithm based on gap adjustment for histogram equalization (CegaHE) is proposed. It refers to a visual contrast enhancement algorithm based on histogram equalization (VCEA), which generates visually pleasing enhanced images, and improves the enhancement effects of VCEA. CegaHE adjusts the gaps between two gray values based on the adjustment equation, which takes the properties of human visual perception into consideration, to solve the over-enhancement problem. Besides, it also alleviates the feature loss problem and further enhances the textures in the dark regions of the images to improve the quality of the processed images for human visual perception. Experimental results demonstrate that CegaHE is a reliable method for contrast enhancement and that it significantly outperforms VCEA and other methods. PMID:27338412
Analysis of dose heterogeneity using a subvolume-DVH
NASA Astrophysics Data System (ADS)
Said, M.; Nilsson, P.; Ceberg, C.
2017-11-01
The dose-volume histogram (DVH) is universally used in radiation therapy for its highly efficient way of summarizing three-dimensional dose distributions. An apparent limitation that is inherent to standard histograms is the loss of spatial information, e.g. it is no longer possible to tell where low- and high-dose regions are, and whether they are connected or disjoint. Two methods for overcoming the spatial fragmentation of low- and high-dose regions are presented, both based on the gray-level size zone matrix, which is a two-dimensional histogram describing the frequencies of connected regions of similar intensities. The first approach is a quantitative metric which can be likened to a homogeneity index. The large cold spot metric (LCS) is here defined to emphasize large contiguous regions receiving too low a dose; emphasis is put on both size, and deviation from the prescribed dose. In contrast, the subvolume-DVH (sDVH) is an extension to the standard DVH and allows for a qualitative evaluation of the degree of dose heterogeneity. The information retained from the two-dimensional histogram is overlaid on top of the DVH and the two are presented simultaneously. Both methods gauge the underlying heterogeneity in ways that the DVH alone cannot, and both have their own merits—the sDVH being more intuitive and the LCS being quantitative.
Digital image classification with the help of artificial neural network by simple histogram.
Dey, Pranab; Banerjee, Nirmalya; Kaur, Rajwant
2016-01-01
Visual image classification is a great challenge to the cytopathologist in routine day-to-day work. Artificial neural network (ANN) may be helpful in this matter. In this study, we have tried to classify digital images of malignant and benign cells in effusion cytology smear with the help of simple histogram data and ANN. A total of 404 digital images consisting of 168 benign cells and 236 malignant cells were selected for this study. The simple histogram data was extracted from these digital images and an ANN was constructed with the help of Neurointelligence software [Alyuda Neurointelligence 2.2 (577), Cupertino, California, USA]. The network architecture was 6-3-1. The images were classified as training set (281), validation set (63), and test set (60). The on-line backpropagation training algorithm was used for this study. A total of 10,000 iterations were done to train the ANN system with the speed of 609.81/s. After the adequate training of this ANN model, the system was able to identify all 34 malignant cell images and 24 out of 26 benign cells. The ANN model can be used for the identification of the individual malignant cells with the help of simple histogram data. This study will be helpful in the future to identify malignant cells in unknown situations.
Liang, Alice L W; Vavasour, Irene M; Mädler, Burkhard; Traboulsee, Anthony L; Lang, Donna J; Li, David K B; MacKay, Alex L; Laule, Cornelia
2012-06-01
The presence of diffuse and widespread abnormalities within the 'normal appearing' white matter (NAWM) of multiple sclerosis (MS) brain has been established. T(1) histogram analysis has revealed increased T(1) (related to water content) in segmented NAWM, while quantitative assessment of T(2) relaxation measures has demonstrated decreased myelin water fraction (MWF, related to myelin content) and increased geometric mean T(2) (GMT(2)) of the intra/extracellular water pool. Previous studies with follow-up periods of 1-5 years have demonstrated longitudinal changes in T(1) histogram metrics over time; however, longitudinal changes in MWF and GMT(2) of segmented NAWM have not been examined. We examined the short-term evolution of MWF, GMT(2) and T(1) in MS NAWM based on monthly scanning over 6 months in 18 relapsing remitting (RR) MS subjects. Histogram metrics demonstrated short-term stability of T(1), MWF and remitting (RR) MS subjects. We observed no change in MWF, GMT(2) or T(1) histogram metrics in NAWM in RRMS over the course of 6 months. Longer follow-up periods may be required to establish demonstrable changes in NAWM based on of MWF, GMT(2) and T(1) metrics.
Slope histogram distribution-based parametrisation of Martian geomorphic features
NASA Astrophysics Data System (ADS)
Balint, Zita; Székely, Balázs; Kovács, Gábor
2014-05-01
The application of geomorphometric methods on the large Martian digital topographic datasets paves the way to analyse the Martian areomorphic processes in more detail. One of the numerous methods is the analysis is to analyse local slope distributions. To this implementation a visualization program code was developed that allows to calculate the local slope histograms and to compare them based on Kolmogorov distance criterion. As input data we used the digital elevation models (DTMs) derived from HRSC high-resolution stereo camera image from various Martian regions. The Kolmogorov-criterion based discrimination produces classes of slope histograms that displayed using coloration obtaining an image map. In this image map the distribution can be visualized by their different colours representing the various classes. Our goal is to create a local slope histogram based classification for large Martian areas in order to obtain information about general morphological characteristics of the region. This is a contribution of the TMIS.ascrea project, financed by the Austrian Research Promotion Agency (FFG). The present research is partly realized in the frames of TÁMOP 4.2.4.A/2-11-1-2012-0001 high priority "National Excellence Program - Elaborating and Operating an Inland Student and Researcher Personal Support System convergence program" project's scholarship support, using Hungarian state and European Union funds and cofinances from the European Social Fund.
ERIC Educational Resources Information Center
Habib, H. S.
A professor involved with the HELDS project (Higher Education for Learning Disabled Students) describes modifications in a general chemistry course. A syllabus lists program objectives for eight text chapters, evaluation components, and course rules. Two units are described in detail, with information presented on modifications made for LD…
[Cosmetic technique application on the modification of lip deformity after cleft lip surgery].
Lixian, Chen; Huajun, Wang; Caixia, Gong; Qian, Zheng; Bing, Shi; Bihe, Zhang
2018-02-01
This study aims to explore the effect of cosmetic technique on the modification of lip deformity after cleft lip surgery. A total of 35 patients with postoperative cleft lip and who needed two-stage repair due to the nasolabial deformity were selected. Cosmetic technique was used to modify their lip deformities prior to the surgery. Front photos of the patients were taken before and after modification of their lip deformities. These photos were subsequently assessed by both the patients and the medical staff. The visual analogue method, Asher-McDade aesthetic index, and Mortier PB scale were used by patients and medical staff to evaluate changes in the lip shape by the cosmetic technique. Prior to the cosmetic technique application, the mean self-score of the patients and the mean scores of the medical staff were 56±13 and 3.22±1.11 points, respectively. After the cosmetic technique application, the mean self-score of the patients and the mean scores of the medical staff were 67±12 and 2.85±1.03 points, respectively. The differences were statistically significant for both the patients and the medical staff (P<0.05). Appropriate use of the cosmetic technique can modify the lip deformity after the cleft lip surgery to a certain degree. The use of this technique exerts evident effects in restoring the symmetry of lip peak and the continuity of the labial arch and in highlighting the philtrum column.
Fission gas bubble identification using MATLAB's image processing toolbox
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collette, R.; King, J.; Keiser, Jr., D.
Automated image processing routines have the potential to aid in the fuel performance evaluation process by eliminating bias in human judgment that may vary from person-to-person or sample-to-sample. In addition, this study presents several MATLAB based image analysis routines designed for fission gas void identification in post-irradiation examination of uranium molybdenum (U–Mo) monolithic-type plate fuels. Frequency domain filtration, enlisted as a pre-processing technique, can eliminate artifacts from the image without compromising the critical features of interest. This process is coupled with a bilateral filter, an edge-preserving noise removal technique aimed at preparing the image for optimal segmentation. Adaptive thresholding provedmore » to be the most consistent gray-level feature segmentation technique for U–Mo fuel microstructures. The Sauvola adaptive threshold technique segments the image based on histogram weighting factors in stable contrast regions and local statistics in variable contrast regions. Once all processing is complete, the algorithm outputs the total fission gas void count, the mean void size, and the average porosity. The final results demonstrate an ability to extract fission gas void morphological data faster, more consistently, and at least as accurately as manual segmentation methods.« less
Advances in photographic X-ray imaging for solar astronomy
NASA Technical Reports Server (NTRS)
Moses, J. Daniel; Schueller, R.; Waljeski, K.; Davis, John M.
1989-01-01
The technique of obtaining quantitative data from high resolution soft X-ray photographic images produced by grazing incidence optics was successfully developed to a high degree during the Solar Research Sounding Rocket Program and the S-054 X-Ray Spectrographic Telescope Experiment Program on Skylab. Continued use of soft X-ray photographic imaging in sounding rocket flights of the High Resolution Solar Soft X-Ray Imaging Payload has provided opportunities to further develop these techniques. The developments discussed include: (1) The calibration and use of an inexpensive, commercially available microprocessor controlled drum type film processor for photometric film development; (2) The use of Kodak Technical Pan 2415 film and Kodak SO-253 High Speed Holographic film for improved resolution; and (3) The application of a technique described by Cook, Ewing, and Sutton for determining the film characteristics curves from density histograms of the flight film. Although the superior sensitivity, noise level, and linearity of microchannel plate and CCD detectors attracts the development efforts of many groups working in soft X-ray imaging, the high spatial resolution and dynamic range as well as the reliability and ease of application of photographic media assures the continued use of these techniques in solar X-ray astronomy observations.
Fission gas bubble identification using MATLAB's image processing toolbox
Collette, R.; King, J.; Keiser, Jr., D.; ...
2016-06-08
Automated image processing routines have the potential to aid in the fuel performance evaluation process by eliminating bias in human judgment that may vary from person-to-person or sample-to-sample. In addition, this study presents several MATLAB based image analysis routines designed for fission gas void identification in post-irradiation examination of uranium molybdenum (U–Mo) monolithic-type plate fuels. Frequency domain filtration, enlisted as a pre-processing technique, can eliminate artifacts from the image without compromising the critical features of interest. This process is coupled with a bilateral filter, an edge-preserving noise removal technique aimed at preparing the image for optimal segmentation. Adaptive thresholding provedmore » to be the most consistent gray-level feature segmentation technique for U–Mo fuel microstructures. The Sauvola adaptive threshold technique segments the image based on histogram weighting factors in stable contrast regions and local statistics in variable contrast regions. Once all processing is complete, the algorithm outputs the total fission gas void count, the mean void size, and the average porosity. The final results demonstrate an ability to extract fission gas void morphological data faster, more consistently, and at least as accurately as manual segmentation methods.« less
Anterior segment sparing to reduce charged particle radiotherapy complications in uveal melanoma
NASA Technical Reports Server (NTRS)
Daftari, I. K.; Char, D. H.; Verhey, L. J.; Castro, J. R.; Petti, P. L.; Meecham, W. J.; Kroll, S.; Blakely, E. A.; Chatterjee, A. (Principal Investigator)
1997-01-01
PURPOSE: The purpose of this investigation is to delineate the risk factors in the development of neovascular glaucoma (NVG) after helium-ion irradiation of uveal melanoma patients and to propose treatment technique that may reduce this risk. METHODS AND MATERIALS: 347 uveal melanoma patients were treated with helium-ions using a single-port treatment technique. Using univariate and multivariate statistics, the NVG complication rate was analyzed according to the percent of anterior chamber in the radiation field, tumor size, tumor location, sex, age, dose, and other risk factors. Several University of California San Francisco-Lawrence Berkeley National Laboratory (LBNL) patients in each size category (medium, large, and extralarge) were retrospectively replanned using two ports instead of a single port. By using appropriate polar and azimuthal gaze angles or by treating patients with two ports, the maximum dose to the anterior segment of the eye can often be reduced. Although a larger volume of anterior chamber may receive a lower dose by using two ports than a single port treatment. We hypothesize that this could reduce the level of complications that result from the irradiation of the anterior chamber of the eye. Dose-volume histograms were calculated for the lens, and compared for the single and two-port techniques. RESULTS: NVG developed in 121 (35%) patients. The risk of NVG peaked between 1 and 2.5 years posttreatment. By univariate and multivariate analysis, the percent of lens in the field was strongly correlated with the development of NVG. Other contributing factors were tumor height, history of diabetes, and vitreous hemorrhage. Dose-volume histogram analysis of single-port vs. two-port techniques demonstrate that for some patients in the medium and large category tumor groups, a significant decrease in dose to the structures in the anterior segment of the eye could have been achieved with the use of two ports. CONCLUSION: The development of NVG after helium-ion irradiation is correlated to the amount of lens, anterior chamber in the treatment field, tumor height, proximity to the fovea, history of diabetes, and the development of vitreous hemorrhage. Although the influence of the higher LET deposition of helium-ions is unclear, this study suggests that by reducing the dose to the anterior segment of the eye may reduce the NVG complications. Based on this retrospective analysis of LBNL patients, we have implemented techniques to reduce the amount of the anterior segment receiving a high dose in our new series of patients treated with protons using the cyclotron at the UC Davis Crocker Nuclear Laboratory (CNL).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abraham, John Bishoy Sam; Pacheco, Jose L.; Aguirre, Brandon Adrian
2016-08-09
We demonstrate low energy single ion detection using a co-planar detector fabricated on a diamond substrate and characterized by ion beam induced charge collection. Histograms are taken with low fluence ion pulses illustrating quantized ion detection down to a single ion with a signal-to-noise ratio of approximately 10. We anticipate that this detection technique can serve as a basis to optimize the yield of single color centers in diamond. In conclusion, the ability to count ions into a diamond substrate is expected to reduce the uncertainty in the yield of color center formation by removing Poisson statistics from the implantationmore » process.« less
NASA Technical Reports Server (NTRS)
Thomson, F.
1972-01-01
The additional processing performed on data collected over the Rhode River Test Site and Forestry Site in November 1970 is reported. The techniques and procedures used to obtain the processed results are described. Thermal data collected over three approximately parallel lines of the site were contoured, and the results color coded, for the purpose of delineating important scene constituents and to identify trees attacked by pine bark beetles. Contouring work and histogram preparation are reviewed and the important conclusions from the spectral analysis and recognition computer (SPARC) signature extension work are summarized. The SPARC setup and processing records are presented and recommendations are made for future data collection over the site.
Towards advanced OCT clinical applications
NASA Astrophysics Data System (ADS)
Kirillin, Mikhail; Panteleeva, Olga; Agrba, Pavel; Pasukhin, Mikhail; Sergeeva, Ekaterina; Plankina, Elena; Dudenkova, Varvara; Gubarkova, Ekaterina; Kiseleva, Elena; Gladkova, Natalia; Shakhova, Natalia; Vitkin, Alex
2015-07-01
In this paper we report on our recent achievement in application of conventional and cross-polarization OCT (CP OCT) modalities for in vivo clinical diagnostics in different medical areas including gynecology, dermatology, and stomatology. In gynecology, CP OCT was employed for diagnosing fallopian tubes and cervix; in dermatology OCT for monitoring of treatment of psoriasis, scleroderma and atopic dermatitis; and in stomatology for diagnosis of oral diseases. For all considered application, we propose and develop different image processing methods which enhance the diagnostic value of the technique. In particular, we use histogram analysis, Fourier analysis and neural networks, thus calculating different tissue characteristics as revealed by OCT's polarization evolution. These approaches enable improved OCT image quantification and increase its resultant diagnostic accuracy.
Maturity assessment of harumanis mango using thermal camera sensor
NASA Astrophysics Data System (ADS)
Sa'ad, F. S. A.; Shakaff, A. Y. Md.; Zakaria, A.; Abdullah, A. H.; Ibrahim, M. F.
2017-03-01
The perceived quality of fruits, such as mangoes, is greatly dependent on many parameters such as ripeness, shape, size, and is influenced by other factors such as harvesting time. Unfortunately, a manual fruit grading has several drawbacks such as subjectivity, tediousness and inconsistency. By automating the procedure, as well as developing new classification technique, it may solve these problems. This paper presents the novel work on the using Infrared as a Tool in Quality Monitoring of Harumanis Mangoes. The histogram of infrared image was used to distinguish and classify the level of ripeness of the fruits based on the colour spectrum by week. The approach proposed thermal data was able to achieve 90.5% correct classification.
ERIC Educational Resources Information Center
Schneider, Marlene; Robin, Arthur
This manual describes the Turtle Technique which was developed to help children with behavior problems control their own disruptive behavior. The technique differs from other behavior modification techniques in that it is based upon self-control rather than external control of disruptive behavior. The Turtle Technique first teaches the child how…
Nanotubular surface modification of metallic implants via electrochemical anodization technique.
Wang, Lu-Ning; Jin, Ming; Zheng, Yudong; Guan, Yueping; Lu, Xin; Luo, Jing-Li
2014-01-01
Due to increased awareness and interest in the biomedical implant field as a result of an aging population, research in the field of implantable devices has grown rapidly in the last few decades. Among the biomedical implants, metallic implant materials have been widely used to replace disordered bony tissues in orthopedic and orthodontic surgeries. The clinical success of implants is closely related to their early osseointegration (ie, the direct structural and functional connection between living bone and the surface of a load-bearing artificial implant), which relies heavily on the surface condition of the implant. Electrochemical techniques for modifying biomedical implants are relatively simple, cost-effective, and appropriate for implants with complex shapes. Recently, metal oxide nanotubular arrays via electrochemical anodization have become an attractive technique to build up on metallic implants to enhance the biocompatibility and bioactivity. This article will thoroughly review the relevance of electrochemical anodization techniques for the modification of metallic implant surfaces in nanoscale, and cover the electrochemical anodization techniques used in the development of the types of nanotubular/nanoporous modification achievable via electrochemical approaches, which hold tremendous potential for bio-implant applications. In vitro and in vivo studies using metallic oxide nanotubes are also presented, revealing the potential of nanotubes in biomedical applications. Finally, an outlook of future growth of research in metallic oxide nanotubular arrays is provided. This article will therefore provide researchers with an in-depth understanding of electrochemical anodization modification and provide guidance regarding the design and tuning of new materials to achieve a desired performance and reliable biocompatibility.
Nanotubular surface modification of metallic implants via electrochemical anodization technique
Wang, Lu-Ning; Jin, Ming; Zheng, Yudong; Guan, Yueping; Lu, Xin; Luo, Jing-Li
2014-01-01
Due to increased awareness and interest in the biomedical implant field as a result of an aging population, research in the field of implantable devices has grown rapidly in the last few decades. Among the biomedical implants, metallic implant materials have been widely used to replace disordered bony tissues in orthopedic and orthodontic surgeries. The clinical success of implants is closely related to their early osseointegration (ie, the direct structural and functional connection between living bone and the surface of a load-bearing artificial implant), which relies heavily on the surface condition of the implant. Electrochemical techniques for modifying biomedical implants are relatively simple, cost-effective, and appropriate for implants with complex shapes. Recently, metal oxide nanotubular arrays via electrochemical anodization have become an attractive technique to build up on metallic implants to enhance the biocompatibility and bioactivity. This article will thoroughly review the relevance of electrochemical anodization techniques for the modification of metallic implant surfaces in nanoscale, and cover the electrochemical anodization techniques used in the development of the types of nanotubular/nanoporous modification achievable via electrochemical approaches, which hold tremendous potential for bio-implant applications. In vitro and in vivo studies using metallic oxide nanotubes are also presented, revealing the potential of nanotubes in biomedical applications. Finally, an outlook of future growth of research in metallic oxide nanotubular arrays is provided. This article will therefore provide researchers with an in-depth understanding of electrochemical anodization modification and provide guidance regarding the design and tuning of new materials to achieve a desired performance and reliable biocompatibility. PMID:25258532
Performance and durability of high emittance heat receiver surfaces for solar dynamic power systems
NASA Technical Reports Server (NTRS)
Degroh, Kim K.; Roig, David M.; Burke, Christopher A.; Shah, Dilipkumar R.
1994-01-01
Haynes 188, a cobalt-based superalloy, will be used to make thermal energy storage (TES) containment canisters for a 2 kW solar dynamic ground test demonstrator (SD GTD). Haynes 188 containment canisters with a high thermal emittance (epsilon) are desired for radiating heat away from local hot spots, improving the heating distribution, which will in turn improve canister service life. In addition to needing a high emittance, the surface needs to be durable in an elevated temperature, high vacuum environment for an extended time period. Thirty-five Haynes 188 samples were exposed to 14 different types of surface modification techniques for emittance and vacuum heat treatment (VHT) durability enhancement evaluation. Optical properties were obtained for the modified surfaces. Emittance enhanced samples were exposed to VHT for up to 2692 hours at 827 C and less than or equal to 10(exp -6) torr with integral thermal cycling. Optical properties were taken intermittently during exposure, and after final VHT exposure. The various surface modification treatments increased the emittance of pristine Haynes 188 from 0.11 up to 0.86. Seven different surface modification techniques were found to provide surfaces which met the SD GTD receiver VHT durability requirement. Of the 7 surface treatments, 2 were found to display excellent VHT durability: an alumina based (AB) coating and a zirconia based coating. The alumina based coating was chosen for the epsilon enhancement surface modification technique for the SD GTD receiver. Details of the performance and vacuum heat treatment durability of this coating and other Haynes 188 emittance surface modification techniques are discussed. Technology from this program will lead to successful demonstration of solar dynamic power for space applications, and has potential for application in other systems requiring high emittance surfaces.
Angular relational signature-based chest radiograph image view classification.
Santosh, K C; Wendling, Laurent
2018-01-22
In a computer-aided diagnosis (CAD) system, especially for chest radiograph or chest X-ray (CXR) screening, CXR image view information is required. Automatically separating CXR image view, frontal and lateral can ease subsequent CXR screening process, since the techniques may not equally work for both views. We present a novel technique to classify frontal and lateral CXR images, where we introduce angular relational signature through force histogram to extract features and apply three different state-of-the-art classifiers: multi-layer perceptron, random forest, and support vector machine to make a decision. We validated our fully automatic technique on a set of 8100 images hosted by the U.S. National Library of Medicine (NLM), National Institutes of Health (NIH), and achieved an accuracy close to 100%. Our method outperforms the state-of-the-art methods in terms of processing time (less than or close to 2 s for the whole test data) while the accuracies can be compared, and therefore, it justifies its practicality. Graphical Abstract Interpreting chest X-ray (CXR) through the angular relational signature.
NASA Astrophysics Data System (ADS)
Liu, Hong; Nodine, Calvin F.
1996-07-01
This paper presents a generalized image contrast enhancement technique, which equalizes the perceived brightness distribution based on the Heinemann contrast discrimination model. It is based on the mathematically proven existence of a unique solution to a nonlinear equation, and is formulated with easily tunable parameters. The model uses a two-step log-log representation of luminance contrast between targets and surround in a luminous background setting. The algorithm consists of two nonlinear gray scale mapping functions that have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of the gray-level distribution of the given image, and can be uniquely determined once the previous three are set. Tests have been carried out to demonstrate the effectiveness of the algorithm for increasing the overall contrast of radiology images. The traditional histogram equalization can be reinterpreted as an image enhancement technique based on the knowledge of human contrast perception. In fact, it is a special case of the proposed algorithm.
NASA Astrophysics Data System (ADS)
Paramanandham, Nirmala; Rajendiran, Kishore
2018-01-01
A novel image fusion technique is presented for integrating infrared and visible images. Integration of images from the same or various sensing modalities can deliver the required information that cannot be delivered by viewing the sensor outputs individually and consecutively. In this paper, a swarm intelligence based image fusion technique using discrete cosine transform (DCT) domain is proposed for surveillance application which integrates the infrared image with the visible image for generating a single informative fused image. Particle swarm optimization (PSO) is used in the fusion process for obtaining the optimized weighting factor. These optimized weighting factors are used for fusing the DCT coefficients of visible and infrared images. Inverse DCT is applied for obtaining the initial fused image. An enhanced fused image is obtained through adaptive histogram equalization for a better visual understanding and target detection. The proposed framework is evaluated using quantitative metrics such as standard deviation, spatial frequency, entropy and mean gradient. The experimental results demonstrate the outperformance of the proposed algorithm over many other state- of- the- art techniques reported in literature.
Cost Savings Potential of Modification to the Standard Light Rail Vehicle Specification
DOT National Transportation Integrated Search
1979-02-01
This report describes an assessment of the Standard Light Rail Vehicle (SLRV) specification to determine whether the relaxation or modification of some requirements could result in a significant reduction in vehicle costs. A Technique of Assessment b...