Science.gov

Sample records for adaptive histogram equalization

  1. Combining Vector Quantization and Histogram Equalization.

    ERIC Educational Resources Information Center

    Cosman, Pamela C.; And Others

    1992-01-01

    Discussion of contrast enhancement techniques focuses on the use of histogram equalization with a data compression technique, i.e., tree-structured vector quantization. The enhancement technique of intensity windowing is described, and the use of enhancement techniques for medical images is explained, including adaptive histogram equalization.…

  2. Osteoarthritis Classification Using Self Organizing Map Based on Gabor Kernel and Contrast-Limited Adaptive Histogram Equalization

    PubMed Central

    Anifah, Lilik; Purnama, I Ketut Eddy; Hariadi, Mochamad; Purnomo, Mauridhi Hery

    2013-01-01

    Localization is the first step in osteoarthritis (OA) classification. Manual classification, however, is time-consuming, tedious, and expensive. The proposed system is designed as decision support system for medical doctors to classify the severity of knee OA. A method has been proposed here to localize a joint space area for OA and then classify it in 4 steps to classify OA into KL-Grade 0, KL-Grade 1, KL-Grade 2, KL-Grade 3 and KL-Grade 4, which are preprocessing, segmentation, feature extraction, and classification. In this proposed system, right and left knee detection was performed by employing the Contrast-Limited Adaptive Histogram Equalization (CLAHE) and the template matching. The Gabor kernel, row sum graph and moment methods were used to localize the junction space area of knee. CLAHE is used for preprocessing step, i.e.to normalize the varied intensities. The segmentation process was conducted using the Gabor kernel, template matching, row sum graph and gray level center of mass method. Here GLCM (contrast, correlation, energy, and homogeinity) features were employed as training data. Overall, 50 data were evaluated for training and 258 data for testing. Experimental results showed the best performance by using gabor kernel with parameters α=8, θ=0, Ψ=[0 π/2], γ=0,8, N=4 and with number of iterations being 5000, momentum value 0.5 and α0=0.6 for the classification process. The run gave classification accuracy rate of 93.8% for KL-Grade 0, 70% for KL-Grade 1, 4% for KL-Grade 2, 10% for KL-Grade 3 and 88.9% for KL-Grade 4. PMID:23525188

  3. Osteoarthritis classification using self organizing map based on gabor kernel and contrast-limited adaptive histogram equalization.

    PubMed

    Anifah, Lilik; Purnama, I Ketut Eddy; Hariadi, Mochamad; Purnomo, Mauridhi Hery

    2013-01-01

    Localization is the first step in osteoarthritis (OA) classification. Manual classification, however, is time-consuming, tedious, and expensive. The proposed system is designed as decision support system for medical doctors to classify the severity of knee OA. A method has been proposed here to localize a joint space area for OA and then classify it in 4 steps to classify OA into KL-Grade 0, KL-Grade 1, KL-Grade 2, KL-Grade 3 and KL-Grade 4, which are preprocessing, segmentation, feature extraction, and classification. In this proposed system, right and left knee detection was performed by employing the Contrast-Limited Adaptive Histogram Equalization (CLAHE) and the template matching. The Gabor kernel, row sum graph and moment methods were used to localize the junction space area of knee. CLAHE is used for preprocessing step, i.e.to normalize the varied intensities. The segmentation process was conducted using the Gabor kernel, template matching, row sum graph and gray level center of mass method. Here GLCM (contrast, correlation, energy, and homogeinity) features were employed as training data. Overall, 50 data were evaluated for training and 258 data for testing. Experimental results showed the best performance by using gabor kernel with parameters α=8, θ=0, Ψ=[0 π/2], γ=0,8, N=4 and with number of iterations being 5000, momentum value 0.5 and α0=0.6 for the classification process. The run gave classification accuracy rate of 93.8% for KL-Grade 0, 70% for KL-Grade 1, 4% for KL-Grade 2, 10% for KL-Grade 3 and 88.9% for KL-Grade 4. PMID:23525188

  4. An innovative technique for contrast enhancement of computed tomography images using normalized gamma-corrected contrast-limited adaptive histogram equalization

    NASA Astrophysics Data System (ADS)

    Al-Ameen, Zohair; Sulong, Ghazali; Rehman, Amjad; Al-Dhelaan, Abdullah; Saba, Tanzila; Al-Rodhaan, Mznah

    2015-12-01

    Image contrast is an essential visual feature that determines whether an image is of good quality. In computed tomography (CT), captured images tend to be low contrast, which is a prevalent artifact that reduces the image quality and hampers the process of extracting its useful information. A common tactic to process such artifact is by using histogram-based techniques. However, although these techniques may improve the contrast for different grayscale imaging applications, the results are mostly unacceptable for CT images due to the presentation of various faults, noise amplification, excess brightness, and imperfect contrast. Therefore, an ameliorated version of the contrast-limited adaptive histogram equalization (CLAHE) is introduced in this article to provide a good brightness with decent contrast for CT images. The novel modification to the aforesaid technique is done by adding an initial phase of a normalized gamma correction function that helps in adjusting the gamma of the processed image to avoid the common errors of the basic CLAHE of the excess brightness and imperfect contrast it produces. The newly developed technique is tested with synthetic and real-degraded low-contrast CT images, in which it highly contributed in producing better quality results. Moreover, a low intricacy technique for contrast enhancement is proposed, and its performance is also exhibited against various versions of histogram-based enhancement technique using three advanced image quality assessment metrics of Universal Image Quality Index (UIQI), Structural Similarity Index (SSIM), and Feature Similarity Index (FSIM). Finally, the proposed technique provided acceptable results with no visible artifacts and outperformed all the comparable techniques.

  5. Contrast enhancement via texture region based histogram equalization

    NASA Astrophysics Data System (ADS)

    Singh, Kuldeep; Vishwakarma, Dinesh K.; Singh Walia, Gurjit; Kapoor, Rajiv

    2016-08-01

    This paper presents two novel contrast enhancement approaches using texture regions-based histogram equalization (HE). In HE-based contrast enhancement methods, the enhanced image often contains undesirable artefacts because an excessive number of pixels in the non-textured areas heavily bias the histogram. The novel idea presented in this paper is to suppress the impact of pixels in non-textured areas and to exploit texture features for the computation of histogram in the process of HE. The first algorithm named as Dominant Orientation-based Texture Histogram Equalization (DOTHE), constructs the histogram of the image using only those image patches having dominant orientation. DOTHE categories image patches into smooth, dominant or non-dominant orientation patches by using the image variance and singular value decomposition algorithm and utilizes only dominant orientation patches in the process of HE. The second method termed as Edge-based Texture Histogram Equalization, calculates significant edges in the image and constructs the histogram using the grey levels present in the neighbourhood of edges. The cumulative density function of the histogram formed from texture features is mapped on the entire dynamic range of the input image to produce the contrast-enhanced image. Subjective as well as objective performance assessment of proposed methods is conducted and compared with other existing HE methods. The performance assessment in terms of visual quality, contrast improvement index, entropy and measure of enhancement reveals that the proposed methods outperform the existing HE methods.

  6. A novel parallel architecture for local histogram equalization

    NASA Astrophysics Data System (ADS)

    Ohannessian, Mesrob I.; Choueiter, Ghinwa F.; Diab, Hassan

    2005-07-01

    Local histogram equalization is an image enhancement algorithm that has found wide application in the pre-processing stage of areas such as computer vision, pattern recognition and medical imaging. The computationally intensive nature of the procedure, however, is a main limitation when real time interactive applications are in question. This work explores the possibility of performing parallel local histogram equalization, using an array of special purpose elementary processors, through an HDL implementation that targets FPGA or ASIC platforms. A novel parallelization scheme is presented and the corresponding architecture is derived. The algorithm is reduced to pixel-level operations. Processing elements are assigned image blocks, to maintain a reasonable performance-cost ratio. To further simplify both processor and memory organizations, a bit-serial access scheme is used. A brief performance assessment is provided to illustrate and quantify the merit of the approach.

  7. Flood detection/monitoring using adjustable histogram equalization technique.

    PubMed

    Nazir, Fakhera; Riaz, Muhammad Mohsin; Ghafoor, Abdul; Arif, Fahim

    2014-01-01

    Flood monitoring technique using adjustable histogram equalization is proposed. The technique overcomes the limitations (overenhancement, artifacts, and unnatural look) of existing technique by adjusting the contrast of images. The proposed technique takes pre- and postimages and applies different processing steps for generating flood map without user interaction. The resultant flood maps can be used for flood monitoring and detection. Simulation results show that the proposed technique provides better output quality compared to the state of the art existing technique. PMID:24558332

  8. Flood Detection/Monitoring Using Adjustable Histogram Equalization Technique

    PubMed Central

    Riaz, Muhammad Mohsin; Ghafoor, Abdul

    2014-01-01

    Flood monitoring technique using adjustable histogram equalization is proposed. The technique overcomes the limitations (overenhancement, artifacts, and unnatural look) of existing technique by adjusting the contrast of images. The proposed technique takes pre- and postimages and applies different processing steps for generating flood map without user interaction. The resultant flood maps can be used for flood monitoring and detection. Simulation results show that the proposed technique provides better output quality compared to the state of the art existing technique. PMID:24558332

  9. Contrast Enhancement Algorithm Based on Gap Adjustment for Histogram Equalization.

    PubMed

    Chiu, Chung-Cheng; Ting, Chih-Chung

    2016-01-01

    Image enhancement methods have been widely used to improve the visual effects of images. Owing to its simplicity and effectiveness histogram equalization (HE) is one of the methods used for enhancing image contrast. However, HE may result in over-enhancement and feature loss problems that lead to unnatural look and loss of details in the processed images. Researchers have proposed various HE-based methods to solve the over-enhancement problem; however, they have largely ignored the feature loss problem. Therefore, a contrast enhancement algorithm based on gap adjustment for histogram equalization (CegaHE) is proposed. It refers to a visual contrast enhancement algorithm based on histogram equalization (VCEA), which generates visually pleasing enhanced images, and improves the enhancement effects of VCEA. CegaHE adjusts the gaps between two gray values based on the adjustment equation, which takes the properties of human visual perception into consideration, to solve the over-enhancement problem. Besides, it also alleviates the feature loss problem and further enhances the textures in the dark regions of the images to improve the quality of the processed images for human visual perception. Experimental results demonstrate that CegaHE is a reliable method for contrast enhancement and that it significantly outperforms VCEA and other methods. PMID:27338412

  10. Contrast Enhancement Algorithm Based on Gap Adjustment for Histogram Equalization

    PubMed Central

    Chiu, Chung-Cheng; Ting, Chih-Chung

    2016-01-01

    Image enhancement methods have been widely used to improve the visual effects of images. Owing to its simplicity and effectiveness histogram equalization (HE) is one of the methods used for enhancing image contrast. However, HE may result in over-enhancement and feature loss problems that lead to unnatural look and loss of details in the processed images. Researchers have proposed various HE-based methods to solve the over-enhancement problem; however, they have largely ignored the feature loss problem. Therefore, a contrast enhancement algorithm based on gap adjustment for histogram equalization (CegaHE) is proposed. It refers to a visual contrast enhancement algorithm based on histogram equalization (VCEA), which generates visually pleasing enhanced images, and improves the enhancement effects of VCEA. CegaHE adjusts the gaps between two gray values based on the adjustment equation, which takes the properties of human visual perception into consideration, to solve the over-enhancement problem. Besides, it also alleviates the feature loss problem and further enhances the textures in the dark regions of the images to improve the quality of the processed images for human visual perception. Experimental results demonstrate that CegaHE is a reliable method for contrast enhancement and that it significantly outperforms VCEA and other methods. PMID:27338412

  11. The L_infinity constrained global optimal histogram equalization technique for real time imaging

    NASA Astrophysics Data System (ADS)

    Ren, Qiongwei; Niu, Yi; Liu, Lin; Jiao, Yang; Shi, Guangming

    2015-08-01

    Although the current imaging sensors can achieve 12 or higher precision, the current display devices and the commonly used digital image formats are still only 8 bits. This mismatch causes significant waste of the sensor precision and loss of information when storing and displaying the images. For better usage of the precision-budget, tone mapping operators have to be used to map the high-precision data into low-precision digital images adaptively. In this paper, the classic histogram equalization tone mapping operator is reexamined in the sense of optimization. We point out that the traditional histogram equalization technique and its variants are fundamentally improper by suffering from local optimum problems. To overcome this drawback, we remodel the histogram equalization tone mapping task based on graphic theory which achieves the global optimal solutions. Another advantage of the graphic-based modeling is that the tone-continuity is also modeled as a vital constraint in our approach which suppress the annoying boundary artifacts of the traditional approaches. In addition, we propose a novel dynamic programming technique to solve the histogram equalization problem in real time. Experimental results shows that the proposed tone-preserved global optimal histogram equalization technique outperforms the traditional approaches by exhibiting more subtle details in the foreground while preserving the smoothness of the background.

  12. Infrared image enhancement based on atmospheric scattering model and histogram equalization

    NASA Astrophysics Data System (ADS)

    Li, Yi; Zhang, Yunfeng; Geng, Aihui; Cao, Lihua; Chen, Juan

    2016-09-01

    Infrared images are fuzzy due to the special imaging technology of infrared sensor. In order to achieve contrast enhancement and gain clear edge details from a fuzzy infrared image, we propose an efficient enhancement method based on atmospheric scattering model and histogram equalization. The novel algorithm optimizes and improves the visual image haze remove method which combines the characteristics of the fuzzy infrared images. Firstly, an average filtering operation is presented to get the estimation of coarse transmission rate. Then we get the fuzzy free image through self-adaptive transmission rate calculated with the statistics information of original infrared image. Finally, to deal with low lighting problem of fuzzy free image, we propose a sectional plateau histogram equalization method which is capable of background suppression. Experimental results show that the performance and efficiency of the proposed algorithm are pleased, compared to four other algorithms in both subjective observation and objective quantitative evaluation. In addition, the proposed algorithm is competent to enhance infrared image for different applications under different circumstances.

  13. Compensating Acoustic Mismatch Using Class-Based Histogram Equalization for Robust Speech Recognition

    NASA Astrophysics Data System (ADS)

    Suh, Youngjoo; Kim, Sungtak; Kim, Hoirin

    2007-12-01

    A new class-based histogram equalization method is proposed for robust speech recognition. The proposed method aims at not only compensating for an acoustic mismatch between training and test environments but also reducing the two fundamental limitations of the conventional histogram equalization method, the discrepancy between the phonetic distributions of training and test speech data, and the nonmonotonic transformation caused by the acoustic mismatch. The algorithm employs multiple class-specific reference and test cumulative distribution functions, classifies noisy test features into their corresponding classes, and equalizes the features by using their corresponding class reference and test distributions. The minimum mean-square error log-spectral amplitude (MMSE-LSA)-based speech enhancement is added just prior to the baseline feature extraction to reduce the corruption by additive noise. The experiments on the Aurora2 database proved the effectiveness of the proposed method by reducing relative errors by[InlineEquation not available: see fulltext.] over the mel-cepstral-based features and by[InlineEquation not available: see fulltext.] over the conventional histogram equalization method, respectively.

  14. Particle swarm optimized multi-objective histogram equalization for image enhancement

    NASA Astrophysics Data System (ADS)

    Shanmugavadivu, P.; Balasubramanian, K.

    2014-04-01

    Histogram Equalization (HE) is a simple and effective technique for enhancing the contrast of the input image. However, it fails to preserve the brightness while enhancing the contrast due to the abrupt mean shift during the process of equalization. Many HE based methods have been developed to overcome the problem of mean shift. But, they suffered from over-enhancement. In this paper, a multi-objective HE model has been proposed in order to enhance the contrast as well as to preserve the brightness. The central idea of this technique is to first segment the histogram of the input image into two using Otsu's threshold. A set of optimized weighing constraints are formulated and applied on both the sub-images. Then, the sub-images are equalized independently and their union produces the contrast enhanced, brightness preserved output image. Here, Particle Swarm Optimization (PSO) is employed to find the optimal constraints. This technique is proved to have an edge over the other contemporary methods in terms of entropy and contrast improvement index.

  15. Efficient visibility-driven medical image visualisation via adaptive binned visibility histogram.

    PubMed

    Jung, Younhyun; Kim, Jinman; Kumar, Ashnil; Feng, David Dagan; Fulham, Michael

    2016-07-01

    'Visibility' is a fundamental optical property that represents the observable, by users, proportion of the voxels in a volume during interactive volume rendering. The manipulation of this 'visibility' improves the volume rendering processes; for instance by ensuring the visibility of regions of interest (ROIs) or by guiding the identification of an optimal rendering view-point. The construction of visibility histograms (VHs), which represent the distribution of all the visibility of all voxels in the rendered volume, enables users to explore the volume with real-time feedback about occlusion patterns among spatially related structures during volume rendering manipulations. Volume rendered medical images have been a primary beneficiary of VH given the need to ensure that specific ROIs are visible relative to the surrounding structures, e.g. the visualisation of tumours that may otherwise be occluded by neighbouring structures. VH construction and its subsequent manipulations, however, are computationally expensive due to the histogram binning of the visibilities. This limits the real-time application of VH to medical images that have large intensity ranges and volume dimensions and require a large number of histogram bins. In this study, we introduce an efficient adaptive binned visibility histogram (AB-VH) in which a smaller number of histogram bins are used to represent the visibility distribution of the full VH. We adaptively bin medical images by using a cluster analysis algorithm that groups the voxels according to their intensity similarities into a smaller subset of bins while preserving the distribution of the intensity range of the original images. We increase efficiency by exploiting the parallel computation and multiple render targets (MRT) extension of the modern graphical processing units (GPUs) and this enables efficient computation of the histogram. We show the application of our method to single-modality computed tomography (CT), magnetic resonance

  16. A fracture enhancement method based on the histogram equalization of eigenstructure-based coherence

    NASA Astrophysics Data System (ADS)

    Dou, Xi-Ying; Han, Li-Guo; Wang, En-Li; Dong, Xue-Hua; Yang, Qing; Yan, Gao-Han

    2014-06-01

    Eigenstructure-based coherence attributes are efficient and mature techniques for large-scale fracture detection. However, in horizontally bedded and continuous strata, buried fractures in high grayscale value zones are difficult to detect. Furthermore, middle- and small-scale fractures in fractured zones where migration image energies are usually not concentrated perfectly are also hard to detect because of the fuzzy, clouded shadows owing to low grayscale values. A new fracture enhancement method combined with histogram equalization is proposed to solve these problems. With this method, the contrast between discontinuities and background in coherence images is increased, linear structures are highlighted by stepwise adjustment of the threshold of the coherence image, and fractures are detected at different scales. Application of the method shows that it can also improve fracture cognition and accuracy.

  17. Adaptive sigmoid function bihistogram equalization for image contrast enhancement

    NASA Astrophysics Data System (ADS)

    Arriaga-Garcia, Edgar F.; Sanchez-Yanez, Raul E.; Ruiz-Pinales, Jose; Garcia-Hernandez, Ma. de Guadalupe

    2015-09-01

    Contrast enhancement plays a key role in a wide range of applications including consumer electronic applications, such as video surveillance, digital cameras, and televisions. The main goal of contrast enhancement is to increase the quality of images. However, most state-of-the-art methods induce different types of distortion such as intensity shift, wash-out, noise, intensity burn-out, and intensity saturation. In addition, in consumer electronics, simple and fast methods are required in order to be implemented in real time. A bihistogram equalization method based on adaptive sigmoid functions is proposed. It consists of splitting the image histogram into two parts that are equalized independently by using adaptive sigmoid functions. In order to preserve the mean brightness of the input image, the parameter of the sigmoid functions is chosen to minimize the absolute mean brightness metric. Experiments on the Berkeley database have shown that the proposed method improves the quality of images and preserves their mean brightness. An application to improve the colorfulness of images is also presented.

  18. Differentially Private Histogram Publication For Dynamic Datasets: An Adaptive Sampling Approach

    PubMed Central

    Li, Haoran; Jiang, Xiaoqian; Xiong, Li; Liu, Jinfei

    2016-01-01

    Differential privacy has recently become a de facto standard for private statistical data release. Many algorithms have been proposed to generate differentially private histograms or synthetic data. However, most of them focus on “one-time” release of a static dataset and do not adequately address the increasing need of releasing series of dynamic datasets in real time. A straightforward application of existing histogram methods on each snapshot of such dynamic datasets will incur high accumulated error due to the composibility of differential privacy and correlations or overlapping users between the snapshots. In this paper, we address the problem of releasing series of dynamic datasets in real time with differential privacy, using a novel adaptive distance-based sampling approach. Our first method, DSFT, uses a fixed distance threshold and releases a differentially private histogram only when the current snapshot is sufficiently different from the previous one, i.e., with a distance greater than a predefined threshold. Our second method, DSAT, further improves DSFT and uses a dynamic threshold adaptively adjusted by a feedback control mechanism to capture the data dynamics. Extensive experiments on real and synthetic datasets demonstrate that our approach achieves better utility than baseline methods and existing state-of-the-art methods. PMID:26973795

  19. Wavelength-adaptive dehazing using histogram merging-based classification for UAV images.

    PubMed

    Yoon, Inhye; Jeong, Seokhwa; Jeong, Jaeheon; Seo, Doochun; Paik, Joonki

    2015-01-01

    Since incoming light to an unmanned aerial vehicle (UAV) platform can be scattered by haze and dust in the atmosphere, the acquired image loses the original color and brightness of the subject. Enhancement of hazy images is an important task in improving the visibility of various UAV images. This paper presents a spatially-adaptive dehazing algorithm that merges color histograms with consideration of the wavelength-dependent atmospheric turbidity. Based on the wavelength-adaptive hazy image acquisition model, the proposed dehazing algorithm consists of three steps: (i) image segmentation based on geometric classes; (ii) generation of the context-adaptive transmission map; and (iii) intensity transformation for enhancing a hazy UAV image. The major contribution of the research is a novel hazy UAV image degradation model by considering the wavelength of light sources. In addition, the proposed transmission map provides a theoretical basis to differentiate visually important regions from others based on the turbidity and merged classification results. PMID:25808767

  20. Wavelength-Adaptive Dehazing Using Histogram Merging-Based Classification for UAV Images

    PubMed Central

    Yoon, Inhye; Jeong, Seokhwa; Jeong, Jaeheon; Seo, Doochun; Paik, Joonki

    2015-01-01

    Since incoming light to an unmanned aerial vehicle (UAV) platform can be scattered by haze and dust in the atmosphere, the acquired image loses the original color and brightness of the subject. Enhancement of hazy images is an important task in improving the visibility of various UAV images. This paper presents a spatially-adaptive dehazing algorithm that merges color histograms with consideration of the wavelength-dependent atmospheric turbidity. Based on the wavelength-adaptive hazy image acquisition model, the proposed dehazing algorithm consists of three steps: (i) image segmentation based on geometric classes; (ii) generation of the context-adaptive transmission map; and (iii) intensity transformation for enhancing a hazy UAV image. The major contribution of the research is a novel hazy UAV image degradation model by considering the wavelength of light sources. In addition, the proposed transmission map provides a theoretical basis to differentiate visually important regions from others based on the turbidity and merged classification results. PMID:25808767

  1. Medical image classification using spatial adjacent histogram based on adaptive local binary patterns.

    PubMed

    Liu, Dong; Wang, Shengsheng; Huang, Dezhi; Deng, Gang; Zeng, Fantao; Chen, Huiling

    2016-05-01

    Medical image recognition is an important task in both computer vision and computational biology. In the field of medical image classification, representing an image based on local binary patterns (LBP) descriptor has become popular. However, most existing LBP-based methods encode the binary patterns in a fixed neighborhood radius and ignore the spatial relationships among local patterns. The ignoring of the spatial relationships in the LBP will cause a poor performance in the process of capturing discriminative features for complex samples, such as medical images obtained by microscope. To address this problem, in this paper we propose a novel method to improve local binary patterns by assigning an adaptive neighborhood radius for each pixel. Based on these adaptive local binary patterns, we further propose a spatial adjacent histogram strategy to encode the micro-structures for image representation. An extensive set of evaluations are performed on four medical datasets which show that the proposed method significantly improves standard LBP and compares favorably with several other prevailing approaches. PMID:27058283

  2. Adapting histogram for automatic noise data removal in building interior point cloud data

    NASA Astrophysics Data System (ADS)

    Shukor, S. A. Abdul; Rushforth, E. J.

    2015-05-01

    3D point cloud data is now preferred by researchers to generate 3D models. These models can be used throughout a variety of applications including 3D building interior models. The rise of Building Information Modeling (BIM) for Architectural, Engineering, Construction (AEC) applications has given 3D interior modelling more attention recently. To generate a 3D model representing the building interior, a laser scanner is used to collect the point cloud data. However, this data often comes with noise. This is due to several factors including the surrounding objects, lighting and specifications of the laser scanner. This paper highlights on the usage of the histogram to remove the noise data. Histograms, used in statistics and probability, are regularly being used in a number of applications like image processing, where a histogram can represent the total number of pixels in an image at each intensity level. Here, histograms represent the number of points recorded at range distance intervals in various projections. As unwanted noise data has a sparser cloud density compared to the required data and is usually situated at a notable distance from the required data, noise data will have lower frequencies in the histogram. By defining the acceptable range using the average frequency, points below this range can be removed. This research has shown that these histograms have the capabilities to remove unwanted data from 3D point cloud data representing building interiors automatically. This feature will aid the process of data preprocessing in producing an ideal 3D model from the point cloud data.

  3. Fast Adaptive Blind MMSE Equalizer for Multichannel FIR Systems

    NASA Astrophysics Data System (ADS)

    Kacha, Ibrahim; Abed-Meraim, Karim; Belouchrani, Adel

    2006-12-01

    We propose a new blind minimum mean square error (MMSE) equalization algorithm of noisy multichannel finite impulse response (FIR) systems, that relies only on second-order statistics. The proposed algorithm offers two important advantages: a low computational complexity and a relative robustness against channel order overestimation errors. Exploiting the fact that the columns of the equalizer matrix filter belong both to the signal subspace and to the kernel of truncated data covariance matrix, the proposed algorithm achieves blindly a direct estimation of the zero-delay MMSE equalizer parameters. We develop a two-step procedure to further improve the performance gain and control the equalization delay. An efficient fast adaptive implementation of our equalizer, based on the projection approximation and the shift invariance property of temporal data covariance matrix, is proposed for reducing the computational complexity from[InlineEquation not available: see fulltext.] to[InlineEquation not available: see fulltext.], where[InlineEquation not available: see fulltext.] is the number of emitted signals,[InlineEquation not available: see fulltext.] the data vector length, and[InlineEquation not available: see fulltext.] the dimension of the signal subspace. We then derive a statistical performance analysis to compare the equalization performance with that of the optimal MMSE equalizer. Finally, simulation results are provided to illustrate the effectiveness of the proposed blind equalization algorithm.

  4. EZ-ROSE: a computer program for equal-area circular histograms and statistical analysis of two-dimensional vectorial data

    NASA Astrophysics Data System (ADS)

    Baas, Jaco H.

    2000-03-01

    EZ-ROSE 1.0 is a computer program for the statistical analysis of populations of two-dimensional vectorial data and their presentation in equal-area rose diagrams. The program is implemented as a Microsoft® Excel workbook containing worksheets for the input of directional (circular) or lineational (semi-circular) data and their automatic processing, which includes the calculation of a frequency distribution for a selected class width, statistical analysis, and the construction of a rose diagram in CorelDraw™. The statistical analysis involves tests of uniformity for the vectorial population distribution, such as the nonparametric Kuiper and Watson tests and the parametric Rayleigh test. The statistics calculated include the vector mean, its magnitude (length) and strength (data concentration); the Batschelet circular standard deviation as an alternative measure of vectorial concentration; and a confidence sector for the vector mean. The statistics together with the frequency data are used to prepare a Corel Script™ file that contains all the necessary instructions to draw automatically an equal-area circular frequency histogram (rose diagram) in CorelDraw™. The advantages of EZ-ROSE, compared to other software for circular statistics, are: (1) the ability to use an equal-area scale in rose diagrams; (2) the wide range of tools for a comprehensive statistical analysis; (3) the ease of use, as Microsoft® Excel and CorelDraw™ are widely known to users of Microsoft® Windows; and (4) the high degree of flexibility due to the application of Microsoft® Excel and CorelDraw™, which offer a whole range of tools for possible addition of other statistical methods and changes of the rose-diagram layout.

  5. Low complexity adaptive equalizers for underwater acoustic communications

    NASA Astrophysics Data System (ADS)

    Soflaei, Masoumeh; Azmi, Paeiz

    2014-08-01

    Interference signals due to scattering from surface and reflecting from bottom is one of the most important problems of reliable communications in shallow water channels. To solve this problem, one of the best suggested ways is to use adaptive equalizers. Convergence rate and misadjustment error in adaptive algorithms play important roles in adaptive equalizer performance. In this paper, affine projection algorithm (APA), selective regressor APA(SR-APA), family of selective partial update (SPU) algorithms, family of set-membership (SM) algorithms and selective partial update selective regressor APA (SPU-SR-APA) are compared with conventional algorithms such as the least mean square (LMS) in underwater acoustic communications. We apply experimental data from the Strait of Hormuz for demonstrating the efficiency of the proposed methods over shallow water channel. We observe that the values of the steady-state mean square error (MSE) of SR-APA, SPU-APA, SPU-normalized least mean square (SPU-NLMS), SPU-SR-APA, SM-APA and SM-NLMS algorithms decrease in comparison with the LMS algorithm. Also these algorithms have better convergence rates than LMS type algorithm.

  6. A successive overrelaxation iterative technique for an adaptive equalizer

    NASA Technical Reports Server (NTRS)

    Kosovych, O. S.

    1973-01-01

    An adaptive strategy for the equalization of pulse-amplitude-modulated signals in the presence of intersymbol interference and additive noise is reported. The successive overrelaxation iterative technique is used as the algorithm for the iterative adjustment of the equalizer coefficents during a training period for the minimization of the mean square error. With 2-cyclic and nonnegative Jacobi matrices substantial improvement is demonstrated in the rate of convergence over the commonly used gradient techniques. The Jacobi theorems are also extended to nonpositive Jacobi matrices. Numerical examples strongly indicate that the improvements obtained for the special cases are possible for general channel characteristics. The technique is analytically demonstrated to decrease the mean square error at each iteration for a large range of parameter values for light or moderate intersymbol interference and for small intervals for general channels. Analytically, convergence of the relaxation algorithm was proven in a noisy environment and the coefficient variance was demonstrated to be bounded.

  7. A multiresolution approach to image enhancement via histogram shaping and adaptive Wiener filtering

    NASA Astrophysics Data System (ADS)

    Pace, T.; Manville, D.; Lee, H.; Cloud, G.; Puritz, J.

    2008-04-01

    It is critical in military applications to be able to extract features in imagery that may be of interest to the viewer at any time of the day or night. Infrared (IR) imagery is ideally suited for producing these types of images. However, even under the best of circumstances, the traditional approach of applying a global automatic gain control (AGC) to the digital image may not provide the user with local area details that may be of interest. Processing the imagery locally can enhance additional features and characteristics in the image which provide the viewer with an improved understanding of the scene being observed. This paper describes a multi-resolution pyramid approach for decomposing an image, enhancing its contrast by remapping the histograms to desired pdfs, filtering them and recombining them to create an output image with much more visible detail than the input image. The technique improves the local area image contrast in light and dark areas providing the warfighter with significantly improved situational awareness.

  8. Brief Communication: Contrast-stretching- and histogram-smoothness-based synthetic aperture radar image enhancement for flood map generation

    NASA Astrophysics Data System (ADS)

    Nazir, F.; Riaz, M. M.; Ghafoor, A.; Arif, F.

    2015-02-01

    Synthetic-aperture-radar-image-based flood map generation is usually a challenging task (due to degraded contrast). A three-step approach (based on adaptive histogram clipping, histogram remapping and smoothing) is proposed for generation of a more visualized flood map image. The pre- and post-flood images are adaptively histogram equalized. The hidden details in difference image are enhanced using contrast-based enhancement and histogram smoothing. A fast-ready flood map is then generated using equalized pre-, post- and difference images. Results (evaluated using different data sets) show significance of the proposed technique.

  9. Adaptive block-wise alphabet reduction scheme for lossless compression of images with sparse and locally sparse histograms

    NASA Astrophysics Data System (ADS)

    Masmoudi, Atef; Zouari, Sonia; Ghribi, Abdelaziz

    2015-11-01

    We propose a new adaptive block-wise lossless image compression algorithm, which is based on the so-called alphabet reduction scheme combined with an adaptive arithmetic coding (AC). This new encoding algorithm is particularly efficient for lossless compression of images with sparse and locally sparse histograms. AC is a very efficient technique for lossless data compression and produces a rate that is close to the entropy; however, a compression performance loss occurs when encoding images or blocks with a limited number of active symbols by comparison with the number of symbols in the nominal alphabet, which consists in the amplification of the zero frequency problem. Generally, most methods add one to the frequency count of each symbol from the nominal alphabet, which leads to a statistical model distortion, and therefore reduces the efficiency of the AC. The aim of this work is to overcome this drawback by assigning to each image block the smallest possible set including all the existing symbols called active symbols. This is an alternative of using the nominal alphabet when applying the conventional arithmetic encoders. We show experimentally that the proposed method outperforms several lossless image compression encoders and standards including the conventional arithmetic encoders, JPEG2000, and JPEG-LS.

  10. Comparison of hybrid adaptive blind equalizers for QAM signals

    NASA Astrophysics Data System (ADS)

    Labed, A.; Belouchrani, A.; Aissa-El-Bey, A.; Chonavel, T.

    2009-06-01

    This paper compares different hybrid blind equalization algorithms used for QAM signals. In hybrid equalizers, a penalty term, with zeros at constellation points coordinates, called constellation matching error (CME) is added to the criterion of one of the standard algorithms, such as the constant modulus algorithm (CMA), the multimodulus algorithm (MMA) or the recently proposed extended constant modulus algorithm (ECMA). Among the CMEs, to be considered, we have recently introduced a new one which is the product of l1-norm of the deviations of equalizer output from the constellation points. The hybrid algorithms, obtained by combining different CMEs with the CMA or the ECMA are compared through simulations on 16-QAM, 64-QAM and 256-QAM signals transmitted over different channels.

  11. Light wave transmission through free space using atmospheric laser links with adaptive equalization

    NASA Astrophysics Data System (ADS)

    Hussein, Gamal A.; Mohamed, Abd El-Naser A.; Oraby, Osama A.; Hassan, Emad S.; Eldokany, Ibrahim M.; El-Rabaie, El-Sayed M.; Dessouky, Moawad I.; Alshebeili, Saleh A.; El-Samie, Fathi E. Abd

    2013-07-01

    The utilization of adaptive equalization in the design of atmospheric laser link transceiver architectures that can be used for television and broadcast signal interconnect between the external place of event and the master control room is suggested. At the transmitter side of the proposed transceiver; an array of atmospheric laser sources, digital signal processing, and optical radiators are used to send light waves in free space. At the receiver side, an adaptive finite impulse response least mean square (LMS) equalizer with activity detection guidance (ADG) and tap decoupling (TD) is used to mitigate the effect of channel impairments. The performance of the suggested adaptive equalizer is compared with that of the conventional adaptive equalizer based only on the standard LMS algorithm. The simulation results revealed that the adaptive LMS equalizer with ADG and TD is a promising solution for the inter-symbol interference problem in optical wireless communication systems.

  12. On the similarity of 239Pu α-activity histograms when the angular velocities of the Earth diurnal rotation, orbital movement and rotation of collimators are equalized

    NASA Astrophysics Data System (ADS)

    Shnoll, S. E.; Rubinstein, I. A.; Shapovalov, S. N.; Tolokonnikova, A. A.; Shlektaryov, V. A.; Kolombet, V. A.; Kondrashova, M. N.

    2016-01-01

    It was shown earlier that the persistent "scatter" of results of measurements of any nature is determined by the diurnal and orbital movement of the Earth. The movement is accompanied by "macroscopic fluctuations" (MF)—regular, periodic changes in the shape of histograms, spectra of fluctuation amplitudes of the measured parameters. There are two near-daily periods ("sidereal", 1436 min; and "solar", 1440 min) and three yearly ones ("calendar", 365 average solar days; "tropical", 365 days 5 h and 48 min; and "sidereal", 365 days 6 h and 9 min). This periodicity was explained by the objects whose parameters are measured passing through the same spatial-temporal heterogeneities as the Earth rotates and shifts along its orbit.

  13. Digital timing recovery combined with adaptive equalization for optical coherent receivers

    NASA Astrophysics Data System (ADS)

    Zhou, Xian; Chen, Xue; Zhou, Weiqing; Fan, Yangyang; Zhu, Hai; Li, Zhiyu

    2009-11-01

    We propose a novel equalization and timing recovery scheme, which adds an adaptive butterfly-structured equalizer in an all-digital timing recovery loop, for polarization multiplexing (POLMUX) coherent receivers. It resolves an incompatible problem that digital equalizer requires the timing recovered (synchronous) signal and Gardner timing-error detection algorithm requires the equalized signal because of its small tolerance on dispersion. This joint module can complete synchronization, equalization and polarization de-multiplexing simultaneously without any extra computational cost. Finally, we demonstrate the good performance of the new scheme in a 112-Gbit/s POLMUX-NRZ-DQPSK digital optical coherent receiver.

  14. Time-domain digital pre-equalization for band-limited signals based on receiver-side adaptive equalizers.

    PubMed

    Zhang, Junwen; Yu, Jianjun; Chi, Nan; Chien, Hung-Chang

    2014-08-25

    We theoretically and experimentally investigate a time-domain digital pre-equalization (DPEQ) scheme for bandwidth-limited optical coherent communication systems, which is based on feedback of channel characteristics from the receiver-side blind and adaptive equalizers, such as least-mean-squares (LMS) algorithm and constant or multi- modulus algorithms (CMA, MMA). Based on the proposed DPEQ scheme, we theoretically and experimentally study its performance in terms of various channel conditions as well as resolutions for channel estimation, such as filtering bandwidth, taps length, and OSNR. Using a high speed 64-GSa/s DAC in cooperation with the proposed DPEQ technique, we successfully synthesized band-limited 40-Gbaud signals in modulation formats of polarization-diversion multiplexed (PDM) quadrature phase shift keying (QPSK), 8-quadrature amplitude modulation (QAM) and 16-QAM, and significant improvement in both back-to-back and transmission BER performances are also demonstrated. PMID:25321257

  15. A novel high precision adaptive equalizer in digital coherent optical receivers

    NASA Astrophysics Data System (ADS)

    Ma, Xiurong; Xu, Yujun; Wang, Xiao; Ding, Zhaocai

    2015-10-01

    A novel high precision adaptive equalization method is introduced and applied to dynamic equalization for quadrature phase shift keying (QPSK) coherent optical communication system in this paper. A frequency-domain constant modulus algorithm (CMA) method is used to equalize the received signal roughly. Then, some non-ideal output signals will be picked out through the error measurement, and they will be equalized accurately further in a fixed time-domain CMA equalizer. This high precision equalization method can decrease the equalization error, then it can reduce the bit error ratio (BER) of coherent communication system. Simulation results show that there is a 6% decrease for computation complexity by proposed scheme when compared with time-domain CMA. Furthermore, compared with time-domain CMA and frequency-domain CMA, about 2 dB and 2.2 dB in OSNR improvement can be obtained by proposed scheme at the BER value of 1e-3, respectively.

  16. Learning Rate Updating Methods Applied to Adaptive Fuzzy Equalizers for Broadband Power Line Communications

    NASA Astrophysics Data System (ADS)

    Ribeiro, Moisés V.

    2004-12-01

    This paper introduces adaptive fuzzy equalizers with variable step size for broadband power line (PL) communications. Based on delta-bar-delta and local Lipschitz estimation updating rules, feedforward, and decision feedback approaches, we propose singleton and nonsingleton fuzzy equalizers with variable step size to cope with the intersymbol interference (ISI) effects of PL channels and the hardness of the impulse noises generated by appliances and nonlinear loads connected to low-voltage power grids. The computed results show that the convergence rates of the proposed equalizers are higher than the ones attained by the traditional adaptive fuzzy equalizers introduced by J. M. Mendel and his students. Additionally, some interesting BER curves reveal that the proposed techniques are efficient for mitigating the above-mentioned impairments.

  17. Color Histogram Diffusion for Image Enhancement

    NASA Technical Reports Server (NTRS)

    Kim, Taemin

    2011-01-01

    Various color histogram equalization (CHE) methods have been proposed to extend grayscale histogram equalization (GHE) for color images. In this paper a new method called histogram diffusion that extends the GHE method to arbitrary dimensions is proposed. Ranges in a histogram are specified as overlapping bars of uniform heights and variable widths which are proportional to their frequencies. This diagram is called the vistogram. As an alternative approach to GHE, the squared error of the vistogram from the uniform distribution is minimized. Each bar in the vistogram is approximated by a Gaussian function. Gaussian particles in the vistoram diffuse as a nonlinear autonomous system of ordinary differential equations. CHE results of color images showed that the approach is effective.

  18. A NOISE ADAPTIVE FUZZY EQUALIZATION METHOD FOR PROCESSING SOLAR EXTREME ULTRAVIOLET IMAGES

    SciTech Connect

    Druckmueller, M.

    2013-08-15

    A new image enhancement tool ideally suited for the visualization of fine structures in extreme ultraviolet images of the corona is presented in this paper. The Noise Adaptive Fuzzy Equalization method is particularly suited for the exceptionally high dynamic range images from the Atmospheric Imaging Assembly instrument on the Solar Dynamics Observatory. This method produces artifact-free images and gives significantly better results than methods based on convolution or Fourier transform which are often used for that purpose.

  19. Image enhancement by local histogram stretching

    NASA Astrophysics Data System (ADS)

    Alparslan, E.; Fuatince, Mr.

    1981-05-01

    An image enhancement algorithm that makes use of local histogram stretching is introduced. This algorithm yields considerable improvements in human observation of details in an image, compared to straightforward histogram equalization and a number of other enhancement techniques. The algorithm is especially suitable for producing hard copies of images on electrostatic plotters with limited gray levels, as shown in applications to the Girl's image and a Landsat image.

  20. Adaptive reconfigurable V-BLAST type equalizer for cognitive MIMO-OFDM radios

    NASA Astrophysics Data System (ADS)

    Ozden, Mehmet Tahir

    2015-12-01

    An adaptive channel shortening equalizer design for multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) radio receivers is considered in this presentation. The proposed receiver has desirable features for cognitive and software defined radio implementations. It consists of two sections: MIMO decision feedback equalizer (MIMO-DFE) and adaptive multiple Viterbi detection. In MIMO-DFE section, a complete modified Gram-Schmidt orthogonalization of multichannel input data is accomplished using sequential processing multichannel Givens lattice stages, so that a Vertical Bell Laboratories Layered Space Time (V-BLAST) type MIMO-DFE is realized at the front-end section of the channel shortening equalizer. Matrix operations, a major bottleneck for receiver operations, are accordingly avoided, and only scalar operations are used. A highly modular and regular radio receiver architecture that has a suitable structure for digital signal processing (DSP) chip and field programable gate array (FPGA) implementations, which are important for software defined radio realizations, is achieved. The MIMO-DFE section of the proposed receiver can also be reconfigured for spectrum sensing and positioning functions, which are important tasks for cognitive radio applications. In connection with adaptive multiple Viterbi detection section, a systolic array implementation for each channel is performed so that a receiver architecture with high computational concurrency is attained. The total computational complexity is given in terms of equalizer and desired response filter lengths, alphabet size, and number of antennas. The performance of the proposed receiver is presented for two-channel case by means of mean squared error (MSE) and probability of error evaluations, which are conducted for time-invariant and time-variant channel conditions, orthogonal and nonorthogonal transmissions, and two different modulation schemes.

  1. Histograms and Frequency Density.

    ERIC Educational Resources Information Center

    Micromath, 2003

    2003-01-01

    Introduces exercises on histograms and frequency density. Guides pupils to Discovering Important Statistical Concepts Using Spreadsheets (DISCUSS), created at the University of Coventry. Includes curriculum points, teaching tips, activities, and internet address (http://www.coventry.ac.uk/discuss/). (KHR)

  2. Histograms and Raisin Bread

    ERIC Educational Resources Information Center

    Leyden, Michael B.

    1975-01-01

    Describes various elementary school activities using a loaf of raisin bread to promote inquiry skills. Activities include estimating the number of raisins in the loaf by constructing histograms of the number of raisins in a slice. (MLH)

  3. Structure Size Enhanced Histogram

    NASA Astrophysics Data System (ADS)

    Wesarg, Stefan; Kirschner, Matthias

    Direct volume visualization requires the definition of transfer functions (TFs) for the assignment of opacity and color. Multi-dimensional TFs are based on at least two image properties, and are specified by means of 2D histograms. In this work we propose a new type of a 2D histogram which combines gray value with information about the size of the structures. This structure size enhanced (SSE) histogram is an intuitive approach for representing anatomical features. Clinicians — the users we are focusing on — are much more familiar with selecting features by their size than by their gradient magnitude value. As a proof of concept, we employ the SSE histogram for the definition of two-dimensional TFs for the visualization of 3D MRI and CT image data.

  4. Reducing interferences in wireless communication systems by mobile agents with recurrent neural networks-based adaptive channel equalization

    NASA Astrophysics Data System (ADS)

    Beritelli, Francesco; Capizzi, Giacomo; Lo Sciuto, Grazia; Napoli, Christian; Tramontana, Emiliano; Woźniak, Marcin

    2015-09-01

    Solving channel equalization problem in communication systems is based on adaptive filtering algorithms. Today, Mobile Agents (MAs) with Recurrent Neural Networks (RNNs) can be also adopted for effective interference reduction in modern wireless communication systems (WCSs). In this paper MAs with RNNs are proposed as novel computing algorithms for reducing interferences in WCSs performing an adaptive channel equalization. The method to provide it is so called MAs-RNNs. We perform the implementation of this new paradigm for interferences reduction. Simulations results and evaluations demonstrates the effectiveness of this approach and as better transmission performance in wireless communication network can be achieved by using the MAs-RNNs based adaptive filtering algorithm.

  5. A low power CMOS 3.3 Gbps continuous-time adaptive equalizer for serial link

    NASA Astrophysics Data System (ADS)

    Hao, Ju; Yumei, Zhou; Jianzhong, Zhao

    2011-09-01

    This paper describes using a high-speed continuous-time analog adaptive equalizer as the front-end of a receiver for a high-speed serial interface, which is compliant with many serial communication specifications such as USB2.0, PCI-E2.0 and Rapid IO. The low and high frequency loops are merged to decrease the effect of delay between the two paths, in addition, the infinite input impedance facilitates the cascade stages in order to improve the high frequency boosting gain. The implemented circuit architecture could facilitate the wide frequency range from 1 to 3.3 Gbps with different length FR4-PCB traces, which brings as much as 25 dB loss. The replica control circuits are injected to provide a convenient way to regulate common-mode voltage for full differential operation. In addition, AC coupling is adopted to suppress the common input from the forward stage. A prototype chip was fabricated in 0.18-μm 1P6M mixed-signal CMOS technology. The actual area is 0.6 × 0.57 mm2 and the analog equalizer operates up to 3.3 Gbps over FR4-PCB trace with 25 dB loss. The overall power dissipation is approximately 23.4 mW.

  6. Testing the equality of students' performance using Alexander-Govern test with adaptive trimmed means

    NASA Astrophysics Data System (ADS)

    Abdullah, Suhaida; Yahaya, Sharipah Soaad Syed; Yusof, Zahayu Md

    2014-06-01

    Analyzing the equality of independent group has to be done with caution. The classical approaches such as ttest for two groups and analysis of variance (ANOVA) for more than two groups always are favorable selection by researchers. However, sometime these methods were abused by the presence of nonnormality or variance heterogeneity or both. It is known that ANOVA is restricted to the assumptions of normality and homogeneity of variance. In real life data, sometimes these requirements are hard to attain. The Alexander-Govern test with adaptive trimmed mean (AG_atm) is one approach that can be chosen as alternative to the classical tests when their assumptions are violated. In this paper, the performances of AG_atm were compared to the original AG test and ANOVA using simulated and real life data. The simulation study proved that the AG_atm performs better than the original AG test and the classical test. For real life data, student's performance in decision analysis course, measured by final examination score was chosen. Based on the exploratory data analysis, this data found to have problem of nonnormality.

  7. An IIR adaptive electronic equalizer for polarization multiplexed fiber optic communication systems

    NASA Astrophysics Data System (ADS)

    Zeng, Xiang-Ye; Liu, Jian-Fei; Zhao, Qi-Da

    2011-09-01

    An electronic digital equalizer for polarization multiplex coherent fiber optic communication systems is designed to compensate polarization mode dispersion (PMD) and residual chromatic dispersion (CD) of transmission channel. The proposed equalizer is realized with fraction spaced infinite impulse response (IIR) butterfly structure with 21 feedforward taps and 2 feedback taps. Compared with finite impulse response (FIR) structure, this structure can reduce implementation complexity of hardware under the same condition. To keep track of the random variation of channel characteristics, the filter weights are updated by least mean square (LMS) algorithm. The simulation results show that the proposed equalizer can compensate residual chromatic dispersion (CD) of 1600 ps/nm and differential group delay (DGD) of 90 ps simultaneously, and also can increase the PMD and residual CD tolerance of the whole communication system.

  8. Urban Heat Island Adaptation Strategies are not created equal: Assessment of Impacts and Tradeoffs

    NASA Astrophysics Data System (ADS)

    Georgescu, Matei

    2014-05-01

    Sustainable urban expansion requires an extension of contemporary approaches that focus nearly exclusively on reduction of greenhouse gas emissions. Researchers have proposed biophysical approaches to urban heat island mitigation (e.g., via deployment of cool or green roofs) but little is known how these technologies vary with place and season and what impacts are beyond those of near surface temperature. Using a suite of continuous, multi-year and multi-member continental scale numerical simulations for the United States, we examine hydroclimatic impacts for a variety of U.S. urban expansion (to the year 2100) and urban adaptation futures and compare those to contemporary urban extent. Adaptation approaches include widespread adoption of cool roofs, green roofs, and a hypothetical hybrid approach integrating properties of both cool and green roofs (i.e., reflective green roofs). Widespread adoption of adaptation strategies exhibits hydroclimatic impacts that are regionally and seasonally dependent. For some regions and seasons, urban-induced warming of 3°C can be completely offset by the adaptation approaches examined. For other regions and seasons, widespread adoption of some adaptation strategies can result in significant reduction in precipitation. Finally, implications of large-scale urbanization for seasonal energy demand will be examined.

  9. An Adaptive Niching Genetic Algorithm using a niche size equalization mechanism

    NASA Astrophysics Data System (ADS)

    Nagata, Yuichi

    Niching GAs have been widely investigated to apply genetic algorithms (GAs) to multimodal function optimization problems. In this paper, we suggest a new niching GA that attempts to form niches, each consisting of an equal number of individuals. The proposed GA can be applied also to combinatorial optimization problems by defining a distance metric in the search space. We apply the proposed GA to the job-shop scheduling problem (JSP) and demonstrate that the proposed niching method enhances the ability to maintain niches and improve the performance of GAs.

  10. Equalizing resolution in smoothed-particle hydrodynamics calculations using self-adaptive sinc kernels

    NASA Astrophysics Data System (ADS)

    García-Senz, Domingo; Cabezón, Rubén M.; Escartín, José A.; Ebinger, Kevin

    2014-10-01

    Context. The smoothed-particle hydrodynamics (SPH) technique is a numerical method for solving gas-dynamical problems. It has been applied to simulate the evolution of a wide variety of astrophysical systems. The method has a second-order accuracy, with a resolution that is usually much higher in the compressed regions than in the diluted zones of the fluid. Aims: We propose and check a method to balance and equalize the resolution of SPH between high- and low-density regions. This method relies on the versatility of a family of interpolators called sinc kernels, which allows increasing the interpolation quality by varying only a single parameter (the exponent of the sinc function). Methods: The proposed method was checked and validated through a number of numerical tests, from standard one-dimensional Riemann problems in shock tubes, to multidimensional simulations of explosions, hydrodynamic instabilities, and the collapse of a Sun-like polytrope. Results: The analysis of the hydrodynamical simulations suggests that the scheme devised to equalize the accuracy improves the treatment of the post-shock regions and, in general, of the rarefacted zones of fluids while causing no harm to the growth of hydrodynamic instabilities. The method is robust and easy to implement with a low computational overload. It conserves mass, energy, and momentum and reduces to the standard SPH scheme in regions of the fluid that have smooth density gradients.

  11. Fast tracking using edge histograms

    NASA Astrophysics Data System (ADS)

    Rokita, Przemyslaw

    1997-04-01

    This paper proposes a new algorithm for tracking objects and objects boundaries. This algorithm was developed and applied in a system used for compositing computer generated images and real world video sequences, but can be applied in general in all tracking systems where accuracy and high processing speed are required. The algorithm is based on analysis of histograms obtained by summing along chosen axles pixels of edge segmented images. Edge segmentation is done by spatial convolution using gradient operator. The advantage of such an approach is that it can be performed in real-time using available on the market hardware convolution filters. After edge extraction and histograms computation, respective positions of maximums in edge intensity histograms, in current and previous frame, are compared and matched. Obtained this way information about displacement of histograms maximums, can be directly converted into information about changes of target boundaries positions along chosen axles.

  12. CHIL - a comprehensive histogramming language

    SciTech Connect

    Milner, W.T.; Biggerstaff, J.A.

    1984-01-01

    A high level language, CHIL, has been developed for use in processing event-by-event experimental data at the Holifield Heavy Ion Research Facility (HHIRF) using PERKIN-ELMER 3230 computers. CHIL has been fully integrated into all software which supports on-line and off-line histogramming and off-line preprocessing. CHIL supports simple gates, free-form-gates (2-D regions of arbitrary shape), condition test and branch statements, bit-tests, loops, calls to up to three user supplied subroutines and histogram generating statements. Any combination of 1, 2, 3 or 4-D histograms (32 megachannels max) may be recorded at 16 or 32 bits/channel. User routines may intercept the data being processed and modify it as desired. The CPU-intensive part of the processing utilizes microcoded routines which enhance performance by about a factor of two.

  13. Studies on effects of feedback delay on the convergence performance of adaptive time-domain equalizers for fiber dispersive channels

    NASA Astrophysics Data System (ADS)

    Guo, Qun; Xu, Bo; Qiu, Kun

    2016-04-01

    Adaptive time-domain equalizer (TDE) is an important module for digital optical coherent receivers. From an implementation perspective, we analyze and compare in detail the effects of error signal feedback delay on the convergence performance of TDE using either least-mean square (LMS) or constant modulus algorithm (CMA). For this purpose, a simplified theoretical model is proposed based on which iterative equations on the mean value and the variance of the tap coefficient are derived with or without error signal feedback delay for both LMS- and CMA-based methods for the first time. The analytical results show that decreased step size has to be used for TDE to converge and a slower convergence speed cannot be avoided as the feedback delay increases. Compared with the data-aided LMS-based method, the CMA-based method has a slower convergence speed and larger variation after convergence. Similar results are confirmed using numerical simulations for fiber dispersive channels. As the step size increases, a feedback delay of 20 clock cycles might cause the TDE to diverge. Compared with the CMA-based method, the LMS-based method has a higher tolerance on the feedback delay and allows a larger step size for a faster convergence speed.

  14. Theory and Application of DNA Histogram Analysis.

    ERIC Educational Resources Information Center

    Bagwell, Charles Bruce

    The underlying principles and assumptions associated with DNA histograms are discussed along with the characteristics of fluorescent probes. Information theory was described and used to calculate the information content of a DNA histogram. Two major types of DNA histogram analyses are proposed: parametric and nonparametric analysis. Three levels…

  15. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  16. Spline smoothing of histograms by linear programming

    NASA Technical Reports Server (NTRS)

    Bennett, J. O.

    1972-01-01

    An algorithm for an approximating function to the frequency distribution is obtained from a sample of size n. To obtain the approximating function a histogram is made from the data. Next, Euclidean space approximations to the graph of the histogram using central B-splines as basis elements are obtained by linear programming. The approximating function has area one and is nonnegative.

  17. Interpreting Histograms. As Easy as It Seems?

    ERIC Educational Resources Information Center

    Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim

    2014-01-01

    Histograms are widely used, but recent studies have shown that they are not as easy to interpret as it might seem. In this article, we report on three studies on the interpretation of histograms in which we investigated, namely, (1) whether the misinterpretation by university students can be considered to be the result of heuristic reasoning, (2)…

  18. Adaptive Channel-Tracking Method and Equalization for MC-CDMA Systems over Rapidly Fading Channel under Colored Noise

    NASA Astrophysics Data System (ADS)

    Yang, Chang-Yi; Chen, Bor-Sen

    2010-12-01

    A recursive maximum-likelihood (RML) algorithm for channel estimation under rapidly fading channel and colored noise in a multicarrier code-division multiple-access (MC-CDMA) system is proposed in this paper. A moving-average model with exogenous input (MAX) is given to describe the transmission channel and colored noise. Based on the pseudoregression method, the proposed RML algorithm can simultaneously estimate the parameters of channel and colored noise. Following the estimation results, these parameters can be used to enhance the minimum mean-square error (MMSE) equalizer. Considering high-speed mobile stations, a one-step linear trend predictor is added to improve symbol detection. Simulation results indicate that the proposed RML estimator can track the channel more precisely than the conventional estimator. Meanwhile, the performance of the proposed enhanced MMSE equalizer is robust to the rapidly Rayleigh fading channel under colored noise in the MC-CDMA systems.

  19. Are Urban Heat Island Adaptation Strategies Created Equal? Hydroclimatic Impact Assessment for U.S. 2100 Urban Expansion (Invited)

    NASA Astrophysics Data System (ADS)

    Georgescu, M.; Bierwagen, B. G.; Morefield, P.; Weaver, C. P.

    2013-12-01

    With population projections ranging from 380 to 690 million inhabitants for U.S. 2100, considerable conversion of landscapes will be necessary to meet increased demand for the built environment. Incorporating Integrated Climate and Land Use Scenarios (ICLUS) urban expansion data for 2100 as surface boundary conditions within the Weather Research and Forecasting (WRF) modeling system, we examine hydroclimatic consequences owing to built environment expansion scenarios across the conterminous U.S. Continuous, multi-year and multi-member continental scale numerical simulations are performed for a modern day urban representation (Control), a worst-case (A2) and a best-case (B1) urban expansion scenario. Three adaptation approaches are explored to assess the potential offset of urban-induced warming to growth of the built environment: (i) widespread adoption of cool roofs, (ii) a simple representation of green roofs, and a (iii) hypothetical hybrid approach integrating properties of both cool and green roofs (i.e., reflective green roofs).Widespread adoption of adaptation strategies exhibit hydroclimatic impacts that are regionally and seasonally dependant. To help prioritize region-specific adaptation strategies, the potential to offset urban-induced warming by each of the trio of strategies is examined and contrasted across the various hydrometeorological environments.

  20. Ear segmentation using histogram based K-means clustering and Hough transformation under CVL dataset

    NASA Astrophysics Data System (ADS)

    Liu, Heng; Liu, Dekai

    2009-10-01

    Under CVL dataset, we provide an image segmentation approach based on adaptive histogram based K-means clustering and fast Hough transformation. This work firstly analyzes the characteristics of ear images in CVL face dataset. According to the analysis, we then use adaptive histogram based K-means clustering method to threshold ear images and then roughly segment the ear parts. After ear contour extraction, with boundary determination through vertical project, Hough transformation is utilized to locate the ear contour accurately. The experimental results and comparisons with other segmentation methods show our approach is effective.

  1. Circle detection using scan lines and histograms

    NASA Astrophysics Data System (ADS)

    Chen, Ming; Zhang, Feng; Du, Zhenhong; Liu, Renyi

    2013-11-01

    Circle detection is significant in image processing and pattern recognition. We present a new algorithm for detecting circles, which is based on the global geometric symmetry of circles. First, the horizontal and vertical midpoint histograms of the edge image are obtained by using scan lines. Then, we apply the peak-finding algorithm to the midpoint histograms to look for the center of the circle. The normalized radius histogram is finally used to verify the existence of the circle and extract its radius. Synthetic images with different levels of pepper noise and real images containing several circles have been taken to test the performance. Experimental results demonstrate that the proposed algorithm has the advantage of computational efficiency as compared with the randomized Hough transform and some other algorithms.

  2. Quantization of color histograms using GLA

    NASA Astrophysics Data System (ADS)

    Yang, Christopher C.; Yip, Milo K.

    2002-09-01

    Color histogram has been used as one of the most important image descriptor in a wide range of content-based image retrieval (CBIR) projects for color image indexing. It captures the global chromatic distribution of an image. Traditionally, there are two major approaches to quantize the color space: (1) quantize each dimension of a color coordinate system uniformly to generate a fixed number of bins; and (2) quantize a color coordinate system arbitrarily. The first approach works best on cubical color coordinate systems, such as RGB. For other non-cubical color coordinate system, such as CIELAB and CIELUV, some bins may fall out of the gamut (transformed from the RGB cube) of the color space. As a result, it reduces the effectiveness of the color histogram and hence reduces the retrieval performance. The second approach uses arbitrarily quantization. The volume of the bins is not necessary uniform. As a result, it affects the effectiveness of the histogram significantly. In this paper, we propose to develop the color histogram by tessellating the non-cubical color gamut transformed from RGB cube using a vector quantization (VQ) method, the General Loyld Algorithm (GLA) [6]. Using such approach, the problem of empty bins due to the gamut of the color coordinate system can be avoided. Besides, all bins quantized by GLA will occupy the same volume. It guarantees that uniformity of each quantized bins in the histogram. An experiment has been conducted to evaluate the quantitative performance of our approach. The image collection from UC Berkeley's digital library project is used as the test bed. The indexing effectiveness of a histogram space [3] is used as the measurement of the performance. The experimental result shows that using the GLA quantization approach significantly increase the indexing effectiveness.

  3. Accelerated weight histogram method for exploring free energy landscapes

    SciTech Connect

    Lindahl, V.; Lidmar, J.; Hess, B.

    2014-07-28

    Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.

  4. Accelerated weight histogram method for exploring free energy landscapes

    NASA Astrophysics Data System (ADS)

    Lindahl, V.; Lidmar, J.; Hess, B.

    2014-07-01

    Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.

  5. Equality Handbook

    ERIC Educational Resources Information Center

    Malzer, Maris; Popovic, Milica; Striedinger, Angelika; Bjorklund, Karin; Olsson, Anna-Clara; Elstad, Linda; Brus, Sanja; Stark, Kat; Stojanovic, Marko; Scholz, Christine

    2009-01-01

    "Tolerance is not enough, discrimination must be fought" is what ESU staff stated in their Seminar on Equality in London, last May. Following their seminar, they decided to provide members with more practical tools to fight discrimination in higher education. This handbook aims at as part of that strategy. Focusing on several issues that are high…

  6. Maximum-Likelihood Fits to Histograms for Improved Parameter Estimation

    NASA Astrophysics Data System (ADS)

    Fowler, J. W.

    2014-08-01

    Straightforward methods for adapting the familiar statistic to histograms of discrete events and other Poisson distributed data generally yield biased estimates of the parameters of a model. The bias can be important even when the total number of events is large. For the case of estimating a microcalorimeter's energy resolution at 6 keV from the observed shape of the Mn K fluorescence spectrum, a poor choice of can lead to biases of at least 10 % in the estimated resolution when up to thousands of photons are observed. The best remedy is a Poisson maximum-likelihood fit, through a simple modification of the standard Levenberg-Marquardt algorithm for minimization. Where the modification is not possible, another approach allows iterative approximation of the maximum-likelihood fit.

  7. Histogramming of the Charged Particle Measurements with MSL/RAD - Comparison of Histogram Data with Simulations

    NASA Astrophysics Data System (ADS)

    Ehresmann, B.; Zeitlin, C.; Hassler, D. M.; Wimmer-Schweingruber, R. F.; Boettcher, S.; Koehler, J.; Martin, C.; Brinza, D.; Rafkin, S. C.

    2012-12-01

    The Radiation Assessment Detector (RAD) on-board the Mars Science Laboratory (MSL) is designed to measure a broad range of energetic particle radiation. A significant part of this radiation consists of charged particles, which mainly stem from cosmic background radiation, Solar particle events, and secondaries created by the interaction of these particles with the Martian atmosphere and soil. To measure charged particles RAD is equipped with a set of detectors: a particle telescope consisting of three silicon Solid-State Detectors (SSDs), a CsI scintillator and a plastic scintillator, as well as a further plastic scintillator used as anti-coincidence. RAD uses an elaborate post-processing logic to analyze if a measured event qualifies as a charged particle, as well as to distinguish between particles stopping in any one of the detectors and particles penetrating the whole detector stack. RAD then arranges these qualifying events in an appropriate stopping or penetrating charged particle histogram, reducing the data volume necessary to maintain crucial information about the measured particle. For ground-based data analysis it is of prime importance to derive information, such as particle species or energy, from the data in the downloaded histograms. Here, we will present how the chosen binning of these histograms enables us to derive this information. Pre-flight, we used the Monte-Carlo code GEANT4 to simulate the expected particle radiation and its interactions with a full model of the RAD sensor head. By mirroring the on-board processing logic, we derived statistics of which particle species and energies populate any one bin in the set of charged particle histograms. Finally, we will compare the resulting histogram data from RAD cruise and surface observations with simulations. RAD is supported by NASA (HEOMD) under JPL subcontract #1273039 to SwRI, and by DLR in Germany under contract to Christian-Albrechts-Universitaet zu Kiel (CAU).

  8. Active steganalysis for histogram-shifting based reversible data hiding

    NASA Astrophysics Data System (ADS)

    Lou, Der-Chyuan; Chou, Chao-Lung; Tso, Hao-Kuan; Chiu, Chung-Cheng

    2012-05-01

    This paper presents an innovative active steganalysis algorithm for reversible data hiding schemes based on histogram shifting. These schemes use histogram shifting to embed secret data in cover-images. However, some histogram patterns originating during the embedding procedure may be recognized readily by a steganalyst. The proposed algorithm analyzes the characteristics of histogram changing during the data embedding procedure, and then models these features into reference templates by using a 1 × 4 sliding window. A support vector machine is trained as the classifier for discriminating between cover-images and stego-images by adopting the template matching techniques. The hidden messages located at the histogram peak of the cover-image were further estimated by measuring the feature of adjacent histogram differences. Experimental results indicate that the proposed active steganalysis algorithm can effectively detect stego-images at low bit rates and estimate the hidden messages locations.

  9. Using the gradient histogram to analyze the continuous phase plate

    NASA Astrophysics Data System (ADS)

    Yang, Chunlin

    2015-01-01

    The geometrical optical method has been used to discuss the far-field distribution characteristics of a continuous phase plate. The gradient histogram of the plate’s surface has been calculated. It has been proved that the gradient histogram can be used to show the angular spectrum of a phase plate. The gradient histogram can simplify the analysis process of the angular spectrum and describe the focal spot morphology more intuitively.

  10. Automatic image equalization and contrast enhancement using Gaussian mixture modeling.

    PubMed

    Celik, Turgay; Tjahjadi, Tardi

    2012-01-01

    In this paper, we propose an adaptive image equalization algorithm that automatically enhances the contrast in an input image. The algorithm uses the Gaussian mixture model to model the image gray-level distribution, and the intersection points of the Gaussian components in the model are used to partition the dynamic range of the image into input gray-level intervals. The contrast equalized image is generated by transforming the pixels' gray levels in each input interval to the appropriate output gray-level interval according to the dominant Gaussian component and the cumulative distribution function of the input interval. To take account of the hypothesis that homogeneous regions in the image represent homogeneous silences (or set of Gaussian components) in the image histogram, the Gaussian components with small variances are weighted with smaller values than the Gaussian components with larger variances, and the gray-level distribution is also used to weight the components in the mapping of the input interval to the output interval. Experimental results show that the proposed algorithm produces better or comparable enhanced images than several state-of-the-art algorithms. Unlike the other algorithms, the proposed algorithm is free of parameter setting for a given dynamic range of the enhanced image and can be applied to a wide range of image types. PMID:21775265

  11. Monitoring Incremental Histogram Distribution for Change Detection in Data Streams

    NASA Astrophysics Data System (ADS)

    Sebastião, Raquel; Gama, João; Rodrigues, Pedro Pereira; Bernardes, João

    Histograms are a common technique for density estimation and they have been widely used as a tool in exploratory data analysis. Learning histograms from static and stationary data is a well known topic. Nevertheless, very few works discuss this problem when we have a continuous flow of data generated from dynamic environments.

  12. From data to probability densities without histograms

    NASA Astrophysics Data System (ADS)

    Berg, Bernd A.; Harris, Robert C.

    2008-09-01

    When one deals with data drawn from continuous variables, a histogram is often inadequate to display their probability density. It deals inefficiently with statistical noise, and binsizes are free parameters. In contrast to that, the empirical cumulative distribution function (obtained after sorting the data) is parameter free. But it is a step function, so that its differentiation does not give a smooth probability density. Based on Fourier series expansion and Kolmogorov tests, we introduce a simple method, which overcomes this problem. Error bars on the estimated probability density are calculated using a jackknife method. We give several examples and provide computer code reproducing them. You may want to look at the corresponding figures 4 to 9 first. Program summaryProgram title: cdf_to_pd Catalogue identifier: AEBC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2758 No. of bytes in distributed program, including test data, etc.: 18 594 Distribution format: tar.gz Programming language: Fortran 77 Computer: Any capable of compiling and executing Fortran code Operating system: Any capable of compiling and executing Fortran code Classification: 4.14, 9 Nature of problem: When one deals with data drawn from continuous variables, a histogram is often inadequate to display the probability density. It deals inefficiently with statistical noise, and binsizes are free parameters. In contrast to that, the empirical cumulative distribution function (obtained after sorting the data) is parameter free. But it is a step function, so that its differentiation does not give a smooth probability density. Solution method: Based on Fourier series expansion and Kolmogorov tests, we introduce a simple method, which

  13. Comparison of Histograms for Use in Cloud Observation and Modeling

    NASA Technical Reports Server (NTRS)

    Green, Lisa; Xu, Kuan-Man

    2005-01-01

    Cloud observation and cloud modeling data can be presented in histograms for each characteristic to be measured. Combining information from single-cloud histograms yields a summary histogram. Summary histograms can be compared to each other to reach conclusions about the behavior of an ensemble of clouds in different places at different times or about the accuracy of a particular cloud model. As in any scientific comparison, it is necessary to decide whether any apparent differences are statistically significant. The usual methods of deciding statistical significance when comparing histograms do not apply in this case because they assume independent data. Thus, a new method is necessary. The proposed method uses the Euclidean distance metric and bootstrapping to calculate the significance level.

  14. Fast ordering algorithm for exact histogram specification.

    PubMed

    Nikolova, Mila; Steidl, Gabriele

    2014-12-01

    This paper provides a fast algorithm to order in a meaningful, strict way the integer gray values in digital (quantized) images. It can be used in any exact histogram specification-based application. Our algorithm relies on the ordering procedure based on the specialized variational approach. This variational method was shown to be superior to all other state-of-the art ordering algorithms in terms of faithful total strict ordering but not in speed. Indeed, the relevant functionals are in general difficult to minimize because their gradient is nearly flat over vast regions. In this paper, we propose a simple and fast fixed point algorithm to minimize these functionals. The fast convergence of our algorithm results from known analytical properties of the model. Our algorithm is equivalent to an iterative nonlinear filtering. Furthermore, we show that a particular form of the variational model gives rise to much faster convergence than other alternative forms. We demonstrate that only a few iterations of this filter yield almost the same pixel ordering as the minimizer. Thus, we apply only few iteration steps to obtain images, whose pixels can be ordered in a strict and faithful way. Numerical experiments confirm that our algorithm outperforms by far its main competitors. PMID:25347881

  15. AHIMSA - Ad hoc histogram information measure sensing algorithm for feature selection in the context of histogram inspired clustering techniques

    NASA Technical Reports Server (NTRS)

    Dasarathy, B. V.

    1976-01-01

    An algorithm is proposed for dimensionality reduction in the context of clustering techniques based on histogram analysis. The approach is based on an evaluation of the hills and valleys in the unidimensional histograms along the different features and provides an economical means of assessing the significance of the features in a nonparametric unsupervised data environment. The method has relevance to remote sensing applications.

  16. Adaptation.

    PubMed

    Broom, Donald M

    2006-01-01

    The term adaptation is used in biology in three different ways. It may refer to changes which occur at the cell and organ level, or at the individual level, or at the level of gene action and evolutionary processes. Adaptation by cells, especially nerve cells helps in: communication within the body, the distinguishing of stimuli, the avoidance of overload and the conservation of energy. The time course and complexity of these mechanisms varies. Adaptive characters of organisms, including adaptive behaviours, increase fitness so this adaptation is evolutionary. The major part of this paper concerns adaptation by individuals and its relationships to welfare. In complex animals, feed forward control is widely used. Individuals predict problems and adapt by acting before the environmental effect is substantial. Much of adaptation involves brain control and animals have a set of needs, located in the brain and acting largely via motivational mechanisms, to regulate life. Needs may be for resources but are also for actions and stimuli which are part of the mechanism which has evolved to obtain the resources. Hence pigs do not just need food but need to be able to carry out actions like rooting in earth or manipulating materials which are part of foraging behaviour. The welfare of an individual is its state as regards its attempts to cope with its environment. This state includes various adaptive mechanisms including feelings and those which cope with disease. The part of welfare which is concerned with coping with pathology is health. Disease, which implies some significant effect of pathology, always results in poor welfare. Welfare varies over a range from very good, when adaptation is effective and there are feelings of pleasure or contentment, to very poor. A key point concerning the concept of individual adaptation in relation to welfare is that welfare may be good or poor while adaptation is occurring. Some adaptation is very easy and energetically cheap and

  17. Online Data Monitoring Framework Based on Histogram Packaging in Network Distributed Data Acquisition Systems

    NASA Astrophysics Data System (ADS)

    Konno, T.; Cabarera, A.; Ishitsuka, M.; Kuze, M.; Sakamoto, Y.

    2011-12-01

    "Online monitor framework" is a new general software framework for online data monitoring, which provides a way to collect information from online systems, including data acquisition, and displays them to shifters far from experimental sites. "Monitor Server", a core system in this framework gathers the monitoring information from the online subsystems and the information is handled as collections of histograms named "Histogram Package". Monitor Server broadcasts the histogram packages to "Monitor Viewers", graphical user interfaces in the framework. We developed two types of the viewers with different technologies: Java and web browser. We adapted XML based file for the configuration of GUI components on the windows and graphical objects on the canvases. Monitor Viewer creates its GUIs automatically with the configuration files.This monitoring framework has been developed for the Double Chooz reactor neutrino oscillation experiment in France, but can be extended for general application to be used in other experiments. This document reports the structure of the online monitor framework with some examples from the adaption to the Double Chooz experiment.

  18. RHINO: Real-time histogram interpretation of numerical observations

    NASA Astrophysics Data System (ADS)

    Chandler, Susan; Lukesh, Gordon

    2006-05-01

    RHINO, Real-time Histogram Interpretation of Numerical Observations, is a specialty algorithm and tool under development for the United States Air Force Office of Scientific Research (AFOSR). The intent is to provide real-time feedback for adaptive control of telescope pointing for ground-space-ground laser illumination experiments. Nukove together with New Mexico State University first established a proof-of-principle laboratory experiment using RHINO and, under a controlled environment, reduction of the pointing error known as boresight was demonstrated. RHINO is resilient to effects such as glints, speckle, and scintillation. The forthcoming commercially available version of RHINO will use real-time field data and provide adaptive control to the user. The utility of RHINO is evident in a realistic scenario: Consider a space asset that has been joined by a microsat, perhaps 0.5m in size. The microsat may have been launched to simply listen in from close proximity, monitor the asset, image the asset or most critically, cause damage to the asset. If the goal is to destroy the microsat by long-range illumination with a high power laser and the microsat is meters from the asset (μrads at 1Mm) laser pointing is of utmost importance as the goal is certainly not to damage the space asset. RHINO offers the capability to estimate key metrics of laser system pointing, known as jitter and boresight. The algorithms used have been under development for nearly a decade, have been established in a laboratory environment, and have been tested with field data.

  19. Brightness preserving image enhancement based on a gradient and intensity histogram

    NASA Astrophysics Data System (ADS)

    Sun, Zebin; Feng, Wenquan; Zhao, Qi; Huang, Lidong

    2015-09-01

    We present a straightforward brightness preserving image enhancement technique. The proposed method is based on an original gradient and intensity histogram (GIH) which contains both gradient and intensity information of the image. This character enables GIH to avoid high peaks in the traditional intensity histogram and, thus alleviate overenhancement in our enhancement method, i.e., gradient and intensity histogram equalization (GIHE). GIHE can also enhance the gradient strength of an image, which is good for improving the subjective quality since the human vision system is more sensitive to the gradient than the absolute intensity of image. Considering that brightness preservation and dynamic range compression are highly demanded in consumer electronics, we manipulate the intensity of the enhanced image appropriately by amplifying the small intensities and attenuating the large intensities, respectively, using a brightness preserving function (BPF). The BPF is straightforward and universal and can be used in other image enhancement techniques. We demonstrate that the proposed method can effectively improve the subjective quality as well as preserve the brightness of the input image.

  20. A spectral histogram model for texton modeling and texture discrimination.

    PubMed

    Liu, Xiuwen; Wang, DeLiang

    2002-10-01

    We suggest a spectral histogram, defined as the marginal distribution of filter responses, as a quantitative definition for a texton pattern. By matching spectral histograms, an arbitrary image can be transformed to an image with similar textons to the observed. We use the chi(2)-statistic to measure the difference between two spectral histograms, which leads to a texture discrimination model. The performance of the model well matches psychophysical results on a systematic set of texture discrimination data and it exhibits the nonlinearity and asymmetry phenomena in human texture discrimination. A quantitative comparison with the Malik-Perona model is given, and a number of issues regarding the model are discussed. PMID:12446034

  1. Action recognition via cumulative histogram of multiple features

    NASA Astrophysics Data System (ADS)

    Yan, Xunshi; Luo, Yupin

    2011-01-01

    Spatial-temporal interest points (STIPs) are popular in human action recognition. However, they suffer from difficulties in determining size of codebook and losing much information during forming histograms. In this paper, spatial-temporal interest regions (STIRs) are proposed, which are based on STIPs and are capable of marking the locations of the most ``shining'' human body parts. In order to represent human actions, the proposed approach takes great advantages of multiple features, including STIRs, pyramid histogram of oriented gradients and pyramid histogram of oriented optical flows. To achieve this, cumulative histogram is used to integrate dynamic information in sequences and to form feature vectors. Furthermore, the widely used nearest neighbor and AdaBoost methods are employed as classification algorithms. Experiments on public datasets KTH, Weizmann and UCF sports show that the proposed approach achieves effective and robust results.

  2. Approximate Splitting for Ensembles of Trees using Histograms

    SciTech Connect

    Kamath, C; Cantu-Paz, E; Littau, D

    2001-09-28

    Recent work in classification indicates that significant improvements in accuracy can be obtained by growing an ensemble of classifiers and having them vote for the most popular class. Implicit in many of these techniques is the concept of randomization that generates different classifiers. In this paper, they focus on ensembles of decision trees that are created using a randomized procedure based on histograms. Techniques, such as histograms, that discretize continuous variables, have long been used in classification to convert the data into a form suitable for processing and to reduce the compute time. The approach combines the ideas behind discretization through histograms and randomization in ensembles to create decision trees by randomly selecting a split point in an interval around the best bin boundary in the histogram. The experimental results with public domain data show that ensembles generated using this approach are competitive in accuracy and superior in computational cost to other ensembles techniques such as boosting and bagging.

  3. Frequency distribution histograms for the rapid analysis of data

    NASA Technical Reports Server (NTRS)

    Burke, P. V.; Bullen, B. L.; Poff, K. L.

    1988-01-01

    The mean and standard error are good representations for the response of a population to an experimental parameter and are frequently used for this purpose. Frequency distribution histograms show, in addition, responses of individuals in the population. Both the statistics and a visual display of the distribution of the responses can be obtained easily using a microcomputer and available programs. The type of distribution shown by the histogram may suggest different mechanisms to be tested.

  4. Adapt

    NASA Astrophysics Data System (ADS)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  5. Bin recycling strategy for improving the histogram precision on GPU

    NASA Astrophysics Data System (ADS)

    Cárdenas-Montes, Miguel; Rodríguez-Vázquez, Juan José; Vega-Rodríguez, Miguel A.

    2016-07-01

    Histogram is an easily comprehensible way to present data and analyses. In the current scientific context with access to large volumes of data, the processing time for building histogram has dramatically increased. For this reason, parallel construction is necessary to alleviate the impact of the processing time in the analysis activities. In this scenario, GPU computing is becoming widely used for reducing until affordable levels the processing time of histogram construction. Associated to the increment of the processing time, the implementations are stressed on the bin-count accuracy. Accuracy aspects due to the particularities of the implementations are not usually taken into consideration when building histogram with very large data sets. In this work, a bin recycling strategy to create an accuracy-aware implementation for building histogram on GPU is presented. In order to evaluate the approach, this strategy was applied to the computation of the three-point angular correlation function, which is a relevant function in Cosmology for the study of the Large Scale Structure of Universe. As a consequence of the study a high-accuracy implementation for histogram construction on GPU is proposed.

  6. Quantification of emphysema severity by histogram analysis of CT scans.

    PubMed

    Mendonça, Paulo R S; Padfield, Dirk R; Ross, James C; Miller, James V; Dutta, Sandeep; Gautham, Sardar Mal

    2005-01-01

    Emphysema is characterized by the destruction and over distension of lung tissue, which manifest on high resolution computer tomography (CT) images as regions of low attenuation. Typically, it is diagnosed by clinical symptoms, physical examination, pulmonary function tests, and X-ray and CT imaging. In this paper we discuss a quantitative imaging approach to analyze emphysema which employs low-level segmentations of CT images that partition the data into perceptually relevant regions. We constructed multi-dimensional histograms of feature values computed over the image segmentation. For each region in the segmentation, we derive a rich set of feature measurements. While we can use any combination of physical and geometric features, we found that limiting the scope to two features - the mean attenuation across a region and the region area - is effective. The subject histogram is compared to a set of canonical histograms representative of various stages of emphysema using the Earth Mover's Distance metric. Disease severity is assigned based on which canonical histogram is most similar to the subject histogram. Experimental results with 81 cases of emphysema at different stages of disease progression show good agreement against the reading of an expert radiologist. PMID:16685912

  7. A rule-based expert system for the automatic classification of DNA "ploidy" histograms measured by the CAS 200 image analysis system.

    PubMed

    Marchevsky, A M; Truong, H; Tolmachoff, T

    1997-02-15

    DNA "ploidy" histogram interpretation is one of the most important sources of variation in DNA image cytometry and is influenced by multiple technical factors such as scaling, selection of peaks, and variable classification criteria. A rule-based expert system was developed to automate and eliminate subjectivity from this interpretative process. Ninety-eight Feulgen stained histologic sections from patients with breast, colon, and lung cancer were measured with the CAS 200 image analysis system (Becton Dickinson, Santa Clara, CA); they included diploid (n = 42), aneuploid (n = 46), tetraploid (n = 7), and multiploid (n = 3) examples. The data was converted from listmode format into ASCII with the aid of CELLSHEET software (JVC Imaging, Elmhurst, IL). Individual microphotometric nuclear measurements were sorted to one of 64 bins based on DNA index. The 64 bins were then divided into 5 semi-arbitrarily defined ranges: hypodiploid, diploid, aneuploid, tetraploid, and hypertetraploid. The nuclear percentages in each range were calculated with EXCEL 4.0 (Microsoft, Redmond, WA). The histograms were divided into 2 equal sets: training and testing. The data from the training set were used to develop 16 IF-THEN rules to classify the histograms into diploid, aneuploid, or tetraploid. A macro was programmed in EXCEL to automate all these operations. The rule-based expert system classified correctly 45/50 histograms of the training set. Two tetraploid histograms were classified as aneuploid. Three multiploid histograms were classified as tetraploid. All histograms in the testing set were correctly classified by the expert system. The potential role of rule-based expert system technology for the objective classification of DNA "ploidy" histograms measured by image cytometry is discussed. PMID:9056741

  8. Face recognition with histograms of fractional differential gradients

    NASA Astrophysics Data System (ADS)

    Yu, Lei; Ma, Yan; Cao, Qi

    2014-05-01

    It has proved that fractional differentiation can enhance the edge information and nonlinearly preserve textural detailed information in an image. This paper investigates its ability for face recognition and presents a local descriptor called histograms of fractional differential gradients (HFDG) to extract facial visual features. HFDG encodes a face image into gradient patterns using multiorientation fractional differential masks, from which histograms of gradient directions are computed as the face representation. Experimental results on Yale, face recognition technology (FERET), Carnegie Mellon University pose, illumination, and expression (CMU PIE), and A. Martinez and R. Benavente (AR) databases validate the feasibility of the proposed method and show that HFDG outperforms local binary patterns (LBP), histograms of oriented gradients (HOG), enhanced local directional patterns (ELDP), and Gabor feature-based methods.

  9. Java multi-histogram volume rendering framework for medical images

    NASA Astrophysics Data System (ADS)

    Senseney, Justin; Bokinsky, Alexandra; Cheng, Ruida; McCreedy, Evan; McAuliffe, Matthew J.

    2013-03-01

    This work extends the multi-histogram volume rendering framework proposed by Kniss et al. [1] to provide rendering results based on the impression of overlaid triangles on a graph of image intensity versus gradient magnitude. The developed method of volume rendering allows for greater emphasis to boundary visualization while avoiding issues common in medical image acquisition. For example, partial voluming effects in computed tomography and intensity inhomogeneity of similar tissue types in magnetic resonance imaging introduce pixel values that will not reflect differing tissue types when a standard transfer function is applied to an intensity histogram. This new framework uses developing technology to improve upon the Kniss multi-histogram framework by using Java, the GPU, and MIPAV, an open-source medical image processing application, to allow multi-histogram techniques to be widely disseminated. The OpenGL view aligned texture rendering approach suffered from performance setbacks, inaccessibility, and usability problems. Rendering results can now be interactively compared with other rendering frameworks, surfaces can now be extracted for use in other programs, and file formats that are widely used in the field of biomedical imaging can be visualized using this multi-histogram approach. OpenCL and GLSL are used to produce this new multi-histogram approach, leveraging texture memory on the graphics processing unit of desktops to provide a new interactive method for visualizing biomedical images. Performance results for this method are generated and qualitative rendering results are compared. The resulting framework provides the opportunity for further applications in medical imaging, both in volume rendering and in generic image processing.

  10. A Novel Method for the Evaluation of Uncertainty in Dose-Volume Histogram Computation

    SciTech Connect

    Henriquez, Francisco Cutanda M.Sc. Castrillon, Silvia Vargas

    2008-03-15

    Purpose: Dose-volume histograms (DVHs) are a useful tool in state-of-the-art radiotherapy treatment planning, and it is essential to recognize their limitations. Even after a specific dose-calculation model is optimized, dose distributions computed by using treatment-planning systems are affected by several sources of uncertainty, such as algorithm limitations, measurement uncertainty in the data used to model the beam, and residual differences between measured and computed dose. This report presents a novel method to take them into account. Methods and Materials: To take into account the effect of associated uncertainties, a probabilistic approach using a new kind of histogram, a dose-expected volume histogram, is introduced. The expected value of the volume in the region of interest receiving an absorbed dose equal to or greater than a certain value is found by using the probability distribution of the dose at each point. A rectangular probability distribution is assumed for this point dose, and a formulation that accounts for uncertainties associated with point dose is presented for practical computations. Results: This method is applied to a set of DVHs for different regions of interest, including 6 brain patients, 8 lung patients, 8 pelvis patients, and 6 prostate patients planned for intensity-modulated radiation therapy. Conclusions: Results show a greater effect on planning target volume coverage than in organs at risk. In cases of steep DVH gradients, such as planning target volumes, this new method shows the largest differences with the corresponding DVH; thus, the effect of the uncertainty is larger.

  11. An introduction to RHINO: real-time histogram interpretation of numerical observations

    NASA Astrophysics Data System (ADS)

    Chandler, Susan; Lukesh, Gordon

    2006-02-01

    RHINO, Real-time Histogram Interpretation of Numerical Observations, is a specialty algorithm and tool under development for the United States Air Force Office of Scientific Research. The intent is to provide real-time feedback for adaptive control of telescope pointing for ground-space-ground laser illumination experiments. Nukove together with New Mexico State University first established a proof-of-principle laboratory experiment using RHINO and, under a controlled environment, reduction of the pointing error known as boresight was demonstrated. Additionally, the RHINO algorithm successfully predicted a systematic pointing offset due to solar illumination of a satellite. RHINO is resilient to effects such as glints, speckle, and scintillation. The forthcoming commercially available version of RHINO will use real-time field data and provide adaptive control to the user.

  12. On algorithmic optimization of histogramming functions for GEM systems

    NASA Astrophysics Data System (ADS)

    Krawczyk, Rafał D.; Czarski, Tomasz; Kolasinski, Piotr; Poźniak, Krzysztof T.; Linczuk, Maciej; Byszuk, Adrian; Chernyshova, Maryna; Juszczyk, Bartlomiej; Kasprowicz, Grzegorz; Wojenski, Andrzej; Zabolotny, Wojciech

    2015-09-01

    This article concerns optimization methods for data analysis for the X-ray GEM detector system. The offline analysis of collected samples was optimized for MATLAB computations. Compiled functions in C language were used with MEX library. Significant speedup was received for both ordering-preprocessing and for histogramming of samples. Utilized techniques with obtained results are presented.

  13. Histogram-Based Calibration Method for Pipeline ADCs.

    PubMed

    Son, Hyeonuk; Jang, Jaewon; Kim, Heetae; Kang, Sungho

    2015-01-01

    Measurement and calibration of an analog-to-digital converter (ADC) using a histogram-based method requires a large volume of data and a long test duration, especially for a high resolution ADC. A fast and accurate calibration method for pipelined ADCs is proposed in this research. The proposed calibration method composes histograms through the outputs of each stage and calculates error sources. The digitized outputs of a stage are influenced directly by the operation of the prior stage, so the results of the histogram provide the information of errors in the prior stage. The composed histograms reduce the required samples and thus calibration time being implemented by simple modules. For 14-bit resolution pipelined ADC, the measured maximum integral non-linearity (INL) is improved from 6.78 to 0.52 LSB, and the spurious-free dynamic range (SFDR) and signal-to-noise-and-distortion ratio (SNDR) are improved from 67.0 to 106.2dB and from 65.6 to 84.8dB, respectively. PMID:26070196

  14. Wildfire Detection using by Multi Dimensional Histogram in Boreal Forest

    NASA Astrophysics Data System (ADS)

    Honda, K.; Kimura, K.; Honma, T.

    2008-12-01

    Early detection of wildfires is an issue for reduction of damage to environment and human. There are some attempts to detect wildfires by using satellite imagery, which are mainly classified into three methods: Dozier Method(1981-), Threshold Method(1986-) and Contextual Method(1994-). However, the accuracy of these methods is not enough: some commission and omission errors are included in the detected results. In addition, it is not so easy to analyze satellite imagery with high accuracy because of insufficient ground truth data. Kudoh and Hosoi (2003) developed the detection method by using three-dimensional (3D) histogram from past fire data with the NOAA-AVHRR imagery. But their method is impractical because their method depends on their handworks to pick up past fire data from huge data. Therefore, the purpose of this study is to collect fire points as hot spots efficiently from satellite imagery and to improve the method to detect wildfires with the collected data. As our method, we collect past fire data with the Alaska Fire History data obtained by the Alaska Fire Service (AFS). We select points that are expected to be wildfires, and pick up the points inside the fire area of the AFS data. Next, we make 3D histogram with the past fire data. In this study, we use Bands 1, 21 and 32 of MODIS. We calculate the likelihood to detect wildfires with the three-dimensional histogram. As our result, we select wildfires with the 3D histogram effectively. We can detect the troidally spreading wildfire. This result shows the evidence of good wildfire detection. However, the area surrounding glacier tends to rise brightness temperature. It is a false alarm. Burnt area and bare ground are sometimes indicated as false alarms, so that it is necessary to improve this method. Additionally, we are trying various combinations of MODIS bands as the better method to detect wildfire effectively. So as to adjust our method in another area, we are applying our method to tropical

  15. Science EQUALS Success.

    ERIC Educational Resources Information Center

    Cobb, Kitty B., Ed.; Conwell, Catherine R., Ed.

    The purpose of the EQUALS programs is to increase the interest and awareness that females and minorities have concerning mathematics and science related careers. This book, produced by an EQUALS program in North Carolina, contains 35 hands-on, discovery science activities that center around four EQUALS processes--problem solving, cooperative…

  16. Brain tumor CT attenuation coefficients: semiquantitative analysis of histograms.

    PubMed

    Ratzka, M; Haubitz, I

    1983-01-01

    This paper reports on work in progress on semiquantitative curve analyses of histograms of brain tumors. Separation of statistical groups of attenuation values obtained by computer calculation is done separately from scanning, using histogram printouts as the data input for a programmable calculator. This method is discussed together with its results in 50 cases of malignant gliomas. The detection of hidden tissue portions and the more accurate evaluation of partial enhancement effects have been the investigators' main concerns to the present time; however, this method may allow more specific diagnosis of malignancy and changes in tumor characteristics than visual assessment alone. This has not been proven by studies that have evaluated large numbers of cases, but seems to be worth pursuing as a new approach. PMID:6410783

  17. Fingerprint image segmentation based on multi-features histogram analysis

    NASA Astrophysics Data System (ADS)

    Wang, Peng; Zhang, Youguang

    2007-11-01

    An effective fingerprint image segmentation based on multi-features histogram analysis is presented. We extract a new feature, together with three other features to segment fingerprints. Two of these four features, each of which is related to one of the other two, are reciprocals with each other, so features are divided into two groups. These two features' histograms are calculated respectively to determine which feature group is introduced to segment the aim-fingerprint. The features could also divide fingerprints into two classes with high and low quality. Experimental results show that our algorithm could classify foreground and background effectively with lower computational cost, and it can also reduce pseudo-minutiae detected and improve the performance of AFIS.

  18. Retrospective Reconstructions of Active Bone Marrow Dose-Volume Histograms

    SciTech Connect

    Veres, Cristina; Allodji, Rodrigue S.; Llanas, Damien; Vu Bezin, Jérémi; Chavaudra, Jean; Mège, Jean Pierre; Lefkopoulos, Dimitri; Quiniou, Eric; Deutsh, Eric; Vathaire, Florent de; Diallo, Ibrahima

    2014-12-01

    Purpose: To present a method for calculating dose-volume histograms (DVH's) to the active bone marrow (ABM) of patients who had undergone radiation therapy (RT) and subsequently developed leukemia. Methods and Materials: The study focuses on 15 patients treated between 1961 and 1996. Whole-body RT planning computed tomographic (CT) data were not available. We therefore generated representative whole-body CTs similar to patient anatomy. In addition, we developed a method enabling us to obtain information on the density distribution of ABM all over the skeleton. Dose could then be calculated in a series of points distributed all over the skeleton in such a way that their local density reflected age-specific data for ABM distribution. Dose to particular regions and dose-volume histograms of the entire ABM were estimated for all patients. Results: Depending on patient age, the total number of dose calculation points generated ranged from 1,190,970 to 4,108,524. The average dose to ABM ranged from 0.3 to 16.4 Gy. Dose-volume histograms analysis showed that the median doses (D{sub 50%}) ranged from 0.06 to 12.8 Gy. We also evaluated the inhomogeneity of individual patient ABM dose distribution according to clinical situation. It was evident that the coefficient of variation of the dose for the whole ABM ranged from 1.0 to 5.7, which means that the standard deviation could be more than 5 times higher than the mean. Conclusions: For patients with available long-term follow-up data, our method provides reconstruction of dose-volume data comparable to detailed dose calculations, which have become standard in modern CT-based 3-dimensional RT planning. Our strategy of using dose-volume histograms offers new perspectives to retrospective epidemiological studies.

  19. Slope histogram distribution-based parametrisation of Martian geomorphic features

    NASA Astrophysics Data System (ADS)

    Balint, Zita; Székely, Balázs; Kovács, Gábor

    2014-05-01

    The application of geomorphometric methods on the large Martian digital topographic datasets paves the way to analyse the Martian areomorphic processes in more detail. One of the numerous methods is the analysis is to analyse local slope distributions. To this implementation a visualization program code was developed that allows to calculate the local slope histograms and to compare them based on Kolmogorov distance criterion. As input data we used the digital elevation models (DTMs) derived from HRSC high-resolution stereo camera image from various Martian regions. The Kolmogorov-criterion based discrimination produces classes of slope histograms that displayed using coloration obtaining an image map. In this image map the distribution can be visualized by their different colours representing the various classes. Our goal is to create a local slope histogram based classification for large Martian areas in order to obtain information about general morphological characteristics of the region. This is a contribution of the TMIS.ascrea project, financed by the Austrian Research Promotion Agency (FFG). The present research is partly realized in the frames of TÁMOP 4.2.4.A/2-11-1-2012-0001 high priority "National Excellence Program - Elaborating and Operating an Inland Student and Researcher Personal Support System convergence program" project's scholarship support, using Hungarian state and European Union funds and cofinances from the European Social Fund.

  20. Finding significantly connected voxels based on histograms of connection strengths

    NASA Astrophysics Data System (ADS)

    Kasenburg, Niklas; Pedersen, Morten Vester; Darkner, Sune

    2016-03-01

    We explore a new approach for structural connectivity based segmentations of subcortical brain regions. Connectivity based segmentations are usually based on fibre connections from a seed region to predefined target regions. We present a method for finding significantly connected voxels based on the distribution of connection strengths. Paths from seed voxels to all voxels in a target region are obtained from a shortest-path tractography. For each seed voxel we approximate the distribution with a histogram of path scores. We hypothesise that the majority of estimated connections are false-positives and that their connection strength is distributed differently from true-positive connections. Therefore, an empirical null-distribution is defined for each target region as the average normalized histogram over all voxels in the seed region. Single histograms are then tested against the corresponding null-distribution and significance is determined using the false discovery rate (FDR). Segmentations are based on significantly connected voxels and their FDR. In this work we focus on the thalamus and the target regions were chosen by dividing the cortex into a prefrontal/temporal zone, motor zone, somatosensory zone and a parieto-occipital zone. The obtained segmentations consistently show a sparse number of significantly connected voxels that are located near the surface of the anterior thalamus over a population of 38 subjects.

  1. Implementing a 3D histogram version of the Energy-Test in ROOT

    NASA Astrophysics Data System (ADS)

    Cohen, E. O.; Reid, I. D.; Piasetzky, E.

    2016-08-01

    Comparing simulation and data histograms is of interest in nuclear and particle physics experiments; however, the leading three-dimensional histogram comparison tool available in ROOT, the 3D Kolmogorov-Smirnov test, exhibits shortcomings. Throughout the following, we present and discuss the implementation of an alternative comparison test for three-dimensional histograms, based on the Energy-Test by Aslan and Zech.

  2. Content based Image Retrieval based on Different Global and Local Color Histogram Methods: A Survey

    NASA Astrophysics Data System (ADS)

    Suhasini, Pallikonda Sarah; Sri Rama Krishna, K.; Murali Krishna, I. V.

    2016-06-01

    Different global and local color histogram methods for content based image retrieval (CBIR) are investigated in this paper. Color histogram is a widely used descriptor for CBIR. Conventional method of extracting color histogram is global, which misses the spatial content, is less invariant to deformation and viewpoint changes, and results in a very large three dimensional histogram corresponding to the color space used. To address the above deficiencies, different global and local histogram methods are proposed in recent research. Different ways of extracting local histograms to have spatial correspondence, invariant colour histogram to add deformation and viewpoint invariance and fuzzy linking method to reduce the size of the histogram are found in recent papers. The color space and the distance metric used are vital in obtaining color histogram. In this paper the performance of CBIR based on different global and local color histograms in three different color spaces, namely, RGB, HSV, L*a*b* and also with three distance measures Euclidean, Quadratic and Histogram intersection are surveyed, to choose appropriate method for future research.

  3. Equal Justice Under Law.

    ERIC Educational Resources Information Center

    Johnson, Earl, Jr., Ed.

    1994-01-01

    This special theme issue of "Update on Law-Related Education""tells about the past, present, and future of equal legal representation for all in our society." It is dedicated to the history and heroes of legal aid for the poor and the need to further that cause if the United States hopes to achieve equal justice for all. In his foreword, Justice…

  4. Early Understanding of Equality

    ERIC Educational Resources Information Center

    Leavy, Aisling; Hourigan, Mairéad; McMahon, Áine

    2013-01-01

    Quite a bit of the arithmetic in elementary school contains elements of algebraic reasoning. After researching and testing a number of instructional strategies with Irish third graders, these authors found effective methods for cultivating a relational concept of equality in third-grade students. Understanding equality is fundamental to algebraic…

  5. Equality and Economy

    ERIC Educational Resources Information Center

    Brink, Chris

    2012-01-01

    The two big events in higher education during 2010 were the implementation of the Equality Act, and the introduction of a new dispensation on fees and funding. The former is intended to promote equality, the latter is premised on the need for economy. In this article, the author focuses on the effect of the latter on the former. He considers this…

  6. Equality, Innovation and Diversity.

    ERIC Educational Resources Information Center

    Smith, Janet

    1999-01-01

    Offers some ideas concerning promotion of gender equality and diversity within European Union-funded programs and activities. Reviews efforts since the 1970s to foster equal access in European schools and universities, examines some principles of innovation and entrepreneurship, and considers stages in diversity policy development. (DB)

  7. Integer Equal Flows

    SciTech Connect

    Meyers, C A; Schulz, A S

    2009-01-07

    The integer equal flow problem is an NP-hard network flow problem, in which all arcs in given sets R{sub 1}, ..., R{sub {ell}} must carry equal flow. We show this problem is effectively inapproximable, even if the cardinality of each set R{sub k} is two. When {ell} is fixed, it is solvable in polynomial time.

  8. Robust human intrusion detection technique using hue-saturation histograms

    NASA Astrophysics Data System (ADS)

    Hassan, Waqas; Mitra, Bhargav; Bangalore, Nagachetan; Birch, Philip; Young, Rupert; Chatwin, Chris

    2011-04-01

    A robust human intrusion detection technique using hue-saturation histograms is presented in this paper. Initially a region of interest (ROI) is manually identified in the scene viewed by a single fixed CCTV camera. All objects in the ROI are automatically demarcated from the background using brightness and chromaticity distortion parameters. The segmented objects are then tracked using correlation between hue-saturation based bivariate distributions. The technique has been applied on all the 'Sterile Zone' sequences of the United Kingdom Home Office iLIDS dataset and its performance is evaluated with over 70% positive results.

  9. Image segmentation using random-walks on the histogram

    NASA Astrophysics Data System (ADS)

    Morin, Jean-Philippe; Desrosiers, Christian; Duong, Luc

    2012-02-01

    This document presents a novel method for the problem of image segmentation, based on random-walks. This method shares similarities with the Mean-shift algorithm, as it finds the modes of the intensity histogram of images. However, unlike Mean-shift, our proposed method is stochastic and also provides class membership probabilities. Also, unlike other random-walk based methods, our approach does not require any form of user interaction, and can scale to very large images. To illustrate the usefulness, efficiency and scalability of our method, we test it on the task of segmenting anatomical structures present in cardiac CT and brain MRI images.

  10. Classification of CT-brain slices based on local histograms

    NASA Astrophysics Data System (ADS)

    Avrunin, Oleg G.; Tymkovych, Maksym Y.; Pavlov, Sergii V.; Timchik, Sergii V.; Kisała, Piotr; Orakbaev, Yerbol

    2015-12-01

    Neurosurgical intervention is a very complicated process. Modern operating procedures based on data such as CT, MRI, etc. Automated analysis of these data is an important task for researchers. Some modern methods of brain-slice segmentation use additional data to process these images. Classification can be used to obtain this information. To classify the CT images of the brain, we suggest using local histogram and features extracted from them. The paper shows the process of feature extraction and classification CT-slices of the brain. The process of feature extraction is specialized for axial cross-section of the brain. The work can be applied to medical neurosurgical systems.

  11. Density Equalizing Map Projections

    SciTech Connect

    Close, E. R.; Merrill, D. W.; Holmes, H. H.

    1995-07-01

    A geographic map is mathematically transformed so that the subareas of the map are proportional to a given quantity such as population. In other words, population density is equalized over the entire map. The transformed map can be used as a display tool, or it can be statistically analyzed. For example, cases of disease plotted on the transformed map should be uniformly distributed at random, if disease rates are everywhere equal. Geographic clusters of disease can be readily identified, and their statistical significance determined, on a density equalized map.

  12. Equalization in redundant channels

    NASA Technical Reports Server (NTRS)

    Tulpule, Bhalchandra R. (Inventor); Collins, Robert E. (Inventor); Cominelli, Donald F. (Inventor); O'Neill, Richard D. (Inventor)

    1988-01-01

    A miscomparison between a channel's configuration data base and a voted system configuration data base in a redundant channel system having identically operating, frame synchronous channels triggers autoequalization of the channel's historical signal data bases in a hierarchical, chronological manner with that of a correctly operating channel. After equalization, symmetrization of the channel's configuration data base with that of the system permits upgrading of the previously degraded channel to full redundancy. An externally provided equalization command, e.g., manually actuated, can also trigger equalization.

  13. Density Equalizing Map Projections

    1995-07-01

    A geographic map is mathematically transformed so that the subareas of the map are proportional to a given quantity such as population. In other words, population density is equalized over the entire map. The transformed map can be used as a display tool, or it can be statistically analyzed. For example, cases of disease plotted on the transformed map should be uniformly distributed at random, if disease rates are everywhere equal. Geographic clusters of diseasemore » can be readily identified, and their statistical significance determined, on a density equalized map.« less

  14. Lean histogram of oriented gradients features for effective eye detection

    NASA Astrophysics Data System (ADS)

    Sharma, Riti; Savakis, Andreas

    2015-11-01

    Reliable object detection is very important in computer vision and robotics applications. The histogram of oriented gradients (HOG) is established as one of the most popular hand-crafted features, which along with support vector machine (SVM) classification provides excellent performance for object recognition. We investigate dimensionality deduction on HOG features in combination with SVM classifiers to obtain efficient feature representation and improved classification performance. In addition to lean HOG features, we explore descriptors resulting from dimensionality reduction on histograms of binary descriptors. We consider three-dimensionality reduction techniques: standard principal component analysis, random projections, a computationally efficient linear mapping that is data independent, and locality preserving projections (LPP), which learns the manifold structure of the data. Our methods focus on the application of eye detection and were tested on an eye database created using the BioID and FERET face databases. Our results indicate that manifold learning is beneficial to classification utilizing HOG features. To demonstrate the broader usefulness of lean HOG features for object class recognition, we evaluated our system's classification performance on the CalTech-101 dataset with favorable outcomes.

  15. In vivo resolution of oligomers with fluorescence photobleaching recovery histograms

    PubMed Central

    Youn, B.S.; Lepock, J.R.; Borrelli, M.J.; Jervis, E.J.

    2006-01-01

    Simple independent enzyme-catalyzed reactions distributed homogeneously throughout an aqueous environment cannot adequately explain the regulation of metabolic and other cellular processes in vivo. Such an unstructured system results in unacceptably slow substrate turnover rates and consumes inordinate amounts of cellular energy. Current approaches to resolving compartmentalization in living cells requires the partitioning of the molecular species in question such that its localization can be resolved with fluorescence microscopy. Standard imaging approaches will not resolve localization of protein activity for proteins that are ubiquitously distributed, but whose function requires a change in state of the protein. The small heat shock protein sHSP27 exists as both dimers and large multimers and is distributed homogeneously throughout the cytoplasm. A fusion of the green fluorescent protein variant S65T and sHSP27 is used to assess the ability of diffusion rate histograms to resolve compartmentalization of the 2 dominant oligomeric species of sHSP27. Diffusion rates were measured by multiphoton fluorescence photobleaching recovery. Under physiologic conditions, diffusion rate histograms resolved at least 2 diffusive transport rates within a living cell potentially corresponding to the large and small oligomers of sHSP27. Given that oligomerization is often a means of regulation, compartmentalization of different oligomer species could provide a means for efficient regulation and localization of sHsp27 activity. PMID:16817323

  16. Microcanonical thermostatistics analysis without histograms: Cumulative distribution and Bayesian approaches

    NASA Astrophysics Data System (ADS)

    Alves, Nelson A.; Morero, Lucas D.; Rizzi, Leandro G.

    2015-06-01

    Microcanonical thermostatistics analysis has become an important tool to reveal essential aspects of phase transitions in complex systems. An efficient way to estimate the microcanonical inverse temperature β(E) and the microcanonical entropy S(E) is achieved with the statistical temperature weighted histogram analysis method (ST-WHAM). The strength of this method lies on its flexibility, as it can be used to analyse data produced by algorithms with generalised sampling weights. However, for any sampling weight, ST-WHAM requires the calculation of derivatives of energy histograms H(E) , which leads to non-trivial and tedious binning tasks for models with continuous energy spectrum such as those for biomolecular and colloidal systems. Here, we discuss two alternative methods that avoid the need for such energy binning to obtain continuous estimates for H(E) in order to evaluate β(E) by using ST-WHAM: (i) a series expansion to estimate probability densities from the empirical cumulative distribution function (CDF), and (ii) a Bayesian approach to model this CDF. Comparison with a simple linear regression method is also carried out. The performance of these approaches is evaluated considering coarse-grained protein models for folding and peptide aggregation.

  17. The Transition Matrix in Flat-histogram Sampling

    NASA Astrophysics Data System (ADS)

    Brown, Gregory; Eisenbach, M.; Li, Y. W.; Stocks, G. M.; Nicholson, D. M.; Odbadrakh, Kh.; Rikvold, P. A.

    2015-03-01

    Calculating the thermodynamic density of states (DOS) via flat-histogram sampling is a powerful numerical method for characterizing the temperature-dependent properties of materials. Since the calculated DOS is refined directly from the statistics of the sampling, methods of accelerating the sampling, e.g. through windowing and slow forcing, skew the resulting DOS. Calculating the infinite-temperature transition matrix during the flat-histogram sampling decouples the sampling from estimating the DOS, and allows the techniques of Transition Matrix Monte Carlo to be applied. This enables the calculation of the properties for very large system sizes and thus finite-size scaling analysis of the specific heat, magnetic susceptibility, and cumulant crossings at critical points. We discuss these developments in the context of models for magnetocaloric and spin-crossover materials. This work was performed at the Oak Ridge National Laboratory, which is managed by UT-Battelle for the U.S. Department of Energy. It was sponsored by the U.S. Department of Energy, Office of Basic Energy Sciences, Office of Advanced Scientific Computing Research, and the Oak Ridge Leadership Computing Facility. PAR is supported by the National Science Foundation.

  18. A preliminary evaluation of histogram-based binarization algorithms

    SciTech Connect

    Kanai, Junichi; Grover, K.

    1995-04-01

    To date, most Optical Character Recognition (OCR) systems process binary document images, and the quality of the input image strongly affects their performance. Since a binarization process is inherently lossy, different algorithms typically produce different binary images from the same gray scale image. The objective of this research is to study effects of global binarization algorithms on the performance of OCR systems. Several binarization methods were examined: the best fixed threshold value for the data set, the ideal histogram method, and Otsu`s algorithm. Four contemporary OCR systems and 50 hard copy pages containing 91,649 characters were used in the experiments. These pages were digitized at 300 dpi and 8 bits/pixel, and 36 different threshold values (ranging from 59 to 199 in increments of) 4 were used. The resulting 1,800 binary images were processed by all four OCR systems. All systems made approximately 40% more errors from images generated by Otsu`s method than those of the ideal histogram method. Two of the systems made approximately the same number of errors from images generated by the best fixed threshold value and Otsu`s method.

  19. Nonrigid registration of joint histograms for intensity standardization in magnetic resonance imaging.

    PubMed

    Jäger, Florian; Hornegger, Joachim

    2009-01-01

    A major disadvantage of magnetic resonance imaging (MRI) compared to other imaging modalities like computed tomography is the fact that its intensities are not standardized. Our contribution is a novel method for MRI signal intensity standardization of arbitrary MRI scans, so as to create a pulse sequence dependent standard intensity scale. The proposed method is the first approach that uses the properties of all acquired images jointly (e.g., T1- and T2-weighted images). The image properties are stored in multidimensional joint histograms. In order to normalize the probability density function (pdf) of a newly acquired data set, a nonrigid image registration is performed between a reference and the joint histogram of the acquired images. From this matching a nonparametric transformation is obtained, which describes a mapping between the corresponding intensity spaces and subsequently adapts the image properties of the newly acquired series to a given standard. As the proposed intensity standardization is based on the probability density functions of the data sets only, it is independent of spatial coherence or prior segmentations of the reference and current images. Furthermore, it is not designed for a particular application, body region or acquisition protocol. The evaluation was done using two different settings. First, MRI head images were used, hence the approach can be compared to state-of-the-art methods. Second, whole body MRI scans were used. For this modality no other normalization algorithm is known in literature. The Jeffrey divergence of the pdfs of the whole body scans was reduced by 45%. All used data sets were acquired during clinical routine and thus included pathologies. PMID:19116196

  20. The retina dose-area histogram: a metric for quantitatively comparing rival eye plaque treatment options

    PubMed Central

    2013-01-01

    Purpose Episcleral plaques have a history of over a half century in the delivery of radiation therapy to intraocular tumors such as choroidal melanoma. Although the tumor control rate is high, vision-impairing complications subsequent to treatment remain an issue. Notable, late complications are radiation retinopathy and maculopathy. The obvious way to reduce the risk of radiation damage to the retina is to conform the prescribed isodose surface to the tumor base and to reduce the dose delivered to the surrounding healthy retina, especially the macula. Using a fusion of fundus photography, ultrasound and CT images, tumor size, shape and location within the eye can be accurately simulated as part of the radiation planning process. In this work an adaptation of the dose-volume histogram (DVH), the retina dose-area histogram (RDAH) is introduced as a metric to help compare rival plaque designs and conformal treatment planning options with the goal of reducing radiation retinopathy. Material and methods The RDAH is calculated by transforming a digitized fundus-photo collage of the tumor into a rasterized polar map of the retinal surface known as a retinal diagram (RD). The perimeter of the tumor base is digitized on the RD and its area computed. Area and radiation dose are calculated for every pixel in the RD. Results The areal resolution of the RDAH is a function of the pixel resolution of the raster image used to display the RD and the number of polygon edges used to digitize the perimeter of the tumor base. A practical demonstration is presented. Conclusions The RDAH provides a quantitative metric by which episcleral plaque treatment plan options may be evaluated and compared in order to confirm adequate dosimetric coverage of the tumor and margin, and to help minimize dose to the macula and retina. PMID:23634152

  1. Technology--The Equalizer.

    ERIC Educational Resources Information Center

    Sloane, Eydie

    1989-01-01

    This article describes a number of computer-based learning tools for disabled students. Adaptive input devices, assisted technologies, software, and hardware and software resources are discussed. (IAH)

  2. An efficient Earth Mover's Distance algorithm for robust histogram comparison.

    PubMed

    Ling, Haibin; Okada, Kazunori

    2007-05-01

    We propose EMD-L1: a fast and exact algorithm for computing the Earth Mover's Distance (EMD) between a pair of histograms. The efficiency of the new algorithm enables its application to problems that were previously prohibitive due to high time complexities. The proposed EMD-L1 significantly simplifies the original linear programming formulation of EMD. Exploiting the L1 metric structure, the number of unknown variables in EMD-L1 is reduced to O(N) from O(N2) of the original EMD for a histogram with N bins. In addition, the number of constraints is reduced by half and the objective function of the linear program is simplified. Formally, without any approximation, we prove that the EMD-L1 formulation is equivalent to the original EMD with a L1 ground distance. To perform the EMD-L1 computation, we propose an efficient tree-based algorithm, Tree-EMD. Tree-EMD exploits the fact that a basic feasible solution of the simplex algorithm-based solver forms a spanning tree when we interpret EMD-L1 as a network flow optimization problem. We empirically show that this new algorithm has an average time complexity of O(N2), which significantly improves the best reported supercubic complexity of the original EMD. The accuracy of the proposed methods is evaluated by experiments for two computation-intensive problems: shape recognition and interest point matching using multidimensional histogram-based local features. For shape recognition, EMD-L1 is applied to compare shape contexts on the widely tested MPEG7 shape data set, as well as an articulated shape data set. For interest point matching, SIFT, shape context and spin image are tested on both synthetic and real image pairs with large geometrical deformation, illumination change, and heavy intensity noise. The results demonstrate that our EMD-L1-based solutions outperform previously reported state-of-the-art features and distance measures in solving the two tasks. PMID:17356203

  3. Implementing a 3D histogram version of the Energy-Test in ROOT

    NASA Astrophysics Data System (ADS)

    Cohen, E. O.; Reid, I. D.; Piasetzky, E.

    2016-08-01

    Comparing simulation and data histograms is of interest in nuclear and particle physics experiments; however, the leading three-dimensional histogram comparison tool available in ROOT, the 3D Kolmogorov-Smirnov test, exhibits shortcomings. Throughout the following, we present and discuss the implementation of an alternative comparison test for three-dimensional histograms, based on the Energy-Test by Aslan and Zech. The software package can be found at http://www-nuclear.tau.ac.il/ecohen/.

  4. Fast and fully automatic phalanx segmentation using a grayscale-histogram morphology algorithm

    NASA Astrophysics Data System (ADS)

    Hsieh, Chi-Wen; Liu, Tzu-Chiang; Jong, Tai-Lang; Chen, Chih-Yen; Tiu, Chui-Mei; Chan, Din-Yuen

    2011-08-01

    Bone age assessment is a common radiological examination used in pediatrics to diagnose the discrepancy between the skeletal and chronological age of a child; therefore, it is beneficial to develop a computer-based bone age assessment to help junior pediatricians estimate bone age easily. Unfortunately, the phalanx on radiograms is not easily separated from the background and soft tissue. Therefore, we proposed a new method, called the grayscale-histogram morphology algorithm, to segment the phalanges fast and precisely. The algorithm includes three parts: a tri-stage sieve algorithm used to eliminate the background of hand radiograms, a centroid-edge dual scanning algorithm to frame the phalanx region, and finally a segmentation algorithm based on disk traverse-subtraction filter to segment the phalanx. Moreover, two more segmentation methods: adaptive two-mean and adaptive two-mean clustering were performed, and their results were compared with the segmentation algorithm based on disk traverse-subtraction filter using five indices comprising misclassification error, relative foreground area error, modified Hausdorff distances, edge mismatch, and region nonuniformity. In addition, the CPU time of the three segmentation methods was discussed. The result showed that our method had a better performance than the other two methods. Furthermore, satisfactory segmentation results were obtained with a low standard error.

  5. Violence detection based on histogram of optical flow orientation

    NASA Astrophysics Data System (ADS)

    Yang, Zhijie; Zhang, Tao; Yang, Jie; Wu, Qiang; Bai, Li; Yao, Lixiu

    2013-12-01

    In this paper, we propose a novel approach for violence detection and localization in a public scene. Currently, violence detection is considerably under-researched compared with the common action recognition. Although existing methods can detect the presence of violence in a video, they cannot precisely locate the regions in the scene where violence is happening. This paper will tackle the challenge and propose a novel method to locate the violence location in the scene, which is important for public surveillance. The Gaussian Mixed Model is extended into the optical flow domain in order to detect candidate violence regions. In each region, a new descriptor, Histogram of Optical Flow Orientation (HOFO), is proposed to measure the spatial-temporal features. A linear SVM is trained based on the descriptor. The performance of the method is demonstrated on the publicly available data sets, BEHAVE and CAVIAR.

  6. Transhumanism and moral equality.

    PubMed

    Wilson, James

    2007-10-01

    Conservative thinkers such as Francis Fukuyama have produced a battery of objections to the transhumanist project of fundamentally enhancing human capacities. This article examines one of these objections, namely that by allowing some to greatly extend their capacities, we will undermine the fundamental moral equality of human beings. I argue that this objection is groundless: once we understand the basis for human equality, it is clear that anyone who now has sufficient capacities to count as a person from the moral point of view will continue to count as one even if others are fundamentally enhanced; and it is mistaken to think that a creature which had even far greater capacities than an unenhanced human being should count as more than an equal from the moral point of view. PMID:17845448

  7. Using histograms to introduce randomization in the generation of ensembles of decision trees

    DOEpatents

    Kamath, Chandrika; Cantu-Paz, Erick; Littau, David

    2005-02-22

    A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

  8. Custom IC/Embedded IP design for histogram in video processing application

    NASA Astrophysics Data System (ADS)

    Pandey, Manoj; Chaturvedi, Richa; Rai, S. K.

    2016-03-01

    Histogram is an integral part of video processing applications. Either of the design methods ASIC or Embedded, histogram computation is an important functional block. This paper proposes the custom Integrated Circuit (IC) as an ASIC and an embedded IP to compute the colored histogram function. Histogram computation has two features: color and spatial. Color feature has been calculated using find_bin and spatial feature is calculated using kernel function. The design is verified using NCSIM Cadence tool, while it is synthesized using RTL compiler. Finally, the embedded IP has interfaced with Kernel based mean shift algorithm in tracking a moving object and implemented on Xilinx Spartan 6 LX150T FPGA.

  9. Infrared image segmentation method based on spatial coherence histogram and maximum entropy

    NASA Astrophysics Data System (ADS)

    Liu, Songtao; Shen, Tongsheng; Dai, Yao

    2014-11-01

    In order to segment the target well and suppress background noises effectively, an infrared image segmentation method based on spatial coherence histogram and maximum entropy is proposed. First, spatial coherence histogram is presented by weighting the importance of the different position of these pixels with the same gray-level, which is obtained by computing their local density. Then, after enhancing the image by spatial coherence histogram, 1D maximum entropy method is used to segment the image. The novel method can not only get better segmentation results, but also have a faster computation time than traditional 2D histogram-based segmentation methods.

  10. 43 CFR 2201.6 - Value equalization; cash equalization waiver.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Value equalization; cash equalization... PROCEDURES Exchanges-Specific Requirements § 2201.6 Value equalization; cash equalization waiver. (a) To equalize the agreed upon values of the Federal and non-Federal lands involved in an exchange, either...

  11. Equal Opportunity in Housing.

    ERIC Educational Resources Information Center

    Commission on Civil Rights, Washington, DC.

    This overview of developments in housing opportunities for minorities and women includes an historical review of housing discrimination, its nature, and its effects. Federal legislation and Federal actions which were taken to assure equal housing opportunities for women and minorities are described. Other topic areas addressed include minority…

  12. Equality and Academic Subjects

    ERIC Educational Resources Information Center

    Hardarson, Atli

    2013-01-01

    A recent national curriculum guide for upper secondary schools in my home country, Iceland, requires secondary schools to work towards equality and five other overarching aims. This requirement raises questions about to what extent secondary schools have to change their curricula in order to approach these aims or work towards them in an adequate…

  13. Defining Equality in Education

    ERIC Educational Resources Information Center

    Benson, Ronald E.

    1977-01-01

    Defines equality of education in three areas: 1) by the degree of integration of school systems; 2) by a comparison of material resources and assets in education; and 3) by the effects of schooling as measured by the mean scores of groups on standardized tests. Available from: College of Education, 107 Quadrangle, Iowa State University, Ames, Iowa…

  14. Status Equalization Project.

    ERIC Educational Resources Information Center

    Cohen, Elizabeth G.; Deslonde, James

    The introduction of the Multiple Ability Curriculum (MAC) and Expectation Training (ET) into the curriculum of racially integrated elementary schools appears to improve the equal status interaction between students of differing academic and social status. The goals of the MAC and ET are the following: (1) prevent classroom dominance by students of…

  15. Equal Opportunity in Employment

    ERIC Educational Resources Information Center

    Bullock, Paul

    This book focuses on discrimination in employment, defined as the denial of equal opportunity in the labor market to qualified persons on the basis of race, color, religion, national origin, age, sex, or any other factor not related to their individual qualifications for work. The average nonwhite college graduate can expect to earn less during…

  16. Fast and efficient search for MPEG-4 video using adjacent pixel intensity difference quantization histogram feature

    NASA Astrophysics Data System (ADS)

    Lee, Feifei; Kotani, Koji; Chen, Qiu; Ohmi, Tadahiro

    2010-02-01

    In this paper, a fast search algorithm for MPEG-4 video clips from video database is proposed. An adjacent pixel intensity difference quantization (APIDQ) histogram is utilized as the feature vector of VOP (video object plane), which had been reliably applied to human face recognition previously. Instead of fully decompressed video sequence, partially decoded data, namely DC sequence of the video object are extracted from the video sequence. Combined with active search, a temporal pruning algorithm, fast and robust video search can be realized. The proposed search algorithm has been evaluated by total 15 hours of video contained of TV programs such as drama, talk, news, etc. to search for given 200 MPEG-4 video clips which each length is 15 seconds. Experimental results show the proposed algorithm can detect the similar video clip in merely 80ms, and Equal Error Rate (ERR) of 2 % in drama and news categories are achieved, which are more accurately and robust than conventional fast video search algorithm.

  17. Freedom, equality, race.

    PubMed

    Ferguson, Jeffrey B

    2011-01-01

    This essay explores come of the reasons for the continuing power of racial categorization in our era, and thus offers some friendly amendments to the more optimistic renderings of the term post-racial. Focusing mainly on the relationship between black and white Americans, it argues that the widespread embrace of universal values of freedom and equality, which most regard as antidotes to racial exclusion, actually reinforce it. The internal logic of these categories requires the construction of the "other." In America, where freedom and equality still stand at the contested center of collective identity, a history of racial oppression informs the very meaning of these terms. Thus the irony: much of the effort exerted to transcend race tends to fuel continuing division. PMID:21469393

  18. Battery equalization active methods

    NASA Astrophysics Data System (ADS)

    Gallardo-Lozano, Javier; Romero-Cadaval, Enrique; Milanes-Montero, M. Isabel; Guerrero-Martinez, Miguel A.

    2014-01-01

    Many different battery technologies are available for the applications which need energy storage. New researches are being focused on Lithium-based batteries, since they are becoming the most viable option for portable energy storage applications. As most of the applications need series battery strings to meet voltage requirements, battery imbalance is an important matter to be taken into account, since it leads the individual battery voltages to drift apart over time, and premature cells degradation, safety hazards, and capacity reduction will occur. A large number of battery equalization methods can be found, which present different advantages/disadvantages and are suitable for different applications. The present paper presents a summary, comparison and evaluation of the different active battery equalization methods, providing a table that compares them, which is helpful to select the suitable equalization method depending on the application. By applying the same weight to the different parameters of comparison, switch capacitor and double-tiered switching capacitor have the highest ratio. Cell bypass methods are cheap and cell to cell ones are efficient. Cell to pack, pack to cell and cell to pack to cell methods present a higher cost, size, and control complexity, but relatively low voltage and current stress in high-power applications.

  19. Do you need to compare two histograms not only by eye?

    NASA Astrophysics Data System (ADS)

    Cardiel, N.

    2015-05-01

    Although the use of histograms implies loss of information due to the fact that the actual data are replaced by the central values of the considered intervals, this graphical representation is commonly employed in scientific communication, particularly in Astrophysics. Sometimes this kind of comparison is unavoidable when one needs to compare new results with already published data only available in histogram format. Unfortunately, it is not infrequent to find in the literature examples of histogram comparisons where the similarity between the histograms is not statistically quantified but simply justified or discarded ``by eye''. In this poster several methods to quantify the similarity between two histograms are discussed. The availability of statistical packages, such as R (R Core Team 2014, R: A Language and Environment for Statistical Computing, R Foundation for Statistical Computing, Vienna, Austria. URL http://www.R-project.org/), notably simplify the understanding of the different approaches through the use of numerical simulations.

  20. Time-cumulated visible and infrared radiance histograms used as descriptors of surface and cloud variations

    NASA Technical Reports Server (NTRS)

    Seze, Genevieve; Rossow, William B.

    1991-01-01

    The spatial and temporal stability of the distributions of satellite-measured visible and infrared radiances, caused by variations in clouds and surfaces, are investigated using bidimensional and monodimensional histograms and time-composite images. Similar analysis of the histograms of the original and time-composite images provides separation of the contributions of the space and time variations to the total variations. The variability of both the surfaces and clouds is found to be larger at scales much larger than the minimum resolved by satellite imagery. This study shows that the shapes of these histograms are distinctive characteristics of the different climate regimes and that particular attributes of these histograms can be related to several general, though not universal, properties of clouds and surface variations at regional and synoptic scales. There are also significant exceptions to these relationships in particular climate regimes. The characteristics of these radiance histograms provide a stable well defined descriptor of the cloud and surface properties.

  1. Equality in Education: An Equality of Condition Perspective

    ERIC Educational Resources Information Center

    Lynch, Kathleen; Baker, John

    2005-01-01

    Transforming schools into truly egalitarian institutions requires a holistic and integrated approach. Using a robust conception of "equality of condition", we examine key dimensions of equality that are central to both the purposes and processes of education: equality in educational and related resources; equality of respect and recognition;…

  2. 43 CFR 2201.6 - Value equalization; cash equalization waiver.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Value equalization; cash equalization... PROCEDURES Exchanges-Specific Requirements § 2201.6 Value equalization; cash equalization waiver. (a) To... as compensation for costs under § 2201.1-3 of this part may not exceed 25 percent of the value of...

  3. 43 CFR 2201.6 - Value equalization; cash equalization waiver.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Value equalization; cash equalization... PROCEDURES Exchanges-Specific Requirements § 2201.6 Value equalization; cash equalization waiver. (a) To... as compensation for costs under § 2201.1-3 of this part may not exceed 25 percent of the value of...

  4. 36 CFR 254.12 - Value equalization; cash equalization waiver.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 2 2011-07-01 2011-07-01 false Value equalization; cash... AGRICULTURE LANDOWNERSHIP ADJUSTMENTS Land Exchanges § 254.12 Value equalization; cash equalization waiver. (a) To equalize the agreed upon values of the Federal and non-Federal lands involved in an...

  5. Landmark Detection in Orbital Images Using Salience Histograms

    NASA Technical Reports Server (NTRS)

    Wagstaff, Kiri L.; Panetta, Julian; Schorghofer, Norbert; Greeley, Ronald; PendletonHoffer, Mary; bunte, Melissa

    2010-01-01

    NASA's planetary missions have collected, and continue to collect, massive volumes of orbital imagery. The volume is such that it is difficult to manually review all of the data and determine its significance. As a result, images are indexed and searchable by location and date but generally not by their content. A new automated method analyzes images and identifies "landmarks," or visually salient features such as gullies, craters, dust devil tracks, and the like. This technique uses a statistical measure of salience derived from information theory, so it is not associated with any specific landmark type. It identifies regions that are unusual or that stand out from their surroundings, so the resulting landmarks are context-sensitive areas that can be used to recognize the same area when it is encountered again. A machine learning classifier is used to identify the type of each discovered landmark. Using a specified window size, an intensity histogram is computed for each such window within the larger image (sliding the window across the image). Next, a salience map is computed that specifies, for each pixel, the salience of the window centered at that pixel. The salience map is thresholded to identify landmark contours (polygons) using the upper quartile of salience values. Descriptive attributes are extracted for each landmark polygon: size, perimeter, mean intensity, standard deviation of intensity, and shape features derived from an ellipse fit.

  6. Using color histograms and SPA-LDA to classify bacteria.

    PubMed

    de Almeida, Valber Elias; da Costa, Gean Bezerra; de Sousa Fernandes, David Douglas; Gonçalves Dias Diniz, Paulo Henrique; Brandão, Deysiane; de Medeiros, Ana Claudia Dantas; Véras, Germano

    2014-09-01

    In this work, a new approach is proposed to verify the differentiating characteristics of five bacteria (Escherichia coli, Enterococcus faecalis, Streptococcus salivarius, Streptococcus oralis, and Staphylococcus aureus) by using digital images obtained with a simple webcam and variable selection by the Successive Projections Algorithm associated with Linear Discriminant Analysis (SPA-LDA). In this sense, color histograms in the red-green-blue (RGB), hue-saturation-value (HSV), and grayscale channels and their combinations were used as input data, and statistically evaluated by using different multivariate classifiers (Soft Independent Modeling by Class Analogy (SIMCA), Principal Component Analysis-Linear Discriminant Analysis (PCA-LDA), Partial Least Squares Discriminant Analysis (PLS-DA) and Successive Projections Algorithm-Linear Discriminant Analysis (SPA-LDA)). The bacteria strains were cultivated in a nutritive blood agar base layer for 24 h by following the Brazilian Pharmacopoeia, maintaining the status of cell growth and the nature of nutrient solutions under the same conditions. The best result in classification was obtained by using RGB and SPA-LDA, which reached 94 and 100 % of classification accuracy in the training and test sets, respectively. This result is extremely positive from the viewpoint of routine clinical analyses, because it avoids bacterial identification based on phenotypic identification of the causative organism using Gram staining, culture, and biochemical proofs. Therefore, the proposed method presents inherent advantages, promoting a simpler, faster, and low-cost alternative for bacterial identification. PMID:25023972

  7. Wireless Micro-Ball endoscopic image enhancement using histogram information.

    PubMed

    Attar, Abdolrahman; Xie, Xiang; Zhang, Chun; Wang, Zhihua; Yue, Shigang

    2014-01-01

    Wireless endoscopy systems is a new innovative method widely used for gastrointestinal tract examination in recent decade. Wireless Micro-Ball endoscopy system with multiple image sensors is the newest proposed method which can make a full view image of the gastrointestinal tract. But still the quality of images from this new wireless endoscopy system is not satisfactory. It's hard for doctors and specialist to easily examine and interpret the captured images. The image features also are not distinct enough to be used for further processing. So as to enhance these low-contrast endoscopic images a new image enhancement method based on the endoscopic images features and color distribution is proposed in this work. The enhancement method is performed on three main steps namely color space transformation, edge preserving mask formation, and histogram information correction. The luminance component of CIE Lab, YCbCr, and HSV color space is enhanced in this method and then two other components added finally to form an enhanced color image. The experimental result clearly show the robustness of the method. PMID:25570705

  8. Genes and equality.

    PubMed

    Farrelly, C

    2004-12-01

    The way people think about equality as a value will influence how they think genetic interventions should be regulated. In this paper the author uses the taxonomy of equality put forth by Derek Parfit and applies this to the issue of genetic interventions. It is argued that telic egalitarianism is untenable and that deontic egalitarianism collapses into prioritarianism. The priority view maintains that it is morally more important to benefit the people who are worse off. Once this precision has been given to the concerns egalitarians have, a number of diverse issues must be considered before determining what the just regulation of genetic interventions would be. Consideration must be given to the current situation of the least advantaged, the fiscal realities behind genetic interventions, the budget constraints on other social programmes egalitarians believe should receive scarce public funds, and the interconnected nature of genetic information. These considerations might lead egalitarians to abandon what they take to be the obvious policy recommendations for them to endorse regarding the regulation of gene therapies and enhancements. PMID:15574450

  9. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  10. Preference for luminance histogram regularities in natural scenes.

    PubMed

    Graham, Daniel; Schwarz, Bianca; Chatterjee, Anjan; Leder, Helmut

    2016-03-01

    Natural scene luminance distributions typically have positive skew, and for single objects, there is evidence that higher skew is a correlate (but not a guarantee) of glossiness. Skewness is also relevant to aesthetics: preference for glossy single objects (with high skew) has been shown even in infants, and skewness is a good predictor of fruit freshness. Given that primate vision appears to efficiently encode natural scene luminance variation, and given evidence that natural scene regularities may be a prerequisite for aesthetic perception in the spatial domain, here we ask whether humans in general prefer natural scenes with more positively skewed luminance distributions. If humans generally prefer images with the higher-order regularities typical of natural scenes and/or shiny objects, we would expect this to be the case. By manipulating luminance distribution skewness (holding mean and variance constant) for individual natural images, we show that in fact preference varies inversely with increasing positive skewness. This finding holds for: artistic landscape images and calibrated natural scenes; scenes with and without glossy surfaces; landscape scenes and close-up objects; and noise images with natural luminance histograms. Across conditions, humans prefer images with skew near zero over higher skew images, and they prefer skew lower than that of the unmodified scenes. These results suggest that humans prefer images with luminances that are distributed relatively evenly about the mean luminance, i.e., images with similar amounts of light and dark. We propose that our results reflect an efficient processing advantage of low-skew images over high-skew images, following evidence from prior brain imaging results. PMID:25872178

  11. Compressed histogram attribute profiles for the classification of VHR remote sensing images

    NASA Astrophysics Data System (ADS)

    Battiti, Romano; Demir, Begüm; Bruzzone, Lorenzo

    2015-10-01

    This paper presents a novel compressed histogram attribute profile (CHAP) for classification of very high resolution remote sensing images. The CHAP characterizes the marginal local distribution of attribute filter responses to model the texture information of each sample with a small number of image features. This is achieved based on a three steps algorithm. The first step is devoted to provide a complete characterization of spatial properties of objects in a scene. To this end, the attribute profile (AP) is initially built by the sequential application of attribute filters to the considered image. Then, to capture complete spatial characteristics of the structures in the scene a local histogram is calculated for each sample of each image in the AP. The local histograms of the same pixel location can contain redundant information since: i) adjacent histogram bins can provide similar information; and ii) the attributes obtained with similar attribute filter threshold values lead to redundant features. In the second step, to point out the redundancies the local histograms of the same pixel locations in the AP are organized into a 2D matrix representation, where columns are associated to the local histograms and rows represents a specific bin in all histograms of the considered sequence of filtered attributes in the profile. This representation results in the characterization of the texture information of each sample through a 2D texture descriptor. In the final step, a novel compression approach based on a uniform 2D quantization strategy is applied to remove the redundancy of the 2D texture descriptors. Finally the CHAP is classified by a Support Vector Machine classifier with histogram intersection kernel that is very effective for high dimensional histogram-based feature representations. Experimental results confirm the effectiveness of the proposed CHAP in terms of computational complexity, storage requirements and classification accuracy when compared to the

  12. CUDA implementation of histogram stretching function for improving X-ray image.

    PubMed

    Lee, Yong H; Kim, Kwan W; Kim, Soon S

    2013-01-01

    This paper presents a method to improve the contrast of digital X-ray image using CUDA program on a GPU. The histogram is commonly used to get the statistical distribution of the contrast in image processing. To increase the visibility of the image in real time, we use the histogram stretching function. It is difficult to implement the function on a GPU because the CUDA program is due to handle the complex process to transfer the source data and the processed results between the memory of GPU and the host system. As a result, we show to operate the histogram stretching function quickly on GPU by the CUDA program. PMID:23920761

  13. Webster and women's equality.

    PubMed

    Johnsen, D; Wilder, M J

    1989-01-01

    The National Abortion Rights Action League (NARAL) and the Women's Legal Defense Fund (WLDF) co-authored an "amicus curiae" brief in "Webster." The brief was written for 77 organizations who believe in equality of women. The brief said that constitutional protection of a woman's right to choose is guaranteed by the right to privacy. The brief said that if abortions were illegal, women would not be able to take place in society equally with men. Liberty would be taken away from women. If the state interferes with abortion, the principle of bodily integrity is violated. In "Winston v. Lee," the Supreme Court found that the state could not compel a criminal to undergo an invasive surgical procedure to retrieve a bullet necessary for the state to prosecute with. 1 in 4 women have a cesarean section, which requires a larger incision in the abdomen, and has many risks. Bearing and raising children often puts a damper on women's employment opportunities. Therefore, if the Supreme Court denied women the right to bear children when and where they wanted, women would not have the right to plan their futures. If the Supreme Court were to agree that "interest in potential life outweighs" a woman's tight to procreate autonomously, states could declare all abortions illegal, investigate them to see if they were induced on purpose, and murder women who induced them. Contraceptive devices could be declared illegal. Laws could be used to force women to submit to cesarean sections and other fetal surgery. Pre-viability abortion restrictions should be rejected because they have old-fashioned notions of women's role in society. They reinforce stereotypes. Missouri's law stresses aiding "potential," rather than actual life. PMID:2603859

  14. 36 CFR 254.12 - Value equalization; cash equalization waiver.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 36 Parks, Forests, and Public Property 2 2012-07-01 2012-07-01 false Value equalization; cash... AGRICULTURE LANDOWNERSHIP ADJUSTMENTS Land Exchanges § 254.12 Value equalization; cash equalization waiver. (a... to as compensation for costs under § 254.7 of this subpart may not exceed 25 percent of the value...

  15. 36 CFR 254.12 - Value equalization; cash equalization waiver.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 36 Parks, Forests, and Public Property 2 2013-07-01 2013-07-01 false Value equalization; cash... AGRICULTURE LANDOWNERSHIP ADJUSTMENTS Land Exchanges § 254.12 Value equalization; cash equalization waiver. (a... to as compensation for costs under § 254.7 of this subpart may not exceed 25 percent of the value...

  16. 36 CFR 254.12 - Value equalization; cash equalization waiver.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 2 2010-07-01 2010-07-01 false Value equalization; cash... AGRICULTURE LANDOWNERSHIP ADJUSTMENTS Land Exchanges § 254.12 Value equalization; cash equalization waiver. (a... to as compensation for costs under § 254.7 of this subpart may not exceed 25 percent of the value...

  17. Histogram analysis of ADC in brain tumor patients

    NASA Astrophysics Data System (ADS)

    Banerjee, Debrup; Wang, Jihong; Li, Jiang

    2011-03-01

    At various stage of progression, most brain tumors are not homogenous. In this presentation, we retrospectively studied the distribution of ADC values inside tumor volume during the course of tumor treatment and progression for a selective group of patients who underwent an anti-VEGF trial. Complete MRI studies were obtained for this selected group of patients including pre- and multiple follow-up, post-treatment imaging studies. In each MRI imaging study, multiple scan series were obtained as a standard protocol which includes T1, T2, T1-post contrast, FLAIR and DTI derived images (ADC, FA etc.) for each visit. All scan series (T1, T2, FLAIR, post-contrast T1) were registered to the corresponding DTI scan at patient's first visit. Conventionally, hyper-intensity regions on T1-post contrast images are believed to represent the core tumor region while regions highlighted by FLAIR may overestimate tumor size. Thus we annotated tumor regions on the T1-post contrast scans and ADC intensity values for pixels were extracted inside tumor regions as defined on T1-post scans. We fit a mixture Gaussian (MG) model for the extracted pixels using the Expectation-Maximization (EM) algorithm, which produced a set of parameters (mean, various and mixture coefficients) for the MG model. This procedure was performed for each visits resulting in a series of GM parameters. We studied the parameters fitted for ADC and see if they can be used as indicators for tumor progression. Additionally, we studied the ADC characteristics in the peri-tumoral region as identified by hyper-intensity on FLAIR scans. The results show that ADC histogram analysis of the tumor region supports the two compartment model that suggests the low ADC value subregion corresponding to densely packed cancer cell while the higher ADC value region corresponding to a mixture of viable and necrotic cells with superimposed edema. Careful studies of the composition and relative volume of the two compartments in tumor

  18. Efficient Human Action and Gait Analysis Using Multiresolution Motion Energy Histogram

    NASA Astrophysics Data System (ADS)

    Yu, Chih-Chang; Cheng, Hsu-Yung; Cheng, Chien-Hung; Fan, Kuo-Chin

    2010-12-01

    Average Motion Energy (AME) image is a good way to describe human motions. However, it has to face the computation efficiency problem with the increasing number of database templates. In this paper, we propose a histogram-based approach to improve the computation efficiency. We convert the human action/gait recognition problem to a histogram matching problem. In order to speed up the recognition process, we adopt a multiresolution structure on the Motion Energy Histogram (MEH). To utilize the multiresolution structure more efficiently, we propose an automated uneven partitioning method which is achieved by utilizing the quadtree decomposition results of MEH. In that case, the computation time is only relevant to the number of partitioned histogram bins, which is much less than the AME method. Two applications, action recognition and gait classification, are conducted in the experiments to demonstrate the feasibility and validity of the proposed approach.

  19. Matching of flow-cytometry histograms using information theory in feature space.

    PubMed Central

    Zeng, Qing; Wand, Matthew; Young, Alan J.; Rawn, James; Milford, Edgar L.; Mentzer, Steven J.; Greenes, Robert A.

    2002-01-01

    Flow cytometry is a widely available technique for analyzing cell-surface protein expression. Data obtained from flow cytometry is frequently used to produce fluorescence intensity histograms. Comparison of histograms can be useful in the identification of unknown molecules and in the analysis of protein expression. In this study, we examined the combination of a new smoothing technique called SiZer with information theory to measure the difference between cytometry histograms. SiZer provides cross-bandwidth smoothing and allowed analysis in feature space. The new methods were tested on a panel of monoclonal antibodies raised against proteins expressed on peripheral blood lymphocytes and compared with previous methods. The findings suggest that comparing information content of histograms in feature space is effective and efficient for identifying antibodies with similar cell-surface binding patterns. PMID:12463961

  20. Genetic Diversity and Human Equality.

    ERIC Educational Resources Information Center

    Dobzhansky, Theodosius

    The idea of equality often, if not frequently, bogs down in confusion and apparent contradictions; equality is confused with identity, and diversity with inequality. It would seem that the easiest way to discredit the idea of equality is to show that people are innately, genetically, and, therefore, irremediably diverse and unlike. The snare is,…

  1. Comparison and evaluation of joint histogram estimation methods for mutual information based image registration

    NASA Astrophysics Data System (ADS)

    Liang, Yongfang; Chen, Hua-mei

    2005-04-01

    Joint histogram is the only quantity required to calculate the mutual information (MI) between two images. For MI based image registration, joint histograms are often estimated through linear interpolation or partial volume interpolation (PVI). It has been pointed out that both methods may result in a phenomenon known as interpolation induced artifacts. In this paper, we implemented a wide range of interpolation/approximation kernels for joint histogram estimation. Some kernels are nonnegative. In this case, these kernels are applied in two ways as the linear kernel is applied in linear interpolation and PVI. In addition, we implemented two other joint histogram estimation methods devised to overcome the interpolation artifact problem. They are nearest neighbor interpolation with jittered sampling with/without histogram blurring and data resampling. We used the clinical data obtained from Vanderbilt University for all of the experiments. The objective of this study is to perform a comprehensive comparison and evaluation of different joint histogram estimation methods for MI based image registration in terms of artifacts reduction and registration accuracy.

  2. De-Striping for Tdiccd Remote Sensing Image Based on Statistical Features of Histogram

    NASA Astrophysics Data System (ADS)

    Gao, Hui-ting; Liu, Wei; He, Hong-yan; Zhang, Bing-xian; Jiang, Cheng

    2016-06-01

    Aim to striping noise brought by non-uniform response of remote sensing TDI CCD, a novel de-striping method based on statistical features of image histogram is put forward. By analysing the distribution of histograms,the centroid of histogram is selected to be an eigenvalue representing uniformity of ground objects,histogrammic centroid of whole image and each pixels are calculated first,the differences between them are regard as rough correction coefficients, then in order to avoid the sensitivity caused by single parameter and considering the strong continuity and pertinence of ground objects between two adjacent pixels,correlation coefficient of the histograms is introduces to reflect the similarities between them,fine correction coefficient is obtained by searching around the rough correction coefficient,additionally,in view of the influence of bright cloud on histogram,an automatic cloud detection based on multi-feature including grey level,texture,fractal dimension and edge is used to pre-process image.Two 0-level panchromatic images of SJ-9A satellite with obvious strip noise are processed by proposed method to evaluate the performance, results show that the visual quality of images are improved because the strip noise is entirely removed,we quantitatively analyse the result by calculating the non-uniformity ,which has reached about 1% and is better than histogram matching method.

  3. Region of Interest Detection Based on Histogram Segmentation for Satellite Image

    NASA Astrophysics Data System (ADS)

    Kiadtikornthaweeyot, Warinthorn; Tatnall, Adrian R. L.

    2016-06-01

    High resolution satellite imaging is considered as the outstanding applicant to extract the Earth's surface information. Extraction of a feature of an image is very difficult due to having to find the appropriate image segmentation techniques and combine different methods to detect the Region of Interest (ROI) most effectively. This paper proposes techniques to classify objects in the satellite image by using image processing methods on high-resolution satellite images. The systems to identify the ROI focus on forests, urban and agriculture areas. The proposed system is based on histograms of the image to classify objects using thresholding. The thresholding is performed by considering the behaviour of the histogram mapping to a particular region in the satellite image. The proposed model is based on histogram segmentation and morphology techniques. There are five main steps supporting each other; Histogram classification, Histogram segmentation, Morphological dilation, Morphological fill image area and holes and ROI management. The methods to detect the ROI of the satellite images based on histogram classification have been studied, implemented and tested. The algorithm is be able to detect the area of forests, urban and agriculture separately. The image segmentation methods can detect the ROI and reduce the size of the original image by discarding the unnecessary parts.

  4. Information-Adaptive Image Encoding and Restoration

    NASA Technical Reports Server (NTRS)

    Park, Stephen K.; Rahman, Zia-ur

    1998-01-01

    The multiscale retinex with color restoration (MSRCR) has shown itself to be a very versatile automatic image enhancement algorithm that simultaneously provides dynamic range compression, color constancy, and color rendition. A number of algorithms exist that provide one or more of these features, but not all. In this paper we compare the performance of the MSRCR with techniques that are widely used for image enhancement. Specifically, we compare the MSRCR with color adjustment methods such as gamma correction and gain/offset application, histogram modification techniques such as histogram equalization and manual histogram adjustment, and other more powerful techniques such as homomorphic filtering and 'burning and dodging'. The comparison is carried out by testing the suite of image enhancement methods on a set of diverse images. We find that though some of these techniques work well for some of these images, only the MSRCR performs universally well oil the test set.

  5. Reframing Inclusive Education: Educational Equality as Capability Equality

    ERIC Educational Resources Information Center

    Terzi, Lorella

    2014-01-01

    In this paper, I argue that rethinking questions of inclusive education in the light of the value of educational equality--specifically conceived as capability equality, or genuine opportunities to achieve educational functionings--adds some important insights to the current debate on inclusive education. First, it provides a cohesive value…

  6. Histogram-based classification with Gaussian mixture modeling for GBM tumor treatment response using ADC map

    NASA Astrophysics Data System (ADS)

    Huo, Jing; Kim, Hyun J.; Pope, Whitney B.; Okada, Kazunori; Alger, Jeffery R.; Wang, Yang; Goldin, Jonathan G.; Brown, Matthew S.

    2009-02-01

    This study applied a Gaussian Mixture Model (GMM) to apparent diffusion coefficient (ADC) histograms to evaluate glioblastoma multiforme (GBM) tumor treatment response using diffusion weighted (DW) MR images. ADC mapping, calculated from DW images, has been shown to reveal changes in the tumor's microenvironment preceding morphologic tumor changes. In this study, we investigated the effectiveness of features that represent changes from pre- and post-treatment tumor ADC histograms to detect treatment response. The main contribution of this work is to model the ADC histogram as the composition of two components, fitted by GMM with expectation maximization (EM) algorithm. For both pre- and post-treatment scans taken 5-7 weeks apart, we obtained the tumor ADC histogram, calculated the two-component features, as well as the other standard histogram-based features, and applied supervised learning for classification. We evaluated our approach with data from 85 patients with GBM under chemotherapy, in which 33 responded and 52 did not respond based on tumor size reduction. We compared AdaBoost and random forests classification algorithms, using ten-fold cross validation, resulting in a best accuracy of 69.41%.

  7. High-fidelity DNA histograms in neoplastic progression in Barrett's esophagus.

    PubMed

    Yu, Chenggong; Zhang, Xiaoqi; Huang, Qin; Klein, Michael; Goyal, Raj K

    2007-05-01

    This study describes the high-fidelity DNA histograms in different stages of neoplastic progression to Barrett's adenocarcinoma (BAC). High-fidelity DNA histograms were obtained with image cytometry on sections, and were classified based on DNA index values of the peaks into diploid, mild aneuploid, moderate aneuploid and severe aneuploid. Heterogeneity index (HI) representing cells with different DNA content and the 5N exceeding cell fraction were determined. One hundred and eighty-seven cases, including 34 normal gastrointestinal mucosa (control), 66 Barrett's-specialized intestinal metaplasia (SIM), 22 low-grade dysplasia (LGD), 22 high-grade dysplasia (HGD) and 43 BAC were investigated. Controls showed sharp diploid peaks with HI values less than 13, and no 5N exceeding nuclei. SIM showed a spectrum of histograms including diploid, mild aneuploid and moderate aneuploid histograms. The frequency and severity of aneuploidy increased with worsening histological grades of dysplasia. All BAC cases were aneuploid, with moderate or severe aneuploidy. Marked elevated HI values (>20) and 5N exceeding fractions (>5%) were found in 5%, 32%, 50% and 88% of cases with SIM, LGD, HGD and BAC, respectively. The high-fidelity DNA histograms suggest that (1) Barrett's SIM may already be dysplastic in nature, and all BAC may be markedly aneuploid; and (2) elevated cellular DNA heterogeneity and 5N fractions may be markers of progressive chromosomal changes and 'unstable aneuploidy' that identifies progressive lesions. PMID:17310216

  8. Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma

    PubMed Central

    Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-01-01

    Abstract The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC). Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement. The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P <0.05), with area under the ROC curves (AUCs) of 0.66 to 0.74 for ADC and 0.76 to 0.88 for PVP. The largest AUC of PVP (1th percentile) showed significantly higher accuracy compared with that of arterial phase (AP) or tumor size (P <0.001). MR histogram analyses—in particular for 1th percentile for PVP images—held promise for prediction of MVI of HCC. PMID:27368028

  9. On-line equalization for lithium-ion battery packs based on charging cell voltages: Part 2. Fuzzy logic equalization

    NASA Astrophysics Data System (ADS)

    Zheng, Yuejiu; Ouyang, Minggao; Lu, Languang; Li, Jianqiu; Han, Xuebing; Xu, Liangfei

    2014-02-01

    In the first part of this work, we propose dissipative cell equalization (DCE) algorithm based on remaining charging capacity estimation (RCCE) and establish a pack model with 8 cells in series. The results show that RCCE-DCE algorithm is suitable for on-line equalization in electric vehicles (EVs) and no over-equalization happens. However, 1% pack capacity difference from the DCE theoretical pack capacity is observed with RCCE-DCE algorithm. Therefore, as the second part of the series, we propose fuzzy logic (FL) DCE algorithm based on charging cell voltage curves (CCVCs). Cell capacities and SOCs are fuzzily identified in FL-DCE algorithm by comparing cell voltages at the beginning and end of charging. Adaptive FL-DCE is further improved to prevent over-equalization and maintain the equalization capability. The simulation results show that pack capacity difference from the DCE theoretical pack capacity with the adaptive FL-DCE is smaller than that with RCCE-DCE algorithm, and the duration of the infant stage is also shorter. The proposed adaptive FL-DCE is suitable for on-line equalization in EVs and well prevents over-equalization.

  10. Dual-mode type algorithms for blind equalization

    NASA Astrophysics Data System (ADS)

    Weerackody, Vijitha; Kassam, Saleem A.

    1994-01-01

    Adaptive channel equalization accomplished without resorting to a training sequence is known as blind equalization. The Godard algorithm and the generalized Sato algorithm are two widely referenced algorithms for blind equalization of a QAM system. These algorithms exhibit very slow convergence rates when compared to algorithms employed in conventional data-aided equalization schemes. In order to speed up the convergence process, these algorithms may be switched over to a decision-directed equalization scheme once the error level is reasonably low. We present a scheme which is capable of operating in two modes: blind equalization mode and a mode similar to the decision-directed equalization mode. In this proposed scheme, the dominant mode of operation changes from the blind equalization mode at higher error levels to the mode similar to the decision-directed equalization mode at lower error levels. Manual switch-over to the decision-directed mode from the blind equalization mode, or vice-versa, is not necessary since transitions between the two modes take place smoothly and automatically.

  11. Infrared face recognition based on LBP histogram and KW feature selection

    NASA Astrophysics Data System (ADS)

    Xie, Zhihua

    2014-07-01

    The conventional LBP-based feature as represented by the local binary pattern (LBP) histogram still has room for performance improvements. This paper focuses on the dimension reduction of LBP micro-patterns and proposes an improved infrared face recognition method based on LBP histogram representation. To extract the local robust features in infrared face images, LBP is chosen to get the composition of micro-patterns of sub-blocks. Based on statistical test theory, Kruskal-Wallis (KW) feature selection method is proposed to get the LBP patterns which are suitable for infrared face recognition. The experimental results show combination of LBP and KW features selection improves the performance of infrared face recognition, the proposed method outperforms the traditional methods based on LBP histogram, discrete cosine transform(DCT) or principal component analysis(PCA).

  12. Perceived quality of wood images influenced by the skewness of image histogram

    NASA Astrophysics Data System (ADS)

    Katsura, Shigehito; Mizokami, Yoko; Yaguchi, Hirohisa

    2015-08-01

    The shape of image luminance histograms is related to material perception. We investigated how the luminance histogram contributed to improvements in the perceived quality of wood images by examining various natural wood and adhesive vinyl sheets with printed wood grain. In the first experiment, we visually evaluated the perceived quality of wood samples. In addition, we measured the colorimetric parameters of the wood samples and calculated statistics of image luminance. The relationship between visual evaluation scores and image statistics suggested that skewness and kurtosis affected the perceived quality of wood. In the second experiment, we evaluated the perceived quality of wood images with altered luminance skewness and kurtosis using a paired comparison method. Our result suggests that wood images are more realistic if the skewness of the luminance histogram is slightly negative.

  13. [Image Feature Extraction and Discriminant Analysis of Xinjiang Uygur Medicine Based on Color Histogram].

    PubMed

    Hamit, Murat; Yun, Weikang; Yan, Chuanbo; Kutluk, Abdugheni; Fang, Yang; Alip, Elzat

    2015-06-01

    Image feature extraction is an important part of image processing and it is an important field of research and application of image processing technology. Uygur medicine is one of Chinese traditional medicine and researchers pay more attention to it. But large amounts of Uygur medicine data have not been fully utilized. In this study, we extracted the image color histogram feature of herbal and zooid medicine of Xinjiang Uygur. First, we did preprocessing, including image color enhancement, size normalizition and color space transformation. Then we extracted color histogram feature and analyzed them with statistical method. And finally, we evaluated the classification ability of features by Bayes discriminant analysis. Experimental results showed that high accuracy for Uygur medicine image classification was obtained by using color histogram feature. This study would have a certain help for the content-based medical image retrieval for Xinjiang Uygur medicine. PMID:26485983

  14. Fluorescent image classification by major color histograms and a neural network

    NASA Astrophysics Data System (ADS)

    Soriano, M.; Garcia, L.; Saloma, Caesar A.

    2001-02-01

    Efficient image classification of microscopic fluorescent spheres is demonstrated with a supervised backpropagation neural network (NN) that uses as inputs the major color histogram representation of the fluorescent image to be classified. Two techniques are tested for the major color search: (1) cluster mean (CM) and (2) Kohonen's self-organizing feature map (SOFM). The method is shown to have higher recognition rates than Swain and Ballard's Color Indexing by histogram intersection. Classification with SOFM-generated histograms as inputs to the classifier NN achieved the best recognition rate (90%) for cases of normal, scaled, defocused, photobleached, and combined images of AMCA (7-Amino-4-Methylcoumarin- 3-Acetic Acid) and FITC (Fluorescein Isothiocynate)-stained microspheres.

  15. Equality and Education -- Part 1

    ERIC Educational Resources Information Center

    Porter, John

    1975-01-01

    Discusses equality in education within the framework of the ideas of John Rawls, asserting that even though in the real world it is not easy to implement his version of equality and justice without endangering his prior principle of liberty, he provides a philosophical foundation for the reconsideration of the meritocratic principle. (Author/JM)

  16. Ensuring equal opportunity sprinkler irrigation

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Equal opportunity for plants to sprinkler irrigation water must be carefully considered by crop producers, irrigation consultants, and the industry that supplies the irrigation equipment. Equal opportunity can be negated by improper marketing, design, and installation, as well as through improper f...

  17. Equal Educational Opportunity Resource Handbook.

    ERIC Educational Resources Information Center

    Oregon State Dept. of Education, Salem.

    Since theory and practice of equal educational opportunity is an issue which is currently confronting decisionmakers at all levels of American education, this handbook presents key federal regulations and Oregon statutes, and administrative rules which provide for equality in employment and in the delivery of educational services. Sources of…

  18. Gender Equality in Teacher Organisations.

    ERIC Educational Resources Information Center

    Lund, Torill Scharning

    1995-01-01

    Most senior trade union posts are held by men, even in organizations where most members are women. The paper examines how the Norwegian Union of Teachers has advanced in this area, noting the status of gender equality in Norway, Norway's work toward gender equality, and the country's focus on educational change. (SM)

  19. Democracy, Equal Citizenship, and Education

    ERIC Educational Resources Information Center

    Callan, Eamonn

    2016-01-01

    Two appealing principles of educational distribution--equality and sufficiency--are comparatively assessed. The initial point of comparison is the distribution of civic educational goods. One reason to favor equality in educational distribution rather than sufficiency is the elimination of undeserved positional advantage in access to labor…

  20. Governing Equality: Mathematics for All?

    ERIC Educational Resources Information Center

    Diaz, Jennifer D.

    2013-01-01

    With the notion of governmentality, this article considers how the equal sign (=) in the U.S. math curriculum organizes knowledge of equality and inscribes cultural rules for thinking, acting, and seeing in the world. Situating the discussion within contemporary math reforms aimed at teaching mathematics for all, I draw attention to how the…

  1. Luck, Choice, and Educational Equality

    ERIC Educational Resources Information Center

    Calvert, John

    2015-01-01

    Harry Brighouse discusses two conceptions of educational equality. The first is a type of equality of opportunity, heavily influenced by the work of John Rawls, which he calls the meritocratic conception. According to this conception, an individual's educational prospects should not be influenced by factors such as their social class background.…

  2. Enhanced optical coherence tomography imaging using a histogram-based denoising algorithm

    NASA Astrophysics Data System (ADS)

    Kim, Keo-Sik; Park, Hyoung-Jun; Kang, Hyun Seo

    2015-11-01

    A histogram-based denoising algorithm was developed to effectively reduce ghost artifact noise and enhance the quality of an optical coherence tomography (OCT) imaging system used to guide surgical instruments. The noise signal is iteratively detected by comparing the histogram of the ensemble average of all A-scans, and the ghost artifacts included in the noisy signal are removed separately from the raw signals using the polynomial curve fitting method. The devised algorithm was simulated with various noisy OCT images, and >87% of the ghost artifact noise was removed despite different locations. Our results show the feasibility of selectively and effectively removing ghost artifact noise.

  3. Equalization of data transmission cable

    NASA Technical Reports Server (NTRS)

    Zobrist, G. W.

    1975-01-01

    The paper describes an equalization approach utilizing a simple RLC network which can obtain a maximum slope of -12dB/octave for reshaping the frequency characteristics of a data transmission cable, so that data may be generated and detected at the receiver. An experimental procedure for determining equalizer design specifications using distortion analysis is presented. It was found that for lengths of 16 PEV-L cable of up to 5 miles and data transmission rates of up to 1 Mbs, the equalization scheme proposed here is sufficient for generation of the data with acceptable error rates.

  4. Multiple point least squares equalization in a room

    NASA Technical Reports Server (NTRS)

    Elliott, S. J.; Nelson, P. A.

    1988-01-01

    Equalization filters designed to minimize the mean square error between a delayed version of the original electrical signal and the equalized response at a point in a room have previously been investigated. In general, such a strategy degrades the response at positions in a room away from the equalization point. A method is presented for designing an equalization filter by adjusting the filter coefficients to minimize the sum of the squares of the errors between the equalized responses at multiple points in the room and delayed versions of the original, electrical signal. Such an equalization filter can give a more uniform frequency response over a greater volume of the enclosure than can the single point equalizer above. Computer simulation results are presented of equalizing the frequency responses from a loudspeaker to various typical ear positions, in a room with dimensions and acoustic damping typical of a car interior, using the two approaches outlined above. Adaptive filter algorithms, which can automatically adjust the coefficients of a digital equalization filter to achieve this minimization, will also be discussed.

  5. A Dynamic Tap Allocation for Concurrent CMA-DD Equalizers

    NASA Astrophysics Data System (ADS)

    Trindade, Diego von B. M.; Halmenschlager, Vitor; Ortolan, Leonardo; De Castro, Maria C. F.; De Castro, Fernando C. C.; Ourique, Fabrício

    2010-12-01

    This paper proposes a dynamic tap allocation for the concurrent CMA-DD equalizer as a low complexity solution for the blind channel deconvolution problem. The number of taps is a crucial factor which affects the performance and the complexity of most adaptive equalizers. Generally an equalizer requires a large number of taps in order to cope with long delays in the channel multipath profile. Simulations show that the proposed new blind equalizer is able to solve the blind channel deconvolution problem with a specified and reduced number of active taps. As a result, it minimizes the output excess mean square error due to inactive taps during and after the equalizer convergence and the hardware complexity as well.

  6. Electronegativity Equalization and Partial Charge

    ERIC Educational Resources Information Center

    Sanderson, R. T.

    1974-01-01

    This article elaborates the relationship between covalent radius, homonuclear bond energy, and electronegativity, and sets the background for bond energy calculation by discussing the nature of heteronuclear covalent bonding on the basis of electronegativity equalization and particle charge. (DT)

  7. Equal Education and the Law

    ERIC Educational Resources Information Center

    Shanks, Hershel

    1970-01-01

    A number of court cases are cited which trace the development of various definitions and interpretations of the equal protection clause of the Fourteenth Amendment to the Constitution as would be applicable to inadequate" schools. (DM)

  8. Equal Pay for Comparable Work.

    ERIC Educational Resources Information Center

    Rothman, Nancy Lloyd; Rothman, Daniel A.

    1980-01-01

    Examines the legal battleground upon which one struggle for the equality of women is being fought. Updates a civil rights decision of crucial importance to nursing--Lemons v City and County of Denver. (JOW)

  9. Electronegativity Equalization with Pauling Units.

    ERIC Educational Resources Information Center

    Bratsch, Steven G.

    1984-01-01

    Discusses electronegativity equalization using Pauling units. Although Pauling has qualitatively defined electronegativity as the power of an atom in a molecule to attract electrons to itself, Pauling electronegativities are treated in this paper as prebonded, isolated-atom quantities. (JN)

  10. DIF Testing with an Empirical-Histogram Approximation of the Latent Density for Each Group

    ERIC Educational Resources Information Center

    Woods, Carol M.

    2011-01-01

    This research introduces, illustrates, and tests a variation of IRT-LR-DIF, called EH-DIF-2, in which the latent density for each group is estimated simultaneously with the item parameters as an empirical histogram (EH). IRT-LR-DIF is used to evaluate the degree to which items have different measurement properties for one group of people versus…

  11. Comparison Between Cooccurrence Matrices, Local Histograms And Curvilinear Integration For Texture Characterization

    NASA Astrophysics Data System (ADS)

    Ronsin, J.; Barba, D.; Raboisson, S.

    1986-04-01

    We present an algorithm for texture characterization based upon curvilinear integration of grey tone signal along some predefined directions.In the context of image segmentation, we compare the performances of this very simple technique with two other ones : texture features by second-order cooccurrence probabilities, and texture features by local one dimensional histograms. Good classification performances are obtained on quite different pictures.

  12. Reducing variability in the output of pattern classifiers using histogram shaping

    SciTech Connect

    Gupta, Shalini; Kan, Chih-Wen; Markey, Mia K.

    2010-04-15

    Purpose: The authors present a novel technique based on histogram shaping to reduce the variability in the output and (sensitivity, specificity) pairs of pattern classifiers with identical ROC curves, but differently distributed outputs. Methods: The authors identify different sources of variability in the output of linear pattern classifiers with identical ROC curves, which also result in classifiers with differently distributed outputs. They theoretically develop a novel technique based on the matching of the histograms of these differently distributed pattern classifier outputs to reduce the variability in their (sensitivity, specificity) pairs at fixed decision thresholds, and to reduce the variability in their actual output values. They empirically demonstrate the efficacy of the proposed technique by means of analyses on the simulated data and real world mammography data. Results: For the simulated data, with three different known sources of variability, and for the real world mammography data with unknown sources of variability, the proposed classifier output calibration technique significantly reduced the variability in the classifiers' (sensitivity, specificity) pairs at fixed decision thresholds. Furthermore, for classifiers with monotonically or approximately monotonically related output variables, the histogram shaping technique also significantly reduced the variability in their actual output values. Conclusions: Classifier output calibration based on histogram shaping can be successfully employed to reduce the variability in the output values and (sensitivity, specificity) pairs of pattern classifiers with identical ROC curves, but differently distributed outputs.

  13. Estimating the body portion of CT volumes by matching histograms of visual words

    NASA Astrophysics Data System (ADS)

    Feulner, Johannes; Zhou, S. Kevin; Seifert, Sascha; Cavallaro, Alexander; Hornegger, Joachim; Comaniciu, Dorin

    2009-02-01

    Being able to automatically determine which portion of the human body is shown by a CT volume image offers various possibilities like automatic labeling of images or initializing subsequent image analysis algorithms. This paper presents a method that takes a CT volume as input and outputs the vertical body coordinates of its top and bottom slice in a normalized coordinate system whose origin and unit length are determined by anatomical landmarks. Each slice of a volume is described by a histogram of visual words: Feature vectors consisting of an intensity histogram and a SURF descriptor are first computed on a regular grid and then classified into the closest visual words to form a histogram. The vocabulary of visual words is a quantization of the feature space by offline clustering a large number of feature vectors from prototype volumes into visual words (or cluster centers) via the K-Means algorithm. For a set of prototype volumes whose body coordinates are known the slice descriptions are computed in advance. The body coordinates of a test volume are computed by a 1D rigid registration of the test volume with the prototype volumes in axial direction. The similarity of two slices is measured by comparing their histograms of visual words. Cross validation on a dataset of 44 volumes proved the robustness of the results. Even for test volumes of ca. 20cm height, the average error was 15.8mm.

  14. A contrast correction method for dental images based on histogram registration

    PubMed Central

    Economopoulos, TL; Asvestas, PA; Matsopoulos, GK; Gröndahl, K; Gröndahl, H-G

    2010-01-01

    Contrast correction is often required in digital subtraction radiography when comparing medical data acquired over different time periods owing to dissimilarities in the acquisition process. This paper focuses on dental radiographs and introduces a novel approach for correcting the contrast in dental image pairs. The proposed method modifies the subject images by applying typical registration techniques on their histograms. The proposed histogram registration method reshapes the histograms of the two subject images in such a way that these images are matched in terms of their contrast deviation. The method was extensively tested over 4 sets of dental images, consisting of 72 registered dental image pairs with unknown contrast differences as well as 20 dental pairs with known contrast differences. The proposed method was directly compared against the well-known histogram-based contrast correction method. The two methods were qualitatively and quantitatively evaluated for all 92 available dental image pairs. The two methods were compared in terms of the contrast root mean square difference between the reference image and the corrected image in each case. The obtained results were also verified statistically using appropriate t-tests in each set. The proposed method exhibited superior performance compared with the well-established method, in terms of the contrast root mean square difference between the reference and the corrected images. After suitable statistical analysis, it was deduced that the performance advantage of the proposed approach was statistically significant. PMID:20587655

  15. Pattern-histogram-based temporal change detection using personal chest radiographs

    NASA Astrophysics Data System (ADS)

    Ugurlu, Yucel; Obi, Takashi; Hasegawa, Akira; Yamaguchi, Masahiro; Ohyama, Nagaaki

    1999-05-01

    An accurate and reliable detection of temporal changes from a pair of images has considerable interest in the medical science. Traditional registration and subtraction techniques can be applied to extract temporal differences when,the object is rigid or corresponding points are obvious. However, in radiological imaging, loss of the depth information, the elasticity of object, the absence of clearly defined landmarks and three-dimensional positioning differences constraint the performance of conventional registration techniques. In this paper, we propose a new method in order to detect interval changes accurately without using an image registration technique. The method is based on construction of so-called pattern histogram and comparison procedure. The pattern histogram is a graphic representation of the frequency counts of all allowable patterns in the multi-dimensional pattern vector space. K-means algorithm is employed to partition pattern vector space successively. Any differences in the pattern histograms imply that different patterns are involved in the scenes. In our experiment, a pair of chest radiographs of pneumoconiosis is employed and the changing histogram bins are visualized on both of the images. We found that the method can be used as an alternative way of temporal change detection, particularly when the precise image registration is not available.

  16. Large-Scale Merging of Histograms using Distributed In-Memory Computing

    NASA Astrophysics Data System (ADS)

    Blomer, Jakob; Ganis, Gerardo

    2015-12-01

    Most high-energy physics analysis jobs are embarrassingly parallel except for the final merging of the output objects, which are typically histograms. Currently, the merging of output histograms scales badly. The running time for distributed merging depends not only on the overall number of bins but also on the number partial histogram output files. That means, while the time to analyze data decreases linearly with the number of worker nodes, the time to merge the histograms in fact increases with the number of worker nodes. On the grid, merging jobs that take a few hours are not unusual. In order to improve the situation, we present a distributed and decentral merging algorithm whose running time is independent of the number of worker nodes. We exploit full bisection bandwidth of local networks and we keep all intermediate results in memory. We present benchmarks from an implementation using the parallel ROOT facility (PROOF) and RAMCloud, a distributed key-value store that keeps all data in DRAM.

  17. Utility of histogram analysis of ADC maps for differentiating orbital tumors

    PubMed Central

    Xu, Xiao-Quan; Hu, Hao; Su, Guo-Yi; Liu, Hu; Hong, Xun-Ning; Shi, Hai-Bin; Wu, Fei-Yun

    2016-01-01

    PURPOSE We aimed to evaluate the role of histogram analysis of apparent diffusion coefficient (ADC) maps for differentiating benign and malignant orbital tumors. METHODS Fifty-two patients with orbital tumors were enrolled from March 2013 to November 2014. Pretreatment diffusion-weighted imaging was performed on a 3T magnetic resonance scanner with b factors of 0 and 800 s/mm2, and the corresponding ADC maps were generated. Whole-tumor regions of interest were drawn on all slices of the ADC maps to obtain histogram parameters, including ADCmean, ADCmedian, standard deviation (SD), skewness, kurtosis, quartile, ADC10, ADC25, ADC75, and ADC90. Histogram parameter differences between benign and malignant orbital tumors were compared. The diagnostic value of each significant parameter in predicting malignant tumors was established. RESULTS Age, ADCmean, ADCmedian, quartile, kurtosis, ADC10, ADC25, ADC75, and ADC90 parameters were significantly different between benign and malignant orbital tumor groups, while gender, location, SD, and skewness were not significantly different. The best diagnostic performance in predicting malignant orbital tumors was achieved at the threshold of ADC10=0.990 (AUC, 0.997; sensitivity, 96.2%; specificity, 100%). CONCLUSION Histogram analysis of ADC maps holds promise for differentiating benign and malignant orbital tumors. ADC10 has the potential to be the most significant parameter for predicting malignant orbital tumors. PMID:26829400

  18. Human detection by quadratic classification on subspace of extended histogram of gradients.

    PubMed

    Satpathy, Amit; Jiang, Xudong; Eng, How-Lung

    2014-01-01

    This paper proposes a quadratic classification approach on the subspace of Extended Histogram of Gradients (ExHoG) for human detection. By investigating the limitations of Histogram of Gradients (HG) and Histogram of Oriented Gradients (HOG), ExHoG is proposed as a new feature for human detection. ExHoG alleviates the problem of discrimination between a dark object against a bright background and vice versa inherent in HG. It also resolves an issue of HOG whereby gradients of opposite directions in the same cell are mapped into the same histogram bin. We reduce the dimensionality of ExHoG using Asymmetric Principal Component Analysis (APCA) for improved quadratic classification. APCA also addresses the asymmetry issue in training sets of human detection where there are much fewer human samples than non-human samples. Our proposed approach is tested on three established benchmarking data sets--INRIA, Caltech, and Daimler--using a modified Minimum Mahalanobis distance classifier. Results indicate that the proposed approach outperforms current state-of-the-art human detection methods. PMID:23708804

  19. Principal component analysis of the CT density histogram to generate parametric response maps of COPD

    NASA Astrophysics Data System (ADS)

    Zha, N.; Capaldi, D. P. I.; Pike, D.; McCormack, D. G.; Cunningham, I. A.; Parraga, G.

    2015-03-01

    Pulmonary x-ray computed tomography (CT) may be used to characterize emphysema and airways disease in patients with chronic obstructive pulmonary disease (COPD). One analysis approach - parametric response mapping (PMR) utilizes registered inspiratory and expiratory CT image volumes and CT-density-histogram thresholds, but there is no consensus regarding the threshold values used, or their clinical meaning. Principal-component-analysis (PCA) of the CT density histogram can be exploited to quantify emphysema using data-driven CT-density-histogram thresholds. Thus, the objective of this proof-of-concept demonstration was to develop a PRM approach using PCA-derived thresholds in COPD patients and ex-smokers without airflow limitation. Methods: Fifteen COPD ex-smokers and 5 normal ex-smokers were evaluated. Thoracic CT images were also acquired at full inspiration and full expiration and these images were non-rigidly co-registered. PCA was performed for the CT density histograms, from which the components with the highest eigenvalues greater than one were summed. Since the values of the principal component curve correlate directly with the variability in the sample, the maximum and minimum points on the curve were used as threshold values for the PCA-adjusted PRM technique. Results: A significant correlation was determined between conventional and PCA-adjusted PRM with 3He MRI apparent diffusion coefficient (p<0.001), with CT RA950 (p<0.0001), as well as with 3He MRI ventilation defect percent, a measurement of both small airways disease (p=0.049 and p=0.06, respectively) and emphysema (p=0.02). Conclusions: PRM generated using PCA thresholds of the CT density histogram showed significant correlations with CT and 3He MRI measurements of emphysema, but not airways disease.

  20. [Fractal dimension and histogram method: algorithm and some preliminary results of noise-like time series analysis].

    PubMed

    Pancheliuga, V A; Pancheliuga, M S

    2013-01-01

    In the present work a methodological background for the histogram method of time series analysis is developed. Connection between shapes of smoothed histograms constructed on the basis of short segments of time series of fluctuations and the fractal dimension of the segments is studied. It is shown that the fractal dimension possesses all main properties of the histogram method. Based on it a further development of fractal dimension determination algorithm is proposed. This algorithm allows more precision determination of the fractal dimension by using the "all possible combination" method. The application of the method to noise-like time series analysis leads to results, which could be obtained earlier only by means of the histogram method based on human expert comparisons of histograms shapes. PMID:23755565

  1. Histogram and gray level co-occurrence matrix on gray-scale ultrasound images for diagnosing lymphocytic thyroiditis.

    PubMed

    Shin, Young Gyung; Yoo, Jaeheung; Kwon, Hyeong Ju; Hong, Jung Hwa; Lee, Hye Sun; Yoon, Jung Hyun; Kim, Eun-Kyung; Moon, Hee Jung; Han, Kyunghwa; Kwak, Jin Young

    2016-08-01

    The objective of the study was to evaluate whether texture analysis using histogram and gray level co-occurrence matrix (GLCM) parameters can help clinicians diagnose lymphocytic thyroiditis (LT) and differentiate LT according to pathologic grade. The background thyroid pathology of 441 patients was classified into no evidence of LT, chronic LT (CLT), and Hashimoto's thyroiditis (HT). Histogram and GLCM parameters were extracted from the regions of interest on ultrasound. The diagnostic performances of the parameters for diagnosing and differentiating LT were calculated. Of the histogram and GLCM parameters, the mean on histogram had the highest Az (0.63) and VUS (0.303). As the degrees of LT increased, the mean decreased and the standard deviation and entropy increased. The mean on histogram from gray-scale ultrasound showed the best diagnostic performance as a single parameter in differentiating LT according to pathologic grade as well as in diagnosing LT. PMID:27336835

  2. Adapting to the Revolution of Equal Opportunity for the Handicapped.

    ERIC Educational Resources Information Center

    Bailey, Cornelia W.

    1979-01-01

    Federal regulations regarding the handicapped pose problems for recipients of federal aid, but higher education's reactions have been more positive than negative. The principle problems seem to be compliance costs, the need for interpretation of the regulations, and difficulties in the areas of admissions and academic requirements. (Author/JMD)

  3. Equal Employment + Equal Pay = Multiple Problems for Colleges and Universities

    ERIC Educational Resources Information Center

    Steinbach, Sheldon Elliot; Reback, Joyce E.

    1974-01-01

    Issues involved in government regulation of university employment practices are discussed: confidentiality of records, pregnancy as a disability, alleged discrimination in benefits, tests and other employment criteria, seniority and layoff, reverse discrimination, use of statistics for determination of discrimination, and the Equal Pay Act. (JT)

  4. A Perspective on Diversity, Equality and Equity in Swedish Schools

    ERIC Educational Resources Information Center

    Johansson, Olof; Davis, Anna; Geijer, Luule

    2007-01-01

    This study presents policy and theory as they apply to diversity, equality and equity in Swedish social and educational policy. All education in Sweden should, according to the curriculum (Lpo 94, 1994, p. 5) be of equivalent value, irrespective of where in the country it is provided and education should be adapted to each pupil's circumstances…

  5. Genomics and equal opportunity ethics.

    PubMed

    Cappelen, A W; Norheim, O F; Tungodden, B

    2008-05-01

    Genomics provides information on genetic susceptibility to diseases and new possibilities for interventions which can fundamentally alter the design of fair health policies. The aim of this paper is to explore implications of genomics from the perspective of equal opportunity ethics. The ideal of equal opportunity requires that individuals are held responsible for some, but not all, factors that affect their health. Informational problems, however, often make it difficult to implement the ideal of equal opportunity in the context of healthcare. In this paper, examples are considered of how new genetic information may affect the way individual responsibility for choice is assigned. It is also argued that genomics may result in relocation of the responsibility cut by providing both new information and new technology. Finally, how genomics may affect healthcare policies and the market for health insurance is discussed. PMID:18448717

  6. Higher Education and Equal Protection.

    ERIC Educational Resources Information Center

    Finnigan, John J.

    1979-01-01

    The effect of the Bakke case, in which the courts first encountered the question of legality of reverse discrimination, is explored; its constitutional significance is examined. It is concluded that the virtue of the decision is in its support of affirmative action and its equal protection implications. (MSE)

  7. Primer of Equal Employment Opportunity.

    ERIC Educational Resources Information Center

    Anderson, Howard J.

    This booklet presents laws and court cases concerning discrimination in hiring. It begins with a presentation of the laws and orders regulating equal employment opportunity and the remedies available. It lists those employees and employers to whom the laws apply and exemptions. Sections deal with discrimination on the basis of race, sex, sexual…

  8. Extending Understanding of Equal Protection.

    ERIC Educational Resources Information Center

    Dreyfuss, Elisabeth T.

    1988-01-01

    Presents four strategies for teaching secondary students about equal protection clause of the U.S. Constitution's Fourteenth Amendment. To be taught by the classroom teacher or a visiting lawyer, these strategies use such methods as a panel discussion and examination of Fourteenth Amendment court cases to accomplish their goals. (GEA)

  9. The Road to Racial Equality

    ERIC Educational Resources Information Center

    Tatum, Beverly Daniel

    2004-01-01

    In this article, the author describes how he was born in 1954, just four months after the Brown v. Board of Education Supreme Court decision outlawed the "separate but equal" doctrine of school segregation. He discusses how that fact has shaped his life immeasurably. Beginning with entering the world in Tallahassee, Fla., where his father taught…

  10. Equalizing Educational Opportunity Through Funding.

    ERIC Educational Resources Information Center

    McGary, Carroll R.

    This speech incorporates a major policy statement regarding school subsidies in Maine. The author discusses equality of educational opportunity; and he comments on the property tax, wealth-connected inequities, and class action suits. The speech focuses on a discussion of the Maine subsidy law and its effects; as well as the Maine property, sales,…

  11. The Forces of Information Equality.

    ERIC Educational Resources Information Center

    Davidow, William H.

    1996-01-01

    It is argued that college trustees and campus executives must understand the tools of information technology and be open to the changes created by equalized access to information, even if they threaten traditions and symbols. This means addressing new issues of organizational functions, changes in the order of teaching/learning and life events,…

  12. Religious Freedom vs. Sex Equality

    ERIC Educational Resources Information Center

    Song, Sarah

    2006-01-01

    This essay examines Susan Moller Okin's writing on conflicts between religious freedom and sex equality, and her criticism of "political liberal" approaches to these conflicts, which I take to be a part of her lifelong critique of the public-private distinction. I argue that, while Okin ultimately accepted a version of the distinction, she was…

  13. Equalization among Florida School Districts.

    ERIC Educational Resources Information Center

    Alexander, Kern; Shiver, Lee

    1983-01-01

    This statistical analysis of funding equalization from 1970 to 1981 evaluates the distributional equity achieved by Florida's school finance plan and examines the relationship between selected per pupil revenue measures and variables thought to influence school district spending, concluding that greater equity has not been attained. (MJL)

  14. An Adaptive Image Enhancement Technique by Combining Cuckoo Search and Particle Swarm Optimization Algorithm

    PubMed Central

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper. PMID:25784928

  15. An adaptive image enhancement technique by combining cuckoo search and particle swarm optimization algorithm.

    PubMed

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper. PMID:25784928

  16. Optical performance monitoring technique using software-based synchronous amplitude histograms.

    PubMed

    Choi, H G; Chang, J H; Kim, Hoon; Chung, Y C

    2014-10-01

    We propose and demonstrate a simple technique to monitor both the optical signal-to-noise ratio (OSNR) and chromatic dispersion (CD) by using the software-based synchronous amplitude histogram (SAH) analysis. We exploit the software-based synchronization technique to construct SAHs from the asynchronously sampled intensities of the signal. The use of SAHs facilitates the accurate extraction of the monitoring parameters at the center of the symbol. Thus, unlike in the case of using the technique based on the asynchronous amplitude histogram (AAH), this technique is not affected by the transient characteristics of the modulated signals. The performance of the proposed monitoring technique is evaluated experimentally by using 10-Gbaud quadrature phase-shift keying (QPSK) and quadrature amplitude modulation (QAM) signals over wide ranges of OSNR and CD. We also evaluate the robustness of the proposed technique to the signal's transient characteristics. PMID:25321978

  17. High Capacity Reversible Watermarking for Audio by Histogram Shifting and Predicted Error Expansion

    PubMed Central

    Wang, Fei; Chen, Zuo

    2014-01-01

    Being reversible, the watermarking information embedded in audio signals can be extracted while the original audio data can achieve lossless recovery. Currently, the few reversible audio watermarking algorithms are confronted with following problems: relatively low SNR (signal-to-noise) of embedded audio; a large amount of auxiliary embedded location information; and the absence of accurate capacity control capability. In this paper, we present a novel reversible audio watermarking scheme based on improved prediction error expansion and histogram shifting. First, we use differential evolution algorithm to optimize prediction coefficients and then apply prediction error expansion to output stego data. Second, in order to reduce location map bits length, we introduced histogram shifting scheme. Meanwhile, the prediction error modification threshold according to a given embedding capacity can be computed by our proposed scheme. Experiments show that this algorithm improves the SNR of embedded audio signals and embedding capacity, drastically reduces location map bits length, and enhances capacity control capability. PMID:25097883

  18. Liver fibrosis grading using multiresolution histogram information in real-time elastography

    NASA Astrophysics Data System (ADS)

    Albouy-Kissi, A.; Sarry, L.; Massoulier, S.; Bonny, C.; Randl, K.; Abergel, A.

    2010-03-01

    Despites many limitations, liver biopsy remains the gold standard method for grading and staging liver biopsy. Several modalities have been developed for a non invasive assessment of liver diseases. Real-time elastography may constitute a true alternative to liver biopsy by providing an image of tissular elasticity distribution correlated to the fibrosis grade. In this paper, we investigate a new approach for the assessment of liver fibrosis by the classification of fibrosis morphometry. Multiresolution histogram, based on a combination of intensity and texture features, has been tested as feature space. Thus, the ability of such multiresolution histograms to discriminate fibrosis grade has been proven. The results have been tested on seventeen patients that underwent a real time elastography and FibroScan examination.

  19. Bias or equality? Unconscious thought equally integrates temporally scattered information.

    PubMed

    Li, Jiansheng; Gao, Qiyang; Zhou, Jifan; Li, Xinyu; Zhang, Meng; Shen, Mowei

    2014-04-01

    In previous experiments on unconscious thought, information was presented to participants in one continuous session; however, in daily life, information is delivered in a temporally partitioned way. We examined whether unconscious thought could equally integrate temporally scattered information when making overall evaluations. When presenting participants with information in two temporally partitioned sessions, participants' overall evaluation was based on neither the information in the first session (Experiment 1) nor that in the second session (Experiment 2); instead, information in both sessions were equally integrated to reach a final judgment. Conscious thought, however, overemphasized information in the second session. Experiments 3 and 4 further ruled out possible influencing factors including differences in the distributions of positive/negative attributes in the first and second sessions and on-line judgment. These findings suggested that unconscious thought can integrate information from a wider range of periods during an evaluation, while conscious thought cannot. PMID:24583456

  20. Feasibility of histogram analysis of susceptibility-weighted MRI for staging of liver fibrosis

    PubMed Central

    Yang, Zhao-Xia; Liang, He-Yue; Hu, Xin-Xing; Huang, Ya-Qin; Ding, Ying; Yang, Shan; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-01-01

    PURPOSE We aimed to evaluate whether histogram analysis of susceptibility-weighted imaging (SWI) could quantify liver fibrosis grade in patients with chronic liver disease (CLD). METHODS Fifty-three patients with CLD who underwent multi-echo SWI (TEs of 2.5, 5, and 10 ms) were included. Histogram analysis of SWI images were performed and mean, variance, skewness, kurtosis, and the 1st, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared. For significant parameters, further receiver operating characteristic (ROC) analyses were performed to evaluate the potential diagnostic performance for differentiating liver fibrosis stages. RESULTS The number of patients in each pathologic fibrosis grade was 7, 3, 5, 5, and 33 for F0, F1, F2, F3, and F4, respectively. The results of variance (TE: 10 ms), 90th percentile (TE: 10 ms), and 99th percentile (TE: 10 and 5 ms) in F0–F3 group were significantly lower than in F4 group, with areas under the ROC curves (AUCs) of 0.84 for variance and 0.70–0.73 for the 90th and 99th percentiles, respectively. The results of variance (TE: 10 and 5 ms), 99th percentile (TE: 10 ms), and skewness (TE: 2.5 and 5 ms) in F0–F2 group were smaller than those of F3/F4 group, with AUCs of 0.88 and 0.69 for variance (TE: 10 and 5 ms, respectively), 0.68 for 99th percentile (TE: 10 ms), and 0.73 and 0.68 for skewness (TE: 2.5 and 5 ms, respectively). CONCLUSION Magnetic resonance histogram analysis of SWI, particularly the variance, is promising for predicting advanced liver fibrosis and cirrhosis. PMID:27113421

  1. Digital image classification with the help of artificial neural network by simple histogram

    PubMed Central

    Dey, Pranab; Banerjee, Nirmalya; Kaur, Rajwant

    2016-01-01

    Background: Visual image classification is a great challenge to the cytopathologist in routine day-to-day work. Artificial neural network (ANN) may be helpful in this matter. Aims and Objectives: In this study, we have tried to classify digital images of malignant and benign cells in effusion cytology smear with the help of simple histogram data and ANN. Materials and Methods: A total of 404 digital images consisting of 168 benign cells and 236 malignant cells were selected for this study. The simple histogram data was extracted from these digital images and an ANN was constructed with the help of Neurointelligence software [Alyuda Neurointelligence 2.2 (577), Cupertino, California, USA]. The network architecture was 6-3-1. The images were classified as training set (281), validation set (63), and test set (60). The on-line backpropagation training algorithm was used for this study. Result: A total of 10,000 iterations were done to train the ANN system with the speed of 609.81/s. After the adequate training of this ANN model, the system was able to identify all 34 malignant cell images and 24 out of 26 benign cells. Conclusion: The ANN model can be used for the identification of the individual malignant cells with the help of simple histogram data. This study will be helpful in the future to identify malignant cells in unknown situations. PMID:27279679

  2. Statistical Analysis of Photopyroelectric Signals using Histogram and Kernel Density Estimation for differentiation of Maize Seeds

    NASA Astrophysics Data System (ADS)

    Rojas-Lima, J. E.; Domínguez-Pacheco, A.; Hernández-Aguilar, C.; Cruz-Orea, A.

    2016-09-01

    Considering the necessity of photothermal alternative approaches for characterizing nonhomogeneous materials like maize seeds, the objective of this research work was to analyze statistically the amplitude variations of photopyroelectric signals, by means of nonparametric techniques such as the histogram and the kernel density estimator, and the probability density function of the amplitude variations of two genotypes of maize seeds with different pigmentations and structural components: crystalline and floury. To determine if the probability density function had a known parametric form, the histogram was determined which did not present a known parametric form, so the kernel density estimator using the Gaussian kernel, with an efficiency of 95 % in density estimation, was used to obtain the probability density function. The results obtained indicated that maize seeds could be differentiated in terms of the statistical values for floury and crystalline seeds such as the mean (93.11, 159.21), variance (1.64× 103, 1.48× 103), and standard deviation (40.54, 38.47) obtained from the amplitude variations of photopyroelectric signals in the case of the histogram approach. For the case of the kernel density estimator, seeds can be differentiated in terms of kernel bandwidth or smoothing constant h of 9.85 and 6.09 for floury and crystalline seeds, respectively.

  3. A comparison of histogram distance metrics for content-based image retrieval

    NASA Astrophysics Data System (ADS)

    Zhang, Qianwen; Canosa, Roxanne L.

    2014-03-01

    The type of histogram distance metric selected for a CBIR query varies greatly and will affect the accuracy of the retrieval results. This paper compares the retrieval results of a variety of commonly used CBIR distance metrics: the Euclidean distance, the Manhattan distance, the vector cosine angle distance, histogram intersection distance, χ2 distance, Jensen-Shannon divergence, and the Earth Mover's distance. A training set of ground-truth labeled images is used to build a classifier for the CBIR system, where the images were obtained from three commonly used benchmarking datasets: the WANG dataset (http://savvash.blogspot.com/2008/12/benchmark-databases-for-cbir.html), the Corel Subset dataset (http://vision.stanford.edu/resources_links.html), and the CalTech dataset (http://www.vision.caltech.edu/htmlfiles/). To implement the CBIR system, we use the Tamura texture features of coarseness, contrast, and directionality. We create texture histograms of the training set and the query images, and then measure the difference between a randomly selected query and the corresponding retrieved image using a k-nearest-neighbors approach. Precision and recall is used to evaluate the retrieval performance of the system, given a particular distance metric. Then, given the same query image, the distance metric is changed and performance of the system is evaluated once again.

  4. LOR-OSEM: statistical PET reconstruction from raw line-of-response histograms

    PubMed Central

    Kadrmas, Dan J

    2010-01-01

    Iterative statistical reconstruction methods are becoming the standard in positron emission tomography (PET). Conventional maximum-likelihood expectation-maximization (MLEM) and ordered-subsets (OSEM) algorithms act on data which has been pre-processed into corrected, evenly-spaced histograms; however, such pre-processing corrupts the Poisson statistics. Recent advances have incorporated attenuation, scatter, and randoms compensation into the iterative reconstruction. The objective of this work was to incorporate the remaining preprocessing steps, including arc correction, to reconstruct directly from raw unevenly-spaced line-of-response (LOR) histograms. This exactly preserves Poisson statistics and full spatial information in a manner closely related to listmode ML, making full use of the ML statistical model. The LOR-OSEM algorithm was implemented using a rotation-based projector which maps directly to the unevenly-spaced LOR grid. Simulation and phantom experiments were performed to characterize resolution, contrast, and noise properties for 2D PET. LOR-OSEM provided a beneficial noise-resolution tradeoff, outperforming AW-OSEM by about the same margin that AW-OSEM outperformed pre-corrected OSEM. The relationship between LOR-ML and listmode ML algorithms was explored, and implementation differences are discussed. LOR-OSEM is a viable alternative to AW-OSEM for histogram-based reconstruction with improved spatial resolution and noise properties. PMID:15566171

  5. Flat-histogram Monte Carlo in the Classical Antiferromagnetic Ising Model

    NASA Astrophysics Data System (ADS)

    Brown, G.; Rikvold, P. A.; Nicholson, D. M.; Odbadrakh, Kh.; Yin, J.-Q.; Eisenbach, M.; Miyashita, S.

    2014-03-01

    Flat-histogram Monte Carlo methods, such as Wang-Landau and multicanonical sampling, are extremely useful in numerical studies of frustrated magnetic systems. Numerical tools such as windowing and discrete histograms introduce discontinuities along the continuous energy variable, which in turn introduce artifacts into the calculated density of states. We demonstrate these effects and introduce practical solutions, including ``guard regions'' with biased walks for windowing and analytic representations for histograms. The classical Ising antiferromagnet supplemented by a mean-field interaction is considered. In zero field, the allowed energies are discrete and the artifacts can be avoided in small systems by not binning. For large systems, or cases where non-zero fields are used to break the degeneracy between local energy minima, the energy becomes continuous and these artifacts must be taken into account. Work performed at ORNL, managed by UT-Batelle for the US DOE; sponsored by Div of Mat Sci & Eng, Office of BES; used resources of Oak Ridge Leadership Computing Facility at ORNL, supported by Office of Science Contract DE-AC05-00OR22725.

  6. Office of Equal Opportunity Programs

    NASA Technical Reports Server (NTRS)

    Chin, Jennifer L.

    2004-01-01

    The NASA Glenn Office of Equal Opportunity Programs works to provide quality service for all programs and/or to assist the Center in becoming a model workplace. During the summer of 2004, I worked with Deborah Cotleur along with other staff members to create and modify customer satisfaction surveys. This office aims to assist in developing a model workplace by providing functions as a change agent to the center by serving as an advisor to management to ensure equity throughout the Center. In addition, the office serves as a mediator for the Center in addressing issues and concerns. Lastly, the office provides assistance to employees to enable attainment of personal and organizational goals. The Office of Equal Opportunities is a staff office which reports and provides advice to the Center Director and Executive Leadership, implements laws, regulations, and presidential executive orders, and provides center wide leadership and assistance to NASA GRC employees. Some of the major responsibilities of the office include working with the discrimination complaints program, special emphasis programs (advisory groups), management support, monitoring and evaluation, contract compliance, and community outreach. During my internship in this office, my main objective was to create four customer satisfaction surveys based on EO retreats, EO observances, EO advisory boards, and EO mediation/counseling. I created these surveys after conducting research on past events and surveys as well as similar survey research created and conducted by other NASA centers, program for EO Advisory group members, leadership training sessions for supervisors, preventing sexual harassment training sessions, and observance events. I also conducted research on the style and format from feedback surveys from the Marshall Equal Opportunity website, the Goddard website, and the main NASA website. Using the material from the Office of Equal Opportunity Programs at Glenn Research Center along with my

  7. Measuring kinetics of complex single ion channel data using mean-variance histograms.

    PubMed Central

    Patlak, J B

    1993-01-01

    The measurement of single ion channel kinetics is difficult when those channels exhibit subconductance events. When the kinetics are fast, and when the current magnitudes are small, as is the case for Na+, Ca2+, and some K+ channels, these difficulties can lead to serious errors in the estimation of channel kinetics. I present here a method, based on the construction and analysis of mean-variance histograms, that can overcome these problems. A mean-variance histogram is constructed by calculating the mean current and the current variance within a brief "window" (a set of N consecutive data samples) superimposed on the digitized raw channel data. Systematic movement of this window over the data produces large numbers of mean-variance pairs which can be assembled into a two-dimensional histogram. Defined current levels (open, closed, or sublevel) appear in such plots as low variance regions. The total number of events in such low variance regions is estimated by curve fitting and plotted as a function of window width. This function decreases with the same time constants as the original dwell time probability distribution for each of the regions. The method can therefore be used: 1) to present a qualitative summary of the single channel data from which the signal-to-noise ratio, open channel noise, steadiness of the baseline, and number of conductance levels can be quickly determined; 2) to quantify the dwell time distribution in each of the levels exhibited. In this paper I present the analysis of a Na+ channel recording that had a number of complexities. The signal-to-noise ratio was only about 8 for the main open state, open channel noise, and fast flickers to other states were present, as were a substantial number of subconductance states. "Standard" half-amplitude threshold analysis of these data produce open and closed time histograms that were well fitted by the sum of two exponentials, but with apparently erroneous time constants, whereas the mean

  8. The equal right to drink.

    PubMed

    Schmidt, Laura A

    2014-11-01

    The starting place for this essay is Knupfer and Room's insight that more restrictive norms around drinking and intoxication tend to be selectively applied to the economically dependent segments of society, such as women. However, since these authors wrote in 1964, women in the US and many other societies around the globe have experienced rising economic independence. The essay considers how the moral categories of acceptable drinking and drunkenness may have shifted alongside women's rising economic independence, and looks at evidence on the potential consequences for women's health and wellbeing. I argue that, as women have gained economic independence, changes in drinking norms have produced two different kinds of negative unintended consequences for women at high and low extremes of economic spectrum. As liberated women of the middle and upper classes have become more economically equal to men, they have enjoyed the right to drink with less restraint. For them, alongside the equal right to drink has come greater equality in exposure to alcohol-attributable harms, abuse and dependence. I further suggest that, as societies become more liberated, the economic dependency of low-income women is brought into greater question. Under such conditions, women in poverty-particularly those economically dependent on the state, such as welfare mothers-have become subject to more restrictive norms around drinking and intoxication, and more punitive social controls. PMID:25303360

  9. An adaptive algorithm for low contrast infrared image enhancement

    NASA Astrophysics Data System (ADS)

    Liu, Sheng-dong; Peng, Cheng-yuan; Wang, Ming-jia; Wu, Zhi-guo; Liu, Jia-qi

    2013-08-01

    An adaptive infrared image enhancement algorithm for low contrast is proposed in this paper, to deal with the problem that conventional image enhancement algorithm is not able to effective identify the interesting region when dynamic range is large in image. This algorithm begin with the human visual perception characteristics, take account of the global adaptive image enhancement and local feature boost, not only the contrast of image is raised, but also the texture of picture is more distinct. Firstly, the global image dynamic range is adjusted from the overall, the dynamic range of original image and display grayscale form corresponding relationship, the gray scale of bright object is raised and the the gray scale of dark target is reduced at the same time, to improve the overall image contrast. Secondly, the corresponding filtering algorithm is used on the current point and its neighborhood pixels to extract image texture information, to adjust the brightness of the current point in order to enhance the local contrast of the image. The algorithm overcomes the default that the outline is easy to vague in traditional edge detection algorithm, and ensure the distinctness of texture detail in image enhancement. Lastly, we normalize the global luminance adjustment image and the local brightness adjustment image, to ensure a smooth transition of image details. A lot of experiments is made to compare the algorithm proposed in this paper with other convention image enhancement algorithm, and two groups of vague IR image are taken in experiment. Experiments show that: the contrast ratio of the picture is boosted after handled by histogram equalization algorithm, but the detail of the picture is not clear, the detail of the picture can be distinguished after handled by the Retinex algorithm. The image after deal with by self-adaptive enhancement algorithm proposed in this paper becomes clear in details, and the image contrast is markedly improved in compared with Retinex

  10. Postimplantation Analysis Enables Improvement of Dose-Volume Histograms and Reduction of Toxicity for Permanent Seed Implantation

    SciTech Connect

    Wust, Peter Postrach, Johanna; Kahmann, Frank; Henkel, Thomas; Graf, Reinhold; Cho, Chie Hee; Budach, Volker; Boehmer, Dirk

    2008-05-01

    Purpose: To demonstrate how postimplantation analysis is useful for improving permanent seed implantation and reducing toxicity. Patients and Methods: We evaluated 197 questionnaires completed by patients after permanent seed implantation (monotherapy between 1999 and 2003). For 70% of these patients, a computed tomography was available to perform postimplantation analysis. The index doses and volumes of the dose-volume histograms (DVHs) were determined and categorized with respect to the date of implantation. Differences in symptom scores relative to pretherapeutic status were analyzed with regard to follow-up times and DVH descriptors. Acute and subacute toxicities in a control group of 117 patients from an earlier study (June 1999 to September 2001) by Wust et al. (2004) were compared with a matched subgroup from this study equaling 110 patients treated between October 2001 and August 2003. Results: Improved performance, identifying a characteristic time dependency of DVH parameters (after implantation) and toxicity scores, was demonstrated. Although coverage (volume covered by 100% of the prescription dose of the prostate) increased slightly, high-dose regions decreased with the growing experience of the users. Improvement in the DVH and a reduction of toxicities were found in the patient group implanted in the later period. A decline in symptoms with follow-up time counteracts this gain of experience and must be considered. Urinary and sexual discomfort was enhanced by dose heterogeneities (e.g., dose covering 10% of the prostate volume, volume covered by 200% of prescription dose). In contrast, rectal toxicities correlated with exposed rectal volumes, especially the rectal volume covered by 100% of the prescription dose. Conclusion: The typical side effects occurring after permanent seed implantation can be reduced by improving the dose distributions. An improvement in dose distributions and a reduction of toxicities were identified with elapsed time between

  11. Equal is as equal does: challenging Vatican views on women.

    PubMed

    1995-01-01

    The authors of this piece are women from the Roman Catholic tradition who are critical of the Vatican position on women's rights. The Report of the Holy See in Preparation for the Fourth World Conference on Women reveals a religious fundamentalism that misuses tradition and anthropology to limit women's roles and rights. The Vatican is itself a self-proclaimed state that offers women neither opportunities nor protections within its own organization, and there is no evidence of women's participation in the preparation of its report. The Vatican document constructs a vision of women and men in which men are normative persons, whose dignity is conferred by their humanity, and women are the variant other, defined by and granted dignity by their reproductive and mothering functions. The Vatican document is anti-feminist. It criticizes the "radical feminists" of the 1960s for trying to deny sexual differences, and accuses today's Western feminists of ignoring the needs of women in developing countries while pursuing selfish and hedonistic goals. It makes no recognition of the work of feminists to improve the lives of women worldwide. The Vatican document claims to support women's equality, but it qualifies each statement of equality with a presumption of difference. The document defines women as vulnerable without naming men as responsible for the oppression and violence to which women are vulnerable. It ridicules as feminist cant the well-documented fact that the home is the setting of most violence against women. The Vatican decries the suffering families undergo as a result of cumpulsory birth control and abortion policies, while it would deny families sex education, contraceptives, and safe abortion, thereby making pregnancy cumpulsory. A new vision of social justice is needed, one that: 1) rests on a radical equality, in which both women and men are expected to contribute to work, education, culture, morality, and reproduction; 2) accepts a "discipleship of equals

  12. The Business of Equal Opportunity.

    ERIC Educational Resources Information Center

    Dickson, Reginald D.

    1992-01-01

    The author describes his journey from poor African-American youth in the rural South to successful businessman. He discusses the Inroads program, an internship for African-American and Hispanic youth and advises giving up victimhood and adapting to the mainstream of capitalism. (SK)

  13. Enhancing tumor apparent diffusion coefficient histogram skewness stratifies the postoperative survival in recurrent glioblastoma multiforme patients undergoing salvage surgery.

    PubMed

    Zolal, Amir; Juratli, Tareq A; Linn, Jennifer; Podlesek, Dino; Sitoci Ficici, Kerim Hakan; Kitzler, Hagen H; Schackert, Gabriele; Sobottka, Stephan B; Rieger, Bernhard; Krex, Dietmar

    2016-05-01

    Objective To determine the value of apparent diffusion coefficient (ADC) histogram parameters for the prediction of individual survival in patients undergoing surgery for recurrent glioblastoma (GBM) in a retrospective cohort study. Methods Thirty-one patients who underwent surgery for first recurrence of a known GBM between 2008 and 2012 were included. The following parameters were collected: age, sex, enhancing tumor size, mean ADC, median ADC, ADC skewness, ADC kurtosis and fifth percentile of the ADC histogram, initial progression free survival (PFS), extent of second resection and further adjuvant treatment. The association of these parameters with survival and PFS after second surgery was analyzed using log-rank test and Cox regression. Results Using log-rank test, ADC histogram skewness of the enhancing tumor was significantly associated with both survival (p = 0.001) and PFS after second surgery (p = 0.005). Further parameters associated with prolonged survival after second surgery were: gross total resection at second surgery (p = 0.026), tumor size (0.040) and third surgery (p = 0.003). In the multivariate Cox analysis, ADC histogram skewness was shown to be an independent prognostic factor for survival after second surgery. Conclusion ADC histogram skewness of the enhancing lesion, enhancing lesion size, third surgery, as well as gross total resection have been shown to be associated with survival following the second surgery. ADC histogram skewness was an independent prognostic factor for survival in the multivariate analysis. PMID:26830088

  14. Midwives, gender equality and feminism.

    PubMed

    Walsh, Denis

    2016-03-01

    Gender inequality and the harmful effects of patriarchy are sustaining the wide spread oppression of women across the world and this is also having an impact on maternity services with unacceptable rates of maternal mortality, the continued under investment in the midwifery profession and the limiting of women's place of birth options. However alongside these effects, the current zeitgeist is affirming an alignment of feminism and gender equality such that both have a high profile in public discourse. This presents a once in a generation opportunity for midwives to self-declare as feminists and commit to righting the wrongs of this most pernicious form of discrimination. PMID:27044191

  15. Shape from equal thickness contours

    SciTech Connect

    Cong, G.; Parvin, B.

    1998-05-10

    A unique imaging modality based on Equal Thickness Contours (ETC) has introduced a new opportunity for 3D shape reconstruction from multiple views. We present a computational framework for representing each view of an object in terms of its object thickness, and then integrating these representations into a 3D surface by algebraic reconstruction. The object thickness is inferred by grouping curve segments that correspond to points of second derivative maxima. At each step of the process, we use some form of regularization to ensure closeness to the original features, as well as neighborhood continuity. We apply our approach to images of a sub-micron crystal structure obtained through a holographic process.

  16. Educational Equality: Luck Egalitarian, Pluralist and Complex

    ERIC Educational Resources Information Center

    Calvert, John

    2014-01-01

    The basic principle of educational equality is that each child should receive an equally good education. This sounds appealing, but is rather vague and needs substantial working out. Also, educational equality faces all the objections to equality per se, plus others specific to its subject matter. Together these have eroded confidence in the…

  17. Gender equality and women empowerment.

    PubMed

    Dargan, R

    1996-01-01

    This article lists 11 suggestions for empowering women that the government of India should take, if it has a sincere commitment to gender equality and women's empowerment grounded in social change and not just rhetoric: 1) education should be made compulsory for all female children and places held on a 50/50 basis in all technical institutions; 2) a uniform civil code should be adopted for all citizens regardless of cast, creed, and religion; 3) women should have an equal right to own property and receive inheritance; 4) the National Women's Commission should be enlarged, representative of diversity, and effective in making policy decisions related to welfare, education, recruitment, and promotion; 5) a State Women's Commission should be established with affiliates at the block, district, and division levels; 6) the National and State Women's Commission should be established as a Statutory Body with binding decisions mandating government action; 7) the National and State Women's Commissions should have transparent functions, be regulatory, and offer workshops and seminars for women; 8) state governments should not interfere in the functions of National and State Women's Commissions; 9) women should fill 50% of all Center and State government service posts and concessions should be made on minimum academic qualifications and completed years of service, until all positions are filled; 10) 50% of the seats of Parliament should be reserved for women in both the State Legislature, Council of Ministry Boards, Corporations, Committees, and Commissions; and 11) the Constitution should provide for women judges in courts of law. PMID:12179426

  18. Partial-volume Bayesian classification of material mixtures in MR volume data using voxel histograms.

    PubMed

    Laidlaw, D H; Fleischer, K W; Barr, A H

    1998-02-01

    We present a new algorithm for identifying the distribution of different material types in volumetric datasets such as those produced with magnetic resonance imaging (MRI) or computed tomography (CT). Because we allow for mixtures of materials and treat voxels as regions, our technique reduces errors that other classification techniques can create along boundaries between materials and is particularly useful for creating accurate geometric models and renderings from volume data. It also has the potential to make volume measurements more accurately and classifies noisy, low-resolution data well. There are two unusual aspects to our approach. First, we assume that, due to partial-volume effects, or blurring, voxels can contain more than one material, e.g., both muscle and fat; we compute the relative proportion of each material in the voxels. Second, we incorporate information from neighboring voxels into the classification process by reconstructing a continuous function, rho(x), from the samples and then looking at the distribution of values that rho(x) takes on within the region of a voxel. This distribution of values is represented by a histogram taken over the region of the voxel; the mixture of materials that those values measure is identified within the voxel using a probabilistic Bayesian approach that matches the histogram by finding the mixture of materials within each voxel most likely to have created the histogram. The size of regions that we classify is chosen to match the spacing of the samples because the spacing is intrinsically related to the minimum feature size that the reconstructed continuous function can represent. PMID:9617909

  19. Optimized swimmer tracking system by a dynamic fusion of correlation and color histogram techniques

    NASA Astrophysics Data System (ADS)

    Benarab, D.; Napoléon, T.; Alfalou, A.; Verney, A.; Hellard, P.

    2015-12-01

    To design a robust swimmer tracking system, we took into account two well-known tracking techniques: the nonlinear joint transform correlation (NL-JTC) and the color histogram. The two techniques perform comparably well, yet they both have substantial limitations. Interestingly, they also seem to show some complementarity. The correlation technique yields accurate detection but is sensitive to rotation, scale and contour deformation, whereas the color histogram technique is robust for rotation and contour deformation but shows low accuracy and is highly sensitive to luminosity and confusing background colors. These observations suggested the possibility of a dynamic fusion of the correlation plane and the color scores map. Before this fusion, two steps are required. First is the extraction of a sub-plane of correlation that describes the similarity between the reference and target images. This sub-plane has the same size as the color scores map but they have different interval values. Thus, the second step is required which is the normalization of the planes in the same interval so they can be fused. In order to determine the benefits of this fusion technique, first, we tested it on a synthetic image containing different forms with different colors. We thus were able to optimize the correlation plane and color histogram techniques before applying our fusion technique to real videos of swimmers in international competitions. Last, a comparative study of the dynamic fusion technique and the two classical techniques was carried out to demonstrate the efficacy of the proposed technique. The criteria of comparison were the tracking percentage, the peak to correlation energy (PCE), which evaluated the sharpness of the peak (accuracy), and the local standard deviation (Local-STD), which assessed the noise in the planes (robustness).

  20. Numerical study of QCD phase diagram at high temperature and density by a histogram method

    NASA Astrophysics Data System (ADS)

    Ejiri, Shinji; Aoki, Sinya; Hatsuda, Tetsuo; Kanaya, Kazuyuki; Nakagawa, Yoshiyuki; Ohno, Hiroshi; Saito, Hana; Umeda, Takashi

    2012-12-01

    We study the QCD phase structure at high temperature and density adopting a histogram method. Because the quark determinant is complex at finite density, the Monte-Carlo method cannot be applied directly. We use a reweighting method and try to solve the problems which arise in the reweighting method, i.e. the sign problem and the overlap problem. We discuss the chemical potential dependence of the probability distribution function in the heavy quark mass region and examine the applicability of the approach in the light quark region.

  1. Phase-unwrapping algorithm for images with high noise content based on a local histogram

    NASA Astrophysics Data System (ADS)

    Meneses, Jaime; Gharbi, Tijani; Humbert, Philippe

    2005-03-01

    We present a robust algorithm of phase unwrapping that was designed for use on phase images with high noise content. We proceed with the algorithm by first identifying regions with continuous phase values placed between fringe boundaries in an image and then phase shifting the regions with respect to one another by multiples of 2pi to unwrap the phase. Image pixels are segmented between interfringe and fringe boundary areas by use of a local histogram of a wrapped phase. The algorithm has been used successfully to unwrap phase images generated in a three-dimensional shape measurement for noninvasive quantification of human skin structure in dermatology, cosmetology, and plastic surgery.

  2. Segmentation of neuronal-cell images from stained fields and monomodal histograms.

    PubMed

    Pham, Tuan D; Crane, Denis I

    2005-01-01

    Information from images taken of cells being grown in culture with oxidative agents allows life science researchers to compare changes in neurons from the Zellweger mice to those from normal mice. Image segmentation is the major and first step for the study of these different types of processes in cells. In this paper we develop an innovative strategy for the segmentation of neuronal-cell images which are subjected to stains and whose histograms are monomodal. Such nontrival images make it a challenging task for many existing image segmentation methods. We show that the proposed method is an effective and simple procedure for the subsequent quantitative analysis of neuronal images. PMID:17281705

  3. Quantitative characterization of metastatic disease in the spine. Part II. Histogram-based analyses

    SciTech Connect

    Whyne, Cari; Hardisty, Michael; Wu, Florence; Skrinskas, Tomas; Clemons, Mark; Gordon, Lyle; Basran, Parminder S.

    2007-08-15

    Radiological imaging is essential to the appropriate management of patients with bone metastasis; however, there have been no widely accepted guidelines as to the optimal method for quantifying the potential impact of skeletal lesions or to evaluate response to treatment. The current inability to rapidly quantify the response of bone metastases excludes patients with cancer and bone disease from participating in clinical trials of many new treatments as these studies frequently require patients with so-called measurable disease. Computed tomography (CT) can provide excellent skeletal detail with a sensitivity for the diagnosis of bone metastases. The purpose of this study was to establish an objective method to quantitatively characterize disease in the bony spine using CT-based segmentations. It was hypothesized that histogram analysis of CT vertebral density distributions would enable standardized segmentation of tumor tissue and consequently allow quantification of disease in the metastatic spine. Thirty two healthy vertebral CT scans were first studied to establish a baseline characterization. The histograms of the trabecular centrums were found to be Gaussian distributions (average root-mean-square difference=30 voxel counts), as expected for a uniform material. Intrapatient vertebral level similarity was also observed as the means were not significantly different (p>0.8). Thus, a patient-specific healthy vertebral body histogram is able to characterize healthy trabecular bone throughout that individual's thoracolumbar spine. Eleven metastatically involved vertebrae were analyzed to determine the characteristics of the lytic and blastic bone voxels relative to the healthy bone. Lytic and blastic tumors were segmented as connected areas with voxel intensities between specified thresholds. The tested thresholds were {mu}-1.0{sigma}, {mu}-1.5{sigma}, and {mu}-2.0{sigma}, for lytic and {mu}+2.0{sigma}, {mu}+3.0{sigma}, and {mu}+3.5{sigma} for blastic tissue where

  4. Prediction of brain tumor progression using multiple histogram matched MRI scans

    NASA Astrophysics Data System (ADS)

    Banerjee, Debrup; Tran, Loc; Li, Jiang; Shen, Yuzhong; McKenzie, Frederic; Wang, Jihong

    2011-03-01

    In a recent study [1], we investigated the feasibility of predicting brain tumor progression based on multiple MRI series and we tested our methods on seven patients' MRI images scanned at three consecutive visits A, B and C. Experimental results showed that it is feasible to predict tumor progression from visit A to visit C using a model trained by the information from visit A to visit B. However, the trained model failed when we tried to predict tumor progression from visit B to visit C, though it is clinically more important. Upon a closer look at the MRI scans revealed that histograms of MRI scans such as T1, T2, FLAIR etc taken at different times have slight shifts or different shapes. This is because those MRI scans are qualitative instead of quantitative so MRI scans taken at different times or by different scanners might have slightly different scales or have different homogeneities in the scanning region. In this paper, we proposed a method to overcome this difficulty. The overall goal of this study is to assess brain tumor progression by exploring seven patients' complete MRI records scanned during their visits in the past two years. There are ten MRI series in each visit, including FLAIR, T1-weighted, post-contrast T1-weighted, T2-weighted and five DTI derived MRI volumes: ADC, FA, Max, Min and Middle Eigen Values. After registering all series to the corresponding DTI scan at the first visit, we applied a histogram matching algorithm to non-DTI MRI scans to match their histograms to those of the corresponding MRI scans at the first visit. DTI derived series are quantitative and do not require the histogram matching procedure. A machine learning algorithm was then trained using the data containing information from visit A to visit B, and the trained model was used to predict tumor progression from visit B to visit C. An average of 72% pixel-wise accuracy was achieved for tumor progression prediction from visit B to visit C.

  5. Early detection of Alzheimer's disease using histograms in a dissimilarity-based classification framework

    NASA Astrophysics Data System (ADS)

    Luchtenberg, Anne; Simões, Rita; van Cappellen van Walsum, Anne-Marie; Slump, Cornelis H.

    2014-03-01

    Classification methods have been proposed to detect early-stage Alzheimer's disease using Magnetic Resonance images. In particular, dissimilarity-based classification has been applied using a deformation-based distance measure. However, such approach is not only computationally expensive but it also considers large-scale alterations in the brain only. In this work, we propose the use of image histogram distance measures, determined both globally and locally, to detect very mild to mild Alzheimer's disease. Using an ensemble of local patches over the entire brain, we obtain an accuracy of 84% (sensitivity 80% and specificity 88%).

  6. Improved dose-volume histogram estimates for radiopharmaceutical therapy by optimizing quantitative SPECT reconstruction parameters

    NASA Astrophysics Data System (ADS)

    Cheng, Lishui; Hobbs, Robert F.; Segars, Paul W.; Sgouros, George; Frey, Eric C.

    2013-06-01

    In radiopharmaceutical therapy, an understanding of the dose distribution in normal and target tissues is important for optimizing treatment. Three-dimensional (3D) dosimetry takes into account patient anatomy and the nonuniform uptake of radiopharmaceuticals in tissues. Dose-volume histograms (DVHs) provide a useful summary representation of the 3D dose distribution and have been widely used for external beam treatment planning. Reliable 3D dosimetry requires an accurate 3D radioactivity distribution as the input. However, activity distribution estimates from SPECT are corrupted by noise and partial volume effects (PVEs). In this work, we systematically investigated OS-EM based quantitative SPECT (QSPECT) image reconstruction in terms of its effect on DVHs estimates. A modified 3D NURBS-based Cardiac-Torso (NCAT) phantom that incorporated a non-uniform kidney model and clinically realistic organ activities and biokinetics was used. Projections were generated using a Monte Carlo (MC) simulation; noise effects were studied using 50 noise realizations with clinical count levels. Activity images were reconstructed using QSPECT with compensation for attenuation, scatter and collimator-detector response (CDR). Dose rate distributions were estimated by convolution of the activity image with a voxel S kernel. Cumulative DVHs were calculated from the phantom and QSPECT images and compared both qualitatively and quantitatively. We found that noise, PVEs, and ringing artifacts due to CDR compensation all degraded histogram estimates. Low-pass filtering and early termination of the iterative process were needed to reduce the effects of noise and ringing artifacts on DVHs, but resulted in increased degradations due to PVEs. Large objects with few features, such as the liver, had more accurate histogram estimates and required fewer iterations and more smoothing for optimal results. Smaller objects with fine details, such as the kidneys, required more iterations and less

  7. Retinal vessel enhancement based on multi-scale top-hat transformation and histogram fitting stretching

    NASA Astrophysics Data System (ADS)

    Liao, Miao; Zhao, Yu-qian; Wang, Xiao-hong; Dai, Pei-shan

    2014-06-01

    Retinal vessels play an important role in the diagnostic procedure of retinopathy. A new retinal vessel enhancement method is proposed in this paper. Firstly, the optimal bright and dim image features of an original retinal image are extracted by a multi-scale top-hat transformation. Then, the retinal image is enhanced preliminarily by adding the optimal bright image features and removing the optimal dim image features. Finally, the preliminarily enhanced image is further processed by linear stretching with histogram Gaussian curve fitting. The experiments results on the DRIVE and STARE databases show that the proposed method improves the contrast and enhances the details of the retinal vessels effectively.

  8. Improved dose-volume histogram estimates for radiopharmaceutical therapy by optimizing quantitative SPECT reconstruction parameters.

    PubMed

    Cheng, Lishui; Hobbs, Robert F; Segars, Paul W; Sgouros, George; Frey, Eric C

    2013-06-01

    In radiopharmaceutical therapy, an understanding of the dose distribution in normal and target tissues is important for optimizing treatment. Three-dimensional (3D) dosimetry takes into account patient anatomy and the nonuniform uptake of radiopharmaceuticals in tissues. Dose-volume histograms (DVHs) provide a useful summary representation of the 3D dose distribution and have been widely used for external beam treatment planning. Reliable 3D dosimetry requires an accurate 3D radioactivity distribution as the input. However, activity distribution estimates from SPECT are corrupted by noise and partial volume effects (PVEs). In this work, we systematically investigated OS-EM based quantitative SPECT (QSPECT) image reconstruction in terms of its effect on DVHs estimates. A modified 3D NURBS-based Cardiac-Torso (NCAT) phantom that incorporated a non-uniform kidney model and clinically realistic organ activities and biokinetics was used. Projections were generated using a Monte Carlo (MC) simulation; noise effects were studied using 50 noise realizations with clinical count levels. Activity images were reconstructed using QSPECT with compensation for attenuation, scatter and collimator-detector response (CDR). Dose rate distributions were estimated by convolution of the activity image with a voxel S kernel. Cumulative DVHs were calculated from the phantom and QSPECT images and compared both qualitatively and quantitatively. We found that noise, PVEs, and ringing artifacts due to CDR compensation all degraded histogram estimates. Low-pass filtering and early termination of the iterative process were needed to reduce the effects of noise and ringing artifacts on DVHs, but resulted in increased degradations due to PVEs. Large objects with few features, such as the liver, had more accurate histogram estimates and required fewer iterations and more smoothing for optimal results. Smaller objects with fine details, such as the kidneys, required more iterations and less

  9. Verification of dose volume histograms in stereotactic radiosurgery and radiotherapy using polymer gel and MRI

    NASA Astrophysics Data System (ADS)

    Šemnická, Jitka; Novotný, Josef, Jr.; Spěváček, Václav; Garčic, Jirí; Steiner, Martin; Judas, Libor

    2006-12-01

    In this work we focus on dose volume histograms (DVHs) measurement in stereotactic radiosurgery (SR) performed with the Leksell gamma knife (ELEKTA Instrument AB, Stockholm, Sweden) and stereotactic radiotherapy (SRT) performed with linear accelerator 6 MV Varian Clinac 2100 C/D (Varian Medical Systems, Palo Alto, USA) in conjunction with BrainLAB stereotactic system (BrainLAB, Germany) using modified BANG gel and magnetic resonance imaging (MRI). The aim of the experiments was to investigate a method for acquiring entire dose volume information from irradiated gel dosimeter and calculate DVHs.

  10. Locally weighted histogram analysis and stochastic solution for large-scale multi-state free energy estimation

    NASA Astrophysics Data System (ADS)

    Tan, Zhiqiang; Xia, Junchao; Zhang, Bin W.; Levy, Ronald M.

    2016-01-01

    The weighted histogram analysis method (WHAM) including its binless extension has been developed independently in several different contexts, and widely used in chemistry, physics, and statistics, for computing free energies and expectations from multiple ensembles. However, this method, while statistically efficient, is computationally costly or even infeasible when a large number, hundreds or more, of distributions are studied. We develop a locally WHAM (local WHAM) from the perspective of simulations of simulations (SOS), using generalized serial tempering (GST) to resample simulated data from multiple ensembles. The local WHAM equations based on one jump attempt per GST cycle can be solved by optimization algorithms orders of magnitude faster than standard implementations of global WHAM, but yield similarly accurate estimates of free energies to global WHAM estimates. Moreover, we propose an adaptive SOS procedure for solving local WHAM equations stochastically when multiple jump attempts are performed per GST cycle. Such a stochastic procedure can lead to more accurate estimates of equilibrium distributions than local WHAM with one jump attempt per cycle. The proposed methods are broadly applicable when the original data to be "WHAMMED" are obtained properly by any sampling algorithm including serial tempering and parallel tempering (replica exchange). To illustrate the methods, we estimated absolute binding free energies and binding energy distributions using the binding energy distribution analysis method from one and two dimensional replica exchange molecular dynamics simulations for the beta-cyclodextrin-heptanoate host-guest system. In addition to the computational advantage of handling large datasets, our two dimensional WHAM analysis also demonstrates that accurate results similar to those from well-converged data can be obtained from simulations for which sampling is limited and not fully equilibrated.

  11. All equal-area map projections are created equal, but some are more equal than others

    USGS Publications Warehouse

    Usery, E.L.; Seong, J.C.

    2001-01-01

    High-resolution regional and global raster databases are currently being generated for a variety of environmental and scientific modeling applications. The projection of these data from geographic coordinates to a plane coordinate system is subject to significant areal error. Sources of error include users selecting an inappropriate projection or incorrect parameters for a given projection, algorithmic errors in commercial geographic information system (GIS) software, and errors resulting from the projection of data in the raster format. To assess the latter type of errors, the accuracy of raster projection was analyzed by two methods. First, a set of 12 one-degree by one-degree quadrilaterals placed at various latitudes was projected at several raster resolutions and compared to the projection of a vector representation of the same quadrilaterals. Second, several different raster resolutions of land cover data for Asia were projected and the total areas of 21 land cover categories were tabulated and compared. While equal-area projections are designed to specifically preserve area, the comparison of the results of the one-degree by one-degree quadrilaterals with the common equal area projections (e.g., the Mollweide) indicates a considerable variance in the one-degree area after projection. Similarly, the empirical comparison of land cover areas for Asia among various projections shows that total areas of land cover vary with projection type, raster resolution, and latitude. No single projection is best for all resolutions and all latitudes. While any of the equal-area projections tested are reasonably accurate for most applications with resolutions of eight-kilometer pixels or smaller, significant variances in accuracies appear at larger pixel sizes.

  12. A co-designed equalization, modulation, and coding scheme

    NASA Technical Reports Server (NTRS)

    Peile, Robert E.

    1992-01-01

    The commercial impact and technical success of Trellis Coded Modulation seems to illustrate that, if Shannon's capacity is going to be neared, the modulation and coding of an analogue signal ought to be viewed as an integrated process. More recent work has focused on going beyond the gains obtained for Average White Gaussian Noise and has tried to combine the coding/modulation with adaptive equalization. The motive is to gain similar advances on less perfect or idealized channels.

  13. Klystron equalization for RF feedback

    SciTech Connect

    Corredoura, P.

    1993-01-01

    The next generation of colliding beam storage rings support higher luminosities by significantly increasing the number of bunches and decreasing the spacing between respective bunches. The heavy beam loading requires large RF cavity detuning which drives several lower coupled bunch modes very strongly. One technique which has proven to be very successful in reducing the coupled bunch mode driving impedance is RF feedback around the klystron-cavity combination. The gain and bandwidth of the feedback loop is limited by the group delay around the feedback loop. Existing klystrons on the world market have not been optimized for this application and contribute a large portion of the total loop group delay. This paper describes a technique to reduce klystron group delay by adding an equalizing filter to the klystron RF drive. Such a filter was built and tested on a 500 kill klystron as part of the on going PEP-II R D effort here at SLAC.

  14. Klystron equalization for RF feedback

    SciTech Connect

    Corredoura, P.

    1993-01-01

    The next generation of colliding beam storage rings support higher luminosities by significantly increasing the number of bunches and decreasing the spacing between respective bunches. The heavy beam loading requires large RF cavity detuning which drives several lower coupled bunch modes very strongly. One technique which has proven to be very successful in reducing the coupled bunch mode driving impedance is RF feedback around the klystron-cavity combination. The gain and bandwidth of the feedback loop is limited by the group delay around the feedback loop. Existing klystrons on the world market have not been optimized for this application and contribute a large portion of the total loop group delay. This paper describes a technique to reduce klystron group delay by adding an equalizing filter to the klystron RF drive. Such a filter was built and tested on a 500 kill klystron as part of the on going PEP-II R&D effort here at SLAC.

  15. Brightness-equalized quantum dots

    NASA Astrophysics Data System (ADS)

    Lim, Sung Jun; Zahid, Mohammad U.; Le, Phuong; Ma, Liang; Entenberg, David; Harney, Allison S.; Condeelis, John; Smith, Andrew M.

    2015-10-01

    As molecular labels for cells and tissues, fluorescent probes have shaped our understanding of biological structures and processes. However, their capacity for quantitative analysis is limited because photon emission rates from multicolour fluorophores are dissimilar, unstable and often unpredictable, which obscures correlations between measured fluorescence and molecular concentration. Here we introduce a new class of light-emitting quantum dots with tunable and equalized fluorescence brightness across a broad range of colours. The key feature is independent tunability of emission wavelength, extinction coefficient and quantum yield through distinct structural domains in the nanocrystal. Precise tuning eliminates a 100-fold red-to-green brightness mismatch of size-tuned quantum dots at the ensemble and single-particle levels, which substantially improves quantitative imaging accuracy in biological tissue. We anticipate that these materials engineering principles will vastly expand the optical engineering landscape of fluorescent probes, facilitate quantitative multicolour imaging in living tissue and improve colour tuning in light-emitting devices.

  16. Brightness-equalized quantum dots

    PubMed Central

    Lim, Sung Jun; Zahid, Mohammad U.; Le, Phuong; Ma, Liang; Entenberg, David; Harney, Allison S.; Condeelis, John; Smith, Andrew M.

    2015-01-01

    As molecular labels for cells and tissues, fluorescent probes have shaped our understanding of biological structures and processes. However, their capacity for quantitative analysis is limited because photon emission rates from multicolour fluorophores are dissimilar, unstable and often unpredictable, which obscures correlations between measured fluorescence and molecular concentration. Here we introduce a new class of light-emitting quantum dots with tunable and equalized fluorescence brightness across a broad range of colours. The key feature is independent tunability of emission wavelength, extinction coefficient and quantum yield through distinct structural domains in the nanocrystal. Precise tuning eliminates a 100-fold red-to-green brightness mismatch of size-tuned quantum dots at the ensemble and single-particle levels, which substantially improves quantitative imaging accuracy in biological tissue. We anticipate that these materials engineering principles will vastly expand the optical engineering landscape of fluorescent probes, facilitate quantitative multicolour imaging in living tissue and improve colour tuning in light-emitting devices. PMID:26437175

  17. Extended equal area criterion revisited

    SciTech Connect

    Xue, X.; Wehenkel, L.; Belhomme, R.; Rousseaux, P.; Pavella, M. ); Euxibie, E.; Heilbronn, B.; Lesigne, J.F. )

    1992-08-01

    This paper reports on a case study conducted on the EHV French power system in order to revisit the extended equal area criterion and test its suitability as a fast transient stability indicator. The assumptions underlying the method are reexamined, causes liable to invalidate them are identified, and indices are devised to automatically circumvent them. The selection of candidate critical machines is also reconsidered and an augmented criterion is proposed. The various improvements are developed and tested on about 1000 stability scenarios, covering the entire 400-kV system; the severity of the scenarios, resulting from the combination of weakened both pre- and post-fault configurations, subjects the method to particularly stringent conditions. The obtained results show that the devised tools contribute to significantly reinforce its robustness and reliability.

  18. The across frequency independence of equalization of interaural time delay in the equalization-cancellation model of binaural unmasking

    NASA Astrophysics Data System (ADS)

    Akeroyd, Michael A.

    2004-08-01

    The equalization stage in the equalization-cancellation model of binaural unmasking compensates for the interaural time delay (ITD) of a masking noise by introducing an opposite, internal delay [N. I. Durlach, in Foundations of Modern Auditory Theory, Vol. II., edited by J. V. Tobias (Academic, New York, 1972)]. Culling and Summerfield [J. Acoust. Soc. Am. 98, 785-797 (1995)] developed a multi-channel version of this model in which equalization was ``free'' to use the optimal delay in each channel. Two experiments were conducted to test if equalization was indeed free or if it was ``restricted'' to the same delay in all channels. One experiment measured binaural detection thresholds, using an adaptive procedure, for 1-, 5-, or 17-component tones against a broadband masking noise, in three binaural configurations (N0S180, N180S0, and N90S270). The thresholds for the 1-component stimuli were used to normalize the levels of each of the 5- and 17-component stimuli so that they were equally detectable. If equalization was restricted, then, for the 5- and 17-component stimuli, the N90S270 and N180S0 configurations would yield a greater threshold than the N0S180 configurations. No such difference was found. A subsequent experiment measured binaural detection thresholds, via psychometric functions, for a 2-component complex tone in the same three binaural configurations. Again, no differential effect of configuration was observed. An analytic model of the detection of a complex tone showed that the results were more consistent with free equalization than restricted equalization, although the size of the differences was found to depend on the shape of the psychometric function for detection.

  19. Automatic histogram-based segmentation of white matter hyperintensities using 3D FLAIR images

    NASA Astrophysics Data System (ADS)

    Simões, Rita; Slump, Cornelis; Moenninghoff, Christoph; Wanke, Isabel; Dlugaj, Martha; Weimar, Christian

    2012-03-01

    White matter hyperintensities are known to play a role in the cognitive decline experienced by patients suffering from neurological diseases. Therefore, accurately detecting and monitoring these lesions is of importance. Automatic methods for segmenting white matter lesions typically use multimodal MRI data. Furthermore, many methods use a training set to perform a classification task or to determine necessary parameters. In this work, we describe and evaluate an unsupervised segmentation method that is based solely on the histogram of FLAIR images. It approximates the histogram by a mixture of three Gaussians in order to find an appropriate threshold for white matter hyperintensities. We use a context-sensitive Expectation-Maximization method to determine the Gaussian mixture parameters. The segmentation is subsequently corrected for false positives using the knowledge of the location of typical FLAIR artifacts. A preliminary validation with the ground truth on 6 patients revealed a Similarity Index of 0.73 +/- 0.10, indicating that the method is comparable to others in the literature which require multimodal MRI and/or a preliminary training step.

  20. Decoding brain cancer dynamics: a quantitative histogram-based approach using temporal MRI

    NASA Astrophysics Data System (ADS)

    Zhou, Mu; Hall, Lawrence O.; Goldgof, Dmitry B.; Russo, Robin; Gillies, Robert J.; Gatenby, Robert A.

    2015-03-01

    Brain tumor heterogeneity remains a challenge for probing brain cancer evolutionary dynamics. In light of evolution, it is a priority to inspect the cancer system from a time-domain perspective since it explicitly tracks the dynamics of cancer variations. In this paper, we study the problem of exploring brain tumor heterogeneity from temporal clinical magnetic resonance imaging (MRI) data. Our goal is to discover evidence-based knowledge from such temporal imaging data, where multiple clinical MRI scans from Glioblastoma multiforme (GBM) patients are generated during therapy. In particular, we propose a quantitative histogram-based approach that builds a prediction model to measure the difference in histograms obtained from pre- and post-treatment. The study could significantly assist radiologists by providing a metric to identify distinctive patterns within each tumor, which is crucial for the goal of providing patient-specific treatments. We examine the proposed approach for a practical application - clinical survival group prediction. Experimental results show that our approach achieved 90.91% accuracy.

  1. Free energies from dynamic weighted histogram analysis using unbiased Markov state model.

    PubMed

    Rosta, Edina; Hummer, Gerhard

    2015-01-13

    The weighted histogram analysis method (WHAM) is widely used to obtain accurate free energies from biased molecular simulations. However, WHAM free energies can exhibit significant errors if some of the biasing windows are not fully equilibrated. To account for the lack of full equilibration, we develop the dynamic histogram analysis method (DHAM). DHAM uses a global Markov state model to obtain the free energy along the reaction coordinate. A maximum likelihood estimate of the Markov transition matrix is constructed by joint unbiasing of the transition counts from multiple umbrella-sampling simulations along discretized reaction coordinates. The free energy profile is the stationary distribution of the resulting Markov matrix. For this matrix, we derive an explicit approximation that does not require the usual iterative solution of WHAM. We apply DHAM to model systems, a chemical reaction in water treated using quantum-mechanics/molecular-mechanics (QM/MM) simulations, and the Na(+) ion passage through the membrane-embedded ion channel GLIC. We find that DHAM gives accurate free energies even in cases where WHAM fails. In addition, DHAM provides kinetic information, which we here use to assess the extent of convergence in each of the simulation windows. DHAM may also prove useful in the construction of Markov state models from biased simulations in phase-space regions with otherwise low population. PMID:26574225

  2. Two non-parametric methods for derivation of constraints from radiotherapy dose-histogram data

    NASA Astrophysics Data System (ADS)

    Ebert, M. A.; Gulliford, S. L.; Buettner, F.; Foo, K.; Haworth, A.; Kennedy, A.; Joseph, D. J.; Denham, J. W.

    2014-07-01

    Dose constraints based on histograms provide a convenient and widely-used method for informing and guiding radiotherapy treatment planning. Methods of derivation of such constraints are often poorly described. Two non-parametric methods for derivation of constraints are described and investigated in the context of determination of dose-specific cut-points—values of the free parameter (e.g., percentage volume of the irradiated organ) which best reflect resulting changes in complication incidence. A method based on receiver operating characteristic (ROC) analysis and one based on a maximally-selected standardized rank sum are described and compared using rectal toxicity data from a prostate radiotherapy trial. Multiple test corrections are applied using a free step-down resampling algorithm, which accounts for the large number of tests undertaken to search for optimal cut-points and the inherent correlation between dose-histogram points. Both methods provide consistent significant cut-point values, with the rank sum method displaying some sensitivity to the underlying data. The ROC method is simple to implement and can utilize a complication atlas, though an advantage of the rank sum method is the ability to incorporate all complication grades without the need for grade dichotomization.

  3. Validation of Vehicle Candidate Areas in Aerial Images Using Color Co-Occurrence Histograms

    NASA Astrophysics Data System (ADS)

    Leister, W.; Tuermer, S.; Reinartz, P.; Hoffmann, K. H.; Stilla, U.

    2013-10-01

    Traffic monitoring plays an important role in transportation management. In addition, airborne acquisition enables a flexible and realtime mapping for special traffic situations e.g. mass events and disasters. Also the automatic extraction of vehicles from aerial imagery is a common application. However, many approaches focus on the target object only. As an extension to previously developed car detection techniques, a validation scheme is presented. The focus is on exploiting the background of the vehicle candidates as well as their color properties in the HSV color space. Therefore, texture of the vehicle background is described by color co-occurrence histograms. From all resulting histograms a likelihood function is calculated giving a quantity value to indicate whether the vehicle candidate is correctly classified. Only a few robust parameters have to be determined. Finally, the strategy is tested with a dataset of dense urban areas from the inner city of Munich, Germany. First results show that certain regions which are often responsible for false positive detections, such as vegetation or road markings, can be excluded successfully.

  4. 3D target tracking in infrared imagery by SIFT-based distance histograms

    NASA Astrophysics Data System (ADS)

    Yan, Ruicheng; Cao, Zhiguo

    2011-11-01

    SIFT tracking algorithm is an excellent point-based tracking algorithm, which has high tracking performance and accuracy due to its robust capability against rotation, scale change and occlusion. However, when tracking a huge 3D target in complicated real scenarios in a forward-looking infrared (FLIR) image sequence taken from an airborne moving platform, the tracked point locating in the vertical surface usually shifts away from the correct position. In this paper, we propose a novel algorithm for 3D target tracking in FLIR image sequences. Our approach uses SIFT keypoints detected in consecutive frames for point correspondence. The candidate position of the tracked point is firstly estimated by computing the affine transformation using local corresponding SIFT keypoints. Then the correct position is located via an optimal method. Euclidean distances between a candidate point and SIFT keypoints nearby are calculated and formed into a SIFT-based distance histogram. The distance histogram is defined a cost of associating each candidate point to a correct tracked point using the constraint based on the topology of each candidate point with its surrounding SIFT keypoints. Minimization of the cost is formulated as a combinatorial optimization problem. Experiments demonstrate that the proposed algorithm efficiently improves the tracking performance and accuracy.

  5. Addendum to brachytherapy dose-volume histogram commissioning with multiple planning systems.

    PubMed

    Gossman, Michael S

    2016-01-01

    The process for validating dose-volume histogram data in brachytherapy software is presented as a supplement to a previously published article. Included is the DVH accuracy evaluation of the Best NOMOS treatment planning system called "Best TPS VolumePlan." As done previously in other software, a rectangular cuboid was contoured in the treatment planning system. A single radioactive 125I source was positioned coplanar and concentric with one end. Calculations were performed to estimate dose deposition in partial volumes of the cuboid structure, using the brachytherapy dosimetry formalism defined in AAPM Task Group 43. Hand-calculated, dose-volume results were compared to TPS-generated, point-source-approximated dose-volume histogram data to establish acceptance. The required QA for commissioning was satisfied for the DVH as conducted previously for other software, using the criterion that the DVH %VolTPS "actual variance" calculations should differ by no more than 5% at any specific radial distance with respect to %VolTG-43, and the "average variance" DVH %VolTPS calculations should differ by no more than 2% over all radial distances with respect to %VolTG-43. The average disagreement observed between hand calculations and treatment planning system DVH was less than 0.5% on average for this treatment planning system and less than 1.1% maximally for 1 ≤ r ≤ 5 cm. PMID:27167288

  6. The use of force histograms for affine-invariant relative position description.

    PubMed

    Matsakis, Pascal; Keller, James M; Sjahputera, Ozy; Marjamaa, Jonathon

    2004-01-01

    Affine invariant descriptors have been widely used for recognition of objects regardless of their position, size, and orientation in space. Examples of color, texture, and shape descriptors abound in the literature. However, many tasks in computer vision require looking not only at single objects or regions in images but also at their spatial relationships. In an earlier work, we showed that the relative position of two objects can be quantitatively described by a histogram of forces. Here, we study how affine transformations affect this descriptor. The position of an object with respect to another changes when the objects are affine transformed. We analyze the link between 1) the applied affinity, 2) the relative position before transformation (described through a force histogram), and 3) the relative position after transformation. We show that any two of these elements allow the third one to be recovered. Moreover, it is possible to determine whether (or how well) two relative positions are actually related through an affine transformation. If they are not, the affinity that best approximates the unknown transformation can be retrieved, and the quality of the approximation assessed. PMID:15382682

  7. Rapid dynamic radial MRI via reference image enforced histogram constrained reconstruction

    NASA Astrophysics Data System (ADS)

    Gaass, Thomas; Bauman, Grzegorz; Potdevin, Guillaume; Noël, Peter B.; Haase, Axel

    2014-03-01

    Exploiting spatio-temporal redundancies in sub-Nyquist sampled dynamic MRI for the suppression of undersampling artifacts was shown to be of great success. However, temporally averaged and blurred structures in image space composite data poses the risk of false information in the reconstruction. Within this work we assess the possibility of employing the composite image histogram as a measure of undersampling artifacts and as basis of their suppression. The proposed algorithm utilizes a histogram, computed from a composite image within a dynamically acquired interleaved radial MRI measurement as reference to compensate for the impact of undersampling in temporally resolved data without the incorporation of temporal averaging. In addition an image space regularization utilizing a single frame low-resolution reconstruction is implemented to enforce overall contrast fidelity. The performance of the approach was evaluated on a simulated radial dynamic MRI acquisition and on two functional in vivo radial cardiac acquisitions. Results demonstrate that the algorithm maintained contrast properties, details and temporal resolution in the images, while effectively suppressing undersampling artifacts.

  8. Rapid dynamic radial MRI via reference image enforced histogram constrained reconstruction.

    PubMed

    Gaass, Thomas; Bauman, Grzegorz; Potdevin, Guillaume; Noël, Peter B; Haase, Axel

    2014-03-01

    Exploiting spatio-temporal redundancies in sub-Nyquist sampled dynamic MRI for the suppression of undersampling artifacts was shown to be of great success. However, temporally averaged and blurred structures in image space composite data poses the risk of false information in the reconstruction. Within this work we assess the possibility of employing the composite image histogram as a measure of undersampling artifacts and as basis of their suppression. The proposed algorithm utilizes a histogram, computed from a composite image within a dynamically acquired interleaved radial MRI measurement as reference to compensate for the impact of undersampling in temporally resolved data without the incorporation of temporal averaging. In addition an image space regularization utilizing a single frame low-resolution reconstruction is implemented to enforce overall contrast fidelity. The performance of the approach was evaluated on a simulated radial dynamic MRI acquisition and on two functional in vivo radial cardiac acquisitions. Results demonstrate that the algorithm maintained contrast properties, details and temporal resolution in the images, while effectively suppressing undersampling artifacts. PMID:24486719

  9. Detection of Basal Cell Carcinoma Using Color and Histogram Measures of Semitranslucent Areas

    PubMed Central

    Stoecker, William V.; Gupta, Kapil; Shrestha, Bijaya; Wronkiewiecz, Mark; Chowdhury, Raeed; Stanley, R. Joe; Xu, Jin; Moss, Randy H.; Celebi, M. Emre; Rabinovitz, Harold S.; Oliviero, Margaret; Malters, Joseph M.; Kolm, Isabel

    2009-01-01

    Background Semitranslucency, defined as a smooth, jelly-like area with varied, near-skin-tone color, can indicate a diagnosis of basal cell carcinoma (BCC) with high specificity. This study sought to analyze potential areas of semitranslucency with histogram-derived texture and color measures to discriminate BCC from non-semitranslucent areas in non-BCC skin lesions. Methods For 210 dermoscopy images, the areas of semitranslucency in 42 BCCs and comparable areas of smoothness and color in 168 non-BCCs were selected manually. Six color measures and six texture measures were applied to the semitranslucent areas of the BCC and the comparable areas in the non-BCC images. Results Receiver operating characteristic (ROC) curve analysis showed that the texture measures alone provided greater separation of BCC from non-BCC than the color measures alone. Statistical analysis showed that the four most important measures of semitranslucency are three histogram measures: contrast, smoothness, and entropy, and one color measure: blue chromaticity. Smoothness is the single most important measure. The combined 12 measures achieved a diagnostic accuracy of 95.05% based on area under the ROC curve. Conclusion Texture and color analysis measures, especially smoothness, may afford automatic detection of basal cell carcinoma images with semitranslucency. PMID:19624424

  10. RelMon: A General Approach to QA, Validation and Physics Analysis through Comparison of large Sets of Histograms

    NASA Astrophysics Data System (ADS)

    Piparo, Danilo

    2012-12-01

    The estimation of the compatibility of large amounts of histogram pairs is a recurrent problem in high energy physics. The issue is common to several different areas, from software quality monitoring to data certification, preservation and analysis. Given two sets of histograms, it is very important to be able to scrutinize the outcome of several goodness of fit tests, obtain a clear answer about the overall compatibility, easily spot the single anomalies and directly access the concerned histogram pairs. This procedure must be automated in order to reduce the human workload, therefore improving the process of identification of differences which is usually carried out by a trained human mind. Some solutions to this problem have been proposed, but they are experiment specific. RelMon depends only on ROOT and offers several goodness of fit tests (e.g. chi-squared or Kolmogorov-Smirnov). It produces highly readable web reports, in which aggregations of the comparisons rankings are available as well as all the plots of the single histogram overlays. The comparison procedure is fully automatic and scales smoothly towards ensembles of millions of histograms. Examples of RelMon utilisation within the regular workflows of the CMS collaboration and the advantages therewith obtained are described. Its interplay with the data quality monitoring infrastructure is illustrated as well as its role in the QA of the event reconstruction code, its integration in the CMS software release cycle process, CMS user data analysis and dataset validation.

  11. Promoting Racial Equality in the Nursing Curriculum.

    ERIC Educational Resources Information Center

    Foolchand, M. K.

    1995-01-01

    Equality in nursing education and the profession can be promoted in the following ways: a working policy on racism and equal opportunities; curriculum content that explores stereotypes, values, attitudes, and prejudices; and multicultural health research, education, and promotion. (SK)

  12. The neural bases for valuing social equality.

    PubMed

    Aoki, Ryuta; Yomogida, Yukihito; Matsumoto, Kenji

    2015-01-01

    The neural basis of how humans value and pursue social equality has become a major topic in social neuroscience research. Although recent studies have identified a set of brain regions and possible mechanisms that are involved in the neural processing of equality of outcome between individuals, how the human brain processes equality of opportunity remains unknown. In this review article, first we describe the importance of the distinction between equality of outcome and equality of opportunity, which has been emphasized in philosophy and economics. Next, we discuss possible approaches for empirical characterization of human valuation of equality of opportunity vs. equality of outcome. Understanding how these two concepts are distinct and interact with each other may provide a better explanation of complex human behaviors concerning fairness and social equality. PMID:25452125

  13. Impact of the radiotherapy technique on the correlation between dose-volume histograms of the bladder wall defined on MRI imaging and dose-volume/surface histograms in prostate cancer patients

    NASA Astrophysics Data System (ADS)

    Maggio, Angelo; Carillo, Viviana; Cozzarini, Cesare; Perna, Lucia; Rancati, Tiziana; Valdagni, Riccardo; Gabriele, Pietro; Fiorino, Claudio

    2013-04-01

    The aim of this study was to evaluate the correlation between the ‘true’ absolute and relative dose-volume histograms (DVHs) of the bladder wall, dose-wall histogram (DWH) defined on MRI imaging and other surrogates of bladder dosimetry in prostate cancer patients, planned both with 3D-conformal and intensity-modulated radiation therapy (IMRT) techniques. For 17 prostate cancer patients, previously treated with radical intent, CT and MRI scans were acquired and matched. The contours of bladder walls were drawn by using MRI images. External bladder surfaces were then used to generate artificial bladder walls by performing automatic contractions of 5, 7 and 10 mm. For each patient a 3D conformal radiotherapy (3DCRT) and an IMRT treatment plan was generated with a prescription dose of 77.4 Gy (1.8 Gy/fr) and DVH of the whole bladder of the artificial walls (DVH-5/10) and dose-surface histograms (DSHs) were calculated and compared against the DWH in absolute and relative value, for both treatment planning techniques. A specific software (VODCA v. 4.4.0, MSS Inc.) was used for calculating the dose-volume/surface histogram. Correlation was quantified for selected dose-volume/surface parameters by the Spearman correlation coefficient. The agreement between %DWH and DVH5, DVH7 and DVH10 was found to be very good (maximum average deviations below 2%, SD < 5%): DVH5 showed the best agreement. The correlation was slightly better for absolute (R = 0.80-0.94) compared to relative (R = 0.66-0.92) histograms. The DSH was also found to be highly correlated with the DWH, although slightly higher deviations were generally found. The DVH was not a good surrogate of the DWH (R < 0.7 for most of parameters). When comparing the two treatment techniques, more pronounced differences between relative histograms were seen for IMRT with respect to 3DCRT (p < 0.0001).

  14. 34 CFR 108.6 - Equal access.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Equal access. 108.6 Section 108.6 Education Regulations of the Offices of the Department of Education OFFICE FOR CIVIL RIGHTS, DEPARTMENT OF EDUCATION EQUAL ACCESS TO PUBLIC SCHOOL FACILITIES FOR THE BOY SCOUTS OF AMERICA AND OTHER DESIGNATED YOUTH GROUPS § 108.6 Equal access. (a) General....

  15. 76 FR 59237 - Equal Credit Opportunity

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-26

    ... 7100 AD 78 Equal Credit Opportunity AGENCY: Board of Governors of the Federal Reserve System. ACTION: Final rule. SUMMARY: The Board is publishing a final rule amending Regulation B (Equal Credit Opportunity). Section 704B of the Equal Credit Opportunity Act (ECOA), as added by Section 1071 of the...

  16. 45 CFR 98.43 - Equal access.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Equal access. 98.43 Section 98.43 Public Welfare... Operations (Child Care Services)-Lead Agency and Provider Requirements § 98.43 Equal access. (a) The Lead... sufficient to ensure equal access, for eligible families in the area served by the Lead Agency, to child...

  17. 45 CFR 98.43 - Equal access.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Equal access. 98.43 Section 98.43 Public Welfare... Operations (Child Care Services)-Lead Agency and Provider Requirements § 98.43 Equal access. (a) The Lead... sufficient to ensure equal access, for eligible families in the area served by the Lead Agency, to child...

  18. Does 0.999... Really Equal 1?

    ERIC Educational Resources Information Center

    Norton, Anderson; Baldwin, Michael

    2012-01-01

    This article confronts the issue of why secondary and post-secondary students resist accepting the equality of 0.999... and 1, even after they have seen and understood logical arguments for the equality. In some sense, we might say that the equality holds by definition of 0.999..., but this definition depends upon accepting properties of the real…

  19. Computer-aided diagnosis method for MRI-guided prostate biopsy within the peripheral zone using grey level histograms

    NASA Astrophysics Data System (ADS)

    Rampun, Andrik; Malcolm, Paul; Zwiggelaar, Reyer

    2015-02-01

    This paper describes a computer-aided diagnosis method for targeted prostate biopsies within the peripheral zone in T2-Weighted MRI. We subdivided the peripheral zone into four regions and compare each sub region's grey level histogram with malignant and normal histogram models, and use specific metrics to estimate the presence of abnormality. The initial evaluation based on 200 MRI slices taken from 40 different patients and we achieved 87% correct classification rate with 89% and 86% sensitivity and specificity, respectively. The main contribution of this paper is a novel approach of Computer Aided Diagnosis which is using grey level histograms analysis between sub regions. In clinical point of view, the developed method could assist clinicians to perform targeted biopsies which are better than the random ones which are currently used.

  20. Gandhigram: fostering equality through development.

    PubMed

    Devi, R K

    1991-12-01

    A noticeable trend towards 1-child families reveals the success of Gandhigram, an integrated rural development program in Tamil Nadu, India. Founded in 1947 by T.S. Soundram, Gandhigram has adhered to Gandhian principles of truth, nonviolence, castlessness, and equality between the sexes. The program has combined health and family planning with social welfare, education, and economic development. From the outset, Gandhigram has sought community participation, including the involvement of women. Girls have been encouraged to attend school up to at least the 10th level, and employment opportunities for women have been increased. Women's increased economic independence and level of education have influenced their decision to delay marriage by about 2 1/2 years, to choose their own partners, and to decide on the number of children they want. And increasingly, women are opting to limit family size to 2 -- and sometimes 1 -- child. Women are choosing to undergo tubectomies at a younger age, partly because of the availability of recanalization surgery, which has allowed mothers who have lost a child to conceive again. Unlike typical government family planning programs, which usually provide only contraception to meet the objective of a small family norm, Gandhigram also offers infertility services. Not all of Gandhigram's efforts have resulted in success. For example, a plan to develop a health insurance program did not succeed. However, Gandhigram's 44 years of experience have revealed the necessary elements for success. These elements include community participation, the participation of women through educational and employment programs, and easy access to services. PMID:12317116

  1. Hybrid time-frequency domain equalization for LED nonlinearity mitigation in OFDM-based VLC systems.

    PubMed

    Li, Jianfeng; Huang, Zhitong; Liu, Xiaoshuang; Ji, Yuefeng

    2015-01-12

    A novel hybrid time-frequency domain equalization scheme is proposed and experimentally demonstrated to mitigate the white light emitting diode (LED) nonlinearity in visible light communication (VLC) systems based on orthogonal frequency division multiplexing (OFDM). We handle the linear and nonlinear distortion separately in a nonlinear OFDM system. The linear part is equalized in frequency domain and the nonlinear part is compensated by an adaptive nonlinear time domain equalizer (N-TDE). The experimental results show that with only a small number of parameters the nonlinear equalizer can efficiently mitigate the LED nonlinearity. With the N-TDE the modulation index (MI) and BER performance can be significantly enhanced. PMID:25835706

  2. Histogram flow mapping with optical coherence tomography for in vivo skin angiography of hereditary hemorrhagic telangiectasia

    PubMed Central

    Cheng, Kyle H. Y.; Mariampillai, Adrian; Lee, Kenneth K. C.; Vuong, Barry; Luk, Timothy W. H.; Ramjist, Joel; Curtis, Anne; Jakubovic, Henry; Kertes, Peter; Letarte, Michelle; Faughnan, Marie E.; HHT Investigator Group, Brain Vascular Malformation Consortium; Yang, Victor X. D.

    2014-01-01

    Abstract. Speckle statistics of flowing scatterers have been well documented in the literature. Speckle variance optical coherence tomography exploits the large variance values of intensity changes in time caused mainly by the random backscattering of light resulting from translational activity of red blood cells to map out the microvascular networks. A method to map out the microvasculature malformation of skin based on the time-domain histograms of individual pixels is presented with results obtained from both normal skin and skin containing vascular malformation. Results demonstrated that this method can potentially map out deeper blood vessels and enhance the visualization of microvasculature in low signal regions, while being resistant against motion (e.g., patient tremor or internal reflex movements). The overall results are manifested as more uniform en face projection maps of microvessels. Potential applications include clinical imaging of skin vascular abnormalities and wide-field skin angiography for the study of complex vascular networks. PMID:25140883

  3. The characterization of radioaerosol deposition in the healthy lung by histogram distribution analysis

    SciTech Connect

    Garrard, C.S.; Gerrity, T.R.; Schreiner, J.F.; Yeates, D.B.

    1981-12-01

    Thirteen healthy nonsmoking volunteers inhaled an 8.1 micrometers (MMAD) radioaerosol on two occasions. Aerosol deposition pattern within the right lung, as recorded by a gamma camera, was expressed as the 3rd and 4th moments of the distribution histogram (skew and kurtosis) of radioactivity during the first ten minutes after aerosol inhalation. Deposition pattern was also expressed as the percentage of deposited activity retained within the lung at 24 hr (24 hr % retention) and found to be significantly correlated with measures of skew (P less than 0.001). Tests of pulmonary function (FEV1, FVC, and MMFR) were significantly correlated with skew. Correlations were also demonstrated for these pulmonary function tests with 24 hr % retention but at lower levels of significance. Results indicate that changes in measures of forced expiratory airflow in healthy human volunteers influence deposition pattern and that the skew of the distribution of inhaled radioactivity may provide an acceptable index of deposition pattern.

  4. Scale and Orientation-Based Background Weighted Histogram for Human Tracking

    NASA Astrophysics Data System (ADS)

    Laaroussi, Khadija; Saaidi, Abderrahim; Masrar, Mohamed; Satori, Khalid

    2016-09-01

    The Mean Shift procedure is a popular object tracking algorithm since it is fast, easy to implement and performs well in a range of conditions. However, classic Mean Shift tracking algorithm fixes the size and orientation of the tracking window, which limits the performance when the target's orientation and scale change. In this paper, we present a new human tracking algorithm based on Mean Shift technique in order to estimate the position, scale and orientation changes of the target. This work combines moment features of the weight image with background information to design a robust tracking algorithm entitled Scale and Orientation-based Background Weighted Histogram (SOBWH). The experimental results show that the proposed approach SOBWH presents a good compromise between tracking precision and calculation time, also they validate its robustness, especially to large background variation, scale and orientation changes and similar background scenes.

  5. ``Binless Wang-Landau sampling'' - a multicanonical Monte Carlo algorithm without histograms

    NASA Astrophysics Data System (ADS)

    Li, Ying Wai; Eisenbach, Markus

    Inspired by the very successful Wang-Landau (WL) sampling, we innovated a multicanonical Monte Carlo algorithm to obtain the density of states (DOS) for physical systems with continuous state variables. Unlike the original WL scheme where the DOS is obtained as a numerical array of finite resolution, our algorithm assumes an analytical form for the DOS using a well chosen basis set, with coefficients determined iteratively similar to the WL approach. To avoid undesirable artificial errors caused by the discretization of state variables, we get rid of the use of a histogram for keeping track of the number of visits to energy levels, but store the visited states directly for the fitting of coefficients. This new algorithm has the advantage of producing an analytical expression for the DOS, while the original WL sampling can be readily recovered. This research was supported by the Office of Science of the Department of Energy under Contract DE-AC05-00OR22725.

  6. Biological dose volume histograms during conformal hypofractionated accelerated radiotherapy for prostate cancer

    SciTech Connect

    Koukourakis, Michael I.; Abatzoglou, Ioannis; Touloupidis, Stavros; Manavis, Ioannis

    2007-01-15

    Radiobiological data suggest that prostate cancer has a low {alpha}/{beta} ratio. Large radiotherapy fractions may, therefore, prove more efficacious than standard radiotherapy, while radiotherapy acceleration should further improve control rates. This study describes the radiobiology of a conformal hypofractionated accelerated radiotherapy scheme for the treatment of high risk prostate cancer. Anteroposterior fields to the pelvis deliver a daily dose of 2.7 Gy, while lateral fields confined to the prostate and seminal vesicles deliver an additional daily dose of 0.7 Gy. Radiotherapy is accomplished within 19 days (15 fractions). Dose volume histograms, calculated for tissue specific {alpha}/{beta} ratios and time factors, predict a high biological dose to the prostate and seminal vesicles (77-93 Gy). The biological dose to normal pelvic tissues is maintained at standard levels. Radiobiological dosimetry suggests that, using hypofractionated and accelerated radiotherapy, high biological radiation dose can be given to the prostate without overdosing normal tissues.

  7. A 124 Mpixels/s VLSI design for histogram-based joint bilateral filtering.

    PubMed

    Tseng, Yu-Cheng; Hsu, Po-Hsiung; Chang, Tian-Sheuan

    2011-11-01

    This paper presents an efficient and scalable design for histogram-based bilateral filtering (BF) and joint BF (JBF) by memory reduction methods and architecture design techniques to solve the problems of high memory cost, high computational complexity, high bandwidth, and large range table. The presented memory reduction methods exploit the progressive computing characteristics to reduce the memory cost to 0.003%-0.020%, as compared with the original approach. Furthermore, the architecture design techniques adopt range domain parallelism and take advantage of the computing order and the numerical properties to solve the complexity, bandwidth, and range-table problems. The example design with a 90-nm complementary metal-oxide-semiconductor process can deliver the throughput to 124 Mpixels/s with 356-K gate counts and 23-KB on-chip memory. PMID:21659030

  8. Accelerating the weighted histogram analysis method by direct inversion in the iterative subspace

    PubMed Central

    Zhang, Cheng; Lai, Chun-Liang; Pettitt, B. Montgomery

    2016-01-01

    The weighted histogram analysis method (WHAM) for free energy calculations is a valuable tool to produce free energy differences with the minimal errors. Given multiple simulations, WHAM obtains from the distribution overlaps the optimal statistical estimator of the density of states, from which the free energy differences can be computed. The WHAM equations are often solved by an iterative procedure. In this work, we use a well-known linear algebra algorithm which allows for more rapid convergence to the solution. We find that the computational complexity of the iterative solution to WHAM and the closely-related multiple Bennett acceptance ratio (MBAR) method can be improved by using the method of direct inversion in the iterative subspace. We give examples from a lattice model, a simple liquid and an aqueous protein solution. PMID:27453632

  9. Improving the imaging of calcifications in CT by histogram-based selective deblurring

    NASA Astrophysics Data System (ADS)

    Rollano-Hijarrubia, Empar; van der Meer, Frits; van der Lugt, Add; Weinans, Harrie; Vrooman, Henry; Vossepoel, Albert; Stokking, Rik

    2005-04-01

    Imaging of small high-density structures, such as calcifications, with computed tomography (CT) is limited by the spatial resolution of the system. Blur causes small calcifications to be imaged with lower contrast and overestimated volume, thereby hampering the analysis of vessels. The aim of this work is to reduce the blur of calcifications by applying three-dimensional (3D) deconvolution. Unfortunately, the high-frequency amplification of the deconvolution produces edge-related ring artifacts and enhances noise and original artifacts, which degrades the imaging of low-density structures. A method, referred to as Histogram-based Selective Deblurring (HiSD), was implemented to avoid these negative effects. HiSD uses the histogram information to generate a restored image in which the low-intensity voxel information of the observed image is combined with the high-intensity voxel information of the deconvolved image. To evaluate HiSD we scanned four in-vitro atherosclerotic plaques of carotid arteries with a multislice spiral CT and with a microfocus CT (μCT), used as reference. Restored images were generated from the observed images, and qualitatively and quantitatively compared with their corresponding μCT images. Transverse views and maximum-intensity projections of restored images show the decrease of blur of the calcifications in 3D. Measurements of the areas of 27 calcifications and total volumes of calcification of 4 plaques show that the overestimation of calcification was smaller for restored images (mean-error: 90% for area; 92% for volume) than for observed images (143%; 213%, respectively). The qualitative and quantitative analyses show that the imaging of calcifications in CT can be improved considerably by applying HiSD.

  10. Fast Analysis of Molecular Dynamics Trajectories with Graphics Processing Units—Radial Distribution Function Histogramming

    PubMed Central

    Stone, John E.; Kohlmeyer, Axel

    2011-01-01

    The calculation of radial distribution functions (RDFs) from molecular dynamics trajectory data is a common and computationally expensive analysis task. The rate limiting step in the calculation of the RDF is building a histogram of the distance between atom pairs in each trajectory frame. Here we present an implementation of this histogramming scheme for multiple graphics processing units (GPUs). The algorithm features a tiling scheme to maximize the reuse of data at the fastest levels of the GPU’s memory hierarchy and dynamic load balancing to allow high performance on heterogeneous configurations of GPUs. Several versions of the RDF algorithm are presented, utilizing the specific hardware features found on different generations of GPUs. We take advantage of larger shared memory and atomic memory operations available on state-of-the-art GPUs to accelerate the code significantly. The use of atomic memory operations allows the fast, limited-capacity on-chip memory to be used much more efficiently, resulting in a fivefold increase in performance compared to the version of the algorithm without atomic operations. The ultimate version of the algorithm running in parallel on four NVIDIA GeForce GTX 480 (Fermi) GPUs was found to be 92 times faster than a multithreaded implementation running on an Intel Xeon 5550 CPU. On this multi-GPU hardware, the RDF between two selections of 1,000,000 atoms each can be calculated in 26.9 seconds per frame. The multi-GPU RDF algorithms described here are implemented in VMD, a widely used and freely available software package for molecular dynamics visualization and analysis. PMID:21547007

  11. Convergence and error estimation in free energy calculations using the weighted histogram analysis method

    PubMed Central

    Zhu, Fangqiang; Hummer, Gerhard

    2012-01-01

    The weighted histogram analysis method (WHAM) has become the standard technique for the analysis of umbrella sampling simulations. In this paper, we address the challenges (1) of obtaining fast and accurate solutions of the coupled nonlinear WHAM equations, (2) of quantifying the statistical errors of the resulting free energies, (3) of diagnosing possible systematic errors, and (4) of optimal allocation of the computational resources. Traditionally, the WHAM equations are solved by a fixed-point direct iteration method, despite poor convergence and possible numerical inaccuracies in the solutions. Here we instead solve the mathematically equivalent problem of maximizing a target likelihood function, by using superlinear numerical optimization algorithms with a significantly faster convergence rate. To estimate the statistical errors in one-dimensional free energy profiles obtained from WHAM, we note that for densely spaced umbrella windows with harmonic biasing potentials, the WHAM free energy profile can be approximated by a coarse-grained free energy obtained by integrating the mean restraining forces. The statistical errors of the coarse-grained free energies can be estimated straightforwardly and then used for the WHAM results. A generalization to multidimensional WHAM is described. We also propose two simple statistical criteria to test the consistency between the histograms of adjacent umbrella windows, which help identify inadequate sampling and hysteresis in the degrees of freedom orthogonal to the reaction coordinate. Together, the estimates of the statistical errors and the diagnostics of inconsistencies in the potentials of mean force provide a basis for the efficient allocation of computational resources in free energy simulations. PMID:22109354

  12. Histogram Analysis of Gadoxetic Acid-Enhanced MRI for Quantitative Hepatic Fibrosis Measurement

    PubMed Central

    Kim, Honsoul; Park, Seong Ho; Kim, Eun Kyung; Kim, Myeong-Jin; Park, Young Nyun; Park, Hae-Jeong; Choi, Jin-Young

    2014-01-01

    Purpose The diagnosis and monitoring of liver fibrosis is an important clinical issue; however, this is usually achieved by invasive methods such as biopsy. We aimed to determine whether histogram analysis of hepatobiliary phase images of gadoxetic acid-enhanced magnetic resonance imaging (MRI) can provide non-invasive quantitative measurement of liver fibrosis. Methods This retrospective study was approved by the institutional ethics committee, and a waiver of informed consent was obtained. Hepatobiliary phase images of preoperative gadoxetic acid-enhanced MRI studies of 105 patients (69 males, 36 females; age 56.1±12.2) with pathologically documented liver fibrosis grades were analyzed. Fibrosis staging was F0/F1/F2/F3/F4 (METAVIR system) for 11/20/13/15/46 patients, respectively. Four regions-of-interest (ROI, each about 2 cm2) were placed on predetermined locations of representative images. The measured signal intensity of pixels in each ROI was used to calculate corrected coefficient of variation (cCV), skewness, and kurtosis. An average value of each parameter was calculated for comparison. Statistical analysis was performed by ANOVA, receiver operating characteristic (ROC) curve analysis, and linear regression. Results The cCV showed statistically significant differences among pathological fibrosis grades (P<0.001) whereas skewness and kurtosis did not. Univariable linear regression analysis suggested cCV to be a meaningful parameter in predicting the fibrosis grade (P<0.001, β = 0.40 and standard error  = 0.06). For discriminating F0-3 from F4, the area under ROC score was 0.857, standard deviation 0.036, 95% confidence interval 0.785–0.928. Conclusion Histogram analysis of hepatobiliary phase images of gadoxetic acid-enhanced MRI can provide non-invasive quantitative measurements of hepatic fibrosis. PMID:25460180

  13. Seismic remote sensing image segmentation based on spectral histogram and dynamic region merging

    NASA Astrophysics Data System (ADS)

    Wang, Peng; Sun, Genyun; Wang, Zhenjie

    2015-12-01

    Image segmentation is the foundation of seismic information extraction from high-resolution remote sensing images. While the complexity of the seismic image brings great challenges to its segmentation. Compared with the traditional pixel-level approaches, the region-level approaches are found prevailing in dealing with the complexity. This paper addresses the seismic image segmentation problem in a region-merging style. Starting from many over-segmented regions, the image segmentation is performed by iteratively merging the neighboring regions. In the proposed algorithm, the merging criterion and merging order are two essential issues to be emphatically considered. An effective merging criterion is largely depends on the region feature and neighbor homogeneity measure. The region's spectral histogram represents the global feature of each region and enhances the discriminability of neighboring regions. Therefore, we utilize it to solve the merging criterion. Under a certain the merging criterion, a better performance could be obtained if the most similar regions are always ensured to be merged first, which can be transformed into a least-cost problem. Rather than predefine an order queue, we solve the order problem with a dynamic scheme. The proposed approach mainly contains three parts. Firstly, starting from the over-segmented regions, the spectral histograms are constructed to represent each region. Then, we use the homogeneity that combines the distance and shape measure to conduct the merge criterion. Finally, neighbor regions are dynamically merged following the dynamic program (DP) theory and breadth-first strategy. Experiments are conducted using the earthquake images, including collapsed buildings and seismic secondary geological disaster. The experimental results show that, the proposed method segments the seismic image more correctly.

  14. Fast Analysis of Molecular Dynamics Trajectories with Graphics Processing Units-Radial Distribution Function Histogramming.

    PubMed

    Levine, Benjamin G; Stone, John E; Kohlmeyer, Axel

    2011-05-01

    The calculation of radial distribution functions (RDFs) from molecular dynamics trajectory data is a common and computationally expensive analysis task. The rate limiting step in the calculation of the RDF is building a histogram of the distance between atom pairs in each trajectory frame. Here we present an implementation of this histogramming scheme for multiple graphics processing units (GPUs). The algorithm features a tiling scheme to maximize the reuse of data at the fastest levels of the GPU's memory hierarchy and dynamic load balancing to allow high performance on heterogeneous configurations of GPUs. Several versions of the RDF algorithm are presented, utilizing the specific hardware features found on different generations of GPUs. We take advantage of larger shared memory and atomic memory operations available on state-of-the-art GPUs to accelerate the code significantly. The use of atomic memory operations allows the fast, limited-capacity on-chip memory to be used much more efficiently, resulting in a fivefold increase in performance compared to the version of the algorithm without atomic operations. The ultimate version of the algorithm running in parallel on four NVIDIA GeForce GTX 480 (Fermi) GPUs was found to be 92 times faster than a multithreaded implementation running on an Intel Xeon 5550 CPU. On this multi-GPU hardware, the RDF between two selections of 1,000,000 atoms each can be calculated in 26.9 seconds per frame. The multi-GPU RDF algorithms described here are implemented in VMD, a widely used and freely available software package for molecular dynamics visualization and analysis. PMID:21547007

  15. Fast analysis of molecular dynamics trajectories with graphics processing units-Radial distribution function histogramming

    SciTech Connect

    Levine, Benjamin G.; Stone, John E.; Kohlmeyer, Axel

    2011-05-01

    The calculation of radial distribution functions (RDFs) from molecular dynamics trajectory data is a common and computationally expensive analysis task. The rate limiting step in the calculation of the RDF is building a histogram of the distance between atom pairs in each trajectory frame. Here we present an implementation of this histogramming scheme for multiple graphics processing units (GPUs). The algorithm features a tiling scheme to maximize the reuse of data at the fastest levels of the GPU's memory hierarchy and dynamic load balancing to allow high performance on heterogeneous configurations of GPUs. Several versions of the RDF algorithm are presented, utilizing the specific hardware features found on different generations of GPUs. We take advantage of larger shared memory and atomic memory operations available on state-of-the-art GPUs to accelerate the code significantly. The use of atomic memory operations allows the fast, limited-capacity on-chip memory to be used much more efficiently, resulting in a fivefold increase in performance compared to the version of the algorithm without atomic operations. The ultimate version of the algorithm running in parallel on four NVIDIA GeForce GTX 480 (Fermi) GPUs was found to be 92 times faster than a multithreaded implementation running on an Intel Xeon 5550 CPU. On this multi-GPU hardware, the RDF between two selections of 1,000,000 atoms each can be calculated in 26.9 s per frame. The multi-GPU RDF algorithms described here are implemented in VMD, a widely used and freely available software package for molecular dynamics visualization and analysis.

  16. Performance analysis of a dual-tree algorithm for computing spatial distance histograms.

    PubMed

    Chen, Shaoping; Tu, Yi-Cheng; Xia, Yuni

    2011-08-01

    Many scientific and engineering fields produce large volume of spatiotemporal data. The storage, retrieval, and analysis of such data impose great challenges to database systems design. Analysis of scientific spatiotemporal data often involves computing functions of all point-to-point interactions. One such analytics, the Spatial Distance Histogram (SDH), is of vital importance to scientific discovery. Recently, algorithms for efficient SDH processing in large-scale scientific databases have been proposed. These algorithms adopt a recursive tree-traversing strategy to process point-to-point distances in the visited tree nodes in batches, thus require less time when compared to the brute-force approach where all pairwise distances have to be computed. Despite the promising experimental results, the complexity of such algorithms has not been thoroughly studied. In this paper, we present an analysis of such algorithms based on a geometric modeling approach. The main technique is to transform the analysis of point counts into a problem of quantifying the area of regions where pairwise distances can be processed in batches by the algorithm. From the analysis, we conclude that the number of pairwise distances that are left to be processed decreases exponentially with more levels of the tree visited. This leads to the proof of a time complexity lower than the quadratic time needed for a brute-force algorithm and builds the foundation for a constant-time approximate algorithm. Our model is also general in that it works for a wide range of point spatial distributions, histogram types, and space-partitioning options in building the tree. PMID:21804753

  17. Development of a Quality Assurance Procedure for Dose Volume Histogram Analysis

    NASA Astrophysics Data System (ADS)

    Davenport, David A.

    The role of the dose-volume histogram (DVH) is rapidly expanding in radiation oncology treatment planning. DVHs are already relied upon to differentiate between two similar plans and evaluate organ-at-risk dosage. Their role will become even more important as progress continues towards implementing biologically based treatment planning systems. Therefore it is imperative that the accuracy of DVHs is evaluated and reappraised after any major software or hardware upgrades, affecting a treatment planning system (TPS). The purpose of this work is to create and implement a comprehensive quality assurance procedure evaluating dose volume histograms to insure their accuracy while satisfying American College of Radiology guidelines. Virtual phantoms of known volumes were created in Pinnacle TPS and exposed to different beam arrangements. Variables including grid size and slice thickness were varied and their effects were analyzed. The resulting DVHs were evaluated by comparison to the commissioned percent depth dose values using a custom Excel spreadsheet. After determining the uncertainty of the DVH based on these variables, multiple second check calculations were performed using MIM Maestro and Matlab software packages. The uncertainties of the DVHs were shown to be less than +/- 3%. The average uncertainty was shown to be less than +/- 1%. The second check procedures resulted in mean percent differences less than 1% which confirms the accuracy of DVH calculation in Pinnacle and the effectiveness of the quality assurance template. The importance of knowing the limits of accuracy of the DVHs, which are routinely used to assess the quality of clinical treatment plans, cannot be overestimated. The developed comprehensive QA procedure evaluating the accuracy of the DVH statistical analysis will become a part of our clinical arsenal for periodic tests of the treatment planning system. It will also be performed at the time of commissioning and after any major software

  18. Size distribution of linear and helical polymers in actin solution analyzed by photon counting histogram.

    PubMed

    Terada, Naofumi; Shimozawa, Togo; Ishiwata, Shin'ichi; Funatsu, Takashi

    2007-03-15

    Actin is a ubiquitous protein that is a major component of the cytoskeleton, playing an important role in muscle contraction and cell motility. At steady state, actin monomers and filaments (F-actin) coexist, and actin subunits continuously attach and detach at the filament ends. However, the size distribution of actin oligomers in F-actin solution has never been clarified. In this study, we investigated the size distribution of actin oligomers using photon-counting histograms. For this purpose, actin was labeled with a fluorescent dye, and the emitted photons were detected by confocal optics (the detection volume was of femtoliter (fL) order). Photon-counting histograms were analyzed to obtain the number distribution of actin oligomers in the detection area from their brightness, assuming that the brightness of an oligomer was proportional to the number of protomers. We found that the major populations at physiological ionic strength were 1-5mers. For data analysis, we successfully applied the theory of linear and helical aggregations of macromolecules. The model postulates three states of actin, i.e., monomers, linear polymers, and helical polymers. Here we obtained three parameters: the equilibrium constants for polymerization of linear polymers, K(l)=(5.2 +/- 1.1) x 10(6) M(-1), and helical polymers, K(h)=(1.6 +/- 0.5) x 10(7) M(-1); and the ratio of helical to linear trimers, gamma = (3.6 +/- 2.3) x 10(-2). The excess free energy of transforming a linear trimer to a helical trimer, which is assumed to be a nucleus for helical polymers, was calculated to be 2.0 kcal/mol. These analyses demonstrate that the oligomeric phase at steady state is predominantly composed of linear 1-5mers, and the transition from linear to helical polymers occurs on the level of 5-7mers. PMID:17172301

  19. A New Blind Equalization Method Based on Negentropy Minimization for Constant Modulus Signals

    NASA Astrophysics Data System (ADS)

    Choi, Sooyong; Chung, Jong-Moon; Jeong, Wun-Cheol

    A new blind adaptive equalization method for constant modulus signals based on minimizing the approximate negentropy of the estimation error for a finite-length equalizer is presented. We consider the approximate negentropy using nonpolynomial expansions of the estimation error as a new performance criterion to improve the performance of a linear equalizer using the conventional constant modulus algorithm (CMA). Negentropy includes higher order statistical information and its minimization provides improved convergence, performance, and accuracy compared to traditional methods, such as the CMA, in terms of the bit error rate (BER). Also, the proposed equalizer shows faster convergence characteristics than the CMA equalizer and is more robust to nonlinear distortion than the CMA equalizer.

  20. TaBoo SeArch Algorithm with a Modified Inverse Histogram for Reproducing Biologically Relevant Rare Events of Proteins.

    PubMed

    Harada, Ryuhei; Takano, Yu; Shigeta, Yasuteru

    2016-05-10

    The TaBoo SeArch (TBSA) algorithm [ Harada et al. J. Comput. Chem. 2015 , 36 , 763 - 772 and Harada et al. Chem. Phys. Lett. 2015 , 630 , 68 - 75 ] was recently proposed as an enhanced conformational sampling method for reproducing biologically relevant rare events of a given protein. In TBSA, an inverse histogram of the original distribution, mapped onto a set of reaction coordinates, is constructed from trajectories obtained by multiple short-time molecular dynamics (MD) simulations. Rarely occurring states of a given protein are statistically selected as new initial states based on the inverse histogram, and resampling is performed by restarting the MD simulations from the new initial states to promote the conformational transition. In this process, the definition of the inverse histogram, which characterizes the rarely occurring states, is crucial for the efficiency of TBSA. In this study, we propose a simple modification of the inverse histogram to further accelerate the convergence of TBSA. As demonstrations of the modified TBSA, we applied it to (a) hydrogen bonding rearrangements of Met-enkephalin, (b) large-amplitude domain motions of Glutamine-Binding Protein, and (c) folding processes of the B domain of Staphylococcus aureus Protein A. All demonstrations numerically proved that the modified TBSA reproduced these biologically relevant rare events with nanosecond-order simulation times, although a set of microsecond-order, canonical MD simulations failed to reproduce the rare events, indicating the high efficiency of the modified TBSA. PMID:27070761

  1. Students' Misconceptions in Interpreting Center and Variability of Data Represented via Histograms and Stem-and-Leaf Plots

    ERIC Educational Resources Information Center

    Cooper, Linda L.; Shore, Felice S.

    2008-01-01

    This paper identifies and discusses misconceptions that students have in making judgments of center and variability when data are presented graphically. An assessment addressing interpreting center and variability in histograms and stem-and-leaf plots was administered to, and follow-up interviews were conducted with, undergraduates enrolled in…

  2. Estimation of the content of fat and parenchyma in breast tissue using MRI T1 histograms and phantoms.

    PubMed

    Boston, Raymond C; Schnall, Mitchell D; Englander, Sarah A; Landis, J Richard; Moate, Peter J

    2005-05-01

    Mammographic breast density has been correlated with breast cancer risk. Estimation of the volumetric composition of breast tissue using three-dimensional MRI has been proposed, but accuracy depends upon the estimation methods employed. The use of segmentation based on T1 relaxation rates allows quantitative estimates of fat and parenchyma volume, but is limited by partial volume effects. An investigation employing phantom breast tissue composed of various combinations of chicken breast (to represent parenchyma) and cooking fats was carried out to elucidate the factors that influence MRI T1 histograms. Using the phantoms, T1 histograms and their known fat and parenchyma composition, a logistic distribution function was derived to describe the apportioning of the T1 histogram to fat and parenchyma. This function and T1 histograms were then used to predict the fat and parenchyma content of breasts from 14 women. Using this method, the composition of the breast tissue in the study population was as follows: fat 69.9+/-22.9% and parenchyma 30.1+/-22.9%. PMID:15919606

  3. Studying the time histogram of a terrestrial electron beam detected from the opposite hemisphere of its associated TGF

    NASA Astrophysics Data System (ADS)

    Sarria, D.; Blelly, P.-L.; Briggs, M. S.; Forme, F.

    2016-05-01

    Terrestrial gamma-ray flashes are bursts of X/gamma photons, correlated to thunderstorms. By interacting with the atmosphere, the photons produce a substantial number of electrons and positrons. Some of these reach a sufficiently high altitude that their interactions with the atmosphere become negligible, and they are then guided by geomagnetic field lines, forming a Terrestrial Electron Beam. On 9 December 2009, the Gamma-Ray Burst Monitor (GBM) instrument on board the Fermi Space Telescope made a particularly interesting measurement of such an event. To study this type of event in detail, we perform Monte-Carlo simulations and focus on the resulting time histograms. In agreement with previous work, we show that the histogram measured by Fermi GBM is reproducible from a simulation. We then show that the time histogram resulting from this simulation is only weakly dependent on the production altitude, duration, beaming angle, and spectral shape of the associated terrestrial gamma-ray flash. Finally, we show that the time histogram can be decomposed into three populations of leptons, coming from the opposite hemisphere, and mirroring back to the satellite with or without interacting with the atmosphere, and that these populations can be clearly distinguished by their pitch angles.

  4. Dynamic Contrast-enhanced MR Imaging in Renal Cell Carcinoma: Reproducibility of Histogram Analysis on Pharmacokinetic Parameters

    PubMed Central

    Wang, Hai-yi; Su, Zi-hua; Xu, Xiao; Sun, Zhi-peng; Duan, Fei-xue; Song, Yuan-yuan; Li, Lu; Wang, Ying-wei; Ma, Xin; Guo, Ai-tao; Ma, Lin; Ye, Hui-yi

    2016-01-01

    Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan–rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K trans & Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733

  5. Multilevel image thresholding based on 2D histogram and maximum Tsallis entropy--a differential evolution approach.

    PubMed

    Sarkar, Soham; Das, Swagatam

    2013-12-01

    Multilevel thresholding amounts to segmenting a gray-level image into several distinct regions. This paper presents a 2D histogram based multilevel thresholding approach to improve the separation between objects. Recent studies indicate that the results obtained with 2D histogram oriented approaches are superior to those obtained with 1D histogram based techniques in the context of bi-level thresholding. Here, a method to incorporate 2D histogram related information for generalized multilevel thresholding is proposed using the maximum Tsallis entropy. Differential evolution (DE), a simple yet efficient evolutionary algorithm of current interest, is employed to improve the computational efficiency of the proposed method. The performance of DE is investigated extensively through comparison with other well-known nature inspired global optimization techniques such as genetic algorithm, particle swarm optimization, artificial bee colony, and simulated annealing. In addition, the outcome of the proposed method is evaluated using a well known benchmark--the Berkley segmentation data set (BSDS300) with 300 distinct images. PMID:23955760

  6. Dynamic Contrast-enhanced MR Imaging in Renal Cell Carcinoma: Reproducibility of Histogram Analysis on Pharmacokinetic Parameters.

    PubMed

    Wang, Hai-Yi; Su, Zi-Hua; Xu, Xiao; Sun, Zhi-Peng; Duan, Fei-Xue; Song, Yuan-Yuan; Li, Lu; Wang, Ying-Wei; Ma, Xin; Guo, Ai-Tao; Ma, Lin; Ye, Hui-Yi

    2016-01-01

    Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan-rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K( trans) &Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733

  7. Histogram analysis reveals a better delineation of tumor volume from background in 18F-FET PET compared to CBV maps in a hybrid PET-MR studie in gliomas

    NASA Astrophysics Data System (ADS)

    Filss, Christian P.; Stoffels, Gabriele; Galldiks, Norbert; Sabel, Michael; Wittsack, Hans J.; Coenen, Heinz H.; Shah, Nadim J.; Herzog, Hans; Langen, Karl-Josef

    2014-01-01

    Anatomical imaging with magnetic resonance imaging (MRI) is currently the method of first choice for diagnostic investigation of glial tumors. However, different MR sequences may over- or underestimate tumor size and thus it may not be possible to delineate tumor from adjacent brain. In order to compensate this confinement additonal MR sequences like perfusion weighted MRI (PWI) with regional cerebral blood volume (rCBV) or positron emission tomography (PET) with aminoacids are used to gain further information. Recent studies suggest that both of theses image modalities provide similar diagnostic information. For comparison tumor to brain ratios (TBR) with mean and maximum values are frequently used but results from different studies can often not be checked against each other. Furthermore, especially the maximum TBR in rCBV is at risk to be falsified by artifacts (e.g. blood vessels). These confinements are reduced by the use of histograms since all information of the VOIs are equally displayed. In this study we measured and compared the intersection of tumor and reference tissue histograms in 18F-FET PET and rCBV maps in glioma patients.

  8. Automated geomorphometric classification of landforms in Transdanubian Region (Pannonian Basin) based on local slope histograms

    NASA Astrophysics Data System (ADS)

    Székely, Balázs; Koma, Zsófia; Csorba, Kristóf; Ferenc Morovics, József

    2014-05-01

    The Transdanubian Region is a typically hilly, geologically manifold area of the Pannonian Basin. It is composed primarily of Permo-Mesozoic carbonates and siliciclastic sediments, however Pannonian sedimentary units and young volcanic forms are also characteristic, such as those in the Bakony-Balaton Highland Volcanic Field. The geological diversity is reflected in the geomorphological setting: beside of the classic eroding volcanic edifices, carbonate plateaus, medium-relief, gently hilly, slowly eroding landforms are also frequent in the geomorphic mosaic of the area. Geomorphometric techniques are suitable to analyse and separate the various geomorphic units mosaicked and, in some cases, affected by (sub-)recent tectonic geomorphic processes. In our project we applied automated classification of local slope angle histograms derived of a 10-meter nominal resolution digital terrain model (DTM). Slope angle histrograms within a rectangular moving window of various sizes have been calculated in numerous experiments. The histograms then served as a multichannel input of for a k-means classification to achieve a geologically-geomorphologically sound categorization of the area. The experiments show good results in separating the very basic landforms, defined landscape boundaries can be reconstructed with high accuracy in case of larger window sizes (e.g. 5 km) and low number of categories. If the window size is smaller and the number of classes is higher, the tectonic geomorphic features are more prominently recognized, however often at the price of the clear separation boundaries: in these cases the horizontal change in the composition of various clusters matches the boundaries of the geological units. Volcanic forms are typically also put into some definite classes, however the flat plateaus of some volcanic edifices fall into another category also recognized in the experiments. In summary we can conclude that the area is suitable for such analyses, many

  9. Normal-reciprocal error models for quantitative ERT in permafrost environments: bin analysis versus histogram analysis

    NASA Astrophysics Data System (ADS)

    Verleysdonk, Sarah; Flores-Orozco, Adrian; Krautblatter, Michael; Kemna, Andreas

    2010-05-01

    Electrical resistivity tomography (ERT) has been used for the monitoring of permafrost-affected rock walls for some years now. To further enhance the interpretation of ERT measurements a deeper insight into error sources and the influence of error model parameters on the imaging results is necessary. Here, we present the effect of different statistical schemes for the determination of error parameters from the discrepancies between normal and reciprocal measurements - bin analysis and histogram analysis - using a smoothness-constrained inversion code (CRTomo) with an incorporated appropriate error model. The study site is located in galleries adjacent to the Zugspitze North Face (2800 m a.s.l.) at the border between Austria and Germany. A 20 m * 40 m rock permafrost body and its surroundings have been monitored along permanently installed transects - with electrode spacings of 1.5 m and 4.6 m - from 2007 to 2009. For data acquisition, a conventional Wenner survey was conducted as this array has proven to be the most robust array in frozen rock walls. Normal and reciprocal data were collected directly one after another to ensure identical conditions. The ERT inversion results depend strongly on the chosen parameters of the employed error model, i.e., the absolute resistance error and the relative resistance error. These parameters were derived (1) for large normal/reciprocal data sets by means of bin analyses and (2) for small normal/reciprocal data sets by means of histogram analyses. Error parameters were calculated independently for each data set of a monthly monitoring sequence to avoid the creation of artefacts (over-fitting of the data) or unnecessary loss of contrast (under-fitting of the data) in the images. The inversion results are assessed with respect to (1) raw data quality as described by the error model parameters, (2) validation via available (rock) temperature data and (3) the interpretation of the images from a geophysical as well as a

  10. Analysis of RapidArc optimization strategies using objective function values and dose-volume histograms.

    PubMed

    Oliver, Michael; Gagne, Isabelle; Popescu, Carmen; Ansbacher, Will; Beckham, Wayne A

    2010-01-01

    RapidArc is a novel treatment planning and delivery system that has recently been made available for clinical use. Included within the Eclipse treatment planning system are a number of different optimization strategies that can be employed to improve the quality of the final treatment plan. The purpose of this study is to systematically assess three categories of strategies for four phantoms, and then apply proven strategies to clinical head and neck cases. Four phantoms were created within Eclipse with varying shapes and locations for the planning target volumes and organs at risk. A baseline optimization consisting of a single 359.8 degrees arc with collimator at 45 degrees was applied to all phantoms. Three categories of strategies were assessed and compared to the baseline strategy. They include changing the initialization parameters, increasing the total number of control points, and increasing the total optimization time. Optimization log files were extracted from the treatment planning system along with final dose-volume histograms for plan assessment. Treatment plans were also generated for four head and neck patients to determine whether the results for phantom plans can be extended to clinical plans. The strategies that resulted in a significant difference from baseline were: changing the maximum leaf speed prior to optimization ( p < 0.05), increasing the total number of segments by adding an arc ( p < 0.05), and increasing the total optimization time by either continuing the optimization ( p < 0.01) or adding time to the optimization by pausing the optimization ( p < 0.01). The reductions in objective function values correlated with improvements in the dose-volume histogram (DVH). The addition of arcs and pausing strategies were applied to head and neck cancer cases, which demonstrated similar benefits with respect to the final objective function value and DVH. Analysis of the optimization log files is a useful way to intercompare treatment plans that

  11. 78 FR 53231 - Women's Equality Day, 2013

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-28

    ... hundred and thirty- eighth. (Presidential Sig.) [FR Doc. 2013-21188 Filed 8-27-13; 11:15 am] Billing code... march toward gender equality. We have fought for equal pay, prohibited gender discrimination in America... strategy to close any gender pay gap within the Federal workforce. To build on this work, I will...

  12. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Reservoir, equalizing. 236.792 Section 236.792 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of...

  13. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Reservoir, equalizing. 236.792 Section 236.792 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of...

  14. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Reservoir, equalizing. 236.792 Section 236.792 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of...

  15. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Reservoir, equalizing. 236.792 Section 236.792 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of...

  16. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Reservoir, equalizing. 236.792 Section 236.792 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of...

  17. School Law: A Question of Equality.

    ERIC Educational Resources Information Center

    Dowling-Sendor, Benjamin

    2003-01-01

    This article discusses the Equal Access Act (EAA) as it pertains to high-school student clubs. It raises basics questions about EAA: What does "equal" mean? What level of access is required? Does the First Amendment's free-speech clause offer broader protection to student clubs than the EAA? (WFA)

  18. Vocational Education and Equality of Opportunity.

    ERIC Educational Resources Information Center

    Horowitz, Benjamin; Feinberg, Walter

    1990-01-01

    Examines the concepts of equality of opportunity and equality of educational opportunity and their relationship to vocational education. Traces the history of vocational education. Delineates the distinction between training and education as enumerated in Aristotelian philosophy. Discusses the role vocational education can play in the educative…

  19. Equal Plate Charges on Series Capacitors?

    ERIC Educational Resources Information Center

    Illman, B. L.; Carlson, G. T.

    1994-01-01

    Provides a line of reasoning in support of the contention that the equal charge proposition is at best an approximation. Shows how the assumption of equal plate charge on capacitors in series contradicts the conservative nature of the electric field. (ZWH)

  20. Brown and the Politics of Equality.

    ERIC Educational Resources Information Center

    Brown, Frank

    1994-01-01

    Assesses the progress of equality since Brown v Topeka Board of Education and argues that there still has not been a full implementation of that Supreme Court decree. School integration is shown to be declining. It is recommended that The court could merge the equality standards of Plessy v Ferguson with Brown to provide quality education. (GR)

  1. Computationally efficient multidimensional analysis of complex flow cytometry data using second order polynomial histograms.

    PubMed

    Zaunders, John; Jing, Junmei; Leipold, Michael; Maecker, Holden; Kelleher, Anthony D; Koch, Inge

    2016-01-01

    Many methods have been described for automated clustering analysis of complex flow cytometry data, but so far the goal to efficiently estimate multivariate densities and their modes for a moderate number of dimensions and potentially millions of data points has not been attained. We have devised a novel approach to describing modes using second order polynomial histogram estimators (SOPHE). The method divides the data into multivariate bins and determines the shape of the data in each bin based on second order polynomials, which is an efficient computation. These calculations yield local maxima and allow joining of adjacent bins to identify clusters. The use of second order polynomials also optimally uses wide bins, such that in most cases each parameter (dimension) need only be divided into 4-8 bins, again reducing computational load. We have validated this method using defined mixtures of up to 17 fluorescent beads in 16 dimensions, correctly identifying all populations in data files of 100,000 beads in <10 s, on a standard laptop. The method also correctly clustered granulocytes, lymphocytes, including standard T, B, and NK cell subsets, and monocytes in 9-color stained peripheral blood, within seconds. SOPHE successfully clustered up to 36 subsets of memory CD4 T cells using differentiation and trafficking markers, in 14-color flow analysis, and up to 65 subpopulations of PBMC in 33-dimensional CyTOF data, showing its usefulness in discovery research. SOPHE has the potential to greatly increase efficiency of analysing complex mixtures of cells in higher dimensions. PMID:26097104

  2. A generic shape/texture descriptor over multiscale edge field: 2-D walking ant histogram.

    PubMed

    Kiranyaz, Serkan; Ferreira, Miguel; Gabbouj, Moncef

    2008-03-01

    A novel shape descriptor, which can be extracted from the major object edges automatically and used for the multimedia content-based retrieval in multimedia databases, is presented. By adopting a multiscale approach over the edge field where the scale represents the amount of simplification, the most relevant edge segments, referred to as subsegments, which eventually represent the major object boundaries, are extracted from a scale-map. Similar to the process of a walking ant with a limited line of sight over the boundary of a particular object, we traverse through each subsegment and describe a certain line of sight, whether it is a continuous branch or a corner, using individual 2-D histograms. Furthermore, the proposed method can also be tuned to be an efficient texture descriptor, which achieves a superior performance especially for directional textures. Finally, integrating the whole process as feature extraction module into MUVIS framework allows us to test the mutual performance of the proposed shape descriptor in the context of multimedia indexing and retrieval. PMID:18270126

  3. Application of Histogram Analysis in Radiation Therapy (HART) in Intensity Modulation Radiation Therapy (IMRT) Treatments

    NASA Astrophysics Data System (ADS)

    Pyakuryal, Anil

    2009-03-01

    A carcinoma is a malignant cancer that emerges from epithelial cells in structures through out the body.It invades the critical organs, could metastasize or spread to lymph nodes.IMRT is an advanced mode of radiation therapy treatment for cancer. It delivers more conformal doses to malignant tumors sparing the critical organs by modulating the intensity of radiation beam.An automated software, HART (S. Jang et al.,2008,Med Phys 35,p.2812) was used for efficient analysis of dose volume histograms (DVH) for multiple targets and critical organs in four IMRT treatment plans for each patient. IMRT data for ten head and neck cancer patients were exported as AAPM/RTOG format files from a commercial treatment planning system at Northwestern Memorial Hospital (NMH).HART extracted DVH statistics were used to evaluate plan indices and to analyze dose tolerance of critical structures at prescription dose (PD) for each patient. Mean plan indices (n=10) were found to be in good agreement with published results for Linac based plans. The least irradiated volume at tolerance dose (TD50) was observed for brainstem and the highest volume for larynx in SIB treatment techniques. Thus HART, an open source platform, has extensive clinical implications in IMRT treatments.

  4. Quality control of dose volume histogram computation characteristics of 3D treatment planning systems

    NASA Astrophysics Data System (ADS)

    Panitsa, E.; Rosenwald, J. C.; Kappas, C.

    1998-10-01

    Detailed quality control (QC) protocols are a necessity for modern radiotherapy departments. The established QC protocols for treatment planning systems (TPS) do not include recommendations on the advanced features of three-dimensional (3D) treatment planning, like the dose volume histograms (DVH). In this study, a test protocol for DVH characteristics was developed. The protocol assesses the consistency of the DVH computation to the dose distribution calculated by the same TPS by comparing DVH parameters with values obtained by the isodose distributions. The computation parameters (such as the dimension of the computation grid) that are applied to the TPS during the tests are not fixed but set by the user as if the test represents a typical clinical case. Six commercial TPS were examined with this protocol within the frame of the EC project Dynarad (Biomed I). The results of the intercomparison prove the consistency of the DVH results to the isodose values for most of the examined TPS. However, special attention should be paid when working with cases of adverse conditions such as high dose gradient regions. In these cases, higher errors are derived, especially when an insufficient number of dose calculation points are used for the DVH computation.

  5. Shot-Noise Limited Single-Molecule FRET Histograms: Comparison between Theory and Experiments†

    PubMed Central

    Nir, Eyal; Michalet, Xavier; Hamadani, Kambiz M.; Laurence, Ted A.; Neuhauser, Daniel; Kovchegov, Yevgeniy; Weiss, Shimon

    2011-01-01

    We describe a simple approach and present a straightforward numerical algorithm to compute the best fit shot-noise limited proximity ratio histogram (PRH) in single-molecule fluorescence resonant energy transfer diffusion experiments. The key ingredient is the use of the experimental burst size distribution, as obtained after burst search through the photon data streams. We show how the use of an alternated laser excitation scheme and a correspondingly optimized burst search algorithm eliminates several potential artifacts affecting the calculation of the best fit shot-noise limited PRH. This algorithm is tested extensively on simulations and simple experimental systems. We find that dsDNA data exhibit a wider PRH than expected from shot noise only and hypothetically account for it by assuming a small Gaussian distribution of distances with an average standard deviation of 1.6 Å. Finally, we briefly mention the results of a future publication and illustrate them with a simple two-state model system (DNA hairpin), for which the kinetic transition rates between the open and closed conformations are extracted. PMID:17078646

  6. 3D/2D image registration using weighted histogram of gradient directions

    NASA Astrophysics Data System (ADS)

    Ghafurian, Soheil; Hacihaliloglu, Ilker; Metaxas, Dimitris N.; Tan, Virak; Li, Kang

    2015-03-01

    Three dimensional (3D) to two dimensional (2D) image registration is crucial in many medical applications such as image-guided evaluation of musculoskeletal disorders. One of the key problems is to estimate the 3D CT- reconstructed bone model positions (translation and rotation) which maximize the similarity between the digitally reconstructed radiographs (DRRs) and the 2D fluoroscopic images using a registration method. This problem is computational-intensive due to a large search space and the complicated DRR generation process. Also, finding a similarity measure which converges to the global optimum instead of local optima adds to the challenge. To circumvent these issues, most existing registration methods need a manual initialization, which requires user interaction and is prone to human error. In this paper, we introduce a novel feature-based registration method using the weighted histogram of gradient directions of images. This method simplifies the computation by searching the parameter space (rotation and translation) sequentially rather than simultaneously. In our numeric simulation experiments, the proposed registration algorithm was able to achieve sub-millimeter and sub-degree accuracies. Moreover, our method is robust to the initial guess. It can tolerate up to +/-90°rotation offset from the global optimal solution, which minimizes the need for human interaction to initialize the algorithm.

  7. Computing Spatial Distance Histograms for Large Scientific Datasets On-the-Fly

    PubMed Central

    Kumar, Anand; Grupcev, Vladimir; Yuan, Yongke; Huang, Jin; Shen, Gang

    2014-01-01

    This paper focuses on an important query in scientific simulation data analysis: the Spatial Distance Histogram (SDH). The computation time of an SDH query using brute force method is quadratic. Often, such queries are executed continuously over certain time periods, increasing the computation time. We propose highly efficient approximate algorithm to compute SDH over consecutive time periods with provable error bounds. The key idea of our algorithm is to derive statistical distribution of distances from the spatial and temporal characteristics of particles. Upon organizing the data into a Quad-tree based structure, the spatiotemporal characteristics of particles in each node of the tree are acquired to determine the particles’ spatial distribution as well as their temporal locality in consecutive time periods. We report our efforts in implementing and optimizing the above algorithm in Graphics Processing Units (GPUs) as means to further improve the efficiency. The accuracy and efficiency of the proposed algorithm is backed by mathematical analysis and results of extensive experiments using data generated from real simulation studies. PMID:25264418

  8. Nonlinear histogram binning for quantitative analysis of lung tissue fibrosis in high-resolution CT data

    NASA Astrophysics Data System (ADS)

    Zavaletta, Vanessa A.; Bartholmai, Brian J.; Robb, Richard A.

    2007-03-01

    Diffuse lung diseases, such as idiopathic pulmonary fibrosis (IPF), can be characterized and quantified by analysis of volumetric high resolution CT scans of the lungs. These data sets typically have dimensions of 512 x 512 x 400. It is too subjective and labor intensive for a radiologist to analyze each slice and quantify regional abnormalities manually. Thus, computer aided techniques are necessary, particularly texture analysis techniques which classify various lung tissue types. Second and higher order statistics which relate the spatial variation of the intensity values are good discriminatory features for various textures. The intensity values in lung CT scans range between [-1024, 1024]. Calculation of second order statistics on this range is too computationally intensive so the data is typically binned between 16 or 32 gray levels. There are more effective ways of binning the gray level range to improve classification. An optimal and very efficient way to nonlinearly bin the histogram is to use a dynamic programming algorithm. The objective of this paper is to show that nonlinear binning using dynamic programming is computationally efficient and improves the discriminatory power of the second and higher order statistics for more accurate quantification of diffuse lung disease.

  9. Facial expression recognition and histograms of oriented gradients: a comprehensive study.

    PubMed

    Carcagnì, Pierluigi; Del Coco, Marco; Leo, Marco; Distante, Cosimo

    2015-01-01

    Automatic facial expression recognition (FER) is a topic of growing interest mainly due to the rapid spread of assistive technology applications, as human-robot interaction, where a robust emotional awareness is a key point to best accomplish the assistive task. This paper proposes a comprehensive study on the application of histogram of oriented gradients (HOG) descriptor in the FER problem, highlighting as this powerful technique could be effectively exploited for this purpose. In particular, this paper highlights that a proper set of the HOG parameters can make this descriptor one of the most suitable to characterize facial expression peculiarities. A large experimental session, that can be divided into three different phases, was carried out exploiting a consolidated algorithmic pipeline. The first experimental phase was aimed at proving the suitability of the HOG descriptor to characterize facial expression traits and, to do this, a successful comparison with most commonly used FER frameworks was carried out. In the second experimental phase, different publicly available facial datasets were used to test the system on images acquired in different conditions (e.g. image resolution, lighting conditions, etc.). As a final phase, a test on continuous data streams was carried out on-line in order to validate the system in real-world operating conditions that simulated a real-time human-machine interaction. PMID:26543779

  10. Visualization of boundaries in CT volumetric data sets using dynamic M-|∇f| histogram.

    PubMed

    Li, Lu; Peng, Hu; Chen, Xun; Cheng, Juan; Gao, Dayong

    2016-01-01

    Direct volume rendering is widely used for three-dimensional medical data visualization such as computed tomography and magnetic resonance imaging. Distinct visualization of boundaries is able to provide valuable and insightful information in many medical applications. However, it is conventionally challenging to detect boundaries reliably due to limitations of the transfer function design. Meanwhile, the interactive strategy is complicated for new users or even experts. In this paper, we build a generalized boundary model contaminated by noise and prove boundary middle value (M) has a good statistical property. Based on the model we propose a user-friendly strategy for the boundary extraction and transfer function design, using M, boundary height (Δh), and gradient magnitude (|∇f|). In fact, it is a dynamic iterative process. First, potential boundaries are sorted orderly from high to low according to the value of their height. Then, users iteratively extract the boundary with the highest value of Δh in a newly defined domain, where different boundaries are transformed to disjoint vertical bars using M-|∇f| histogram. In this case, the chance of misclassification among different boundaries decreases. PMID:26649763

  11. Lung Cancer Prediction Using Neural Network Ensemble with Histogram of Oriented Gradient Genomic Features

    PubMed Central

    Adetiba, Emmanuel; Olugbara, Oludayo O.

    2015-01-01

    This paper reports an experimental comparison of artificial neural network (ANN) and support vector machine (SVM) ensembles and their “nonensemble” variants for lung cancer prediction. These machine learning classifiers were trained to predict lung cancer using samples of patient nucleotides with mutations in the epidermal growth factor receptor, Kirsten rat sarcoma viral oncogene, and tumor suppressor p53 genomes collected as biomarkers from the IGDB.NSCLC corpus. The Voss DNA encoding was used to map the nucleotide sequences of mutated and normal genomes to obtain the equivalent numerical genomic sequences for training the selected classifiers. The histogram of oriented gradient (HOG) and local binary pattern (LBP) state-of-the-art feature extraction schemes were applied to extract representative genomic features from the encoded sequences of nucleotides. The ANN ensemble and HOG best fit the training dataset of this study with an accuracy of 95.90% and mean square error of 0.0159. The result of the ANN ensemble and HOG genomic features is promising for automated screening and early detection of lung cancer. This will hopefully assist pathologists in administering targeted molecular therapy and offering counsel to early stage lung cancer patients and persons in at risk populations. PMID:25802891

  12. Nonequilibrium equalities in absolutely irreversible processes

    NASA Astrophysics Data System (ADS)

    Murashita, Yuto; Funo, Ken; Ueda, Masahito

    2015-03-01

    Nonequilibrium equalities have attracted considerable attention in the context of statistical mechanics and information thermodynamics. Integral nonequilibrium equalities reveal an ensemble property of the entropy production σ as = 1 . Although nonequilibrium equalities apply to rather general nonequilibrium situations, they break down in absolutely irreversible processes, where the forward-path probability vanishes and the entropy production diverges. We identify the mathematical origins of this inapplicability as the singularity of probability measure. As a result, we generalize conventional integral nonequilibrium equalities to absolutely irreversible processes as = 1 -λS , where λS is the probability of the singular part defined based on Lebesgue's decomposition theorem. The acquired equality contains two physical quantities related to irreversibility: σ characterizing ordinary irreversibility and λS describing absolute irreversibility. An inequality derived from the obtained equality demonstrates the absolute irreversibility leads to the fundamental lower bound on the entropy production. We demonstrate the validity of the obtained equality for a simple model.

  13. Equality Hypocrisy, Inconsistency, and Prejudice: The Unequal Application of the Universal Human Right to Equality

    PubMed Central

    2015-01-01

    In Western culture, there appears to be widespread endorsement of Article 1 of the Universal Declaration of Human Rights (which stresses equality and freedom). But do people really apply their equality values equally, or are their principles and application systematically discrepant, resulting in equality hypocrisy? The present study, conducted with a representative national sample of adults in the United Kingdom (N = 2,895), provides the first societal test of whether people apply their value of “equality for all” similarly across multiple types of status minority (women, disabled people, people aged over 70, Blacks, Muslims, and gay people). Drawing on theories of intergroup relations and stereotyping we examined, relation to each of these groups, respondents’ judgments of how important it is to satisfy their particular wishes, whether there should be greater or reduced equality of employment opportunities, and feelings of social distance. The data revealed a clear gap between general equality values and responses to these specific measures. Respondents prioritized equality more for “paternalized” groups (targets of benevolent prejudice: women, disabled, over 70) than others (Black people, Muslims, and homosexual people), demonstrating significant inconsistency. Respondents who valued equality more, or who expressed higher internal or external motivation to control prejudice, showed greater consistency in applying equality. However, even respondents who valued equality highly showed significant divergence in their responses to paternalized versus nonpaternalized groups, revealing a degree of hypocrisy. Implications for strategies to promote equality and challenge prejudice are discussed. PMID:25914516

  14. New spatial diversity equalizer based on PLL

    NASA Astrophysics Data System (ADS)

    Rao, Wei

    2011-10-01

    A new Spatial Diversity Equalizer (SDE) based on phase-locked loop (PLL) is proposed to overcome the inter-symbol interference (ISI) and phase rotations simultaneously in the digital communication system. The proposed SDE consists of equal gain combining technique based on a famous blind equalization algorithm constant modulus algorithm (CMA) and a PLL. Compared with conventional SDE, the proposed SDE has not only faster convergence rate and lower residual error but also the ability to recover carrier phase rotation. The efficiency of the method is proved by computer simulation.

  15. Contributorships Are Not 'Weighable' to be Equal.

    PubMed

    Moustafa, Khaled

    2016-05-01

    A new trend to assign some authors as 'first co-authors' is noticeable in scientific publications as a statement highlighting that two or more authors 'contributed equally' to a reported work. However, the requirements of scientific rigor, honesty, and accuracy in academic standards make such statements invalid and, thus, should be avoided. A potential solution is to specify the role of each co-author, from study conception to communication of results, and let readers judge the importance of each contribution by themselves. Alternatively, authors should demonstrate how they contributed 'equally' when they are defined as 'equal contributors'. PMID:27025412

  16. The Bakke Opinions and Equal Protection Doctrine.

    ERIC Educational Resources Information Center

    Karst, Kenneth L.; Horowitz, Harold W.

    1979-01-01

    Constitutional issues addressed in the Supreme Court's decision are reviewed. The opinions rendered by Justice Powell are viewed as reflections of the weakness of recent equal protection theory, and as signs of future doctrine. (GC)

  17. 7 CFR 622.6 - Equal opportunity.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... of Agriculture (7 CFR Part 15), which provide that no person in the United States shall, on the... AGRICULTURE WATER RESOURCES WATERSHED PROJECTS General § 622.6 Equal opportunity. The Pub. L. 83-566 and...

  18. 7 CFR 622.6 - Equal opportunity.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... of Agriculture (7 CFR Part 15), which provide that no person in the United States shall, on the... AGRICULTURE WATER RESOURCES WATERSHED PROJECTS General § 622.6 Equal opportunity. The Pub. L. 83-566 and...

  19. Turbo Equalization Using Partial Gaussian Approximation

    NASA Astrophysics Data System (ADS)

    Zhang, Chuanzong; Wang, Zhongyong; Manchon, Carles Navarro; Sun, Peng; Guo, Qinghua; Fleury, Bernard Henri

    2016-09-01

    This paper deals with turbo-equalization for coded data transmission over intersymbol interference (ISI) channels. We propose a message-passing algorithm that uses the expectation-propagation rule to convert messages passed from the demodulator-decoder to the equalizer and computes messages returned by the equalizer by using a partial Gaussian approximation (PGA). Results from Monte Carlo simulations show that this approach leads to a significant performance improvement compared to state-of-the-art turbo-equalizers and allows for trading performance with complexity. We exploit the specific structure of the ISI channel model to significantly reduce the complexity of the PGA compared to that considered in the initial paper proposing the method.

  20. 7 CFR 622.6 - Equal opportunity.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... of Agriculture (7 CFR Part 15), which provide that no person in the United States shall, on the... AGRICULTURE WATER RESOURCES WATERSHED PROJECTS General § 622.6 Equal opportunity. The Pub. L. 83-566 and...

  1. NI-20ADC HISTOGRAM ANALYSIS FOLLOWING RADIOTHERAPY PREDICTS RESPONSE TO ADJUVANT TEMOZOLOMIDE IN NEWLY DIAGNOSED GBM

    PubMed Central

    Ellingson, Benjamin; Chang, Warren; Harris, Robert; Mody, Reema; Lai, Albert; Nghiemphu, Phioanh; Cloughesy, Timothy; Pope, Whitney

    2014-01-01

    INTRODUCTION: The current standard of care for newly diagnosed GBM consists of concurrent radiotherapy and temozolomide (TMZ) plus adjuvant TMZ. We hypothesize there is a subset of patients that will have a significant benefit from this adjuvant therapy. Therefore, the purpose of the current study was to identify a diffusion imaging phenotype for patients with newly diagnosed GBM that will benefit from adjuvant TMZ following concurrent radiotherapy and TMZ. METHODS: A total of 120 patients with: 1) histologically confirmed glioblastoma, 2) treated with concurrent radiotherapy and TMZ followed by adjuvant TMZ; and 3) high quality diffusion MR data were included in the current study. Diffusion and standard structural MRI were performed approximately 10 weeks after the start of radiotherapy and concurrent TMZ. ADC histogram analysis was performed by fitting a double Gaussian mixed model to ADC data extracted from contrast enhancement tumor. ADCL was defined as the mean ADC of the lower Gaussian distribution. We hypothesize that patients with a high ADCL have a lower tumor burden and thus favorable response to adjuvant TMZ in terms of TTP and OS. RESULTS: Results demonstrate that patients with an ADCL lower than 1 um2/ms has a significantly shorter PFS compared with patients having a higher ADCL (Log-rank, P < 0.0001), showing almost twice the median PFS (297 days vs. 156 days). Additionally, patients with a high ADCL had a significantly longer OS (Log-rank, P = 0.0049). Patients with a high ADCL had a median OS of 648 days while patients with a low ADCL had a median OS of only 407 days from the start of adjuvant TMZ. CONCLUSION: Newly diagnosed GBM patients with elevated tumor diffusivity after completion of radiotherapy and concurrent TMZ have a favorable prognosis.

  2. Dose-Volume Histogram Analysis of the Safety of Proton Beam Therapy for Unresectable Hepatocellular Carcinoma

    SciTech Connect

    Kawashima, Mitsuhiko; Kohno, Ryosuke; Nakachi, Kohei; Nishio, Teiji; Mitsunaga, Shuichi; Ikeda, Masafumi; Konishi, Masaru; Takahashi, Shinichiro; Gotohda, Naoto; Arahira, Satoko; Zenda, Sadamoto; Ogino, Takashi; Kinoshita, Taira

    2011-04-01

    Purpose: To evaluate the safety and efficacy of radiotherapy using proton beam (PRT) for unresectable hepatocellular carcinoma. Methods and Materials: Sixty consecutive patients who underwent PRT between May 1999 and July 2007 were analyzed. There were 42 males and 18 females, with a median age of 70 years (48-92 years). All but 1 patient had a single lesion with a median diameter of 45 mm (20-100 mm). Total PRT dose/fractionation was 76-cobalt Gray equivalent (CGE)/20 fractions in 46 patients, 65 CGE/26 fractions in 11 patients, and 60 CGE/10 fractions in 3 patients. The risk of developing proton-induced hepatic insufficiency (PHI) was estimated using dose-volume histograms and an indocyanine-green retention rate at 15 minutes (ICG R15). Results: None of the 20 patients with ICG R15 of less than 20% developed PHI, whereas 6 of 8 patients with ICG R15 values of 50% or higher developed PHI. Among 32 patients whose ICG R15 ranged from 20% to 49.9%, PHI was observed only in patients who had received 30 CGE (V30) to more than 25% of the noncancerous parts of the liver (n = 5) Local progression-free and overall survival rates at 3 years were 90% (95% confidence interval [CI], 80-99%) and 56% (95% CI, 43-69%), respectively. A gastrointestinal toxicity of Grade {>=}2 was observed in 3 patients. Conclusions: ICG R15 and V30 are recommended as useful predictors for the risk of developing PHI, which should be incorporated into multidisciplinary treatment plans for patients with this disease.

  3. Assessment of Autonomic Function by Phase Rectification of RRInterval Histogram Analysis in Chagas Disease

    PubMed Central

    Nasari-Junior, Olivassé; Benchimol-Barbosa, Paulo Roberto; Pedrosa, Roberto Coury; Nadal, Jurandir

    2015-01-01

    Background In chronic Chagas disease (ChD), impairment of cardiac autonomic function bears prognostic implications. Phase‑rectification of RR-interval series isolates the sympathetic, acceleration phase (AC) and parasympathetic, deceleration phase (DC) influences on cardiac autonomic modulation. Objective This study investigated heart rate variability (HRV) as a function of RR-interval to assess autonomic function in healthy and ChD subjects. Methods Control (n = 20) and ChD (n = 20) groups were studied. All underwent 60-min head-up tilt table test under ECG recording. Histogram of RR-interval series was calculated, with 100 ms class, ranging from 600–1100 ms. In each class, mean RR-intervals (MNN) and root-mean-squared difference (RMSNN) of consecutive normal RR-intervals that suited a particular class were calculated. Average of all RMSNN values in each class was analyzed as function of MNN, in the whole series (RMSNNT), and in AC (RMSNNAC) and DC (RMSNNDC) phases. Slopes of linear regression lines were compared between groups using Student t-test. Correlation coefficients were tested before comparisons. RMSNN was log-transformed. (α < 0.05). Results Correlation coefficient was significant in all regressions (p < 0.05). In the control group, RMSNNT, RMSNNAC, and RMSNNDC significantly increased linearly with MNN (p < 0.05). In ChD, only RMSNNAC showed significant increase as a function of MNN, whereas RMSNNT and RMSNNDC did not. Conclusion HRV increases in proportion with the RR-interval in healthy subjects. This behavior is lost in ChD, particularly in the DC phase, indicating cardiac vagal incompetence. PMID:26131700

  4. ADC histograms predict response to anti-angiogenic therapy in patients with recurrent high-grade glioma

    PubMed Central

    2011-01-01

    Introduction The purpose of this study is to evaluate apparent diffusion coefficient (ADC) maps to distinguish anti-vascular and anti-tumor effects in the course of anti-angiogenic treatment of recurrent high-grade gliomas (rHGG) as compared to standard magnetic resonance imaging (MRI). Methods This retrospective study analyzed ADC maps from diffusion-weighted MRI in 14 rHGG patients during bevacizumab/irinotecan (B/I) therapy. Applying image segmentation, volumes of contrast-enhanced lesions in T1 sequences and of hyperintense T2 lesions (hT2) were calculated. hT2 were defined as regions of interest (ROI) and registered to corresponding ADC maps (hT2-ADC). Histograms were calculated from hT2-ADC ROIs. Thereafter, histogram asymmetry termed “skewness” was calculated and compared to progression-free survival (PFS) as defined by the Response Assessment Neuro-Oncology (RANO) Working Group criteria. Results At 8–12 weeks follow-up, seven (50%) patients showed a partial response, three (21.4%) patients were stable, and four (28.6%) patients progressed according to RANO criteria. hT2-ADC histograms demonstrated statistically significant changes in skewness in relation to PFS at 6 months. Patients with increasing skewness (n=11) following B/I therapy had significantly shorter PFS than did patients with decreasing or stable skewness values (n=3, median percentage change in skewness 54% versus −3%, p=0.04). Conclusion In rHGG patients, the change in ADC histogram skewness may be predictive for treatment response early in the course of anti-angiogenic therapy and more sensitive than treatment assessment based solely on RANO criteria. PMID:21125399

  5. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 4 2012-07-01 2012-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal..., or responsibility required for the performance of jobs will not render the equal pay...

  6. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal..., or responsibility required for the performance of jobs will not render the equal pay...

  7. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 4 2011-07-01 2011-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal..., or responsibility required for the performance of jobs will not render the equal pay...

  8. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 4 2014-07-01 2014-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal..., or responsibility required for the performance of jobs will not render the equal pay...

  9. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 4 2013-07-01 2013-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal..., or responsibility required for the performance of jobs will not render the equal pay...

  10. Visual vs Fully Automatic Histogram-Based Assessment of Idiopathic Pulmonary Fibrosis (IPF) Progression Using Sequential Multidetector Computed Tomography (MDCT)

    PubMed Central

    Colombi, Davide; Dinkel, Julien; Weinheimer, Oliver; Obermayer, Berenike; Buzan, Teodora; Nabers, Diana; Bauer, Claudia; Oltmanns, Ute; Palmowski, Karin; Herth, Felix; Kauczor, Hans Ulrich; Sverzellati, Nicola

    2015-01-01

    Objectives To describe changes over time in extent of idiopathic pulmonary fibrosis (IPF) at multidetector computed tomography (MDCT) assessed by semi-quantitative visual scores (VSs) and fully automatic histogram-based quantitative evaluation and to test the relationship between these two methods of quantification. Methods Forty IPF patients (median age: 70 y, interquartile: 62-75 years; M:F, 33:7) that underwent 2 MDCT at different time points with a median interval of 13 months (interquartile: 10-17 months) were retrospectively evaluated. In-house software YACTA quantified automatically lung density histogram (10th-90th percentile in 5th percentile steps). Longitudinal changes in VSs and in the percentiles of attenuation histogram were obtained in 20 untreated patients and 20 patients treated with pirfenidone. Pearson correlation analysis was used to test the relationship between VSs and selected percentiles. Results In follow-up MDCT, visual overall extent of parenchymal abnormalities (OE) increased in median by 5 %/year (interquartile: 0 %/y; +11 %/y). Substantial difference was found between treated and untreated patients in HU changes of the 40th and of the 80th percentiles of density histogram. Correlation analysis between VSs and selected percentiles showed higher correlation between the changes (Δ) in OE and Δ 40th percentile (r=0.69; p<0.001) as compared to Δ 80th percentile (r=0.58; p<0.001); closer correlation was found between Δ ground-glass extent and Δ 40th percentile (r=0.66, p<0.001) as compared to Δ 80th percentile (r=0.47, p=0.002), while the Δ reticulations correlated better with the Δ 80th percentile (r=0.56, p<0.001) in comparison to Δ 40th percentile (r=0.43, p=0.003). Conclusions There is a relevant and fully automatically measurable difference at MDCT in VSs and in histogram analysis at one year follow-up of IPF patients, whether treated or untreated: Δ 40th percentile might reflect the change in overall extent of lung

  11. Adaptive Management

    EPA Science Inventory

    Adaptive management is an approach to natural resource management that emphasizes learning through management where knowledge is incomplete, and when, despite inherent uncertainty, managers and policymakers must act. Unlike a traditional trial and error approach, adaptive managem...

  12. Nanothermodynamics of large iron clusters by means of a flat histogram Monte Carlo method

    SciTech Connect

    Basire, M.; Soudan, J.-M.; Angelié, C.

    2014-09-14

    The thermodynamics of iron clusters of various sizes, from 76 to 2452 atoms, typical of the catalyst particles used for carbon nanotubes growth, has been explored by a flat histogram Monte Carlo (MC) algorithm (called the σ-mapping), developed by Soudan et al. [J. Chem. Phys. 135, 144109 (2011), Paper I]. This method provides the classical density of states, g{sub p}(E{sub p}) in the configurational space, in terms of the potential energy of the system, with good and well controlled convergence properties, particularly in the melting phase transition zone which is of interest in this work. To describe the system, an iron potential has been implemented, called “corrected EAM” (cEAM), which approximates the MEAM potential of Lee et al. [Phys. Rev. B 64, 184102 (2001)] with an accuracy better than 3 meV/at, and a five times larger computational speed. The main simplification concerns the angular dependence of the potential, with a small impact on accuracy, while the screening coefficients S{sub ij} are exactly computed with a fast algorithm. With this potential, ergodic explorations of the clusters can be performed efficiently in a reasonable computing time, at least in the upper half of the solid zone and above. Problems of ergodicity exist in the lower half of the solid zone but routes to overcome them are discussed. The solid-liquid (melting) phase transition temperature T{sub m} is plotted in terms of the cluster atom number N{sub at}. The standard N{sub at}{sup −1/3} linear dependence (Pawlow law) is observed for N{sub at} >300, allowing an extrapolation up to the bulk metal at 1940 ±50 K. For N{sub at} <150, a strong divergence is observed compared to the Pawlow law. The melting transition, which begins at the surface, is stated by a Lindemann-Berry index and an atomic density analysis. Several new features are obtained for the thermodynamics of cEAM clusters, compared to the Rydberg pair potential clusters studied in Paper I.

  13. Military Curricula for Vocational & Technical Education. Equal Opportunity and Treatment Classroom Course 17-9.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    These instructor lesson plans and teaching guides and student study guides for a secondary-postsecondary-level course for equal opportunity and treatment personnel are one of a number of military-developed curriculum packages selected for adaptation to vocational instruction and curriculum development in a civilian setting. Purpose stated for the…

  14. Criteria for equality in two entropic inequalities

    SciTech Connect

    Shirokov, M. E.

    2014-07-31

    We obtain a simple criterion for local equality between the constrained Holevo capacity and the quantum mutual information of a quantum channel. This shows that the set of all states for which this equality holds is determined by the kernel of the channel (as a linear map). Applications to Bosonic Gaussian channels are considered. It is shown that for a Gaussian channel having no completely depolarizing components the above characteristics may coincide only at non-Gaussian mixed states and a criterion for the existence of such states is given. All the obtained results may be reformulated as conditions for equality between the constrained Holevo capacity of a quantum channel and the input von Neumann entropy. Bibliography: 20 titles. (paper)

  15. Overpaying morbidity adjusters in risk equalization models.

    PubMed

    van Kleef, R C; van Vliet, R C J A; van de Ven, W P M M

    2016-09-01

    Most competitive social health insurance markets include risk equalization to compensate insurers for predictable variation in healthcare expenses. Empirical literature shows that even the most sophisticated risk equalization models-with advanced morbidity adjusters-substantially undercompensate insurers for selected groups of high-risk individuals. In the presence of premium regulation, these undercompensations confront consumers and insurers with incentives for risk selection. An important reason for the undercompensations is that not all information with predictive value regarding healthcare expenses is appropriate for use as a morbidity adjuster. To reduce incentives for selection regarding specific groups we propose overpaying morbidity adjusters that are already included in the risk equalization model. This paper illustrates the idea of overpaying by merging data on morbidity adjusters and healthcare expenses with health survey information, and derives three preconditions for meaningful application. Given these preconditions, we think overpaying may be particularly useful for pharmacy-based cost groups. PMID:26420555

  16. All Are Equal, but Some Are More Equal than Others: Managerialism and Gender Equality in Higher Education in Comparative Perspective

    ERIC Educational Resources Information Center

    Teelken, Christine; Deem, Rosemary

    2013-01-01

    The main purpose of this paper is to investigate what impact new regimes of management and governance, including new managerialism, have had on perceptions of gender equality at universities in three Western European countries. While in accordance with national laws and EU directives, contemporary current management approaches in universities…

  17. Image quality-based adaptive illumination normalisation for face recognition

    NASA Astrophysics Data System (ADS)

    Sellahewa, Harin; Jassim, Sabah A.

    2009-05-01

    Automatic face recognition is a challenging task due to intra-class variations. Changes in lighting conditions during enrolment and identification stages contribute significantly to these intra-class variations. A common approach to address the effects such of varying conditions is to pre-process the biometric samples in order normalise intra-class variations. Histogram equalisation is a widely used illumination normalisation technique in face recognition. However, a recent study has shown that applying histogram equalisation on well-lit face images could lead to a decrease in recognition accuracy. This paper presents a dynamic approach to illumination normalisation, based on face image quality. The quality of a given face image is measured in terms of its luminance distortion by comparing this image against a known reference face image. Histogram equalisation is applied to a probe image if its luminance distortion is higher than a predefined threshold. We tested the proposed adaptive illumination normalisation method on the widely used Extended Yale Face Database B. Identification results demonstrate that our adaptive normalisation produces better identification accuracy compared to the conventional approach where every image is normalised, irrespective of the lighting condition they were acquired.

  18. 34 CFR 108.6 - Equal access.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... limited to, school-related means of communication, such as bulletin board notices and literature... ACCESS TO PUBLIC SCHOOL FACILITIES FOR THE BOY SCOUTS OF AMERICA AND OTHER DESIGNATED YOUTH GROUPS § 108... equal access to school premises or facilities to conduct meetings. (2) Benefits and services. Any...

  19. The Equal Access Act: Recent Court Decisions.

    ERIC Educational Resources Information Center

    Bjorklun, Eugene C.

    1989-01-01

    Examines court decisions which led to the passage of the Equal Access Act of 1984. Although the act was designed to clarify the issue over the legality of permitting religious clubs to meet on school property, it may have created more confusion. Concludes that the Supreme Court may have to decide the issue. (SLM)

  20. The Path to Equal Rights in Michigan

    ERIC Educational Resources Information Center

    Gratz, Jennifer

    2007-01-01

    The litigant in a historic reverse-discrimination case against the University of Michigan, and subsequently the leader of a Michigan ballot initiative that carried the day against long odds, recounts how her simple call for equal treatment under the law persuaded the people of her state that color-conscious preferences are wrong.

  1. The Internet and Equality of Educational Opportunity.

    ERIC Educational Resources Information Center

    Schofield, Janet Ward; Davidson, Ann Locke

    One benefit often expected to flow from Internet use in schools is an increase in equality of educational opportunity as all kinds of schools gain access to the same extraordinary set of resources. Yet, prior research suggests that patterns of technology access often mirror existing inequalities rather than mitigate them. This paper discusses the…

  2. An American Perspective on Equal Educational Opportunities

    ERIC Educational Resources Information Center

    Russo, Charles; Perkins, Brian

    2004-01-01

    The United States Supreme Court ushered in a new era in American history on May 17, 1954 in its monumental ruling in "Brown v Board of Education," Topeka, Kansas. "Brown" is not only the Court's most significant decision on race and equal educational opportunities, but also ranks among the most important cases it has ever decided. In "Brown" a…

  3. Equal Opportunity and Racial Differences in IQ.

    ERIC Educational Resources Information Center

    Fagan, Joseph F.; Holland, Cynthia R.

    2002-01-01

    Administered an intelligence test to blacks and whites in 2 studies involving 254 community college students and 2 more studies involving 115 community college students. Results show that differences in knowledge between blacks and whites for items on an intelligence test, the meanings of words, can be eliminated when equal opportunities for…

  4. Power Equalization through Organization Development Training.

    ERIC Educational Resources Information Center

    Bartunek, Jean M.; Keys, Christopher B.

    The effects of a three-year Organization Development (OD) intervention on power equalization were examined in seven experimental and seven control schools. The principals and teachers from experimental schools participated in OD workshops, in a project-coordinating council for planning and policy, and in school goal-setting activities. The power…

  5. 75 FR 53559 - Women's Equality Day, 2010

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-31

    ... hundred and thirty-fifth. (Presidential Sig.) [FR Doc. 2010-21904 Filed 8-30-10; 11:15 am] Billing code... education and economic opportunity, face gender-based violence, and cannot participate fully and equally in... to all Americans, regardless of gender, race, ethnicity, sexual orientation, disability,...

  6. Race Equality Scheme 2005-2008

    ERIC Educational Resources Information Center

    Her Majesty's Inspectorate of Education, 2005

    2005-01-01

    Her Majesty's Inspectorate of Education (HMIE) is strongly committed to promoting race equality in the way that HMIE staff go about performing their role within Scottish education. Scottish society reflects cultural, ethnic, religious and linguistic diversity and Scottish education should be accessible to all. No-one should be disadvantaged or…

  7. When Equal Masses Don't Balance

    ERIC Educational Resources Information Center

    Newburgh, Ronald; Peidle, Joseph; Rueckner, Wolfgang

    2004-01-01

    We treat a modified Atwood's machine in which equal masses do not balance because of being in an accelerated frame of reference. Analysis of the problem illuminates the meaning of inertial forces, d'Alembert's principle, the use of free-body diagrams and the selection of appropriate systems for the diagrams. In spite of the range of these…

  8. Great Constitutional Ideas: Justice, Equality, and Property.

    ERIC Educational Resources Information Center

    Starr, Isidore

    1987-01-01

    Examines the ideas of justice, equality, and property as they are represented in the Declaration of Independence, the U.S. Constitution and the Bill of Rights. Discusses how these ideas affect the way public schools operate and the lessons educators teach or don't teach about our society. Includes ideas for classroom activities. (JDH)

  9. 77 FR 52583 - Women's Equality Day, 2012

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-29

    ... Independence of the United States of America the two hundred and thirty- seventh. (Presidential Sig.) [FR Doc... August 29, 2012 Part IV The President Proclamation 8848--Women's Equality Day, 2012 #0; #0; #0... / Presidential Documents#0;#0; #0; #0;Title 3-- #0;The President ] Proclamation 8848 of August 24, 2012...

  10. 7 CFR 622.6 - Equal opportunity.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... of Agriculture (7 CFR Part 15), which provide that no person in the United States shall, on the... 7 Agriculture 6 2010-01-01 2010-01-01 false Equal opportunity. 622.6 Section 622.6 Agriculture Regulations of the Department of Agriculture (Continued) NATURAL RESOURCES CONSERVATION SERVICE, DEPARTMENT...

  11. Three Utilities for the Equal Sign

    ERIC Educational Resources Information Center

    Jones, Ian; Pratt, Dave

    2005-01-01

    We compare the activity of young children using a microworld and a JavaScript relational calculator with the literature on children using traditional calculators. We describe how the children constructed different meanings for the equal sign in each setting. It appears that the nature of the meaning constructed is highly dependent on specificities…

  12. Disability in the UK: Measuring Equality

    ERIC Educational Resources Information Center

    Purdam, Kingsley; Afkhami, Reza; Olsen, Wendy; Thornton, Patricia

    2008-01-01

    In this article we identify the key survey data for examining the issue of equality in the lives of disabled people in the UK. Such data is essential for assessing change in quality of life over time and for the evaluation of the impact of policy initiatives. For each data source we consider definitions, data collection, issue coverage, sample…

  13. Position Paper: NO equals x Measurement

    ERIC Educational Resources Information Center

    Hauser, Thomas R.; Shy, Carl M.

    1972-01-01

    Doubts about the accuracy of measured concentrations of nitrogen dioxide (NO 2) in ambient air have led the Environmental Protection Agency to reassess both the analytical technique and the extent to which nitrogen oxides (NO equals x) control will need to satisfy federal laws. (BL)

  14. Gender Equality in Academia: A Critical Reflection

    ERIC Educational Resources Information Center

    Winchester, Hilary P. M.; Browning, Lynette

    2015-01-01

    Gender equality in academia has been monitored in Australia for the past three decades so it is timely to reflect on what progress has been made, what works, and what challenges remain. When data were first published on the gender composition of staff in Australian universities in the mid-1980s women comprised 20 per cent of academic staff and…

  15. Equalizing Multi-School Curriculum by Technology.

    ERIC Educational Resources Information Center

    Etowah County Board of Education, Gadsden, AL.

    A three year project aimed at providing equal educational opportunity for all students in the seven high schools of Etowah County, Alabama by implementing a county-wide curriculum using a flexible, rotating schedule, audio-graphic network, instructional television, a learning center, and individualized instruction. The report rates the project as…

  16. Equity, Equal Opportunities, Gender and Organization Performance.

    ERIC Educational Resources Information Center

    Standing, Hilary; Baume, Elaine

    The issues of equity, equal opportunities, gender, and organization performance in the health care sector worldwide was examined. Information was gathered from the available literature and from individuals in 17 countries. The analysis highlighted the facts that employment equity debates and policies refer largely to high-income countries and…

  17. Gender Equality Policies and Higher Education Careers

    ERIC Educational Resources Information Center

    Berggren, Caroline

    2011-01-01

    Gender equality policies regulate the Swedish labour market, including higher education. This study analyses and discusses the career development of postgraduate students in the light of labour market influences. The principle of gender separation is used to understand these effects. Swedish register data encompassing information on 585…

  18. Gender Equality in Education: Definitions and Measurements

    ERIC Educational Resources Information Center

    Subrahmanian, R.

    2005-01-01

    International consensus on education priorities accords an important place to achieving gender justice in the educational sphere. Both the Dakar 'Education for All' goals and the Millennium Development goals emphasise two goals, in this regard. These two goals are distinguished as gender parity goals [achieving equal participation of girls and…

  19. Sourcebook of Equal Educational Opportunity. Second Edition.

    ERIC Educational Resources Information Center

    1977

    This reference book offers current information about equal opportunity in education through the elimination of racial, cultural, sexist, and linguistic barriers facing minority groups. The volume consists of seven parts, plus subject and geographical indexes. The first section includes a general demographic overview of the U.S., with statistics on…

  20. Women and Employment. Policies for Equal Opportunities.

    ERIC Educational Resources Information Center

    Organisation for Economic Cooperation and Development, Paris (France).

    This document contains the proceedings of a high-level conference on the Employment of Women, attended by labor ministers and other high officials of countries belonging to the Organisation for Economic Cooperation and Development (OECD). Delegates to the conference adopted a 14-point declaration pledging themselves to achieve equality of…

  1. What Is Equality of Opportunity in Education?

    ERIC Educational Resources Information Center

    Lazenby, Hugh

    2016-01-01

    There is widespread disagreement about what equality of opportunity in education requires. For some it is that each child is legally permitted to go to school. For others it is that each child receives the same educational resources. Further interpretations abound. This fact presents a problem: when politicians or academics claim they are in…

  2. The Circle and Sphere as Great Equalizers.

    ERIC Educational Resources Information Center

    Schwartzman, Steven

    1991-01-01

    From the equality of the ratios of the surface areas and volumes of a sphere and its circumscribed cylinder, the exploration of theorems relating the ratios of surface areas and volumes of a sphere and other circumscribed solids in three dimensions, and analogous questions relating two-dimensional concepts of perimeter and area is recounted. (MDH)

  3. Equal Justice Under Law. Instructor's Guide.

    ERIC Educational Resources Information Center

    Starr, Isidore; And Others

    This document is a teachers' guide to accompany the book "Equal Justice Under Law: The Supreme Court in American Life." Because the book contains a tremendous amount of detail, the guide does not attempt to explicate everything in the text. Instead the guide attempts to provide more detail on one or more of the issues covered in different sections…

  4. Racial Equality. To Protect These Rights Series.

    ERIC Educational Resources Information Center

    McDonald, Laughlin

    A historical review of racial discrimination against Negroes is the scope of this volume, part of a series of six volumes which explore the basic American rights. These include due process of law, freedom of speech and religious freedom. This volume traces the development of racial equality in the legal system, explores the controversies and…

  5. Social Responsibility in Librarianship: Essays on Equality.

    ERIC Educational Resources Information Center

    MacCann, Donnarae, Ed.

    In a culturally complex world, librarians can best work toward the equalization of library services if they understand their institutions in the light of cultural history. The six essays in this book highlight problems that affect unempowered populations, and address a variety of cultural problems and biases--problems that contribute to the…

  6. Equality in the Workplace. An Equal Opportunities Handbook for Trainers. Human Resource Management in Action Series.

    ERIC Educational Resources Information Center

    Collins, Helen

    This workbook, which is intended as a practical guide for human resource managers, trainers, and others concerned with developing and implementing equal opportunities training programs in British workplaces, examines issues in and methods for equal opportunities training. The introduction gives an overview of current training trends and issues.…

  7. Neural network with dynamically adaptable neurons

    NASA Technical Reports Server (NTRS)

    Tawel, Raoul (Inventor)

    1994-01-01

    This invention is an adaptive neuron for use in neural network processors. The adaptive neuron participates in the supervised learning phase of operation on a co-equal basis with the synapse matrix elements by adaptively changing its gain in a similar manner to the change of weights in the synapse IO elements. In this manner, training time is decreased by as much as three orders of magnitude.

  8. Angiogenic response of locally advanced breast cancer to neoadjuvant chemotherapy evaluated with parametric histogram from dynamic contrast-enhanced MRI

    NASA Astrophysics Data System (ADS)

    Chang, Yeun-Chung; Huang, Chiun-Sheng; Liu, Yi-Jui; Chen, Jyh-Horng; Lu, Yen-Shen; Tseng, Wen-Yih I.

    2004-08-01

    The aim of this study was to evaluate angiogenic compositions and tumour response in the course of neoadjuvant chemotherapy in patients with locally advanced breast cancer (LABC) using dynamic contrast-enhanced (DCE) MRI. Thirteen patients with LABC underwent serial DCE MRI during the course of chemotherapy. DCE MRI was quantified using a two-compartment model on a pixel-by-pixel basis. Analysis of parametric histograms of amplitude, exchange rate kout and peak enhancement over the whole tumour was performed. The distribution patterns of histograms were correlated with the tumour response. Initial kurtosis and standard deviation of amplitude before chemotherapy correlated with tumour response, r = 0.63 and r = 0.61, respectively. Comparing the initial values with the values after the first course of chemotherapy, tumour response was associated with a decrease in standard deviation of amplitude (r = 0.79), and an increase in kurtosis and a decrease in standard deviation of kout (r = 0.57 and 0.57, respectively). Comparing the initial values with the values after completing the chemotherapy, tumours with better response were associated with an increase in kurtosis (r = 0.62), a decrease in mean (r = 0.84) and standard deviation (r = 0.77) of amplitude, and a decrease in mean of peak enhancement (r = 0.71). Our results suggested that tumours with better response tended to alter their internal compositions from heterogeneous to homogeneous distributions and a decrease in peak enhancement after chemotherapy. Serial analyses of parametric histograms of DCE MRI-derived angiogenic parameters are potentially useful to monitor the response of angiogenic compositions of a tumour throughout the course of chemotherapy, and might predict tumour response early in the course.

  9. Adaptive SPECT

    PubMed Central

    Barrett, Harrison H.; Furenlid, Lars R.; Freed, Melanie; Hesterman, Jacob Y.; Kupinski, Matthew A.; Clarkson, Eric; Whitaker, Meredith K.

    2008-01-01

    Adaptive imaging systems alter their data-acquisition configuration or protocol in response to the image information received. An adaptive pinhole single-photon emission computed tomography (SPECT) system might acquire an initial scout image to obtain preliminary information about the radiotracer distribution and then adjust the configuration or sizes of the pinholes, the magnifications, or the projection angles in order to improve performance. This paper briefly describes two small-animal SPECT systems that allow this flexibility and then presents a framework for evaluating adaptive systems in general, and adaptive SPECT systems in particular. The evaluation is in terms of the performance of linear observers on detection or estimation tasks. Expressions are derived for the ideal linear (Hotelling) observer and the ideal linear (Wiener) estimator with adaptive imaging. Detailed expressions for the performance figures of merit are given, and possible adaptation rules are discussed. PMID:18541485

  10. Density equalizing map projections: A new algorithm

    SciTech Connect

    Merrill, D.W.; Selvin, S.; Mohr, M.S.

    1992-02-01

    In the study of geographic disease clusters, an alternative to traditional methods based on rates is to analyze case locations on a transformed map in which population density is everywhere equal. Although the analyst`s task is thereby simplified, the specification of the density equalizing map projection (DEMP) itself is not simple and continues to be the subject of considerable research. Here a new DEMP algorithm is described, which avoids some of the difficulties of earlier approaches. The new algorithm (a) avoids illegal overlapping of transformed polygons; (b) finds the unique solution that minimizes map distortion; (c) provides constant magnification over each map polygon; (d) defines a continuous transformation over the entire map domain; (e) defines an inverse transformation; (f) can accept optional constraints such as fixed boundaries; and (g) can use commercially supported minimization software. Work is continuing to improve computing efficiency and improve the algorithm.

  11. Density equalizing map projections: A new algorithm

    SciTech Connect

    Merrill, D.W.; Selvin, S.; Mohr, M.S.

    1992-02-01

    In the study of geographic disease clusters, an alternative to traditional methods based on rates is to analyze case locations on a transformed map in which population density is everywhere equal. Although the analyst's task is thereby simplified, the specification of the density equalizing map projection (DEMP) itself is not simple and continues to be the subject of considerable research. Here a new DEMP algorithm is described, which avoids some of the difficulties of earlier approaches. The new algorithm (a) avoids illegal overlapping of transformed polygons; (b) finds the unique solution that minimizes map distortion; (c) provides constant magnification over each map polygon; (d) defines a continuous transformation over the entire map domain; (e) defines an inverse transformation; (f) can accept optional constraints such as fixed boundaries; and (g) can use commercially supported minimization software. Work is continuing to improve computing efficiency and improve the algorithm.

  12. Microscopic justification of the equal filling approximation

    SciTech Connect

    Perez-Martin, Sara; Robledo, L. M.

    2008-07-15

    The equal filling approximation, a procedure widely used in mean-field calculations to treat the dynamics of odd nuclei in a time-reversal invariant way, is justified as the consequence of a variational principle over an average energy functional. The ideas of statistical quantum mechanics are employed in the justification. As an illustration of the method, the ground and lowest-lying states of some octupole deformed radium isotopes are computed.

  13. Spatio-Temporal Equalizer for a Receiving-Antenna Feed Array

    NASA Technical Reports Server (NTRS)

    Mukai, Ryan; Lee, Dennis; Vilnrotter, Victor

    2010-01-01

    A spatio-temporal equalizer has been conceived as an improved means of suppressing multipath effects in the reception of aeronautical telemetry signals, and may be adaptable to radar and aeronautical communication applications as well. This equalizer would be an integral part of a system that would also include a seven-element planar array of receiving feed horns centered at the focal point of a paraboloidal antenna that would be nominally aimed at or near the aircraft that would be the source of the signal that one seeks to receive (see Figure 1). This spatio-temporal equalizer would consist mostly of a bank of seven adaptive finite-impulse-response (FIR) filters one for each element in the array - and the outputs of the filters would be summed (see Figure 2). The combination of the spatial diversity of the feedhorn array and the temporal diversity of the filter bank would afford better multipath-suppression performance than is achievable by means of temporal equalization alone. The seven-element feed array would supplant the single feed horn used in a conventional paraboloidal ground telemetry-receiving antenna. The radio-frequency telemetry signals re ceiv ed by the seven elements of the array would be digitized, converted to complex baseband form, and sent to the FIR filter bank, which would adapt itself in real time to enable reception of telemetry at a low bit error rate, even in the presence of multipath of the type found at many flight test ranges.

  14. The equal effectiveness of different defensive strategies

    PubMed Central

    Zhang, Shuang; Zhang, Yuxin; Ma, Keming

    2015-01-01

    Plants have evolved a variety of defensive strategies to resist herbivory, but at the interspecific level, the relative effectiveness of these strategies has been poorly evaluated. In this study, we compared the level of herbivory between species that depend on ants as indirect defenders and species that rely primarily on their own direct defenses. Using a dataset of 871 species and 1,405 data points, we found that in general, ant-associated species had levels of herbivory equal to those of species that are unattractive to ants; the pattern was unaffected by plant life form, climate and phylogenetic relationships between species. Interestingly, species that offer both food and nesting spaces for ants suffered significantly lower herbivory compared to species that offer either food or nesting spaces only or no reward for ants. A negative relationship between herbivory and latitude was detected, but the pattern can be changed by ants. These findings suggest that, at the interspecific level, the effectiveness of different defensive strategies may be equal. Considering the effects of herbivory on plant performance and fitness, the equal effectiveness of different defensive strategies may play an important role in the coexistence of various species at the community scale. PMID:26267426

  15. Equalization method for Medipix3RX

    NASA Astrophysics Data System (ADS)

    Rinkel, Jean; Magalhães, Debora; Wagner, Franz; Frojdh, Erik; Ballabriga Sune, Rafael

    2015-11-01

    This paper describes a new method of threshold equalization for X-ray detectors based on the Medipix3RX ASIC, using electrical pulses to calibrate and correct for the threshold dispersion between pixels. This method involves a coarse threshold tuning, based on two 8 bits global DACs and which sets the range of variation of the threshold values; and a fine-tuning, based on two 5-bits adjustment DACs per pixel. As our fine-tuning approach is based on a state-of-the-art methodology, our coarse tuning relies on an original theoretical model. This model takes into account the noise level of the ASIC, which varies with temperature and received radiation dose. The experimental results using 300 μm Si sensor and Kα fluorescence of Zn show a global energy resolution improvement of 14% compared to previous equalization methods. We compared these results with the best achievable global energy resolution given by the resolution of individual pixels and concluded that the remaining 14% difference was due to the discretization error limited by the number of equalization bits.

  16. Modeling the dark current histogram induced by gold contamination in complementary-metal-oxide-semiconductor image sensors

    NASA Astrophysics Data System (ADS)

    Domengie, F.; Morin, P.; Bauza, D.

    2015-07-01

    We propose a model for dark current induced by metallic contamination in a CMOS image sensor. Based on Shockley-Read-Hall kinetics, the expression of dark current proposed accounts for the electric field enhanced emission factor due to the Poole-Frenkel barrier lowering and phonon-assisted tunneling mechanisms. To that aim, we considered the distribution of the electric field magnitude and metal atoms in the depth of the pixel. Poisson statistics were used to estimate the random distribution of metal atoms in each pixel for a given contamination dose. Then, we performed a Monte-Carlo-based simulation for each pixel to set the number of metal atoms the pixel contained and the enhancement factor each atom underwent, and obtained a histogram of the number of pixels versus dark current for the full sensor. Excellent agreement with the dark current histogram measured on an ion-implanted gold-contaminated imager has been achieved, in particular, for the description of the distribution tails due to the pixel regions in which the contaminant atoms undergo a large electric field. The agreement remains very good when increasing the temperature by 15 °C. We demonstrated that the amplification of the dark current generated for the typical electric fields encountered in the CMOS image sensors, which depends on the nature of the metal contaminant, may become very large at high electric field. The electron and hole emissions and the resulting enhancement factor are described as a function of the trap characteristics, electric field, and temperature.

  17. Elucidating the effects of adsorbent flexibility on fluid adsorption using simple models and flat-histogram sampling methods

    SciTech Connect

    Shen, Vincent K. Siderius, Daniel W.

    2014-06-28

    Using flat-histogram Monte Carlo methods, we investigate the adsorptive behavior of the square-well fluid in two simple slit-pore-like models intended to capture fundamental characteristics of flexible adsorbent materials. Both models require as input thermodynamic information about the flexible adsorbent material itself. An important component of this work involves formulating the flexible pore models in the appropriate thermodynamic (statistical mechanical) ensembles, namely, the osmotic ensemble and a variant of the grand-canonical ensemble. Two-dimensional probability distributions, which are calculated using flat-histogram methods, provide the information necessary to determine adsorption thermodynamics. For example, we are able to determine precisely adsorption isotherms, (equilibrium) phase transition conditions, limits of stability, and free energies for a number of different flexible adsorbent materials, distinguishable as different inputs into the models. While the models used in this work are relatively simple from a geometric perspective, they yield non-trivial adsorptive behavior, including adsorption-desorption hysteresis solely due to material flexibility and so-called “breathing” of the adsorbent. The observed effects can in turn be tied to the inherent properties of the bare adsorbent. Some of the effects are expected on physical grounds while others arise from a subtle balance of thermodynamic and mechanical driving forces. In addition, the computational strategy presented here can be easily applied to more complex models for flexible adsorbents.

  18. Local structure of equality constrained NLP problems

    SciTech Connect

    Mari, J.

    1994-12-31

    We show that locally around a feasible point, the behavior of an equality constrained nonlinear program is described by the gradient and the Hessian of the Lagrangian on the tangent subspace. In particular this holds true for reduced gradient approaches. Applying the same ideas to the control of nonlinear ODE:s, one can device first and second order methods that can be applied also to stiff problems. We finally describe an application of these ideas to the optimization of the production of human growth factor by fed-batch fermentation.

  19. 47 CFR 36.191 - Equal access equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false Equal access equipment. 36.191 Section 36.191... AND RESERVES FOR TELECOMMUNICATIONS COMPANIES 1 Telecommunications Property Equal Access Equipment § 36.191 Equal access equipment. (a) Equal access investment includes only initial...

  20. 47 CFR 36.421 - Equal access expenses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false Equal access expenses. 36.421 Section 36.421... AND RESERVES FOR TELECOMMUNICATIONS COMPANIES 1 Operating Expenses and Taxes Equal Access Expenses § 36.421 Equal access expenses. (a) Equal access expenses include only initial incremental...

  1. 29 CFR 1614.202 - Equal Pay Act.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION FEDERAL SECTOR EQUAL EMPLOYMENT OPPORTUNITY Provisions Applicable to Particular Complaints § 1614.202 Equal Pay Act. (a) In its enforcement of the Equal Pay Act, the Commission has the authority to investigate an agency's employment practices...

  2. 29 CFR 1614.408 - Civil action: Equal Pay Act.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Civil action: Equal Pay Act. 1614.408 Section 1614.408 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION FEDERAL SECTOR EQUAL EMPLOYMENT OPPORTUNITY Appeals and Civil Actions § 1614.408 Civil action: Equal Pay Act....

  3. 48 CFR 52.211-6 - Brand name or equal.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 2 2013-10-01 2013-10-01 false Brand name or equal. 52....211-6 Brand name or equal. As prescribed in 11.107(a), insert the following provision: Brand Name or Equal (AUG 1999) (a) If an item in this solicitation is identified as “brand name or equal,”...

  4. 48 CFR 52.211-6 - Brand name or equal.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 2 2014-10-01 2014-10-01 false Brand name or equal. 52....211-6 Brand name or equal. As prescribed in 11.107(a), insert the following provision: Brand Name or Equal (AUG 1999) (a) If an item in this solicitation is identified as “brand name or equal,”...

  5. 48 CFR 52.211-6 - Brand name or equal.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 2 2012-10-01 2012-10-01 false Brand name or equal. 52....211-6 Brand name or equal. As prescribed in 11.107(a), insert the following provision: Brand Name or Equal (AUG 1999) (a) If an item in this solicitation is identified as “brand name or equal,”...

  6. 34 CFR 21.1 - Equal Access to Justice Act.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Equal Access to Justice Act. 21.1 Section 21.1 Education Office of the Secretary, Department of Education EQUAL ACCESS TO JUSTICE General § 21.1 Equal Access to Justice Act. (a) The Equal Access to Justice Act (the Act) provides for the award of fees...

  7. 48 CFR 52.211-6 - Brand name or equal.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Brand name or equal. 52....211-6 Brand name or equal. As prescribed in 11.107(a), insert the following provision: Brand Name or Equal (AUG 1999) (a) If an item in this solicitation is identified as “brand name or equal,”...

  8. 48 CFR 52.211-6 - Brand name or equal.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Brand name or equal. 52....211-6 Brand name or equal. As prescribed in 11.107(a), insert the following provision: Brand Name or Equal (AUG 1999) (a) If an item in this solicitation is identified as “brand name or equal,”...

  9. Equal area rule methods for ternary systems

    SciTech Connect

    Shyu, G.S.; Hanif, N.S.M.; Alvarado, J.F.J.; Hall, K.R.; Eubank, P.T.

    1995-12-01

    The phase equilibrium behavior of fluid mixtures is an important design consideration for both chemical processes and oil production. Eubank and Hall have recently shown the equal area rule (EAR) applies to the composition derivative of the Gibbs energy of a binary system at fixed pressure and temperature regardless of derivative continuity. A sufficient condition for equilibria, EAR is faster and simpler than either the familiar tangent-line method or the area method of Eubank et al. Here, the authors show that EAR can be extended to ternary systems exhibiting one, two, or three phases at equilibrium. A single directional vector is searched in composition space; at equilibrium, this vector is the familiar tie line. A sensitive criterion for equilibrium under EAR is equality of orthogonal derivatives such as ({partial_derivative}g/{partial_derivative}x{sub 1}){sub x{sub 2}P,T} at the end points ({alpha} and {beta}), where g {equivalent_to} ({Delta}{sub m}G/RT). Repeated use of the binary algorithm published in the first reference allows rapid, simple solution of ternary problems, even with hand-held calculations for cases where the background model is simple (e.g., activity coefficient models) and the derivative continuous.

  10. Is Primatology an equal-opportunity discipline?

    PubMed

    Addessi, Elsa; Borgi, Marta; Palagi, Elisabetta

    2012-01-01

    The proportion of women occupying academic positions in biological sciences has increased in the past few decades, but women are still under-represented in senior academic ranks compared to their male colleagues. Primatology has been often singled out as a model of "equal-opportunity" discipline because of the common perception that women are more represented in Primatology than in similar fields. But is this indeed true? Here we show that, although in the past 15 years the proportion of female primatologists increased from the 38% of the early 1990s to the 57% of 2008, Primatology is far from being an "equal-opportunity" discipline, and suffers the phenomenon of "glass ceiling" as all the other scientific disciplines examined so far. In fact, even if Primatology does attract more female students than males, at the full professor level male members significantly outnumber females. Moreover, regardless of position, IPS male members publish significantly more than their female colleagues. Furthermore, when analyzing gender difference in scientific productivity in relation to the name order in the publications, it emerged that the scientific achievements of female primatologists (in terms of number and type of publications) do not always match their professional achievements (in terms of academic position). However, the gender difference in the IPS members' number of publications does not correspond to a similar difference in their scientific impact (as measured by their H index), which may indicate that female primatologists' fewer articles are of higher impact than those of their male colleagues. PMID:22272353

  11. Comp Plan: A computer program to generate dose and radiobiological metrics from dose-volume histogram files

    SciTech Connect

    Holloway, Lois Charlotte; Miller, Julie-Anne; Kumar, Shivani; Whelan, Brendan M.; Vinod, Shalini K.

    2012-10-01

    Treatment planning studies often require the calculation of a large number of dose and radiobiological metrics. To streamline these calculations, a computer program called Comp Plan was developed using MATLAB. Comp Plan calculates common metrics, including equivalent uniform dose, tumor control probability, and normal tissue complication probability from dose-volume histogram data. The dose and radiobiological metrics can be calculated for the original data or for an adjusted fraction size using the linear quadratic model. A homogeneous boost dose can be added to a given structure if desired. The final output is written to an Excel file in a format convenient for further statistical analysis. Comp Plan was verified by independent calculations. A lung treatment planning study comparing 45 plans for 7 structures using up to 6 metrics for each structure was successfully analyzed within approximately 5 minutes with Comp Plan. The code is freely available from the authors on request.

  12. A fast underwater optical image segmentation algorithm based on a histogram weighted fuzzy c-means improved by PSO

    NASA Astrophysics Data System (ADS)

    Wang, Shilong; Xu, Yuru; Pang, Yongjie

    2011-03-01

    The S/N of an underwater image is low and has a fuzzy edge. If using traditional methods to process it directly, the result is not satisfying. Though the traditional fuzzy C-means algorithm can sometimes divide the image into object and background, its time-consuming computation is often an obstacle. The mission of the vision system of an autonomous underwater vehicle (AUV) is to rapidly and exactly deal with the information about the object in a complex environment for the AUV to use the obtained result to execute the next task. So, by using the statistical characteristics of the gray image histogram, a fast and effective fuzzy C-means underwater image segmentation algorithm was presented. With the weighted histogram modifying the fuzzy membership, the above algorithm can not only cut down on a large amount of data processing and storage during the computation process compared with the traditional algorithm, so as to speed up the efficiency of the segmentation, but also improve the quality of underwater image segmentation. Finally, particle swarm optimization (PSO) described by the sine function was introduced to the algorithm mentioned above. It made up for the shortcomings that the FCM algorithm can not get the global optimal solution. Thus, on the one hand, it considers the global impact and achieves the local optimal solution, and on the other hand, further greatly increases the computing speed. Experimental results indicate that the novel algorithm can reach a better segmentation quality and the processing time of each image is reduced. They enhance efficiency and satisfy the requirements of a highly effective, real-time AUV.

  13. Modeling the dark current histogram induced by gold contamination in complementary-metal-oxide-semiconductor image sensors

    SciTech Connect

    Domengie, F. Morin, P.; Bauza, D.

    2015-07-14

    We propose a model for dark current induced by metallic contamination in a CMOS image sensor. Based on Shockley-Read-Hall kinetics, the expression of dark current proposed accounts for the electric field enhanced emission factor due to the Poole-Frenkel barrier lowering and phonon-assisted tunneling mechanisms. To that aim, we considered the distribution of the electric field magnitude and metal atoms in the depth of the pixel. Poisson statistics were used to estimate the random distribution of metal atoms in each pixel for a given contamination dose. Then, we performed a Monte-Carlo-based simulation for each pixel to set the number of metal atoms the pixel contained and the enhancement factor each atom underwent, and obtained a histogram of the number of pixels versus dark current for the full sensor. Excellent agreement with the dark current histogram measured on an ion-implanted gold-contaminated imager has been achieved, in particular, for the description of the distribution tails due to the pixel regions in which the contaminant atoms undergo a large electric field. The agreement remains very good when increasing the temperature by 15 °C. We demonstrated that the amplification of the dark current generated for the typical electric fields encountered in the CMOS image sensors, which depends on the nature of the metal contaminant, may become very large at high electric field. The electron and hole emissions and the resulting enhancement factor are described as a function of the trap characteristics, electric field, and temperature.

  14. Public hospital care: equal for all or equal for some? Evidence from the Philippines.

    PubMed

    James, Chris D; Peabody, John; Hanson, Kara; Solon, Orville

    2015-03-01

    In low- and middle-income countries, government budgets are rarely sufficient to cover a public hospital's operating costs. Shortfalls are typically financed through a combination of health insurance contributions and user charges. The mixed nature of this financing arrangement potentially creates financial incentives to treat patients with equal health need unequally. Using data from the Philippines, the authors analyzed whether doctors respond to such incentives. After controlling for a patient's condition, they found that patients using insurance, paying more for hospital accommodation, and being treated in externally monitored hospitals were likely to receive more care. This highlights the worrying possibility that public hospital patients with equal health needs are not always equally treated. PMID:23420059

  15. rp Process and Masses of N{approx_equal}Z{approx_equal}34 Nuclides

    SciTech Connect

    Savory, J.; Schury, P.; Bachelet, C.; Block, M.; Bollen, G.; Facina, M.; Folden, C. M. III; Guenaut, C.; Kwan, E.; Kwiatkowski, A. A.; Morrissey, D. J.; Pang, G. K.; Prinke, A.; Ringle, R.; Schatz, H.; Schwarz, S.; Sumithrarachchi, C. S.

    2009-04-03

    High-precision Penning-trap mass measurements of the N{approx_equal}Z{approx_equal}34 nuclides {sup 68}Se, {sup 70}Se, {sup 70m}Br, and {sup 71}Br were performed, reaching experimental uncertainties of 0.5-15 keV. The new and improved mass data together with theoretical Coulomb displacement energies were used as input for rp process network calculations. An increase in the effective lifetime of the waiting point nucleus {sup 68}Se was found, and more precise information was obtained on the luminosity during a type I x-ray burst along with the final elemental abundances after the burst.

  16. Jarzynski equality for non-Hamiltonian dynamics

    NASA Astrophysics Data System (ADS)

    Mandal, Dibyendu; Deweese, Michael R.

    Recent years have witnessed major advances in our understanding of nonequilibrium processes. The Jarzynski equality, for example, provides a link between equilibrium free energy differences and finite-time, nonequilibrium dynamics. We propose a generalization of this relation to non-Hamiltonian dynamics, relevant for active matter systems, continuous feedback, and computer simulation. Surprisingly, this relation allows us to calculate the free energy difference between the desired initial and final states using arbitrary dynamics. As a practical matter, this dissociation between the dynamics and the initial and final states promises to facilitate a range of techniques for free energy estimation in a single, universal expression. This material is based upon work supported in part by the U.S. Army Research Laboratory and the U.S. Army Research Office under Contract Number W911NF-13-1-0390.

  17. Pressure-equalizing PV assembly and method

    DOEpatents

    Dinwoodie, Thomas L.

    2004-10-26

    Each PV assembly of an array of PV assemblies comprises a base, a PV module and a support assembly securing the PV module to a position overlying the upper surface of the base. Vents are formed through the base. A pressure equalization path extends from the outer surface of the PV module, past the PV module, to and through at least one of the vents, and to the lower surface of the base to help reduce wind uplift forces on the PV assembly. The PV assemblies may be interengaged, such as by interengaging the bases of adjacent PV assemblies. The base may include a main portion and a cover and the bases of adjacent PV assemblies may be interengaged by securing the covers of adjacent bases together.

  18. Social Epigenetics and Equality of Opportunity

    PubMed Central

    Loi, Michele; Del Savio, Lorenzo; Stupka, Elia

    2013-01-01

    Recent epidemiological reports of associations between socioeconomic status and epigenetic markers that predict vulnerability to diseases are bringing to light substantial biological effects of social inequalities. Here, we start the discussion of the moral consequences of these findings. We firstly highlight their explanatory importance in the context of the research program on the Developmental Origins of Health and Disease (DOHaD) and the social determinants of health. In the second section, we review some theories of the moral status of health inequalities. Rather than a complete outline of the debate, we single out those theories that rest on the principle of equality of opportunity and analyze the consequences of DOHaD and epigenetics for these particular conceptions of justice. We argue that DOHaD and epigenetics reshape the conceptual distinction between natural and acquired traits on which these theories rely and might provide important policy tools to tackle unjust distributions of health. PMID:23864907

  19. Acoustical numerology and lucky equal temperaments

    NASA Astrophysics Data System (ADS)

    Hall, Donald E.

    1988-04-01

    Equally tempered musical scales with N steps per octave are known to work especially well in approximating justly tuned intervals for such values as N=12, 19, 31, and 53. A quantitative measure of the closeness of such fits is suggested, in terms of the probabilities of coming as close to randomly chosen intervals as to the justly tuned targets. When two or more harmonic intervals are considered simultaneously, this involves a Monte Carlo evaluation of the probabilities. The results can be used to gauge how much advantage the special values of N mentioned above have over others. This article presents the rationale and method of computation, together with illustrative results in a few of the most interesting cases. References are provided to help relate these results to earlier works by music theorists.

  20. Pressure equalizing photovoltaic assembly and method

    DOEpatents

    Dinwoodie, Thomas L.

    2003-05-27

    Each PV assembly of an array of PV assemblies comprises a base, a PV module and a support assembly securing the PV module to a position overlying the upper surface of the base. Vents are formed through the base. A pressure equalization path extends from the outer surface of the PV module, past the peripheral edge of the PV module, to and through at least one of the vents, and to the lower surface of the base to help reduce wind uplift forces on the PV assembly. The PV assemblies may be interengaged, such as by interengaging the bases of adjacent PV assemblies. The base may include a main portion and a cover and the bases of adjacent PV assemblies may be interengaged by securing the covers of adjacent bases together.

  1. Is Primatology an Equal-Opportunity Discipline?

    PubMed Central

    Borgi, Marta

    2012-01-01

    The proportion of women occupying academic positions in biological sciences has increased in the past few decades, but women are still under-represented in senior academic ranks compared to their male colleagues. Primatology has been often singled out as a model of “equal-opportunity” discipline because of the common perception that women are more represented in Primatology than in similar fields. But is this indeed true? Here we show that, although in the past 15 years the proportion of female primatologists increased from the 38% of the early 1990s to the 57% of 2008, Primatology is far from being an “equal-opportunity” discipline, and suffers the phenomenon of “glass ceiling” as all the other scientific disciplines examined so far. In fact, even if Primatology does attract more female students than males, at the full professor level male members significantly outnumber females. Moreover, regardless of position, IPS male members publish significantly more than their female colleagues. Furthermore, when analyzing gender difference in scientific productivity in relation to the name order in the publications, it emerged that the scientific achievements of female primatologists (in terms of number and type of publications) do not always match their professional achievements (in terms of academic position). However, the gender difference in the IPS members' number of publications does not correspond to a similar difference in their scientific impact (as measured by their H index), which may indicate that female primatologists' fewer articles are of higher impact than those of their male colleagues. PMID:22272353

  2. Downhole component with a pressure equalization passageway

    DOEpatents

    Hall, David R.; Pixton, David S.; Dahlgren, Scott; Reynolds, Jay T.; Breihan, James W.; Briscoe, Michael A.

    2006-08-22

    The present invention includes a downhole component adapted for transmitting downhole data. The downhole component includes a threaded end on a downhole component. The threaded end furthermore includes an interior region, and exterior region, and a mating surface wherein a cavity is formed. A data transmission element is disposed in the cavity and displaces a volume of the cavity. At least one passageway is formed in the threaded region between interior and exterior regions. The passageway is in fluid communication with both the interior and exterior regions and thereby relieves pressure build up of thread lubricant upon tool joint make up.

  3. Equalization of loudspeaker response using balanced model truncation.

    PubMed

    Li, Xiansheng; Cai, Zhibo; Zheng, Chengshi; Li, Xiaodong

    2015-04-01

    Traditional loudspeaker equalization algorithms cannot decide the order of an equalizer before the whole equalization procedure has been completed. Designers have to try many times before they determine a proper order of the equalization filter. A method which solves this drawback is presented for loudspeaker equalization using balanced model truncation. The order of the equalizer can be easily decided using this algorithm and the error between the model and the loudspeaker can also be readily controlled. Examples are presented and the performance of the proposed method is discussed with comparative experiments. PMID:25920872

  4. The Equal Right to Inequality: Equality and Utility in on- and off-Campus Subject Delivery

    ERIC Educational Resources Information Center

    Luck, Morgan

    2009-01-01

    The principle of equality states that it is bad for some people to be worse off than others. In the context of distance education, this principle is violated on those occasions where on-campus students have access, not only to all the resources available to distance education students, but also to face-to-face tutorials. This is because the…

  5. Equal Opportunity Handbook: A Resource on Equal Opportunities for Education and Employment in Oregon Public Schools.

    ERIC Educational Resources Information Center

    Oregon State Dept. of Education, Salem.

    This handbook is an information source for Oregon public school districts developing policies to ensure equal opportunities in education, employment, and the provision of educational services required by Federal and state laws, regulations, and policies. Not addressed are issues and services for the handicapped or programs for migrants, the…

  6. 101 Short Problems from EQUALS = 101 Problemas Cortos del programma EQUALS.

    ERIC Educational Resources Information Center

    Stenmark, Jean Kerr, Ed.

    EQUALS is a teacher advisory program that helps elementary and secondary educators acquire methods and materials to attract minority and female students to mathematics. The program supports a problem-solving approach to mathematics, including having students working in groups, using active assessment methods, and incorporating a broad mathematics…

  7. Teachers Negotiating Discourses of Gender (In) Equality: The Case of Equal Opportunities Reform in Andalusia

    ERIC Educational Resources Information Center

    Cubero, Mercedes; Santamaría, Andrés; Rebollo, Mª Ángeles; Cubero, Rosario; García, Rafael; Vega, Luisa

    2015-01-01

    This article is focused on the analysis of the narratives produced by a group of teachers, experts in coeducation, while they were discussing their everyday activities. They are responsible for the implementation of a Plan for Gender Equality in public secondary schools in Andalusia (Spain). This study is based on contributions about doing gender…

  8. Adaptive Development

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The goal of this research is to develop and demonstrate innovative adaptive seal technologies that can lead to dramatic improvements in engine performance, life, range, and emissions, and enhance operability for next generation gas turbine engines. This work is concentrated on the development of self-adaptive clearance control systems for gas turbine engines. Researchers have targeted the high-pressure turbine (HPT) blade tip seal location for following reasons: Current active clearance control (ACC) systems (e.g., thermal case-cooling schemes) cannot respond to blade tip clearance changes due to mechanical, thermal, and aerodynamic loads. As such they are prone to wear due to the required tight running clearances during operation. Blade tip seal wear (increased clearances) reduces engine efficiency, performance, and service life. Adaptive sealing technology research has inherent impact on all envisioned 21st century propulsion systems (e.g. distributed vectored, hybrid and electric drive propulsion concepts).

  9. Decision feedback equalization for CDMA in indoor wireless communications

    NASA Astrophysics Data System (ADS)

    Abdulrahman, Majeed; Sheikh, Asrar U. H.; Falconer, David D.

    1994-05-01

    Commercial interest in Code Division Multiple Access (CDMA) systems has risen dramatically in the last few years. It yields a potential increase in capacity over other access schemes, because it provides protection against interference, multipath, fading, and jamming. Recently, several interference cancellation schemes for CDMA have been proposed but they require information about all interfering active users or some channel parameters. In this paper, we present an adaptive fractionally spaced decision feedback equalizer (DFE) for a CDMA system in an indoor wireless Rayleigh fading environment. This system only uses information about the desired user's spreading code and a training sequence. An analysis on the optimum performance of the DFE receiver shows the advantages of this system over others in terms of capacity improvements. A simulation of this system is also presented to study the convergence properties and implementation considerations of the DFE receiver. Effects on the performance because of sudden birth and death of users in the CDMA system and bit error rate performanceof the DFE receiver is also presented. of the DFE receiver is also presented.

  10. Evaluation of breast cancer using intravoxel incoherent motion (IVIM) histogram analysis: comparison with malignant status, histological subtype, and molecular prognostic factors

    PubMed Central

    Cho, Gene Young; Moy, Linda; Kim, Sungheon G.; Baete, Steven H.; Moccaldi, Melanie; Babb, James S.; Sodickson, Daniel K.; Sigmund, Eric E.

    2016-01-01

    Purpose To examine heterogeneous breast cancer through intravoxel incoherent motion (IVIM) histogram analysis. Materials and methods This HIPAA-compliant, IRB-approved retrospective study included 62 patients (age 48.44±11.14 years, 50 malignant lesions and 12 benign) who underwent contrast-enhanced 3 T breast MRI and diffusion-weighted imaging. Apparent diffusion coefficient (ADC) and IVIM biomarkers of tissue diffusivity (Dt), perfusion fraction (fp), and pseudo-diffusivity (Dp) were calculated using voxel-based analysis for the whole lesion volume. Histogram analysis was performed to quantify tumour heterogeneity. Comparisons were made using Mann–Whitney tests between benign/malignant status, histological subtype, and molecular prognostic factor status while Spearman’s rank correlation was used to characterize the association between imaging biomarkers and prognostic factor expression. Results The average values of the ADC and IVIM biomarkers, Dt and fp, showed significant differences between benign and malignant lesions. Additional significant differences were found in the histogram parameters among tumour subtypes and molecular prognostic factor status. IVIM histogram metrics, particularly fp and Dp, showed significant correlation with hormonal factor expression. Conclusion Advanced diffusion imaging biomarkers show relationships with molecular prognostic factors and breast cancer malignancy. This analysis reveals novel diagnostic metrics that may explain some of the observed variability in treatment response among breast cancer patients. PMID:26615557

  11. Structure-Property Relationships in Atomic-Scale Junctions: Histograms and Beyond.

    PubMed

    Hybertsen, Mark S; Venkataraman, Latha

    2016-03-15

    Over the past 10 years, there has been tremendous progress in the measurement, modeling and understanding of structure-function relationships in single molecule junctions. Numerous research groups have addressed significant scientific questions, directed both to conductance phenomena at the single molecule level and to the fundamental chemistry that controls junction functionality. Many different functionalities have been demonstrated, including single-molecule diodes, optically and mechanically activated switches, and, significantly, physical phenomena with no classical analogues, such as those based on quantum interference effects. Experimental techniques for reliable and reproducible single molecule junction formation and characterization have led to this progress. In particular, the scanning tunneling microscope based break-junction (STM-BJ) technique has enabled rapid, sequential measurement of large numbers of nanoscale junctions allowing a statistical analysis to readily distinguish reproducible characteristics. Harnessing fundamental link chemistry has provided the necessary chemical control over junction formation, enabling measurements that revealed clear relationships between molecular structure and conductance characteristics. Such link groups (amines, methylsuflides, pyridines, etc.) maintain a stable lone pair configuration that selectively bonds to specific, undercoordinated transition metal atoms available following rupture of a metal point contact in the STM-BJ experiments. This basic chemical principle rationalizes the observation of highly reproducible conductance signatures. Subsequently, the method has been extended to probe a variety of physical phenomena ranging from basic I-V characteristics to more complex properties such as thermopower and electrochemical response. By adapting the technique to a conducting cantilever atomic force microscope (AFM-BJ), simultaneous measurement of the mechanical characteristics of nanoscale junctions as they

  12. 47 CFR 90.168 - Equal employment opportunities.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... SERVICES PRIVATE LAND MOBILE RADIO SERVICES Applications and Authorizations Special Rules Governing Facilities Used to Provide Commercial Mobile Radio Services § 90.168 Equal employment opportunities. Commercial Mobile Radio Services licensees shall afford equal opportunity in employment to all...

  13. 47 CFR 90.168 - Equal employment opportunities.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... SERVICES PRIVATE LAND MOBILE RADIO SERVICES Applications and Authorizations Special Rules Governing Facilities Used to Provide Commercial Mobile Radio Services § 90.168 Equal employment opportunities. Commercial Mobile Radio Services licensees shall afford equal opportunity in employment to all...

  14. 47 CFR 90.168 - Equal employment opportunities.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... SERVICES PRIVATE LAND MOBILE RADIO SERVICES Applications and Authorizations Special Rules Governing Facilities Used to Provide Commercial Mobile Radio Services § 90.168 Equal employment opportunities. Commercial Mobile Radio Services licensees shall afford equal opportunity in employment to all...

  15. Adapting Animals.

    ERIC Educational Resources Information Center

    Wedman, John; Wedman, Judy

    1985-01-01

    The "Animals" program found on the Apple II and IIe system master disk can be adapted for use in the mathematics classroom. Instructions for making the necessary changes and suggestions for using it in lessons related to geometric shapes are provided. (JN)

  16. Adaptive Thresholds

    SciTech Connect

    Bremer, P. -T.

    2014-08-26

    ADAPT is a topological analysis code that allow to compute local threshold, in particular relevance based thresholds for features defined in scalar fields. The initial target application is vortex detection but the software is more generally applicable to all threshold based feature definitions.

  17. Adaptive homeostasis.

    PubMed

    Davies, Kelvin J A

    2016-06-01

    Homeostasis is a central pillar of modern Physiology. The term homeostasis was invented by Walter Bradford Cannon in an attempt to extend and codify the principle of 'milieu intérieur,' or a constant interior bodily environment, that had previously been postulated by Claude Bernard. Clearly, 'milieu intérieur' and homeostasis have served us well for over a century. Nevertheless, research on signal transduction systems that regulate gene expression, or that cause biochemical alterations to existing enzymes, in response to external and internal stimuli, makes it clear that biological systems are continuously making short-term adaptations both to set-points, and to the range of 'normal' capacity. These transient adaptations typically occur in response to relatively mild changes in conditions, to programs of exercise training, or to sub-toxic, non-damaging levels of chemical agents; thus, the terms hormesis, heterostasis, and allostasis are not accurate descriptors. Therefore, an operational adjustment to our understanding of homeostasis suggests that the modified term, Adaptive Homeostasis, may be useful especially in studies of stress, toxicology, disease, and aging. Adaptive Homeostasis may be defined as follows: 'The transient expansion or contraction of the homeostatic range in response to exposure to sub-toxic, non-damaging, signaling molecules or events, or the removal or cessation of such molecules or events.' PMID:27112802

  18. Subsea choke and riser pressure equalization system

    SciTech Connect

    Shanks, F.

    1980-07-01

    A description is given of a subsea choke and riser pressure equalization system comprising, in combination: a section of riser pipe; a choke conduit line mechanically supported by said riser pipe; a variable choke valve having an inlet port and a discharge port; a gate valve having an inlet port and a discharge port; a gate valve having an inlet port conected in fluid communiction with the choke line and an output port connected in fluid communication with the inlet port of the variable choke for permitting diversion of fluid from the choke line to the variable choke; and, means connecting the discharge port of the variable choke in fluid communication with the interior of the riser section, said means including a discharge conduit extending from said choke valve in inclined relation with respect to the axis of said riser, whereby fluid diverted from said choke conduit is discharged into said riser in counterflow relation with respect to the upward flow of drilling fluid through said riser.

  19. Categorical facilitation with equally discriminable colors.

    PubMed

    Witzel, Christoph; Gegenfurtner, Karl R

    2015-01-01

    This study investigates the impact of language on color perception. By categorical facilitation, we refer to an aspect of categorical perception, in which the linguistic distinction between categories affects color discrimination beyond the low-level, sensory sensitivity to color differences. According to this idea, discrimination performance for colors that cross a category border should be better than for colors that belong to the same category when controlling for low-level sensitivity. We controlled for sensitivity by using colors that were equally discriminable according to empirically measured discrimination thresholds. To test for categorical facilitation, we measured response times and error rates in a speeded discrimination task for suprathreshold stimuli. Robust categorical facilitation occurred for five out of six categories with a group of inexperienced observers, namely for pink, orange, yellow, green, and purple. Categorical facilitation was robust against individual variations of categories or the laterality of target presentation. However, contradictory effects occurred in the blue category, most probably reflecting the difficulty to control effects of sensory mechanisms at the green-blue boundary. Moreover, a group of observers who were highly familiar with the discrimination task did not show consistent categorical facilitation in the other five categories. This trained group had much faster response times than the inexperienced group without any speed-accuracy trade-off. Additional analyses suggest that categorical facilitation occurs when observers pay attention to the categorical distinction but not when they respond automatically based on sensory feed-forward information. PMID:26129860

  20. Battery Charge Equalizer with Transformer Array

    NASA Technical Reports Server (NTRS)

    Davies, Francis

    2013-01-01

    High-power batteries generally consist of a series connection of many cells or cell banks. In order to maintain high performance over battery life, it is desirable to keep the state of charge of all the cell banks equal. A method provides individual charging for battery cells in a large, high-voltage battery array with a minimum number of transformers while maintaining reasonable efficiency. This is designed to augment a simple highcurrent charger that supplies the main charge energy. The innovation will form part of a larger battery charge system. It consists of a transformer array connected to the battery array through rectification and filtering circuits. The transformer array is connected to a drive circuit and a timing and control circuit that allow individual battery cells or cell banks to be charged. The timing circuit and control circuit connect to a charge controller that uses battery instrumentation to determine which battery bank to charge. It is important to note that the innovation can charge an individual cell bank at the same time that the main battery charger is charging the high-voltage battery. The fact that the battery cell banks are at a non-zero voltage, and that they are all at similar voltages, can be used to allow charging of individual cell banks. A set of transformers can be connected with secondary windings in series to make weighted sums of the voltages on the primaries.

  1. Nonlinear Algorithms for Channel Equalization and Map Symbol Detection.

    NASA Astrophysics Data System (ADS)

    Giridhar, K.

    The transfer of information through a communication medium invariably results in various kinds of distortion to the transmitted signal. In this dissertation, a feed -forward neural network-based equalizer, and a family of maximum a posteriori (MAP) symbol detectors are proposed for signal recovery in the presence of intersymbol interference (ISI) and additive white Gaussian noise. The proposed neural network-based equalizer employs a novel bit-mapping strategy to handle multilevel data signals in an equivalent bipolar representation. It uses a training procedure to learn the channel characteristics, and at the end of training, the multilevel symbols are recovered from the corresponding inverse bit-mapping. When the channel characteristics are unknown and no training sequences are available, blind estimation of the channel (or its inverse) and simultaneous data recovery is required. Convergence properties of several existing Bussgang-type blind equalization algorithms are studied through computer simulations, and a unique gain independent approach is used to obtain a fair comparison of their rates of convergence. Although simple to implement, the slow convergence of these Bussgang-type blind equalizers make them unsuitable for many high data-rate applications. Rapidly converging blind algorithms based on the principle of MAP symbol-by -symbol detection are proposed, which adaptively estimate the channel impulse response (CIR) and simultaneously decode the received data sequence. Assuming a linear and Gaussian measurement model, the near-optimal blind MAP symbol detector (MAPSD) consists of a parallel bank of conditional Kalman channel estimators, where the conditioning is done on each possible data subsequence that can convolve with the CIR. This algorithm is also extended to the recovery of convolutionally encoded waveforms in the presence of ISI. Since the complexity of the MAPSD algorithm increases exponentially with the length of the assumed CIR, a suboptimal

  2. 76 FR 36885 - Regulation B; Equal Credit Opportunity

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-23

    ...; ] FEDERAL RESERVE SYSTEM 12 CFR Part 202 RIN No. 7100-AD-78 Regulation B; Equal Credit Opportunity AGENCY.... SUMMARY: The Board is publishing for public comment a proposed rule amending Regulation B (Equal Credit Opportunity). Section 704B of the Equal Credit Opportunity Act (ECOA), as added by Section 1071 of the...

  3. 12 CFR 528.5 - Equal Housing Lender Poster.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Equal Housing Lender Poster. 528.5 Section 528... REQUIREMENTS § 528.5 Equal Housing Lender Poster. (a) Each savings association shall post and maintain one or more Equal Housing Lender Posters, the text of which is prescribed in paragraph (b) of this section,...

  4. 12 CFR 626.6025 - Equal housing lender poster.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Equal housing lender poster. 626.6025 Section... § 626.6025 Equal housing lender poster. (a) Each Farm Credit institution that makes loans for the... by a dwelling shall post and maintain an Equal Housing Lender Poster in the lobby of each of...

  5. 12 CFR 128.5 - Equal Housing Lender Poster.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 1 2014-01-01 2014-01-01 false Equal Housing Lender Poster. 128.5 Section 128... REQUIREMENTS § 128.5 Equal Housing Lender Poster. (a) Each savings association shall post and maintain one or more Equal Housing Lender Posters, the text of which is prescribed in paragraph (b) of this section,...

  6. 12 CFR 390.146 - Equal Housing Lender Poster.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 5 2014-01-01 2014-01-01 false Equal Housing Lender Poster. 390.146 Section....146 Equal Housing Lender Poster. (a) Each State savings association shall post and maintain one or more Equal Housing Lender Posters, the text of which is prescribed in paragraph (b) of this section,...

  7. 12 CFR 390.146 - Equal Housing Lender Poster.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 5 2012-01-01 2012-01-01 false Equal Housing Lender Poster. 390.146 Section....146 Equal Housing Lender Poster. (a) Each State savings association shall post and maintain one or more Equal Housing Lender Posters, the text of which is prescribed in paragraph (b) of this section,...

  8. 12 CFR 528.5 - Equal Housing Lender Poster.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 6 2014-01-01 2012-01-01 true Equal Housing Lender Poster. 528.5 Section 528.5... REQUIREMENTS § 528.5 Equal Housing Lender Poster. (a) Each savings association shall post and maintain one or more Equal Housing Lender Posters, the text of which is prescribed in paragraph (b) of this section,...

  9. 12 CFR 390.146 - Equal Housing Lender Poster.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 5 2013-01-01 2013-01-01 false Equal Housing Lender Poster. 390.146 Section....146 Equal Housing Lender Poster. (a) Each State savings association shall post and maintain one or more Equal Housing Lender Posters, the text of which is prescribed in paragraph (b) of this section,...

  10. 12 CFR 528.5 - Equal Housing Lender Poster.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 6 2012-01-01 2012-01-01 false Equal Housing Lender Poster. 528.5 Section 528... REQUIREMENTS § 528.5 Equal Housing Lender Poster. (a) Each savings association shall post and maintain one or more Equal Housing Lender Posters, the text of which is prescribed in paragraph (b) of this section,...

  11. 12 CFR 626.6025 - Equal housing lender poster.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Equal housing lender poster. 626.6025 Section... § 626.6025 Equal housing lender poster. (a) Each Farm Credit institution that makes loans for the... by a dwelling shall post and maintain an Equal Housing Lender Poster in the lobby of each of...

  12. 12 CFR 268.407 - Civil action: Equal Pay Act.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 3 2011-01-01 2011-01-01 false Civil action: Equal Pay Act. 268.407 Section 268.407 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM RULES REGARDING EQUAL OPPORTUNITY Appeals to the Equal Employment Opportunity Commission §...

  13. 12 CFR 268.407 - Civil action: Equal Pay Act.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 4 2013-01-01 2013-01-01 false Civil action: Equal Pay Act. 268.407 Section 268.407 Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) BOARD OF GOVERNORS OF THE FEDERAL RESERVE SYSTEM (CONTINUED) RULES REGARDING EQUAL OPPORTUNITY Appeals to the Equal Employment...

  14. 45 CFR 1616.6 - Equal employment opportunity.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Equal employment opportunity. 1616.6 Section 1616... ATTORNEY HIRING § 1616.6 Equal employment opportunity. A recipient shall adopt employment qualifications... employment, and shall take affirmative action to insure equal employment opportunity....

  15. 45 CFR 2543.81 - Equal employment opportunity.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Equal employment opportunity. 2543.81 Section 2543...-PROFIT ORGANIZATIONS Statutory Compliance § 2543.81 Equal employment opportunity. All contracts shall contain a provision requiring compliance with E.O. 11246, “Equal Employment Opportunity,” as amended by...

  16. 47 CFR 73.881 - Equal employment opportunities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Equal employment opportunities. 73.881 Section... BROADCAST SERVICES Low Power FM Broadcast Stations (LPFM) § 73.881 Equal employment opportunities. General EEO policy. Equal employment opportunity shall be afforded by all LPFM licensees and permittees to...

  17. 36 CFR 254.11 - Exchanges at approximately equal value.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... equal value. 254.11 Section 254.11 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE LANDOWNERSHIP ADJUSTMENTS Land Exchanges § 254.11 Exchanges at approximately equal value. (a) The authorized officer may exchange lands which are of approximately equal value upon a determination that:...

  18. 36 CFR 254.11 - Exchanges at approximately equal value.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... equal value. 254.11 Section 254.11 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE LANDOWNERSHIP ADJUSTMENTS Land Exchanges § 254.11 Exchanges at approximately equal value. (a) The authorized officer may exchange lands which are of approximately equal value upon a determination that:...

  19. 36 CFR 254.11 - Exchanges at approximately equal value.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... equal value. 254.11 Section 254.11 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE LANDOWNERSHIP ADJUSTMENTS Land Exchanges § 254.11 Exchanges at approximately equal value. (a) The authorized officer may exchange lands which are of approximately equal value upon a determination that:...

  20. 43 CFR 2201.5 - Exchanges at approximately equal value.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Exchanges at approximately equal value... PROCEDURES Exchanges-Specific Requirements § 2201.5 Exchanges at approximately equal value. (a) The authorized officer may exchange lands that are of approximately equal value when it is determined that:...

  1. 43 CFR 2201.5 - Exchanges at approximately equal value.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Exchanges at approximately equal value... PROCEDURES Exchanges-Specific Requirements § 2201.5 Exchanges at approximately equal value. (a) The authorized officer may exchange lands that are of approximately equal value when it is determined that:...

  2. 43 CFR 2201.5 - Exchanges at approximately equal value.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Exchanges at approximately equal value... PROCEDURES Exchanges-Specific Requirements § 2201.5 Exchanges at approximately equal value. (a) The authorized officer may exchange lands that are of approximately equal value when it is determined that:...

  3. 36 CFR 254.11 - Exchanges at approximately equal value.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... equal value. 254.11 Section 254.11 Parks, Forests, and Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE LANDOWNERSHIP ADJUSTMENTS Land Exchanges § 254.11 Exchanges at approximately equal value. (a) The authorized officer may exchange lands which are of approximately equal value upon a determination that:...

  4. 5 CFR 720.101 - Federal Equal Opportunity Recruitment Program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Federal Equal Opportunity Recruitment... Federal Equal Opportunity Recruitment Program. This section incorporates the statutory requirements for establishing and conducting an equal opportunity recruitment program consistent with law within the...

  5. 48 CFR 852.211-73 - Brand name or equal.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Brand name or equal. 852... Brand name or equal. As prescribed in 811.104-71, insert the following clause: Brand Name or Equal (JAN 2008) (Note: As used in this clause, the term “brand name” includes identification of products by...

  6. 48 CFR 1852.210-70 - Brand name or equal.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Brand name or equal. 1852... 1852.210-70 Brand name or equal. As prescribed in 1810.011-70(a), insert the following provision: Brand Name or Equal (DEC 1988) (a) As used in this provision, “brand name” means identification of...

  7. 48 CFR 452.211-70 - Brand Name or Equal.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Brand Name or Equal. 452... FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and Clauses 452.211-70 Brand Name or Equal. As prescribed in 411.171, insert the following provision: Brand Name or Equal (NOV...

  8. 48 CFR 1852.210-70 - Brand name or equal.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Brand name or equal. 1852... 1852.210-70 Brand name or equal. As prescribed in 1810.011-70(a), insert the following provision: Brand Name or Equal (DEC 1988) (a) As used in this provision, “brand name” means identification of...

  9. 48 CFR 452.211-70 - Brand Name or Equal.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Brand Name or Equal. 452... FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and Clauses 452.211-70 Brand Name or Equal. As prescribed in 411.171, insert the following provision: Brand Name or Equal (NOV...

  10. 48 CFR 452.211-70 - Brand Name or Equal.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Brand Name or Equal. 452... FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and Clauses 452.211-70 Brand Name or Equal. As prescribed in 411.171, insert the following provision: Brand Name or Equal (NOV...

  11. 48 CFR 852.211-73 - Brand name or equal.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Brand name or equal. 852... Brand name or equal. As prescribed in 811.104-71, insert the following clause: Brand Name or Equal (JAN 2008) (Note: As used in this clause, the term “brand name” includes identification of products by...

  12. 48 CFR 852.211-73 - Brand name or equal.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Brand name or equal. 852... Brand name or equal. As prescribed in 811.104-71, insert the following clause: Brand Name or Equal (JAN 2008) (Note: As used in this clause, the term “brand name” includes identification of products by...

  13. 48 CFR 1852.210-70 - Brand name or equal.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Brand name or equal. 1852... 1852.210-70 Brand name or equal. As prescribed in 1810.011-70(a), insert the following provision: Brand Name or Equal (DEC 1988) (a) As used in this provision, “brand name” means identification of...

  14. Pushing Economies (and Students) outside the Factor Price Equalization Zone

    ERIC Educational Resources Information Center

    Oslington, Paul; Towers, Isaac

    2009-01-01

    Despite overwhelming empirical evidence of the failure of factor price equalization, most teaching of international trade theory (even at the graduate level) assumes that economies are incompletely specialized and that factor price equalization holds. The behavior of trading economies in the absence of factor price equalization is not well…

  15. Reflections on Mainstreaming Gender Equality in Adult Basic Education Programmes

    ERIC Educational Resources Information Center

    Lind, Agneta

    2006-01-01

    This article is about mainstreaming gender equality in adult basic learning and education (ABLE). Gender equality is defined as equal rights of both women and men to influence, participate in and benefit from a programme. It is argued that specific gender analyses of emerging patterns of gender relations is helpful in formulating gender equality…

  16. The Struggle for Gender Equality: How Men Respond.

    ERIC Educational Resources Information Center

    Kimmel, Michael S.

    1993-01-01

    Men's responses to women's demands for educational equality during three peak periods of activism for women's equality (mid-nineteenth century, turn of the century, and contemporary era) are examined for the insight they offer into the different issues and shifting arguments for and against gender equality in this and other arenas. (MSE)

  17. 29 CFR 530.414 - Equal Access to Justice Act.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Equal Access to Justice Act. 530.414 Section 530.414 Labor... OF HOMEWORKERS IN CERTAIN INDUSTRIES Administrative Procedures § 530.414 Equal Access to Justice Act. Proceedings under this part are not subject to the provisions of the Equal Access to Justice Act. In...

  18. 48 CFR 852.211-73 - Brand name or equal.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Brand name or equal. 852... Brand name or equal. As prescribed in 811.104-71, insert the following clause: Brand Name or Equal (JAN 2008) (Note: As used in this clause, the term “brand name” includes identification of products by...

  19. 48 CFR 452.211-70 - Brand Name or Equal.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Brand Name or Equal. 452... FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and Clauses 452.211-70 Brand Name or Equal. As prescribed in 411.171, insert the following provision: Brand Name or Equal (NOV...

  20. 48 CFR 1852.210-70 - Brand name or equal.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Brand name or equal. 1852... 1852.210-70 Brand name or equal. As prescribed in 1810.011-70(a), insert the following provision: Brand Name or Equal (DEC 1988) (a) As used in this provision, “brand name” means identification of...

  1. 48 CFR 852.211-73 - Brand name or equal.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Brand name or equal. 852... Brand name or equal. As prescribed in 811.104-71, insert the following clause: Brand Name or Equal (JAN 2008) (Note: As used in this clause, the term “brand name” includes identification of products by...

  2. 48 CFR 1852.210-70 - Brand name or equal.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Brand name or equal. 1852... 1852.210-70 Brand name or equal. As prescribed in 1810.011-70(a), insert the following provision: Brand Name or Equal (DEC 1988) (a) As used in this provision, “brand name” means identification of...

  3. 48 CFR 452.211-70 - Brand Name or Equal.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Brand Name or Equal. 452... FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and Clauses 452.211-70 Brand Name or Equal. As prescribed in 411.171, insert the following provision: Brand Name or Equal (NOV...

  4. 77 FR 39117 - Equal Access to Justice Act Implementation Rule

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-29

    ... PROTECTION 12 CFR Part 1071 RIN 3170-AA27 Equal Access to Justice Act Implementation Rule AGENCY: Bureau of... Equal Access to Justice Act (EAJA or the Act) requires agencies ] that conduct adversary adjudications..., Credit, Credit unions, Equal access to justice, Law enforcement, National banks, Savings...

  5. 75 FR 17622 - Equal Access to Justice Act Implementation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-07

    ... Federal Housing Enterprise Oversight 12 CFR Part 1705 RIN 2590-AA29 Equal Access to Justice Act... form, the OFHEO ``Implementation of the Equal Access to ] Justice Act'' regulation at 12 CFR part 1705... Subjects in 12 CFR Parts 1203 and 1705 Administrative practice and procedure, Equal access to...

  6. Halving It All: How Equally Shared Parenting Works.

    ERIC Educational Resources Information Center

    Deutsch, Francine M.

    Noting that details of everyday life contribute to parental equality or inequality, this qualitative study focused on how couples transformed parental roles to create truly equal families. Participating in the study were 88 couples in 4 categories, based on division of parental responsibilities: equal sharers, 60-40 couples, 75-25 couples, and…

  7. BEDVH--A method for evaluating biologically effective dose volume histograms: Application to eye plaque brachytherapy implants

    SciTech Connect

    Gagne, Nolan L.; Leonard, Kara L.; Huber, Kathryn E.; Mignano, John E.; Duker, Jay S.; Laver, Nora V.; Rivard, Mark J.

    2012-02-15

    Purpose: A method is introduced to examine the influence of implant duration T, radionuclide, and radiobiological parameters on the biologically effective dose (BED) throughout the entire volume of regions of interest for episcleral brachytherapy using available radionuclides. This method is employed to evaluate a particular eye plaque brachytherapy implant in a radiobiological context. Methods: A reference eye geometry and 16 mm COMS eye plaque loaded with {sup 103}Pd, {sup 125}I, or {sup 131}Cs sources were examined with dose distributions accounting for plaque heterogeneities. For a standardized 7 day implant, doses to 90% of the tumor volume ( {sub TUMOR}D{sub 90}) and 10% of the organ at risk volumes ( {sub OAR}D{sub 10}) were calculated. The BED equation from Dale and Jones and published {alpha}/{beta} and {mu} parameters were incorporated with dose volume histograms (DVHs) for various T values such as T = 7 days (i.e., {sub TUMOR} {sup 7}BED{sub 10} and {sub OAR} {sup 7}BED{sub 10}). By calculating BED throughout the volumes, biologically effective dose volume histograms (BEDVHs) were developed for tumor and OARs. Influence of T, radionuclide choice, and radiobiological parameters on {sub TUMOR}BEDVH and {sub OAR}BEDVH were examined. The nominal dose was scaled for shorter implants to achieve biological equivalence. Results: {sub TUMOR}D{sub 90} values were 102, 112, and 110 Gy for {sup 103}Pd, {sup 125}I, and {sup 131}Cs, respectively. Corresponding {sub TUMOR} {sup 7}BED{sub 10} values were 124, 140, and 138 Gy, respectively. As T decreased from 7 to 0.01 days, the isobiologically effective prescription dose decreased by a factor of three. As expected, {sub TUMOR} {sup 7}BEDVH did not significantly change as a function of radionuclide half-life but varied by 10% due to radionuclide dose distribution. Variations in reported radiobiological parameters caused {sub TUMOR} {sup 7}BED{sub 10} to deviate by up to 46%. Over the range of {sub OAR

  8. Employment equality legislation, 3 March 1988.

    PubMed

    1988-01-01

    On 1 April 1988, new employment equality legislation came into effect in Israel. The new legislation outlaws discrimination at work on the grounds of sex, marital status, and parenthood with respect to recruitment, terms of employment, promotion, vocational training, retraining, dismissal, and severance pay. Under the legislation, 1) employers may not cause prejudice to workers who allege discrimination, help others to do so, or decline sexual advances by a direct or indirect supervisor; 2) the burden of proof in discrimination claims against an employer is on the employer if the worker can show that requirements set by the employer have been met; 3) company managers and co-owners in a partnership are personally liable for violations on the part of the employer if the firm has over six workers unless they prove that the offense was committed without their knowledge or that they had taken all appropriate measures to prevent it; and 4) no special rights given to women by law, collective agreement, or other work contract are to be considered discrimination. The legislation establishes a Public Council to advise the Minister of Labour and Social Affairs on implementing and publicizing the legislation. It also allows a father to receive the following work benefits that were previously restricted to mothers: 1) leave of absence to care for a sick child and 2) statutory leave and statutory entitlement to severance pay for resigning to care for a newborn or adopted baby if the father is the sole guardian or if the mother renounces her right because she is working. PMID:12289294

  9. Connector adapter

    NASA Technical Reports Server (NTRS)

    Hacker, Scott C. (Inventor); Dean, Richard J. (Inventor); Burge, Scott W. (Inventor); Dartez, Toby W. (Inventor)

    2007-01-01

    An adapter for installing a connector to a terminal post, wherein the connector is attached to a cable, is presented. In an embodiment, the adapter is comprised of an elongated collet member having a longitudinal axis comprised of a first collet member end, a second collet member end, an outer collet member surface, and an inner collet member surface. The inner collet member surface at the first collet member end is used to engage the connector. The outer collet member surface at the first collet member end is tapered for a predetermined first length at a predetermined taper angle. The collet includes a longitudinal slot that extends along the longitudinal axis initiating at the first collet member end for a predetermined second length. The first collet member end is formed of a predetermined number of sections segregated by a predetermined number of channels and the longitudinal slot.

  10. Facilitation between extensor carpi radialis and pronator teres in humans: a study using a post-stimulus time histogram method.

    PubMed

    Nakano, Haruki; Miyasaka, Takuji; Ogino, Toshihiko; Naito, Akira

    2014-12-01

    Group I muscle afferents modulate the excitability of motor neurons through excitatory and inhibitory spinal reflexes. Spinal reflex relationships between various muscle pairs are well described in experimental animals but not in the human upper limb, which exhibits a fine control of movement. In the present study, spinal reflexes between the extensor carpi radialis (ECR) and pronator teres (PT) muscles were examined in healthy human subjects using a post-stimulus time histogram method. Electrical stimulation of low-threshold afferents of ECR nerves increased the motor neuron excitability in 31 of 76 PT motor units (MUs) in all eight subjects tested, while stimulation of low-threshold afferents of PT nerves increased the motor neuron excitability in 36 of 102 ECR MUs in all 10 subjects. The estimated central synaptic delay was almost equivalent to that of homonymous facilitation. Mechanical stimulation (MS) of ECR facilitated 16 of 30 PT MUs in all five subjects tested, while MS of PT facilitated 17 of 30 ECR MUs in all six subjects. These results suggest excitatory reflex (facilitation) between PT and ECR. Group I afferents should mediate the facilitation through a monosynaptic path. PMID:25026240

  11. Highly sensitive image-derived indices of water-stressed plants using hyperspectral imaging in SWIR and histogram analysis

    PubMed Central

    Kim, David M.; Zhang, Hairong; Zhou, Haiying; Du, Tommy; Wu, Qian; Mockler, Todd C.; Berezin, Mikhail Y.

    2015-01-01

    The optical signature of leaves is an important monitoring and predictive parameter for a variety of biotic and abiotic stresses, including drought. Such signatures derived from spectroscopic measurements provide vegetation indices – a quantitative method for assessing plant health. However, the commonly used metrics suffer from low sensitivity. Relatively small changes in water content in moderately stressed plants demand high-contrast imaging to distinguish affected plants. We present a new approach in deriving sensitive indices using hyperspectral imaging in a short-wave infrared range from 800 nm to 1600 nm. Our method, based on high spectral resolution (1.56 nm) instrumentation and image processing algorithms (quantitative histogram analysis), enables us to distinguish a moderate water stress equivalent of 20% relative water content (RWC). The identified image-derived indices 15XX nm/14XX nm (i.e. 1529 nm/1416 nm) were superior to common vegetation indices, such as WBI, MSI, and NDWI, with significantly better sensitivity, enabling early diagnostics of plant health. PMID:26531782

  12. Usefulness of histogram analysis of spatial frequency components for exploring the similarity and bilateral asymmetry in mammograms

    NASA Astrophysics Data System (ADS)

    Shiotsuki, Kenshi; Matsunobu, Yusuke; Yabuuchi, Hidetake; Morishita, Junji

    2015-03-01

    The right and left mammograms of a patient are assumed to be bilaterally symmetric for image readings. The detection of asymmetry in bilateral mammograms is a reliable indicator for detecting possible breast abnormalities. The purpose of this study was to examine the potential usefulness of a new method in terms of spatial frequency components for exploration of similarity and abnormality between the right and left mammograms. A total of 98 normal and 119 abnormal cases with calcifications were used for this study. Each case included two mediolateral oblique views. The spatial frequency components were determined from the symmetric regions in the right and left mammograms by Fourier transform. The degrees of conformity between the two spatial frequency components in the right and left mammograms were calculated for the same and different patients. The degrees of conformity were also examined for cases with and without calcifications for the same patient to show if the proposed method was useful for indicating the existence of calcifications or not. The average degrees of conformity and the standard deviations for the same and different patients were 0.911 +/- 0.0165 and 0.857 +/- 0.0328, respectively. The degrees of conformity calculated from abnormal cases (0.836 +/- 0.0906) showed statistically lower values compared with those measured from normal cases (0.911 +/- 0.0165). Our results indicated that histogram analysis of spatial frequency components could be useful as a similarity measure between bilateral mammograms for the same patient and abnormal signs in a mammogram.

  13. Highly sensitive image-derived indices of water-stressed plants using hyperspectral imaging in SWIR and histogram analysis.

    PubMed

    Kim, David M; Zhang, Hairong; Zhou, Haiying; Du, Tommy; Wu, Qian; Mockler, Todd C; Berezin, Mikhail Y

    2015-01-01

    The optical signature of leaves is an important monitoring and predictive parameter for a variety of biotic and abiotic stresses, including drought. Such signatures derived from spectroscopic measurements provide vegetation indices - a quantitative method for assessing plant health. However, the commonly used metrics suffer from low sensitivity. Relatively small changes in water content in moderately stressed plants demand high-contrast imaging to distinguish affected plants. We present a new approach in deriving sensitive indices using hyperspectral imaging in a short-wave infrared range from 800 nm to 1600 nm. Our method, based on high spectral resolution (1.56 nm) instrumentation and image processing algorithms (quantitative histogram analysis), enables us to distinguish a moderate water stress equivalent of 20% relative water content (RWC). The identified image-derived indices 15XX nm/14XX nm (i.e. 1529 nm/1416 nm) were superior to common vegetation indices, such as WBI, MSI, and NDWI, with significantly better sensitivity, enabling early diagnostics of plant health. PMID:26531782

  14. Vision-based drone flight control and crowd or riot analysis with efficient color histogram based tracking

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    Object tracking is a direct or indirect key issue in many different military applications like visual surveillance, automatic visual closed-loop control of UAVs (unmanned aerial vehicles) and PTZ-cameras, or in the field of crowd evaluations in order to detect or analyse a riot emergence. Of course, a high robustness is the most important feature of the underlying tracker, but this is hindered significantly the more the tracker needs to have low calculation times. In the UAV application introduced in this paper the tracker has to be extraordinarily quick. In order to optimize the calculation time and the robustness in combination as far as possible, a highly efficient tracking procedure is presented for the above mentioned application fields which relies on well-known color histograms but uses them in a novel manner. This procedure bases on the calculation of a color weighting vector representing the significances of object colors like a kind of an object's color finger print. Several examples from the above mentioned military applications are shown to demonstrate the practical relevance and the performance of the presented tracking approach.

  15. Highly sensitive image-derived indices of water-stressed plants using hyperspectral imaging in SWIR and histogram analysis

    NASA Astrophysics Data System (ADS)

    Kim, David M.; Zhang, Hairong; Zhou, Haiying; Du, Tommy; Wu, Qian; Mockler, Todd C.; Berezin, Mikhail Y.

    2015-11-01

    The optical signature of leaves is an important monitoring and predictive parameter for a variety of biotic and abiotic stresses, including drought. Such signatures derived from spectroscopic measurements provide vegetation indices - a quantitative method for assessing plant health. However, the commonly used metrics suffer from low sensitivity. Relatively small changes in water content in moderately stressed plants demand high-contrast imaging to distinguish affected plants. We present a new approach in deriving sensitive indices using hyperspectral imaging in a short-wave infrared range from 800 nm to 1600 nm. Our method, based on high spectral resolution (1.56 nm) instrumentation and image processing algorithms (quantitative histogram analysis), enables us to distinguish a moderate water stress equivalent of 20% relative water content (RWC). The identified image-derived indices 15XX nm/14XX nm (i.e. 1529 nm/1416 nm) were superior to common vegetation indices, such as WBI, MSI, and NDWI, with significantly better sensitivity, enabling early diagnostics of plant health.

  16. O(1) time algorithms for computing histogram and Hough transform on a cross-bridge reconfigurable array of processors

    SciTech Connect

    Kao, T.; Horng, S.; Wang, Y.

    1995-04-01

    Instead of using the base-2 number system, we use a base-m number system to represent the numbers used in the proposed algorithms. Such a strategy can be used to design an O(T) time, T = (log(sub m) N) + 1, prefix sum algorithm for a binary sequence with N-bit on a cross-bridge reconfigurable array of processors using N processors, where the data bus is m-bit wide. Then, this basic operation can be used to compute the histogram of an n x n image with G gray-level value in constant time using G x n x n processors, and compute the Hough transform of an image with N edge pixels and n x n parameter space in constant time using n x n x N processors, respectively. This result is better than the previously known results proposed in the literature. Also, the execution time of the proposed algorithms is tunable by the bus bandwidth. 43 refs.

  17. Adaptive sampler

    DOEpatents

    Watson, Bobby L.; Aeby, Ian

    1982-01-01

    An adaptive data compression device for compressing data having variable frequency content, including a plurality of digital filters for analyzing the content of the data over a plurality of frequency regions, a memory, and a control logic circuit for generating a variable rate memory clock corresponding to the analyzed frequency content of the data in the frequency region and for clocking the data into the memory in response to the variable rate memory clock.

  18. Adaptive sampler

    DOEpatents

    Watson, B.L.; Aeby, I.

    1980-08-26

    An adaptive data compression device for compressing data is described. The device has a frequency content, including a plurality of digital filters for analyzing the content of the data over a plurality of frequency regions, a memory, and a control logic circuit for generating a variable rate memory clock corresponding to the analyzed frequency content of the data in the frequency region and for clocking the data into the memory in response to the variable rate memory clock.

  19. Superchannel transmission system based on multi-channel equalization.

    PubMed

    Zeng, Tao

    2013-06-17

    We proposed a new method for superchannel transmission based on the newly proposed multi-channel equalization technique. This method allows us to realize tight channel spacing (equal to the baud rate) without using frequency-locked lasers and complex spectral shaping techniques at the transmitter. The inter-channel interference originated from the tight channel spacing is removed at the receiver by joint equalization of multiple adjacent channels. When the channel spacing is equal to the baud rate, our simulation results show that, with conventional oversample ratio (2 samples per symbol), realistic laser frequency offset and laser linewidth, the proposed multi-channel-equalization based method can achieve better performance than the traditional method using spectral shaping plus single channel equalization, although at the expense of a moderate increase in DSP complexity. The paper also gives a simple method to process the data after conventional chromatic dispersion compensation, which enables subsequent multi-channel equalization for long-haul transmissions. PMID:23787667

  20. Adaptive antennas

    NASA Astrophysics Data System (ADS)

    Barton, P.

    1987-04-01

    The basic principles of adaptive antennas are outlined in terms of the Wiener-Hopf expression for maximizing signal to noise ratio in an arbitrary noise environment; the analogy with generalized matched filter theory provides a useful aid to understanding. For many applications, there is insufficient information to achieve the above solution and thus non-optimum constrained null steering algorithms are also described, together with a summary of methods for preventing wanted signals being nulled by the adaptive system. The three generic approaches to adaptive weight control are discussed; correlation steepest descent, weight perturbation and direct solutions based on sample matrix conversion. The tradeoffs between hardware complexity and performance in terms of null depth and convergence rate are outlined. The sidelobe cancellor technique is described. Performance variation with jammer power and angular distribution is summarized and the key performance limitations identified. The configuration and performance characteristics of both multiple beam and phase scan array antennas are covered, with a brief discussion of performance factors.