Science.gov

Sample records for adaptive histogram equalization

  1. Osteoarthritis classification using self organizing map based on gabor kernel and contrast-limited adaptive histogram equalization.

    PubMed

    Anifah, Lilik; Purnama, I Ketut Eddy; Hariadi, Mochamad; Purnomo, Mauridhi Hery

    2013-01-01

    Localization is the first step in osteoarthritis (OA) classification. Manual classification, however, is time-consuming, tedious, and expensive. The proposed system is designed as decision support system for medical doctors to classify the severity of knee OA. A method has been proposed here to localize a joint space area for OA and then classify it in 4 steps to classify OA into KL-Grade 0, KL-Grade 1, KL-Grade 2, KL-Grade 3 and KL-Grade 4, which are preprocessing, segmentation, feature extraction, and classification. In this proposed system, right and left knee detection was performed by employing the Contrast-Limited Adaptive Histogram Equalization (CLAHE) and the template matching. The Gabor kernel, row sum graph and moment methods were used to localize the junction space area of knee. CLAHE is used for preprocessing step, i.e.to normalize the varied intensities. The segmentation process was conducted using the Gabor kernel, template matching, row sum graph and gray level center of mass method. Here GLCM (contrast, correlation, energy, and homogeinity) features were employed as training data. Overall, 50 data were evaluated for training and 258 data for testing. Experimental results showed the best performance by using gabor kernel with parameters α=8, θ=0, Ψ=[0 π/2], γ=0,8, N=4 and with number of iterations being 5000, momentum value 0.5 and α0=0.6 for the classification process. The run gave classification accuracy rate of 93.8% for KL-Grade 0, 70% for KL-Grade 1, 4% for KL-Grade 2, 10% for KL-Grade 3 and 88.9% for KL-Grade 4.

  2. Osteoarthritis Classification Using Self Organizing Map Based on Gabor Kernel and Contrast-Limited Adaptive Histogram Equalization

    PubMed Central

    Anifah, Lilik; Purnama, I Ketut Eddy; Hariadi, Mochamad; Purnomo, Mauridhi Hery

    2013-01-01

    Localization is the first step in osteoarthritis (OA) classification. Manual classification, however, is time-consuming, tedious, and expensive. The proposed system is designed as decision support system for medical doctors to classify the severity of knee OA. A method has been proposed here to localize a joint space area for OA and then classify it in 4 steps to classify OA into KL-Grade 0, KL-Grade 1, KL-Grade 2, KL-Grade 3 and KL-Grade 4, which are preprocessing, segmentation, feature extraction, and classification. In this proposed system, right and left knee detection was performed by employing the Contrast-Limited Adaptive Histogram Equalization (CLAHE) and the template matching. The Gabor kernel, row sum graph and moment methods were used to localize the junction space area of knee. CLAHE is used for preprocessing step, i.e.to normalize the varied intensities. The segmentation process was conducted using the Gabor kernel, template matching, row sum graph and gray level center of mass method. Here GLCM (contrast, correlation, energy, and homogeinity) features were employed as training data. Overall, 50 data were evaluated for training and 258 data for testing. Experimental results showed the best performance by using gabor kernel with parameters α=8, θ=0, Ψ=[0 π/2], γ=0,8, N=4 and with number of iterations being 5000, momentum value 0.5 and α0=0.6 for the classification process. The run gave classification accuracy rate of 93.8% for KL-Grade 0, 70% for KL-Grade 1, 4% for KL-Grade 2, 10% for KL-Grade 3 and 88.9% for KL-Grade 4. PMID:23525188

  3. Adaptive equalization

    NASA Astrophysics Data System (ADS)

    Qureshi, S. U. H.

    1985-09-01

    Theoretical work which has been effective in improving data transmission by telephone and radio links using adaptive equalization (AE) techniques is reviewed. AE has been applied to reducing the temporal dispersion effects, such as intersymbol interference, caused by the channel accessed. Attention is given to the Nyquist telegraph transmission theory, least mean square error adaptive filtering and the theory and structure of linear receive and transmit filters for reducing error. Optimum nonlinear receiver structures are discussed in terms of optimality criteria as a function of error probability. A suboptimum receiver structure is explored in the form of a decision-feedback equalizer. Consideration is also given to quadrature amplitude modulation and transversal equalization for receivers.

  4. Contrast enhancement via texture region based histogram equalization

    NASA Astrophysics Data System (ADS)

    Singh, Kuldeep; Vishwakarma, Dinesh K.; Singh Walia, Gurjit; Kapoor, Rajiv

    2016-08-01

    This paper presents two novel contrast enhancement approaches using texture regions-based histogram equalization (HE). In HE-based contrast enhancement methods, the enhanced image often contains undesirable artefacts because an excessive number of pixels in the non-textured areas heavily bias the histogram. The novel idea presented in this paper is to suppress the impact of pixels in non-textured areas and to exploit texture features for the computation of histogram in the process of HE. The first algorithm named as Dominant Orientation-based Texture Histogram Equalization (DOTHE), constructs the histogram of the image using only those image patches having dominant orientation. DOTHE categories image patches into smooth, dominant or non-dominant orientation patches by using the image variance and singular value decomposition algorithm and utilizes only dominant orientation patches in the process of HE. The second method termed as Edge-based Texture Histogram Equalization, calculates significant edges in the image and constructs the histogram using the grey levels present in the neighbourhood of edges. The cumulative density function of the histogram formed from texture features is mapped on the entire dynamic range of the input image to produce the contrast-enhanced image. Subjective as well as objective performance assessment of proposed methods is conducted and compared with other existing HE methods. The performance assessment in terms of visual quality, contrast improvement index, entropy and measure of enhancement reveals that the proposed methods outperform the existing HE methods.

  5. Histogram Equalization with Variable Enhancement Degree for Preserving Mean Brightness

    NASA Astrophysics Data System (ADS)

    Kawakami, Takashi; Murahira, Kota; Taguchi, Akira

    The histogram equalization (HE) is one of the common methods used for improving contrast in digital images. However, this technique causes a fluctuation of mean brightness. The fluctuation leads to the flicker for video signal. In order to preserve the mean brightness, the dynamic histogram equalization (DHE) is proposed. In this letter, we propose a novel DHE which is called the DHE with variable enhancement degree (DHEwVED). This method can change from DHE to HE by turning one parameter. We also show the effectiveness of the proposed method.

  6. Contrast Enhancement Algorithm Based on Gap Adjustment for Histogram Equalization

    PubMed Central

    Chiu, Chung-Cheng; Ting, Chih-Chung

    2016-01-01

    Image enhancement methods have been widely used to improve the visual effects of images. Owing to its simplicity and effectiveness histogram equalization (HE) is one of the methods used for enhancing image contrast. However, HE may result in over-enhancement and feature loss problems that lead to unnatural look and loss of details in the processed images. Researchers have proposed various HE-based methods to solve the over-enhancement problem; however, they have largely ignored the feature loss problem. Therefore, a contrast enhancement algorithm based on gap adjustment for histogram equalization (CegaHE) is proposed. It refers to a visual contrast enhancement algorithm based on histogram equalization (VCEA), which generates visually pleasing enhanced images, and improves the enhancement effects of VCEA. CegaHE adjusts the gaps between two gray values based on the adjustment equation, which takes the properties of human visual perception into consideration, to solve the over-enhancement problem. Besides, it also alleviates the feature loss problem and further enhances the textures in the dark regions of the images to improve the quality of the processed images for human visual perception. Experimental results demonstrate that CegaHE is a reliable method for contrast enhancement and that it significantly outperforms VCEA and other methods. PMID:27338412

  7. Texture enhanced histogram equalization using TV- L¹ image decomposition.

    PubMed

    Ghita, Ovidiu; Ilea, Dana E; Whelan, Paul F

    2013-08-01

    Histogram transformation defines a class of image processing operations that are widely applied in the implementation of data normalization algorithms. In this paper, we present a new variational approach for image enhancement that is constructed to alleviate the intensity saturation effects that are introduced by standard contrast enhancement (CE) methods based on histogram equalization. In this paper, we initially apply total variation (TV) minimization with a L(1) fidelity term to decompose the input image with respect to cartoon and texture components. Contrary to previous papers that rely solely on the information encompassed in the distribution of the intensity information, in this paper, the texture information is also employed to emphasize the contribution of the local textural features in the CE process. This is achieved by implementing a nonlinear histogram warping CE strategy that is able to maximize the information content in the transformed image. Our experimental study addresses the CE of a wide variety of image data and comparative evaluations are provided to illustrate that our method produces better results than conventional CE strategies.

  8. Infrared image gray adaptive adjusting enhancement algorithm based on gray redundancy histogram-dealing technique

    NASA Astrophysics Data System (ADS)

    Hao, Zi-long; Liu, Yong; Chen, Ruo-wang

    2016-11-01

    In view of the histogram equalizing algorithm to enhance image in digital image processing, an Infrared Image Gray adaptive adjusting Enhancement Algorithm Based on Gray Redundancy Histogram-dealing Technique is proposed. The algorithm is based on the determination of the entire image gray value, enhanced or lowered the image's overall gray value by increasing appropriate gray points, and then use gray-level redundancy HE method to compress the gray-scale of the image. The algorithm can enhance image detail information. Through MATLAB simulation, this paper compares the algorithm with the histogram equalization method and the algorithm based on gray redundancy histogram-dealing technique , and verifies the effectiveness of the algorithm.

  9. Infrared image enhancement based on atmospheric scattering model and histogram equalization

    NASA Astrophysics Data System (ADS)

    Li, Yi; Zhang, Yunfeng; Geng, Aihui; Cao, Lihua; Chen, Juan

    2016-09-01

    Infrared images are fuzzy due to the special imaging technology of infrared sensor. In order to achieve contrast enhancement and gain clear edge details from a fuzzy infrared image, we propose an efficient enhancement method based on atmospheric scattering model and histogram equalization. The novel algorithm optimizes and improves the visual image haze remove method which combines the characteristics of the fuzzy infrared images. Firstly, an average filtering operation is presented to get the estimation of coarse transmission rate. Then we get the fuzzy free image through self-adaptive transmission rate calculated with the statistics information of original infrared image. Finally, to deal with low lighting problem of fuzzy free image, we propose a sectional plateau histogram equalization method which is capable of background suppression. Experimental results show that the performance and efficiency of the proposed algorithm are pleased, compared to four other algorithms in both subjective observation and objective quantitative evaluation. In addition, the proposed algorithm is competent to enhance infrared image for different applications under different circumstances.

  10. Adaptive edge histogram descriptor for landmine detection using GPR

    NASA Astrophysics Data System (ADS)

    Frigui, Hichem; Fadeev, Aleksey; Karem, Andrew; Gader, Paul

    2009-05-01

    The Edge Histogram Detector (EHD) is a landmine detection algorithm for sensor data generated by ground penetrating radar (GPR). It uses edge histograms for feature extraction and a possibilistic K-Nearest Neighbors (K-NN) rule for confidence assignment. To reduce the computational complexity of the EHD and improve its generalization, the K-NN classifier uses few prototypes that can capture the variations of the signatures within each class. Each of these prototypes is assigned a label in the class of mines and a label in the class of clutter to capture its degree of sharing among these classes. The EHD has been tested extensively. It has demonstrated excellent performance on large real world data sets, and has been implemented in real time versions in hand-held and vehicle mounted GPR. In this paper, we propose two modifications to the EHD to improve its performance and adaptability. First, instead of using a fixed threshold to decide if the edge at a certain location is strong enough, we use an adaptive threshold that is learned from the background surrounding the target. This modification makes the EHD more adaptive to different terrains and to mines buried at different depths. Second, we introduce an additional training component that tunes the prototype features and labels to different environments. Results on large and diverse GPR data collections show that the proposed adaptive EHD outperforms the baseline EHD. We also show that the edge threshold can vary significantly according to the edge type, alarm depth, and soil conditions.

  11. Fast autodidactic adaptive equalization algorithms

    NASA Astrophysics Data System (ADS)

    Hilal, Katia

    Autodidactic equalization by adaptive filtering is addressed in a mobile radio communication context. A general method, using an adaptive stochastic gradient Bussgang type algorithm, to deduce two low cost computation algorithms is given: one equivalent to the initial algorithm and the other having improved convergence properties thanks to a block criteria minimization. Two start algorithms are reworked: the Godard algorithm and the decision controlled algorithm. Using a normalization procedure, and block normalization, the performances are improved, and their common points are evaluated. These common points are used to propose an algorithm retaining the advantages of the two initial algorithms. This thus inherits the robustness of the Godard algorithm and the precision and phase correction of the decision control algorithm. The work is completed by a study of the stable states of Bussgang type algorithms and of the stability of the Godard algorithms, initial and normalized. The simulation of these algorithms, carried out in a mobile radio communications context, and under severe conditions on the propagation channel, gave a 75% reduction in the number of samples required for the processing in relation with the initial algorithms. The improvement of the residual error was of a much lower return. These performances are close to making possible the use of autodidactic equalization in the mobile radio system.

  12. From image processing to computational neuroscience: a neural model based on histogram equalization

    PubMed Central

    Bertalmío, Marcelo

    2014-01-01

    There are many ways in which the human visual system works to reduce the inherent redundancy of the visual information in natural scenes, coding it in an efficient way. The non-linear response curves of photoreceptors and the spatial organization of the receptive fields of visual neurons both work toward this goal of efficient coding. A related, very important aspect is that of the existence of post-retinal mechanisms for contrast enhancement that compensate for the blurring produced in early stages of the visual process. And alongside mechanisms for coding and wiring efficiency, there is neural activity in the human visual cortex that correlates with the perceptual phenomenon of lightness induction. In this paper we propose a neural model that is derived from an image processing technique for histogram equalization, and that is able to deal with all the aspects just mentioned: this new model is able to predict lightness induction phenomena, and improves the efficiency of the representation by flattening both the histogram and the power spectrum of the image signal. PMID:25100983

  13. Adaptive gamma correction based on cumulative histogram for enhancing near-infrared images

    NASA Astrophysics Data System (ADS)

    Huang, Zhenghua; Zhang, Tianxu; Li, Qian; Fang, Hao

    2016-11-01

    Histogram-based methods have been proven their ability in image enhancement. To improve low contrast while preserving details and high brightness in near-infrared images, a novel method called adaptive gamma correction based on cumulative histogram (AGCCH) is studied in this paper. This novel image enhancement method improves the contrast of local pixels through adaptive gamma correction (AGC), which is formed by incorporating a cumulative histogram or cumulative sub-histogram into the weighting distribution. Both qualitatively and quantitatively, experimental results demonstrate that the proposed image enhancement with the AGCCH method can perform well in brightness preservation, contrast enhancement, and detail preservation, and it is superior to previous state-of-the-art methods.

  14. High-performance thresholding with adaptive equalization

    NASA Astrophysics Data System (ADS)

    Lam, Ka Po

    1998-09-01

    The ability to simplify an image whilst retaining such crucial information as shapes and geometric structures is of great importance for real-time image analysis applications. Here the technique of binary thresholding which reduces the image complexity has generally been regarded as one of the most valuable methods, primarily owing to its ease of design and analysis. This paper studies the state of developments in the field, and describes a radically different approach of adaptive thresholding. The latter employs the analytical technique of histogram normalization for facilitating an optimal `contrast level' of the image under consideration. A suitable criterion is also developed to determine the applicability of the adaptive processing procedure. In terms of performance and computational complexity, the proposed algorithm compares favorably to five established image thresholding methods selected for this study. Experimental results have shown that the new algorithm outperforms these methods in terms of a number of important errors measures, including a consistently low visual classification error performance. The simplicity of design of the algorithm also lends itself to efficient parallel implementations.

  15. Adaptive sigmoid function bihistogram equalization for image contrast enhancement

    NASA Astrophysics Data System (ADS)

    Arriaga-Garcia, Edgar F.; Sanchez-Yanez, Raul E.; Ruiz-Pinales, Jose; Garcia-Hernandez, Ma. de Guadalupe

    2015-09-01

    Contrast enhancement plays a key role in a wide range of applications including consumer electronic applications, such as video surveillance, digital cameras, and televisions. The main goal of contrast enhancement is to increase the quality of images. However, most state-of-the-art methods induce different types of distortion such as intensity shift, wash-out, noise, intensity burn-out, and intensity saturation. In addition, in consumer electronics, simple and fast methods are required in order to be implemented in real time. A bihistogram equalization method based on adaptive sigmoid functions is proposed. It consists of splitting the image histogram into two parts that are equalized independently by using adaptive sigmoid functions. In order to preserve the mean brightness of the input image, the parameter of the sigmoid functions is chosen to minimize the absolute mean brightness metric. Experiments on the Berkeley database have shown that the proposed method improves the quality of images and preserves their mean brightness. An application to improve the colorfulness of images is also presented.

  16. Differentially Private Histogram Publication For Dynamic Datasets: An Adaptive Sampling Approach.

    PubMed

    Li, Haoran; Jiang, Xiaoqian; Xiong, Li; Liu, Jinfei

    2015-10-01

    Differential privacy has recently become a de facto standard for private statistical data release. Many algorithms have been proposed to generate differentially private histograms or synthetic data. However, most of them focus on "one-time" release of a static dataset and do not adequately address the increasing need of releasing series of dynamic datasets in real time. A straightforward application of existing histogram methods on each snapshot of such dynamic datasets will incur high accumulated error due to the composibility of differential privacy and correlations or overlapping users between the snapshots. In this paper, we address the problem of releasing series of dynamic datasets in real time with differential privacy, using a novel adaptive distance-based sampling approach. Our first method, DSFT, uses a fixed distance threshold and releases a differentially private histogram only when the current snapshot is sufficiently different from the previous one, i.e., with a distance greater than a predefined threshold. Our second method, DSAT, further improves DSFT and uses a dynamic threshold adaptively adjusted by a feedback control mechanism to capture the data dynamics. Extensive experiments on real and synthetic datasets demonstrate that our approach achieves better utility than baseline methods and existing state-of-the-art methods.

  17. Differentially Private Histogram Publication For Dynamic Datasets: An Adaptive Sampling Approach

    PubMed Central

    Li, Haoran; Jiang, Xiaoqian; Xiong, Li; Liu, Jinfei

    2016-01-01

    Differential privacy has recently become a de facto standard for private statistical data release. Many algorithms have been proposed to generate differentially private histograms or synthetic data. However, most of them focus on “one-time” release of a static dataset and do not adequately address the increasing need of releasing series of dynamic datasets in real time. A straightforward application of existing histogram methods on each snapshot of such dynamic datasets will incur high accumulated error due to the composibility of differential privacy and correlations or overlapping users between the snapshots. In this paper, we address the problem of releasing series of dynamic datasets in real time with differential privacy, using a novel adaptive distance-based sampling approach. Our first method, DSFT, uses a fixed distance threshold and releases a differentially private histogram only when the current snapshot is sufficiently different from the previous one, i.e., with a distance greater than a predefined threshold. Our second method, DSAT, further improves DSFT and uses a dynamic threshold adaptively adjusted by a feedback control mechanism to capture the data dynamics. Extensive experiments on real and synthetic datasets demonstrate that our approach achieves better utility than baseline methods and existing state-of-the-art methods. PMID:26973795

  18. Adaptive local backlight dimming algorithm based on local histogram and image characteristics

    NASA Astrophysics Data System (ADS)

    Nadernejad, Ehsan; Burini, Nino; Korhonen, Jari; Forchhammer, Søren; Mantel, Claire

    2013-02-01

    Liquid Crystal Display (LCDs) with Light Emitting Diode (LED) backlight is a very popular display technology, used for instance in television sets, monitors and mobile phones. This paper presents a new backlight dimming algorithm that exploits the characteristics of the target image, such as the local histograms and the average pixel intensity of each backlight segment, to reduce the power consumption of the backlight and enhance image quality. The local histogram of the pixels within each backlight segment is calculated and, based on this average, an adaptive quantile value is extracted. A classification into three classes based on the average luminance value is performed and, depending on the image luminance class, the extracted information on the local histogram determines the corresponding backlight value. The proposed method has been applied on two modeled screens: one with a high resolution direct-lit backlight, and the other screen with 16 edge-lit backlight segments placed in two columns and eight rows. We have compared the proposed algorithm against several known backlight dimming algorithms by simulations; and the results show that the proposed algorithm provides better trade-off between power consumption and image quality preservation than the other algorithms representing the state of the art among feature based backlight algorithms.

  19. Wavelength-adaptive dehazing using histogram merging-based classification for UAV images.

    PubMed

    Yoon, Inhye; Jeong, Seokhwa; Jeong, Jaeheon; Seo, Doochun; Paik, Joonki

    2015-03-19

    Since incoming light to an unmanned aerial vehicle (UAV) platform can be scattered by haze and dust in the atmosphere, the acquired image loses the original color and brightness of the subject. Enhancement of hazy images is an important task in improving the visibility of various UAV images. This paper presents a spatially-adaptive dehazing algorithm that merges color histograms with consideration of the wavelength-dependent atmospheric turbidity. Based on the wavelength-adaptive hazy image acquisition model, the proposed dehazing algorithm consists of three steps: (i) image segmentation based on geometric classes; (ii) generation of the context-adaptive transmission map; and (iii) intensity transformation for enhancing a hazy UAV image. The major contribution of the research is a novel hazy UAV image degradation model by considering the wavelength of light sources. In addition, the proposed transmission map provides a theoretical basis to differentiate visually important regions from others based on the turbidity and merged classification results.

  20. Investigation on improved infrared image detail enhancement algorithm based on adaptive histogram statistical stretching and gradient filtering

    NASA Astrophysics Data System (ADS)

    Zeng, Bangze; Zhu, Youpan; Li, Zemin; Hu, Dechao; Luo, Lin; Zhao, Deli; Huang, Juan

    2014-11-01

    Duo to infrared image with low contrast, big noise and unclear visual effect, target is very difficult to observed and identified. This paper presents an improved infrared image detail enhancement algorithm based on adaptive histogram statistical stretching and gradient filtering (AHSS-GF). Based on the fact that the human eyes are very sensitive to the edges and lines, the author proposed to extract the details and textures by using the gradient filtering. New histogram could be acquired by calculating the sum of original histogram based on fixed window. With the minimum value for cut-off point, author carried on histogram statistical stretching. After the proper weights given to the details and background, the detail-enhanced results could be acquired finally. The results indicate image contrast could be improved and the details and textures could be enhanced effectively as well.

  1. Adaptive Bacteria Colony Picking in Unstructured Environments Using Intensity Histogram and Unascertained LS-SVM Classifier

    PubMed Central

    Zhang, Kun; Fei, Minrui; Li, Xin

    2014-01-01

    Features analysis is an important task which can significantly affect the performance of automatic bacteria colony picking. Unstructured environments also affect the automatic colony screening. This paper presents a novel approach for adaptive colony segmentation in unstructured environments by treating the detected peaks of intensity histograms as a morphological feature of images. In order to avoid disturbing peaks, an entropy based mean shift filter is introduced to smooth images as a preprocessing step. The relevance and importance of these features can be determined in an improved support vector machine classifier using unascertained least square estimation. Experimental results show that the proposed unascertained least square support vector machine (ULSSVM) has better recognition accuracy than the other state-of-the-art techniques, and its training process takes less time than most of the traditional approaches presented in this paper. PMID:24955423

  2. Adaptive Kalman filtering for histogram-based appearance learning in infrared imagery.

    PubMed

    Venkataraman, Vijay; Fan, Guoliang; Havlicek, Joseph P; Fan, Xin; Zhai, Yan; Yeary, Mark B

    2012-11-01

    Targets of interest in video acquired from imaging infrared sensors often exhibit profound appearance variations due to a variety of factors, including complex target maneuvers, ego-motion of the sensor platform, background clutter, etc., making it difficult to maintain a reliable detection process and track lock over extended time periods. Two key issues in overcoming this problem are how to represent the target and how to learn its appearance online. In this paper, we adopt a recent appearance model that estimates the pixel intensity histograms as well as the distribution of local standard deviations in both the foreground and background regions for robust target representation. Appearance learning is then cast as an adaptive Kalman filtering problem where the process and measurement noise variances are both unknown. We formulate this problem using both covariance matching and, for the first time in a visual tracking application, the recent autocovariance least-squares (ALS) method. Although convergence of the ALS algorithm is guaranteed only for the case of globally wide sense stationary process and measurement noises, we demonstrate for the first time that the technique can often be applied with great effectiveness under the much weaker assumption of piecewise stationarity. The performance advantages of the ALS method relative to the classical covariance matching are illustrated by means of simulated stationary and nonstationary systems. Against real data, our results show that the ALS-based algorithm outperforms the covariance matching as well as the traditional histogram similarity-based methods, achieving sub-pixel tracking accuracy against the well-known AMCOM closure sequences and the recent SENSIAC automatic target recognition dataset.

  3. Medical image classification using spatial adjacent histogram based on adaptive local binary patterns.

    PubMed

    Liu, Dong; Wang, Shengsheng; Huang, Dezhi; Deng, Gang; Zeng, Fantao; Chen, Huiling

    2016-05-01

    Medical image recognition is an important task in both computer vision and computational biology. In the field of medical image classification, representing an image based on local binary patterns (LBP) descriptor has become popular. However, most existing LBP-based methods encode the binary patterns in a fixed neighborhood radius and ignore the spatial relationships among local patterns. The ignoring of the spatial relationships in the LBP will cause a poor performance in the process of capturing discriminative features for complex samples, such as medical images obtained by microscope. To address this problem, in this paper we propose a novel method to improve local binary patterns by assigning an adaptive neighborhood radius for each pixel. Based on these adaptive local binary patterns, we further propose a spatial adjacent histogram strategy to encode the micro-structures for image representation. An extensive set of evaluations are performed on four medical datasets which show that the proposed method significantly improves standard LBP and compares favorably with several other prevailing approaches.

  4. Coding and decoding with adapting neurons: a population approach to the peri-stimulus time histogram.

    PubMed

    Naud, Richard; Gerstner, Wulfram

    2012-01-01

    The response of a neuron to a time-dependent stimulus, as measured in a Peri-Stimulus-Time-Histogram (PSTH), exhibits an intricate temporal structure that reflects potential temporal coding principles. Here we analyze the encoding and decoding of PSTHs for spiking neurons with arbitrary refractoriness and adaptation. As a modeling framework, we use the spike response model, also known as the generalized linear neuron model. Because of refractoriness, the effect of the most recent spike on the spiking probability a few milliseconds later is very strong. The influence of the last spike needs therefore to be described with high precision, while the rest of the neuronal spiking history merely introduces an average self-inhibition or adaptation that depends on the expected number of past spikes but not on the exact spike timings. Based on these insights, we derive a 'quasi-renewal equation' which is shown to yield an excellent description of the firing rate of adapting neurons. We explore the domain of validity of the quasi-renewal equation and compare it with other rate equations for populations of spiking neurons. The problem of decoding the stimulus from the population response (or PSTH) is addressed analogously. We find that for small levels of activity and weak adaptation, a simple accumulator of the past activity is sufficient to decode the original input, but when refractory effects become large decoding becomes a non-linear function of the past activity. The results presented here can be applied to the mean-field analysis of coupled neuron networks, but also to arbitrary point processes with negative self-interaction.

  5. Adapting histogram for automatic noise data removal in building interior point cloud data

    NASA Astrophysics Data System (ADS)

    Shukor, S. A. Abdul; Rushforth, E. J.

    2015-05-01

    3D point cloud data is now preferred by researchers to generate 3D models. These models can be used throughout a variety of applications including 3D building interior models. The rise of Building Information Modeling (BIM) for Architectural, Engineering, Construction (AEC) applications has given 3D interior modelling more attention recently. To generate a 3D model representing the building interior, a laser scanner is used to collect the point cloud data. However, this data often comes with noise. This is due to several factors including the surrounding objects, lighting and specifications of the laser scanner. This paper highlights on the usage of the histogram to remove the noise data. Histograms, used in statistics and probability, are regularly being used in a number of applications like image processing, where a histogram can represent the total number of pixels in an image at each intensity level. Here, histograms represent the number of points recorded at range distance intervals in various projections. As unwanted noise data has a sparser cloud density compared to the required data and is usually situated at a notable distance from the required data, noise data will have lower frequencies in the histogram. By defining the acceptable range using the average frequency, points below this range can be removed. This research has shown that these histograms have the capabilities to remove unwanted data from 3D point cloud data representing building interiors automatically. This feature will aid the process of data preprocessing in producing an ideal 3D model from the point cloud data.

  6. Blind adaptive equalization of polarization-switched QPSK modulation.

    PubMed

    Millar, David S; Savory, Seb J

    2011-04-25

    Coherent detection in combination with digital signal processing has recently enabled significant progress in the capacity of optical communications systems. This improvement has enabled detection of optimum constellations for optical signals in four dimensions. In this paper, we propose and investigate an algorithm for the blind adaptive equalization of one such modulation format: polarization-switched quaternary phase shift keying (PS-QPSK). The proposed algorithm, which includes both blind initialization and adaptation of the equalizer, is found to be insensitive to the input polarization state and demonstrates highly robust convergence in the presence of PDL, DGD and polarization rotation.

  7. Low complexity adaptive equalizers for underwater acoustic communications

    NASA Astrophysics Data System (ADS)

    Soflaei, Masoumeh; Azmi, Paeiz

    2014-08-01

    Interference signals due to scattering from surface and reflecting from bottom is one of the most important problems of reliable communications in shallow water channels. To solve this problem, one of the best suggested ways is to use adaptive equalizers. Convergence rate and misadjustment error in adaptive algorithms play important roles in adaptive equalizer performance. In this paper, affine projection algorithm (APA), selective regressor APA(SR-APA), family of selective partial update (SPU) algorithms, family of set-membership (SM) algorithms and selective partial update selective regressor APA (SPU-SR-APA) are compared with conventional algorithms such as the least mean square (LMS) in underwater acoustic communications. We apply experimental data from the Strait of Hormuz for demonstrating the efficiency of the proposed methods over shallow water channel. We observe that the values of the steady-state mean square error (MSE) of SR-APA, SPU-APA, SPU-normalized least mean square (SPU-NLMS), SPU-SR-APA, SM-APA and SM-NLMS algorithms decrease in comparison with the LMS algorithm. Also these algorithms have better convergence rates than LMS type algorithm.

  8. A successive overrelaxation iterative technique for an adaptive equalizer

    NASA Technical Reports Server (NTRS)

    Kosovych, O. S.

    1973-01-01

    An adaptive strategy for the equalization of pulse-amplitude-modulated signals in the presence of intersymbol interference and additive noise is reported. The successive overrelaxation iterative technique is used as the algorithm for the iterative adjustment of the equalizer coefficents during a training period for the minimization of the mean square error. With 2-cyclic and nonnegative Jacobi matrices substantial improvement is demonstrated in the rate of convergence over the commonly used gradient techniques. The Jacobi theorems are also extended to nonpositive Jacobi matrices. Numerical examples strongly indicate that the improvements obtained for the special cases are possible for general channel characteristics. The technique is analytically demonstrated to decrease the mean square error at each iteration for a large range of parameter values for light or moderate intersymbol interference and for small intervals for general channels. Analytically, convergence of the relaxation algorithm was proven in a noisy environment and the coefficient variance was demonstrated to be bounded.

  9. Adaptive block-wise alphabet reduction scheme for lossless compression of images with sparse and locally sparse histograms

    NASA Astrophysics Data System (ADS)

    Masmoudi, Atef; Zouari, Sonia; Ghribi, Abdelaziz

    2015-11-01

    We propose a new adaptive block-wise lossless image compression algorithm, which is based on the so-called alphabet reduction scheme combined with an adaptive arithmetic coding (AC). This new encoding algorithm is particularly efficient for lossless compression of images with sparse and locally sparse histograms. AC is a very efficient technique for lossless data compression and produces a rate that is close to the entropy; however, a compression performance loss occurs when encoding images or blocks with a limited number of active symbols by comparison with the number of symbols in the nominal alphabet, which consists in the amplification of the zero frequency problem. Generally, most methods add one to the frequency count of each symbol from the nominal alphabet, which leads to a statistical model distortion, and therefore reduces the efficiency of the AC. The aim of this work is to overcome this drawback by assigning to each image block the smallest possible set including all the existing symbols called active symbols. This is an alternative of using the nominal alphabet when applying the conventional arithmetic encoders. We show experimentally that the proposed method outperforms several lossless image compression encoders and standards including the conventional arithmetic encoders, JPEG2000, and JPEG-LS.

  10. An improved human visual system based reversible data hiding method using adaptive histogram modification

    NASA Astrophysics Data System (ADS)

    Hong, Wien; Chen, Tung-Shou; Wu, Mei-Chen

    2013-03-01

    Jung et al., IEEE Signal Processing Letters, 18, 2, 95, 2011 proposed a reversible data hiding method considering the human visual system (HVS). They employed the mean of visited neighboring pixels to predict the current pixel value, and estimated the just noticeable difference (JND) of the current pixel. Message bits are then embedded by adjusting the embedding level according to the calculated JND. Jung et al.'s method achieved excellent image quality. However, the embedding algorithm they used may result in over modification of pixel values and a large location map, which may deteriorate the image quality and decrease the pure payload. The proposed method exploits the nearest neighboring pixels to predict the visited pixel value and to estimate the corresponding JND. The cover pixels are preprocessed adaptively to reduce the size of the location map. We also employ an embedding level selection mechanism to prevent near-saturated pixels from being over modified. Experimental results show that the image quality of the proposed method is higher than that of Jung et al.'s method, and the payload can also be increased due to the reduction of the location map.

  11. On the similarity of 239Pu α-activity histograms when the angular velocities of the Earth diurnal rotation, orbital movement and rotation of collimators are equalized

    NASA Astrophysics Data System (ADS)

    Shnoll, S. E.; Rubinstein, I. A.; Shapovalov, S. N.; Tolokonnikova, A. A.; Shlektaryov, V. A.; Kolombet, V. A.; Kondrashova, M. N.

    2016-01-01

    It was shown earlier that the persistent "scatter" of results of measurements of any nature is determined by the diurnal and orbital movement of the Earth. The movement is accompanied by "macroscopic fluctuations" (MF)—regular, periodic changes in the shape of histograms, spectra of fluctuation amplitudes of the measured parameters. There are two near-daily periods ("sidereal", 1436 min; and "solar", 1440 min) and three yearly ones ("calendar", 365 average solar days; "tropical", 365 days 5 h and 48 min; and "sidereal", 365 days 6 h and 9 min). This periodicity was explained by the objects whose parameters are measured passing through the same spatial-temporal heterogeneities as the Earth rotates and shifts along its orbit.

  12. Digital timing recovery combined with adaptive equalization for optical coherent receivers

    NASA Astrophysics Data System (ADS)

    Zhou, Xian; Chen, Xue; Zhou, Weiqing; Fan, Yangyang; Zhu, Hai; Li, Zhiyu

    2009-11-01

    We propose a novel equalization and timing recovery scheme, which adds an adaptive butterfly-structured equalizer in an all-digital timing recovery loop, for polarization multiplexing (POLMUX) coherent receivers. It resolves an incompatible problem that digital equalizer requires the timing recovered (synchronous) signal and Gardner timing-error detection algorithm requires the equalized signal because of its small tolerance on dispersion. This joint module can complete synchronization, equalization and polarization de-multiplexing simultaneously without any extra computational cost. Finally, we demonstrate the good performance of the new scheme in a 112-Gbit/s POLMUX-NRZ-DQPSK digital optical coherent receiver.

  13. Blind equalization using stop-and-go adaptation rules

    NASA Astrophysics Data System (ADS)

    Hatzinakos, Dimitrios

    1992-06-01

    Stop-and-go adaptation rules that are utilized to improve the blind convergence characteristics of the conventional and sign decision-directed algorithms are proposed and examined. They are based on the Sato- and Godard-type errors, which are utilized in many blind deconvolution applications. The convergence rates achieved by the algorithms with quadrature amplitude modulated signal constellations and nonminimum phase communication channels are compared. Based on a new criterion, the optimal values of the Sato and Godard error parameters are redefined. The optimality of the new parameter values is confirmed by means of computer simulations.

  14. Dose-Volume Histogram Parameters and Late Side Effects in Magnetic Resonance Image-Guided Adaptive Cervical Cancer Brachytherapy

    SciTech Connect

    Georg, Petra; Lang, Stefan; Dimopoulos, Johannes C.A.; Doerr, Wolfgang; Sturdza, Alina E.; Berger, Daniel; Georg, Dietmar; Kirisits, Christian; Poetter, Richard

    2011-02-01

    Purpose: To evaluate the predictive value of dose-volume histogram (DVH) parameters for late side effects of the rectum, sigmoid colon, and bladder in image-guided brachytherapy for cervix cancer patients. Methods and Materials: A total of 141 patients received external-beam radiotherapy and image-guided brachytherapy with or without chemotherapy. The DVH parameters for the most exposed 2, 1, and 0.1 cm{sup 3} (D{sub 2cc}, D{sub 1cc}, and D{sub 0.1cc}) of the rectum, sigmoid, and bladder, as well as International Commission on Radiation Units and Measurements point doses (D{sub ICRU}) were computed. Total doses were converted to equivalent doses in 2 Gy by applying the linear-quadratic model ({alpha}/{beta} = 3 Gy). Late side effects were prospectively assessed using the Late Effects in Normal Tissues-Subjective, Objective, Management and Analytic score. The following patient groups were defined: Group 1: no side effects (Grade 0); Group 2: side effects (Grade 1-4); Group 3: minor side effects (Grade 0-1); and Group 4: major side effects (Grade 2-4). Results: The median follow-up was 51 months. The overall 5-year actuarial side effect rates were 12% for rectum, 3% for sigmoid, and 23% for bladder. The mean total D{sub 2cc} were 65 {+-} 12 Gy for rectum, 62 {+-} 12 Gy for sigmoid, and 95 {+-} 22 Gy for bladder. For rectum, statistically significant differences were observed between Groups 1 and 2 in all DVH parameters and D{sub ICRU}. Between Groups 3 and 4, no difference was observed for D{sub 0.1cc.} For sigmoid, significant differences were observed for D{sub 2cc} and D{sub 1cc}, but not for D{sub 0.1cc} in all groups. For bladder, significant differences were observed for all DVH parameters only comparing Groups 3 and 4. No differences were observed for D{sub ICRU}. Conclusions: The parameters D{sub 2cc} and D{sub 1cc} have a good predictive value for rectal toxicity. For sigmoid, no prediction could be postulated because of limited data. In bladder, DVH

  15. A Brief Investigation of Adaptive Decision Feedback Equalization for Digital HF Links Employing PSK (Phase Shift Keying) Modulation.

    DTIC Science & Technology

    2014-09-26

    INVESTIGATION OF ADAPTIVE DECISION FEEDBACK EQUALIZATION FOR DIGITAL HF LINKS EMPLOYING PSKK MODULATION B.E. Sawyer Mission Research Corporation P.O. Drawer 719...Rot FEEDBACK EQUAL Z FOR IGITAL HF LINKS TcnclRpr PERFORMING ORG. REPORT NUMBERMRC-R-801- 7, AUTHORf,, 8. CONTRACT OR GRANT NUMBER(-) Blair E...C’!, -i, e -n rev r- , -de if necessary and identify by block number) PSK Receiver Mitigation Adaptive Equalization Decision Feedback Equalization HF

  16. Color Histogram Diffusion for Image Enhancement

    NASA Technical Reports Server (NTRS)

    Kim, Taemin

    2011-01-01

    Various color histogram equalization (CHE) methods have been proposed to extend grayscale histogram equalization (GHE) for color images. In this paper a new method called histogram diffusion that extends the GHE method to arbitrary dimensions is proposed. Ranges in a histogram are specified as overlapping bars of uniform heights and variable widths which are proportional to their frequencies. This diagram is called the vistogram. As an alternative approach to GHE, the squared error of the vistogram from the uniform distribution is minimized. Each bar in the vistogram is approximated by a Gaussian function. Gaussian particles in the vistoram diffuse as a nonlinear autonomous system of ordinary differential equations. CHE results of color images showed that the approach is effective.

  17. A NOISE ADAPTIVE FUZZY EQUALIZATION METHOD FOR PROCESSING SOLAR EXTREME ULTRAVIOLET IMAGES

    SciTech Connect

    Druckmueller, M.

    2013-08-15

    A new image enhancement tool ideally suited for the visualization of fine structures in extreme ultraviolet images of the corona is presented in this paper. The Noise Adaptive Fuzzy Equalization method is particularly suited for the exceptionally high dynamic range images from the Atmospheric Imaging Assembly instrument on the Solar Dynamics Observatory. This method produces artifact-free images and gives significantly better results than methods based on convolution or Fourier transform which are often used for that purpose.

  18. Adaptively combined FIR and functional link artificial neural network equalizer for nonlinear communication channel.

    PubMed

    Zhao, Haiquan; Zhang, Jiashu

    2009-04-01

    This paper proposes a novel computational efficient adaptive nonlinear equalizer based on combination of finite impulse response (FIR) filter and functional link artificial neural network (CFFLANN) to compensate linear and nonlinear distortions in nonlinear communication channel. This convex nonlinear combination results in improving the speed while retaining the lower steady-state error. In addition, since the CFFLANN needs not the hidden layers, which exist in conventional neural-network-based equalizers, it exhibits a simpler structure than the traditional neural networks (NNs) and can require less computational burden during the training mode. Moreover, appropriate adaptation algorithm for the proposed equalizer is derived by the modified least mean square (MLMS). Results obtained from the simulations clearly show that the proposed equalizer using the MLMS algorithm can availably eliminate various intensity linear and nonlinear distortions, and be provided with better anti-jamming performance. Furthermore, comparisons of the mean squared error (MSE), the bit error rate (BER), and the effect of eigenvalue ratio (EVR) of input correlation matrix are presented.

  19. Adaptive reconfigurable V-BLAST type equalizer for cognitive MIMO-OFDM radios

    NASA Astrophysics Data System (ADS)

    Ozden, Mehmet Tahir

    2015-12-01

    An adaptive channel shortening equalizer design for multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) radio receivers is considered in this presentation. The proposed receiver has desirable features for cognitive and software defined radio implementations. It consists of two sections: MIMO decision feedback equalizer (MIMO-DFE) and adaptive multiple Viterbi detection. In MIMO-DFE section, a complete modified Gram-Schmidt orthogonalization of multichannel input data is accomplished using sequential processing multichannel Givens lattice stages, so that a Vertical Bell Laboratories Layered Space Time (V-BLAST) type MIMO-DFE is realized at the front-end section of the channel shortening equalizer. Matrix operations, a major bottleneck for receiver operations, are accordingly avoided, and only scalar operations are used. A highly modular and regular radio receiver architecture that has a suitable structure for digital signal processing (DSP) chip and field programable gate array (FPGA) implementations, which are important for software defined radio realizations, is achieved. The MIMO-DFE section of the proposed receiver can also be reconfigured for spectrum sensing and positioning functions, which are important tasks for cognitive radio applications. In connection with adaptive multiple Viterbi detection section, a systolic array implementation for each channel is performed so that a receiver architecture with high computational concurrency is attained. The total computational complexity is given in terms of equalizer and desired response filter lengths, alphabet size, and number of antennas. The performance of the proposed receiver is presented for two-channel case by means of mean squared error (MSE) and probability of error evaluations, which are conducted for time-invariant and time-variant channel conditions, orthogonal and nonorthogonal transmissions, and two different modulation schemes.

  20. Comparison of adverse effects of proton and X-ray chemoradiotherapy for esophageal cancer using an adaptive dose-volume histogram analysis.

    PubMed

    Makishima, Hirokazu; Ishikawa, Hitoshi; Terunuma, Toshiyuki; Hashimoto, Takayuki; Yamanashi, Koichi; Sekiguchi, Takao; Mizumoto, Masashi; Okumura, Toshiyuki; Sakae, Takeji; Sakurai, Hideyuki

    2015-05-01

    Cardiopulmonary late toxicity is of concern in concurrent chemoradiotherapy (CCRT) for esophageal cancer. The aim of this study was to examine the benefit of proton beam therapy (PBT) using clinical data and adaptive dose-volume histogram (DVH) analysis. The subjects were 44 patients with esophageal cancer who underwent definitive CCRT using X-rays (n = 19) or protons (n = 25). Experimental recalculation using protons was performed for the patient actually treated with X-rays, and vice versa. Target coverage and dose constraints of normal tissues were conserved. Lung V5-V20, mean lung dose (MLD), and heart V30-V50 were compared for risk organ doses between experimental plans and actual treatment plans. Potential toxicity was estimated using protons in patients actually treated with X-rays, and vice versa. Pulmonary events of Grade ≥2 occurred in 8/44 cases (18%), and cardiac events were seen in 11 cases (25%). Risk organ doses in patients with events of Grade ≥2 were significantly higher than for those with events of Grade ≤1. Risk organ doses were lower in proton plans compared with X-ray plans. All patients suffering toxicity who were treated with X-rays (n = 13) had reduced predicted doses in lung and heart using protons, while doses in all patients treated with protons (n = 24) with toxicity of Grade ≤1 had worsened predicted toxicity with X-rays. Analysis of normal tissue complication probability showed a potential reduction in toxicity by using proton beams. Irradiation dose, volume and adverse effects on the heart and lung can be reduced using protons. Thus, PBT is a promising treatment modality for the management of esophageal cancer.

  1. Structure Size Enhanced Histogram

    NASA Astrophysics Data System (ADS)

    Wesarg, Stefan; Kirschner, Matthias

    Direct volume visualization requires the definition of transfer functions (TFs) for the assignment of opacity and color. Multi-dimensional TFs are based on at least two image properties, and are specified by means of 2D histograms. In this work we propose a new type of a 2D histogram which combines gray value with information about the size of the structures. This structure size enhanced (SSE) histogram is an intuitive approach for representing anatomical features. Clinicians — the users we are focusing on — are much more familiar with selecting features by their size than by their gradient magnitude value. As a proof of concept, we employ the SSE histogram for the definition of two-dimensional TFs for the visualization of 3D MRI and CT image data.

  2. Concurrent adaptation of reactive saccades and hand pointing movements to equal and to opposite changes of target direction.

    PubMed

    Grigorova, Valentina; Bock, Otmar; Borisova, Steliana

    2013-04-01

    Eye as well as hand movements can adapt to double-step target displacements, but it is still controversial whether both motor systems use common or distinct adaptive mechanisms. Here, we posit that analyses of the concurrent adaptation of both motor systems to equal versus different double-steps may provide more conclusive evidence than previous work about the transfer of adaptation from one motor system to the other. Forty subjects adapted to double-steps which called for a change of response direction. The same (group S) or the opposite change (group O) was required for eyes and hand. Group ON equaled O, except that no visual feedback of the hand was provided. Groups E and H served as controls for eyes-only and hand-only adaptation, respectively. We found no differences between groups or motor systems when comparing S, E and H. Adaptation was faster in O than in S, E and H, and faster still in ON. However, the magnitude of eye adaptation was much smaller in O and ON than in S, E and H. We conclude that concurrent adaptation of eye and hand directions to opposite double-steps attenuates recalibration which, at least for the hand, is largely replaced by workaround strategies. The mechanisms for eye and hand adaptation therefore seem to be coupled, in a way that hinders divergent recalibration of both motor systems. The possible neuronal substrate for our findings is discussed.

  3. Adaptive gain, equalization, and wavelength stabilization techniques for silicon photonic microring resonator-based optical receivers

    NASA Astrophysics Data System (ADS)

    Palermo, Samuel; Chiang, Patrick; Yu, Kunzhi; Bai, Rui; Li, Cheng; Chen, Chin-Hui; Fiorentino, Marco; Beausoleil, Ray; Li, Hao; Shafik, Ayman; Titriku, Alex

    2016-03-01

    Interconnect architectures based on high-Q silicon photonic microring resonator devices offer a promising solution to address the dramatic increase in datacenter I/O bandwidth demands due to their ability to realize wavelength-division multiplexing (WDM) in a compact and energy efficient manner. However, challenges exist in realizing efficient receivers for these systems due to varying per-channel link budgets, sensitivity requirements, and ring resonance wavelength shifts. This paper reports on adaptive optical receiver design techniques which address these issues and have been demonstrated in two hybrid-integrated prototypes based on microring drop filters and waveguide photodetectors implemented in a 130nm SOI process and high-speed optical front-ends designed in 65nm CMOS. A 10Gb/s powerscalable architecture employs supply voltage scaling of a three inverter-stage transimpedance amplifier (TIA) that is adapted with an eye-monitor control loop to yield the necessary sensitivity for a given channel. As reduction of TIA input-referred noise is more critical at higher data rates, a 25Gb/s design utilizes a large input-stage feedback resistor TIA cascaded with a continuous-time linear equalizer (CTLE) that compensates for the increased input pole. When tested with a waveguide Ge PD with 0.45A/W responsivity, this topology achieves 25Gb/s operation with -8.2dBm sensitivity at a BER=10-12. In order to address microring drop filters sensitivity to fabrication tolerances and thermal variations, efficient wavelength-stabilization control loops are necessary. A peak-power-based monitoring loop which locks the drop filter to the input wavelength, while achieving compatibility with the high-speed TIA offset-correction feedback loop is implemented with a 0.7nm tuning range at 43μW/GHz efficiency.

  4. X-ray dose reduction by adaptive source equalization and electronic region-of-interest control

    NASA Astrophysics Data System (ADS)

    Burion, Steve; Sandman, Anne; Bechtel, Kate; Solomon, Edward; Funk, Tobias

    2011-03-01

    Radiation dose is particularly a concern in pediatric cardiac fluoroscopy procedures, which account for 7% of all cardiac procedures performed. The Scanning-Beam Digital X-ray (SBDX) fluoroscopy system has already demonstrated reduced dose in adult patients owing to its high-DQE photon-counting detector, reduced detected scatter, and the elimination of the anti-scatter grid. Here we show that the unique flexible illumination platform of the SBDX system will enable further dose area product reduction, which we are currently developing for pediatric patients, but which will ultimately benefit all patients. The SBDX system has a small-area detector array and a large-area X-ray source with up to 9,000 individually-controlled X-ray focal spots. Each focal spot illuminates a small fraction of the full field of view. To acquire a frame, each focal spot is activated for a fixed number of 1-microsecond periods. Dose reduction is made possible by reducing the number of activations of some of the X-ray focal spots during each frame time. This can be done dynamically to reduce the exposure in areas of low patient attenuation, such as the lung field. This spatially-adaptive illumination also reduces the dynamic range in the full image, which is visually pleasing. Dose can also be reduced by the user selecting a region of interest (ROI) where full image quality is to be maintained. Outside the ROI, the number of activations of each X-ray focal spot is reduced and the image gain is correspondingly increased to maintain consistent image brightness. Dose reduction is dependent on the size of the ROI and the desired image quality outside the ROI. We have developed simulation software that is based on real data and can simulate the performance of the equalization and ROI filtration. This software represents a first step toward real-time implementation of these dose-reduction methods. Our simulations have shown that dose area product reductions of 40% are possible using equalization

  5. Investigating Student Understanding of Histograms

    ERIC Educational Resources Information Center

    Kaplan, Jennifer J.; Gabrosek, John G.; Curtiss, Phyllis; Malone, Chris

    2014-01-01

    Histograms are adept at revealing the distribution of data values, especially the shape of the distribution and any outlier values. They are included in introductory statistics texts, research methods texts, and in the popular press, yet students often have difficulty interpreting the information conveyed by a histogram. This research identifies…

  6. An IIR adaptive electronic equalizer for polarization multiplexed fiber optic communication systems

    NASA Astrophysics Data System (ADS)

    Zeng, Xiang-Ye; Liu, Jian-Fei; Zhao, Qi-Da

    2011-09-01

    An electronic digital equalizer for polarization multiplex coherent fiber optic communication systems is designed to compensate polarization mode dispersion (PMD) and residual chromatic dispersion (CD) of transmission channel. The proposed equalizer is realized with fraction spaced infinite impulse response (IIR) butterfly structure with 21 feedforward taps and 2 feedback taps. Compared with finite impulse response (FIR) structure, this structure can reduce implementation complexity of hardware under the same condition. To keep track of the random variation of channel characteristics, the filter weights are updated by least mean square (LMS) algorithm. The simulation results show that the proposed equalizer can compensate residual chromatic dispersion (CD) of 1600 ps/nm and differential group delay (DGD) of 90 ps simultaneously, and also can increase the PMD and residual CD tolerance of the whole communication system.

  7. Urban Heat Island Adaptation Strategies are not created equal: Assessment of Impacts and Tradeoffs

    NASA Astrophysics Data System (ADS)

    Georgescu, Matei

    2014-05-01

    Sustainable urban expansion requires an extension of contemporary approaches that focus nearly exclusively on reduction of greenhouse gas emissions. Researchers have proposed biophysical approaches to urban heat island mitigation (e.g., via deployment of cool or green roofs) but little is known how these technologies vary with place and season and what impacts are beyond those of near surface temperature. Using a suite of continuous, multi-year and multi-member continental scale numerical simulations for the United States, we examine hydroclimatic impacts for a variety of U.S. urban expansion (to the year 2100) and urban adaptation futures and compare those to contemporary urban extent. Adaptation approaches include widespread adoption of cool roofs, green roofs, and a hypothetical hybrid approach integrating properties of both cool and green roofs (i.e., reflective green roofs). Widespread adoption of adaptation strategies exhibits hydroclimatic impacts that are regionally and seasonally dependent. For some regions and seasons, urban-induced warming of 3°C can be completely offset by the adaptation approaches examined. For other regions and seasons, widespread adoption of some adaptation strategies can result in significant reduction in precipitation. Finally, implications of large-scale urbanization for seasonal energy demand will be examined.

  8. Innate and adaptive antifungal immune responses: partners on an equal footing.

    PubMed

    Hamad, Mawieh

    2012-05-01

    Adaptive immunity has long been regarded as the major player in protection against most fungal infections. Mounting evidence suggest however, that both innate and adaptive responses intricately collaborate to produce effective antifungal protection. Dendritic cells (DCs) play an important role in initiating and orchestrating antifungal immunity; neutrophils, macrophages and other phagocytes also participate in recognising and eliminating fungal pathogens. Adaptive immunity provides a wide range of effector and regulatory responses against fungal infections. Th1 responses protect against most forms of mycoses but they associate with significant inflammation and limited pathogen persistence. By contrast, Th2 responses enhance persistence of and tolerance to fungal infections thus permitting the generation of long-lasting immunological memory. Although the role of Th17 cytokines in fungal immunity is not fully understood, they can enhance proinflammatory or anti-inflammatory responses or play a regulatory role in fungal immunity all depending on the pathogen, site/phase of infection and host immunostatus. T regulatory cells balance the activities of various Th cell subsets thereby permitting inflammation and protection on the one hand and allowing for tolerance and memory on the other. Here, recent developments in fungal immunity research are reviewed as means of tracing the emergence of a refined paradigm where innate and adaptive responses are viewed in the same light.

  9. Equalizing resolution in smoothed-particle hydrodynamics calculations using self-adaptive sinc kernels

    NASA Astrophysics Data System (ADS)

    García-Senz, Domingo; Cabezón, Rubén M.; Escartín, José A.; Ebinger, Kevin

    2014-10-01

    Context. The smoothed-particle hydrodynamics (SPH) technique is a numerical method for solving gas-dynamical problems. It has been applied to simulate the evolution of a wide variety of astrophysical systems. The method has a second-order accuracy, with a resolution that is usually much higher in the compressed regions than in the diluted zones of the fluid. Aims: We propose and check a method to balance and equalize the resolution of SPH between high- and low-density regions. This method relies on the versatility of a family of interpolators called sinc kernels, which allows increasing the interpolation quality by varying only a single parameter (the exponent of the sinc function). Methods: The proposed method was checked and validated through a number of numerical tests, from standard one-dimensional Riemann problems in shock tubes, to multidimensional simulations of explosions, hydrodynamic instabilities, and the collapse of a Sun-like polytrope. Results: The analysis of the hydrodynamical simulations suggests that the scheme devised to equalize the accuracy improves the treatment of the post-shock regions and, in general, of the rarefacted zones of fluids while causing no harm to the growth of hydrodynamic instabilities. The method is robust and easy to implement with a low computational overload. It conserves mass, energy, and momentum and reduces to the standard SPH scheme in regions of the fluid that have smooth density gradients.

  10. The constant beat: cardiomyocytes adapt their forces by equal contraction upon environmental stiffening.

    PubMed

    Hersch, Nils; Wolters, Benjamin; Dreissen, Georg; Springer, Ronald; Kirchgeßner, Norbert; Merkel, Rudolf; Hoffmann, Bernd

    2013-03-15

    Cardiomyocytes are responsible for the permanent blood flow by coordinated heart contractions. This vital function is accomplished over a long period of time with almost the same performance, although heart properties, as its elasticity, change drastically upon aging or as a result of diseases like myocardial infarction. In this paper we have analyzed late rat embryonic heart muscle cells' morphology, sarcomere/costamere formation and force generation patterns on substrates of various elasticities ranging from ∼1 to 500 kPa, which covers physiological and pathological heart stiffnesses. Furthermore, adhesion behaviour, as well as single myofibril/sarcomere contraction patterns, was characterized with high spatial resolution in the range of physiological stiffnesses (15 kPa to 90 kPa). Here, sarcomere units generate an almost stable contraction of ∼4%. On stiffened substrates the contraction amplitude remains stable, which in turn leads to increased force levels allowing cells to adapt almost instantaneously to changing environmental stiffness. Furthermore, our data strongly indicate specific adhesion to flat substrates via both costameric and focal adhesions. The general appearance of the contractile and adhesion apparatus remains almost unaffected by substrate stiffness.

  11. Theory and Application of DNA Histogram Analysis.

    ERIC Educational Resources Information Center

    Bagwell, Charles Bruce

    The underlying principles and assumptions associated with DNA histograms are discussed along with the characteristics of fluorescent probes. Information theory was described and used to calculate the information content of a DNA histogram. Two major types of DNA histogram analyses are proposed: parametric and nonparametric analysis. Three levels…

  12. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  13. Interpreting Histograms. As Easy as It Seems?

    ERIC Educational Resources Information Center

    Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim

    2014-01-01

    Histograms are widely used, but recent studies have shown that they are not as easy to interpret as it might seem. In this article, we report on three studies on the interpretation of histograms in which we investigated, namely, (1) whether the misinterpretation by university students can be considered to be the result of heuristic reasoning, (2)…

  14. Spline smoothing of histograms by linear programming

    NASA Technical Reports Server (NTRS)

    Bennett, J. O.

    1972-01-01

    An algorithm for an approximating function to the frequency distribution is obtained from a sample of size n. To obtain the approximating function a histogram is made from the data. Next, Euclidean space approximations to the graph of the histogram using central B-splines as basis elements are obtained by linear programming. The approximating function has area one and is nonnegative.

  15. The Capacity of Color Histogram Indexing

    DTIC Science & Technology

    1993-01-01

    AD-A279 031 94-13058 0 LT D lE dE AtEL ECTE APR 2 01994 The Capacity of Color Histogram Indexing F Markus Stricker Michael Swain Communications... Technology Laboratory Department of Computer Science Swiss Federal Institute of Technology ETH The University of Chicago CH-8092 Zirich, Switzerland...kinds of color histograms as the features vectors promising way of quickly indexing into a large image to be stored in the index ([Swain and Ballard

  16. Local histograms and image occlusion models

    PubMed Central

    Massar, Melody L.; Bhagavatula, Ramamurthy; Fickus, Matthew; Kovačević, Jelena

    2012-01-01

    The local histogram transform of an image is a data cube that consists of the histograms of the pixel values that lie within a fixed neighborhood of any given pixel location. Such transforms are useful in image processing applications such as classification and segmentation, especially when dealing with textures that can be distinguished by the distributions of their pixel intensities and colors. We, in particular, use them to identify and delineate biological tissues found in histology images obtained via digital microscopy. In this paper, we introduce a mathematical formalism that rigorously justifies the use of local histograms for such purposes. We begin by discussing how local histograms can be computed as systems of convolutions. We then introduce probabilistic image models that can emulate textures one routinely encounters in histology images. These models are rooted in the concept of image occlusion. A simple model may, for example, generate textures by randomly speckling opaque blobs of one color on top of blobs of another. Under certain conditions, we show that, on average, the local histograms of such model-generated-textures are convex combinations of more basic distributions. We further provide several methods for creating models that meet these conditions; the textures generated by some of these models resemble those found in histology images. Taken together, these results suggest that histology textures can be analyzed by decomposing their local histograms into more basic components. We conclude with a proof-of-concept segmentation-and-classification algorithm based on these ideas, supported by numerical experimentation. PMID:23543920

  17. Maximizing the entropy of histogram bar heights to explore neural activity: a simulation study on auditory and tactile fibers.

    PubMed

    Güçlü, Burak

    2005-01-01

    Neurophysiologists often use histograms to explore patterns of activity in neural spike trains. The bin size selected to construct a histogram is crucial: too large bin widths result in coarse histograms, too small bin widths expand unimportant detail. Peri-stimulus time (PST) histograms of simulated nerve fibers were studied in the current article. This class of histograms gives information about neural activity in the temporal domain and is a density estimate for the spike rate. Scott's rule based on modem statistical theory suggests that the optimal bin size is inversely proportional to the cube root of sample size. However, this estimate requires a priori knowledge about the density function. Moreover, there are no good algorithms for adaptive-mesh histograms, which have variable bin sizes to minimize estimation errors. Therefore, an unconventional technique is proposed here to help experimenters in practice. This novel method maximizes the entropy of histogram-bar heights to find the unique bin size, which generates the highest disorder in a histogram (i.e., the most complex histogram), and is useful as a starting point for neural data mining. Although the proposed method is ad hoc from a density-estimation point of view, it is simple, efficient and more helpful in the experimental setting where no prior statistical information on neural activity is available. The results of simulations based on the entropy method are also discussed in relation to Ellaway's cumulative-sum technique, which can detect subtle changes in neural activity in certain conditions.

  18. 4.5-Gb/s RGB-LED based WDM visible light communication system employing CAP modulation and RLS based adaptive equalization.

    PubMed

    Wang, Yiguang; Huang, Xingxing; Tao, Li; Shi, Jianyang; Chi, Nan

    2015-05-18

    Inter-symbol interference (ISI) is one of the key problems that seriously limit transmission data rate in high-speed VLC systems. To eliminate ISI and further improve the system performance, series of equalization schemes have been widely investigated. As an adaptive algorithm commonly used in wireless communication, RLS is also suitable for visible light communication due to its quick convergence and better performance. In this paper, for the first time we experimentally demonstrate a high-speed RGB-LED based WDM VLC system employing carrier-less amplitude and phase (CAP) modulation and recursive least square (RLS) based adaptive equalization. An aggregate data rate of 4.5Gb/s is successfully achieved over 1.5-m indoor free space transmission with the bit error rate (BER) below the 7% forward error correction (FEC) limit of 3.8x10(-3). To the best of our knowledge, this is the highest data rate ever achieved in RGB-LED based VLC systems.

  19. Automatic threshold selection using histogram quantization

    NASA Astrophysics Data System (ADS)

    Wang, Yue; Adali, Tulay; Lo, Shih-Chung B.

    1997-04-01

    An automatic threshold selection method is proposed for biomedical image analysis based on a histogram coding scheme. The threshold values can be determined based on the well-known Lloyd-Max scalar quantization rule, which is optimal in the sense of achieving minimum mean-square-error distortion. An iterative self-organizing learning rule is derived to determine the threshold levels. The rule does not require any prior information about the histogram, hence is fully automatic. Experimental results show that this new approach is easy to implement yet is highly efficient, robust with respect to noise, and yields reliable estimates of the threshold levels.

  20. The Development of Cluster and Histogram Methods

    NASA Astrophysics Data System (ADS)

    Swendsen, Robert H.

    2003-11-01

    This talk will review the history of both cluster and histogram methods for Monte Carlo simulations. Cluster methods are based on the famous exact mapping by Fortuin and Kasteleyn from general Potts models onto a percolation representation. I will discuss the Swendsen-Wang algorithm, as well as its improvement and extension to more general spin models by Wolff. The Replica Monte Carlo method further extended cluster simulations to deal with frustrated systems. The history of histograms is quite extensive, and can only be summarized briefly in this talk. It goes back at least to work by Salsburg et al. in 1959. Since then, it has been forgotten and rediscovered several times. The modern use of the method has exploited its ability to efficiently determine the location and height of peaks in various quantities, which is of prime importance in the analysis of critical phenomena. The extensions of this approach to the multiple histogram method and multicanonical ensembles have allowed information to be obtained over a broad range of parameters. Histogram simulations and analyses have become standard techniques in Monte Carlo simulations.

  1. Equal Access.

    ERIC Educational Resources Information Center

    De Patta, Joe

    2003-01-01

    Presents an interview with Stephen McCarthy, co-partner and president of Equal Access ADA Consulting Architects of San Diego, California, about designing schools to naturally integrate compliance with the Americans with Disabilities Act (ADA). (EV)

  2. Histogramming of the Charged Particle Measurements with MSL/RAD - Comparison of Histogram Data with Simulations

    NASA Astrophysics Data System (ADS)

    Ehresmann, B.; Zeitlin, C.; Hassler, D. M.; Wimmer-Schweingruber, R. F.; Boettcher, S.; Koehler, J.; Martin, C.; Brinza, D.; Rafkin, S. C.

    2012-12-01

    The Radiation Assessment Detector (RAD) on-board the Mars Science Laboratory (MSL) is designed to measure a broad range of energetic particle radiation. A significant part of this radiation consists of charged particles, which mainly stem from cosmic background radiation, Solar particle events, and secondaries created by the interaction of these particles with the Martian atmosphere and soil. To measure charged particles RAD is equipped with a set of detectors: a particle telescope consisting of three silicon Solid-State Detectors (SSDs), a CsI scintillator and a plastic scintillator, as well as a further plastic scintillator used as anti-coincidence. RAD uses an elaborate post-processing logic to analyze if a measured event qualifies as a charged particle, as well as to distinguish between particles stopping in any one of the detectors and particles penetrating the whole detector stack. RAD then arranges these qualifying events in an appropriate stopping or penetrating charged particle histogram, reducing the data volume necessary to maintain crucial information about the measured particle. For ground-based data analysis it is of prime importance to derive information, such as particle species or energy, from the data in the downloaded histograms. Here, we will present how the chosen binning of these histograms enables us to derive this information. Pre-flight, we used the Monte-Carlo code GEANT4 to simulate the expected particle radiation and its interactions with a full model of the RAD sensor head. By mirroring the on-board processing logic, we derived statistics of which particle species and energies populate any one bin in the set of charged particle histograms. Finally, we will compare the resulting histogram data from RAD cruise and surface observations with simulations. RAD is supported by NASA (HEOMD) under JPL subcontract #1273039 to SwRI, and by DLR in Germany under contract to Christian-Albrechts-Universitaet zu Kiel (CAU).

  3. Comparison of Histograms for Use in Cloud Observation and Modeling

    NASA Technical Reports Server (NTRS)

    Green, Lisa; Xu, Kuan-Man

    2005-01-01

    Cloud observation and cloud modeling data can be presented in histograms for each characteristic to be measured. Combining information from single-cloud histograms yields a summary histogram. Summary histograms can be compared to each other to reach conclusions about the behavior of an ensemble of clouds in different places at different times or about the accuracy of a particular cloud model. As in any scientific comparison, it is necessary to decide whether any apparent differences are statistically significant. The usual methods of deciding statistical significance when comparing histograms do not apply in this case because they assume independent data. Thus, a new method is necessary. The proposed method uses the Euclidean distance metric and bootstrapping to calculate the significance level.

  4. An automated blood vessel segmentation algorithm using histogram equalization and automatic threshold selection.

    PubMed

    Saleh, Marwan D; Eswaran, C; Mueen, Ahmed

    2011-08-01

    This paper focuses on the detection of retinal blood vessels which play a vital role in reducing the proliferative diabetic retinopathy and for preventing the loss of visual capability. The proposed algorithm which takes advantage of the powerful preprocessing techniques such as the contrast enhancement and thresholding offers an automated segmentation procedure for retinal blood vessels. To evaluate the performance of the new algorithm, experiments are conducted on 40 images collected from DRIVE database. The results show that the proposed algorithm performs better than the other known algorithms in terms of accuracy. Furthermore, the proposed algorithm being simple and easy to implement, is best suited for fast processing applications.

  5. AHIMSA - Ad hoc histogram information measure sensing algorithm for feature selection in the context of histogram inspired clustering techniques

    NASA Technical Reports Server (NTRS)

    Dasarathy, B. V.

    1976-01-01

    An algorithm is proposed for dimensionality reduction in the context of clustering techniques based on histogram analysis. The approach is based on an evaluation of the hills and valleys in the unidimensional histograms along the different features and provides an economical means of assessing the significance of the features in a nonparametric unsupervised data environment. The method has relevance to remote sensing applications.

  6. Automatic Contrast Enhancement of Brain MR Images Using Hierarchical Correlation Histogram Analysis.

    PubMed

    Chen, Chiao-Min; Chen, Chih-Cheng; Wu, Ming-Chi; Horng, Gwoboa; Wu, Hsien-Chu; Hsueh, Shih-Hua; Ho, His-Yun

    Parkinson's disease is a progressive neurodegenerative disorder that has a higher probability of occurrence in middle-aged and older adults than in the young. With the use of a computer-aided diagnosis (CAD) system, abnormal cell regions can be identified, and this identification can help medical personnel to evaluate the chance of disease. This study proposes a hierarchical correlation histogram analysis based on the grayscale distribution degree of pixel intensity by constructing a correlation histogram, that can improves the adaptive contrast enhancement for specific objects. The proposed method produces significant results during contrast enhancement preprocessing and facilitates subsequent CAD processes, thereby reducing recognition time and improving accuracy. The experimental results show that the proposed method is superior to existing methods by using two estimation image quantitative methods of PSNR and average gradient values. Furthermore, the edge information pertaining to specific cells can effectively increase the accuracy of the results.

  7. Brightness preserving image enhancement based on a gradient and intensity histogram

    NASA Astrophysics Data System (ADS)

    Sun, Zebin; Feng, Wenquan; Zhao, Qi; Huang, Lidong

    2015-09-01

    We present a straightforward brightness preserving image enhancement technique. The proposed method is based on an original gradient and intensity histogram (GIH) which contains both gradient and intensity information of the image. This character enables GIH to avoid high peaks in the traditional intensity histogram and, thus alleviate overenhancement in our enhancement method, i.e., gradient and intensity histogram equalization (GIHE). GIHE can also enhance the gradient strength of an image, which is good for improving the subjective quality since the human vision system is more sensitive to the gradient than the absolute intensity of image. Considering that brightness preservation and dynamic range compression are highly demanded in consumer electronics, we manipulate the intensity of the enhanced image appropriately by amplifying the small intensities and attenuating the large intensities, respectively, using a brightness preserving function (BPF). The BPF is straightforward and universal and can be used in other image enhancement techniques. We demonstrate that the proposed method can effectively improve the subjective quality as well as preserve the brightness of the input image.

  8. 2000 fps multi-object tracking based on color histogram

    NASA Astrophysics Data System (ADS)

    Gu, Qingyi; Takaki, Takeshi; Ishii, Idaku

    2012-06-01

    In this study, we develop a real-time, color histogram-based tracking system for multiple color-patterned objects in a 512×512 image at 2000 fps. Our system can simultaneously extract the positions, areas, orientation angles, and color histograms of multiple objects in an image using the hardware implementation of a multi-object, color histogram extraction circuit module on a high-speed vision platform. It can both label multiple objects in an image consisting of connected components and calculate their moment features and 16-bin hue-based color histograms using cell-based labeling. We demonstrate the performance of our system by showing several experimental results: (1) tracking of multiple color-patterned objects on a plate rotating at 16 rps, and (2) tracking of human hand movement with two color-patterned drinking bottles.

  9. Approximate Splitting for Ensembles of Trees using Histograms

    SciTech Connect

    Kamath, C; Cantu-Paz, E; Littau, D

    2001-09-28

    Recent work in classification indicates that significant improvements in accuracy can be obtained by growing an ensemble of classifiers and having them vote for the most popular class. Implicit in many of these techniques is the concept of randomization that generates different classifiers. In this paper, they focus on ensembles of decision trees that are created using a randomized procedure based on histograms. Techniques, such as histograms, that discretize continuous variables, have long been used in classification to convert the data into a form suitable for processing and to reduce the compute time. The approach combines the ideas behind discretization through histograms and randomization in ensembles to create decision trees by randomly selecting a split point in an interval around the best bin boundary in the histogram. The experimental results with public domain data show that ensembles generated using this approach are competitive in accuracy and superior in computational cost to other ensembles techniques such as boosting and bagging.

  10. Frequency distribution histograms for the rapid analysis of data

    NASA Technical Reports Server (NTRS)

    Burke, P. V.; Bullen, B. L.; Poff, K. L.

    1988-01-01

    The mean and standard error are good representations for the response of a population to an experimental parameter and are frequently used for this purpose. Frequency distribution histograms show, in addition, responses of individuals in the population. Both the statistics and a visual display of the distribution of the responses can be obtained easily using a microcomputer and available programs. The type of distribution shown by the histogram may suggest different mechanisms to be tested.

  11. Dose-volume histogram prediction using density estimation.

    PubMed

    Skarpman Munter, Johanna; Sjölund, Jens

    2015-09-07

    Knowledge of what dose-volume histograms can be expected for a previously unseen patient could increase consistency and quality in radiotherapy treatment planning. We propose a machine learning method that uses previous treatment plans to predict such dose-volume histograms. The key to the approach is the framing of dose-volume histograms in a probabilistic setting.The training consists of estimating, from the patients in the training set, the joint probability distribution of some predictive features and the dose. The joint distribution immediately provides an estimate of the conditional probability of the dose given the values of the predictive features. The prediction consists of estimating, from the new patient, the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimate of the dose-volume histogram.To illustrate how the proposed method relates to previously proposed methods, we use the signed distance to the target boundary as a single predictive feature. As a proof-of-concept, we predicted dose-volume histograms for the brainstems of 22 acoustic schwannoma patients treated with stereotactic radiosurgery, and for the lungs of 9 lung cancer patients treated with stereotactic body radiation therapy. Comparing with two previous attempts at dose-volume histogram prediction we find that, given the same input data, the predictions are similar.In summary, we propose a method for dose-volume histogram prediction that exploits the intrinsic probabilistic properties of dose-volume histograms. We argue that the proposed method makes up for some deficiencies in previously proposed methods, thereby potentially increasing ease of use, flexibility and ability to perform well with small amounts of training data.

  12. An adaptive enhancement algorithm for infrared video based on modified k-means clustering

    NASA Astrophysics Data System (ADS)

    Zhang, Linze; Wang, Jingqi; Wu, Wen

    2016-09-01

    In this paper, we have proposed a video enhancement algorithm to improve the output video of the infrared camera. Sometimes the video obtained by infrared camera is very dark since there is no clear target. In this case, infrared video should be divided into frame images by frame extraction, in order to carry out the image enhancement. For the first frame image, which can be divided into k sub images by using K-means clustering according to the gray interval it occupies before k sub images' histogram equalization according to the amount of information per sub image, we used a method to solve a problem that final cluster centers close to each other in some cases; and for the other frame images, their initial cluster centers can be determined by the final clustering centers of the previous ones, and the histogram equalization of each sub image will be carried out after image segmentation based on K-means clustering. The histogram equalization can make the gray value of the image to the whole gray level, and the gray level of each sub image is determined by the ratio of pixels to a frame image. Experimental results show that this algorithm can improve the contrast of infrared video where night target is not obvious which lead to a dim scene, and reduce the negative effect given by the overexposed pixels adaptively in a certain range.

  13. Performance analysis of low-complexity adaptive frequency-domain equalization and MIMO signal processing for compensation of differential mode group delay in mode-division multiplexing communication systems using few-mode fibers

    NASA Astrophysics Data System (ADS)

    Weng, Yi; He, Xuan; Pan, Zhongqi

    2016-02-01

    Mode-division multiplexing (MDM) transmission systems utilizing few-mode fibers (FMF) have been intensively explored to sustain continuous traffic growth. The key challenges of MDM systems are inter-modal crosstalk due to random mode coupling (RMC), and largely-accumulated differential mode group delay (DMGD), whilst hinders mode-demultiplexer implementation. The adaptive multi-input multi-output (MIMO) frequency-domain equalization (FDE) can dynamically compensate DMGD using digital signal processing (DSP) algorithms. The frequency-domain least-mean squares (FD-LMS) algorithm has been universally adopted for high-speed MDM communications, mainly for its relatively low computational complexity. However, longer training sequence is appended for FD-LMS to achieve faster convergence, which incurs prohibitively higher system overhead and reduces overall throughput. In this paper, we propose a fast-convergent single-stage adaptive frequency-domain recursive least-squares (FD-RLS) algorithm with reduced complexity for DMGD compensation at MDM coherent receivers. The performance and complexity comparison of FD-RLS, with signal-PSD-dependent FD-LMS method and conventional FD-LMS approach, are performed in a 3000 km six-mode transmission system with 65 ps/km DMGD. We explore the convergence speed of three adaptive algorithms, including the normalized mean-square-error (NMSE) per fast Fourier transform (FFT) block at 14-30 dB OSNR. The fast convergence of FD-RLS is exploited at the expense of slightly-increased necessary tap numbers for MIMO equalizers, and it can partially save the overhead of training sequence. Furthermore, we demonstrate adaptive FD-RLS can also be used for chromatic dispersion (CD) compensation without increasing the filter tap length, thus prominently reducing the DSP implementation complexity for MDM systems.

  14. Adaptation.

    PubMed

    Broom, Donald M

    2006-01-01

    The term adaptation is used in biology in three different ways. It may refer to changes which occur at the cell and organ level, or at the individual level, or at the level of gene action and evolutionary processes. Adaptation by cells, especially nerve cells helps in: communication within the body, the distinguishing of stimuli, the avoidance of overload and the conservation of energy. The time course and complexity of these mechanisms varies. Adaptive characters of organisms, including adaptive behaviours, increase fitness so this adaptation is evolutionary. The major part of this paper concerns adaptation by individuals and its relationships to welfare. In complex animals, feed forward control is widely used. Individuals predict problems and adapt by acting before the environmental effect is substantial. Much of adaptation involves brain control and animals have a set of needs, located in the brain and acting largely via motivational mechanisms, to regulate life. Needs may be for resources but are also for actions and stimuli which are part of the mechanism which has evolved to obtain the resources. Hence pigs do not just need food but need to be able to carry out actions like rooting in earth or manipulating materials which are part of foraging behaviour. The welfare of an individual is its state as regards its attempts to cope with its environment. This state includes various adaptive mechanisms including feelings and those which cope with disease. The part of welfare which is concerned with coping with pathology is health. Disease, which implies some significant effect of pathology, always results in poor welfare. Welfare varies over a range from very good, when adaptation is effective and there are feelings of pleasure or contentment, to very poor. A key point concerning the concept of individual adaptation in relation to welfare is that welfare may be good or poor while adaptation is occurring. Some adaptation is very easy and energetically cheap and

  15. Gradient histogram estimation and preservation for texture enhanced image denoising.

    PubMed

    Zuo, Wangmeng; Zhang, Lei; Song, Chunwei; Zhang, David; Gao, Huijun

    2014-06-01

    Natural image statistics plays an important role in image denoising, and various natural image priors, including gradient-based, sparse representation-based, and nonlocal self-similarity-based ones, have been widely studied and exploited for noise removal. In spite of the great success of many denoising algorithms, they tend to smooth the fine scale image textures when removing noise, degrading the image visual quality. To address this problem, in this paper, we propose a texture enhanced image denoising method by enforcing the gradient histogram of the denoised image to be close to a reference gradient histogram of the original image. Given the reference gradient histogram, a novel gradient histogram preservation (GHP) algorithm is developed to enhance the texture structures while removing noise. Two region-based variants of GHP are proposed for the denoising of images consisting of regions with different textures. An algorithm is also developed to effectively estimate the reference gradient histogram from the noisy observation of the unknown image. Our experimental results demonstrate that the proposed GHP algorithm can well preserve the texture appearance in the denoised images, making them look more natural.

  16. Navigating a mobile robot by a traversability field histogram.

    PubMed

    Ye, Cang

    2007-04-01

    This paper presents an autonomous terrain navigation system for a mobile robot. The system employs a two-dimensional laser range finder (LRF) for terrain mapping. A so-called "traversability field histogram" (TFH) method is proposed to guide the robot. The TFH method first transforms a local terrain map surrounding the robot's momentary position into a traversability map by extracting the slope and roughness of a terrain patch through least-squares plane fitting. It then computes a so-called "polar traversability index" (PTI) that represents the overall difficulty of traveling along the corresponding direction. The PTIs are represented in a form of histogram. Based on this histogram, the velocity and steering commands of the robot are determined. The concept of a virtual valley and an exit condition are proposed and used to direct the robot such that it can reach the target with a finite-length path. The algorithm is verified by simulation and experimental results.

  17. Face recognition with histograms of fractional differential gradients

    NASA Astrophysics Data System (ADS)

    Yu, Lei; Ma, Yan; Cao, Qi

    2014-05-01

    It has proved that fractional differentiation can enhance the edge information and nonlinearly preserve textural detailed information in an image. This paper investigates its ability for face recognition and presents a local descriptor called histograms of fractional differential gradients (HFDG) to extract facial visual features. HFDG encodes a face image into gradient patterns using multiorientation fractional differential masks, from which histograms of gradient directions are computed as the face representation. Experimental results on Yale, face recognition technology (FERET), Carnegie Mellon University pose, illumination, and expression (CMU PIE), and A. Martinez and R. Benavente (AR) databases validate the feasibility of the proposed method and show that HFDG outperforms local binary patterns (LBP), histograms of oriented gradients (HOG), enhanced local directional patterns (ELDP), and Gabor feature-based methods.

  18. Click-iT assay with improved DNA distribution histograms.

    PubMed

    Hamelik, Ronald M; Krishan, Awtar

    2009-10-01

    The Click-iT Assay developed and commercialized by Invitrogen is based on incorporation of a new 5-bromo-2'-deoxyuridine analog, 5-ethynyl-2'-deoxyuridine (EdU) into newly synthesized DNA and its recognition by azide dyes via a copper mediated "click" reaction. This relatively convenient and useful procedure depends on fixation of cells with paraformaldehyde and staining of the DNA with 7-aminoactinomycin-D (7-AAD). Both of these procedures result in DNA histograms with broad coefficients of variation (CV's). In this report, we have shown that after EdU incorporation, nuclei isolated by lysis can be incubated with the Click-iT Assay and stained with propidium iodide for generation of DNA histograms with low CV's. This modified procedure results in better DNA histograms by replacing 7-AAD with propidium iodide and also saves processing time by eliminating the fixation and permeabilization steps.

  19. Java multi-histogram volume rendering framework for medical images

    NASA Astrophysics Data System (ADS)

    Senseney, Justin; Bokinsky, Alexandra; Cheng, Ruida; McCreedy, Evan; McAuliffe, Matthew J.

    2013-03-01

    This work extends the multi-histogram volume rendering framework proposed by Kniss et al. [1] to provide rendering results based on the impression of overlaid triangles on a graph of image intensity versus gradient magnitude. The developed method of volume rendering allows for greater emphasis to boundary visualization while avoiding issues common in medical image acquisition. For example, partial voluming effects in computed tomography and intensity inhomogeneity of similar tissue types in magnetic resonance imaging introduce pixel values that will not reflect differing tissue types when a standard transfer function is applied to an intensity histogram. This new framework uses developing technology to improve upon the Kniss multi-histogram framework by using Java, the GPU, and MIPAV, an open-source medical image processing application, to allow multi-histogram techniques to be widely disseminated. The OpenGL view aligned texture rendering approach suffered from performance setbacks, inaccessibility, and usability problems. Rendering results can now be interactively compared with other rendering frameworks, surfaces can now be extracted for use in other programs, and file formats that are widely used in the field of biomedical imaging can be visualized using this multi-histogram approach. OpenCL and GLSL are used to produce this new multi-histogram approach, leveraging texture memory on the graphics processing unit of desktops to provide a new interactive method for visualizing biomedical images. Performance results for this method are generated and qualitative rendering results are compared. The resulting framework provides the opportunity for further applications in medical imaging, both in volume rendering and in generic image processing.

  20. Some considerations in the analysis of clinical DNA histogram data

    SciTech Connect

    Jett, J.

    1990-01-01

    In this brief paper, we will examine the theoretical basis of several cell cycle distribution analysis techniques that are frequently used for the analysis of clinical DNA histograms. The class of analysis technique that will be discussed is that which assumes a model that describes the DNA distribution and uses some sort of fitting procedure to adjust the parameters in the model so that the model agrees with the data as well as is possible. Several of the techniques described are included in commercially available DNA histogram analysis packages. 16 refs., 7 figs.

  1. Adapt

    NASA Astrophysics Data System (ADS)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  2. Forget about equality.

    PubMed

    Powers, Madison

    1996-06-01

    Justice is widely thought to consist in equality. For many theorists, the central question has been: Equality of what? The author argues that the ideal of equality distorts practical reasoning and has deeply counterintuitive implications. Moreover, an alternative view of distributive justice can give a better account of what egalitarians should care about than can any of the competing ideals of equality.

  3. A Novel Method for the Evaluation of Uncertainty in Dose-Volume Histogram Computation

    SciTech Connect

    Henriquez, Francisco Cutanda M.Sc. Castrillon, Silvia Vargas

    2008-03-15

    Purpose: Dose-volume histograms (DVHs) are a useful tool in state-of-the-art radiotherapy treatment planning, and it is essential to recognize their limitations. Even after a specific dose-calculation model is optimized, dose distributions computed by using treatment-planning systems are affected by several sources of uncertainty, such as algorithm limitations, measurement uncertainty in the data used to model the beam, and residual differences between measured and computed dose. This report presents a novel method to take them into account. Methods and Materials: To take into account the effect of associated uncertainties, a probabilistic approach using a new kind of histogram, a dose-expected volume histogram, is introduced. The expected value of the volume in the region of interest receiving an absorbed dose equal to or greater than a certain value is found by using the probability distribution of the dose at each point. A rectangular probability distribution is assumed for this point dose, and a formulation that accounts for uncertainties associated with point dose is presented for practical computations. Results: This method is applied to a set of DVHs for different regions of interest, including 6 brain patients, 8 lung patients, 8 pelvis patients, and 6 prostate patients planned for intensity-modulated radiation therapy. Conclusions: Results show a greater effect on planning target volume coverage than in organs at risk. In cases of steep DVH gradients, such as planning target volumes, this new method shows the largest differences with the corresponding DVH; thus, the effect of the uncertainty is larger.

  4. Histogram of Oriented Gradient Based Gist Feature for Building Recognition.

    PubMed

    Li, Bin; Cheng, Kaili; Yu, Zhezhou

    2016-01-01

    We proposed a new method of gist feature extraction for building recognition and named the feature extracted by this method as the histogram of oriented gradient based gist (HOG-gist). The proposed method individually computes the normalized histograms of multiorientation gradients for the same image with four different scales. The traditional approach uses the Gabor filters with four angles and four different scales to extract orientation gist feature vectors from an image. Our method, in contrast, uses the normalized histogram of oriented gradient as orientation gist feature vectors of the same image. These HOG-based orientation gist vectors, combined with intensity and color gist feature vectors, are the proposed HOG-gist vectors. In general, the HOG-gist contains four multiorientation histograms (four orientation gist feature vectors), and its texture description ability is stronger than that of the traditional gist using Gabor filters with four angles. Experimental results using Sheffield Buildings Database verify the feasibility and effectiveness of the proposed HOG-gist.

  5. DRDC Starfish Acoustic Sentinel and Phase Gradient Histogram Tracking

    DTIC Science & Technology

    2015-04-01

    exponential filters, with the frequency-domain algorithm using parallel filters in each frequency bin. A Phase Gradient bearing estimation algorithm is...algorithm and the Phase Gradient bearing estimation algorithm with Histogram Tracking. Significance for defence and security For the Force ASW project, the...1 2 Frequency-domain acoustic sentinel . . . . . . . . . . . . . . . . . . . . . . 1 3 Phase Gradient bearing estimation algorithm

  6. Histogram of Oriented Gradient Based Gist Feature for Building Recognition

    PubMed Central

    Cheng, Kaili; Yu, Zhezhou

    2016-01-01

    We proposed a new method of gist feature extraction for building recognition and named the feature extracted by this method as the histogram of oriented gradient based gist (HOG-gist). The proposed method individually computes the normalized histograms of multiorientation gradients for the same image with four different scales. The traditional approach uses the Gabor filters with four angles and four different scales to extract orientation gist feature vectors from an image. Our method, in contrast, uses the normalized histogram of oriented gradient as orientation gist feature vectors of the same image. These HOG-based orientation gist vectors, combined with intensity and color gist feature vectors, are the proposed HOG-gist vectors. In general, the HOG-gist contains four multiorientation histograms (four orientation gist feature vectors), and its texture description ability is stronger than that of the traditional gist using Gabor filters with four angles. Experimental results using Sheffield Buildings Database verify the feasibility and effectiveness of the proposed HOG-gist. PMID:27872639

  7. Science EQUALS Success.

    ERIC Educational Resources Information Center

    Cobb, Kitty B., Ed.; Conwell, Catherine R., Ed.

    The purpose of the EQUALS programs is to increase the interest and awareness that females and minorities have concerning mathematics and science related careers. This book, produced by an EQUALS program in North Carolina, contains 35 hands-on, discovery science activities that center around four EQUALS processes--problem solving, cooperative…

  8. Connecting the Equals Sign

    ERIC Educational Resources Information Center

    Jones, Ian; Pratt, Dave

    2006-01-01

    Children tend to view the equals sign as an operator symbol bereft of the rich relational properties of equality statements. It has been argued by some that this restricted view of the equals sign is due to cultural or cognitive factors. We suggest a significant factor is that rich relational meanings lack relevance within the context of…

  9. Automatic segmentation of ground-glass opacity nodule on chest CT images by histogram modeling and local contrast

    NASA Astrophysics Data System (ADS)

    Jung, Julip; Hong, Helen; Goo, Jin Mo

    2012-03-01

    We propose an automatic segmentation of Ground Glass Opacity (GGO) nodules on chest CT images by histogram modeling and local contrast. First, optimal volume circumscribing a nodule is calculated by clicking inside of GGO nodule. To remove noises while preserving a nodule boundary, anisotropic diffusion filtering is applied to the optimal volume. Second, for deciding an appropriate threshold value of GGO nodule, histogram modeling is performed by Gaussian Mixture Modeling (GMM) with three components such as lung parenchyma, nodule, and chest wall or vessels. Third, the attached chest wall and vessels are separated from the GGO nodules by maximum curvature points linking and morphological erosion with adaptive circular mask. Fourth, initial boundary of GGO nodule is refined using local contrast information. Experimental results show that attached neighbor structures are well separated from GGO nodules while missed GGO region is refined. The proposed segmentation method can be used for measurement of the growth rate of nodule and the proportion of solid portion inside nodule.

  10. Some Equalities Are More Equal Than Others: Quality Equality Emerges Later Than Numerical Equality.

    PubMed

    Sheskin, Mark; Nadal, Amber; Croom, Adam; Mayer, Tanya; Nissel, Jenny; Bloom, Paul

    2016-09-01

    By age 6, children typically share an equal number of resources between themselves and others. However, fairness involves not merely that each person receive an equal number of resources ("numerical equality") but also that each person receive equal quality resources ("quality equality"). In Study 1, children (N = 87, 3-10 years) typically split four resources "two each" by age 6, but typically monopolized the better two resources until age 10. In Study 2, a new group of 6- to 8-year-olds (N = 32) allocated resources to third parties according to quality equality, indicating that children in this age group understand that fairness requires both types of equality.

  11. Real Time Motion Detection Based on the Spatio-Temporal Median Filter using GPU Integral Histograms

    DTIC Science & Technology

    2012-12-01

    histogram is extensible to higher dimen- sions and different bin structures. The integral histogram at position (x, y) in the image holds the histogram for...syncthreads(); // write the transposed matrix tile to global memory xIndex = blockIdx.z * BLOCK_DIM + threadIdx.x; yIndex = blockIdx.y * BLOCK_DIM...bank conflict shared memory and guaranties that global reads and writes are coalesced. Our GPU integral histogram implementation benefits from

  12. A 2D histogram representation of images for pooling

    NASA Astrophysics Data System (ADS)

    Yu, Xinnan; Zhang, Yu-Jin

    2011-03-01

    Designing a suitable image representation is one of the most fundamental issues of computer vision. There are three steps in the popular Bag of Words based image representation: feature extraction, coding and pooling. In the final step, current methods make an M x K encoded feature matrix degraded to a K-dimensional vector (histogram), where M is the number of features, and K is the size of the codebook: information is lost dramatically here. In this paper, a novel pooling method, based on 2-D histogram representation, is proposed to retain more information from the encoded image features. This pooling method can be easily incorporated into state-of- the-art computer vision system frameworks. Experiments show that our approach improves current pooling methods, and can achieve satisfactory performance of image classification and image reranking even when using a small codebook and costless linear SVM.

  13. Interpolated histogram method for area optimised median computation

    NASA Astrophysics Data System (ADS)

    Buch, Kaushal D.; Darji, Anand D.

    2013-04-01

    The article describes an area efficient algorithm for real-time approximate median computation on VLSI platforms. The improvement in performance and area optimisation are achieved through linear interpolation within a reduced number of histogram bins. In order to reduce the hardware utilisation further, an approximation technique for interpolation is also proposed. This approach extends the utility of the histogram method to data sets having a large dynamic range. The performance of the proposed algorithm in terms of mean squared error (MSE) and resource utilisation is provided and compared to that of the existing algorithms. This comparison indicates that more than 60% optimisation in resources is achieved with marginal compromise in the accuracy of the median. The proposed algorithm finds applications in the areas of image processing, time series analysis and median absolute deviation (MAD) computation.

  14. Click-iT proliferation assay with improved DNA histograms.

    PubMed

    Krishan, Awtar; Hamelik, Ronald M

    2010-04-01

    The Click-iT EdU cell proliferation assay (Invitrogen) for detection of replicating cells is based on incorporation of EdU into newly synthesized DNA and its recognition by azide dyes via a copper mediated "click" reaction. In the protocol provided by Invitrogen, cells are fixed with paraformaldehyde and stained with 7-aminoactinomycin D (7-AAD) for DNA content analysis. Both of these procedures result in DNA histograms with a broad coefficient of variation. We have modified this protocol and show that after EdU incorporation, nuclei isolated by hypotonic lysis of cells can be directly labeled using the Click-iT Alexa Fluor 488 Assay kit and stained with propidium iodide. This modified procedure using isolated nuclei and propidium iodide staining results in DNA histograms with better resolution (lower coefficient of variation of the G(1) peak) and shorter processing time by eliminating the fixation and permeabilization steps.

  15. Equality of Opportunity and Equality of Outcome

    ERIC Educational Resources Information Center

    Kodelja, Zdenko

    2016-01-01

    The report on the findings of extensive empirical research on equality of educational opportunities carried out in the United States on a very large sample of public schools by Coleman and his colleagues has had a major impact on education policy and has given rise to a large amount of research and various interpretations. However, as some…

  16. Retrospective Reconstructions of Active Bone Marrow Dose-Volume Histograms

    SciTech Connect

    Veres, Cristina; Allodji, Rodrigue S.; Llanas, Damien; Vu Bezin, Jérémi; Chavaudra, Jean; Mège, Jean Pierre; Lefkopoulos, Dimitri; Quiniou, Eric; Deutsh, Eric; Vathaire, Florent de; Diallo, Ibrahima

    2014-12-01

    Purpose: To present a method for calculating dose-volume histograms (DVH's) to the active bone marrow (ABM) of patients who had undergone radiation therapy (RT) and subsequently developed leukemia. Methods and Materials: The study focuses on 15 patients treated between 1961 and 1996. Whole-body RT planning computed tomographic (CT) data were not available. We therefore generated representative whole-body CTs similar to patient anatomy. In addition, we developed a method enabling us to obtain information on the density distribution of ABM all over the skeleton. Dose could then be calculated in a series of points distributed all over the skeleton in such a way that their local density reflected age-specific data for ABM distribution. Dose to particular regions and dose-volume histograms of the entire ABM were estimated for all patients. Results: Depending on patient age, the total number of dose calculation points generated ranged from 1,190,970 to 4,108,524. The average dose to ABM ranged from 0.3 to 16.4 Gy. Dose-volume histograms analysis showed that the median doses (D{sub 50%}) ranged from 0.06 to 12.8 Gy. We also evaluated the inhomogeneity of individual patient ABM dose distribution according to clinical situation. It was evident that the coefficient of variation of the dose for the whole ABM ranged from 1.0 to 5.7, which means that the standard deviation could be more than 5 times higher than the mean. Conclusions: For patients with available long-term follow-up data, our method provides reconstruction of dose-volume data comparable to detailed dose calculations, which have become standard in modern CT-based 3-dimensional RT planning. Our strategy of using dose-volume histograms offers new perspectives to retrospective epidemiological studies.

  17. Finding significantly connected voxels based on histograms of connection strengths

    NASA Astrophysics Data System (ADS)

    Kasenburg, Niklas; Pedersen, Morten Vester; Darkner, Sune

    2016-03-01

    We explore a new approach for structural connectivity based segmentations of subcortical brain regions. Connectivity based segmentations are usually based on fibre connections from a seed region to predefined target regions. We present a method for finding significantly connected voxels based on the distribution of connection strengths. Paths from seed voxels to all voxels in a target region are obtained from a shortest-path tractography. For each seed voxel we approximate the distribution with a histogram of path scores. We hypothesise that the majority of estimated connections are false-positives and that their connection strength is distributed differently from true-positive connections. Therefore, an empirical null-distribution is defined for each target region as the average normalized histogram over all voxels in the seed region. Single histograms are then tested against the corresponding null-distribution and significance is determined using the false discovery rate (FDR). Segmentations are based on significantly connected voxels and their FDR. In this work we focus on the thalamus and the target regions were chosen by dividing the cortex into a prefrontal/temporal zone, motor zone, somatosensory zone and a parieto-occipital zone. The obtained segmentations consistently show a sparse number of significantly connected voxels that are located near the surface of the anterior thalamus over a population of 38 subjects.

  18. Slope histogram distribution-based parametrisation of Martian geomorphic features

    NASA Astrophysics Data System (ADS)

    Balint, Zita; Székely, Balázs; Kovács, Gábor

    2014-05-01

    The application of geomorphometric methods on the large Martian digital topographic datasets paves the way to analyse the Martian areomorphic processes in more detail. One of the numerous methods is the analysis is to analyse local slope distributions. To this implementation a visualization program code was developed that allows to calculate the local slope histograms and to compare them based on Kolmogorov distance criterion. As input data we used the digital elevation models (DTMs) derived from HRSC high-resolution stereo camera image from various Martian regions. The Kolmogorov-criterion based discrimination produces classes of slope histograms that displayed using coloration obtaining an image map. In this image map the distribution can be visualized by their different colours representing the various classes. Our goal is to create a local slope histogram based classification for large Martian areas in order to obtain information about general morphological characteristics of the region. This is a contribution of the TMIS.ascrea project, financed by the Austrian Research Promotion Agency (FFG). The present research is partly realized in the frames of TÁMOP 4.2.4.A/2-11-1-2012-0001 high priority "National Excellence Program - Elaborating and Operating an Inland Student and Researcher Personal Support System convergence program" project's scholarship support, using Hungarian state and European Union funds and cofinances from the European Social Fund.

  19. 41 CFR 60-250.5 - Equal opportunity clause.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... assignments, job classifications, organizational structures, position descriptions, lines of progression, and...) Adaption of language. Such necessary changes in language may be made to the equal opportunity clause...

  20. Content based Image Retrieval based on Different Global and Local Color Histogram Methods: A Survey

    NASA Astrophysics Data System (ADS)

    Suhasini, Pallikonda Sarah; Sri Rama Krishna, K.; Murali Krishna, I. V.

    2016-06-01

    Different global and local color histogram methods for content based image retrieval (CBIR) are investigated in this paper. Color histogram is a widely used descriptor for CBIR. Conventional method of extracting color histogram is global, which misses the spatial content, is less invariant to deformation and viewpoint changes, and results in a very large three dimensional histogram corresponding to the color space used. To address the above deficiencies, different global and local histogram methods are proposed in recent research. Different ways of extracting local histograms to have spatial correspondence, invariant colour histogram to add deformation and viewpoint invariance and fuzzy linking method to reduce the size of the histogram are found in recent papers. The color space and the distance metric used are vital in obtaining color histogram. In this paper the performance of CBIR based on different global and local color histograms in three different color spaces, namely, RGB, HSV, L*a*b* and also with three distance measures Euclidean, Quadratic and Histogram intersection are surveyed, to choose appropriate method for future research.

  1. Content based Image Retrieval based on Different Global and Local Color Histogram Methods: A Survey

    NASA Astrophysics Data System (ADS)

    Suhasini, Pallikonda Sarah; Sri Rama Krishna, K.; Murali Krishna, I. V.

    2017-02-01

    Different global and local color histogram methods for content based image retrieval (CBIR) are investigated in this paper. Color histogram is a widely used descriptor for CBIR. Conventional method of extracting color histogram is global, which misses the spatial content, is less invariant to deformation and viewpoint changes, and results in a very large three dimensional histogram corresponding to the color space used. To address the above deficiencies, different global and local histogram methods are proposed in recent research. Different ways of extracting local histograms to have spatial correspondence, invariant colour histogram to add deformation and viewpoint invariance and fuzzy linking method to reduce the size of the histogram are found in recent papers. The color space and the distance metric used are vital in obtaining color histogram. In this paper the performance of CBIR based on different global and local color histograms in three different color spaces, namely, RGB, HSV, L*a*b* and also with three distance measures Euclidean, Quadratic and Histogram intersection are surveyed, to choose appropriate method for future research.

  2. Integer Equal Flows

    SciTech Connect

    Meyers, C A; Schulz, A S

    2009-01-07

    The integer equal flow problem is an NP-hard network flow problem, in which all arcs in given sets R{sub 1}, ..., R{sub {ell}} must carry equal flow. We show this problem is effectively inapproximable, even if the cardinality of each set R{sub k} is two. When {ell} is fixed, it is solvable in polynomial time.

  3. Equality and Economy

    ERIC Educational Resources Information Center

    Brink, Chris

    2012-01-01

    The two big events in higher education during 2010 were the implementation of the Equality Act, and the introduction of a new dispensation on fees and funding. The former is intended to promote equality, the latter is premised on the need for economy. In this article, the author focuses on the effect of the latter on the former. He considers this…

  4. EQUAL PAY FACTS.

    ERIC Educational Resources Information Center

    Women's Bureau (DOL), Washington, DC.

    EQUAL PAY MEANS PAYMENT OF "RATE OF THE JOB" WITHOUT REGARD TO SEX. EQUAL PAY LAWS WERE ENACTED IN 29 STATES FROM 1919 TO 1965. FOUR ADDITIONAL STATES HAVE FAIR EMPLOYMENT PRACTICES LAWS. SUPPORT FOR SUCH LEGISLATION HAS COME FROM WOMEN'S AND CIVIC ORGANIZATIONS, AFL-CIO, AND THE PRESIDENT'S AND STATE COMMISSIONS ON THE STATUS OF WOMEN. THE…

  5. Early Understanding of Equality

    ERIC Educational Resources Information Center

    Leavy, Aisling; Hourigan, Mairéad; McMahon, Áine

    2013-01-01

    Quite a bit of the arithmetic in elementary school contains elements of algebraic reasoning. After researching and testing a number of instructional strategies with Irish third graders, these authors found effective methods for cultivating a relational concept of equality in third-grade students. Understanding equality is fundamental to algebraic…

  6. Equal Justice Under Law.

    ERIC Educational Resources Information Center

    Johnson, Earl, Jr., Ed.

    1994-01-01

    This special theme issue of "Update on Law-Related Education""tells about the past, present, and future of equal legal representation for all in our society." It is dedicated to the history and heroes of legal aid for the poor and the need to further that cause if the United States hopes to achieve equal justice for all. In his…

  7. Classification of CT-brain slices based on local histograms

    NASA Astrophysics Data System (ADS)

    Avrunin, Oleg G.; Tymkovych, Maksym Y.; Pavlov, Sergii V.; Timchik, Sergii V.; Kisała, Piotr; Orakbaev, Yerbol

    2015-12-01

    Neurosurgical intervention is a very complicated process. Modern operating procedures based on data such as CT, MRI, etc. Automated analysis of these data is an important task for researchers. Some modern methods of brain-slice segmentation use additional data to process these images. Classification can be used to obtain this information. To classify the CT images of the brain, we suggest using local histogram and features extracted from them. The paper shows the process of feature extraction and classification CT-slices of the brain. The process of feature extraction is specialized for axial cross-section of the brain. The work can be applied to medical neurosurgical systems.

  8. Efficient local statistical analysis via integral histograms with discrete wavelet transform.

    PubMed

    Lee, Teng-Yok; Shen, Han-Wei

    2013-12-01

    Histograms computed from local regions are commonly used in many visualization applications, and allowing the user to query histograms interactively in regions of arbitrary locations and sizes plays an important role in feature identification and tracking. Computing histograms in regions with arbitrary location and size, nevertheless, can be time consuming for large data sets since it involves expensive I/O and scan of data elements. To achieve both performance- and storage-efficient query of local histograms, we present a new algorithm called WaveletSAT, which utilizes integral histograms, an extension of the summed area tables (SAT), and discrete wavelet transform (DWT). Similar to SAT, an integral histogram is the histogram computed from the area between each grid point and the grid origin, which can be be pre-computed to support fast query. Nevertheless, because one histogram contains multiple bins, it will be very expensive to store one integral histogram at each grid point. To reduce the storage cost for large integral histograms, WaveletSAT treats the integral histograms of all grid points as multiple SATs, each of which can be converted into a sparse representation via DWT, allowing the reconstruction of axis-aligned region histograms of arbitrary sizes from a limited number of wavelet coefficients. Besides, we present an efficient wavelet transform algorithm for SATs that can operate on each grid point separately in logarithmic time complexity, which can be extended to parallel GPU-based implementation. With theoretical and empirical demonstration, we show that WaveletSAT can achieve fast preprocessing and smaller storage overhead than the conventional integral histogram approach with close query performance.

  9. Lean histogram of oriented gradients features for effective eye detection

    NASA Astrophysics Data System (ADS)

    Sharma, Riti; Savakis, Andreas

    2015-11-01

    Reliable object detection is very important in computer vision and robotics applications. The histogram of oriented gradients (HOG) is established as one of the most popular hand-crafted features, which along with support vector machine (SVM) classification provides excellent performance for object recognition. We investigate dimensionality deduction on HOG features in combination with SVM classifiers to obtain efficient feature representation and improved classification performance. In addition to lean HOG features, we explore descriptors resulting from dimensionality reduction on histograms of binary descriptors. We consider three-dimensionality reduction techniques: standard principal component analysis, random projections, a computationally efficient linear mapping that is data independent, and locality preserving projections (LPP), which learns the manifold structure of the data. Our methods focus on the application of eye detection and were tested on an eye database created using the BioID and FERET face databases. Our results indicate that manifold learning is beneficial to classification utilizing HOG features. To demonstrate the broader usefulness of lean HOG features for object class recognition, we evaluated our system's classification performance on the CalTech-101 dataset with favorable outcomes.

  10. The Transition Matrix in Flat-histogram Sampling

    NASA Astrophysics Data System (ADS)

    Brown, Gregory; Eisenbach, M.; Li, Y. W.; Stocks, G. M.; Nicholson, D. M.; Odbadrakh, Kh.; Rikvold, P. A.

    2015-03-01

    Calculating the thermodynamic density of states (DOS) via flat-histogram sampling is a powerful numerical method for characterizing the temperature-dependent properties of materials. Since the calculated DOS is refined directly from the statistics of the sampling, methods of accelerating the sampling, e.g. through windowing and slow forcing, skew the resulting DOS. Calculating the infinite-temperature transition matrix during the flat-histogram sampling decouples the sampling from estimating the DOS, and allows the techniques of Transition Matrix Monte Carlo to be applied. This enables the calculation of the properties for very large system sizes and thus finite-size scaling analysis of the specific heat, magnetic susceptibility, and cumulant crossings at critical points. We discuss these developments in the context of models for magnetocaloric and spin-crossover materials. This work was performed at the Oak Ridge National Laboratory, which is managed by UT-Battelle for the U.S. Department of Energy. It was sponsored by the U.S. Department of Energy, Office of Basic Energy Sciences, Office of Advanced Scientific Computing Research, and the Oak Ridge Leadership Computing Facility. PAR is supported by the National Science Foundation.

  11. The retina dose-area histogram: a metric for quantitatively comparing rival eye plaque treatment options

    PubMed Central

    2013-01-01

    Purpose Episcleral plaques have a history of over a half century in the delivery of radiation therapy to intraocular tumors such as choroidal melanoma. Although the tumor control rate is high, vision-impairing complications subsequent to treatment remain an issue. Notable, late complications are radiation retinopathy and maculopathy. The obvious way to reduce the risk of radiation damage to the retina is to conform the prescribed isodose surface to the tumor base and to reduce the dose delivered to the surrounding healthy retina, especially the macula. Using a fusion of fundus photography, ultrasound and CT images, tumor size, shape and location within the eye can be accurately simulated as part of the radiation planning process. In this work an adaptation of the dose-volume histogram (DVH), the retina dose-area histogram (RDAH) is introduced as a metric to help compare rival plaque designs and conformal treatment planning options with the goal of reducing radiation retinopathy. Material and methods The RDAH is calculated by transforming a digitized fundus-photo collage of the tumor into a rasterized polar map of the retinal surface known as a retinal diagram (RD). The perimeter of the tumor base is digitized on the RD and its area computed. Area and radiation dose are calculated for every pixel in the RD. Results The areal resolution of the RDAH is a function of the pixel resolution of the raster image used to display the RD and the number of polygon edges used to digitize the perimeter of the tumor base. A practical demonstration is presented. Conclusions The RDAH provides a quantitative metric by which episcleral plaque treatment plan options may be evaluated and compared in order to confirm adequate dosimetric coverage of the tumor and margin, and to help minimize dose to the macula and retina. PMID:23634152

  12. A novel hybrid motion detection algorithm based on 2D histogram

    NASA Astrophysics Data System (ADS)

    Su, Xiaomeng; Wang, Haiying

    2015-03-01

    This article proposes a novel hybrid motion detection algorithm based on 2-D (2-Dimensional) spatio-temporal states histogram. The new algorithm combines the idea of image change detection based on 2-D histogram and spatio-temporal entropy image segmentation. It quantifies the continuity of pixel state in time and space domain which are called TDF (Time Domain Filter) and SDF (Space Domain Filter) respectively. After this, put both channels of output data from TDF and SDF into a 2-D histogram. In the 2-D histogram, a curve division method helps to separate the foreground state points and the background ones more accurately. Innovatively, the new algorithm converts the video sequence to its histogram sequence, and transforms the difference of pixel's value in the video sequence into the difference of pixel's position in the 2-D histogram. Experimental results on different types of scenes added Gaussian noise shows that the proposed technique has strong ability of detecting moving objects.

  13. The photon counting histogram in fluorescence fluctuation spectroscopy.

    PubMed Central

    Chen, Y; Müller, J D; So, P T; Gratton, E

    1999-01-01

    Fluorescence correlation spectroscopy (FCS) is generally used to obtain information about the number of fluorescent particles in a small volume and the diffusion coefficient from the autocorrelation function of the fluorescence signal. Here we demonstrate that photon counting histogram (PCH) analysis constitutes a novel tool for extracting quantities from fluorescence fluctuation data, i.e., the measured photon counts per molecule and the average number of molecules within the observation volume. The photon counting histogram of fluorescence fluctuation experiments, in which few molecules are present in the excitation volume, exhibits a super-Poissonian behavior. The additional broadening of the PCH compared to a Poisson distribution is due to fluorescence intensity fluctuations. For diffusing particles these intensity fluctuations are caused by an inhomogeneous excitation profile and the fluctuations in the number of particles in the observation volume. The quantitative relationship between the detected photon counts and the fluorescence intensity reaching the detector is given by Mandel's formula. Based on this equation and considering the fluorescence intensity distribution in the two-photon excitation volume, a theoretical expression for the PCH as a function of the number of molecules in the excitation volume is derived. For a single molecular species two parameters are sufficient to characterize the histogram completely, namely the average number of molecules within the observation volume and the detected photon counts per molecule per sampling time epsilon. The PCH for multiple molecular species, on the other hand, is generated by successively convoluting the photon counting distribution of each species with the others. The influence of the excitation profile upon the photon counting statistics for two relevant point spread functions (PSFs), the three-dimensional Gaussian PSF conventionally employed in confocal detection and the square of the Gaussian

  14. Transhumanism and moral equality.

    PubMed

    Wilson, James

    2007-10-01

    Conservative thinkers such as Francis Fukuyama have produced a battery of objections to the transhumanist project of fundamentally enhancing human capacities. This article examines one of these objections, namely that by allowing some to greatly extend their capacities, we will undermine the fundamental moral equality of human beings. I argue that this objection is groundless: once we understand the basis for human equality, it is clear that anyone who now has sufficient capacities to count as a person from the moral point of view will continue to count as one even if others are fundamentally enhanced; and it is mistaken to think that a creature which had even far greater capacities than an unenhanced human being should count as more than an equal from the moral point of view.

  15. Using histograms to introduce randomization in the generation of ensembles of decision trees

    DOEpatents

    Kamath, Chandrika; Cantu-Paz, Erick; Littau, David

    2005-02-22

    A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

  16. Infrared image segmentation method based on spatial coherence histogram and maximum entropy

    NASA Astrophysics Data System (ADS)

    Liu, Songtao; Shen, Tongsheng; Dai, Yao

    2014-11-01

    In order to segment the target well and suppress background noises effectively, an infrared image segmentation method based on spatial coherence histogram and maximum entropy is proposed. First, spatial coherence histogram is presented by weighting the importance of the different position of these pixels with the same gray-level, which is obtained by computing their local density. Then, after enhancing the image by spatial coherence histogram, 1D maximum entropy method is used to segment the image. The novel method can not only get better segmentation results, but also have a faster computation time than traditional 2D histogram-based segmentation methods.

  17. Neuronal Adaptive Mechanisms Underlying Intelligent Information Processing

    DTIC Science & Technology

    1982-05-01

    omuter Procram The program consists of three functional units: stimulus presentation and data collection, histogram generation and display, and benavioral...sequence for ten second trials of adaptation, conditioning, extinction, or delayed HS paradigms. Timing of stimuli can be generated • . .. -?’ _-, a’. Ah...are generated from the data and displayed four each on Mime 100 and VT105 video terminals. The histograms are averages of three trials and are

  18. Fast and fully automatic phalanx segmentation using a grayscale-histogram morphology algorithm

    NASA Astrophysics Data System (ADS)

    Hsieh, Chi-Wen; Liu, Tzu-Chiang; Jong, Tai-Lang; Chen, Chih-Yen; Tiu, Chui-Mei; Chan, Din-Yuen

    2011-08-01

    Bone age assessment is a common radiological examination used in pediatrics to diagnose the discrepancy between the skeletal and chronological age of a child; therefore, it is beneficial to develop a computer-based bone age assessment to help junior pediatricians estimate bone age easily. Unfortunately, the phalanx on radiograms is not easily separated from the background and soft tissue. Therefore, we proposed a new method, called the grayscale-histogram morphology algorithm, to segment the phalanges fast and precisely. The algorithm includes three parts: a tri-stage sieve algorithm used to eliminate the background of hand radiograms, a centroid-edge dual scanning algorithm to frame the phalanx region, and finally a segmentation algorithm based on disk traverse-subtraction filter to segment the phalanx. Moreover, two more segmentation methods: adaptive two-mean and adaptive two-mean clustering were performed, and their results were compared with the segmentation algorithm based on disk traverse-subtraction filter using five indices comprising misclassification error, relative foreground area error, modified Hausdorff distances, edge mismatch, and region nonuniformity. In addition, the CPU time of the three segmentation methods was discussed. The result showed that our method had a better performance than the other two methods. Furthermore, satisfactory segmentation results were obtained with a low standard error.

  19. The Equal Pay Boondoggle

    ERIC Educational Resources Information Center

    Lester, Richard A.

    1975-01-01

    Problems of extending the Equal Pay Act to university faculty are examined in light of the complicated market forces and merit systems affecting faculty appointments and salaries. Solutions to the problem are suggested including guidelines for the Wage and Hour Division of the Department of Labor to use in identifying sex discrimination. (JT)

  20. Motivation and Equality.

    ERIC Educational Resources Information Center

    Nicholls, John G.; Burton, John T.

    1982-01-01

    Argues that if teachers maintain task involvement in all children, they will achieve justifiable form of educational equality. Discusses social and personal factors which influence task involvement, including value framework of school (i.e., purpose school is seen to serve), organizational strategies adopted to facilitate learning, and specific…

  1. Equality Versus Inequality.

    ERIC Educational Resources Information Center

    Dahl, Robert A.

    1996-01-01

    Argues that political equality and democracy are attainable only through the distribution of access to political resources and the willingness to use them. Discusses the broad philosophical and sociological components that contribute to a system marked by advantage and inequalities, as well as opportunities for opposition and resistance. (MJP)

  2. EQUALS Investigations: Remote Rulers.

    ERIC Educational Resources Information Center

    Mayfield, Karen; Whitlow, Robert

    EQUALS is a teacher education program that helps elementary and secondary educators acquire methods and materials to attract minority and female students to mathematics. It supports a problem-solving approach to mathematics which has students working in groups, uses active assessment methods, and incorporates a broad mathematics curriculum…

  3. EQUALITY OF EDUCATIONAL OPPORTUNITY.

    ERIC Educational Resources Information Center

    COLEMAN, JAMES S.; AND OTHERS

    THE PRODUCT OF AN EXTENSIVE SURVEY REQUESTED BY THE CIVIL RIGHTS ACT OF 1964, THIS REPORT DOCUMENTS THE AVAILABILITY OF EQUAL EDUCATIONAL OPPORTUNITIES IN THE PUBLIC SCHOOLS FOR MINORITY GROUP NEGROES, PUERTO RICANS, MEXICAN-AMERICANS, ORIENTAL-AMERICANS, AND AMERICAN INDIANS, AS COMPARED WITH OPPORTUNITIES FOR MAJORITY GROUP WHITES. COMPARATIVE…

  4. The Equal Access Act.

    ERIC Educational Resources Information Center

    Catron, J. Gregory

    1987-01-01

    Reviews past history of access of religious activities in public schools in relation to the establishment clause of the First Amendment and sets forth the prerequisites in the Equal Access Act of 1984 for creating a well-defined forum for student-initiated free speech including religious groups in public high schools. (MD)

  5. Granting Each Equal Access.

    ERIC Educational Resources Information Center

    Walling, Linda Lucas

    1992-01-01

    Summarizes federal legislation regarding equal access for students with disabilities and discusses environmental barriers to accessibility in the library media center. Solutions to these design problems are suggested in the following areas: material formats and space requirements; the physical setting, including furniture, floor coverings,…

  6. Equality and Academic Subjects

    ERIC Educational Resources Information Center

    Hardarson, Atli

    2013-01-01

    A recent national curriculum guide for upper secondary schools in my home country, Iceland, requires secondary schools to work towards equality and five other overarching aims. This requirement raises questions about to what extent secondary schools have to change their curricula in order to approach these aims or work towards them in an adequate…

  7. The Equal Rights Amendment

    ERIC Educational Resources Information Center

    Kelly, Eileen

    1978-01-01

    A brief discussion of the Equal Rights Amendment (ERA) including questions often asked of social studies teachers in the classroom. A map of the United States shows which states have passed the ERA to date. Also includes a bibliography of 43 resources about ERA. (BC)

  8. Equal Educational Opportunity?

    ERIC Educational Resources Information Center

    Morris, Lorenzo

    1980-01-01

    Holds that the "Bakke" decision simply reaffirmed an insufficient commitment to equal opportunities for Blacks in higher education. Reviews several studies, including research conducted at the Institute for the Study of Educational Policy (ISEP) that has focused on the social and economic context of educational discrimination. (GC)

  9. Fight For Equality

    ERIC Educational Resources Information Center

    Mink, Patsy T.

    1973-01-01

    In this presentation to the annual conventions of the NAWDAC and the ACPA (Cleveland 1973) the author, a Congresswoman from Hawaii, deplores the practice of some counselors of directing women students into traditional women's courses. She urges college counselors and personnel workers to join in the struggle to achieve equal educational and…

  10. Equal Opportunity in Employment

    ERIC Educational Resources Information Center

    Bullock, Paul

    This book focuses on discrimination in employment, defined as the denial of equal opportunity in the labor market to qualified persons on the basis of race, color, religion, national origin, age, sex, or any other factor not related to their individual qualifications for work. The average nonwhite college graduate can expect to earn less during…

  11. A cost-effective line-based light-balancing technique using adaptive processing.

    PubMed

    Hsia, Shih-Chang; Chen, Ming-Huei; Chen, Yu-Min

    2006-09-01

    The camera imaging system has been widely used; however, the displaying image appears to have an unequal light distribution. This paper presents novel light-balancing techniques to compensate uneven illumination based on adaptive signal processing. For text image processing, first, we estimate the background level and then process each pixel with nonuniform gain. This algorithm can balance the light distribution while keeping a high contrast in the image. For graph image processing, the adaptive section control using piecewise nonlinear gain is proposed to equalize the histogram. Simulations show that the performance of light balance is better than the other methods. Moreover, we employ line-based processing to efficiently reduce the memory requirement and the computational cost to make it applicable in real-time systems.

  12. Time-cumulated visible and infrared histograms used as descriptor of cloud cover

    NASA Technical Reports Server (NTRS)

    Seze, G.; Rossow, W.

    1987-01-01

    To study the statistical behavior of clouds for different climate regimes, the spatial and temporal stability of VIS-IR bidimensional histograms is tested. Also, the effect of data sampling and averaging on the histogram shapes is considered; in particular the sampling strategy used by the International Satellite Cloud Climatology Project is tested.

  13. Battery equalization active methods

    NASA Astrophysics Data System (ADS)

    Gallardo-Lozano, Javier; Romero-Cadaval, Enrique; Milanes-Montero, M. Isabel; Guerrero-Martinez, Miguel A.

    2014-01-01

    Many different battery technologies are available for the applications which need energy storage. New researches are being focused on Lithium-based batteries, since they are becoming the most viable option for portable energy storage applications. As most of the applications need series battery strings to meet voltage requirements, battery imbalance is an important matter to be taken into account, since it leads the individual battery voltages to drift apart over time, and premature cells degradation, safety hazards, and capacity reduction will occur. A large number of battery equalization methods can be found, which present different advantages/disadvantages and are suitable for different applications. The present paper presents a summary, comparison and evaluation of the different active battery equalization methods, providing a table that compares them, which is helpful to select the suitable equalization method depending on the application. By applying the same weight to the different parameters of comparison, switch capacitor and double-tiered switching capacitor have the highest ratio. Cell bypass methods are cheap and cell to cell ones are efficient. Cell to pack, pack to cell and cell to pack to cell methods present a higher cost, size, and control complexity, but relatively low voltage and current stress in high-power applications.

  14. Do you need to compare two histograms not only by eye?

    NASA Astrophysics Data System (ADS)

    Cardiel, N.

    2015-05-01

    Although the use of histograms implies loss of information due to the fact that the actual data are replaced by the central values of the considered intervals, this graphical representation is commonly employed in scientific communication, particularly in Astrophysics. Sometimes this kind of comparison is unavoidable when one needs to compare new results with already published data only available in histogram format. Unfortunately, it is not infrequent to find in the literature examples of histogram comparisons where the similarity between the histograms is not statistically quantified but simply justified or discarded ``by eye''. In this poster several methods to quantify the similarity between two histograms are discussed. The availability of statistical packages, such as R (R Core Team 2014, R: A Language and Environment for Statistical Computing, R Foundation for Statistical Computing, Vienna, Austria. URL http://www.R-project.org/), notably simplify the understanding of the different approaches through the use of numerical simulations.

  15. Time-cumulated visible and infrared radiance histograms used as descriptors of surface and cloud variations

    NASA Technical Reports Server (NTRS)

    Seze, Genevieve; Rossow, William B.

    1991-01-01

    The spatial and temporal stability of the distributions of satellite-measured visible and infrared radiances, caused by variations in clouds and surfaces, are investigated using bidimensional and monodimensional histograms and time-composite images. Similar analysis of the histograms of the original and time-composite images provides separation of the contributions of the space and time variations to the total variations. The variability of both the surfaces and clouds is found to be larger at scales much larger than the minimum resolved by satellite imagery. This study shows that the shapes of these histograms are distinctive characteristics of the different climate regimes and that particular attributes of these histograms can be related to several general, though not universal, properties of clouds and surface variations at regional and synoptic scales. There are also significant exceptions to these relationships in particular climate regimes. The characteristics of these radiance histograms provide a stable well defined descriptor of the cloud and surface properties.

  16. Equality in Education: An Equality of Condition Perspective

    ERIC Educational Resources Information Center

    Lynch, Kathleen; Baker, John

    2005-01-01

    Transforming schools into truly egalitarian institutions requires a holistic and integrated approach. Using a robust conception of "equality of condition", we examine key dimensions of equality that are central to both the purposes and processes of education: equality in educational and related resources; equality of respect and recognition;…

  17. Multifractal diffusion entropy analysis: Optimal bin width of probability histograms

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Korbel, Jan

    2014-11-01

    In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. The method presented uses techniques of Rényi’s entropy and the mean squared error analysis to discuss the conditions under which the error in the multifractal spectrum estimation is minimal. We illustrate the utility of our approach by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the method proposed we compare the multifractal δ-spectrum for various bin-widths and show the robustness of the method, especially for large values of q. For such values, other methods in use, e.g., those based on moment estimation, tend to fail for heavy-tailed data or data with long correlations. Connection between the δ-spectrum and Rényi’s q parameter is also discussed and elucidated on a simple example of multiscale time series.

  18. Developing a dose-volume histogram computation program for brachytherapy.

    PubMed

    Panitsa, E; Rosenwald, J C; Kappas, C

    1998-08-01

    A dose-volume histogram (DVH) computation program was developed for brachytherapy treatment planning in an attempt to benefit from the DVH's ability to present graphically information on 3D dose distributions. The program is incorporated into a planning system that utilizes a pair of orthogonal radiographs to localize the radiation sources. DVHs are calculated for the volume of tissue enclosed by an isodose surface (e.g. half the value of the reference isodose). The calculation algorithm is based on a non-uniform random sampling that gives a denser point distribution at the centre of the implants. Our program was tested and proved to be fast enough for clinical use and sufficiently accurate (i.e. computation time of 20 s and less than 2% relative error for one point source, for 100,000 calculation points). The accuracy improves when a larger calculation point number is used, but the computation time also increases proportionally. The DVH is presented in the form of a simple graph or table, or as Anderson's 'natural' DVH graph. The cumulative DVH tables can be used to extract a series of indexes characterizing the homogeneity and the dose levels of the distribution in the treatment volume and the surrounding tissues. If a reference plan is available, the DVH results can be assessed relative to the reference plan's DVH.

  19. Landmark Detection in Orbital Images Using Salience Histograms

    NASA Technical Reports Server (NTRS)

    Wagstaff, Kiri L.; Panetta, Julian; Schorghofer, Norbert; Greeley, Ronald; PendletonHoffer, Mary; bunte, Melissa

    2010-01-01

    NASA's planetary missions have collected, and continue to collect, massive volumes of orbital imagery. The volume is such that it is difficult to manually review all of the data and determine its significance. As a result, images are indexed and searchable by location and date but generally not by their content. A new automated method analyzes images and identifies "landmarks," or visually salient features such as gullies, craters, dust devil tracks, and the like. This technique uses a statistical measure of salience derived from information theory, so it is not associated with any specific landmark type. It identifies regions that are unusual or that stand out from their surroundings, so the resulting landmarks are context-sensitive areas that can be used to recognize the same area when it is encountered again. A machine learning classifier is used to identify the type of each discovered landmark. Using a specified window size, an intensity histogram is computed for each such window within the larger image (sliding the window across the image). Next, a salience map is computed that specifies, for each pixel, the salience of the window centered at that pixel. The salience map is thresholded to identify landmark contours (polygons) using the upper quartile of salience values. Descriptive attributes are extracted for each landmark polygon: size, perimeter, mean intensity, standard deviation of intensity, and shape features derived from an ellipse fit.

  20. Why the Equal Rights Amendment?

    ERIC Educational Resources Information Center

    Denmark, Florence L.

    The Equal Rights Amendment proposes to ensure constitutional protection against all legislative sex discrimination. "Separate but Equal" standards, be they legal, social or psychological, are inevitably incompatable with equal protection under the law and act as a barrier to each individual's freedom for self determination. Equal rights,…

  1. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  2. Upper Extremity Length Equalization

    PubMed Central

    DeCoster, Thomas A.; Ritterbusch, John; Crawford, Mark

    1992-01-01

    Significant upper extremity length inequality is uncommon but can cause major functional problems. The ability to position and use the hand may be impaired by shortness of any of the long bones of the upper extremity. In many respects upper and lower extremity length problems are similar. They most commonly occur after injury to a growing bone and the treatment modalities utilized in the lower extremity may be applied to the upper extremity. These treatment options include epiphysiodesis, shortening osteotomy, angulatory correction osteotomy and lengthening. This report reviews the literature relative to upper extremity length inequality and equalization and presents an algorithm for evaluation and planning appropriate treatment for patients with this condition. This algorithm is illustrated by two clinical cases of posttraumatic shortness of the radius which were effectively treated. ImagesFigure 1Figure 2Figure 3

  3. Kalman filtering approach to blind equalization

    NASA Astrophysics Data System (ADS)

    Kutlu, Mehmet

    1993-12-01

    Digital communication systems suffer from the channel distortion problem which introduces errors due to intersymbol interference. The solution to this problem is provided by equalizers which use a training sequence to adapt to the channel. However in many cases in which a training sequence is unfeasible, the channel must be adapted blindly. Most of the blind equalization algorithms known so far have problems of convergence to local minima. Our intention is to offer an alternative approach by using extended Kalman filtering and hidden Markov models. They seem to yield more efficient algorithms which take the statistics of the transmitted sequence into consideration. The theoretical development of these new algorithms is discussed in this thesis. Also these algorithms have been simulated under different conditions. The results of simulations and comparisons with existing systems are provided. The models for simulations are presented as MATLAB codes.

  4. Spectrum of Changes in RBC Indices and Histograms in Blood from Subjects with Cold Antibodies

    PubMed Central

    Kannan, Aarthi

    2016-01-01

    Cold antibodies are mostly immunoglobulin M, which interact with red cell antigens at lower temperatures (<37°C). The analysis of samples from subjects with cold antibodies in automated haematology analysers may show abnormal Red Blood Corpuscles (RBC) indices and changes in histogram. High Mean Corpuscular Haemoglobin (MCH) and Mean Haemoglobin Concentration (MCHC) along with plateau effect beyond 110fl at Upper Discriminator (RU) end of RBC histogram are good indicators of presence of cold antibodies in plasma. Cold antibodies in plasma must be considered while reporting the peripheral smear in presence of plateau effect beyond 110fl at RU end of RBC histogram. PMID:28050381

  5. CUDA implementation of histogram stretching function for improving X-ray image.

    PubMed

    Lee, Yong H; Kim, Kwan W; Kim, Soon S

    2013-01-01

    This paper presents a method to improve the contrast of digital X-ray image using CUDA program on a GPU. The histogram is commonly used to get the statistical distribution of the contrast in image processing. To increase the visibility of the image in real time, we use the histogram stretching function. It is difficult to implement the function on a GPU because the CUDA program is due to handle the complex process to transfer the source data and the processed results between the memory of GPU and the host system. As a result, we show to operate the histogram stretching function quickly on GPU by the CUDA program.

  6. An alternative to gamma histograms for ROI-based quantitative dose comparisons.

    PubMed

    Dvorak, P

    2009-06-21

    An alternative to gamma (gamma) histograms for ROI-based quantitative comparisons of dose distributions using the gamma concept is proposed. The method provides minimum values of dose difference and distance-to-agreement such that a pre-set fraction of the region of interest passes the gamma test. Compared to standard gamma histograms, the method provides more information in terms of pass rate per gamma calculation. This is achieved at negligible additional calculation cost and without loss of accuracy. The presented method is proposed as a useful and complementary alternative to standard gamma histograms, increasing both the quantity and quality of information for use in acceptance or rejection decisions.

  7. Equality Matters: The Critical Implications of Precisely Defining Equality

    ERIC Educational Resources Information Center

    Faulkner, Valerie; Walkowiak, Temple; Cain, Chris; Lee, Carrie

    2016-01-01

    Equality is such an important concept for children to develop. In this article it is argued that a precise definition is needed to ensure that students are provided with a consistent "picture" of what it is that equality really means.

  8. Blind equalization with criterion with memory nonlinearity

    NASA Astrophysics Data System (ADS)

    Chen, Yuanjie; Nikias, Chrysostomos L.; Proakis, John G.

    1992-06-01

    Blind equalization methods usually combat the linear distortion caused by a nonideal channel via a transversal filter, without resorting to the a priori known training sequences. We introduce a new criterion with memory nonlinearity (CRIMNO) for the blind equalization problem. The basic idea of this criterion is to augment the Godard [or constant modulus algorithm (CMA)] cost function with additional terms that penalize the autocorrelations of the equalizer outputs. Several variations of the CRIMNO algorithms are derived, with the variations dependent on (1) whether the empirical averages or the single point estimates are used to approximate the expectations, (2) whether the recent or the delayed equalizer coefficients are used, and (3) whether the weights applied to the autocorrelation terms are fixed or are allowed to adapt. Simulation experiments show that the CRIMNO algorithm, and especially its adaptive weight version, exhibits faster convergence speed than the Godard (or CMA) algorithm. Extensions of the CRIMNO criterion to accommodate the case of correlated inputs to the channel are also presented.

  9. Tests: The Foundation for Equality.

    ERIC Educational Resources Information Center

    Wooten, Kenneth L.

    1982-01-01

    Testing and tests are shown to have played a positive and dramatic role in increased opportunities through identification of educational talent. It is argued that tests, far from denying equality, are necessary conditions for equality and quality. (Author/CM)

  10. Recursive histogram modification: establishing equivalency between reversible data hiding and lossless data compression.

    PubMed

    Zhang, Weiming; Hu, Xiaocheng; Li, Xiaolong; Yu, Nenghai

    2013-07-01

    State-of-the-art schemes for reversible data hiding (RDH) usually consist of two steps: first construct a host sequence with a sharp histogram via prediction errors, and then embed messages by modifying the histogram with methods, such as difference expansion and histogram shift. In this paper, we focus on the second stage, and propose a histogram modification method for RDH, which embeds the message by recursively utilizing the decompression and compression processes of an entropy coder. We prove that, for independent identically distributed (i.i.d.) gray-scale host signals, the proposed method asymptotically approaches the rate-distortion bound of RDH as long as perfect compression can be realized, i.e., the entropy coder can approach entropy. Therefore, this method establishes the equivalency between reversible data hiding and lossless data compression. Experiments show that this coding method can be used to improve the performance of previous RDH schemes and the improvements are more significant for larger images.

  11. Face verification system for Android mobile devices using histogram based features

    NASA Astrophysics Data System (ADS)

    Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu

    2016-07-01

    This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.

  12. Efficient Human Action and Gait Analysis Using Multiresolution Motion Energy Histogram

    NASA Astrophysics Data System (ADS)

    Yu, Chih-Chang; Cheng, Hsu-Yung; Cheng, Chien-Hung; Fan, Kuo-Chin

    2010-12-01

    Average Motion Energy (AME) image is a good way to describe human motions. However, it has to face the computation efficiency problem with the increasing number of database templates. In this paper, we propose a histogram-based approach to improve the computation efficiency. We convert the human action/gait recognition problem to a histogram matching problem. In order to speed up the recognition process, we adopt a multiresolution structure on the Motion Energy Histogram (MEH). To utilize the multiresolution structure more efficiently, we propose an automated uneven partitioning method which is achieved by utilizing the quadtree decomposition results of MEH. In that case, the computation time is only relevant to the number of partitioned histogram bins, which is much less than the AME method. Two applications, action recognition and gait classification, are conducted in the experiments to demonstrate the feasibility and validity of the proposed approach.

  13. Genetic Diversity and Human Equality.

    ERIC Educational Resources Information Center

    Dobzhansky, Theodosius

    The idea of equality often, if not frequently, bogs down in confusion and apparent contradictions; equality is confused with identity, and diversity with inequality. It would seem that the easiest way to discredit the idea of equality is to show that people are innately, genetically, and, therefore, irremediably diverse and unlike. The snare is,…

  14. Equality, Adequacy, and Educational Policy

    ERIC Educational Resources Information Center

    Satz, Debra

    2008-01-01

    In this article I argue that the distinction between an adequate education and an equal education has been overdrawn. In my view, a certain type of equality--civic equality--is internal to the idea of educational adequacy. An education system that completely separates the children of the poor and minorities from those of the wealthy and middle…

  15. De-Striping for Tdiccd Remote Sensing Image Based on Statistical Features of Histogram

    NASA Astrophysics Data System (ADS)

    Gao, Hui-ting; Liu, Wei; He, Hong-yan; Zhang, Bing-xian; Jiang, Cheng

    2016-06-01

    Aim to striping noise brought by non-uniform response of remote sensing TDI CCD, a novel de-striping method based on statistical features of image histogram is put forward. By analysing the distribution of histograms,the centroid of histogram is selected to be an eigenvalue representing uniformity of ground objects,histogrammic centroid of whole image and each pixels are calculated first,the differences between them are regard as rough correction coefficients, then in order to avoid the sensitivity caused by single parameter and considering the strong continuity and pertinence of ground objects between two adjacent pixels,correlation coefficient of the histograms is introduces to reflect the similarities between them,fine correction coefficient is obtained by searching around the rough correction coefficient,additionally,in view of the influence of bright cloud on histogram,an automatic cloud detection based on multi-feature including grey level,texture,fractal dimension and edge is used to pre-process image.Two 0-level panchromatic images of SJ-9A satellite with obvious strip noise are processed by proposed method to evaluate the performance, results show that the visual quality of images are improved because the strip noise is entirely removed,we quantitatively analyse the result by calculating the non-uniformity ,which has reached about 1% and is better than histogram matching method.

  16. Information-Adaptive Image Encoding and Restoration

    NASA Technical Reports Server (NTRS)

    Park, Stephen K.; Rahman, Zia-ur

    1998-01-01

    The multiscale retinex with color restoration (MSRCR) has shown itself to be a very versatile automatic image enhancement algorithm that simultaneously provides dynamic range compression, color constancy, and color rendition. A number of algorithms exist that provide one or more of these features, but not all. In this paper we compare the performance of the MSRCR with techniques that are widely used for image enhancement. Specifically, we compare the MSRCR with color adjustment methods such as gamma correction and gain/offset application, histogram modification techniques such as histogram equalization and manual histogram adjustment, and other more powerful techniques such as homomorphic filtering and 'burning and dodging'. The comparison is carried out by testing the suite of image enhancement methods on a set of diverse images. We find that though some of these techniques work well for some of these images, only the MSRCR performs universally well oil the test set.

  17. Equal Pay for Equal Work; Women in Special Libraries.

    ERIC Educational Resources Information Center

    Special Libraries Association, New York, NY.

    The Special Libraries Association (SLA) provides information on achieving equal pay for equal work for women librarians in special libraries. A 1973 SLA study is cited to show pay differences between men and women. Then relevant legislation and executive orders are listed for the United States, along with similar legislation for Canada. Attention…

  18. Reframing Inclusive Education: Educational Equality as Capability Equality

    ERIC Educational Resources Information Center

    Terzi, Lorella

    2014-01-01

    In this paper, I argue that rethinking questions of inclusive education in the light of the value of educational equality--specifically conceived as capability equality, or genuine opportunities to achieve educational functionings--adds some important insights to the current debate on inclusive education. First, it provides a cohesive value…

  19. Medical Need, Equality, and Uncertainty.

    PubMed

    Horne, L Chad

    2016-10-01

    Many hold that distributing healthcare according to medical need is a requirement of equality. Most egalitarians believe, however, that people ought to be equal on the whole, by some overall measure of well-being or life-prospects; it would be a massive coincidence if distributing healthcare according to medical need turned out to be an effective way of promoting equality overall. I argue that distributing healthcare according to medical need is important for reducing individuals' uncertainty surrounding their future medical needs. In other words, distributing healthcare according to medical need is a natural feature of healthcare insurance; it is about indemnity, not equality.

  20. LETTER TO THE EDITOR: Comments on 'Reconsidering the definition of a dose volume histogram'—dose mass histogram (DMH) versus dose volume histogram (DVH) for predicting radiation-induced pneumonitis

    NASA Astrophysics Data System (ADS)

    Mavroidis, Panayiotis; Plataniotis, Georgios A.; Adamus Górka, Magdalena; Lind, Bengt K.

    2006-12-01

    In a recently published paper (Nioutsikou et al 2005 Phys. Med. Biol. 50 L17) the authors showed that the use of the dose-mass histogram (DMH) concept is a more accurate descriptor of the dose delivered to lung than the traditionally used dose-volume histogram (DVH) concept. Furthermore, they state that if a functional imaging modality could also be registered to the anatomical imaging modality providing a functional weighting across the organ (functional mass) then the more general and realistic concept of the dose-functioning mass histogram (D[F]MH) could be an even more appropriate descriptor. The comments of the present letter to the editor are in line with the basic arguments of that work since their general conclusions appear to be supported by the comparison of the DMH and DVH concepts using radiobiological measures. In this study, it is examined whether the dose-mass histogram (DMH) concept deviated significantly from the widely used dose-volume histogram (DVH) concept regarding the expected lung complications and if there are clinical indications supporting these results. The problem was investigated theoretically by applying two hypothetical dose distributions (Gaussian and semi-Gaussian shaped) on two lungs of uniform and varying densities. The influence of the deviation between DVHs and DMHs on the treatment outcome was estimated by using the relative seriality and LKB models using the Gagliardi et al (2000 Int. J. Radiat. Oncol. Biol. Phys. 46 373) and Seppenwoolde et al (2003 Int. J. Radiat. Oncol. Biol. Phys. 55 724) parameter sets for radiation pneumonitis, respectively. Furthermore, the biological equivalent of their difference was estimated by the biologically effective uniform dose (\\bar{\\bar{D}}) and equivalent uniform dose (EUD) concepts, respectively. It is shown that the relation between the DVHs and DMHs varies depending on the underlying cell density distribution and the applied dose distribution. However, the range of their deviation in

  1. Histogram Curve Matching Approaches for Object-based Image Classification of Land Cover and Land Use

    PubMed Central

    Toure, Sory I.; Stow, Douglas A.; Weeks, John R.; Kumar, Sunil

    2013-01-01

    The classification of image-objects is usually done using parametric statistical measures of central tendency and/or dispersion (e.g., mean or standard deviation). The objectives of this study were to analyze digital number histograms of image objects and evaluate classifications measures exploiting characteristic signatures of such histograms. Two histograms matching classifiers were evaluated and compared to the standard nearest neighbor to mean classifier. An ADS40 airborne multispectral image of San Diego, California was used for assessing the utility of curve matching classifiers in a geographic object-based image analysis (GEOBIA) approach. The classifications were performed with data sets having 0.5 m, 2.5 m, and 5 m spatial resolutions. Results show that histograms are reliable features for characterizing classes. Also, both histogram matching classifiers consistently performed better than the one based on the standard nearest neighbor to mean rule. The highest classification accuracies were produced with images having 2.5 m spatial resolution. PMID:24403648

  2. Histogram-based classification with Gaussian mixture modeling for GBM tumor treatment response using ADC map

    NASA Astrophysics Data System (ADS)

    Huo, Jing; Kim, Hyun J.; Pope, Whitney B.; Okada, Kazunori; Alger, Jeffery R.; Wang, Yang; Goldin, Jonathan G.; Brown, Matthew S.

    2009-02-01

    This study applied a Gaussian Mixture Model (GMM) to apparent diffusion coefficient (ADC) histograms to evaluate glioblastoma multiforme (GBM) tumor treatment response using diffusion weighted (DW) MR images. ADC mapping, calculated from DW images, has been shown to reveal changes in the tumor's microenvironment preceding morphologic tumor changes. In this study, we investigated the effectiveness of features that represent changes from pre- and post-treatment tumor ADC histograms to detect treatment response. The main contribution of this work is to model the ADC histogram as the composition of two components, fitted by GMM with expectation maximization (EM) algorithm. For both pre- and post-treatment scans taken 5-7 weeks apart, we obtained the tumor ADC histogram, calculated the two-component features, as well as the other standard histogram-based features, and applied supervised learning for classification. We evaluated our approach with data from 85 patients with GBM under chemotherapy, in which 33 responded and 52 did not respond based on tumor size reduction. We compared AdaBoost and random forests classification algorithms, using ten-fold cross validation, resulting in a best accuracy of 69.41%.

  3. Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma

    PubMed Central

    Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-01-01

    Abstract The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC). Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement. The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P <0.05), with area under the ROC curves (AUCs) of 0.66 to 0.74 for ADC and 0.76 to 0.88 for PVP. The largest AUC of PVP (1th percentile) showed significantly higher accuracy compared with that of arterial phase (AP) or tumor size (P <0.001). MR histogram analyses—in particular for 1th percentile for PVP images—held promise for prediction of MVI of HCC. PMID:27368028

  4. Dual-mode type algorithms for blind equalization

    NASA Astrophysics Data System (ADS)

    Weerackody, Vijitha; Kassam, Saleem A.

    1994-01-01

    Adaptive channel equalization accomplished without resorting to a training sequence is known as blind equalization. The Godard algorithm and the generalized Sato algorithm are two widely referenced algorithms for blind equalization of a QAM system. These algorithms exhibit very slow convergence rates when compared to algorithms employed in conventional data-aided equalization schemes. In order to speed up the convergence process, these algorithms may be switched over to a decision-directed equalization scheme once the error level is reasonably low. We present a scheme which is capable of operating in two modes: blind equalization mode and a mode similar to the decision-directed equalization mode. In this proposed scheme, the dominant mode of operation changes from the blind equalization mode at higher error levels to the mode similar to the decision-directed equalization mode at lower error levels. Manual switch-over to the decision-directed mode from the blind equalization mode, or vice-versa, is not necessary since transitions between the two modes take place smoothly and automatically.

  5. Governing Equality: Mathematics for All?

    ERIC Educational Resources Information Center

    Diaz, Jennifer D.

    2013-01-01

    With the notion of governmentality, this article considers how the equal sign (=) in the U.S. math curriculum organizes knowledge of equality and inscribes cultural rules for thinking, acting, and seeing in the world. Situating the discussion within contemporary math reforms aimed at teaching mathematics for all, I draw attention to how the…

  6. Equal Pay for Comparable Work.

    ERIC Educational Resources Information Center

    Von Frank, Jane

    1980-01-01

    Argues that sex discrimination has depressed salaries for jobs filled primarily by women. Shows that under the Equal Pay Act and Title VII, workers in traditionally female occupations can establish equal pay claims. Suggests approaches for developing legal and enforcement standards to deal with discriminatory compensation in traditionally female…

  7. Luck, Choice, and Educational Equality

    ERIC Educational Resources Information Center

    Calvert, John

    2015-01-01

    Harry Brighouse discusses two conceptions of educational equality. The first is a type of equality of opportunity, heavily influenced by the work of John Rawls, which he calls the meritocratic conception. According to this conception, an individual's educational prospects should not be influenced by factors such as their social class background.…

  8. Equality and Education -- Part 1

    ERIC Educational Resources Information Center

    Porter, John

    1975-01-01

    Discusses equality in education within the framework of the ideas of John Rawls, asserting that even though in the real world it is not easy to implement his version of equality and justice without endangering his prior principle of liberty, he provides a philosophical foundation for the reconsideration of the meritocratic principle. (Author/JM)

  9. Democracy, Equal Citizenship, and Education

    ERIC Educational Resources Information Center

    Callan, Eamonn

    2016-01-01

    Two appealing principles of educational distribution--equality and sufficiency--are comparatively assessed. The initial point of comparison is the distribution of civic educational goods. One reason to favor equality in educational distribution rather than sufficiency is the elimination of undeserved positional advantage in access to labor…

  10. An energy-based model for the image edge-histogram specification problem.

    PubMed

    Mignotte, Max

    2012-01-01

    In this correspondence, we present an original energy-based model that achieves the edge-histogram specification of a real input image and thus extends the exact specification method of the image luminance (or gray level) distribution recently proposed by Coltuc et al. Our edge-histogram specification approach is stated as an optimization problem in which each edge of a real input image will tend iteratively toward some specified gradient magnitude values given by a target edge distribution (or a normalized edge histogram possibly estimated from a target image). To this end, a hybrid optimization scheme combining a global and deterministic conjugate-gradient-based procedure and a local stochastic search using the Metropolis criterion is proposed herein to find a reliable solution to our energy-based model. Experimental results are presented, and several applications follow from this procedure.

  11. Method for quality control of laboratory tests using histograms of daily patient data.

    PubMed

    Okada, M

    1990-01-01

    A method for controlling the quality of laboratory tests is proposed. Histograms of patients' daily results which fall within reference ranges of healthy individuals are used for estimating accuracy and precision of measurements. For the determination of accuracy, three methods are evaluated; computing an average of patients' results; determining the location of the peak of the histogram; approximating the histogram by an Erland distribution and determining the peak of the distribution. For precision control, standard deviations are calculated from patient data. We applied these methods to serum aspartate aminotransferase (AST or SGOT) and total cholesterol of patients in a general hospital. Averages, peaks of approximated Erland distribution, and standard deviations were found to be useful to daily quality control in laboratories of large hospitals.

  12. Infrared face recognition based on LBP histogram and KW feature selection

    NASA Astrophysics Data System (ADS)

    Xie, Zhihua

    2014-07-01

    The conventional LBP-based feature as represented by the local binary pattern (LBP) histogram still has room for performance improvements. This paper focuses on the dimension reduction of LBP micro-patterns and proposes an improved infrared face recognition method based on LBP histogram representation. To extract the local robust features in infrared face images, LBP is chosen to get the composition of micro-patterns of sub-blocks. Based on statistical test theory, Kruskal-Wallis (KW) feature selection method is proposed to get the LBP patterns which are suitable for infrared face recognition. The experimental results show combination of LBP and KW features selection improves the performance of infrared face recognition, the proposed method outperforms the traditional methods based on LBP histogram, discrete cosine transform(DCT) or principal component analysis(PCA).

  13. Spline Histogram Method for Reconstruction of Probability Density Functions of Clusters of Galaxies

    NASA Astrophysics Data System (ADS)

    Docenko, Dmitrijs; Berzins, Karlis

    We describe the spline histogram algorithm which is useful for visualization of the probability density function setting up a statistical hypothesis for a test. The spline histogram is constructed from discrete data measurements using tensioned cubic spline interpolation of the cumulative distribution function which is then differentiated and smoothed using the Savitzky-Golay filter. The optimal width of the filter is determined by minimization of the Integrated Square Error function. The current distribution of the TCSplin algorithm written in f77 with IDL and Gnuplot visualization scripts is available from www.virac.lv/en/soft.html.

  14. Flat histogram diagrammatic Monte Carlo method: calculation of the Green's function in imaginary time.

    PubMed

    Diamantis, Nikolaos G; Manousakis, Efstratios

    2013-10-01

    The diagrammatic Monte Carlo (DiagMC) method is a numerical technique which samples the entire diagrammatic series of the Green's function in quantum many-body systems. In this work, we incorporate the flat histogram principle in the diagrammatic Monte Carlo method, and we term the improved version the "flat histogram diagrammatic Monte Carlo" method. We demonstrate the superiority of this method over the standard DiagMC in extracting the long-imaginary-time behavior of the Green's function, without incorporating any a priori knowledge about this function, by applying the technique to the polaron problem.

  15. Flat histogram diagrammatic Monte Carlo method: Calculation of the Green's function in imaginary time

    NASA Astrophysics Data System (ADS)

    Diamantis, Nikolaos G.; Manousakis, Efstratios

    2013-10-01

    The diagrammatic Monte Carlo (DiagMC) method is a numerical technique which samples the entire diagrammatic series of the Green's function in quantum many-body systems. In this work, we incorporate the flat histogram principle in the diagrammatic Monte Carlo method, and we term the improved version the “flat histogram diagrammatic Monte Carlo” method. We demonstrate the superiority of this method over the standard DiagMC in extracting the long-imaginary-time behavior of the Green's function, without incorporating any a priori knowledge about this function, by applying the technique to the polaron problem.

  16. [Regular changes in histogram forms in physical measurements and mathematical modeling].

    PubMed

    Zenchenko, T A; Fedorov, M V; Zenchenko, K I; Konradov, A A; Shnol', S E

    2001-01-01

    A study of macroscopic fluctuations for objects separated by large distances confirmed the conclusion drawn earlier that, if the objects being measured are in different time zones, the increase in the probability of occurrence of histograms of similar form corresponds to the difference in the local time at the points of measurement. It was also found that, upon realization of pseudo-random sequences of numbers in mathematical generators, sequences of histograms very similar to those in real physical series can be realized. This suggests the presence of previously unknown regularities, both physical and mathematical, in sequences traditionally considered as absolutely random.

  17. Evaluating CMA Equalization of SOQPSK-TG for Aeronautical Telemetry

    DTIC Science & Technology

    2015-03-01

    Aeronautical Telemetry March 2015 DISTRIBUTION STATEMENT A. Approved for public release: distribution unlimited. Test Resource Management...Equalization of SOQPK-TG Data for Aeronautical Telemetry 5a. CONTRACT NUMBER: W900KK-13-C-0026 5b. GRANT NUMBER: N/A 5c. PROGRAM ELEMENT NUMBER 6...This standard is defined and used for aeronautical telemetry. Based on the iNET-packet structure, the adaptive block processing CMA equalizer can be

  18. Loudspeaker equalization for auditory research.

    PubMed

    MacDonald, Justin A; Tran, Phuong K

    2007-02-01

    The equalization of loudspeaker frequency response is necessary to conduct many types of well-controlled auditory experiments. This article introduces a program that includes functions to measure a loudspeaker's frequency response, design equalization filters, and apply the filters to a set of stimuli to be used in an auditory experiment. The filters can compensate for both magnitude and phase distortions introduced by the loudspeaker. A MATLAB script is included in the Appendix to illustrate the details of the equalization algorithm used in the program.

  19. Equalization of data transmission cable

    NASA Technical Reports Server (NTRS)

    Zobrist, G. W.

    1975-01-01

    The paper describes an equalization approach utilizing a simple RLC network which can obtain a maximum slope of -12dB/octave for reshaping the frequency characteristics of a data transmission cable, so that data may be generated and detected at the receiver. An experimental procedure for determining equalizer design specifications using distortion analysis is presented. It was found that for lengths of 16 PEV-L cable of up to 5 miles and data transmission rates of up to 1 Mbs, the equalization scheme proposed here is sufficient for generation of the data with acceptable error rates.

  20. Multiple point least squares equalization in a room

    NASA Technical Reports Server (NTRS)

    Elliott, S. J.; Nelson, P. A.

    1988-01-01

    Equalization filters designed to minimize the mean square error between a delayed version of the original electrical signal and the equalized response at a point in a room have previously been investigated. In general, such a strategy degrades the response at positions in a room away from the equalization point. A method is presented for designing an equalization filter by adjusting the filter coefficients to minimize the sum of the squares of the errors between the equalized responses at multiple points in the room and delayed versions of the original, electrical signal. Such an equalization filter can give a more uniform frequency response over a greater volume of the enclosure than can the single point equalizer above. Computer simulation results are presented of equalizing the frequency responses from a loudspeaker to various typical ear positions, in a room with dimensions and acoustic damping typical of a car interior, using the two approaches outlined above. Adaptive filter algorithms, which can automatically adjust the coefficients of a digital equalization filter to achieve this minimization, will also be discussed.

  1. Equal Education and the Law

    ERIC Educational Resources Information Center

    Shanks, Hershel

    1970-01-01

    A number of court cases are cited which trace the development of various definitions and interpretations of the equal protection clause of the Fourteenth Amendment to the Constitution as would be applicable to inadequate" schools. (DM)

  2. Electronegativity Equalization and Partial Charge

    ERIC Educational Resources Information Center

    Sanderson, R. T.

    1974-01-01

    This article elaborates the relationship between covalent radius, homonuclear bond energy, and electronegativity, and sets the background for bond energy calculation by discussing the nature of heteronuclear covalent bonding on the basis of electronegativity equalization and particle charge. (DT)

  3. Electronegativity Equalization with Pauling Units.

    ERIC Educational Resources Information Center

    Bratsch, Steven G.

    1984-01-01

    Discusses electronegativity equalization using Pauling units. Although Pauling has qualitatively defined electronegativity as the power of an atom in a molecule to attract electrons to itself, Pauling electronegativities are treated in this paper as prebonded, isolated-atom quantities. (JN)

  4. Equal Pay for Comparable Work.

    ERIC Educational Resources Information Center

    Rothman, Nancy Lloyd; Rothman, Daniel A.

    1980-01-01

    Examines the legal battleground upon which one struggle for the equality of women is being fought. Updates a civil rights decision of crucial importance to nursing--Lemons v City and County of Denver. (JOW)

  5. Incentives, health promotion and equality.

    PubMed

    Voigt, Kristin

    2012-07-01

    The use of incentives to encourage individuals to adopt 'healthier' behaviours is an increasingly popular instrument in health policy. Much of the literature has been critical of 'negative' incentives, often due to concerns about equality; 'positive' incentives, however, have largely been welcomed as an instrument for the improvement of population health and possibly the reduction of health inequalities. The aim of this paper is to provide a more systematic assessment of the use of incentives from the perspective of equality. The paper begins with an overview of existing and proposed incentive schemes. I then suggest that the distinction between 'positive' and 'negative' incentives - or 'carrots' and 'sticks' - is of limited use in distinguishing those incentive schemes that raise concerns of equality from those that do not. The paper assesses incentive schemes with respect to two important considerations of equality: equality of access and equality of outcomes. While our assessment of incentive schemes will, ultimately, depend on various empirical facts, the paper aims to advance the debate by identifying some of the empirical questions we need to ask. The paper concludes by considering a number of trade-offs and caveats relevant to the assessment of incentive schemes.

  6. Large-Scale Merging of Histograms using Distributed In-Memory Computing

    NASA Astrophysics Data System (ADS)

    Blomer, Jakob; Ganis, Gerardo

    2015-12-01

    Most high-energy physics analysis jobs are embarrassingly parallel except for the final merging of the output objects, which are typically histograms. Currently, the merging of output histograms scales badly. The running time for distributed merging depends not only on the overall number of bins but also on the number partial histogram output files. That means, while the time to analyze data decreases linearly with the number of worker nodes, the time to merge the histograms in fact increases with the number of worker nodes. On the grid, merging jobs that take a few hours are not unusual. In order to improve the situation, we present a distributed and decentral merging algorithm whose running time is independent of the number of worker nodes. We exploit full bisection bandwidth of local networks and we keep all intermediate results in memory. We present benchmarks from an implementation using the parallel ROOT facility (PROOF) and RAMCloud, a distributed key-value store that keeps all data in DRAM.

  7. A Concise Guide to Feature Histograms with Applications to LIDAR-Based Spacecraft Relative Navigation

    NASA Astrophysics Data System (ADS)

    Rhodes, Andrew P.; Christian, John A.; Evans, Thomas

    2017-01-01

    With the availability and popularity of 3D sensors, it is advantageous to re-examine the use of point cloud descriptors for the purpose of pose estimation and spacecraft relative navigation. One popular descriptor is the oriented unique repeatable clustered viewpoint feature histogram (OUR-CVFH), which is most often utilized in personal and industrial robotics to simultaneously recognize and navigate relative to an object. Recent research into using the OUR-CVFH descriptor for spacecraft navigation has produced favorable results. Since OUR-CVFH is the most recent innovation in a large family of feature histogram point cloud descriptors, discussions of parameter settings and insights into its functionality are spread among various publications and online resources. This paper organizes the history of feature histogram point cloud descriptors for a straightforward explanation of their evolution. This article compiles all the requisite information needed to implement OUR-CVFH into one location, as well as providing useful suggestions on how to tune the generation parameters. This work is beneficial for anyone interested in using this histogram descriptor for object recognition or navigation - may it be personal robotics or spacecraft navigation.

  8. CT texture analysis using the filtration-histogram method: what do the measurements mean?

    PubMed Central

    Ganeshan, Balaji; Hayball, Michael P.

    2013-01-01

    Abstract Analysis of texture within tumours on computed tomography (CT) is emerging as a potentially useful tool in assessing prognosis and treatment response for patients with cancer. This article illustrates the image and histological features that correlate with CT texture parameters obtained from tumours using the filtration-histogram approach, which comprises image filtration to highlight image features of a specified size followed by histogram analysis for quantification. Computer modelling can be used to generate texture parameters for a range of simple hypothetical images with specified image features. The model results are useful in explaining relationships between image features and texture parameters. The main image features that can be related to texture parameters are the number of objects highlighted by the filter, the brightness and/or contrast of highlighted objects relative to background attenuation, and the variability of brightness/contrast of highlighted objects. These relationships are also demonstrable by texture analysis of clinical CT images. The results of computer modelling may facilitate the interpretation of the reported associations between CT texture and histopathology in human tumours. The histogram parameters derived during the filtration-histogram method of CT texture analysis have specific relationships with a range of image features. Knowledge of these relationships can assist the understanding of results obtained from clinical CT texture analysis studies in oncology. PMID:24061266

  9. Effect of molecular organization on the image histograms of polarization SHG microscopy.

    PubMed

    Psilodimitrakopoulos, Sotiris; Amat-Roldan, Ivan; Loza-Alvarez, Pablo; Artigas, David

    2012-10-01

    Based on its polarization dependency, second harmonic generation (PSHG) microscopy has been proven capable to structurally characterize molecular architectures in different biological samples. By exploiting this polarization dependency of the SHG signal in every pixel of the image, average quantitative structural information can be retrieved in the form of PSHG image histograms. In the present study we experimentally show how the PSHG image histograms can be affected by the organization of the SHG active molecules. Our experimental scenario grounds on two inherent properties of starch granules. Firstly, we take advantage of the radial organization of amylopectin molecules (the SHG source in starch) to attribute shifts of the image histograms to the existence of tilted off the plane molecules. Secondly, we use the property of starch to organize upon hydration to demonstrate that the degree of structural order at the molecular level affects the width of the PSHG image histograms. The shorter the width is the more organized the molecules in the sample are, resulting in a reliable method to measure order. The implication of this finding is crucial to the interpretation of PSHG images used for example in tissue diagnostics.

  10. Histogram of Gabor phase patterns (HGPP): a novel object representation approach for face recognition.

    PubMed

    Zhang, Baochang; Shan, Shiguang; Chen, Xilin; Gao, Wen

    2007-01-01

    A novel object descriptor, histogram of Gabor phase pattern (HGPP), is proposed for robust face recognition. In HGPP, the quadrant-bit codes are first extracted from faces based on the Gabor transformation. Global Gabor phase pattern (GGPP) and local Gabor phase pattern (LGPP) are then proposed to encode the phase variations. GGPP captures the variations derived from the orientation changing of Gabor wavelet at a given scale (frequency), while LGPP encodes the local neighborhood variations by using a novel local XOR pattern (LXP) operator. They are both divided into the nonoverlapping rectangular regions, from which spatial histograms are extracted and concatenated into an extended histogram feature to represent the original image. Finally, the recognition is performed by using the nearest-neighbor classifier with histogram intersection as the similarity measurement. The features of HGPP lie in two aspects: 1) HGPP can describe the general face images robustly without the training procedure; 2) HGPP encodes the Gabor phase information, while most previous face recognition methods exploit the Gabor magnitude information. In addition, Fisher separation criterion is further used to improve the performance of HGPP by weighing the subregions of the image according to their discriminative powers. The proposed methods are successfully applied to face recognition, and the experiment results on the large-scale FERET and CAS-PEAL databases show that the proposed algorithms significantly outperform other well-known systems in terms of recognition rate.

  11. DIF Testing with an Empirical-Histogram Approximation of the Latent Density for Each Group

    ERIC Educational Resources Information Center

    Woods, Carol M.

    2011-01-01

    This research introduces, illustrates, and tests a variation of IRT-LR-DIF, called EH-DIF-2, in which the latent density for each group is estimated simultaneously with the item parameters as an empirical histogram (EH). IRT-LR-DIF is used to evaluate the degree to which items have different measurement properties for one group of people versus…

  12. Histogram of oriented phase (HOP): a new descriptor based on phase congruency

    NASA Astrophysics Data System (ADS)

    Ragb, Hussin K.; Asari, Vijayan K.

    2016-05-01

    In this paper we present a low level image descriptor called Histogram of Oriented Phase based on phase congruency concept and the Principal Component Analysis (PCA). Since the phase of the signal conveys more information regarding signal structure than the magnitude, the proposed descriptor can precisely identify and localize image features over the gradient based techniques, especially in the regions affected by illumination changes. The proposed features can be formed by extracting the phase congruency information for each pixel in the image with respect to its neighborhood. Histograms of the phase congruency values of the local regions in the image are computed with respect to its orientation. These histograms are concatenated to construct the Histogram of Oriented Phase (HOP) features. The dimensionality of HOP features is reduced using PCA algorithm to form HOP-PCA descriptor. The dimensionless quantity of the phase congruency leads the HOP-PCA descriptor to be more robust to the image scale variations as well as contrast and illumination changes. Several experiments were performed using INRIA and DaimlerChrysler datasets to evaluate the performance of the HOP-PCA descriptor. The experimental results show that the proposed descriptor has better detection performance and less error rates than a set of the state of the art feature extraction methodologies.

  13. Direct-space methods in phase extension and phase refinement. IV. The double-histogram method.

    PubMed

    Refaat, L S; Tate, C; Woolfson, M M

    1996-03-01

    In the conventional histogram-matching technique for phase extension and refinement for proteins a simple one-to-one transformation is made in the protein region to modify calculated density so that it will have some target histogram in addition to solvent flattening. This work describes an investigation where the density modification takes into account not only the current calculated density at a grid point but also some characteristic of the environment of the grid point within some distance R. This characteristic can be one of the local maximum density, the local minimum density or the local variance of density. The grid points are divided into ten groups, each containing the same number of grid points, for ten different ranges of value of the local characteristic. The ten groups are modified to give different histograms, each corresponding to that obtained under the same circumstances from a structure similar to the one under investigation. This process is referred to as the double-histogram matching method. Other processes which have been investigated are the weighting of structure factors when calculating maps with estimated phases and also the use of a factor to dampen the change of density and so control the refinement process. Two protein structures were used in numerical trials, RNApl [Bezborodova, Ermekbaeva, Shlyapnikov, Polyakov & Bezborodov (1988). Biokhimiya, 53, 965-973] and 2-Zn insulin [Baker, Blundell, Cutfield, Cutfield, Dodson, Dodson, Hodgkin, Hubbard, lsaacs, Reynolds, Sakabe, Sakabe & Vijayan (1988). Philos. Trans. R. Soc. London Ser. B, 319, 456--469]. Comparison of the proposed procedures with the normal histogram-matching technique without structure-factor weighting or damping gives mean phase errors reduced by up to 10 degrees with map correlation coefficients improved by as much as 0.14. Compared to the normal histogram used with weighting of structure factors and damping, the improvement due to the use of the double-histogram method is

  14. A CMOS VLSI IC for real-time opto-electronic two-dimensional histogram generation

    NASA Astrophysics Data System (ADS)

    Richstein, James K.

    1993-12-01

    Histogram generation, a standard image processing operation, is a record of the intensity distribution in the image. Histogram generation has straightforward implementations on digital computers using high level languages. A prototype of an optical-electronic histogram generator was designed and tested for 1-D objects using wirewrapped MSI TTL components. The system has shown to be fairly modular in design. The aspects of the extension to two dimensions and the VLSI implementation of this design are discussed. In this paper, we report a VLSI design to be used in a two-dimensional real-time histogram generation scheme. The overall system design is such that the electronic signal obtained from the optically scanned two-dimensional semi-opaque image is processed and displayed within a period of one cycle of the scanning process. Specifically, in the VLSI implementation of the two-dimensional histogram generator, modifications were made to the original design. For the two-dimensional application, the output controller was analyzed as a finite state machine. The process used to describe the required timing signals and translate them to a VLSI finite state machine using Computer Aided Design Tools is discussed. In addition, the circuitry for sampling, binning, and display were combined with the timing circuitry on one IC. In the original design, the pulse width of the electronically sampled photodetector is limited with an analog one-shot. The high sampling rates associated with the extension to two dimensions requires significant reduction in the original 1-D prototype's sample pulse width of approximately 75 ns. The alternate design using VLSI logic gates will provide one-shot pulse widths of approximately 3 ns.

  15. Principal component analysis of the CT density histogram to generate parametric response maps of COPD

    NASA Astrophysics Data System (ADS)

    Zha, N.; Capaldi, D. P. I.; Pike, D.; McCormack, D. G.; Cunningham, I. A.; Parraga, G.

    2015-03-01

    Pulmonary x-ray computed tomography (CT) may be used to characterize emphysema and airways disease in patients with chronic obstructive pulmonary disease (COPD). One analysis approach - parametric response mapping (PMR) utilizes registered inspiratory and expiratory CT image volumes and CT-density-histogram thresholds, but there is no consensus regarding the threshold values used, or their clinical meaning. Principal-component-analysis (PCA) of the CT density histogram can be exploited to quantify emphysema using data-driven CT-density-histogram thresholds. Thus, the objective of this proof-of-concept demonstration was to develop a PRM approach using PCA-derived thresholds in COPD patients and ex-smokers without airflow limitation. Methods: Fifteen COPD ex-smokers and 5 normal ex-smokers were evaluated. Thoracic CT images were also acquired at full inspiration and full expiration and these images were non-rigidly co-registered. PCA was performed for the CT density histograms, from which the components with the highest eigenvalues greater than one were summed. Since the values of the principal component curve correlate directly with the variability in the sample, the maximum and minimum points on the curve were used as threshold values for the PCA-adjusted PRM technique. Results: A significant correlation was determined between conventional and PCA-adjusted PRM with 3He MRI apparent diffusion coefficient (p<0.001), with CT RA950 (p<0.0001), as well as with 3He MRI ventilation defect percent, a measurement of both small airways disease (p=0.049 and p=0.06, respectively) and emphysema (p=0.02). Conclusions: PRM generated using PCA thresholds of the CT density histogram showed significant correlations with CT and 3He MRI measurements of emphysema, but not airways disease.

  16. Histogram and gray level co-occurrence matrix on gray-scale ultrasound images for diagnosing lymphocytic thyroiditis.

    PubMed

    Shin, Young Gyung; Yoo, Jaeheung; Kwon, Hyeong Ju; Hong, Jung Hwa; Lee, Hye Sun; Yoon, Jung Hyun; Kim, Eun-Kyung; Moon, Hee Jung; Han, Kyunghwa; Kwak, Jin Young

    2016-08-01

    The objective of the study was to evaluate whether texture analysis using histogram and gray level co-occurrence matrix (GLCM) parameters can help clinicians diagnose lymphocytic thyroiditis (LT) and differentiate LT according to pathologic grade. The background thyroid pathology of 441 patients was classified into no evidence of LT, chronic LT (CLT), and Hashimoto's thyroiditis (HT). Histogram and GLCM parameters were extracted from the regions of interest on ultrasound. The diagnostic performances of the parameters for diagnosing and differentiating LT were calculated. Of the histogram and GLCM parameters, the mean on histogram had the highest Az (0.63) and VUS (0.303). As the degrees of LT increased, the mean decreased and the standard deviation and entropy increased. The mean on histogram from gray-scale ultrasound showed the best diagnostic performance as a single parameter in differentiating LT according to pathologic grade as well as in diagnosing LT.

  17. [Fractal dimension and histogram method: algorithm and some preliminary results of noise-like time series analysis].

    PubMed

    Pancheliuga, V A; Pancheliuga, M S

    2013-01-01

    In the present work a methodological background for the histogram method of time series analysis is developed. Connection between shapes of smoothed histograms constructed on the basis of short segments of time series of fluctuations and the fractal dimension of the segments is studied. It is shown that the fractal dimension possesses all main properties of the histogram method. Based on it a further development of fractal dimension determination algorithm is proposed. This algorithm allows more precision determination of the fractal dimension by using the "all possible combination" method. The application of the method to noise-like time series analysis leads to results, which could be obtained earlier only by means of the histogram method based on human expert comparisons of histograms shapes.

  18. Equal Employment + Equal Pay = Multiple Problems for Colleges and Universities

    ERIC Educational Resources Information Center

    Steinbach, Sheldon Elliot; Reback, Joyce E.

    1974-01-01

    Issues involved in government regulation of university employment practices are discussed: confidentiality of records, pregnancy as a disability, alleged discrimination in benefits, tests and other employment criteria, seniority and layoff, reverse discrimination, use of statistics for determination of discrimination, and the Equal Pay Act. (JT)

  19. Equality and selection for existence.

    PubMed Central

    Persson, I

    1999-01-01

    It is argued that the policy of excluding from further life some human gametes and pre-embryos as "unfit" for existence is not at odds with a defensible idea of human equality. Such an idea must be compatible with the obvious fact that the "functional" value of humans differs, that their "use" to themselves and others differs. A defensible idea of human equality is instead grounded in the fact that as this functional difference is genetically determined, it is nothing which makes humans deserve or be worthy of being better or worse off. Rather, nobody is worth a better life than anyone else. This idea of equality is, however, not applicable to gametes and pre-embryos, since they are not human beings, but something out of which human beings develop. PMID:10226918

  20. 41 CFR 60-741.5 - Equal opportunity clause.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., organizational structures, position descriptions, lines of progression, and seniority lists; v. Leaves of absence... opportunity clause in each of its subcontracts subject to this part. (c) Adaption of language. Such necessary changes in language may be made to the equal opportunity clause as shall be appropriate to...

  1. Equal Pay: The Emerging Terrain.

    ERIC Educational Resources Information Center

    Weeks, Kent M.

    1985-01-01

    Colleges and universities can employ several statutory defenses to alleged pay disparities and demonstrate that there are legitimate reasons for pay differentials. Several preventive strategies in response to the emerging legal terrain of equal pay litigation are suggested. (Author/MLW)

  2. STEM Equality and Diversity Toolkit

    ERIC Educational Resources Information Center

    Collins, Jill

    2011-01-01

    In 2008, the Centre for Science Education at Sheffield Hallam University teamed up with VT Enterprise (now Babcock International) in their submission of a successful bid to deliver the national STEM (Science, Technology, Engineering and Maths) Subject Choice and Careers Project. An integral part of the bid was the promotion of equality and…

  3. Extending Understanding of Equal Protection.

    ERIC Educational Resources Information Center

    Dreyfuss, Elisabeth T.

    1988-01-01

    Presents four strategies for teaching secondary students about equal protection clause of the U.S. Constitution's Fourteenth Amendment. To be taught by the classroom teacher or a visiting lawyer, these strategies use such methods as a panel discussion and examination of Fourteenth Amendment court cases to accomplish their goals. (GEA)

  4. Equal Opportunity Through Instructional Design.

    ERIC Educational Resources Information Center

    Arrighi, Margarite A.

    1985-01-01

    The assumption is that sex-integrated classes are inherently equal by the very fact that boys and girls are in the same class. In fact, educational inequity has increased primarily because of instructional design which perpetuates differences among individuals. Good teaching must accommodate individual differences. (MT)

  5. Equal, but not the same.

    PubMed

    Fleming, Gabriel

    2006-10-19

    In April 2007, the gender equality duty will make it obligatory for all health providers to actively demonstrate equity in service provision. Good practice tends to exist in small projects with little evidence of national progress towards gender equity. The DoH says trusts should already be working towards

  6. Religious Freedom vs. Sex Equality

    ERIC Educational Resources Information Center

    Song, Sarah

    2006-01-01

    This essay examines Susan Moller Okin's writing on conflicts between religious freedom and sex equality, and her criticism of "political liberal" approaches to these conflicts, which I take to be a part of her lifelong critique of the public-private distinction. I argue that, while Okin ultimately accepted a version of the distinction, she was…

  7. EQUALITY OF EDUCATIONAL OPPORTUNITY, RECONSIDERED.

    ERIC Educational Resources Information Center

    COLEMAN, JAMES S.

    THIS STUDY POSES THE QUESTION OF HOW TO MEASURE THE DEGREE OF INEQUALITY OF EDUCATIONAL OPPORTUNITY FOR SUBGROUPS IN SOCIETY. IT EXAMINES AND REJECTS THE DOMINANT IDEA THAT EQUAL EDUCATIONAL OPPORTUNITY IS PROVIDED BY A COMMUNITY THROUGH THE PROVISION OF FACILITIES WITH FREE AND OPEN ACCESS FOR ALL, SUBSTITUTING THE IDEA THAT IT IS THE INTENSITY…

  8. Promote Equality in the Classroom.

    ERIC Educational Resources Information Center

    Brown, Sharon; And Others

    1996-01-01

    Presents suggestions to help physical educators treat all students equally and avoid unconsciously making inequitable gender-based statements and practicing other gender discrimination. Suggestions include encouraging girls to talk more, praising girls' performance and boys' appearance, using gender-neutral language, not stereotyping either sex,…

  9. Detecting entanglement with Jarzynski's equality

    SciTech Connect

    Hide, Jenny; Vedral, Vlatko

    2010-06-15

    We present a method for detecting the entanglement of a state using nonequilibrium processes. A comparison of relative entropies allows us to construct an entanglement witness. The relative entropy can further be related to the quantum Jarzynski equality, allowing nonequilibrium work to be used in entanglement detection. To exemplify our results, we consider two different spin chains.

  10. ADC Histograms from Routine DWI for Longitudinal Studies in Cerebral Small Vessel Disease: A Field Study in CADASIL

    PubMed Central

    Gunda, Bence; Porcher, Raphael; Duering, Marco; Guichard, Jean-Pierre; Mawet, Jerome; Jouvent, Eric; Dichgans, Martin; Chabriat, Hugues

    2014-01-01

    Diffusion tensor imaging (DTI) histogram metrics are correlated with clinical parameters in cerebral small vessel diseases (cSVD). Whether ADC histogram parameters derived from simple diffusion weighted imaging (DWI) can provide relevant markers for long term studies of cSVD remains unknown. CADASIL patients were evaluated by DWI and DTI in a large cohort study overa6-year period. ADC histogram parameters were compared to those derived from mean diffusivity (MD) histograms in 280 patients using intra-class correlation and Bland-Altman plots. Impact of image corrections applied to ADC maps was assessed and a mixed effect model was used for analyzing the effects of scanner upgrades. The results showed that ADC histogram parameters are strongly correlated to MD histogram parameters and that image corrections have only limited influence on these results. Unexpectedly, scanner upgrades were found to have major effects on diffusion measures with DWI or DTI that can be even larger than those related to patients’ characteristics. These data support that ADC histograms from daily used DWI can provide relevant parameters for assessing cSVD, but the variability related to scanner upgrades as regularly performed in clinical centers should be determined precisely for longitudinal and multicentric studies using diffusion MRI in cSVD. PMID:24819368

  11. Accelerating atomic-level protein simulations by flat-histogram techniques

    NASA Astrophysics Data System (ADS)

    Jónsson, Sigurður Ć.; Mohanty, Sandipan; Irbäck, Anders

    2011-09-01

    Flat-histogram techniques provide a powerful approach to the simulation of first-order-like phase transitions and are potentially very useful for protein studies. Here, we test this approach by implicit solvent all-atom Monte Carlo (MC) simulations of peptide aggregation, for a 7-residue fragment (GIIFNEQ) of the Cu/Zn superoxide dismutase 1 protein (SOD1). In simulations with 8 chains, we observe two distinct aggregated/non-aggregated phases. At the midpoint temperature, these phases coexist, separated by a free-energy barrier of height 2.7 kBT. We show that this system can be successfully studied by carefully implemented flat-histogram techniques. The frequency of barrier crossing, which is low in conventional canonical simulations, can be increased by turning to a two-step procedure based on the Wang-Landau and multicanonical algorithms.

  12. Recovery of the histogram of hourly ozone distribution from weekly average concentrations.

    PubMed

    Olcese, Luis E; Toselli, Beatriz M

    2006-05-01

    A simple method is presented for estimating hourly distribution of air pollutants, based on data collected by passive sensors on a weekly or bi-weekly basis with no need for previous measurements at a site. In order for this method to be applied to locations where no hourly records are available, reference data from other sites are required to generate calibration histograms. The proposed procedure allows one to obtain the histogram of hourly ozone values during a given week with an error of about 30%, which is good considering the simplicity of this approach. This method can be a valuable tool for sites that lack previous hourly records of pollutant ambient concentrations, where it can be used to verify compliance with regulations or to estimate the AOT40 index with an acceptable degree of exactitude.

  13. Kernel Learning of Histogram of Local Gabor Phase Patterns for Face Recognition

    NASA Astrophysics Data System (ADS)

    Zhang, Baochang; Wang, Zongli; Zhong, Bineng

    2008-12-01

    This paper proposes a new face recognition method, named kernel learning of histogram of local Gabor phase pattern (K-HLGPP), which is based on Daugman's method for iris recognition and the local XOR pattern (LXP) operator. Unlike traditional Gabor usage exploiting the magnitude part in face recognition, we encode the Gabor phase information for face classification by the quadrant bit coding (QBC) method. Two schemes are proposed for face recognition. One is based on the nearest-neighbor classifier with chi-square as the similarity measurement, and the other makes kernel discriminant analysis for HLGPP (K-HLGPP) using histogram intersection and Gaussian-weighted chi-square kernels. The comparative experiments show that K-HLGPP achieves a higher recognition rate than other well-known face recognition systems on the large-scale standard FERET, FERET200, and CAS-PEAL-R1 databases.

  14. Implementation of a Cascaded Histogram of Oriented Gradient (HOG)-Based Pedestrian Detector

    DTIC Science & Technology

    2013-09-01

    Implementation of a Cascaded Histogram of Oriented Gradient (HOG)-Based Pedestrian Detector by Christopher Reale, Prudhvi Gurram , Shuowen...Pedestrian Detector Christopher Reale, Prudhvi Gurram , Shuowen Hu, and Alex Chan Sensors and Electron Devices Directorate, ARL...NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Christopher Reale, Prudhvi Gurram , Shuowen Hu, and Alex Chan 5d. PROJECT NUMBER 5e. TASK NUMBER

  15. A Low-Cost BIST Based on Histogram Testing for Analog to Digital Converters

    NASA Astrophysics Data System (ADS)

    Kim, Kicheol; Kim, Youbean; Kim, Incheol; Son, Hyeonuk; Kang, Sungho

    In this letter a histogram-based BIST (Built-In Self-Test) approach for deriving the main characteristic parameters of an ADC (Analog to Digital Converter) such as offset, gain and non-linearities is proposed. The BIST uses a ramp signal as an input signal and two counters as a response analyzer to calculate the derived static parameters. Experimental results show that the proposed method reduces the hardware overhead and testing time while detecting any static faults in an ADC.

  16. Evaluation of the S phase distribution of flow cytometric DNA histograms by autoradiography and computer algorithms.

    PubMed

    Sheck, L E; Muirhead, K A; Horan, P K

    1980-09-01

    Cell sorting and tritiated thymidine autoradiography were used to define the distribution of S phase cells in flow cytometric DNA histograms obtained from exponential mouse lymphoma cells (L5178Y). The numbers of labeled S phase cells, autoradiographically determined from cells sorted at 2-channel intervals in the G1/early S and late S/G2M regions of the histogram, were compared with the numbers of computed S phase cells in comparable 2-channel intervals as predicted by several computer algorithms used to extract cell cycle phase distributions from DNA histograms. Polynomial and multirectangle algorithms gave computed estimates of total %S in close agreement with the tritiated thymidine labeling index for the cell population, while multi-Gaussian algorithms underestimated %S. Interval autoradiographic and algorithm studies confirmed these results in that no significant differences were found between the autoradiographic S phase distribution and S phase distributions calculated by the polynomial and multirectangle models. However, S phase cells were significantly underestimated in G1/early S by a constrained multi-Gaussian model and in both G1/early S and late S/G2 by an unconstrained multi-Gaussian model. For the particular cell line (L5178Y), staining protocol (mithramycin following ethanol fixation) and instrumentation (Coulter TPS-2 cell sorter) used in this study, close agreement between computed %S and tritiated thymidine labeling index was found to be a reliable indicator of an algorithm's success in resolving S phase cells in the G1/S and S/G2 transition regions of the DNA histograms.

  17. A comparison of histogram distance metrics for content-based image retrieval

    NASA Astrophysics Data System (ADS)

    Zhang, Qianwen; Canosa, Roxanne L.

    2014-03-01

    The type of histogram distance metric selected for a CBIR query varies greatly and will affect the accuracy of the retrieval results. This paper compares the retrieval results of a variety of commonly used CBIR distance metrics: the Euclidean distance, the Manhattan distance, the vector cosine angle distance, histogram intersection distance, χ2 distance, Jensen-Shannon divergence, and the Earth Mover's distance. A training set of ground-truth labeled images is used to build a classifier for the CBIR system, where the images were obtained from three commonly used benchmarking datasets: the WANG dataset (http://savvash.blogspot.com/2008/12/benchmark-databases-for-cbir.html), the Corel Subset dataset (http://vision.stanford.edu/resources_links.html), and the CalTech dataset (http://www.vision.caltech.edu/htmlfiles/). To implement the CBIR system, we use the Tamura texture features of coarseness, contrast, and directionality. We create texture histograms of the training set and the query images, and then measure the difference between a randomly selected query and the corresponding retrieved image using a k-nearest-neighbors approach. Precision and recall is used to evaluate the retrieval performance of the system, given a particular distance metric. Then, given the same query image, the distance metric is changed and performance of the system is evaluated once again.

  18. Efficient descriptor of histogram of salient edge orientation map for finger vein recognition.

    PubMed

    Lu, Yu; Yoon, Sook; Xie, Shan Juan; Yang, Jucheng; Wang, Zhihui; Park, Dong Sun

    2014-07-10

    Finger vein images are rich in orientation and edge features. Inspired by the edge histogram descriptor proposed in MPEG-7, this paper presents an efficient orientation-based local descriptor, named histogram of salient edge orientation map (HSEOM). HSEOM is based on the fact that human vision is sensitive to edge features for image perception. For a given image, HSEOM first finds oriented edge maps according to predefined orientations using a well-known edge operator and obtains a salient edge orientation map by choosing an orientation with the maximum edge magnitude for each pixel. Then, subhistograms of the salient edge orientation map are generated from the nonoverlapping submaps and concatenated to build the final HSEOM. In the experiment of this paper, eight oriented edge maps were used to generate a salient edge orientation map for HSEOM construction. Experimental results on our available finger vein image database, MMCBNU_6000, show that the performance of HSEOM outperforms that of state-of-the-art orientation-based methods (e.g., Gabor filter, histogram of oriented gradients, and local directional code). Furthermore, the proposed HSEOM has advantages of low feature dimensionality and fast implementation for a real-time finger vein recognition system.

  19. Statistical Analysis of Photopyroelectric Signals using Histogram and Kernel Density Estimation for differentiation of Maize Seeds

    NASA Astrophysics Data System (ADS)

    Rojas-Lima, J. E.; Domínguez-Pacheco, A.; Hernández-Aguilar, C.; Cruz-Orea, A.

    2016-09-01

    Considering the necessity of photothermal alternative approaches for characterizing nonhomogeneous materials like maize seeds, the objective of this research work was to analyze statistically the amplitude variations of photopyroelectric signals, by means of nonparametric techniques such as the histogram and the kernel density estimator, and the probability density function of the amplitude variations of two genotypes of maize seeds with different pigmentations and structural components: crystalline and floury. To determine if the probability density function had a known parametric form, the histogram was determined which did not present a known parametric form, so the kernel density estimator using the Gaussian kernel, with an efficiency of 95 % in density estimation, was used to obtain the probability density function. The results obtained indicated that maize seeds could be differentiated in terms of the statistical values for floury and crystalline seeds such as the mean (93.11, 159.21), variance (1.64× 103, 1.48× 103), and standard deviation (40.54, 38.47) obtained from the amplitude variations of photopyroelectric signals in the case of the histogram approach. For the case of the kernel density estimator, seeds can be differentiated in terms of kernel bandwidth or smoothing constant h of 9.85 and 6.09 for floury and crystalline seeds, respectively.

  20. Digital image classification with the help of artificial neural network by simple histogram

    PubMed Central

    Dey, Pranab; Banerjee, Nirmalya; Kaur, Rajwant

    2016-01-01

    Background: Visual image classification is a great challenge to the cytopathologist in routine day-to-day work. Artificial neural network (ANN) may be helpful in this matter. Aims and Objectives: In this study, we have tried to classify digital images of malignant and benign cells in effusion cytology smear with the help of simple histogram data and ANN. Materials and Methods: A total of 404 digital images consisting of 168 benign cells and 236 malignant cells were selected for this study. The simple histogram data was extracted from these digital images and an ANN was constructed with the help of Neurointelligence software [Alyuda Neurointelligence 2.2 (577), Cupertino, California, USA]. The network architecture was 6-3-1. The images were classified as training set (281), validation set (63), and test set (60). The on-line backpropagation training algorithm was used for this study. Result: A total of 10,000 iterations were done to train the ANN system with the speed of 609.81/s. After the adequate training of this ANN model, the system was able to identify all 34 malignant cell images and 24 out of 26 benign cells. Conclusion: The ANN model can be used for the identification of the individual malignant cells with the help of simple histogram data. This study will be helpful in the future to identify malignant cells in unknown situations. PMID:27279679

  1. Equal opportunity in the workplace.

    PubMed

    Allen, A

    1992-04-01

    The Equal Employment Opportunity Commission (EEOC) was created by the Civil Rights Act of 1964. The commission encourages voluntary compliance with equal employment opportunity practices, and has authority to investigate complaints alleging discrimination in hiring, firing, wage rates, testing, training, apprenticeship, and other conditions of employment. In October 1991, during the Senate Judiciary Committee hearings, the confirmation of Judge Clarence Thomas for a seat on the United States Supreme Court was placed in jeopardy by a charge of sexual harassment while Thomas was head of the EEOC. This article focuses on aspects of sexual harassment in the workplace, the role of the EEOC, and offers some suggestions for keeping the work environment free of abusive behavior.

  2. An adaptive image enhancement technique by combining cuckoo search and particle swarm optimization algorithm.

    PubMed

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper.

  3. Complex adaptation-based LDR image rendering for 3D image reconstruction

    NASA Astrophysics Data System (ADS)

    Lee, Sung-Hak; Kwon, Hyuk-Ju; Sohng, Kyu-Ik

    2014-07-01

    A low-dynamic tone-compression technique is developed for realistic image rendering that can make three-dimensional (3D) images similar to realistic scenes by overcoming brightness dimming in the 3D display mode. The 3D surround provides varying conditions for image quality, illuminant adaptation, contrast, gamma, color, sharpness, and so on. In general, gain/offset adjustment, gamma compensation, and histogram equalization have performed well in contrast compression; however, as a result of signal saturation and clipping effects, image details are removed and information is lost on bright and dark areas. Thus, an enhanced image mapping technique is proposed based on space-varying image compression. The performance of contrast compression is enhanced with complex adaptation in a 3D viewing surround combining global and local adaptation. Evaluating local image rendering in view of tone and color expression, noise reduction, and edge compensation confirms that the proposed 3D image-mapping model can compensate for the loss of image quality in the 3D mode.

  4. Office of Equal Opportunity Programs

    NASA Technical Reports Server (NTRS)

    Chin, Jennifer L.

    2004-01-01

    The NASA Glenn Office of Equal Opportunity Programs works to provide quality service for all programs and/or to assist the Center in becoming a model workplace. During the summer of 2004, I worked with Deborah Cotleur along with other staff members to create and modify customer satisfaction surveys. This office aims to assist in developing a model workplace by providing functions as a change agent to the center by serving as an advisor to management to ensure equity throughout the Center. In addition, the office serves as a mediator for the Center in addressing issues and concerns. Lastly, the office provides assistance to employees to enable attainment of personal and organizational goals. The Office of Equal Opportunities is a staff office which reports and provides advice to the Center Director and Executive Leadership, implements laws, regulations, and presidential executive orders, and provides center wide leadership and assistance to NASA GRC employees. Some of the major responsibilities of the office include working with the discrimination complaints program, special emphasis programs (advisory groups), management support, monitoring and evaluation, contract compliance, and community outreach. During my internship in this office, my main objective was to create four customer satisfaction surveys based on EO retreats, EO observances, EO advisory boards, and EO mediation/counseling. I created these surveys after conducting research on past events and surveys as well as similar survey research created and conducted by other NASA centers, program for EO Advisory group members, leadership training sessions for supervisors, preventing sexual harassment training sessions, and observance events. I also conducted research on the style and format from feedback surveys from the Marshall Equal Opportunity website, the Goddard website, and the main NASA website. Using the material from the Office of Equal Opportunity Programs at Glenn Research Center along with my

  5. The equal right to drink.

    PubMed

    Schmidt, Laura A

    2014-11-01

    The starting place for this essay is Knupfer and Room's insight that more restrictive norms around drinking and intoxication tend to be selectively applied to the economically dependent segments of society, such as women. However, since these authors wrote in 1964, women in the US and many other societies around the globe have experienced rising economic independence. The essay considers how the moral categories of acceptable drinking and drunkenness may have shifted alongside women's rising economic independence, and looks at evidence on the potential consequences for women's health and wellbeing. I argue that, as women have gained economic independence, changes in drinking norms have produced two different kinds of negative unintended consequences for women at high and low extremes of economic spectrum. As liberated women of the middle and upper classes have become more economically equal to men, they have enjoyed the right to drink with less restraint. For them, alongside the equal right to drink has come greater equality in exposure to alcohol-attributable harms, abuse and dependence. I further suggest that, as societies become more liberated, the economic dependency of low-income women is brought into greater question. Under such conditions, women in poverty-particularly those economically dependent on the state, such as welfare mothers-have become subject to more restrictive norms around drinking and intoxication, and more punitive social controls.

  6. Equal is as equal does: challenging Vatican views on women.

    PubMed

    1995-01-01

    The authors of this piece are women from the Roman Catholic tradition who are critical of the Vatican position on women's rights. The Report of the Holy See in Preparation for the Fourth World Conference on Women reveals a religious fundamentalism that misuses tradition and anthropology to limit women's roles and rights. The Vatican is itself a self-proclaimed state that offers women neither opportunities nor protections within its own organization, and there is no evidence of women's participation in the preparation of its report. The Vatican document constructs a vision of women and men in which men are normative persons, whose dignity is conferred by their humanity, and women are the variant other, defined by and granted dignity by their reproductive and mothering functions. The Vatican document is anti-feminist. It criticizes the "radical feminists" of the 1960s for trying to deny sexual differences, and accuses today's Western feminists of ignoring the needs of women in developing countries while pursuing selfish and hedonistic goals. It makes no recognition of the work of feminists to improve the lives of women worldwide. The Vatican document claims to support women's equality, but it qualifies each statement of equality with a presumption of difference. The document defines women as vulnerable without naming men as responsible for the oppression and violence to which women are vulnerable. It ridicules as feminist cant the well-documented fact that the home is the setting of most violence against women. The Vatican decries the suffering families undergo as a result of cumpulsory birth control and abortion policies, while it would deny families sex education, contraceptives, and safe abortion, thereby making pregnancy cumpulsory. A new vision of social justice is needed, one that: 1) rests on a radical equality, in which both women and men are expected to contribute to work, education, culture, morality, and reproduction; 2) accepts a "discipleship of equals

  7. Implementing Equal Access Computer Labs.

    ERIC Educational Resources Information Center

    Clinton, Janeen; And Others

    This paper discusses the philosophy followed in Palm Beach County to adapt computer literacy curriculum, hardware, and software to meet the needs of all children. The Department of Exceptional Student Education and the Department of Instructional Computing Services cooperated in planning strategies and coordinating efforts to implement equal…

  8. The Business of Equal Opportunity.

    ERIC Educational Resources Information Center

    Dickson, Reginald D.

    1992-01-01

    The author describes his journey from poor African-American youth in the rural South to successful businessman. He discusses the Inroads program, an internship for African-American and Hispanic youth and advises giving up victimhood and adapting to the mainstream of capitalism. (SK)

  9. Shape from equal thickness contours

    SciTech Connect

    Cong, G.; Parvin, B.

    1998-05-10

    A unique imaging modality based on Equal Thickness Contours (ETC) has introduced a new opportunity for 3D shape reconstruction from multiple views. We present a computational framework for representing each view of an object in terms of its object thickness, and then integrating these representations into a 3D surface by algebraic reconstruction. The object thickness is inferred by grouping curve segments that correspond to points of second derivative maxima. At each step of the process, we use some form of regularization to ensure closeness to the original features, as well as neighborhood continuity. We apply our approach to images of a sub-micron crystal structure obtained through a holographic process.

  10. Midwives, gender equality and feminism.

    PubMed

    Walsh, Denis

    2016-03-01

    Gender inequality and the harmful effects of patriarchy are sustaining the wide spread oppression of women across the world and this is also having an impact on maternity services with unacceptable rates of maternal mortality, the continued under investment in the midwifery profession and the limiting of women's place of birth options. However alongside these effects, the current zeitgeist is affirming an alignment of feminism and gender equality such that both have a high profile in public discourse. This presents a once in a generation opportunity for midwives to self-declare as feminists and commit to righting the wrongs of this most pernicious form of discrimination.

  11. Equal Pay for Work of Comparable Value.

    ERIC Educational Resources Information Center

    Mutari, Ellen; And Others

    1982-01-01

    Discusses occupational segregation and other barriers to equal job opportunities for women and examines two approaches toward correcting pay inequities: equal pay for equal work and equal pay for work of comparable value. Legal cases, job evaluation studies, and other steps toward comparable worth are described. A 64-item reference list is…

  12. Educational Equality: Luck Egalitarian, Pluralist and Complex

    ERIC Educational Resources Information Center

    Calvert, John

    2014-01-01

    The basic principle of educational equality is that each child should receive an equally good education. This sounds appealing, but is rather vague and needs substantial working out. Also, educational equality faces all the objections to equality per se, plus others specific to its subject matter. Together these have eroded confidence in the…

  13. Enhancing tumor apparent diffusion coefficient histogram skewness stratifies the postoperative survival in recurrent glioblastoma multiforme patients undergoing salvage surgery.

    PubMed

    Zolal, Amir; Juratli, Tareq A; Linn, Jennifer; Podlesek, Dino; Sitoci Ficici, Kerim Hakan; Kitzler, Hagen H; Schackert, Gabriele; Sobottka, Stephan B; Rieger, Bernhard; Krex, Dietmar

    2016-05-01

    Objective To determine the value of apparent diffusion coefficient (ADC) histogram parameters for the prediction of individual survival in patients undergoing surgery for recurrent glioblastoma (GBM) in a retrospective cohort study. Methods Thirty-one patients who underwent surgery for first recurrence of a known GBM between 2008 and 2012 were included. The following parameters were collected: age, sex, enhancing tumor size, mean ADC, median ADC, ADC skewness, ADC kurtosis and fifth percentile of the ADC histogram, initial progression free survival (PFS), extent of second resection and further adjuvant treatment. The association of these parameters with survival and PFS after second surgery was analyzed using log-rank test and Cox regression. Results Using log-rank test, ADC histogram skewness of the enhancing tumor was significantly associated with both survival (p = 0.001) and PFS after second surgery (p = 0.005). Further parameters associated with prolonged survival after second surgery were: gross total resection at second surgery (p = 0.026), tumor size (0.040) and third surgery (p = 0.003). In the multivariate Cox analysis, ADC histogram skewness was shown to be an independent prognostic factor for survival after second surgery. Conclusion ADC histogram skewness of the enhancing lesion, enhancing lesion size, third surgery, as well as gross total resection have been shown to be associated with survival following the second surgery. ADC histogram skewness was an independent prognostic factor for survival in the multivariate analysis.

  14. Approximate Minimum Bit Error Rate Equalization for Fading Channels

    NASA Astrophysics Data System (ADS)

    Kovacs, Lorant; Levendovszky, Janos; Olah, Andras; Treplan, Gergely

    2010-12-01

    A novel channel equalizer algorithm is introduced for wireless communication systems to combat channel distortions resulting from multipath propagation. The novel algorithm is based on minimizing the bit error rate (BER) using a fast approximation of its gradient with respect to the equalizer coefficients. This approximation is obtained by estimating the exponential summation in the gradient with only some carefully chosen dominant terms. The paper derives an algorithm to calculate these dominant terms in real-time. Summing only these dominant terms provides a highly accurate approximation of the true gradient. Combined with a fast adaptive channel state estimator, the new equalization algorithm yields better performance than the traditional zero forcing (ZF) or minimum mean square error (MMSE) equalizers. The performance of the new method is tested by simulations performed on standard wireless channels. From the performance analysis one can infer that the new equalizer is capable of efficient channel equalization and maintaining a relatively low bit error probability in the case of channels corrupted by frequency selectivity. Hence, the new algorithm can contribute to ensuring QoS communication over highly distorted channels.

  15. Resolving heterogeneity on the single molecular level with the photon-counting histogram.

    PubMed Central

    Müller, J D; Chen, Y; Gratton, E

    2000-01-01

    The diffusion of fluorescent particles through a small, illuminated observation volume gives rise to intensity fluctuations caused by particle number fluctuations in the open observation volume and the inhomogeneous excitation-beam profile. The intensity distribution of these fluorescence fluctuations is experimentally captured by the photon-counting histogram (PCH). We recently introduced the theory of the PCH for diffusing particles (Chen et al., Biophys. J., 77:553-567), where we showed that we can uniquely describe the distribution of photon counts with only two parameters for each species: the molecular brightness of the particle and the average number of particles within the observation volume. The PCH is sensitive to the molecular brightness and thus offers the possibility to separate a mixture of fluorescent species into its constituents, based on a difference in their molecular brightness alone. This analysis is complementary to the autocorrelation function, traditionally used in fluorescence fluctuation spectroscopy, which separates a mixture of species by a difference in their diffusion coefficient. The PCH of each individual species is convoluted successively to yield the PCH of the mixture. Successful resolution of the histogram into its components is largely a matter of the signal statistics. Here, we discuss the case of two species in detail and show that a concentration for each species exists, where the signal statistics is optimal. We also discuss the influence of the absolute molecular brightness and the brightness contrast between two species on the resolvability of two species. A binary dye mixture serves as a model system to demonstrate that the molecular brightness and the concentration of each species can be resolved experimentally from a single or from several histograms. We extend our study to biomolecules, where we label proteins with a fluorescent dye and show that a brightness ratio of two can be resolved. The ability to resolve a

  16. Optimized swimmer tracking system by a dynamic fusion of correlation and color histogram techniques

    NASA Astrophysics Data System (ADS)

    Benarab, D.; Napoléon, T.; Alfalou, A.; Verney, A.; Hellard, P.

    2015-12-01

    To design a robust swimmer tracking system, we took into account two well-known tracking techniques: the nonlinear joint transform correlation (NL-JTC) and the color histogram. The two techniques perform comparably well, yet they both have substantial limitations. Interestingly, they also seem to show some complementarity. The correlation technique yields accurate detection but is sensitive to rotation, scale and contour deformation, whereas the color histogram technique is robust for rotation and contour deformation but shows low accuracy and is highly sensitive to luminosity and confusing background colors. These observations suggested the possibility of a dynamic fusion of the correlation plane and the color scores map. Before this fusion, two steps are required. First is the extraction of a sub-plane of correlation that describes the similarity between the reference and target images. This sub-plane has the same size as the color scores map but they have different interval values. Thus, the second step is required which is the normalization of the planes in the same interval so they can be fused. In order to determine the benefits of this fusion technique, first, we tested it on a synthetic image containing different forms with different colors. We thus were able to optimize the correlation plane and color histogram techniques before applying our fusion technique to real videos of swimmers in international competitions. Last, a comparative study of the dynamic fusion technique and the two classical techniques was carried out to demonstrate the efficacy of the proposed technique. The criteria of comparison were the tracking percentage, the peak to correlation energy (PCE), which evaluated the sharpness of the peak (accuracy), and the local standard deviation (Local-STD), which assessed the noise in the planes (robustness).

  17. The radiological diagnosis of fenestral otosclerosis: the utility of histogram analysis using multidetector row CT.

    PubMed

    Yamashita, Koji; Yoshiura, Takashi; Hiwatashi, Akio; Togao, Osamu; Kikuchi, Kazufumi; Inoguchi, Takashi; Kumazawa, Seiji; Honda, Hiroshi

    2014-12-01

    Bone density measurements using high-resolution CT have been reported to be useful to diagnose fenestral otosclerosis. However, small region of interest (ROI) chosen by less-experienced radiologists may result in false-negative findings. Semi-automatic analysis such as CT histogram analysis may offer improved assessment. The aim of this study was to evaluate the utility of CT histogram analysis in diagnosing fenestral otosclerosis. Temporal bone CT of consecutive patients with otosclerosis and normal controls was retrospectively analyzed. The control group consisted of the normal-hearing contralateral ears of patients with otitis media, cholesteatoma, trauma, facial nerve palsy, or tinnitus. All CT images were obtained using a 64-detector-row CT scanner with 0.5-mm collimation. AROI encompassing 10 × 10 pixels was placed in the bony labyrinth located anterior to the oval window. The mean CT value, variance and entropy were compared between otosclerosis patients and normal controls using Student's t test. The number of pixels below mean minus SD in the control (%Lowcont) and total subjects (%Lowtotal) were also compared. In addition, the area under the receiver operating characteristic curves (AUC) value for the discrimination between otosclerosis patients and normal controls was calculated. 51 temporal bones of 38 patients with otosclerosis and 30 temporal bones of 30 control subjects were included. The mean CT value was significantly lower in otosclerosis cases than in normal controls (p < 0.01). In addition, variance, entropy, %Lowcont and %Lowtotal were significantly higher in otosclerosis cases than in normal controls (p < 0.01, respectively). The AUC values for the mean CT value, %Lowcont and %Lowtotal were 0.751, 0.760 and 0.765, respectively. In conclusion, our results demonstrated that histogram analysis of CT image may be of clinical value in diagnosing otosclerosis.

  18. Verification of dose volume histograms in stereotactic radiosurgery and radiotherapy using polymer gel and MRI

    NASA Astrophysics Data System (ADS)

    Šemnická, Jitka; Novotný, Josef, Jr.; Spěváček, Václav; Garčic, Jirí; Steiner, Martin; Judas, Libor

    2006-12-01

    In this work we focus on dose volume histograms (DVHs) measurement in stereotactic radiosurgery (SR) performed with the Leksell gamma knife (ELEKTA Instrument AB, Stockholm, Sweden) and stereotactic radiotherapy (SRT) performed with linear accelerator 6 MV Varian Clinac 2100 C/D (Varian Medical Systems, Palo Alto, USA) in conjunction with BrainLAB stereotactic system (BrainLAB, Germany) using modified BANG gel and magnetic resonance imaging (MRI). The aim of the experiments was to investigate a method for acquiring entire dose volume information from irradiated gel dosimeter and calculate DVHs.

  19. Phase-unwrapping algorithm for images with high noise content based on a local histogram.

    PubMed

    Meneses, Jaime; Gharbi, Tijani; Humbert, Philippe

    2005-03-01

    We present a robust algorithm of phase unwrapping that was designed for use on phase images with high noise content. We proceed with the algorithm by first identifying regions with continuous phase values placed between fringe boundaries in an image and then phase shifting the regions with respect to one another by multiples of 2pi to unwrap the phase. Image pixels are segmented between interfringe and fringe boundary areas by use of a local histogram of a wrapped phase. The algorithm has been used successfully to unwrap phase images generated in a three-dimensional shape measurement for noninvasive quantification of human skin structure in dermatology, cosmetology, and plastic surgery.

  20. Early detection of Alzheimer's disease using histograms in a dissimilarity-based classification framework

    NASA Astrophysics Data System (ADS)

    Luchtenberg, Anne; Simões, Rita; van Cappellen van Walsum, Anne-Marie; Slump, Cornelis H.

    2014-03-01

    Classification methods have been proposed to detect early-stage Alzheimer's disease using Magnetic Resonance images. In particular, dissimilarity-based classification has been applied using a deformation-based distance measure. However, such approach is not only computationally expensive but it also considers large-scale alterations in the brain only. In this work, we propose the use of image histogram distance measures, determined both globally and locally, to detect very mild to mild Alzheimer's disease. Using an ensemble of local patches over the entire brain, we obtain an accuracy of 84% (sensitivity 80% and specificity 88%).

  1. Quantitative characterization of metastatic disease in the spine. Part II. Histogram-based analyses

    SciTech Connect

    Whyne, Cari; Hardisty, Michael; Wu, Florence; Skrinskas, Tomas; Clemons, Mark; Gordon, Lyle; Basran, Parminder S.

    2007-08-15

    Radiological imaging is essential to the appropriate management of patients with bone metastasis; however, there have been no widely accepted guidelines as to the optimal method for quantifying the potential impact of skeletal lesions or to evaluate response to treatment. The current inability to rapidly quantify the response of bone metastases excludes patients with cancer and bone disease from participating in clinical trials of many new treatments as these studies frequently require patients with so-called measurable disease. Computed tomography (CT) can provide excellent skeletal detail with a sensitivity for the diagnosis of bone metastases. The purpose of this study was to establish an objective method to quantitatively characterize disease in the bony spine using CT-based segmentations. It was hypothesized that histogram analysis of CT vertebral density distributions would enable standardized segmentation of tumor tissue and consequently allow quantification of disease in the metastatic spine. Thirty two healthy vertebral CT scans were first studied to establish a baseline characterization. The histograms of the trabecular centrums were found to be Gaussian distributions (average root-mean-square difference=30 voxel counts), as expected for a uniform material. Intrapatient vertebral level similarity was also observed as the means were not significantly different (p>0.8). Thus, a patient-specific healthy vertebral body histogram is able to characterize healthy trabecular bone throughout that individual's thoracolumbar spine. Eleven metastatically involved vertebrae were analyzed to determine the characteristics of the lytic and blastic bone voxels relative to the healthy bone. Lytic and blastic tumors were segmented as connected areas with voxel intensities between specified thresholds. The tested thresholds were {mu}-1.0{sigma}, {mu}-1.5{sigma}, and {mu}-2.0{sigma}, for lytic and {mu}+2.0{sigma}, {mu}+3.0{sigma}, and {mu}+3.5{sigma} for blastic tissue where

  2. Improved dose-volume histogram estimates for radiopharmaceutical therapy by optimizing quantitative SPECT reconstruction parameters

    NASA Astrophysics Data System (ADS)

    Cheng, Lishui; Hobbs, Robert F.; Segars, Paul W.; Sgouros, George; Frey, Eric C.

    2013-06-01

    In radiopharmaceutical therapy, an understanding of the dose distribution in normal and target tissues is important for optimizing treatment. Three-dimensional (3D) dosimetry takes into account patient anatomy and the nonuniform uptake of radiopharmaceuticals in tissues. Dose-volume histograms (DVHs) provide a useful summary representation of the 3D dose distribution and have been widely used for external beam treatment planning. Reliable 3D dosimetry requires an accurate 3D radioactivity distribution as the input. However, activity distribution estimates from SPECT are corrupted by noise and partial volume effects (PVEs). In this work, we systematically investigated OS-EM based quantitative SPECT (QSPECT) image reconstruction in terms of its effect on DVHs estimates. A modified 3D NURBS-based Cardiac-Torso (NCAT) phantom that incorporated a non-uniform kidney model and clinically realistic organ activities and biokinetics was used. Projections were generated using a Monte Carlo (MC) simulation; noise effects were studied using 50 noise realizations with clinical count levels. Activity images were reconstructed using QSPECT with compensation for attenuation, scatter and collimator-detector response (CDR). Dose rate distributions were estimated by convolution of the activity image with a voxel S kernel. Cumulative DVHs were calculated from the phantom and QSPECT images and compared both qualitatively and quantitatively. We found that noise, PVEs, and ringing artifacts due to CDR compensation all degraded histogram estimates. Low-pass filtering and early termination of the iterative process were needed to reduce the effects of noise and ringing artifacts on DVHs, but resulted in increased degradations due to PVEs. Large objects with few features, such as the liver, had more accurate histogram estimates and required fewer iterations and more smoothing for optimal results. Smaller objects with fine details, such as the kidneys, required more iterations and less

  3. Phase-unwrapping algorithm for images with high noise content based on a local histogram

    NASA Astrophysics Data System (ADS)

    Meneses, Jaime; Gharbi, Tijani; Humbert, Philippe

    2005-03-01

    We present a robust algorithm of phase unwrapping that was designed for use on phase images with high noise content. We proceed with the algorithm by first identifying regions with continuous phase values placed between fringe boundaries in an image and then phase shifting the regions with respect to one another by multiples of 2pi to unwrap the phase. Image pixels are segmented between interfringe and fringe boundary areas by use of a local histogram of a wrapped phase. The algorithm has been used successfully to unwrap phase images generated in a three-dimensional shape measurement for noninvasive quantification of human skin structure in dermatology, cosmetology, and plastic surgery.

  4. Locally weighted histogram analysis and stochastic solution for large-scale multi-state free energy estimation

    PubMed Central

    Tan, Zhiqiang; Xia, Junchao; Zhang, Bin W.; Levy, Ronald M.

    2016-01-01

    The weighted histogram analysis method (WHAM) including its binless extension has been developed independently in several different contexts, and widely used in chemistry, physics, and statistics, for computing free energies and expectations from multiple ensembles. However, this method, while statistically efficient, is computationally costly or even infeasible when a large number, hundreds or more, of distributions are studied. We develop a locally WHAM (local WHAM) from the perspective of simulations of simulations (SOS), using generalized serial tempering (GST) to resample simulated data from multiple ensembles. The local WHAM equations based on one jump attempt per GST cycle can be solved by optimization algorithms orders of magnitude faster than standard implementations of global WHAM, but yield similarly accurate estimates of free energies to global WHAM estimates. Moreover, we propose an adaptive SOS procedure for solving local WHAM equations stochastically when multiple jump attempts are performed per GST cycle. Such a stochastic procedure can lead to more accurate estimates of equilibrium distributions than local WHAM with one jump attempt per cycle. The proposed methods are broadly applicable when the original data to be “WHAMMED” are obtained properly by any sampling algorithm including serial tempering and parallel tempering (replica exchange). To illustrate the methods, we estimated absolute binding free energies and binding energy distributions using the binding energy distribution analysis method from one and two dimensional replica exchange molecular dynamics simulations for the beta-cyclodextrin-heptanoate host-guest system. In addition to the computational advantage of handling large datasets, our two dimensional WHAM analysis also demonstrates that accurate results similar to those from well-converged data can be obtained from simulations for which sampling is limited and not fully equilibrated. PMID:26801020

  5. Locally weighted histogram analysis and stochastic solution for large-scale multi-state free energy estimation

    NASA Astrophysics Data System (ADS)

    Tan, Zhiqiang; Xia, Junchao; Zhang, Bin W.; Levy, Ronald M.

    2016-01-01

    The weighted histogram analysis method (WHAM) including its binless extension has been developed independently in several different contexts, and widely used in chemistry, physics, and statistics, for computing free energies and expectations from multiple ensembles. However, this method, while statistically efficient, is computationally costly or even infeasible when a large number, hundreds or more, of distributions are studied. We develop a locally WHAM (local WHAM) from the perspective of simulations of simulations (SOS), using generalized serial tempering (GST) to resample simulated data from multiple ensembles. The local WHAM equations based on one jump attempt per GST cycle can be solved by optimization algorithms orders of magnitude faster than standard implementations of global WHAM, but yield similarly accurate estimates of free energies to global WHAM estimates. Moreover, we propose an adaptive SOS procedure for solving local WHAM equations stochastically when multiple jump attempts are performed per GST cycle. Such a stochastic procedure can lead to more accurate estimates of equilibrium distributions than local WHAM with one jump attempt per cycle. The proposed methods are broadly applicable when the original data to be "WHAMMED" are obtained properly by any sampling algorithm including serial tempering and parallel tempering (replica exchange). To illustrate the methods, we estimated absolute binding free energies and binding energy distributions using the binding energy distribution analysis method from one and two dimensional replica exchange molecular dynamics simulations for the beta-cyclodextrin-heptanoate host-guest system. In addition to the computational advantage of handling large datasets, our two dimensional WHAM analysis also demonstrates that accurate results similar to those from well-converged data can be obtained from simulations for which sampling is limited and not fully equilibrated.

  6. Brightness-equalized quantum dots

    PubMed Central

    Lim, Sung Jun; Zahid, Mohammad U.; Le, Phuong; Ma, Liang; Entenberg, David; Harney, Allison S.; Condeelis, John; Smith, Andrew M.

    2015-01-01

    As molecular labels for cells and tissues, fluorescent probes have shaped our understanding of biological structures and processes. However, their capacity for quantitative analysis is limited because photon emission rates from multicolour fluorophores are dissimilar, unstable and often unpredictable, which obscures correlations between measured fluorescence and molecular concentration. Here we introduce a new class of light-emitting quantum dots with tunable and equalized fluorescence brightness across a broad range of colours. The key feature is independent tunability of emission wavelength, extinction coefficient and quantum yield through distinct structural domains in the nanocrystal. Precise tuning eliminates a 100-fold red-to-green brightness mismatch of size-tuned quantum dots at the ensemble and single-particle levels, which substantially improves quantitative imaging accuracy in biological tissue. We anticipate that these materials engineering principles will vastly expand the optical engineering landscape of fluorescent probes, facilitate quantitative multicolour imaging in living tissue and improve colour tuning in light-emitting devices. PMID:26437175

  7. Klystron equalization for RF feedback

    SciTech Connect

    Corredoura, P.

    1993-01-01

    The next generation of colliding beam storage rings support higher luminosities by significantly increasing the number of bunches and decreasing the spacing between respective bunches. The heavy beam loading requires large RF cavity detuning which drives several lower coupled bunch modes very strongly. One technique which has proven to be very successful in reducing the coupled bunch mode driving impedance is RF feedback around the klystron-cavity combination. The gain and bandwidth of the feedback loop is limited by the group delay around the feedback loop. Existing klystrons on the world market have not been optimized for this application and contribute a large portion of the total loop group delay. This paper describes a technique to reduce klystron group delay by adding an equalizing filter to the klystron RF drive. Such a filter was built and tested on a 500 kill klystron as part of the on going PEP-II R&D effort here at SLAC.

  8. Klystron equalization for RF feedback

    SciTech Connect

    Corredoura, P.

    1993-01-01

    The next generation of colliding beam storage rings support higher luminosities by significantly increasing the number of bunches and decreasing the spacing between respective bunches. The heavy beam loading requires large RF cavity detuning which drives several lower coupled bunch modes very strongly. One technique which has proven to be very successful in reducing the coupled bunch mode driving impedance is RF feedback around the klystron-cavity combination. The gain and bandwidth of the feedback loop is limited by the group delay around the feedback loop. Existing klystrons on the world market have not been optimized for this application and contribute a large portion of the total loop group delay. This paper describes a technique to reduce klystron group delay by adding an equalizing filter to the klystron RF drive. Such a filter was built and tested on a 500 kill klystron as part of the on going PEP-II R D effort here at SLAC.

  9. Brightness-equalized quantum dots.

    PubMed

    Lim, Sung Jun; Zahid, Mohammad U; Le, Phuong; Ma, Liang; Entenberg, David; Harney, Allison S; Condeelis, John; Smith, Andrew M

    2015-10-05

    As molecular labels for cells and tissues, fluorescent probes have shaped our understanding of biological structures and processes. However, their capacity for quantitative analysis is limited because photon emission rates from multicolour fluorophores are dissimilar, unstable and often unpredictable, which obscures correlations between measured fluorescence and molecular concentration. Here we introduce a new class of light-emitting quantum dots with tunable and equalized fluorescence brightness across a broad range of colours. The key feature is independent tunability of emission wavelength, extinction coefficient and quantum yield through distinct structural domains in the nanocrystal. Precise tuning eliminates a 100-fold red-to-green brightness mismatch of size-tuned quantum dots at the ensemble and single-particle levels, which substantially improves quantitative imaging accuracy in biological tissue. We anticipate that these materials engineering principles will vastly expand the optical engineering landscape of fluorescent probes, facilitate quantitative multicolour imaging in living tissue and improve colour tuning in light-emitting devices.

  10. Blind Equalization and Fading Channel Signal Recovery of OFDM Modulation

    DTIC Science & Technology

    2011-03-01

    subchannel is affected by flat-fading, which can be easily equalized. A guard time interval between symbols prevents inter-symbol interference...provided in this chapter. The principle behind OFDM is the idea that a wideband frequency channel can be divided into subchannels with narrower...investigated a DFT-based adaptive method for channel estimation. The next chapter discusses an algorithm for the estimation of data in null subchannels

  11. The across frequency independence of equalization of interaural time delay in the equalization-cancellation model of binaural unmasking

    NASA Astrophysics Data System (ADS)

    Akeroyd, Michael A.

    2004-08-01

    The equalization stage in the equalization-cancellation model of binaural unmasking compensates for the interaural time delay (ITD) of a masking noise by introducing an opposite, internal delay [N. I. Durlach, in Foundations of Modern Auditory Theory, Vol. II., edited by J. V. Tobias (Academic, New York, 1972)]. Culling and Summerfield [J. Acoust. Soc. Am. 98, 785-797 (1995)] developed a multi-channel version of this model in which equalization was ``free'' to use the optimal delay in each channel. Two experiments were conducted to test if equalization was indeed free or if it was ``restricted'' to the same delay in all channels. One experiment measured binaural detection thresholds, using an adaptive procedure, for 1-, 5-, or 17-component tones against a broadband masking noise, in three binaural configurations (N0S180, N180S0, and N90S270). The thresholds for the 1-component stimuli were used to normalize the levels of each of the 5- and 17-component stimuli so that they were equally detectable. If equalization was restricted, then, for the 5- and 17-component stimuli, the N90S270 and N180S0 configurations would yield a greater threshold than the N0S180 configurations. No such difference was found. A subsequent experiment measured binaural detection thresholds, via psychometric functions, for a 2-component complex tone in the same three binaural configurations. Again, no differential effect of configuration was observed. An analytic model of the detection of a complex tone showed that the results were more consistent with free equalization than restricted equalization, although the size of the differences was found to depend on the shape of the psychometric function for detection.

  12. Predicting low-temperature free energy landscapes with flat-histogram Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Mahynski, Nathan A.; Blanco, Marco A.; Errington, Jeffrey R.; Shen, Vincent K.

    2017-02-01

    We present a method for predicting the free energy landscape of fluids at low temperatures from flat-histogram grand canonical Monte Carlo simulations performed at higher ones. We illustrate our approach for both pure and multicomponent systems using two different sampling methods as a demonstration. This allows us to predict the thermodynamic behavior of systems which undergo both first order and continuous phase transitions upon cooling using simulations performed only at higher temperatures. After surveying a variety of different systems, we identify a range of temperature differences over which the extrapolation of high temperature simulations tends to quantitatively predict the thermodynamic properties of fluids at lower ones. Beyond this range, extrapolation still provides a reasonably well-informed estimate of the free energy landscape; this prediction then requires less computational effort to refine with an additional simulation at the desired temperature than reconstruction of the surface without any initial estimate. In either case, this method significantly increases the computational efficiency of these flat-histogram methods when investigating thermodynamic properties of fluids over a wide range of temperatures. For example, we demonstrate how a binary fluid phase diagram may be quantitatively predicted for many temperatures using only information obtained from a single supercritical state.

  13. A non-Gaussian analysis scheme using rank histograms for ensemble data assimilation

    NASA Astrophysics Data System (ADS)

    Metref, S.; Cosme, E.; Snyder, C.; Brasseur, P.

    2014-08-01

    One challenge of geophysical data assimilation is to address the issue of non-Gaussianities in the distributions of the physical variables ensuing, in many cases, from nonlinear dynamical models. Non-Gaussian ensemble analysis methods fall into two categories, those remapping the ensemble particles by approximating the best linear unbiased estimate, for example, the ensemble Kalman filter (EnKF), and those resampling the particles by directly applying Bayes' rule, like particle filters. In this article, it is suggested that the most common remapping methods can only handle weakly non-Gaussian distributions, while the others suffer from sampling issues. In between those two categories, a new remapping method directly applying Bayes' rule, the multivariate rank histogram filter (MRHF), is introduced as an extension of the rank histogram filter (RHF) first introduced by Anderson (2010). Its performance is evaluated and compared with several data assimilation methods, on different levels of non-Gaussianity with the Lorenz 63 model. The method's behavior is then illustrated on a simple density estimation problem using ensemble simulations from a coupled physical-biogeochemical model of the North Atlantic ocean. The MRHF performs well with low-dimensional systems in strongly non-Gaussian regimes.

  14. User Aligned Histogram Stacks for Visualization of Abdominal Organs via MRI

    NASA Astrophysics Data System (ADS)

    Özdemir, M.; Akay, O.; Güzeliş, C.; Dicle, O.; Selver, M. A.

    2016-08-01

    Multi-dimensional transfer functions (MDTF) are occasionally designed as two-step approaches. At the first step, the constructed domain is modelled coarsely using global volume statistics and an initial transfer function (TF) is designed. Then, a finer classification is performed using local information to refine the TF design. In this study, both a new TF domain and a novel two-step MDTF strategy are proposed for visualization of abdominal organs. The proposed domain is generated by aligning the histograms of the slices, which are reconstructed based on user aligned majority axis/regions through an interactive Multi-Planar Reconstruction graphical user interface. It is shown that these user aligned histogram stacks (UAHS) exploit more a priori information by providing tissue specific inter-slice spatial domain knowledge. For initial TF design, UAHS are approximated using a multi-scale hierarchical Gaussian mixture model, which is designed to work in quasi real time. Then, a finer classification step is carried out for refinement of the initial result. Applications to several MRI data sets acquired with various sequences demonstrate improved visualization of abdomen.

  15. Radial polar histogram: obstacle avoidance and path planning for robotic cognition and motion control

    NASA Astrophysics Data System (ADS)

    Wang, Po-Jen; Keyawa, Nicholas R.; Euler, Craig

    2012-01-01

    In order to achieve highly accurate motion control and path planning for a mobile robot, an obstacle avoidance algorithm that provided a desired instantaneous turning radius and velocity was generated. This type of obstacle avoidance algorithm, which has been implemented in California State University Northridge's Intelligent Ground Vehicle (IGV), is known as Radial Polar Histogram (RPH). The RPH algorithm utilizes raw data in the form of a polar histogram that is read from a Laser Range Finder (LRF) and a camera. A desired open block is determined from the raw data utilizing a navigational heading and an elliptical approximation. The left and right most radii are determined from the calculated edges of the open block and provide the range of possible radial paths the IGV can travel through. In addition, the calculated obstacle edge positions allow the IGV to recognize complex obstacle arrangements and to slow down accordingly. A radial path optimization function calculates the best radial path between the left and right most radii and is sent to motion control for speed determination. Overall, the RPH algorithm allows the IGV to autonomously travel at average speeds of 3mph while avoiding all obstacles, with a processing time of approximately 10ms.

  16. Quantitative comparison of 3D and 2.5D gamma analysis: introducing gamma angle histograms

    NASA Astrophysics Data System (ADS)

    Sa'd, M. Al; Graham, J.; Liney, G. P.; Moore, C. J.

    2013-04-01

    Comparison of dose distributions using the 3D gamma method is anticipated to provide better indicators for the quality assurance process than the 2.5D (stacked 2D slice-by-slice) gamma calculation, especially for advanced radiotherapy technologies. This study compares the accuracy of the 3D and 2.5D gamma calculation methods. 3D and 2.5D gamma calculations were carried out on four reference/evaluation 3D dose sample pairs. A number of analysis methods were used, including average gamma and gamma volume histograms. We introduce the concept of gamma-angle histograms. Noise sensitivity tests were also performed using two different noise models. The advantage of the 3D gamma method showed up as a higher proportion of points passing the tolerance criteria of 3% dose difference and 3 mm distance-to-agreement (DTA), with considerably lower average gamma values, a lower influence of the DTA criterion, and a higher noise tolerance. The 3D gamma approach is more reliable than the 2.5D approach in terms of providing comprehensive quantitative results, which are needed in quality assurance procedures for advanced radiotherapy methods.

  17. Orientation Histogram-Based Center-Surround Interaction: An Integration Approach for Contour Detection.

    PubMed

    Zhao, Rongchang; Wu, Min; Liu, Xiyao; Zou, Beiji; Li, Fangfang

    2017-01-01

    Contour is a critical feature for image description and object recognition in many computer vision tasks. However, detection of object contour remains a challenging problem because of disturbances from texture edges. This letter proposes a scheme to handle texture edges by implementing contour integration. The proposed scheme integrates structural segments into contours while inhibiting texture edges with the help of the orientation histogram-based center-surround interaction model. In the model, local edges within surroundings exert a modulatory effect on central contour cues based on the co-occurrence statistics of local edges described by the divergence of orientation histograms in the local region. We evaluate the proposed scheme on two well-known challenging boundary detection data sets (RuG and BSDS500). The experiments demonstrate that our scheme achieves a high [Formula: see text]-measure of up to 0.74. Results show that our scheme achieves integrating accurate contour while eliminating most of texture edges, a novel approach to long-range feature analysis.

  18. Decoding brain cancer dynamics: a quantitative histogram-based approach using temporal MRI

    NASA Astrophysics Data System (ADS)

    Zhou, Mu; Hall, Lawrence O.; Goldgof, Dmitry B.; Russo, Robin; Gillies, Robert J.; Gatenby, Robert A.

    2015-03-01

    Brain tumor heterogeneity remains a challenge for probing brain cancer evolutionary dynamics. In light of evolution, it is a priority to inspect the cancer system from a time-domain perspective since it explicitly tracks the dynamics of cancer variations. In this paper, we study the problem of exploring brain tumor heterogeneity from temporal clinical magnetic resonance imaging (MRI) data. Our goal is to discover evidence-based knowledge from such temporal imaging data, where multiple clinical MRI scans from Glioblastoma multiforme (GBM) patients are generated during therapy. In particular, we propose a quantitative histogram-based approach that builds a prediction model to measure the difference in histograms obtained from pre- and post-treatment. The study could significantly assist radiologists by providing a metric to identify distinctive patterns within each tumor, which is crucial for the goal of providing patient-specific treatments. We examine the proposed approach for a practical application - clinical survival group prediction. Experimental results show that our approach achieved 90.91% accuracy.

  19. Detection of Basal Cell Carcinoma Using Color and Histogram Measures of Semitranslucent Areas

    PubMed Central

    Stoecker, William V.; Gupta, Kapil; Shrestha, Bijaya; Wronkiewiecz, Mark; Chowdhury, Raeed; Stanley, R. Joe; Xu, Jin; Moss, Randy H.; Celebi, M. Emre; Rabinovitz, Harold S.; Oliviero, Margaret; Malters, Joseph M.; Kolm, Isabel

    2009-01-01

    Background Semitranslucency, defined as a smooth, jelly-like area with varied, near-skin-tone color, can indicate a diagnosis of basal cell carcinoma (BCC) with high specificity. This study sought to analyze potential areas of semitranslucency with histogram-derived texture and color measures to discriminate BCC from non-semitranslucent areas in non-BCC skin lesions. Methods For 210 dermoscopy images, the areas of semitranslucency in 42 BCCs and comparable areas of smoothness and color in 168 non-BCCs were selected manually. Six color measures and six texture measures were applied to the semitranslucent areas of the BCC and the comparable areas in the non-BCC images. Results Receiver operating characteristic (ROC) curve analysis showed that the texture measures alone provided greater separation of BCC from non-BCC than the color measures alone. Statistical analysis showed that the four most important measures of semitranslucency are three histogram measures: contrast, smoothness, and entropy, and one color measure: blue chromaticity. Smoothness is the single most important measure. The combined 12 measures achieved a diagnostic accuracy of 95.05% based on area under the ROC curve. Conclusion Texture and color analysis measures, especially smoothness, may afford automatic detection of basal cell carcinoma images with semitranslucency. PMID:19624424

  20. Beyond histograms: Efficiently estimating radial distribution functions via spectral Monte Carlo

    NASA Astrophysics Data System (ADS)

    Patrone, Paul N.; Rosch, Thomas W.

    2017-03-01

    Despite more than 40 years of research in condensed-matter physics, state-of-the-art approaches for simulating the radial distribution function (RDF) g(r) still rely on binning pair-separations into a histogram. Such methods suffer from undesirable properties, including subjectivity, high uncertainty, and slow rates of convergence. Moreover, such problems go undetected by the metrics often used to assess RDFs. To address these issues, we propose (I) a spectral Monte Carlo (SMC) quadrature method that yields g(r) as an analytical series expansion and (II) a Sobolev norm that assesses the quality of RDFs by quantifying their fluctuations. Using the latter, we show that, relative to histogram-based approaches, SMC reduces by orders of magnitude both the noise in g(r) and the number of pair separations needed for acceptable convergence. Moreover, SMC reduces subjectivity and yields simple, differentiable formulas for the RDF, which are useful for tasks such as coarse-grained force-field calibration via iterative Boltzmann inversion.

  1. The application of age distribution theory in the analysis of cytofluorimetric DNA histogram data.

    PubMed

    Watson, J V

    1977-03-01

    Age distribution theory has been employed in a model to analyse a variety of histograms of the DNA content of single cells in samples from experimental tumours growing in tissue culture. The method has produced satisfactory correspondence with the experimental data in which there was a wide variation in the proportions of cells in the intermitotic phases, and generally good agreement between the 3H-thymidine labelling index and the computed proportion in S phase. The model has the capacity to analyse data from populations which contain a proportion of non-cycling cells. However, it is concluded that reliable results for the growth fraction and also for the relative durations of the intermitotic phase times cannot be obtained for the data reported here from the DNA histograms alone. To obtain reliable estimates of the growth fraction the relative durations of the phase time must be known, and conversely, reliable estimates of the relative phase durations can only be obtained if the growth fraction is known.

  2. Validation of Vehicle Candidate Areas in Aerial Images Using Color Co-Occurrence Histograms

    NASA Astrophysics Data System (ADS)

    Leister, W.; Tuermer, S.; Reinartz, P.; Hoffmann, K. H.; Stilla, U.

    2013-10-01

    Traffic monitoring plays an important role in transportation management. In addition, airborne acquisition enables a flexible and realtime mapping for special traffic situations e.g. mass events and disasters. Also the automatic extraction of vehicles from aerial imagery is a common application. However, many approaches focus on the target object only. As an extension to previously developed car detection techniques, a validation scheme is presented. The focus is on exploiting the background of the vehicle candidates as well as their color properties in the HSV color space. Therefore, texture of the vehicle background is described by color co-occurrence histograms. From all resulting histograms a likelihood function is calculated giving a quantity value to indicate whether the vehicle candidate is correctly classified. Only a few robust parameters have to be determined. Finally, the strategy is tested with a dataset of dense urban areas from the inner city of Munich, Germany. First results show that certain regions which are often responsible for false positive detections, such as vegetation or road markings, can be excluded successfully.

  3. Two non-parametric methods for derivation of constraints from radiotherapy dose-histogram data

    NASA Astrophysics Data System (ADS)

    Ebert, M. A.; Gulliford, S. L.; Buettner, F.; Foo, K.; Haworth, A.; Kennedy, A.; Joseph, D. J.; Denham, J. W.

    2014-07-01

    Dose constraints based on histograms provide a convenient and widely-used method for informing and guiding radiotherapy treatment planning. Methods of derivation of such constraints are often poorly described. Two non-parametric methods for derivation of constraints are described and investigated in the context of determination of dose-specific cut-points—values of the free parameter (e.g., percentage volume of the irradiated organ) which best reflect resulting changes in complication incidence. A method based on receiver operating characteristic (ROC) analysis and one based on a maximally-selected standardized rank sum are described and compared using rectal toxicity data from a prostate radiotherapy trial. Multiple test corrections are applied using a free step-down resampling algorithm, which accounts for the large number of tests undertaken to search for optimal cut-points and the inherent correlation between dose-histogram points. Both methods provide consistent significant cut-point values, with the rank sum method displaying some sensitivity to the underlying data. The ROC method is simple to implement and can utilize a complication atlas, though an advantage of the rank sum method is the ability to incorporate all complication grades without the need for grade dichotomization.

  4. Equality and Empowerment for Decent Work.

    ERIC Educational Resources Information Center

    Hepple, Bob

    2001-01-01

    Substantive equality encompasses equality of results, opportunity, and human dignity. To implement it requires an incremental approach ranging from voluntary participation to penalties for noncompliance, active participation of all stakeholders, and empowerment of disadvantaged groups. (SK)

  5. The neural bases for valuing social equality.

    PubMed

    Aoki, Ryuta; Yomogida, Yukihito; Matsumoto, Kenji

    2015-01-01

    The neural basis of how humans value and pursue social equality has become a major topic in social neuroscience research. Although recent studies have identified a set of brain regions and possible mechanisms that are involved in the neural processing of equality of outcome between individuals, how the human brain processes equality of opportunity remains unknown. In this review article, first we describe the importance of the distinction between equality of outcome and equality of opportunity, which has been emphasized in philosophy and economics. Next, we discuss possible approaches for empirical characterization of human valuation of equality of opportunity vs. equality of outcome. Understanding how these two concepts are distinct and interact with each other may provide a better explanation of complex human behaviors concerning fairness and social equality.

  6. Histogram analysis of pharmacokinetic parameters by bootstrap resampling from one-point sampling data in animal experiments.

    PubMed

    Takemoto, Seiji; Yamaoka, Kiyoshi; Nishikawa, Makiya; Takakura, Yoshinobu

    2006-12-01

    A bootstrap method is proposed for assessing statistical histograms of pharmacokinetic parameters (AUC, MRT, CL and V(ss)) from one-point sampling data in animal experiments. A computer program, MOMENT(BS), written in Visual Basic on Microsoft Excel, was developed for the bootstrap calculation and the construction of histograms. MOMENT(BS) was applied to one-point sampling data of the blood concentration of three physiologically active proteins ((111)In labeled Hsp70, Suc(20)-BSA and Suc(40)-BSA) administered in different doses to mice. The histograms of AUC, MRT, CL and V(ss) were close to a normal (Gaussian) distribution with the bootstrap resampling number (200), or more, considering the skewness and kurtosis of the histograms. A good agreement of means and SD was obtained between the bootstrap and Bailer's approaches. The hypothesis test based on the normal distribution clearly demonstrated that the disposition of (111)In-Hsp70 and Suc(20)-BSA was almost independent of dose, whereas that of (111)In-Suc(40)-BSA was definitely dose-dependent. In conclusion, the bootstrap method was found to be an efficient method for assessing the histogram of pharmacokinetic parameters of blood or tissue disposition data by one-point sampling.

  7. RelMon: A General Approach to QA, Validation and Physics Analysis through Comparison of large Sets of Histograms

    NASA Astrophysics Data System (ADS)

    Piparo, Danilo

    2012-12-01

    The estimation of the compatibility of large amounts of histogram pairs is a recurrent problem in high energy physics. The issue is common to several different areas, from software quality monitoring to data certification, preservation and analysis. Given two sets of histograms, it is very important to be able to scrutinize the outcome of several goodness of fit tests, obtain a clear answer about the overall compatibility, easily spot the single anomalies and directly access the concerned histogram pairs. This procedure must be automated in order to reduce the human workload, therefore improving the process of identification of differences which is usually carried out by a trained human mind. Some solutions to this problem have been proposed, but they are experiment specific. RelMon depends only on ROOT and offers several goodness of fit tests (e.g. chi-squared or Kolmogorov-Smirnov). It produces highly readable web reports, in which aggregations of the comparisons rankings are available as well as all the plots of the single histogram overlays. The comparison procedure is fully automatic and scales smoothly towards ensembles of millions of histograms. Examples of RelMon utilisation within the regular workflows of the CMS collaboration and the advantages therewith obtained are described. Its interplay with the data quality monitoring infrastructure is illustrated as well as its role in the QA of the event reconstruction code, its integration in the CMS software release cycle process, CMS user data analysis and dataset validation.

  8. Competition and Cooperation, Equality and Elites.

    ERIC Educational Resources Information Center

    Flew, Antony

    1983-01-01

    Cooperation and competition are often seen as polar opposites; yet they are not necessarily in opposition. Similarly, quality and elitism are not opposed if the goal of equality is seen as equality of opportunity rather than equality of outcome and the elite are the product of fair competition. (IS)

  9. Sex, Money and the Equal Pay Act

    ERIC Educational Resources Information Center

    Feldman, Edwin B.

    1973-01-01

    Institutions who justify a wage differential between male and female custodians on the basis that women typically do the lighter work, and men the heavier, can find themselves in trouble. The Equal Pay Act of 1963 requires that men and women get the same pay for equal work -- and all custodial work is substantially equal to the Labor Department.…

  10. 77 FR 52583 - Women's Equality Day, 2012

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-29

    ... August 29, 2012 Part IV The President Proclamation 8848--Women's Equality Day, 2012 #0; #0; #0... / Presidential Documents#0;#0; #0; #0;Title 3-- #0;The President ] Proclamation 8848 of August 24, 2012 Women's Equality Day, 2012 By the President of the United States of America A Proclamation On Women's Equality...

  11. Putting Educational Equality in Its Place

    ERIC Educational Resources Information Center

    Brighouse, Harry; Swift, Adam

    2008-01-01

    Educational equality is one important value of justice in education, but it is only one. This article makes a case for a meritocratic principle of educational equality and shows that certain arguments against that principle do not justify rejecting it. It would be wrong to, for the sake of educational equality, undermine the value of the family or…

  12. Does 0.999... Really Equal 1?

    ERIC Educational Resources Information Center

    Norton, Anderson; Baldwin, Michael

    2012-01-01

    This article confronts the issue of why secondary and post-secondary students resist accepting the equality of 0.999... and 1, even after they have seen and understood logical arguments for the equality. In some sense, we might say that the equality holds by definition of 0.999..., but this definition depends upon accepting properties of the real…

  13. 76 FR 41590 - Equal Credit Opportunity

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-15

    ... CFR Part 202 RIN 7100-AD67 Equal Credit Opportunity AGENCY: Board of Governors of the Federal Reserve System (Board). ACTION: Final rule. SUMMARY: Section 701 of the Equal Credit Opportunity Act (ECOA..., contact (202) 263- 4869. SUPPLEMENTARY INFORMATION: I. Background The Equal Credit Opportunity Act...

  14. Impact of the radiotherapy technique on the correlation between dose-volume histograms of the bladder wall defined on MRI imaging and dose-volume/surface histograms in prostate cancer patients

    NASA Astrophysics Data System (ADS)

    Maggio, Angelo; Carillo, Viviana; Cozzarini, Cesare; Perna, Lucia; Rancati, Tiziana; Valdagni, Riccardo; Gabriele, Pietro; Fiorino, Claudio

    2013-04-01

    The aim of this study was to evaluate the correlation between the ‘true’ absolute and relative dose-volume histograms (DVHs) of the bladder wall, dose-wall histogram (DWH) defined on MRI imaging and other surrogates of bladder dosimetry in prostate cancer patients, planned both with 3D-conformal and intensity-modulated radiation therapy (IMRT) techniques. For 17 prostate cancer patients, previously treated with radical intent, CT and MRI scans were acquired and matched. The contours of bladder walls were drawn by using MRI images. External bladder surfaces were then used to generate artificial bladder walls by performing automatic contractions of 5, 7 and 10 mm. For each patient a 3D conformal radiotherapy (3DCRT) and an IMRT treatment plan was generated with a prescription dose of 77.4 Gy (1.8 Gy/fr) and DVH of the whole bladder of the artificial walls (DVH-5/10) and dose-surface histograms (DSHs) were calculated and compared against the DWH in absolute and relative value, for both treatment planning techniques. A specific software (VODCA v. 4.4.0, MSS Inc.) was used for calculating the dose-volume/surface histogram. Correlation was quantified for selected dose-volume/surface parameters by the Spearman correlation coefficient. The agreement between %DWH and DVH5, DVH7 and DVH10 was found to be very good (maximum average deviations below 2%, SD < 5%): DVH5 showed the best agreement. The correlation was slightly better for absolute (R = 0.80-0.94) compared to relative (R = 0.66-0.92) histograms. The DSH was also found to be highly correlated with the DWH, although slightly higher deviations were generally found. The DVH was not a good surrogate of the DWH (R < 0.7 for most of parameters). When comparing the two treatment techniques, more pronounced differences between relative histograms were seen for IMRT with respect to 3DCRT (p < 0.0001).

  15. Ship detection and extraction using visual saliency and histogram of oriented gradient

    NASA Astrophysics Data System (ADS)

    Xu, Fang; Liu, Jing-hong

    2016-11-01

    A novel unsupervised ship detection and extraction method is proposed. A combination model based on visual saliency is constructed for searching the ship target regions and suppressing the false alarms. The salient target regions are extracted and marked through segmentation. Radon transform is applied to confirm the suspected ship targets with symmetry profiles. Then, a new descriptor, improved histogram of oriented gradient (HOG), is introduced to discriminate the real ships. The experimental results on real optical remote sensing images demonstrate that plenty of ships can be extracted and located successfully, and the number of ships can be accurately acquired. Furthermore, the proposed method is superior to the contrastive methods in terms of both accuracy rate and false alarm rate.

  16. Accelerating the weighted histogram analysis method by direct inversion in the iterative subspace

    PubMed Central

    Zhang, Cheng; Lai, Chun-Liang; Pettitt, B. Montgomery

    2016-01-01

    The weighted histogram analysis method (WHAM) for free energy calculations is a valuable tool to produce free energy differences with the minimal errors. Given multiple simulations, WHAM obtains from the distribution overlaps the optimal statistical estimator of the density of states, from which the free energy differences can be computed. The WHAM equations are often solved by an iterative procedure. In this work, we use a well-known linear algebra algorithm which allows for more rapid convergence to the solution. We find that the computational complexity of the iterative solution to WHAM and the closely-related multiple Bennett acceptance ratio (MBAR) method can be improved by using the method of direct inversion in the iterative subspace. We give examples from a lattice model, a simple liquid and an aqueous protein solution. PMID:27453632

  17. A hardware-oriented histogram of oriented gradients algorithm and its VLSI implementation

    NASA Astrophysics Data System (ADS)

    Zhang, Xiangyu; An, Fengwei; Nakashima, Ikki; Luo, Aiwen; Chen, Lei; Ishii, Idaku; Jürgen Mattausch, Hans

    2017-04-01

    A challenging and important issue for object recognition is feature extraction on embedded systems. We report a hardware implementation of the histogram of oriented gradients (HOG) algorithm for real-time object recognition, which is known to provide high efficiency and accuracy. The developed hardware-oriented algorithm exploits the cell-based scan strategy which enables image-sensor synchronization and extraction-speed acceleration. Furthermore, buffers for image frames or integral images are avoided. An image-size scalable hardware architecture with an effective bin-decoder and a parallelized voting element (PVE) is developed and used to verify the hardware-oriented HOG implementation with the application of human detection. The fabricated test chip in 180 nm CMOS technology achieves fast processing speed and large flexibility for different image resolutions with substantially reduced hardware cost and energy consumption.

  18. The characterization of radioaerosol deposition in the healthy lung by histogram distribution analysis

    SciTech Connect

    Garrard, C.S.; Gerrity, T.R.; Schreiner, J.F.; Yeates, D.B.

    1981-12-01

    Thirteen healthy nonsmoking volunteers inhaled an 8.1 micrometers (MMAD) radioaerosol on two occasions. Aerosol deposition pattern within the right lung, as recorded by a gamma camera, was expressed as the 3rd and 4th moments of the distribution histogram (skew and kurtosis) of radioactivity during the first ten minutes after aerosol inhalation. Deposition pattern was also expressed as the percentage of deposited activity retained within the lung at 24 hr (24 hr % retention) and found to be significantly correlated with measures of skew (P less than 0.001). Tests of pulmonary function (FEV1, FVC, and MMFR) were significantly correlated with skew. Correlations were also demonstrated for these pulmonary function tests with 24 hr % retention but at lower levels of significance. Results indicate that changes in measures of forced expiratory airflow in healthy human volunteers influence deposition pattern and that the skew of the distribution of inhaled radioactivity may provide an acceptable index of deposition pattern.

  19. A Novel Histogram Region Merging Based Multithreshold Segmentation Algorithm for MR Brain Images

    PubMed Central

    Shen, Xuanjing; Feng, Yuncong

    2017-01-01

    Multithreshold segmentation algorithm is time-consuming, and the time complexity will increase exponentially with the increase of thresholds. In order to reduce the time complexity, a novel multithreshold segmentation algorithm is proposed in this paper. First, all gray levels are used as thresholds, so the histogram of the original image is divided into 256 small regions, and each region corresponds to one gray level. Then, two adjacent regions are merged in each iteration by a new designed scheme, and a threshold is removed each time. To improve the accuracy of the merger operation, variance and probability are used as energy. No matter how many the thresholds are, the time complexity of the algorithm is stable at O(L). Finally, the experiment is conducted on many MR brain images to verify the performance of the proposed algorithm. Experiment results show that our method can reduce the running time effectively and obtain segmentation results with high accuracy.

  20. RF coil optimization: evaluation of B1 field homogeneity using field histograms and finite element calculations.

    PubMed

    Li, S; Yang, Q X; Smith, M B

    1994-01-01

    Two-dimensional (2D) finite element analysis has been used to solve the full set of Maxwell's equations for the 2D magnetic field of radiofrequency (RF) coils. The field histogram method has been applied to evaluate and optimize the magnetic field homogeneity of some commonly used RF coils: the saddle coil, the slotted tube resonator, the multiple elements coil and the birdcage resonator, as well as the radial plate coil. Each coil model represents a cross-section of an infinitely long cylinder. The optimum configuration of each of these five RF coils is suggested. It was found that field homogeneity is more strongly dependent on the coil's window angle than on any other parameter. Additionally, eddy currents in the coil's conductive elements distort the current and magnetic field distribution. The frequency dependence of this eddy current distortion is analyzed and discussed.

  1. Development of a Quality Assurance Procedure for Dose Volume Histogram Analysis

    NASA Astrophysics Data System (ADS)

    Davenport, David A.

    The role of the dose-volume histogram (DVH) is rapidly expanding in radiation oncology treatment planning. DVHs are already relied upon to differentiate between two similar plans and evaluate organ-at-risk dosage. Their role will become even more important as progress continues towards implementing biologically based treatment planning systems. Therefore it is imperative that the accuracy of DVHs is evaluated and reappraised after any major software or hardware upgrades, affecting a treatment planning system (TPS). The purpose of this work is to create and implement a comprehensive quality assurance procedure evaluating dose volume histograms to insure their accuracy while satisfying American College of Radiology guidelines. Virtual phantoms of known volumes were created in Pinnacle TPS and exposed to different beam arrangements. Variables including grid size and slice thickness were varied and their effects were analyzed. The resulting DVHs were evaluated by comparison to the commissioned percent depth dose values using a custom Excel spreadsheet. After determining the uncertainty of the DVH based on these variables, multiple second check calculations were performed using MIM Maestro and Matlab software packages. The uncertainties of the DVHs were shown to be less than +/- 3%. The average uncertainty was shown to be less than +/- 1%. The second check procedures resulted in mean percent differences less than 1% which confirms the accuracy of DVH calculation in Pinnacle and the effectiveness of the quality assurance template. The importance of knowing the limits of accuracy of the DVHs, which are routinely used to assess the quality of clinical treatment plans, cannot be overestimated. The developed comprehensive QA procedure evaluating the accuracy of the DVH statistical analysis will become a part of our clinical arsenal for periodic tests of the treatment planning system. It will also be performed at the time of commissioning and after any major software

  2. Fast analysis of molecular dynamics trajectories with graphics processing units-Radial distribution function histogramming

    SciTech Connect

    Levine, Benjamin G.; Stone, John E.; Kohlmeyer, Axel

    2011-05-01

    The calculation of radial distribution functions (RDFs) from molecular dynamics trajectory data is a common and computationally expensive analysis task. The rate limiting step in the calculation of the RDF is building a histogram of the distance between atom pairs in each trajectory frame. Here we present an implementation of this histogramming scheme for multiple graphics processing units (GPUs). The algorithm features a tiling scheme to maximize the reuse of data at the fastest levels of the GPU's memory hierarchy and dynamic load balancing to allow high performance on heterogeneous configurations of GPUs. Several versions of the RDF algorithm are presented, utilizing the specific hardware features found on different generations of GPUs. We take advantage of larger shared memory and atomic memory operations available on state-of-the-art GPUs to accelerate the code significantly. The use of atomic memory operations allows the fast, limited-capacity on-chip memory to be used much more efficiently, resulting in a fivefold increase in performance compared to the version of the algorithm without atomic operations. The ultimate version of the algorithm running in parallel on four NVIDIA GeForce GTX 480 (Fermi) GPUs was found to be 92 times faster than a multithreaded implementation running on an Intel Xeon 5550 CPU. On this multi-GPU hardware, the RDF between two selections of 1,000,000 atoms each can be calculated in 26.9 s per frame. The multi-GPU RDF algorithms described here are implemented in VMD, a widely used and freely available software package for molecular dynamics visualization and analysis.

  3. Convergence and error estimation in free energy calculations using the weighted histogram analysis method.

    PubMed

    Zhu, Fangqiang; Hummer, Gerhard

    2012-02-05

    The weighted histogram analysis method (WHAM) has become the standard technique for the analysis of umbrella sampling simulations. In this article, we address the challenges (1) of obtaining fast and accurate solutions of the coupled nonlinear WHAM equations, (2) of quantifying the statistical errors of the resulting free energies, (3) of diagnosing possible systematic errors, and (4) of optimally allocating of the computational resources. Traditionally, the WHAM equations are solved by a fixed-point direct iteration method, despite poor convergence and possible numerical inaccuracies in the solutions. Here, we instead solve the mathematically equivalent problem of maximizing a target likelihood function, by using superlinear numerical optimization algorithms with a significantly faster convergence rate. To estimate the statistical errors in one-dimensional free energy profiles obtained from WHAM, we note that for densely spaced umbrella windows with harmonic biasing potentials, the WHAM free energy profile can be approximated by a coarse-grained free energy obtained by integrating the mean restraining forces. The statistical errors of the coarse-grained free energies can be estimated straightforwardly and then used for the WHAM results. A generalization to multidimensional WHAM is described. We also propose two simple statistical criteria to test the consistency between the histograms of adjacent umbrella windows, which help identify inadequate sampling and hysteresis in the degrees of freedom orthogonal to the reaction coordinate. Together, the estimates of the statistical errors and the diagnostics of inconsistencies in the potentials of mean force provide a basis for the efficient allocation of computational resources in free energy simulations.

  4. Seismic remote sensing image segmentation based on spectral histogram and dynamic region merging

    NASA Astrophysics Data System (ADS)

    Wang, Peng; Sun, Genyun; Wang, Zhenjie

    2015-12-01

    Image segmentation is the foundation of seismic information extraction from high-resolution remote sensing images. While the complexity of the seismic image brings great challenges to its segmentation. Compared with the traditional pixel-level approaches, the region-level approaches are found prevailing in dealing with the complexity. This paper addresses the seismic image segmentation problem in a region-merging style. Starting from many over-segmented regions, the image segmentation is performed by iteratively merging the neighboring regions. In the proposed algorithm, the merging criterion and merging order are two essential issues to be emphatically considered. An effective merging criterion is largely depends on the region feature and neighbor homogeneity measure. The region's spectral histogram represents the global feature of each region and enhances the discriminability of neighboring regions. Therefore, we utilize it to solve the merging criterion. Under a certain the merging criterion, a better performance could be obtained if the most similar regions are always ensured to be merged first, which can be transformed into a least-cost problem. Rather than predefine an order queue, we solve the order problem with a dynamic scheme. The proposed approach mainly contains three parts. Firstly, starting from the over-segmented regions, the spectral histograms are constructed to represent each region. Then, we use the homogeneity that combines the distance and shape measure to conduct the merge criterion. Finally, neighbor regions are dynamically merged following the dynamic program (DP) theory and breadth-first strategy. Experiments are conducted using the earthquake images, including collapsed buildings and seismic secondary geological disaster. The experimental results show that, the proposed method segments the seismic image more correctly.

  5. Fast Analysis of Molecular Dynamics Trajectories with Graphics Processing Units-Radial Distribution Function Histogramming.

    PubMed

    Levine, Benjamin G; Stone, John E; Kohlmeyer, Axel

    2011-05-01

    The calculation of radial distribution functions (RDFs) from molecular dynamics trajectory data is a common and computationally expensive analysis task. The rate limiting step in the calculation of the RDF is building a histogram of the distance between atom pairs in each trajectory frame. Here we present an implementation of this histogramming scheme for multiple graphics processing units (GPUs). The algorithm features a tiling scheme to maximize the reuse of data at the fastest levels of the GPU's memory hierarchy and dynamic load balancing to allow high performance on heterogeneous configurations of GPUs. Several versions of the RDF algorithm are presented, utilizing the specific hardware features found on different generations of GPUs. We take advantage of larger shared memory and atomic memory operations available on state-of-the-art GPUs to accelerate the code significantly. The use of atomic memory operations allows the fast, limited-capacity on-chip memory to be used much more efficiently, resulting in a fivefold increase in performance compared to the version of the algorithm without atomic operations. The ultimate version of the algorithm running in parallel on four NVIDIA GeForce GTX 480 (Fermi) GPUs was found to be 92 times faster than a multithreaded implementation running on an Intel Xeon 5550 CPU. On this multi-GPU hardware, the RDF between two selections of 1,000,000 atoms each can be calculated in 26.9 seconds per frame. The multi-GPU RDF algorithms described here are implemented in VMD, a widely used and freely available software package for molecular dynamics visualization and analysis.

  6. Adaptive threshold selection for background removal in fringe projection profilometry

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Li, Weishi; Yan, Jianwen; Yu, Liandong; Pan, Chengliang

    2017-03-01

    In fringe projection profilometry, background and shadow are inevitable in the image of an object, and must be identified and removed. In existing methods, it is nontrivial to determine a proper threshold to segment the background and shadow regions, especially when the gray-level histogram of the image is close to unimodal, and an improper threshold generally results in misclassification of the object and the background/shadow. In this paper, an adaptive threshold method is proposed to tackle the problem. Different from the existing automatic methods, the modulation-level histogram, instead of the gray-level histogram, of the image is employed to determine the threshold. Furthermore, a new weighting factor is proposed to improve Otsu's method to segment the image with a histogram close to unimodal, and the modulation difference of the object pixels and the background/shadow pixels is intensified significantly by the weighting factor. Moreover, the weighting factor is adaptive to the image. The proposed method outperforms existing methods either in accuracy, efficiency or automation. Experimental results are given to demonstrate the feasibility and effectiveness of the proposed method.

  7. Combat Exclusion: An Equal Protection Analysis

    DTIC Science & Technology

    1997-04-01

    uniforms, they served without the benefits of rank, officer status, equal pay , or veteran’s benefits. The Army also contracted women to serve in the Signal...accomplish the elimination of hearings on the merits, is to make the very kind of arbitrary legislative choice forbidden by the Equal Protection...wUGmm’" LOAN DOCUMENT * COMBAT EXCLUSION: AN EQUAL PROTECTION ANALYSIS A Thesis Presented to The Judge Advocate General’s School United States Army The

  8. 25 GB Read-Only Disk System using the Two-Dimensional Equalizer

    NASA Astrophysics Data System (ADS)

    Tomita, Yoshimi; Nishiwaki, Hiroshi; Miyanabe, Shogo; Kuribayashi, Hiroki; Yamamoto, Kaoru; Yokogawa, Fumihiko

    2001-03-01

    In order to achieve a higher density disk system, we applied a two-dimensional equalizer along with a limit equalizer to an optical disk drive system which has an objective lens with a numerical aperture of 0.85 and a thin transparent cover layer of 0.1 mm thickness. Consequently we realized the 25 GB read-only disk system with sufficient margins against disk tilt and defocus. The two-dimensional equalizer is composed of a cross-talk cancel system and a tangential adaptive equalizer, and could prevent deterioration due to inter-symbol interference and cross-talk from adjacent tracks. Using the limit equalizer could prevent deterioration due to disk noise. By measuring the jitter with the limit equalizer, which has an ability to expand the system margin almost equivalent to that of the Viterbi decoder, we could evaluate the disk quality for standardization and verification.

  9. TaBoo SeArch Algorithm with a Modified Inverse Histogram for Reproducing Biologically Relevant Rare Events of Proteins.

    PubMed

    Harada, Ryuhei; Takano, Yu; Shigeta, Yasuteru

    2016-05-10

    The TaBoo SeArch (TBSA) algorithm [ Harada et al. J. Comput. Chem. 2015 , 36 , 763 - 772 and Harada et al. Chem. Phys. Lett. 2015 , 630 , 68 - 75 ] was recently proposed as an enhanced conformational sampling method for reproducing biologically relevant rare events of a given protein. In TBSA, an inverse histogram of the original distribution, mapped onto a set of reaction coordinates, is constructed from trajectories obtained by multiple short-time molecular dynamics (MD) simulations. Rarely occurring states of a given protein are statistically selected as new initial states based on the inverse histogram, and resampling is performed by restarting the MD simulations from the new initial states to promote the conformational transition. In this process, the definition of the inverse histogram, which characterizes the rarely occurring states, is crucial for the efficiency of TBSA. In this study, we propose a simple modification of the inverse histogram to further accelerate the convergence of TBSA. As demonstrations of the modified TBSA, we applied it to (a) hydrogen bonding rearrangements of Met-enkephalin, (b) large-amplitude domain motions of Glutamine-Binding Protein, and (c) folding processes of the B domain of Staphylococcus aureus Protein A. All demonstrations numerically proved that the modified TBSA reproduced these biologically relevant rare events with nanosecond-order simulation times, although a set of microsecond-order, canonical MD simulations failed to reproduce the rare events, indicating the high efficiency of the modified TBSA.

  10. Studying the time histogram of a terrestrial electron beam detected from the opposite hemisphere of its associated TGF

    NASA Astrophysics Data System (ADS)

    Sarria, D.; Blelly, P.-L.; Briggs, M. S.; Forme, F.

    2016-05-01

    Terrestrial gamma-ray flashes are bursts of X/gamma photons, correlated to thunderstorms. By interacting with the atmosphere, the photons produce a substantial number of electrons and positrons. Some of these reach a sufficiently high altitude that their interactions with the atmosphere become negligible, and they are then guided by geomagnetic field lines, forming a Terrestrial Electron Beam. On 9 December 2009, the Gamma-Ray Burst Monitor (GBM) instrument on board the Fermi Space Telescope made a particularly interesting measurement of such an event. To study this type of event in detail, we perform Monte-Carlo simulations and focus on the resulting time histograms. In agreement with previous work, we show that the histogram measured by Fermi GBM is reproducible from a simulation. We then show that the time histogram resulting from this simulation is only weakly dependent on the production altitude, duration, beaming angle, and spectral shape of the associated terrestrial gamma-ray flash. Finally, we show that the time histogram can be decomposed into three populations of leptons, coming from the opposite hemisphere, and mirroring back to the satellite with or without interacting with the atmosphere, and that these populations can be clearly distinguished by their pitch angles.

  11. Dynamic Contrast-enhanced MR Imaging in Renal Cell Carcinoma: Reproducibility of Histogram Analysis on Pharmacokinetic Parameters

    PubMed Central

    Wang, Hai-yi; Su, Zi-hua; Xu, Xiao; Sun, Zhi-peng; Duan, Fei-xue; Song, Yuan-yuan; Li, Lu; Wang, Ying-wei; Ma, Xin; Guo, Ai-tao; Ma, Lin; Ye, Hui-yi

    2016-01-01

    Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan–rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K trans & Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733

  12. Isobio software: biological dose distribution and biological dose volume histogram from physical dose conversion using linear-quadratic-linear model

    PubMed Central

    Jaikuna, Tanwiwat; Khadsiri, Phatchareewan; Chawapun, Nisa; Saekho, Suwit

    2017-01-01

    Purpose To develop an in-house software program that is able to calculate and generate the biological dose distribution and biological dose volume histogram by physical dose conversion using the linear-quadratic-linear (LQL) model. Material and methods The Isobio software was developed using MATLAB version 2014b to calculate and generate the biological dose distribution and biological dose volume histograms. The physical dose from each voxel in treatment planning was extracted through Computational Environment for Radiotherapy Research (CERR), and the accuracy was verified by the differentiation between the dose volume histogram from CERR and the treatment planning system. An equivalent dose in 2 Gy fraction (EQD2) was calculated using biological effective dose (BED) based on the LQL model. The software calculation and the manual calculation were compared for EQD2 verification with pair t-test statistical analysis using IBM SPSS Statistics version 22 (64-bit). Results Two and three-dimensional biological dose distribution and biological dose volume histogram were displayed correctly by the Isobio software. Different physical doses were found between CERR and treatment planning system (TPS) in Oncentra, with 3.33% in high-risk clinical target volume (HR-CTV) determined by D90%, 0.56% in the bladder, 1.74% in the rectum when determined by D2cc, and less than 1% in Pinnacle. The difference in the EQD2 between the software calculation and the manual calculation was not significantly different with 0.00% at p-values 0.820, 0.095, and 0.593 for external beam radiation therapy (EBRT) and 0.240, 0.320, and 0.849 for brachytherapy (BT) in HR-CTV, bladder, and rectum, respectively. Conclusions The Isobio software is a feasible tool to generate the biological dose distribution and biological dose volume histogram for treatment plan evaluation in both EBRT and BT. PMID:28344603

  13. Absolute and relative dose surface and dose volume histograms of the bladder: which one is the most representative for the actual treatment?

    NASA Astrophysics Data System (ADS)

    Hoogeman, Mischa S.; Peeters, Stephanie T. H.; de Bois, Josien; Lebesque, Joos V.

    2005-08-01

    The purpose of this study was to quantify to what extent relative and absolute bladder dose-volume and dose-surface histograms of the planning CT scan were representative for the actual treatment. We used data of 17 patients, who each received 11 repeat CT scans and a planning CT scan. The repeat CT scans were matched on the planning CT scan by the bony anatomy. Clinical treatment plans were used to evaluate the impact of bladder filling changes on the four histogram types. The impact was quantified by calculating for this patient group the correlation coefficient between the planning histogram and the treatment histogram. We found that the absolute dose-surface histogram was the most representative one for the actual treatment.

  14. 78 FR 53231 - Women's Equality Day, 2013

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-28

    ... of organizing, agitating, and demonstrating, our country achieved a major victory for women's rights... August 28, 2013 Part IV The President Proclamation 9003--Women's Equality Day, 2013 Proclamation 9004...#0;#0; #0; #0;Title 3-- #0;The President ] Proclamation 9003 of August 23, 2013 Women's Equality...

  15. Equality in Higher Education in Northern Ireland

    ERIC Educational Resources Information Center

    Osborne, R.D.

    2005-01-01

    The higher education sector in Northern Ireland has been fully involved in the public policies designed to enhance equality. Starting with measures designed to secure greater employment between Catholics and Protestants, known as fair employment, the policies are now designed to promote equality of opportunity across nine designated groups…

  16. School Law: A Question of Equality.

    ERIC Educational Resources Information Center

    Dowling-Sendor, Benjamin

    2003-01-01

    This article discusses the Equal Access Act (EAA) as it pertains to high-school student clubs. It raises basics questions about EAA: What does "equal" mean? What level of access is required? Does the First Amendment's free-speech clause offer broader protection to student clubs than the EAA? (WFA)

  17. Vocational Education and Equality of Opportunity.

    ERIC Educational Resources Information Center

    Horowitz, Benjamin; Feinberg, Walter

    1990-01-01

    Examines the concepts of equality of opportunity and equality of educational opportunity and their relationship to vocational education. Traces the history of vocational education. Delineates the distinction between training and education as enumerated in Aristotelian philosophy. Discusses the role vocational education can play in the educative…

  18. Equal Rights for Men and Women

    ERIC Educational Resources Information Center

    Geo-Karis, Adeline J.

    1975-01-01

    A member of the Illinois House of Representatives expresses her reasons for believing that most of the goals of Women's Rights advocates are not only reasonable and desirable, but necessary for the survival of the family, and that the Equal Rights Amendment (ERA) means equal opportunity and recognition for all. (Author/AJ)

  19. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Reservoir, equalizing. 236.792 Section 236.792 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of...

  20. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Reservoir, equalizing. 236.792 Section 236.792 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of...

  1. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Reservoir, equalizing. 236.792 Section 236.792 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of...

  2. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Reservoir, equalizing. 236.792 Section 236.792 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of...

  3. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Reservoir, equalizing. 236.792 Section 236.792 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of...

  4. Equal Plate Charges on Series Capacitors?

    ERIC Educational Resources Information Center

    Illman, B. L.; Carlson, G. T.

    1994-01-01

    Provides a line of reasoning in support of the contention that the equal charge proposition is at best an approximation. Shows how the assumption of equal plate charge on capacitors in series contradicts the conservative nature of the electric field. (ZWH)

  5. Automated geomorphometric classification of landforms in Transdanubian Region (Pannonian Basin) based on local slope histograms

    NASA Astrophysics Data System (ADS)

    Székely, Balázs; Koma, Zsófia; Csorba, Kristóf; Ferenc Morovics, József

    2014-05-01

    The Transdanubian Region is a typically hilly, geologically manifold area of the Pannonian Basin. It is composed primarily of Permo-Mesozoic carbonates and siliciclastic sediments, however Pannonian sedimentary units and young volcanic forms are also characteristic, such as those in the Bakony-Balaton Highland Volcanic Field. The geological diversity is reflected in the geomorphological setting: beside of the classic eroding volcanic edifices, carbonate plateaus, medium-relief, gently hilly, slowly eroding landforms are also frequent in the geomorphic mosaic of the area. Geomorphometric techniques are suitable to analyse and separate the various geomorphic units mosaicked and, in some cases, affected by (sub-)recent tectonic geomorphic processes. In our project we applied automated classification of local slope angle histograms derived of a 10-meter nominal resolution digital terrain model (DTM). Slope angle histrograms within a rectangular moving window of various sizes have been calculated in numerous experiments. The histograms then served as a multichannel input of for a k-means classification to achieve a geologically-geomorphologically sound categorization of the area. The experiments show good results in separating the very basic landforms, defined landscape boundaries can be reconstructed with high accuracy in case of larger window sizes (e.g. 5 km) and low number of categories. If the window size is smaller and the number of classes is higher, the tectonic geomorphic features are more prominently recognized, however often at the price of the clear separation boundaries: in these cases the horizontal change in the composition of various clusters matches the boundaries of the geological units. Volcanic forms are typically also put into some definite classes, however the flat plateaus of some volcanic edifices fall into another category also recognized in the experiments. In summary we can conclude that the area is suitable for such analyses, many

  6. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 4 2012-07-01 2012-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal... equality of job content in general. In determining whether employees are performing equal work within...

  7. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 4 2014-07-01 2014-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal... equality of job content in general. In determining whether employees are performing equal work within...

  8. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 4 2013-07-01 2013-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal... equality of job content in general. In determining whether employees are performing equal work within...

  9. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal... equality of job content in general. In determining whether employees are performing equal work within...

  10. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 4 2011-07-01 2011-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal... equality of job content in general. In determining whether employees are performing equal work within...

  11. Directional Histogram Ratio at Random Probes: A Local Thresholding Criterion for Capillary Images

    PubMed Central

    Lu, Na; Silva, Jharon; Gu, Yu; Gerber, Scott; Wu, Hulin; Gelbard, Harris; Dewhurst, Stephen; Miao, Hongyu

    2013-01-01

    With the development of micron-scale imaging techniques, capillaries can be conveniently visualized using methods such as two-photon and whole mount microscopy. However, the presence of background staining, leaky vessels and the diffusion of small fluorescent molecules can lead to significant complexity in image analysis and loss of information necessary to accurately quantify vascular metrics. One solution to this problem is the development of accurate thresholding algorithms that reliably distinguish blood vessels from surrounding tissue. Although various thresholding algorithms have been proposed, our results suggest that without appropriate pre- or post-processing, the existing approaches may fail to obtain satisfactory results for capillary images that include areas of contamination. In this study, we propose a novel local thresholding algorithm, called directional histogram ratio at random probes (DHR-RP). This method explicitly considers the geometric features of tube-like objects in conducting image binarization, and has a reliable performance in distinguishing small vessels from either clean or contaminated background. Experimental and simulation studies suggest that our DHR-RP algorithm is superior over existing thresholding methods. PMID:23525856

  12. Classification of amyloid status using machine learning with histograms of oriented 3D gradients.

    PubMed

    Cattell, Liam; Platsch, Günther; Pfeiffer, Richie; Declerck, Jérôme; Schnabel, Julia A; Hutton, Chloe

    2016-01-01

    Brain amyloid burden may be quantitatively assessed from positron emission tomography imaging using standardised uptake value ratios. Using these ratios as an adjunct to visual image assessment has been shown to improve inter-reader reliability, however, the amyloid positivity threshold is dependent on the tracer and specific image regions used to calculate the uptake ratio. To address this problem, we propose a machine learning approach to amyloid status classification, which is independent of tracer and does not require a specific set of regions of interest. Our method extracts feature vectors from amyloid images, which are based on histograms of oriented three-dimensional gradients. We optimised our method on 133 (18)F-florbetapir brain volumes, and applied it to a separate test set of 131 volumes. Using the same parameter settings, we then applied our method to 209 (11)C-PiB images and 128 (18)F-florbetaben images. We compared our method to classification results achieved using two other methods: standardised uptake value ratios and a machine learning method based on voxel intensities. Our method resulted in the largest mean distances between the subjects and the classification boundary, suggesting that it is less likely to make low-confidence classification decisions. Moreover, our method obtained the highest classification accuracy for all three tracers, and consistently achieved above 96% accuracy.

  13. Assessing the hydrologic alteration of the Yangtze River using the histogram matching approach

    NASA Astrophysics Data System (ADS)

    Huang, F.; Zhang, N.; Guo, L. D.; Xia, Z. Q.

    2016-08-01

    Hydrologic changes of the Yangtze River, an important river with abundant water resources in China, were investigated using the Histogram Matching Approach. Daily streamflow data spanning the time interval from 1955 to 2013 was collected from Yichang and Datong stations, which monitor the hydrologic processes of the upper and lower reach of the Yangtze River, respectively. The Gezhouba Dam, the first dam constructed at the main stream of the Yangtze River, started operations in 1981. 1981 was used to differentiate the pre-dam (1955-1980) and post-dam (1981-2013) hydrologic regimes. The hydrologic regime was quantified by the Indicators of Hydrologic Alteration. The overall alteration degree of the upper Yangtze River was 31% and the alteration degree of every hydrologic indicator ranged from 10% to 81%. Only 1, 5 and 26 hydrologic indicators were altered at high, moderate and low degrees, respectively. The overall alteration degree of the lower Yangtze River was 30%, and the alteration degree of every hydrologic indicator ranged from 8% to 49%. No high alteration degree was detected at the Datong station. Ten hydrologic indicators were altered at moderate degrees and 22 hydrologic indicators were altered at low degrees. Significant increases could be observed for the low-flow relevant indicators, including the monthly flow from January-March, the annual minimum 1, 3, 7, 30 and 90-day flows, and the base flow index.

  14. Visualization of boundaries in CT volumetric data sets using dynamic M-|∇f| histogram.

    PubMed

    Li, Lu; Peng, Hu; Chen, Xun; Cheng, Juan; Gao, Dayong

    2016-01-01

    Direct volume rendering is widely used for three-dimensional medical data visualization such as computed tomography and magnetic resonance imaging. Distinct visualization of boundaries is able to provide valuable and insightful information in many medical applications. However, it is conventionally challenging to detect boundaries reliably due to limitations of the transfer function design. Meanwhile, the interactive strategy is complicated for new users or even experts. In this paper, we build a generalized boundary model contaminated by noise and prove boundary middle value (M) has a good statistical property. Based on the model we propose a user-friendly strategy for the boundary extraction and transfer function design, using M, boundary height (Δh), and gradient magnitude (|∇f|). In fact, it is a dynamic iterative process. First, potential boundaries are sorted orderly from high to low according to the value of their height. Then, users iteratively extract the boundary with the highest value of Δh in a newly defined domain, where different boundaries are transformed to disjoint vertical bars using M-|∇f| histogram. In this case, the chance of misclassification among different boundaries decreases.

  15. Histogram of Oriented Principal Components for Cross-View Action Recognition.

    PubMed

    Rahmani, Hossein; Mahmood, Arif; Huynh, Du; Mian, Ajmal

    2016-12-01

    Existing techniques for 3D action recognition are sensitive to viewpoint variations because they extract features from depth images which are viewpoint dependent. In contrast, we directly process pointclouds for cross-view action recognition from unknown and unseen views. We propose the histogram of oriented principal components (HOPC) descriptor that is robust to noise, viewpoint, scale and action speed variations. At a 3D point, HOPC is computed by projecting the three scaled eigenvectors of the pointcloud within its local spatio-temporal support volume onto the vertices of a regular dodecahedron. HOPC is also used for the detection of spatio-temporal keypoints (STK) in 3D pointcloud sequences so that view-invariant STK descriptors (or Local HOPC descriptors) at these key locations only are used for action recognition. We also propose a global descriptor computed from the normalized spatio-temporal distribution of STKs in 4-D, which we refer to as STK-D. We have evaluated the performance of our proposed descriptors against nine existing techniques on two cross-view and three single-view human action recognition datasets. The experimental results show that our techniques provide significant improvement over state-of-the-art methods.

  16. Computing Spatial Distance Histograms for Large Scientific Datasets On-the-Fly

    PubMed Central

    Kumar, Anand; Grupcev, Vladimir; Yuan, Yongke; Huang, Jin; Shen, Gang

    2014-01-01

    This paper focuses on an important query in scientific simulation data analysis: the Spatial Distance Histogram (SDH). The computation time of an SDH query using brute force method is quadratic. Often, such queries are executed continuously over certain time periods, increasing the computation time. We propose highly efficient approximate algorithm to compute SDH over consecutive time periods with provable error bounds. The key idea of our algorithm is to derive statistical distribution of distances from the spatial and temporal characteristics of particles. Upon organizing the data into a Quad-tree based structure, the spatiotemporal characteristics of particles in each node of the tree are acquired to determine the particles’ spatial distribution as well as their temporal locality in consecutive time periods. We report our efforts in implementing and optimizing the above algorithm in Graphics Processing Units (GPUs) as means to further improve the efficiency. The accuracy and efficiency of the proposed algorithm is backed by mathematical analysis and results of extensive experiments using data generated from real simulation studies. PMID:25264418

  17. Shot-Noise Limited Single-Molecule FRET Histograms: Comparison between Theory and Experiments†

    PubMed Central

    Nir, Eyal; Michalet, Xavier; Hamadani, Kambiz M.; Laurence, Ted A.; Neuhauser, Daniel; Kovchegov, Yevgeniy; Weiss, Shimon

    2011-01-01

    We describe a simple approach and present a straightforward numerical algorithm to compute the best fit shot-noise limited proximity ratio histogram (PRH) in single-molecule fluorescence resonant energy transfer diffusion experiments. The key ingredient is the use of the experimental burst size distribution, as obtained after burst search through the photon data streams. We show how the use of an alternated laser excitation scheme and a correspondingly optimized burst search algorithm eliminates several potential artifacts affecting the calculation of the best fit shot-noise limited PRH. This algorithm is tested extensively on simulations and simple experimental systems. We find that dsDNA data exhibit a wider PRH than expected from shot noise only and hypothetically account for it by assuming a small Gaussian distribution of distances with an average standard deviation of 1.6 Å. Finally, we briefly mention the results of a future publication and illustrate them with a simple two-state model system (DNA hairpin), for which the kinetic transition rates between the open and closed conformations are extracted. PMID:17078646

  18. Thermodynamics and structure of macromolecules from flat-histogram Monte Carlo simulations.

    PubMed

    Janke, Wolfhard; Paul, Wolfgang

    2016-01-21

    Over the last decade flat-histogram Monte Carlo simulations, especially multi-canonical and Wang-Landau simulations, have emerged as a strong tool to study the statistical mechanics of polymer chains. These investigations have focused on coarse-grained models of polymers on the lattice and in the continuum. Phase diagrams of chains in bulk as well as chains attached to surfaces were studied, for homopolymers as well as for protein-like models. Also, aggregation behavior in solution of these models has been investigated. We will present here the theoretical background for these simulations, explain the algorithms used and discuss their performance and give an overview over the systems studied with these methods in the literature, where we will limit ourselves to studies of coarse-grained model systems. Implementations of these algorithms on parallel computers will be also briefly described. In parallel to the development of these simulation methods, the power of a micro-canonical analysis of such simulations has been recognized, and we present the current state of the art in applying the micro-canonical analysis to phase transitions in nanoscopic polymer systems.

  19. Application of Histogram Analysis in Radiation Therapy (HART) in Intensity Modulation Radiation Therapy (IMRT) Treatments

    NASA Astrophysics Data System (ADS)

    Pyakuryal, Anil

    2009-03-01

    A carcinoma is a malignant cancer that emerges from epithelial cells in structures through out the body.It invades the critical organs, could metastasize or spread to lymph nodes.IMRT is an advanced mode of radiation therapy treatment for cancer. It delivers more conformal doses to malignant tumors sparing the critical organs by modulating the intensity of radiation beam.An automated software, HART (S. Jang et al.,2008,Med Phys 35,p.2812) was used for efficient analysis of dose volume histograms (DVH) for multiple targets and critical organs in four IMRT treatment plans for each patient. IMRT data for ten head and neck cancer patients were exported as AAPM/RTOG format files from a commercial treatment planning system at Northwestern Memorial Hospital (NMH).HART extracted DVH statistics were used to evaluate plan indices and to analyze dose tolerance of critical structures at prescription dose (PD) for each patient. Mean plan indices (n=10) were found to be in good agreement with published results for Linac based plans. The least irradiated volume at tolerance dose (TD50) was observed for brainstem and the highest volume for larynx in SIB treatment techniques. Thus HART, an open source platform, has extensive clinical implications in IMRT treatments.

  20. Lung Cancer Prediction Using Neural Network Ensemble with Histogram of Oriented Gradient Genomic Features

    PubMed Central

    Adetiba, Emmanuel; Olugbara, Oludayo O.

    2015-01-01

    This paper reports an experimental comparison of artificial neural network (ANN) and support vector machine (SVM) ensembles and their “nonensemble” variants for lung cancer prediction. These machine learning classifiers were trained to predict lung cancer using samples of patient nucleotides with mutations in the epidermal growth factor receptor, Kirsten rat sarcoma viral oncogene, and tumor suppressor p53 genomes collected as biomarkers from the IGDB.NSCLC corpus. The Voss DNA encoding was used to map the nucleotide sequences of mutated and normal genomes to obtain the equivalent numerical genomic sequences for training the selected classifiers. The histogram of oriented gradient (HOG) and local binary pattern (LBP) state-of-the-art feature extraction schemes were applied to extract representative genomic features from the encoded sequences of nucleotides. The ANN ensemble and HOG best fit the training dataset of this study with an accuracy of 95.90% and mean square error of 0.0159. The result of the ANN ensemble and HOG genomic features is promising for automated screening and early detection of lung cancer. This will hopefully assist pathologists in administering targeted molecular therapy and offering counsel to early stage lung cancer patients and persons in at risk populations. PMID:25802891

  1. Research of automatic counting paper money technology based on two-dimensional histogram θ-division

    NASA Astrophysics Data System (ADS)

    Liu, Yongze; Meng, Qingshen; Song, Xuejun; Li, Aiting

    2011-12-01

    At present, the most technology of counting money is to use the money counter in financial fields. The paper presents a new method for automatic counting paper money which is based on image processing technology. Firstly, the paper money image is acquired by CCD. After analyzing the feature of image, we find that in Cr-space the edge of each paper money is enhanced. Then we use the north-west sobel operator for filtering and north sobel operator for detecting edge. Although the image-processed better highlight the edge of each paper money, the edge is rough and its variance is high. It is hardly to threshold the image for getting the single-pixel edge linked. After Different segmentation algorithm was been used for deriving the edge of paper money, we find the Two-dimensional Histogram θ-division algorithm is suitable for our purpose. The experimental result is proved satisfied. The detecting rate reached 100% in controlled environment for RMB. However, if we want to detect other kinds of paper money such as dollar, there also have several problems to be solved.

  2. Multicomponent adsorption in mesoporous flexible materials with flat-histogram Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Mahynski, Nathan A.; Shen, Vincent K.

    2016-11-01

    We demonstrate an extensible flat-histogram Monte Carlo simulation methodology for studying the adsorption of multicomponent fluids in flexible porous solids. This methodology allows us to easily obtain the complete free energy landscape for the confined fluid-solid system in equilibrium with a bulk fluid of any arbitrary composition. We use this approach to study the adsorption of a prototypical coarse-grained binary fluid in "Hookean" solids, where the free energy of the solid may be described as a simple spring. However, our approach is fully extensible to solids with arbitrarily complex free energy profiles. We demonstrate that by tuning the fluid-solid interaction ranges, the inhomogeneous fluid structure inside the pore can give rise to enhanced selective capture of a larger species through cooperative adsorption with a smaller one. The maximum enhancement in selectivity is observed at low to intermediate pressures and is especially pronounced when the larger species is very dilute in the bulk. This suggest a mechanism by which the selective capture of a minor component from a bulk fluid may be enhanced.

  3. Computationally efficient multidimensional analysis of complex flow cytometry data using second order polynomial histograms.

    PubMed

    Zaunders, John; Jing, Junmei; Leipold, Michael; Maecker, Holden; Kelleher, Anthony D; Koch, Inge

    2016-01-01

    Many methods have been described for automated clustering analysis of complex flow cytometry data, but so far the goal to efficiently estimate multivariate densities and their modes for a moderate number of dimensions and potentially millions of data points has not been attained. We have devised a novel approach to describing modes using second order polynomial histogram estimators (SOPHE). The method divides the data into multivariate bins and determines the shape of the data in each bin based on second order polynomials, which is an efficient computation. These calculations yield local maxima and allow joining of adjacent bins to identify clusters. The use of second order polynomials also optimally uses wide bins, such that in most cases each parameter (dimension) need only be divided into 4-8 bins, again reducing computational load. We have validated this method using defined mixtures of up to 17 fluorescent beads in 16 dimensions, correctly identifying all populations in data files of 100,000 beads in <10 s, on a standard laptop. The method also correctly clustered granulocytes, lymphocytes, including standard T, B, and NK cell subsets, and monocytes in 9-color stained peripheral blood, within seconds. SOPHE successfully clustered up to 36 subsets of memory CD4 T cells using differentiation and trafficking markers, in 14-color flow analysis, and up to 65 subpopulations of PBMC in 33-dimensional CyTOF data, showing its usefulness in discovery research. SOPHE has the potential to greatly increase efficiency of analysing complex mixtures of cells in higher dimensions.

  4. Histogram Analysis of CT Perfusion of Hepatocellular Carcinoma for Predicting Response to Transarterial Radioembolization: Value of Tumor Heterogeneity Assessment

    SciTech Connect

    Reiner, Caecilia S. Gordic, Sonja; Puippe, Gilbert; Morsbach, Fabian; Wurnig, Moritz; Schaefer, Niklaus; Veit-Haibach, Patrick; Pfammatter, Thomas; Alkadhi, Hatem

    2016-03-15

    PurposeTo evaluate in patients with hepatocellular carcinoma (HCC), whether assessment of tumor heterogeneity by histogram analysis of computed tomography (CT) perfusion helps predicting response to transarterial radioembolization (TARE).Materials and MethodsSixteen patients (15 male; mean age 65 years; age range 47–80 years) with HCC underwent CT liver perfusion for treatment planning prior to TARE with Yttrium-90 microspheres. Arterial perfusion (AP) derived from CT perfusion was measured in the entire tumor volume, and heterogeneity was analyzed voxel-wise by histogram analysis. Response to TARE was evaluated on follow-up imaging (median follow-up, 129 days) based on modified Response Evaluation Criteria in Solid Tumors (mRECIST). Results of histogram analysis and mean AP values of the tumor were compared between responders and non-responders. Receiver operating characteristics were calculated to determine the parameters’ ability to discriminate responders from non-responders.ResultsAccording to mRECIST, 8 patients (50 %) were responders and 8 (50 %) non-responders. Comparing responders and non-responders, the 50th and 75th percentile of AP derived from histogram analysis was significantly different [AP 43.8/54.3 vs. 27.6/34.3 mL min{sup −1} 100 mL{sup −1}); p < 0.05], while the mean AP of HCCs (43.5 vs. 27.9 mL min{sup −1} 100 mL{sup −1}; p > 0.05) was not. Further heterogeneity parameters from histogram analysis (skewness, coefficient of variation, and 25th percentile) did not differ between responders and non-responders (p > 0.05). If the cut-off for the 75th percentile was set to an AP of 37.5 mL min{sup −1} 100 mL{sup −1}, therapy response could be predicted with a sensitivity of 88 % (7/8) and specificity of 75 % (6/8).ConclusionVoxel-wise histogram analysis of pretreatment CT perfusion indicating tumor heterogeneity of HCC improves the pretreatment prediction of response to TARE.

  5. Equality Assurance: Self-Assessment for Equal Opportunities in Further Education.

    ERIC Educational Resources Information Center

    Dadzie, Stella, Comp.

    This manual is intended as a tool kit for further education (FE) colleges to use to develop their own approaches to equal opportunities policy development and implementation. The following topics are discussed in the eight sections: the manual's development; the case for equality; things an equal opportunities policy should cover; strategic and…

  6. Equality Hypocrisy, Inconsistency, and Prejudice: The Unequal Application of the Universal Human Right to Equality.

    PubMed

    Abrams, Dominic; Houston, Diane M; Van de Vyver, Julie; Vasiljevic, Milica

    2015-02-01

    In Western culture, there appears to be widespread endorsement of Article 1 of the Universal Declaration of Human Rights (which stresses equality and freedom). But do people really apply their equality values equally, or are their principles and application systematically discrepant, resulting in equality hypocrisy? The present study, conducted with a representative national sample of adults in the United Kingdom (N = 2,895), provides the first societal test of whether people apply their value of "equality for all" similarly across multiple types of status minority (women, disabled people, people aged over 70, Blacks, Muslims, and gay people). Drawing on theories of intergroup relations and stereotyping we examined, relation to each of these groups, respondents' judgments of how important it is to satisfy their particular wishes, whether there should be greater or reduced equality of employment opportunities, and feelings of social distance. The data revealed a clear gap between general equality values and responses to these specific measures. Respondents prioritized equality more for "paternalized" groups (targets of benevolent prejudice: women, disabled, over 70) than others (Black people, Muslims, and homosexual people), demonstrating significant inconsistency. Respondents who valued equality more, or who expressed higher internal or external motivation to control prejudice, showed greater consistency in applying equality. However, even respondents who valued equality highly showed significant divergence in their responses to paternalized versus nonpaternalized groups, revealing a degree of hypocrisy. Implications for strategies to promote equality and challenge prejudice are discussed.

  7. Kalman Filtering Approach to Blind Equalization

    DTIC Science & Technology

    1993-12-01

    Update weighing probabilities (3.11) vi. Compute one step predictions p(j..Nb I X’) S(tlp (D,.,,,IX,) B. ADMISSIBILITY OF THE BLIND EQUALIZATION BY...NAVAL POSTGRADUATE SCHOOL Monterey, California •GR AD13 DTIC 94-07381 AR 0C199 THESIS S 0 LECTE4u KALMAN FILTERING APPROACH TO BLIND EQUALIZATION by...FILTERING APPROACH 5. FUNDING NUMBERS TO BLIND EQUALIZATION S. AUTHOR(S) Mehmet Kutlu 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) S

  8. Jarzynski equality for quantum stochastic maps.

    PubMed

    Rastegin, Alexey E; Życzkowski, Karol

    2014-01-01

    Jarzynski equality and related fluctuation theorems can be formulated for various setups. Such an equality was recently derived for nonunitary quantum evolutions described by unital quantum operations, i.e., for completely positive, trace-preserving maps, which preserve the maximally mixed state. We analyze here a more general case of arbitrary quantum operations on finite systems and derive the corresponding form of the Jarzynski equality. It contains a correction term due to nonunitality of the quantum map. Bounds for the relative size of this correction term are established and they are applied for exemplary systems subjected to quantum channels acting on a finite-dimensional Hilbert space.

  9. The EEOC's New Equal Pay Act Guidelines.

    ERIC Educational Resources Information Center

    Greenlaw, Paul S.; Kohl, John P.

    1982-01-01

    Analyzes the new guidelines for enforcement of the Equal Pay Act and their implications for personnel management. Argues that there are key problem areas in the new regulations arising from considerable ambiguity and uncertainty about their interpretation. (SK)

  10. Teaching the Economics of Equal Opportunities.

    ERIC Educational Resources Information Center

    Ownby, Arnola C.; Rhea, Jeanine N.

    1990-01-01

    Focuses on equal opportunities--for education, pay, and with gender bias for individuals and business organizations. Suggests that business educators can expand the implications to include ethnic-based inequalities as well. (JOW)

  11. Equal Remuneration Convention (ILO No. 100).

    PubMed

    1989-01-01

    The government of Uruguay ratified this UN International Labor Organization convention on equal remuneration on November 16, 1989, and the Government of Zimbabwe ratified this Convention on December 14, 1989.

  12. The Bakke Opinions and Equal Protection Doctrine.

    ERIC Educational Resources Information Center

    Karst, Kenneth L.; Horowitz, Harold W.

    1979-01-01

    Constitutional issues addressed in the Supreme Court's decision are reviewed. The opinions rendered by Justice Powell are viewed as reflections of the weakness of recent equal protection theory, and as signs of future doctrine. (GC)

  13. Equalization of nonlinear transmission impairments by maximum-likelihood-sequence estimation in digital coherent receivers.

    PubMed

    Khairuzzaman, Md; Zhang, Chao; Igarashi, Koji; Katoh, Kazuhiro; Kikuchi, Kazuro

    2010-03-01

    We describe a successful introduction of maximum-likelihood-sequence estimation (MLSE) into digital coherent receivers together with finite-impulse response (FIR) filters in order to equalize both linear and nonlinear fiber impairments. The MLSE equalizer based on the Viterbi algorithm is implemented in the offline digital signal processing (DSP) core. We transmit 20-Gbit/s quadrature phase-shift keying (QPSK) signals through a 200-km-long standard single-mode fiber. The bit-error rate performance shows that the MLSE equalizer outperforms the conventional adaptive FIR filter, especially when nonlinear impairments are predominant.

  14. Fast Equalization for Large Lithium Ion Batteries

    DTIC Science & Technology

    2008-09-01

    that in Figure 7. The cell charger , CH1, is simply a Cosel ZUS251205 DC - DC converter, which has an output rating of 5VDC/4ADC. Because each cell...Wiegman, D. Divan, and D. Novotny (1995), “ Design considerations for charge equalization of an electric vehicle battery system,” IEEE 1995 Applied...VA, June 2002. [8] Y. Lee and G. Cheng (2006) “Quasi-resonant zero-current switching bidirectional converter for battery equalization applications

  15. [Gender equality activity in the Bioimaging Society].

    PubMed

    Suzaki, Etsuko

    2013-09-01

    Gender equality activity in the Bioimaging Society was initiated in 2005 when it joined the Japan Inter-Society Liaison Association Committee for Promoting Equal Participation of Men and Women in Science and Engineering (EPMEWSE). The Gender Equality Committee of the Bioimaging Society is acting on this issue by following the policy of the EPMEWSE, and has also been planning and conducting lectures at annual meetings of the society to gain the understanding, consents, and cooperation of the members of the society to become conscious of gender equality. Women's participation in the society has been promoted through the activities of the Gender Equality Committee, and the number of women officers in the society has since increased from two women out of 40 members in 2005 to five out of 44 in 2013. The activities of the Gender Equality Committee of the Japanese Association of Anatomists (JAA) have just started. There are more than 400 women belonging to the JAA. When these women members join together and collaborate, women's participation in the JAA will increase.

  16. Dose-Volume Histogram Analysis of the Safety of Proton Beam Therapy for Unresectable Hepatocellular Carcinoma

    SciTech Connect

    Kawashima, Mitsuhiko; Kohno, Ryosuke; Nakachi, Kohei; Nishio, Teiji; Mitsunaga, Shuichi; Ikeda, Masafumi; Konishi, Masaru; Takahashi, Shinichiro; Gotohda, Naoto; Arahira, Satoko; Zenda, Sadamoto; Ogino, Takashi; Kinoshita, Taira

    2011-04-01

    Purpose: To evaluate the safety and efficacy of radiotherapy using proton beam (PRT) for unresectable hepatocellular carcinoma. Methods and Materials: Sixty consecutive patients who underwent PRT between May 1999 and July 2007 were analyzed. There were 42 males and 18 females, with a median age of 70 years (48-92 years). All but 1 patient had a single lesion with a median diameter of 45 mm (20-100 mm). Total PRT dose/fractionation was 76-cobalt Gray equivalent (CGE)/20 fractions in 46 patients, 65 CGE/26 fractions in 11 patients, and 60 CGE/10 fractions in 3 patients. The risk of developing proton-induced hepatic insufficiency (PHI) was estimated using dose-volume histograms and an indocyanine-green retention rate at 15 minutes (ICG R15). Results: None of the 20 patients with ICG R15 of less than 20% developed PHI, whereas 6 of 8 patients with ICG R15 values of 50% or higher developed PHI. Among 32 patients whose ICG R15 ranged from 20% to 49.9%, PHI was observed only in patients who had received 30 CGE (V30) to more than 25% of the noncancerous parts of the liver (n = 5) Local progression-free and overall survival rates at 3 years were 90% (95% confidence interval [CI], 80-99%) and 56% (95% CI, 43-69%), respectively. A gastrointestinal toxicity of Grade {>=}2 was observed in 3 patients. Conclusions: ICG R15 and V30 are recommended as useful predictors for the risk of developing PHI, which should be incorporated into multidisciplinary treatment plans for patients with this disease.

  17. Assessment of Autonomic Function by Phase Rectification of RRInterval Histogram Analysis in Chagas Disease

    PubMed Central

    Nasari-Junior, Olivassé; Benchimol-Barbosa, Paulo Roberto; Pedrosa, Roberto Coury; Nadal, Jurandir

    2015-01-01

    Background In chronic Chagas disease (ChD), impairment of cardiac autonomic function bears prognostic implications. Phase‑rectification of RR-interval series isolates the sympathetic, acceleration phase (AC) and parasympathetic, deceleration phase (DC) influences on cardiac autonomic modulation. Objective This study investigated heart rate variability (HRV) as a function of RR-interval to assess autonomic function in healthy and ChD subjects. Methods Control (n = 20) and ChD (n = 20) groups were studied. All underwent 60-min head-up tilt table test under ECG recording. Histogram of RR-interval series was calculated, with 100 ms class, ranging from 600–1100 ms. In each class, mean RR-intervals (MNN) and root-mean-squared difference (RMSNN) of consecutive normal RR-intervals that suited a particular class were calculated. Average of all RMSNN values in each class was analyzed as function of MNN, in the whole series (RMSNNT), and in AC (RMSNNAC) and DC (RMSNNDC) phases. Slopes of linear regression lines were compared between groups using Student t-test. Correlation coefficients were tested before comparisons. RMSNN was log-transformed. (α < 0.05). Results Correlation coefficient was significant in all regressions (p < 0.05). In the control group, RMSNNT, RMSNNAC, and RMSNNDC significantly increased linearly with MNN (p < 0.05). In ChD, only RMSNNAC showed significant increase as a function of MNN, whereas RMSNNT and RMSNNDC did not. Conclusion HRV increases in proportion with the RR-interval in healthy subjects. This behavior is lost in ChD, particularly in the DC phase, indicating cardiac vagal incompetence. PMID:26131700

  18. Adaboost face detector based on Joint Integral Histogram and Genetic Algorithms for feature extraction process.

    PubMed

    Jammoussi, Ameni Yangui; Ghribi, Sameh Fakhfakh; Masmoudi, Dorra Sellami

    2014-01-01

    Recently, many classes of objects can be efficiently detected by the way of machine learning techniques. In practice, boosting techniques are among the most widely used machine learning for various reasons. This is mainly due to low false positive rate of the cascade structure offering the possibility to be trained by different classes of object. However, it is especially used for face detection since it is the most popular sub-problem within object detection. The challenges of Adaboost based face detector include the selection of the most relevant features from a large feature set which are considered as weak classifiers. In many scenarios, however, selection of features based on lowering classification errors leads to computation complexity and excess of memory use. In this work, we propose a new method to train an effective detector by discarding redundant weak classifiers while achieving the pre-determined learning objective. To achieve this, on the one hand, we modify AdaBoost training so that the feature selection process is not based any more on the weak learner's training error. This is by incorporating the Genetic Algorithm (GA) on the training process. On the other hand, we make use of the Joint Integral Histogram in order to extract more powerful features. Experimental performance on human faces show that our proposed method requires smaller number of weak classifiers than the conventional learning algorithm, resulting in higher learning and faster classification rates. So, our method outperforms significantly state-of-the-art cascade methods in terms of detection rate and false positive rate and especially in reducing the number of weak classifiers per stage.

  19. Clinical dose-volume histogram analysis in predicting radiation pneumonitis in Hodgkin's lymphoma

    SciTech Connect

    Koh, Eng-Siew; Sun, Alexander . E-mail: alex.sun@rmp.uhn.on.ca; Tu Huan Tran; Tsang, Richard; Pintilie, Melania; Hodgson, David C.; Wells, Woodrow; Heaton, Robert; Gospodarowicz, Mary K.

    2006-09-01

    Purpose: To quantify the incidence of radiation pneumonitis (RP) in a modern Hodgkin's lymphoma (HL) cohort, and to identify any clinically relevant parameters that may influence the risk of RP. Methods and Materials: Between January 2003 and February 2005, 64 consecutive HL patients aged 18 years or older receiving radical mediastinal radiation therapy (RT) were retrospectively reviewed. Symptomatic cases of radiation pneumonitis were identified. Dose-volume histogram parameters, including V{sub 13}, V{sub 2}, V{sub 3}, and mean lung dose (MLD), were quantified. Results: At a median follow-up of 2.1 years, the actuarial survival for all patients was 91% at 3 years. There were 2 (2/64) cases of Radiation Therapy Oncology Group (RTOG) Grade 2 RP (incidence 3.1%). Both index cases with corresponding V{sub 2} values of 47.0% and 40.7% were located in the upper quartile (2/16 cases), defined by a V{sub 2} value of {>=}36%, an incidence of 12.5% (p = 0.03). Similarly for total MLD, both index cases with values of 17.6 Gy and 16.4 Gy, respectively, were located in the upper quartile defined by MLD {>=}14.2 Gy, an incidence of 11.8% (2/17 cases, p = 0.02). Conclusions: Despite relatively high V{sub 2} values in this study of HL patients, the incidence of RP was only 3%, lower compared with the lung cancer literature. We suggest the following clinically relevant parameters be considered in treatment plan assessment: a V{sub 2} greater than 36% and an MLD greater than 14 Gy, over and above which the risk of RTOG Grade 2 or greater RP would be considered clinically significant.

  20. Multimodal registration of SD-OCT volumes and fundus photographs using histograms of oriented gradients

    PubMed Central

    Miri, Mohammad Saleh; Abràmoff, Michael D.; Kwon, Young H.; Garvin, Mona K.

    2016-01-01

    With availability of different retinal imaging modalities such as fundus photography and spectral domain optical coherence tomography (SD-OCT), having a robust and accurate registration scheme to enable utilization of this complementary information is beneficial. The few existing fundus-OCT registration approaches contain a vessel segmentation step, as the retinal blood vessels are the most dominant structures that are in common between the pair of images. However, errors in the vessel segmentation from either modality may cause corresponding errors in the registration. In this paper, we propose a feature-based registration method for registering fundus photographs and SD-OCT projection images that benefits from vasculature structural information without requiring blood vessel segmentation. In particular, after a preprocessing step, a set of control points (CPs) are identified by looking for the corners in the images. Next, each CP is represented by a feature vector which encodes the local structural information via computing the histograms of oriented gradients (HOG) from the neighborhood of each CP. The best matching CPs are identified by calculating the distance of their corresponding feature vectors. After removing the incorrect matches the best affine transform that registers fundus photographs to SD-OCT projection images is computed using the random sample consensus (RANSAC) method. The proposed method was tested on 44 pairs of fundus and SD-OCT projection images of glaucoma patients and the result showed that the proposed method successfully registers the multimodal images and produced a registration error of 25.34 ± 12.34 μm (0.84 ± 0.41 pixels). PMID:28018740

  1. Three-Dimensional Object Recognition and Registration for Robotic Grasping Systems Using a Modified Viewpoint Feature Histogram.

    PubMed

    Chen, Chin-Sheng; Chen, Po-Chun; Hsu, Chih-Ming

    2016-11-23

    This paper presents a novel 3D feature descriptor for object recognition and to identify poses when there are six-degrees-of-freedom for mobile manipulation and grasping applications. Firstly, a Microsoft Kinect sensor is used to capture 3D point cloud data. A viewpoint feature histogram (VFH) descriptor for the 3D point cloud data then encodes the geometry and viewpoint, so an object can be simultaneously recognized and registered in a stable pose and the information is stored in a database. The VFH is robust to a large degree of surface noise and missing depth information so it is reliable for stereo data. However, the pose estimation for an object fails when the object is placed symmetrically to the viewpoint. To overcome this problem, this study proposes a modified viewpoint feature histogram (MVFH) descriptor that consists of two parts: a surface shape component that comprises an extended fast point feature histogram and an extended viewpoint direction component. The MVFH descriptor characterizes an object's pose and enhances the system's ability to identify objects with mirrored poses. Finally, the refined pose is further estimated using an iterative closest point when the object has been recognized and the pose roughly estimated by the MVFH descriptor and it has been registered on a database. The estimation results demonstrate that the MVFH feature descriptor allows more accurate pose estimation. The experiments also show that the proposed method can be applied in vision-guided robotic grasping systems.

  2. Three-Dimensional Object Recognition and Registration for Robotic Grasping Systems Using a Modified Viewpoint Feature Histogram

    PubMed Central

    Chen, Chin-Sheng; Chen, Po-Chun; Hsu, Chih-Ming

    2016-01-01

    This paper presents a novel 3D feature descriptor for object recognition and to identify poses when there are six-degrees-of-freedom for mobile manipulation and grasping applications. Firstly, a Microsoft Kinect sensor is used to capture 3D point cloud data. A viewpoint feature histogram (VFH) descriptor for the 3D point cloud data then encodes the geometry and viewpoint, so an object can be simultaneously recognized and registered in a stable pose and the information is stored in a database. The VFH is robust to a large degree of surface noise and missing depth information so it is reliable for stereo data. However, the pose estimation for an object fails when the object is placed symmetrically to the viewpoint. To overcome this problem, this study proposes a modified viewpoint feature histogram (MVFH) descriptor that consists of two parts: a surface shape component that comprises an extended fast point feature histogram and an extended viewpoint direction component. The MVFH descriptor characterizes an object’s pose and enhances the system’s ability to identify objects with mirrored poses. Finally, the refined pose is further estimated using an iterative closest point when the object has been recognized and the pose roughly estimated by the MVFH descriptor and it has been registered on a database. The estimation results demonstrate that the MVFH feature descriptor allows more accurate pose estimation. The experiments also show that the proposed method can be applied in vision-guided robotic grasping systems. PMID:27886080

  3. Combining a modified vector field histogram algorithm and real-time image processing for unknown environment navigation

    NASA Astrophysics Data System (ADS)

    Nepal, Kumud; Fine, Adam; Imam, Nabil; Pietrocola, David; Robertson, Neil; Ahlgren, David J.

    2009-01-01

    Q is an unmanned ground vehicle designed to compete in the Autonomous and Navigation Challenges of the AUVSI Intelligent Ground Vehicle Competition (IGVC). Built on a base platform of a modified PerMobil Trax off-road wheel chair frame, and running off a Dell Inspiron D820 laptop with an Intel t7400 Core 2 Duo Processor, Q gathers information from a SICK laser range finder (LRF), video cameras, differential GPS, and digital compass to localize its behavior and map out its navigational path. This behavior is handled by intelligent closed loop speed control and robust sensor data processing algorithms. In the Autonomous challenge, data taken from two IEEE 1394 cameras and the LRF are integrated and plotted on a custom-defined occupancy grid and converted into a histogram which is analyzed for openings between obstacles. The image processing algorithm consists of a series of steps involving plane extraction, normalizing of the image histogram for an effective dynamic thresholding, texture and morphological analysis and particle filtering to allow optimum operation at varying ambient conditions. In the Navigation Challenge, a modified Vector Field Histogram (VFH) algorithm is combined with an auto-regressive path planning model for obstacle avoidance and better localization. Also, Q features the Joint Architecture for Unmanned Systems (JAUS) Level 3 compliance. All algorithms are developed and implemented using National Instruments (NI) hardware and LabVIEW software. The paper will focus on explaining the various algorithms that make up Q's intelligence and the different ways and modes of their implementation.

  4. Criteria for equality in two entropic inequalities

    SciTech Connect

    Shirokov, M. E.

    2014-07-31

    We obtain a simple criterion for local equality between the constrained Holevo capacity and the quantum mutual information of a quantum channel. This shows that the set of all states for which this equality holds is determined by the kernel of the channel (as a linear map). Applications to Bosonic Gaussian channels are considered. It is shown that for a Gaussian channel having no completely depolarizing components the above characteristics may coincide only at non-Gaussian mixed states and a criterion for the existence of such states is given. All the obtained results may be reformulated as conditions for equality between the constrained Holevo capacity of a quantum channel and the input von Neumann entropy. Bibliography: 20 titles. (paper)

  5. All Are Equal, but Some Are More Equal than Others: Managerialism and Gender Equality in Higher Education in Comparative Perspective

    ERIC Educational Resources Information Center

    Teelken, Christine; Deem, Rosemary

    2013-01-01

    The main purpose of this paper is to investigate what impact new regimes of management and governance, including new managerialism, have had on perceptions of gender equality at universities in three Western European countries. While in accordance with national laws and EU directives, contemporary current management approaches in universities…

  6. [Forms of histograms constructed from measurements of alpha-decay of 228Ra in Lindau (Germany) and neutron fluxes in Moscow change synchronously according to the local time].

    PubMed

    Zenchenko, K I; Zenchenko, T A; Kuzhevskiĭ, B M; Vilken, B; Axford, Y; Shnol', S E

    2001-01-01

    In joint experiments performed at Max Plank Institute of Aeronomy (Germany) and the Institute of Theoretical and Experimental Biophysics in Pushchino, the main manifestations of the phenomenon of macroscopic fluctuations were confirmed. An increased probability of the similarity in synchronous histograms in independent measurements performed by two installations in one laboratory and in two laboratories separated by a distance of 2000 km was shown. In the latter case, the similarity of histograms is most probable at the same local time.

  7. Existing Data Format for Two-Parameter Beta-Gamma Histograms for Radioxenon

    SciTech Connect

    TW Bowyer; TR Heimbigner; JI McIntyre; AD McKinnon; PL Reeder; E Wittinger

    1999-03-23

    There is a need to establish a commonly acceptable format for storing beta-gated coincidence data for stations in the International Monitoring System (IMS) for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The current aerosol RMS type data format is not applicable for radioxenon in that the current format contains implicit assumptions specific to conventional gamma-ray spectrometry. Some assumptions in the current RMS format are not acceptable for the beta-gated spectra expected from the U.S. Department of Energy PNNL Automated Radioxenon Sampler-Analyzer (ARSA) and other similar systems under use or development from various countries. The RMS data format is not generally applicable for radioxenon measurements in the CTBT for one or more of the following main reasons: 1) The RMS format does not currently support 2-dimensional data. That is, the RMS data format is setup for a simple l-dimensional gamma-ray energy histogram. Current data available from the ARSA system and planned for other radioxenon monitors includes spectral information from gamma-rays and betas/conversion electrons. It is worth noting that the beta/conversion electron energy information will be used to separate the contributions from the different radioxenons. 2) The RMS data format assumes that the conversion between counts and activity can be calculated based (in part) on a simple calibration curve (detector efficiency curve) that depends only on energy of the gamma-ray. In the case of beta-gated gamma-ray spectra and for 2-dimensional spectra, there are generally two detector calibration curves that must be convoluted, the lower energy cutoff for the betas must be considered, and the energy acceptance window must be taken into account to convert counts into activity. . 3) The RMS format has header information that contains aerosol-specific information that allows the activity (Bq) calculated to be converted into a concentration (Bq/SCM). This calculation is performed by dividing the

  8. A Substituting Meaning for the Equals Sign in Arithmetic Notating Tasks

    ERIC Educational Resources Information Center

    Jones, Ian; Pratt, Dave

    2012-01-01

    Three studies explore arithmetic tasks that support both substitutive and basic relational meanings for the equals sign. The duality of meanings enabled children to engage meaningfully and purposefully with the structural properties of arithmetic statements in novel ways. Some, but not all, children were successful at the adapted task and were…

  9. Histogram analysis of apparent diffusion coefficient for monitoring early response in patients with advanced cervical cancers undergoing concurrent chemo-radiotherapy.

    PubMed

    Meng, Jie; Zhu, Lijing; Zhu, Li; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng

    2017-01-01

    Background Apparent diffusion coefficient (ADC) histogram analysis has been widely used in determining tumor prognosis. Purpose To investigate the dynamic changes of ADC histogram parameters during concurrent chemo-radiotherapy (CCRT) in patients with advanced cervical cancers. Material and Methods This prospective study enrolled 32 patients with advanced cervical cancers undergoing CCRT who received diffusion-weighted (DW) magnetic resonance imaging (MRI) before CCRT, at the end of the second and fourth week during CCRT and one month after CCRT completion. The ADC histogram for the entire tumor volume was generated, and a series of histogram parameters was obtained. Dynamic changes of those parameters in cervical cancers were investigated as early biomarkers for treatment response. Results All histogram parameters except AUClow showed significant changes during CCRT (all P < 0.05). There were three variable trends involving different parameters. The mode, 5th, 10th, and 25th percentiles showed similar early increase rates (33.33%, 33.99%, 34.12%, and 30.49%, respectively) at the end of the second week of CCRT. The pre-CCRT 5th and 25th percentiles of the complete response (CR) group were significantly lower than those of the partial response (PR) group. Conclusion A series of ADC histogram parameters of cervical cancers changed significantly at the early stage of CCRT, indicating their potential in monitoring early tumor response to therapy.

  10. Great Constitutional Ideas: Justice, Equality, and Property.

    ERIC Educational Resources Information Center

    Starr, Isidore

    1987-01-01

    Examines the ideas of justice, equality, and property as they are represented in the Declaration of Independence, the U.S. Constitution and the Bill of Rights. Discusses how these ideas affect the way public schools operate and the lessons educators teach or don't teach about our society. Includes ideas for classroom activities. (JDH)

  11. The Need for the Equal Rights Amendment

    ERIC Educational Resources Information Center

    Ginsburg, Ruth Bader

    1974-01-01

    The Equal Rights Amendment would dedicate the nation to a new view of the rights and responsibilities of men and women. It rejects sharp legislative lines between the sexes as constitutionally tolerable. The need for the Amendment is discussed in this article. (Author/RM)

  12. 76 FR 53807 - Women's Equality Day, 2011

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-29

    ... August 29, 2011 Part IV The President Proclamation 8699--Women's Equality Day, 2011 #0; #0; #0... / Presidential Documents#0;#0; #0; #0;Title 3-- #0;The President ] Proclamation 8699 of August 25, 2011 Women's... the United States Constitution tore down the last formal barrier to women's enfranchisement in...

  13. Equality by Default: Women in Dual Roles

    ERIC Educational Resources Information Center

    Campbell, Ena

    1978-01-01

    The struggle for an Equal Rights Amendment (ERA) to the American Constitution is one of the most controversial issues of this era. Discusses the changing role of women amidst a fast-changing society, the styles of those opposing the women's revolution, the debate over women as persons, women in dual roles, and the implications of ERA for the world…

  14. Gender Equality Policies and Higher Education Careers

    ERIC Educational Resources Information Center

    Berggren, Caroline

    2011-01-01

    Gender equality policies regulate the Swedish labour market, including higher education. This study analyses and discusses the career development of postgraduate students in the light of labour market influences. The principle of gender separation is used to understand these effects. Swedish register data encompassing information on 585…

  15. An American Perspective on Equal Educational Opportunities

    ERIC Educational Resources Information Center

    Russo, Charles; Perkins, Brian

    2004-01-01

    The United States Supreme Court ushered in a new era in American history on May 17, 1954 in its monumental ruling in "Brown v Board of Education," Topeka, Kansas. "Brown" is not only the Court's most significant decision on race and equal educational opportunities, but also ranks among the most important cases it has ever decided. In "Brown" a…

  16. EQUALS Investigations: Telling Someone Where To Go.

    ERIC Educational Resources Information Center

    Mayfield, Karen; Whitlow, Robert

    EQUALS is a teacher education program that helps elementary and secondary educators acquire methods and materials to attract minority and female students to mathematics. It supports a problem-solving approach to mathematics which has students working in groups, uses active assessment methods, and incorporates a broad mathematics curriculum…

  17. Three Utilities for the Equal Sign

    ERIC Educational Resources Information Center

    Jones, Ian; Pratt, Dave

    2005-01-01

    We compare the activity of young children using a microworld and a JavaScript relational calculator with the literature on children using traditional calculators. We describe how the children constructed different meanings for the equal sign in each setting. It appears that the nature of the meaning constructed is highly dependent on specificities…

  18. 45 CFR 98.43 - Equal access.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Program Operations (Child Care Services)-Lead Agency and Provider Requirements § 98.43 Equal access. (a) The Lead Agency shall certify that the payment rates for the provision of child care services under this part...

  19. 45 CFR 98.43 - Equal access.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Department of Health and Human Services GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Program Operations (Child Care Services)-Lead Agency and Provider Requirements § 98.43 Equal access. (a) The Lead Agency shall certify that the payment rates for the provision of child care services under this part...

  20. 45 CFR 98.43 - Equal access.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Program Operations (Child Care Services)-Lead Agency and Provider Requirements § 98.43 Equal access. (a) The Lead Agency shall certify that the payment rates for the provision of child care services under this part...

  1. 45 CFR 98.43 - Equal access.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Program Operations (Child Care Services)-Lead Agency and Provider Requirements § 98.43 Equal access. (a) The Lead Agency shall certify that the payment rates for the provision of child care services under this part...

  2. 45 CFR 98.43 - Equal access.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Program Operations (Child Care Services)-Lead Agency and Provider Requirements § 98.43 Equal access. (a) The Lead Agency shall certify that the payment rates for the provision of child care services under this part...

  3. Educational Technology: A Presupposition of Equality?

    ERIC Educational Resources Information Center

    Orlando, Joanne

    2014-01-01

    The work of philosopher Jacques Rancière is used conceptually and methodologically to frame an exploration of the driving interests in educational technology policy and the sanctioning of particular discursive constructions of pedagogy that result. In line with Rancière's thinking, the starting point for this analysis is that of equality--that…

  4. Position Paper: NO equals x Measurement

    ERIC Educational Resources Information Center

    Hauser, Thomas R.; Shy, Carl M.

    1972-01-01

    Doubts about the accuracy of measured concentrations of nitrogen dioxide (NO 2) in ambient air have led the Environmental Protection Agency to reassess both the analytical technique and the extent to which nitrogen oxides (NO equals x) control will need to satisfy federal laws. (BL)

  5. Disability in the UK: Measuring Equality

    ERIC Educational Resources Information Center

    Purdam, Kingsley; Afkhami, Reza; Olsen, Wendy; Thornton, Patricia

    2008-01-01

    In this article we identify the key survey data for examining the issue of equality in the lives of disabled people in the UK. Such data is essential for assessing change in quality of life over time and for the evaluation of the impact of policy initiatives. For each data source we consider definitions, data collection, issue coverage, sample…

  6. Some Are More Equal Than Others.

    ERIC Educational Resources Information Center

    Ohanian, Susan

    1998-01-01

    Contends that not all books are created equal--"Anna Karenina," for example, is worth more than Nancy Drew mysteries. Relates, in a personal narrative, that when this opinion was manifested in a newspaper column, hundreds of letters took issue with the idea. Reiterates that the literate teacher finds ways to convince students that…

  7. Sex Equality in Physical Education and Athletics.

    ERIC Educational Resources Information Center

    Burkett, Lucille M.

    Title IX legislated, among other things, equal educational opportunities for boys and girls in physical education. Although there are many practices which discriminate against girls' sports, and it is important to correct these, Title IX really calls for a fundamental change in all physical education programs to give each individual child the best…

  8. Gender Equality in Academia: A Critical Reflection

    ERIC Educational Resources Information Center

    Winchester, Hilary P. M.; Browning, Lynette

    2015-01-01

    Gender equality in academia has been monitored in Australia for the past three decades so it is timely to reflect on what progress has been made, what works, and what challenges remain. When data were first published on the gender composition of staff in Australian universities in the mid-1980s women comprised 20 per cent of academic staff and…

  9. Comparing Several Robust Tests of Stochastic Equality.

    ERIC Educational Resources Information Center

    Vargha, Andras; Delaney, Harold D.

    In this paper, six statistical tests of stochastic equality are compared with respect to Type I error and power through a Monte Carlo simulation. In the simulation, the skewness and kurtosis levels and the extent of variance heterogeneity of the two parent distributions were varied across a wide range. The sample sizes applied were either small or…

  10. The New Family Equality: Myth or Reality?

    ERIC Educational Resources Information Center

    Vanek, Joann

    The paper analyzes the work roles of husbands and wives in the 1970s and suggests policies to implement sex equality in the workplace and at home. Data reviewed in the paper support the structural-cultural view that work behavior both inside and outside the home is shaped by deeply embedded cultural and structural forces. In 1975, 41% of families…

  11. What Is Equality of Opportunity in Education?

    ERIC Educational Resources Information Center

    Lazenby, Hugh

    2016-01-01

    There is widespread disagreement about what equality of opportunity in education requires. For some it is that each child is legally permitted to go to school. For others it is that each child receives the same educational resources. Further interpretations abound. This fact presents a problem: when politicians or academics claim they are in…

  12. Racial Equality. To Protect These Rights Series.

    ERIC Educational Resources Information Center

    McDonald, Laughlin

    A historical review of racial discrimination against Negroes is the scope of this volume, part of a series of six volumes which explore the basic American rights. These include due process of law, freedom of speech and religious freedom. This volume traces the development of racial equality in the legal system, explores the controversies and…

  13. When Equal Masses Don't Balance

    ERIC Educational Resources Information Center

    Newburgh, Ronald; Peidle, Joseph; Rueckner, Wolfgang

    2004-01-01

    We treat a modified Atwood's machine in which equal masses do not balance because of being in an accelerated frame of reference. Analysis of the problem illuminates the meaning of inertial forces, d'Alembert's principle, the use of free-body diagrams and the selection of appropriate systems for the diagrams. In spite of the range of these…

  14. Equal Opportunity and Racial Differences in IQ.

    ERIC Educational Resources Information Center

    Fagan, Joseph F.; Holland, Cynthia R.

    2002-01-01

    Administered an intelligence test to blacks and whites in 2 studies involving 254 community college students and 2 more studies involving 115 community college students. Results show that differences in knowledge between blacks and whites for items on an intelligence test, the meanings of words, can be eliminated when equal opportunities for…

  15. Social Responsibility in Librarianship: Essays on Equality.

    ERIC Educational Resources Information Center

    MacCann, Donnarae, Ed.

    In a culturally complex world, librarians can best work toward the equalization of library services if they understand their institutions in the light of cultural history. The six essays in this book highlight problems that affect unempowered populations, and address a variety of cultural problems and biases--problems that contribute to the…

  16. EQUALS Investigations: Flea-Sized Surgeons.

    ERIC Educational Resources Information Center

    Mayfield, Karen; Whitlow, Robert

    EQUALS is a teacher education program that helps elementary and secondary educators acquire methods and materials to attract minority and female students to mathematics. It supports a problem-solving approach to mathematics which has students working in groups, uses active assessment methods, and incorporates a broad mathematics curriculum…

  17. Race Equality Scheme 2005-2008

    ERIC Educational Resources Information Center

    Her Majesty's Inspectorate of Education, 2005

    2005-01-01

    Her Majesty's Inspectorate of Education (HMIE) is strongly committed to promoting race equality in the way that HMIE staff go about performing their role within Scottish education. Scottish society reflects cultural, ethnic, religious and linguistic diversity and Scottish education should be accessible to all. No-one should be disadvantaged or…

  18. Evaluating Faculty Performance Under the Equal Pay for Equal Work Doctrine

    ERIC Educational Resources Information Center

    Buzan, Bert Carl; Hunt, Thomas Lynn

    1976-01-01

    Faculty promotion and salary policies at the University of Texas at Austin are analyzed to determine whether male and female faculty members are rewarded equally for equal academic qualifications and performances. This regression analysis tends to support the discrimination hypothesis with respect to both promotion and salary policies. (Author/LBH)

  19. Troubling Gender Equality: Revisiting Gender Equality Work in the Famous Nordic Model Countries

    ERIC Educational Resources Information Center

    Edström, Charlotta; Brunila, Kristiina

    2016-01-01

    This article concerns gender equality work, that is, those educational and workplace activities that involve the promotion of gender equality. It is based on research conducted in Sweden and Finland, and focuses on the period during which the public sector has become more market-oriented and project-based all over the Nordic countries. The…

  20. Equality in the Workplace. An Equal Opportunities Handbook for Trainers. Human Resource Management in Action Series.

    ERIC Educational Resources Information Center

    Collins, Helen

    This workbook, which is intended as a practical guide for human resource managers, trainers, and others concerned with developing and implementing equal opportunities training programs in British workplaces, examines issues in and methods for equal opportunities training. The introduction gives an overview of current training trends and issues.…

  1. Equal Opportunities and European Educational/Vocational Policy [and] Equal Opportunities and the European Social Fund.

    ERIC Educational Resources Information Center

    Brine, Jacky

    This document contains a symposium paper and a conference paper. "Equal Opportunities and European Educational and Vocational Policy" explores the symposium theme of concepts of difference as it relates directly to the European discourse of equal opportunities and its influence on European educational and vocational policy. It outlines…

  2. Equality of Opportunity in Higher Education - The Impact of Contract Compliance and the Equal Rights Amendment.

    ERIC Educational Resources Information Center

    Roberts, Sylvia

    In discussing the impact of contract compliance and the Equal Rights Amendment on equality of opportunity in higher education, the author focuses primarily on women employed as faculty members and staff at universities and colleges. The basic and fundamental fact is that women have been treated differently, and that is to say, less well, than men.…

  3. Review of pre-FFT equalization techniques and their application to 4G

    NASA Astrophysics Data System (ADS)

    Armour, Simon; Doufexi, Angela; Nix, Andrew; Beach, Mark; McGeehan, J.

    2001-11-01

    In this paper a review of the Pre-FFT Equalization technique is presented with a particular focus on 4G applications. The essential concepts and motivations for the use of this technique are first presented. Subsequently, previous research of the topic both by the authors and others is reviewed. In particular, methods for implementing the Pre-FFT Equalizer itself and for adapting it are reviewed in detail. The issue of noise amplification and the use of Channel State Information in the COFDM system to mitigate this phenomenon are also discussed. Application of a Pre-FFT Equalizer to a possible, COFDM based, 4G standard is then discussed and software simulations used to demonstrate the benefits that can be achieved by a Pre-FFT Equalizer in a 4G system.

  4. A double-DD blind equalizer for PolMux QAM optical coherent systems

    NASA Astrophysics Data System (ADS)

    Zhou, Zhili; Zhan, Yiju; Ruan, Xiukai; Cai, Qibo; Cui, Guihua; Zhu, Guijun

    2017-01-01

    In polarization-multiplexed (PolMux) coherent system, adaptive blind equalization is efficient in demultiplexing and mitigating inter-symbol interference (ISI). We propose a double decision-directed (double-DD) blind algorithm which combines the output of soft decision-directed (SDD) algorithm with that of the equalizer in variable step size decision-directed (VSS-DD) mode. In our scheme, two equalizers can softly switch when the mean-square error (MSE) is sufficiently low and it can achieve fast convergence rate and small steady MSE in steady state. By simulation in PolMux-16QAM/64QAM coherent systems, we show that the double-DD algorithm outperforms existing blind equalizers.

  5. Feasibility of CBCT-based proton dose calculation using a histogram-matching algorithm in proton beam therapy.

    PubMed

    Arai, Kazuhiro; Kadoya, Noriyuki; Kato, Takahiro; Endo, Hiromitsu; Komori, Shinya; Abe, Yoshitomo; Nakamura, Tatsuya; Wada, Hitoshi; Kikuchi, Yasuhiro; Takai, Yoshihiro; Jingu, Keiichi

    2017-01-01

    The aim of this study was to confirm On-Board Imager cone-beam computed tomography (CBCT) using the histogram-matching algorithm as a useful method for proton dose calculation. We studied one head and neck phantom, one pelvic phantom, and ten patients with head and neck cancer treated using intensity-modulated radiation therapy (IMRT) and proton beam therapy. We modified Hounsfield unit (HU) values of CBCT and generated two modified CBCTs (mCBCT-RR, mCBCT-DIR) using the histogram-matching algorithm: modified CBCT with rigid registration (mCBCT-RR) and that with deformable image registration (mCBCT-DIR). Rigid and deformable image registration were applied to match the CBCT to planning CT. To evaluate the accuracy of the proton dose calculation, we compared dose differences in the dosimetric parameters (D2% and D98%) for clinical target volume (CTV) and planning target volume (PTV). We also evaluated the accuracy of the dosimetric parameters (Dmean and D2%) for some organs at risk, and compared the proton ranges (PR) between planning CT (reference) and CBCT or mCBCTs, and the gamma passing rates of CBCT and mCBCTs. For patients, the average dose and PR differences of mCBCTs were smaller than those of CBCT. Additionally, the average gamma passing rates of mCBCTs were larger than those of CBCT (e.g., 94.1±3.5% in mCBCT-DIR vs. 87.8±7.4% in CBCT). We evaluated the accuracy of the proton dose calculation in CBCT and mCBCTs for two phantoms and ten patients. Our results showed that HU modification using the histogram-matching algorithm could improve the accuracy of the proton dose calculation.

  6. Visual Adaptation

    PubMed Central

    Webster, Michael A.

    2015-01-01

    Sensory systems continuously mold themselves to the widely varying contexts in which they must operate. Studies of these adaptations have played a long and central role in vision science. In part this is because the specific adaptations remain a powerful tool for dissecting vision, by exposing the mechanisms that are adapting. That is, “if it adapts, it's there.” Many insights about vision have come from using adaptation in this way, as a method. A second important trend has been the realization that the processes of adaptation are themselves essential to how vision works, and thus are likely to operate at all levels. That is, “if it's there, it adapts.” This has focused interest on the mechanisms of adaptation as the target rather than the probe. Together both approaches have led to an emerging insight of adaptation as a fundamental and ubiquitous coding strategy impacting all aspects of how we see. PMID:26858985

  7. Microscopic justification of the equal filling approximation

    SciTech Connect

    Perez-Martin, Sara; Robledo, L. M.

    2008-07-15

    The equal filling approximation, a procedure widely used in mean-field calculations to treat the dynamics of odd nuclei in a time-reversal invariant way, is justified as the consequence of a variational principle over an average energy functional. The ideas of statistical quantum mechanics are employed in the justification. As an illustration of the method, the ground and lowest-lying states of some octupole deformed radium isotopes are computed.

  8. Temporal control mechanism in equaled interval tapping.

    PubMed

    Yamada, M

    1996-05-01

    Subjects who were at intermediate levels of musical performance made equaled interval tapping in several tempos. The temporal fluctuation for the tapping was observed and analysed. The power spectrum of the fluctuation showed a critical phenomenon at around a frequency which corresponds to the period of 20 taps, for all tempos and all subjects, i.e., the slope of the spectrum was flat or had a positive value in the high frequency region above the critical frequency but it increased as the frequency decreased in the low frequency region below the critical frequency. Moreover, auto-regressive models and Akaike's information criterion were introduced to determine the critical tap number. The order of the best auto-regressive model for the temporal fluctuation data was distributed around 20 taps. These results show that the memory capacity of 20 taps governs the control of equaled interval tapping. To interpret the critical phenomenon of 20 taps with the memory capacity of the short term memory, the so called magic number seven, a simple chunking assumption was introduced; subjects might have unconsciously chunked every three taps during the tapping. If the chunking assumption is true, when subjects consciously chunk every seven taps, the memory capacity of taps should shift to about 50 taps. To test if the assumption is true or not, subjects made a three-beat rhythm tapping and a seven-beat rhythm tapping with equaled intervals. As a result, the memory capacity for these accented tappings were also estimated as 20 taps. This suggests that the critical phenomenon cannot be explained by the chunking assumption and the magic number seven, rather this finding suggests that there exists a memory capacity of 20 taps and this is used for equaled interval tapping.

  9. When equal masses don't balance

    NASA Astrophysics Data System (ADS)

    Newburgh, Ronald; Peidle, Joseph; Rueckner, Wolfgang

    2004-05-01

    We treat a modified Atwood's machine in which equal masses do not balance because of being in an accelerated frame of reference. Analysis of the problem illuminates the meaning of inertial forces, d'Alembert's principle, the use of free-body diagrams and the selection of appropriate systems for the diagrams. In spite of the range of these applications the analysis does not require calculus, so the ideas are accessible even to first-year students.

  10. Non-small cell lung cancer: Whole-lesion histogram analysis of the apparent diffusion coefficient for assessment of tumor grade, lymphovascular invasion and pleural invasion

    PubMed Central

    Tsuchiya, Naoko; Doai, Mariko; Usuda, Katsuo; Uramoto, Hidetaka

    2017-01-01

    Purpose Investigating the diagnostic accuracy of histogram analyses of apparent diffusion coefficient (ADC) values for determining non-small cell lung cancer (NSCLC) tumor grades, lymphovascular invasion, and pleural invasion. Materials and methods We studied 60 surgically diagnosed NSCLC patients. Diffusion-weighted imaging (DWI) was performed in the axial plane using a navigator-triggered single-shot, echo-planar imaging sequence with prospective acquisition correction. The ADC maps were generated, and we placed a volume-of-interest on the tumor to construct the whole-lesion histogram. Using the histogram, we calculated the mean, 5th, 10th, 25th, 50th, 75th, 90th, and 95th percentiles of ADC, skewness, and kurtosis. Histogram parameters were correlated with tumor grade, lymphovascular invasion, and pleural invasion. We performed a receiver operating characteristics (ROC) analysis to assess the diagnostic performance of histogram parameters for distinguishing different pathologic features. Results The ADC mean, 10th, 25th, 50th, 75th, 90th, and 95th percentiles showed significant differences among the tumor grades. The ADC mean, 25th, 50th, 75th, 90th, and 95th percentiles were significant histogram parameters between high- and low-grade tumors. The ROC analysis between high- and low-grade tumors showed that the 95th percentile ADC achieved the highest area under curve (AUC) at 0.74. Lymphovascular invasion was associated with the ADC mean, 50th, 75th, 90th, and 95th percentiles, skewness, and kurtosis. Kurtosis achieved the highest AUC at 0.809. Pleural invasion was only associated with skewness, with the AUC of 0.648. Conclusions ADC histogram analyses on the basis of the entire tumor volume are able to stratify NSCLCs' tumor grade, lymphovascular invasion and pleural invasion. PMID:28207858

  11. Rethinking equal access: agency, quality, and norms.

    PubMed

    Ruger, J P

    2007-01-01

    In 2005 the Global Health Council convened healthcare providers, community organizers, policymakers and researchers at Health Systems: Putting Pieces Together to discuss health from a systems perspective. Its report and others have established healthcare access and quality as two of the most important issues in health policy today. Still, there is little agreement about what equal access and quality mean for health system development. At the philosophical level, few have sought to understand why differences in healthcare quality are morally so troubling. While there has been considerable work in medical ethics on equal access, these efforts have neglected health agency (individuals' ability to work toward health goals they value) and health norms, both of which influence individuals' ability to be healthy. This paper argues for rethinking equal access in terms of an alternative ethical aim: to ensure the social conditions in which all individuals have the capability to be healthy. This perspective requires that we examine injustices not just by the level of healthcare resources, but by the: (1) quality of those resources and their capacity to enable effective health functioning; (2) extent to which society supports health agency so that individuals can convert healthcare resources into health functioning; and (3) nature of health norms, which affect individuals' efforts to achieve functioning.

  12. The equal effectiveness of different defensive strategies

    PubMed Central

    Zhang, Shuang; Zhang, Yuxin; Ma, Keming

    2015-01-01

    Plants have evolved a variety of defensive strategies to resist herbivory, but at the interspecific level, the relative effectiveness of these strategies has been poorly evaluated. In this study, we compared the level of herbivory between species that depend on ants as indirect defenders and species that rely primarily on their own direct defenses. Using a dataset of 871 species and 1,405 data points, we found that in general, ant-associated species had levels of herbivory equal to those of species that are unattractive to ants; the pattern was unaffected by plant life form, climate and phylogenetic relationships between species. Interestingly, species that offer both food and nesting spaces for ants suffered significantly lower herbivory compared to species that offer either food or nesting spaces only or no reward for ants. A negative relationship between herbivory and latitude was detected, but the pattern can be changed by ants. These findings suggest that, at the interspecific level, the effectiveness of different defensive strategies may be equal. Considering the effects of herbivory on plant performance and fitness, the equal effectiveness of different defensive strategies may play an important role in the coexistence of various species at the community scale. PMID:26267426

  13. Chest conduction properties and ECG equalization.

    PubMed

    Delle Cave, G; Fabricatore, G; Nolfe, G; Petrosino, M; Pizzuti, G P

    2000-01-01

    In common practice of detecting and recording biomedical signals, it is often implicitly assumed that the propagation, through the whole circuit human body-electrodes recording devices, is frequency and voltage independent. As a consequence, clinicians are not aware that recorded signals do not correspond faithfully to the original electrical activity of organs under investigation. We have studied the transmission of electrical signals in human body at various voltages and frequencies to understand if and to which extent the most diffused stimulating and recording techniques used in medicine are affected by global body conduction properties. Our results show that, in order to obtain a more faithful detection of electrical activity produced or evoked by human organs (e.g. EGG, electromyography, etc.), it is convenient to 'equalize'' recorded signals. To this purpose, two equalization techniques are proposed, based, respectively, on a simple hardware filtering during acquisition, or FFT post-processing of the acquired signals. As an application, we have studied the transmission of electrical signal in human chest and have compared equalized high frequency ECG signals with raw (original) recordings.

  14. Object localization using adaptive feature selection

    NASA Astrophysics Data System (ADS)

    Hwang, S. Youngkyoo; Kim, Jungbae; Lee, Seongdeok

    2009-01-01

    'Fast and robust' are the most beautiful keywords in computer vision. Unfortunately they are in trade-off relationship. We present a method to have one's cake and eat it using adaptive feature selections. Our chief insight is that it compares reference patterns to query patterns, so that it selects smartly more important and useful features to find target. The probabilities of pixels in the query to belong to the target are calculated from importancy of features. Our framework has three distinct advantages: 1 - It saves computational cost dramatically to the conventional approach. This framework makes it possible to find location of an object in real-time. 2 - It can smartly select robust features of a reference pattern as adapting to a query pattern. 3- It has high flexibility on any feature. It doesn't matter which feature you may use. Lots of color space, texture, motion features and other features can fit perfectly only if the features meet histogram criteria.

  15. Elucidating the effects of adsorbent flexibility on fluid adsorption using simple models and flat-histogram sampling methods

    SciTech Connect

    Shen, Vincent K. Siderius, Daniel W.

    2014-06-28

    Using flat-histogram Monte Carlo methods, we investigate the adsorptive behavior of the square-well fluid in two simple slit-pore-like models intended to capture fundamental characteristics of flexible adsorbent materials. Both models require as input thermodynamic information about the flexible adsorbent material itself. An important component of this work involves formulating the flexible pore models in the appropriate thermodynamic (statistical mechanical) ensembles, namely, the osmotic ensemble and a variant of the grand-canonical ensemble. Two-dimensional probability distributions, which are calculated using flat-histogram methods, provide the information necessary to determine adsorption thermodynamics. For example, we are able to determine precisely adsorption isotherms, (equilibrium) phase transition conditions, limits of stability, and free energies for a number of different flexible adsorbent materials, distinguishable as different inputs into the models. While the models used in this work are relatively simple from a geometric perspective, they yield non-trivial adsorptive behavior, including adsorption-desorption hysteresis solely due to material flexibility and so-called “breathing” of the adsorbent. The observed effects can in turn be tied to the inherent properties of the bare adsorbent. Some of the effects are expected on physical grounds while others arise from a subtle balance of thermodynamic and mechanical driving forces. In addition, the computational strategy presented here can be easily applied to more complex models for flexible adsorbents.

  16. Quantifying kinetics from time series of single-molecule Förster resonance energy transfer efficiency histograms.

    PubMed

    Benke, Stephan; Nettels, Daniel; Hofmann, Hagen; Schuler, Benjamin

    2017-03-17

    Single-molecule fluorescence spectroscopy is a powerful approach for probing biomolecular structure and dynamics, including protein folding. For the investigation of nonequilibrium kinetics, Förster resonance energy transfer combined with confocal multiparameter detection has proven particularly versatile, owing to the large number of observables and the broad range of accessible timescales, especially in combination with rapid microfluidic mixing. However, a comprehensive kinetic analysis of the resulting time series of transfer efficiency histograms and complementary observables can be challenging owing to the complexity of the data. Here we present and compare three different methods for the analysis of such kinetic data: singular value decomposition, multivariate curve resolution with alternating least square fitting, and model-based peak fitting, where an explicit model of both the transfer efficiency histogram of each species and the kinetic mechanism of the process is employed. While each of these methods has its merits for specific applications, we conclude that model-based peak fitting is most suitable for a quantitative analysis and comparison of kinetic mechanisms.

  17. Experimental assessment of an automatic breast density classification algorithm based on principal component analysis applied to histogram data

    NASA Astrophysics Data System (ADS)

    Angulo, Antonio; Ferrer, Jose; Pinto, Joseph; Lavarello, Roberto; Guerrero, Jorge; Castaneda, Benjamín.

    2015-01-01

    Breast parenchymal density is considered a strong indicator of cancer risk. However, measures of breast density are often qualitative and require the subjective judgment of radiologists. This work proposes a supervised algorithm to automatically assign a BI-RADS breast density score to a digital mammogram. The algorithm applies principal component analysis to the histograms of a training dataset of digital mammograms to create four different spaces, one for each BI-RADS category. Scoring is achieved by projecting the histogram of the image to be classified onto the four spaces and assigning it to the closest class. In order to validate the algorithm, a training set of 86 images and a separate testing database of 964 images were built. All mammograms were acquired in the craniocaudal view from female patients without any visible pathology. Eight experienced radiologists categorized the mammograms according to a BIRADS score and the mode of their evaluations was considered as ground truth. Results show better agreement between the algorithm and ground truth for the training set (kappa=0.74) than for the test set (kappa=0.44) which suggests the method may be used for BI-RADS classification but a better training is required.

  18. Quantifying kinetics from time series of single-molecule Förster resonance energy transfer efficiency histograms

    NASA Astrophysics Data System (ADS)

    Benke, Stephan; Nettels, Daniel; Hofmann, Hagen; Schuler, Benjamin

    2017-03-01

    Single-molecule fluorescence spectroscopy is a powerful approach for probing biomolecular structure and dynamics, including protein folding. For the investigation of nonequilibrium kinetics, Förster resonance energy transfer combined with confocal multiparameter detection has proven particularly versatile, owing to the large number of observables and the broad range of accessible timescales, especially in combination with rapid microfluidic mixing. However, a comprehensive kinetic analysis of the resulting time series of transfer efficiency histograms and complementary observables can be challenging owing to the complexity of the data. Here we present and compare three different methods for the analysis of such kinetic data: singular value decomposition, multivariate curve resolution with alternating least square fitting, and model-based peak fitting, where an explicit model of both the transfer efficiency histogram of each species and the kinetic mechanism of the process is employed. While each of these methods has its merits for specific applications, we conclude that model-based peak fitting is most suitable for a quantitative analysis and comparison of kinetic mechanisms.

  19. A rotation-invariant pattern recognition using spectral fringe-adjusted joint transform correlator and histogram representation

    NASA Astrophysics Data System (ADS)

    Sidike, Paheding; Aspiras, Theus; Asari, Vijayan K.; Alam, Mohammad S.

    2014-04-01

    A new rotation-invariant pattern recognition technique, based on spectral fringe-adjusted joint transform correlator (SFJTC) and histogram representation, is proposed. Synthetic discriminant function (SDF) based joint transform correlation (JTC) techniques have shown attractive performance in rotation-invariant pattern recognition applications. However, when the targets present in a complex scene, SDF-based JTC techniques may produce false detections due to inaccurate estimation of rotation angle of the object. Therefore, we herein propose an efficient rotation-invariant JTC scheme which does not require a priori rotation training of the reference image. In the proposed technique, a Vectorized Gaussian Ringlet Intensity Distribution (VGRID) descriptor is also proposed to obtain rotation-invariant features from the reference image. In this step, we divide the reference image into multiple Gaussian ringlets and extract histogram distribution of each ringlet, and then concatenate them into a vector as a target signature. Similarly, an unknown input scene is also represented by the VGRID which produces a multidimensional input image. Finally, the concept of the SFJTC is incorporated and utilized for target detection in the input scene. The classical SFJTC was proposed for detecting very small objects involving only few pixels in hyperspectral imagery. However, in our proposed algorithm, the SFJTC is applied for a two-dimensional image without limitation of the size of objects and most importantly it achieves rotation-invariant target discriminability. Simulation results verify that the proposed scheme performs satisfactorily in detecting targets in the input scene irrespective of rotation of the object.

  20. Spatio-Temporal Equalizer for a Receiving-Antenna Feed Array

    NASA Technical Reports Server (NTRS)

    Mukai, Ryan; Lee, Dennis; Vilnrotter, Victor

    2010-01-01

    A spatio-temporal equalizer has been conceived as an improved means of suppressing multipath effects in the reception of aeronautical telemetry signals, and may be adaptable to radar and aeronautical communication applications as well. This equalizer would be an integral part of a system that would also include a seven-element planar array of receiving feed horns centered at the focal point of a paraboloidal antenna that would be nominally aimed at or near the aircraft that would be the source of the signal that one seeks to receive (see Figure 1). This spatio-temporal equalizer would consist mostly of a bank of seven adaptive finite-impulse-response (FIR) filters one for each element in the array - and the outputs of the filters would be summed (see Figure 2). The combination of the spatial diversity of the feedhorn array and the temporal diversity of the filter bank would afford better multipath-suppression performance than is achievable by means of temporal equalization alone. The seven-element feed array would supplant the single feed horn used in a conventional paraboloidal ground telemetry-receiving antenna. The radio-frequency telemetry signals re ceiv ed by the seven elements of the array would be digitized, converted to complex baseband form, and sent to the FIR filter bank, which would adapt itself in real time to enable reception of telemetry at a low bit error rate, even in the presence of multipath of the type found at many flight test ranges.

  1. Adaptive Management

    EPA Science Inventory

    Adaptive management is an approach to natural resource management that emphasizes learning through management where knowledge is incomplete, and when, despite inherent uncertainty, managers and policymakers must act. Unlike a traditional trial and error approach, adaptive managem...

  2. Why Does Public Transport Not Arrive on Time? The Pervasiveness of Equal Headway Instability

    PubMed Central

    Gershenson, Carlos; Pineda, Luis A.

    2009-01-01

    Background The equal headway instability phenomenon is pervasive in public transport systems. This instability is characterized by an aggregation of vehicles that causes inefficient service. While equal headway instability is common, it has not been studied independently of a particular scenario. However, the phenomenon is apparent in many transport systems and can be modeled and rectified in abstraction. Methodology We present a multi-agent simulation where a default method with no restrictions always leads to unstable headways. We discuss two methods that attempt to achieve equal headways, called minimum and maximum. Since one parameter of the methods depends on the passenger density, adaptive versions—where the relevant parameter is adjusted automatically—are also put forward. Our results show that the adaptive maximum method improves significantly over the default method. The model and simulation give insights of the interplay between transport design and passenger behavior. Finally, we provide technological and social suggestions for engineers and passengers to help achieve equal headways and thus reduce delays. Conclusions The equal headway instability phenomenon can be avoided with the suggested technological and social measures. PMID:19862321

  3. Decomposite channel estimation and equalization for GMSK-based system with transmit diversity

    NASA Astrophysics Data System (ADS)

    Yao, Timothy S.; Gudena, Chandragupta

    2004-08-01

    In this paper, multi-channel estimation schemes for a GMSK-based system with transmit diversity (space-time coding) are presented. For such a system, the channel information (impulse response) is critical for both space-time decoding and equalization at the receiver. Three non-blind estimation schemes, which decompose the channel in the process, are proposed for the GMSK receiver to obtain the impulse response of each of the multipath channels (i.e. transmit antennas): oversampling deconvolution, minimum mean-square error, and joint adaptive and correlation estimation. Since the received signal is the sum of emitted GMSK signals, interference cancellation is employed to facilitate the estimation process. Three cancellation algorithms, including direct cancellation, mean-square cancellation, and iterative cancellation, combined with each channel estimation method are investigated and compared. The estimated channel information will feed to the receiver consisting of space-time decoder and equalizer to decode the symbols of interest. Two receiver architectures are investigated in this paper, where the first design is the space-time decoder followed by the equalizer, the other is in the reverse way (equalizer followed by space-time decoder). In each of the two receiver architectures, the channel estimation needs additional modification and so does the equalizer. The equalizer in the design is a maximum likelihood sequence estimation (MLSE) based on Viterbi algorithm. To prove the concept and algorithms, both simulation and hardware implementation are performed. From the experimental results, it is shown that all the channel estimation algorithms can produce acceptable impulse response for space-time decoding and equalizer, in which the joint adaptive estimation with iterative cancellation is superior to the others. It is also shown that the diversity gain of this transmit diversity system is as good as a system with the same degree of receive diversity.

  4. Equal-Curvature X-Ray Telescopes

    NASA Technical Reports Server (NTRS)

    Saha, Timo T.; Zhang, William

    2002-01-01

    We introduce a new type of x-ray telescope design; an Equal-Curvature telescope. We simply add a second order axial sag to the base grazing incidence cone-cone telescope. The radius of curvature of the sag terms is the same on the primary surface and on the secondary surface. The design is optimized so that the on-axis image spot at the focal plane is minimized. The on-axis RMS (root mean square) spot diameter of two studied telescopes is less than 0.2 arc-seconds. The off-axis performance is comparable to equivalent Wolter type 1 telescopes.

  5. Neural network with dynamically adaptable neurons

    NASA Technical Reports Server (NTRS)

    Tawel, Raoul (Inventor)

    1994-01-01

    This invention is an adaptive neuron for use in neural network processors. The adaptive neuron participates in the supervised learning phase of operation on a co-equal basis with the synapse matrix elements by adaptively changing its gain in a similar manner to the change of weights in the synapse IO elements. In this manner, training time is decreased by as much as three orders of magnitude.

  6. 29 CFR 1620.25 - Equalization of rates.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Equalization of rates. 1620.25 Section 1620.25 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION THE EQUAL PAY ACT § 1620.25 Equalization of rates. Under the express terms of the EPA, when a prohibited sex-based wage differential...

  7. 29 CFR 1614.408 - Civil action: Equal Pay Act.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Civil action: Equal Pay Act. 1614.408 Section 1614.408... EQUAL EMPLOYMENT OPPORTUNITY Appeals and Civil Actions § 1614.408 Civil action: Equal Pay Act. A..., three years of the date of the alleged violation of the Equal Pay Act regardless of whether he or...

  8. 12 CFR 268.101 - General policy for equal opportunity.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Discrimination in Employment Act (ADEA) (29 U.S.C. 621 et seq.), the Equal Pay Act (29 U.S.C. 206(d)), or the... 12 Banks and Banking 3 2010-01-01 2010-01-01 false General policy for equal opportunity. 268.101... RESERVE SYSTEM RULES REGARDING EQUAL OPPORTUNITY Board Program To Promote Equal Opportunity §...

  9. 12 CFR 268.202 - Equal Pay Act.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Equal Pay Act. 268.202 Section 268.202 Banks... REGARDING EQUAL OPPORTUNITY Provisions Applicable to Particular Complaints § 268.202 Equal Pay Act. Complaints alleging violations of the Equal Pay Act shall be processed under this part....

  10. Some Thoughts on the Equal Pay Act and Coaching Salaries.

    ERIC Educational Resources Information Center

    Boring, Phyllis

    This paper discusses the Equal Pay Act, Title VII of the Civil Rights Act of 1964, as it applies to women athletic coaches and physical education teachers. The following points are considered: (1) application of the Equal Pay Act; (2) advantage of voluntary compliance with the Equal Pay Act; (3) factors used to measure "equal work"; (4)…

  11. Equality of Education and Citizenship: Challenges of European Integration

    ERIC Educational Resources Information Center

    Follesdal, Andreas

    2008-01-01

    What kind of equality among Europeans does equal citizenship require, especially regarding education? In particular, is there good reason to insist of equality of education among Europeans--and if so, equality of what? To what extent should the same knowledge base and citizenship norms be taught across state borders and religious and other…

  12. 47 CFR 36.191 - Equal access equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false Equal access equipment. 36.191 Section 36.191... AND RESERVES FOR TELECOMMUNICATIONS COMPANIES 1 Telecommunications Property Equal Access Equipment § 36.191 Equal access equipment. (a) Equal access investment includes only initial...

  13. 77 FR 43498 - Federal Sector Equal Employment Opportunity

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-25

    ... From the Federal Register Online via the Government Publishing Office EQUAL EMPLOYMENT OPPORTUNITY COMMISSION 29 CFR Part 1614 RIN Number 3046-AA73 Federal Sector Equal Employment Opportunity AGENCY: Equal Employment Opportunity Commission. ACTION: Final rule. SUMMARY: The Equal Employment Opportunity...

  14. 34 CFR 21.1 - Equal Access to Justice Act.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Equal Access to Justice Act. 21.1 Section 21.1 Education Office of the Secretary, Department of Education EQUAL ACCESS TO JUSTICE General § 21.1 Equal Access to Justice Act. (a) The Equal Access to Justice Act (the Act) provides for the award of fees...

  15. Spearman's g and the Problem of Educational Equality.

    ERIC Educational Resources Information Center

    Jensen, Arthur R.

    1991-01-01

    Criticizes approach to equal education that seeks equality of outcome as well as equality of opportunity. Discusses Spearman's theory of g that attempts to explain individual differences in intelligence. Contrasts efforts at genuinely reducing equality of outcome, including Aptitude X Treatment Interaction, Mastery Learning, and Thinking Skills…

  16. 29 CFR 30.3 - Equal opportunity standards.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 1 2013-07-01 2013-07-01 false Equal opportunity standards. 30.3 Section 30.3 Labor Office of the Secretary of Labor EQUAL EMPLOYMENT OPPORTUNITY IN APPRENTICESHIP AND TRAINING § 30.3 Equal... the program sponsor; and (3) Take affirmative action to provide equal opportunity in...

  17. 29 CFR 30.3 - Equal opportunity standards.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 1 2014-07-01 2013-07-01 true Equal opportunity standards. 30.3 Section 30.3 Labor Office of the Secretary of Labor EQUAL EMPLOYMENT OPPORTUNITY IN APPRENTICESHIP AND TRAINING § 30.3 Equal... the program sponsor; and (3) Take affirmative action to provide equal opportunity in...

  18. 29 CFR 30.3 - Equal opportunity standards.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 1 2012-07-01 2012-07-01 false Equal opportunity standards. 30.3 Section 30.3 Labor Office of the Secretary of Labor EQUAL EMPLOYMENT OPPORTUNITY IN APPRENTICESHIP AND TRAINING § 30.3 Equal... the program sponsor; and (3) Take affirmative action to provide equal opportunity in...

  19. 29 CFR 30.3 - Equal opportunity standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 1 2011-07-01 2011-07-01 false Equal opportunity standards. 30.3 Section 30.3 Labor Office of the Secretary of Labor EQUAL EMPLOYMENT OPPORTUNITY IN APPRENTICESHIP AND TRAINING § 30.3 Equal... the program sponsor; and (3) Take affirmative action to provide equal opportunity in...

  20. Why Should We Demand Equality of Educational Opportunity?

    ERIC Educational Resources Information Center

    Meyer, Kirsten

    2016-01-01

    This paper reveals how equal educational opportunities, equal job opportunities and equality of opportunity for welfare are related to each other, and how they are related to other demands for justice. There are several important objections to the emphasis on equal educational opportunities. Nevertheless, this paper shows that demanding equal…

  1. Is Primatology an equal-opportunity discipline?

    PubMed

    Addessi, Elsa; Borgi, Marta; Palagi, Elisabetta

    2012-01-01

    The proportion of women occupying academic positions in biological sciences has increased in the past few decades, but women are still under-represented in senior academic ranks compared to their male colleagues. Primatology has been often singled out as a model of "equal-opportunity" discipline because of the common perception that women are more represented in Primatology than in similar fields. But is this indeed true? Here we show that, although in the past 15 years the proportion of female primatologists increased from the 38% of the early 1990s to the 57% of 2008, Primatology is far from being an "equal-opportunity" discipline, and suffers the phenomenon of "glass ceiling" as all the other scientific disciplines examined so far. In fact, even if Primatology does attract more female students than males, at the full professor level male members significantly outnumber females. Moreover, regardless of position, IPS male members publish significantly more than their female colleagues. Furthermore, when analyzing gender difference in scientific productivity in relation to the name order in the publications, it emerged that the scientific achievements of female primatologists (in terms of number and type of publications) do not always match their professional achievements (in terms of academic position). However, the gender difference in the IPS members' number of publications does not correspond to a similar difference in their scientific impact (as measured by their H index), which may indicate that female primatologists' fewer articles are of higher impact than those of their male colleagues.

  2. Evolution of equal division among unequal partners.

    PubMed

    Debove, Stéphane; Baumard, Nicolas; André, Jean-Baptiste

    2015-02-01

    One of the hallmarks of human fairness is its insensitivity to power: although strong individuals are often in a position to coerce weak individuals, fairness requires them to share the benefits of cooperation equally. The existence of such egalitarianism is poorly explained by current evolutionary models. We present a model based on cooperation and partner choice that can account for the emergence of a psychological disposition toward fairness, whatever the balance of power between the cooperative partners. We model the evolution of the division of a benefit in an interaction similar to an ultimatum game, in a population made up of individuals of variable strength. The model shows that strong individuals will not receive any advantage from their strength, instead having to share the benefits of cooperation equally with weak individuals at the evolutionary equilibrium, a result that is robust to variations in population size and the proportion of weak individuals. We discuss how this model suggests an explanation for why egalitarian behaviors toward everyone, including the weak, should be more likely to evolve in humans than in any other species.

  3. Equal area rule methods for ternary systems

    SciTech Connect

    Shyu, G.S.; Hanif, N.S.M.; Alvarado, J.F.J.; Hall, K.R.; Eubank, P.T.

    1995-12-01

    The phase equilibrium behavior of fluid mixtures is an important design consideration for both chemical processes and oil production. Eubank and Hall have recently shown the equal area rule (EAR) applies to the composition derivative of the Gibbs energy of a binary system at fixed pressure and temperature regardless of derivative continuity. A sufficient condition for equilibria, EAR is faster and simpler than either the familiar tangent-line method or the area method of Eubank et al. Here, the authors show that EAR can be extended to ternary systems exhibiting one, two, or three phases at equilibrium. A single directional vector is searched in composition space; at equilibrium, this vector is the familiar tie line. A sensitive criterion for equilibrium under EAR is equality of orthogonal derivatives such as ({partial_derivative}g/{partial_derivative}x{sub 1}){sub x{sub 2}P,T} at the end points ({alpha} and {beta}), where g {equivalent_to} ({Delta}{sub m}G/RT). Repeated use of the binary algorithm published in the first reference allows rapid, simple solution of ternary problems, even with hand-held calculations for cases where the background model is simple (e.g., activity coefficient models) and the derivative continuous.

  4. A computerized approach for estimating pulmonary nodule growth rates in three-dimensional thoracic CT images based on CT density histogram

    NASA Astrophysics Data System (ADS)

    Kawata, Yoshiki; Niki, Noboru; Ohmatsu, Hironobu; Kusumoto, Masahiko; Kakinuma, Ryutaro; Mori, Kiyoshi; Yamada, Kozo; Nishiyama, Hiroyuki; Eguchi, Kenji; Kaneko, Masahiro; Moriyama, Noriyuki

    2005-04-01

    In research and development of computer-aided differential diagnosis, there is now a widespread interest in the use of nodule doubling time for measuring the volumetric changes of pulmonary nodule. To assess nodule status requires not only the measurement of volume changes but also one of nodule density variations. This paper proposes a computerized approach to measure nodule density variation inside small pulmonary nodule using CT images. The approach consists of five steps: (1) nodule segmentation, (2) computation of CT density histogram, (3) nodule classification based on CT density histogram and size, (4) computation of doubling time based on CT density histogram, and (5) classification between benign and malignant. Our approach was applied to follow-up scans of lung nodules. The preliminary experimental result demonstrated that our approach has a highly potential usefulness to assess the nodule evolution using high-resolution CT images.

  5. Comp Plan: A computer program to generate dose and radiobiological metrics from dose-volume histogram files

    SciTech Connect

    Holloway, Lois Charlotte; Miller, Julie-Anne; Kumar, Shivani; Whelan, Brendan M.; Vinod, Shalini K.

    2012-10-01

    Treatment planning studies often require the calculation of a large number of dose and radiobiological metrics. To streamline these calculations, a computer program called Comp Plan was developed using MATLAB. Comp Plan calculates common metrics, including equivalent uniform dose, tumor control probability, and normal tissue complication probability from dose-volume histogram data. The dose and radiobiological metrics can be calculated for the original data or for an adjusted fraction size using the linear quadratic model. A homogeneous boost dose can be added to a given structure if desired. The final output is written to an Excel file in a format convenient for further statistical analysis. Comp Plan was verified by independent calculations. A lung treatment planning study comparing 45 plans for 7 structures using up to 6 metrics for each structure was successfully analyzed within approximately 5 minutes with Comp Plan. The code is freely available from the authors on request.

  6. A novel chromatic dispersion monitoring technique for 16/64-QAM system based on asynchronous amplitude histogram

    NASA Astrophysics Data System (ADS)

    Yan, Li-juan; Zhu, Bo; Liu, Guo-qing; Hu, Fang-ren

    2013-05-01

    A novel chromatic dispersion (CD) monitoring technique based on asynchronous amplitude histogram (AAH) for higher order modulation formats is proposed in this paper. Without demodulating the signal, in the monitoring scheme, the received signal is sampled asynchronously, and thus clock information and high-speed sampling units are unnecessary, resulting in low cost and high reliability. Simulations of CD monitoring technique for non-return-to-zero/return-to-zero (NRZ/RZ) 16- and 64-quadrature amplitude modulation (QAM) systems with different optical signal-to-noise ratios (OSNRs) and duty cycles are investigated, and the tolerance of the scheme is also discussed. Simulation results show that the presented CD monitoring technique with high sensitivity can be applied to monitor the residual CD of a transmission link in the next-generation optical networks.

  7. Adaptive SPECT

    PubMed Central

    Barrett, Harrison H.; Furenlid, Lars R.; Freed, Melanie; Hesterman, Jacob Y.; Kupinski, Matthew A.; Clarkson, Eric; Whitaker, Meredith K.

    2008-01-01

    Adaptive imaging systems alter their data-acquisition configuration or protocol in response to the image information received. An adaptive pinhole single-photon emission computed tomography (SPECT) system might acquire an initial scout image to obtain preliminary information about the radiotracer distribution and then adjust the configuration or sizes of the pinholes, the magnifications, or the projection angles in order to improve performance. This paper briefly describes two small-animal SPECT systems that allow this flexibility and then presents a framework for evaluating adaptive systems in general, and adaptive SPECT systems in particular. The evaluation is in terms of the performance of linear observers on detection or estimation tasks. Expressions are derived for the ideal linear (Hotelling) observer and the ideal linear (Wiener) estimator with adaptive imaging. Detailed expressions for the performance figures of merit are given, and possible adaptation rules are discussed. PMID:18541485

  8. Modeling the dark current histogram induced by gold contamination in complementary-metal-oxide-semiconductor image sensors

    SciTech Connect

    Domengie, F. Morin, P.; Bauza, D.

    2015-07-14

    We propose a model for dark current induced by metallic contamination in a CMOS image sensor. Based on Shockley-Read-Hall kinetics, the expression of dark current proposed accounts for the electric field enhanced emission factor due to the Poole-Frenkel barrier lowering and phonon-assisted tunneling mechanisms. To that aim, we considered the distribution of the electric field magnitude and metal atoms in the depth of the pixel. Poisson statistics were used to estimate the random distribution of metal atoms in each pixel for a given contamination dose. Then, we performed a Monte-Carlo-based simulation for each pixel to set the number of metal atoms the pixel contained and the enhancement factor each atom underwent, and obtained a histogram of the number of pixels versus dark current for the full sensor. Excellent agreement with the dark current histogram measured on an ion-implanted gold-contaminated imager has been achieved, in particular, for the description of the distribution tails due to the pixel regions in which the contaminant atoms undergo a large electric field. The agreement remains very good when increasing the temperature by 15 °C. We demonstrated that the amplification of the dark current generated for the typical electric fields encountered in the CMOS image sensors, which depends on the nature of the metal contaminant, may become very large at high electric field. The electron and hole emissions and the resulting enhancement factor are described as a function of the trap characteristics, electric field, and temperature.

  9. Pressure equalizing photovoltaic assembly and method

    DOEpatents

    Dinwoodie, Thomas L.

    2003-05-27

    Each PV assembly of an array of PV assemblies comprises a base, a PV module and a support assembly securing the PV module to a position overlying the upper surface of the base. Vents are formed through the base. A pressure equalization path extends from the outer surface of the PV module, past the peripheral edge of the PV module, to and through at least one of the vents, and to the lower surface of the base to help reduce wind uplift forces on the PV assembly. The PV assemblies may be interengaged, such as by interengaging the bases of adjacent PV assemblies. The base may include a main portion and a cover and the bases of adjacent PV assemblies may be interengaged by securing the covers of adjacent bases together.

  10. Pressure-equalizing PV assembly and method

    DOEpatents

    Dinwoodie, Thomas L.

    2004-10-26

    Each PV assembly of an array of PV assemblies comprises a base, a PV module and a support assembly securing the PV module to a position overlying the upper surface of the base. Vents are formed through the base. A pressure equalization path extends from the outer surface of the PV module, past the PV module, to and through at least one of the vents, and to the lower surface of the base to help reduce wind uplift forces on the PV assembly. The PV assemblies may be interengaged, such as by interengaging the bases of adjacent PV assemblies. The base may include a main portion and a cover and the bases of adjacent PV assemblies may be interengaged by securing the covers of adjacent bases together.

  11. Newsmaker interview: Geraldine Ferraro. Striving for equality.

    PubMed

    1995-01-01

    Geraldine Ferraro is a long-standing advocate of women's rights who currently serves as the US ambassador to the UN's Human Rights Commission. She was elected to the US Congress in 1978, after which she served three terms as a representative from the state of New York. Ms. Ferraro led efforts to pass the Equal Rights Amendment, sponsored the Women's Economic Equity Act, and ran on the Democratic national ticket in 1984 as the first woman vice-presidential candidate. This paper presents the text of her interview with ZPG about women's well-being, current political trends, and population growth. Ms. Ferraro became interested in population issues in the wake of the 1973 Supreme Court decision on Roe vs. Wade. She explains how a 1989 trip to the Philippines made her more aware of the impact of rapid population growth and overpopulation. Ms. Ferraro believes that overpopulation is due to poor health care and high infant and child mortality, the oppression of and discrimination against women, and inadequate social services. Educating both men and women and ensuring the provision of family planning methods are needed to start dealing with such problems. Ms. Ferraro responds to questions on how the health and well-being of women are linked with the need to stabilize population growth, why women in the US have not attained equality with men, the movement of the US away from being a leader on population issues both domestically and internationally, how local communities across the US can influence the direction of population-related policies, and what she hopes will be accomplished at the Fourth World Conference on Women in Beijing in September 1995.

  12. Is Primatology an Equal-Opportunity Discipline?

    PubMed Central

    Borgi, Marta

    2012-01-01

    The proportion of women occupying academic positions in biological sciences has increased in the past few decades, but women are still under-represented in senior academic ranks compared to their male colleagues. Primatology has been often singled out as a model of “equal-opportunity” discipline because of the common perception that women are more represented in Primatology than in similar fields. But is this indeed true? Here we show that, although in the past 15 years the proportion of female primatologists increased from the 38% of the early 1990s to the 57% of 2008, Primatology is far from being an “equal-opportunity” discipline, and suffers the phenomenon of “glass ceiling” as all the other scientific disciplines examined so far. In fact, even if Primatology does attract more female students than males, at the full professor level male members significantly outnumber females. Moreover, regardless of position, IPS male members publish significantly more than their female colleagues. Furthermore, when analyzing gender difference in scientific productivity in relation to the name order in the publications, it emerged that the scientific achievements of female primatologists (in terms of number and type of publications) do not always match their professional achievements (in terms of academic position). However, the gender difference in the IPS members' number of publications does not correspond to a similar difference in their scientific impact (as measured by their H index), which may indicate that female primatologists' fewer articles are of higher impact than those of their male colleagues. PMID:22272353

  13. Iterative Frequency Domain Decision Feedback Equalization and Decoding for Underwater Acoustic Communications

    NASA Astrophysics Data System (ADS)

    Zhao, Liang; Ge, Jian-Hua

    2012-12-01

    Single-carrier (SC) transmission with frequency-domain equalization (FDE) is today recognized as an attractive alternative to orthogonal frequency-division multiplexing (OFDM) for communication application with the inter-symbol interference (ISI) caused by multi-path propagation, especially in shallow water channel. In this paper, we investigate an iterative receiver based on minimum mean square error (MMSE) decision feedback equalizer (DFE) with symbol rate and fractional rate samplings in the frequency domain (FD) and serially concatenated trellis coded modulation (SCTCM) decoder. Based on sound speed profiles (SSP) measured in the lake and finite-element ray tracking (Bellhop) method, the shallow water channel is constructed to evaluate the performance of the proposed iterative receiver. Performance results show that the proposed iterative receiver can significantly improve the performance and obtain better data transmission than FD linear and adaptive decision feedback equalizers, especially in adopting fractional rate sampling.

  14. Downhole component with a pressure equalization passageway

    DOEpatents

    Hall, David R.; Pixton, David S.; Dahlgren, Scott; Reynolds, Jay T.; Breihan, James W.; Briscoe, Michael A.

    2006-08-22

    The present invention includes a downhole component adapted for transmitting downhole data. The downhole component includes a threaded end on a downhole component. The threaded end furthermore includes an interior region, and exterior region, and a mating surface wherein a cavity is formed. A data transmission element is disposed in the cavity and displaces a volume of the cavity. At least one passageway is formed in the threaded region between interior and exterior regions. The passageway is in fluid communication with both the interior and exterior regions and thereby relieves pressure build up of thread lubricant upon tool joint make up.

  15. Equalization of loudspeaker response using balanced model truncation.

    PubMed

    Li, Xiansheng; Cai, Zhibo; Zheng, Chengshi; Li, Xiaodong

    2015-04-01

    Traditional loudspeaker equalization algorithms cannot decide the order of an equalizer before the whole equalization procedure has been completed. Designers have to try many times before they determine a proper order of the equalization filter. A method which solves this drawback is presented for loudspeaker equalization using balanced model truncation. The order of the equalizer can be easily decided using this algorithm and the error between the model and the loudspeaker can also be readily controlled. Examples are presented and the performance of the proposed method is discussed with comparative experiments.

  16. 101 Short Problems from EQUALS = 101 Problemas Cortos del programma EQUALS.

    ERIC Educational Resources Information Center

    Stenmark, Jean Kerr, Ed.

    EQUALS is a teacher advisory program that helps elementary and secondary educators acquire methods and materials to attract minority and female students to mathematics. The program supports a problem-solving approach to mathematics, including having students working in groups, using active assessment methods, and incorporating a broad mathematics…

  17. The Equal Rights Amendment: Guaranteeing Equal Rights for Women Under the Constitution. Clearinghouse Publication 68.

    ERIC Educational Resources Information Center

    Commission on Civil Rights, Washington, DC.

    This report examines the effects that the ratification of the Equal Rights Amendment will have on laws concerning women. The amendment's impacts on divorced, married, and employed women, on women in the military and in school, and on women dependent on pensions, insurance, and social security are all analyzed. A discussion of the Constitutional…

  18. Regulatory Fit and Equal Opportunity/Diversity: Implications for the Defense Equal Opportunity Management Institute (DEOMI)

    DTIC Science & Technology

    2013-01-01

    required for duty. Until recently, these military BFOQs prohibited admitted homosexuals from serving and prohibited women from serving in combat positions... choice is the vigilant goal pursuit strategy that is not oriented toward differences. Operational Recommendations for DEOMI Regulatory focus/fit...equally difficult. From the egalitarian perspective, seeking diversity is the morally correct choice (Syed & Kramer, 2009); diversity is not a

  19. Climate adaptation

    NASA Astrophysics Data System (ADS)

    Kinzig, Ann P.

    2015-03-01

    This paper is intended as a brief introduction to climate adaptation in a conference devoted otherwise to the physics of sustainable energy. Whereas mitigation involves measures to reduce the probability of a potential event, such as climate change, adaptation refers to actions that lessen the impact of climate change. Mitigation and adaptation differ in other ways as well. Adaptation does not necessarily have to be implemented immediately to be effective; it only needs to be in place before the threat arrives. Also, adaptation does not necessarily require global, coordinated action; many effective adaptation actions can be local. Some urban communities, because of land-use change and the urban heat-island effect, currently face changes similar to some expected under climate change, such as changes in water availability, heat-related morbidity, or changes in disease patterns. Concern over those impacts might motivate the implementation of measures that would also help in climate adaptation, despite skepticism among some policy makers about anthropogenic global warming. Studies of ancient civilizations in the southwestern US lends some insight into factors that may or may not be important to successful adaptation.

  20. Image enhancement for radiography inspection

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Wong, Brian Stephen; Guan, Tui Chen

    2005-04-01

    The x-ray radiographic testing method is often used for detecting defects as a non-destructive testing method (NDT). In many cases, NDT is used for aircraft components, welds, etc. Hence, the backgrounds are always more complex than a piece of steel. Radiographic images are low contrast, dark and high noise image. It is difficult to detect defects directly. So, image enhancement is a significant part of automated radiography inspection system. Histogram equalization and median filter are the most frequently used techniques to enhance the radiographic images. In this paper, the adaptive histogram equalization and contrast limited histogram equalization are compared with histogram equalization. The adaptive wavelet thresholding is compared with median filter. Through comparative analysis, the contrast limited histogram equalization and adaptive wavelet thresholding can enhance perception of defects better.

  1. Evaluation of breast cancer using intravoxel incoherent motion (IVIM) histogram analysis: comparison with malignant status, histological subtype, and molecular prognostic factors

    PubMed Central

    Cho, Gene Young; Moy, Linda; Kim, Sungheon G.; Baete, Steven H.; Moccaldi, Melanie; Babb, James S.; Sodickson, Daniel K.; Sigmund, Eric E.

    2016-01-01

    Purpose To examine heterogeneous breast cancer through intravoxel incoherent motion (IVIM) histogram analysis. Materials and methods This HIPAA-compliant, IRB-approved retrospective study included 62 patients (age 48.44±11.14 years, 50 malignant lesions and 12 benign) who underwent contrast-enhanced 3 T breast MRI and diffusion-weighted imaging. Apparent diffusion coefficient (ADC) and IVIM biomarkers of tissue diffusivity (Dt), perfusion fraction (fp), and pseudo-diffusivity (Dp) were calculated using voxel-based analysis for the whole lesion volume. Histogram analysis was performed to quantify tumour heterogeneity. Comparisons were made using Mann–Whitney tests between benign/malignant status, histological subtype, and molecular prognostic factor status while Spearman’s rank correlation was used to characterize the association between imaging biomarkers and prognostic factor expression. Results The average values of the ADC and IVIM biomarkers, Dt and fp, showed significant differences between benign and malignant lesions. Additional significant differences were found in the histogram parameters among tumour subtypes and molecular prognostic factor status. IVIM histogram metrics, particularly fp and Dp, showed significant correlation with hormonal factor expression. Conclusion Advanced diffusion imaging biomarkers show relationships with molecular prognostic factors and breast cancer malignancy. This analysis reveals novel diagnostic metrics that may explain some of the observed variability in treatment response among breast cancer patients. PMID:26615557

  2. 76 FR 39233 - Federal Acquisition Regulation; Equal Opportunity for Veterans

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-05

    ... RIN 9000-AL67 Federal Acquisition Regulation; Equal Opportunity for Veterans AGENCIES: Department of... rule amending the Federal Acquisition Regulation (FAR) to implement Department of Labor (DOL) regulations on equal opportunity provisions for various categories of military veterans. The interim...

  3. 47 CFR 22.321 - Equal employment opportunities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 22.321 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES PUBLIC MOBILE SERVICES Operational and Technical Requirements Operational Requirements § 22.321 Equal employment opportunities. Public Mobile Services licensees shall afford equal opportunity in employment to all...

  4. Equalizer: a scalable parallel rendering framework.

    PubMed

    Eilemann, Stefan; Makhinya, Maxim; Pajarola, Renato

    2009-01-01

    Continuing improvements in CPU and GPU performances as well as increasing multi-core processor and cluster-based parallelism demand for flexible and scalable parallel rendering solutions that can exploit multipipe hardware accelerated graphics. In fact, to achieve interactive visualization, scalable rendering systems are essential to cope with the rapid growth of data sets. However, parallel rendering systems are non-trivial to develop and often only application specific implementations have been proposed. The task of developing a scalable parallel rendering framework is even more difficult if it should be generic to support various types of data and visualization applications, and at the same time work efficiently on a cluster with distributed graphics cards. In this paper we introduce a novel system called Equalizer, a toolkit for scalable parallel rendering based on OpenGL which provides an application programming interface (API) to develop scalable graphics applications for a wide range of systems ranging from large distributed visualization clusters and multi-processor multipipe graphics systems to single-processor single-pipe desktop machines. We describe the system architecture, the basic API, discuss its advantages over previous approaches, present example configurations and usage scenarios as well as scalability results.

  5. Organelle size equalization by a constitutive process.

    PubMed

    Ludington, William B; Shi, Linda Z; Zhu, Qingyuan; Berns, Michael W; Marshall, Wallace F

    2012-11-20

    How cells control organelle size is an elusive problem. Two predominant models for size control can be distinguished: (1) induced control, where organelle genesis, maintenance, and disassembly are three separate programs that are activated in response to size change, and (2) constitutive control, where stable size results from the balance between continuous organelle assembly and disassembly. The problem has been studied in Chlamydomonas reinhardtii because the flagella are easy to measure, their size changes only in the length dimension, and the genetics are comparable to yeast. Length dynamics in Chlamydomonas flagella are quite robust: they maintain a length of about 12 μm and recover from amputation in about 90 min with a growth rate that decreases smoothly to zero as the length approaches 12 μm. Despite a wealth of experimental studies, existing data are consistent with both induced and constitutive control models for flagella. Here we developed novel microfluidic trapping and laser microsurgery techniques in Chlamydomonas to distinguish between length control models by measuring the two flagella on a single cell as they equilibrate after amputation of a single flagellum. The results suggest that cells equalize flagellar length by constitutive control.

  6. Battery Charge Equalizer with Transformer Array

    NASA Technical Reports Server (NTRS)

    Davies, Francis

    2013-01-01

    High-power batteries generally consist of a series connection of many cells or cell banks. In order to maintain high performance over battery life, it is desirable to keep the state of charge of all the cell banks equal. A method provides individual charging for battery cells in a large, high-voltage battery array with a minimum number of transformers while maintaining reasonable efficiency. This is designed to augment a simple highcurrent charger that supplies the main charge energy. The innovation will form part of a larger battery charge system. It consists of a transformer array connected to the battery array through rectification and filtering circuits. The transformer array is connected to a drive circuit and a timing and control circuit that allow individual battery cells or cell banks to be charged. The timing circuit and control circuit connect to a charge controller that uses battery instrumentation to determine which battery bank to charge. It is important to note that the innovation can charge an individual cell bank at the same time that the main battery charger is charging the high-voltage battery. The fact that the battery cell banks are at a non-zero voltage, and that they are all at similar voltages, can be used to allow charging of individual cell banks. A set of transformers can be connected with secondary windings in series to make weighted sums of the voltages on the primaries.

  7. Soft-output bidirectional decision feedback equalization technique for TDMA cellular radio

    NASA Astrophysics Data System (ADS)

    Liu, Yow-Jong; Wallace, Mark; Ketchum, W.

    1993-09-01

    Issues encountered in the design of reliable narrow-band TDMA digital cellular mobile communication systems are considered. In particular, the problem of compensating for the harsh multipath fading environment in systems whose transmission bandwidth is commensurate with the coherence bandwidth of the fading channel is considered. A new TDMA channel characterization parameter, the slot-normalized fade rate, is introduced and a novel adaptive bidirectional equalization technique, which is able to estimate the location of a deep fade within a time slot, is proposed. The simulation results show that the carrier-to-noise ratio requirement is only 15.5 dB when this equalization technique is used. This is achieved without diversity, and with low complexity. This is also 6.5 dB lower than called for in the IS-54 specification, which requires 3% BER at E(sub s)N(sub o) equal to 22 dB and vehicle speed equal to 60 mph under certain channel conditions. An equivalent equalized land/mobile radio channel model and the analytical solution for the optimal bit likelihood calculation for pi /4-shift QDPSK modulation are also derived under certain channel conditions. The results are used as soft decisions for the convolutional decoder. The likelihood calculation requires an estimate of the instantaneous noise variance. A good estimate may be derived from the equalizer error signal, but care must be taken to avoid use of the error signal when the equalizer is not tracking the channel. This approach gives a 1% decoded BER with 2 dB less power than that required for hard decisions.

  8. 12 CFR 268.407 - Civil action: Equal Pay Act.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Civil action: Equal Pay Act. 268.407 Section... Civil action: Equal Pay Act. A complainant is authorized under section 16(b) of the Fair Labor Standards..., if the violation is willful, three years of the date of the alleged violation of the Equal Pay...

  9. 29 CFR 1614.202 - Equal Pay Act.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Equal Pay Act. 1614.202 Section 1614.202 Labor Regulations... OPPORTUNITY Provisions Applicable to Particular Complaints § 1614.202 Equal Pay Act. (a) In its enforcement of the Equal Pay Act, the Commission has the authority to investigate an agency's employment practices...

  10. 29 CFR 1620.25 - Equalization of rates.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 4 2012-07-01 2012-07-01 false Equalization of rates. 1620.25 Section 1620.25 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION THE EQUAL PAY ACT § 1620.25... hiring or transferring employees to perform the previously lower-paid job at the lower rate....

  11. 76 FR 21221 - National Equal Pay Day, 2011

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-15

    ... economic equality for all, regardless of gender. When the Equal Pay Act was signed into law in 1963, women... exist for working women, who still earn less on average than working men. Each year, National Equal Pay... harder to close the gaps that still exist. At a time when families across this country are struggling...

  12. 48 CFR 852.211-73 - Brand name or equal.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Brand name or equal. 852... Brand name or equal. As prescribed in 811.104-71, insert the following clause: Brand Name or Equal (JAN 2008) (Note: As used in this clause, the term “brand name” includes identification of products by...

  13. 48 CFR 1852.210-70 - Brand name or equal.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Brand name or equal. 1852... 1852.210-70 Brand name or equal. As prescribed in 1810.011-70(a), insert the following provision: Brand Name or Equal (DEC 1988) (a) As used in this provision, “brand name” means identification of...

  14. 48 CFR 452.211-70 - Brand Name or Equal.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Brand Name or Equal. 452... FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and Clauses 452.211-70 Brand Name or Equal. As prescribed in 411.171, insert the following provision: Brand Name or Equal (NOV...

  15. 12 CFR 528.5 - Equal Housing Lender Poster.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 6 2013-01-01 2012-01-01 true Equal Housing Lender Poster. 528.5 Section 528.5... REQUIREMENTS § 528.5 Equal Housing Lender Poster. (a) Each savings association shall post and maintain one or more Equal Housing Lender Posters, the text of which is prescribed in paragraph (b) of this section,...

  16. 12 CFR 390.146 - Equal Housing Lender Poster.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 5 2013-01-01 2013-01-01 false Equal Housing Lender Poster. 390.146 Section....146 Equal Housing Lender Poster. (a) Each State savings association shall post and maintain one or more Equal Housing Lender Posters, the text of which is prescribed in paragraph (b) of this section,...

  17. 12 CFR 528.5 - Equal Housing Lender Poster.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 6 2014-01-01 2012-01-01 true Equal Housing Lender Poster. 528.5 Section 528.5... REQUIREMENTS § 528.5 Equal Housing Lender Poster. (a) Each savings association shall post and maintain one or more Equal Housing Lender Posters, the text of which is prescribed in paragraph (b) of this section,...

  18. 12 CFR 528.5 - Equal Housing Lender Poster.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 5 2011-01-01 2011-01-01 false Equal Housing Lender Poster. 528.5 Section 528... REQUIREMENTS § 528.5 Equal Housing Lender Poster. (a) Each savings association shall post and maintain one or more Equal Housing Lender Posters, the text of which is prescribed in paragraph (b) of this section,...

  19. 12 CFR 626.6025 - Equal housing lender poster.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Equal housing lender poster. 626.6025 Section... § 626.6025 Equal housing lender poster. (a) Each Farm Credit institution that makes loans for the... by a dwelling shall post and maintain an Equal Housing Lender Poster in the lobby of each of...

  20. 12 CFR 626.6025 - Equal housing lender poster.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Equal housing lender poster. 626.6025 Section... § 626.6025 Equal housing lender poster. (a) Each Farm Credit institution that makes loans for the... by a dwelling shall post and maintain an Equal Housing Lender Poster in the lobby of each of...

  1. 12 CFR 626.6025 - Equal housing lender poster.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Equal housing lender poster. 626.6025 Section... § 626.6025 Equal housing lender poster. (a) Each Farm Credit institution that makes loans for the... by a dwelling shall post and maintain an Equal Housing Lender Poster in the lobby of each of...

  2. Reflections on Mainstreaming Gender Equality in Adult Basic Education Programmes

    ERIC Educational Resources Information Center

    Lind, Agneta

    2006-01-01

    This article is about mainstreaming gender equality in adult basic learning and education (ABLE). Gender equality is defined as equal rights of both women and men to influence, participate in and benefit from a programme. It is argued that specific gender analyses of emerging patterns of gender relations is helpful in formulating gender equality…

  3. 28 CFR 42.304 - Written equal employment opportunity program.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 1 2012-07-01 2012-07-01 false Written equal employment opportunity program. 42.304 Section 42.304 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Equal Employment Opportunity Program Guidelines §...

  4. 28 CFR 42.304 - Written equal employment opportunity program.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Written equal employment opportunity program. 42.304 Section 42.304 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Equal Employment Opportunity Program Guidelines §...

  5. 28 CFR 42.304 - Written equal employment opportunity program.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Written equal employment opportunity program. 42.304 Section 42.304 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Equal Employment Opportunity Program Guidelines §...

  6. 28 CFR 42.304 - Written equal employment opportunity program.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Written equal employment opportunity program. 42.304 Section 42.304 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Equal Employment Opportunity Program Guidelines §...

  7. 48 CFR 19.202-3 - Equal low bids.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Equal low bids. 19.202-3 Section 19.202-3 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Policies 19.202-3 Equal low bids. In the event of equal low bids (see...

  8. 48 CFR 14.408-6 - Equal low bids.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Equal low bids. 14.408-6... AND CONTRACT TYPES SEALED BIDDING Opening of Bids and Award of Contract 14.408-6 Equal low bids. (a) Contracts shall be awarded in the following order of priority when two or more low bids are equal in...

  9. 29 CFR 530.414 - Equal Access to Justice Act.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Equal Access to Justice Act. 530.414 Section 530.414 Labor... OF HOMEWORKERS IN CERTAIN INDUSTRIES Administrative Procedures § 530.414 Equal Access to Justice Act. Proceedings under this part are not subject to the provisions of the Equal Access to Justice Act. In...

  10. 12 CFR 626.6025 - Equal housing lender poster.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) The Equal Housing Lender Poster shall be at least 11 inches by 14 inches in size, and shall bear the... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Equal housing lender poster. 626.6025 Section... § 626.6025 Equal housing lender poster. (a) Each Farm Credit institution that makes loans for...

  11. The Fate of Equality in a Technological Civilization

    ERIC Educational Resources Information Center

    Stivers, Richard

    2008-01-01

    The meaning of equality has been radically altered since the Enlightenment. In the 18th century, equality acquired political and economic meanings specifically in the contexts of democracy and capitalism. Today, the context in which equality is understood and practiced is technology as our most immediate and compelling environment. Moreover, the…

  12. 47 CFR 22.321 - Equal employment opportunities.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... employment rights and their right to notify the Equal Employment Opportunity Commission (EEOC), the Federal... 47 Telecommunication 2 2014-10-01 2014-10-01 false Equal employment opportunities. 22.321 Section... MOBILE SERVICES Operational and Technical Requirements Operational Requirements § 22.321 Equal...

  13. 12 CFR 626.6025 - Equal housing lender poster.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... by the Equal Credit Opportunity Act Amendments of 1976) IT IS ILLEGAL TO DISCRIMINATE IN ANY CREDIT... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Equal housing lender poster. 626.6025 Section... § 626.6025 Equal housing lender poster. (a) Each Farm Credit institution that makes loans for...

  14. 47 CFR 22.321 - Equal employment opportunities.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... employment rights and their right to notify the Equal Employment Opportunity Commission (EEOC), the Federal... 47 Telecommunication 2 2012-10-01 2012-10-01 false Equal employment opportunities. 22.321 Section... MOBILE SERVICES Operational and Technical Requirements Operational Requirements § 22.321 Equal...

  15. Pushing Economies (and Students) outside the Factor Price Equalization Zone

    ERIC Educational Resources Information Center

    Oslington, Paul; Towers, Isaac

    2009-01-01

    Despite overwhelming empirical evidence of the failure of factor price equalization, most teaching of international trade theory (even at the graduate level) assumes that economies are incompletely specialized and that factor price equalization holds. The behavior of trading economies in the absence of factor price equalization is not well…

  16. Structure-Property Relationships in Atomic-Scale Junctions: Histograms and Beyond.

    PubMed

    Hybertsen, Mark S; Venkataraman, Latha

    2016-03-15

    Over the past 10 years, there has been tremendous progress in the measurement, modeling and understanding of structure-function relationships in single molecule junctions. Numerous research groups have addressed significant scientific questions, directed both to conductance phenomena at the single molecule level and to the fundamental chemistry that controls junction functionality. Many different functionalities have been demonstrated, including single-molecule diodes, optically and mechanically activated switches, and, significantly, physical phenomena with no classical analogues, such as those based on quantum interference effects. Experimental techniques for reliable and reproducible single molecule junction formation and characterization have led to this progress. In particular, the scanning tunneling microscope based break-junction (STM-BJ) technique has enabled rapid, sequential measurement of large numbers of nanoscale junctions allowing a statistical analysis to readily distinguish reproducible characteristics. Harnessing fundamental link chemistry has provided the necessary chemical control over junction formation, enabling measurements that revealed clear relationships between molecular structure and conductance characteristics. Such link groups (amines, methylsuflides, pyridines, etc.) maintain a stable lone pair configuration that selectively bonds to specific, undercoordinated transition metal atoms available following rupture of a metal point contact in the STM-BJ experiments. This basic chemical principle rationalizes the observation of highly reproducible conductance signatures. Subsequently, the method has been extended to probe a variety of physical phenomena ranging from basic I-V characteristics to more complex properties such as thermopower and electrochemical response. By adapting the technique to a conducting cantilever atomic force microscope (AFM-BJ), simultaneous measurement of the mechanical characteristics of nanoscale junctions as they

  17. Employment equality legislation, 3 March 1988.

    PubMed

    1988-01-01

    On 1 April 1988, new employment equality legislation came into effect in Israel. The new legislation outlaws discrimination at work on the grounds of sex, marital status, and parenthood with respect to recruitment, terms of employment, promotion, vocational training, retraining, dismissal, and severance pay. Under the legislation, 1) employers may not cause prejudice to workers who allege discrimination, help others to do so, or decline sexual advances by a direct or indirect supervisor; 2) the burden of proof in discrimination claims against an employer is on the employer if the worker can show that requirements set by the employer have been met; 3) company managers and co-owners in a partnership are personally liable for violations on the part of the employer if the firm has over six workers unless they prove that the offense was committed without their knowledge or that they had taken all appropriate measures to prevent it; and 4) no special rights given to women by law, collective agreement, or other work contract are to be considered discrimination. The legislation establishes a Public Council to advise the Minister of Labour and Social Affairs on implementing and publicizing the legislation. It also allows a father to receive the following work benefits that were previously restricted to mothers: 1) leave of absence to care for a sick child and 2) statutory leave and statutory entitlement to severance pay for resigning to care for a newborn or adopted baby if the father is the sole guardian or if the mother renounces her right because she is working.

  18. Data-Driven Approach to Generating Achievable Dose-Volume Histogram Objectives in Intensity-Modulated Radiotherapy Planning

    SciTech Connect

    Wu Binbin; Ricchetti, Francesco; Sanguineti, Giuseppe; Kazhdan, Michael; Simari, Patricio; Jacques, Robert; Taylor, Russell; McNutt, Todd

    2011-03-15

    Purpose: To propose a method of intensity-modulated radiotherapy (IMRT) planning that generates achievable dose-volume histogram (DVH) objectives using a database containing geometric and dosimetric information of previous patients. Methods and Materials: The overlap volume histogram (OVH) is used to compare the spatial relationships between the organs at risk and targets of a new patient with those of previous patients in a database. From the OVH analysis, the DVH objectives of the new patient were generated from the database and used as the initial planning goals. In a retrospective OVH-assisted planning demonstration, 15 patients were randomly selected from a database containing clinical plans (CPs) of 91 previous head-and-neck patients treated by a three-level IMRT-simultaneous integrated boost technique. OVH-assisted plans (OPs) were planned in a leave-one-out manner by a planner who had no knowledge of CPs. Thus, DVH objectives of an OP were generated from a subdatabase containing the information of the other 90 patients. Those DVH objectives were then used as the initial planning goals in IMRT optimization. Planning efficiency was evaluated by the number of clicks of the 'Start Optimization' button in the course of planning. Although the Pinnacle{sup 3} treatment planning system allows planners to interactively adjust the DVH parameters during optimization, planners in our institution have never used this function in planning. Results: The average clicks required for completing the CP and OP was 27.6 and 1.9, respectively (p <.00001); three OPs were finished within a single click. Ten more patient's cord + 4 mm reached the sparing goal D{sub 0.1cc} <44 Gy (p <.0001), where D{sub 0.1cc} represents the dose corresponding to 0.1 cc. For planning target volume uniformity, conformity, and other organ at risk sparing, the OPs were at least comparable with the CPs. Additionally, the averages of D{sub 0.1cc} to the cord + 4 mm decreased by 6.9 Gy (p <.0001

  19. BEDVH--A method for evaluating biologically effective dose volume histograms: Application to eye plaque brachytherapy implants

    SciTech Connect

    Gagne, Nolan L.; Leonard, Kara L.; Huber, Kathryn E.; Mignano, John E.; Duker, Jay S.; Laver, Nora V.; Rivard, Mark J.

    2012-02-15

    Purpose: A method is introduced to examine the influence of implant duration T, radionuclide, and radiobiological parameters on the biologically effective dose (BED) throughout the entire volume of regions of interest for episcleral brachytherapy using available radionuclides. This method is employed to evaluate a particular eye plaque brachytherapy implant in a radiobiological context. Methods: A reference eye geometry and 16 mm COMS eye plaque loaded with {sup 103}Pd, {sup 125}I, or {sup 131}Cs sources were examined with dose distributions accounting for plaque heterogeneities. For a standardized 7 day implant, doses to 90% of the tumor volume ( {sub TUMOR}D{sub 90}) and 10% of the organ at risk volumes ( {sub OAR}D{sub 10}) were calculated. The BED equation from Dale and Jones and published {alpha}/{beta} and {mu} parameters were incorporated with dose volume histograms (DVHs) for various T values such as T = 7 days (i.e., {sub TUMOR} {sup 7}BED{sub 10} and {sub OAR} {sup 7}BED{sub 10}). By calculating BED throughout the volumes, biologically effective dose volume histograms (BEDVHs) were developed for tumor and OARs. Influence of T, radionuclide choice, and radiobiological parameters on {sub TUMOR}BEDVH and {sub OAR}BEDVH were examined. The nominal dose was scaled for shorter implants to achieve biological equivalence. Results: {sub TUMOR}D{sub 90} values were 102, 112, and 110 Gy for {sup 103}Pd, {sup 125}I, and {sup 131}Cs, respectively. Corresponding {sub TUMOR} {sup 7}BED{sub 10} values were 124, 140, and 138 Gy, respectively. As T decreased from 7 to 0.01 days, the isobiologically effective prescription dose decreased by a factor of three. As expected, {sub TUMOR} {sup 7}BEDVH did not significantly change as a function of radionuclide half-life but varied by 10% due to radionuclide dose distribution. Variations in reported radiobiological parameters caused {sub TUMOR} {sup 7}BED{sub 10} to deviate by up to 46%. Over the range of {sub OAR

  20. SU-E-T-430: Feasibility Study On Using Overlap Volume Histogram to Predict the Dose Difference by Respiratory Motion

    SciTech Connect

    Shin, D; Kang, S; Kim, D; Kim, T; Kim, K; Cho, M; Suh, T

    2015-06-15

    Purpose: The dose difference between three-dimensional dose (3D dose) and 4D dose which considers motion due to respiratory can be varied according to geometrical relationship between planning target volume (PTV) and organ at risk (OAR). The purpose of the study is to investigate the dose difference between 3D and 4D dose using overlap volume histogram (OVH) which is an indicator that quantify geometrical relationship between a PTV and an OAR. Methods: Five liver cancer patients who previously treated stereotactic body radiotherapy (SBRT) were investigated. Four-dimensional computed tomography (4DCT) images were acquired for all patients. ITV-based treatment planning was performed. 3D dose was calculated on the end-exhale phase image as a reference phase image. 4D dose accumulation was implemented from all phase images using dose warping technique used deformable image registration (DIR) algorithm (Horn and Schunck optical flow) in DIRART. In this study OVH was used to quantify geometrical relationship between a PTV and an OAR. OVH between a PTV and a selected OAR was generated for each patient case and compared for all cases. The dose difference between 3D and 4D dose for normal organ was calculated and compared for all cases according to OVH. Results: The 3D and 4D dose difference for OAR was analyzed using dose-volume histogram (DVH). On the basis of a specific point which corresponds to 10% of OAR volume overlapped with expanded PTV, mean dose difference was 34.56% in minimum OVH distance case and 13.36% in maximum OVH distance case. As the OVH distance increased, mean dose difference between 4D and 3D dose was decreased. Conclusion: The tendency of dose difference variation was verified according to OVH. OVH is seems to be indicator that has a potential to predict the dose difference between 4D and 3D dose. This work was supported by the Radiation Technology R&D program (No. 2013M2A2A7043498) and the Mid-career Researcher Program (2014R1A2A1A10050270) through

  1. Toothbrush Adaptations.

    ERIC Educational Resources Information Center

    Exceptional Parent, 1987

    1987-01-01

    Suggestions are presented for helping disabled individuals learn to use or adapt toothbrushes for proper dental care. A directory lists dental health instructional materials available from various organizations. (CB)

  2. Equal Pay for Equal Qualifications? A Model for Determining Race or Sex Discrimination in Salaries. AIR Forum Paper 1978.

    ERIC Educational Resources Information Center

    Muffo, John; And Others

    Equal pay for equal work by persons of equal qualifications is the concept behind laws against race and sex discrimination in salaries in the United States. However, determining the existence and extent of discrimination is not a simple matter. A four-step procedure is recommended that attempts to uncover the existence of discrimination and begins…

  3. Prism adaptation by mental practice.

    PubMed

    Michel, Carine; Gaveau, Jérémie; Pozzo, Thierry; Papaxanthis, Charalambos

    2013-09-01

    The prediction of our actions and their interaction with the external environment is critical for sensorimotor adaptation. For instance, during prism exposure, which deviates laterally our visual field, we progressively correct movement errors by combining sensory feedback with forward model sensory predictions. However, very often we project our actions to the external environment without physically interacting with it (e.g., mental actions). An intriguing question is whether adaptation will occur if we imagine, instead of executing, an arm movement while wearing prisms. Here, we investigated prism adaptation during mental actions. In the first experiment, participants (n = 54) performed arm pointing movements before and after exposure to the optical device. They were equally divided into six groups according to prism exposure: Prisms-Active, Prisms-Imagery, Prisms-Stationary, Prisms-Stationary-Attention, No Conflict-Prisms-Imagery, No Prisms-Imagery. Adaptation, measured by the difference in pointing errors between pre-test and post-test, occurred only in Prisms-Active and Prisms-Imagery conditions. The second experiment confirmed the results of the first experiment and further showed that sensorimotor adaptation was mainly due to proprioceptive realignment in both Prisms-Active (n = 10) and Prisms-Imagery (n = 10) groups. In both experiments adaptation was greater following actual than imagined pointing movements. The present results are the first demonstration of prism adaptation by mental practice under prism exposure and they are discussed in terms of internal forward models and sensorimotor plasticity.

  4. Highly sensitive image-derived indices of water-stressed plants using hyperspectral imaging in SWIR and histogram analysis

    NASA Astrophysics Data System (ADS)

    Kim, David M.; Zhang, Hairong; Zhou, Haiying; Du, Tommy; Wu, Qian; Mockler, Todd C.; Berezin, Mikhail Y.

    2015-11-01

    The optical signature of leaves is an important monitoring and predictive parameter for a variety of biotic and abiotic stresses, including drought. Such signatures derived from spectroscopic measurements provide vegetation indices - a quantitative method for assessing plant health. However, the commonly used metrics suffer from low sensitivity. Relatively small changes in water content in moderately stressed plants demand high-contrast imaging to distinguish affected plants. We present a new approach in deriving sensitive indices using hyperspectral imaging in a short-wave infrared range from 800 nm to 1600 nm. Our method, based on high spectral resolution (1.56 nm) instrumentation and image processing algorithms (quantitative histogram analysis), enables us to distinguish a moderate water stress equivalent of 20% relative water content (RWC). The identified image-derived indices 15XX nm/14XX nm (i.e. 1529 nm/1416 nm) were superior to common vegetation indices, such as WBI, MSI, and NDWI, with significantly better sensitivity, enabling early diagnostics of plant health.

  5. Approach to Privacy-Preserve Data in Two-Tiered Wireless Sensor Network Based on Linear System and Histogram

    NASA Astrophysics Data System (ADS)

    Dang, Van H.; Wohlgemuth, Sven; Yoshiura, Hiroshi; Nguyen, Thuc D.; Echizen, Isao

    Wireless sensor network (WSN) has been one of key technologies for the future with broad applications from the military to everyday life [1,2,3,4,5]. There are two kinds of WSN model models with sensors for sensing data and a sink for receiving and processing queries from users; and models with special additional nodes capable of storing large amounts of data from sensors and processing queries from the sink. Among the latter type, a two-tiered model [6,7] has been widely adopted because of its storage and energy saving benefits for weak sensors, as proved by the advent of commercial storage node products such as Stargate [8] and RISE. However, by concentrating storage in certain nodes, this model becomes more vulnerable to attack. Our novel technique, called zip-histogram, contributes to solving the problems of previous studies [6,7] by protecting the stored data's confidentiality and integrity (including data from the sensor and queries from the sink) against attackers who might target storage nodes in two-tiered WSNs.

  6. Highly sensitive image-derived indices of water-stressed plants using hyperspectral imaging in SWIR and histogram analysis.

    PubMed

    Kim, David M; Zhang, Hairong; Zhou, Haiying; Du, Tommy; Wu, Qian; Mockler, Todd C; Berezin, Mikhail Y

    2015-11-04

    The optical signature of leaves is an important monitoring and predictive parameter for a variety of biotic and abiotic stresses, including drought. Such signatures derived from spectroscopic measurements provide vegetation indices - a quantitative method for assessing plant health. However, the commonly used metrics suffer from low sensitivity. Relatively small changes in water content in moderately stressed plants demand high-contrast imaging to distinguish affected plants. We present a new approach in deriving sensitive indices using hyperspectral imaging in a short-wave infrared range from 800 nm to 1600 nm. Our method, based on high spectral resolution (1.56 nm) instrumentation and image processing algorithms (quantitative histogram analysis), enables us to distinguish a moderate water stress equivalent of 20% relative water content (RWC). The identified image-derived indices 15XX nm/14XX nm (i.e. 1529 nm/1416 nm) were superior to common vegetation indices, such as WBI, MSI, and NDWI, with significantly better sensitivity, enabling early diagnostics of plant health.

  7. Safety surrogate histograms (SSH): A novel real-time safety assessment of dilemma zone related conflicts at signalized intersections.

    PubMed

    Ghanipoor Machiani, Sahar; Abbas, Montasir

    2016-11-01

    Drivers' indecisiveness in dilemma zones (DZ) could result in crash-prone situations at signalized intersections. DZ is to the area ahead of an intersection in which drivers encounter a dilemma regarding whether to stop or proceed through the intersection when the signal turns yellow. An improper decision to stop by the leading driver, combined with the following driver deciding to go, can result in a rear-end collision, unless the following driver recognizes a collision is imminent and adjusts his or her behavior at or shortly after the onset of yellow. Considering the significance of DZ-related crashes, a comprehensive safety measure is needed to characterize the level of safety at signalized intersections. In this study, a novel safety surrogate measure was developed utilizing real-time radar field data. This new measure, called safety surrogate histogram (SSH), captures the degree and frequency of DZ-related conflicts at each intersection approach. SSH includes detailed information regarding the possibility of crashes, because it is calculated based on the vehicles conflicts. An example illustrating the application of the new methodology at two study sites in Virginia is presented and discussed, and a comparison is provided between SSH and other DZ-related safety surrogate measures mentioned in the literature. The results of the study reveal the efficacy of the SSH as complementary to existing surrogate measures.

  8. O(1) time algorithms for computing histogram and Hough transform on a cross-bridge reconfigurable array of processors

    SciTech Connect

    Kao, T.; Horng, S.; Wang, Y.

    1995-04-01

    Instead of using the base-2 number system, we use a base-m number system to represent the numbers used in the proposed algorithms. Such a strategy can be used to design an O(T) time, T = (log(sub m) N) + 1, prefix sum algorithm for a binary sequence with N-bit on a cross-bridge reconfigurable array of processors using N processors, where the data bus is m-bit wide. Then, this basic operation can be used to compute the histogram of an n x n image with G gray-level value in constant time using G x n x n processors, and compute the Hough transform of an image with N edge pixels and n x n parameter space in constant time using n x n x N processors, respectively. This result is better than the previously known results proposed in the literature. Also, the execution time of the proposed algorithms is tunable by the bus bandwidth. 43 refs.

  9. Highly sensitive image-derived indices of water-stressed plants using hyperspectral imaging in SWIR and histogram analysis

    PubMed Central

    Kim, David M.; Zhang, Hairong; Zhou, Haiying; Du, Tommy; Wu, Qian; Mockler, Todd C.; Berezin, Mikhail Y.

    2015-01-01

    The optical signature of leaves is an important monitoring and predictive parameter for a variety of biotic and abiotic stresses, including drought. Such signatures derived from spectroscopic measurements provide vegetation indices – a quantitative method for assessing plant health. However, the commonly used metrics suffer from low sensitivity. Relatively small changes in water content in moderately stressed plants demand high-contrast imaging to distinguish affected plants. We present a new approach in deriving sensitive indices using hyperspectral imaging in a short-wave infrared range from 800 nm to 1600 nm. Our method, based on high spectral resolution (1.56 nm) instrumentation and image processing algorithms (quantitative histogram analysis), enables us to distinguish a moderate water stress equivalent of 20% relative water content (RWC). The identified image-derived indices 15XX nm/14XX nm (i.e. 1529 nm/1416 nm) were superior to common vegetation indices, such as WBI, MSI, and NDWI, with significantly better sensitivity, enabling early diagnostics of plant health. PMID:26531782

  10. Comparative study of old and new versions of treatment planning system using dose volume histogram indices of clinical plans

    PubMed Central

    Krishna, Gangarapu Sri; Srinivas, Vuppu; Ayyangar, K. M.; Reddy, Palreddy Yadagiri

    2016-01-01

    Recently, Eclipse treatment planning system (TPS) version 8.8 was upgraded to the latest version 13.6. It is customary that the vendor gives training on how to upgrade the existing software to the new version. However, the customer is provided less inner details about changes in the new software version. According to manufacturer, accuracy of point dose calculations and irregular treatment planning is better in the new version (13.6) compared to the old version (8.8). Furthermore, the new version uses voxel-based calculations while the earlier version used point dose calculations. Major difference in intensity-modulated radiation therapy (IMRT) plans was observed between the two versions after re-optimization and re-calculations. However, minor difference was observed for IMRT cases after performing only re-calculations. It is recommended TPS quality assurance to be performed after any major upgrade of software. This can be done by performing dose calculation comparisons in TPS. To assess the difference between the versions, 25 clinical cases from the old version were compared keeping all the patient data intact including the monitor units and comparing the differences in dose calculations using dose volume histogram (DVH) analysis. Along with DVH analysis, uniformity index, conformity index, homogeneity index, and dose spillage index were also compared for both versions. The results of comparative study are presented in this paper. PMID:27651566

  11. Analytic treatment of the compound action potential: Estimating the summed post-stimulus time histogram and unit response

    NASA Astrophysics Data System (ADS)

    Chertoff, Mark E.

    2004-11-01

    The convolution of an equation representing a summed post-stimulus time histogram computed across auditory nerve fibers [P(t)] with an equation representing a single-unit wave form [U(t)], resulted in an analytic expression for the compound action potential (CAP). The solution was fit to CAPs recorded to low and high frequency stimuli at various signal levels. The correlation between the CAP and the analytic expression was generally greater than 0.90. At high levels the width of P(t) was broader for low frequency stimuli than for high frequency signals, but delays were comparable. This indicates that at high signal levels there is an overlap in the population of auditory nerve fibers contributing to the CAP for both low and high frequency stimuli but low frequencies include contributions from more apical regions. At low signal levels the width of P(t) decreased for most frequencies and delays increased. The frequency of oscillation of U(t) was largest for high frequency stimuli and decreased for low frequency stimuli. The decay of U(t) was largest at 8 kHz and smallest at 1 kHz. These results indicate that the hair cell or neural mechanisms involved in the generation of action potentials may differ along the cochlear partition. .

  12. Cortical Magnification Plus Cortical Plasticity Equals Vision?

    PubMed Central

    Born, Richard T.; Trott, Alexander; Hartmann, Till

    2014-01-01

    Most approaches to visual prostheses have focused on the retina, and for good reasons. The earlier that one introduces signals into the visual system, the more one can take advantage of its prodigious computational abilities. For methods that make use of microelectrodes to introduce electrical signals, however, the limited density and volume occupying nature of the electrodes place severe limits on the image resolution that can be provided to the brain. In this regard, non-retinal areas in general, and the primary visual cortex in particular, possess one large advantage: “magnification factor” (MF)—a value that represents the distance across a sheet of neurons that represents a given angle of the visual field. In the foveal representation of primate primary visual cortex, the MF is enormous—on the order of 15–20 mm/deg in monkeys and humans, whereas on the retina, the MF is limited by the optical design of the eye to around 0.3 mm/deg. This means that, for an electrode array of a given density, a much higher- resolution image can be introduced into V1 than onto the retina (or any other visual structure). In addition to this tremendous advantage in resolution, visual cortex is plastic at many different levels ranging from a very local ability to learn to better detect electrical stimulation to higher levels of learning that permit human observers to adapt to radical changes to their visual inputs. We argue that the combination of the large magnification factor and the impressive ability of the cerebral cortex to learn to recognize arbitrary patterns, might outweigh the disadvantages of bypassing earlier processing stages and makes V1 a viable option for the restoration of vision. PMID:25449335

  13. Equally parsimonious pathways through an RNA sequence space are not equally likely

    NASA Technical Reports Server (NTRS)

    Lee, Y. H.; DSouza, L. M.; Fox, G. E.

    1997-01-01

    An experimental system for determining the potential ability of sequences resembling 5S ribosomal RNA (rRNA) to perform as functional 5S rRNAs in vivo in the Escherichia coli cellular environment was devised previously. Presumably, the only 5S rRNA sequences that would have been fixed by ancestral populations are ones that were functionally valid, and hence the actual historical paths taken through RNA sequence space during 5S rRNA evolution would have most likely utilized valid sequences. Herein, we examine the potential validity of all sequence intermediates along alternative equally parsimonious trajectories through RNA sequence space which connect two pairs of sequences that had previously been shown to behave as valid 5S rRNAs in E. coli. The first trajectory requires a total of four changes. The 14 sequence intermediates provide 24 apparently equally parsimonious paths by which the transition could occur. The second trajectory involves three changes, six intermediate sequences, and six potentially equally parsimonious paths. In total, only eight of the 20 sequence intermediates were found to be clearly invalid. As a consequence of the position of these invalid intermediates in the sequence space, seven of the 30 possible paths consisted of exclusively valid sequences. In several cases, the apparent validity/invalidity of the intermediate sequences could not be anticipated on the basis of current knowledge of the 5S rRNA structure. This suggests that the interdependencies in RNA sequence space may be more complex than currently appreciated. If ancestral sequences predicted by parsimony are to be regarded as actual historical sequences, then the present results would suggest that they should also satisfy a validity requirement and that, in at least limited cases, this conjecture can be tested experimentally.

  14. Equity, Equal Shares or Equal Final Outcomes? Group Goal Guides Allocations of Public Goods

    PubMed Central

    Kazemi, Ali; Eek, Daniel; Gärling, Tommy

    2017-01-01

    In an experiment we investigate preferences for allocation of a public good among group members who contributed unequally in providing the public good. Inducing the group goal of productivity resulted in preferences for equitable allocations, whereas inducing the group goals of harmony and social concern resulted in preferences for equal final outcomes. The study makes a contribution by simultaneously treating provision and allocation of a public good, thus viewing these as related processes. Another contribution is that a new paradigm is introduced that bears closer resemblance to real life public good dilemmas than previous research paradigms do. PMID:28179890

  15. Equalization and detection for digital communication over nonlinear bandlimited satellite communication channels. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Gutierrez, Alberto, Jr.

    1995-01-01

    This dissertation evaluates receiver-based methods for mitigating the effects due to nonlinear bandlimited signal distortion present in high data rate satellite channels. The effects of the nonlinear bandlimited distortion is illustrated for digitally modulated signals. A lucid development of the low-pass Volterra discrete time model for a nonlinear communication channel is presented. In addition, finite-state machine models are explicitly developed for a nonlinear bandlimited satellite channel. A nonlinear fixed equalizer based on Volterra series has previously been studied for compensation of noiseless signal distortion due to a nonlinear satellite channel. This dissertation studies adaptive Volterra equalizers on a downlink-limited nonlinear bandlimited satellite channel. We employ as figure of merits performance in the mean-square error and probability of error senses. In addition, a receiver consisting of a fractionally-spaced equalizer (FSE) followed by a Volterra equalizer (FSE-Volterra) is found to give improvement beyond that gained by the Volterra equalizer. Significant probability of error performance improvement is found for multilevel modulation schemes. Also, it is found that probability of error improvement is more significant for modulation schemes, constant amplitude and multilevel, which require higher signal to noise ratios (i.e., higher modulation orders) for reliable operation. The maximum likelihood sequence detection (MLSD) receiver for a nonlinear satellite channel, a bank of matched filters followed by a Viterbi detector, serves as a probability of error lower bound for the Volterra and FSE-Volterra equalizers. However, this receiver has not been evaluated for a specific satellite channel. In this work, an MLSD receiver is evaluated for a specific downlink-limited satellite channel. Because of the bank of matched filters, the MLSD receiver may be high in complexity. Consequently, the probability of error performance of a more practical

  16. Analysis of and Techniques for Adaptive Equalization for Underwater Acoustic Communication

    DTIC Science & Technology

    2011-09-01

    Keyed QAM Quadrature Amplitude Modulation QPSK Quadrature Phase Shift Keyed 19 continued from last page. . . Acronym Definition RF Radio Frequency...rate was 2400 samples per second and the data packet was 60000 4- QAM modulated symbols. The results again confirm that the proposed method outperforms...communication is quickly becoming a necessity for applications in ocean science, defense, and homeland security. Acoustics remains the only prac- tical means

  17. Smart Acoustic Network Using Combined Fsk-Psk, Adaptive Beamforming and Equalization

    DTIC Science & Technology

    2003-09-30

    500m 1500m 1500m 4500m 500m End Surface 10’ 20’ 10’ 35’ 0’ Beginning Surface 0’ Millscross Morpheus 10’ 750m 750m 35’ Figure 4. High-Speed Data...Transmission from a Morpheus UUV to the HPAL. 4 RESULTS High-speed acoustic communication using FAU-HPAL: 1) Using this multiple-stage method, bit

  18. Gender equality and violent behavior: how neighborhood gender equality influences the gender gap in violence.

    PubMed

    Lei, Man-Kit; Simons, Ronald L; Simons, Leslie Gordon; Edmond, Mary Bond

    2014-01-01

    Using a sample of 703 African American adolescents from the Family and Community Health Study (FACHS) along with census data from the year 2000, we examine the association between neighborhood-level gender equality and violence. We find that boys' and girls' violent behavior is unevenly distributed across neighborhood contexts. In particular, gender differences in violent behavior are less pronounced in gender-equalitarian neighborhoods compared to those characterized by gender inequality. We also find that the gender gap narrows in gender-equalitarian neighborhoods because boys' rates of violence decrease whereas girls' rates remain relatively low across neighborhoods. This is in stark contrast to the pessimistic predictions of theorists who argue that the narrowing of the gender gap in equalitarian settings is the result of an increase in girls' violence. In addition, the relationship between neighborhood gender equality and violence is mediated by a specific articulation of masculinity characterized by toughness. Our results provide evidence for the use of gender-specific neighborhood prevention programs.

  19. Adaptive management

    USGS Publications Warehouse

    Allen, Craig R.; Garmestani, Ahjond S.

    2015-01-01

    Adaptive management is an approach to natural resource management that emphasizes learning through management where knowledge is incomplete, and when, despite inherent uncertainty, managers and policymakers must act. Unlike a traditional trial and error approach, adaptive management has explicit structure, including a careful elucidation of goals, identification of alternative management objectives and hypotheses of causation, and procedures for the collection of data followed by evaluation and reiteration. The process is iterative, and serves to reduce uncertainty, build knowledge and improve management over time in a goal-oriented and structured process.

  20. Method for solvent extraction with near-equal density solutions

    DOEpatents

    Birdwell, Joseph F.; Randolph, John D.; Singh, S. Paul

    2001-01-01

    Disclosed is a modified centrifugal contactor for separating solutions of near equal density. The modified contactor has a pressure differential establishing means that allows the application of a pressure differential across fluid in the rotor of the contactor. The pressure differential is such that it causes the boundary between solutions of near-equal density to shift, thereby facilitating separation of the phases. Also disclosed is a method of separating solutions of near-equal density.