Science.gov

Sample records for adaptive histogram equalization

  1. Osteoarthritis Classification Using Self Organizing Map Based on Gabor Kernel and Contrast-Limited Adaptive Histogram Equalization

    PubMed Central

    Anifah, Lilik; Purnama, I Ketut Eddy; Hariadi, Mochamad; Purnomo, Mauridhi Hery

    2013-01-01

    Localization is the first step in osteoarthritis (OA) classification. Manual classification, however, is time-consuming, tedious, and expensive. The proposed system is designed as decision support system for medical doctors to classify the severity of knee OA. A method has been proposed here to localize a joint space area for OA and then classify it in 4 steps to classify OA into KL-Grade 0, KL-Grade 1, KL-Grade 2, KL-Grade 3 and KL-Grade 4, which are preprocessing, segmentation, feature extraction, and classification. In this proposed system, right and left knee detection was performed by employing the Contrast-Limited Adaptive Histogram Equalization (CLAHE) and the template matching. The Gabor kernel, row sum graph and moment methods were used to localize the junction space area of knee. CLAHE is used for preprocessing step, i.e.to normalize the varied intensities. The segmentation process was conducted using the Gabor kernel, template matching, row sum graph and gray level center of mass method. Here GLCM (contrast, correlation, energy, and homogeinity) features were employed as training data. Overall, 50 data were evaluated for training and 258 data for testing. Experimental results showed the best performance by using gabor kernel with parameters α=8, θ=0, Ψ=[0 π/2], γ=0,8, N=4 and with number of iterations being 5000, momentum value 0.5 and α0=0.6 for the classification process. The run gave classification accuracy rate of 93.8% for KL-Grade 0, 70% for KL-Grade 1, 4% for KL-Grade 2, 10% for KL-Grade 3 and 88.9% for KL-Grade 4. PMID:23525188

  2. An innovative technique for contrast enhancement of computed tomography images using normalized gamma-corrected contrast-limited adaptive histogram equalization

    NASA Astrophysics Data System (ADS)

    Al-Ameen, Zohair; Sulong, Ghazali; Rehman, Amjad; Al-Dhelaan, Abdullah; Saba, Tanzila; Al-Rodhaan, Mznah

    2015-12-01

    Image contrast is an essential visual feature that determines whether an image is of good quality. In computed tomography (CT), captured images tend to be low contrast, which is a prevalent artifact that reduces the image quality and hampers the process of extracting its useful information. A common tactic to process such artifact is by using histogram-based techniques. However, although these techniques may improve the contrast for different grayscale imaging applications, the results are mostly unacceptable for CT images due to the presentation of various faults, noise amplification, excess brightness, and imperfect contrast. Therefore, an ameliorated version of the contrast-limited adaptive histogram equalization (CLAHE) is introduced in this article to provide a good brightness with decent contrast for CT images. The novel modification to the aforesaid technique is done by adding an initial phase of a normalized gamma correction function that helps in adjusting the gamma of the processed image to avoid the common errors of the basic CLAHE of the excess brightness and imperfect contrast it produces. The newly developed technique is tested with synthetic and real-degraded low-contrast CT images, in which it highly contributed in producing better quality results. Moreover, a low intricacy technique for contrast enhancement is proposed, and its performance is also exhibited against various versions of histogram-based enhancement technique using three advanced image quality assessment metrics of Universal Image Quality Index (UIQI), Structural Similarity Index (SSIM), and Feature Similarity Index (FSIM). Finally, the proposed technique provided acceptable results with no visible artifacts and outperformed all the comparable techniques.

  3. Contrast enhancement via texture region based histogram equalization

    NASA Astrophysics Data System (ADS)

    Singh, Kuldeep; Vishwakarma, Dinesh K.; Singh Walia, Gurjit; Kapoor, Rajiv

    2016-08-01

    This paper presents two novel contrast enhancement approaches using texture regions-based histogram equalization (HE). In HE-based contrast enhancement methods, the enhanced image often contains undesirable artefacts because an excessive number of pixels in the non-textured areas heavily bias the histogram. The novel idea presented in this paper is to suppress the impact of pixels in non-textured areas and to exploit texture features for the computation of histogram in the process of HE. The first algorithm named as Dominant Orientation-based Texture Histogram Equalization (DOTHE), constructs the histogram of the image using only those image patches having dominant orientation. DOTHE categories image patches into smooth, dominant or non-dominant orientation patches by using the image variance and singular value decomposition algorithm and utilizes only dominant orientation patches in the process of HE. The second method termed as Edge-based Texture Histogram Equalization, calculates significant edges in the image and constructs the histogram using the grey levels present in the neighbourhood of edges. The cumulative density function of the histogram formed from texture features is mapped on the entire dynamic range of the input image to produce the contrast-enhanced image. Subjective as well as objective performance assessment of proposed methods is conducted and compared with other existing HE methods. The performance assessment in terms of visual quality, contrast improvement index, entropy and measure of enhancement reveals that the proposed methods outperform the existing HE methods.

  4. Contrast Enhancement Algorithm Based on Gap Adjustment for Histogram Equalization

    PubMed Central

    Chiu, Chung-Cheng; Ting, Chih-Chung

    2016-01-01

    Image enhancement methods have been widely used to improve the visual effects of images. Owing to its simplicity and effectiveness histogram equalization (HE) is one of the methods used for enhancing image contrast. However, HE may result in over-enhancement and feature loss problems that lead to unnatural look and loss of details in the processed images. Researchers have proposed various HE-based methods to solve the over-enhancement problem; however, they have largely ignored the feature loss problem. Therefore, a contrast enhancement algorithm based on gap adjustment for histogram equalization (CegaHE) is proposed. It refers to a visual contrast enhancement algorithm based on histogram equalization (VCEA), which generates visually pleasing enhanced images, and improves the enhancement effects of VCEA. CegaHE adjusts the gaps between two gray values based on the adjustment equation, which takes the properties of human visual perception into consideration, to solve the over-enhancement problem. Besides, it also alleviates the feature loss problem and further enhances the textures in the dark regions of the images to improve the quality of the processed images for human visual perception. Experimental results demonstrate that CegaHE is a reliable method for contrast enhancement and that it significantly outperforms VCEA and other methods. PMID:27338412

  5. The L_infinity constrained global optimal histogram equalization technique for real time imaging

    NASA Astrophysics Data System (ADS)

    Ren, Qiongwei; Niu, Yi; Liu, Lin; Jiao, Yang; Shi, Guangming

    2015-08-01

    Although the current imaging sensors can achieve 12 or higher precision, the current display devices and the commonly used digital image formats are still only 8 bits. This mismatch causes significant waste of the sensor precision and loss of information when storing and displaying the images. For better usage of the precision-budget, tone mapping operators have to be used to map the high-precision data into low-precision digital images adaptively. In this paper, the classic histogram equalization tone mapping operator is reexamined in the sense of optimization. We point out that the traditional histogram equalization technique and its variants are fundamentally improper by suffering from local optimum problems. To overcome this drawback, we remodel the histogram equalization tone mapping task based on graphic theory which achieves the global optimal solutions. Another advantage of the graphic-based modeling is that the tone-continuity is also modeled as a vital constraint in our approach which suppress the annoying boundary artifacts of the traditional approaches. In addition, we propose a novel dynamic programming technique to solve the histogram equalization problem in real time. Experimental results shows that the proposed tone-preserved global optimal histogram equalization technique outperforms the traditional approaches by exhibiting more subtle details in the foreground while preserving the smoothness of the background.

  6. Visual Contrast Enhancement Algorithm Based on Histogram Equalization

    PubMed Central

    Ting, Chih-Chung; Wu, Bing-Fei; Chung, Meng-Liang; Chiu, Chung-Cheng; Wu, Ya-Ching

    2015-01-01

    Image enhancement techniques primarily improve the contrast of an image to lend it a better appearance. One of the popular enhancement methods is histogram equalization (HE) because of its simplicity and effectiveness. However, it is rarely applied to consumer electronics products because it can cause excessive contrast enhancement and feature loss problems. These problems make the images processed by HE look unnatural and introduce unwanted artifacts in them. In this study, a visual contrast enhancement algorithm (VCEA) based on HE is proposed. VCEA considers the requirements of the human visual perception in order to address the drawbacks of HE. It effectively solves the excessive contrast enhancement problem by adjusting the spaces between two adjacent gray values of the HE histogram. In addition, VCEA reduces the effects of the feature loss problem by using the obtained spaces. Furthermore, VCEA enhances the detailed textures of an image to generate an enhanced image with better visual quality. Experimental results show that images obtained by applying VCEA have higher contrast and are more suited to human visual perception than those processed by HE and other HE-based methods. PMID:26184219

  7. Infrared image enhancement based on atmospheric scattering model and histogram equalization

    NASA Astrophysics Data System (ADS)

    Li, Yi; Zhang, Yunfeng; Geng, Aihui; Cao, Lihua; Chen, Juan

    2016-09-01

    Infrared images are fuzzy due to the special imaging technology of infrared sensor. In order to achieve contrast enhancement and gain clear edge details from a fuzzy infrared image, we propose an efficient enhancement method based on atmospheric scattering model and histogram equalization. The novel algorithm optimizes and improves the visual image haze remove method which combines the characteristics of the fuzzy infrared images. Firstly, an average filtering operation is presented to get the estimation of coarse transmission rate. Then we get the fuzzy free image through self-adaptive transmission rate calculated with the statistics information of original infrared image. Finally, to deal with low lighting problem of fuzzy free image, we propose a sectional plateau histogram equalization method which is capable of background suppression. Experimental results show that the performance and efficiency of the proposed algorithm are pleased, compared to four other algorithms in both subjective observation and objective quantitative evaluation. In addition, the proposed algorithm is competent to enhance infrared image for different applications under different circumstances.

  8. Image enhancement circuit using nonlinear processing curve and constrained histogram range equalization

    NASA Astrophysics Data System (ADS)

    Cvetkovic, Sascha D.; de With, Peter H. N.

    2004-01-01

    For real-time imaging in surveillance applications, image fidelity is of primary importance to ensure customer confidence. The obtained image fidelity is a result from amongst others dynamic range expansion and video signal enhancement. The dynamic range of the signal needs adaptation, because the sensor signal has a much larger range than the standard CRT display. The signal enhancement should accommodate for the widely varying light and scene conditions and user scenarios of the equipment. This paper proposes a new system to combine dynamic range and enhancement processing, offering a strongly improved picture quality for surveillance applications. The key to our solution is that we use Non-Linear Processing (NLP) with a so-called Constrained Histogram Range Equalization (CHRE). The NLP transforms the digitized high-dynamic luminance sensor signal such that details of the low-luminance parts are enhanced, while avoiding detail losses in the high-luminance areas. The CHRE technique enhances visibility of the global contrast for the camera signal without significant information loss in the statistically less relevant areas. Evaluations of this proposal have shown clear improvements of the perceptual image quality. An additional advantage is that the new scheme is adaptable and allows the concatenation of further enhancement techniques without sacrificing the obtained picture quality improvement.

  9. Adaptive removal of show-through artifacts by histogram analysis

    NASA Astrophysics Data System (ADS)

    Hong, Jin-Kyung; Kang, Ki-Min; Kim, Sang-Ho

    2010-01-01

    When scanning a document that is printed on both sides, the image on the reverse can show through with high luminance. We propose an adaptive method of removing show-through artifacts based on histogram analysis. Instead of attempting to measure the physical parameters of the paper and the scanning system, or making multiple scans, we analyze the color distribution to remove unwanted artifacts, using an image of the front of the document alone. First, we accumulate histogram information to find the lightness distribution of pixels in the scanned image. Using this data, we set thresholds on both luminance and chrominance to determine candidate regions of show-through. Finally, we classify these regions into foreground and background of the image on the front of the paper, and show-through from the back. The background and show-through regions become candidates for erasure, and they are adaptively updated as the process proceeds. This approach preserves the chrominance of the image on the front of the papers without introducing artifacts. It does not make the whole image brighter, which is what happens when a fixed threshold is used to remove show-through.

  10. Improving the convergence rate in affine registration of PET and SPECT brain images using histogram equalization.

    PubMed

    Salas-Gonzalez, D; Górriz, J M; Ramírez, J; Padilla, P; Illán, I A

    2013-01-01

    A procedure to improve the convergence rate for affine registration methods of medical brain images when the images differ greatly from the template is presented. The methodology is based on a histogram matching of the source images with respect to the reference brain template before proceeding with the affine registration. The preprocessed source brain images are spatially normalized to a template using a general affine model with 12 parameters. A sum of squared differences between the source images and the template is considered as objective function, and a Gauss-Newton optimization algorithm is used to find the minimum of the cost function. Using histogram equalization as a preprocessing step improves the convergence rate in the affine registration algorithm of brain images as we show in this work using SPECT and PET brain images.

  11. Efficient visibility-driven medical image visualisation via adaptive binned visibility histogram.

    PubMed

    Jung, Younhyun; Kim, Jinman; Kumar, Ashnil; Feng, David Dagan; Fulham, Michael

    2016-07-01

    'Visibility' is a fundamental optical property that represents the observable, by users, proportion of the voxels in a volume during interactive volume rendering. The manipulation of this 'visibility' improves the volume rendering processes; for instance by ensuring the visibility of regions of interest (ROIs) or by guiding the identification of an optimal rendering view-point. The construction of visibility histograms (VHs), which represent the distribution of all the visibility of all voxels in the rendered volume, enables users to explore the volume with real-time feedback about occlusion patterns among spatially related structures during volume rendering manipulations. Volume rendered medical images have been a primary beneficiary of VH given the need to ensure that specific ROIs are visible relative to the surrounding structures, e.g. the visualisation of tumours that may otherwise be occluded by neighbouring structures. VH construction and its subsequent manipulations, however, are computationally expensive due to the histogram binning of the visibilities. This limits the real-time application of VH to medical images that have large intensity ranges and volume dimensions and require a large number of histogram bins. In this study, we introduce an efficient adaptive binned visibility histogram (AB-VH) in which a smaller number of histogram bins are used to represent the visibility distribution of the full VH. We adaptively bin medical images by using a cluster analysis algorithm that groups the voxels according to their intensity similarities into a smaller subset of bins while preserving the distribution of the intensity range of the original images. We increase efficiency by exploiting the parallel computation and multiple render targets (MRT) extension of the modern graphical processing units (GPUs) and this enables efficient computation of the histogram. We show the application of our method to single-modality computed tomography (CT), magnetic resonance

  12. A fracture enhancement method based on the histogram equalization of eigenstructure-based coherence

    NASA Astrophysics Data System (ADS)

    Dou, Xi-Ying; Han, Li-Guo; Wang, En-Li; Dong, Xue-Hua; Yang, Qing; Yan, Gao-Han

    2014-06-01

    Eigenstructure-based coherence attributes are efficient and mature techniques for large-scale fracture detection. However, in horizontally bedded and continuous strata, buried fractures in high grayscale value zones are difficult to detect. Furthermore, middle- and small-scale fractures in fractured zones where migration image energies are usually not concentrated perfectly are also hard to detect because of the fuzzy, clouded shadows owing to low grayscale values. A new fracture enhancement method combined with histogram equalization is proposed to solve these problems. With this method, the contrast between discontinuities and background in coherence images is increased, linear structures are highlighted by stepwise adjustment of the threshold of the coherence image, and fractures are detected at different scales. Application of the method shows that it can also improve fracture cognition and accuracy.

  13. Adaptive sigmoid function bihistogram equalization for image contrast enhancement

    NASA Astrophysics Data System (ADS)

    Arriaga-Garcia, Edgar F.; Sanchez-Yanez, Raul E.; Ruiz-Pinales, Jose; Garcia-Hernandez, Ma. de Guadalupe

    2015-09-01

    Contrast enhancement plays a key role in a wide range of applications including consumer electronic applications, such as video surveillance, digital cameras, and televisions. The main goal of contrast enhancement is to increase the quality of images. However, most state-of-the-art methods induce different types of distortion such as intensity shift, wash-out, noise, intensity burn-out, and intensity saturation. In addition, in consumer electronics, simple and fast methods are required in order to be implemented in real time. A bihistogram equalization method based on adaptive sigmoid functions is proposed. It consists of splitting the image histogram into two parts that are equalized independently by using adaptive sigmoid functions. In order to preserve the mean brightness of the input image, the parameter of the sigmoid functions is chosen to minimize the absolute mean brightness metric. Experiments on the Berkeley database have shown that the proposed method improves the quality of images and preserves their mean brightness. An application to improve the colorfulness of images is also presented.

  14. Differentially Private Histogram Publication For Dynamic Datasets: An Adaptive Sampling Approach

    PubMed Central

    Li, Haoran; Jiang, Xiaoqian; Xiong, Li; Liu, Jinfei

    2016-01-01

    Differential privacy has recently become a de facto standard for private statistical data release. Many algorithms have been proposed to generate differentially private histograms or synthetic data. However, most of them focus on “one-time” release of a static dataset and do not adequately address the increasing need of releasing series of dynamic datasets in real time. A straightforward application of existing histogram methods on each snapshot of such dynamic datasets will incur high accumulated error due to the composibility of differential privacy and correlations or overlapping users between the snapshots. In this paper, we address the problem of releasing series of dynamic datasets in real time with differential privacy, using a novel adaptive distance-based sampling approach. Our first method, DSFT, uses a fixed distance threshold and releases a differentially private histogram only when the current snapshot is sufficiently different from the previous one, i.e., with a distance greater than a predefined threshold. Our second method, DSAT, further improves DSFT and uses a dynamic threshold adaptively adjusted by a feedback control mechanism to capture the data dynamics. Extensive experiments on real and synthetic datasets demonstrate that our approach achieves better utility than baseline methods and existing state-of-the-art methods. PMID:26973795

  15. Adaptive local backlight dimming algorithm based on local histogram and image characteristics

    NASA Astrophysics Data System (ADS)

    Nadernejad, Ehsan; Burini, Nino; Korhonen, Jari; Forchhammer, Søren; Mantel, Claire

    2013-02-01

    Liquid Crystal Display (LCDs) with Light Emitting Diode (LED) backlight is a very popular display technology, used for instance in television sets, monitors and mobile phones. This paper presents a new backlight dimming algorithm that exploits the characteristics of the target image, such as the local histograms and the average pixel intensity of each backlight segment, to reduce the power consumption of the backlight and enhance image quality. The local histogram of the pixels within each backlight segment is calculated and, based on this average, an adaptive quantile value is extracted. A classification into three classes based on the average luminance value is performed and, depending on the image luminance class, the extracted information on the local histogram determines the corresponding backlight value. The proposed method has been applied on two modeled screens: one with a high resolution direct-lit backlight, and the other screen with 16 edge-lit backlight segments placed in two columns and eight rows. We have compared the proposed algorithm against several known backlight dimming algorithms by simulations; and the results show that the proposed algorithm provides better trade-off between power consumption and image quality preservation than the other algorithms representing the state of the art among feature based backlight algorithms.

  16. Wavelength-adaptive dehazing using histogram merging-based classification for UAV images.

    PubMed

    Yoon, Inhye; Jeong, Seokhwa; Jeong, Jaeheon; Seo, Doochun; Paik, Joonki

    2015-03-19

    Since incoming light to an unmanned aerial vehicle (UAV) platform can be scattered by haze and dust in the atmosphere, the acquired image loses the original color and brightness of the subject. Enhancement of hazy images is an important task in improving the visibility of various UAV images. This paper presents a spatially-adaptive dehazing algorithm that merges color histograms with consideration of the wavelength-dependent atmospheric turbidity. Based on the wavelength-adaptive hazy image acquisition model, the proposed dehazing algorithm consists of three steps: (i) image segmentation based on geometric classes; (ii) generation of the context-adaptive transmission map; and (iii) intensity transformation for enhancing a hazy UAV image. The major contribution of the research is a novel hazy UAV image degradation model by considering the wavelength of light sources. In addition, the proposed transmission map provides a theoretical basis to differentiate visually important regions from others based on the turbidity and merged classification results.

  17. Wavelength-Adaptive Dehazing Using Histogram Merging-Based Classification for UAV Images

    PubMed Central

    Yoon, Inhye; Jeong, Seokhwa; Jeong, Jaeheon; Seo, Doochun; Paik, Joonki

    2015-01-01

    Since incoming light to an unmanned aerial vehicle (UAV) platform can be scattered by haze and dust in the atmosphere, the acquired image loses the original color and brightness of the subject. Enhancement of hazy images is an important task in improving the visibility of various UAV images. This paper presents a spatially-adaptive dehazing algorithm that merges color histograms with consideration of the wavelength-dependent atmospheric turbidity. Based on the wavelength-adaptive hazy image acquisition model, the proposed dehazing algorithm consists of three steps: (i) image segmentation based on geometric classes; (ii) generation of the context-adaptive transmission map; and (iii) intensity transformation for enhancing a hazy UAV image. The major contribution of the research is a novel hazy UAV image degradation model by considering the wavelength of light sources. In addition, the proposed transmission map provides a theoretical basis to differentiate visually important regions from others based on the turbidity and merged classification results. PMID:25808767

  18. Wavelength-adaptive dehazing using histogram merging-based classification for UAV images.

    PubMed

    Yoon, Inhye; Jeong, Seokhwa; Jeong, Jaeheon; Seo, Doochun; Paik, Joonki

    2015-01-01

    Since incoming light to an unmanned aerial vehicle (UAV) platform can be scattered by haze and dust in the atmosphere, the acquired image loses the original color and brightness of the subject. Enhancement of hazy images is an important task in improving the visibility of various UAV images. This paper presents a spatially-adaptive dehazing algorithm that merges color histograms with consideration of the wavelength-dependent atmospheric turbidity. Based on the wavelength-adaptive hazy image acquisition model, the proposed dehazing algorithm consists of three steps: (i) image segmentation based on geometric classes; (ii) generation of the context-adaptive transmission map; and (iii) intensity transformation for enhancing a hazy UAV image. The major contribution of the research is a novel hazy UAV image degradation model by considering the wavelength of light sources. In addition, the proposed transmission map provides a theoretical basis to differentiate visually important regions from others based on the turbidity and merged classification results. PMID:25808767

  19. Hand contour detection in wearable camera video using an adaptive histogram region of interest

    PubMed Central

    2013-01-01

    Background Monitoring hand function at home is needed to better evaluate the effectiveness of rehabilitation interventions. Our objective is to develop wearable computer vision systems for hand function monitoring. The specific aim of this study is to develop an algorithm that can identify hand contours in video from a wearable camera that records the user’s point of view, without the need for markers. Methods The two-step image processing approach for each frame consists of: (1) Detecting a hand in the image, and choosing one seed point that lies within the hand. This step is based on a priori models of skin colour. (2) Identifying the contour of the region containing the seed point. This is accomplished by adaptively determining, for each frame, the region within a colour histogram that corresponds to hand colours, and backprojecting the image using the reduced histogram. Results In four test videos relevant to activities of daily living, the hand detector classification accuracy was 88.3%. The contour detection results were compared to manually traced contours in 97 test frames, and the median F-score was 0.86. Conclusion This algorithm will form the basis for a wearable computer-vision system that can monitor and log the interactions of the hand with its environment. PMID:24354542

  20. Medical image classification using spatial adjacent histogram based on adaptive local binary patterns.

    PubMed

    Liu, Dong; Wang, Shengsheng; Huang, Dezhi; Deng, Gang; Zeng, Fantao; Chen, Huiling

    2016-05-01

    Medical image recognition is an important task in both computer vision and computational biology. In the field of medical image classification, representing an image based on local binary patterns (LBP) descriptor has become popular. However, most existing LBP-based methods encode the binary patterns in a fixed neighborhood radius and ignore the spatial relationships among local patterns. The ignoring of the spatial relationships in the LBP will cause a poor performance in the process of capturing discriminative features for complex samples, such as medical images obtained by microscope. To address this problem, in this paper we propose a novel method to improve local binary patterns by assigning an adaptive neighborhood radius for each pixel. Based on these adaptive local binary patterns, we further propose a spatial adjacent histogram strategy to encode the micro-structures for image representation. An extensive set of evaluations are performed on four medical datasets which show that the proposed method significantly improves standard LBP and compares favorably with several other prevailing approaches. PMID:27058283

  1. Medical image classification using spatial adjacent histogram based on adaptive local binary patterns.

    PubMed

    Liu, Dong; Wang, Shengsheng; Huang, Dezhi; Deng, Gang; Zeng, Fantao; Chen, Huiling

    2016-05-01

    Medical image recognition is an important task in both computer vision and computational biology. In the field of medical image classification, representing an image based on local binary patterns (LBP) descriptor has become popular. However, most existing LBP-based methods encode the binary patterns in a fixed neighborhood radius and ignore the spatial relationships among local patterns. The ignoring of the spatial relationships in the LBP will cause a poor performance in the process of capturing discriminative features for complex samples, such as medical images obtained by microscope. To address this problem, in this paper we propose a novel method to improve local binary patterns by assigning an adaptive neighborhood radius for each pixel. Based on these adaptive local binary patterns, we further propose a spatial adjacent histogram strategy to encode the micro-structures for image representation. An extensive set of evaluations are performed on four medical datasets which show that the proposed method significantly improves standard LBP and compares favorably with several other prevailing approaches.

  2. Adapting histogram for automatic noise data removal in building interior point cloud data

    NASA Astrophysics Data System (ADS)

    Shukor, S. A. Abdul; Rushforth, E. J.

    2015-05-01

    3D point cloud data is now preferred by researchers to generate 3D models. These models can be used throughout a variety of applications including 3D building interior models. The rise of Building Information Modeling (BIM) for Architectural, Engineering, Construction (AEC) applications has given 3D interior modelling more attention recently. To generate a 3D model representing the building interior, a laser scanner is used to collect the point cloud data. However, this data often comes with noise. This is due to several factors including the surrounding objects, lighting and specifications of the laser scanner. This paper highlights on the usage of the histogram to remove the noise data. Histograms, used in statistics and probability, are regularly being used in a number of applications like image processing, where a histogram can represent the total number of pixels in an image at each intensity level. Here, histograms represent the number of points recorded at range distance intervals in various projections. As unwanted noise data has a sparser cloud density compared to the required data and is usually situated at a notable distance from the required data, noise data will have lower frequencies in the histogram. By defining the acceptable range using the average frequency, points below this range can be removed. This research has shown that these histograms have the capabilities to remove unwanted data from 3D point cloud data representing building interiors automatically. This feature will aid the process of data preprocessing in producing an ideal 3D model from the point cloud data.

  3. Blind adaptive equalization of polarization-switched QPSK modulation.

    PubMed

    Millar, David S; Savory, Seb J

    2011-04-25

    Coherent detection in combination with digital signal processing has recently enabled significant progress in the capacity of optical communications systems. This improvement has enabled detection of optimum constellations for optical signals in four dimensions. In this paper, we propose and investigate an algorithm for the blind adaptive equalization of one such modulation format: polarization-switched quaternary phase shift keying (PS-QPSK). The proposed algorithm, which includes both blind initialization and adaptation of the equalizer, is found to be insensitive to the input polarization state and demonstrates highly robust convergence in the presence of PDL, DGD and polarization rotation.

  4. EZ-ROSE: a computer program for equal-area circular histograms and statistical analysis of two-dimensional vectorial data

    NASA Astrophysics Data System (ADS)

    Baas, Jaco H.

    2000-03-01

    EZ-ROSE 1.0 is a computer program for the statistical analysis of populations of two-dimensional vectorial data and their presentation in equal-area rose diagrams. The program is implemented as a Microsoft® Excel workbook containing worksheets for the input of directional (circular) or lineational (semi-circular) data and their automatic processing, which includes the calculation of a frequency distribution for a selected class width, statistical analysis, and the construction of a rose diagram in CorelDraw™. The statistical analysis involves tests of uniformity for the vectorial population distribution, such as the nonparametric Kuiper and Watson tests and the parametric Rayleigh test. The statistics calculated include the vector mean, its magnitude (length) and strength (data concentration); the Batschelet circular standard deviation as an alternative measure of vectorial concentration; and a confidence sector for the vector mean. The statistics together with the frequency data are used to prepare a Corel Script™ file that contains all the necessary instructions to draw automatically an equal-area circular frequency histogram (rose diagram) in CorelDraw™. The advantages of EZ-ROSE, compared to other software for circular statistics, are: (1) the ability to use an equal-area scale in rose diagrams; (2) the wide range of tools for a comprehensive statistical analysis; (3) the ease of use, as Microsoft® Excel and CorelDraw™ are widely known to users of Microsoft® Windows; and (4) the high degree of flexibility due to the application of Microsoft® Excel and CorelDraw™, which offer a whole range of tools for possible addition of other statistical methods and changes of the rose-diagram layout.

  5. Low complexity adaptive equalizers for underwater acoustic communications

    NASA Astrophysics Data System (ADS)

    Soflaei, Masoumeh; Azmi, Paeiz

    2014-08-01

    Interference signals due to scattering from surface and reflecting from bottom is one of the most important problems of reliable communications in shallow water channels. To solve this problem, one of the best suggested ways is to use adaptive equalizers. Convergence rate and misadjustment error in adaptive algorithms play important roles in adaptive equalizer performance. In this paper, affine projection algorithm (APA), selective regressor APA(SR-APA), family of selective partial update (SPU) algorithms, family of set-membership (SM) algorithms and selective partial update selective regressor APA (SPU-SR-APA) are compared with conventional algorithms such as the least mean square (LMS) in underwater acoustic communications. We apply experimental data from the Strait of Hormuz for demonstrating the efficiency of the proposed methods over shallow water channel. We observe that the values of the steady-state mean square error (MSE) of SR-APA, SPU-APA, SPU-normalized least mean square (SPU-NLMS), SPU-SR-APA, SM-APA and SM-NLMS algorithms decrease in comparison with the LMS algorithm. Also these algorithms have better convergence rates than LMS type algorithm.

  6. A successive overrelaxation iterative technique for an adaptive equalizer

    NASA Technical Reports Server (NTRS)

    Kosovych, O. S.

    1973-01-01

    An adaptive strategy for the equalization of pulse-amplitude-modulated signals in the presence of intersymbol interference and additive noise is reported. The successive overrelaxation iterative technique is used as the algorithm for the iterative adjustment of the equalizer coefficents during a training period for the minimization of the mean square error. With 2-cyclic and nonnegative Jacobi matrices substantial improvement is demonstrated in the rate of convergence over the commonly used gradient techniques. The Jacobi theorems are also extended to nonpositive Jacobi matrices. Numerical examples strongly indicate that the improvements obtained for the special cases are possible for general channel characteristics. The technique is analytically demonstrated to decrease the mean square error at each iteration for a large range of parameter values for light or moderate intersymbol interference and for small intervals for general channels. Analytically, convergence of the relaxation algorithm was proven in a noisy environment and the coefficient variance was demonstrated to be bounded.

  7. Adaptive block-wise alphabet reduction scheme for lossless compression of images with sparse and locally sparse histograms

    NASA Astrophysics Data System (ADS)

    Masmoudi, Atef; Zouari, Sonia; Ghribi, Abdelaziz

    2015-11-01

    We propose a new adaptive block-wise lossless image compression algorithm, which is based on the so-called alphabet reduction scheme combined with an adaptive arithmetic coding (AC). This new encoding algorithm is particularly efficient for lossless compression of images with sparse and locally sparse histograms. AC is a very efficient technique for lossless data compression and produces a rate that is close to the entropy; however, a compression performance loss occurs when encoding images or blocks with a limited number of active symbols by comparison with the number of symbols in the nominal alphabet, which consists in the amplification of the zero frequency problem. Generally, most methods add one to the frequency count of each symbol from the nominal alphabet, which leads to a statistical model distortion, and therefore reduces the efficiency of the AC. The aim of this work is to overcome this drawback by assigning to each image block the smallest possible set including all the existing symbols called active symbols. This is an alternative of using the nominal alphabet when applying the conventional arithmetic encoders. We show experimentally that the proposed method outperforms several lossless image compression encoders and standards including the conventional arithmetic encoders, JPEG2000, and JPEG-LS.

  8. An improved human visual system based reversible data hiding method using adaptive histogram modification

    NASA Astrophysics Data System (ADS)

    Hong, Wien; Chen, Tung-Shou; Wu, Mei-Chen

    2013-03-01

    Jung et al., IEEE Signal Processing Letters, 18, 2, 95, 2011 proposed a reversible data hiding method considering the human visual system (HVS). They employed the mean of visited neighboring pixels to predict the current pixel value, and estimated the just noticeable difference (JND) of the current pixel. Message bits are then embedded by adjusting the embedding level according to the calculated JND. Jung et al.'s method achieved excellent image quality. However, the embedding algorithm they used may result in over modification of pixel values and a large location map, which may deteriorate the image quality and decrease the pure payload. The proposed method exploits the nearest neighboring pixels to predict the visited pixel value and to estimate the corresponding JND. The cover pixels are preprocessed adaptively to reduce the size of the location map. We also employ an embedding level selection mechanism to prevent near-saturated pixels from being over modified. Experimental results show that the image quality of the proposed method is higher than that of Jung et al.'s method, and the payload can also be increased due to the reduction of the location map.

  9. Light wave transmission through free space using atmospheric laser links with adaptive equalization

    NASA Astrophysics Data System (ADS)

    Hussein, Gamal A.; Mohamed, Abd El-Naser A.; Oraby, Osama A.; Hassan, Emad S.; Eldokany, Ibrahim M.; El-Rabaie, El-Sayed M.; Dessouky, Moawad I.; Alshebeili, Saleh A.; El-Samie, Fathi E. Abd

    2013-07-01

    The utilization of adaptive equalization in the design of atmospheric laser link transceiver architectures that can be used for television and broadcast signal interconnect between the external place of event and the master control room is suggested. At the transmitter side of the proposed transceiver; an array of atmospheric laser sources, digital signal processing, and optical radiators are used to send light waves in free space. At the receiver side, an adaptive finite impulse response least mean square (LMS) equalizer with activity detection guidance (ADG) and tap decoupling (TD) is used to mitigate the effect of channel impairments. The performance of the suggested adaptive equalizer is compared with that of the conventional adaptive equalizer based only on the standard LMS algorithm. The simulation results revealed that the adaptive LMS equalizer with ADG and TD is a promising solution for the inter-symbol interference problem in optical wireless communication systems.

  10. Learning Rate Updating Methods Applied to Adaptive Fuzzy Equalizers for Broadband Power Line Communications

    NASA Astrophysics Data System (ADS)

    Ribeiro, Moisés V.

    2004-12-01

    This paper introduces adaptive fuzzy equalizers with variable step size for broadband power line (PL) communications. Based on delta-bar-delta and local Lipschitz estimation updating rules, feedforward, and decision feedback approaches, we propose singleton and nonsingleton fuzzy equalizers with variable step size to cope with the intersymbol interference (ISI) effects of PL channels and the hardness of the impulse noises generated by appliances and nonlinear loads connected to low-voltage power grids. The computed results show that the convergence rates of the proposed equalizers are higher than the ones attained by the traditional adaptive fuzzy equalizers introduced by J. M. Mendel and his students. Additionally, some interesting BER curves reveal that the proposed techniques are efficient for mitigating the above-mentioned impairments.

  11. Dose-Volume Histogram Parameters and Late Side Effects in Magnetic Resonance Image-Guided Adaptive Cervical Cancer Brachytherapy

    SciTech Connect

    Georg, Petra; Lang, Stefan; Dimopoulos, Johannes C.A.; Doerr, Wolfgang; Sturdza, Alina E.; Berger, Daniel; Georg, Dietmar; Kirisits, Christian; Poetter, Richard

    2011-02-01

    Purpose: To evaluate the predictive value of dose-volume histogram (DVH) parameters for late side effects of the rectum, sigmoid colon, and bladder in image-guided brachytherapy for cervix cancer patients. Methods and Materials: A total of 141 patients received external-beam radiotherapy and image-guided brachytherapy with or without chemotherapy. The DVH parameters for the most exposed 2, 1, and 0.1 cm{sup 3} (D{sub 2cc}, D{sub 1cc}, and D{sub 0.1cc}) of the rectum, sigmoid, and bladder, as well as International Commission on Radiation Units and Measurements point doses (D{sub ICRU}) were computed. Total doses were converted to equivalent doses in 2 Gy by applying the linear-quadratic model ({alpha}/{beta} = 3 Gy). Late side effects were prospectively assessed using the Late Effects in Normal Tissues-Subjective, Objective, Management and Analytic score. The following patient groups were defined: Group 1: no side effects (Grade 0); Group 2: side effects (Grade 1-4); Group 3: minor side effects (Grade 0-1); and Group 4: major side effects (Grade 2-4). Results: The median follow-up was 51 months. The overall 5-year actuarial side effect rates were 12% for rectum, 3% for sigmoid, and 23% for bladder. The mean total D{sub 2cc} were 65 {+-} 12 Gy for rectum, 62 {+-} 12 Gy for sigmoid, and 95 {+-} 22 Gy for bladder. For rectum, statistically significant differences were observed between Groups 1 and 2 in all DVH parameters and D{sub ICRU}. Between Groups 3 and 4, no difference was observed for D{sub 0.1cc.} For sigmoid, significant differences were observed for D{sub 2cc} and D{sub 1cc}, but not for D{sub 0.1cc} in all groups. For bladder, significant differences were observed for all DVH parameters only comparing Groups 3 and 4. No differences were observed for D{sub ICRU}. Conclusions: The parameters D{sub 2cc} and D{sub 1cc} have a good predictive value for rectal toxicity. For sigmoid, no prediction could be postulated because of limited data. In bladder, DVH

  12. Color Histogram Diffusion for Image Enhancement

    NASA Technical Reports Server (NTRS)

    Kim, Taemin

    2011-01-01

    Various color histogram equalization (CHE) methods have been proposed to extend grayscale histogram equalization (GHE) for color images. In this paper a new method called histogram diffusion that extends the GHE method to arbitrary dimensions is proposed. Ranges in a histogram are specified as overlapping bars of uniform heights and variable widths which are proportional to their frequencies. This diagram is called the vistogram. As an alternative approach to GHE, the squared error of the vistogram from the uniform distribution is minimized. Each bar in the vistogram is approximated by a Gaussian function. Gaussian particles in the vistoram diffuse as a nonlinear autonomous system of ordinary differential equations. CHE results of color images showed that the approach is effective.

  13. Adaptive Pre-FFT Equalizer with High-Precision Channel Estimator for ISI Channels

    NASA Astrophysics Data System (ADS)

    Yoshida, Makoto

    We present an attractive approach for OFDM transmission using an adaptive pre-FFT equalizer, which can select ICI reduction mode according to channel condition, and a degenerated-inverse-matrix-based channel estimator (DIME), which uses a cyclic sinc-function matrix uniquely determined by transmitted subcarriers. In addition to simulation results, the proposed system with an adaptive pre-FFT equalizer and DIME has been laboratory tested by using a software defined radio (SDR)-based test bed. The simulation and experimental results demonstrated that the system at a rate of more than 100Mbps can provide a bit error rate of less than 10-3 for a fast multi-path fading channel that has a moving velocity of more than 200km/h with a delay spread of 1.9µs (a maximum delay path of 7.3µs) in the 5-GHz band.

  14. A NOISE ADAPTIVE FUZZY EQUALIZATION METHOD FOR PROCESSING SOLAR EXTREME ULTRAVIOLET IMAGES

    SciTech Connect

    Druckmueller, M.

    2013-08-15

    A new image enhancement tool ideally suited for the visualization of fine structures in extreme ultraviolet images of the corona is presented in this paper. The Noise Adaptive Fuzzy Equalization method is particularly suited for the exceptionally high dynamic range images from the Atmospheric Imaging Assembly instrument on the Solar Dynamics Observatory. This method produces artifact-free images and gives significantly better results than methods based on convolution or Fourier transform which are often used for that purpose.

  15. Comparison of adverse effects of proton and X-ray chemoradiotherapy for esophageal cancer using an adaptive dose–volume histogram analysis

    PubMed Central

    Makishima, Hirokazu; Ishikawa, Hitoshi; Terunuma, Toshiyuki; Hashimoto, Takayuki; Yamanashi, Koichi; Sekiguchi, Takao; Mizumoto, Masashi; Okumura, Toshiyuki; Sakae, Takeji; Sakurai, Hideyuki

    2015-01-01

    Cardiopulmonary late toxicity is of concern in concurrent chemoradiotherapy (CCRT) for esophageal cancer. The aim of this study was to examine the benefit of proton beam therapy (PBT) using clinical data and adaptive dose–volume histogram (DVH) analysis. The subjects were 44 patients with esophageal cancer who underwent definitive CCRT using X-rays (n = 19) or protons (n = 25). Experimental recalculation using protons was performed for the patient actually treated with X-rays, and vice versa. Target coverage and dose constraints of normal tissues were conserved. Lung V5–V20, mean lung dose (MLD), and heart V30–V50 were compared for risk organ doses between experimental plans and actual treatment plans. Potential toxicity was estimated using protons in patients actually treated with X-rays, and vice versa. Pulmonary events of Grade ≥2 occurred in 8/44 cases (18%), and cardiac events were seen in 11 cases (25%). Risk organ doses in patients with events of Grade ≥2 were significantly higher than for those with events of Grade ≤1. Risk organ doses were lower in proton plans compared with X-ray plans. All patients suffering toxicity who were treated with X-rays (n = 13) had reduced predicted doses in lung and heart using protons, while doses in all patients treated with protons (n = 24) with toxicity of Grade ≤1 had worsened predicted toxicity with X-rays. Analysis of normal tissue complication probability showed a potential reduction in toxicity by using proton beams. Irradiation dose, volume and adverse effects on the heart and lung can be reduced using protons. Thus, PBT is a promising treatment modality for the management of esophageal cancer. PMID:25755255

  16. Comparison of adverse effects of proton and X-ray chemoradiotherapy for esophageal cancer using an adaptive dose-volume histogram analysis.

    PubMed

    Makishima, Hirokazu; Ishikawa, Hitoshi; Terunuma, Toshiyuki; Hashimoto, Takayuki; Yamanashi, Koichi; Sekiguchi, Takao; Mizumoto, Masashi; Okumura, Toshiyuki; Sakae, Takeji; Sakurai, Hideyuki

    2015-05-01

    Cardiopulmonary late toxicity is of concern in concurrent chemoradiotherapy (CCRT) for esophageal cancer. The aim of this study was to examine the benefit of proton beam therapy (PBT) using clinical data and adaptive dose-volume histogram (DVH) analysis. The subjects were 44 patients with esophageal cancer who underwent definitive CCRT using X-rays (n = 19) or protons (n = 25). Experimental recalculation using protons was performed for the patient actually treated with X-rays, and vice versa. Target coverage and dose constraints of normal tissues were conserved. Lung V5-V20, mean lung dose (MLD), and heart V30-V50 were compared for risk organ doses between experimental plans and actual treatment plans. Potential toxicity was estimated using protons in patients actually treated with X-rays, and vice versa. Pulmonary events of Grade ≥2 occurred in 8/44 cases (18%), and cardiac events were seen in 11 cases (25%). Risk organ doses in patients with events of Grade ≥2 were significantly higher than for those with events of Grade ≤1. Risk organ doses were lower in proton plans compared with X-ray plans. All patients suffering toxicity who were treated with X-rays (n = 13) had reduced predicted doses in lung and heart using protons, while doses in all patients treated with protons (n = 24) with toxicity of Grade ≤1 had worsened predicted toxicity with X-rays. Analysis of normal tissue complication probability showed a potential reduction in toxicity by using proton beams. Irradiation dose, volume and adverse effects on the heart and lung can be reduced using protons. Thus, PBT is a promising treatment modality for the management of esophageal cancer.

  17. Network histograms and universality of blockmodel approximation

    PubMed Central

    Olhede, Sofia C.; Wolfe, Patrick J.

    2014-01-01

    In this paper we introduce the network histogram, a statistical summary of network interactions to be used as a tool for exploratory data analysis. A network histogram is obtained by fitting a stochastic blockmodel to a single observation of a network dataset. Blocks of edges play the role of histogram bins and community sizes that of histogram bandwidths or bin sizes. Just as standard histograms allow for varying bandwidths, different blockmodel estimates can all be considered valid representations of an underlying probability model, subject to bandwidth constraints. Here we provide methods for automatic bandwidth selection, by which the network histogram approximates the generating mechanism that gives rise to exchangeable random graphs. This makes the blockmodel a universal network representation for unlabeled graphs. With this insight, we discuss the interpretation of network communities in light of the fact that many different community assignments can all give an equally valid representation of such a network. To demonstrate the fidelity-versus-interpretability tradeoff inherent in considering different numbers and sizes of communities, we analyze two publicly available networks—political weblogs and student friendships—and discuss how to interpret the network histogram when additional information related to node and edge labeling is present. PMID:25275010

  18. Adaptive reconfigurable V-BLAST type equalizer for cognitive MIMO-OFDM radios

    NASA Astrophysics Data System (ADS)

    Ozden, Mehmet Tahir

    2015-12-01

    An adaptive channel shortening equalizer design for multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) radio receivers is considered in this presentation. The proposed receiver has desirable features for cognitive and software defined radio implementations. It consists of two sections: MIMO decision feedback equalizer (MIMO-DFE) and adaptive multiple Viterbi detection. In MIMO-DFE section, a complete modified Gram-Schmidt orthogonalization of multichannel input data is accomplished using sequential processing multichannel Givens lattice stages, so that a Vertical Bell Laboratories Layered Space Time (V-BLAST) type MIMO-DFE is realized at the front-end section of the channel shortening equalizer. Matrix operations, a major bottleneck for receiver operations, are accordingly avoided, and only scalar operations are used. A highly modular and regular radio receiver architecture that has a suitable structure for digital signal processing (DSP) chip and field programable gate array (FPGA) implementations, which are important for software defined radio realizations, is achieved. The MIMO-DFE section of the proposed receiver can also be reconfigured for spectrum sensing and positioning functions, which are important tasks for cognitive radio applications. In connection with adaptive multiple Viterbi detection section, a systolic array implementation for each channel is performed so that a receiver architecture with high computational concurrency is attained. The total computational complexity is given in terms of equalizer and desired response filter lengths, alphabet size, and number of antennas. The performance of the proposed receiver is presented for two-channel case by means of mean squared error (MSE) and probability of error evaluations, which are conducted for time-invariant and time-variant channel conditions, orthogonal and nonorthogonal transmissions, and two different modulation schemes.

  19. Structure Size Enhanced Histogram

    NASA Astrophysics Data System (ADS)

    Wesarg, Stefan; Kirschner, Matthias

    Direct volume visualization requires the definition of transfer functions (TFs) for the assignment of opacity and color. Multi-dimensional TFs are based on at least two image properties, and are specified by means of 2D histograms. In this work we propose a new type of a 2D histogram which combines gray value with information about the size of the structures. This structure size enhanced (SSE) histogram is an intuitive approach for representing anatomical features. Clinicians — the users we are focusing on — are much more familiar with selecting features by their size than by their gradient magnitude value. As a proof of concept, we employ the SSE histogram for the definition of two-dimensional TFs for the visualization of 3D MRI and CT image data.

  20. Simulation of a QPSK high data rate receiver - Modeling of tracking loops and adaptive equalizer

    NASA Astrophysics Data System (ADS)

    Schiavoni, Maryanne T.; Ray, Robert T.

    The simulation effort supporting the design of the high-data-rate receiver for the second tracking and data relay satellite system ground terminal is detailed. The receiver accepts quadrature-phase-shift-keying (QPSK) signals at data rates from 6 to 300 Mb/s. The modeling of the Costas tracking loop that tracks the phase of the carrier signal and the clock recovery loop that aligns the data timing, both of which include hardlimiting functions that add a degree of nonlinearity to the designs, is discussed. The simulation of the equalizer, which applies a least-mean-squares adaptive algorithm to remove the intersymbol interference and crosstalk from the channel in order to improve the end-to-end link performance, is also addressed. Simulation results are provided.

  1. A low power CMOS 3.3 Gbps continuous-time adaptive equalizer for serial link

    NASA Astrophysics Data System (ADS)

    Hao, Ju; Yumei, Zhou; Jianzhong, Zhao

    2011-09-01

    This paper describes using a high-speed continuous-time analog adaptive equalizer as the front-end of a receiver for a high-speed serial interface, which is compliant with many serial communication specifications such as USB2.0, PCI-E2.0 and Rapid IO. The low and high frequency loops are merged to decrease the effect of delay between the two paths, in addition, the infinite input impedance facilitates the cascade stages in order to improve the high frequency boosting gain. The implemented circuit architecture could facilitate the wide frequency range from 1 to 3.3 Gbps with different length FR4-PCB traces, which brings as much as 25 dB loss. The replica control circuits are injected to provide a convenient way to regulate common-mode voltage for full differential operation. In addition, AC coupling is adopted to suppress the common input from the forward stage. A prototype chip was fabricated in 0.18-μm 1P6M mixed-signal CMOS technology. The actual area is 0.6 × 0.57 mm2 and the analog equalizer operates up to 3.3 Gbps over FR4-PCB trace with 25 dB loss. The overall power dissipation is approximately 23.4 mW.

  2. Reducing interferences in wireless communication systems by mobile agents with recurrent neural networks-based adaptive channel equalization

    NASA Astrophysics Data System (ADS)

    Beritelli, Francesco; Capizzi, Giacomo; Lo Sciuto, Grazia; Napoli, Christian; Tramontana, Emiliano; Woźniak, Marcin

    2015-09-01

    Solving channel equalization problem in communication systems is based on adaptive filtering algorithms. Today, Mobile Agents (MAs) with Recurrent Neural Networks (RNNs) can be also adopted for effective interference reduction in modern wireless communication systems (WCSs). In this paper MAs with RNNs are proposed as novel computing algorithms for reducing interferences in WCSs performing an adaptive channel equalization. The method to provide it is so called MAs-RNNs. We perform the implementation of this new paradigm for interferences reduction. Simulations results and evaluations demonstrates the effectiveness of this approach and as better transmission performance in wireless communication network can be achieved by using the MAs-RNNs based adaptive filtering algorithm.

  3. Adaptive gain, equalization, and wavelength stabilization techniques for silicon photonic microring resonator-based optical receivers

    NASA Astrophysics Data System (ADS)

    Palermo, Samuel; Chiang, Patrick; Yu, Kunzhi; Bai, Rui; Li, Cheng; Chen, Chin-Hui; Fiorentino, Marco; Beausoleil, Ray; Li, Hao; Shafik, Ayman; Titriku, Alex

    2016-03-01

    Interconnect architectures based on high-Q silicon photonic microring resonator devices offer a promising solution to address the dramatic increase in datacenter I/O bandwidth demands due to their ability to realize wavelength-division multiplexing (WDM) in a compact and energy efficient manner. However, challenges exist in realizing efficient receivers for these systems due to varying per-channel link budgets, sensitivity requirements, and ring resonance wavelength shifts. This paper reports on adaptive optical receiver design techniques which address these issues and have been demonstrated in two hybrid-integrated prototypes based on microring drop filters and waveguide photodetectors implemented in a 130nm SOI process and high-speed optical front-ends designed in 65nm CMOS. A 10Gb/s powerscalable architecture employs supply voltage scaling of a three inverter-stage transimpedance amplifier (TIA) that is adapted with an eye-monitor control loop to yield the necessary sensitivity for a given channel. As reduction of TIA input-referred noise is more critical at higher data rates, a 25Gb/s design utilizes a large input-stage feedback resistor TIA cascaded with a continuous-time linear equalizer (CTLE) that compensates for the increased input pole. When tested with a waveguide Ge PD with 0.45A/W responsivity, this topology achieves 25Gb/s operation with -8.2dBm sensitivity at a BER=10-12. In order to address microring drop filters sensitivity to fabrication tolerances and thermal variations, efficient wavelength-stabilization control loops are necessary. A peak-power-based monitoring loop which locks the drop filter to the input wavelength, while achieving compatibility with the high-speed TIA offset-correction feedback loop is implemented with a 0.7nm tuning range at 43μW/GHz efficiency.

  4. Testing the equality of students' performance using Alexander-Govern test with adaptive trimmed means

    NASA Astrophysics Data System (ADS)

    Abdullah, Suhaida; Yahaya, Sharipah Soaad Syed; Yusof, Zahayu Md

    2014-06-01

    Analyzing the equality of independent group has to be done with caution. The classical approaches such as ttest for two groups and analysis of variance (ANOVA) for more than two groups always are favorable selection by researchers. However, sometime these methods were abused by the presence of nonnormality or variance heterogeneity or both. It is known that ANOVA is restricted to the assumptions of normality and homogeneity of variance. In real life data, sometimes these requirements are hard to attain. The Alexander-Govern test with adaptive trimmed mean (AG_atm) is one approach that can be chosen as alternative to the classical tests when their assumptions are violated. In this paper, the performances of AG_atm were compared to the original AG test and ANOVA using simulated and real life data. The simulation study proved that the AG_atm performs better than the original AG test and the classical test. For real life data, student's performance in decision analysis course, measured by final examination score was chosen. Based on the exploratory data analysis, this data found to have problem of nonnormality.

  5. Asymmetry Compensation by Nonlinear Adaptive Partial Response Equalizer for 31.3 GB Blu-ray Disk ROM

    NASA Astrophysics Data System (ADS)

    Kajiwara, Yoshiyuki; Higashino, Satoru; Yamagami, Tamotsu

    2005-05-01

    We investigated a nonlinear adaptive partial response equalizer for the asymmetry compensation of a 31.3 GB higher linear density Blu-ray disc read only memory (ROM) with a 16% asymmetry. A second-order adaptive Volterra filter approximately equalizes a nonlinear signal into a linear one. We reduced its calculation complexity to design a digital circuit in optimum hardware resources by the result of computer simulations. Then we designed an adaptive Volterra filter on an FPGA evaluation board for bit error rate measurements. Finally, we determined that an adaptive Volterra filter has a capability to obtain improved bit error rates by signal linearization in a conventional Viterbi detector for PR(1221).

  6. The Histogram-Area Connection

    ERIC Educational Resources Information Center

    Gratzer, William; Carpenter, James E.

    2008-01-01

    This article demonstrates an alternative approach to the construction of histograms--one based on the notion of using area to represent relative density in intervals of unequal length. The resulting histograms illustrate the connection between the area of the rectangles associated with particular outcomes and the relative frequency (probability)…

  7. Fast tracking using edge histograms

    NASA Astrophysics Data System (ADS)

    Rokita, Przemyslaw

    1997-04-01

    This paper proposes a new algorithm for tracking objects and objects boundaries. This algorithm was developed and applied in a system used for compositing computer generated images and real world video sequences, but can be applied in general in all tracking systems where accuracy and high processing speed are required. The algorithm is based on analysis of histograms obtained by summing along chosen axles pixels of edge segmented images. Edge segmentation is done by spatial convolution using gradient operator. The advantage of such an approach is that it can be performed in real-time using available on the market hardware convolution filters. After edge extraction and histograms computation, respective positions of maximums in edge intensity histograms, in current and previous frame, are compared and matched. Obtained this way information about displacement of histograms maximums, can be directly converted into information about changes of target boundaries positions along chosen axles.

  8. An Adaptive Niching Genetic Algorithm using a niche size equalization mechanism

    NASA Astrophysics Data System (ADS)

    Nagata, Yuichi

    Niching GAs have been widely investigated to apply genetic algorithms (GAs) to multimodal function optimization problems. In this paper, we suggest a new niching GA that attempts to form niches, each consisting of an equal number of individuals. The proposed GA can be applied also to combinatorial optimization problems by defining a distance metric in the search space. We apply the proposed GA to the job-shop scheduling problem (JSP) and demonstrate that the proposed niching method enhances the ability to maintain niches and improve the performance of GAs.

  9. Equalizing resolution in smoothed-particle hydrodynamics calculations using self-adaptive sinc kernels

    NASA Astrophysics Data System (ADS)

    García-Senz, Domingo; Cabezón, Rubén M.; Escartín, José A.; Ebinger, Kevin

    2014-10-01

    Context. The smoothed-particle hydrodynamics (SPH) technique is a numerical method for solving gas-dynamical problems. It has been applied to simulate the evolution of a wide variety of astrophysical systems. The method has a second-order accuracy, with a resolution that is usually much higher in the compressed regions than in the diluted zones of the fluid. Aims: We propose and check a method to balance and equalize the resolution of SPH between high- and low-density regions. This method relies on the versatility of a family of interpolators called sinc kernels, which allows increasing the interpolation quality by varying only a single parameter (the exponent of the sinc function). Methods: The proposed method was checked and validated through a number of numerical tests, from standard one-dimensional Riemann problems in shock tubes, to multidimensional simulations of explosions, hydrodynamic instabilities, and the collapse of a Sun-like polytrope. Results: The analysis of the hydrodynamical simulations suggests that the scheme devised to equalize the accuracy improves the treatment of the post-shock regions and, in general, of the rarefacted zones of fluids while causing no harm to the growth of hydrodynamic instabilities. The method is robust and easy to implement with a low computational overload. It conserves mass, energy, and momentum and reduces to the standard SPH scheme in regions of the fluid that have smooth density gradients.

  10. Convergence study of various non-quadratic adaptive algorithms in the equalization of impulsive DS-CDMA channel

    NASA Astrophysics Data System (ADS)

    Jimaa, Shihab A.; Jadah, Mohamed E.

    2005-10-01

    This paper investigates the performance of using various non-quadratic adaptive algorithms in the adaptation of a non-linear receiver, coupled with a second-order phase tracking subsystem, for asynchronous DS-CDMA communication system impaired by double-spread multipath channel and Gaussian mixture impulsive noise. These algorithms are the lower order (where the power of the cost function is lower than 2), the least-mean mixed norm (where a mixed-norm function is introduced, which combines the LMS and the LMF functions), and the least mean square-fourth switching (where this algorithm switches between LMS and LMF depending on the value of the error). The non-linear receiver comprises feed-forward filter (FFF), feedback filter (FBF), and an equalizer/second order phase locked loop (PLL). The investigations study the effect of using the proposed algorithms on the performance of the non-linear receiver in terms of the mean-square error (MSE) and bit-error-rate (BER). Computer simulation results indicate that the least-mean mixed proposed receiver's algorithm gives the fastest convergence rate and similar BER performance, in comparison with the NLMS adaptive receiver. Furthermore, extensive computer simulation tests have been carried out to determine the optimum values of the step-size, the power of the cost function, and the adaptation parameter of the proposed algorithms. Results show that the optimum values of the step-size for the lower-order, least-mean square fourth, least-mean mixed norm, and the NLMS algorithms are 5x10 -4, 10 -6, 5x10 -4, and 0.01, respectively. The optimum value of the power of the lower-order algorithm is 1.9 and the optimum value of the adaptation parameter of the least-mean mixed algorithm is 0.9.

  11. CHICOM: A code of tests for comparing unweighted and weighted histograms and two weighted histograms

    NASA Astrophysics Data System (ADS)

    Gagunashvili, N. D.

    2012-01-01

    A Fortran-77 program for calculating test statistics to compare weighted histogram with an unweighted histogram and two histograms with weighted entries is presented. The code calculates test statistics for cases of histograms with normalized weights of events and unnormalized weights of events.

  12. Studies on effects of feedback delay on the convergence performance of adaptive time-domain equalizers for fiber dispersive channels

    NASA Astrophysics Data System (ADS)

    Guo, Qun; Xu, Bo; Qiu, Kun

    2016-04-01

    Adaptive time-domain equalizer (TDE) is an important module for digital optical coherent receivers. From an implementation perspective, we analyze and compare in detail the effects of error signal feedback delay on the convergence performance of TDE using either least-mean square (LMS) or constant modulus algorithm (CMA). For this purpose, a simplified theoretical model is proposed based on which iterative equations on the mean value and the variance of the tap coefficient are derived with or without error signal feedback delay for both LMS- and CMA-based methods for the first time. The analytical results show that decreased step size has to be used for TDE to converge and a slower convergence speed cannot be avoided as the feedback delay increases. Compared with the data-aided LMS-based method, the CMA-based method has a slower convergence speed and larger variation after convergence. Similar results are confirmed using numerical simulations for fiber dispersive channels. As the step size increases, a feedback delay of 20 clock cycles might cause the TDE to diverge. Compared with the CMA-based method, the LMS-based method has a higher tolerance on the feedback delay and allows a larger step size for a faster convergence speed.

  13. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  14. Interpreting Histograms. As Easy as It Seems?

    ERIC Educational Resources Information Center

    Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim

    2014-01-01

    Histograms are widely used, but recent studies have shown that they are not as easy to interpret as it might seem. In this article, we report on three studies on the interpretation of histograms in which we investigated, namely, (1) whether the misinterpretation by university students can be considered to be the result of heuristic reasoning, (2)…

  15. The Online Histogram Presenter for the ATLAS experiment: A modular system for histogram visualization

    NASA Astrophysics Data System (ADS)

    Dotti, Andrea; Adragna, Paolo; Vitillo, Roberto A.

    2010-04-01

    The Online Histogram Presenter (OHP) is the ATLAS tool to display histograms produced by the online monitoring system. In spite of the name, the Online Histogram Presenter is much more than just a histogram display. To cope with the large amount of data, the application has been designed to minimise the network traffic; sophisticated caching, hashing and filtering algorithms reduce memory and CPU usage. The system uses Qt and ROOT for histogram visualisation and manipulation. In addition, histogram visualisation can be extensively customised through configuration files. Finally, its very modular architecture features a lightweight plug-in system, allowing extensions to accommodate specific user needs. After an architectural overview of the application, the paper is going to present in detail the solutions adopted to increase the performance and a description of the plug-in system.

  16. Ear segmentation using histogram based K-means clustering and Hough transformation under CVL dataset

    NASA Astrophysics Data System (ADS)

    Liu, Heng; Liu, Dekai

    2009-10-01

    Under CVL dataset, we provide an image segmentation approach based on adaptive histogram based K-means clustering and fast Hough transformation. This work firstly analyzes the characteristics of ear images in CVL face dataset. According to the analysis, we then use adaptive histogram based K-means clustering method to threshold ear images and then roughly segment the ear parts. After ear contour extraction, with boundary determination through vertical project, Hough transformation is utilized to locate the ear contour accurately. The experimental results and comparisons with other segmentation methods show our approach is effective.

  17. Circle detection using scan lines and histograms

    NASA Astrophysics Data System (ADS)

    Chen, Ming; Zhang, Feng; Du, Zhenhong; Liu, Renyi

    2013-11-01

    Circle detection is significant in image processing and pattern recognition. We present a new algorithm for detecting circles, which is based on the global geometric symmetry of circles. First, the horizontal and vertical midpoint histograms of the edge image are obtained by using scan lines. Then, we apply the peak-finding algorithm to the midpoint histograms to look for the center of the circle. The normalized radius histogram is finally used to verify the existence of the circle and extract its radius. Synthetic images with different levels of pepper noise and real images containing several circles have been taken to test the performance. Experimental results demonstrate that the proposed algorithm has the advantage of computational efficiency as compared with the randomized Hough transform and some other algorithms.

  18. Simple dynamics for broad histogram method

    NASA Astrophysics Data System (ADS)

    de Oliveira, Paulo Murilo Castro

    2002-08-01

    The purpose of this text is: (1) to clarify the foundations of the broad histogram method, stressing the conceptual differences between it and reweighting procedures in general; (2) to propose a very simple microcanonical dynamic rule, yet to be tested by theoretical grounds, which could provide a good improvement to numerical simulations.

  19. Histogram approaches for lossy compression of digital holograms of three-dimensional objects.

    PubMed

    Shortt, Alison E; Naughton, Thomas J; Javidi, Bahram

    2007-06-01

    We present a novel nonuniform quantization compression technique-histogram quantization-for digital holograms of 3-D real-world objects. We exploit a priori knowledge of the distribution of the values in our data. We compare this technique to another histogram based approach: a modified version of Max's algorithm that has been adapted in a straight-forward manner to complex-valued 2-D signals. We conclude the compression procedure by applying lossless techniques to our quantized data. We demonstrate improvements over previous results obtained by applying uniform and nonuniform quantization techniques to the hologram data.

  20. Accelerated weight histogram method for exploring free energy landscapes

    NASA Astrophysics Data System (ADS)

    Lindahl, V.; Lidmar, J.; Hess, B.

    2014-07-01

    Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.

  1. Accelerated weight histogram method for exploring free energy landscapes

    SciTech Connect

    Lindahl, V.; Lidmar, J.; Hess, B.

    2014-07-28

    Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.

  2. Equal Access.

    ERIC Educational Resources Information Center

    De Patta, Joe

    2003-01-01

    Presents an interview with Stephen McCarthy, co-partner and president of Equal Access ADA Consulting Architects of San Diego, California, about designing schools to naturally integrate compliance with the Americans with Disabilities Act (ADA). (EV)

  3. Improving tumour heterogeneity MRI assessment with histograms

    PubMed Central

    Just, N

    2014-01-01

    By definition, tumours are heterogeneous. They are defined by marked differences in cells, microenvironmental factors (oxygenation levels, pH, VEGF, VPF and TGF-α) metabolism, vasculature, structure and function that in turn translate into heterogeneous drug delivery and therapeutic outcome. Ways to estimate quantitatively tumour heterogeneity can improve drug discovery, treatment planning and therapeutic responses. It is therefore of paramount importance to have reliable and reproducible biomarkers of cancerous lesions' heterogeneity. During the past decade, the number of studies using histogram approaches increased drastically with various magnetic resonance imaging (MRI) techniques (DCE-MRI, DWI, SWI etc.) although information on tumour heterogeneity remains poorly exploited. This fact can be attributed to a poor knowledge of the available metrics and of their specific meaning as well as to the lack of literature references to standardised histogram methods with which surrogate markers of heterogeneity can be compared. This review highlights the current knowledge and critical advances needed to investigate and quantify tumour heterogeneity. The key role of imaging techniques and in particular the key role of MRI for an accurate investigation of tumour heterogeneity is reviewed with a particular emphasis on histogram approaches and derived methods. PMID:25268373

  4. Quantification and classification of neuronal responses in kernel-smoothed peristimulus time histograms.

    PubMed

    Hill, Michael R H; Fried, Itzhak; Koch, Christof

    2015-02-15

    Peristimulus time histograms are a widespread form of visualizing neuronal responses. Kernel convolution methods transform these histograms into a smooth, continuous probability density function. This provides an improved estimate of a neuron's actual response envelope. We here develop a classifier, called the h-coefficient, to determine whether time-locked fluctuations in the firing rate of a neuron should be classified as a response or as random noise. Unlike previous approaches, the h-coefficient takes advantage of the more precise response envelope estimation provided by the kernel convolution method. The h-coefficient quantizes the smoothed response envelope and calculates the probability of a response of a given shape to occur by chance. We tested the efficacy of the h-coefficient in a large data set of Monte Carlo simulated smoothed peristimulus time histograms with varying response amplitudes, response durations, trial numbers, and baseline firing rates. Across all these conditions, the h-coefficient significantly outperformed more classical classifiers, with a mean false alarm rate of 0.004 and a mean hit rate of 0.494. We also tested the h-coefficient's performance in a set of neuronal responses recorded in humans. The algorithm behind the h-coefficient provides various opportunities for further adaptation and the flexibility to target specific parameters in a given data set. Our findings confirm that the h-coefficient can provide a conservative and powerful tool for the analysis of peristimulus time histograms with great potential for future development. PMID:25475352

  5. Empirical Histograms in Item Response Theory with Ordinal Data

    ERIC Educational Resources Information Center

    Woods, Carol M.

    2007-01-01

    The purpose of this research is to describe, test, and illustrate a new implementation of the empirical histogram (EH) method for ordinal items. The EH method involves the estimation of item response model parameters simultaneously with the approximation of the distribution of the random latent variable (theta) as a histogram. Software for the EH…

  6. An automated blood vessel segmentation algorithm using histogram equalization and automatic threshold selection.

    PubMed

    Saleh, Marwan D; Eswaran, C; Mueen, Ahmed

    2011-08-01

    This paper focuses on the detection of retinal blood vessels which play a vital role in reducing the proliferative diabetic retinopathy and for preventing the loss of visual capability. The proposed algorithm which takes advantage of the powerful preprocessing techniques such as the contrast enhancement and thresholding offers an automated segmentation procedure for retinal blood vessels. To evaluate the performance of the new algorithm, experiments are conducted on 40 images collected from DRIVE database. The results show that the proposed algorithm performs better than the other known algorithms in terms of accuracy. Furthermore, the proposed algorithm being simple and easy to implement, is best suited for fast processing applications.

  7. Reversible watermarking based on invariant image classification and dynamical error histogram shifting.

    PubMed

    Pan, W; Coatrieux, G; Cuppens, N; Cuppens, F; Roux, Ch

    2011-01-01

    In this article, we present a novel reversible watermarking scheme. Its originality stands in identifying parts of the image that can be watermarked additively with the most adapted lossless modulation between: Pixel Histogram Shifting (PHS) or Dynamical Error Histogram Shifting (DEHS). This classification process makes use of a reference image derived from the image itself, a prediction of it, which has the property to be invariant to the watermark addition. In that way, watermark embedded and reader remain synchronized through this image of reference. DEHS is also an original contribution of this work. It shifts predict-errors between the image and its reference image taking care of the local specificities of the image, thus dynamically. Conducted experiments, on different medical image test sets issued from different modalities and some natural images, show that our method can insert more data with lower distortion than the most recent and efficient methods of the literature.

  8. Size histograms of gold nanoparticles measured by gravitational sedimentation.

    PubMed

    Alexander, Colleen M; Goodisman, Jerry

    2014-03-15

    Sedimentation curves of gold nanoparticles in water were obtained by measuring the optical density of a suspension over time. The results are not subject to sampling errors, and refer to the particles in situ. Curves obtained simultaneously at several wave lengths were analyzed together to derive the size histogram of the sedimenting particles. The bins in the histogram were 5 nm wide and centered at diameters 60, 65, …, 110 nm. To get the histogram, we weighted previously calculated solutions to the Mason-Weaver sedimentation-diffusion equation for various particle diameters with absorption/scattering coefficients and size (diameter) abundances {c(j)}, and found the {c(j)} which gave the best fit to all the theoretical sedimentation curves. The effects of changing the number of bins and the wave lengths used were studied. Going to smaller bins would mean determining more parameters and require more wave lengths. The histograms derived from sedimentation agreed quite well in general with the histogram derived from TEM. Differences found for the smallest particle diameters are partly due to statistical fluctuations (TEM found only 1-2 particles out of 103 with these diameters). More important is that the TEM histogram indicates 12% of the particles have diameters of 75±2.5 nm, and the sedimentation histogram shows none. We show that this reflects the difference between the particles in situ, which possess a low-density shell about 1 nm thick, and the bare particles on the TEM stage. Correcting for this makes agreement between the two histograms excellent. Comparing sedimentation-derived with TEM-derived histograms thus shows differences between the particles in situ and on the TEM stage.

  9. Approximate Splitting for Ensembles of Trees using Histograms

    SciTech Connect

    Kamath, C; Cantu-Paz, E; Littau, D

    2001-09-28

    Recent work in classification indicates that significant improvements in accuracy can be obtained by growing an ensemble of classifiers and having them vote for the most popular class. Implicit in many of these techniques is the concept of randomization that generates different classifiers. In this paper, they focus on ensembles of decision trees that are created using a randomized procedure based on histograms. Techniques, such as histograms, that discretize continuous variables, have long been used in classification to convert the data into a form suitable for processing and to reduce the compute time. The approach combines the ideas behind discretization through histograms and randomization in ensembles to create decision trees by randomly selecting a split point in an interval around the best bin boundary in the histogram. The experimental results with public domain data show that ensembles generated using this approach are competitive in accuracy and superior in computational cost to other ensembles techniques such as boosting and bagging.

  10. Frequency distribution histograms for the rapid analysis of data

    NASA Technical Reports Server (NTRS)

    Burke, P. V.; Bullen, B. L.; Poff, K. L.

    1988-01-01

    The mean and standard error are good representations for the response of a population to an experimental parameter and are frequently used for this purpose. Frequency distribution histograms show, in addition, responses of individuals in the population. Both the statistics and a visual display of the distribution of the responses can be obtained easily using a microcomputer and available programs. The type of distribution shown by the histogram may suggest different mechanisms to be tested.

  11. Bin recycling strategy for improving the histogram precision on GPU

    NASA Astrophysics Data System (ADS)

    Cárdenas-Montes, Miguel; Rodríguez-Vázquez, Juan José; Vega-Rodríguez, Miguel A.

    2016-07-01

    Histogram is an easily comprehensible way to present data and analyses. In the current scientific context with access to large volumes of data, the processing time for building histogram has dramatically increased. For this reason, parallel construction is necessary to alleviate the impact of the processing time in the analysis activities. In this scenario, GPU computing is becoming widely used for reducing until affordable levels the processing time of histogram construction. Associated to the increment of the processing time, the implementations are stressed on the bin-count accuracy. Accuracy aspects due to the particularities of the implementations are not usually taken into consideration when building histogram with very large data sets. In this work, a bin recycling strategy to create an accuracy-aware implementation for building histogram on GPU is presented. In order to evaluate the approach, this strategy was applied to the computation of the three-point angular correlation function, which is a relevant function in Cosmology for the study of the Large Scale Structure of Universe. As a consequence of the study a high-accuracy implementation for histogram construction on GPU is proposed.

  12. Adaptive VFH

    NASA Astrophysics Data System (ADS)

    Odriozola, Iñigo; Lazkano, Elena; Sierra, Basi

    2011-10-01

    This paper investigates the improvement of the Vector Field Histogram (VFH) local planning algorithm for mobile robot systems. The Adaptive Vector Field Histogram (AVFH) algorithm has been developed to improve the effectiveness of the traditional VFH path planning algorithm overcoming the side effects of using static parameters. This new algorithm permits the adaptation of planning parameters for the different type of areas in an environment. Genetic Algorithms are used to fit the best VFH parameters to each type of sector and, afterwards, every section in the map is labelled with the sector-type which best represents it. The Player/Stage simulation platform has been chosen for making all sort of tests and to prove the new algorithm's adequateness. Even though there is still much work to be carried out, the developed algorithm showed good navigation properties and turned out to be softer and more effective than the traditional VFH algorithm.

  13. The life cycle of the amoeboid alga Synchroma grande (Synchromophyceae, Heterokontophyta)--highly adapted yet equally equipped for rapid diversification in benthic habitats.

    PubMed

    Koch, C; Brumme, B; Schmidt, M; Flieger, K; Schnetter, R; Wilhelm, C

    2011-09-01

    Synchroma grande (Synchromophyceae, Heterokontophyta) is a marine amoeboid alga, which was isolated from a benthic habitat. This species has sessile cell stages (amoeboid cells with lorica and cysts) and non-sessile cell stages (migrating and floating amoebae) during its life cycle. The different cell types and their transitions within the life cycle are described, as are their putative functions. Cell proliferation was observed only in cells attached to the substrate but not in free-floating or migrating cells. We also characterised the phagotrophy of the meroplasmodium in comparison to other amoeboid algae and the formation of the lorica. The functional adaptations of S. grande during its life cycle were compared to the cell stages of other amoeboid algae of the red and green chloroplast lineages. S. grande was found to be highly adapted to the benthic habitat. One sexual and two asexual reproductive strategies (haplo-diploid life cycle) support the ability of this species to achieve rapid diversification and high adaptivity in its natural habitat.

  14. Performance analysis of low-complexity adaptive frequency-domain equalization and MIMO signal processing for compensation of differential mode group delay in mode-division multiplexing communication systems using few-mode fibers

    NASA Astrophysics Data System (ADS)

    Weng, Yi; He, Xuan; Pan, Zhongqi

    2016-02-01

    Mode-division multiplexing (MDM) transmission systems utilizing few-mode fibers (FMF) have been intensively explored to sustain continuous traffic growth. The key challenges of MDM systems are inter-modal crosstalk due to random mode coupling (RMC), and largely-accumulated differential mode group delay (DMGD), whilst hinders mode-demultiplexer implementation. The adaptive multi-input multi-output (MIMO) frequency-domain equalization (FDE) can dynamically compensate DMGD using digital signal processing (DSP) algorithms. The frequency-domain least-mean squares (FD-LMS) algorithm has been universally adopted for high-speed MDM communications, mainly for its relatively low computational complexity. However, longer training sequence is appended for FD-LMS to achieve faster convergence, which incurs prohibitively higher system overhead and reduces overall throughput. In this paper, we propose a fast-convergent single-stage adaptive frequency-domain recursive least-squares (FD-RLS) algorithm with reduced complexity for DMGD compensation at MDM coherent receivers. The performance and complexity comparison of FD-RLS, with signal-PSD-dependent FD-LMS method and conventional FD-LMS approach, are performed in a 3000 km six-mode transmission system with 65 ps/km DMGD. We explore the convergence speed of three adaptive algorithms, including the normalized mean-square-error (NMSE) per fast Fourier transform (FFT) block at 14-30 dB OSNR. The fast convergence of FD-RLS is exploited at the expense of slightly-increased necessary tap numbers for MIMO equalizers, and it can partially save the overhead of training sequence. Furthermore, we demonstrate adaptive FD-RLS can also be used for chromatic dispersion (CD) compensation without increasing the filter tap length, thus prominently reducing the DSP implementation complexity for MDM systems.

  15. Endoscopic ultrasound elastography strain histograms in the evaluation of patients with pancreatic masses

    PubMed Central

    Opačić, Dalibor; Rustemović, Nadan; Kalauz, Mirjana; Markoš, Pave; Ostojić, Zvonimir; Majerović, Matea; Ledinsky, Iva; Višnjić, Ana; Krznarić, Juraj; Opačić, Milorad

    2015-01-01

    AIM: To investigate the accuracy of the strain histogram endoscopic ultrasound (EUS)-based method for the diagnostic differentiation of patients with pancreatic masses. METHODS: In a prospective single center study, 149 patients were analyzed, 105 with pancreatic masses and 44 controls. Elastography images were recorded using commercially available ultrasound equipment in combination with EUS linear probes. Strain histograms (SHs) were calculated by machine integrated software in regions of interest and mean values of the strain histograms were expressed as Mode 1 (over the mass) and Mode 2 (over an adjacent part of pancreatic tissue, representing the reference area). The ratio between Mode 2 and Mode 1 was calculated later, representing a new variable, the strain histogram ratio. After the final diagnosis was established, two groups of patients were formed: a pancreatic cancer group with positive cytology achieved by fine needle aspiration puncture or histology after surgery (58 patients), and a mass-forming pancreatitis group with negative cytology and follow-up after 3 and 6 mo (47 patients). All statistical analyses were conducted in SPSS 14.0 (SPSS Inc., Chicago, IL, United States). RESULTS: Results were obtained with software for strain histograms with reversed hue scale (0 represents the hardest tissue structure and 255 the softest). Based on the receiver operating characteristics (ROC) curve coordinates, the cut-off point for Mode 1 was set at the value of 86. Values under the cut-off point indicated the presence of pancreatic malignancy. Mode 1 reached 100% sensitivity and 45% specificity with overall accuracy of 66% (95%CI: 61%-66%) in detection of pancreatic malignant tumors among the patients with pancreatic masses. The positive and negative predictive values were 54% and 100%, respectively. The cut-off for the new calculated variable, the SH ratio, was set at the value 1.153 based on the ROC curve coordinates. Values equal or above the cut-off value

  16. Adapt

    NASA Astrophysics Data System (ADS)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  17. Histogram bin width selection for time-dependent Poisson processes

    NASA Astrophysics Data System (ADS)

    Koyama, Shinsuke; Shinomoto, Shigeru

    2004-07-01

    In constructing a time histogram of the event sequences derived from a nonstationary point process, we wish to determine the bin width such that the mean squared error of the histogram from the underlying rate of occurrence is minimized. We find that the optimal bin widths obtained for a doubly stochastic Poisson process and a sinusoidally regulated Poisson process exhibit different scaling relations with respect to the number of sequences, time scale and amplitude of rate modulation, but both diverge under similar parametric conditions. This implies that under these conditions, no determination of the time-dependent rate can be made. We also apply the kernel method to these point processes, and find that the optimal kernels do not exhibit any critical phenomena, unlike the time histogram method.

  18. Face recognition with histograms of fractional differential gradients

    NASA Astrophysics Data System (ADS)

    Yu, Lei; Ma, Yan; Cao, Qi

    2014-05-01

    It has proved that fractional differentiation can enhance the edge information and nonlinearly preserve textural detailed information in an image. This paper investigates its ability for face recognition and presents a local descriptor called histograms of fractional differential gradients (HFDG) to extract facial visual features. HFDG encodes a face image into gradient patterns using multiorientation fractional differential masks, from which histograms of gradient directions are computed as the face representation. Experimental results on Yale, face recognition technology (FERET), Carnegie Mellon University pose, illumination, and expression (CMU PIE), and A. Martinez and R. Benavente (AR) databases validate the feasibility of the proposed method and show that HFDG outperforms local binary patterns (LBP), histograms of oriented gradients (HOG), enhanced local directional patterns (ELDP), and Gabor feature-based methods.

  19. Evaluating Climate Models with MISR Joint Histograms of Cloud Properties

    NASA Astrophysics Data System (ADS)

    Ackerman, T. P.; Marchand, R.; Hillman, B. R.

    2009-12-01

    Following the approach pioneered by ISCCP, joint histograms of cloud optical depth and cloud top height (pressure) are being produced by MISR and MODIS for the evaluation of climate models. There are significant differences among the histogram due to the differences in sensors and retrieval algorithms. These differences provide insight into the properties of the observed cloud fields. MISR retrievals of stereo cloud height, in particular, provide a unique perspective on the distribution cloud heights. MISR, due to its stereo imaging, is more effective in identifying low clouds and retrieving their height, while MODIS is a more reliable detector of high clouds. In analogy to the ISCCP simulator, cloud fields generated in global climate models can be processed through a MISR simulator, which we have developed, to produce joint histograms of model clouds. Comparingf observed joint histograms with simulated joint histograms allows us to determine where the model is producing clouds well and where not. We have applied this technique to results from the Multiscale Modeling Framework (MMF; also called the “superparameterization” model) and are currently applying it to the NCAR Community Atmosphere Model and the GFDL AM2 model. The MMF computes cloud properties using an embedded 2D cloud resolving model (CRM) in each grid square of the large-scale climate model. We have run versions of the MMF with CRM horizontal resolution of 4 km and 1 km and with 26 and 52 vertical levels in order to explore the effect of resolution on model clouds. Comparison with MISR joint histograms shows that the model run with 52 levels and 1 km provides an improved simulation, but low cloud amounts are still considerably lower than observed. We discuss possible solutions to this problem. Evaluations of the CAM and AM2 model are in progress and evaluations of these models will be presented.

  20. Remote logo detection using angle-distance histograms

    NASA Astrophysics Data System (ADS)

    Youn, Sungwook; Ok, Jiheon; Baek, Sangwook; Woo, Seongyoun; Lee, Chulhee

    2016-05-01

    Among all the various computer vision applications, automatic logo recognition has drawn great interest from industry as well as various academic institutions. In this paper, we propose an angle-distance map, which we used to develop a robust logo detection algorithm. The proposed angle-distance histogram is invariant against scale and rotation. The proposed method first used shape information and color characteristics to find the candidate regions and then applied the angle-distance histogram. Experiments show that the proposed method detected logos of various sizes and orientations.

  1. A Novel Method for the Evaluation of Uncertainty in Dose-Volume Histogram Computation

    SciTech Connect

    Henriquez, Francisco Cutanda M.Sc. Castrillon, Silvia Vargas

    2008-03-15

    Purpose: Dose-volume histograms (DVHs) are a useful tool in state-of-the-art radiotherapy treatment planning, and it is essential to recognize their limitations. Even after a specific dose-calculation model is optimized, dose distributions computed by using treatment-planning systems are affected by several sources of uncertainty, such as algorithm limitations, measurement uncertainty in the data used to model the beam, and residual differences between measured and computed dose. This report presents a novel method to take them into account. Methods and Materials: To take into account the effect of associated uncertainties, a probabilistic approach using a new kind of histogram, a dose-expected volume histogram, is introduced. The expected value of the volume in the region of interest receiving an absorbed dose equal to or greater than a certain value is found by using the probability distribution of the dose at each point. A rectangular probability distribution is assumed for this point dose, and a formulation that accounts for uncertainties associated with point dose is presented for practical computations. Results: This method is applied to a set of DVHs for different regions of interest, including 6 brain patients, 8 lung patients, 8 pelvis patients, and 6 prostate patients planned for intensity-modulated radiation therapy. Conclusions: Results show a greater effect on planning target volume coverage than in organs at risk. In cases of steep DVH gradients, such as planning target volumes, this new method shows the largest differences with the corresponding DVH; thus, the effect of the uncertainty is larger.

  2. Histogram-Based Calibration Method for Pipeline ADCs

    PubMed Central

    Son, Hyeonuk; Jang, Jaewon; Kim, Heetae; Kang, Sungho

    2015-01-01

    Measurement and calibration of an analog-to-digital converter (ADC) using a histogram-based method requires a large volume of data and a long test duration, especially for a high resolution ADC. A fast and accurate calibration method for pipelined ADCs is proposed in this research. The proposed calibration method composes histograms through the outputs of each stage and calculates error sources. The digitized outputs of a stage are influenced directly by the operation of the prior stage, so the results of the histogram provide the information of errors in the prior stage. The composed histograms reduce the required samples and thus calibration time being implemented by simple modules. For 14-bit resolution pipelined ADC, the measured maximum integral non-linearity (INL) is improved from 6.78 to 0.52 LSB, and the spurious-free dynamic range (SFDR) and signal-to-noise-and-distortion ratio (SNDR) are improved from 67.0 to 106.2dB and from 65.6 to 84.8dB, respectively. PMID:26070196

  3. Histogram-Based Calibration Method for Pipeline ADCs.

    PubMed

    Son, Hyeonuk; Jang, Jaewon; Kim, Heetae; Kang, Sungho

    2015-01-01

    Measurement and calibration of an analog-to-digital converter (ADC) using a histogram-based method requires a large volume of data and a long test duration, especially for a high resolution ADC. A fast and accurate calibration method for pipelined ADCs is proposed in this research. The proposed calibration method composes histograms through the outputs of each stage and calculates error sources. The digitized outputs of a stage are influenced directly by the operation of the prior stage, so the results of the histogram provide the information of errors in the prior stage. The composed histograms reduce the required samples and thus calibration time being implemented by simple modules. For 14-bit resolution pipelined ADC, the measured maximum integral non-linearity (INL) is improved from 6.78 to 0.52 LSB, and the spurious-free dynamic range (SFDR) and signal-to-noise-and-distortion ratio (SNDR) are improved from 67.0 to 106.2dB and from 65.6 to 84.8dB, respectively.

  4. Histogram-Based Calibration Method for Pipeline ADCs.

    PubMed

    Son, Hyeonuk; Jang, Jaewon; Kim, Heetae; Kang, Sungho

    2015-01-01

    Measurement and calibration of an analog-to-digital converter (ADC) using a histogram-based method requires a large volume of data and a long test duration, especially for a high resolution ADC. A fast and accurate calibration method for pipelined ADCs is proposed in this research. The proposed calibration method composes histograms through the outputs of each stage and calculates error sources. The digitized outputs of a stage are influenced directly by the operation of the prior stage, so the results of the histogram provide the information of errors in the prior stage. The composed histograms reduce the required samples and thus calibration time being implemented by simple modules. For 14-bit resolution pipelined ADC, the measured maximum integral non-linearity (INL) is improved from 6.78 to 0.52 LSB, and the spurious-free dynamic range (SFDR) and signal-to-noise-and-distortion ratio (SNDR) are improved from 67.0 to 106.2dB and from 65.6 to 84.8dB, respectively. PMID:26070196

  5. Forget about equality.

    PubMed

    Powers, Madison

    1996-06-01

    Justice is widely thought to consist in equality. For many theorists, the central question has been: Equality of what? The author argues that the ideal of equality distorts practical reasoning and has deeply counterintuitive implications. Moreover, an alternative view of distributive justice can give a better account of what egalitarians should care about than can any of the competing ideals of equality.

  6. Theoretical considerations on the validity of the Stewart-Hamilton principle in measuring cycle-averaged flows via histogram of indicator in the pulsating compartment.

    PubMed

    Eterović, D; Dujić, Z

    1994-02-01

    It has been heuristically shown that the Stewart-Hamilton principle, adapted to external counting observables of system indicator histogram, A(t), its cycle-averaged equilibrium count rate, A(equ), and indicator volume of distribution in the body, V(body), is F/V(body) = A(equ)/integral of o infinity A(t)dt, where F is the cycle-averaged cardiac output. Since the method lacks the theoretical plausibility, it remained unclear whether it is an approximation and what conditions warrant its usability. This paper presents an exact derivation of the above equation. To fulfill it the generalizations of the stationary theory of indicator kinetics were set up that allowed for the conditions of pulsatile flows and volumes and the dependence of the distribution of transit times of indicator on the phase of the cardiac cycle. The assumptions utilized were that the tracer enters the compartment well mixed and convectively carried by the blood in concentrations that do not vary in the single cycle to a material extent. The method yields the cardiac output, even when the flow to a compartment is only a part of it, provided that the fraction of indicator that traversed the system equals the fraction of cardiac output that perfuses the compartment. It was shown that, when applied to a regurgitant ventricle, the method obtains the forward flow and that separate application of the method to each of the ventricles provides the theoretical basis for evaluation of the central-circulatory shunts. PMID:8177163

  7. Fingerprint image segmentation based on multi-features histogram analysis

    NASA Astrophysics Data System (ADS)

    Wang, Peng; Zhang, Youguang

    2007-11-01

    An effective fingerprint image segmentation based on multi-features histogram analysis is presented. We extract a new feature, together with three other features to segment fingerprints. Two of these four features, each of which is related to one of the other two, are reciprocals with each other, so features are divided into two groups. These two features' histograms are calculated respectively to determine which feature group is introduced to segment the aim-fingerprint. The features could also divide fingerprints into two classes with high and low quality. Experimental results show that our algorithm could classify foreground and background effectively with lower computational cost, and it can also reduce pseudo-minutiae detected and improve the performance of AFIS.

  8. Retrospective Reconstructions of Active Bone Marrow Dose-Volume Histograms

    SciTech Connect

    Veres, Cristina; Allodji, Rodrigue S.; Llanas, Damien; Vu Bezin, Jérémi; Chavaudra, Jean; Mège, Jean Pierre; Lefkopoulos, Dimitri; Quiniou, Eric; Deutsh, Eric; Vathaire, Florent de; Diallo, Ibrahima

    2014-12-01

    Purpose: To present a method for calculating dose-volume histograms (DVH's) to the active bone marrow (ABM) of patients who had undergone radiation therapy (RT) and subsequently developed leukemia. Methods and Materials: The study focuses on 15 patients treated between 1961 and 1996. Whole-body RT planning computed tomographic (CT) data were not available. We therefore generated representative whole-body CTs similar to patient anatomy. In addition, we developed a method enabling us to obtain information on the density distribution of ABM all over the skeleton. Dose could then be calculated in a series of points distributed all over the skeleton in such a way that their local density reflected age-specific data for ABM distribution. Dose to particular regions and dose-volume histograms of the entire ABM were estimated for all patients. Results: Depending on patient age, the total number of dose calculation points generated ranged from 1,190,970 to 4,108,524. The average dose to ABM ranged from 0.3 to 16.4 Gy. Dose-volume histograms analysis showed that the median doses (D{sub 50%}) ranged from 0.06 to 12.8 Gy. We also evaluated the inhomogeneity of individual patient ABM dose distribution according to clinical situation. It was evident that the coefficient of variation of the dose for the whole ABM ranged from 1.0 to 5.7, which means that the standard deviation could be more than 5 times higher than the mean. Conclusions: For patients with available long-term follow-up data, our method provides reconstruction of dose-volume data comparable to detailed dose calculations, which have become standard in modern CT-based 3-dimensional RT planning. Our strategy of using dose-volume histograms offers new perspectives to retrospective epidemiological studies.

  9. Conformational thermodynamics of biomolecular complexes: The histogram-based method

    NASA Astrophysics Data System (ADS)

    Das, Amit; Sikdar, Samapan; Ghosh, Mahua; Chakrabarti, J.

    2015-09-01

    Conformational changes in biomacromolecules govern majority of biological processes. Complete characterization of conformational contributions to thermodynamics of complexation of biomacromolecules has been challenging. Although, advances in NMR relaxation experiments and several computational studies have revealed important aspects of conformational entropy changes, efficient and large-scale estimations still remain an intriguing facet. Recent histogram-based method (HBM) offers a simple yet rigorous route to estimate both conformational entropy and free energy changes from same set of histograms in an efficient manner. The HBM utilizes the power of histograms which can be generated as accurately as desired from an arbitrarily large sample space from atomistic simulation trajectories. Here we discuss some recent applications of the HBM, using dihedral angles of amino acid residues as conformational variables, which provide good measure of conformational thermodynamics of several protein-peptide complexes, obtained from NMR, metal-ion binding to an important metalloprotein, interfacial changes in protein-protein complex and insight to protein function, coupled with conformational changes. We conclude the paper with a few future directions worth pursuing.

  10. Slope histogram distribution-based parametrisation of Martian geomorphic features

    NASA Astrophysics Data System (ADS)

    Balint, Zita; Székely, Balázs; Kovács, Gábor

    2014-05-01

    The application of geomorphometric methods on the large Martian digital topographic datasets paves the way to analyse the Martian areomorphic processes in more detail. One of the numerous methods is the analysis is to analyse local slope distributions. To this implementation a visualization program code was developed that allows to calculate the local slope histograms and to compare them based on Kolmogorov distance criterion. As input data we used the digital elevation models (DTMs) derived from HRSC high-resolution stereo camera image from various Martian regions. The Kolmogorov-criterion based discrimination produces classes of slope histograms that displayed using coloration obtaining an image map. In this image map the distribution can be visualized by their different colours representing the various classes. Our goal is to create a local slope histogram based classification for large Martian areas in order to obtain information about general morphological characteristics of the region. This is a contribution of the TMIS.ascrea project, financed by the Austrian Research Promotion Agency (FFG). The present research is partly realized in the frames of TÁMOP 4.2.4.A/2-11-1-2012-0001 high priority "National Excellence Program - Elaborating and Operating an Inland Student and Researcher Personal Support System convergence program" project's scholarship support, using Hungarian state and European Union funds and cofinances from the European Social Fund.

  11. Finding significantly connected voxels based on histograms of connection strengths

    NASA Astrophysics Data System (ADS)

    Kasenburg, Niklas; Pedersen, Morten Vester; Darkner, Sune

    2016-03-01

    We explore a new approach for structural connectivity based segmentations of subcortical brain regions. Connectivity based segmentations are usually based on fibre connections from a seed region to predefined target regions. We present a method for finding significantly connected voxels based on the distribution of connection strengths. Paths from seed voxels to all voxels in a target region are obtained from a shortest-path tractography. For each seed voxel we approximate the distribution with a histogram of path scores. We hypothesise that the majority of estimated connections are false-positives and that their connection strength is distributed differently from true-positive connections. Therefore, an empirical null-distribution is defined for each target region as the average normalized histogram over all voxels in the seed region. Single histograms are then tested against the corresponding null-distribution and significance is determined using the false discovery rate (FDR). Segmentations are based on significantly connected voxels and their FDR. In this work we focus on the thalamus and the target regions were chosen by dividing the cortex into a prefrontal/temporal zone, motor zone, somatosensory zone and a parieto-occipital zone. The obtained segmentations consistently show a sparse number of significantly connected voxels that are located near the surface of the anterior thalamus over a population of 38 subjects.

  12. Local Histograms for Classifying H&E Stained Tissues.

    PubMed

    Massar, M L; Bhagavatula, R; Fickus, M; Kovačević, J

    2010-01-01

    We introduce a rigorous mathematical theory for the analysis of local histograms, and consider the appropriateness of their use in the automated classification of textures commonly encountered in images of H&E stained tissues. We first discuss some of the many image features that pathologists indicate they use when classifying tissues, focusing on simple, locally-defined features that essentially involve pixel counting: the number of cells in a region of given size, the size of the nuclei within these cells, and the distribution of color within both. We then introduce a probabilistic, occlusion-based model for textures that exhibit these features, in particular demonstrating how certain tissue-similar textures can be built up from simpler ones. After considering the basic notions and properties of local histogram transforms, we then formally demonstrate that such transforms are natural tools for analyzing the textures produced by our model. In particular, we discuss how local histogram transforms can be used to produce numerical features that, when fed into mainstream classification schemes, mimic the baser aspects of a pathologist's thought process. PMID:24839388

  13. Implementing a 3D histogram version of the Energy-Test in ROOT

    NASA Astrophysics Data System (ADS)

    Cohen, E. O.; Reid, I. D.; Piasetzky, E.

    2016-08-01

    Comparing simulation and data histograms is of interest in nuclear and particle physics experiments; however, the leading three-dimensional histogram comparison tool available in ROOT, the 3D Kolmogorov-Smirnov test, exhibits shortcomings. Throughout the following, we present and discuss the implementation of an alternative comparison test for three-dimensional histograms, based on the Energy-Test by Aslan and Zech.

  14. Science EQUALS Success.

    ERIC Educational Resources Information Center

    Cobb, Kitty B., Ed.; Conwell, Catherine R., Ed.

    The purpose of the EQUALS programs is to increase the interest and awareness that females and minorities have concerning mathematics and science related careers. This book, produced by an EQUALS program in North Carolina, contains 35 hands-on, discovery science activities that center around four EQUALS processes--problem solving, cooperative…

  15. Direct evaluation of multicomponent phase equilibria using flat-histogram methods.

    PubMed

    Errington, Jeffrey R; Shen, Vincent K

    2005-10-22

    We present a method for directly locating density-driven phase transitions in multicomponent systems. Phase coexistence conditions are determined through manipulation of a total density probability distribution evaluated over a density range that includes both coexisting phases. Saturation quantities are determined through appropriate averaging of density-dependent mean values of a given property of interest. We discuss how to implement the method in both the grand-canonical and isothermal-isobaric semigrand ensembles. Calculations can be conducted using any of the recently introduced flat-histogram techniques. Here, we combine the general algorithm with a transition-matrix approach to produce an efficient self-adaptive technique for determining multicomponent phase equilibrium properties. To assess the performance of the new method, we generate phase diagrams for a number of binary and ternary Lennard-Jones mixtures.

  16. Content based Image Retrieval based on Different Global and Local Color Histogram Methods: A Survey

    NASA Astrophysics Data System (ADS)

    Suhasini, Pallikonda Sarah; Sri Rama Krishna, K.; Murali Krishna, I. V.

    2016-06-01

    Different global and local color histogram methods for content based image retrieval (CBIR) are investigated in this paper. Color histogram is a widely used descriptor for CBIR. Conventional method of extracting color histogram is global, which misses the spatial content, is less invariant to deformation and viewpoint changes, and results in a very large three dimensional histogram corresponding to the color space used. To address the above deficiencies, different global and local histogram methods are proposed in recent research. Different ways of extracting local histograms to have spatial correspondence, invariant colour histogram to add deformation and viewpoint invariance and fuzzy linking method to reduce the size of the histogram are found in recent papers. The color space and the distance metric used are vital in obtaining color histogram. In this paper the performance of CBIR based on different global and local color histograms in three different color spaces, namely, RGB, HSV, L*a*b* and also with three distance measures Euclidean, Quadratic and Histogram intersection are surveyed, to choose appropriate method for future research.

  17. Classification of CT-brain slices based on local histograms

    NASA Astrophysics Data System (ADS)

    Avrunin, Oleg G.; Tymkovych, Maksym Y.; Pavlov, Sergii V.; Timchik, Sergii V.; Kisała, Piotr; Orakbaev, Yerbol

    2015-12-01

    Neurosurgical intervention is a very complicated process. Modern operating procedures based on data such as CT, MRI, etc. Automated analysis of these data is an important task for researchers. Some modern methods of brain-slice segmentation use additional data to process these images. Classification can be used to obtain this information. To classify the CT images of the brain, we suggest using local histogram and features extracted from them. The paper shows the process of feature extraction and classification CT-slices of the brain. The process of feature extraction is specialized for axial cross-section of the brain. The work can be applied to medical neurosurgical systems.

  18. EQUAL PAY FACTS.

    ERIC Educational Resources Information Center

    Women's Bureau (DOL), Washington, DC.

    EQUAL PAY MEANS PAYMENT OF "RATE OF THE JOB" WITHOUT REGARD TO SEX. EQUAL PAY LAWS WERE ENACTED IN 29 STATES FROM 1919 TO 1965. FOUR ADDITIONAL STATES HAVE FAIR EMPLOYMENT PRACTICES LAWS. SUPPORT FOR SUCH LEGISLATION HAS COME FROM WOMEN'S AND CIVIC ORGANIZATIONS, AFL-CIO, AND THE PRESIDENT'S AND STATE COMMISSIONS ON THE STATUS OF WOMEN. THE…

  19. Equality and Economy

    ERIC Educational Resources Information Center

    Brink, Chris

    2012-01-01

    The two big events in higher education during 2010 were the implementation of the Equality Act, and the introduction of a new dispensation on fees and funding. The former is intended to promote equality, the latter is premised on the need for economy. In this article, the author focuses on the effect of the latter on the former. He considers this…

  20. Equality, Innovation and Diversity.

    ERIC Educational Resources Information Center

    Smith, Janet

    1999-01-01

    Offers some ideas concerning promotion of gender equality and diversity within European Union-funded programs and activities. Reviews efforts since the 1970s to foster equal access in European schools and universities, examines some principles of innovation and entrepreneurship, and considers stages in diversity policy development. (DB)

  1. Early Understanding of Equality

    ERIC Educational Resources Information Center

    Leavy, Aisling; Hourigan, Mairéad; McMahon, Áine

    2013-01-01

    Quite a bit of the arithmetic in elementary school contains elements of algebraic reasoning. After researching and testing a number of instructional strategies with Irish third graders, these authors found effective methods for cultivating a relational concept of equality in third-grade students. Understanding equality is fundamental to algebraic…

  2. Integer Equal Flows

    SciTech Connect

    Meyers, C A; Schulz, A S

    2009-01-07

    The integer equal flow problem is an NP-hard network flow problem, in which all arcs in given sets R{sub 1}, ..., R{sub {ell}} must carry equal flow. We show this problem is effectively inapproximable, even if the cardinality of each set R{sub k} is two. When {ell} is fixed, it is solvable in polynomial time.

  3. Searching for Equality.

    ERIC Educational Resources Information Center

    Giese, James; Miller, Barbara

    1987-01-01

    Offers an activity designed to illustrate the historical context of current legal questions relating to equal rights. The activity shows how the definition of equality has been expanded as a result of the continuous hard work of individuals and civil rights groups. (JDH)

  4. A preliminary evaluation of histogram-based binarization algorithms

    SciTech Connect

    Kanai, Junichi; Grover, K.

    1995-04-01

    To date, most Optical Character Recognition (OCR) systems process binary document images, and the quality of the input image strongly affects their performance. Since a binarization process is inherently lossy, different algorithms typically produce different binary images from the same gray scale image. The objective of this research is to study effects of global binarization algorithms on the performance of OCR systems. Several binarization methods were examined: the best fixed threshold value for the data set, the ideal histogram method, and Otsu`s algorithm. Four contemporary OCR systems and 50 hard copy pages containing 91,649 characters were used in the experiments. These pages were digitized at 300 dpi and 8 bits/pixel, and 36 different threshold values (ranging from 59 to 199 in increments of) 4 were used. The resulting 1,800 binary images were processed by all four OCR systems. All systems made approximately 40% more errors from images generated by Otsu`s method than those of the ideal histogram method. Two of the systems made approximately the same number of errors from images generated by the best fixed threshold value and Otsu`s method.

  5. Lean histogram of oriented gradients features for effective eye detection

    NASA Astrophysics Data System (ADS)

    Sharma, Riti; Savakis, Andreas

    2015-11-01

    Reliable object detection is very important in computer vision and robotics applications. The histogram of oriented gradients (HOG) is established as one of the most popular hand-crafted features, which along with support vector machine (SVM) classification provides excellent performance for object recognition. We investigate dimensionality deduction on HOG features in combination with SVM classifiers to obtain efficient feature representation and improved classification performance. In addition to lean HOG features, we explore descriptors resulting from dimensionality reduction on histograms of binary descriptors. We consider three-dimensionality reduction techniques: standard principal component analysis, random projections, a computationally efficient linear mapping that is data independent, and locality preserving projections (LPP), which learns the manifold structure of the data. Our methods focus on the application of eye detection and were tested on an eye database created using the BioID and FERET face databases. Our results indicate that manifold learning is beneficial to classification utilizing HOG features. To demonstrate the broader usefulness of lean HOG features for object class recognition, we evaluated our system's classification performance on the CalTech-101 dataset with favorable outcomes.

  6. Bin Ratio-Based Histogram Distances and Their Application to Image Classification.

    PubMed

    Hu, Weiming; Xie, Nianhua; Hu, Ruiguang; Ling, Haibin; Chen, Qiang; Yan, Shuicheng; Maybank, Stephen

    2014-12-01

    Large variations in image background may cause partial matching and normalization problems for histogram-based representations, i.e., the histograms of the same category may have bins which are significantly different, and normalization may produce large changes in the differences between corresponding bins. In this paper, we deal with this problem by using the ratios between bin values of histograms, rather than bin values' differences which are used in the traditional histogram distances. We propose a bin ratio-based histogram distance (BRD), which is an intra-cross-bin distance, in contrast with previous bin-to-bin distances and cross-bin distances. The BRD is robust to partial matching and histogram normalization, and captures correlations between bins with only a linear computational complexity. We combine the BRD with the ℓ1 histogram distance and the χ(2) histogram distance to generate the ℓ1 BRD and the χ(2) BRD, respectively. These combinations exploit and benefit from the robustness of the BRD under partial matching and the robustness of the ℓ1 and χ(2) distances to small noise. We propose a method for assessing the robustness of histogram distances to partial matching. The BRDs and logistic regression-based histogram fusion are applied to image classification. The experimental results on synthetic data sets show the robustness of the BRDs to partial matching, and the experiments on seven benchmark data sets demonstrate promising results of the BRDs for image classification. PMID:26353143

  7. Efficient local statistical analysis via integral histograms with discrete wavelet transform.

    PubMed

    Lee, Teng-Yok; Shen, Han-Wei

    2013-12-01

    Histograms computed from local regions are commonly used in many visualization applications, and allowing the user to query histograms interactively in regions of arbitrary locations and sizes plays an important role in feature identification and tracking. Computing histograms in regions with arbitrary location and size, nevertheless, can be time consuming for large data sets since it involves expensive I/O and scan of data elements. To achieve both performance- and storage-efficient query of local histograms, we present a new algorithm called WaveletSAT, which utilizes integral histograms, an extension of the summed area tables (SAT), and discrete wavelet transform (DWT). Similar to SAT, an integral histogram is the histogram computed from the area between each grid point and the grid origin, which can be be pre-computed to support fast query. Nevertheless, because one histogram contains multiple bins, it will be very expensive to store one integral histogram at each grid point. To reduce the storage cost for large integral histograms, WaveletSAT treats the integral histograms of all grid points as multiple SATs, each of which can be converted into a sparse representation via DWT, allowing the reconstruction of axis-aligned region histograms of arbitrary sizes from a limited number of wavelet coefficients. Besides, we present an efficient wavelet transform algorithm for SATs that can operate on each grid point separately in logarithmic time complexity, which can be extended to parallel GPU-based implementation. With theoretical and empirical demonstration, we show that WaveletSAT can achieve fast preprocessing and smaller storage overhead than the conventional integral histogram approach with close query performance. PMID:24051836

  8. Equalization in redundant channels

    NASA Technical Reports Server (NTRS)

    Tulpule, Bhalchandra R. (Inventor); Collins, Robert E. (Inventor); Cominelli, Donald F. (Inventor); O'Neill, Richard D. (Inventor)

    1988-01-01

    A miscomparison between a channel's configuration data base and a voted system configuration data base in a redundant channel system having identically operating, frame synchronous channels triggers autoequalization of the channel's historical signal data bases in a hierarchical, chronological manner with that of a correctly operating channel. After equalization, symmetrization of the channel's configuration data base with that of the system permits upgrading of the previously degraded channel to full redundancy. An externally provided equalization command, e.g., manually actuated, can also trigger equalization.

  9. Density Equalizing Map Projections

    1995-07-01

    A geographic map is mathematically transformed so that the subareas of the map are proportional to a given quantity such as population. In other words, population density is equalized over the entire map. The transformed map can be used as a display tool, or it can be statistically analyzed. For example, cases of disease plotted on the transformed map should be uniformly distributed at random, if disease rates are everywhere equal. Geographic clusters of diseasemore » can be readily identified, and their statistical significance determined, on a density equalized map.« less

  10. Overcoming the slowing down of flat-histogram Monte Carlo simulations: cluster updates and optimized broad-histogram ensembles.

    PubMed

    Wu, Yong; Körner, Mathias; Colonna-Romano, Louis; Trebst, Simon; Gould, Harvey; Machta, Jonathan; Troyer, Matthias

    2005-10-01

    We study the performance of Monte Carlo simulations that sample a broad histogram in energy by determining the mean first-passage time to span the entire energy space of d-dimensional ferromagnetic Ising/Potts models. We first show that flat-histogram Monte Carlo methods with single-spin flip updates such as the Wang-Landau algorithm or the multicanonical method perform suboptimally in comparison to an unbiased Markovian random walk in energy space. For the d = 1, 2, 3 Ising model, the mean first-passage time tau scales with the number of spins N = L(d) as tau proportional N2L(z). The exponent z is found to decrease as the dimensionality d is increased. In the mean-field limit of infinite dimensions we find that z vanishes up to logarithmic corrections. We then demonstrate how the slowdown characterized by z > 0 for finite d can be overcome by two complementary approaches--cluster dynamics in connection with Wang-Landau sampling and the recently developed ensemble optimization technique. Both approaches are found to improve the random walk in energy space so that tau proportional N2 up to logarithmic corrections for the d = 1, 2 Ising model.

  11. Happiness as Educational Equality

    ERIC Educational Resources Information Center

    McCord, Arline Sakuma

    1974-01-01

    A review of six monographs in the American Sociological Association Rose Monograph Series that examine attitudes toward self as they relate to the pervasiveness of American values concerning happiness and equality of opportunity. (EH)

  12. Implementing a 3D histogram version of the Energy-Test in ROOT

    NASA Astrophysics Data System (ADS)

    Cohen, E. O.; Reid, I. D.; Piasetzky, E.

    2016-08-01

    Comparing simulation and data histograms is of interest in nuclear and particle physics experiments; however, the leading three-dimensional histogram comparison tool available in ROOT, the 3D Kolmogorov-Smirnov test, exhibits shortcomings. Throughout the following, we present and discuss the implementation of an alternative comparison test for three-dimensional histograms, based on the Energy-Test by Aslan and Zech. The software package can be found at http://www-nuclear.tau.ac.il/ecohen/.

  13. Using histograms to introduce randomization in the generation of ensembles of decision trees

    DOEpatents

    Kamath, Chandrika; Cantu-Paz, Erick; Littau, David

    2005-02-22

    A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

  14. Custom IC/Embedded IP design for histogram in video processing application

    NASA Astrophysics Data System (ADS)

    Pandey, Manoj; Chaturvedi, Richa; Rai, S. K.

    2016-03-01

    Histogram is an integral part of video processing applications. Either of the design methods ASIC or Embedded, histogram computation is an important functional block. This paper proposes the custom Integrated Circuit (IC) as an ASIC and an embedded IP to compute the colored histogram function. Histogram computation has two features: color and spatial. Color feature has been calculated using find_bin and spatial feature is calculated using kernel function. The design is verified using NCSIM Cadence tool, while it is synthesized using RTL compiler. Finally, the embedded IP has interfaced with Kernel based mean shift algorithm in tracking a moving object and implemented on Xilinx Spartan 6 LX150T FPGA.

  15. Fast and fully automatic phalanx segmentation using a grayscale-histogram morphology algorithm

    NASA Astrophysics Data System (ADS)

    Hsieh, Chi-Wen; Liu, Tzu-Chiang; Jong, Tai-Lang; Chen, Chih-Yen; Tiu, Chui-Mei; Chan, Din-Yuen

    2011-08-01

    Bone age assessment is a common radiological examination used in pediatrics to diagnose the discrepancy between the skeletal and chronological age of a child; therefore, it is beneficial to develop a computer-based bone age assessment to help junior pediatricians estimate bone age easily. Unfortunately, the phalanx on radiograms is not easily separated from the background and soft tissue. Therefore, we proposed a new method, called the grayscale-histogram morphology algorithm, to segment the phalanges fast and precisely. The algorithm includes three parts: a tri-stage sieve algorithm used to eliminate the background of hand radiograms, a centroid-edge dual scanning algorithm to frame the phalanx region, and finally a segmentation algorithm based on disk traverse-subtraction filter to segment the phalanx. Moreover, two more segmentation methods: adaptive two-mean and adaptive two-mean clustering were performed, and their results were compared with the segmentation algorithm based on disk traverse-subtraction filter using five indices comprising misclassification error, relative foreground area error, modified Hausdorff distances, edge mismatch, and region nonuniformity. In addition, the CPU time of the three segmentation methods was discussed. The result showed that our method had a better performance than the other two methods. Furthermore, satisfactory segmentation results were obtained with a low standard error.

  16. Technology--The Equalizer.

    ERIC Educational Resources Information Center

    Sloane, Eydie

    1989-01-01

    This article describes a number of computer-based learning tools for disabled students. Adaptive input devices, assisted technologies, software, and hardware and software resources are discussed. (IAH)

  17. Transhumanism and moral equality.

    PubMed

    Wilson, James

    2007-10-01

    Conservative thinkers such as Francis Fukuyama have produced a battery of objections to the transhumanist project of fundamentally enhancing human capacities. This article examines one of these objections, namely that by allowing some to greatly extend their capacities, we will undermine the fundamental moral equality of human beings. I argue that this objection is groundless: once we understand the basis for human equality, it is clear that anyone who now has sufficient capacities to count as a person from the moral point of view will continue to count as one even if others are fundamentally enhanced; and it is mistaken to think that a creature which had even far greater capacities than an unenhanced human being should count as more than an equal from the moral point of view.

  18. CHIWEI: A code of goodness of fit tests for weighted and unweighted histograms

    NASA Astrophysics Data System (ADS)

    Gagunashvili, N. D.

    2012-02-01

    A Fortran-77 program for goodness of fit tests for histograms with weighted entries as well as with unweighted entries is presented. The code calculates test statistics for case of histogram with normalized weights of events and in case of unnormalized weights of events.

  19. Equality and Academic Subjects

    ERIC Educational Resources Information Center

    Hardarson, Atli

    2013-01-01

    A recent national curriculum guide for upper secondary schools in my home country, Iceland, requires secondary schools to work towards equality and five other overarching aims. This requirement raises questions about to what extent secondary schools have to change their curricula in order to approach these aims or work towards them in an adequate…

  20. Equal Opportunity in Employment

    ERIC Educational Resources Information Center

    Bullock, Paul

    This book focuses on discrimination in employment, defined as the denial of equal opportunity in the labor market to qualified persons on the basis of race, color, religion, national origin, age, sex, or any other factor not related to their individual qualifications for work. The average nonwhite college graduate can expect to earn less during…

  1. Education and Economic Equality

    ERIC Educational Resources Information Center

    Thurow, Lester C.

    1972-01-01

    Argues that the present reliance on education as the ultimate policy for curing all problems, economic and social, is unwarranted at best and in all probability, ineffective. Suggests that any time a consensus emerges on the need for more equality, it can be at least partly achieved by making a frontal attack on wage differentials. (RJ)

  2. EQUALS Investigations: Remote Rulers.

    ERIC Educational Resources Information Center

    Mayfield, Karen; Whitlow, Robert

    EQUALS is a teacher education program that helps elementary and secondary educators acquire methods and materials to attract minority and female students to mathematics. It supports a problem-solving approach to mathematics which has students working in groups, uses active assessment methods, and incorporates a broad mathematics curriculum…

  3. Equality of Fitness Centers

    ERIC Educational Resources Information Center

    Swoyer, Jesse O.

    2008-01-01

    The author, who has been a personal trainer for the past ten years, recently realized that all fitness centers are not equal. In February, he was able to participate in the grand opening of the Center for Independent Living of Central PA (CILCP), a fitness center that is designed to accommodate persons with disabilities living in the Central…

  4. Granting Each Equal Access.

    ERIC Educational Resources Information Center

    Walling, Linda Lucas

    1992-01-01

    Summarizes federal legislation regarding equal access for students with disabilities and discusses environmental barriers to accessibility in the library media center. Solutions to these design problems are suggested in the following areas: material formats and space requirements; the physical setting, including furniture, floor coverings,…

  5. Equality of Educational Opportunity.

    ERIC Educational Resources Information Center

    Cross, K. Patricia

    A consideration of the use of the phrase "equality of educational opportunity" and of the educational models used to attempt its implementation suggest the following recommendations. If education is to devise learning models that will maximize individual potential and aid in matching human abilities to the work required by societies, then (1) we…

  6. Defining Equality in Education

    ERIC Educational Resources Information Center

    Benson, Ronald E.

    1977-01-01

    Defines equality of education in three areas: 1) by the degree of integration of school systems; 2) by a comparison of material resources and assets in education; and 3) by the effects of schooling as measured by the mean scores of groups on standardized tests. Available from: College of Education, 107 Quadrangle, Iowa State University, Ames, Iowa…

  7. Equality Versus Inequality.

    ERIC Educational Resources Information Center

    Dahl, Robert A.

    1996-01-01

    Argues that political equality and democracy are attainable only through the distribution of access to political resources and the willingness to use them. Discusses the broad philosophical and sociological components that contribute to a system marked by advantage and inequalities, as well as opportunities for opposition and resistance. (MJP)

  8. Do you need to compare two histograms not only by eye?

    NASA Astrophysics Data System (ADS)

    Cardiel, N.

    2015-05-01

    Although the use of histograms implies loss of information due to the fact that the actual data are replaced by the central values of the considered intervals, this graphical representation is commonly employed in scientific communication, particularly in Astrophysics. Sometimes this kind of comparison is unavoidable when one needs to compare new results with already published data only available in histogram format. Unfortunately, it is not infrequent to find in the literature examples of histogram comparisons where the similarity between the histograms is not statistically quantified but simply justified or discarded ``by eye''. In this poster several methods to quantify the similarity between two histograms are discussed. The availability of statistical packages, such as R (R Core Team 2014, R: A Language and Environment for Statistical Computing, R Foundation for Statistical Computing, Vienna, Austria. URL http://www.R-project.org/), notably simplify the understanding of the different approaches through the use of numerical simulations.

  9. Time-cumulated visible and infrared radiance histograms used as descriptors of surface and cloud variations

    NASA Technical Reports Server (NTRS)

    Seze, Genevieve; Rossow, William B.

    1991-01-01

    The spatial and temporal stability of the distributions of satellite-measured visible and infrared radiances, caused by variations in clouds and surfaces, are investigated using bidimensional and monodimensional histograms and time-composite images. Similar analysis of the histograms of the original and time-composite images provides separation of the contributions of the space and time variations to the total variations. The variability of both the surfaces and clouds is found to be larger at scales much larger than the minimum resolved by satellite imagery. This study shows that the shapes of these histograms are distinctive characteristics of the different climate regimes and that particular attributes of these histograms can be related to several general, though not universal, properties of clouds and surface variations at regional and synoptic scales. There are also significant exceptions to these relationships in particular climate regimes. The characteristics of these radiance histograms provide a stable well defined descriptor of the cloud and surface properties.

  10. Approximate Algorithms for Computing Spatial Distance Histograms with Accuracy Guarantees

    PubMed Central

    Grupcev, Vladimir; Yuan, Yongke; Tu, Yi-Cheng; Huang, Jin; Chen, Shaoping; Pandit, Sagar; Weng, Michael

    2014-01-01

    Particle simulation has become an important research tool in many scientific and engineering fields. Data generated by such simulations impose great challenges to database storage and query processing. One of the queries against particle simulation data, the spatial distance histogram (SDH) query, is the building block of many high-level analytics, and requires quadratic time to compute using a straightforward algorithm. Previous work has developed efficient algorithms that compute exact SDHs. While beating the naive solution, such algorithms are still not practical in processing SDH queries against large-scale simulation data. In this paper, we take a different path to tackle this problem by focusing on approximate algorithms with provable error bounds. We first present a solution derived from the aforementioned exact SDH algorithm, and this solution has running time that is unrelated to the system size N. We also develop a mathematical model to analyze the mechanism that leads to errors in the basic approximate algorithm. Our model provides insights on how the algorithm can be improved to achieve higher accuracy and efficiency. Such insights give rise to a new approximate algorithm with improved time/accuracy tradeoff. Experimental results confirm our analysis. PMID:24693210

  11. Effects of voxelization on dose volume histogram accuracy

    NASA Astrophysics Data System (ADS)

    Sunderland, Kyle; Pinter, Csaba; Lasso, Andras; Fichtinger, Gabor

    2016-03-01

    PURPOSE: In radiotherapy treatment planning systems, structures of interest such as targets and organs at risk are stored as 2D contours on evenly spaced planes. In order to be used in various algorithms, contours must be converted into binary labelmap volumes using voxelization. The voxelization process results in lost information, which has little effect on the volume of large structures, but has significant impact on small structures, which contain few voxels. Volume differences for segmented structures affects metrics such as dose volume histograms (DVH), which are used for treatment planning. Our goal is to evaluate the impact of voxelization on segmented structures, as well as how factors like voxel size affects metrics, such as DVH. METHODS: We create a series of implicit functions, which represent simulated structures. These structures are sampled at varying resolutions, and compared to labelmaps with high sub-millimeter resolutions. We generate DVH and evaluate voxelization error for the same structures at different resolutions by calculating the agreement acceptance percentage between the DVH. RESULTS: We implemented tools for analysis as modules in the SlicerRT toolkit based on the 3D Slicer platform. We found that there were large DVH variation from the baseline for small structures or for structures located in regions with a high dose gradient, potentially leading to the creation of suboptimal treatment plans. CONCLUSION: This work demonstrates that labelmap and dose volume voxel size is an important factor in DVH accuracy, which must be accounted for in order to ensure the development of accurate treatment plans.

  12. Approximate Algorithms for Computing Spatial Distance Histograms with Accuracy Guarantees.

    PubMed

    Grupcev, Vladimir; Yuan, Yongke; Tu, Yi-Cheng; Huang, Jin; Chen, Shaoping; Pandit, Sagar; Weng, Michael

    2012-09-01

    Particle simulation has become an important research tool in many scientific and engineering fields. Data generated by such simulations impose great challenges to database storage and query processing. One of the queries against particle simulation data, the spatial distance histogram (SDH) query, is the building block of many high-level analytics, and requires quadratic time to compute using a straightforward algorithm. Previous work has developed efficient algorithms that compute exact SDHs. While beating the naive solution, such algorithms are still not practical in processing SDH queries against large-scale simulation data. In this paper, we take a different path to tackle this problem by focusing on approximate algorithms with provable error bounds. We first present a solution derived from the aforementioned exact SDH algorithm, and this solution has running time that is unrelated to the system size N. We also develop a mathematical model to analyze the mechanism that leads to errors in the basic approximate algorithm. Our model provides insights on how the algorithm can be improved to achieve higher accuracy and efficiency. Such insights give rise to a new approximate algorithm with improved time/accuracy tradeoff. Experimental results confirm our analysis.

  13. Using color histograms and SPA-LDA to classify bacteria.

    PubMed

    de Almeida, Valber Elias; da Costa, Gean Bezerra; de Sousa Fernandes, David Douglas; Gonçalves Dias Diniz, Paulo Henrique; Brandão, Deysiane; de Medeiros, Ana Claudia Dantas; Véras, Germano

    2014-09-01

    In this work, a new approach is proposed to verify the differentiating characteristics of five bacteria (Escherichia coli, Enterococcus faecalis, Streptococcus salivarius, Streptococcus oralis, and Staphylococcus aureus) by using digital images obtained with a simple webcam and variable selection by the Successive Projections Algorithm associated with Linear Discriminant Analysis (SPA-LDA). In this sense, color histograms in the red-green-blue (RGB), hue-saturation-value (HSV), and grayscale channels and their combinations were used as input data, and statistically evaluated by using different multivariate classifiers (Soft Independent Modeling by Class Analogy (SIMCA), Principal Component Analysis-Linear Discriminant Analysis (PCA-LDA), Partial Least Squares Discriminant Analysis (PLS-DA) and Successive Projections Algorithm-Linear Discriminant Analysis (SPA-LDA)). The bacteria strains were cultivated in a nutritive blood agar base layer for 24 h by following the Brazilian Pharmacopoeia, maintaining the status of cell growth and the nature of nutrient solutions under the same conditions. The best result in classification was obtained by using RGB and SPA-LDA, which reached 94 and 100 % of classification accuracy in the training and test sets, respectively. This result is extremely positive from the viewpoint of routine clinical analyses, because it avoids bacterial identification based on phenotypic identification of the causative organism using Gram staining, culture, and biochemical proofs. Therefore, the proposed method presents inherent advantages, promoting a simpler, faster, and low-cost alternative for bacterial identification.

  14. Using color histograms and SPA-LDA to classify bacteria.

    PubMed

    de Almeida, Valber Elias; da Costa, Gean Bezerra; de Sousa Fernandes, David Douglas; Gonçalves Dias Diniz, Paulo Henrique; Brandão, Deysiane; de Medeiros, Ana Claudia Dantas; Véras, Germano

    2014-09-01

    In this work, a new approach is proposed to verify the differentiating characteristics of five bacteria (Escherichia coli, Enterococcus faecalis, Streptococcus salivarius, Streptococcus oralis, and Staphylococcus aureus) by using digital images obtained with a simple webcam and variable selection by the Successive Projections Algorithm associated with Linear Discriminant Analysis (SPA-LDA). In this sense, color histograms in the red-green-blue (RGB), hue-saturation-value (HSV), and grayscale channels and their combinations were used as input data, and statistically evaluated by using different multivariate classifiers (Soft Independent Modeling by Class Analogy (SIMCA), Principal Component Analysis-Linear Discriminant Analysis (PCA-LDA), Partial Least Squares Discriminant Analysis (PLS-DA) and Successive Projections Algorithm-Linear Discriminant Analysis (SPA-LDA)). The bacteria strains were cultivated in a nutritive blood agar base layer for 24 h by following the Brazilian Pharmacopoeia, maintaining the status of cell growth and the nature of nutrient solutions under the same conditions. The best result in classification was obtained by using RGB and SPA-LDA, which reached 94 and 100 % of classification accuracy in the training and test sets, respectively. This result is extremely positive from the viewpoint of routine clinical analyses, because it avoids bacterial identification based on phenotypic identification of the causative organism using Gram staining, culture, and biochemical proofs. Therefore, the proposed method presents inherent advantages, promoting a simpler, faster, and low-cost alternative for bacterial identification. PMID:25023972

  15. Freedom, equality, race.

    PubMed

    Ferguson, Jeffrey B

    2011-01-01

    This essay explores come of the reasons for the continuing power of racial categorization in our era, and thus offers some friendly amendments to the more optimistic renderings of the term post-racial. Focusing mainly on the relationship between black and white Americans, it argues that the widespread embrace of universal values of freedom and equality, which most regard as antidotes to racial exclusion, actually reinforce it. The internal logic of these categories requires the construction of the "other." In America, where freedom and equality still stand at the contested center of collective identity, a history of racial oppression informs the very meaning of these terms. Thus the irony: much of the effort exerted to transcend race tends to fuel continuing division. PMID:21469393

  16. Equality in Education: An Equality of Condition Perspective

    ERIC Educational Resources Information Center

    Lynch, Kathleen; Baker, John

    2005-01-01

    Transforming schools into truly egalitarian institutions requires a holistic and integrated approach. Using a robust conception of "equality of condition", we examine key dimensions of equality that are central to both the purposes and processes of education: equality in educational and related resources; equality of respect and recognition;…

  17. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  18. Compressed histogram attribute profiles for the classification of VHR remote sensing images

    NASA Astrophysics Data System (ADS)

    Battiti, Romano; Demir, Begüm; Bruzzone, Lorenzo

    2015-10-01

    This paper presents a novel compressed histogram attribute profile (CHAP) for classification of very high resolution remote sensing images. The CHAP characterizes the marginal local distribution of attribute filter responses to model the texture information of each sample with a small number of image features. This is achieved based on a three steps algorithm. The first step is devoted to provide a complete characterization of spatial properties of objects in a scene. To this end, the attribute profile (AP) is initially built by the sequential application of attribute filters to the considered image. Then, to capture complete spatial characteristics of the structures in the scene a local histogram is calculated for each sample of each image in the AP. The local histograms of the same pixel location can contain redundant information since: i) adjacent histogram bins can provide similar information; and ii) the attributes obtained with similar attribute filter threshold values lead to redundant features. In the second step, to point out the redundancies the local histograms of the same pixel locations in the AP are organized into a 2D matrix representation, where columns are associated to the local histograms and rows represents a specific bin in all histograms of the considered sequence of filtered attributes in the profile. This representation results in the characterization of the texture information of each sample through a 2D texture descriptor. In the final step, a novel compression approach based on a uniform 2D quantization strategy is applied to remove the redundancy of the 2D texture descriptors. Finally the CHAP is classified by a Support Vector Machine classifier with histogram intersection kernel that is very effective for high dimensional histogram-based feature representations. Experimental results confirm the effectiveness of the proposed CHAP in terms of computational complexity, storage requirements and classification accuracy when compared to the

  19. Real-time rotation estimation using histograms of oriented gradients.

    PubMed

    Bratanič, Blaž; Pernuš, Franjo; Likar, Boštjan; Tomaževič, Dejan

    2014-01-01

    This paper focuses on real-time rotation estimation for model-based automated visual inspection. In the case of model-based inspection, spatial alignment is essential to distinguish visual defects from normal appearance variations. Defects are detected by comparing the inspected object with its spatially aligned ideal reference model. Rotation estimation is crucial for the inspection of rotationally symmetric objects where mechanical manipulation is unable to ensure the correct object rotation. We propose a novel method for in-plane rotation estimation. Rotation is estimated with an ensemble of nearest-neighbor estimators. Each estimator contains a spatially local representation of an object in a feature space for all rotation angles and is constructed with a semi-supervised self-training approach from a set of unlabeled training images. An individual representation in a feature space is obtained by calculating the Histograms of Oriented Gradients (HOG) over a spatially local region. Each estimator votes separately for the estimated angle; all votes are weighted and accumulated. The final estimation is the angle with the most votes. The method was evaluated on several datasets of pharmaceutical tablets varying in size, shape, and color. The results show that the proposed method is superior in robustness with comparable speed and accuracy to previously proposed methods for rotation estimation of pharmaceutical tablets. Furthermore, all evaluations were performed with the same set of parameters, which implies that the method requires minimal human intervention. Despite the evaluation focused on pharmaceutical tablets, we consider the method useful for any application that requires robust real-time in-plane rotation estimation.

  20. Histogram-driven cupping correction (HDCC) in CT

    NASA Astrophysics Data System (ADS)

    Kyriakou, Y.; Meyer, M.; Lapp, R.; Kalender, W. A.

    2010-04-01

    Typical cupping correction methods are pre-processing methods which require either pre-calibration measurements or simulations of standard objects to approximate and correct for beam hardening and scatter. Some of them require the knowledge of spectra, detector characteristics, etc. The aim of this work was to develop a practical histogram-driven cupping correction (HDCC) method to post-process the reconstructed images. We use a polynomial representation of the raw-data generated by forward projection of the reconstructed images; forward and backprojection are performed on graphics processing units (GPU). The coefficients of the polynomial are optimized using a simplex minimization of the joint entropy of the CT image and its gradient. The algorithm was evaluated using simulations and measurements of homogeneous and inhomogeneous phantoms. For the measurements a C-arm flat-detector CT (FD-CT) system with a 30×40 cm2 detector, a kilovoltage on board imager (radiation therapy simulator) and a micro-CT system were used. The algorithm reduced cupping artifacts both in simulations and measurements using a fourth-order polynomial and was in good agreement to the reference. The minimization algorithm required less than 70 iterations to adjust the coefficients only performing a linear combination of basis images, thus executing without time consuming operations. HDCC reduced cupping artifacts without the necessity of pre-calibration or other scan information enabling a retrospective improvement of CT image homogeneity. However, the method can work with other cupping correction algorithms or in a calibration manner, as well.

  1. Preference for luminance histogram regularities in natural scenes.

    PubMed

    Graham, Daniel; Schwarz, Bianca; Chatterjee, Anjan; Leder, Helmut

    2016-03-01

    Natural scene luminance distributions typically have positive skew, and for single objects, there is evidence that higher skew is a correlate (but not a guarantee) of glossiness. Skewness is also relevant to aesthetics: preference for glossy single objects (with high skew) has been shown even in infants, and skewness is a good predictor of fruit freshness. Given that primate vision appears to efficiently encode natural scene luminance variation, and given evidence that natural scene regularities may be a prerequisite for aesthetic perception in the spatial domain, here we ask whether humans in general prefer natural scenes with more positively skewed luminance distributions. If humans generally prefer images with the higher-order regularities typical of natural scenes and/or shiny objects, we would expect this to be the case. By manipulating luminance distribution skewness (holding mean and variance constant) for individual natural images, we show that in fact preference varies inversely with increasing positive skewness. This finding holds for: artistic landscape images and calibrated natural scenes; scenes with and without glossy surfaces; landscape scenes and close-up objects; and noise images with natural luminance histograms. Across conditions, humans prefer images with skew near zero over higher skew images, and they prefer skew lower than that of the unmodified scenes. These results suggest that humans prefer images with luminances that are distributed relatively evenly about the mean luminance, i.e., images with similar amounts of light and dark. We propose that our results reflect an efficient processing advantage of low-skew images over high-skew images, following evidence from prior brain imaging results.

  2. Histogram analysis of ADC in brain tumor patients

    NASA Astrophysics Data System (ADS)

    Banerjee, Debrup; Wang, Jihong; Li, Jiang

    2011-03-01

    At various stage of progression, most brain tumors are not homogenous. In this presentation, we retrospectively studied the distribution of ADC values inside tumor volume during the course of tumor treatment and progression for a selective group of patients who underwent an anti-VEGF trial. Complete MRI studies were obtained for this selected group of patients including pre- and multiple follow-up, post-treatment imaging studies. In each MRI imaging study, multiple scan series were obtained as a standard protocol which includes T1, T2, T1-post contrast, FLAIR and DTI derived images (ADC, FA etc.) for each visit. All scan series (T1, T2, FLAIR, post-contrast T1) were registered to the corresponding DTI scan at patient's first visit. Conventionally, hyper-intensity regions on T1-post contrast images are believed to represent the core tumor region while regions highlighted by FLAIR may overestimate tumor size. Thus we annotated tumor regions on the T1-post contrast scans and ADC intensity values for pixels were extracted inside tumor regions as defined on T1-post scans. We fit a mixture Gaussian (MG) model for the extracted pixels using the Expectation-Maximization (EM) algorithm, which produced a set of parameters (mean, various and mixture coefficients) for the MG model. This procedure was performed for each visits resulting in a series of GM parameters. We studied the parameters fitted for ADC and see if they can be used as indicators for tumor progression. Additionally, we studied the ADC characteristics in the peri-tumoral region as identified by hyper-intensity on FLAIR scans. The results show that ADC histogram analysis of the tumor region supports the two compartment model that suggests the low ADC value subregion corresponding to densely packed cancer cell while the higher ADC value region corresponding to a mixture of viable and necrotic cells with superimposed edema. Careful studies of the composition and relative volume of the two compartments in tumor

  3. Region of Interest Detection Based on Histogram Segmentation for Satellite Image

    NASA Astrophysics Data System (ADS)

    Kiadtikornthaweeyot, Warinthorn; Tatnall, Adrian R. L.

    2016-06-01

    High resolution satellite imaging is considered as the outstanding applicant to extract the Earth's surface information. Extraction of a feature of an image is very difficult due to having to find the appropriate image segmentation techniques and combine different methods to detect the Region of Interest (ROI) most effectively. This paper proposes techniques to classify objects in the satellite image by using image processing methods on high-resolution satellite images. The systems to identify the ROI focus on forests, urban and agriculture areas. The proposed system is based on histograms of the image to classify objects using thresholding. The thresholding is performed by considering the behaviour of the histogram mapping to a particular region in the satellite image. The proposed model is based on histogram segmentation and morphology techniques. There are five main steps supporting each other; Histogram classification, Histogram segmentation, Morphological dilation, Morphological fill image area and holes and ROI management. The methods to detect the ROI of the satellite images based on histogram classification have been studied, implemented and tested. The algorithm is be able to detect the area of forests, urban and agriculture separately. The image segmentation methods can detect the ROI and reduce the size of the original image by discarding the unnecessary parts.

  4. De-Striping for Tdiccd Remote Sensing Image Based on Statistical Features of Histogram

    NASA Astrophysics Data System (ADS)

    Gao, Hui-ting; Liu, Wei; He, Hong-yan; Zhang, Bing-xian; Jiang, Cheng

    2016-06-01

    Aim to striping noise brought by non-uniform response of remote sensing TDI CCD, a novel de-striping method based on statistical features of image histogram is put forward. By analysing the distribution of histograms,the centroid of histogram is selected to be an eigenvalue representing uniformity of ground objects,histogrammic centroid of whole image and each pixels are calculated first,the differences between them are regard as rough correction coefficients, then in order to avoid the sensitivity caused by single parameter and considering the strong continuity and pertinence of ground objects between two adjacent pixels,correlation coefficient of the histograms is introduces to reflect the similarities between them,fine correction coefficient is obtained by searching around the rough correction coefficient,additionally,in view of the influence of bright cloud on histogram,an automatic cloud detection based on multi-feature including grey level,texture,fractal dimension and edge is used to pre-process image.Two 0-level panchromatic images of SJ-9A satellite with obvious strip noise are processed by proposed method to evaluate the performance, results show that the visual quality of images are improved because the strip noise is entirely removed,we quantitatively analyse the result by calculating the non-uniformity ,which has reached about 1% and is better than histogram matching method.

  5. Genetic Diversity and Human Equality.

    ERIC Educational Resources Information Center

    Dobzhansky, Theodosius

    The idea of equality often, if not frequently, bogs down in confusion and apparent contradictions; equality is confused with identity, and diversity with inequality. It would seem that the easiest way to discredit the idea of equality is to show that people are innately, genetically, and, therefore, irremediably diverse and unlike. The snare is,…

  6. Information-Adaptive Image Encoding and Restoration

    NASA Technical Reports Server (NTRS)

    Park, Stephen K.; Rahman, Zia-ur

    1998-01-01

    The multiscale retinex with color restoration (MSRCR) has shown itself to be a very versatile automatic image enhancement algorithm that simultaneously provides dynamic range compression, color constancy, and color rendition. A number of algorithms exist that provide one or more of these features, but not all. In this paper we compare the performance of the MSRCR with techniques that are widely used for image enhancement. Specifically, we compare the MSRCR with color adjustment methods such as gamma correction and gain/offset application, histogram modification techniques such as histogram equalization and manual histogram adjustment, and other more powerful techniques such as homomorphic filtering and 'burning and dodging'. The comparison is carried out by testing the suite of image enhancement methods on a set of diverse images. We find that though some of these techniques work well for some of these images, only the MSRCR performs universally well oil the test set.

  7. Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma

    PubMed Central

    Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-01-01

    Abstract The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC). Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement. The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P <0.05), with area under the ROC curves (AUCs) of 0.66 to 0.74 for ADC and 0.76 to 0.88 for PVP. The largest AUC of PVP (1th percentile) showed significantly higher accuracy compared with that of arterial phase (AP) or tumor size (P <0.001). MR histogram analyses—in particular for 1th percentile for PVP images—held promise for prediction of MVI of HCC. PMID:27368028

  8. Perceived quality of wood images influenced by the skewness of image histogram

    NASA Astrophysics Data System (ADS)

    Katsura, Shigehito; Mizokami, Yoko; Yaguchi, Hirohisa

    2015-08-01

    The shape of image luminance histograms is related to material perception. We investigated how the luminance histogram contributed to improvements in the perceived quality of wood images by examining various natural wood and adhesive vinyl sheets with printed wood grain. In the first experiment, we visually evaluated the perceived quality of wood samples. In addition, we measured the colorimetric parameters of the wood samples and calculated statistics of image luminance. The relationship between visual evaluation scores and image statistics suggested that skewness and kurtosis affected the perceived quality of wood. In the second experiment, we evaluated the perceived quality of wood images with altered luminance skewness and kurtosis using a paired comparison method. Our result suggests that wood images are more realistic if the skewness of the luminance histogram is slightly negative.

  9. Medical Need, Equality, and Uncertainty.

    PubMed

    Horne, L Chad

    2016-10-01

    Many hold that distributing healthcare according to medical need is a requirement of equality. Most egalitarians believe, however, that people ought to be equal on the whole, by some overall measure of well-being or life-prospects; it would be a massive coincidence if distributing healthcare according to medical need turned out to be an effective way of promoting equality overall. I argue that distributing healthcare according to medical need is important for reducing individuals' uncertainty surrounding their future medical needs. In other words, distributing healthcare according to medical need is a natural feature of healthcare insurance; it is about indemnity, not equality. PMID:27196999

  10. Medical Need, Equality, and Uncertainty.

    PubMed

    Horne, L Chad

    2016-10-01

    Many hold that distributing healthcare according to medical need is a requirement of equality. Most egalitarians believe, however, that people ought to be equal on the whole, by some overall measure of well-being or life-prospects; it would be a massive coincidence if distributing healthcare according to medical need turned out to be an effective way of promoting equality overall. I argue that distributing healthcare according to medical need is important for reducing individuals' uncertainty surrounding their future medical needs. In other words, distributing healthcare according to medical need is a natural feature of healthcare insurance; it is about indemnity, not equality.

  11. Similarity Estimation Between DNA Sequences Based on Local Pattern Histograms of Binary Images.

    PubMed

    Kobori, Yusei; Mizuta, Satoshi

    2016-04-01

    Graphical representation of DNA sequences is one of the most popular techniques for alignment-free sequence comparison. Here, we propose a new method for the feature extraction of DNA sequences represented by binary images, by estimating the similarity between DNA sequences using the frequency histograms of local bitmap patterns of images. Our method shows linear time complexity for the length of DNA sequences, which is practical even when long sequences, such as whole genome sequences, are compared. We tested five distance measures for the estimation of sequence similarities, and found that the histogram intersection and Manhattan distance are the most appropriate ones for phylogenetic analyses.

  12. Luck, Choice, and Educational Equality

    ERIC Educational Resources Information Center

    Calvert, John

    2015-01-01

    Harry Brighouse discusses two conceptions of educational equality. The first is a type of equality of opportunity, heavily influenced by the work of John Rawls, which he calls the meritocratic conception. According to this conception, an individual's educational prospects should not be influenced by factors such as their social class background.…

  13. Democracy, Equal Citizenship, and Education

    ERIC Educational Resources Information Center

    Callan, Eamonn

    2016-01-01

    Two appealing principles of educational distribution--equality and sufficiency--are comparatively assessed. The initial point of comparison is the distribution of civic educational goods. One reason to favor equality in educational distribution rather than sufficiency is the elimination of undeserved positional advantage in access to labor…

  14. Governing Equality: Mathematics for All?

    ERIC Educational Resources Information Center

    Diaz, Jennifer D.

    2013-01-01

    With the notion of governmentality, this article considers how the equal sign (=) in the U.S. math curriculum organizes knowledge of equality and inscribes cultural rules for thinking, acting, and seeing in the world. Situating the discussion within contemporary math reforms aimed at teaching mathematics for all, I draw attention to how the…

  15. Large-Scale Merging of Histograms using Distributed In-Memory Computing

    NASA Astrophysics Data System (ADS)

    Blomer, Jakob; Ganis, Gerardo

    2015-12-01

    Most high-energy physics analysis jobs are embarrassingly parallel except for the final merging of the output objects, which are typically histograms. Currently, the merging of output histograms scales badly. The running time for distributed merging depends not only on the overall number of bins but also on the number partial histogram output files. That means, while the time to analyze data decreases linearly with the number of worker nodes, the time to merge the histograms in fact increases with the number of worker nodes. On the grid, merging jobs that take a few hours are not unusual. In order to improve the situation, we present a distributed and decentral merging algorithm whose running time is independent of the number of worker nodes. We exploit full bisection bandwidth of local networks and we keep all intermediate results in memory. We present benchmarks from an implementation using the parallel ROOT facility (PROOF) and RAMCloud, a distributed key-value store that keeps all data in DRAM.

  16. DIF Testing with an Empirical-Histogram Approximation of the Latent Density for Each Group

    ERIC Educational Resources Information Center

    Woods, Carol M.

    2011-01-01

    This research introduces, illustrates, and tests a variation of IRT-LR-DIF, called EH-DIF-2, in which the latent density for each group is estimated simultaneously with the item parameters as an empirical histogram (EH). IRT-LR-DIF is used to evaluate the degree to which items have different measurement properties for one group of people versus…

  17. Pattern-histogram-based temporal change detection using personal chest radiographs

    NASA Astrophysics Data System (ADS)

    Ugurlu, Yucel; Obi, Takashi; Hasegawa, Akira; Yamaguchi, Masahiro; Ohyama, Nagaaki

    1999-05-01

    An accurate and reliable detection of temporal changes from a pair of images has considerable interest in the medical science. Traditional registration and subtraction techniques can be applied to extract temporal differences when,the object is rigid or corresponding points are obvious. However, in radiological imaging, loss of the depth information, the elasticity of object, the absence of clearly defined landmarks and three-dimensional positioning differences constraint the performance of conventional registration techniques. In this paper, we propose a new method in order to detect interval changes accurately without using an image registration technique. The method is based on construction of so-called pattern histogram and comparison procedure. The pattern histogram is a graphic representation of the frequency counts of all allowable patterns in the multi-dimensional pattern vector space. K-means algorithm is employed to partition pattern vector space successively. Any differences in the pattern histograms imply that different patterns are involved in the scenes. In our experiment, a pair of chest radiographs of pneumoconiosis is employed and the changing histogram bins are visualized on both of the images. We found that the method can be used as an alternative way of temporal change detection, particularly when the precise image registration is not available.

  18. Histogram of Gabor phase patterns (HGPP): a novel object representation approach for face recognition.

    PubMed

    Zhang, Baochang; Shan, Shiguang; Chen, Xilin; Gao, Wen

    2007-01-01

    A novel object descriptor, histogram of Gabor phase pattern (HGPP), is proposed for robust face recognition. In HGPP, the quadrant-bit codes are first extracted from faces based on the Gabor transformation. Global Gabor phase pattern (GGPP) and local Gabor phase pattern (LGPP) are then proposed to encode the phase variations. GGPP captures the variations derived from the orientation changing of Gabor wavelet at a given scale (frequency), while LGPP encodes the local neighborhood variations by using a novel local XOR pattern (LXP) operator. They are both divided into the nonoverlapping rectangular regions, from which spatial histograms are extracted and concatenated into an extended histogram feature to represent the original image. Finally, the recognition is performed by using the nearest-neighbor classifier with histogram intersection as the similarity measurement. The features of HGPP lie in two aspects: 1) HGPP can describe the general face images robustly without the training procedure; 2) HGPP encodes the Gabor phase information, while most previous face recognition methods exploit the Gabor magnitude information. In addition, Fisher separation criterion is further used to improve the performance of HGPP by weighing the subregions of the image according to their discriminative powers. The proposed methods are successfully applied to face recognition, and the experiment results on the large-scale FERET and CAS-PEAL databases show that the proposed algorithms significantly outperform other well-known systems in terms of recognition rate.

  19. Flat-histogram methods in quantum Monte Carlo simulations: Application to the t-J model

    NASA Astrophysics Data System (ADS)

    Diamantis, Nikolaos G.; Manousakis, Efstratios

    2016-09-01

    We discuss that flat-histogram techniques can be appropriately applied in the sampling of quantum Monte Carlo simulation in order to improve the statistical quality of the results at long imaginary time or low excitation energy. Typical imaginary-time correlation functions calculated in quantum Monte Carlo are subject to exponentially growing errors as the range of imaginary time grows and this smears the information on the low energy excitations. We show that we can extract the low energy physics by modifying the Monte Carlo sampling technique to one in which configurations which contribute to making the histogram of certain quantities flat are promoted. We apply the diagrammatic Monte Carlo (diag-MC) method to the motion of a single hole in the t-J model and we show that the implementation of flat-histogram techniques allows us to calculate the Green's function in a wide range of imaginary-time. In addition, we show that applying the flat-histogram technique alleviates the “sign”-problem associated with the simulation of the single-hole Green's function at long imaginary time.

  20. Effect of molecular organization on the image histograms of polarization SHG microscopy

    PubMed Central

    Psilodimitrakopoulos, Sotiris; Amat-Roldan, Ivan; Loza-Alvarez, Pablo; Artigas, David

    2012-01-01

    Based on its polarization dependency, second harmonic generation (PSHG) microscopy has been proven capable to structurally characterize molecular architectures in different biological samples. By exploiting this polarization dependency of the SHG signal in every pixel of the image, average quantitative structural information can be retrieved in the form of PSHG image histograms. In the present study we experimentally show how the PSHG image histograms can be affected by the organization of the SHG active molecules. Our experimental scenario grounds on two inherent properties of starch granules. Firstly, we take advantage of the radial organization of amylopectin molecules (the SHG source in starch) to attribute shifts of the image histograms to the existence of tilted off the plane molecules. Secondly, we use the property of starch to organize upon hydration to demonstrate that the degree of structural order at the molecular level affects the width of the PSHG image histograms. The shorter the width is the more organized the molecules in the sample are, resulting in a reliable method to measure order. The implication of this finding is crucial to the interpretation of PSHG images used for example in tissue diagnostics. PMID:23082306

  1. Human detection by quadratic classification on subspace of extended histogram of gradients.

    PubMed

    Satpathy, Amit; Jiang, Xudong; Eng, How-Lung

    2014-01-01

    This paper proposes a quadratic classification approach on the subspace of Extended Histogram of Gradients (ExHoG) for human detection. By investigating the limitations of Histogram of Gradients (HG) and Histogram of Oriented Gradients (HOG), ExHoG is proposed as a new feature for human detection. ExHoG alleviates the problem of discrimination between a dark object against a bright background and vice versa inherent in HG. It also resolves an issue of HOG whereby gradients of opposite directions in the same cell are mapped into the same histogram bin. We reduce the dimensionality of ExHoG using Asymmetric Principal Component Analysis (APCA) for improved quadratic classification. APCA also addresses the asymmetry issue in training sets of human detection where there are much fewer human samples than non-human samples. Our proposed approach is tested on three established benchmarking data sets--INRIA, Caltech, and Daimler--using a modified Minimum Mahalanobis distance classifier. Results indicate that the proposed approach outperforms current state-of-the-art human detection methods. PMID:23708804

  2. Three-dimensional histogram visualization in different color spaces and applications

    NASA Astrophysics Data System (ADS)

    Marcu, Gabriel G.; Abe, Satoshi

    1995-10-01

    A visualization procedure for the 3D histogram of color images is presented. The procedure assumes that the histogram is available as a table that associates to a pixel color the number of its appearance in the image. The procedure runs for the RGB, YMC, HSV, HSL, L*a*b, and L*u*v color spaces and it is easily extendable to other color spaces if the analytical form of color transformations is available. Each histogram value is represented in the color space as a colored ball, in a position corresponding to the place of the color in the space. A simple drawing procedure is used instead of more complicated 3D rendering techniques. The 3D histogram visualization offers a clear and intuitive representation of the color distribution of the image. The procedure is applied to derive a clusterization technique for color classification and visualize its results, to display comparatively the gamut of different color devices, and to detect the misalignment of the RGB planes of a color image. Diagrams illustrating the visualization procedure are presented for each application.

  3. 3D histogram visualization in different color spaces with application in color clustering classification

    NASA Astrophysics Data System (ADS)

    Marcu, Gabriel G.; Abe, Satoshi

    1995-04-01

    The paper presents a dynamically visualization procedure for 3D histogram of color images. The procedure runs for RGB, YMC, HSV, HSL device dependent color spaces and for Lab, Luv device independent color spaces and it is easily extendable to other color spaces if the analytical form of color transformations is available. Each histogram value is represented in the color space as a colored ball, in a position corresponding to the place of color in the color space. The paper presents the procedures for nonlinear ball normalization, ordering of drawing, space edges drawing, translation, scaling and rotation of the histogram. The 3D histogram visualization procedure can be used in different applications described in the second part of the paper. It enables to get a clear representation of the range of colors of one image, to derive and compare the efficiency of different clusterization procedures for color classification, to display comparatively the gamut of different color devices, to select the color space for an optimal mapping procedure of the outside gamut colors for minimizing the hue error, to detect bad-alignment in RGB planes for a sequential process.

  4. Histograms of the unitary evoked potential of the mouse diaphragm show multiple peaks.

    PubMed

    Kriebel, M E; Llados, F; Matteson, D R

    1982-01-01

    1. Two classes of miniature end-plate potentials (m.e.p.p.s) were recorded from diaphragm neuromuscular junctions. Amplitude histograms of both classes had multiple peaks that were integral multiples of the smallest peak (s-m.e.p.p.s). The smaller m.e.p.p.s formed the first three or four peaks of histograms and the number of m.e.p.p.s (skew-m.e.p.p.s) in each peak decreased, forming an over-all skewed distribution. The larger m.e.p.p.s (bell-m.e.p.p.s) formed a more-or-less bell-shaped distribution. The distribution of m.e.p.p.s varied from mainly skew- to mainly bell-m.e.p.p.s. In young adult mice the number of subunits composing the classical m.e.p.p.s varied between ten and fifteen at room temperature; at higher temperatures the range was from three to ten subunits.2. End-plate potentials (e.p.p.s) were reduced with cobalt ions (ca. 4 mm) until most nerve impulses failed to release transmitter. The amplitudes of ;unitary evoked potentials' were of the bell-m.e.p.p. class and histograms show integral multiple peaks that correspond to the peaks in histograms of the bell-m.e.p.p.s.3. The peaks in both m.e.p.p. and unitary e.p.p. histograms remained in the same position throughout the recording period and became more distinct as the sample size increased.4. The variance of the s-m.e.p.p. was estimated from the noise and measurement error and the variance of all peaks in the histograms. Most variance of the first peak (s-m.e.p.p.) was due to noise and measurement error.5. The integral peaks in the m.e.p.p. and ;unitary evoked potential' histograms are predicted with a probability density model based on the estimated variance of the s-m.e.p.p. and the assumption that larger potentials are composed of subunits the size of s-m.e.p.p.s. The data and model support the hypothesis that m.e.p.p.s and unitary potentials are composed of subunits.

  5. Loudspeaker equalization for auditory research.

    PubMed

    MacDonald, Justin A; Tran, Phuong K

    2007-02-01

    The equalization of loudspeaker frequency response is necessary to conduct many types of well-controlled auditory experiments. This article introduces a program that includes functions to measure a loudspeaker's frequency response, design equalization filters, and apply the filters to a set of stimuli to be used in an auditory experiment. The filters can compensate for both magnitude and phase distortions introduced by the loudspeaker. A MATLAB script is included in the Appendix to illustrate the details of the equalization algorithm used in the program.

  6. Principal component analysis of the CT density histogram to generate parametric response maps of COPD

    NASA Astrophysics Data System (ADS)

    Zha, N.; Capaldi, D. P. I.; Pike, D.; McCormack, D. G.; Cunningham, I. A.; Parraga, G.

    2015-03-01

    Pulmonary x-ray computed tomography (CT) may be used to characterize emphysema and airways disease in patients with chronic obstructive pulmonary disease (COPD). One analysis approach - parametric response mapping (PMR) utilizes registered inspiratory and expiratory CT image volumes and CT-density-histogram thresholds, but there is no consensus regarding the threshold values used, or their clinical meaning. Principal-component-analysis (PCA) of the CT density histogram can be exploited to quantify emphysema using data-driven CT-density-histogram thresholds. Thus, the objective of this proof-of-concept demonstration was to develop a PRM approach using PCA-derived thresholds in COPD patients and ex-smokers without airflow limitation. Methods: Fifteen COPD ex-smokers and 5 normal ex-smokers were evaluated. Thoracic CT images were also acquired at full inspiration and full expiration and these images were non-rigidly co-registered. PCA was performed for the CT density histograms, from which the components with the highest eigenvalues greater than one were summed. Since the values of the principal component curve correlate directly with the variability in the sample, the maximum and minimum points on the curve were used as threshold values for the PCA-adjusted PRM technique. Results: A significant correlation was determined between conventional and PCA-adjusted PRM with 3He MRI apparent diffusion coefficient (p<0.001), with CT RA950 (p<0.0001), as well as with 3He MRI ventilation defect percent, a measurement of both small airways disease (p=0.049 and p=0.06, respectively) and emphysema (p=0.02). Conclusions: PRM generated using PCA thresholds of the CT density histogram showed significant correlations with CT and 3He MRI measurements of emphysema, but not airways disease.

  7. Multiple point least squares equalization in a room

    NASA Technical Reports Server (NTRS)

    Elliott, S. J.; Nelson, P. A.

    1988-01-01

    Equalization filters designed to minimize the mean square error between a delayed version of the original electrical signal and the equalized response at a point in a room have previously been investigated. In general, such a strategy degrades the response at positions in a room away from the equalization point. A method is presented for designing an equalization filter by adjusting the filter coefficients to minimize the sum of the squares of the errors between the equalized responses at multiple points in the room and delayed versions of the original, electrical signal. Such an equalization filter can give a more uniform frequency response over a greater volume of the enclosure than can the single point equalizer above. Computer simulation results are presented of equalizing the frequency responses from a loudspeaker to various typical ear positions, in a room with dimensions and acoustic damping typical of a car interior, using the two approaches outlined above. Adaptive filter algorithms, which can automatically adjust the coefficients of a digital equalization filter to achieve this minimization, will also be discussed.

  8. [Fractal dimension and histogram method: algorithm and some preliminary results of noise-like time series analysis].

    PubMed

    Pancheliuga, V A; Pancheliuga, M S

    2013-01-01

    In the present work a methodological background for the histogram method of time series analysis is developed. Connection between shapes of smoothed histograms constructed on the basis of short segments of time series of fluctuations and the fractal dimension of the segments is studied. It is shown that the fractal dimension possesses all main properties of the histogram method. Based on it a further development of fractal dimension determination algorithm is proposed. This algorithm allows more precision determination of the fractal dimension by using the "all possible combination" method. The application of the method to noise-like time series analysis leads to results, which could be obtained earlier only by means of the histogram method based on human expert comparisons of histograms shapes. PMID:23755565

  9. Equal Education and the Law

    ERIC Educational Resources Information Center

    Shanks, Hershel

    1970-01-01

    A number of court cases are cited which trace the development of various definitions and interpretations of the equal protection clause of the Fourteenth Amendment to the Constitution as would be applicable to inadequate" schools. (DM)

  10. Electronegativity Equalization with Pauling Units.

    ERIC Educational Resources Information Center

    Bratsch, Steven G.

    1984-01-01

    Discusses electronegativity equalization using Pauling units. Although Pauling has qualitatively defined electronegativity as the power of an atom in a molecule to attract electrons to itself, Pauling electronegativities are treated in this paper as prebonded, isolated-atom quantities. (JN)

  11. Electronegativity Equalization and Partial Charge

    ERIC Educational Resources Information Center

    Sanderson, R. T.

    1974-01-01

    This article elaborates the relationship between covalent radius, homonuclear bond energy, and electronegativity, and sets the background for bond energy calculation by discussing the nature of heteronuclear covalent bonding on the basis of electronegativity equalization and particle charge. (DT)

  12. Equal Pay for Comparable Work.

    ERIC Educational Resources Information Center

    Rothman, Nancy Lloyd; Rothman, Daniel A.

    1980-01-01

    Examines the legal battleground upon which one struggle for the equality of women is being fought. Updates a civil rights decision of crucial importance to nursing--Lemons v City and County of Denver. (JOW)

  13. Equal Employment + Equal Pay = Multiple Problems for Colleges and Universities

    ERIC Educational Resources Information Center

    Steinbach, Sheldon Elliot; Reback, Joyce E.

    1974-01-01

    Issues involved in government regulation of university employment practices are discussed: confidentiality of records, pregnancy as a disability, alleged discrimination in benefits, tests and other employment criteria, seniority and layoff, reverse discrimination, use of statistics for determination of discrimination, and the Equal Pay Act. (JT)

  14. High capacity reversible watermarking for audio by histogram shifting and predicted error expansion.

    PubMed

    Wang, Fei; Xie, Zhaoxin; Chen, Zuo

    2014-01-01

    Being reversible, the watermarking information embedded in audio signals can be extracted while the original audio data can achieve lossless recovery. Currently, the few reversible audio watermarking algorithms are confronted with following problems: relatively low SNR (signal-to-noise) of embedded audio; a large amount of auxiliary embedded location information; and the absence of accurate capacity control capability. In this paper, we present a novel reversible audio watermarking scheme based on improved prediction error expansion and histogram shifting. First, we use differential evolution algorithm to optimize prediction coefficients and then apply prediction error expansion to output stego data. Second, in order to reduce location map bits length, we introduced histogram shifting scheme. Meanwhile, the prediction error modification threshold according to a given embedding capacity can be computed by our proposed scheme. Experiments show that this algorithm improves the SNR of embedded audio signals and embedding capacity, drastically reduces location map bits length, and enhances capacity control capability.

  15. High Capacity Reversible Watermarking for Audio by Histogram Shifting and Predicted Error Expansion

    PubMed Central

    Wang, Fei; Chen, Zuo

    2014-01-01

    Being reversible, the watermarking information embedded in audio signals can be extracted while the original audio data can achieve lossless recovery. Currently, the few reversible audio watermarking algorithms are confronted with following problems: relatively low SNR (signal-to-noise) of embedded audio; a large amount of auxiliary embedded location information; and the absence of accurate capacity control capability. In this paper, we present a novel reversible audio watermarking scheme based on improved prediction error expansion and histogram shifting. First, we use differential evolution algorithm to optimize prediction coefficients and then apply prediction error expansion to output stego data. Second, in order to reduce location map bits length, we introduced histogram shifting scheme. Meanwhile, the prediction error modification threshold according to a given embedding capacity can be computed by our proposed scheme. Experiments show that this algorithm improves the SNR of embedded audio signals and embedding capacity, drastically reduces location map bits length, and enhances capacity control capability. PMID:25097883

  16. Liver fibrosis grading using multiresolution histogram information in real-time elastography

    NASA Astrophysics Data System (ADS)

    Albouy-Kissi, A.; Sarry, L.; Massoulier, S.; Bonny, C.; Randl, K.; Abergel, A.

    2010-03-01

    Despites many limitations, liver biopsy remains the gold standard method for grading and staging liver biopsy. Several modalities have been developed for a non invasive assessment of liver diseases. Real-time elastography may constitute a true alternative to liver biopsy by providing an image of tissular elasticity distribution correlated to the fibrosis grade. In this paper, we investigate a new approach for the assessment of liver fibrosis by the classification of fibrosis morphometry. Multiresolution histogram, based on a combination of intensity and texture features, has been tested as feature space. Thus, the ability of such multiresolution histograms to discriminate fibrosis grade has been proven. The results have been tested on seventeen patients that underwent a real time elastography and FibroScan examination.

  17. Sample training based wildfire segmentation by 2D histogram θ-division with minimum error.

    PubMed

    Zhao, Jianhui; Dong, Erqian; Sun, Mingui; Jia, Wenyan; Zhang, Dengyi; Yuan, Zhiyong

    2013-01-01

    A novel wildfire segmentation algorithm is proposed with the help of sample training based 2D histogram θ-division and minimum error. Based on minimum error principle and 2D color histogram, the θ-division methods were presented recently, but application of prior knowledge on them has not been explored. For the specific problem of wildfire segmentation, we collect sample images with manually labeled fire pixels. Then we define the probability function of error division to evaluate θ-division segmentations, and the optimal angle θ is determined by sample training. Performances in different color channels are compared, and the suitable channel is selected. To further improve the accuracy, the combination approach is presented with both θ-division and other segmentation methods such as GMM. Our approach is tested on real images, and the experiments prove its efficiency for wildfire segmentation.

  18. Accelerating atomic-level protein simulations by flat-histogram techniques

    NASA Astrophysics Data System (ADS)

    Jónsson, Sigurður Ć.; Mohanty, Sandipan; Irbäck, Anders

    2011-09-01

    Flat-histogram techniques provide a powerful approach to the simulation of first-order-like phase transitions and are potentially very useful for protein studies. Here, we test this approach by implicit solvent all-atom Monte Carlo (MC) simulations of peptide aggregation, for a 7-residue fragment (GIIFNEQ) of the Cu/Zn superoxide dismutase 1 protein (SOD1). In simulations with 8 chains, we observe two distinct aggregated/non-aggregated phases. At the midpoint temperature, these phases coexist, separated by a free-energy barrier of height 2.7 kBT. We show that this system can be successfully studied by carefully implemented flat-histogram techniques. The frequency of barrier crossing, which is low in conventional canonical simulations, can be increased by turning to a two-step procedure based on the Wang-Landau and multicanonical algorithms.

  19. Adapting to the Revolution of Equal Opportunity for the Handicapped.

    ERIC Educational Resources Information Center

    Bailey, Cornelia W.

    1979-01-01

    Federal regulations regarding the handicapped pose problems for recipients of federal aid, but higher education's reactions have been more positive than negative. The principle problems seem to be compliance costs, the need for interpretation of the regulations, and difficulties in the areas of admissions and academic requirements. (Author/JMD)

  20. A Perspective on Diversity, Equality and Equity in Swedish Schools

    ERIC Educational Resources Information Center

    Johansson, Olof; Davis, Anna; Geijer, Luule

    2007-01-01

    This study presents policy and theory as they apply to diversity, equality and equity in Swedish social and educational policy. All education in Sweden should, according to the curriculum (Lpo 94, 1994, p. 5) be of equivalent value, irrespective of where in the country it is provided and education should be adapted to each pupil's circumstances…

  1. 41 CFR 60-741.5 - Equal opportunity clause.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., demotion, transfer, layoff, termination, right of return from layoff and rehiring; iii. Rates of pay or any... opportunity clause in each of its subcontracts subject to this part. (c) Adaption of language. Such necessary changes in language may be made to the equal opportunity clause as shall be appropriate to...

  2. Temporal evolution for the phase histogram of ECG during human ventricular fibrillation

    NASA Astrophysics Data System (ADS)

    Wu, Ming-Chya; Struzik, Zbigniew R.; Watanabe, Eiichi; Yamamoto, Yoshiharu; Hu, Chin-Kun

    2007-07-01

    A novel approach to momentary/instantaneous morphological assessment of phase histograms, extending phase statistics analysis, is used to investigate electrocardiograms during ventricular fibrillation (VF) in humans. By using empirical mode decomposition (EMD) and the Hilbert transform, we calculate the instantaneous phase of intrinsic mode functions (IMFs) in Holter data from 16 individuals, and construct the corresponding momentary phase histograms, enabling us to inspect the evolution of the waveform of the time series. A measure defined as the difference between the integrals of the probability distribution density of phase in different regions is then used to characterize the morphology of the momentary histograms and their temporal evolution. We find that the measure of morphology difference allows near perfect classification of the VF data into survivor and non-survivor groups. The technique offers a new possibility to improve the effectiveness of intervention in defibrillation treatment and limit the negative side effects of unnecessary interventions. The approach can be implemented in real time and should provide a useful method for early evaluation of (fatal) VF.

  3. Glioma grading using apparent diffusion coefficient map: application of histogram analysis based on automatic segmentation.

    PubMed

    Lee, Jeongwon; Choi, Seung Hong; Kim, Ji-Hoon; Sohn, Chul-Ho; Lee, Sooyeul; Jeong, Jaeseung

    2014-09-01

    The accurate diagnosis of glioma subtypes is critical for appropriate treatment, but conventional histopathologic diagnosis often exhibits significant intra-observer variability and sampling error. The aim of this study was to investigate whether histogram analysis using an automatically segmented region of interest (ROI), excluding cystic or necrotic portions, could improve the differentiation between low-grade and high-grade gliomas. Thirty-two patients (nine low-grade and 23 high-grade gliomas) were included in this retrospective investigation. The outer boundaries of the entire tumors were manually drawn in each section of the contrast-enhanced T1 -weighted MR images. We excluded cystic or necrotic portions from the entire tumor volume. The histogram analyses were performed within the ROI on normalized apparent diffusion coefficient (ADC) maps. To evaluate the contribution of the proposed method to glioma grading, we compared the area under the receiver operating characteristic (ROC) curves. We found that an ROI excluding cystic or necrotic portions was more useful for glioma grading than was an entire tumor ROI. In the case of the fifth percentile values of the normalized ADC histogram, the area under the ROC curve for the tumor ROIs excluding cystic or necrotic portions was significantly higher than that for the entire tumor ROIs (p < 0.005). The automatic segmentation of a cystic or necrotic area probably improves the ability to differentiate between high- and low-grade gliomas on an ADC map. PMID:25042540

  4. A comparison of histogram distance metrics for content-based image retrieval

    NASA Astrophysics Data System (ADS)

    Zhang, Qianwen; Canosa, Roxanne L.

    2014-03-01

    The type of histogram distance metric selected for a CBIR query varies greatly and will affect the accuracy of the retrieval results. This paper compares the retrieval results of a variety of commonly used CBIR distance metrics: the Euclidean distance, the Manhattan distance, the vector cosine angle distance, histogram intersection distance, χ2 distance, Jensen-Shannon divergence, and the Earth Mover's distance. A training set of ground-truth labeled images is used to build a classifier for the CBIR system, where the images were obtained from three commonly used benchmarking datasets: the WANG dataset (http://savvash.blogspot.com/2008/12/benchmark-databases-for-cbir.html), the Corel Subset dataset (http://vision.stanford.edu/resources_links.html), and the CalTech dataset (http://www.vision.caltech.edu/htmlfiles/). To implement the CBIR system, we use the Tamura texture features of coarseness, contrast, and directionality. We create texture histograms of the training set and the query images, and then measure the difference between a randomly selected query and the corresponding retrieved image using a k-nearest-neighbors approach. Precision and recall is used to evaluate the retrieval performance of the system, given a particular distance metric. Then, given the same query image, the distance metric is changed and performance of the system is evaluated once again.

  5. Statistical Analysis of Photopyroelectric Signals using Histogram and Kernel Density Estimation for differentiation of Maize Seeds

    NASA Astrophysics Data System (ADS)

    Rojas-Lima, J. E.; Domínguez-Pacheco, A.; Hernández-Aguilar, C.; Cruz-Orea, A.

    2016-09-01

    Considering the necessity of photothermal alternative approaches for characterizing nonhomogeneous materials like maize seeds, the objective of this research work was to analyze statistically the amplitude variations of photopyroelectric signals, by means of nonparametric techniques such as the histogram and the kernel density estimator, and the probability density function of the amplitude variations of two genotypes of maize seeds with different pigmentations and structural components: crystalline and floury. To determine if the probability density function had a known parametric form, the histogram was determined which did not present a known parametric form, so the kernel density estimator using the Gaussian kernel, with an efficiency of 95 % in density estimation, was used to obtain the probability density function. The results obtained indicated that maize seeds could be differentiated in terms of the statistical values for floury and crystalline seeds such as the mean (93.11, 159.21), variance (1.64× 103, 1.48× 103), and standard deviation (40.54, 38.47) obtained from the amplitude variations of photopyroelectric signals in the case of the histogram approach. For the case of the kernel density estimator, seeds can be differentiated in terms of kernel bandwidth or smoothing constant h of 9.85 and 6.09 for floury and crystalline seeds, respectively.

  6. Plastic Flowlike Deformation and Its Relation to Aperiodic Peaks in Conductance Histograms of Molybdenum Nanocontacts

    NASA Astrophysics Data System (ADS)

    Yamada, Kohei; Kizuka, Tokushi

    2016-10-01

    We observed the tensile deformation of molybdenum (Mo) nanocontacts (NCs) and simultaneously measured their conductance by in situ transmission electron microscopy. During deformation, the contact width decreased from several nanometers to a single-atom size. Mo NCs were thinned via a plastic flowlike deformation process. The process differs from the slip on lattice planes, which is frequently observed in NCs made of noble metals. We plotted histograms of the time-conductance traces measured during the tensile deformation of Mo NCs. In the conductance histograms, we observed peaks at 1.8G0 (G0 = 2e2/h, where e is the electron charge and h is Planck's constant), 3.6G0, and 4.4G0. When the minimum conductance (1.8G0) was measured, the minimum cross-sectional widths of the NCs were 3-7 atoms. These NCs exhibited relaxed structures that formed irregularly after the plastic flowlike deformation occurred in the final stage of the tensile process. We inferred that the aperiodic peaks observed in the conductance histograms originated from irregular variations in the contact areas and atomic configurations of the NCs during the plastic flowlike deformation. Moreover, the conductance value of the single-atom contacts was less than 0.1G0.

  7. Digital image classification with the help of artificial neural network by simple histogram

    PubMed Central

    Dey, Pranab; Banerjee, Nirmalya; Kaur, Rajwant

    2016-01-01

    Background: Visual image classification is a great challenge to the cytopathologist in routine day-to-day work. Artificial neural network (ANN) may be helpful in this matter. Aims and Objectives: In this study, we have tried to classify digital images of malignant and benign cells in effusion cytology smear with the help of simple histogram data and ANN. Materials and Methods: A total of 404 digital images consisting of 168 benign cells and 236 malignant cells were selected for this study. The simple histogram data was extracted from these digital images and an ANN was constructed with the help of Neurointelligence software [Alyuda Neurointelligence 2.2 (577), Cupertino, California, USA]. The network architecture was 6-3-1. The images were classified as training set (281), validation set (63), and test set (60). The on-line backpropagation training algorithm was used for this study. Result: A total of 10,000 iterations were done to train the ANN system with the speed of 609.81/s. After the adequate training of this ANN model, the system was able to identify all 34 malignant cell images and 24 out of 26 benign cells. Conclusion: The ANN model can be used for the identification of the individual malignant cells with the help of simple histogram data. This study will be helpful in the future to identify malignant cells in unknown situations. PMID:27279679

  8. LOR-OSEM: statistical PET reconstruction from raw line-of-response histograms

    NASA Astrophysics Data System (ADS)

    Kadrmas, Dan J.

    2004-10-01

    Iterative statistical reconstruction methods are becoming the standard in positron emission tomography (PET). Conventional maximum-likelihood expectation-maximization (MLEM) and ordered-subsets (OSEM) algorithms act on data which have been pre-processed into corrected, evenly-spaced histograms; however, such pre-processing corrupts the Poisson statistics. Recent advances have incorporated attenuation, scatter and randoms compensation into the iterative reconstruction. The objective of this work was to incorporate the remaining pre-processing steps, including arc correction, to reconstruct directly from raw unevenly-spaced line-of-response (LOR) histograms. This exactly preserves Poisson statistics and full spatial information in a manner closely related to listmode ML, making full use of the ML statistical model. The LOR-OSEM algorithm was implemented using a rotation-based projector which maps directly to the unevenly-spaced LOR grid. Simulation and phantom experiments were performed to characterize resolution, contrast and noise properties for 2D PET. LOR-OSEM provided a beneficial noise-resolution tradeoff, outperforming AW-OSEM by about the same margin that AW-OSEM outperformed pre-corrected OSEM. The relationship between LOR-ML and listmode ML algorithms was explored, and implementation differences are discussed. LOR-OSEM is a viable alternative to AW-OSEM for histogram-based reconstruction with improved spatial resolution and noise properties.

  9. LOR-OSEM: statistical PET reconstruction from raw line-of-response histograms.

    PubMed

    Kadrmas, Dan J

    2004-10-21

    Iterative statistical reconstruction methods are becoming the standard in positron emission tomography (PET). Conventional maximum-likelihood expectation-maximization (MLEM) and ordered-subsets (OSEM) algorithms act on data which have been pre-processed into corrected, evenly-spaced histograms; however, such pre-processing corrupts the Poisson statistics. Recent advances have incorporated attenuation, scatter and randoms compensation into the iterative reconstruction. The objective of this work was to incorporate the remaining pre-processing steps, including arc correction, to reconstruct directly from raw unevenly-spaced line-of-response (LOR) histograms. This exactly preserves Poisson statistics and full spatial information in a manner closely related to listmode ML, making full use of the ML statistical model. The LOR-OSEM algorithm was implemented using a rotation-based projector which maps directly to the unevenly-spaced LOR grid. Simulation and phantom experiments were performed to characterize resolution, contrast and noise properties for 2D PET. LOR-OSEM provided a beneficial noise-resolution tradeoff, outperforming AW-OSEM by about the same margin that AW-OSEM outperformed pre-corrected OSEM. The relationship between LOR-ML and listmode ML algorithms was explored, and implementation differences are discussed. LOR-OSEM is a viable alternative to AW-OSEM for histogram-based reconstruction with improved spatial resolution and noise properties.

  10. Equal Time for Creationism? No.

    ERIC Educational Resources Information Center

    Skoog, Gerald

    1983-01-01

    Legal decisions and other arguments support the argument that the exclusion of creationism from school curricula is not the result of censorship or bias. Equal-time legislation for creationism has the potential to entangle the state and religion and to make the task of teachers, textbook authors, and publishers nearly impossible. (PP)

  11. Higher Education and Equal Protection.

    ERIC Educational Resources Information Center

    Finnigan, John J.

    1979-01-01

    The effect of the Bakke case, in which the courts first encountered the question of legality of reverse discrimination, is explored; its constitutional significance is examined. It is concluded that the virtue of the decision is in its support of affirmative action and its equal protection implications. (MSE)

  12. STEM Equality and Diversity Toolkit

    ERIC Educational Resources Information Center

    Collins, Jill

    2011-01-01

    In 2008, the Centre for Science Education at Sheffield Hallam University teamed up with VT Enterprise (now Babcock International) in their submission of a successful bid to deliver the national STEM (Science, Technology, Engineering and Maths) Subject Choice and Careers Project. An integral part of the bid was the promotion of equality and…

  13. The Road to Racial Equality

    ERIC Educational Resources Information Center

    Tatum, Beverly Daniel

    2004-01-01

    In this article, the author describes how he was born in 1954, just four months after the Brown v. Board of Education Supreme Court decision outlawed the "separate but equal" doctrine of school segregation. He discusses how that fact has shaped his life immeasurably. Beginning with entering the world in Tallahassee, Fla., where his father taught…

  14. Primer of Equal Employment Opportunity.

    ERIC Educational Resources Information Center

    Anderson, Howard J.

    This booklet presents laws and court cases concerning discrimination in hiring. It begins with a presentation of the laws and orders regulating equal employment opportunity and the remedies available. It lists those employees and employers to whom the laws apply and exemptions. Sections deal with discrimination on the basis of race, sex, sexual…

  15. Extending Understanding of Equal Protection.

    ERIC Educational Resources Information Center

    Dreyfuss, Elisabeth T.

    1988-01-01

    Presents four strategies for teaching secondary students about equal protection clause of the U.S. Constitution's Fourteenth Amendment. To be taught by the classroom teacher or a visiting lawyer, these strategies use such methods as a panel discussion and examination of Fourteenth Amendment court cases to accomplish their goals. (GEA)

  16. Promote Equality in the Classroom.

    ERIC Educational Resources Information Center

    Brown, Sharon; And Others

    1996-01-01

    Presents suggestions to help physical educators treat all students equally and avoid unconsciously making inequitable gender-based statements and practicing other gender discrimination. Suggestions include encouraging girls to talk more, praising girls' performance and boys' appearance, using gender-neutral language, not stereotyping either sex,…

  17. Equalization among Florida School Districts.

    ERIC Educational Resources Information Center

    Alexander, Kern; Shiver, Lee

    1983-01-01

    This statistical analysis of funding equalization from 1970 to 1981 evaluates the distributional equity achieved by Florida's school finance plan and examines the relationship between selected per pupil revenue measures and variables thought to influence school district spending, concluding that greater equity has not been attained. (MJL)

  18. An adaptive image enhancement technique by combining cuckoo search and particle swarm optimization algorithm.

    PubMed

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper. PMID:25784928

  19. An adaptive image enhancement technique by combining cuckoo search and particle swarm optimization algorithm.

    PubMed

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper.

  20. Complex adaptation-based LDR image rendering for 3D image reconstruction

    NASA Astrophysics Data System (ADS)

    Lee, Sung-Hak; Kwon, Hyuk-Ju; Sohng, Kyu-Ik

    2014-07-01

    A low-dynamic tone-compression technique is developed for realistic image rendering that can make three-dimensional (3D) images similar to realistic scenes by overcoming brightness dimming in the 3D display mode. The 3D surround provides varying conditions for image quality, illuminant adaptation, contrast, gamma, color, sharpness, and so on. In general, gain/offset adjustment, gamma compensation, and histogram equalization have performed well in contrast compression; however, as a result of signal saturation and clipping effects, image details are removed and information is lost on bright and dark areas. Thus, an enhanced image mapping technique is proposed based on space-varying image compression. The performance of contrast compression is enhanced with complex adaptation in a 3D viewing surround combining global and local adaptation. Evaluating local image rendering in view of tone and color expression, noise reduction, and edge compensation confirms that the proposed 3D image-mapping model can compensate for the loss of image quality in the 3D mode.

  1. An Adaptive Image Enhancement Technique by Combining Cuckoo Search and Particle Swarm Optimization Algorithm

    PubMed Central

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper. PMID:25784928

  2. Postimplantation Analysis Enables Improvement of Dose-Volume Histograms and Reduction of Toxicity for Permanent Seed Implantation

    SciTech Connect

    Wust, Peter Postrach, Johanna; Kahmann, Frank; Henkel, Thomas; Graf, Reinhold; Cho, Chie Hee; Budach, Volker; Boehmer, Dirk

    2008-05-01

    Purpose: To demonstrate how postimplantation analysis is useful for improving permanent seed implantation and reducing toxicity. Patients and Methods: We evaluated 197 questionnaires completed by patients after permanent seed implantation (monotherapy between 1999 and 2003). For 70% of these patients, a computed tomography was available to perform postimplantation analysis. The index doses and volumes of the dose-volume histograms (DVHs) were determined and categorized with respect to the date of implantation. Differences in symptom scores relative to pretherapeutic status were analyzed with regard to follow-up times and DVH descriptors. Acute and subacute toxicities in a control group of 117 patients from an earlier study (June 1999 to September 2001) by Wust et al. (2004) were compared with a matched subgroup from this study equaling 110 patients treated between October 2001 and August 2003. Results: Improved performance, identifying a characteristic time dependency of DVH parameters (after implantation) and toxicity scores, was demonstrated. Although coverage (volume covered by 100% of the prescription dose of the prostate) increased slightly, high-dose regions decreased with the growing experience of the users. Improvement in the DVH and a reduction of toxicities were found in the patient group implanted in the later period. A decline in symptoms with follow-up time counteracts this gain of experience and must be considered. Urinary and sexual discomfort was enhanced by dose heterogeneities (e.g., dose covering 10% of the prostate volume, volume covered by 200% of prescription dose). In contrast, rectal toxicities correlated with exposed rectal volumes, especially the rectal volume covered by 100% of the prescription dose. Conclusion: The typical side effects occurring after permanent seed implantation can be reduced by improving the dose distributions. An improvement in dose distributions and a reduction of toxicities were identified with elapsed time between

  3. Adaptive sharpening of photos

    NASA Astrophysics Data System (ADS)

    Safonov, Ilia V.; Rychagov, Michael N.; Kang, KiMin; Kim, Sang Ho

    2008-01-01

    Sharpness is an important attribute that contributes to the overall impression of printed photo quality. Often it is impossible to estimate sharpness prior to printing. Sometimes it is a complex task for a consumer to obtain accurate sharpening results by editing a photo on a computer. The novel method of adaptive sharpening aimed for photo printers is proposed. Our approach includes 3 key techniques: sharpness level estimation, local tone mapping and boosting of local contrast. Non-reference automatic sharpness level estimation is based on analysis of variations of edges histograms, where edges are produced by high-pass filters with various kernel sizes, array of integrals of logarithm of edges histograms characterizes photo sharpness, machine learning is applied to choose optimal parameters for given printing size and resolution. Local tone mapping with ordering is applied to decrease edge transition slope length without noticeable artifacts and with some noise suppression. Unsharp mask via bilateral filter is applied for boosting of local contrast. This stage does not produce strong halo artifact which is typical for the traditional unsharp mask filter. The quality of proposed approach is evaluated by surveying observer's opinions. According to obtained replies the proposed method enhances the majority of photos.

  4. Racial equity or racial equality.

    PubMed

    Daymont, T N

    1980-11-01

    This study examines the relationship between racial equity in labor market processes and racial equality in future labor market rewards. In particular, a regression standardization procedure is used to project the degree of racial inequality in earnings that would exist among men at various future points in time based on three different sets of assumptions about attainment processes in labor market and educational institutions. The most important results suggest that even if racial discrimination were eliminated immediately in labor market and educational institutions, it would take almost 50 years for the black-white earnings ratio to reach .95. This incompatibility between equity and equality needs to be considered more explicitly both by those who advocate a color-blind labor market and those who advocate preferential treatment for blacks.

  5. Equal opportunity in the workplace.

    PubMed

    Allen, A

    1992-04-01

    The Equal Employment Opportunity Commission (EEOC) was created by the Civil Rights Act of 1964. The commission encourages voluntary compliance with equal employment opportunity practices, and has authority to investigate complaints alleging discrimination in hiring, firing, wage rates, testing, training, apprenticeship, and other conditions of employment. In October 1991, during the Senate Judiciary Committee hearings, the confirmation of Judge Clarence Thomas for a seat on the United States Supreme Court was placed in jeopardy by a charge of sexual harassment while Thomas was head of the EEOC. This article focuses on aspects of sexual harassment in the workplace, the role of the EEOC, and offers some suggestions for keeping the work environment free of abusive behavior.

  6. Office of Equal Opportunity Programs

    NASA Technical Reports Server (NTRS)

    Chin, Jennifer L.

    2004-01-01

    The NASA Glenn Office of Equal Opportunity Programs works to provide quality service for all programs and/or to assist the Center in becoming a model workplace. During the summer of 2004, I worked with Deborah Cotleur along with other staff members to create and modify customer satisfaction surveys. This office aims to assist in developing a model workplace by providing functions as a change agent to the center by serving as an advisor to management to ensure equity throughout the Center. In addition, the office serves as a mediator for the Center in addressing issues and concerns. Lastly, the office provides assistance to employees to enable attainment of personal and organizational goals. The Office of Equal Opportunities is a staff office which reports and provides advice to the Center Director and Executive Leadership, implements laws, regulations, and presidential executive orders, and provides center wide leadership and assistance to NASA GRC employees. Some of the major responsibilities of the office include working with the discrimination complaints program, special emphasis programs (advisory groups), management support, monitoring and evaluation, contract compliance, and community outreach. During my internship in this office, my main objective was to create four customer satisfaction surveys based on EO retreats, EO observances, EO advisory boards, and EO mediation/counseling. I created these surveys after conducting research on past events and surveys as well as similar survey research created and conducted by other NASA centers, program for EO Advisory group members, leadership training sessions for supervisors, preventing sexual harassment training sessions, and observance events. I also conducted research on the style and format from feedback surveys from the Marshall Equal Opportunity website, the Goddard website, and the main NASA website. Using the material from the Office of Equal Opportunity Programs at Glenn Research Center along with my

  7. The equal right to drink.

    PubMed

    Schmidt, Laura A

    2014-11-01

    The starting place for this essay is Knupfer and Room's insight that more restrictive norms around drinking and intoxication tend to be selectively applied to the economically dependent segments of society, such as women. However, since these authors wrote in 1964, women in the US and many other societies around the globe have experienced rising economic independence. The essay considers how the moral categories of acceptable drinking and drunkenness may have shifted alongside women's rising economic independence, and looks at evidence on the potential consequences for women's health and wellbeing. I argue that, as women have gained economic independence, changes in drinking norms have produced two different kinds of negative unintended consequences for women at high and low extremes of economic spectrum. As liberated women of the middle and upper classes have become more economically equal to men, they have enjoyed the right to drink with less restraint. For them, alongside the equal right to drink has come greater equality in exposure to alcohol-attributable harms, abuse and dependence. I further suggest that, as societies become more liberated, the economic dependency of low-income women is brought into greater question. Under such conditions, women in poverty-particularly those economically dependent on the state, such as welfare mothers-have become subject to more restrictive norms around drinking and intoxication, and more punitive social controls.

  8. The equal right to drink.

    PubMed

    Schmidt, Laura A

    2014-11-01

    The starting place for this essay is Knupfer and Room's insight that more restrictive norms around drinking and intoxication tend to be selectively applied to the economically dependent segments of society, such as women. However, since these authors wrote in 1964, women in the US and many other societies around the globe have experienced rising economic independence. The essay considers how the moral categories of acceptable drinking and drunkenness may have shifted alongside women's rising economic independence, and looks at evidence on the potential consequences for women's health and wellbeing. I argue that, as women have gained economic independence, changes in drinking norms have produced two different kinds of negative unintended consequences for women at high and low extremes of economic spectrum. As liberated women of the middle and upper classes have become more economically equal to men, they have enjoyed the right to drink with less restraint. For them, alongside the equal right to drink has come greater equality in exposure to alcohol-attributable harms, abuse and dependence. I further suggest that, as societies become more liberated, the economic dependency of low-income women is brought into greater question. Under such conditions, women in poverty-particularly those economically dependent on the state, such as welfare mothers-have become subject to more restrictive norms around drinking and intoxication, and more punitive social controls. PMID:25303360

  9. An adaptive algorithm for low contrast infrared image enhancement

    NASA Astrophysics Data System (ADS)

    Liu, Sheng-dong; Peng, Cheng-yuan; Wang, Ming-jia; Wu, Zhi-guo; Liu, Jia-qi

    2013-08-01

    An adaptive infrared image enhancement algorithm for low contrast is proposed in this paper, to deal with the problem that conventional image enhancement algorithm is not able to effective identify the interesting region when dynamic range is large in image. This algorithm begin with the human visual perception characteristics, take account of the global adaptive image enhancement and local feature boost, not only the contrast of image is raised, but also the texture of picture is more distinct. Firstly, the global image dynamic range is adjusted from the overall, the dynamic range of original image and display grayscale form corresponding relationship, the gray scale of bright object is raised and the the gray scale of dark target is reduced at the same time, to improve the overall image contrast. Secondly, the corresponding filtering algorithm is used on the current point and its neighborhood pixels to extract image texture information, to adjust the brightness of the current point in order to enhance the local contrast of the image. The algorithm overcomes the default that the outline is easy to vague in traditional edge detection algorithm, and ensure the distinctness of texture detail in image enhancement. Lastly, we normalize the global luminance adjustment image and the local brightness adjustment image, to ensure a smooth transition of image details. A lot of experiments is made to compare the algorithm proposed in this paper with other convention image enhancement algorithm, and two groups of vague IR image are taken in experiment. Experiments show that: the contrast ratio of the picture is boosted after handled by histogram equalization algorithm, but the detail of the picture is not clear, the detail of the picture can be distinguished after handled by the Retinex algorithm. The image after deal with by self-adaptive enhancement algorithm proposed in this paper becomes clear in details, and the image contrast is markedly improved in compared with Retinex

  10. Equal is as equal does: challenging Vatican views on women.

    PubMed

    1995-01-01

    The authors of this piece are women from the Roman Catholic tradition who are critical of the Vatican position on women's rights. The Report of the Holy See in Preparation for the Fourth World Conference on Women reveals a religious fundamentalism that misuses tradition and anthropology to limit women's roles and rights. The Vatican is itself a self-proclaimed state that offers women neither opportunities nor protections within its own organization, and there is no evidence of women's participation in the preparation of its report. The Vatican document constructs a vision of women and men in which men are normative persons, whose dignity is conferred by their humanity, and women are the variant other, defined by and granted dignity by their reproductive and mothering functions. The Vatican document is anti-feminist. It criticizes the "radical feminists" of the 1960s for trying to deny sexual differences, and accuses today's Western feminists of ignoring the needs of women in developing countries while pursuing selfish and hedonistic goals. It makes no recognition of the work of feminists to improve the lives of women worldwide. The Vatican document claims to support women's equality, but it qualifies each statement of equality with a presumption of difference. The document defines women as vulnerable without naming men as responsible for the oppression and violence to which women are vulnerable. It ridicules as feminist cant the well-documented fact that the home is the setting of most violence against women. The Vatican decries the suffering families undergo as a result of cumpulsory birth control and abortion policies, while it would deny families sex education, contraceptives, and safe abortion, thereby making pregnancy cumpulsory. A new vision of social justice is needed, one that: 1) rests on a radical equality, in which both women and men are expected to contribute to work, education, culture, morality, and reproduction; 2) accepts a "discipleship of equals

  11. Equal is as equal does: challenging Vatican views on women.

    PubMed

    1995-01-01

    The authors of this piece are women from the Roman Catholic tradition who are critical of the Vatican position on women's rights. The Report of the Holy See in Preparation for the Fourth World Conference on Women reveals a religious fundamentalism that misuses tradition and anthropology to limit women's roles and rights. The Vatican is itself a self-proclaimed state that offers women neither opportunities nor protections within its own organization, and there is no evidence of women's participation in the preparation of its report. The Vatican document constructs a vision of women and men in which men are normative persons, whose dignity is conferred by their humanity, and women are the variant other, defined by and granted dignity by their reproductive and mothering functions. The Vatican document is anti-feminist. It criticizes the "radical feminists" of the 1960s for trying to deny sexual differences, and accuses today's Western feminists of ignoring the needs of women in developing countries while pursuing selfish and hedonistic goals. It makes no recognition of the work of feminists to improve the lives of women worldwide. The Vatican document claims to support women's equality, but it qualifies each statement of equality with a presumption of difference. The document defines women as vulnerable without naming men as responsible for the oppression and violence to which women are vulnerable. It ridicules as feminist cant the well-documented fact that the home is the setting of most violence against women. The Vatican decries the suffering families undergo as a result of cumpulsory birth control and abortion policies, while it would deny families sex education, contraceptives, and safe abortion, thereby making pregnancy cumpulsory. A new vision of social justice is needed, one that: 1) rests on a radical equality, in which both women and men are expected to contribute to work, education, culture, morality, and reproduction; 2) accepts a "discipleship of equals

  12. Optimized swimmer tracking system by a dynamic fusion of correlation and color histogram techniques

    NASA Astrophysics Data System (ADS)

    Benarab, D.; Napoléon, T.; Alfalou, A.; Verney, A.; Hellard, P.

    2015-12-01

    To design a robust swimmer tracking system, we took into account two well-known tracking techniques: the nonlinear joint transform correlation (NL-JTC) and the color histogram. The two techniques perform comparably well, yet they both have substantial limitations. Interestingly, they also seem to show some complementarity. The correlation technique yields accurate detection but is sensitive to rotation, scale and contour deformation, whereas the color histogram technique is robust for rotation and contour deformation but shows low accuracy and is highly sensitive to luminosity and confusing background colors. These observations suggested the possibility of a dynamic fusion of the correlation plane and the color scores map. Before this fusion, two steps are required. First is the extraction of a sub-plane of correlation that describes the similarity between the reference and target images. This sub-plane has the same size as the color scores map but they have different interval values. Thus, the second step is required which is the normalization of the planes in the same interval so they can be fused. In order to determine the benefits of this fusion technique, first, we tested it on a synthetic image containing different forms with different colors. We thus were able to optimize the correlation plane and color histogram techniques before applying our fusion technique to real videos of swimmers in international competitions. Last, a comparative study of the dynamic fusion technique and the two classical techniques was carried out to demonstrate the efficacy of the proposed technique. The criteria of comparison were the tracking percentage, the peak to correlation energy (PCE), which evaluated the sharpness of the peak (accuracy), and the local standard deviation (Local-STD), which assessed the noise in the planes (robustness).

  13. The Business of Equal Opportunity.

    ERIC Educational Resources Information Center

    Dickson, Reginald D.

    1992-01-01

    The author describes his journey from poor African-American youth in the rural South to successful businessman. He discusses the Inroads program, an internship for African-American and Hispanic youth and advises giving up victimhood and adapting to the mainstream of capitalism. (SK)

  14. Parameters of proteome evolution from histograms of amino-acid sequence identities of paralogous proteins

    PubMed Central

    Axelsen, Jacob Bock; Yan, Koon-Kiu; Maslov, Sergei

    2007-01-01

    Background The evolution of the full repertoire of proteins encoded in a given genome is mostly driven by gene duplications, deletions, and sequence modifications of existing proteins. Indirect information about relative rates and other intrinsic parameters of these three basic processes is contained in the proteome-wide distribution of sequence identities of pairs of paralogous proteins. Results We introduce a simple mathematical framework based on a stochastic birth-and-death model that allows one to extract some of this information and apply it to the set of all pairs of paralogous proteins in H. pylori, E. coli, S. cerevisiae, C. elegans, D. melanogaster, and H. sapiens. It was found that the histogram of sequence identities p generated by an all-to-all alignment of all protein sequences encoded in a genome is well fitted with a power-law form ~ p-γ with the value of the exponent γ around 4 for the majority of organisms used in this study. This implies that the intra-protein variability of substitution rates is best described by the Gamma-distribution with the exponent α ≈ 0.33. Different features of the shape of such histograms allow us to quantify the ratio between the genome-wide average deletion/duplication rates and the amino-acid substitution rate. Conclusion We separately measure the short-term ("raw") duplication and deletion rates rdup∗, rdel∗ which include gene copies that will be removed soon after the duplication event and their dramatically reduced long-term counterparts rdup, rdel. High deletion rate among recently duplicated proteins is consistent with a scenario in which they didn't have enough time to significantly change their functional roles and thus are to a large degree disposable. Systematic trends of each of the four duplication/deletion rates with the total number of genes in the genome were analyzed. All but the deletion rate of recent duplicates rdel∗ were shown to systematically increase with Ngenes. Abnormally flat shapes

  15. Quantitative characterization of metastatic disease in the spine. Part II. Histogram-based analyses

    SciTech Connect

    Whyne, Cari; Hardisty, Michael; Wu, Florence; Skrinskas, Tomas; Clemons, Mark; Gordon, Lyle; Basran, Parminder S.

    2007-08-15

    Radiological imaging is essential to the appropriate management of patients with bone metastasis; however, there have been no widely accepted guidelines as to the optimal method for quantifying the potential impact of skeletal lesions or to evaluate response to treatment. The current inability to rapidly quantify the response of bone metastases excludes patients with cancer and bone disease from participating in clinical trials of many new treatments as these studies frequently require patients with so-called measurable disease. Computed tomography (CT) can provide excellent skeletal detail with a sensitivity for the diagnosis of bone metastases. The purpose of this study was to establish an objective method to quantitatively characterize disease in the bony spine using CT-based segmentations. It was hypothesized that histogram analysis of CT vertebral density distributions would enable standardized segmentation of tumor tissue and consequently allow quantification of disease in the metastatic spine. Thirty two healthy vertebral CT scans were first studied to establish a baseline characterization. The histograms of the trabecular centrums were found to be Gaussian distributions (average root-mean-square difference=30 voxel counts), as expected for a uniform material. Intrapatient vertebral level similarity was also observed as the means were not significantly different (p>0.8). Thus, a patient-specific healthy vertebral body histogram is able to characterize healthy trabecular bone throughout that individual's thoracolumbar spine. Eleven metastatically involved vertebrae were analyzed to determine the characteristics of the lytic and blastic bone voxels relative to the healthy bone. Lytic and blastic tumors were segmented as connected areas with voxel intensities between specified thresholds. The tested thresholds were {mu}-1.0{sigma}, {mu}-1.5{sigma}, and {mu}-2.0{sigma}, for lytic and {mu}+2.0{sigma}, {mu}+3.0{sigma}, and {mu}+3.5{sigma} for blastic tissue where

  16. Phase-unwrapping algorithm for images with high noise content based on a local histogram

    NASA Astrophysics Data System (ADS)

    Meneses, Jaime; Gharbi, Tijani; Humbert, Philippe

    2005-03-01

    We present a robust algorithm of phase unwrapping that was designed for use on phase images with high noise content. We proceed with the algorithm by first identifying regions with continuous phase values placed between fringe boundaries in an image and then phase shifting the regions with respect to one another by multiples of 2pi to unwrap the phase. Image pixels are segmented between interfringe and fringe boundary areas by use of a local histogram of a wrapped phase. The algorithm has been used successfully to unwrap phase images generated in a three-dimensional shape measurement for noninvasive quantification of human skin structure in dermatology, cosmetology, and plastic surgery.

  17. Prediction of brain tumor progression using multiple histogram matched MRI scans

    NASA Astrophysics Data System (ADS)

    Banerjee, Debrup; Tran, Loc; Li, Jiang; Shen, Yuzhong; McKenzie, Frederic; Wang, Jihong

    2011-03-01

    In a recent study [1], we investigated the feasibility of predicting brain tumor progression based on multiple MRI series and we tested our methods on seven patients' MRI images scanned at three consecutive visits A, B and C. Experimental results showed that it is feasible to predict tumor progression from visit A to visit C using a model trained by the information from visit A to visit B. However, the trained model failed when we tried to predict tumor progression from visit B to visit C, though it is clinically more important. Upon a closer look at the MRI scans revealed that histograms of MRI scans such as T1, T2, FLAIR etc taken at different times have slight shifts or different shapes. This is because those MRI scans are qualitative instead of quantitative so MRI scans taken at different times or by different scanners might have slightly different scales or have different homogeneities in the scanning region. In this paper, we proposed a method to overcome this difficulty. The overall goal of this study is to assess brain tumor progression by exploring seven patients' complete MRI records scanned during their visits in the past two years. There are ten MRI series in each visit, including FLAIR, T1-weighted, post-contrast T1-weighted, T2-weighted and five DTI derived MRI volumes: ADC, FA, Max, Min and Middle Eigen Values. After registering all series to the corresponding DTI scan at the first visit, we applied a histogram matching algorithm to non-DTI MRI scans to match their histograms to those of the corresponding MRI scans at the first visit. DTI derived series are quantitative and do not require the histogram matching procedure. A machine learning algorithm was then trained using the data containing information from visit A to visit B, and the trained model was used to predict tumor progression from visit B to visit C. An average of 72% pixel-wise accuracy was achieved for tumor progression prediction from visit B to visit C.

  18. Improved dose-volume histogram estimates for radiopharmaceutical therapy by optimizing quantitative SPECT reconstruction parameters

    NASA Astrophysics Data System (ADS)

    Cheng, Lishui; Hobbs, Robert F.; Segars, Paul W.; Sgouros, George; Frey, Eric C.

    2013-06-01

    In radiopharmaceutical therapy, an understanding of the dose distribution in normal and target tissues is important for optimizing treatment. Three-dimensional (3D) dosimetry takes into account patient anatomy and the nonuniform uptake of radiopharmaceuticals in tissues. Dose-volume histograms (DVHs) provide a useful summary representation of the 3D dose distribution and have been widely used for external beam treatment planning. Reliable 3D dosimetry requires an accurate 3D radioactivity distribution as the input. However, activity distribution estimates from SPECT are corrupted by noise and partial volume effects (PVEs). In this work, we systematically investigated OS-EM based quantitative SPECT (QSPECT) image reconstruction in terms of its effect on DVHs estimates. A modified 3D NURBS-based Cardiac-Torso (NCAT) phantom that incorporated a non-uniform kidney model and clinically realistic organ activities and biokinetics was used. Projections were generated using a Monte Carlo (MC) simulation; noise effects were studied using 50 noise realizations with clinical count levels. Activity images were reconstructed using QSPECT with compensation for attenuation, scatter and collimator-detector response (CDR). Dose rate distributions were estimated by convolution of the activity image with a voxel S kernel. Cumulative DVHs were calculated from the phantom and QSPECT images and compared both qualitatively and quantitatively. We found that noise, PVEs, and ringing artifacts due to CDR compensation all degraded histogram estimates. Low-pass filtering and early termination of the iterative process were needed to reduce the effects of noise and ringing artifacts on DVHs, but resulted in increased degradations due to PVEs. Large objects with few features, such as the liver, had more accurate histogram estimates and required fewer iterations and more smoothing for optimal results. Smaller objects with fine details, such as the kidneys, required more iterations and less

  19. Phase-unwrapping algorithm for images with high noise content based on a local histogram.

    PubMed

    Meneses, Jaime; Gharbi, Tijani; Humbert, Philippe

    2005-03-01

    We present a robust algorithm of phase unwrapping that was designed for use on phase images with high noise content. We proceed with the algorithm by first identifying regions with continuous phase values placed between fringe boundaries in an image and then phase shifting the regions with respect to one another by multiples of 2pi to unwrap the phase. Image pixels are segmented between interfringe and fringe boundary areas by use of a local histogram of a wrapped phase. The algorithm has been used successfully to unwrap phase images generated in a three-dimensional shape measurement for noninvasive quantification of human skin structure in dermatology, cosmetology, and plastic surgery.

  20. Principal Component Analysis-Based Pattern Analysis of Dose-Volume Histograms and Influence on Rectal Toxicity

    SciTech Connect

    Soehn, Matthias Alber, Markus; Yan Di

    2007-09-01

    Purpose: The variability of dose-volume histogram (DVH) shapes in a patient population can be quantified using principal component analysis (PCA). We applied this to rectal DVHs of prostate cancer patients and investigated the correlation of the PCA parameters with late bleeding. Methods and Materials: PCA was applied to the rectal wall DVHs of 262 patients, who had been treated with a four-field box, conformal adaptive radiotherapy technique. The correlated changes in the DVH pattern were revealed as 'eigenmodes,' which were ordered by their importance to represent data set variability. Each DVH is uniquely characterized by its principal components (PCs). The correlation of the first three PCs and chronic rectal bleeding of Grade 2 or greater was investigated with uni- and multivariate logistic regression analyses. Results: Rectal wall DVHs in four-field conformal RT can primarily be represented by the first two or three PCs, which describe {approx}94% or 96% of the DVH shape variability, respectively. The first eigenmode models the total irradiated rectal volume; thus, PC1 correlates to the mean dose. Mode 2 describes the interpatient differences of the relative rectal volume in the two- or four-field overlap region. Mode 3 reveals correlations of volumes with intermediate doses ({approx}40-45 Gy) and volumes with doses >70 Gy; thus, PC3 is associated with the maximal dose. According to univariate logistic regression analysis, only PC2 correlated significantly with toxicity. However, multivariate logistic regression analysis with the first two or three PCs revealed an increased probability of bleeding for DVHs with more than one large PC. Conclusions: PCA can reveal the correlation structure of DVHs for a patient population as imposed by the treatment technique and provide information about its relationship to toxicity. It proves useful for augmenting normal tissue complication probability modeling approaches.

  1. Locally weighted histogram analysis and stochastic solution for large-scale multi-state free energy estimation

    NASA Astrophysics Data System (ADS)

    Tan, Zhiqiang; Xia, Junchao; Zhang, Bin W.; Levy, Ronald M.

    2016-01-01

    The weighted histogram analysis method (WHAM) including its binless extension has been developed independently in several different contexts, and widely used in chemistry, physics, and statistics, for computing free energies and expectations from multiple ensembles. However, this method, while statistically efficient, is computationally costly or even infeasible when a large number, hundreds or more, of distributions are studied. We develop a locally WHAM (local WHAM) from the perspective of simulations of simulations (SOS), using generalized serial tempering (GST) to resample simulated data from multiple ensembles. The local WHAM equations based on one jump attempt per GST cycle can be solved by optimization algorithms orders of magnitude faster than standard implementations of global WHAM, but yield similarly accurate estimates of free energies to global WHAM estimates. Moreover, we propose an adaptive SOS procedure for solving local WHAM equations stochastically when multiple jump attempts are performed per GST cycle. Such a stochastic procedure can lead to more accurate estimates of equilibrium distributions than local WHAM with one jump attempt per cycle. The proposed methods are broadly applicable when the original data to be "WHAMMED" are obtained properly by any sampling algorithm including serial tempering and parallel tempering (replica exchange). To illustrate the methods, we estimated absolute binding free energies and binding energy distributions using the binding energy distribution analysis method from one and two dimensional replica exchange molecular dynamics simulations for the beta-cyclodextrin-heptanoate host-guest system. In addition to the computational advantage of handling large datasets, our two dimensional WHAM analysis also demonstrates that accurate results similar to those from well-converged data can be obtained from simulations for which sampling is limited and not fully equilibrated.

  2. Midwives, gender equality and feminism.

    PubMed

    Walsh, Denis

    2016-03-01

    Gender inequality and the harmful effects of patriarchy are sustaining the wide spread oppression of women across the world and this is also having an impact on maternity services with unacceptable rates of maternal mortality, the continued under investment in the midwifery profession and the limiting of women's place of birth options. However alongside these effects, the current zeitgeist is affirming an alignment of feminism and gender equality such that both have a high profile in public discourse. This presents a once in a generation opportunity for midwives to self-declare as feminists and commit to righting the wrongs of this most pernicious form of discrimination. PMID:27044191

  3. Midwives, gender equality and feminism.

    PubMed

    Walsh, Denis

    2016-03-01

    Gender inequality and the harmful effects of patriarchy are sustaining the wide spread oppression of women across the world and this is also having an impact on maternity services with unacceptable rates of maternal mortality, the continued under investment in the midwifery profession and the limiting of women's place of birth options. However alongside these effects, the current zeitgeist is affirming an alignment of feminism and gender equality such that both have a high profile in public discourse. This presents a once in a generation opportunity for midwives to self-declare as feminists and commit to righting the wrongs of this most pernicious form of discrimination.

  4. Educational Equality: Luck Egalitarian, Pluralist and Complex

    ERIC Educational Resources Information Center

    Calvert, John

    2014-01-01

    The basic principle of educational equality is that each child should receive an equally good education. This sounds appealing, but is rather vague and needs substantial working out. Also, educational equality faces all the objections to equality per se, plus others specific to its subject matter. Together these have eroded confidence in the…

  5. Gender equality and human rights.

    PubMed

    1998-01-01

    This editorial introduces an issue of INSTRAW News that commemorates the 50th anniversary of the UN's Universal Declaration of Human Rights. This introduction notes that the lead article in the journal expresses optimism about potential progress towards achieving gender equity and human rights because 1) industrialized countries are undergoing a "powershift" to an information society that will offer more and better jobs for women and give women greater access to the power of information, 2) women's earnings have increased worldwide, 3) more and more women are organizing on their own behalf, and 4) a public discourse is being created that promotes the mainstreaming of women's rights and their equality. In addition, several of the UN's international treaties promote gender equality and women's human rights. Foremost among these are 1) the Convention on the Elimination of All Forms of Discrimination Against Women; 2) the Vienna Declaration and Programme of Action, adopted by the 1993 World Conference on Human Rights; 3) the Declaration on the Elimination of Violence Against Women; and 4) the Platform for Action of the Fourth World Conference on women. Counteracting these positive steps is a trend towards defining identity and rights on the basis of community membership only, which ignores the fact that cultures, traditions, and religions are not gender neutral. Given the challenges ahead, the partnership model of society created by women when they have political power is more likely to result in sustainable solutions than the dominator model that men have forwarded for the past 6000 years.

  6. Anterior chamber angle classification using multiscale histograms of oriented gradients for glaucoma subtype identification.

    PubMed

    Xu, Yanwu; Liu, Jiang; Tan, Ngan Meng; Lee, Beng Hai; Wong, Damon Wing Kee; Baskaran, Mani; Perera, Shamira A; Aung, Tin

    2012-01-01

    Glaucoma subtype can be identified according to the configuration of the anterior chamber angle(ACA). In this paper, we present an ACA classification approach based on histograms of oriented gradients at multiple scales. In digital optical coherence tomography (OCT) photographs, our method automatically localizes the ACA, and extracts histograms of oriented gradients (HOG) features from this region to classify the angle as an open angle (OA) or an angle-closure(AC). This proposed method has three major features that differs from existing methods. First, the ACA localization from OCT images is fully automated and efficient for different ACA configurations. Second, the ACA is directly classified as OA/AC by using multiscale HOG visual features only, which is different from previous ACA assessment approaches that on clinical features. Third, it demonstrates that visual features with higher dimensions outperform low dimensional clinical features in terms of angle closure classification accuracy. Testing was performed on a large clinical dataset, comprising of 2048 images. The proposed method achieves a 0.835±0.068 AUC value and 75.8% ± 6.4% balanced accuracy at a 85% specificity, which outperforms existing ACA classification approaches based on clinical features.

  7. Stereo vision-based vehicle detection using a road feature and disparity histogram

    NASA Astrophysics Data System (ADS)

    Lee, Chung-Hee; Lim, Young-Chul; Kwon, Soon; Lee, Jong-Hun

    2011-02-01

    This paper presents stereo vision-based vehicle detection approach on the road using a road feature and disparity histogram. It is not easy to detect only vehicles robustly on the road in various traffic situations, for example, a nonflat road or a multiple-obstacle situation. This paper focuses on the improvement of vehicle detection performance in various real traffic situations. The approach consists of three steps, namely obstacle localization, obstacle segmentation, and vehicle verification. First, we extract a road feature from v-disparity maps binarized using the most frequent values in each row and column, and adopt the extracted road feature as an obstacle criterion in column detection. However, many obstacles still coexist in each localized obstacle area. Thus, we divide the localized obstacle area into multiple obstacles using a disparity histogram and remerge the divided obstacles using four criteria parameters, namely the obstacle size, distance, and angle between the divided obstacles, and the difference of disparity values. Finally, we verify the vehicles using a depth map and gray image to improve the performance. We verify the performance of our proposed method by conducting experiments in various real traffic situations. The average recall rate of vehicle detection is 95.5%.

  8. Rapid dynamic radial MRI via reference image enforced histogram constrained reconstruction

    NASA Astrophysics Data System (ADS)

    Gaass, Thomas; Bauman, Grzegorz; Potdevin, Guillaume; Noël, Peter B.; Haase, Axel

    2014-03-01

    Exploiting spatio-temporal redundancies in sub-Nyquist sampled dynamic MRI for the suppression of undersampling artifacts was shown to be of great success. However, temporally averaged and blurred structures in image space composite data poses the risk of false information in the reconstruction. Within this work we assess the possibility of employing the composite image histogram as a measure of undersampling artifacts and as basis of their suppression. The proposed algorithm utilizes a histogram, computed from a composite image within a dynamically acquired interleaved radial MRI measurement as reference to compensate for the impact of undersampling in temporally resolved data without the incorporation of temporal averaging. In addition an image space regularization utilizing a single frame low-resolution reconstruction is implemented to enforce overall contrast fidelity. The performance of the approach was evaluated on a simulated radial dynamic MRI acquisition and on two functional in vivo radial cardiac acquisitions. Results demonstrate that the algorithm maintained contrast properties, details and temporal resolution in the images, while effectively suppressing undersampling artifacts.

  9. 3D target tracking in infrared imagery by SIFT-based distance histograms

    NASA Astrophysics Data System (ADS)

    Yan, Ruicheng; Cao, Zhiguo

    2011-11-01

    SIFT tracking algorithm is an excellent point-based tracking algorithm, which has high tracking performance and accuracy due to its robust capability against rotation, scale change and occlusion. However, when tracking a huge 3D target in complicated real scenarios in a forward-looking infrared (FLIR) image sequence taken from an airborne moving platform, the tracked point locating in the vertical surface usually shifts away from the correct position. In this paper, we propose a novel algorithm for 3D target tracking in FLIR image sequences. Our approach uses SIFT keypoints detected in consecutive frames for point correspondence. The candidate position of the tracked point is firstly estimated by computing the affine transformation using local corresponding SIFT keypoints. Then the correct position is located via an optimal method. Euclidean distances between a candidate point and SIFT keypoints nearby are calculated and formed into a SIFT-based distance histogram. The distance histogram is defined a cost of associating each candidate point to a correct tracked point using the constraint based on the topology of each candidate point with its surrounding SIFT keypoints. Minimization of the cost is formulated as a combinatorial optimization problem. Experiments demonstrate that the proposed algorithm efficiently improves the tracking performance and accuracy.

  10. User Aligned Histogram Stacks for Visualization of Abdominal Organs via MRI

    NASA Astrophysics Data System (ADS)

    Özdemir, M.; Akay, O.; Güzeliş, C.; Dicle, O.; Selver, M. A.

    2016-08-01

    Multi-dimensional transfer functions (MDTF) are occasionally designed as two-step approaches. At the first step, the constructed domain is modelled coarsely using global volume statistics and an initial transfer function (TF) is designed. Then, a finer classification is performed using local information to refine the TF design. In this study, both a new TF domain and a novel two-step MDTF strategy are proposed for visualization of abdominal organs. The proposed domain is generated by aligning the histograms of the slices, which are reconstructed based on user aligned majority axis/regions through an interactive Multi-Planar Reconstruction graphical user interface. It is shown that these user aligned histogram stacks (UAHS) exploit more a priori information by providing tissue specific inter-slice spatial domain knowledge. For initial TF design, UAHS are approximated using a multi-scale hierarchical Gaussian mixture model, which is designed to work in quasi real time. Then, a finer classification step is carried out for refinement of the initial result. Applications to several MRI data sets acquired with various sequences demonstrate improved visualization of abdomen.

  11. An accurate skull stripping method based on simplex meshes and histogram analysis for magnetic resonance images.

    PubMed

    Galdames, Francisco J; Jaillet, Fabrice; Perez, Claudio A

    2012-01-01

    Skull stripping methods are designed to eliminate the non-brain tissue in magnetic resonance (MR) brain images. Removal of non-brain tissues is a fundamental step in enabling the processing of brain MR images. The aim of this study is to develop an automatic accurate skull stripping method based on deformable models and histogram analysis. A rough-segmentation step is used to find the optimal starting point for the deformation and is based on thresholds and morphological operators. Thresholds are computed using comparisons with an atlas, and modeling by Gaussians. The deformable model is based on a simplex mesh and its deformation is controlled by the image local gray levels and the information obtained on the gray level modeling of the rough-segmentation. Our Simplex Mesh and Histogram Analysis Skull Stripping (SMHASS) method was tested on the following international databases commonly used in scientific articles: BrainWeb, Internet Brain Segmentation Repository (IBSR), and Segmentation Validation Engine (SVE). A comparison was performed against three of the best skull stripping methods previously published: Brain Extraction Tool (BET), Brain Surface Extractor (BSE), and Hybrid Watershed Algorithm (HWA). Performance was measured using the Jaccard index (J) and Dice coefficient (κ). Our method showed the best performance and differences were statistically significant (p<0.05): J=0.904 and κ=0.950 on BrainWeb; J=0.905 and κ=0.950 on IBSR; J=0.946 and κ=0.972 on SVE.

  12. Addendum to brachytherapy dose-volume histogram commissioning with multiple planning systems.

    PubMed

    Gossman, Michael S

    2016-01-01

    The process for validating dose-volume histogram data in brachytherapy software is presented as a supplement to a previously published article. Included is the DVH accuracy evaluation of the Best NOMOS treatment planning system called "Best TPS VolumePlan." As done previously in other software, a rectangular cuboid was contoured in the treatment planning system. A single radioactive 125I source was positioned coplanar and concentric with one end. Calculations were performed to estimate dose deposition in partial volumes of the cuboid structure, using the brachytherapy dosimetry formalism defined in AAPM Task Group 43. Hand-calculated, dose-volume results were compared to TPS-generated, point-source-approximated dose-volume histogram data to establish acceptance. The required QA for commissioning was satisfied for the DVH as conducted previously for other software, using the criterion that the DVH %VolTPS "actual variance" calculations should differ by no more than 5% at any specific radial distance with respect to %VolTG-43, and the "average variance" DVH %VolTPS calculations should differ by no more than 2% over all radial distances with respect to %VolTG-43. The average disagreement observed between hand calculations and treatment planning system DVH was less than 0.5% on average for this treatment planning system and less than 1.1% maximally for 1 ≤ r ≤ 5 cm. PMID:27167288

  13. Detection of Basal Cell Carcinoma Using Color and Histogram Measures of Semitranslucent Areas

    PubMed Central

    Stoecker, William V.; Gupta, Kapil; Shrestha, Bijaya; Wronkiewiecz, Mark; Chowdhury, Raeed; Stanley, R. Joe; Xu, Jin; Moss, Randy H.; Celebi, M. Emre; Rabinovitz, Harold S.; Oliviero, Margaret; Malters, Joseph M.; Kolm, Isabel

    2009-01-01

    Background Semitranslucency, defined as a smooth, jelly-like area with varied, near-skin-tone color, can indicate a diagnosis of basal cell carcinoma (BCC) with high specificity. This study sought to analyze potential areas of semitranslucency with histogram-derived texture and color measures to discriminate BCC from non-semitranslucent areas in non-BCC skin lesions. Methods For 210 dermoscopy images, the areas of semitranslucency in 42 BCCs and comparable areas of smoothness and color in 168 non-BCCs were selected manually. Six color measures and six texture measures were applied to the semitranslucent areas of the BCC and the comparable areas in the non-BCC images. Results Receiver operating characteristic (ROC) curve analysis showed that the texture measures alone provided greater separation of BCC from non-BCC than the color measures alone. Statistical analysis showed that the four most important measures of semitranslucency are three histogram measures: contrast, smoothness, and entropy, and one color measure: blue chromaticity. Smoothness is the single most important measure. The combined 12 measures achieved a diagnostic accuracy of 95.05% based on area under the ROC curve. Conclusion Texture and color analysis measures, especially smoothness, may afford automatic detection of basal cell carcinoma images with semitranslucency. PMID:19624424

  14. Radial polar histogram: obstacle avoidance and path planning for robotic cognition and motion control

    NASA Astrophysics Data System (ADS)

    Wang, Po-Jen; Keyawa, Nicholas R.; Euler, Craig

    2012-01-01

    In order to achieve highly accurate motion control and path planning for a mobile robot, an obstacle avoidance algorithm that provided a desired instantaneous turning radius and velocity was generated. This type of obstacle avoidance algorithm, which has been implemented in California State University Northridge's Intelligent Ground Vehicle (IGV), is known as Radial Polar Histogram (RPH). The RPH algorithm utilizes raw data in the form of a polar histogram that is read from a Laser Range Finder (LRF) and a camera. A desired open block is determined from the raw data utilizing a navigational heading and an elliptical approximation. The left and right most radii are determined from the calculated edges of the open block and provide the range of possible radial paths the IGV can travel through. In addition, the calculated obstacle edge positions allow the IGV to recognize complex obstacle arrangements and to slow down accordingly. A radial path optimization function calculates the best radial path between the left and right most radii and is sent to motion control for speed determination. Overall, the RPH algorithm allows the IGV to autonomously travel at average speeds of 3mph while avoiding all obstacles, with a processing time of approximately 10ms.

  15. Two non-parametric methods for derivation of constraints from radiotherapy dose-histogram data

    NASA Astrophysics Data System (ADS)

    Ebert, M. A.; Gulliford, S. L.; Buettner, F.; Foo, K.; Haworth, A.; Kennedy, A.; Joseph, D. J.; Denham, J. W.

    2014-07-01

    Dose constraints based on histograms provide a convenient and widely-used method for informing and guiding radiotherapy treatment planning. Methods of derivation of such constraints are often poorly described. Two non-parametric methods for derivation of constraints are described and investigated in the context of determination of dose-specific cut-points—values of the free parameter (e.g., percentage volume of the irradiated organ) which best reflect resulting changes in complication incidence. A method based on receiver operating characteristic (ROC) analysis and one based on a maximally-selected standardized rank sum are described and compared using rectal toxicity data from a prostate radiotherapy trial. Multiple test corrections are applied using a free step-down resampling algorithm, which accounts for the large number of tests undertaken to search for optimal cut-points and the inherent correlation between dose-histogram points. Both methods provide consistent significant cut-point values, with the rank sum method displaying some sensitivity to the underlying data. The ROC method is simple to implement and can utilize a complication atlas, though an advantage of the rank sum method is the ability to incorporate all complication grades without the need for grade dichotomization.

  16. Decoding brain cancer dynamics: a quantitative histogram-based approach using temporal MRI

    NASA Astrophysics Data System (ADS)

    Zhou, Mu; Hall, Lawrence O.; Goldgof, Dmitry B.; Russo, Robin; Gillies, Robert J.; Gatenby, Robert A.

    2015-03-01

    Brain tumor heterogeneity remains a challenge for probing brain cancer evolutionary dynamics. In light of evolution, it is a priority to inspect the cancer system from a time-domain perspective since it explicitly tracks the dynamics of cancer variations. In this paper, we study the problem of exploring brain tumor heterogeneity from temporal clinical magnetic resonance imaging (MRI) data. Our goal is to discover evidence-based knowledge from such temporal imaging data, where multiple clinical MRI scans from Glioblastoma multiforme (GBM) patients are generated during therapy. In particular, we propose a quantitative histogram-based approach that builds a prediction model to measure the difference in histograms obtained from pre- and post-treatment. The study could significantly assist radiologists by providing a metric to identify distinctive patterns within each tumor, which is crucial for the goal of providing patient-specific treatments. We examine the proposed approach for a practical application - clinical survival group prediction. Experimental results show that our approach achieved 90.91% accuracy.

  17. Contrast enhancement based on layered difference representation of 2D histograms.

    PubMed

    Lee, Chulwoo; Lee, Chul; Kim, Chang-Su

    2013-12-01

    A novel contrast enhancement algorithm based on the layered difference representation of 2D histograms is proposed in this paper. We attempt to enhance image contrast by amplifying the gray-level differences between adjacent pixels. To this end, we obtain the 2D histogram h(k, k + l ) from an input image, which counts the pairs of adjacent pixels with gray-levels k and k + l , and represent the gray-level differences in a tree-like layered structure. Then, we formulate a constrained optimization problem based on the observation that the gray-level differences, occurring more frequently in the input image, should be more emphasized in the output image. We first solve the optimization problem to derive the transformation function at each layer. We then combine the transformation functions at all layers into the unified transformation function, which is used to map input gray-levels to output gray-levels. Experimental results demonstrate that the proposed algorithm enhances images efficiently in terms of both objective quality and subjective quality.

  18. All equal-area map projections are created equal, but some are more equal than others

    USGS Publications Warehouse

    Usery, E.L.; Seong, J.C.

    2001-01-01

    High-resolution regional and global raster databases are currently being generated for a variety of environmental and scientific modeling applications. The projection of these data from geographic coordinates to a plane coordinate system is subject to significant areal error. Sources of error include users selecting an inappropriate projection or incorrect parameters for a given projection, algorithmic errors in commercial geographic information system (GIS) software, and errors resulting from the projection of data in the raster format. To assess the latter type of errors, the accuracy of raster projection was analyzed by two methods. First, a set of 12 one-degree by one-degree quadrilaterals placed at various latitudes was projected at several raster resolutions and compared to the projection of a vector representation of the same quadrilaterals. Second, several different raster resolutions of land cover data for Asia were projected and the total areas of 21 land cover categories were tabulated and compared. While equal-area projections are designed to specifically preserve area, the comparison of the results of the one-degree by one-degree quadrilaterals with the common equal area projections (e.g., the Mollweide) indicates a considerable variance in the one-degree area after projection. Similarly, the empirical comparison of land cover areas for Asia among various projections shows that total areas of land cover vary with projection type, raster resolution, and latitude. No single projection is best for all resolutions and all latitudes. While any of the equal-area projections tested are reasonably accurate for most applications with resolutions of eight-kilometer pixels or smaller, significant variances in accuracies appear at larger pixel sizes.

  19. Modified projection algorithms for solving the split equality problems.

    PubMed

    Dong, Qiao-Li; He, Songnian

    2014-01-01

    The split equality problem (SEP) has extraordinary utility and broad applicability in many areas of applied mathematics. Recently, Byrne and Moudafi (2013) proposed a CQ algorithm for solving it. In this paper, we propose a modification for the CQ algorithm, which computes the stepsize adaptively and performs an additional projection step onto two half-spaces in each iteration. We further propose a relaxation scheme for the self-adaptive projection algorithm by using projections onto half-spaces instead of those onto the original convex sets, which is much more practical. Weak convergence results for both algorithms are analyzed.

  20. Brightness-equalized quantum dots

    PubMed Central

    Lim, Sung Jun; Zahid, Mohammad U.; Le, Phuong; Ma, Liang; Entenberg, David; Harney, Allison S.; Condeelis, John; Smith, Andrew M.

    2015-01-01

    As molecular labels for cells and tissues, fluorescent probes have shaped our understanding of biological structures and processes. However, their capacity for quantitative analysis is limited because photon emission rates from multicolour fluorophores are dissimilar, unstable and often unpredictable, which obscures correlations between measured fluorescence and molecular concentration. Here we introduce a new class of light-emitting quantum dots with tunable and equalized fluorescence brightness across a broad range of colours. The key feature is independent tunability of emission wavelength, extinction coefficient and quantum yield through distinct structural domains in the nanocrystal. Precise tuning eliminates a 100-fold red-to-green brightness mismatch of size-tuned quantum dots at the ensemble and single-particle levels, which substantially improves quantitative imaging accuracy in biological tissue. We anticipate that these materials engineering principles will vastly expand the optical engineering landscape of fluorescent probes, facilitate quantitative multicolour imaging in living tissue and improve colour tuning in light-emitting devices. PMID:26437175

  1. Impact of the radiotherapy technique on the correlation between dose-volume histograms of the bladder wall defined on MRI imaging and dose-volume/surface histograms in prostate cancer patients

    NASA Astrophysics Data System (ADS)

    Maggio, Angelo; Carillo, Viviana; Cozzarini, Cesare; Perna, Lucia; Rancati, Tiziana; Valdagni, Riccardo; Gabriele, Pietro; Fiorino, Claudio

    2013-04-01

    The aim of this study was to evaluate the correlation between the ‘true’ absolute and relative dose-volume histograms (DVHs) of the bladder wall, dose-wall histogram (DWH) defined on MRI imaging and other surrogates of bladder dosimetry in prostate cancer patients, planned both with 3D-conformal and intensity-modulated radiation therapy (IMRT) techniques. For 17 prostate cancer patients, previously treated with radical intent, CT and MRI scans were acquired and matched. The contours of bladder walls were drawn by using MRI images. External bladder surfaces were then used to generate artificial bladder walls by performing automatic contractions of 5, 7 and 10 mm. For each patient a 3D conformal radiotherapy (3DCRT) and an IMRT treatment plan was generated with a prescription dose of 77.4 Gy (1.8 Gy/fr) and DVH of the whole bladder of the artificial walls (DVH-5/10) and dose-surface histograms (DSHs) were calculated and compared against the DWH in absolute and relative value, for both treatment planning techniques. A specific software (VODCA v. 4.4.0, MSS Inc.) was used for calculating the dose-volume/surface histogram. Correlation was quantified for selected dose-volume/surface parameters by the Spearman correlation coefficient. The agreement between %DWH and DVH5, DVH7 and DVH10 was found to be very good (maximum average deviations below 2%, SD < 5%): DVH5 showed the best agreement. The correlation was slightly better for absolute (R = 0.80-0.94) compared to relative (R = 0.66-0.92) histograms. The DSH was also found to be highly correlated with the DWH, although slightly higher deviations were generally found. The DVH was not a good surrogate of the DWH (R < 0.7 for most of parameters). When comparing the two treatment techniques, more pronounced differences between relative histograms were seen for IMRT with respect to 3DCRT (p < 0.0001).

  2. The across frequency independence of equalization of interaural time delay in the equalization-cancellation model of binaural unmasking

    NASA Astrophysics Data System (ADS)

    Akeroyd, Michael A.

    2004-08-01

    The equalization stage in the equalization-cancellation model of binaural unmasking compensates for the interaural time delay (ITD) of a masking noise by introducing an opposite, internal delay [N. I. Durlach, in Foundations of Modern Auditory Theory, Vol. II., edited by J. V. Tobias (Academic, New York, 1972)]. Culling and Summerfield [J. Acoust. Soc. Am. 98, 785-797 (1995)] developed a multi-channel version of this model in which equalization was ``free'' to use the optimal delay in each channel. Two experiments were conducted to test if equalization was indeed free or if it was ``restricted'' to the same delay in all channels. One experiment measured binaural detection thresholds, using an adaptive procedure, for 1-, 5-, or 17-component tones against a broadband masking noise, in three binaural configurations (N0S180, N180S0, and N90S270). The thresholds for the 1-component stimuli were used to normalize the levels of each of the 5- and 17-component stimuli so that they were equally detectable. If equalization was restricted, then, for the 5- and 17-component stimuli, the N90S270 and N180S0 configurations would yield a greater threshold than the N0S180 configurations. No such difference was found. A subsequent experiment measured binaural detection thresholds, via psychometric functions, for a 2-component complex tone in the same three binaural configurations. Again, no differential effect of configuration was observed. An analytic model of the detection of a complex tone showed that the results were more consistent with free equalization than restricted equalization, although the size of the differences was found to depend on the shape of the psychometric function for detection.

  3. Flow cytometric titration of retroviral expression vectors: comparison of methods for analysis of immunofluorescence histograms derived from cells expressing low antigen levels.

    PubMed

    Sladek, T L; Jacobberger, J W

    1993-01-01

    Few quantitative studies addressing immunofluorescence histogram analysis have been published. One study by Overton (Cytometry 9:619-626, 1988) has shown threshold and histogram subtraction methods to be accurate for analysis of well-separated immunofluorescence distributions of positive and negative cells. An evaluation of methods to analyze immunofluorescence histograms when positive and negative immunofluorescence distributions overlap has not, to our knowledge, been reported. In this paper, data obtained from flow cytometry of immunofluorescently stained cells infected with recombinant retroviruses that produce a range of simian virus 40 large T antigen levels were analyzed by threshold, histogram subtraction, and distribution modeling methods. This analysis showed that as the separation between the immunofluorescence distributions of positive and negative cell populations decrease the best methods for histogram analysis are modeling followed, in order, by histogram subtraction, and threshold analysis.

  4. Promoting Racial Equality in the Nursing Curriculum.

    ERIC Educational Resources Information Center

    Foolchand, M. K.

    1995-01-01

    Equality in nursing education and the profession can be promoted in the following ways: a working policy on racism and equal opportunities; curriculum content that explores stereotypes, values, attitudes, and prejudices; and multicultural health research, education, and promotion. (SK)

  5. The characterization of radioaerosol deposition in the healthy lung by histogram distribution analysis

    SciTech Connect

    Garrard, C.S.; Gerrity, T.R.; Schreiner, J.F.; Yeates, D.B.

    1981-12-01

    Thirteen healthy nonsmoking volunteers inhaled an 8.1 micrometers (MMAD) radioaerosol on two occasions. Aerosol deposition pattern within the right lung, as recorded by a gamma camera, was expressed as the 3rd and 4th moments of the distribution histogram (skew and kurtosis) of radioactivity during the first ten minutes after aerosol inhalation. Deposition pattern was also expressed as the percentage of deposited activity retained within the lung at 24 hr (24 hr % retention) and found to be significantly correlated with measures of skew (P less than 0.001). Tests of pulmonary function (FEV1, FVC, and MMFR) were significantly correlated with skew. Correlations were also demonstrated for these pulmonary function tests with 24 hr % retention but at lower levels of significance. Results indicate that changes in measures of forced expiratory airflow in healthy human volunteers influence deposition pattern and that the skew of the distribution of inhaled radioactivity may provide an acceptable index of deposition pattern.

  6. Communication: Iteration-free, weighted histogram analysis method in terms of intensive variables

    PubMed Central

    Kim, Jaegil; Keyes, Thomas; Straub, John E.

    2011-01-01

    We present an iteration-free weighted histogram method in terms of intensive variables that directly determines the inverse statistical temperature, βS = ∂S/∂E, with S the microcanonical entropy. The method eliminates iterative evaluations of the partition functions intrinsic to the conventional approach and leads to a dramatic acceleration of the posterior analysis of combining statistically independent simulations with no loss in accuracy. The synergistic combination of the method with generalized ensemble weights provides insights into the nature of the underlying phase transitions via signatures in βS characteristic of finite size systems. The versatility and accuracy of the method is illustrated for the Ising and Potts models. PMID:21842919

  7. Accelerating the weighted histogram analysis method by direct inversion in the iterative subspace

    PubMed Central

    Zhang, Cheng; Lai, Chun-Liang; Pettitt, B. Montgomery

    2016-01-01

    The weighted histogram analysis method (WHAM) for free energy calculations is a valuable tool to produce free energy differences with the minimal errors. Given multiple simulations, WHAM obtains from the distribution overlaps the optimal statistical estimator of the density of states, from which the free energy differences can be computed. The WHAM equations are often solved by an iterative procedure. In this work, we use a well-known linear algebra algorithm which allows for more rapid convergence to the solution. We find that the computational complexity of the iterative solution to WHAM and the closely-related multiple Bennett acceptance ratio (MBAR) method can be improved by using the method of direct inversion in the iterative subspace. We give examples from a lattice model, a simple liquid and an aqueous protein solution. PMID:27453632

  8. Scale and Orientation-Based Background Weighted Histogram for Human Tracking

    NASA Astrophysics Data System (ADS)

    Laaroussi, Khadija; Saaidi, Abderrahim; Masrar, Mohamed; Satori, Khalid

    2016-09-01

    The Mean Shift procedure is a popular object tracking algorithm since it is fast, easy to implement and performs well in a range of conditions. However, classic Mean Shift tracking algorithm fixes the size and orientation of the tracking window, which limits the performance when the target's orientation and scale change. In this paper, we present a new human tracking algorithm based on Mean Shift technique in order to estimate the position, scale and orientation changes of the target. This work combines moment features of the weight image with background information to design a robust tracking algorithm entitled Scale and Orientation-based Background Weighted Histogram (SOBWH). The experimental results show that the proposed approach SOBWH presents a good compromise between tracking precision and calculation time, also they validate its robustness, especially to large background variation, scale and orientation changes and similar background scenes.

  9. Use of morphology index histograms to quantify populations of the fungal pathogen Paracoccidioides brasiliensis.

    PubMed

    San-Blas, G; Padrón, R; Alamo, L; San-Blas, F

    1997-01-01

    To quantify the dimorphic process in wild and mutant strains of Paracoccidioides brasiliensis, we defined a morphology index (Mi) in terms of the maximum cell length (l), maximum cell diameter (d), and septal diameter (s), according to the equation Mi = 2.13 + 1.13 log10 (ls/d2), whose intercept and slope were such that Mi was around 1 for yeast (spherical) cells or 4 for hyphal (elongated) cells. This discriminatory power was used to quantify morphological population mixtures through Mi histograms. During the temperature-induced dimorphic transition (either way), mean Mi (Mi) varied linearly with time, suggesting a continuity in the process. Also, in wild strains and mutants thereof we found an inverse relationship between Mi and content of both cell wall chitin and 1,3-alpha-glucan.

  10. Performance analysis of a dual-tree algorithm for computing spatial distance histograms

    PubMed Central

    Chen, Shaoping; Tu, Yi-Cheng; Xia, Yuni

    2011-01-01

    Many scientific and engineering fields produce large volume of spatiotemporal data. The storage, retrieval, and analysis of such data impose great challenges to database systems design. Analysis of scientific spatiotemporal data often involves computing functions of all point-to-point interactions. One such analytics, the Spatial Distance Histogram (SDH), is of vital importance to scientific discovery. Recently, algorithms for efficient SDH processing in large-scale scientific databases have been proposed. These algorithms adopt a recursive tree-traversing strategy to process point-to-point distances in the visited tree nodes in batches, thus require less time when compared to the brute-force approach where all pairwise distances have to be computed. Despite the promising experimental results, the complexity of such algorithms has not been thoroughly studied. In this paper, we present an analysis of such algorithms based on a geometric modeling approach. The main technique is to transform the analysis of point counts into a problem of quantifying the area of regions where pairwise distances can be processed in batches by the algorithm. From the analysis, we conclude that the number of pairwise distances that are left to be processed decreases exponentially with more levels of the tree visited. This leads to the proof of a time complexity lower than the quadratic time needed for a brute-force algorithm and builds the foundation for a constant-time approximate algorithm. Our model is also general in that it works for a wide range of point spatial distributions, histogram types, and space-partitioning options in building the tree. PMID:21804753

  11. Histogram Analysis of Gadoxetic Acid-Enhanced MRI for Quantitative Hepatic Fibrosis Measurement

    PubMed Central

    Kim, Honsoul; Park, Seong Ho; Kim, Eun Kyung; Kim, Myeong-Jin; Park, Young Nyun; Park, Hae-Jeong; Choi, Jin-Young

    2014-01-01

    Purpose The diagnosis and monitoring of liver fibrosis is an important clinical issue; however, this is usually achieved by invasive methods such as biopsy. We aimed to determine whether histogram analysis of hepatobiliary phase images of gadoxetic acid-enhanced magnetic resonance imaging (MRI) can provide non-invasive quantitative measurement of liver fibrosis. Methods This retrospective study was approved by the institutional ethics committee, and a waiver of informed consent was obtained. Hepatobiliary phase images of preoperative gadoxetic acid-enhanced MRI studies of 105 patients (69 males, 36 females; age 56.1±12.2) with pathologically documented liver fibrosis grades were analyzed. Fibrosis staging was F0/F1/F2/F3/F4 (METAVIR system) for 11/20/13/15/46 patients, respectively. Four regions-of-interest (ROI, each about 2 cm2) were placed on predetermined locations of representative images. The measured signal intensity of pixels in each ROI was used to calculate corrected coefficient of variation (cCV), skewness, and kurtosis. An average value of each parameter was calculated for comparison. Statistical analysis was performed by ANOVA, receiver operating characteristic (ROC) curve analysis, and linear regression. Results The cCV showed statistically significant differences among pathological fibrosis grades (P<0.001) whereas skewness and kurtosis did not. Univariable linear regression analysis suggested cCV to be a meaningful parameter in predicting the fibrosis grade (P<0.001, β = 0.40 and standard error  = 0.06). For discriminating F0-3 from F4, the area under ROC score was 0.857, standard deviation 0.036, 95% confidence interval 0.785–0.928. Conclusion Histogram analysis of hepatobiliary phase images of gadoxetic acid-enhanced MRI can provide non-invasive quantitative measurements of hepatic fibrosis. PMID:25460180

  12. Development of a Quality Assurance Procedure for Dose Volume Histogram Analysis

    NASA Astrophysics Data System (ADS)

    Davenport, David A.

    The role of the dose-volume histogram (DVH) is rapidly expanding in radiation oncology treatment planning. DVHs are already relied upon to differentiate between two similar plans and evaluate organ-at-risk dosage. Their role will become even more important as progress continues towards implementing biologically based treatment planning systems. Therefore it is imperative that the accuracy of DVHs is evaluated and reappraised after any major software or hardware upgrades, affecting a treatment planning system (TPS). The purpose of this work is to create and implement a comprehensive quality assurance procedure evaluating dose volume histograms to insure their accuracy while satisfying American College of Radiology guidelines. Virtual phantoms of known volumes were created in Pinnacle TPS and exposed to different beam arrangements. Variables including grid size and slice thickness were varied and their effects were analyzed. The resulting DVHs were evaluated by comparison to the commissioned percent depth dose values using a custom Excel spreadsheet. After determining the uncertainty of the DVH based on these variables, multiple second check calculations were performed using MIM Maestro and Matlab software packages. The uncertainties of the DVHs were shown to be less than +/- 3%. The average uncertainty was shown to be less than +/- 1%. The second check procedures resulted in mean percent differences less than 1% which confirms the accuracy of DVH calculation in Pinnacle and the effectiveness of the quality assurance template. The importance of knowing the limits of accuracy of the DVHs, which are routinely used to assess the quality of clinical treatment plans, cannot be overestimated. The developed comprehensive QA procedure evaluating the accuracy of the DVH statistical analysis will become a part of our clinical arsenal for periodic tests of the treatment planning system. It will also be performed at the time of commissioning and after any major software

  13. Performance analysis of a dual-tree algorithm for computing spatial distance histograms.

    PubMed

    Chen, Shaoping; Tu, Yi-Cheng; Xia, Yuni

    2011-08-01

    Many scientific and engineering fields produce large volume of spatiotemporal data. The storage, retrieval, and analysis of such data impose great challenges to database systems design. Analysis of scientific spatiotemporal data often involves computing functions of all point-to-point interactions. One such analytics, the Spatial Distance Histogram (SDH), is of vital importance to scientific discovery. Recently, algorithms for efficient SDH processing in large-scale scientific databases have been proposed. These algorithms adopt a recursive tree-traversing strategy to process point-to-point distances in the visited tree nodes in batches, thus require less time when compared to the brute-force approach where all pairwise distances have to be computed. Despite the promising experimental results, the complexity of such algorithms has not been thoroughly studied. In this paper, we present an analysis of such algorithms based on a geometric modeling approach. The main technique is to transform the analysis of point counts into a problem of quantifying the area of regions where pairwise distances can be processed in batches by the algorithm. From the analysis, we conclude that the number of pairwise distances that are left to be processed decreases exponentially with more levels of the tree visited. This leads to the proof of a time complexity lower than the quadratic time needed for a brute-force algorithm and builds the foundation for a constant-time approximate algorithm. Our model is also general in that it works for a wide range of point spatial distributions, histogram types, and space-partitioning options in building the tree.

  14. Seismic remote sensing image segmentation based on spectral histogram and dynamic region merging

    NASA Astrophysics Data System (ADS)

    Wang, Peng; Sun, Genyun; Wang, Zhenjie

    2015-12-01

    Image segmentation is the foundation of seismic information extraction from high-resolution remote sensing images. While the complexity of the seismic image brings great challenges to its segmentation. Compared with the traditional pixel-level approaches, the region-level approaches are found prevailing in dealing with the complexity. This paper addresses the seismic image segmentation problem in a region-merging style. Starting from many over-segmented regions, the image segmentation is performed by iteratively merging the neighboring regions. In the proposed algorithm, the merging criterion and merging order are two essential issues to be emphatically considered. An effective merging criterion is largely depends on the region feature and neighbor homogeneity measure. The region's spectral histogram represents the global feature of each region and enhances the discriminability of neighboring regions. Therefore, we utilize it to solve the merging criterion. Under a certain the merging criterion, a better performance could be obtained if the most similar regions are always ensured to be merged first, which can be transformed into a least-cost problem. Rather than predefine an order queue, we solve the order problem with a dynamic scheme. The proposed approach mainly contains three parts. Firstly, starting from the over-segmented regions, the spectral histograms are constructed to represent each region. Then, we use the homogeneity that combines the distance and shape measure to conduct the merge criterion. Finally, neighbor regions are dynamically merged following the dynamic program (DP) theory and breadth-first strategy. Experiments are conducted using the earthquake images, including collapsed buildings and seismic secondary geological disaster. The experimental results show that, the proposed method segments the seismic image more correctly.

  15. Size distribution of linear and helical polymers in actin solution analyzed by photon counting histogram.

    PubMed

    Terada, Naofumi; Shimozawa, Togo; Ishiwata, Shin'ichi; Funatsu, Takashi

    2007-03-15

    Actin is a ubiquitous protein that is a major component of the cytoskeleton, playing an important role in muscle contraction and cell motility. At steady state, actin monomers and filaments (F-actin) coexist, and actin subunits continuously attach and detach at the filament ends. However, the size distribution of actin oligomers in F-actin solution has never been clarified. In this study, we investigated the size distribution of actin oligomers using photon-counting histograms. For this purpose, actin was labeled with a fluorescent dye, and the emitted photons were detected by confocal optics (the detection volume was of femtoliter (fL) order). Photon-counting histograms were analyzed to obtain the number distribution of actin oligomers in the detection area from their brightness, assuming that the brightness of an oligomer was proportional to the number of protomers. We found that the major populations at physiological ionic strength were 1-5mers. For data analysis, we successfully applied the theory of linear and helical aggregations of macromolecules. The model postulates three states of actin, i.e., monomers, linear polymers, and helical polymers. Here we obtained three parameters: the equilibrium constants for polymerization of linear polymers, K(l)=(5.2 +/- 1.1) x 10(6) M(-1), and helical polymers, K(h)=(1.6 +/- 0.5) x 10(7) M(-1); and the ratio of helical to linear trimers, gamma = (3.6 +/- 2.3) x 10(-2). The excess free energy of transforming a linear trimer to a helical trimer, which is assumed to be a nucleus for helical polymers, was calculated to be 2.0 kcal/mol. These analyses demonstrate that the oligomeric phase at steady state is predominantly composed of linear 1-5mers, and the transition from linear to helical polymers occurs on the level of 5-7mers. PMID:17172301

  16. Fast analysis of molecular dynamics trajectories with graphics processing units—Radial distribution function histogramming

    NASA Astrophysics Data System (ADS)

    Levine, Benjamin G.; Stone, John E.; Kohlmeyer, Axel

    2011-05-01

    The calculation of radial distribution functions (RDFs) from molecular dynamics trajectory data is a common and computationally expensive analysis task. The rate limiting step in the calculation of the RDF is building a histogram of the distance between atom pairs in each trajectory frame. Here we present an implementation of this histogramming scheme for multiple graphics processing units (GPUs). The algorithm features a tiling scheme to maximize the reuse of data at the fastest levels of the GPU's memory hierarchy and dynamic load balancing to allow high performance on heterogeneous configurations of GPUs. Several versions of the RDF algorithm are presented, utilizing the specific hardware features found on different generations of GPUs. We take advantage of larger shared memory and atomic memory operations available on state-of-the-art GPUs to accelerate the code significantly. The use of atomic memory operations allows the fast, limited-capacity on-chip memory to be used much more efficiently, resulting in a fivefold increase in performance compared to the version of the algorithm without atomic operations. The ultimate version of the algorithm running in parallel on four NVIDIA GeForce GTX 480 (Fermi) GPUs was found to be 92 times faster than a multithreaded implementation running on an Intel Xeon 5550 CPU. On this multi-GPU hardware, the RDF between two selections of 1,000,000 atoms each can be calculated in 26.9 s per frame. The multi-GPU RDF algorithms described here are implemented in VMD, a widely used and freely available software package for molecular dynamics visualization and analysis.

  17. Gandhigram: fostering equality through development.

    PubMed

    Devi, R K

    1991-12-01

    A noticeable trend towards 1-child families reveals the success of Gandhigram, an integrated rural development program in Tamil Nadu, India. Founded in 1947 by T.S. Soundram, Gandhigram has adhered to Gandhian principles of truth, nonviolence, castlessness, and equality between the sexes. The program has combined health and family planning with social welfare, education, and economic development. From the outset, Gandhigram has sought community participation, including the involvement of women. Girls have been encouraged to attend school up to at least the 10th level, and employment opportunities for women have been increased. Women's increased economic independence and level of education have influenced their decision to delay marriage by about 2 1/2 years, to choose their own partners, and to decide on the number of children they want. And increasingly, women are opting to limit family size to 2 -- and sometimes 1 -- child. Women are choosing to undergo tubectomies at a younger age, partly because of the availability of recanalization surgery, which has allowed mothers who have lost a child to conceive again. Unlike typical government family planning programs, which usually provide only contraception to meet the objective of a small family norm, Gandhigram also offers infertility services. Not all of Gandhigram's efforts have resulted in success. For example, a plan to develop a health insurance program did not succeed. However, Gandhigram's 44 years of experience have revealed the necessary elements for success. These elements include community participation, the participation of women through educational and employment programs, and easy access to services. PMID:12317116

  18. Dynamic Contrast-enhanced MR Imaging in Renal Cell Carcinoma: Reproducibility of Histogram Analysis on Pharmacokinetic Parameters.

    PubMed

    Wang, Hai-Yi; Su, Zi-Hua; Xu, Xiao; Sun, Zhi-Peng; Duan, Fei-Xue; Song, Yuan-Yuan; Li, Lu; Wang, Ying-Wei; Ma, Xin; Guo, Ai-Tao; Ma, Lin; Ye, Hui-Yi

    2016-01-01

    Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan-rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K( trans) &Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733

  19. Dynamic Contrast-enhanced MR Imaging in Renal Cell Carcinoma: Reproducibility of Histogram Analysis on Pharmacokinetic Parameters

    PubMed Central

    Wang, Hai-yi; Su, Zi-hua; Xu, Xiao; Sun, Zhi-peng; Duan, Fei-xue; Song, Yuan-yuan; Li, Lu; Wang, Ying-wei; Ma, Xin; Guo, Ai-tao; Ma, Lin; Ye, Hui-yi

    2016-01-01

    Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan–rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K trans & Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733

  20. Students' Misconceptions in Interpreting Center and Variability of Data Represented via Histograms and Stem-and-Leaf Plots

    ERIC Educational Resources Information Center

    Cooper, Linda L.; Shore, Felice S.

    2008-01-01

    This paper identifies and discusses misconceptions that students have in making judgments of center and variability when data are presented graphically. An assessment addressing interpreting center and variability in histograms and stem-and-leaf plots was administered to, and follow-up interviews were conducted with, undergraduates enrolled in…

  1. Studying the time histogram of a terrestrial electron beam detected from the opposite hemisphere of its associated TGF

    NASA Astrophysics Data System (ADS)

    Sarria, D.; Blelly, P.-L.; Briggs, M. S.; Forme, F.

    2016-05-01

    Terrestrial gamma-ray flashes are bursts of X/gamma photons, correlated to thunderstorms. By interacting with the atmosphere, the photons produce a substantial number of electrons and positrons. Some of these reach a sufficiently high altitude that their interactions with the atmosphere become negligible, and they are then guided by geomagnetic field lines, forming a Terrestrial Electron Beam. On 9 December 2009, the Gamma-Ray Burst Monitor (GBM) instrument on board the Fermi Space Telescope made a particularly interesting measurement of such an event. To study this type of event in detail, we perform Monte-Carlo simulations and focus on the resulting time histograms. In agreement with previous work, we show that the histogram measured by Fermi GBM is reproducible from a simulation. We then show that the time histogram resulting from this simulation is only weakly dependent on the production altitude, duration, beaming angle, and spectral shape of the associated terrestrial gamma-ray flash. Finally, we show that the time histogram can be decomposed into three populations of leptons, coming from the opposite hemisphere, and mirroring back to the satellite with or without interacting with the atmosphere, and that these populations can be clearly distinguished by their pitch angles.

  2. Automated geomorphometric classification of landforms in Transdanubian Region (Pannonian Basin) based on local slope histograms

    NASA Astrophysics Data System (ADS)

    Székely, Balázs; Koma, Zsófia; Csorba, Kristóf; Ferenc Morovics, József

    2014-05-01

    The Transdanubian Region is a typically hilly, geologically manifold area of the Pannonian Basin. It is composed primarily of Permo-Mesozoic carbonates and siliciclastic sediments, however Pannonian sedimentary units and young volcanic forms are also characteristic, such as those in the Bakony-Balaton Highland Volcanic Field. The geological diversity is reflected in the geomorphological setting: beside of the classic eroding volcanic edifices, carbonate plateaus, medium-relief, gently hilly, slowly eroding landforms are also frequent in the geomorphic mosaic of the area. Geomorphometric techniques are suitable to analyse and separate the various geomorphic units mosaicked and, in some cases, affected by (sub-)recent tectonic geomorphic processes. In our project we applied automated classification of local slope angle histograms derived of a 10-meter nominal resolution digital terrain model (DTM). Slope angle histrograms within a rectangular moving window of various sizes have been calculated in numerous experiments. The histograms then served as a multichannel input of for a k-means classification to achieve a geologically-geomorphologically sound categorization of the area. The experiments show good results in separating the very basic landforms, defined landscape boundaries can be reconstructed with high accuracy in case of larger window sizes (e.g. 5 km) and low number of categories. If the window size is smaller and the number of classes is higher, the tectonic geomorphic features are more prominently recognized, however often at the price of the clear separation boundaries: in these cases the horizontal change in the composition of various clusters matches the boundaries of the geological units. Volcanic forms are typically also put into some definite classes, however the flat plateaus of some volcanic edifices fall into another category also recognized in the experiments. In summary we can conclude that the area is suitable for such analyses, many

  3. Normal-reciprocal error models for quantitative ERT in permafrost environments: bin analysis versus histogram analysis

    NASA Astrophysics Data System (ADS)

    Verleysdonk, Sarah; Flores-Orozco, Adrian; Krautblatter, Michael; Kemna, Andreas

    2010-05-01

    Electrical resistivity tomography (ERT) has been used for the monitoring of permafrost-affected rock walls for some years now. To further enhance the interpretation of ERT measurements a deeper insight into error sources and the influence of error model parameters on the imaging results is necessary. Here, we present the effect of different statistical schemes for the determination of error parameters from the discrepancies between normal and reciprocal measurements - bin analysis and histogram analysis - using a smoothness-constrained inversion code (CRTomo) with an incorporated appropriate error model. The study site is located in galleries adjacent to the Zugspitze North Face (2800 m a.s.l.) at the border between Austria and Germany. A 20 m * 40 m rock permafrost body and its surroundings have been monitored along permanently installed transects - with electrode spacings of 1.5 m and 4.6 m - from 2007 to 2009. For data acquisition, a conventional Wenner survey was conducted as this array has proven to be the most robust array in frozen rock walls. Normal and reciprocal data were collected directly one after another to ensure identical conditions. The ERT inversion results depend strongly on the chosen parameters of the employed error model, i.e., the absolute resistance error and the relative resistance error. These parameters were derived (1) for large normal/reciprocal data sets by means of bin analyses and (2) for small normal/reciprocal data sets by means of histogram analyses. Error parameters were calculated independently for each data set of a monthly monitoring sequence to avoid the creation of artefacts (over-fitting of the data) or unnecessary loss of contrast (under-fitting of the data) in the images. The inversion results are assessed with respect to (1) raw data quality as described by the error model parameters, (2) validation via available (rock) temperature data and (3) the interpretation of the images from a geophysical as well as a

  4. Nonlinear histogram binning for quantitative analysis of lung tissue fibrosis in high-resolution CT data

    NASA Astrophysics Data System (ADS)

    Zavaletta, Vanessa A.; Bartholmai, Brian J.; Robb, Richard A.

    2007-03-01

    Diffuse lung diseases, such as idiopathic pulmonary fibrosis (IPF), can be characterized and quantified by analysis of volumetric high resolution CT scans of the lungs. These data sets typically have dimensions of 512 x 512 x 400. It is too subjective and labor intensive for a radiologist to analyze each slice and quantify regional abnormalities manually. Thus, computer aided techniques are necessary, particularly texture analysis techniques which classify various lung tissue types. Second and higher order statistics which relate the spatial variation of the intensity values are good discriminatory features for various textures. The intensity values in lung CT scans range between [-1024, 1024]. Calculation of second order statistics on this range is too computationally intensive so the data is typically binned between 16 or 32 gray levels. There are more effective ways of binning the gray level range to improve classification. An optimal and very efficient way to nonlinearly bin the histogram is to use a dynamic programming algorithm. The objective of this paper is to show that nonlinear binning using dynamic programming is computationally efficient and improves the discriminatory power of the second and higher order statistics for more accurate quantification of diffuse lung disease.

  5. A generic shape/texture descriptor over multiscale edge field: 2-D walking ant histogram.

    PubMed

    Kiranyaz, Serkan; Ferreira, Miguel; Gabbouj, Moncef

    2008-03-01

    A novel shape descriptor, which can be extracted from the major object edges automatically and used for the multimedia content-based retrieval in multimedia databases, is presented. By adopting a multiscale approach over the edge field where the scale represents the amount of simplification, the most relevant edge segments, referred to as subsegments, which eventually represent the major object boundaries, are extracted from a scale-map. Similar to the process of a walking ant with a limited line of sight over the boundary of a particular object, we traverse through each subsegment and describe a certain line of sight, whether it is a continuous branch or a corner, using individual 2-D histograms. Furthermore, the proposed method can also be tuned to be an efficient texture descriptor, which achieves a superior performance especially for directional textures. Finally, integrating the whole process as feature extraction module into MUVIS framework allows us to test the mutual performance of the proposed shape descriptor in the context of multimedia indexing and retrieval. PMID:18270126

  6. 3D/2D image registration using weighted histogram of gradient directions

    NASA Astrophysics Data System (ADS)

    Ghafurian, Soheil; Hacihaliloglu, Ilker; Metaxas, Dimitris N.; Tan, Virak; Li, Kang

    2015-03-01

    Three dimensional (3D) to two dimensional (2D) image registration is crucial in many medical applications such as image-guided evaluation of musculoskeletal disorders. One of the key problems is to estimate the 3D CT- reconstructed bone model positions (translation and rotation) which maximize the similarity between the digitally reconstructed radiographs (DRRs) and the 2D fluoroscopic images using a registration method. This problem is computational-intensive due to a large search space and the complicated DRR generation process. Also, finding a similarity measure which converges to the global optimum instead of local optima adds to the challenge. To circumvent these issues, most existing registration methods need a manual initialization, which requires user interaction and is prone to human error. In this paper, we introduce a novel feature-based registration method using the weighted histogram of gradient directions of images. This method simplifies the computation by searching the parameter space (rotation and translation) sequentially rather than simultaneously. In our numeric simulation experiments, the proposed registration algorithm was able to achieve sub-millimeter and sub-degree accuracies. Moreover, our method is robust to the initial guess. It can tolerate up to +/-90°rotation offset from the global optimal solution, which minimizes the need for human interaction to initialize the algorithm.

  7. Lung cancer prediction using neural network ensemble with histogram of oriented gradient genomic features.

    PubMed

    Adetiba, Emmanuel; Olugbara, Oludayo O

    2015-01-01

    This paper reports an experimental comparison of artificial neural network (ANN) and support vector machine (SVM) ensembles and their "nonensemble" variants for lung cancer prediction. These machine learning classifiers were trained to predict lung cancer using samples of patient nucleotides with mutations in the epidermal growth factor receptor, Kirsten rat sarcoma viral oncogene, and tumor suppressor p53 genomes collected as biomarkers from the IGDB.NSCLC corpus. The Voss DNA encoding was used to map the nucleotide sequences of mutated and normal genomes to obtain the equivalent numerical genomic sequences for training the selected classifiers. The histogram of oriented gradient (HOG) and local binary pattern (LBP) state-of-the-art feature extraction schemes were applied to extract representative genomic features from the encoded sequences of nucleotides. The ANN ensemble and HOG best fit the training dataset of this study with an accuracy of 95.90% and mean square error of 0.0159. The result of the ANN ensemble and HOG genomic features is promising for automated screening and early detection of lung cancer. This will hopefully assist pathologists in administering targeted molecular therapy and offering counsel to early stage lung cancer patients and persons in at risk populations. PMID:25802891

  8. Assessing the hydrologic alteration of the Yangtze River using the histogram matching approach

    NASA Astrophysics Data System (ADS)

    Huang, F.; Zhang, N.; Guo, L. D.; Xia, Z. Q.

    2016-08-01

    Hydrologic changes of the Yangtze River, an important river with abundant water resources in China, were investigated using the Histogram Matching Approach. Daily streamflow data spanning the time interval from 1955 to 2013 was collected from Yichang and Datong stations, which monitor the hydrologic processes of the upper and lower reach of the Yangtze River, respectively. The Gezhouba Dam, the first dam constructed at the main stream of the Yangtze River, started operations in 1981. 1981 was used to differentiate the pre-dam (1955-1980) and post-dam (1981-2013) hydrologic regimes. The hydrologic regime was quantified by the Indicators of Hydrologic Alteration. The overall alteration degree of the upper Yangtze River was 31% and the alteration degree of every hydrologic indicator ranged from 10% to 81%. Only 1, 5 and 26 hydrologic indicators were altered at high, moderate and low degrees, respectively. The overall alteration degree of the lower Yangtze River was 30%, and the alteration degree of every hydrologic indicator ranged from 8% to 49%. No high alteration degree was detected at the Datong station. Ten hydrologic indicators were altered at moderate degrees and 22 hydrologic indicators were altered at low degrees. Significant increases could be observed for the low-flow relevant indicators, including the monthly flow from January-March, the annual minimum 1, 3, 7, 30 and 90-day flows, and the base flow index.

  9. Computing Spatial Distance Histograms for Large Scientific Datasets On-the-Fly

    PubMed Central

    Kumar, Anand; Grupcev, Vladimir; Yuan, Yongke; Huang, Jin; Shen, Gang

    2014-01-01

    This paper focuses on an important query in scientific simulation data analysis: the Spatial Distance Histogram (SDH). The computation time of an SDH query using brute force method is quadratic. Often, such queries are executed continuously over certain time periods, increasing the computation time. We propose highly efficient approximate algorithm to compute SDH over consecutive time periods with provable error bounds. The key idea of our algorithm is to derive statistical distribution of distances from the spatial and temporal characteristics of particles. Upon organizing the data into a Quad-tree based structure, the spatiotemporal characteristics of particles in each node of the tree are acquired to determine the particles’ spatial distribution as well as their temporal locality in consecutive time periods. We report our efforts in implementing and optimizing the above algorithm in Graphics Processing Units (GPUs) as means to further improve the efficiency. The accuracy and efficiency of the proposed algorithm is backed by mathematical analysis and results of extensive experiments using data generated from real simulation studies. PMID:25264418

  10. Shot-Noise Limited Single-Molecule FRET Histograms: Comparison between Theory and Experiments†

    PubMed Central

    Nir, Eyal; Michalet, Xavier; Hamadani, Kambiz M.; Laurence, Ted A.; Neuhauser, Daniel; Kovchegov, Yevgeniy; Weiss, Shimon

    2011-01-01

    We describe a simple approach and present a straightforward numerical algorithm to compute the best fit shot-noise limited proximity ratio histogram (PRH) in single-molecule fluorescence resonant energy transfer diffusion experiments. The key ingredient is the use of the experimental burst size distribution, as obtained after burst search through the photon data streams. We show how the use of an alternated laser excitation scheme and a correspondingly optimized burst search algorithm eliminates several potential artifacts affecting the calculation of the best fit shot-noise limited PRH. This algorithm is tested extensively on simulations and simple experimental systems. We find that dsDNA data exhibit a wider PRH than expected from shot noise only and hypothetically account for it by assuming a small Gaussian distribution of distances with an average standard deviation of 1.6 Å. Finally, we briefly mention the results of a future publication and illustrate them with a simple two-state model system (DNA hairpin), for which the kinetic transition rates between the open and closed conformations are extracted. PMID:17078646

  11. Application of Histogram Analysis in Radiation Therapy (HART) in Intensity Modulation Radiation Therapy (IMRT) Treatments

    NASA Astrophysics Data System (ADS)

    Pyakuryal, Anil

    2009-03-01

    A carcinoma is a malignant cancer that emerges from epithelial cells in structures through out the body.It invades the critical organs, could metastasize or spread to lymph nodes.IMRT is an advanced mode of radiation therapy treatment for cancer. It delivers more conformal doses to malignant tumors sparing the critical organs by modulating the intensity of radiation beam.An automated software, HART (S. Jang et al.,2008,Med Phys 35,p.2812) was used for efficient analysis of dose volume histograms (DVH) for multiple targets and critical organs in four IMRT treatment plans for each patient. IMRT data for ten head and neck cancer patients were exported as AAPM/RTOG format files from a commercial treatment planning system at Northwestern Memorial Hospital (NMH).HART extracted DVH statistics were used to evaluate plan indices and to analyze dose tolerance of critical structures at prescription dose (PD) for each patient. Mean plan indices (n=10) were found to be in good agreement with published results for Linac based plans. The least irradiated volume at tolerance dose (TD50) was observed for brainstem and the highest volume for larynx in SIB treatment techniques. Thus HART, an open source platform, has extensive clinical implications in IMRT treatments.

  12. People re-identification in camera networks based on probabilistic color histograms

    NASA Astrophysics Data System (ADS)

    D'Angelo, Angela; Dugelay, Jean-Luc

    2011-01-01

    People tracking has to face many issues in video surveillance scenarios. One of the most challenging aspect is to re-identify people across different cameras. Humans, indeed, change appearance according to pose, clothes and illumination conditions and thus defining features that are able to robustly describe people moving in a camera network is a not trivial task. While color is widely exploited in the distinction and recognition of objects, most of the color descriptors proposed so far are not robust in complex applications such as video surveillance scenarios. A new color based feature is introduced in this paper to describe the color appearance of the subjects. For each target a probabilistic color histogram (PCH) is built by using a fuzzy K-Nearest Neighbors (KNN) classifier trained on an ad-hoc dataset and is used to match two corresponding appearances of the same person in different cameras of the network. The experimental results show that the defined descriptor is effective at discriminating and re-identifying people across two different video cameras regardless of the viewpoint change between the two views and outperforms state of the art appearance based techniques.

  13. Lung Cancer Prediction Using Neural Network Ensemble with Histogram of Oriented Gradient Genomic Features

    PubMed Central

    Adetiba, Emmanuel; Olugbara, Oludayo O.

    2015-01-01

    This paper reports an experimental comparison of artificial neural network (ANN) and support vector machine (SVM) ensembles and their “nonensemble” variants for lung cancer prediction. These machine learning classifiers were trained to predict lung cancer using samples of patient nucleotides with mutations in the epidermal growth factor receptor, Kirsten rat sarcoma viral oncogene, and tumor suppressor p53 genomes collected as biomarkers from the IGDB.NSCLC corpus. The Voss DNA encoding was used to map the nucleotide sequences of mutated and normal genomes to obtain the equivalent numerical genomic sequences for training the selected classifiers. The histogram of oriented gradient (HOG) and local binary pattern (LBP) state-of-the-art feature extraction schemes were applied to extract representative genomic features from the encoded sequences of nucleotides. The ANN ensemble and HOG best fit the training dataset of this study with an accuracy of 95.90% and mean square error of 0.0159. The result of the ANN ensemble and HOG genomic features is promising for automated screening and early detection of lung cancer. This will hopefully assist pathologists in administering targeted molecular therapy and offering counsel to early stage lung cancer patients and persons in at risk populations. PMID:25802891

  14. Gliomas: Application of Cumulative Histogram Analysis of Normalized Cerebral Blood Volume on 3 T MRI to Tumor Grading

    PubMed Central

    Kim, Hyungjin; Choi, Seung Hong; Kim, Ji-Hoon; Ryoo, Inseon; Kim, Soo Chin; Yeom, Jeong A.; Shin, Hwaseon; Jung, Seung Chai; Lee, A. Leum; Yun, Tae Jin; Park, Chul-Kee; Sohn, Chul-Ho; Park, Sung-Hye

    2013-01-01

    Background Glioma grading assumes significant importance in that low- and high-grade gliomas display different prognoses and are treated with dissimilar therapeutic strategies. The objective of our study was to retrospectively assess the usefulness of a cumulative normalized cerebral blood volume (nCBV) histogram for glioma grading based on 3 T MRI. Methods From February 2010 to April 2012, 63 patients with astrocytic tumors underwent 3 T MRI with dynamic susceptibility contrast perfusion-weighted imaging. Regions of interest containing the entire tumor volume were drawn on every section of the co-registered relative CBV (rCBV) maps and T2-weighted images. The percentile values from the cumulative nCBV histograms and the other histogram parameters were correlated with tumor grades. Cochran’s Q test and the McNemar test were used to compare the diagnostic accuracies of the histogram parameters after the receiver operating characteristic curve analysis. Using the parameter offering the highest diagnostic accuracy, a validation process was performed with an independent test set of nine patients. Results The 99th percentile of the cumulative nCBV histogram (nCBV C99), mean and peak height differed significantly between low- and high-grade gliomas (P = <0.001, 0.014 and <0.001, respectively) and between grade III and IV gliomas (P = <0.001, 0.001 and <0.001, respectively). The diagnostic accuracy of nCBV C99 was significantly higher than that of the mean nCBV (P = 0.016) in distinguishing high- from low-grade gliomas and was comparable to that of the peak height (P = 1.000). Validation using the two cutoff values of nCBV C99 achieved a diagnostic accuracy of 66.7% (6/9) for the separation of all three glioma grades. Conclusion Cumulative histogram analysis of nCBV using 3 T MRI can be a useful method for preoperative glioma grading. The nCBV C99 value is helpful in distinguishing high- from low-grade gliomas and grade IV from III gliomas. PMID:23704910

  15. Equal Cell Size and Nonorthogonality in ANCOVA.

    ERIC Educational Resources Information Center

    Llabre, Maria M.; Ware, William B.

    1980-01-01

    Computer programs for analysis of covariance use classical experimental, regression, or hierarchical methods of least squares. In a 3 X 3 factorial experiment with equal cell frequencies, three solutions yielded different sums of squares for main effects although correlation between variables was negligible and cell frequencies were equal.…

  16. Continuing Controversy in Equal Employment Law.

    ERIC Educational Resources Information Center

    Vogt, Carl W.; Robles, Martin J.

    1980-01-01

    Some current controversies in the growing body of equal employment law are examined: wage comparability in the context of sex discrimination; procedural sections of the laws and related litigation; class actions by the Equal Employment Opportunities Commission; the scope of litigation; and the criteria for conferring standing on the plaintiff.…

  17. Equal Opportunity in the Air Force

    ERIC Educational Resources Information Center

    Gatling, Wade S.

    1976-01-01

    Notes that the Affirmative Action Plan of the Air Force is built on the premise that equal opportunity and treatment is everyone's responsibility. Unit Commanders, on-the-job training monitors, recreation centers directors, and others are involved in specific facets of the Plan to insure equality. (Author/AM)

  18. Equal Plate Charges on Series Capacitors?

    ERIC Educational Resources Information Center

    Illman, B. L.; Carlson, G. T.

    1994-01-01

    Provides a line of reasoning in support of the contention that the equal charge proposition is at best an approximation. Shows how the assumption of equal plate charge on capacitors in series contradicts the conservative nature of the electric field. (ZWH)

  19. Vocational Education and Equality of Opportunity.

    ERIC Educational Resources Information Center

    Horowitz, Benjamin; Feinberg, Walter

    1990-01-01

    Examines the concepts of equality of opportunity and equality of educational opportunity and their relationship to vocational education. Traces the history of vocational education. Delineates the distinction between training and education as enumerated in Aristotelian philosophy. Discusses the role vocational education can play in the educative…

  20. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Reservoir, equalizing. 236.792 Section 236.792 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of...

  1. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Reservoir, equalizing. 236.792 Section 236.792 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of...

  2. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Reservoir, equalizing. 236.792 Section 236.792 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of...

  3. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Reservoir, equalizing. 236.792 Section 236.792 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Reservoir, equalizing. An air reservoir connected with and adding volume to the top portion of...

  4. Universal and adapted vocabularies for generic visual categorization.

    PubMed

    Perronnin, Florent

    2008-07-01

    Generic Visual Categorization (GVC) is the pattern classification problem which consists in assigning labels to an image based on its semantic content. This is a challenging task as one has to deal with inherent object/scene variations as well as changes in viewpoint, lighting and occlusion. Several state-of-the-art GVC systems use a vocabulary of visual terms to characterize images with a histogram of visual word counts. We propose a novel practical approach to GVC based on a universal vocabulary, which describes the content of all the considered classes of images, and class vocabularies obtained through the adaptation of the universal vocabulary using class-specific data. The main novelty is that an image is characterized by a set of histograms - one per class - where each histogram describes whether the image content is best modeled by the universal vocabulary or the corresponding class vocabulary. This framework is applied to two types of local image features: low-level descriptors such as the popular SIFT and high-level histograms of word co-occurrences in a spatial neighborhood. It is shown experimentally on two challenging datasets (an in-house database of 19 categories and the PASCAL VOC 2006 dataset) that the proposed approach exhibits state-of-the-art performance at a modest computational cost.

  5. Image quality and automatic color equalization

    NASA Astrophysics Data System (ADS)

    Chambah, M.; Rizzi, A.; Saint Jean, C.

    2007-01-01

    In the professional movie field, image quality is mainly judged visually. In fact, experts and technicians judge and determine the quality of the film images during the calibration (post production) process. As a consequence, the quality of a restored movie is also estimated subjectively by experts [26,27]. On the other hand, objective quality metrics do not necessarily correlate well with perceived quality [28]. Moreover, some measures assume that there exists a reference in the form of an "original" to compare to, which prevents their use in digital restoration field, where often there is no reference to compare to. That is why subjective evaluation is the most used and most efficient approach up to now. But subjective assessment is expensive, time consuming and does not respond, hence, to the economic requirements of the field [29,25]. Thus, reliable automatic methods for visual quality assessment are needed in the field of digital film restoration. Ideally, a quality assessment system would perceive and measure image or video impairments just like a human being. The ACE method, for Automatic Color Equalization [1,2], is an algorithm for digital images unsupervised enhancement. Like our vision system ACE is able to adapt to widely varying lighting conditions, and to extract visual information from the environment efficaciously. We present in this paper is the use of ACE as a basis of a reference free image quality metric. ACE output is an estimate of our visual perception of a scene. The assumption, tested in other papers [3,4], is that ACE enhancing images is in the way our vision system will perceive them, increases their overall perceived quality. The basic idea proposed in this paper, is that ACE output can differ from the input more or less according to the visual quality of the input image In other word, an image appears good if it is near to the visual appearance we (estimate to) have of it. Reversely bad quality images will need "more filtering". Test

  6. Variance of a potential of mean force obtained using the weighted histogram analysis method.

    PubMed

    Cukier, Robert I

    2013-11-27

    A potential of mean force (PMF) that provides the free energy of a thermally driven system along some chosen reaction coordinate (RC) is a useful descriptor of systems characterized by complex, high dimensional potential energy surfaces. Umbrella sampling window simulations use potential energy restraints to provide more uniform sampling along a RC so that potential energy barriers that would otherwise make equilibrium sampling computationally difficult can be overcome. Combining the results from the different biased window trajectories can be accomplished using the Weighted Histogram Analysis Method (WHAM). Here, we provide an analysis of the variance of a PMF along the reaction coordinate. We assume that the potential restraints used for each window lead to Gaussian distributions for the window reaction coordinate densities and that the data sampling in each window is from an equilibrium ensemble sampled so that successive points are statistically independent. Also, we assume that neighbor window densities overlap, as required in WHAM, and that further-than-neighbor window density overlap is negligible. Then, an analytic expression for the variance of the PMF along the reaction coordinate at a desired level of spatial resolution can be generated. The variance separates into a sum over all windows with two kinds of contributions: One from the variance of the biased window density normalized by the total biased window density and the other from the variance of the local (for each window's coordinate range) PMF. Based on the desired spatial resolution of the PMF, the former variance can be minimized relative to that from the latter. The method is applied to a model system that has features of a complex energy landscape evocative of a protein with two conformational states separated by a free energy barrier along a collective reaction coordinate. The variance can be constructed from data that is already available from the WHAM PMF construction.

  7. Technical Note: Dose-volume histogram analysis in radiotherapy using the Gaussian error function

    SciTech Connect

    Chow, James C. L.; Markel, Daniel; Jiang, Runqing

    2008-04-15

    A mathematical model based on the Gaussian error and complementary error functions was proposed to describe the cumulative dose-volume histogram (cDVH) for a region of interest in a radiotherapy plan. Parameters in the model (a, b, c) are related to different characteristics of the shape of a cDVH curve such as the maximum relative volume, slope and position of a curve drop off, respectively. A prostate phantom model containing a prostate, the seminal vesicle, bladder and rectum with cylindrical organ geometries was used to demonstrate the effect of interfraction prostate motion on the cDVH based on this error function model. The prostate phantom model was planned using a five-beam intensity modulated radiotherapy (IMRT), and a four-field box (4FB), technique with the clinical target volume (CTV) shifted in different directions from the center. In the case of the CTV moving out of the planning target volume (PTV), that is, the margin between the CTV and PTV is underestimated, parameter c (related to position of curve drop off) in the 4FB plan and parameters b (related to the slope of curve) and c in the IMRT plan vary significantly with CTV displacement. This shows that variation of the cDVH is present in the 4FB plan and such variation is more serious in the IMRT plan. These variations of cDVHs for 4FB and IMRT are due to the different dose gradients at the CTV edges in the anterior and posterior directions for the 4FB and IMRT plan. It is believed that a mathematical representation of the dose-volume relationship provides another viewpoint from which to illustrate problems with radiotherapy delivery such as internal organ motion that affect the dose distribution in a treatment plan.

  8. Dose-Volume Histogram Analysis of the Safety of Proton Beam Therapy for Unresectable Hepatocellular Carcinoma

    SciTech Connect

    Kawashima, Mitsuhiko; Kohno, Ryosuke; Nakachi, Kohei; Nishio, Teiji; Mitsunaga, Shuichi; Ikeda, Masafumi; Konishi, Masaru; Takahashi, Shinichiro; Gotohda, Naoto; Arahira, Satoko; Zenda, Sadamoto; Ogino, Takashi; Kinoshita, Taira

    2011-04-01

    Purpose: To evaluate the safety and efficacy of radiotherapy using proton beam (PRT) for unresectable hepatocellular carcinoma. Methods and Materials: Sixty consecutive patients who underwent PRT between May 1999 and July 2007 were analyzed. There were 42 males and 18 females, with a median age of 70 years (48-92 years). All but 1 patient had a single lesion with a median diameter of 45 mm (20-100 mm). Total PRT dose/fractionation was 76-cobalt Gray equivalent (CGE)/20 fractions in 46 patients, 65 CGE/26 fractions in 11 patients, and 60 CGE/10 fractions in 3 patients. The risk of developing proton-induced hepatic insufficiency (PHI) was estimated using dose-volume histograms and an indocyanine-green retention rate at 15 minutes (ICG R15). Results: None of the 20 patients with ICG R15 of less than 20% developed PHI, whereas 6 of 8 patients with ICG R15 values of 50% or higher developed PHI. Among 32 patients whose ICG R15 ranged from 20% to 49.9%, PHI was observed only in patients who had received 30 CGE (V30) to more than 25% of the noncancerous parts of the liver (n = 5) Local progression-free and overall survival rates at 3 years were 90% (95% confidence interval [CI], 80-99%) and 56% (95% CI, 43-69%), respectively. A gastrointestinal toxicity of Grade {>=}2 was observed in 3 patients. Conclusions: ICG R15 and V30 are recommended as useful predictors for the risk of developing PHI, which should be incorporated into multidisciplinary treatment plans for patients with this disease.

  9. Nanothermodynamics of large iron clusters by means of a flat histogram Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Basire, M.; Soudan, J.-M.; Angelié, C.

    2014-09-01

    The thermodynamics of iron clusters of various sizes, from 76 to 2452 atoms, typical of the catalyst particles used for carbon nanotubes growth, has been explored by a flat histogram Monte Carlo (MC) algorithm (called the σ-mapping), developed by Soudan et al. [J. Chem. Phys. 135, 144109 (2011), Paper I]. This method provides the classical density of states, gp(Ep) in the configurational space, in terms of the potential energy of the system, with good and well controlled convergence properties, particularly in the melting phase transition zone which is of interest in this work. To describe the system, an iron potential has been implemented, called "corrected EAM" (cEAM), which approximates the MEAM potential of Lee et al. [Phys. Rev. B 64, 184102 (2001)] with an accuracy better than 3 meV/at, and a five times larger computational speed. The main simplification concerns the angular dependence of the potential, with a small impact on accuracy, while the screening coefficients Sij are exactly computed with a fast algorithm. With this potential, ergodic explorations of the clusters can be performed efficiently in a reasonable computing time, at least in the upper half of the solid zone and above. Problems of ergodicity exist in the lower half of the solid zone but routes to overcome them are discussed. The solid-liquid (melting) phase transition temperature Tm is plotted in terms of the cluster atom number Nat. The standard N_{at}^{-1/3} linear dependence (Pawlow law) is observed for Nat >300, allowing an extrapolation up to the bulk metal at 1940 ±50 K. For Nat <150, a strong divergence is observed compared to the Pawlow law. The melting transition, which begins at the surface, is stated by a Lindemann-Berry index and an atomic density analysis. Several new features are obtained for the thermodynamics of cEAM clusters, compared to the Rydberg pair potential clusters studied in Paper I.

  10. Assessment of Autonomic Function by Phase Rectification of RRInterval Histogram Analysis in Chagas Disease

    PubMed Central

    Nasari-Junior, Olivassé; Benchimol-Barbosa, Paulo Roberto; Pedrosa, Roberto Coury; Nadal, Jurandir

    2015-01-01

    Background In chronic Chagas disease (ChD), impairment of cardiac autonomic function bears prognostic implications. Phase‑rectification of RR-interval series isolates the sympathetic, acceleration phase (AC) and parasympathetic, deceleration phase (DC) influences on cardiac autonomic modulation. Objective This study investigated heart rate variability (HRV) as a function of RR-interval to assess autonomic function in healthy and ChD subjects. Methods Control (n = 20) and ChD (n = 20) groups were studied. All underwent 60-min head-up tilt table test under ECG recording. Histogram of RR-interval series was calculated, with 100 ms class, ranging from 600–1100 ms. In each class, mean RR-intervals (MNN) and root-mean-squared difference (RMSNN) of consecutive normal RR-intervals that suited a particular class were calculated. Average of all RMSNN values in each class was analyzed as function of MNN, in the whole series (RMSNNT), and in AC (RMSNNAC) and DC (RMSNNDC) phases. Slopes of linear regression lines were compared between groups using Student t-test. Correlation coefficients were tested before comparisons. RMSNN was log-transformed. (α < 0.05). Results Correlation coefficient was significant in all regressions (p < 0.05). In the control group, RMSNNT, RMSNNAC, and RMSNNDC significantly increased linearly with MNN (p < 0.05). In ChD, only RMSNNAC showed significant increase as a function of MNN, whereas RMSNNT and RMSNNDC did not. Conclusion HRV increases in proportion with the RR-interval in healthy subjects. This behavior is lost in ChD, particularly in the DC phase, indicating cardiac vagal incompetence. PMID:26131700

  11. Automatic detection of atrial fibrillation using the coefficient of variation and density histograms of RR and deltaRR intervals.

    PubMed

    Tateno, K; Glass, L

    2001-11-01

    The paper describes a method for the automatic detection of atrial fibrillation, an abnormal heart rhythm, based on the sequence of intervals between heartbeats. The RR interval is the interbeat interval, and deltaRR is the difference between two successive RR intervals. Standard density histograms of the RR and deltaRR intervals were prepared as templates for atrial fibrillation detection. As the coefficients of variation of the RR and deltaRR intervals were approximately constant during atrial fibrillation, the coefficients of variation in the test data could be compared with the standard coefficients of variation (CV test). Further, the similarities between the density histograms of the test data and the standard density histograms were estimated using the Kolmogorov-Smirnov test. The CV test based on the RR intervals showed a sensitivity of 86.6% and a specificity of 84.3%. The CV test based on the deltaRR intervals showed that the sensitivity and the specificity are both approximately 84%. The Kolmogorov-Smirnov test based on the RR intervals did not improve on the result of the CV test. In contrast, the Kolmogorov-Smirnov test based on the ARR intervals showed a sensitivity of 94.4% and a specificity of 97.2%.

  12. A computational tool for the efficient analysis of dose-volume histograms for radiation therapy treatment plans

    PubMed Central

    Pyakuryal, Anil; Myint, W. Kenji; Gopalakrishnan, Mahesh; Jang, Sunyoung; Logemann, Jerilyn A.; Mittal, Bharat B.

    2010-01-01

    A Histogram Analysis in Radiation Therapy (HART) program was primarily developed to increase the efficiency and accuracy of dose–volume histogram (DVH) analysis of large quantities of patient data in radiation therapy research. The program was written in MATLAB to analyze patient plans exported from the treatment planning system (Pinnacle3) in the American Association of Physicists in Medicine/Radiation Therapy Oncology Group (AAPM/RTOG) format. HART-computed DVH data was validated against manually extracted data from the planning system for five head and neck cancer patients treated with the intensity-modulated radiation therapy (IMRT) technique. HART calculated over 4000 parameters from the differential DVH (dDVH) curves for each patient in approximately 10–15 minutes. Manual extraction of this amount of data required 5 to 6 hours. The normalized root mean square deviation (NRMSD) for the HART–extracted DVH outcomes was less than 1%, or within 0.5% distance-to-agreement (DTA). This tool is supported with various user-friendly options and graphical displays. Additional features include optimal polynomial modeling of DVH curves for organs, treatment plan indices (TPI) evaluation, plan-specific outcome analysis (POA), and spatial DVH (zDVH) and dose surface histogram (DSH) analyses, respectively. HART is freely available to the radiation oncology community. PMID:20160690

  13. Equality Assurance: Self-Assessment for Equal Opportunities in Further Education.

    ERIC Educational Resources Information Center

    Dadzie, Stella, Comp.

    This manual is intended as a tool kit for further education (FE) colleges to use to develop their own approaches to equal opportunities policy development and implementation. The following topics are discussed in the eight sections: the manual's development; the case for equality; things an equal opportunities policy should cover; strategic and…

  14. Equality Hypocrisy, Inconsistency, and Prejudice: The Unequal Application of the Universal Human Right to Equality

    PubMed Central

    2015-01-01

    In Western culture, there appears to be widespread endorsement of Article 1 of the Universal Declaration of Human Rights (which stresses equality and freedom). But do people really apply their equality values equally, or are their principles and application systematically discrepant, resulting in equality hypocrisy? The present study, conducted with a representative national sample of adults in the United Kingdom (N = 2,895), provides the first societal test of whether people apply their value of “equality for all” similarly across multiple types of status minority (women, disabled people, people aged over 70, Blacks, Muslims, and gay people). Drawing on theories of intergroup relations and stereotyping we examined, relation to each of these groups, respondents’ judgments of how important it is to satisfy their particular wishes, whether there should be greater or reduced equality of employment opportunities, and feelings of social distance. The data revealed a clear gap between general equality values and responses to these specific measures. Respondents prioritized equality more for “paternalized” groups (targets of benevolent prejudice: women, disabled, over 70) than others (Black people, Muslims, and homosexual people), demonstrating significant inconsistency. Respondents who valued equality more, or who expressed higher internal or external motivation to control prejudice, showed greater consistency in applying equality. However, even respondents who valued equality highly showed significant divergence in their responses to paternalized versus nonpaternalized groups, revealing a degree of hypocrisy. Implications for strategies to promote equality and challenge prejudice are discussed. PMID:25914516

  15. Jarzynski equality for quantum stochastic maps.

    PubMed

    Rastegin, Alexey E; Życzkowski, Karol

    2014-01-01

    Jarzynski equality and related fluctuation theorems can be formulated for various setups. Such an equality was recently derived for nonunitary quantum evolutions described by unital quantum operations, i.e., for completely positive, trace-preserving maps, which preserve the maximally mixed state. We analyze here a more general case of arbitrary quantum operations on finite systems and derive the corresponding form of the Jarzynski equality. It contains a correction term due to nonunitality of the quantum map. Bounds for the relative size of this correction term are established and they are applied for exemplary systems subjected to quantum channels acting on a finite-dimensional Hilbert space.

  16. Jarzynski equality for quantum stochastic maps

    NASA Astrophysics Data System (ADS)

    Rastegin, Alexey E.; Życzkowski, Karol

    2014-01-01

    Jarzynski equality and related fluctuation theorems can be formulated for various setups. Such an equality was recently derived for nonunitary quantum evolutions described by unital quantum operations, i.e., for completely positive, trace-preserving maps, which preserve the maximally mixed state. We analyze here a more general case of arbitrary quantum operations on finite systems and derive the corresponding form of the Jarzynski equality. It contains a correction term due to nonunitality of the quantum map. Bounds for the relative size of this correction term are established and they are applied for exemplary systems subjected to quantum channels acting on a finite-dimensional Hilbert space.

  17. Equal Remuneration Convention (ILO No. 100).

    PubMed

    1989-01-01

    The government of Uruguay ratified this UN International Labor Organization convention on equal remuneration on November 16, 1989, and the Government of Zimbabwe ratified this Convention on December 14, 1989.

  18. 7 CFR 622.6 - Equal opportunity.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... of Agriculture (7 CFR Part 15), which provide that no person in the United States shall, on the... AGRICULTURE WATER RESOURCES WATERSHED PROJECTS General § 622.6 Equal opportunity. The Pub. L. 83-566 and...

  19. 7 CFR 622.6 - Equal opportunity.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... of Agriculture (7 CFR Part 15), which provide that no person in the United States shall, on the... AGRICULTURE WATER RESOURCES WATERSHED PROJECTS General § 622.6 Equal opportunity. The Pub. L. 83-566 and...

  20. 7 CFR 622.6 - Equal opportunity.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... of Agriculture (7 CFR Part 15), which provide that no person in the United States shall, on the... AGRICULTURE WATER RESOURCES WATERSHED PROJECTS General § 622.6 Equal opportunity. The Pub. L. 83-566 and...

  1. 7 CFR 622.6 - Equal opportunity.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... of Agriculture (7 CFR Part 15), which provide that no person in the United States shall, on the... AGRICULTURE WATER RESOURCES WATERSHED PROJECTS General § 622.6 Equal opportunity. The Pub. L. 83-566 and...

  2. Publicly Supported, Universally Available Education and Equality

    ERIC Educational Resources Information Center

    Boulding, Kenneth E.

    1976-01-01

    It was once thought that publicly supported, universally available education would produce socioeconomic equality. Family inheritance--financial, cultural, social--explains much of its failure to do so. (Author)

  3. 49 CFR 236.792 - Reservoir, equalizing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... equalizing piston chamber of the automatic brake valve, to provide uniform service reductions in brake pipe pressure regardless of the length of the train. Cross Reference: Rocker, see § 236.755....

  4. The Bakke Opinions and Equal Protection Doctrine.

    ERIC Educational Resources Information Center

    Karst, Kenneth L.; Horowitz, Harold W.

    1979-01-01

    Constitutional issues addressed in the Supreme Court's decision are reviewed. The opinions rendered by Justice Powell are viewed as reflections of the weakness of recent equal protection theory, and as signs of future doctrine. (GC)

  5. The Equality Act 2010 and mental health.

    PubMed

    Lockwood, Graeme; Henderson, Claire; Thornicroft, Graham

    2012-03-01

    One aim of the Equality Act 2010 is to protect people with disabilities and prevent disability discrimination. We review the key provisions of the Act relevant to disability discrimination with respect to mental illness.

  6. Summary of the second equal opportunity conference

    SciTech Connect

    Not Available

    1984-02-01

    Reports and recommendations are included for: Hispanic Employment Program; Federal Women's Program; Equal Employment Opportunity Program; complaint processing workshop; Affirmative Actions Program Workshop; EEO programs in the Office of Personnel Management; and an overall evaluation of the program. (PSB)

  7. Turbo Equalization Using Partial Gaussian Approximation

    NASA Astrophysics Data System (ADS)

    Zhang, Chuanzong; Wang, Zhongyong; Manchon, Carles Navarro; Sun, Peng; Guo, Qinghua; Fleury, Bernard Henri

    2016-09-01

    This paper deals with turbo-equalization for coded data transmission over intersymbol interference (ISI) channels. We propose a message-passing algorithm that uses the expectation-propagation rule to convert messages passed from the demodulator-decoder to the equalizer and computes messages returned by the equalizer by using a partial Gaussian approximation (PGA). Results from Monte Carlo simulations show that this approach leads to a significant performance improvement compared to state-of-the-art turbo-equalizers and allows for trading performance with complexity. We exploit the specific structure of the ISI channel model to significantly reduce the complexity of the PGA compared to that considered in the initial paper proposing the method.

  8. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal..., or responsibility required for the performance of jobs will not render the equal pay...

  9. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 4 2011-07-01 2011-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal..., or responsibility required for the performance of jobs will not render the equal pay...

  10. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 4 2012-07-01 2012-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal..., or responsibility required for the performance of jobs will not render the equal pay...

  11. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 4 2014-07-01 2014-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal..., or responsibility required for the performance of jobs will not render the equal pay...

  12. 29 CFR 1620.14 - Testing equality of jobs.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 4 2013-07-01 2013-07-01 false Testing equality of jobs. 1620.14 Section 1620.14 Labor... Testing equality of jobs. (a) In general. What constitutes equal skill, equal effort, or equal..., or responsibility required for the performance of jobs will not render the equal pay...

  13. Nanothermodynamics of large iron clusters by means of a flat histogram Monte Carlo method.

    PubMed

    Basire, M; Soudan, J-M; Angelié, C

    2014-09-14

    The thermodynamics of iron clusters of various sizes, from 76 to 2452 atoms, typical of the catalyst particles used for carbon nanotubes growth, has been explored by a flat histogram Monte Carlo (MC) algorithm (called the σ-mapping), developed by Soudan et al. [J. Chem. Phys. 135, 144109 (2011), Paper I]. This method provides the classical density of states, gp(Ep) in the configurational space, in terms of the potential energy of the system, with good and well controlled convergence properties, particularly in the melting phase transition zone which is of interest in this work. To describe the system, an iron potential has been implemented, called "corrected EAM" (cEAM), which approximates the MEAM potential of Lee et al. [Phys. Rev. B 64, 184102 (2001)] with an accuracy better than 3 meV/at, and a five times larger computational speed. The main simplification concerns the angular dependence of the potential, with a small impact on accuracy, while the screening coefficients S(ij) are exactly computed with a fast algorithm. With this potential, ergodic explorations of the clusters can be performed efficiently in a reasonable computing time, at least in the upper half of the solid zone and above. Problems of ergodicity exist in the lower half of the solid zone but routes to overcome them are discussed. The solid-liquid (melting) phase transition temperature T(m) is plotted in terms of the cluster atom number N(at). The standard N(at)(-1/3) linear dependence (Pawlow law) is observed for N(at) >300, allowing an extrapolation up to the bulk metal at 1940 ±50 K. For N(at) <150, a strong divergence is observed compared to the Pawlow law. The melting transition, which begins at the surface, is stated by a Lindemann-Berry index and an atomic density analysis. Several new features are obtained for the thermodynamics of cEAM clusters, compared to the Rydberg pair potential clusters studied in Paper I.

  14. Nanothermodynamics of large iron clusters by means of a flat histogram Monte Carlo method

    SciTech Connect

    Basire, M.; Soudan, J.-M.; Angelié, C.

    2014-09-14

    The thermodynamics of iron clusters of various sizes, from 76 to 2452 atoms, typical of the catalyst particles used for carbon nanotubes growth, has been explored by a flat histogram Monte Carlo (MC) algorithm (called the σ-mapping), developed by Soudan et al. [J. Chem. Phys. 135, 144109 (2011), Paper I]. This method provides the classical density of states, g{sub p}(E{sub p}) in the configurational space, in terms of the potential energy of the system, with good and well controlled convergence properties, particularly in the melting phase transition zone which is of interest in this work. To describe the system, an iron potential has been implemented, called “corrected EAM” (cEAM), which approximates the MEAM potential of Lee et al. [Phys. Rev. B 64, 184102 (2001)] with an accuracy better than 3 meV/at, and a five times larger computational speed. The main simplification concerns the angular dependence of the potential, with a small impact on accuracy, while the screening coefficients S{sub ij} are exactly computed with a fast algorithm. With this potential, ergodic explorations of the clusters can be performed efficiently in a reasonable computing time, at least in the upper half of the solid zone and above. Problems of ergodicity exist in the lower half of the solid zone but routes to overcome them are discussed. The solid-liquid (melting) phase transition temperature T{sub m} is plotted in terms of the cluster atom number N{sub at}. The standard N{sub at}{sup −1/3} linear dependence (Pawlow law) is observed for N{sub at} >300, allowing an extrapolation up to the bulk metal at 1940 ±50 K. For N{sub at} <150, a strong divergence is observed compared to the Pawlow law. The melting transition, which begins at the surface, is stated by a Lindemann-Berry index and an atomic density analysis. Several new features are obtained for the thermodynamics of cEAM clusters, compared to the Rydberg pair potential clusters studied in Paper I.

  15. Adaptive cancellation techniques

    NASA Astrophysics Data System (ADS)

    1983-11-01

    An adaptive signal canceller has been evaluated for the enhancement of pulse signal reception during the transmission of a high power ECM jamming signal. The canceller design is based on the use of DRFM(Digital RF Memory) technology as part of an adaptive multiple tapped delay line. The study includes analysis of relationship of tap spacing and waveform bandwidth, survey of related documents in areas of sidelobe cancellers, transversal equalizers, and adaptive filters, and derivation of control equations and corresponding control processes. The simulation of overall processes included geometric analysis of the multibeam transmitting antenna, multiple reflection sources and the receiving antenna; waveforms, tap spacings and bandwidths; and alternate control algorithms. Conclusions are provided regarding practical system control algorithms, design characteristics and limitations.

  16. Criteria for equality in two entropic inequalities

    SciTech Connect

    Shirokov, M. E.

    2014-07-31

    We obtain a simple criterion for local equality between the constrained Holevo capacity and the quantum mutual information of a quantum channel. This shows that the set of all states for which this equality holds is determined by the kernel of the channel (as a linear map). Applications to Bosonic Gaussian channels are considered. It is shown that for a Gaussian channel having no completely depolarizing components the above characteristics may coincide only at non-Gaussian mixed states and a criterion for the existence of such states is given. All the obtained results may be reformulated as conditions for equality between the constrained Holevo capacity of a quantum channel and the input von Neumann entropy. Bibliography: 20 titles. (paper)

  17. All Are Equal, but Some Are More Equal than Others: Managerialism and Gender Equality in Higher Education in Comparative Perspective

    ERIC Educational Resources Information Center

    Teelken, Christine; Deem, Rosemary

    2013-01-01

    The main purpose of this paper is to investigate what impact new regimes of management and governance, including new managerialism, have had on perceptions of gender equality at universities in three Western European countries. While in accordance with national laws and EU directives, contemporary current management approaches in universities…

  18. Angiogenic response of locally advanced breast cancer to neoadjuvant chemotherapy evaluated with parametric histogram from dynamic contrast-enhanced MRI

    NASA Astrophysics Data System (ADS)

    Chang, Yeun-Chung; Huang, Chiun-Sheng; Liu, Yi-Jui; Chen, Jyh-Horng; Lu, Yen-Shen; Tseng, Wen-Yih I.

    2004-08-01

    The aim of this study was to evaluate angiogenic compositions and tumour response in the course of neoadjuvant chemotherapy in patients with locally advanced breast cancer (LABC) using dynamic contrast-enhanced (DCE) MRI. Thirteen patients with LABC underwent serial DCE MRI during the course of chemotherapy. DCE MRI was quantified using a two-compartment model on a pixel-by-pixel basis. Analysis of parametric histograms of amplitude, exchange rate kout and peak enhancement over the whole tumour was performed. The distribution patterns of histograms were correlated with the tumour response. Initial kurtosis and standard deviation of amplitude before chemotherapy correlated with tumour response, r = 0.63 and r = 0.61, respectively. Comparing the initial values with the values after the first course of chemotherapy, tumour response was associated with a decrease in standard deviation of amplitude (r = 0.79), and an increase in kurtosis and a decrease in standard deviation of kout (r = 0.57 and 0.57, respectively). Comparing the initial values with the values after completing the chemotherapy, tumours with better response were associated with an increase in kurtosis (r = 0.62), a decrease in mean (r = 0.84) and standard deviation (r = 0.77) of amplitude, and a decrease in mean of peak enhancement (r = 0.71). Our results suggested that tumours with better response tended to alter their internal compositions from heterogeneous to homogeneous distributions and a decrease in peak enhancement after chemotherapy. Serial analyses of parametric histograms of DCE MRI-derived angiogenic parameters are potentially useful to monitor the response of angiogenic compositions of a tumour throughout the course of chemotherapy, and might predict tumour response early in the course.

  19. Image quality-based adaptive illumination normalisation for face recognition

    NASA Astrophysics Data System (ADS)

    Sellahewa, Harin; Jassim, Sabah A.

    2009-05-01

    Automatic face recognition is a challenging task due to intra-class variations. Changes in lighting conditions during enrolment and identification stages contribute significantly to these intra-class variations. A common approach to address the effects such of varying conditions is to pre-process the biometric samples in order normalise intra-class variations. Histogram equalisation is a widely used illumination normalisation technique in face recognition. However, a recent study has shown that applying histogram equalisation on well-lit face images could lead to a decrease in recognition accuracy. This paper presents a dynamic approach to illumination normalisation, based on face image quality. The quality of a given face image is measured in terms of its luminance distortion by comparing this image against a known reference face image. Histogram equalisation is applied to a probe image if its luminance distortion is higher than a predefined threshold. We tested the proposed adaptive illumination normalisation method on the widely used Extended Yale Face Database B. Identification results demonstrate that our adaptive normalisation produces better identification accuracy compared to the conventional approach where every image is normalised, irrespective of the lighting condition they were acquired.

  20. A Substituting Meaning for the Equals Sign in Arithmetic Notating Tasks

    ERIC Educational Resources Information Center

    Jones, Ian; Pratt, Dave

    2012-01-01

    Three studies explore arithmetic tasks that support both substitutive and basic relational meanings for the equals sign. The duality of meanings enabled children to engage meaningfully and purposefully with the structural properties of arithmetic statements in novel ways. Some, but not all, children were successful at the adapted task and were…

  1. Method to improve the performance of the optical modulation format identification system based on asynchronous amplitude histogram

    NASA Astrophysics Data System (ADS)

    Cui, Sheng; He, Sheng; Shang, Jin; Ke, Changjian; Fu, Songnian; Liu, Deming

    2015-06-01

    A method to improve the performance of the asynchronous amplitude histogram (AAH) based optical modulation format identification system is proposed. It is demonstrated that with additional static dispersion compensation modules (SDCMs), polarization and non-polarization multiplexed (PM/NPM) signals can be distinguished simply from the AAH peak position difference, while the stringent chromatic dispersion limit imposed on the MFI method can be expanded up to desired values by selectively enabling the SDCMs to minimize the width to area ratio (WAR) of the AAH. Numerical simulations and experiments are carried out to demonstrate the effectiveness of this method.

  2. Bennett's acceptance ratio and histogram analysis methods enhanced by umbrella sampling along a reaction coordinate in configurational space

    NASA Astrophysics Data System (ADS)

    Kim, Ilsoo; Allen, Toby W.

    2012-04-01

    Free energy perturbation, a method for computing the free energy difference between two states, is often combined with non-Boltzmann biased sampling techniques in order to accelerate the convergence of free energy calculations. Here we present a new extension of the Bennett acceptance ratio (BAR) method by combining it with umbrella sampling (US) along a reaction coordinate in configurational space. In this approach, which we call Bennett acceptance ratio with umbrella sampling (BAR-US), the conditional histogram of energy difference (a mapping of the 3N-dimensional configurational space via a reaction coordinate onto 1D energy difference space) is weighted for marginalization with the associated population density along a reaction coordinate computed by US. This procedure produces marginal histograms of energy difference, from forward and backward simulations, with higher overlap in energy difference space, rendering free energy difference estimations using BAR statistically more reliable. In addition to BAR-US, two histogram analysis methods, termed Bennett overlapping histograms with US (BOH-US) and Bennett-Hummer (linear) least square with US (BHLS-US), are employed as consistency and convergence checks for free energy difference estimation by BAR-US. The proposed methods (BAR-US, BOH-US, and BHLS-US) are applied to a 1-dimensional asymmetric model potential, as has been used previously to test free energy calculations from non-equilibrium processes. We then consider the more stringent test of a 1-dimensional strongly (but linearly) shifted harmonic oscillator, which exhibits no overlap between two states when sampled using unbiased Brownian dynamics. We find that the efficiency of the proposed methods is enhanced over the original Bennett's methods (BAR, BOH, and BHLS) through fast uniform sampling of energy difference space via US in configurational space. We apply the proposed methods to the calculation of the electrostatic contribution to the absolute

  3. Adaptive Management

    EPA Science Inventory

    Adaptive management is an approach to natural resource management that emphasizes learning through management where knowledge is incomplete, and when, despite inherent uncertainty, managers and policymakers must act. Unlike a traditional trial and error approach, adaptive managem...

  4. Equal Opportunity and Racial Differences in IQ.

    ERIC Educational Resources Information Center

    Fagan, Joseph F.; Holland, Cynthia R.

    2002-01-01

    Administered an intelligence test to blacks and whites in 2 studies involving 254 community college students and 2 more studies involving 115 community college students. Results show that differences in knowledge between blacks and whites for items on an intelligence test, the meanings of words, can be eliminated when equal opportunities for…

  5. Gender Equality Policies and Higher Education Careers

    ERIC Educational Resources Information Center

    Berggren, Caroline

    2011-01-01

    Gender equality policies regulate the Swedish labour market, including higher education. This study analyses and discusses the career development of postgraduate students in the light of labour market influences. The principle of gender separation is used to understand these effects. Swedish register data encompassing information on 585…

  6. Gender Equality in Academia: A Critical Reflection

    ERIC Educational Resources Information Center

    Winchester, Hilary P. M.; Browning, Lynette

    2015-01-01

    Gender equality in academia has been monitored in Australia for the past three decades so it is timely to reflect on what progress has been made, what works, and what challenges remain. When data were first published on the gender composition of staff in Australian universities in the mid-1980s women comprised 20 per cent of academic staff and…

  7. 75 FR 53559 - Women's Equality Day, 2010

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-31

    ... ladies'' in the ``new code of laws'' of our fledgling country. It has taken the collective efforts of... Constitution. Standing on the shoulders of these trailblazers, we pay tribute to the brave women who dot the... remains committed to advancing women's equality in all areas of our society and around the world. I...

  8. Racial Equality. To Protect These Rights Series.

    ERIC Educational Resources Information Center

    McDonald, Laughlin

    A historical review of racial discrimination against Negroes is the scope of this volume, part of a series of six volumes which explore the basic American rights. These include due process of law, freedom of speech and religious freedom. This volume traces the development of racial equality in the legal system, explores the controversies and…

  9. 45 CFR 98.43 - Equal access.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Program Operations (Child Care Services)-Lead Agency and Provider Requirements § 98.43 Equal access. (a) The Lead Agency shall certify that the payment rates for the provision of child care services under this part...

  10. 45 CFR 98.43 - Equal access.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Program Operations (Child Care Services)-Lead Agency and Provider Requirements § 98.43 Equal access. (a) The Lead Agency shall certify that the payment rates for the provision of child care services under this part...

  11. 45 CFR 98.43 - Equal access.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Department of Health and Human Services GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Program Operations (Child Care Services)-Lead Agency and Provider Requirements § 98.43 Equal access. (a) The Lead Agency shall certify that the payment rates for the provision of child care services under this part...

  12. 45 CFR 98.43 - Equal access.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Program Operations (Child Care Services)-Lead Agency and Provider Requirements § 98.43 Equal access. (a) The Lead Agency shall certify that the payment rates for the provision of child care services under this part...

  13. 45 CFR 98.43 - Equal access.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION CHILD CARE AND DEVELOPMENT FUND Program Operations (Child Care Services)-Lead Agency and Provider Requirements § 98.43 Equal access. (a) The Lead Agency shall certify that the payment rates for the provision of child care services under this part...

  14. Equality by Default: Women in Dual Roles

    ERIC Educational Resources Information Center

    Campbell, Ena

    1978-01-01

    The struggle for an Equal Rights Amendment (ERA) to the American Constitution is one of the most controversial issues of this era. Discusses the changing role of women amidst a fast-changing society, the styles of those opposing the women's revolution, the debate over women as persons, women in dual roles, and the implications of ERA for the world…

  15. Hawkins' Swan Song: Equalize Funding Now.

    ERIC Educational Resources Information Center

    Penning, Nick

    1990-01-01

    Representative Augustus Hawkins, outgoing chairman of the U.S. House Committee on Education and Labor, stunned state governments and the education community in January by introducing a bill requiring each state to certify that its public education funds are or will be equalized by 1996, contingent on federal funding redistribution penalties. (MLH)

  16. Disability in the UK: Measuring Equality

    ERIC Educational Resources Information Center

    Purdam, Kingsley; Afkhami, Reza; Olsen, Wendy; Thornton, Patricia

    2008-01-01

    In this article we identify the key survey data for examining the issue of equality in the lives of disabled people in the UK. Such data is essential for assessing change in quality of life over time and for the evaluation of the impact of policy initiatives. For each data source we consider definitions, data collection, issue coverage, sample…

  17. Public Schools and Equal Educational Opportunity

    ERIC Educational Resources Information Center

    Green, Robert L.

    1974-01-01

    Discusses several steps necessary to obtaining equal educational opportunities. These include placing better teaching staffs in urban schools, providing extensive field work for student teachers interested in teaching minority children, reevaluating the current curriculum for occurrences of racism and devising extracurricular activities for…

  18. The Path to Equal Rights in Michigan

    ERIC Educational Resources Information Center

    Gratz, Jennifer

    2007-01-01

    The litigant in a historic reverse-discrimination case against the University of Michigan, and subsequently the leader of a Michigan ballot initiative that carried the day against long odds, recounts how her simple call for equal treatment under the law persuaded the people of her state that color-conscious preferences are wrong.

  19. Equal Educational Opportunity for Puerto Ricans.

    ERIC Educational Resources Information Center

    O'Connor, Mary

    Puerto Ricans as a group are more disadvantaged economically, politically, and socially than any other ethnic minority. This marginalization is partly due to the educational system's discriminatory practices which deprive the vast majority of Puerto Rican children of equal educational opportunities. The educational problems of Puerto Ricans stem…

  20. Equity, Equal Opportunities, Gender and Organization Performance.

    ERIC Educational Resources Information Center

    Standing, Hilary; Baume, Elaine

    The issues of equity, equal opportunities, gender, and organization performance in the health care sector worldwide was examined. Information was gathered from the available literature and from individuals in 17 countries. The analysis highlighted the facts that employment equity debates and policies refer largely to high-income countries and…

  1. Transversal filter for parabolic phase equalization

    NASA Technical Reports Server (NTRS)

    Kelly, Larry R. (Inventor); Waugh, Geoffrey S. (Inventor)

    1993-01-01

    An equalizer (10) for removing parabolic phase distortion from an analog signal (3), utilizing a pair of series connected transversal filters. The parabolic phase distortion is cancelled by generating an inverse parabolic approximation using a sinusoidal phase control filter (18). The signal (3) is then passed through an amplitude control filter (21) to remove magnitude ripple components.

  2. When Equal Masses Don't Balance

    ERIC Educational Resources Information Center

    Newburgh, Ronald; Peidle, Joseph; Rueckner, Wolfgang

    2004-01-01

    We treat a modified Atwood's machine in which equal masses do not balance because of being in an accelerated frame of reference. Analysis of the problem illuminates the meaning of inertial forces, d'Alembert's principle, the use of free-body diagrams and the selection of appropriate systems for the diagrams. In spite of the range of these…

  3. Equalizing Multi-School Curriculum by Technology.

    ERIC Educational Resources Information Center

    Etowah County Board of Education, Gadsden, AL.

    A three year project aimed at providing equal educational opportunity for all students in the seven high schools of Etowah County, Alabama by implementing a county-wide curriculum using a flexible, rotating schedule, audio-graphic network, instructional television, a learning center, and individualized instruction. The report rates the project as…

  4. An American Perspective on Equal Educational Opportunities

    ERIC Educational Resources Information Center

    Russo, Charles; Perkins, Brian

    2004-01-01

    The United States Supreme Court ushered in a new era in American history on May 17, 1954 in its monumental ruling in "Brown v Board of Education," Topeka, Kansas. "Brown" is not only the Court's most significant decision on race and equal educational opportunities, but also ranks among the most important cases it has ever decided. In "Brown" a…

  5. The Circle and Sphere as Great Equalizers.

    ERIC Educational Resources Information Center

    Schwartzman, Steven

    1991-01-01

    From the equality of the ratios of the surface areas and volumes of a sphere and its circumscribed cylinder, the exploration of theorems relating the ratios of surface areas and volumes of a sphere and other circumscribed solids in three dimensions, and analogous questions relating two-dimensional concepts of perimeter and area is recounted. (MDH)

  6. How the Equality Act affects you.

    PubMed

    Irwin, Wendy

    This article examines what the Equality Act 2010 covers and how it affects nurses and healthcare organisations. It outlines the main groups the act sets out to protect and the protection it offers, and describes how this law differs from previous legislation.

  7. Some Are More Equal Than Others.

    ERIC Educational Resources Information Center

    Ohanian, Susan

    1998-01-01

    Contends that not all books are created equal--"Anna Karenina," for example, is worth more than Nancy Drew mysteries. Relates, in a personal narrative, that when this opinion was manifested in a newspaper column, hundreds of letters took issue with the idea. Reiterates that the literate teacher finds ways to convince students that literacy is of…

  8. An Equal Employment Opportunity Sensitivity Workshop

    ERIC Educational Resources Information Center

    Patten, Thomas H., Jr.; Dorey, Lester E.

    1972-01-01

    The equal employment opportunity sensitivity workshop seems to be a useful training device for getting an organization started on developing black and white change agents. A report on the establishment of such a workshop at the U.S. Army Tank Automotive Command (TACOM). Includes charts of design, characteristics, analysis of results, program…

  9. What Is Equality of Opportunity in Education?

    ERIC Educational Resources Information Center

    Lazenby, Hugh

    2016-01-01

    There is widespread disagreement about what equality of opportunity in education requires. For some it is that each child is legally permitted to go to school. For others it is that each child receives the same educational resources. Further interpretations abound. This fact presents a problem: when politicians or academics claim they are in…

  10. Three Utilities for the Equal Sign

    ERIC Educational Resources Information Center

    Jones, Ian; Pratt, Dave

    2005-01-01

    We compare the activity of young children using a microworld and a JavaScript relational calculator with the literature on children using traditional calculators. We describe how the children constructed different meanings for the equal sign in each setting. It appears that the nature of the meaning constructed is highly dependent on specificities…

  11. Gender Equality in Education: Definitions and Measurements

    ERIC Educational Resources Information Center

    Subrahmanian, R.

    2005-01-01

    International consensus on education priorities accords an important place to achieving gender justice in the educational sphere. Both the Dakar 'Education for All' goals and the Millennium Development goals emphasise two goals, in this regard. These two goals are distinguished as gender parity goals [achieving equal participation of girls and…

  12. The Equal Access Act: Recent Court Decisions.

    ERIC Educational Resources Information Center

    Bjorklun, Eugene C.

    1989-01-01

    Examines court decisions which led to the passage of the Equal Access Act of 1984. Although the act was designed to clarify the issue over the legality of permitting religious clubs to meet on school property, it may have created more confusion. Concludes that the Supreme Court may have to decide the issue. (SLM)

  13. Position Paper: NO equals x Measurement

    ERIC Educational Resources Information Center

    Hauser, Thomas R.; Shy, Carl M.

    1972-01-01

    Doubts about the accuracy of measured concentrations of nitrogen dioxide (NO 2) in ambient air have led the Environmental Protection Agency to reassess both the analytical technique and the extent to which nitrogen oxides (NO equals x) control will need to satisfy federal laws. (BL)

  14. Precise Phase Comparator for Nearly Equal Frequencies

    NASA Technical Reports Server (NTRS)

    Reinhardt, V. S.; Adams, W. A.

    1982-01-01

    New circuit precisely compares phases of two RF signals nearly equal in frequency, such as two hydrogen-maser frequency standards. Measuring circuit minimizes interactions between two sources. Also stabilized against thermal effects and against noise that could produce erroneous readings. Heat sinking, buffer amplifiers, and low-noise zero-crossing detector make picosecond precision possible.

  15. Evaluating Faculty Performance Under the Equal Pay for Equal Work Doctrine

    ERIC Educational Resources Information Center

    Buzan, Bert Carl; Hunt, Thomas Lynn

    1976-01-01

    Faculty promotion and salary policies at the University of Texas at Austin are analyzed to determine whether male and female faculty members are rewarded equally for equal academic qualifications and performances. This regression analysis tends to support the discrimination hypothesis with respect to both promotion and salary policies. (Author/LBH)

  16. Experimental assessment of an automatic breast density classification algorithm based on principal component analysis applied to histogram data

    NASA Astrophysics Data System (ADS)

    Angulo, Antonio; Ferrer, Jose; Pinto, Joseph; Lavarello, Roberto; Guerrero, Jorge; Castaneda, Benjamín.

    2015-01-01

    Breast parenchymal density is considered a strong indicator of cancer risk. However, measures of breast density are often qualitative and require the subjective judgment of radiologists. This work proposes a supervised algorithm to automatically assign a BI-RADS breast density score to a digital mammogram. The algorithm applies principal component analysis to the histograms of a training dataset of digital mammograms to create four different spaces, one for each BI-RADS category. Scoring is achieved by projecting the histogram of the image to be classified onto the four spaces and assigning it to the closest class. In order to validate the algorithm, a training set of 86 images and a separate testing database of 964 images were built. All mammograms were acquired in the craniocaudal view from female patients without any visible pathology. Eight experienced radiologists categorized the mammograms according to a BIRADS score and the mode of their evaluations was considered as ground truth. Results show better agreement between the algorithm and ground truth for the training set (kappa=0.74) than for the test set (kappa=0.44) which suggests the method may be used for BI-RADS classification but a better training is required.

  17. Comparison of different algorithms for evaluation of respiratory sinus arrhytmia: cross-correlation function histogram analysis and regression analysis.

    PubMed

    Schmitz, J M; Claus, D; Neundörfer, B; Handwerker, H O

    1995-01-01

    Three algorithms for assessment of respiratory sinus arrhythmia (RSA) have been evaluated: cross-correlation function, histogram analysis and regression plot. The algorithms were tested experimentally in a group of 11 subjects. A cross-correlation function with a high time resolution (1 ms) was used for investigation of the time lag between instantaneous heart rate and respiration (CTL). This time lag was not affected by the breathing rate in a range of 8 to 29 breaths per minute. A mathematical model of CTL compared with experimental results indicates that respiratory sinus arrhythmia is probably modulated directly by the respiratory network in the brainstem rather than by a baroreflex in the range of breathing rate investigated. Histogram analysis reflects the impact of inspiration and expiration on respiratory sinus arrhythmia. For this purpose heart rate changes were separated into two distributions (inspiration-expiration). The result value (U-VAL) of the Mann-Whitney U-test reflects the impact of respiration on heart rate variability. Regression analysis of heart rate versus respiration shows that the heart rate increase is more closely coupled to inspiration than the heart rate decrease to expiration. Both, CTL and U-VAL are thought to be useful parameters for clinical investigation of RSA.

  18. Modeling the dark current histogram induced by gold contamination in complementary-metal-oxide-semiconductor image sensors

    NASA Astrophysics Data System (ADS)

    Domengie, F.; Morin, P.; Bauza, D.

    2015-07-01

    We propose a model for dark current induced by metallic contamination in a CMOS image sensor. Based on Shockley-Read-Hall kinetics, the expression of dark current proposed accounts for the electric field enhanced emission factor due to the Poole-Frenkel barrier lowering and phonon-assisted tunneling mechanisms. To that aim, we considered the distribution of the electric field magnitude and metal atoms in the depth of the pixel. Poisson statistics were used to estimate the random distribution of metal atoms in each pixel for a given contamination dose. Then, we performed a Monte-Carlo-based simulation for each pixel to set the number of metal atoms the pixel contained and the enhancement factor each atom underwent, and obtained a histogram of the number of pixels versus dark current for the full sensor. Excellent agreement with the dark current histogram measured on an ion-implanted gold-contaminated imager has been achieved, in particular, for the description of the distribution tails due to the pixel regions in which the contaminant atoms undergo a large electric field. The agreement remains very good when increasing the temperature by 15 °C. We demonstrated that the amplification of the dark current generated for the typical electric fields encountered in the CMOS image sensors, which depends on the nature of the metal contaminant, may become very large at high electric field. The electron and hole emissions and the resulting enhancement factor are described as a function of the trap characteristics, electric field, and temperature.

  19. Elucidating the effects of adsorbent flexibility on fluid adsorption using simple models and flat-histogram sampling methods

    SciTech Connect

    Shen, Vincent K. Siderius, Daniel W.

    2014-06-28

    Using flat-histogram Monte Carlo methods, we investigate the adsorptive behavior of the square-well fluid in two simple slit-pore-like models intended to capture fundamental characteristics of flexible adsorbent materials. Both models require as input thermodynamic information about the flexible adsorbent material itself. An important component of this work involves formulating the flexible pore models in the appropriate thermodynamic (statistical mechanical) ensembles, namely, the osmotic ensemble and a variant of the grand-canonical ensemble. Two-dimensional probability distributions, which are calculated using flat-histogram methods, provide the information necessary to determine adsorption thermodynamics. For example, we are able to determine precisely adsorption isotherms, (equilibrium) phase transition conditions, limits of stability, and free energies for a number of different flexible adsorbent materials, distinguishable as different inputs into the models. While the models used in this work are relatively simple from a geometric perspective, they yield non-trivial adsorptive behavior, including adsorption-desorption hysteresis solely due to material flexibility and so-called “breathing” of the adsorbent. The observed effects can in turn be tied to the inherent properties of the bare adsorbent. Some of the effects are expected on physical grounds while others arise from a subtle balance of thermodynamic and mechanical driving forces. In addition, the computational strategy presented here can be easily applied to more complex models for flexible adsorbents.

  20. Fireplace adapters

    SciTech Connect

    Hunt, R.L.

    1983-12-27

    An adapter is disclosed for use with a fireplace. The stove pipe of a stove standing in a room to be heated may be connected to the flue of the chimney so that products of combustion from the stove may be safely exhausted through the flue and outwardly of the chimney. The adapter may be easily installed within the fireplace by removing the damper plate and fitting the adapter to the damper frame. Each of a pair of bolts has a portion which hooks over a portion of the damper frame and a threaded end depending from the hook portion and extending through a hole in the adapter. Nuts are threaded on the bolts and are adapted to force the adapter into a tight fit with the adapter frame.

  1. Neural network with dynamically adaptable neurons

    NASA Technical Reports Server (NTRS)

    Tawel, Raoul (Inventor)

    1994-01-01

    This invention is an adaptive neuron for use in neural network processors. The adaptive neuron participates in the supervised learning phase of operation on a co-equal basis with the synapse matrix elements by adaptively changing its gain in a similar manner to the change of weights in the synapse IO elements. In this manner, training time is decreased by as much as three orders of magnitude.

  2. New equal-area map projections for noncircular regions

    USGS Publications Warehouse

    Snyder, John P.

    1988-01-01

    A series of new equal-area map projections has been devised. Called Oblated Equal-Area, its lines of constant distortion follow approximately oval or rectangular paths instead of the circles of the Lambert Azimuthal Equal-Area projection or the straight lines of the Cylindrical Equal-Area projection. The projection series permits design of equal-area maps of oblong regions with less overall distortion of shape and scale than equal-area maps on other projections.

  3. Density equalizing map projections: A new algorithm

    SciTech Connect

    Merrill, D.W.; Selvin, S.; Mohr, M.S.

    1992-02-01

    In the study of geographic disease clusters, an alternative to traditional methods based on rates is to analyze case locations on a transformed map in which population density is everywhere equal. Although the analyst`s task is thereby simplified, the specification of the density equalizing map projection (DEMP) itself is not simple and continues to be the subject of considerable research. Here a new DEMP algorithm is described, which avoids some of the difficulties of earlier approaches. The new algorithm (a) avoids illegal overlapping of transformed polygons; (b) finds the unique solution that minimizes map distortion; (c) provides constant magnification over each map polygon; (d) defines a continuous transformation over the entire map domain; (e) defines an inverse transformation; (f) can accept optional constraints such as fixed boundaries; and (g) can use commercially supported minimization software. Work is continuing to improve computing efficiency and improve the algorithm.

  4. Density equalizing map projections: A new algorithm

    SciTech Connect

    Merrill, D.W.; Selvin, S.; Mohr, M.S.

    1992-02-01

    In the study of geographic disease clusters, an alternative to traditional methods based on rates is to analyze case locations on a transformed map in which population density is everywhere equal. Although the analyst's task is thereby simplified, the specification of the density equalizing map projection (DEMP) itself is not simple and continues to be the subject of considerable research. Here a new DEMP algorithm is described, which avoids some of the difficulties of earlier approaches. The new algorithm (a) avoids illegal overlapping of transformed polygons; (b) finds the unique solution that minimizes map distortion; (c) provides constant magnification over each map polygon; (d) defines a continuous transformation over the entire map domain; (e) defines an inverse transformation; (f) can accept optional constraints such as fixed boundaries; and (g) can use commercially supported minimization software. Work is continuing to improve computing efficiency and improve the algorithm.

  5. Microscopic justification of the equal filling approximation

    SciTech Connect

    Perez-Martin, Sara; Robledo, L. M.

    2008-07-15

    The equal filling approximation, a procedure widely used in mean-field calculations to treat the dynamics of odd nuclei in a time-reversal invariant way, is justified as the consequence of a variational principle over an average energy functional. The ideas of statistical quantum mechanics are employed in the justification. As an illustration of the method, the ground and lowest-lying states of some octupole deformed radium isotopes are computed.

  6. When equal masses don't balance

    NASA Astrophysics Data System (ADS)

    Newburgh, Ronald; Peidle, Joseph; Rueckner, Wolfgang

    2004-05-01

    We treat a modified Atwood's machine in which equal masses do not balance because of being in an accelerated frame of reference. Analysis of the problem illuminates the meaning of inertial forces, d'Alembert's principle, the use of free-body diagrams and the selection of appropriate systems for the diagrams. In spite of the range of these applications the analysis does not require calculus, so the ideas are accessible even to first-year students.

  7. The equal effectiveness of different defensive strategies

    PubMed Central

    Zhang, Shuang; Zhang, Yuxin; Ma, Keming

    2015-01-01

    Plants have evolved a variety of defensive strategies to resist herbivory, but at the interspecific level, the relative effectiveness of these strategies has been poorly evaluated. In this study, we compared the level of herbivory between species that depend on ants as indirect defenders and species that rely primarily on their own direct defenses. Using a dataset of 871 species and 1,405 data points, we found that in general, ant-associated species had levels of herbivory equal to those of species that are unattractive to ants; the pattern was unaffected by plant life form, climate and phylogenetic relationships between species. Interestingly, species that offer both food and nesting spaces for ants suffered significantly lower herbivory compared to species that offer either food or nesting spaces only or no reward for ants. A negative relationship between herbivory and latitude was detected, but the pattern can be changed by ants. These findings suggest that, at the interspecific level, the effectiveness of different defensive strategies may be equal. Considering the effects of herbivory on plant performance and fitness, the equal effectiveness of different defensive strategies may play an important role in the coexistence of various species at the community scale. PMID:26267426

  8. Spatio-Temporal Equalizer for a Receiving-Antenna Feed Array

    NASA Technical Reports Server (NTRS)

    Mukai, Ryan; Lee, Dennis; Vilnrotter, Victor

    2010-01-01

    A spatio-temporal equalizer has been conceived as an improved means of suppressing multipath effects in the reception of aeronautical telemetry signals, and may be adaptable to radar and aeronautical communication applications as well. This equalizer would be an integral part of a system that would also include a seven-element planar array of receiving feed horns centered at the focal point of a paraboloidal antenna that would be nominally aimed at or near the aircraft that would be the source of the signal that one seeks to receive (see Figure 1). This spatio-temporal equalizer would consist mostly of a bank of seven adaptive finite-impulse-response (FIR) filters one for each element in the array - and the outputs of the filters would be summed (see Figure 2). The combination of the spatial diversity of the feedhorn array and the temporal diversity of the filter bank would afford better multipath-suppression performance than is achievable by means of temporal equalization alone. The seven-element feed array would supplant the single feed horn used in a conventional paraboloidal ground telemetry-receiving antenna. The radio-frequency telemetry signals re ceiv ed by the seven elements of the array would be digitized, converted to complex baseband form, and sent to the FIR filter bank, which would adapt itself in real time to enable reception of telemetry at a low bit error rate, even in the presence of multipath of the type found at many flight test ranges.

  9. Adaptive SPECT

    PubMed Central

    Barrett, Harrison H.; Furenlid, Lars R.; Freed, Melanie; Hesterman, Jacob Y.; Kupinski, Matthew A.; Clarkson, Eric; Whitaker, Meredith K.

    2008-01-01

    Adaptive imaging systems alter their data-acquisition configuration or protocol in response to the image information received. An adaptive pinhole single-photon emission computed tomography (SPECT) system might acquire an initial scout image to obtain preliminary information about the radiotracer distribution and then adjust the configuration or sizes of the pinholes, the magnifications, or the projection angles in order to improve performance. This paper briefly describes two small-animal SPECT systems that allow this flexibility and then presents a framework for evaluating adaptive systems in general, and adaptive SPECT systems in particular. The evaluation is in terms of the performance of linear observers on detection or estimation tasks. Expressions are derived for the ideal linear (Hotelling) observer and the ideal linear (Wiener) estimator with adaptive imaging. Detailed expressions for the performance figures of merit are given, and possible adaptation rules are discussed. PMID:18541485

  10. Adaptive Computing.

    ERIC Educational Resources Information Center

    Harrell, William

    1999-01-01

    Provides information on various adaptive technology resources available to people with disabilities. (Contains 19 references, an annotated list of 129 websites, and 12 additional print resources.) (JOW)

  11. Contour adaptation.

    PubMed

    Anstis, Stuart

    2013-01-01

    It is known that adaptation to a disk that flickers between black and white at 3-8 Hz on a gray surround renders invisible a congruent gray test disk viewed afterwards. This is contrast adaptation. We now report that adapting simply to the flickering circular outline of the disk can have the same effect. We call this "contour adaptation." This adaptation does not transfer interocularly, and apparently applies only to luminance, not color. One can adapt selectively to only some of the contours in a display, making only these contours temporarily invisible. For instance, a plaid comprises a vertical grating superimposed on a horizontal grating. If one first adapts to appropriate flickering vertical lines, the vertical components of the plaid disappears and it looks like a horizontal grating. Also, we simulated a Cornsweet (1970) edge, and we selectively adapted out the subjective and objective contours of a Kanisza (1976) subjective square. By temporarily removing edges, contour adaptation offers a new technique to study the role of visual edges, and it demonstrates how brightness information is concentrated in edges and propagates from them as it fills in surfaces.

  12. [Research on K-means clustering segmentation method for MRI brain image based on selecting multi-peaks in gray histogram].

    PubMed

    Chen, Zhaoxue; Yu, Haizhong; Chen, Hao

    2013-12-01

    To solve the problem of traditional K-means clustering in which initial clustering centers are selected randomly, we proposed a new K-means segmentation algorithm based on robustly selecting 'peaks' standing for White Matter, Gray Matter and Cerebrospinal Fluid in multi-peaks gray histogram of MRI brain image. The new algorithm takes gray value of selected histogram 'peaks' as the initial K-means clustering center and can segment the MRI brain image into three parts of tissue more effectively, accurately, steadily and successfully. Massive experiments have proved that the proposed algorithm can overcome many shortcomings caused by traditional K-means clustering method such as low efficiency, veracity, robustness and time consuming. The histogram 'peak' selecting idea of the proposed segmentootion method is of more universal availability.

  13. Vision: two plus four equals six.

    PubMed

    Loew, Ellis R

    2014-08-18

    Using two UV-sensitive visual pigments and the UV-filtering properties of four mycosporine-like amino acids, mantis shrimp create six spectrally distinct UV receptors. This is yet another example of the unique ways in which mantis shrimp have adapted to extract information from their visual world. PMID:25137589

  14. Comp Plan: A computer program to generate dose and radiobiological metrics from dose-volume histogram files

    SciTech Connect

    Holloway, Lois Charlotte; Miller, Julie-Anne; Kumar, Shivani; Whelan, Brendan M.; Vinod, Shalini K.

    2012-10-01

    Treatment planning studies often require the calculation of a large number of dose and radiobiological metrics. To streamline these calculations, a computer program called Comp Plan was developed using MATLAB. Comp Plan calculates common metrics, including equivalent uniform dose, tumor control probability, and normal tissue complication probability from dose-volume histogram data. The dose and radiobiological metrics can be calculated for the original data or for an adjusted fraction size using the linear quadratic model. A homogeneous boost dose can be added to a given structure if desired. The final output is written to an Excel file in a format convenient for further statistical analysis. Comp Plan was verified by independent calculations. A lung treatment planning study comparing 45 plans for 7 structures using up to 6 metrics for each structure was successfully analyzed within approximately 5 minutes with Comp Plan. The code is freely available from the authors on request.

  15. Modeling the dark current histogram induced by gold contamination in complementary-metal-oxide-semiconductor image sensors

    SciTech Connect

    Domengie, F. Morin, P.; Bauza, D.

    2015-07-14

    We propose a model for dark current induced by metallic contamination in a CMOS image sensor. Based on Shockley-Read-Hall kinetics, the expression of dark current proposed accounts for the electric field enhanced emission factor due to the Poole-Frenkel barrier lowering and phonon-assisted tunneling mechanisms. To that aim, we considered the distribution of the electric field magnitude and metal atoms in the depth of the pixel. Poisson statistics were used to estimate the random distribution of metal atoms in each pixel for a given contamination dose. Then, we performed a Monte-Carlo-based simulation for each pixel to set the number of metal atoms the pixel contained and the enhancement factor each atom underwent, and obtained a histogram of the number of pixels versus dark current for the full sensor. Excellent agreement with the dark current histogram measured on an ion-implanted gold-contaminated imager has been achieved, in particular, for the description of the distribution tails due to the pixel regions in which the contaminant atoms undergo a large electric field. The agreement remains very good when increasing the temperature by 15 °C. We demonstrated that the amplification of the dark current generated for the typical electric fields encountered in the CMOS image sensors, which depends on the nature of the metal contaminant, may become very large at high electric field. The electron and hole emissions and the resulting enhancement factor are described as a function of the trap characteristics, electric field, and temperature.

  16. Climate adaptation

    NASA Astrophysics Data System (ADS)

    Kinzig, Ann P.

    2015-03-01

    This paper is intended as a brief introduction to climate adaptation in a conference devoted otherwise to the physics of sustainable energy. Whereas mitigation involves measures to reduce the probability of a potential event, such as climate change, adaptation refers to actions that lessen the impact of climate change. Mitigation and adaptation differ in other ways as well. Adaptation does not necessarily have to be implemented immediately to be effective; it only needs to be in place before the threat arrives. Also, adaptation does not necessarily require global, coordinated action; many effective adaptation actions can be local. Some urban communities, because of land-use change and the urban heat-island effect, currently face changes similar to some expected under climate change, such as changes in water availability, heat-related morbidity, or changes in disease patterns. Concern over those impacts might motivate the implementation of measures that would also help in climate adaptation, despite skepticism among some policy makers about anthropogenic global warming. Studies of ancient civilizations in the southwestern US lends some insight into factors that may or may not be important to successful adaptation.

  17. Local structure of equality constrained NLP problems

    SciTech Connect

    Mari, J.

    1994-12-31

    We show that locally around a feasible point, the behavior of an equality constrained nonlinear program is described by the gradient and the Hessian of the Lagrangian on the tangent subspace. In particular this holds true for reduced gradient approaches. Applying the same ideas to the control of nonlinear ODE:s, one can device first and second order methods that can be applied also to stiff problems. We finally describe an application of these ideas to the optimization of the production of human growth factor by fed-batch fermentation.

  18. Equal-Curvature X-Ray Telescopes

    NASA Technical Reports Server (NTRS)

    Saha, Timo T.; Zhang, William

    2002-01-01

    We introduce a new type of x-ray telescope design; an Equal-Curvature telescope. We simply add a second order axial sag to the base grazing incidence cone-cone telescope. The radius of curvature of the sag terms is the same on the primary surface and on the secondary surface. The design is optimized so that the on-axis image spot at the focal plane is minimized. The on-axis RMS (root mean square) spot diameter of two studied telescopes is less than 0.2 arc-seconds. The off-axis performance is comparable to equivalent Wolter type 1 telescopes.

  19. Mergers of equals: understanding the dynamics.

    PubMed

    Rovner, J

    1997-06-01

    The current wave of mergers in the healthcare field is marked by demands from staff that their organizations not lose autonomy--and one way leaders are getting agreements signed is to assert that each merging group will be treated as equal in power. But such mergers, says consultant Stephen L. Ummel, can lead to "inertia, indecisiveness and confusion". Leaders of three healthcare systems--Partners in Boston, BJC in St. Louis and Lifespan in Rhode Island--cite four major lessons that others planning mergers should take to heart.

  20. 34 CFR 21.1 - Equal Access to Justice Act.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 34 Education 1 2013-07-01 2013-07-01 false Equal Access to Justice Act. 21.1 Section 21.1 Education Office of the Secretary, Department of Education EQUAL ACCESS TO JUSTICE General § 21.1 Equal Access to Justice Act. (a) The Equal Access to Justice Act (the Act) provides for the award of fees...

  1. 34 CFR 21.1 - Equal Access to Justice Act.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 34 Education 1 2014-07-01 2014-07-01 false Equal Access to Justice Act. 21.1 Section 21.1 Education Office of the Secretary, Department of Education EQUAL ACCESS TO JUSTICE General § 21.1 Equal Access to Justice Act. (a) The Equal Access to Justice Act (the Act) provides for the award of fees...

  2. 34 CFR 21.1 - Equal Access to Justice Act.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 34 Education 1 2012-07-01 2012-07-01 false Equal Access to Justice Act. 21.1 Section 21.1 Education Office of the Secretary, Department of Education EQUAL ACCESS TO JUSTICE General § 21.1 Equal Access to Justice Act. (a) The Equal Access to Justice Act (the Act) provides for the award of fees...

  3. 34 CFR 21.1 - Equal Access to Justice Act.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Equal Access to Justice Act. 21.1 Section 21.1 Education Office of the Secretary, Department of Education EQUAL ACCESS TO JUSTICE General § 21.1 Equal Access to Justice Act. (a) The Equal Access to Justice Act (the Act) provides for the award of fees...

  4. 34 CFR 21.1 - Equal Access to Justice Act.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 34 Education 1 2011-07-01 2011-07-01 false Equal Access to Justice Act. 21.1 Section 21.1 Education Office of the Secretary, Department of Education EQUAL ACCESS TO JUSTICE General § 21.1 Equal Access to Justice Act. (a) The Equal Access to Justice Act (the Act) provides for the award of fees...

  5. 48 CFR 52.211-6 - Brand name or equal.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Brand name or equal. 52....211-6 Brand name or equal. As prescribed in 11.107(a), insert the following provision: Brand Name or Equal (AUG 1999) (a) If an item in this solicitation is identified as “brand name or equal,”...

  6. 48 CFR 52.211-6 - Brand name or equal.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Brand name or equal. 52....211-6 Brand name or equal. As prescribed in 11.107(a), insert the following provision: Brand Name or Equal (AUG 1999) (a) If an item in this solicitation is identified as “brand name or equal,”...

  7. Spearman's g and the Problem of Educational Equality.

    ERIC Educational Resources Information Center

    Jensen, Arthur R.

    1991-01-01

    Criticizes approach to equal education that seeks equality of outcome as well as equality of opportunity. Discusses Spearman's theory of g that attempts to explain individual differences in intelligence. Contrasts efforts at genuinely reducing equality of outcome, including Aptitude X Treatment Interaction, Mastery Learning, and Thinking Skills…

  8. 47 CFR 36.191 - Equal access equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false Equal access equipment. 36.191 Section 36.191... AND RESERVES FOR TELECOMMUNICATIONS COMPANIES 1 Telecommunications Property Equal Access Equipment § 36.191 Equal access equipment. (a) Equal access investment includes only initial...

  9. Effect of TRMM boost on oceanic rain rate estimates based on microwave brightness temperature histograms

    NASA Astrophysics Data System (ADS)

    Chiu, L. S.; Shin, D.; Chokngamwong, R.

    2006-05-01

    A statistical emission-based passive microwave retrieval algorithm developed by Wilheit, Chang and Chiu (1991) has been used to estimate space/time oceanic rainfall. The algorithm has been applied to Special Sensor Microwave Imager (SSM/I) data taken on board the Defense Meteorological Satellite Program (DMSP) satellites to provide monthly oceanic rainfall over 2.5ºx2.5º and 5ºx5º latitude-longitude boxes by the Global Precipitation Climatology Project-Polar Satellite Precipitation Data Center (GPCP-PSPDC, URL: http://gpcp- pspdc.gmu.edu) as part of NASA's contribution to the GPCP. The algorithm has been adapted to the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) data to produce a TRMM Level 3 standard product (3A11) over 5ºx5º latitude/longitude boxes. To extend the TRMM mission, the TRMM satellite was boosted to a higher altitude thus changing the rain rate-brightness temperature relation and other rain rate parameters in the estimation procedure. Comparison with the SSM/I product showed a statistically significant difference between the pre and post boost TMI monthly rain rates. Our results showed that the difference can be reconciled in terms of the changes in earth's incidence angle of TMI, the freezing level height, and the beam-filling correction factor. After the incorporation of these changes for the post boost data, there is no significant difference between the pre and post-boost 3A11 data.

  10. Is Primatology an equal-opportunity discipline?

    PubMed

    Addessi, Elsa; Borgi, Marta; Palagi, Elisabetta

    2012-01-01

    The proportion of women occupying academic positions in biological sciences has increased in the past few decades, but women are still under-represented in senior academic ranks compared to their male colleagues. Primatology has been often singled out as a model of "equal-opportunity" discipline because of the common perception that women are more represented in Primatology than in similar fields. But is this indeed true? Here we show that, although in the past 15 years the proportion of female primatologists increased from the 38% of the early 1990s to the 57% of 2008, Primatology is far from being an "equal-opportunity" discipline, and suffers the phenomenon of "glass ceiling" as all the other scientific disciplines examined so far. In fact, even if Primatology does attract more female students than males, at the full professor level male members significantly outnumber females. Moreover, regardless of position, IPS male members publish significantly more than their female colleagues. Furthermore, when analyzing gender difference in scientific productivity in relation to the name order in the publications, it emerged that the scientific achievements of female primatologists (in terms of number and type of publications) do not always match their professional achievements (in terms of academic position). However, the gender difference in the IPS members' number of publications does not correspond to a similar difference in their scientific impact (as measured by their H index), which may indicate that female primatologists' fewer articles are of higher impact than those of their male colleagues.

  11. Equal area rule methods for ternary systems

    SciTech Connect

    Shyu, G.S.; Hanif, N.S.M.; Alvarado, J.F.J.; Hall, K.R.; Eubank, P.T.

    1995-12-01

    The phase equilibrium behavior of fluid mixtures is an important design consideration for both chemical processes and oil production. Eubank and Hall have recently shown the equal area rule (EAR) applies to the composition derivative of the Gibbs energy of a binary system at fixed pressure and temperature regardless of derivative continuity. A sufficient condition for equilibria, EAR is faster and simpler than either the familiar tangent-line method or the area method of Eubank et al. Here, the authors show that EAR can be extended to ternary systems exhibiting one, two, or three phases at equilibrium. A single directional vector is searched in composition space; at equilibrium, this vector is the familiar tie line. A sensitive criterion for equilibrium under EAR is equality of orthogonal derivatives such as ({partial_derivative}g/{partial_derivative}x{sub 1}){sub x{sub 2}P,T} at the end points ({alpha} and {beta}), where g {equivalent_to} ({Delta}{sub m}G/RT). Repeated use of the binary algorithm published in the first reference allows rapid, simple solution of ternary problems, even with hand-held calculations for cases where the background model is simple (e.g., activity coefficient models) and the derivative continuous.

  12. Evolution of equal division among unequal partners.

    PubMed

    Debove, Stéphane; Baumard, Nicolas; André, Jean-Baptiste

    2015-02-01

    One of the hallmarks of human fairness is its insensitivity to power: although strong individuals are often in a position to coerce weak individuals, fairness requires them to share the benefits of cooperation equally. The existence of such egalitarianism is poorly explained by current evolutionary models. We present a model based on cooperation and partner choice that can account for the emergence of a psychological disposition toward fairness, whatever the balance of power between the cooperative partners. We model the evolution of the division of a benefit in an interaction similar to an ultimatum game, in a population made up of individuals of variable strength. The model shows that strong individuals will not receive any advantage from their strength, instead having to share the benefits of cooperation equally with weak individuals at the evolutionary equilibrium, a result that is robust to variations in population size and the proportion of weak individuals. We discuss how this model suggests an explanation for why egalitarian behaviors toward everyone, including the weak, should be more likely to evolve in humans than in any other species.

  13. A primer on equalization, decoding and non-iterative joint equalization and decoding

    NASA Astrophysics Data System (ADS)

    Myburgh, Hermanus C.; Olivier, Jan C.

    2013-12-01

    In this article, a general model for non-iterative joint equalization and decoding is systematically derived for use in systems transmitting convolutionally encoded BPSK-modulated information through a multipath channel, with and without interleaving. Optimal equalization and decoding are discussed first, by presenting the maximum likelihood sequence estimation and maximum a posteriori probability algorithms and relating them to equalization in single-carrier channels with memory, and to the decoding of convolutional codes. The non-iterative joint equalizer/decoder (NI-JED) is then derived for the case where no interleaver is used, as well as for the case when block interleavers of varying depths are used, and complexity analyses are performed in each case. Simulation results are performed to compare the performance of the NI-JED to that of a conventional turbo equalizer (CTE), and it is shown that the NI-JED outperforms the CTE, although at much higher computational cost. This article serves to explain the state-of-the-art to students and professionals in the field of wireless communication systems, presenting these fundamental topics clearly and concisely.

  14. Public hospital care: equal for all or equal for some? Evidence from the Philippines.

    PubMed

    James, Chris D; Peabody, John; Hanson, Kara; Solon, Orville

    2015-03-01

    In low- and middle-income countries, government budgets are rarely sufficient to cover a public hospital's operating costs. Shortfalls are typically financed through a combination of health insurance contributions and user charges. The mixed nature of this financing arrangement potentially creates financial incentives to treat patients with equal health need unequally. Using data from the Philippines, the authors analyzed whether doctors respond to such incentives. After controlling for a patient's condition, they found that patients using insurance, paying more for hospital accommodation, and being treated in externally monitored hospitals were likely to receive more care. This highlights the worrying possibility that public hospital patients with equal health needs are not always equally treated.

  15. Iterative Frequency Domain Decision Feedback Equalization and Decoding for Underwater Acoustic Communications

    NASA Astrophysics Data System (ADS)

    Zhao, Liang; Ge, Jian-Hua

    2012-12-01

    Single-carrier (SC) transmission with frequency-domain equalization (FDE) is today recognized as an attractive alternative to orthogonal frequency-division multiplexing (OFDM) for communication application with the inter-symbol interference (ISI) caused by multi-path propagation, especially in shallow water channel. In this paper, we investigate an iterative receiver based on minimum mean square error (MMSE) decision feedback equalizer (DFE) with symbol rate and fractional rate samplings in the frequency domain (FD) and serially concatenated trellis coded modulation (SCTCM) decoder. Based on sound speed profiles (SSP) measured in the lake and finite-element ray tracking (Bellhop) method, the shallow water channel is constructed to evaluate the performance of the proposed iterative receiver. Performance results show that the proposed iterative receiver can significantly improve the performance and obtain better data transmission than FD linear and adaptive decision feedback equalizers, especially in adopting fractional rate sampling.

  16. rp Process and Masses of N{approx_equal}Z{approx_equal}34 Nuclides

    SciTech Connect

    Savory, J.; Schury, P.; Bachelet, C.; Block, M.; Bollen, G.; Facina, M.; Folden, C. M. III; Guenaut, C.; Kwan, E.; Kwiatkowski, A. A.; Morrissey, D. J.; Pang, G. K.; Prinke, A.; Ringle, R.; Schatz, H.; Schwarz, S.; Sumithrarachchi, C. S.

    2009-04-03

    High-precision Penning-trap mass measurements of the N{approx_equal}Z{approx_equal}34 nuclides {sup 68}Se, {sup 70}Se, {sup 70m}Br, and {sup 71}Br were performed, reaching experimental uncertainties of 0.5-15 keV. The new and improved mass data together with theoretical Coulomb displacement energies were used as input for rp process network calculations. An increase in the effective lifetime of the waiting point nucleus {sup 68}Se was found, and more precise information was obtained on the luminosity during a type I x-ray burst along with the final elemental abundances after the burst.

  17. Evaluation of breast cancer using intravoxel incoherent motion (IVIM) histogram analysis: comparison with malignant status, histological subtype, and molecular prognostic factors

    PubMed Central

    Cho, Gene Young; Moy, Linda; Kim, Sungheon G.; Baete, Steven H.; Moccaldi, Melanie; Babb, James S.; Sodickson, Daniel K.; Sigmund, Eric E.

    2016-01-01

    Purpose To examine heterogeneous breast cancer through intravoxel incoherent motion (IVIM) histogram analysis. Materials and methods This HIPAA-compliant, IRB-approved retrospective study included 62 patients (age 48.44±11.14 years, 50 malignant lesions and 12 benign) who underwent contrast-enhanced 3 T breast MRI and diffusion-weighted imaging. Apparent diffusion coefficient (ADC) and IVIM biomarkers of tissue diffusivity (Dt), perfusion fraction (fp), and pseudo-diffusivity (Dp) were calculated using voxel-based analysis for the whole lesion volume. Histogram analysis was performed to quantify tumour heterogeneity. Comparisons were made using Mann–Whitney tests between benign/malignant status, histological subtype, and molecular prognostic factor status while Spearman’s rank correlation was used to characterize the association between imaging biomarkers and prognostic factor expression. Results The average values of the ADC and IVIM biomarkers, Dt and fp, showed significant differences between benign and malignant lesions. Additional significant differences were found in the histogram parameters among tumour subtypes and molecular prognostic factor status. IVIM histogram metrics, particularly fp and Dp, showed significant correlation with hormonal factor expression. Conclusion Advanced diffusion imaging biomarkers show relationships with molecular prognostic factors and breast cancer malignancy. This analysis reveals novel diagnostic metrics that may explain some of the observed variability in treatment response among breast cancer patients. PMID:26615557

  18. Histogram analysis as a method for determining the line tension of a three-phase contact region by Monte Carlo simulations.

    PubMed

    Djikaev, Yuri

    2005-11-01

    A method is proposed for determining the line tension, which is the main physical characteristic of a three-phase contact region, by Monte Carlo (MC) simulations. The key idea of the proposed method is that if a three-phase equilibrium involves a three-phase contact region, the probability distribution of states of a system as a function of two order parameters depends not only on the surface tension, but also on the line tension. This probability distribution can be obtained as a normalized histogram by appropriate MC simulations, so one can use the combination of histogram analysis and finite-size scaling to study the properties of a three phase contact region. Every histogram and results extracted therefrom will depend on the size of the simulated system. Carrying out MC simulations for a series of system sizes and extrapolating the results, obtained from the corresponding series of histograms, to infinite size, one can determine the line tension of the three phase contact region and the interfacial tensions of all three interfaces (and hence the contact angles) in an infinite system. To illustrate the proposed method, it is applied to the three-dimensional ternary fluid mixture, in which molecular pairs of like species do not interact whereas those of unlike species interact as hard spheres. The simulated results are in agreement with expectations.

  19. Pressure equalizing photovoltaic assembly and method

    DOEpatents

    Dinwoodie, Thomas L.

    2003-05-27

    Each PV assembly of an array of PV assemblies comprises a base, a PV module and a support assembly securing the PV module to a position overlying the upper surface of the base. Vents are formed through the base. A pressure equalization path extends from the outer surface of the PV module, past the peripheral edge of the PV module, to and through at least one of the vents, and to the lower surface of the base to help reduce wind uplift forces on the PV assembly. The PV assemblies may be interengaged, such as by interengaging the bases of adjacent PV assemblies. The base may include a main portion and a cover and the bases of adjacent PV assemblies may be interengaged by securing the covers of adjacent bases together.

  20. Pressure-equalizing PV assembly and method

    DOEpatents

    Dinwoodie, Thomas L.

    2004-10-26

    Each PV assembly of an array of PV assemblies comprises a base, a PV module and a support assembly securing the PV module to a position overlying the upper surface of the base. Vents are formed through the base. A pressure equalization path extends from the outer surface of the PV module, past the PV module, to and through at least one of the vents, and to the lower surface of the base to help reduce wind uplift forces on the PV assembly. The PV assemblies may be interengaged, such as by interengaging the bases of adjacent PV assemblies. The base may include a main portion and a cover and the bases of adjacent PV assemblies may be interengaged by securing the covers of adjacent bases together.

  1. Acoustical numerology and lucky equal temperaments

    NASA Astrophysics Data System (ADS)

    Hall, Donald E.

    1988-04-01

    Equally tempered musical scales with N steps per octave are known to work especially well in approximating justly tuned intervals for such values as N=12, 19, 31, and 53. A quantitative measure of the closeness of such fits is suggested, in terms of the probabilities of coming as close to randomly chosen intervals as to the justly tuned targets. When two or more harmonic intervals are considered simultaneously, this involves a Monte Carlo evaluation of the probabilities. The results can be used to gauge how much advantage the special values of N mentioned above have over others. This article presents the rationale and method of computation, together with illustrative results in a few of the most interesting cases. References are provided to help relate these results to earlier works by music theorists.

  2. Social Epigenetics and Equality of Opportunity

    PubMed Central

    Loi, Michele; Del Savio, Lorenzo; Stupka, Elia

    2013-01-01

    Recent epidemiological reports of associations between socioeconomic status and epigenetic markers that predict vulnerability to diseases are bringing to light substantial biological effects of social inequalities. Here, we start the discussion of the moral consequences of these findings. We firstly highlight their explanatory importance in the context of the research program on the Developmental Origins of Health and Disease (DOHaD) and the social determinants of health. In the second section, we review some theories of the moral status of health inequalities. Rather than a complete outline of the debate, we single out those theories that rest on the principle of equality of opportunity and analyze the consequences of DOHaD and epigenetics for these particular conceptions of justice. We argue that DOHaD and epigenetics reshape the conceptual distinction between natural and acquired traits on which these theories rely and might provide important policy tools to tackle unjust distributions of health. PMID:23864907

  3. Toothbrush Adaptations.

    ERIC Educational Resources Information Center

    Exceptional Parent, 1987

    1987-01-01

    Suggestions are presented for helping disabled individuals learn to use or adapt toothbrushes for proper dental care. A directory lists dental health instructional materials available from various organizations. (CB)

  4. Is Primatology an Equal-Opportunity Discipline?

    PubMed Central

    Borgi, Marta

    2012-01-01

    The proportion of women occupying academic positions in biological sciences has increased in the past few decades, but women are still under-represented in senior academic ranks compared to their male colleagues. Primatology has been often singled out as a model of “equal-opportunity” discipline because of the common perception that women are more represented in Primatology than in similar fields. But is this indeed true? Here we show that, although in the past 15 years the proportion of female primatologists increased from the 38% of the early 1990s to the 57% of 2008, Primatology is far from being an “equal-opportunity” discipline, and suffers the phenomenon of “glass ceiling” as all the other scientific disciplines examined so far. In fact, even if Primatology does attract more female students than males, at the full professor level male members significantly outnumber females. Moreover, regardless of position, IPS male members publish significantly more than their female colleagues. Furthermore, when analyzing gender difference in scientific productivity in relation to the name order in the publications, it emerged that the scientific achievements of female primatologists (in terms of number and type of publications) do not always match their professional achievements (in terms of academic position). However, the gender difference in the IPS members' number of publications does not correspond to a similar difference in their scientific impact (as measured by their H index), which may indicate that female primatologists' fewer articles are of higher impact than those of their male colleagues. PMID:22272353

  5. Structure-Property Relationships in Atomic-Scale Junctions: Histograms and Beyond.

    PubMed

    Hybertsen, Mark S; Venkataraman, Latha

    2016-03-15

    Over the past 10 years, there has been tremendous progress in the measurement, modeling and understanding of structure-function relationships in single molecule junctions. Numerous research groups have addressed significant scientific questions, directed both to conductance phenomena at the single molecule level and to the fundamental chemistry that controls junction functionality. Many different functionalities have been demonstrated, including single-molecule diodes, optically and mechanically activated switches, and, significantly, physical phenomena with no classical analogues, such as those based on quantum interference effects. Experimental techniques for reliable and reproducible single molecule junction formation and characterization have led to this progress. In particular, the scanning tunneling microscope based break-junction (STM-BJ) technique has enabled rapid, sequential measurement of large numbers of nanoscale junctions allowing a statistical analysis to readily distinguish reproducible characteristics. Harnessing fundamental link chemistry has provided the necessary chemical control over junction formation, enabling measurements that revealed clear relationships between molecular structure and conductance characteristics. Such link groups (amines, methylsuflides, pyridines, etc.) maintain a stable lone pair configuration that selectively bonds to specific, undercoordinated transition metal atoms available following rupture of a metal point contact in the STM-BJ experiments. This basic chemical principle rationalizes the observation of highly reproducible conductance signatures. Subsequently, the method has been extended to probe a variety of physical phenomena ranging from basic I-V characteristics to more complex properties such as thermopower and electrochemical response. By adapting the technique to a conducting cantilever atomic force microscope (AFM-BJ), simultaneous measurement of the mechanical characteristics of nanoscale junctions as they

  6. Structure-Property Relationships in Atomic-Scale Junctions: Histograms and Beyond.

    PubMed

    Hybertsen, Mark S; Venkataraman, Latha

    2016-03-15

    Over the past 10 years, there has been tremendous progress in the measurement, modeling and understanding of structure-function relationships in single molecule junctions. Numerous research groups have addressed significant scientific questions, directed both to conductance phenomena at the single molecule level and to the fundamental chemistry that controls junction functionality. Many different functionalities have been demonstrated, including single-molecule diodes, optically and mechanically activated switches, and, significantly, physical phenomena with no classical analogues, such as those based on quantum interference effects. Experimental techniques for reliable and reproducible single molecule junction formation and characterization have led to this progress. In particular, the scanning tunneling microscope based break-junction (STM-BJ) technique has enabled rapid, sequential measurement of large numbers of nanoscale junctions allowing a statistical analysis to readily distinguish reproducible characteristics. Harnessing fundamental link chemistry has provided the necessary chemical control over junction formation, enabling measurements that revealed clear relationships between molecular structure and conductance characteristics. Such link groups (amines, methylsuflides, pyridines, etc.) maintain a stable lone pair configuration that selectively bonds to specific, undercoordinated transition metal atoms available following rupture of a metal point contact in the STM-BJ experiments. This basic chemical principle rationalizes the observation of highly reproducible conductance signatures. Subsequently, the method has been extended to probe a variety of physical phenomena ranging from basic I-V characteristics to more complex properties such as thermopower and electrochemical response. By adapting the technique to a conducting cantilever atomic force microscope (AFM-BJ), simultaneous measurement of the mechanical characteristics of nanoscale junctions as they

  7. Downhole component with a pressure equalization passageway

    DOEpatents

    Hall, David R.; Pixton, David S.; Dahlgren, Scott; Reynolds, Jay T.; Breihan, James W.; Briscoe, Michael A.

    2006-08-22

    The present invention includes a downhole component adapted for transmitting downhole data. The downhole component includes a threaded end on a downhole component. The threaded end furthermore includes an interior region, and exterior region, and a mating surface wherein a cavity is formed. A data transmission element is disposed in the cavity and displaces a volume of the cavity. At least one passageway is formed in the threaded region between interior and exterior regions. The passageway is in fluid communication with both the interior and exterior regions and thereby relieves pressure build up of thread lubricant upon tool joint make up.

  8. Equalization of loudspeaker response using balanced model truncation.

    PubMed

    Li, Xiansheng; Cai, Zhibo; Zheng, Chengshi; Li, Xiaodong

    2015-04-01

    Traditional loudspeaker equalization algorithms cannot decide the order of an equalizer before the whole equalization procedure has been completed. Designers have to try many times before they determine a proper order of the equalization filter. A method which solves this drawback is presented for loudspeaker equalization using balanced model truncation. The order of the equalizer can be easily decided using this algorithm and the error between the model and the loudspeaker can also be readily controlled. Examples are presented and the performance of the proposed method is discussed with comparative experiments.

  9. The Equal Rights Amendment: Guaranteeing Equal Rights for Women Under the Constitution. Clearinghouse Publication 68.

    ERIC Educational Resources Information Center

    Commission on Civil Rights, Washington, DC.

    This report examines the effects that the ratification of the Equal Rights Amendment will have on laws concerning women. The amendment's impacts on divorced, married, and employed women, on women in the military and in school, and on women dependent on pensions, insurance, and social security are all analyzed. A discussion of the Constitutional…

  10. Teachers Negotiating Discourses of Gender (In) Equality: The Case of Equal Opportunities Reform in Andalusia

    ERIC Educational Resources Information Center

    Cubero, Mercedes; Santamaría, Andrés; Rebollo, Mª Ángeles; Cubero, Rosario; García, Rafael; Vega, Luisa

    2015-01-01

    This article is focused on the analysis of the narratives produced by a group of teachers, experts in coeducation, while they were discussing their everyday activities. They are responsible for the implementation of a Plan for Gender Equality in public secondary schools in Andalusia (Spain). This study is based on contributions about doing gender…

  11. School Finance, Equivalent Educational Expenditure, and the Income Distribution: Equal Dollars or Equal Chances for Success?

    ERIC Educational Resources Information Center

    Wilson, Kathryn; Lambright, Kristina; Smeeding, Timothy M.

    2006-01-01

    This article breaks new ground in the debate on school finance and equality of per pupil school expenditures. We are able to merge school district data with the individual and family data of the Panel Study of Income Dynamics (PSID). This allows us to examine both student and school district characteristics and to assess several measures of…

  12. Equality, Adequacy, and Stakes Fairness: Retrieving the Equal Opportunities in Education Approach

    ERIC Educational Resources Information Center

    Jacobs, Lesley A.

    2010-01-01

    Two approaches to making judgments about moral urgency in educational policy have prevailed in American law and public policy. One approach holds that educational policy should aspire to realizing equal opportunities in education for all. The other approach holds that educational policy should aspire to realizing adequate opportunities in…

  13. Scale adaptive compressive tracking.

    PubMed

    Zhao, Pengpeng; Cui, Shaohui; Gao, Min; Fang, Dan

    2016-01-01

    Recently, the compressive tracking (CT) method (Zhang et al. in Proceedings of European conference on computer vision, pp 864-877, 2012) has attracted much attention due to its high efficiency, but it cannot well deal with the scale changing objects due to its constant tracking box. To address this issue, in this paper we propose a scale adaptive CT approach, which adaptively adjusts the scale of tracking box with the size variation of the objects. Our method significantly improves CT in three aspects: Firstly, the scale of tracking box is adaptively adjusted according to the size of the objects. Secondly, in the CT method, all the compressive features are supposed independent and equal contribution to the classifier. Actually, different compressive features have different confidence coefficients. In our proposed method, the confidence coefficients of features are computed and used to achieve different contribution to the classifier. Finally, in the CT method, the learning parameter λ is constant, which will result in large tracking drift on the occasion of object occlusion or large scale appearance variation. In our proposed method, a variable learning parameter λ is adopted, which can be adjusted according to the object appearance variation rate. Extensive experiments on the CVPR2013 tracking benchmark demonstrate the superior performance of the proposed method compared to state-of-the-art tracking algorithms. PMID:27386298

  14. Prism adaptation by mental practice.

    PubMed

    Michel, Carine; Gaveau, Jérémie; Pozzo, Thierry; Papaxanthis, Charalambos

    2013-09-01

    The prediction of our actions and their interaction with the external environment is critical for sensorimotor adaptation. For instance, during prism exposure, which deviates laterally our visual field, we progressively correct movement errors by combining sensory feedback with forward model sensory predictions. However, very often we project our actions to the external environment without physically interacting with it (e.g., mental actions). An intriguing question is whether adaptation will occur if we imagine, instead of executing, an arm movement while wearing prisms. Here, we investigated prism adaptation during mental actions. In the first experiment, participants (n = 54) performed arm pointing movements before and after exposure to the optical device. They were equally divided into six groups according to prism exposure: Prisms-Active, Prisms-Imagery, Prisms-Stationary, Prisms-Stationary-Attention, No Conflict-Prisms-Imagery, No Prisms-Imagery. Adaptation, measured by the difference in pointing errors between pre-test and post-test, occurred only in Prisms-Active and Prisms-Imagery conditions. The second experiment confirmed the results of the first experiment and further showed that sensorimotor adaptation was mainly due to proprioceptive realignment in both Prisms-Active (n = 10) and Prisms-Imagery (n = 10) groups. In both experiments adaptation was greater following actual than imagined pointing movements. The present results are the first demonstration of prism adaptation by mental practice under prism exposure and they are discussed in terms of internal forward models and sensorimotor plasticity.

  15. BEDVH--A method for evaluating biologically effective dose volume histograms: Application to eye plaque brachytherapy implants

    SciTech Connect

    Gagne, Nolan L.; Leonard, Kara L.; Huber, Kathryn E.; Mignano, John E.; Duker, Jay S.; Laver, Nora V.; Rivard, Mark J.

    2012-02-15

    Purpose: A method is introduced to examine the influence of implant duration T, radionuclide, and radiobiological parameters on the biologically effective dose (BED) throughout the entire volume of regions of interest for episcleral brachytherapy using available radionuclides. This method is employed to evaluate a particular eye plaque brachytherapy implant in a radiobiological context. Methods: A reference eye geometry and 16 mm COMS eye plaque loaded with {sup 103}Pd, {sup 125}I, or {sup 131}Cs sources were examined with dose distributions accounting for plaque heterogeneities. For a standardized 7 day implant, doses to 90% of the tumor volume ( {sub TUMOR}D{sub 90}) and 10% of the organ at risk volumes ( {sub OAR}D{sub 10}) were calculated. The BED equation from Dale and Jones and published {alpha}/{beta} and {mu} parameters were incorporated with dose volume histograms (DVHs) for various T values such as T = 7 days (i.e., {sub TUMOR} {sup 7}BED{sub 10} and {sub OAR} {sup 7}BED{sub 10}). By calculating BED throughout the volumes, biologically effective dose volume histograms (BEDVHs) were developed for tumor and OARs. Influence of T, radionuclide choice, and radiobiological parameters on {sub TUMOR}BEDVH and {sub OAR}BEDVH were examined. The nominal dose was scaled for shorter implants to achieve biological equivalence. Results: {sub TUMOR}D{sub 90} values were 102, 112, and 110 Gy for {sup 103}Pd, {sup 125}I, and {sup 131}Cs, respectively. Corresponding {sub TUMOR} {sup 7}BED{sub 10} values were 124, 140, and 138 Gy, respectively. As T decreased from 7 to 0.01 days, the isobiologically effective prescription dose decreased by a factor of three. As expected, {sub TUMOR} {sup 7}BEDVH did not significantly change as a function of radionuclide half-life but varied by 10% due to radionuclide dose distribution. Variations in reported radiobiological parameters caused {sub TUMOR} {sup 7}BED{sub 10} to deviate by up to 46%. Over the range of {sub OAR

  16. Differential Trends Toward Equality Between Whites and Nonwhites

    ERIC Educational Resources Information Center

    Palmore, Erdman; Whittington, Frank J.

    1970-01-01

    By using an equality index to measure the amount of overlap between percentage distributions of whites and nonwhites, it is shown that nonwhites have substantially progressed toward equality in income, education, occupation, employment, and housing. (JM)

  17. Battery Charge Equalizer with Transformer Array

    NASA Technical Reports Server (NTRS)

    Davies, Francis

    2013-01-01

    High-power batteries generally consist of a series connection of many cells or cell banks. In order to maintain high performance over battery life, it is desirable to keep the state of charge of all the cell banks equal. A method provides individual charging for battery cells in a large, high-voltage battery array with a minimum number of transformers while maintaining reasonable efficiency. This is designed to augment a simple highcurrent charger that supplies the main charge energy. The innovation will form part of a larger battery charge system. It consists of a transformer array connected to the battery array through rectification and filtering circuits. The transformer array is connected to a drive circuit and a timing and control circuit that allow individual battery cells or cell banks to be charged. The timing circuit and control circuit connect to a charge controller that uses battery instrumentation to determine which battery bank to charge. It is important to note that the innovation can charge an individual cell bank at the same time that the main battery charger is charging the high-voltage battery. The fact that the battery cell banks are at a non-zero voltage, and that they are all at similar voltages, can be used to allow charging of individual cell banks. A set of transformers can be connected with secondary windings in series to make weighted sums of the voltages on the primaries.

  18. Categorical facilitation with equally discriminable colors.

    PubMed

    Witzel, Christoph; Gegenfurtner, Karl R

    2015-01-01

    This study investigates the impact of language on color perception. By categorical facilitation, we refer to an aspect of categorical perception, in which the linguistic distinction between categories affects color discrimination beyond the low-level, sensory sensitivity to color differences. According to this idea, discrimination performance for colors that cross a category border should be better than for colors that belong to the same category when controlling for low-level sensitivity. We controlled for sensitivity by using colors that were equally discriminable according to empirically measured discrimination thresholds. To test for categorical facilitation, we measured response times and error rates in a speeded discrimination task for suprathreshold stimuli. Robust categorical facilitation occurred for five out of six categories with a group of inexperienced observers, namely for pink, orange, yellow, green, and purple. Categorical facilitation was robust against individual variations of categories or the laterality of target presentation. However, contradictory effects occurred in the blue category, most probably reflecting the difficulty to control effects of sensory mechanisms at the green-blue boundary. Moreover, a group of observers who were highly familiar with the discrimination task did not show consistent categorical facilitation in the other five categories. This trained group had much faster response times than the inexperienced group without any speed-accuracy trade-off. Additional analyses suggest that categorical facilitation occurs when observers pay attention to the categorical distinction but not when they respond automatically based on sensory feed-forward information. PMID:26129860

  19. Adaptive Development

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The goal of this research is to develop and demonstrate innovative adaptive seal technologies that can lead to dramatic improvements in engine performance, life, range, and emissions, and enhance operability for next generation gas turbine engines. This work is concentrated on the development of self-adaptive clearance control systems for gas turbine engines. Researchers have targeted the high-pressure turbine (HPT) blade tip seal location for following reasons: Current active clearance control (ACC) systems (e.g., thermal case-cooling schemes) cannot respond to blade tip clearance changes due to mechanical, thermal, and aerodynamic loads. As such they are prone to wear due to the required tight running clearances during operation. Blade tip seal wear (increased clearances) reduces engine efficiency, performance, and service life. Adaptive sealing technology research has inherent impact on all envisioned 21st century propulsion systems (e.g. distributed vectored, hybrid and electric drive propulsion concepts).

  20. Analytic treatment of the compound action potential: Estimating the summed post-stimulus time histogram and unit response

    NASA Astrophysics Data System (ADS)

    Chertoff, Mark E.

    2004-11-01

    The convolution of an equation representing a summed post-stimulus time histogram computed across auditory nerve fibers [P(t)] with an equation representing a single-unit wave form [U(t)], resulted in an analytic expression for the compound action potential (CAP). The solution was fit to CAPs recorded to low and high frequency stimuli at various signal levels. The correlation between the CAP and the analytic expression was generally greater than 0.90. At high levels the width of P(t) was broader for low frequency stimuli than for high frequency signals, but delays were comparable. This indicates that at high signal levels there is an overlap in the population of auditory nerve fibers contributing to the CAP for both low and high frequency stimuli but low frequencies include contributions from more apical regions. At low signal levels the width of P(t) decreased for most frequencies and delays increased. The frequency of oscillation of U(t) was largest for high frequency stimuli and decreased for low frequency stimuli. The decay of U(t) was largest at 8 kHz and smallest at 1 kHz. These results indicate that the hair cell or neural mechanisms involved in the generation of action potentials may differ along the cochlear partition. .

  1. Improved DNA content histograms from formalin-fixed, paraffin-embedded liver tissue by proteinase K digestion.

    PubMed

    Albro, J; Bauer, K D; Hitchcock, C L; Wittwer, C T

    1993-01-01

    An improved method for the enzymatic digestion of formalin-fixed, paraffin-embedded liver tissue for DNA content analysis by flow cytometry is presented. Forty samples of histologically normal liver were alternately digested by the traditional pepsin method or a new method utilizing proteinase K and heat. Sixteen (40%) of the pepsin-digested samples had apparent DNA aneuploid peaks by flow cytometry. False DNA aneuploid peaks were not present in any of the histograms obtained after proteinase K digestion. Microscopy showed that the pepsin-digested samples had residual cytoplasmic remnants which contained fluorescent material. Samples digested with proteinase K had few cytoplasmic remnants. The average G0/G1 coefficient of variation after proteinase K treatment was lower (41%) and the fluorescent intensity higher (128%) than the pepsin-treated samples. The apparent mean S-phase (a combination of S-phase cells and underlying debris) after proteinase K digestion was 35% of the pepsin-treated samples. Primary and secondary tumors of the liver that were DNA aneuploid after pepsin treatment were also DNA aneuploid after proteinase K treatment. A modified digestion protocol utilizing proteinase K and heat can provide superior results for DNA content analysis of formalin-fixed, paraffin-embedded liver tissue.

  2. Vision-based drone flight control and crowd or riot analysis with efficient color histogram based tracking

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    Object tracking is a direct or indirect key issue in many different military applications like visual surveillance, automatic visual closed-loop control of UAVs (unmanned aerial vehicles) and PTZ-cameras, or in the field of crowd evaluations in order to detect or analyse a riot emergence. Of course, a high robustness is the most important feature of the underlying tracker, but this is hindered significantly the more the tracker needs to have low calculation times. In the UAV application introduced in this paper the tracker has to be extraordinarily quick. In order to optimize the calculation time and the robustness in combination as far as possible, a highly efficient tracking procedure is presented for the above mentioned application fields which relies on well-known color histograms but uses them in a novel manner. This procedure bases on the calculation of a color weighting vector representing the significances of object colors like a kind of an object's color finger print. Several examples from the above mentioned military applications are shown to demonstrate the practical relevance and the performance of the presented tracking approach.

  3. Usefulness of histogram analysis of spatial frequency components for exploring the similarity and bilateral asymmetry in mammograms

    NASA Astrophysics Data System (ADS)

    Shiotsuki, Kenshi; Matsunobu, Yusuke; Yabuuchi, Hidetake; Morishita, Junji

    2015-03-01

    The right and left mammograms of a patient are assumed to be bilaterally symmetric for image readings. The detection of asymmetry in bilateral mammograms is a reliable indicator for detecting possible breast abnormalities. The purpose of this study was to examine the potential usefulness of a new method in terms of spatial frequency components for exploration of similarity and abnormality between the right and left mammograms. A total of 98 normal and 119 abnormal cases with calcifications were used for this study. Each case included two mediolateral oblique views. The spatial frequency components were determined from the symmetric regions in the right and left mammograms by Fourier transform. The degrees of conformity between the two spatial frequency components in the right and left mammograms were calculated for the same and different patients. The degrees of conformity were also examined for cases with and without calcifications for the same patient to show if the proposed method was useful for indicating the existence of calcifications or not. The average degrees of conformity and the standard deviations for the same and different patients were 0.911 +/- 0.0165 and 0.857 +/- 0.0328, respectively. The degrees of conformity calculated from abnormal cases (0.836 +/- 0.0906) showed statistically lower values compared with those measured from normal cases (0.911 +/- 0.0165). Our results indicated that histogram analysis of spatial frequency components could be useful as a similarity measure between bilateral mammograms for the same patient and abnormal signs in a mammogram.

  4. Prostate position variability and dose-volume histograms in radiotherapy for prostate cancer with full and empty bladder

    SciTech Connect

    Pinkawa, Michael . E-mail: mpinkawa@ukaachen.de; Asadpour, Branka; Gagel, Bernd; Piroth, Marc D.; Holy, Richard; Eble, Michael J.

    2006-03-01

    Purpose: To evaluate prostate position variability and dose-volume histograms in prostate radiotherapy with full bladder (FB) and empty bladder (EB). Methods and Materials: Thirty patients underwent planning computed tomography scans in a supine position with FB and EB before and after 4 and 8 weeks of radiation therapy. The scans were matched by alignment of pelvic bones. Displacements of the prostate/seminal vesicle organ borders and center of mass were determined. Treatment plans (FB vs. EB) were compared. Results: Compared with the primary scan, FB volume varied more than EB volume (standard deviation, 106 cm{sup 3} vs. 47 cm{sup 3}), but the prostate/seminal vesicle center of mass position variability was the same (>3 mm deviation in right-left, anterior-posterior, and superior-inferior directions in 0, 41%, and 33%, respectively, with FB vs. 0, 44%, and 33% with EB). The bladder volume treated with 90% of the prescription dose was significantly larger with EB (39% {+-} 14% vs. 22% {+-} 10%; p < 0.01). Bowel loops received {>=}90% of prescription dose in 37% (3% with FB; p < 0.01). Conclusion: Despite the larger variability of bladder filling, prostate position stability was the same with FB compared with EB. An increased amount of bladder volume in the high-dose region and a higher dose to bowel loops result from treatment plans with EB.

  5. Comparative study of old and new versions of treatment planning system using dose volume histogram indices of clinical plans.

    PubMed

    Krishna, Gangarapu Sri; Srinivas, Vuppu; Ayyangar, K M; Reddy, Palreddy Yadagiri

    2016-01-01

    Recently, Eclipse treatment planning system (TPS) version 8.8 was upgraded to the latest version 13.6. It is customary that the vendor gives training on how to upgrade the existing software to the new version. However, the customer is provided less inner details about changes in the new software version. According to manufacturer, accuracy of point dose calculations and irregular treatment planning is better in the new version (13.6) compared to the old version (8.8). Furthermore, the new version uses voxel-based calculations while the earlier version used point dose calculations. Major difference in intensity-modulated radiation therapy (IMRT) plans was observed between the two versions after re-optimization and re-calculations. However, minor difference was observed for IMRT cases after performing only re-calculations. It is recommended TPS quality assurance to be performed after any major upgrade of software. This can be done by performing dose calculation comparisons in TPS. To assess the difference between the versions, 25 clinical cases from the old version were compared keeping all the patient data intact including the monitor units and comparing the differences in dose calculations using dose volume histogram (DVH) analysis. Along with DVH analysis, uniformity index, conformity index, homogeneity index, and dose spillage index were also compared for both versions. The results of comparative study are presented in this paper. PMID:27651566

  6. Characterization of stroke lesions using a histogram-based data analysis including diffusion- and perfusion-weighted imaging

    NASA Astrophysics Data System (ADS)

    Grzesik, Alexander; Bernarding, Johannes; Braun, Juergen; Koennecke, Hans-Christian; Wolf, Karl J.; Tolxdorff, Thomas

    2000-04-01

    Diffusion- and perfusion-weighted magnetic resonance imaging (DWI, PWI) allows the diagnosis of ischemic brain injury at a time when ischemic lesions may not yet be detectable in computer tomography or T2-weighted (T2w) MRI. However, regions with pathologic apparent diffusion coefficients (ADC) do not necessarily match with regions of prolonged mean transit times (MTT) or pathologic relative cerebral blood flow (rCBF). Mismatching parts are thought to correlate with tissues that can be saved by appropriate treatment. Ten patients with cerebral ischemia underwent standard T1w and T2w imaging as well as single-shot echo planar imaging (EPI) DWI, and PWI. Multidimensional histograms were constructed from T2w images, DWI, ADC, rCBF, and MTT maps. After segmenting different tissues, signal changes of ischemic tissues relative to unaffected parenchyma were calculated. Combining different information allowed the segmentation of lesions and unaffected tissues. Acute infarcts exhibited decreased ADC values as well as hypo- and hyperperfused areas. Correlating ADC, T2w, and rCBF with clinical symptoms allowed the estimation of age and perfusion state of the lesions. Combining DWI, PWI, and standard imaging overcomes strongly fluctuating parameters such as ADC values. A multidimensional parameter-set characterizes unaffected and pathologic tissues which may help in the evaluation of new therapeutic strategies.

  7. Safety surrogate histograms (SSH): A novel real-time safety assessment of dilemma zone related conflicts at signalized intersections.

    PubMed

    Ghanipoor Machiani, Sahar; Abbas, Montasir

    2016-11-01

    Drivers' indecisiveness in dilemma zones (DZ) could result in crash-prone situations at signalized intersections. DZ is to the area ahead of an intersection in which drivers encounter a dilemma regarding whether to stop or proceed through the intersection when the signal turns yellow. An improper decision to stop by the leading driver, combined with the following driver deciding to go, can result in a rear-end collision, unless the following driver recognizes a collision is imminent and adjusts his or her behavior at or shortly after the onset of yellow. Considering the significance of DZ-related crashes, a comprehensive safety measure is needed to characterize the level of safety at signalized intersections. In this study, a novel safety surrogate measure was developed utilizing real-time radar field data. This new measure, called safety surrogate histogram (SSH), captures the degree and frequency of DZ-related conflicts at each intersection approach. SSH includes detailed information regarding the possibility of crashes, because it is calculated based on the vehicles conflicts. An example illustrating the application of the new methodology at two study sites in Virginia is presented and discussed, and a comparison is provided between SSH and other DZ-related safety surrogate measures mentioned in the literature. The results of the study reveal the efficacy of the SSH as complementary to existing surrogate measures.

  8. Highly sensitive image-derived indices of water-stressed plants using hyperspectral imaging in SWIR and histogram analysis

    PubMed Central

    Kim, David M.; Zhang, Hairong; Zhou, Haiying; Du, Tommy; Wu, Qian; Mockler, Todd C.; Berezin, Mikhail Y.

    2015-01-01

    The optical signature of leaves is an important monitoring and predictive parameter for a variety of biotic and abiotic stresses, including drought. Such signatures derived from spectroscopic measurements provide vegetation indices – a quantitative method for assessing plant health. However, the commonly used metrics suffer from low sensitivity. Relatively small changes in water content in moderately stressed plants demand high-contrast imaging to distinguish affected plants. We present a new approach in deriving sensitive indices using hyperspectral imaging in a short-wave infrared range from 800 nm to 1600 nm. Our method, based on high spectral resolution (1.56 nm) instrumentation and image processing algorithms (quantitative histogram analysis), enables us to distinguish a moderate water stress equivalent of 20% relative water content (RWC). The identified image-derived indices 15XX nm/14XX nm (i.e. 1529 nm/1416 nm) were superior to common vegetation indices, such as WBI, MSI, and NDWI, with significantly better sensitivity, enabling early diagnostics of plant health. PMID:26531782

  9. Novel brachytherapy treatment planning system utilizing dose rate dependent average cell survival, CT-simulator, and dose-volume histogram

    SciTech Connect

    Mayer, R.; Fong, W.; Frankel, T.

    1995-12-31

    This report describes a new brachytherapy planning program that provides an evaluation of a given low or high dose rate treatment taking into account spatial dose heterogeneity and cell response to radiation. This brachytherapy scheme uses the images from a CT-Simulator (AcQSim, Picker International, Cleveland, Ohio) to simultaneously localize the seed positions and to axially scan the patient. This procedure helps to ensure accurate registration of the putative seed positions with the patient tissues and organs. The seed positions are determined by back-projecting positions of seeds or dummy seeds from the CT-Simulator setup scout images. Physicians delineate the tissues of interest on the axial slices. Dose is computed after assigning activity (low dose rate) of dwell times (high dose rate) to the Ir{sup 192} or I{sup 125} seed. The planar isodose distribution is superimposed onto axial cuts of the tissues and onto coronal or sagital views of the tissues following image reconstruction. Areal or volumetric calculations of the dose distribution within a given tissue are computed from the tissue outlines. The treatment plan computes (1) volume differential and cummulative dose histograms of the dose delivered to individual tissues, (2) the average, standard deviation, and coefficient of skewness of the dose distribution delivered to the individual tissues, (3) the average survival probability for a given radiation treatment.

  10. O(1) time algorithms for computing histogram and Hough transform on a cross-bridge reconfigurable array of processors

    SciTech Connect

    Kao, T.; Horng, S.; Wang, Y.

    1995-04-01

    Instead of using the base-2 number system, we use a base-m number system to represent the numbers used in the proposed algorithms. Such a strategy can be used to design an O(T) time, T = (log(sub m) N) + 1, prefix sum algorithm for a binary sequence with N-bit on a cross-bridge reconfigurable array of processors using N processors, where the data bus is m-bit wide. Then, this basic operation can be used to compute the histogram of an n x n image with G gray-level value in constant time using G x n x n processors, and compute the Hough transform of an image with N edge pixels and n x n parameter space in constant time using n x n x N processors, respectively. This result is better than the previously known results proposed in the literature. Also, the execution time of the proposed algorithms is tunable by the bus bandwidth. 43 refs.

  11. Comparative study of old and new versions of treatment planning system using dose volume histogram indices of clinical plans

    PubMed Central

    Krishna, Gangarapu Sri; Srinivas, Vuppu; Ayyangar, K. M.; Reddy, Palreddy Yadagiri

    2016-01-01

    Recently, Eclipse treatment planning system (TPS) version 8.8 was upgraded to the latest version 13.6. It is customary that the vendor gives training on how to upgrade the existing software to the new version. However, the customer is provided less inner details about changes in the new software version. According to manufacturer, accuracy of point dose calculations and irregular treatment planning is better in the new version (13.6) compared to the old version (8.8). Furthermore, the new version uses voxel-based calculations while the earlier version used point dose calculations. Major difference in intensity-modulated radiation therapy (IMRT) plans was observed between the two versions after re-optimization and re-calculations. However, minor difference was observed for IMRT cases after performing only re-calculations. It is recommended TPS quality assurance to be performed after any major upgrade of software. This can be done by performing dose calculation comparisons in TPS. To assess the difference between the versions, 25 clinical cases from the old version were compared keeping all the patient data intact including the monitor units and comparing the differences in dose calculations using dose volume histogram (DVH) analysis. Along with DVH analysis, uniformity index, conformity index, homogeneity index, and dose spillage index were also compared for both versions. The results of comparative study are presented in this paper. PMID:27651566

  12. Comparative study of old and new versions of treatment planning system using dose volume histogram indices of clinical plans

    PubMed Central

    Krishna, Gangarapu Sri; Srinivas, Vuppu; Ayyangar, K. M.; Reddy, Palreddy Yadagiri

    2016-01-01

    Recently, Eclipse treatment planning system (TPS) version 8.8 was upgraded to the latest version 13.6. It is customary that the vendor gives training on how to upgrade the existing software to the new version. However, the customer is provided less inner details about changes in the new software version. According to manufacturer, accuracy of point dose calculations and irregular treatment planning is better in the new version (13.6) compared to the old version (8.8). Furthermore, the new version uses voxel-based calculations while the earlier version used point dose calculations. Major difference in intensity-modulated radiation therapy (IMRT) plans was observed between the two versions after re-optimization and re-calculations. However, minor difference was observed for IMRT cases after performing only re-calculations. It is recommended TPS quality assurance to be performed after any major upgrade of software. This can be done by performing dose calculation comparisons in TPS. To assess the difference between the versions, 25 clinical cases from the old version were compared keeping all the patient data intact including the monitor units and comparing the differences in dose calculations using dose volume histogram (DVH) analysis. Along with DVH analysis, uniformity index, conformity index, homogeneity index, and dose spillage index were also compared for both versions. The results of comparative study are presented in this paper.

  13. Pushing Economies (and Students) outside the Factor Price Equalization Zone

    ERIC Educational Resources Information Center

    Oslington, Paul; Towers, Isaac

    2009-01-01

    Despite overwhelming empirical evidence of the failure of factor price equalization, most teaching of international trade theory (even at the graduate level) assumes that economies are incompletely specialized and that factor price equalization holds. The behavior of trading economies in the absence of factor price equalization is not well…

  14. The Shadow of Equality: Some Notes on Inequality

    ERIC Educational Resources Information Center

    Griffin, Eleanor

    1973-01-01

    This article presents a critique of Christopher Jencks' concept of equality as detailed in Inequality: A Reassessment of the Effect of Family and School in America.'' The article claims that Jencks' concept of equality in education as equality of relative incomes, has serious educational consequences. (JA)

  15. 77 FR 39117 - Equal Access to Justice Act Implementation Rule

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-29

    ... PROTECTION 12 CFR Part 1071 RIN 3170-AA27 Equal Access to Justice Act Implementation Rule AGENCY: Bureau of... Equal Access to Justice Act (EAJA or the Act) requires agencies ] that conduct adversary adjudications..., Credit, Credit unions, Equal access to justice, Law enforcement, National banks, Savings...

  16. 48 CFR 52.222-26 - Equal Opportunity.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Equal Opportunity. 52.222... Equal Opportunity. As prescribed in 22.810(e), insert the following clause: Equal Opportunity (MAR 2007... particular religion to perform work connected with the carrying on of the Contractor's activities (41 CFR...

  17. 41 CFR 60-250.5 - Equal opportunity clause.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 1 2013-07-01 2013-07-01 false Equal opportunity clause. 60-250.5 Section 60-250.5 Public Contracts and Property Management Other Provisions Relating to... PROTECTED VETERANS Preliminary Matters, Equal Opportunity Clause § 60-250.5 Equal opportunity clause....

  18. Special Issue (26): Teacher Education and Equal Rights.

    ERIC Educational Resources Information Center

    Osler, Audrey, Ed.; Davies, Lynn, Ed.

    1994-01-01

    "Focussing on Equal Rights in Teacher Education (TE)" (Davies); "Place of Women in TE" (Maguire, Weiner); "UN Convention on the Rights of the Child" (Osler); "Parents and Entitlement" (Holden et al.); "International Political Development and Democratic TE" (Harber); "Equal Rights in the Classroom?" (Klein); "Equality of Educational Opportunity in…

  19. 5 CFR 720.101 - Federal Equal Opportunity Recruitment Program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Federal Equal Opportunity Recruitment... Federal Equal Opportunity Recruitment Program. This section incorporates the statutory requirements for establishing and conducting an equal opportunity recruitment program consistent with law within the...

  20. 29 CFR 530.414 - Equal Access to Justice Act.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 3 2014-07-01 2014-07-01 false Equal Access to Justice Act. 530.414 Section 530.414 Labor... OF HOMEWORKERS IN CERTAIN INDUSTRIES Administrative Procedures § 530.414 Equal Access to Justice Act. Proceedings under this part are not subject to the provisions of the Equal Access to Justice Act. In...

  1. 29 CFR 530.414 - Equal Access to Justice Act.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 3 2012-07-01 2012-07-01 false Equal Access to Justice Act. 530.414 Section 530.414 Labor... OF HOMEWORKERS IN CERTAIN INDUSTRIES Administrative Procedures § 530.414 Equal Access to Justice Act. Proceedings under this part are not subject to the provisions of the Equal Access to Justice Act. In...

  2. 75 FR 65214 - Equal Access to Justice Act Implementation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-22

    ... Justice Act. The proposed regulation was published in the Federal Register at 75 FR 17622 (April 7, 2010... Oversight 12 CFR Part 1705 RIN 2590-AA29 Equal Access to Justice Act Implementation AGENCY: Federal Housing... 1302 and 1312 of HERA. B. Equal Access to Justice Act The Equal Access to Justice Act, 5 U.S.C....

  3. 29 CFR 530.414 - Equal Access to Justice Act.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 3 2011-07-01 2011-07-01 false Equal Access to Justice Act. 530.414 Section 530.414 Labor... OF HOMEWORKERS IN CERTAIN INDUSTRIES Administrative Procedures § 530.414 Equal Access to Justice Act. Proceedings under this part are not subject to the provisions of the Equal Access to Justice Act. In...

  4. 29 CFR 530.414 - Equal Access to Justice Act.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 3 2013-07-01 2013-07-01 false Equal Access to Justice Act. 530.414 Section 530.414 Labor... OF HOMEWORKERS IN CERTAIN INDUSTRIES Administrative Procedures § 530.414 Equal Access to Justice Act. Proceedings under this part are not subject to the provisions of the Equal Access to Justice Act. In...

  5. 29 CFR 530.414 - Equal Access to Justice Act.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false Equal Access to Justice Act. 530.414 Section 530.414 Labor... OF HOMEWORKERS IN CERTAIN INDUSTRIES Administrative Procedures § 530.414 Equal Access to Justice Act. Proceedings under this part are not subject to the provisions of the Equal Access to Justice Act. In...

  6. Halving It All: How Equally Shared Parenting Works.

    ERIC Educational Resources Information Center

    Deutsch, Francine M.

    Noting that details of everyday life contribute to parental equality or inequality, this qualitative study focused on how couples transformed parental roles to create truly equal families. Participating in the study were 88 couples in 4 categories, based on division of parental responsibilities: equal sharers, 60-40 couples, 75-25 couples, and…

  7. 24 CFR 582.330 - Nondiscrimination and equal opportunity requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 3 2011-04-01 2010-04-01 true Nondiscrimination and equal... nondiscrimination and equal opportunity requirements set forth in 24 CFR part 5, recipients serving a designated...) and implementing regulations at 41 CFR chapter 60-741. (2) The nondiscrimination and equal...

  8. 48 CFR 852.211-73 - Brand name or equal.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Brand name or equal. 852... Brand name or equal. As prescribed in 811.104-71, insert the following clause: Brand Name or Equal (JAN 2008) (Note: As used in this clause, the term “brand name” includes identification of products by...

  9. 48 CFR 452.211-70 - Brand Name or Equal.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Brand Name or Equal. 452... FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Texts of Provisions and Clauses 452.211-70 Brand Name or Equal. As prescribed in 411.171, insert the following provision: Brand Name or Equal (NOV...

  10. 48 CFR 852.211-73 - Brand name or equal.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Brand name or equal. 852... Brand name or equal. As prescribed in 811.104-71, insert the following clause: Brand Name or Equal (JAN 2008) (Note: As used in this clause, the term “brand name” includes identification of products by...

  11. 48 CFR 1852.210-70 - Brand name or equal.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Brand name or equal. 1852... 1852.210-70 Brand name or equal. As prescribed in 1810.011-70(a), insert the following provision: Brand Name or Equal (DEC 1988) (a) As used in this provision, “brand name” means identification of...

  12. On Selecting Tests for Equality of Two Normal Mean Vectors

    ERIC Educational Resources Information Center

    Krishnamoorthy, K.; Xia, Yanping

    2006-01-01

    The conventional approach for testing the equality of two normal mean vectors is to test first the equality of covariance matrices, and if the equality assumption is tenable, then use the two-sample Hotelling T[superscript 2] test. Otherwise one can use one of the approximate tests for the multivariate Behrens-Fisher problem. In this article, we…

  13. Adaptive management

    USGS Publications Warehouse

    Allen, Craig R.; Garmestani, Ahjond S.

    2015-01-01

    Adaptive management is an approach to natural resource management that emphasizes learning through management where knowledge is incomplete, and when, despite inherent uncertainty, managers and policymakers must act. Unlike a traditional trial and error approach, adaptive management has explicit structure, including a careful elucidation of goals, identification of alternative management objectives and hypotheses of causation, and procedures for the collection of data followed by evaluation and reiteration. The process is iterative, and serves to reduce uncertainty, build knowledge and improve management over time in a goal-oriented and structured process.

  14. Employment equality legislation, 3 March 1988.

    PubMed

    1988-01-01

    On 1 April 1988, new employment equality legislation came into effect in Israel. The new legislation outlaws discrimination at work on the grounds of sex, marital status, and parenthood with respect to recruitment, terms of employment, promotion, vocational training, retraining, dismissal, and severance pay. Under the legislation, 1) employers may not cause prejudice to workers who allege discrimination, help others to do so, or decline sexual advances by a direct or indirect supervisor; 2) the burden of proof in discrimination claims against an employer is on the employer if the worker can show that requirements set by the employer have been met; 3) company managers and co-owners in a partnership are personally liable for violations on the part of the employer if the firm has over six workers unless they prove that the offense was committed without their knowledge or that they had taken all appropriate measures to prevent it; and 4) no special rights given to women by law, collective agreement, or other work contract are to be considered discrimination. The legislation establishes a Public Council to advise the Minister of Labour and Social Affairs on implementing and publicizing the legislation. It also allows a father to receive the following work benefits that were previously restricted to mothers: 1) leave of absence to care for a sick child and 2) statutory leave and statutory entitlement to severance pay for resigning to care for a newborn or adopted baby if the father is the sole guardian or if the mother renounces her right because she is working.

  15. Employment equality legislation, 3 March 1988.

    PubMed

    1988-01-01

    On 1 April 1988, new employment equality legislation came into effect in Israel. The new legislation outlaws discrimination at work on the grounds of sex, marital status, and parenthood with respect to recruitment, terms of employment, promotion, vocational training, retraining, dismissal, and severance pay. Under the legislation, 1) employers may not cause prejudice to workers who allege discrimination, help others to do so, or decline sexual advances by a direct or indirect supervisor; 2) the burden of proof in discrimination claims against an employer is on the employer if the worker can show that requirements set by the employer have been met; 3) company managers and co-owners in a partnership are personally liable for violations on the part of the employer if the firm has over six workers unless they prove that the offense was committed without their knowledge or that they had taken all appropriate measures to prevent it; and 4) no special rights given to women by law, collective agreement, or other work contract are to be considered discrimination. The legislation establishes a Public Council to advise the Minister of Labour and Social Affairs on implementing and publicizing the legislation. It also allows a father to receive the following work benefits that were previously restricted to mothers: 1) leave of absence to care for a sick child and 2) statutory leave and statutory entitlement to severance pay for resigning to care for a newborn or adopted baby if the father is the sole guardian or if the mother renounces her right because she is working. PMID:12289294

  16. Adaptive Thresholds

    SciTech Connect

    Bremer, P. -T.

    2014-08-26

    ADAPT is a topological analysis code that allow to compute local threshold, in particular relevance based thresholds for features defined in scalar fields. The initial target application is vortex detection but the software is more generally applicable to all threshold based feature definitions.

  17. Equality and fertility in the kibbutz.

    PubMed

    Danziger, L; Neuman, S

    1993-01-01

    The kibbutz is a uniquely socially organized entity that consists of members having equal consumption possibilities, which are not tied to production. There are no wages and child rearing is performed outside the family unit. A theoretical model is presented which explains the differences between fertility in the kibbutz and in the city, and which is tested with data from the 1983 Israeli Census of Population and Housing. The model includes family consumption, the number of children, the time a parent works, years of finished schooling, duration of marriage, wages, and the time a parent cares for own children. The sample population only considers first marrieds with both spouses present, where the wife is at least 35 years old: 77,455 urban families and 2532 kibbutz families. The results of the ordinary least squares analysis indicate that socioeconomic variables explain fertility in the city much better. Regressions which include oriental origin show that women of oriental origin have 1.663 more children in the city and .343 more children in the kibbutz than Western women. Only 10% of the kibbutzim (plural of kibbutz) had an oriental background, while 40% had an oriental background in the city in 1983. Other control variables were immigration status and duration of marriage. The effect of mother's education shows the number of children decreasing with education until reaching a minimum of 13.9 years, or 14.5 years for mother's education and 10.5 years for father's education in the city. Parents' education is insignificant in the kibbutz. Father's ethnicity affects fertility in the city, but mother's ethnicity affects fertility in the kibbutz. Mother's predicted wage is added to her education; the results show a positive effect in both the city and kibbutz, which shows a larger effect. The reason is that education is more valuable in the city in securing work. Regressions with fathers' ages and education indicate that fathers' predicted wages is positive in both

  18. Combined spatial diversity and time equalization for broadband multiple channel underwater acoustic communications

    NASA Astrophysics Data System (ADS)

    Skoro Kaskarovska, Violeta

    High data rate acoustic communications become feasible with the use of communication systems that operate at high frequency. The high frequency acoustic transmission in shallow water endures severe distortion as a result of the extensive intersymbol interference and Doppler shift, caused by the time variable multipath nature of the channel. In this research a Single Input Multiple Output (SIMO) acoustic communication system is developed to improve the reliability of the high data rate communications at short range in the shallow water acoustic channel. The proposed SIMO communication system operates at very high frequency and combines spatial diversity and decision feedback equalizer in a multilevel adaptive configuration. The first configuration performs selective combining on the equalized signals from multiple receivers and generates quality feedback parameter for the next level of combining. The second configuration implements a form of turbo equalization to evaluate the individual receivers using the feedback parameters as decision symbols. The improved signals from individual receivers are used in the next iteration of selective combining. Multiple iterations are used to achieve optimal estimate of the received signal. The multilevel adaptive configuration is evaluated on experimental and simulated data using SIMO system with three, four and five receivers. The simulation channel model developed for this research is based on experimental channel and Rician fading channel model. The performance of the channel is evaluated in terms of Bit Error Rate (BER) and Signal-to-Noise-and-Interference Ratio (SNIR). Using experimental data with non-zero BER, multilevel adaptive spatial diversity can achieve BER of 0 % and SNIR gain of 3 dB. The simulation results show that the average BER and SNIR after multilevel combining improve dramatically compared to the single receiver, even in case of extremely high BER of individual received signals. The results demonstrate the

  19. Optimization of the fractionated irradiation scheme considering physical doses to tumor and organ at risk based on dose–volume histograms

    SciTech Connect

    Sugano, Yasutaka; Mizuta, Masahiro; Takao, Seishin; Shirato, Hiroki; Sutherland, Kenneth L.; Date, Hiroyuki

    2015-11-15

    Purpose: Radiotherapy of solid tumors has been performed with various fractionation regimens such as multi- and hypofractionations. However, the ability to optimize the fractionation regimen considering the physical dose distribution remains insufficient. This study aims to optimize the fractionation regimen, in which the authors propose a graphical method for selecting the optimal number of fractions (n) and dose per fraction (d) based on dose–volume histograms for tumor and normal tissues of organs around the tumor. Methods: Modified linear-quadratic models were employed to estimate the radiation effects on the tumor and an organ at risk (OAR), where the repopulation of the tumor cells and the linearity of the dose-response curve in the high dose range of the surviving fraction were considered. The minimization problem for the damage effect on the OAR was solved under the constraint that the radiation effect on the tumor is fixed by a graphical method. Here, the damage effect on the OAR was estimated based on the dose–volume histogram. Results: It was found that the optimization of fractionation scheme incorporating the dose–volume histogram is possible by employing appropriate cell surviving models. The graphical method considering the repopulation of tumor cells and a rectilinear response in the high dose range enables them to derive the optimal number of fractions and dose per fraction. For example, in the treatment of prostate cancer, the optimal fractionation was suggested to lie in the range of 8–32 fractions with a daily dose of 2.2–6.3 Gy. Conclusions: It is possible to optimize the number of fractions and dose per fraction based on the physical dose distribution (i.e., dose–volume histogram) by the graphical method considering the effects on tumor and OARs around the tumor. This method may stipulate a new guideline to optimize the fractionation regimen for physics-guided fractionation.

  20. Outcomes of visual acuity in carbon ion radiotherapy: Analysis of dose-volume histograms and prognostic factors

    SciTech Connect

    Hasegawa, Azusa . E-mail: azusa@nirs.go.jp; Mizoe, Jun-etsu; Mizota, Atsushi; Tsujii, Hirohiko

    2006-02-01

    Purpose: To analyze the tolerance dose for retention of visual acuity in patients with head-and-neck tumors treated with carbon ion radiotherapy. Methods and Materials: From June 1994 to March 2000, 163 patients with tumors in the head and neck or skull base region were treated with carbon ion radiotherapy. Analysis was performed on 54 optic nerves (ONs) corresponding to 30 patients whose ONs had been included in the irradiated volume. These patients showed no evidence of visual impairment due to other factors and had a follow-up period of >4 years. All patients had been informed of the possibility of visual impairment before treatment. We evaluated the dose-complication probability and the prognostic factors for the retention of visual acuity in carbon ion radiotherapy, using dose-volume histograms and multivariate analysis. Results: The median age of 30 patients (14 men, 16 women) was 57.2 years. Median prescribed total dose was 56.0 gray equivalents (GyE) at 3.0-4.0 GyE per fraction per day (range, 48-64 GyE; 16-18 fractions; 4-6 weeks). Of 54 ONs that were analyzed, 35 had been irradiated with <57 GyE (maximum dose [D{sub max}]) resulting in no visual loss. Conversely, 11 of the 19 ONs (58%) irradiated with >57 GyE (D{sub max}) suffered a decrease of visual acuity. In all of these cases, the ONs had been involved in the tumor before carbon ion radiotherapy. In the multivariate analysis, a dose of 20% of the volume of the ON (D{sub 2}) was significantly associated with visual loss. Conclusions: The occurrence of visual loss seems to be correlated with a delivery of >60 GyE to 20% of the volume of the ON.

  1. Risk factors for neovascular glaucoma after carbon ion radiotherapy of choroidal melanoma using dose-volume histogram analysis

    SciTech Connect

    Hirasawa, Naoki . E-mail: naoki_h@nirs.go.jp; Tsuji, Hiroshi; Ishikawa, Hitoshi; Koyama-Ito, Hiroko; Kamada, Tadashi; Mizoe, Jun-Etsu; Ito, Yoshiyuki; Naganawa, Shinji; Ohnishi, Yoshitaka; Tsujii, Hirohiko

    2007-02-01

    Purpose: To determine the risk factors for neovascular glaucoma (NVG) after carbon ion radiotherapy (C-ion RT) of choroidal melanoma. Methods and Materials: A total of 55 patients with choroidal melanoma were treated between 2001 and 2005 with C-ion RT based on computed tomography treatment planning. All patients had a tumor of large size or one located close to the optic disk. Univariate and multivariate analyses were performed to identify the risk factors of NVG for the following parameters; gender, age, dose-volumes of the iris-ciliary body and the wall of eyeball, and irradiation of the optic disk (ODI). Results: Neovascular glaucoma occurred in 23 patients and the 3-year cumulative NVG rate was 42.6 {+-} 6.8% (standard error), but enucleation from NVG was performed in only three eyes. Multivariate analysis revealed that the significant risk factors for NVG were V50{sub IC} (volume irradiated {>=}50 GyE to iris-ciliary body) (p = 0.002) and ODI (p = 0.036). The 3-year NVG rate for patients with V50{sub IC} {>=}0.127 mL and those with V50{sub IC} <0.127 mL were 71.4 {+-} 8.5% and 11.5 {+-} 6.3%, respectively. The corresponding rate for the patients with and without ODI were 62.9 {+-} 10.4% and 28.4 {+-} 8.0%, respectively. Conclusion: Dose-volume histogram analysis with computed tomography indicated that V50{sub IC} and ODI were independent risk factors for NVG. An irradiation system that can reduce the dose to both the anterior segment and the optic disk might be worth adopting to investigate whether or not incidence of NVG can be decreased with it.

  2. Dose-Volume Histogram Parameters and Clinical Factors Associated With Pleural Effusion After Chemoradiotherapy in Esophageal Cancer Patients

    SciTech Connect

    Shirai, Katsuyuki; Tamaki, Yoshio; Kitamoto, Yoshizumi; Murata, Kazutoshi; Satoh, Yumi; Higuchi, Keiko; Nonaka, Tetsuo; Ishikawa, Hitoshi; Katoh, Hiroyuki; Takahashi, Takeo; Nakano, Takashi

    2011-07-15

    Purpose: To investigate the dose-volume histogram parameters and clinical factors as predictors of pleural effusion in esophageal cancer patients treated with concurrent chemoradiotherapy (CRT). Methods and Materials: Forty-three esophageal cancer patients treated with definitive CRT from January 2001 to March 2007 were reviewed retrospectively on the basis of the following criteria: pathologically confirmed esophageal cancer, available computed tomography scan for treatment planning, 6-month follow-up after CRT, and radiation dose {>=}50 Gy. Exclusion criteria were lung metastasis, malignant pleural effusion, and surgery. Mean heart dose, mean total lung dose, and percentages of heart or total lung volume receiving {>=}10-60 Gy (Heart-V{sub 10} to V{sub 60} and Lung-V{sub 10} to V{sub 60}, respectively) were analyzed in relation to pleural effusion. Results: The median follow-up time was 26.9 months (range, 6.7-70.2) after CRT. Of the 43 patients, 15 (35%) developed pleural effusion. By univariate analysis, mean heart dose, Heart-V{sub 10} to V{sub 60}, and Lung-V{sub 50} to V{sub 60} were significantly associated with pleural effusion. Poor performance status, primary tumor of the distal esophagus, and age {>=}65 years were significantly related with pleural effusion. Multivariate analysis identified Heart-V{sub 50} as the strongest predictive factor for pleural effusion (p = 0.01). Patients with Heart-V{sub 50} <20%, 20%{<=} Heart-V{sub 50} <40%, and Heart-V{sub 50} {>=}40% had 6%, 44%, and 64% of pleural effusion, respectively (p < 0.01). Conclusion: Heart-V{sub 50} is a useful parameter for assessing the risk of pleural effusion and should be reduced to avoid pleural effusion.

  3. Quantifying the Impact of Immediate Reconstruction in Postmastectomy Radiation: A Large, Dose-Volume Histogram-Based Analysis

    SciTech Connect

    Ohri, Nisha; Cordeiro, Peter G.; Keam, Jennifer; Ballangrud, Ase; Shi Weiji; Zhang Zhigang; Nerbun, Claire T.; Woch, Katherine M.; Stein, Nicholas F.; Zhou Ying; McCormick, Beryl; Powell, Simon N.; Ho, Alice Y.

    2012-10-01

    Purpose: To assess the impact of immediate breast reconstruction on postmastectomy radiation (PMRT) using dose-volume histogram (DVH) data. Methods and Materials: Two hundred forty-seven women underwent PMRT at our center, 196 with implant reconstruction and 51 without reconstruction. Patients with reconstruction were treated with tangential photons, and patients without reconstruction were treated with en-face electron fields and customized bolus. Twenty percent of patients received internal mammary node (IMN) treatment. The DVH data were compared between groups. Ipsilateral lung parameters included V20 (% volume receiving 20 Gy), V40 (% volume receiving 40 Gy), mean dose, and maximum dose. Heart parameters included V25 (% volume receiving 25 Gy), mean dose, and maximum dose. IMN coverage was assessed when applicable. Chest wall coverage was assessed in patients with reconstruction. Propensity-matched analysis adjusted for potential confounders of laterality and IMN treatment. Results: Reconstruction was associated with lower lung V20, mean dose, and maximum dose compared with no reconstruction (all P<.0001). These associations persisted on propensity-matched analysis (all P<.0001). Heart doses were similar between groups (P=NS). Ninety percent of patients with reconstruction had excellent chest wall coverage (D95 >98%). IMN coverage was superior in patients with reconstruction (D95 >92.0 vs 75.7%, P<.001). IMN treatment significantly increased lung and heart parameters in patients with reconstruction (all P<.05) but minimally affected those without reconstruction (all P>.05). Among IMN-treated patients, only lower lung V20 in those without reconstruction persisted (P=.022), and mean and maximum heart doses were higher than in patients without reconstruction (P=.006, P=.015, respectively). Conclusions: Implant reconstruction does not compromise the technical quality of PMRT when the IMNs are untreated. Treatment technique, not reconstruction, is the primary

  4. Cortical Magnification Plus Cortical Plasticity Equals Vision?

    PubMed Central

    Born, Richard T.; Trott, Alexander; Hartmann, Till

    2014-01-01

    Most approaches to visual prostheses have focused on the retina, and for good reasons. The earlier that one introduces signals into the visual system, the more one can take advantage of its prodigious computational abilities. For methods that make use of microelectrodes to introduce electrical signals, however, the limited density and volume occupying nature of the electrodes place severe limits on the image resolution that can be provided to the brain. In this regard, non-retinal areas in general, and the primary visual cortex in particular, possess one large advantage: “magnification factor” (MF)—a value that represents the distance across a sheet of neurons that represents a given angle of the visual field. In the foveal representation of primate primary visual cortex, the MF is enormous—on the order of 15–20 mm/deg in monkeys and humans, whereas on the retina, the MF is limited by the optical design of the eye to around 0.3 mm/deg. This means that, for an electrode array of a given density, a much higher- resolution image can be introduced into V1 than onto the retina (or any other visual structure). In addition to this tremendous advantage in resolution, visual cortex is plastic at many different levels ranging from a very local ability to learn to better detect electrical stimulation to higher levels of learning that permit human observers to adapt to radical changes to their visual inputs. We argue that the combination of the large magnification factor and the impressive ability of the cerebral cortex to learn to recognize arbitrary patterns, might outweigh the disadvantages of bypassing earlier processing stages and makes V1 a viable option for the restoration of vision. PMID:25449335

  5. Connector adapter

    NASA Technical Reports Server (NTRS)

    Hacker, Scott C. (Inventor); Dean, Richard J. (Inventor); Burge, Scott W. (Inventor); Dartez, Toby W. (Inventor)

    2007-01-01

    An adapter for installing a connector to a terminal post, wherein the connector is attached to a cable, is presented. In an embodiment, the adapter is comprised of an elongated collet member having a longitudinal axis comprised of a first collet member end, a second collet member end, an outer collet member surface, and an inner collet member surface. The inner collet member surface at the first collet member end is used to engage the connector. The outer collet member surface at the first collet member end is tapered for a predetermined first length at a predetermined taper angle. The collet includes a longitudinal slot that extends along the longitudinal axis initiating at the first collet member end for a predetermined second length. The first collet member end is formed of a predetermined number of sections segregated by a predetermined number of channels and the longitudinal slot.

  6. Adaptive sampler

    DOEpatents

    Watson, B.L.; Aeby, I.

    1980-08-26

    An adaptive data compression device for compressing data is described. The device has a frequency content, including a plurality of digital filters for analyzing the content of the data over a plurality of frequency regions, a memory, and a control logic circuit for generating a variable rate memory clock corresponding to the analyzed frequency content of the data in the frequency region and for clocking the data into the memory in response to the variable rate memory clock.

  7. Adaptive sampler

    DOEpatents

    Watson, Bobby L.; Aeby, Ian

    1982-01-01

    An adaptive data compression device for compressing data having variable frequency content, including a plurality of digital filters for analyzing the content of the data over a plurality of frequency regions, a memory, and a control logic circuit for generating a variable rate memory clock corresponding to the analyzed frequency content of the data in the frequency region and for clocking the data into the memory in response to the variable rate memory clock.

  8. Equally parsimonious pathways through an RNA sequence space are not equally likely

    NASA Technical Reports Server (NTRS)

    Lee, Y. H.; DSouza, L. M.; Fox, G. E.

    1997-01-01

    An experimental system for determining the potential ability of sequences resembling 5S ribosomal RNA (rRNA) to perform as functional 5S rRNAs in vivo in the Escherichia coli cellular environment was devised previously. Presumably, the only 5S rRNA sequences that would have been fixed by ancestral populations are ones that were functionally valid, and hence the actual historical paths taken through RNA sequence space during 5S rRNA evolution would have most likely utilized valid sequences. Herein, we examine the potential validity of all sequence intermediates along alternative equally parsimonious trajectories through RNA sequence space which connect two pairs of sequences that had previously been shown to behave as valid 5S rRNAs in E. coli. The first trajectory requires a total of four changes. The 14 sequence intermediates provide 24 apparently equally parsimonious paths by which the transition could occur. The second trajectory involves three changes, six intermediate sequences, and six potentially equally parsimonious paths. In total, only eight of the 20 sequence intermediates were found to be clearly invalid. As a consequence of the position of these invalid intermediates in the sequence space, seven of the 30 possible paths consisted of exclusively valid sequences. In several cases, the apparent validity/invalidity of the intermediate sequences could not be anticipated on the basis of current knowledge of the 5S rRNA structure. This suggests that the interdependencies in RNA sequence space may be more complex than currently appreciated. If ancestral sequences predicted by parsimony are to be regarded as actual historical sequences, then the present results would suggest that they should also satisfy a validity requirement and that, in at least limited cases, this conjecture can be tested experimentally.

  9. Multi-site study of diffusion metric variability: characterizing the effects of site, vendor, field strength, and echo time using the histogram distance

    NASA Astrophysics Data System (ADS)

    Helmer, K. G.; Chou, M.-C.; Preciado, R. I.; Gimi, B.; Rollins, N. K.; Song, A.; Turner, J.; Mori, S.

    2016-03-01

    MRI-based multi-site trials now routinely include some form of diffusion-weighted imaging (DWI) in their protocol. These studies can include data originating from scanners built by different vendors, each with their own set of unique protocol restrictions, including restrictions on the number of available gradient directions, whether an externally generated list of gradient directions can be used, and restrictions on the echo time (TE). One challenge of multi-site studies is to create a common imaging protocol that will result in a reliable and accurate set of diffusion metrics. The present study describes the effect of site, scanner vendor, field strength, and TE on two common metrics: the first moment of the diffusion tensor field (mean diffusivity, MD), and the fractional anisotropy (FA). We have shown in earlier work that ROI metrics and the mean of MD and FA histograms are not sufficiently sensitive for use in site characterization. Here we use the distance between whole brain histograms of FA and MD to investigate within- and between-site effects. We concluded that the variability of DTI metrics due to site, vendor, field strength, and echo time could influence the results in multi-center trials and that histogram distance is sensitive metrics for each of these variables.

  10. Adaptive antennas

    NASA Astrophysics Data System (ADS)

    Barton, P.

    1987-04-01

    The basic principles of adaptive antennas are outlined in terms of the Wiener-Hopf expression for maximizing signal to noise ratio in an arbitrary noise environment; the analogy with generalized matched filter theory provides a useful aid to understanding. For many applications, there is insufficient information to achieve the above solution and thus non-optimum constrained null steering algorithms are also described, together with a summary of methods for preventing wanted signals being nulled by the adaptive system. The three generic approaches to adaptive weight control are discussed; correlation steepest descent, weight perturbation and direct solutions based on sample matrix conversion. The tradeoffs between hardware complexity and performance in terms of null depth and convergence rate are outlined. The sidelobe cancellor technique is described. Performance variation with jammer power and angular distribution is summarized and the key performance limitations identified. The configuration and performance characteristics of both multiple beam and phase scan array antennas are covered, with a brief discussion of performance factors.

  11. Equalization and detection for digital communication over nonlinear bandlimited satellite communication channels. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Gutierrez, Alberto, Jr.

    1995-01-01

    This dissertation evaluates receiver-based methods for mitigating the effects due to nonlinear bandlimited signal distortion present in high data rate satellite channels. The effects of the nonlinear bandlimited distortion is illustrated for digitally modulated signals. A lucid development of the low-pass Volterra discrete time model for a nonlinear communication channel is presented. In addition, finite-state machine models are explicitly developed for a nonlinear bandlimited satellite channel. A nonlinear fixed equalizer based on Volterra series has previously been studied for compensation of noiseless signal distortion due to a nonlinear satellite channel. This dissertation studies adaptive Volterra equalizers on a downlink-limited nonlinear bandlimited satellite channel. We employ as figure of merits performance in the mean-square error and probability of error senses. In addition, a receiver consisting of a fractionally-spaced equalizer (FSE) followed by a Volterra equalizer (FSE-Volterra) is found to give improvement beyond that gained by the Volterra equalizer. Significant probability of error performance improvement is found for multilevel modulation schemes. Also, it is found that probability of error improvement is more significant for modulation schemes, constant amplitude and multilevel, which require higher signal to noise ratios (i.e., higher modulation orders) for reliable operation. The maximum likelihood sequence detection (MLSD) receiver for a nonlinear satellite channel, a bank of matched filters followed by a Viterbi detector, serves as a probability of error lower bound for the Volterra and FSE-Volterra equalizers. However, this receiver has not been evaluated for a specific satellite channel. In this work, an MLSD receiver is evaluated for a specific downlink-limited satellite channel. Because of the bank of matched filters, the MLSD receiver may be high in complexity. Consequently, the probability of error performance of a more practical

  12. Gender Equality and Violent Behavior: How Neighborhood Gender Equality Influences the Gender Gap in Violence

    PubMed Central

    Lei, Man-Kit; Simons, Ronald L.; Simons, Leslie Gordon; Edmond, Mary Bond

    2014-01-01

    Using a sample of 703 African American adolescents from the Family and Community Health Study (FACHS) along with census data from the year 2000, we examine the association between neighborhood-level gender equality and violence. We find that boys’ and girls’ violent behavior is unevenly distributed across neighborhood contexts. In particular, gender differences in violent behavior are less pronounced in gender-equalitarian neighborhoods compared to those characterized by gender inequality. We also find that the gender gap narrows in gender-equalitarian neighborhoods because boys’ rates of violence decrease whereas girls’ rates remain relatively low across neighborhoods. This is in stark contrast to the pessimistic predictions of theorists who argue that the narrowing of the gender gap in equalitarian settings is the result of an increase in girls’ violence. In addition, the relationship between neighborhood gender equality and violence is mediated by a specific articulation of masculinity characterized by toughness. Our results provide evidence for the use of gender-specific neighborhood prevention programs. PMID:24672996

  13. Cone pigment polymorphism in New World monkeys: are all pigments created equal?

    PubMed

    Rowe, Mickey P; Jacobs, Gerald H

    2004-01-01

    Most platyrrhine monkeys have a triallelic M/L opsin gene polymorphism that underlies significant individual variations in color vision. A survey of the frequencies of these polymorphic genes suggests that the three alleles occur with equal frequency among squirrel monkeys (subfamily Cebinae), but are not equally frequent in a number of species from the subfamily Callitrichinae. This departure from equal frequency in the Callitrichids should slightly increase the ratio of dichromats to trichromats in the population and significantly alter the relative representation of the three possible dichromatic and trichromatic phenotypes. A particular feature of the inequality is that it leads to a relative increase in the number of trichromats whose M/L pigments have the largest possible spectral separation. To assess whether these trichromatic phenotypes are equally well equipped to make relevant visual discriminations, psychophysical experiments were run on human observers. A technique involving the functional substitution of photopigments was used to simulate the discrimination between fruits among a background of leaves. The goal of the simulation was to reproduce in the cones of human observers excitations equivalent to those produced in monkey cones as the animals view fruit. Three different viewing conditions were examined involving variations in the relative luminances of fruit and leaves and the spectrum of the illuminant. In all cases, performance was best for simulated trichromacies including M/L pigments with the largest spectral separation. Thus, the inequality of opsin gene frequency in Callitrichid monkeys may reflect adaptive pressures. PMID:15518191

  14. Method for solvent extraction with near-equal density solutions

    DOEpatents

    Birdwell, Joseph F.; Randolph, John D.; Singh, S. Paul

    2001-01-01

    Disclosed is a modified centrifugal contactor for separating solutions of near equal density. The modified contactor has a pressure differential establishing means that allows the application of a pressure differential across fluid in the rotor of the contactor. The pressure differential is such that it causes the boundary between solutions of near-equal density to shift, thereby facilitating separation of the phases. Also disclosed is a method of separating solutions of near-equal density.

  15. Pilot study in the treatment of endometrial carcinoma with 3D image-based high-dose-rate brachytherapy using modified Heyman packing: Clinical experience and dose-volume histogram analysis

    SciTech Connect

    Weitmann, Hajo Dirk . E-mail: dirk.weitmann@akhwien.at; Poetter, Richard; Waldhaeusl, Claudia; Nechvile, Elisabeth; Kirisits, Christian; Knocke, Tomas Hendrik

    2005-06-01

    Purpose: The aim of this study was to evaluate dose distribution within uterus (clinical target volume [CTV]) and tumor (gross tumor volume [GTV]) and the resulting clinical outcome based on systematic three-dimensional treatment planning with dose-volume adaptation. Dose-volume assessment and adaptation in organs at risk and its impact on side effects were investigated in parallel. Methods and Materials: Sixteen patients with either locally confined endometrial carcinoma (n = 15) or adenocarcinoma of uterus and ovaries after bilateral salpingo-oophorectomy (n = 1) were included. Heyman packing was performed with mean 11 Norman-Simon applicators (3-18). Three-dimensional treatment planning based on computed tomography (n = 29) or magnetic resonance imaging (n = 18) was done in all patients with contouring of CTV, GTV, and organs at risk. Dose-volume adaptation was achieved by dwell location and time variation (intensity modulation). Twelve patients treated with curative intent received five to seven fractions of high-dose-rate brachytherapy (7 Gy per fraction) corresponding to a total dose of 60 Gy (2 Gy per fraction and {alpha}/{beta} of 10 Gy) to the CTV. Four patients had additional external beam radiotherapy (range, 10-40 Gy). One patient had salvage brachytherapy and 3 patients were treated with palliative intent. A dose-volume histogram analysis was performed in all patients. On average, 68% of the CTV and 92% of the GTV were encompassed by the 60 Gy reference volume. Median minimum dose to 90% of CTV and GTV (D90) was 35.3 Gy and 74 Gy, respectively. Results: All patients treated with curative intent had complete remission (12/12). After a median follow-up of 47 months, 5 patients are alive without tumor. Seven patients died without tumor from intercurrent disease after median 22 months. The patient with salvage treatment had a second local recurrence after 27 months and died of endometrial carcinoma after 57 months. In patients treated with palliative

  16. Channel Equalization for Single Carrier MIMO Underwater Acoustic Communications

    NASA Astrophysics Data System (ADS)

    Tao, Jun; Zheng, Yahong Rosa; Xiao, Chengshan; Yang, T. C.; Yang, Wen-Bin

    2010-12-01

    Multiple-input multiple-output (MIMO) underwater acoustic (UWA) channels introduce both space-time interference (STI) and time-varying phase distortion for transmitted signals. In such cases, the equalized symbols produced by conventional equalizer aiming for STI cancelation suffer phase rotation and thus cannot be reliably detected. In this paper, we propose a new equalization scheme for high data rate single carrier MIMO UWA channels. Different from existing methods employing joint equalization and symbolwise phase tracking technology, the proposed scheme decouples the interference cancelation (IC) operation and the phase compensation operation, leading to a generalized equalizer structure combining an IC equalizer with a phase compensator. The decoupling of the two functionalities leads to robust signal detection, which is most desirable in practical UWA applications. MIMO linear equalizer (LE) is adopted to remove space-time interference, and a groupwise phase estimation and correction method is used to compensate the phase rotation. In addition, the layered space-time processing technology is adopted to enhance the equalization performance. The proposed equalization scheme is tested to be very robust with extensive experimental data collected at Kauai, Hawaii, in September 2005, and Saint Margaret's Bay, Nova Scotia, Canada, in May 2006.

  17. WE-A-17A-12: The Influence of Eye Plaque Design On Dose Distributions and Dose- Volume Histograms

    SciTech Connect

    Aryal, P; Molloy, JA; Rivard, MJ

    2014-06-15

    Purpose: To investigate the effect of slot design of the model EP917 plaque on dose distributions and dose-volume histograms (DVHs). Methods: The dimensions and orientation of the slots in EP917 plaques were measured. In the MCNP5 radiation simulation geometry, dose distributions on orthogonal planes and DVHs for a tumor and sclera were generated for comparisons. 27 slot designs and 13 plaques were evaluated and compared with the published literature and the Plaque Simulator clinical treatment planning system. Results: The dosimetric effect of the gold backing composition and mass density was < 3%. Slot depth, width, and length changed the central axis (CAX) dose distributions by < 1% per 0.1 mm in design variation. Seed shifts in the slot towards the eye and shifts of the {sup 125} I-coated Ag rod within the capsule had the greatest impact on CAX dose distribution, increasing by 14%, 9%, 4%, and 2.5% at 1, 2, 5, and 10 mm, respectively, from the inner sclera. Along the CAX, dose from the full plaque geometry using the measured slot design was 3.4% ± 2.3% higher than the manufacturer-provided geometry. D{sub 10} for the simulated tumor, inner sclera, and outer sclera for the measured plaque was also higher, but 9%, 10%, and 20%, respectively. In comparison to the measured plaque design, a theoretical plaque having narrow and deep slots delivered 30%, 37%, and 62% lower D{sub 10} doses to the tumor, inner sclera, and outer sclera, respectively. CAX doses at −1, 0, 1, and 2 mm were also lower by a factor of 2.6, 1.4, 1.23, and 1.13, respectively. Conclusion: The study identified substantial sensitivity of the EP917 plaque dose distributions to slot design. However, it did not identify substantial dosimetric variations based on radionuclide choice ({sup 125}I, {sup 103}Pd, or {sup 131}Cs). COMS plaques provided lower scleral doses with similar tumor dose coverage.

  18. Application of equalization notch to improve synthetic aperture radar coherent data products

    NASA Astrophysics Data System (ADS)

    Musgrove, Cameron; West, James C.

    2015-05-01

    Interference and interference mitigation techniques degrade synthetic aperture radar (SAR) coherent data products. Radars utilizing stretch processing present a unique challenge for many mitigation techniques because the interference signal itself is modified through stretch processing from its original signal characteristics. Many sources of interference, including constant tones, are only present within the fast-time sample data for a limited number of samples, depending on the radar and interference bandwidth. Adaptive filtering algorithms to estimate and remove the interference signal that rely upon assuming stationary interference signal characteristics can be ineffective. An effective mitigation method, called notching, forces the value of the data samples containing interference to zero. However, as the number of data samples set to zero increases, image distortion and loss of resolution degrade both the image product and any second order image products. Techniques to repair image distortions,1 are effective for point-like targets. However, these techniques are not designed to model and repair distortions in SAR image terrain. Good terrain coherence is important for SAR second order image products because terrain occupies the majority of many scenes. For the case of coherent change detection it is the terrain coherence itself that determines the quality of the change detection image. This paper proposes an unique equalization technique that improves coherence over existing notching techniques. First, the proposed algorithm limits mitigation to only the samples containing interference, unlike adaptive filtering algorithms, so the remaining samples are not modified. Additionally, the mitigation adapts to changing interference power such that the resulting correction equalizes the power across the data samples. The result is reduced distortion and improved coherence for the terrain. SAR data demonstrates improved coherence from the proposed equalization

  19. Equal Protection in Special Admissions Programs: Forward from Bakke.

    ERIC Educational Resources Information Center

    Stone, Julius

    1979-01-01

    Bakke's equal protection holding is analyzed and an assessment is offered of what the decisions mean for academic special admissions programs. Discussion focuses on how race may be used as a factor in admissions decisions consistently with the equal protection clause of the Federal Constitution. (Author/MSE)

  20. 77 FR 23595 - National Equal Pay Day, 2012

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-20

    ... Force has helped women recover millions in lost wages, built collaborative training programs that... the heart of an America built to last. Equal pay will strengthen our families, grow our economy, and... efforts to achieve equal pay. ] IN WITNESS WHEREOF, I have hereunto set my hand this seventeenth day...

  1. Towards an Analytical Framework for Accountability Regarding Equal Educational Opportunities

    ERIC Educational Resources Information Center

    Beckman, Johan; Prinsloo, Justus

    2004-01-01

    This article examines the issue of accountability for equal educational opportunities against the background of the constitutional provision that elevates accountability above a management tool to a national goal and value. It asserts that the government is indeed accountable for equal educational opportunities. However, government reporting on…

  2. Equality and Fiscal Equity in School Finance Reform.

    ERIC Educational Resources Information Center

    Hayes, Kathy J.; And Others

    1993-01-01

    Examines distributional implications of a recent Supreme Court of Texas decision mandating a fiscally neutral school finance system. Comparing 1988-89 system's distribution with that of two forms of district power equalization, this article finds that the new alternatives would generate a more equal distribution of expenditures per unit of tax…

  3. 43 CFR 34.6 - Equal opportunity clause.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Equal opportunity clause. 34.6 Section 34.6 Public Lands: Interior Office of the Secretary of the Interior REQUIREMENTS FOR EQUAL OPPORTUNITY DURING CONSTRUCTION AND OPERATION OF THE ALASKA NATURAL GAS TRANSPORTATION SYSTEM § 34.6...

  4. 29 CFR 2.32 - Equal participation of religious organizations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Social Service Providers and Beneficiaries § 2.32 Equal participation of religious organizations. (a... 29 Labor 1 2011-07-01 2011-07-01 false Equal participation of religious organizations. 2.32... or participate in DOL programs for which they are otherwise eligible. DOL, DOL social...

  5. 47 CFR 101.311 - Equal employment opportunities.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 5 2012-10-01 2012-10-01 false Equal employment opportunities. 101.311 Section 101.311 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES FIXED MICROWAVE SERVICES Miscellaneous Common Carrier Provisions § 101.311 Equal...

  6. 47 CFR 101.311 - Equal employment opportunities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Equal employment opportunities. 101.311 Section 101.311 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES FIXED MICROWAVE SERVICES Miscellaneous Common Carrier Provisions § 101.311 Equal...

  7. 47 CFR 101.311 - Equal employment opportunities.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 5 2014-10-01 2014-10-01 false Equal employment opportunities. 101.311 Section 101.311 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES FIXED MICROWAVE SERVICES Miscellaneous Common Carrier Provisions § 101.311 Equal...

  8. 47 CFR 101.311 - Equal employment opportunities.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 5 2013-10-01 2013-10-01 false Equal employment opportunities. 101.311 Section 101.311 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES FIXED MICROWAVE SERVICES Miscellaneous Common Carrier Provisions § 101.311 Equal...

  9. 47 CFR 101.311 - Equal employment opportunities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 5 2011-10-01 2011-10-01 false Equal employment opportunities. 101.311 Section 101.311 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES FIXED MICROWAVE SERVICES Miscellaneous Common Carrier Provisions § 101.311 Equal...

  10. Equal Education Opportunity for All the Visually Handicapped.

    ERIC Educational Resources Information Center

    Association for Education of the Visually Handicapped, Philadelphia, PA.

    The booklet contains 12 selected papers on equal educational opportunity for visually handicapped (VH) children that were presented at the 52nd biennial conference (June, 1974) of the Association for Education of the Visually Handicapped. Included are presentations on the following: the history and concept of equal educational opportunity (B.…

  11. Teaching the Substitutive Conception of the Equals Sign

    ERIC Educational Resources Information Center

    Jones, Ian; Inglis, Matthew; Gilmore, Camilla; Evans, Rhys

    2013-01-01

    A cumulative body of research has shown that children typically shift from an "operational" to a "relational" conception of the equals sign as they move through schooling. Jones (2008) argued that a truly relational conception of the equals sign comprises a "substitutive" component and a "sameness"…

  12. District Power Equalizing: Cure-All or Prescription?

    ERIC Educational Resources Information Center

    Phelps, James L.; Addonizio, Michael F.

    1981-01-01

    Discusses the conflicting goals of school finance reform--educational equity, equality, and excellence--and their effects on the district power equalizing (DPE) method of achieving school equity. Describes the experiences with DPE in California, Ohio, Illinois, and Michigan. (RW)

  13. Swedish Schools and Gender Equality in the 1970s

    ERIC Educational Resources Information Center

    Hedlin, Maria

    2013-01-01

    In Sweden, as in many countries before Sweden, boys' academic achievements are getting considerable attention as the big gender issue. The Swedish gender equality policy that was put on the agenda in the 1970s is now associated with extreme discussions. This study aims to explore how gender equality was discussed in the 1970s, in connection with…

  14. Conceptualising Gender Equality in Research on Education Quality

    ERIC Educational Resources Information Center

    Aikman, Sheila; Halai, Anjum; Rubagiza, Jolly

    2011-01-01

    This article sets out to re-conceptualise gender equality in education quality. Four approaches to conceptualising gender equitable education quality are identified in the literature: human capital theory with a focus on parity and sameness for all; a human rights and power perspective, within which gender equality is viewed as transforming unjust…

  15. 41 CFR 60-1.4 - Equal opportunity clause.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 41 Public Contracts and Property Management 1 2011-07-01 2009-07-01 true Equal opportunity clause. 60-1.4 Section 60-1.4 Public Contracts and Property Management Other Provisions Relating to Public... 1-OBLIGATIONS OF CONTRACTORS AND SUBCONTRACTORS Preliminary Matters; Equal Opportunity...

  16. 41 CFR 60-1.4 - Equal opportunity clause.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 1 2013-07-01 2013-07-01 false Equal opportunity clause. 60-1.4 Section 60-1.4 Public Contracts and Property Management Other Provisions Relating to Public... 1-OBLIGATIONS OF CONTRACTORS AND SUBCONTRACTORS Preliminary Matters; Equal Opportunity...

  17. Planning Human Resource Development through Equal Opportunities. A Handbook.

    ERIC Educational Resources Information Center

    Warwick, Jill

    This handbook is intended for managers who wish to develop human resources in their organizations, particularly where women are currently underrepresented. It provides a positive model for the successful equal opportunities manager and a checklist of activities that will lead to the successful implementation of equal opportunities. The handbook…

  18. 41 CFR 60-250.5 - Equal opportunity clause.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... REGARDING SPECIAL DISABLED VETERANS, VETERANS OF THE VIETNAM ERA, RECENTLY SEPARATED VETERANS, AND OTHER PROTECTED VETERANS Preliminary Matters, Equal Opportunity Clause § 60-250.5 Equal opportunity clause. (a... Disabled Veterans, Veterans of the Vietnam Era, Recently Separated Veterans, and Other Protected...

  19. 41 CFR 60-300.5 - Equal opportunity clause.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... REGARDING DISABLED VETERANS, RECENTLY SEPARATED VETERANS, OTHER PROTECTED VETERANS, AND ARMED FORCES SERVICE MEDAL VETERANS Preliminary Matters, Equal Opportunity Clause § 60-300.5 Equal opportunity clause. (a... VETERANS, RECENTLY SEPARATED VETERANS, OTHER PROTECTED VETERANS, AND ARMED FORCES SERVICE MEDAL VETERANS...

  20. 12 CFR 626.6025 - Equal housing lender poster.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 626.6025 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM NONDISCRIMINATION IN LENDING § 626.6025 Equal housing lender poster. (a) Each Farm Credit institution that makes loans for the... Secretary for Fair Housing and Equal Opportunity, Department of Housing and Urban Development,...