Frequency distribution histograms for the rapid analysis of data
NASA Technical Reports Server (NTRS)
Burke, P. V.; Bullen, B. L.; Poff, K. L.
1988-01-01
The mean and standard error are good representations for the response of a population to an experimental parameter and are frequently used for this purpose. Frequency distribution histograms show, in addition, responses of individuals in the population. Both the statistics and a visual display of the distribution of the responses can be obtained easily using a microcomputer and available programs. The type of distribution shown by the histogram may suggest different mechanisms to be tested.
Using Computer Graphics in Statistics.
ERIC Educational Resources Information Center
Kerley, Lyndell M.
1990-01-01
Described is software which allows a student to use simulation to produce analytical output as well as graphical results. The results include a frequency histogram of a selected population distribution, a frequency histogram of the distribution of the sample means, and test the normality distributions of the sample means. (KR)
Spline smoothing of histograms by linear programming
NASA Technical Reports Server (NTRS)
Bennett, J. O.
1972-01-01
An algorithm for an approximating function to the frequency distribution is obtained from a sample of size n. To obtain the approximating function a histogram is made from the data. Next, Euclidean space approximations to the graph of the histogram using central B-splines as basis elements are obtained by linear programming. The approximating function has area one and is nonnegative.
NASA Astrophysics Data System (ADS)
Galich, Nikolay E.
2008-07-01
Communication contains the description of the immunology data treatment. New nonlinear methods of immunofluorescence statistical analysis of peripheral blood neutrophils have been developed. We used technology of respiratory burst reaction of DNA fluorescence in the neutrophils cells nuclei due to oxidative activity. The histograms of photon count statistics the radiant neutrophils populations' in flow cytometry experiments are considered. Distributions of the fluorescence flashes frequency as functions of the fluorescence intensity are analyzed. Statistic peculiarities of histograms set for women in the pregnant period allow dividing all histograms on the three classes. The classification is based on three different types of smoothing and long-range scale averaged immunofluorescence distributions, their bifurcation and wavelet spectra. Heterogeneity peculiarities of long-range scale immunofluorescence distributions and peculiarities of wavelet spectra allow dividing all histograms on three groups. First histograms group belongs to healthy donors. Two other groups belong to donors with autoimmune and inflammatory diseases. Some of the illnesses are not diagnosed by standards biochemical methods. Medical standards and statistical data of the immunofluorescence histograms for identifications of health and illnesses are interconnected. Peculiarities of immunofluorescence for women in pregnant period are classified. Health or illness criteria are connected with statistics features of immunofluorescence histograms. Neutrophils populations' fluorescence presents the sensitive clear indicator of health status.
NASA Astrophysics Data System (ADS)
Galich, Nikolay E.; Filatov, Michael V.
2008-07-01
Communication contains the description of the immunology experiments and the experimental data treatment. New nonlinear methods of immunofluorescence statistical analysis of peripheral blood neutrophils have been developed. We used technology of respiratory burst reaction of DNA fluorescence in the neutrophils cells nuclei due to oxidative activity. The histograms of photon count statistics the radiant neutrophils populations' in flow cytometry experiments are considered. Distributions of the fluorescence flashes frequency as functions of the fluorescence intensity are analyzed. Statistic peculiarities of histograms set for healthy and unhealthy donors allow dividing all histograms on the three classes. The classification is based on three different types of smoothing and long-range scale averaged immunofluorescence distributions and their bifurcation. Heterogeneity peculiarities of long-range scale immunofluorescence distributions allow dividing all histograms on three groups. First histograms group belongs to healthy donors. Two other groups belong to donors with autoimmune and inflammatory diseases. Some of the illnesses are not diagnosed by standards biochemical methods. Medical standards and statistical data of the immunofluorescence histograms for identifications of health and illnesses are interconnected. Possibilities and alterations of immunofluorescence statistics in registration, diagnostics and monitoring of different diseases in various medical treatments have been demonstrated. Health or illness criteria are connected with statistics features of immunofluorescence histograms. Neutrophils populations' fluorescence presents the sensitive clear indicator of health status.
Color Histogram Diffusion for Image Enhancement
NASA Technical Reports Server (NTRS)
Kim, Taemin
2011-01-01
Various color histogram equalization (CHE) methods have been proposed to extend grayscale histogram equalization (GHE) for color images. In this paper a new method called histogram diffusion that extends the GHE method to arbitrary dimensions is proposed. Ranges in a histogram are specified as overlapping bars of uniform heights and variable widths which are proportional to their frequencies. This diagram is called the vistogram. As an alternative approach to GHE, the squared error of the vistogram from the uniform distribution is minimized. Each bar in the vistogram is approximated by a Gaussian function. Gaussian particles in the vistoram diffuse as a nonlinear autonomous system of ordinary differential equations. CHE results of color images showed that the approach is effective.
NASA Astrophysics Data System (ADS)
Csillik, O.; Evans, I. S.; Drăguţ, L.
2015-03-01
Automated procedures are developed to alleviate long tails in frequency distributions of morphometric variables. They minimize the skewness of slope gradient frequency distributions, and modify the kurtosis of profile and plan curvature distributions toward that of the Gaussian (normal) model. Box-Cox (for slope) and arctangent (for curvature) transformations are tested on nine digital elevation models (DEMs) of varying origin and resolution, and different landscapes, and shown to be effective. Resulting histograms are illustrated and show considerable improvements over those for previously recommended slope transformations (sine, square root of sine, and logarithm of tangent). Unlike previous approaches, the proposed method evaluates the frequency distribution of slope gradient values in a given area and applies the most appropriate transform if required. Sensitivity of the arctangent transformation is tested, showing that Gaussian-kurtosis transformations are acceptable also in terms of histogram shape. Cube root transformations of curvatures produced bimodal histograms. The transforms are applicable to morphometric variables and many others with skewed or long-tailed distributions. By avoiding long tails and outliers, they permit parametric statistics such as correlation, regression and principal component analyses to be applied, with greater confidence that requirements for linearity, additivity and even scatter of residuals (constancy of error variance) are likely to be met. It is suggested that such transformations should be routinely applied in all parametric analyses of long-tailed variables. Our Box-Cox and curvature automated transformations are based on a Python script, implemented as an easy-to-use script tool in ArcGIS.
Univariate and Bivariate Loglinear Models for Discrete Test Score Distributions.
ERIC Educational Resources Information Center
Holland, Paul W.; Thayer, Dorothy T.
2000-01-01
Applied the theory of exponential families of distributions to the problem of fitting the univariate histograms and discrete bivariate frequency distributions that often arise in the analysis of test scores. Considers efficient computation of the maximum likelihood estimates of the parameters using Newton's Method and computationally efficient…
Redshift data and statistical inference
NASA Technical Reports Server (NTRS)
Newman, William I.; Haynes, Martha P.; Terzian, Yervant
1994-01-01
Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.
Measurement of Device Parameters Using Image Recovery Techniques in Large-Scale IC Devices
NASA Technical Reports Server (NTRS)
Scheick, Leif; Edmonds, Larry
2004-01-01
Devices that respond to radiation on a cell level will produce histograms showing the relative frequency of cell damage as a function of damage. The measured distribution is the convolution of distributions from radiation responses, measurement noise, and manufacturing parameters. A method of extracting device characteristics and parameters from measured distributions via mathematical and image subtraction techniques is described.
Approximate sample sizes required to estimate length distributions
Miranda, L.E.
2007-01-01
The sample sizes required to estimate fish length were determined by bootstrapping from reference length distributions. Depending on population characteristics and species-specific maximum lengths, 1-cm length-frequency histograms required 375-1,200 fish to estimate within 10% with 80% confidence, 2.5-cm histograms required 150-425 fish, proportional stock density required 75-140 fish, and mean length required 75-160 fish. In general, smaller species, smaller populations, populations with higher mortality, and simpler length statistics required fewer samples. Indices that require low sample sizes may be suitable for monitoring population status, and when large changes in length are evident, additional sampling effort may be allocated to more precisely define length status with more informative estimators. ?? Copyright by the American Fisheries Society 2007.
Statistical distribution of wind speeds and directions globally observed by NSCAT
NASA Astrophysics Data System (ADS)
Ebuchi, Naoto
1999-05-01
In order to validate wind vectors derived from the NASA scatterometer (NSCAT), statistical distributions of wind speeds and directions over the global oceans are investigated by comparing with European Centre for Medium-Range Weather Forecasts (ECMWF) wind data. Histograms of wind speeds and directions are calculated from the preliminary and reprocessed NSCAT data products for a period of 8 weeks. For wind speed of the preliminary data products, excessive low wind distribution is pointed out through comparison with ECMWF winds. A hump at the lower wind speed side of the peak in the wind speed histogram is discernible. The shape of the hump varies with incidence angle. Incompleteness of the prelaunch geophysical model function, SASS 2, tentatively used to retrieve wind vectors of the preliminary data products, is considered to cause the skew of the wind speed distribution. On the contrary, histograms of wind speeds of the reprocessed data products show consistent features over the whole range of incidence angles. Frequency distribution of wind directions relative to spacecraft flight direction is calculated to assess self-consistency of the wind directions. It is found that wind vectors of the preliminary data products exhibit systematic directional preference relative to antenna beams. This artificial directivity is also considered to be caused by imperfections in the geophysical model function. The directional distributions of the reprocessed wind vectors show less directivity and consistent features, except for very low wind cases.
Statistical Analysis of Spectral Properties and Prosodic Parameters of Emotional Speech
NASA Astrophysics Data System (ADS)
Přibil, J.; Přibilová, A.
2009-01-01
The paper addresses reflection of microintonation and spectral properties in male and female acted emotional speech. Microintonation component of speech melody is analyzed regarding its spectral and statistical parameters. According to psychological research of emotional speech, different emotions are accompanied by different spectral noise. We control its amount by spectral flatness according to which the high frequency noise is mixed in voiced frames during cepstral speech synthesis. Our experiments are aimed at statistical analysis of cepstral coefficient values and ranges of spectral flatness in three emotions (joy, sadness, anger), and a neutral state for comparison. Calculated histograms of spectral flatness distribution are visually compared and modelled by Gamma probability distribution. Histograms of cepstral coefficient distribution are evaluated and compared using skewness and kurtosis. Achieved statistical results show good correlation comparing male and female voices for all emotional states portrayed by several Czech and Slovak professional actors.
Statistical analysis of landing contact conditions for three lifting body research vehicles
NASA Technical Reports Server (NTRS)
Larson, R. R.
1972-01-01
The landing contact conditions for the HL-10, M2-F2/F3, and the X-24A lifting body vehicles are analyzed statistically for 81 landings. The landing contact parameters analyzed are true airspeed, peak normal acceleration at the center of gravity, roll angle, and roll velocity. Ground measurement parameters analyzed are lateral and longitudinal distance from intended touchdown, lateral distance from touchdown to full stop, and rollout distance. The results are presented in the form of histograms for frequency distributions and cumulative frequency distribution probability curves with a Pearson Type 3 curve fit for extrapolation purposes.
NASA Astrophysics Data System (ADS)
Fu, Z.; Qin, Q.; Wu, C.; Chang, Y.; Luo, B.
2017-09-01
Due to the differences of imaging principles, image matching between visible and thermal infrared images still exist new challenges and difficulties. Inspired by the complementary spatial and frequency information of geometric structural features, a robust descriptor is proposed for visible and thermal infrared images matching. We first divide two different spatial regions to the region around point of interest, using the histogram of oriented magnitudes, which corresponds to the 2-D structural shape information to describe the larger region and the edge oriented histogram to describe the spatial distribution for the smaller region. Then the two vectors are normalized and combined to a higher feature vector. Finally, our proposed descriptor is obtained by applying principal component analysis (PCA) to reduce the dimension of the combined high feature vector to make our descriptor more robust. Experimental results showed that our proposed method was provided with significant improvements in correct matching numbers and obvious advantages by complementing information within spatial and frequency structural information.
Histograms and Frequency Density.
ERIC Educational Resources Information Center
Micromath, 2003
2003-01-01
Introduces exercises on histograms and frequency density. Guides pupils to Discovering Important Statistical Concepts Using Spreadsheets (DISCUSS), created at the University of Coventry. Includes curriculum points, teaching tips, activities, and internet address (http://www.coventry.ac.uk/discuss/). (KHR)
Basic statistics with Microsoft Excel: a review.
Divisi, Duilio; Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-06-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel.
Basic statistics with Microsoft Excel: a review
Di Leonardo, Gabriella; Zaccagna, Gino; Crisci, Roberto
2017-01-01
The scientific world is enriched daily with new knowledge, due to new technologies and continuous discoveries. The mathematical functions explain the statistical concepts particularly those of mean, median and mode along with those of frequency and frequency distribution associated to histograms and graphical representations, determining elaborative processes on the basis of the spreadsheet operations. The aim of the study is to highlight the mathematical basis of statistical models that regulate the operation of spreadsheets in Microsoft Excel. PMID:28740690
Histogram equalization with Bayesian estimation for noise robust speech recognition.
Suh, Youngjoo; Kim, Hoirin
2018-02-01
The histogram equalization approach is an efficient feature normalization technique for noise robust automatic speech recognition. However, it suffers from performance degradation when some fundamental conditions are not satisfied in the test environment. To remedy these limitations of the original histogram equalization methods, class-based histogram equalization approach has been proposed. Although this approach showed substantial performance improvement under noise environments, it still suffers from performance degradation due to the overfitting problem when test data are insufficient. To address this issue, the proposed histogram equalization technique employs the Bayesian estimation method in the test cumulative distribution function estimation. It was reported in a previous study conducted on the Aurora-4 task that the proposed approach provided substantial performance gains in speech recognition systems based on the acoustic modeling of the Gaussian mixture model-hidden Markov model. In this work, the proposed approach was examined in speech recognition systems with deep neural network-hidden Markov model (DNN-HMM), the current mainstream speech recognition approach where it also showed meaningful performance improvement over the conventional maximum likelihood estimation-based method. The fusion of the proposed features with the mel-frequency cepstral coefficients provided additional performance gains in DNN-HMM systems, which otherwise suffer from performance degradation in the clean test condition.
NASA Astrophysics Data System (ADS)
Keilbach, D.; Drews, C.; Berger, L.; Marsch, E.; Wimmer-Schweingruber, R. F.
2017-12-01
Using a test particle approach we have investigated, how an oxygen pickup ion torus velocity distribution is modified by continuous and intermittent alfvènic waves on timescales, where the gyro trajectory of each particle can be traced.We have therefore exposed the test particles to mono frequent waves, which expanded through the whole simulation in time and space. The general behavior of the pitch angle distribution is found to be stationary and a nonlinear function of the wave frequency, amplitude and the initial angle between wave elongation and field-perpendicular particle velocity vector. The figure shows the time-averaged pitch angle distributions as a function of the Doppler shifted wave frequency (where the Doppler shift was calculated with respect to the particles initial velocity) for three different wave amplitudes (labeled in each panel). The background field is chosen to be 5 nT and the 500 test particles were initially distributed on a torus with 120° pitch angle at a solar wind velocity of 450 km/s. Each y-slice of the histogram (which has been normalized to it's respective maximum) represents an individual run of the simulation.The frequency-dependent behavior of the test particles is found to be classifiable into the regimes of very low/high frequencies and frequencies close to first order resonance. We have found, that only in the latter regime the particles interact strongly with the wave, where in the time averaged histograms a branch structure is found, which was identified as a trace of particles co-moving with the wave phase. The magnitude of pitch angle change of these particles is as well as the frequency margin, where the branch structure is found, an increasing function with the wave amplitude.We have also investigated the interaction with mono frequent intermittent waves. Exposed to such waves a torus distribution is scattered in pitch angle space, whereas the pitch angle distribution is broadened systematically over time similar to pitch angle diffusion.The framework of our simulations is a first step toward understanding wave particle interactions at the most basic level and is readily expandable to e.g. the inclusion of multiple wave frequencies, intermittent wave activity, gradients in the background magnetic field or collisions with solar wind particles.
Investigating Student Understanding of Histograms
ERIC Educational Resources Information Center
Kaplan, Jennifer J.; Gabrosek, John G.; Curtiss, Phyllis; Malone, Chris
2014-01-01
Histograms are adept at revealing the distribution of data values, especially the shape of the distribution and any outlier values. They are included in introductory statistics texts, research methods texts, and in the popular press, yet students often have difficulty interpreting the information conveyed by a histogram. This research identifies…
ERIC Educational Resources Information Center
CASE, C. MARSTON
THIS PAPER IS CONCERNED WITH GRAPHIC PRESENTATION AND ANALYSIS OF GROUPED OBSERVATIONS. IT PRESENTS A METHOD AND SUPPORTING THEORY FOR THE CONSTRUCTION OF AN AREA-CONSERVING, MINIMAL LENGTH FREQUENCY POLYGON CORRESPONDING TO A GIVEN HISTOGRAM. TRADITIONALLY, THE CONCEPT OF A FREQUENCY POLYGON CORRESPONDING TO A GIVEN HISTOGRAM HAS REFERRED TO THAT…
A dynamic re-partitioning strategy based on the distribution of key in Spark
NASA Astrophysics Data System (ADS)
Zhang, Tianyu; Lian, Xin
2018-05-01
Spark is a memory-based distributed data processing framework, has the ability of processing massive data and becomes a focus in Big Data. But the performance of Spark Shuffle depends on the distribution of data. The naive Hash partition function of Spark can not guarantee load balancing when data is skewed. The time of job is affected by the node which has more data to process. In order to handle this problem, dynamic sampling is used. In the process of task execution, histogram is used to count the key frequency distribution of each node, and then generate the global key frequency distribution. After analyzing the distribution of key, load balance of data partition is achieved. Results show that the Dynamic Re-Partitioning function is better than the default Hash partition, Fine Partition and the Balanced-Schedule strategy, it can reduce the execution time of the task and improve the efficiency of the whole cluster.
ERIC Educational Resources Information Center
Gratzer, William; Carpenter, James E.
2008-01-01
This article demonstrates an alternative approach to the construction of histograms--one based on the notion of using area to represent relative density in intervals of unequal length. The resulting histograms illustrate the connection between the area of the rectangles associated with particular outcomes and the relative frequency (probability)…
Yin, T C; Kuwada, S
1983-10-01
We used the binaural beat stimulus to study the interaural phase sensitivity of inferior colliculus (IC) neurons in the cat. The binaural beat, produced by delivering tones of slightly different frequencies to the two ears, generates continuous and graded changes in interaural phase. Over 90% of the cells that exhibit a sensitivity to changes in the interaural delay also show a sensitivity to interaural phase disparities with the binaural beat. Cells respond with a burst of impulses with each complete cycle of the beat frequency. The period histogram obtained by binning the poststimulus time histogram on the beat frequency gives a measure of the interaural phase sensitivity of the cell. In general, there is good correspondence in the shapes of the period histograms generated from binaural beats and the interaural phase curves derived from interaural delays and in the mean interaural phase angle calculated from them. The magnitude of the beat frequency determines the rate of change of interaural phase and the sign determines the direction of phase change. While most cells respond in a phase-locked manner up to beat frequencies of 10 Hz, there are some cells tht will phase lock up to 80 Hz. Beat frequency and mean interaural phase angle are linearly related for most cells. Most cells respond equally in the two directions of phase change and with different rates of change, at least up to 10 Hz. However, some IC cells exhibit marked sensitivity to the speed of phase change, either responding more vigorously at low beat frequencies or at high beat frequencies. In addition, other cells demonstrate a clear directional sensitivity. The cells that show sensitivity to the direction and speed of phase changes would be expected to demonstrate a sensitivity to moving sound sources in the free field. Changes in the mean interaural phase of the binaural beat period histograms are used to determine the effects of changes in average and interaural intensity on the phase sensitivity of the cells. The effects of both forms of intensity variation are continuously distributed. The binaural beat offers a number of advantages for studying the interaural phase sensitivity of binaural cells. The dynamic characteristics of the interaural phase can be varied so that the speed and direction of phase change are under direct control. The data can be obtained in a much more efficient manner, as the binaural beat is about 10 times faster in terms of data collection than the interaural delay.
[The value of spectral frequency analysis by Doppler examination (author's transl)].
Boccalon, H; Reggi, M; Lozes, A; Canal, C; Jausseran, J M; Courbier, R; Puel, P; Enjalbert, A
1981-01-01
Arterial stenoses of moderate extent may involve modifications of the blood flow. Arterial shading is not always examined at the best incident angle to assess the extent of the stenosis. Spectral frequency analysis by Doppler examination is a good means of evaluating the effect of moderate arterial lesions. The present study was carried out with a Doppler effect having an acoustic spectrum, which is shown in a histogram having 16 frequency bands. The values were recorded on the two femoral arteries. A study was also made of 49 normal subjects so as to establish a normal envelope histogram, taking into account the following parameters: maximum peak (800 Hz), low cut-off frequency (420 Hz), high cut-off frequency (2,600 Hz); the first peak was found to be present in 81 % of the subjects (at 375 Hz) and the second peak in 75 % of the subjects (2,020 Hz). Thirteen patients with iliac lesions of different extent were included in the study; details of these lesions were established in all cases by aortography. None of the recorded frequency histograms were located within the normal envelope. Two cases of moderate iliac stenoses were noted ( Less Than 50 % of the diameter) which interfered with the histogram, even though the femoral velocity signal was normal.
Massoudieh, Arash; Visser, Ate; Sharifi, Soroosh; ...
2013-10-15
The mixing of groundwaters with different ages in aquifers, groundwater age is more appropriately represented by a distribution rather than a scalar number. To infer a groundwater age distribution from environmental tracers, a mathematical form is often assumed for the shape of the distribution and the parameters of the mathematical distribution are estimated using deterministic or stochastic inverse methods. We found that the prescription of the mathematical form limits the exploration of the age distribution to the shapes that can be described by the selected distribution. In this paper, the use of freeform histograms as groundwater age distributions is evaluated.more » A Bayesian Markov Chain Monte Carlo approach is used to estimate the fraction of groundwater in each histogram bin. This method was able to capture the shape of a hypothetical gamma distribution from the concentrations of four age tracers. The number of bins that can be considered in this approach is limited based on the number of tracers available. The histogram method was also tested on tracer data sets from Holten (The Netherlands; 3H, 3He, 85Kr, 39Ar) and the La Selva Biological Station (Costa-Rica; SF 6, CFCs, 3H, 4He and 14C), and compared to a number of mathematical forms. According to standard Bayesian measures of model goodness, the best mathematical distribution performs better than the histogram distributions in terms of the ability to capture the observed tracer data relative to their complexity. Among the histogram distributions, the four bin histogram performs better in most of the cases. The Monte Carlo simulations showed strong correlations in the posterior estimates of bin contributions, indicating that these bins cannot be well constrained using the available age tracers. The fact that mathematical forms overall perform better than the freeform histogram does not undermine the benefit of the freeform approach, especially for the cases where a larger amount of observed data is available and when the real groundwater distribution is more complex than can be represented by simple mathematical forms.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Massoudieh, Arash; Visser, Ate; Sharifi, Soroosh
The mixing of groundwaters with different ages in aquifers, groundwater age is more appropriately represented by a distribution rather than a scalar number. To infer a groundwater age distribution from environmental tracers, a mathematical form is often assumed for the shape of the distribution and the parameters of the mathematical distribution are estimated using deterministic or stochastic inverse methods. We found that the prescription of the mathematical form limits the exploration of the age distribution to the shapes that can be described by the selected distribution. In this paper, the use of freeform histograms as groundwater age distributions is evaluated.more » A Bayesian Markov Chain Monte Carlo approach is used to estimate the fraction of groundwater in each histogram bin. This method was able to capture the shape of a hypothetical gamma distribution from the concentrations of four age tracers. The number of bins that can be considered in this approach is limited based on the number of tracers available. The histogram method was also tested on tracer data sets from Holten (The Netherlands; 3H, 3He, 85Kr, 39Ar) and the La Selva Biological Station (Costa-Rica; SF 6, CFCs, 3H, 4He and 14C), and compared to a number of mathematical forms. According to standard Bayesian measures of model goodness, the best mathematical distribution performs better than the histogram distributions in terms of the ability to capture the observed tracer data relative to their complexity. Among the histogram distributions, the four bin histogram performs better in most of the cases. The Monte Carlo simulations showed strong correlations in the posterior estimates of bin contributions, indicating that these bins cannot be well constrained using the available age tracers. The fact that mathematical forms overall perform better than the freeform histogram does not undermine the benefit of the freeform approach, especially for the cases where a larger amount of observed data is available and when the real groundwater distribution is more complex than can be represented by simple mathematical forms.« less
Construction and Evaluation of Histograms in Teacher Training
ERIC Educational Resources Information Center
Bruno, A.; Espinel, M. C.
2009-01-01
This article details the results of a written test designed to reveal how education majors construct and evaluate histograms and frequency polygons. Included is a description of the mistakes made by the students which shows how they tend to confuse histograms with bar diagrams, incorrectly assign data along the Cartesian axes and experience…
Dose-volume histogram prediction using density estimation.
Skarpman Munter, Johanna; Sjölund, Jens
2015-09-07
Knowledge of what dose-volume histograms can be expected for a previously unseen patient could increase consistency and quality in radiotherapy treatment planning. We propose a machine learning method that uses previous treatment plans to predict such dose-volume histograms. The key to the approach is the framing of dose-volume histograms in a probabilistic setting.The training consists of estimating, from the patients in the training set, the joint probability distribution of some predictive features and the dose. The joint distribution immediately provides an estimate of the conditional probability of the dose given the values of the predictive features. The prediction consists of estimating, from the new patient, the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimate of the dose-volume histogram.To illustrate how the proposed method relates to previously proposed methods, we use the signed distance to the target boundary as a single predictive feature. As a proof-of-concept, we predicted dose-volume histograms for the brainstems of 22 acoustic schwannoma patients treated with stereotactic radiosurgery, and for the lungs of 9 lung cancer patients treated with stereotactic body radiation therapy. Comparing with two previous attempts at dose-volume histogram prediction we find that, given the same input data, the predictions are similar.In summary, we propose a method for dose-volume histogram prediction that exploits the intrinsic probabilistic properties of dose-volume histograms. We argue that the proposed method makes up for some deficiencies in previously proposed methods, thereby potentially increasing ease of use, flexibility and ability to perform well with small amounts of training data.
Analysis of dose heterogeneity using a subvolume-DVH
NASA Astrophysics Data System (ADS)
Said, M.; Nilsson, P.; Ceberg, C.
2017-11-01
The dose-volume histogram (DVH) is universally used in radiation therapy for its highly efficient way of summarizing three-dimensional dose distributions. An apparent limitation that is inherent to standard histograms is the loss of spatial information, e.g. it is no longer possible to tell where low- and high-dose regions are, and whether they are connected or disjoint. Two methods for overcoming the spatial fragmentation of low- and high-dose regions are presented, both based on the gray-level size zone matrix, which is a two-dimensional histogram describing the frequencies of connected regions of similar intensities. The first approach is a quantitative metric which can be likened to a homogeneity index. The large cold spot metric (LCS) is here defined to emphasize large contiguous regions receiving too low a dose; emphasis is put on both size, and deviation from the prescribed dose. In contrast, the subvolume-DVH (sDVH) is an extension to the standard DVH and allows for a qualitative evaluation of the degree of dose heterogeneity. The information retained from the two-dimensional histogram is overlaid on top of the DVH and the two are presented simultaneously. Both methods gauge the underlying heterogeneity in ways that the DVH alone cannot, and both have their own merits—the sDVH being more intuitive and the LCS being quantitative.
Microbubble cloud characterization by nonlinear frequency mixing.
Cavaro, M; Payan, C; Moysan, J; Baqué, F
2011-05-01
In the frame of the fourth generation forum, France decided to develop sodium fast nuclear reactors. French Safety Authority requests the associated monitoring of argon gas into sodium. This implies to estimate the void fraction, and a histogram indicating the bubble population. In this context, the present letter studies the possibility of achieving an accurate determination of the histogram with acoustic methods. A nonlinear, two-frequency mixing technique has been implemented, and a specific optical device has been developed in order to validate the experimental results. The acoustically reconstructed histograms are in excellent agreement with those obtained using optical methods.
NURE aerial gamma ray and magnetic detail survey of portions of northeast Washington. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1981-11-01
The Northeast Washington Survey was performed under the United States Department of Energy's National Uranium Resource Evaluation (NURE) Program, which is designed to provide radioelement distribution information to assist in assessing the uraniferous material potential of the United States. The radiometric and ancilliary data were digitally recorded and processed. The results are presented in the form of stacked profiles, contour maps, flight path maps, statistical tables and frequency distribution histograms. These graphical outputs are presented at a scale of 1:62,500 and are contained in the individual Volume 2 reports.
NASA Astrophysics Data System (ADS)
Kvinnsland, Yngve; Muren, Ludvig Paul; Dahl, Olav
2004-08-01
Calculations of normal tissue complication probability (NTCP) values for the rectum are difficult because it is a hollow, non-rigid, organ. Finding the true cumulative dose distribution for a number of treatment fractions requires a CT scan before each treatment fraction. This is labour intensive, and several surrogate distributions have therefore been suggested, such as dose wall histograms, dose surface histograms and histograms for the solid rectum, with and without margins. In this study, a Monte Carlo method is used to investigate the relationships between the cumulative dose distributions based on all treatment fractions and the above-mentioned histograms that are based on one CT scan only, in terms of equivalent uniform dose. Furthermore, the effect of a specific choice of histogram on estimates of the volume parameter of the probit NTCP model was investigated. It was found that the solid rectum and the rectum wall histograms (without margins) gave equivalent uniform doses with an expected value close to the values calculated from the cumulative dose distributions in the rectum wall. With the number of patients available in this study the standard deviations of the estimates of the volume parameter were large, and it was not possible to decide which volume gave the best estimates of the volume parameter, but there were distinct differences in the mean values of the values obtained.
Robust Audio Watermarking by Using Low-Frequency Histogram
NASA Astrophysics Data System (ADS)
Xiang, Shijun
In continuation to earlier work where the problem of time-scale modification (TSM) has been studied [1] by modifying the shape of audio time domain histogram, here we consider the additional ingredient of resisting additive noise-like operations, such as Gaussian noise, lossy compression and low-pass filtering. In other words, we study the problem of the watermark against both TSM and additive noises. To this end, in this paper we extract the histogram from a Gaussian-filtered low-frequency component for audio watermarking. The watermark is inserted by shaping the histogram in a way that the use of two consecutive bins as a group is exploited for hiding a bit by reassigning their population. The watermarked signals are perceptibly similar to the original one. Comparing with the previous time-domain watermarking scheme [1], the proposed watermarking method is more robust against additive noise, MP3 compression, low-pass filtering, etc.
Comparison of LOPES measurements with CoREAS and REAS 3.11 simulations
NASA Astrophysics Data System (ADS)
Ludwig, M.; Apel, W. D.; Arteaga-Velázquez, J. C.; Bähren, L.; Bekk, K.; Bertaina, M.; Biermann, P. L.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Chiavassa, A.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Falcke, H.; Fuchs, B.; Fuhrmann, D.; Gemmeke, H.; Grupen, C.; Haug, M.; Haungs, A.; Heck, D.; Hörandel, J. R.; Horneffer, A.; Huber, D.; Huege, T.; Isar, P. G.; Kampert, K.-H.; Kang, D.; Krömer, O.; Kuijpers, J.; Link, K.; Łuczak, P.; Mathes, H. J.; Melissas, M.; Morello, C.; Oehlschläger, J.; Palmieri, N.; Pierog, T.; Rautenberg, J.; Rebel, H.; Roth, M.; Rühle, C.; Saftoiu, A.; Schieler, H.; Schmidt, A.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Weindl, A.; Wochele, J.; Zabierowski, J.; Zensus, J. A.
2013-05-01
In the previous years, LOPES emerged as a very successful experiment measuring the radio emission from air showers in the MHz frequency range. In parallel, the theoretical description of radio emission was developed further and REAS became a widely used simulation Monte Carlo code. REAS 3 as well as CoREAS are based on the endpoint formalism, i.e. they calculate the emission of the air-shower without assuming specific emission mechanisms. While REAS 3 is based on histograms derived from CORSIKA simulations, CoREAS is directly implemented into CORSIKA without loss of information due to histogramming of the particle distributions. In contrast to the earlier versions of REAS, the newest version REAS 3.11 and CoREAS take into account a realistic atmospheric refractive index. To improve the understanding of the emission processes and judge the quality of the simulations, we compare their predictions with high-quality events measured by LOPES. We present results concerning the lateral distribution measured with 30 east-west aligned LOPES antennas. Only the simulation codes including the refractive index (REAS 3.11 and CoREAS) are able to reproduce the slope of measured lateral distributions, but REAS 3.0 predicts too steep lateral distributions, and does not predict rising lateral distributions as seen in a few LOPES events. Moreover, REAS 3.11 predicts an absolute amplitude compatible with the LOPES measurements.
Vertical Structures of Anvil Clouds of Tropical Mesoscale Convective Systems Observed by CloudSat
NASA Technical Reports Server (NTRS)
Hence, Deanna A.; Houze, Robert A.
2011-01-01
A global study of the vertical structures of the clouds of tropical mesoscale convective systems (MCSs) has been carried out with data from the CloudSat Cloud Profiling Radar. Tropical MCSs are found to be dominated by cloud-top heights greater than 10 km. Secondary cloud layers sometimes occur in MCSs, but outside their primary raining cores. The secondary layers have tops at 6 8 and 1 3 km. High-topped clouds extend outward from raining cores of MCSs to form anvil clouds. Closest to the raining cores, the anvils tend to have broader distributions of reflectivity at all levels, with the modal values at higher reflectivity in their lower levels. Portions of anvil clouds far away from the raining core are thin and have narrow frequency distributions of reflectivity at all levels with overall weaker values. This difference likely reflects ice particle fallout and therefore cloud age. Reflectivity histograms of MCS anvil clouds vary little across the tropics, except that (i) in continental MCS anvils, broader distributions of reflectivity occur at the uppermost levels in the portions closest to active raining areas; (ii) the frequency of occurrence of stronger reflectivity in the upper part of anvils decreases faster with increasing distance in continental MCSs; and (iii) narrower-peaked ridges are prominent in reflectivity histograms of thick anvil clouds close to the raining areas of connected MCSs (superclusters). These global results are consistent with observations at ground sites and aircraft data. They present a comprehensive test dataset for models aiming to simulate process-based upper-level cloud structure around the tropics.
Vertical Structures of Anvil Clouds of Tropical Mesoscale Convective Systems Observed by CloudSat
NASA Technical Reports Server (NTRS)
Yuan, J.; Houze, R. A., Jr.; Heymsfield, A.
2011-01-01
A global study of the vertical structures of the clouds of tropical mesoscale convective systems (MCSs) has been carried out with data from the CloudSat Cloud Profiling Radar. Tropical MCSs are found to be dominated by cloud-top heights greater than 10 km. Secondary cloud layers sometimes occur in MCSs, but outside their primary raining cores. The secondary layers have tops at 6--8 and 1--3 km. High-topped clouds extend outward from raining cores of MCSs to form anvil clouds. Closest to the raining cores, the anvils tend to have broader distributions of reflectivity at all levels, with the modal values at higher reflectivity in their lower levels. Portions of anvil clouds far away from the raining core are thin and have narrow frequency distributions of reflectivity at all levels with overall weaker values. This difference likely reflects ice particle fallout and therefore cloud age. Reflectivity histograms of MCS anvil clouds vary little across the tropics, except that (i) in continental MCS anvils, broader distributions of reflectivity occur at the uppermost levels in the portions closest to active raining areas; (ii) the frequency of occurrence of stronger reflectivity in the upper part of anvils decreases faster with increasing distance in continental MCSs; and (iii) narrower-peaked ridges are prominent in reflectivity histograms of thick anvil clouds close to the raining areas of connected MCSs (superclusters). These global results are consistent with observations at ground sites and aircraft data. They present a comprehensive test dataset for models aiming to simulate process-based upper-level cloud structure around the tropics.
NASA Astrophysics Data System (ADS)
Licznar, Paweł; Rupp, David; Adamowski, Witold
2013-04-01
In the fall of 2008, Municipal Water Supply and Sewerage Company (MWSSC) in Warsaw began operating the first large precipitation monitoring network dedicated to urban hydrology in Poland. The process of establishing the network as well as the preliminary phase of its operation, raised a number of questions concerning optimal gauge location and density and revealed the urgent need for new data processing techniques. When considering the full-field precipitation as input to hydrodynamic models of stormwater and combined sewage systems, standard processing techniques developed previously for single gauges and concentrating mainly on the analysis of maximum rainfall rates and intensity-duration-frequency (IDF) curves development were found inadequate. We used a multifractal rainfall modeling framework based on microcanonical multiplicative random cascades to analyze properties of Warsaw precipitation. We calculated breakdown coefficients (BDC) for the hierarchy of timescales from λ=1 (5-min) up to λ=128 (1280-min) for all 25 gauges in the network. At small timescales histograms of BDCs were strongly deformed due to the recording precision of rainfall amounts. A randomization procedure statistically removed the artifacts due to precision errors in the original series. At large timescales BDC values were sparse due to relatively short period of observations (2008-2011). An algorithm with a moving window was proposed to increase the number of BDC values at large timescales and to smooth their histograms. The resulting empirical BDC histograms were modeled by a theoretical "2N-B" distribution, which combined 2 separate normal (N) distributions and one beta (B) distribution. A clear evolution of BDC histograms from a 2N-B distribution for small timescales to a N-B distributions for intermediate timescales and finally to a single beta distributions for large timescales was observed for all gauges. Cluster analysis revealed close patterns of BDC distributions among almost all gauges and timescales with exception of two gauges located at the city limits (one gauge was located on the Okęcie airport). We evaluated the performance of the microcanonical cascades at disaggregating 1280-min (quasi daily precipitation totals) into 5-min rainfall data for selected gauges. Synthetic time series were analyzed with respect to their intermittency and variability of rainfall intensities and compared to observational series. We showed that microcanonical cascades models could be used in practice for generating synthetic rainfall time series suitable as input to urban hydrology models in Warsaw.
Clinical Utility of Blood Cell Histogram Interpretation
Bhagya, S.; Majeed, Abdul
2017-01-01
An automated haematology analyser provides blood cell histograms by plotting the sizes of different blood cells on X-axis and their relative number on Y-axis. Histogram interpretation needs careful analysis of Red Blood Cell (RBC), White Blood Cell (WBC) and platelet distribution curves. Histogram analysis is often a neglected part of the automated haemogram which if interpreted well, has significant potential to provide diagnostically relevant information even before higher level investigations are ordered. PMID:29207767
Clinical Utility of Blood Cell Histogram Interpretation.
Thomas, E T Arun; Bhagya, S; Majeed, Abdul
2017-09-01
An automated haematology analyser provides blood cell histograms by plotting the sizes of different blood cells on X-axis and their relative number on Y-axis. Histogram interpretation needs careful analysis of Red Blood Cell (RBC), White Blood Cell (WBC) and platelet distribution curves. Histogram analysis is often a neglected part of the automated haemogram which if interpreted well, has significant potential to provide diagnostically relevant information even before higher level investigations are ordered.
Face verification system for Android mobile devices using histogram based features
NASA Astrophysics Data System (ADS)
Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu
2016-07-01
This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.
Empirical Histograms in Item Response Theory with Ordinal Data
ERIC Educational Resources Information Center
Woods, Carol M.
2007-01-01
The purpose of this research is to describe, test, and illustrate a new implementation of the empirical histogram (EH) method for ordinal items. The EH method involves the estimation of item response model parameters simultaneously with the approximation of the distribution of the random latent variable (theta) as a histogram. Software for the EH…
Yang, Su
2005-02-01
A new descriptor for symbol recognition is proposed. 1) A histogram is constructed for every pixel to figure out the distribution of the constraints among the other pixels. 2) All the histograms are statistically integrated to form a feature vector with fixed dimension. The robustness and invariance were experimentally confirmed.
NASA Astrophysics Data System (ADS)
Rimskaya-Korsavkova, L. K.
2017-07-01
To find the possible reasons for the midlevel elevation of the Weber fraction in intensity discrimination of a tone burst, a comparison was performed for the complementary distributions of spike activity of an ensemble of space nerves, such as the distribution of time instants when spikes occur, the distribution of interspike intervals, and the autocorrelation function. The distribution properties were detected in a poststimulus histogram, an interspike interval histogram, and an autocorrelation histogram—all obtained from the reaction of an ensemble of model space nerves in response to an auditory noise burst-useful tone burst complex. Two configurations were used: in the first, the peak amplitude of the tone burst was varied and the noise amplitude was fixed; in the other, the tone burst amplitude was fixed and the noise amplitude was varied. Noise could precede or follow the tone burst. The noise and tone burst durations, as well as the interval between them, was 4 kHz and corresponded to the characteristic frequencies of the model space nerves. The profiles of all the mentioned histograms had two maxima. The values and the positions of the maxima in the poststimulus histogram corresponded to the amplitudes and mutual time position of the noise and the tone burst. The maximum that occurred in response to the tone burst action could be a basis for the formation of the loudness of the latter (explicit loudness). However, the positions of the maxima in the other two histograms did not depend on the positions of tone bursts and noise in the combinations. The first maximum fell in short intervals and united intervals corresponding to the noise and tone burst durations. The second maximum fell in intervals corresponding to a tone burst delay with respect to noise, and its value was proportional to the noise amplitude or tone burst amplitude that was smaller in the complex. An increase in tone burst or noise amplitudes was caused by nonlinear variations in the two maxima and the ratio between them. The size of the first maximum in the of interspike interval distribution could be the basis for the formation of the loudness of the masked tone burst (implicit loudness), and the size of the second maximum, for the formation of intensity in the periodicity pitch of the complex. The auditory effect of the midlevel enhancement of tone burst loudness could be the result of variations in the implicit tone burst loudness caused by variations in tone-burst or noise intensity. The reason for the enhancement of the Weber fraction could be competitive interaction between such subjective qualities as explicit and implicit tone-burst loudness and the intensity of the periodicity pitch of the complex.
NASA Technical Reports Server (NTRS)
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
Velocity distributions among colliding asteroids
NASA Technical Reports Server (NTRS)
Bottke, William F., Jr.; Nolan, Michael C.; Greenberg, Richard; Kolvoord, Robert A.
1994-01-01
The probability distribution for impact velocities between two given asteroids is wide, non-Gaussian, and often contains spikes according to our new method of analysis in which each possible orbital geometry for collision is weighted according to its probability. An average value would give a good representation only if the distribution were smooth and narrow. Therefore, the complete velocity distribution we obtain for various asteroid populations differs significantly from published histograms of average velocities. For all pairs among the 682 asteroids in the main-belt with D greater than 50 km, we find that our computed velocity distribution is much wider than previously computed histograms of average velocities. In this case, the most probable impact velocity is approximately 4.4 km/sec, compared with the mean impact velocity of 5.3 km/sec. For cases of a single asteroid (e.g., Gaspra or Ida) relative to an impacting population, the distribution we find yields lower velocities than previously reported by others. The width of these velocity distributions implies that mean impact velocities must be used with caution when calculating asteroid collisional lifetimes or crater-size distributions. Since the most probable impact velocities are lower than the mean, disruption events may occur less frequently than previously estimated. However, this disruption rate may be balanced somewhat by an apparent increase in the frequency of high-velocity impacts between asteroids. These results have implications for issues such as asteroidal disruption rates, the amount/type of impact ejecta available for meteoritical delivery to the Earth, and the geology and evolution of specific asteroids like Gaspra.
Zhang, Yujuan; Chen, Jun; Liu, Song; Shi, Hua; Guan, Wenxian; Ji, Changfeng; Guo, Tingting; Zheng, Huanhuan; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng; Liu, Tian
2017-02-01
To investigate the efficacy of histogram analysis of the entire tumor volume in apparent diffusion coefficient (ADC) maps for differentiating between histological grades in gastric cancer. Seventy-eight patients with gastric cancer were enrolled in a retrospective 3.0T magnetic resonance imaging (MRI) study. ADC maps were obtained at two different b values (0 and 1000 sec/mm 2 ) for each patient. Tumors were delineated on each slice of the ADC maps, and a histogram for the entire tumor volume was subsequently generated. A series of histogram parameters (eg, skew and kurtosis) were calculated and correlated with the histological grade of the surgical specimen. The diagnostic performance of each parameter for distinguishing poorly from moderately well-differentiated gastric cancers was assessed by using the area under the receiver operating characteristic curve (AUC). There were significant differences in the 5 th , 10 th , 25 th , and 50 th percentiles, skew, and kurtosis between poorly and well-differentiated gastric cancers (P < 0.05). There were correlations between the degrees of differentiation and histogram parameters, including the 10 th percentile, skew, kurtosis, and max frequency; the correlation coefficients were 0.273, -0.361, -0.339, and -0.370, respectively. Among all the histogram parameters, the max frequency had the largest AUC value, which was 0.675. Histogram analysis of the ADC maps on the basis of the entire tumor volume can be useful in differentiating between histological grades for gastric cancer. 4 J. Magn. Reson. Imaging 2017;45:440-449. © 2016 International Society for Magnetic Resonance in Medicine.
Suárez, H; Musé, P; Suárez, A; Arocena, M
2001-01-01
In order to assess the influence of visual stimulation in the triggering of imbalance and falls in the elderly population, the postural responses of 18 elderly patients with central vestibular disorders and clinical evidence of instability and falls were studied while receiving different types of visual stimuli. The stimulation conditions were: (i) no specific stimuli; (ii) smooth pursuit with pure sinusoids of 0.2 Hz as foveal stimulation; and (iii) optokinetic stimulation (OK) as retinal stimuli. Using a platform AMTI Accusway platform, the 95% confidence ellipse (CE) and sway velocity (SV) were evaluated with a scalogram using wavelets in order to assess the relationship between time and frequency in postural control. Velocity histograms were also constructed in order to observe the distribution of velocity values during the recording. A non-homogeneous postural behavior after visual stimulation was found among this population. In five of the patients the OK stimulation generated: (i) significantly higher average values of CE ( > 3.4+/-0.69 cm2); (ii) a significant increase in the average values of the SV ( > 3.89+/-1.15 cm/s) and a velocity histogram with a homogeneous distribution between 0 and 18 cm/s; and (iii) a scalogram with sway frequencies of up to 4 Hz distributed in both the X and Y directions (backwards and forwards and lateral) during visual stimulation with arbitrary units of energy density > 5. These three qualitative and quantitative aspects could be "markers" of visual dependence in the triggering of the mechanism of lack of equilibrium and hence falls in some elderly patients and should be considered in order to prevent falls and also to assist in the rehabilitation program of these patients.
Jaikuna, Tanwiwat; Khadsiri, Phatchareewan; Chawapun, Nisa; Saekho, Suwit; Tharavichitkul, Ekkasit
2017-02-01
To develop an in-house software program that is able to calculate and generate the biological dose distribution and biological dose volume histogram by physical dose conversion using the linear-quadratic-linear (LQL) model. The Isobio software was developed using MATLAB version 2014b to calculate and generate the biological dose distribution and biological dose volume histograms. The physical dose from each voxel in treatment planning was extracted through Computational Environment for Radiotherapy Research (CERR), and the accuracy was verified by the differentiation between the dose volume histogram from CERR and the treatment planning system. An equivalent dose in 2 Gy fraction (EQD 2 ) was calculated using biological effective dose (BED) based on the LQL model. The software calculation and the manual calculation were compared for EQD 2 verification with pair t -test statistical analysis using IBM SPSS Statistics version 22 (64-bit). Two and three-dimensional biological dose distribution and biological dose volume histogram were displayed correctly by the Isobio software. Different physical doses were found between CERR and treatment planning system (TPS) in Oncentra, with 3.33% in high-risk clinical target volume (HR-CTV) determined by D 90% , 0.56% in the bladder, 1.74% in the rectum when determined by D 2cc , and less than 1% in Pinnacle. The difference in the EQD 2 between the software calculation and the manual calculation was not significantly different with 0.00% at p -values 0.820, 0.095, and 0.593 for external beam radiation therapy (EBRT) and 0.240, 0.320, and 0.849 for brachytherapy (BT) in HR-CTV, bladder, and rectum, respectively. The Isobio software is a feasible tool to generate the biological dose distribution and biological dose volume histogram for treatment plan evaluation in both EBRT and BT.
Mutants in Arabidopsis thaliana with altered shoot gravitropism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bullen, B.L.; Poff, K.L.
1987-04-01
A procedure has been developed and used to screen 40,000 m-2 seedlings of Arabidopsis thaliana for strains with altered shoot gravitropism. Several strains have been identified for which shoot gravitropism is considerably more random than that of their wild-type parent (based on frequency distribution histograms of the gravitropic response to a 1 g stimulus). One such strain exhibits normal hypocotyl phototropism and normal root gravitropism. Thus, the gravitropism pathway in the shoot contains at least one mutable element which is not required for root gravitropism.
Felfer, Peter; Cairney, Julie
2018-06-01
Analysing the distribution of selected chemical elements with respect to interfaces is one of the most common tasks in data mining in atom probe tomography. This can be represented by 1D concentration profiles, 2D concentration maps or proximity histograms, which represent concentration, density etc. of selected species as a function of the distance from a reference surface/interface. These are some of the most useful tools for the analysis of solute distributions in atom probe data. In this paper, we present extensions to the proximity histogram in the form of 'local' proximity histograms, calculated for selected parts of a surface, and pseudo-2D concentration maps, which are 2D concentration maps calculated on non-flat surfaces. This way, local concentration changes at interfaces or and other structures can be assessed more effectively. Copyright © 2018 Elsevier B.V. All rights reserved.
High frequency measurements of shot noise suppression in atomic-scale metal contacts
NASA Astrophysics Data System (ADS)
Wheeler, Patrick J.; Evans, Kenneth; Russom, Jeffrey; King, Nicholas; Natelson, Douglas
2009-03-01
Shot noise provides a means of assessing the number and transmission coefficients of transmitting channels in atomic- and molecular-scale junctions. Previous experiments at low temperatures in metal and semiconductor point contacts have demonstrated the expected suppression of shot noise when junction conductance is near an integer multiple of the conductance quantum, G0≡2e^2/h. Using high frequency techniques, we demonstrate the high speed acquisition of such data at room temperature in mechanical break junctions. In clean Au contacts conductance histograms with clear peaks at G0, 2G0, and 3G0 are acquired within hours, and histograms of simultaneous measurements of the shot noise show clear suppression at those conductance values. We describe the dependence of the noise on bias voltage and analyze the noise vs. conductance histograms in terms of a model that averages over transmission coefficients.
Parameterization of the Age-Dependent Whole Brain Apparent Diffusion Coefficient Histogram
Batra, Marion; Nägele, Thomas
2015-01-01
Purpose. The distribution of apparent diffusion coefficient (ADC) values in the brain can be used to characterize age effects and pathological changes of the brain tissue. The aim of this study was the parameterization of the whole brain ADC histogram by an advanced model with influence of age considered. Methods. Whole brain ADC histograms were calculated for all data and for seven age groups between 10 and 80 years. Modeling of the histograms was performed for two parts of the histogram separately: the brain tissue part was modeled by two Gaussian curves, while the remaining part was fitted by the sum of a Gaussian curve, a biexponential decay, and a straight line. Results. A consistent fitting of the histograms of all age groups was possible with the proposed model. Conclusions. This study confirms the strong dependence of the whole brain ADC histograms on the age of the examined subjects. The proposed model can be used to characterize changes of the whole brain ADC histogram in certain diseases under consideration of age effects. PMID:26609526
Automatic classification of sleep stages based on the time-frequency image of EEG signals.
Bajaj, Varun; Pachori, Ram Bilas
2013-12-01
In this paper, a new method for automatic sleep stage classification based on time-frequency image (TFI) of electroencephalogram (EEG) signals is proposed. Automatic classification of sleep stages is an important part for diagnosis and treatment of sleep disorders. The smoothed pseudo Wigner-Ville distribution (SPWVD) based time-frequency representation (TFR) of EEG signal has been used to obtain the time-frequency image (TFI). The segmentation of TFI has been performed based on the frequency-bands of the rhythms of EEG signals. The features derived from the histogram of segmented TFI have been used as an input feature set to multiclass least squares support vector machines (MC-LS-SVM) together with the radial basis function (RBF), Mexican hat wavelet, and Morlet wavelet kernel functions for automatic classification of sleep stages from EEG signals. The experimental results are presented to show the effectiveness of the proposed method for classification of sleep stages from EEG signals. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Dong, Zhicheng; Bao, Zhengyu; Wu, Guoai; Fu, Yangrong; Yang, Yi
2010-11-01
The content and spatial distribution of lead in the aquatic systems in two Chinese tropical cities in Hainan province (Haikou and Sanyan) show an unequal distribution of lead between the urban and the suburban areas. The lead content is significantly higher (72.3 mg/kg) in the urban area than the suburbs (15.0 mg/kg) in Haikou, but quite equal in Sanya (41.6 and 43.9 mg/kg). The frequency distribution histograms suggest that the lead in Haikou and in Sanya derives from different natural and/or anthropogenic sources. The isotopic compositions indicate that urban sediment lead in Haikou originates mainly from anthropogenic sources (automobile exhaust, atmospheric deposition, etc.) which contribute much more than the natural sources, while natural lead (basalt and sea sands) is still dominant in the suburban areas in Haikou. In Sanya, the primary source is natural (soils and sea sands).
EMG circuit design and AR analysis of EMG signs.
Hardalaç, Firat; Canal, Rahmi
2004-12-01
In this study, electromyogram (EMG) circuit was designed and tested on 27 people. Autoregressive (AR) analysis of EMG signals recorded on the ulnar nerve region of the right hand in resting position was performed. AR method, especially in the calculation of the spectrums of stable signs, is used for frequency analysis of signs, which give frequency response as sharp peaks and valleys. In this study, as the result of AR method analysis of EMG signals frequency-time domain, frequency spectrum curves (histogram curves) were obtained. As the images belonging to these histograms were evaluated, fibrillation potential widths of the muscle fibers of the ulnar nerve region of the people (material of the study) were examined. According to the degeneration degrees of the motor nerves, nine people had myopathy, nine had neuropathy, and nine were normal.
Rodrigues, Nils; Weiskopf, Daniel
2018-01-01
Conventional dot plots use a constant dot size and are typically applied to show the frequency distribution of small data sets. Unfortunately, they are not designed for a high dynamic range of frequencies. We address this problem by introducing nonlinear dot plots. Adopting the idea of nonlinear scaling from logarithmic bar charts, our plots allow for dots of varying size so that columns with a large number of samples are reduced in height. For the construction of these diagrams, we introduce an efficient two-way sweep algorithm that leads to a dense and symmetrical layout. We compensate aliasing artifacts at high dot densities by a specifically designed low-pass filtering method. Examples of nonlinear dot plots are compared to conventional dot plots as well as linear and logarithmic histograms. Finally, we include feedback from an expert review.
NASA Astrophysics Data System (ADS)
Mildrexler, D. J.; Zhao, M.; Running, S. W.
2014-12-01
Land Surface Temperature (LST) is a good indicator of the surface energy balance because it is determined by interactions and energy fluxes between the atmosphere and the ground. The variability of land surface properties and vegetation densities across the Earth's surface changes these interactions and gives LST a unique biogeographic influence. Natural and human-induced disturbances modify the surface characteristics and alter the expression of LST. This results in a heterogeneous and dynamic thermal environment. Measurements that merge these factors into a single global metric, while maintaining the important biophysical and biogeographical factors of the land surface's thermal environment are needed to better understand integrated temperature changes in the Earth system. Using satellite-based LST we have developed a new global metric that focuses on one critical component of LST that occurs when the relationship between vegetation density and surface temperature is strongly coupled: annual maximum LST (LSTmax). A 10 year evaluation of LSTmax histograms that include every 1-km pixel across the Earth's surface reveals that this integrative measurement is strongly influenced by the biogeographic patterns of the Earth's ecosystems, providing a unique comparative view of the planet every year that can be likened to the Earth's thermal maximum fingerprint. The biogeographical component is controlled by the frequency and distribution of vegetation types across the Earth's land surface and displays a trimodal distribution. The three modes are driven by ice covered polar regions, forests, and hot desert/shrubland environments. In ice covered areas the histograms show that the heat of fusion results in a convergence of surface temperatures around the melting point. The histograms also show low interannual variability reflecting two important global land surface dynamics; 1) only a small fraction of the Earth's surface is disturbed in any given year, and 2) when considered at the global scale, the positive and negative climate forcings resulting from the aggregate effects of the loss of vegetation to disturbances and the regrowth from natural succession are roughly in balance. Changes in any component of the histogram can be tracked and would indicate a major change in the Earth system.
Chen, Ting Y; Zhang, Die; Dragomir, Andrei; Akay, Yasemin; Akay, Metin
2011-05-01
We investigated the influence of nicotine exposure and prefrontal cortex (PFC) transections on ventral tegmental areas (VTA) dopamine (DA) neurons' firing activities using a time-frequency method based on the continuous wavelet transform (CWT). Extracellular single-unit neural activity was recorded from DA neurons in the VTA area of rats. One group had their PFC inputs to the VTA intact, while the other group had the inputs to VTA bilaterally transected immediate caudal to the PFC. We hypothesized that the systemic nicotine exposure will significantly change the energy distribution in the recorded neural activity. Additionally, we investigated whether the loss of inputs to the VTA caused by the PFC transection resulted in the cancellation of the nicotine' effect on the neurons' firing patterns. The time-frequency representations of VTA DA neurons firing activity were estimated from the reconstructed firing rate histogram. The energy contents were estimated from three frequency bands, which are known to encompass the significant modes of operation of DA neurons. Our results show that systemic nicotine exposure disrupts the energy distribution in PFC-intact rats. Particularly, there is a significant increase in energy contents of the 1-1.5 Hz frequency band. This corresponds to an observed increase in the firing rate of VTA DA neurons following nicotine exposure. Additionally, our results from PFC-transected rats show that there is no change in the energy distribution of the recordings after systemic nicotine exposure. These results indicate that the PFC plays an important role in affecting the activities of VTA DA neurons and that the CWT is a useful method for monitoring the changes in neural activity patterns in both time and frequency domains.
NASA Technical Reports Server (NTRS)
Seze, Genevieve; Rossow, William B.
1991-01-01
The spatial and temporal stability of the distributions of satellite-measured visible and infrared radiances, caused by variations in clouds and surfaces, are investigated using bidimensional and monodimensional histograms and time-composite images. Similar analysis of the histograms of the original and time-composite images provides separation of the contributions of the space and time variations to the total variations. The variability of both the surfaces and clouds is found to be larger at scales much larger than the minimum resolved by satellite imagery. This study shows that the shapes of these histograms are distinctive characteristics of the different climate regimes and that particular attributes of these histograms can be related to several general, though not universal, properties of clouds and surface variations at regional and synoptic scales. There are also significant exceptions to these relationships in particular climate regimes. The characteristics of these radiance histograms provide a stable well defined descriptor of the cloud and surface properties.
NASA Astrophysics Data System (ADS)
Wan, Minjie; Gu, Guohua; Qian, Weixian; Ren, Kan; Chen, Qian; Maldague, Xavier
2018-06-01
Infrared image enhancement plays a significant role in intelligent urban surveillance systems for smart city applications. Unlike existing methods only exaggerating the global contrast, we propose a particle swam optimization-based local entropy weighted histogram equalization which involves the enhancement of both local details and fore-and background contrast. First of all, a novel local entropy weighted histogram depicting the distribution of detail information is calculated based on a modified hyperbolic tangent function. Then, the histogram is divided into two parts via a threshold maximizing the inter-class variance in order to improve the contrasts of foreground and background, respectively. To avoid over-enhancement and noise amplification, double plateau thresholds of the presented histogram are formulated by means of particle swarm optimization algorithm. Lastly, each sub-image is equalized independently according to the constrained sub-local entropy weighted histogram. Comparative experiments implemented on real infrared images prove that our algorithm outperforms other state-of-the-art methods in terms of both visual and quantized evaluations.
Northern peatland initiation lagged abrupt increases in deglacial atmospheric CH4.
Reyes, Alberto V; Cooke, Colin A
2011-03-22
Peatlands are a key component of the global carbon cycle. Chronologies of peatland initiation are typically based on compiled basal peat radiocarbon (14C) dates and frequency histograms of binned calibrated age ranges. However, such compilations are problematic because poor quality 14C dates are commonly included and because frequency histograms of binned age ranges introduce chronological artefacts that bias the record of peatland initiation. Using a published compilation of 274 basal 14C dates from Alaska as a case study, we show that nearly half the 14C dates are inappropriate for reconstructing peatland initiation, and that the temporal structure of peatland initiation is sensitive to sampling biases and treatment of calibrated 14C dates. We present revised chronologies of peatland initiation for Alaska and the circumpolar Arctic based on summed probability distributions of calibrated 14C dates. These revised chronologies reveal that northern peatland initiation lagged abrupt increases in atmospheric CH4 concentration at the start of the Bølling-Allerød interstadial (Termination 1A) and the end of the Younger Dryas chronozone (Termination 1B), suggesting that northern peatlands were not the primary drivers of the rapid increases in atmospheric CH4. Our results demonstrate that subtle methodological changes in the synthesis of basal 14C ages lead to substantially different interpretations of temporal trends in peatland initiation, with direct implications for the role of peatlands in the global carbon cycle.
Book review: A new view on the species abundance distribution
DeAngelis, Donald L.
2018-01-01
The sampled relative abundances of species of a taxonomic group, whether birds, trees, or moths, in a natural community at a particular place vary in a way that suggests a consistent underlying pattern, referred to as the species abundance distribution (SAD). Preston [1] conjectured that the numbers of species, plotted as a histogram of logarithmic abundance classes called octaves, seemed to fit a lognormal distribution; that is, the histograms look like normal distributions, although truncated on the left-hand, or low-species-abundance, end. Although other specific curves for the SAD have been proposed in the literature, Preston’s lognormal distribution is widely cited in textbooks and has stimulated attempts at explanation. An important aspect of Preston’s lognormal distribution is the ‘veil line’, a vertical line drawn exactly at the point of the left-hand truncation in the distribution, to the left of which would be species missing from the sample. Dewdney rejects the lognormal conjecture. Instead, starting with the long-recognized fact that the number of species sampled from a community, when plotted as histograms against population abundance, resembles an inverted J, he presents a mathematical description of an alternative that he calls the ‘J distribution’, a hyperbolic density function truncated at both ends. When multiplied by species richness, R, it becomes the SAD of the sample.
Automated Weather Observing System (AWOS) Demonstration Program.
1984-09-01
month "bur:-in" r "debugging" period and a 10-month ’usefu I life " period. Fhe butrn- in pr i ,J was i sed to establish the Data Acquisition System...Histograms. Histograms provide a graphical means of showing how well the probability distribution of residu : , approaches a normal or Gaussian distribution...Organization Report No. 7- Author’s) Paul .J. O t Brien et al. DOT/FAA/CT-84/20 9. Performing Organlzation Name and Address 10. Work Unit No. (TRAIS
Tan, Shan; Zhang, Hao; Zhang, Yongxue; Chen, Wengen; D’Souza, Warren D.; Lu, Wei
2013-01-01
Purpose: A family of fluorine-18 (18F)-fluorodeoxyglucose (18F-FDG) positron-emission tomography (PET) features based on histogram distances is proposed for predicting pathologic tumor response to neoadjuvant chemoradiotherapy (CRT). These features describe the longitudinal change of FDG uptake distribution within a tumor. Methods: Twenty patients with esophageal cancer treated with CRT plus surgery were included in this study. All patients underwent PET/CT scans before (pre-) and after (post-) CRT. The two scans were first rigidly registered, and the original tumor sites were then manually delineated on the pre-PET/CT by an experienced nuclear medicine physician. Two histograms representing the FDG uptake distribution were extracted from the pre- and the registered post-PET images, respectively, both within the delineated tumor. Distances between the two histograms quantify longitudinal changes in FDG uptake distribution resulting from CRT, and thus are potential predictors of tumor response. A total of 19 histogram distances were examined and compared to both traditional PET response measures and Haralick texture features. Receiver operating characteristic analyses and Mann-Whitney U test were performed to assess their predictive ability. Results: Among all tested histogram distances, seven bin-to-bin and seven crossbin distances outperformed traditional PET response measures using maximum standardized uptake value (AUC = 0.70) or total lesion glycolysis (AUC = 0.80). The seven bin-to-bin distances were: L2 distance (AUC = 0.84), χ2 distance (AUC = 0.83), intersection distance (AUC = 0.82), cosine distance (AUC = 0.83), squared Euclidean distance (AUC = 0.83), L1 distance (AUC = 0.82), and Jeffrey distance (AUC = 0.82). The seven crossbin distances were: quadratic-chi distance (AUC = 0.89), earth mover distance (AUC = 0.86), fast earth mover distance (AUC = 0.86), diffusion distance (AUC = 0.88), Kolmogorov-Smirnov distance (AUC = 0.88), quadratic form distance (AUC = 0.87), and match distance (AUC = 0.84). These crossbin histogram distance features showed slightly higher prediction accuracy than texture features on post-PET images. Conclusions: The results suggest that longitudinal patterns in 18F-FDG uptake characterized using histogram distances provide useful information for predicting the pathologic response of esophageal cancer to CRT. PMID:24089897
Slope histogram distribution-based parametrisation of Martian geomorphic features
NASA Astrophysics Data System (ADS)
Balint, Zita; Székely, Balázs; Kovács, Gábor
2014-05-01
The application of geomorphometric methods on the large Martian digital topographic datasets paves the way to analyse the Martian areomorphic processes in more detail. One of the numerous methods is the analysis is to analyse local slope distributions. To this implementation a visualization program code was developed that allows to calculate the local slope histograms and to compare them based on Kolmogorov distance criterion. As input data we used the digital elevation models (DTMs) derived from HRSC high-resolution stereo camera image from various Martian regions. The Kolmogorov-criterion based discrimination produces classes of slope histograms that displayed using coloration obtaining an image map. In this image map the distribution can be visualized by their different colours representing the various classes. Our goal is to create a local slope histogram based classification for large Martian areas in order to obtain information about general morphological characteristics of the region. This is a contribution of the TMIS.ascrea project, financed by the Austrian Research Promotion Agency (FFG). The present research is partly realized in the frames of TÁMOP 4.2.4.A/2-11-1-2012-0001 high priority "National Excellence Program - Elaborating and Operating an Inland Student and Researcher Personal Support System convergence program" project's scholarship support, using Hungarian state and European Union funds and cofinances from the European Social Fund.
Application of Markov Models for Analysis of Development of Psychological Characteristics
ERIC Educational Resources Information Center
Kuravsky, Lev S.; Malykh, Sergey B.
2004-01-01
A technique to study combined influence of environmental and genetic factors on the base of changes in phenotype distributions is presented. Histograms are exploited as base analyzed characteristics. A continuous time, discrete state Markov process with piece-wise constant interstate transition rates is associated with evolution of each histogram.…
Human body and head characteristics as a communication medium for Body Area Network.
Kifle, Yonatan; Hun-Seok Kim; Yoo, Jerald
2015-01-01
An in-depth investigation of the Body Channel Communication (BCC) under the environment set according to the IEEE 802.15.6 Body Area Network (BAN) standard is conducted to observe and characterize the human body as a communication medium. A thorough measurement of the human head as part of the human channel is also carried out. Human forehead, head to limb, and ear to ear channel is characterized. The channel gain of the human head follows the same bandpass profile of the human torso and limbs with the maximum channel gain occurring at 35MHz. The human body channel gain distribution histogram at given frequencies, while all the other parameters are held constant, exhibits a maximum variation of 2.2dB in the channel gain at the center frequency of the bandpass channel gain profile.
Research on the Characteristics of Alzheimer's Disease Using EEG
NASA Astrophysics Data System (ADS)
Ueda, Taishi; Musha, Toshimitsu; Yagi, Tohru
In this paper, we proposed a new method for diagnosing Alzheimer's disease (AD) on the basis of electroencephalograms (EEG). The method, which is termed Power Variance Function (PVF) method, indicates the variance of the power at each frequency. By using the proposed method, the power of EEG at each frequency was calculated using Wavelet transform, and the corresponding variances were defined as PVF. After the PVF histogram of 55 healthy people was approximated as a Generalized Extreme Value (GEV) distribution, we evaluated the PVF of 22 patients with AD and 25 patients with mild cognitive impairment (MCI). As a result, the values for all AD and MCI subjects were abnormal. In particular, the PVF in the θ band for MCI patients was abnormally high, and the PVF in the α band for AD patients was low.
NASA Technical Reports Server (NTRS)
Eigen, D. J.; Fromm, F. R.; Northouse, R. A.
1974-01-01
A new clustering algorithm is presented that is based on dimensional information. The algorithm includes an inherent feature selection criterion, which is discussed. Further, a heuristic method for choosing the proper number of intervals for a frequency distribution histogram, a feature necessary for the algorithm, is presented. The algorithm, although usable as a stand-alone clustering technique, is then utilized as a global approximator. Local clustering techniques and configuration of a global-local scheme are discussed, and finally the complete global-local and feature selector configuration is shown in application to a real-time adaptive classification scheme for the analysis of remote sensed multispectral scanner data.
Uyghur face recognition method combining 2DDCT with POEM
NASA Astrophysics Data System (ADS)
Yi, Lihamu; Ya, Ermaimaiti
2017-11-01
In this paper, in light of the reduced recognition rate and poor robustness of Uyghur face under illumination and partial occlusion, a Uyghur face recognition method combining Two Dimension Discrete Cosine Transform (2DDCT) with Patterns Oriented Edge Magnitudes (POEM) was proposed. Firstly, the Uyghur face images were divided into 8×8 block matrix, and the Uyghur face images after block processing were converted into frequency-domain status using 2DDCT; secondly, the Uyghur face images were compressed to exclude non-sensitive medium frequency parts and non-high frequency parts, so it can reduce the feature dimensions necessary for the Uyghur face images, and further reduce the amount of computation; thirdly, the corresponding POEM histograms of the Uyghur face images were obtained by calculating the feature quantity of POEM; fourthly, the POEM histograms were cascaded together as the texture histogram of the center feature point to obtain the texture features of the Uyghur face feature points; finally, classification of the training samples was carried out using deep learning algorithm. The simulation experiment results showed that the proposed algorithm further improved the recognition rate of the self-built Uyghur face database, and greatly improved the computing speed of the self-built Uyghur face database, and had strong robustness.
Histogram based analysis of lung perfusion of children after congenital diaphragmatic hernia repair.
Kassner, Nora; Weis, Meike; Zahn, Katrin; Schaible, Thomas; Schoenberg, Stefan O; Schad, Lothar R; Zöllner, Frank G
2018-05-01
To investigate a histogram based approach to characterize the distribution of perfusion in the whole left and right lung by descriptive statistics and to show how histograms could be used to visually explore perfusion defects in two year old children after Congenital Diaphragmatic Hernia (CDH) repair. 28 children (age of 24.2±1.7months; all left sided hernia; 9 after extracorporeal membrane oxygenation therapy) underwent quantitative DCE-MRI of the lung. Segmentations of left and right lung were manually drawn to mask the calculated pulmonary blood flow maps and then to derive histograms for each lung side. Individual and group wise analysis of histograms of left and right lung was performed. Ipsilateral and contralateral lung show significant difference in shape and descriptive statistics derived from the histogram (Wilcoxon signed-rank test, p<0.05) on group wise and individual level. Subgroup analysis (patients with vs without ECMO therapy) showed no significant differences using histogram derived parameters. Histogram analysis can be a valuable tool to characterize and visualize whole lung perfusion of children after CDH repair. It allows for several possibilities to analyze the data, either describing the perfusion differences between the right and left lung but also to explore and visualize localized perfusion patterns in the 3D lung volume. Subgroup analysis will be possible given sufficient sample sizes. Copyright © 2017 Elsevier Inc. All rights reserved.
RF environment survey of Space Shuttle related EEE frequency bands
NASA Technical Reports Server (NTRS)
Simpson, J.; Prigel, B.; Postelle, J.
1977-01-01
Radio frequency assignments within the continental United States in frequency bands between 121 MHz abd 65 GHz were surveyed and analyzed in order to determine current utilization of anticipated frequency bands for the shuttle borne electromagnetic environment experiment. Data from both government and nongovernment files were used. Results are presented in both narrative form and in histograms which show the total number of unclassified assignments versus frequency and total assigned power versus frequency.
Regionally adaptive histogram equalization of the chest.
Sherrier, R H; Johnson, G A
1987-01-01
Advances in the area of digital chest radiography have resulted in the acquisition of high-quality images of the human chest. With these advances, there arises a genuine need for image processing algorithms specific to the chest, in order to fully exploit this digital technology. We have implemented the well-known technique of histogram equalization, noting the problems encountered when it is adapted to chest images. These problems have been successfully solved with our regionally adaptive histogram equalization method. With this technique histograms are calculated locally and then modified according to both the mean pixel value of that region as well as certain characteristics of the cumulative distribution function. This process, which has allowed certain regions of the chest radiograph to be enhanced differentially, may also have broader implications for other image processing tasks.
Analysis of Cellular DNA Content by Flow Cytometry.
Darzynkiewicz, Zbigniew; Huang, Xuan; Zhao, Hong
2017-10-02
Cellular DNA content can be measured by flow cytometry with the aim of : (1) revealing cell distribution within the major phases of the cell cycle, (2) estimating frequency of apoptotic cells with fractional DNA content, and/or (3) disclosing DNA ploidy of the measured cell population. In this unit, simple and universally applicable methods for staining fixed cells are presented, as are methods that utilize detergents and/or proteolytic treatment to permeabilize cells and make DNA accessible to fluorochrome. Additionally, supravital cell staining with Hoechst 33342, which is primarily used for sorting live cells based on DNA-content differences for their subsequent culturing, is described. Also presented are methods for staining cell nuclei isolated from paraffin-embedded tissues. Available algorithms are listed for deconvolution of DNA-content-frequency histograms to estimate percentage of cells in major phases of the cell cycle and frequency of apoptotic cells with fractional DNA content. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley and Sons, Inc.
Analysis of Cellular DNA Content by Flow Cytometry.
Darzynkiewicz, Zbigniew; Huang, Xuan; Zhao, Hong
2017-11-01
Cellular DNA content can be measured by flow cytometry with the aim of : (1) revealing cell distribution within the major phases of the cell cycle, (2) estimating frequency of apoptotic cells with fractional DNA content, and/or (3) disclosing DNA ploidy of the measured cell population. In this unit, simple and universally applicable methods for staining fixed cells are presented, as are methods that utilize detergents and/or proteolytic treatment to permeabilize cells and make DNA accessible to fluorochrome. Additionally, supravital cell staining with Hoechst 33342, which is primarily used for sorting live cells based on DNA-content differences for their subsequent culturing, is described. Also presented are methods for staining cell nuclei isolated from paraffin-embedded tissues. Available algorithms are listed for deconvolution of DNA-content-frequency histograms to estimate percentage of cells in major phases of the cell cycle and frequency of apoptotic cells with fractional DNA content. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley and Sons, Inc.
Chapple, W D
1997-09-01
Reflex activation of the ventral superficial muscles (VSM) in the abdomen of the hermit crab, Pagurus pollicarus, was studied using sinusoidal and stochastic longitudinal vibration of the muscle while recording the length and force of the muscle and the spike times of three exciter motoneurons. In the absence of vibration, the interspike interval histograms of the two larger motoneurons were bimodal; cutting sensory nerves containing most of the mechanoreceptor input removed the short interval peak in the histogram, indicating that the receptors are important in maintaining tonic firing. Vibration of the muscle evoked a reflex increase in motoneuron frequency that habituated after an initial peak but remained above control levels for the duration of stimulation. Motoneuron frequency increased with root mean square (rms) stimulus amplitude. Average stiffness during stimulation was about two times the stiffness of passive muscle. The reflex did not alter muscle dynamics. Estimated transfer functions were calculated from the fast Fourier transform of length and force signals. Coherence was >0.9 for the frequency range of 3-35 Hz. Stiffness magnitude gradually increased over this range in both reflex activated and passive muscle; phase was between 10 and 20 degrees. Reflex stiffness decreased with increasing stimulus amplitudes, but at larger amplitudes, this decrease was much less pronounced; in this range stiffness was regulated by the reflex. The sinusoidal frequency at which reflex bursts were elicited was approximately 6 Hz, consistent with previous measurements using ramp stretch. During reflex excitation, there was an increase in amplitude of the short interval peak in the interspike interval histogram; this was reduced when the majority of afferent pathways was removed. A phase histogram of motoneuron firing during sinusoidal vibration had a peak at approximately 110 ms, also suggesting that an important component of the reflex is via direct projections from the mechanoreceptors. These results are consistent with the hypothesis that a robust feedforward regulation of abdominal stiffness during continuous disturbances is achieved by mechanoreceptors signalling the absolute value of changing forces; habituation of the reflex, its high-threshold for low frequency disturbances and the activation kinetics of the muscle further modify reflex dynamics.
Complexity of possibly gapped histogram and analysis of histogram.
Fushing, Hsieh; Roy, Tania
2018-02-01
We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.
Complexity of possibly gapped histogram and analysis of histogram
Roy, Tania
2018-01-01
We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT. PMID:29515829
Complexity of possibly gapped histogram and analysis of histogram
NASA Astrophysics Data System (ADS)
Fushing, Hsieh; Roy, Tania
2018-02-01
We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.
Efficient visibility-driven medical image visualisation via adaptive binned visibility histogram.
Jung, Younhyun; Kim, Jinman; Kumar, Ashnil; Feng, David Dagan; Fulham, Michael
2016-07-01
'Visibility' is a fundamental optical property that represents the observable, by users, proportion of the voxels in a volume during interactive volume rendering. The manipulation of this 'visibility' improves the volume rendering processes; for instance by ensuring the visibility of regions of interest (ROIs) or by guiding the identification of an optimal rendering view-point. The construction of visibility histograms (VHs), which represent the distribution of all the visibility of all voxels in the rendered volume, enables users to explore the volume with real-time feedback about occlusion patterns among spatially related structures during volume rendering manipulations. Volume rendered medical images have been a primary beneficiary of VH given the need to ensure that specific ROIs are visible relative to the surrounding structures, e.g. the visualisation of tumours that may otherwise be occluded by neighbouring structures. VH construction and its subsequent manipulations, however, are computationally expensive due to the histogram binning of the visibilities. This limits the real-time application of VH to medical images that have large intensity ranges and volume dimensions and require a large number of histogram bins. In this study, we introduce an efficient adaptive binned visibility histogram (AB-VH) in which a smaller number of histogram bins are used to represent the visibility distribution of the full VH. We adaptively bin medical images by using a cluster analysis algorithm that groups the voxels according to their intensity similarities into a smaller subset of bins while preserving the distribution of the intensity range of the original images. We increase efficiency by exploiting the parallel computation and multiple render targets (MRT) extension of the modern graphical processing units (GPUs) and this enables efficient computation of the histogram. We show the application of our method to single-modality computed tomography (CT), magnetic resonance (MR) imaging and multi-modality positron emission tomography-CT (PET-CT). In our experiments, the AB-VH markedly improved the computational efficiency for the VH construction and thus improved the subsequent VH-driven volume manipulations. This efficiency was achieved without major degradation in the VH visually and numerical differences between the AB-VH and its full-bin counterpart. We applied several variants of the K-means clustering algorithm with varying Ks (the number of clusters) and found that higher values of K resulted in better performance at a lower computational gain. The AB-VH also had an improved performance when compared to the conventional method of down-sampling of the histogram bins (equal binning) for volume rendering visualisation. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Cheng, Lishui; Hobbs, Robert F.; Segars, Paul W.; Sgouros, George; Frey, Eric C.
2013-06-01
In radiopharmaceutical therapy, an understanding of the dose distribution in normal and target tissues is important for optimizing treatment. Three-dimensional (3D) dosimetry takes into account patient anatomy and the nonuniform uptake of radiopharmaceuticals in tissues. Dose-volume histograms (DVHs) provide a useful summary representation of the 3D dose distribution and have been widely used for external beam treatment planning. Reliable 3D dosimetry requires an accurate 3D radioactivity distribution as the input. However, activity distribution estimates from SPECT are corrupted by noise and partial volume effects (PVEs). In this work, we systematically investigated OS-EM based quantitative SPECT (QSPECT) image reconstruction in terms of its effect on DVHs estimates. A modified 3D NURBS-based Cardiac-Torso (NCAT) phantom that incorporated a non-uniform kidney model and clinically realistic organ activities and biokinetics was used. Projections were generated using a Monte Carlo (MC) simulation; noise effects were studied using 50 noise realizations with clinical count levels. Activity images were reconstructed using QSPECT with compensation for attenuation, scatter and collimator-detector response (CDR). Dose rate distributions were estimated by convolution of the activity image with a voxel S kernel. Cumulative DVHs were calculated from the phantom and QSPECT images and compared both qualitatively and quantitatively. We found that noise, PVEs, and ringing artifacts due to CDR compensation all degraded histogram estimates. Low-pass filtering and early termination of the iterative process were needed to reduce the effects of noise and ringing artifacts on DVHs, but resulted in increased degradations due to PVEs. Large objects with few features, such as the liver, had more accurate histogram estimates and required fewer iterations and more smoothing for optimal results. Smaller objects with fine details, such as the kidneys, required more iterations and less smoothing at early time points post-radiopharmaceutical administration but more smoothing and fewer iterations at later time points when the total organ activity was lower. The results of this study demonstrate the importance of using optimal reconstruction and regularization parameters. Optimal results were obtained with different parameters at each time point, but using a single set of parameters for all time points produced near-optimal dose-volume histograms.
Northern peatland initiation lagged abrupt increases in deglacial atmospheric CH4
Reyes, Alberto V.; Cooke, Colin A.
2011-01-01
Peatlands are a key component of the global carbon cycle. Chronologies of peatland initiation are typically based on compiled basal peat radiocarbon (14C) dates and frequency histograms of binned calibrated age ranges. However, such compilations are problematic because poor quality 14C dates are commonly included and because frequency histograms of binned age ranges introduce chronological artefacts that bias the record of peatland initiation. Using a published compilation of 274 basal 14C dates from Alaska as a case study, we show that nearly half the 14C dates are inappropriate for reconstructing peatland initiation, and that the temporal structure of peatland initiation is sensitive to sampling biases and treatment of calibrated 14C dates. We present revised chronologies of peatland initiation for Alaska and the circumpolar Arctic based on summed probability distributions of calibrated 14C dates. These revised chronologies reveal that northern peatland initiation lagged abrupt increases in atmospheric CH4 concentration at the start of the Bølling–Allerød interstadial (Termination 1A) and the end of the Younger Dryas chronozone (Termination 1B), suggesting that northern peatlands were not the primary drivers of the rapid increases in atmospheric CH4. Our results demonstrate that subtle methodological changes in the synthesis of basal 14C ages lead to substantially different interpretations of temporal trends in peatland initiation, with direct implications for the role of peatlands in the global carbon cycle. PMID:21368146
Methods for Determining Particle Size Distributions from Nuclear Detonations.
1987-03-01
Debris . . . 30 IV. Summary of Sample Preparation Method . . . . 35 V. Set Parameters for PCS ... ........... 39 VI. Analysis by Vendors...54 XV. Results From Brookhaven Analysis Using The Method of Cumulants ... ........... . 54 XVI. Results From Brookhaven Analysis of Sample...R-3 Using Histogram Method ......... .55 XVII. Results From Brookhaven Analysis of Sample R-8 Using Histogram Method ........... 56 XVIII.TEM Particle
Modulation Based on Probability Density Functions
NASA Technical Reports Server (NTRS)
Williams, Glenn L.
2009-01-01
A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.
Pattern-histogram-based temporal change detection using personal chest radiographs
NASA Astrophysics Data System (ADS)
Ugurlu, Yucel; Obi, Takashi; Hasegawa, Akira; Yamaguchi, Masahiro; Ohyama, Nagaaki
1999-05-01
An accurate and reliable detection of temporal changes from a pair of images has considerable interest in the medical science. Traditional registration and subtraction techniques can be applied to extract temporal differences when,the object is rigid or corresponding points are obvious. However, in radiological imaging, loss of the depth information, the elasticity of object, the absence of clearly defined landmarks and three-dimensional positioning differences constraint the performance of conventional registration techniques. In this paper, we propose a new method in order to detect interval changes accurately without using an image registration technique. The method is based on construction of so-called pattern histogram and comparison procedure. The pattern histogram is a graphic representation of the frequency counts of all allowable patterns in the multi-dimensional pattern vector space. K-means algorithm is employed to partition pattern vector space successively. Any differences in the pattern histograms imply that different patterns are involved in the scenes. In our experiment, a pair of chest radiographs of pneumoconiosis is employed and the changing histogram bins are visualized on both of the images. We found that the method can be used as an alternative way of temporal change detection, particularly when the precise image registration is not available.
Probing size-dependent electrokinetics of hematite aggregates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kedra-Królik, Karolina; Rosso, Kevin M.; Zarzycki, Piotr
Aqueous particle suspensions of many kinds are stabilized by the electrostatic potential developed at their surfaces from reaction with water and ions. An important and less well understood aspect of this stabilization is the dependence of the electrostatic surface potential on particle size. Surface electrostatics are typically probed by measuring particle electrophoretic mobilities and quantified in the electrokinetic potential (f), using commercially available Zeta Potential Analyzers (ZPA). Even though ZPAs provide frequency-spectra (histograms) of electrophoretic mobility and hydrodynamic diameter, typically only the maximal-intensity values are reported, despite the information in the remainder of the spectra. Here we propose a mappingmore » procedure that inter-correlates these histograms to extract additional insight, in this case to probe particle size-dependent electrokinetics. Our method is illustrated for a suspension of prototypical iron (III) oxide (hematite, a-Fe2O3). We found that the electrophoretic mobility and f-potential are a linear function of the aggregate size. By analyzing the distribution of surface site types as a function of aggregate size we show that site coordination increases with increasing aggregate diameter. This observation explains why the acidity of the iron oxide particles decreases with increasing particle size.« less
Dai, Misako; Sato, Aya; Maeba, Hiroko; Iuchi, Terumi; Matsumoto, Masaru; Okuwa, Mayumi; Nakatani, Toshio; Sanada, Hiromi; Sugama, Junko
2016-03-01
Acute dermatolymphangioadenitis (ADLA) is a risk factor for increasing of edema and worsening severity. Reducing ADLA frequency is an important objective of lymphedema management because ADLA episodes are strongly associated with poor quality of life. Lymphedema changes dermal and subcutaneous structure, favoring ADLA; ADLA recurrence may be caused by structural change of the dermis. However, the structure of the skin following ADLA episodes has not been studied in depth. The aim of this study was to examine changes in the skin after episodes of ADLA in breast cancer-related lymphedema (BCRL) using histogram analysis of ultrasonography findings. This was a case-control study with matching for the duration of lymphedema. We compared 10 limbs (5 BCRL patients, Cases) with a history of ADLA and 14 limbs (7 BCRL patients, Controls) without. Ultrasonography was performed using a 20-MHz probe, and measurements were made at a site 10 cm proximal to the ulnar styloid process. We compared "skewness" of the images in the dermis from the histogram analysis. This study was approved by the Ethics Committee of Kanazawa University. Skewness was significantly different between the affected and unaffected limbs (p = 0.02). Cases showed a positive value (median 0.74, range -0.18 to 1.26), whereas Controls showed a negative value (median -0.21, range -0.45 to 0.31). Episodes of ADLA changed the distribution of echogenicity on imaging, which indicates a change in the collagen fibers in the dermis. These findings might contribute to improving the management of lymphedema and prevention of recurrent ADLA.
A novel method for the evaluation of uncertainty in dose-volume histogram computation.
Henríquez, Francisco Cutanda; Castrillón, Silvia Vargas
2008-03-15
Dose-volume histograms (DVHs) are a useful tool in state-of-the-art radiotherapy treatment planning, and it is essential to recognize their limitations. Even after a specific dose-calculation model is optimized, dose distributions computed by using treatment-planning systems are affected by several sources of uncertainty, such as algorithm limitations, measurement uncertainty in the data used to model the beam, and residual differences between measured and computed dose. This report presents a novel method to take them into account. To take into account the effect of associated uncertainties, a probabilistic approach using a new kind of histogram, a dose-expected volume histogram, is introduced. The expected value of the volume in the region of interest receiving an absorbed dose equal to or greater than a certain value is found by using the probability distribution of the dose at each point. A rectangular probability distribution is assumed for this point dose, and a formulation that accounts for uncertainties associated with point dose is presented for practical computations. This method is applied to a set of DVHs for different regions of interest, including 6 brain patients, 8 lung patients, 8 pelvis patients, and 6 prostate patients planned for intensity-modulated radiation therapy. Results show a greater effect on planning target volume coverage than in organs at risk. In cases of steep DVH gradients, such as planning target volumes, this new method shows the largest differences with the corresponding DVH; thus, the effect of the uncertainty is larger.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takashima, Kengo; Yamamoto, Takahiro, E-mail: takahiro@rs.tus.ac.jp; Department of Liberal Arts
Conductance fluctuation of edge-disordered graphene nanoribbons (ED-GNRs) is examined using the non-equilibrium Green's function technique combined with the extended Hückel approximation. The mean free path λ and the localization length ξ of the ED-GNRs are determined to classify the quantum transport regimes. In the diffusive regime where the length L{sub c} of the ED-GNRs is much longer than λ and much shorter than ξ, the conductance histogram is given by a Gaussian distribution function with universal conductance fluctuation. In the localization regime where L{sub c}≫ξ, the histogram is no longer the universal Gaussian distribution but a lognormal distribution that characterizesmore » Anderson localization.« less
2010-07-02
indicated. Panel B, pancreatic infiltrating lymphocytes from 4 month-old NOD females ( left histogram) and males ( right histogram) (n = 8 mice/group...assay was used to measure the IL-2 secretion in the culture medium. Panel A, DN splenic cell cultures stimulated under Th1 ( left panel) and Th2 ( right ...variance test. The significance (p#0.005) of individual differences in frequency of DNCD3 thymocytes and splenocytes from female and male NOD littermates
Wang, Liang; Yuan, Jin; Jiang, Hong; Yan, Wentao; Cintrón-Colón, Hector R; Perez, Victor L; DeBuc, Delia C; Feuer, William J; Wang, Jianhua
2016-03-01
This study determined (1) how many vessels (i.e., the vessel sampling) are needed to reliably characterize the bulbar conjunctival microvasculature and (2) if characteristic information can be obtained from the distribution histogram of the blood flow velocity and vessel diameter. Functional slitlamp biomicroscope was used to image hundreds of venules per subject. The bulbar conjunctiva in five healthy human subjects was imaged on six different locations in the temporal bulbar conjunctiva. The histograms of the diameter and velocity were plotted to examine whether the distribution was normal. Standard errors were calculated from the standard deviation and vessel sample size. The ratio of the standard error of the mean over the population mean was used to determine the sample size cutoff. The velocity was plotted as a function of the vessel diameter to display the distribution of the diameter and velocity. The results showed that the sampling size was approximately 15 vessels, which generated a standard error equivalent to 15% of the population mean from the total vessel population. The distributions of the diameter and velocity were not only unimodal, but also somewhat positively skewed and not normal. The blood flow velocity was related to the vessel diameter (r=0.23, P<0.05). This was the first study to determine the sampling size of the vessels and the distribution histogram of the blood flow velocity and vessel diameter, which may lead to a better understanding of the human microvascular system of the bulbar conjunctiva.
Modeling late rectal toxicities based on a parameterized representation of the 3D dose distribution
NASA Astrophysics Data System (ADS)
Buettner, Florian; Gulliford, Sarah L.; Webb, Steve; Partridge, Mike
2011-04-01
Many models exist for predicting toxicities based on dose-volume histograms (DVHs) or dose-surface histograms (DSHs). This approach has several drawbacks as firstly the reduction of the dose distribution to a histogram results in the loss of spatial information and secondly the bins of the histograms are highly correlated with each other. Furthermore, some of the complex nonlinear models proposed in the past lack a direct physical interpretation and the ability to predict probabilities rather than binary outcomes. We propose a parameterized representation of the 3D distribution of the dose to the rectal wall which explicitly includes geometrical information in the form of the eccentricity of the dose distribution as well as its lateral and longitudinal extent. We use a nonlinear kernel-based probabilistic model to predict late rectal toxicity based on the parameterized dose distribution and assessed its predictive power using data from the MRC RT01 trial (ISCTRN 47772397). The endpoints under consideration were rectal bleeding, loose stools, and a global toxicity score. We extract simple rules identifying 3D dose patterns related to a specifically low risk of complication. Normal tissue complication probability (NTCP) models based on parameterized representations of geometrical and volumetric measures resulted in areas under the curve (AUCs) of 0.66, 0.63 and 0.67 for predicting rectal bleeding, loose stools and global toxicity, respectively. In comparison, NTCP models based on standard DVHs performed worse and resulted in AUCs of 0.59 for all three endpoints. In conclusion, we have presented low-dimensional, interpretable and nonlinear NTCP models based on the parameterized representation of the dose to the rectal wall. These models had a higher predictive power than models based on standard DVHs and their low dimensionality allowed for the identification of 3D dose patterns related to a low risk of complication.
ARM Radar Contoured Frequency by Altitude Diagram (CFAD) Data Products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yuying
2017-03-10
To compare with ARM cloud radar simulator outputs, observational reflectivity-height joint histograms, i.e., CFADs, are constructed from the operational ARM Active Remote Sensing of CLouds (ARSCL) Value-Added Product.
Fluorescence lifetime as a new parameter in analytical cytology measurements
NASA Astrophysics Data System (ADS)
Steinkamp, John A.; Deka, Chiranjit; Lehnert, Bruce E.; Crissman, Harry A.
1996-05-01
A phase-sensitive flow cytometer has been developed to quantify fluorescence decay lifetimes on fluorochrome-labeled cells/particles. This instrument combines flow cytometry (FCM) and frequency-domain fluorescence spectroscopy measurement principles to provide unique capabilities for making phase-resolved lifetime measurements, while preserving conventional FCM capabilities. Cells are analyzed as they intersect a high-frequency, intensity-modulated (sine wave) laser excitation beam. Fluorescence signals are processed by conventional and phase-sensitive signal detection electronics and displayed as frequency distribution histograms. In this study we describe results of fluorescence intensity and lifetime measurements on fluorescently labeled particles, cells, and chromosomes. Examples of measurements on intrinsic cellular autofluorescence, cells labeled with immunofluorescence markers for cell- surface antigens, mitochondria stains, and on cellular DNA and protein binding fluorochromes will be presented to illustrate unique differences in measured lifetimes and changes caused by fluorescence quenching. This innovative technology will be used to probe fluorochrome/molecular interactions in the microenvironment of cells/chromosomes as a new parameter and thus expand the researchers' understanding of biochemical processes and structural features at the cellular and molecular level.
SPA- STATISTICAL PACKAGE FOR TIME AND FREQUENCY DOMAIN ANALYSIS
NASA Technical Reports Server (NTRS)
Brownlow, J. D.
1994-01-01
The need for statistical analysis often arises when data is in the form of a time series. This type of data is usually a collection of numerical observations made at specified time intervals. Two kinds of analysis may be performed on the data. First, the time series may be treated as a set of independent observations using a time domain analysis to derive the usual statistical properties including the mean, variance, and distribution form. Secondly, the order and time intervals of the observations may be used in a frequency domain analysis to examine the time series for periodicities. In almost all practical applications, the collected data is actually a mixture of the desired signal and a noise signal which is collected over a finite time period with a finite precision. Therefore, any statistical calculations and analyses are actually estimates. The Spectrum Analysis (SPA) program was developed to perform a wide range of statistical estimation functions. SPA can provide the data analyst with a rigorous tool for performing time and frequency domain studies. In a time domain statistical analysis the SPA program will compute the mean variance, standard deviation, mean square, and root mean square. It also lists the data maximum, data minimum, and the number of observations included in the sample. In addition, a histogram of the time domain data is generated, a normal curve is fit to the histogram, and a goodness-of-fit test is performed. These time domain calculations may be performed on both raw and filtered data. For a frequency domain statistical analysis the SPA program computes the power spectrum, cross spectrum, coherence, phase angle, amplitude ratio, and transfer function. The estimates of the frequency domain parameters may be smoothed with the use of Hann-Tukey, Hamming, Barlett, or moving average windows. Various digital filters are available to isolate data frequency components. Frequency components with periods longer than the data collection interval are removed by least-squares detrending. As many as ten channels of data may be analyzed at one time. Both tabular and plotted output may be generated by the SPA program. This program is written in FORTRAN IV and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 142K (octal) of 60 bit words. This core requirement can be reduced by segmentation of the program. The SPA program was developed in 1978.
Detecting cell death with optical coherence tomography and envelope statistics
NASA Astrophysics Data System (ADS)
Farhat, Golnaz; Yang, Victor X. D.; Czarnota, Gregory J.; Kolios, Michael C.
2011-02-01
Currently no standard clinical or preclinical noninvasive method exists to monitor cell death based on morphological changes at the cellular level. In our past work we have demonstrated that quantitative high frequency ultrasound imaging can detect cell death in vitro and in vivo. In this study we apply quantitative methods previously used with high frequency ultrasound to optical coherence tomography (OCT) to detect cell death. The ultimate goal of this work is to use these methods for optically-based clinical and preclinical cancer treatment monitoring. Optical coherence tomography data were acquired from acute myeloid leukemia cells undergoing three modes of cell death. Significant increases in integrated backscatter were observed for cells undergoing apoptosis and mitotic arrest, while necrotic cells induced a decrease. These changes appear to be linked to structural changes observed in histology obtained from the cell samples. Signal envelope statistics were analyzed from fittings of the generalized gamma distribution to histograms of envelope intensities. The parameters from this distribution demonstrated sensitivities to morphological changes in the cell samples. These results indicate that OCT integrated backscatter and first order envelope statistics can be used to detect and potentially differentiate between modes of cell death in vitro.
Cauley, K A; Hu, Y; Och, J; Yorks, P J; Fielden, S W
2018-04-01
The majority of brain growth and development occur in the first 2 years of life. This study investigated these changes by analysis of the brain radiodensity histogram of head CT scans from the clinical population, 0-2 years of age. One hundred twenty consecutive head CTs with normal findings meeting the inclusion criteria from children from birth to 2 years were retrospectively identified from 3 different CT scan platforms. Histogram analysis was performed on brain-extracted images, and histogram mean, mode, full width at half maximum, skewness, kurtosis, and SD were correlated with subject age. The effects of scan platform were investigated. Normative curves were fitted by polynomial regression analysis. Average total brain volume was 360 cm 3 at birth, 948 cm 3 at 1 year, and 1072 cm 3 at 2 years. Total brain tissue density showed an 11% increase in mean density at 1 year and 19% at 2 years. Brain radiodensity histogram skewness was positive at birth, declining logarithmically in the first 200 days of life. The histogram kurtosis also decreased in the first 200 days to approach a normal distribution. Direct segmentation of CT images showed that changes in brain radiodensity histogram skewness correlated with, and can be explained by, a relative increase in gray matter volume and an increase in gray and white matter tissue density that occurs during this period of brain maturation. Normative metrics of the brain radiodensity histogram derived from routine clinical head CT images can be used to develop a model of normal brain development. © 2018 by American Journal of Neuroradiology.
Locally connected neural network with improved feature vector
NASA Technical Reports Server (NTRS)
Thomas, Tyson (Inventor)
2004-01-01
A pattern recognizer which uses neuromorphs with a fixed amount of energy that is distributed among the elements. The distribution of the energy is used to form a histogram which is used as a feature vector.
Quantitative histogram analysis of images
NASA Astrophysics Data System (ADS)
Holub, Oliver; Ferreira, Sérgio T.
2006-11-01
A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for loading of an image No. of bits in a word: 32 No. of processors used: 1 Has the code been vectorized or parallelized?: No No of lines in distributed program, including test data, etc.:138 946 No. of bytes in distributed program, including test data, etc.:15 166 675 Distribution format: tar.gz Nature of physical problem: Quantification of image data (e.g., for discrimination of molecular species in gels or fluorescent molecular probes in cell cultures) requires proprietary or complex software packages, which might not include the relevant statistical parameters or make the analysis of multiple images a tedious procedure for the general user. Method of solution: Tool for conversion of RGB bitmap image into luminance-linear image and extraction of luminance histogram, probability distribution, and statistical parameters (average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of histogram and median of probability distribution) with possible selection of region of interest (ROI) and lower and upper threshold levels. Restrictions on the complexity of the problem: Does not incorporate application-specific functions (e.g., morphometric analysis) Typical running time: Seconds (depending on image size and processor speed) Unusual features of the program: None
Beginning without a Conclusion.
ERIC Educational Resources Information Center
Frazier, Richard
1988-01-01
Describes a series of activities without conclusions to introduce scientific reasoning in a ninth grade physical science course. Uses popcorn popping to get students to think about the concepts of graphing, histograms, frequency, probability, and scientific methodology. (CW)
Location of Rotator Cuff Tear Initiation: A Magnetic Resonance Imaging Study of 191 Shoulders.
Jeong, Jeung Yeol; Min, Seul Ki; Park, Keun Min; Park, Yong Bok; Han, Kwang Joon; Yoo, Jae Chul
2018-03-01
Degenerative rotator cuff tears (RCTs) are generally thought to originate at the anterior margin of the supraspinatus tendon. However, a recent ultrasonography study suggested that they might originate more posteriorly than originally thought, perhaps even from the isolated infraspinatus (ISP) tendon, and propagate toward the anterior supraspinatus. Hypothesis/Purpose: It was hypothesized that this finding could be reproduced with magnetic resonance imaging (MRI). The purpose was to determine the most common location of degenerative RCTs by using 3-dimensional multiplanar MRI reconstruction. It was assumed that the location of the partial-thickness tears would identify the area of the initiation of full-thickness tears. Cross-sectional study; Level of evidence, 3. A retrospective analysis was conducted including 245 patients who had RCTs (nearly full- or partial-thickness tears) at the outpatient department between January 2011 and December 2013. RCTs were measured on 3-dimensional multiplanar reconstruction MRI with OsiriX software. The width and distance from the biceps tendon to the anterior margin of the tear were measured on T2-weighted sagittal images. In a spreadsheet, columns of consecutive numbers represented the size of each tear (anteroposterior width) and their locations with respect to the biceps brachii tendon. Data were pooled to graphically represent the width and location of all tears. Frequency histograms of the columns were made to visualize the distribution of tears. The tears were divided into 2 groups based on width (group A, <10 mm; group B, <20 and ≥10 mm) and analyzed for any differences in location related to size. The mean width of all RCTs was 11.9 ± 4.1 mm, and the mean length was 11.1 ± 5.0 mm. Histograms showed the most common location of origin to be 9 to 10 mm posterior to the biceps tendon. The histograms of groups A and B showed similar tear location distributions, indicating that the region approximately 10 mm posterior to the biceps tendon is the most common site of tear initiation. These results demonstrate that degenerative RCTs most commonly originate from approximately 9 to 10 mm posterior to the biceps tendon.
Teh, V; Sim, K S; Wong, E K
2016-11-01
According to the statistic from World Health Organization (WHO), stroke is one of the major causes of death globally. Computed tomography (CT) scan is one of the main medical diagnosis system used for diagnosis of ischemic stroke. CT scan provides brain images in Digital Imaging and Communication in Medicine (DICOM) format. The presentation of CT brain images is mainly relied on the window setting (window center and window width), which converts an image from DICOM format into normal grayscale format. Nevertheless, the ordinary window parameter could not deliver a proper contrast on CT brain images for ischemic stroke detection. In this paper, a new proposed method namely gamma correction extreme-level eliminating with weighting distribution (GCELEWD) is implemented to improve the contrast on CT brain images. GCELEWD is capable of highlighting the hypodense region for diagnosis of ischemic stroke. The performance of this new proposed technique, GCELEWD, is compared with four of the existing contrast enhancement technique such as brightness preserving bi-histogram equalization (BBHE), dualistic sub-image histogram equalization (DSIHE), extreme-level eliminating histogram equalization (ELEHE), and adaptive gamma correction with weighting distribution (AGCWD). GCELEWD shows better visualization for ischemic stroke detection and higher values with image quality assessment (IQA) module. SCANNING 38:842-856, 2016. © 2016 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.
SU-D-201-02: Prediction of Delivered Dose Based On a Joint Histogram of CT and FDG PET Images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, M; Choi, Y; Cho, A
2015-06-15
Purpose: To investigate whether pre-treatment images can be used in predicting microsphere distribution in tumors. When intra-arterial radioembolization using Y90 microspheres was performed, the microspheres were often delivered non-uniformly within the tumor, which could lead to an inefficient therapy. Therefore, it is important to estimate the distribution of microspheres. Methods: Early arterial phase CT and FDG PET images were acquired for patients with primary liver cancer prior to radioembolization (RE) using Y90 microspheres. Tumor volume was delineated on CT images and fused with FDG PET images. From each voxel (3.9×3.9×3.3 mm3) in the tumor, the Hounsfield unit (HU) from themore » CT and SUV values from the FDG PET were harvested. We binned both HU and SUV into 11 bins and then calculated a normalized joint-histogram in an 11×11 array.Patients also underwent a post-treatment Y90 PET imaging. Radiation dose for the tumor was estimated using convolution of the Y90 distribution with a dose-point kernel. We also calculated a fraction of the tumor volume that received a radiation dose great than 100Gy. Results: Averaged over 40 patients, 55% of tumor volume received a dose greater than 100Gy (range : 1.1 – 100%). The width of the joint histogram was narrower for patients with a high dose. For patients with a low dose, the width was wider and a larger fraction of tumor volume had low HU. Conclusion: We have shown the pattern of joint histogram of the HU and SUV depends on delivered dose. The patterns can predict the efficacy of uniform intra-arterial delivery of Y90 microspheres.« less
Dankers, Frank; Wijsman, Robin; Troost, Esther G C; Monshouwer, René; Bussink, Johan; Hoffmann, Aswin L
2017-05-07
In our previous work, a multivariable normal-tissue complication probability (NTCP) model for acute esophageal toxicity (AET) Grade ⩾2 after highly conformal (chemo-)radiotherapy for non-small cell lung cancer (NSCLC) was developed using multivariable logistic regression analysis incorporating clinical parameters and mean esophageal dose (MED). Since the esophagus is a tubular organ, spatial information of the esophageal wall dose distribution may be important in predicting AET. We investigated whether the incorporation of esophageal wall dose-surface data with spatial information improves the predictive power of our established NTCP model. For 149 NSCLC patients treated with highly conformal radiation therapy esophageal wall dose-surface histograms (DSHs) and polar dose-surface maps (DSMs) were generated. DSMs were used to generate new DSHs and dose-length-histograms that incorporate spatial information of the dose-surface distribution. From these histograms dose parameters were derived and univariate logistic regression analysis showed that they correlated significantly with AET. Following our previous work, new multivariable NTCP models were developed using the most significant dose histogram parameters based on univariate analysis (19 in total). However, the 19 new models incorporating esophageal wall dose-surface data with spatial information did not show improved predictive performance (area under the curve, AUC range 0.79-0.84) over the established multivariable NTCP model based on conventional dose-volume data (AUC = 0.84). For prediction of AET, based on the proposed multivariable statistical approach, spatial information of the esophageal wall dose distribution is of no added value and it is sufficient to only consider MED as a predictive dosimetric parameter.
NASA Astrophysics Data System (ADS)
Dankers, Frank; Wijsman, Robin; Troost, Esther G. C.; Monshouwer, René; Bussink, Johan; Hoffmann, Aswin L.
2017-05-01
In our previous work, a multivariable normal-tissue complication probability (NTCP) model for acute esophageal toxicity (AET) Grade ⩾2 after highly conformal (chemo-)radiotherapy for non-small cell lung cancer (NSCLC) was developed using multivariable logistic regression analysis incorporating clinical parameters and mean esophageal dose (MED). Since the esophagus is a tubular organ, spatial information of the esophageal wall dose distribution may be important in predicting AET. We investigated whether the incorporation of esophageal wall dose-surface data with spatial information improves the predictive power of our established NTCP model. For 149 NSCLC patients treated with highly conformal radiation therapy esophageal wall dose-surface histograms (DSHs) and polar dose-surface maps (DSMs) were generated. DSMs were used to generate new DSHs and dose-length-histograms that incorporate spatial information of the dose-surface distribution. From these histograms dose parameters were derived and univariate logistic regression analysis showed that they correlated significantly with AET. Following our previous work, new multivariable NTCP models were developed using the most significant dose histogram parameters based on univariate analysis (19 in total). However, the 19 new models incorporating esophageal wall dose-surface data with spatial information did not show improved predictive performance (area under the curve, AUC range 0.79-0.84) over the established multivariable NTCP model based on conventional dose-volume data (AUC = 0.84). For prediction of AET, based on the proposed multivariable statistical approach, spatial information of the esophageal wall dose distribution is of no added value and it is sufficient to only consider MED as a predictive dosimetric parameter.
Methods in quantitative image analysis.
Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M
1996-05-01
The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic-noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the grey values describing the brightness of every point within the image, named a pixel. The information is stored in bits. Eight bits are summarised in one byte. Therefore, grey values can have a value between 0 and 256 (2(8)). The human eye seems to be quite content with a display of 5-bit images (corresponding to 64 different grey values). In a digitised image, the pixel grey values can vary within regions that are uniform in the original scene: the image is noisy. The noise is mainly manifested in the background of the image. For an optimal discrimination between different objects or features in an image, uniformity of illumination in the whole image is required. These defects can be minimised by shading correction [subtraction of a background (white) image from the original image, pixel per pixel, or division of the original image by the background image]. The brightness of an image represented by its grey values can be analysed for every single pixel or for a group of pixels. The most frequently used pixel-based image descriptors are optical density, integrated optical density, the histogram of the grey values, mean grey value and entropy. The distribution of the grey values existing within an image is one of the most important characteristics of the image. However, the histogram gives no information about the texture of the image. The simplest way to improve the contrast of an image is to expand the brightness scale by spreading the histogram out to the full available range. Rules for transforming the grey value histogram of an existing image (input image) into a new grey value histogram (output image) are most quickly handled by a look-up table (LUT). The histogram of an image can be influenced by gain, offset and gamma of the camera. Gain defines the voltage range, offset defines the reference voltage and gamma the slope of the regression line between the light intensity and the voltage of the camera. A very important descriptor of neighbourhood relations in an image is the co-occurrence matrix. The distance between the pixels (original pixel and its neighbouring pixel) can influence the various parameters calculated from the co-occurrence matrix. The main goals of image enhancement are elimination of surface roughness in an image (smoothing), correction of defects (e.g. noise), extraction of edges, identification of points, strengthening texture elements and improving contrast. In enhancement, two types of operations can be distinguished: pixel-based (point operations) and neighbourhood-based (matrix operations). The most important pixel-based operations are linear stretching of grey values, application of pre-stored LUTs and histogram equalisation. The neighbourhood-based operations work with so-called filters. These are organising elements with an original or initial point in their centre. Filters can be used to accentuate or to suppress specific structures within the image. Filters can work either in the spatial or in the frequency domain. The method used for analysing alterations of grey value intensities in the frequency domain is the Hartley transform. Filter operations in the spatial domain can be based on averaging or ranking the grey values occurring in the organising element. The most important filters, which are usually applied, are the Gaussian filter and the Laplace filter (both averaging filters), and the median filter, the top hat filter and the range operator (all ranking filters). Segmentation of objects is traditionally based on threshold grey values. (AB
Kafemann, R.; Thiel, R.; Finn, J.E.; Neukamm, R.
1998-01-01
Abundance and biomass data for juveniles and adults, length frequency histograms and the electron microprobe analysis (EPMA) of otoliths were used to indicate density, migration and reproduction of common bream Abramis brama in the Kiel Canal drainage, Germany. The reproduction of common bream was primarily restricted to two types of spawning habitats: one in the Haaler Au, a freshwater tributary and another in shallow, oligohaline portion of the main Canal. Both spawning habitats were morphologically characterized as shallow with submerged vegetation. During April to June concentrations of spawners were observed, whereas age-0 common bream dominated from August through December. The distribution of age-0 common bream was primarily restricted to fresh and oligohaline waters. Outside the spawning season, the distribution of common bream was less obvious. Adult fish were more widely distributed within the Canal, indicating a tolerance for higher salinities. During the spawning season common bream seem to show an exceptional mobility between spawning and feeding habitats, which are denoted by different salinities.
Machine assisted histogram classification
NASA Astrophysics Data System (ADS)
Benyó, B.; Gaspar, C.; Somogyi, P.
2010-04-01
LHCb is one of the four major experiments under completion at the Large Hadron Collider (LHC). Monitoring the quality of the acquired data is important, because it allows the verification of the detector performance. Anomalies, such as missing values or unexpected distributions can be indicators of a malfunctioning detector, resulting in poor data quality. Spotting faulty or ageing components can be either done visually using instruments, such as the LHCb Histogram Presenter, or with the help of automated tools. In order to assist detector experts in handling the vast monitoring information resulting from the sheer size of the detector, we propose a graph based clustering tool combined with machine learning algorithm and demonstrate its use by processing histograms representing 2D hitmaps events. We prove the concept by detecting ion feedback events in the LHCb experiment's RICH subdetector.
Student's Conceptions in Statistical Graph's Interpretation
ERIC Educational Resources Information Center
Kukliansky, Ida
2016-01-01
Histograms, box plots and cumulative distribution graphs are popular graphic representations for statistical distributions. The main research question that this study focuses on is how college students deal with interpretation of these statistical graphs when translating graphical representations into analytical concepts in descriptive statistics.…
NASA Technical Reports Server (NTRS)
Tedesco, M.; Kim, E. J.; Gasiewski, A.; Stankov, B.
2005-01-01
Brightness temperature maps at 18.7 and 37 GHz collected at the Fraser and North Park Meso-Scale Areas during the Cold Land Processes Experiment by the NOAA Polarimetric Scanning Radiometer (PSWA) airborne sensor are analyzed. The Fraser site is mostly covered by forest with a typical snowpack depth of 1 m while North Park has no forest cover and is characterized by patches of shallow snow. We examine histograms of the brightness temperatures at 500 m resolution for both the Fraser and North Park areas. The histograms can be modelled by a log-normal distribution in the case of the Fraser MSA and by a bi-modal distribution in the case of the North Park MSA. The histograms of the brightness temperatures at coarser resolutions are also plotted to study the effects of sensor resolution on the shape of the distribution, on the values of the average brightness temperatures and standard deviations. Finally, the values of brightness temperatures obtained by re-sampling (aggregating) the data at 25 km resolution are compared with the values of the brightness temperatures collected by the Advanced Microwave Scanning Radiometer (AMSR-E) and Special Sensor Microwave/Imager (SSMII) satellite radiometers. The results show that in both areas for sensor footprint larger than 5000 m, the brightness temperatures show a flat distribution and the memory of the initial distribution is lost. The values of the brightness temperatures measured by the satellite radiometers are in good agreement with the values obtained averaging the airborne data, even if some discrepancies occur.
Kawase, Takatsugu; Kunieda, Etsuo; Deloar, Hossain M; Tsunoo, Takanori; Seki, Satoshi; Oku, Yohei; Saitoh, Hidetoshi; Saito, Kimiaki; Ogawa, Eileen N; Ishizaka, Akitoshi; Kameyama, Kaori; Kubo, Atsushi
2009-10-01
To validate the feasibility of developing a radiotherapy unit with kilovoltage X-rays through actual irradiation of live rabbit lungs, and to explore the practical issues anticipated in future clinical application to humans through Monte Carlo dose simulation. A converging stereotactic irradiation unit was developed, consisting of a modified diagnostic computed tomography (CT) scanner. A tiny cylindrical volume in 13 normal rabbit lungs was individually irradiated with single fractional absorbed doses of 15, 30, 45, and 60 Gy. Observational CT scanning of the whole lung was performed every 2 weeks for 30 weeks after irradiation. After 30 weeks, histopathologic specimens of the lungs were examined. Dose distribution was simulated using the Monte Carlo method, and dose-volume histograms were calculated according to the data. A trial estimation of the effect of respiratory movement on dose distribution was made. A localized hypodense change and subsequent reticular opacity around the planning target volume (PTV) were observed in CT images of rabbit lungs. Dose-volume histograms of the PTVs and organs at risk showed a focused dose distribution to the target and sufficient dose lowering in the organs at risk. Our estimate of the dose distribution, taking respiratory movement into account, revealed dose reduction in the PTV. A converging stereotactic irradiation unit using kilovoltage X-rays was able to generate a focused radiobiologic reaction in rabbit lungs. Dose-volume histogram analysis and estimated sagittal dose distribution, considering respiratory movement, clarified the characteristics of the irradiation received from this type of unit.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Dyke, Erick S.; Jonnasson, Brian C.; Carmichael, Richard W.
2001-07-01
Rotary screw traps, located at four sites in the Grande Ronde River basin, were used to characterize aspects of early life history exhibited by juvenile Onchorhychus mykiss during migration years 1995-99. The Lostine, Catherine Creek and upper Grande Ronde traps captured fish as they migrated out of spawning areas into valley rearing habitats. The Grande Ronde Valley trap captured fish as they left valley habitats downstream of Catherine Creek and upper Grande Ronde River rearing habitats. Dispersal downstream of spawning areas was most evident in fall and spring, but movement occurred during all seasons that the traps were fished. Seawardmore » migration occurred primarily in spring when O. mykiss smolts left overwintering area located in both spawning area and valley habitats. Migration patterns exhibited by O. mykiss suggest that Grande Ronde Valley habitats are used for overwintering and should be considered critical rearing habitat. We were unable to positively differentiate anadromous and resident forms of O. mykiss in the Grande Ronde River basin because both forms occur in our study area. The Grande Ronde Valley trap provided the best information on steelhead production in the basin because it fished below valley habitats where O. mykiss overwinter. Length frequency histograms of O. mykiss captured below upper spawning and rearing habitats showed a bimodal distribution regardless of the season of capture. Scale analyses suggested that each mode represents a different brood year. Length frequency histograms of O. mykiss captured in the Grande Ronde Valley trap were not bimodal, and primarily represented a size range consistent with other researchers' accounts of anadromous smolts.« less
Sadasivan, Chander; Brownstein, Jeremy; Patel, Bhumika; Dholakia, Ronak; Santore, Joseph; Al-Mufti, Fawaz; Puig, Enrique; Rakian, Audrey; Fernandez-Prada, Kenneth D; Elhammady, Mohamed S; Farhat, Hamad; Fiorella, David J; Woo, Henry H; Aziz-Sultan, Mohammad A; Lieber, Baruch B
2013-03-01
Endovascular coiling of cerebral aneurysms remains limited by coil compaction and associated recanalization. Recent coil designs which effect higher packing densities may be far from optimal because hemodynamic forces causing compaction are not well understood since detailed data regarding the location and distribution of coil masses are unavailable. We present an in vitro methodology to characterize coil masses deployed within aneurysms by quantifying intra-aneurysmal void spaces. Eight identical aneurysms were packed with coils by both balloon- and stent-assist techniques. The samples were embedded, sequentially sectioned and imaged. Empty spaces between the coils were numerically filled with circles (2D) in the planar images and with spheres (3D) in the three-dimensional composite images. The 2D and 3D void size histograms were analyzed for local variations and by fitting theoretical probability distribution functions. Balloon-assist packing densities (31±2%) were lower ( p =0.04) than the stent-assist group (40±7%). The maximum and average 2D and 3D void sizes were higher ( p =0.03 to 0.05) in the balloon-assist group as compared to the stent-assist group. None of the void size histograms were normally distributed; theoretical probability distribution fits suggest that the histograms are most probably exponentially distributed with decay constants of 6-10 mm. Significant ( p <=0.001 to p =0.03) spatial trends were noted with the void sizes but correlation coefficients were generally low (absolute r <=0.35). The methodology we present can provide valuable input data for numerical calculations of hemodynamic forces impinging on intra-aneurysmal coil masses and be used to compare and optimize coil configurations as well as coiling techniques.
Park, Kyeong-Yeon; Jin, In-Ki
2015-09-01
The purpose of this study was to identify differences between the dynamic ranges (DRs) of male and female speakers using Korean standard sentence material. Consideration was especially given to effects within the predefined segmentalized frequency-bands. We used Korean standard sentence lists for adults as stimuli. Each sentence was normalized to a root-mean-square of 65 dB sound pressure level. The sentences were then modified to ensure there were no pauses, and the modified sentences were passed through a filter bank in order to perform the frequency analysis. Finally, the DR was quantified using a histogram that showed the cumulative envelope distribution levels of the speech in each frequency band. In DRs that were averaged across all frequency bands, there were no significant differences between the male and the female speakers. However, when considering effects within the predefined frequency bands, there were significant differences in several frequency bands between the DRs of male speech and those of female speech. This study shows that the DR of speech for the male speaker differed from the female speaker in nine frequency bands among 21 frequency bands. These observed differences suggest that a standardized DR of male speech in the band-audibility function of the speech intelligibility index may differ from that of female speech derived in the same way. Further studies are required to derive standardized DRs for Korean speakers.
Wu, Rongli; Watanabe, Yoshiyuki; Arisawa, Atsuko; Takahashi, Hiroto; Tanaka, Hisashi; Fujimoto, Yasunori; Watabe, Tadashi; Isohashi, Kayako; Hatazawa, Jun; Tomiyama, Noriyuki
2017-10-01
This study aimed to compare the tumor volume definition using conventional magnetic resonance (MR) and 11C-methionine positron emission tomography (MET/PET) images in the differentiation of the pre-operative glioma grade by using whole-tumor histogram analysis of normalized cerebral blood volume (nCBV) maps. Thirty-four patients with histopathologically proven primary brain low-grade gliomas (n = 15) and high-grade gliomas (n = 19) underwent pre-operative or pre-biopsy MET/PET, fluid-attenuated inversion recovery, dynamic susceptibility contrast perfusion-weighted magnetic resonance imaging, and contrast-enhanced T1-weighted at 3.0 T. The histogram distribution derived from the nCBV maps was obtained by co-registering the whole tumor volume delineated on conventional MR or MET/PET images, and eight histogram parameters were assessed. The mean nCBV value had the highest AUC value (0.906) based on MET/PET images. Diagnostic accuracy significantly improved when the tumor volume was measured from MET/PET images compared with conventional MR images for the parameters of mean, 50th, and 75th percentile nCBV value (p = 0.0246, 0.0223, and 0.0150, respectively). Whole-tumor histogram analysis of CBV map provides more valuable histogram parameters and increases diagnostic accuracy in the differentiation of pre-operative cerebral gliomas when the tumor volume is derived from MET/PET images.
Li, Anqin; Xing, Wei; Li, Haojie; Hu, Yao; Hu, Daoyu; Li, Zhen; Kamel, Ihab R
2018-05-29
The purpose of this article is to evaluate the utility of volumetric histogram analysis of apparent diffusion coefficient (ADC) derived from reduced-FOV DWI for small (≤ 4 cm) solid renal mass subtypes at 3-T MRI. This retrospective study included 38 clear cell renal cell carcinomas (RCCs), 16 papillary RCCs, 18 chromophobe RCCs, 13 minimal fat angiomyolipomas (AMLs), and seven oncocytomas evaluated with preoperative MRI. Volumetric ADC maps were generated using all slices of the reduced-FOV DW images to obtain histogram parameters, including mean, median, 10th percentile, 25th percentile, 75th percentile, 90th percentile, and SD ADC values, as well as skewness, kurtosis, and entropy. Comparisons of these parameters were made by one-way ANOVA, t test, and ROC curves analysis. ADC histogram parameters differentiated eight of 10 pairs of renal tumors. Three subtype pairs (clear cell RCC vs papillary RCC, clear cell RCC vs chromophobe RCC, and clear cell RCC vs minimal fat AML) were differentiated by mean ADC. However, five other subtype pairs (clear cell RCC vs oncocytoma, papillary RCC vs minimal fat AML, papillary RCC vs oncocytoma, chromophobe RCC vs minimal fat AML, and chromophobe RCC vs oncocytoma) were differentiated by histogram distribution parameters exclusively (all p < 0.05). Mean ADC, median ADC, 75th and 90th percentile ADC, SD ADC, and entropy of malignant tumors were significantly higher than those of benign tumors (all p < 0.05). Combination of mean ADC with histogram parameters yielded the highest AUC (0.851; sensitivity, 80.0%; specificity, 86.1%). Quantitative volumetric ADC histogram analysis may help differentiate various subtypes of small solid renal tumors, including benign and malignant lesions.
Choi, Moon Hyung; Oh, Soon Nam; Rha, Sung Eun; Choi, Joon-Il; Lee, Sung Hak; Jang, Hong Seok; Kim, Jun-Gi; Grimm, Robert; Son, Yohan
2016-07-01
To investigate the usefulness of apparent diffusion coefficient (ADC) values derived from histogram analysis of the whole rectal cancer as a quantitative parameter to evaluate pathologic complete response (pCR) on preoperative magnetic resonance imaging (MRI). We enrolled a total of 86 consecutive patients who had undergone surgery for rectal cancer after neoadjuvant chemoradiotherapy (CRT) at our institution between July 2012 and November 2014. Two radiologists who were blinded to the final pathological results reviewed post-CRT MRI to evaluate tumor stage. Quantitative image analysis was performed using T2 -weighted and diffusion-weighted images independently by two radiologists using dedicated software that performed histogram analysis to assess the distribution of ADC in the whole tumor. After surgery, 16 patients were confirmed to have achieved pCR (18.6%). All parameters from pre- and post-CRT ADC histogram showed good or excellent agreement between two readers. The minimum, 10th, 25th, 50th, and 75th percentile and mean ADC from post-CRT ADC histogram were significantly higher in the pCR group than in the non-pCR group for both readers. The 25th percentile value from ADC histogram in post-CRT MRI had the best diagnostic performance for detecting pCR, with an area under the receiver operating characteristic curve of 0.796. Low percentile values derived from the ADC histogram analysis of rectal cancer on MRI after CRT showed a significant difference between pCR and non-pCR groups, demonstrating the utility of the ADC value as a quantitative and objective marker to evaluate complete pathologic response to preoperative CRT in rectal cancer. J. Magn. Reson. Imaging 2016;44:212-220. © 2015 Wiley Periodicals, Inc.
A wavelet-based statistical analysis of FMRI data: I. motivation and data distribution modeling.
Dinov, Ivo D; Boscardin, John W; Mega, Michael S; Sowell, Elizabeth L; Toga, Arthur W
2005-01-01
We propose a new method for statistical analysis of functional magnetic resonance imaging (fMRI) data. The discrete wavelet transformation is employed as a tool for efficient and robust signal representation. We use structural magnetic resonance imaging (MRI) and fMRI to empirically estimate the distribution of the wavelet coefficients of the data both across individuals and spatial locations. An anatomical subvolume probabilistic atlas is used to tessellate the structural and functional signals into smaller regions each of which is processed separately. A frequency-adaptive wavelet shrinkage scheme is employed to obtain essentially optimal estimations of the signals in the wavelet space. The empirical distributions of the signals on all the regions are computed in a compressed wavelet space. These are modeled by heavy-tail distributions because their histograms exhibit slower tail decay than the Gaussian. We discovered that the Cauchy, Bessel K Forms, and Pareto distributions provide the most accurate asymptotic models for the distribution of the wavelet coefficients of the data. Finally, we propose a new model for statistical analysis of functional MRI data using this atlas-based wavelet space representation. In the second part of our investigation, we will apply this technique to analyze a large fMRI dataset involving repeated presentation of sensory-motor response stimuli in young, elderly, and demented subjects.
The ISI distribution of the stochastic Hodgkin-Huxley neuron.
Rowat, Peter F; Greenwood, Priscilla E
2014-01-01
The simulation of ion-channel noise has an important role in computational neuroscience. In recent years several approximate methods of carrying out this simulation have been published, based on stochastic differential equations, and all giving slightly different results. The obvious, and essential, question is: which method is the most accurate and which is most computationally efficient? Here we make a contribution to the answer. We compare interspike interval histograms from simulated data using four different approximate stochastic differential equation (SDE) models of the stochastic Hodgkin-Huxley neuron, as well as the exact Markov chain model simulated by the Gillespie algorithm. One of the recent SDE models is the same as the Kurtz approximation first published in 1978. All the models considered give similar ISI histograms over a wide range of deterministic and stochastic input. Three features of these histograms are an initial peak, followed by one or more bumps, and then an exponential tail. We explore how these features depend on deterministic input and on level of channel noise, and explain the results using the stochastic dynamics of the model. We conclude with a rough ranking of the four SDE models with respect to the similarity of their ISI histograms to the histogram of the exact Markov chain model.
A public study of the lifetime distribution of soap films
NASA Astrophysics Data System (ADS)
Tobin, S. T.; Meagher, A. J.; Bulfin, B.; Möbius, M.; Hutzler, S.
2011-08-01
We present data for the lifetime distribution of soap films made from commercial dish-washing solution and contained in sealed cylinders. Data for over 2500 films were gathered during a 2-month exhibition on the science and art of bubbles and foams in Dublin's Science Gallery. Visitors to the gallery were invited to create 10-20 parallel soap films in acrylic tubes which were sealed with cork stoppers. Individual film bursts occurred at random and were uncorrelated. The total number of remaining films in the tubes was recorded every day. Visitors could monitor the status of their soap film tube and the daily updated histogram of the lifetime of all films. The histogram of the bubble lifetimes is well described by a Weibull distribution, which indicates that the failure rate is not constant and increases over time. Unsealed cylinders show drastically reduced film lifetimes. This experiment illustrates the difference between the unpredictability of the lifetime of individual films and the existence of a well-defined lifetime distribution for the ensemble.
Castillo-Barnes, Diego; Peis, Ignacio; Martínez-Murcia, Francisco J.; Segovia, Fermín; Illán, Ignacio A.; Górriz, Juan M.; Ramírez, Javier; Salas-Gonzalez, Diego
2017-01-01
A wide range of segmentation approaches assumes that intensity histograms extracted from magnetic resonance images (MRI) have a distribution for each brain tissue that can be modeled by a Gaussian distribution or a mixture of them. Nevertheless, intensity histograms of White Matter and Gray Matter are not symmetric and they exhibit heavy tails. In this work, we present a hidden Markov random field model with expectation maximization (EM-HMRF) modeling the components using the α-stable distribution. The proposed model is a generalization of the widely used EM-HMRF algorithm with Gaussian distributions. We test the α-stable EM-HMRF model in synthetic data and brain MRI data. The proposed methodology presents two main advantages: Firstly, it is more robust to outliers. Secondly, we obtain similar results than using Gaussian when the Gaussian assumption holds. This approach is able to model the spatial dependence between neighboring voxels in tomographic brain MRI. PMID:29209194
Distribution of a suite of elements including arsenic and mercury in Alabama coal
Goldhaber, Martin B.; Bigelow, R.C.; Hatch, J.R.; Pashin, J.C.
2000-01-01
Arsenic and other elements are unusually abundant in Alabama coal. This conclusion is based on chemical analyses of coal in the U.S. Geological Survey's National Coal Resources Data System (NCRDS; Bragg and others, 1994). According to NCRDS data, the average concentration of arsenic in Alabama coal (72 ppm) is three times higher than is the average for all U.S. coal (24 ppm). Of the U.S. coal analyses for arsenic that are at least 3 standard deviations above the mean, approximately 90% are from the coal fields of Alabama. Figure 1 contrasts the abundance of arsenic in coal of the Warrior field of Alabama (histogram C) with that of coal of the Powder River Basin, Wyoming (histogram A), and the Eastern Interior Province including the Illinois Basin and nearby areas (histogram B). The Warrior field is by far the largest in Alabama. On the histogram, the large 'tail' of very high values (> 200 ppm) in the Warrior coal contrasts with the other two regions that have very few analyses greater than 200 ppm.
Wildey, R.L.
1988-01-01
A method is derived for determining the dependence of radar backscatter on incidence angle that is applicable to the region corresponding to a particular radar image. The method is based on enforcing mathematical consistency between the frequency distribution of the image's pixel signals (histogram of DN values with suitable normalizations) and a one-dimensional frequency distribution of slope component, as might be obtained from a radar or laser altimetry profile in or near the area imaged. In order to achieve a unique solution, the auxiliary assumption is made that the two-dimensional frequency distribution of slope is isotropic. The backscatter is not derived in absolute units. The method is developed in such a way as to separate the reflectance function from the pixel-signal transfer characteristic. However, these two sources of variation are distinguishable only on the basis of a weak dependence on the azimuthal component of slope; therefore such an approach can be expected to be ill-conditioned unless the revision of the transfer characteristic is limited to the determination of an additive instrumental background level. The altimetry profile does not have to be registered in the image, and the statistical nature of the approach minimizes pixel noise effects and the effects of a disparity between the resolutions of the image and the altimetry profile, except in the wings of the distribution where low-number statistics preclude accuracy anyway. The problem of dealing with unknown slope components perpendicular to the profiling traverse, which besets the one-to-one comparison between individual slope components and pixel-signal values, disappears in the present approach. In order to test the resulting algorithm, an artificial radar image was generated from the digitized topographic map of the Lake Champlain West quadrangle in the Adirondack Mountains, U.S.A., using an arbitrarily selected reflectance function. From the same map, a one-dimensional frequency distribution of slope component was extracted. The algorithm recaptured the original reflectance function to the degree that, for the central 90% of the data, the discrepancy translates to a RMS slope error of 0.1 ???. For the central 99% of the data, the maximum error translates to 1 ???; at the absolute extremes of the data the error grows to 6 ???. ?? 1988 Kluwer Academic Publishers.
[Clinical evaluation of heavy-particle radiotherapy using dose volume histogram (DVH)].
Terahara, A; Nakano, T; Tsujii, H
1998-01-01
Radiotherapy with heavy particles such as proton and heavy-charged particles is a promising modality for treatment of localized malignant tumors because of the good dose distribution. A dose calculation and radiotherapy planning system which is essential for this kind of treatment has been developed in recent years. It has the capability to compute the dose volume histogram (DVH) which contains dose-volume information for the target volume and other interesting volumes. Recently, DVH is commonly used to evaluate and compare dose distributions in radiotherapy with both photon and heavy particles, and it shows that a superior dose distribution is obtained in heavy particle radiotherapy. DVH is also utilized for the evaluation of dose distribution related to clinical outcomes. Besides models such as normal tissue complication probability (NTCP) and tumor control probability (TCP), which can be calculated from DVH are proposed by several authors, they are applied to evaluate dose distributions themselves and to evaluate them in relation to clinical results. DVH is now a useful and important tool, but further studies are needed to use DVH and these models practically for clinical evaluation of heavy-particle radiotherapy.
Characteristics of random inlet pressure fluctuations during flights of F-111A airplane
NASA Technical Reports Server (NTRS)
Costakis, W. G.
1977-01-01
Compressor face dynamic total pressures from four F-111 flights were analyzed. Statistics of the nonstationary data were investigated by analyzing the data in a quasi-stationary manner. Changes in the character of the dynamic signal are investigated as functions of flight conditions, time in flight, and location at the compressor face. The results, which are presented in the form of rms values, histograms, and power spectrum plots, show that the shape of the power spectra remains relatively flat while the histograms have an approximate normal distribution.
Digital image improvement by adding noise: an example by a professional photographer
NASA Astrophysics Data System (ADS)
Kurihara, Takehito; Manabe, Yoshitsugu; Aoki, Naokazu; Kobayashi, Hiroyuki
2008-01-01
To overcome shortcomings of digital image, or to reproduce grain of traditional silver halide photographs, some photographers add noise (grain) to digital image. In an effort to find a factor of preferable noise, we analyzed how a professional photographer introduces noise into B&W digital images and found two noticeable characteristics: 1) there is more noise in mid-tones, gradually decreasing in highlights and shadows toward the ends of tonal range, and 2) histograms in highlights are skewed toward shadows and vice versa, while almost symmetrical in mid-tones. Next, we examined whether the professional's noise could be reproduced. The symmetrical histograms were approximated by Gaussian distribution and skewed ones by chi-square distribution. The images on which the noise was reproduced were judged by the professional himself to be satisfactory enough. As the professional said he added the noise so that "it looked like the grain of B&W gelatin silver photographs," we compared the two kinds of noise and found they have in common: 1) more noise in mid-tones but almost none in brightest highlights and deepest shadows, and 2) asymmetrical histograms in highlights and shadows. We think these common characteristics might be one condition for "good" noise.
Histogram contrast analysis and the visual segregation of IID textures.
Chubb, C; Econopouly, J; Landy, M S
1994-09-01
A new psychophysical methodology is introduced, histogram contrast analysis, that allows one to measure stimulus transformations, f, used by the visual system to draw distinctions between different image regions. The method involves the discrimination of images constructed by selecting texture micropatterns randomly and independently (across locations) on the basis of a given micropattern histogram. Different components of f are measured by use of different component functions to modulate the micropattern histogram until the resulting textures are discriminable. When no discrimination threshold can be obtained for a given modulating component function, a second titration technique may be used to measure the contribution of that component to f. The method includes several strong tests of its own assumptions. An example is given of the method applied to visual textures composed of small, uniform squares with randomly chosen gray levels. In particular, for a fixed mean gray level mu and a fixed gray-level variance sigma 2, histogram contrast analysis is used to establish that the class S of all textures composed of small squares with jointly independent, identically distributed gray levels with mean mu and variance sigma 2 is perceptually elementary in the following sense: there exists a single, real-valued function f S of gray level, such that two textures I and J in S are discriminable only if the average value of f S applied to the gray levels in I is significantly different from the average value of f S applied to the gray levels in J. Finally, histogram contrast analysis is used to obtain a seventh-order polynomial approximation of f S.
External Representations for Data Distributions: In Search of Cognitive Fit
ERIC Educational Resources Information Center
Lem, Stephanie; Onghana, Patrick; Verschaffel, Lieven; Van Dooren, Wim
2013-01-01
Data distributions can be represented using different external representations, such as histograms and boxplots. Although the role of external representations has been extensively studied in mathematics, this is less the case in statistics. This study helps to fill this gap by systematically varying the representation that accompanies a task…
Efficient Levenberg-Marquardt minimization of the maximum likelihood estimator for Poisson deviates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laurence, T; Chromy, B
2009-11-10
Histograms of counted events are Poisson distributed, but are typically fitted without justification using nonlinear least squares fitting. The more appropriate maximum likelihood estimator (MLE) for Poisson distributed data is seldom used. We extend the use of the Levenberg-Marquardt algorithm commonly used for nonlinear least squares minimization for use with the MLE for Poisson distributed data. In so doing, we remove any excuse for not using this more appropriate MLE. We demonstrate the use of the algorithm and the superior performance of the MLE using simulations and experiments in the context of fluorescence lifetime imaging. Scientists commonly form histograms ofmore » counted events from their data, and extract parameters by fitting to a specified model. Assuming that the probability of occurrence for each bin is small, event counts in the histogram bins will be distributed according to the Poisson distribution. We develop here an efficient algorithm for fitting event counting histograms using the maximum likelihood estimator (MLE) for Poisson distributed data, rather than the non-linear least squares measure. This algorithm is a simple extension of the common Levenberg-Marquardt (L-M) algorithm, is simple to implement, quick and robust. Fitting using a least squares measure is most common, but it is the maximum likelihood estimator only for Gaussian-distributed data. Non-linear least squares methods may be applied to event counting histograms in cases where the number of events is very large, so that the Poisson distribution is well approximated by a Gaussian. However, it is not easy to satisfy this criterion in practice - which requires a large number of events. It has been well-known for years that least squares procedures lead to biased results when applied to Poisson-distributed data; a recent paper providing extensive characterization of these biases in exponential fitting is given. The more appropriate measure based on the maximum likelihood estimator (MLE) for the Poisson distribution is also well known, but has not become generally used. This is primarily because, in contrast to non-linear least squares fitting, there has been no quick, robust, and general fitting method. In the field of fluorescence lifetime spectroscopy and imaging, there have been some efforts to use this estimator through minimization routines such as Nelder-Mead optimization, exhaustive line searches, and Gauss-Newton minimization. Minimization based on specific one- or multi-exponential models has been used to obtain quick results, but this procedure does not allow the incorporation of the instrument response, and is not generally applicable to models found in other fields. Methods for using the MLE for Poisson-distributed data have been published by the wider spectroscopic community, including iterative minimization schemes based on Gauss-Newton minimization. The slow acceptance of these procedures for fitting event counting histograms may also be explained by the use of the ubiquitous, fast Levenberg-Marquardt (L-M) fitting procedure for fitting non-linear models using least squares fitting (simple searches obtain {approx}10000 references - this doesn't include those who use it, but don't know they are using it). The benefits of L-M include a seamless transition between Gauss-Newton minimization and downward gradient minimization through the use of a regularization parameter. This transition is desirable because Gauss-Newton methods converge quickly, but only within a limited domain of convergence; on the other hand the downward gradient methods have a much wider domain of convergence, but converge extremely slowly nearer the minimum. L-M has the advantages of both procedures: relative insensitivity to initial parameters and rapid convergence. Scientists, when wanting an answer quickly, will fit data using L-M, get an answer, and move on. Only those that are aware of the bias issues will bother to fit using the more appropriate MLE for Poisson deviates. However, since there is a simple, analytical formula for the appropriate MLE measure for Poisson deviates, it is inexcusable that least squares estimators are used almost exclusively when fitting event counting histograms. There have been ways found to use successive non-linear least squares fitting to obtain similarly unbiased results, but this procedure is justified by simulation, must be re-tested when conditions change significantly, and requires two successive fits. There is a great need for a fitting routine for the MLE estimator for Poisson deviates that has convergence domains and rates comparable to the non-linear least squares L-M fitting. We show in this report that a simple way to achieve that goal is to use the L-M fitting procedure not to minimize the least squares measure, but the MLE for Poisson deviates.« less
Wind power as an electrical energy source in Illinois
NASA Astrophysics Data System (ADS)
Wendland, W. M.
1982-03-01
A preliminary estimate of the total wind power available in Illinois was made using available historical data, and projections of cost savings due to the presence of wind-generated electricity were attempted. Wind data at 10 m height were considered from nine different sites in the state, with three years data nominally being included. Wind-speed frequency histograms were developed for day and night periods, using a power law function to extrapolate the 10 m readings to 20 m. Wind speeds over the whole state were found to average over 8 mph, the cut-in point for most wind turbines, for from 40-63% of the time. A maximum of 75% run-time was determined for daylight hours in April-May. A reference 1.8 kW windpowered generator was used in annual demand projections for a reference one family home, using the frequency histograms. The small generator was projected to fulfill from 25-53% of the annual load, and, based on various cost assumptions, exhibited paybacks taking from 14-27 yr.
Are There Different Populations of Flux Ropes in the Solar Wind?
NASA Astrophysics Data System (ADS)
Janvier, M.; Démoulin, P.; Dasso, S.
2014-07-01
Flux ropes are twisted magnetic structures that can be detected by in-situ measurements in the solar wind. However, different properties of detected flux ropes suggest different types of flux-rope populations. As such, are there different populations of flux ropes? The answer is positive and is the result of the analysis of four lists of flux ropes, including magnetic clouds (MCs), observed at 1 AU. The in-situ data for the four lists were fitted with the same cylindrical force-free field model, which provides an estimate of the local flux-rope parameters such as its radius and orientation. Since the flux-rope distributions have a broad dynamic range, we went beyond a simple histogram analysis by developing a partition technique that uniformly distributes the statistical fluctuations across the radius range. By doing so, we found that small flux ropes with radius R<0.1 AU have a steep power-law distribution in contrast to the larger flux ropes (identified as MCs), which have a Gaussian-like distribution. Next, from four CME catalogs, we estimated the expected flux-rope frequency per year at 1 AU. We found that the predicted numbers are similar to the frequencies of MCs observed in-situ. However, we also found that small flux ropes are at least ten times too abundant to correspond to CMEs, even to narrow ones. Investigating the different possible scenarios for the origin of these small flux ropes, we conclude that these twisted structures can be formed by blowout jets in the low corona or in coronal streamers.
Multi-stream LSTM-HMM decoding and histogram equalization for noise robust keyword spotting.
Wöllmer, Martin; Marchi, Erik; Squartini, Stefano; Schuller, Björn
2011-09-01
Highly spontaneous, conversational, and potentially emotional and noisy speech is known to be a challenge for today's automatic speech recognition (ASR) systems, which highlights the need for advanced algorithms that improve speech features and models. Histogram Equalization is an efficient method to reduce the mismatch between clean and noisy conditions by normalizing all moments of the probability distribution of the feature vector components. In this article, we propose to combine histogram equalization and multi-condition training for robust keyword detection in noisy speech. To better cope with conversational speaking styles, we show how contextual information can be effectively exploited in a multi-stream ASR framework that dynamically models context-sensitive phoneme estimates generated by a long short-term memory neural network. The proposed techniques are evaluated on the SEMAINE database-a corpus containing emotionally colored conversations with a cognitive system for "Sensitive Artificial Listening".
A Monte-Carlo Model for the Formation of Radiation-induced Chromosomal Aberrations
NASA Technical Reports Server (NTRS)
Ponomarev, Artem L.; Cornforth, Michael N.; Loucas, Brad D.; Cucinotta, Francis A.
2009-01-01
Purpose: To simulate radiation-induced chromosome aberrations in mammalian cells (e.g., rings, translocations, and dicentrics) and to calculate their frequency distributions following exposure to DNA double strand breaks (DSBs) produced by high-LET ions. Methods: The interphase genome was assumed to be comprised of a collection of 2 kbp rigid-block monomers following the random-walk geometry. Additional details for the modeling of chromosomal structure, such as chromosomal domains and chromosomal loops, were included. A radial energy profile for heavy ion tracks was used to simulate the high-LET pattern of induced DSBs. The induced DSB pattern depended on the ion charge and kinetic energy, but always corresponded to the DSB yield of 25 DSBs/cell/Gy. The sum of all energy contributions from Poisson-distributed particle tracks was taken to account for all possible one-track and multi-track effects. The relevant output of the model was DNA fragments produced by DSBs. The DSBs, or breakpoints, were defined by (x, y, z, l) positions, where x, y, z were the Euclidian coordinates of a DSB, and where l was the relative position along the genome. Results: The code was used to carry out Monte Carlo simulations for DSB rejoinings at low doses. The resulting fragments were analyzed to estimate the frequencies of specific types of chromosomal aberrations. Histograms for relative frequencies of chromosomal aberrations and P.D.F.s (probability density functions) of a given aberration type were produced. The relative frequency of dicentrics to rings was compared to empirical data to calibrate rejoining probabilities. Of particular interest was the predicted distribution of ring sizes, irrespective of their frequencies relative to other aberrations. Simulated ring sizes were . 4 kbp, which are far too small to be observed experimentally (i.e., by microscopy) but which, nevertheless, are conjectured to exist. Other aberrations, for example, inversions, translocations, as well as multi-centrics were also recorded. Conclusion: High-LET DNA damage affects the frequencies of chromosomal aberrations. The ratio of rings to dicentrics is correct for the genomic size cut-offs corresponding to available experimental data. The present work predicts a relative abundance of small rings following irradiation by heavy ions.
On the Misinterpretation of Histograms and Box Plots
ERIC Educational Resources Information Center
Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim
2013-01-01
Recent studies have shown that the interpretation of graphs is not always easy for students. In order to reason properly about distributions of data, however, one needs to be able to interpret graphical representations of these distributions correctly. In this study, we used Tversky's principles for the design of graphs to explain how 125…
Single-photon technique for the detection of periodic extraterrestrial laser pulses.
Leeb, W R; Poppe, A; Hammel, E; Alves, J; Brunner, M; Meingast, S
2013-06-01
To draw humankind's attention to its existence, an extraterrestrial civilization could well direct periodic laser pulses toward Earth. We developed a technique capable of detecting a quasi-periodic light signal with an average of less than one photon per pulse within a measurement time of a few tens of milliseconds in the presence of the radiation emitted by an exoplanet's host star. Each of the electronic events produced by one or more single-photon avalanche detectors is tagged with precise time-of-arrival information and stored. From this we compute a histogram displaying the frequency of event-time differences in classes with bin widths on the order of a nanosecond. The existence of periodic laser pulses manifests itself in histogram peaks regularly spaced at multiples of the-a priori unknown-pulse repetition frequency. With laser sources simulating both the pulse source and the background radiation, we tested a detection system in the laboratory at a wavelength of 850 nm. We present histograms obtained from various recorded data sequences with the number of photons per pulse, the background photons per pulse period, and the recording time as main parameters. We then simulated a periodic signal hypothetically generated on a planet orbiting a G2V-type star (distance to Earth 500 light-years) and show that the technique is capable of detecting the signal even if the received pulses carry as little as one photon on average on top of the star's background light.
Laser fluorescence fluctuation excesses in molecular immunology experiments
NASA Astrophysics Data System (ADS)
Galich, N. E.; Filatov, M. V.
2007-04-01
A novel approach to statistical analysis of flow cytometry fluorescence data have been developed and applied for population analysis of blood neutrophils stained with hydroethidine during respiratory burst reaction. The staining based on intracellular oxidation hydroethidine to ethidium bromide, which intercalate into cell DNA. Fluorescence of the resultant product serves as a measure of the neutrophil ability to generate superoxide radicals after induction respiratory burst reaction by phorbol myristate acetate (PMA). It was demonstrated that polymorphonuclear leukocytes of persons with inflammatory diseases showed a considerably changed response. Cytofluorometric histograms obtained have unique information about condition of neutrophil population what might to allow a determination of the pathology processes type connecting with such inflammation. A novel approach to histogram analysis is based on analysis of high-momentum dynamic of distribution. The features of fluctuation excesses of distribution have unique information about disease under consideration.
Measuring kinetics of complex single ion channel data using mean-variance histograms.
Patlak, J B
1993-07-01
The measurement of single ion channel kinetics is difficult when those channels exhibit subconductance events. When the kinetics are fast, and when the current magnitudes are small, as is the case for Na+, Ca2+, and some K+ channels, these difficulties can lead to serious errors in the estimation of channel kinetics. I present here a method, based on the construction and analysis of mean-variance histograms, that can overcome these problems. A mean-variance histogram is constructed by calculating the mean current and the current variance within a brief "window" (a set of N consecutive data samples) superimposed on the digitized raw channel data. Systematic movement of this window over the data produces large numbers of mean-variance pairs which can be assembled into a two-dimensional histogram. Defined current levels (open, closed, or sublevel) appear in such plots as low variance regions. The total number of events in such low variance regions is estimated by curve fitting and plotted as a function of window width. This function decreases with the same time constants as the original dwell time probability distribution for each of the regions. The method can therefore be used: 1) to present a qualitative summary of the single channel data from which the signal-to-noise ratio, open channel noise, steadiness of the baseline, and number of conductance levels can be quickly determined; 2) to quantify the dwell time distribution in each of the levels exhibited. In this paper I present the analysis of a Na+ channel recording that had a number of complexities. The signal-to-noise ratio was only about 8 for the main open state, open channel noise, and fast flickers to other states were present, as were a substantial number of subconductance states. "Standard" half-amplitude threshold analysis of these data produce open and closed time histograms that were well fitted by the sum of two exponentials, but with apparently erroneous time constants, whereas the mean-variance histogram technique provided a more credible analysis of the open, closed, and subconductance times for the patch. I also show that the method produces accurate results on simulated data in a wide variety of conditions, whereas the half-amplitude method, when applied to complex simulated data shows the same errors as were apparent in the real data. The utility and the limitations of this new method are discussed.
Time Analyzer for Time Synchronization and Monitor of the Deep Space Network
NASA Technical Reports Server (NTRS)
Cole, Steven; Gonzalez, Jorge, Jr.; Calhoun, Malcolm; Tjoelker, Robert
2003-01-01
A software package has been developed to measure, monitor, and archive the performance of timing signals distributed in the NASA Deep Space Network. Timing signals are generated from a central master clock and distributed to over 100 users at distances up to 30 kilometers. The time offset due to internal distribution delays and time jitter with respect to the central master clock are critical for successful spacecraft navigation, radio science, and very long baseline interferometry (VLBI) applications. The instrument controller and operator interface software is written in LabView and runs on the Linux operating system. The software controls a commercial multiplexer to switch 120 separate timing signals to measure offset and jitter with a time-interval counter referenced to the master clock. The offset of each channel is displayed in histogram form, and "out of specification" alarms are sent to a central complex monitor and control system. At any time, the measurement cycle of 120 signals can be interrupted for diagnostic tests on an individual channel. The instrument also routinely monitors and archives the long-term stability of all frequency standards or any other 1-pps source compared against the master clock. All data is stored and made available for
Lin, Yuning; Li, Hui; Chen, Ziqian; Ni, Ping; Zhong, Qun; Huang, Huijuan; Sandrasegaran, Kumar
2015-05-01
The purpose of this study was to investigate the application of histogram analysis of apparent diffusion coefficient (ADC) in characterizing pathologic features of cervical cancer and benign cervical lesions. This prospective study was approved by the institutional review board, and written informed consent was obtained. Seventy-three patients with cervical cancer (33-69 years old; 35 patients with International Federation of Gynecology and Obstetrics stage IB cervical cancer) and 38 patients (38-61 years old) with normal cervix or cervical benign lesions (control group) were enrolled. All patients underwent 3-T diffusion-weighted imaging (DWI) with b values of 0 and 800 s/mm(2). ADC values of the entire tumor in the patient group and the whole cervix volume in the control group were assessed. Mean ADC, median ADC, 25th and 75th percentiles of ADC, skewness, and kurtosis were calculated. Histogram parameters were compared between different pathologic features, as well as between stage IB cervical cancer and control groups. Mean ADC, median ADC, and 25th percentile of ADC were significantly higher for adenocarcinoma (p = 0.021, 0.006, and 0.004, respectively), and skewness was significantly higher for squamous cell carcinoma (p = 0.011). Median ADC was statistically significantly higher for well or moderately differentiated tumors (p = 0.044), and skewness was statistically significantly higher for poorly differentiated tumors (p = 0.004). No statistically significant difference of ADC histogram was observed between lymphovascular space invasion subgroups. All histogram parameters differed significantly between stage IB cervical cancer and control groups (p < 0.05). Distribution of ADCs characterized by histogram analysis may help to distinguish early-stage cervical cancer from normal cervix or cervical benign lesions and may be useful for evaluating the different pathologic features of cervical cancer.
Wang, Feng; Wang, Yuxiang; Zhou, Yan; Liu, Congrong; Xie, Lizhi; Zhou, Zhenyu; Liang, Dong; Shen, Yang; Yao, Zhihang; Liu, Jianyu
2017-12-01
To evaluate the utility of histogram analysis of monoexponential, biexponential, and stretched-exponential models to a dualistic model of epithelial ovarian cancer (EOC). Fifty-two patients with histopathologically proven EOC underwent preoperative magnetic resonance imaging (MRI) (including diffusion-weighted imaging [DWI] with 11 b-values) using a 3.0T system and were divided into two groups: types I and II. Apparent diffusion coefficient (ADC), true diffusion coefficient (D), pseudodiffusion coefficient (D*), perfusion fraction (f), distributed diffusion coefficient (DDC), and intravoxel water diffusion heterogeneity (α) histograms were obtained based on solid components of the entire tumor. The following metrics of each histogram were compared between two types: 1) mean; 2) median; 3) 10th percentile and 90th percentile. Conventional MRI morphological features were also recorded. Significant morphological features for predicting EOC type were maximum diameter (P = 0.007), texture of lesion (P = 0.001), and peritoneal implants (P = 0.001). For ADC, D, f, DDC, and α, all metrics were significantly lower in type II than type I (P < 0.05). Mean, median, 10th, and 90th percentile of D* were not significantly different (P = 0.336, 0.154, 0.779, and 0.203, respectively). Most histogram metrics of ADC, D, and DDC had significantly higher area under the receiver operating characteristic curve values than those of f and α (P < 0.05) CONCLUSION: It is feasible to grade EOC by morphological features and three models with histogram analysis. ADC, D, and DDC have better performance than f and α; f and α may provide additional information. 4 Technical Efficacy: Stage 1 J. Magn. Reson. Imaging 2017;46:1797-1809. © 2017 International Society for Magnetic Resonance in Medicine.
Zadpoor, Amir A
2015-03-01
Mechanical characterization of biological tissues and biomaterials at the nano-scale is often performed using nanoindentation experiments. The different constituents of the characterized materials will then appear in the histogram that shows the probability of measuring a certain range of mechanical properties. An objective technique is needed to separate the probability distributions that are mixed together in such a histogram. In this paper, finite mixture models (FMMs) are proposed as a tool capable of performing such types of analysis. Finite Gaussian mixture models assume that the measured probability distribution is a weighted combination of a finite number of Gaussian distributions with separate mean and standard deviation values. Dedicated optimization algorithms are available for fitting such a weighted mixture model to experimental data. Moreover, certain objective criteria are available to determine the optimum number of Gaussian distributions. In this paper, FMMs are used for interpreting the probability distribution functions representing the distributions of the elastic moduli of osteoarthritic human cartilage and co-polymeric microspheres. As for cartilage experiments, FMMs indicate that at least three mixture components are needed for describing the measured histogram. While the mechanical properties of the softer mixture components, often assumed to be associated with Glycosaminoglycans, were found to be more or less constant regardless of whether two or three mixture components were used, those of the second mixture component (i.e. collagen network) considerably changed depending on the number of mixture components. Regarding the co-polymeric microspheres, the optimum number of mixture components estimated by the FMM theory, i.e. 3, nicely matches the number of co-polymeric components used in the structure of the polymer. The computer programs used for the presented analyses are made freely available online for other researchers to use. Copyright © 2014 Elsevier B.V. All rights reserved.
Research of image retrieval technology based on color feature
NASA Astrophysics Data System (ADS)
Fu, Yanjun; Jiang, Guangyu; Chen, Fengying
2009-10-01
Recently, with the development of the communication and the computer technology and the improvement of the storage technology and the capability of the digital image equipment, more and more image resources are given to us than ever. And thus the solution of how to locate the proper image quickly and accurately is wanted.The early method is to set up a key word for searching in the database, but now the method has become very difficult when we search much more picture that we need. In order to overcome the limitation of the traditional searching method, content based image retrieval technology was aroused. Now, it is a hot research subject.Color image retrieval is the important part of it. Color is the most important feature for color image retrieval. Three key questions on how to make use of the color characteristic are discussed in the paper: the expression of color, the abstraction of color characteristic and the measurement of likeness based on color. On the basis, the extraction technology of the color histogram characteristic is especially discussed. Considering the advantages and disadvantages of the overall histogram and the partition histogram, a new method based the partition-overall histogram is proposed. The basic thought of it is to divide the image space according to a certain strategy, and then calculate color histogram of each block as the color feature of this block. Users choose the blocks that contain important space information, confirming the right value. The system calculates the distance between the corresponding blocks that users choosed. Other blocks merge into part overall histograms again, and the distance should be calculated. Then accumulate all the distance as the real distance between two pictures. The partition-overall histogram comprehensive utilizes advantages of two methods above, by choosing blocks makes the feature contain more spatial information which can improve performance; the distances between partition-overall histogram make rotating and translation does not change. The HSV color space is used to show color characteristic of image, which is suitable to the visual characteristic of human. Taking advance of human's feeling to color, it quantifies color sector with unequal interval, and get characteristic vector. Finally, it matches the similarity of image with the algorithm of the histogram intersection and the partition-overall histogram. Users can choose a demonstration image to show inquired vision require, and also can adjust several right value through the relevance-feedback method to obtain the best result of search.An image retrieval system based on these approaches is presented. The result of the experiments shows that the image retrieval based on partition-overall histogram can keep the space distribution information while abstracting color feature efficiently, and it is superior to the normal color histograms in precision rate while researching. The query precision rate is more than 95%. In addition, the efficient block expression will lower the complicate degree of the images to be searched, and thus the searching efficiency will be increased. The image retrieval algorithms based on the partition-overall histogram proposed in the paper is efficient and effective.
Perceptual Contrast Enhancement with Dynamic Range Adjustment
Zhang, Hong; Li, Yuecheng; Chen, Hao; Yuan, Ding; Sun, Mingui
2013-01-01
Recent years, although great efforts have been made to improve its performance, few Histogram equalization (HE) methods take human visual perception (HVP) into account explicitly. The human visual system (HVS) is more sensitive to edges than brightness. This paper proposes to take use of this nature intuitively and develops a perceptual contrast enhancement approach with dynamic range adjustment through histogram modification. The use of perceptual contrast connects the image enhancement problem with the HVS. To pre-condition the input image before the HE procedure is implemented, a perceptual contrast map (PCM) is constructed based on the modified Difference of Gaussian (DOG) algorithm. As a result, the contrast of the image is sharpened and high frequency noise is suppressed. A modified Clipped Histogram Equalization (CHE) is also developed which improves visual quality by automatically detecting the dynamic range of the image with improved perceptual contrast. Experimental results show that the new HE algorithm outperforms several state-of-the-art algorithms in improving perceptual contrast and enhancing details. In addition, the new algorithm is simple to implement, making it suitable for real-time applications. PMID:24339452
Regime-Dependent Differences in Surface Freshwater Exchange Estimates Over the Ocean
NASA Astrophysics Data System (ADS)
Wong, Sun; Behrangi, Ali
2018-01-01
Differences in gridded precipitation (
Blind identification of image manipulation type using mixed statistical moments
NASA Astrophysics Data System (ADS)
Jeong, Bo Gyu; Moon, Yong Ho; Eom, Il Kyu
2015-01-01
We present a blind identification of image manipulation types such as blurring, scaling, sharpening, and histogram equalization. Motivated by the fact that image manipulations can change the frequency characteristics of an image, we introduce three types of feature vectors composed of statistical moments. The proposed statistical moments are generated from separated wavelet histograms, the characteristic functions of the wavelet variance, and the characteristic functions of the spatial image. Our method can solve the n-class classification problem. Through experimental simulations, we demonstrate that our proposed method can achieve high performance in manipulation type detection. The average rate of the correctly identified manipulation types is as high as 99.22%, using 10,800 test images and six manipulation types including the authentic image.
Using color histogram normalization for recovering chromatic illumination-changed images.
Pei, S C; Tseng, C L; Wu, C C
2001-11-01
We propose a novel image-recovery method using the covariance matrix of the red-green-blue (R-G-B) color histogram and tensor theories. The image-recovery method is called the color histogram normalization algorithm. It is known that the color histograms of an image taken under varied illuminations are related by a general affine transformation of the R-G-B coordinates when the illumination is changed. We propose a simplified affine model for application with illumination variation. This simplified affine model considers the effects of only three basic forms of distortion: translation, scaling, and rotation. According to this principle, we can estimate the affine transformation matrix necessary to recover images whose color distributions are varied as a result of illumination changes. We compare the normalized color histogram of the standard image with that of the tested image. By performing some operations of simple linear algebra, we can estimate the matrix of the affine transformation between two images under different illuminations. To demonstrate the performance of the proposed algorithm, we divide the experiments into two parts: computer-simulated images and real images corresponding to illumination changes. Simulation results show that the proposed algorithm is effective for both types of images. We also explain the noise-sensitive skew-rotation estimation that exists in the general affine model and demonstrate that the proposed simplified affine model without the use of skew rotation is better than the general affine model for such applications.
NASA Astrophysics Data System (ADS)
Mahmud, Kashif; Mariethoz, Gregoire; Baker, Andy; Treble, Pauline C.
2018-02-01
Cave drip water response to surface meteorological conditions is complex due to the heterogeneity of water movement in the karst unsaturated zone. Previous studies have focused on the monitoring of fractured rock limestones that have little or no primary porosity. In this study, we aim to further understand infiltration water hydrology in the Tamala Limestone of SW Australia, which is Quaternary aeolianite with primary porosity. We build on our previous studies of the Golgotha Cave system and utilize the existing spatial survey of 29 automated cave drip loggers and a lidar-based flow classification scheme, conducted in the two main chambers of this cave. We find that a daily sampling frequency at our cave site optimizes the capture of drip variability with the least possible sampling artifacts. With the optimum sampling frequency, most of the drip sites show persistent autocorrelation for at least a month, typically much longer, indicating ample storage of water feeding all stalactites investigated. Drip discharge histograms are highly variable, showing sometimes multimodal distributions. Histogram skewness is shown to relate to the wetter-than-average 2013 hydrological year and modality is affected by seasonality. The hydrological classification scheme with respect to mean discharge and the flow variation can distinguish between groundwater flow types in limestones with primary porosity, and the technique could be used to characterize different karst flow paths when high-frequency automated drip logger data are available. We observe little difference in the coefficient of variation (COV) between flow classification types, probably reflecting the ample storage due to the dominance of primary porosity at this cave site. Moreover, we do not find any relationship between drip variability and discharge within similar flow type. Finally, a combination of multidimensional scaling (MDS) and clustering by k means is used to classify similar drip types based on time series analysis. This clustering reveals four unique drip regimes which agree with previous flow type classification for this site. It highlights a spatial homogeneity in drip types in one cave chamber, and spatial heterogeneity in the other, which is in agreement with our understanding of cave chamber morphology and lithology.
How Can Histograms Be Useful for Introducing Continuous Probability Distributions?
ERIC Educational Resources Information Center
Derouet, Charlotte; Parzysz, Bernard
2016-01-01
The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…
Statistical analysis of passenger-crowding in bus transport network of Harbin
NASA Astrophysics Data System (ADS)
Hu, Baoyu; Feng, Shumin; Li, Jinyang; Zhao, Hu
2018-01-01
Passenger flow data is indispensable but rare in the study of public transport networks. In this study, we focus on the passenger-crowding characteristics of the bus transport network of Harbin (BTN-H) based on passenger flow investigation. The three frequency histograms for all the uplinks and downlinks in Harbin are presented, including passengers on the bus at each section, crowding coefficients, and position parameters of crowded sections. The differences in crowding position are analyzed on each route. The distributions of degree and crowding degree (in directed space L) follow an exponential law. The new finding indicates that there are many stations with few crowded sections and a few stations with many crowded sections. The distributions of path length and crowded length (in directed space P) are presented based on the minimum transfer times, and it is found that they can be fitted by a composite Gaussian function and a Gaussian function, respectively. The stations and paths can be divided into three crowd levels. We conclude that BTN-H is crowded from a network-based perspective.
NASA Astrophysics Data System (ADS)
Obraztsov, S. M.; Konobeev, Yu. V.; Birzhevoy, G. A.; Rachkov, V. I.
2006-12-01
The dependence of mechanical properties of ferritic/martensitic (F/M) steels on irradiation temperature is of interest because these steels are used as structural materials for fast, fusion reactors and accelerator driven systems. Experimental data demonstrating temperature peaks in physical and mechanical properties of neutron irradiated pure iron, nickel, vanadium, and austenitic stainless steels are available in the literature. A lack of such an information for F/M steels forces one to apply a computational mathematical-statistical modeling methods. The bootstrap procedure is one of such methods that allows us to obtain the necessary statistical characteristics using only a sample of limited size. In the present work this procedure is used for modeling the frequency distribution histograms of ultimate strength temperature peaks in pure iron and Russian F/M steels EP-450 and EP-823. Results of fitting the sums of Lorentz or Gauss functions to the calculated distributions are presented. It is concluded that there are two temperature (at 360 and 390 °C) peaks of the ultimate strength in EP-450 steel and single peak at 390 °C in EP-823.
Optical and dielectric properties of NiFe2O4 nanoparticles under different synthesized temperature
NASA Astrophysics Data System (ADS)
Parishani, Marziye; Nadafan, Marzieh; Dehghani, Zahra; Malekfar, Rasoul; Khorrami, G. H. H.
In this research, NiFe2O4 nanoparticles was prepared via the simple sol-gel route, using different sintering temperature. This nanoparticle was characterized via X-ray diffraction (XRD) pattern, scanning electron microscopy (SEM), and FTIR spectra. The XRD patterns show by increasing the synthesized temperature, the intensity, and broadening of peaks are decreased so the results are more crystallization and raising the size of nanoparticles. The size distribution in the histogram of the NiFe2O4 nanoparticles is 42, 96, and 315 nm at 750 °C, 850 °C, and 950 °C, respectively. The FTIR spectra were evaluated using Kramers-Kronig method. Results approved the existing of certain relations between sintering temperatures and grain size of nanoparticles. By raising the temperature from 750 °C to 950 °C, the grain size was increased from 70 nm to 300 nm and the optical constants of nanoparticles were strongly related to synthesizing temperature as well. Since by increasing temperature, both real/imaginary parts of the refractive index and dielectric function were decreased. Consequently, the transversal (TO) and longitudinal (LO) phonon frequencies are detected. The TO and LO frequencies have shifted to red frequencies by increasing reaction temperature.
Examining the NZESM Cloud representation with Self Organizing Maps
NASA Astrophysics Data System (ADS)
Schuddeboom, Alex; McDonald, Adrian; Parsons, Simon; Morgenstern, Olaf; Harvey, Mike
2017-04-01
Several different cloud regimes are identified from MODIS satellite data and the representation of these regimes within the New Zealand Earth System Model (NZESM) is examined. For the development of our cloud classification we utilize a neural network algorithm known as self organizing maps (SOMs) on MODIS cloud top pressure - cloud optical thickness joint histograms. To evaluate the representation of the cloud within NZESM, the frequency and geographical distribution of the regimes is compared between the NZESM and satellite data. This approach has the advantage of not only identifying differences, but also potentially giving additional information about the discrepancy such as in which regions or phases of cloud the differences are most prominent. To allow for a more direct comparison between datasets, the COSP satellite simulation software is applied to NZESM output. COSP works by simulating the observational processes linked to a satellite, within the GCM, so that data can be generated in a way that shares the particular observational bias of specific satellites. By taking the COSP joint histograms and comparing them to our existing classifications we can easily search for discrepancies between the observational data and the simulations without having to be cautious of biases introduced by the satellite. Preliminary results, based on data for 2008, show a significant decrease in overall cloud fraction in the NZESM compared to the MODIS satellite data. To better understand the nature of this discrepancy, the cloud fraction related to different cloud heights and phases were also analysed.
Computerized image analysis: estimation of breast density on mammograms
NASA Astrophysics Data System (ADS)
Zhou, Chuan; Chan, Heang-Ping; Petrick, Nicholas; Sahiner, Berkman; Helvie, Mark A.; Roubidoux, Marilyn A.; Hadjiiski, Lubomir M.; Goodsitt, Mitchell M.
2000-06-01
An automated image analysis tool is being developed for estimation of mammographic breast density, which may be useful for risk estimation or for monitoring breast density change in a prevention or intervention program. A mammogram is digitized using a laser scanner and the resolution is reduced to a pixel size of 0.8 mm X 0.8 mm. Breast density analysis is performed in three stages. First, the breast region is segmented from the surrounding background by an automated breast boundary-tracking algorithm. Second, an adaptive dynamic range compression technique is applied to the breast image to reduce the range of the gray level distribution in the low frequency background and to enhance the differences in the characteristic features of the gray level histogram for breasts of different densities. Third, rule-based classification is used to classify the breast images into several classes according to the characteristic features of their gray level histogram. For each image, a gray level threshold is automatically determined to segment the dense tissue from the breast region. The area of segmented dense tissue as a percentage of the breast area is then estimated. In this preliminary study, we analyzed the interobserver variation of breast density estimation by two experienced radiologists using BI-RADS lexicon. The radiologists' visually estimated percent breast densities were compared with the computer's calculation. The results demonstrate the feasibility of estimating mammographic breast density using computer vision techniques and its potential to improve the accuracy and reproducibility in comparison with the subjective visual assessment by radiologists.
Microcomputer-based system for registration of oxygen tension in peripheral muscle.
Odman, S; Bratt, H; Erlandsson, I; Sjögren, L
1986-01-01
For registration of oxygen tension fields in peripheral muscle a microcomputer based system was designed on the M6800 microprocessor. The system was designed to record the signals from a multiwire oxygen electrode, MDO, which is a multiwire electrode for measuring oxygen on the surface of an organ. The system contained patient safety isolation unit built on optocopplers and the upper frequency limit was 0.64 Hz. Collected data were corrected for drift and temperature changes during the measurement by using pre- and after calibrations and a linear compensation technique. Measure drift of the electrodes were proved to be linear and thus the drift could be compensated for. The system was tested in an experiment on pig. To study the distribution of oxygen statistically mean, standard deviation, skewness and curtosis were calculated. To see changes or differences between histograms a Kolmogorv-Smirnov test was used.
Cruz-Rodríguez, J A; González-Machorro, E; Villegas González, A A; Rodríguez Ramírez, M L; Mejía Lara, F
2016-04-07
It is broadly known that the conservation of biological diversity in agricultural ecosystems contributes to pest control. This process was studied in a prickly pear plantation (Opuntia megacanthaandOpuntia ficus-indica) located in central Mexico. No insecticides have been used on this plantation since 2000, and local farmers believe that the presence of different species of insects limits the growth of the wild cochineal (Dactylopius opuntiaeCockerell), which is one of the main pests in this crop. From August 2012 to November 2013, we estimated the number of cochineal per stem in the plantation and determined its spatial distribution pattern. In order to identify signs of population regulation, we obtained histograms of the frequency distribution of the size of the clusters and determined if distribution is adjusted to a power function (power law). We identified the cochineal predators and determined the correlation in their abundances. The greater abundance of cochineal occurred between summer and autumn while the minimum value was recorded in spring. The frequency distribution of the cochineal clusters had a high level of adjustment to a power function, suggesting the presence of population regulation processes. Six species that prey on cochineal were identified.Laetilia coccidivoraandHyperaspis trifurcatawere the most active and their abundance was significantly correlated with the abundance of cochineal. We found that the probability of extinction of these insects in a cladode increases with its density, since the density and predator activity also increased. It is likely that, under these conditions, the cochineal have established an autonomous control. © The Authors 2016. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Boardsen, Scott A.; Hospodarsky, George B.; Kletzing, Craig A.; Engebretson, Mark J.; Pfaff, Robert F.; Wygant, John R.; Kurth, William S.; Averkamp, Terrance F.; Bounds, Scott R.; Green, Jim L.;
2016-01-01
We present a statistical survey of the latitudinal structure of the fast magnetosonic wave mode detected by the Van Allen Probes spanning the time interval of 21 September 2012 to 1 August 2014. We show that statistically, the latitudinal occurrence of the wave frequency (f) normalized by the local proton cyclotron frequency (f(sub cP)) has a distinct funnel-shaped appearance in latitude about the magnetic equator similar to that found in case studies. By comparing the observed E/B ratios with the model E/B ratio, using the observed plasma density and background magnetic field magnitude as input to the model E/B ratio, we show that this mode is consistent with the extra-ordinary (whistler) mode at wave normal angles (theta(sub k)) near 90 deg. Performing polarization analysis on synthetic waveforms composed from a superposition of extra-ordinary mode plane waves with theta(sub k) randomly chosen between 87 and 90 deg, we show that the uncertainty in the derived wave normal is substantially broadened, with a tail extending down to theta(sub k) of 60 deg, suggesting that another approach is necessary to estimate the true distribution of theta(sub k). We find that the histograms of the synthetically derived ellipticities and theta(sub k) are consistent with the observations of ellipticities and theta(sub k) derived using polarization analysis.We make estimates of the median equatorial theta(sub k) by comparing observed and model ray tracing frequency-dependent probability occurrence with latitude and give preliminary frequency dependent estimates of the equatorial theta(sub k) distribution around noon and 4 R(sub E), with the median of approximately 4 to 7 deg from 90 deg at f/f(sub cP) = 2 and dropping to approximately 0.5 deg from 90 deg at f/f(sub cP) = 30. The occurrence of waves in this mode peaks around noon near the equator at all radial distances, and we find that the overall intensity of these waves increases with AE*, similar to findings of other studies.
Adaptive histogram equalization in digital radiography of destructive skeletal lesions.
Braunstein, E M; Capek, P; Buckwalter, K; Bland, P; Meyer, C R
1988-03-01
Adaptive histogram equalization, an image-processing technique that distributes pixel values of an image uniformly throughout the gray scale, was applied to 28 plain radiographs of bone lesions, after they had been digitized. The non-equalized and equalized digital images were compared by two skeletal radiologists with respect to lesion margins, internal matrix, soft-tissue mass, cortical breakthrough, and periosteal reaction. Receiver operating characteristic (ROC) curves were constructed on the basis of the responses. Equalized images were superior to nonequalized images in determination of cortical breakthrough and presence or absence of periosteal reaction. ROC analysis showed no significant difference in determination of margins, matrix, or soft-tissue masses.
Object-based change detection method using refined Markov random field
NASA Astrophysics Data System (ADS)
Peng, Daifeng; Zhang, Yongjun
2017-01-01
In order to fully consider the local spatial constraints between neighboring objects in object-based change detection (OBCD), an OBCD approach is presented by introducing a refined Markov random field (MRF). First, two periods of images are stacked and segmented to produce image objects. Second, object spectral and textual histogram features are extracted and G-statistic is implemented to measure the distance among different histogram distributions. Meanwhile, object heterogeneity is calculated by combining spectral and textual histogram distance using adaptive weight. Third, an expectation-maximization algorithm is applied for determining the change category of each object and the initial change map is then generated. Finally, a refined change map is produced by employing the proposed refined object-based MRF method. Three experiments were conducted and compared with some state-of-the-art unsupervised OBCD methods to evaluate the effectiveness of the proposed method. Experimental results demonstrate that the proposed method obtains the highest accuracy among the methods used in this paper, which confirms its validness and effectiveness in OBCD.
Exploring gravitational lensing model variations in the Frontier Fields galaxy clusters
NASA Astrophysics Data System (ADS)
Harris James, Nicholas John; Raney, Catie; Brennan, Sean; Keeton, Charles
2018-01-01
Multiple groups have been working on modeling the mass distributions of the six lensing galaxy clusters in the Hubble Space Telescope Frontier Fields data set. The magnification maps produced from these mass models will be important for the future study of the lensed background galaxies, but there exists significant variation in the different groups’ models and magnification maps. We explore the use of two-dimensional histograms as a tool for visualizing these magnification map variations. Using a number of simple, one- or two-halo singular isothermal sphere models, we explore the features that are produced in 2D histogram model comparisons when parameters such as halo mass, ellipticity, and location are allowed to vary. Our analysis demonstrates the potential of 2D histograms as a means of observing the full range of differences between the Frontier Fields groups’ models.This work has been supported by funding from National Science Foundation grants PHY-1560077 and AST-1211385, and from the Space Telescope Science Institute.
NASA Astrophysics Data System (ADS)
Li, Shuo; Jin, Weiqi; Li, Li; Li, Yiyang
2018-05-01
Infrared thermal images can reflect the thermal-radiation distribution of a particular scene. However, the contrast of the infrared images is usually low. Hence, it is generally necessary to enhance the contrast of infrared images in advance to facilitate subsequent recognition and analysis. Based on the adaptive double plateaus histogram equalization, this paper presents an improved contrast enhancement algorithm for infrared thermal images. In the proposed algorithm, the normalized coefficient of variation of the histogram, which characterizes the level of contrast enhancement, is introduced as feedback information to adjust the upper and lower plateau thresholds. The experiments on actual infrared images show that compared to the three typical contrast-enhancement algorithms, the proposed algorithm has better scene adaptability and yields better contrast-enhancement results for infrared images with more dark areas or a higher dynamic range. Hence, it has high application value in contrast enhancement, dynamic range compression, and digital detail enhancement for infrared thermal images.
Color image enhancement based on particle swarm optimization with Gaussian mixture
NASA Astrophysics Data System (ADS)
Kattakkalil Subhashdas, Shibudas; Choi, Bong-Seok; Yoo, Ji-Hoon; Ha, Yeong-Ho
2015-01-01
This paper proposes a Gaussian mixture based image enhancement method which uses particle swarm optimization (PSO) to have an edge over other contemporary methods. The proposed method uses the guassian mixture model to model the lightness histogram of the input image in CIEL*a*b* space. The intersection points of the guassian components in the model are used to partition the lightness histogram. . The enhanced lightness image is generated by transforming the lightness value in each interval to appropriate output interval according to the transformation function that depends on PSO optimized parameters, weight and standard deviation of Gaussian component and cumulative distribution of the input histogram interval. In addition, chroma compensation is applied to the resulting image to reduce washout appearance. Experimental results show that the proposed method produces a better enhanced image compared to the traditional methods. Moreover, the enhanced image is free from several side effects such as washout appearance, information loss and gradation artifacts.
Gonzalez-Vazquez, J P; Anta, Juan A; Bisquert, Juan
2009-11-28
The random walk numerical simulation (RWNS) method is used to compute diffusion coefficients for hopping transport in a fully disordered medium at finite carrier concentrations. We use Miller-Abrahams jumping rates and an exponential distribution of energies to compute the hopping times in the random walk simulation. The computed diffusion coefficient shows an exponential dependence with respect to Fermi-level and Arrhenius behavior with respect to temperature. This result indicates that there is a well-defined transport level implicit to the system dynamics. To establish the origin of this transport level we construct histograms to monitor the energies of the most visited sites. In addition, we construct "corrected" histograms where backward moves are removed. Since these moves do not contribute to transport, these histograms provide a better estimation of the effective transport level energy. The analysis of this concept in connection with the Fermi-level dependence of the diffusion coefficient and the regime of interest for the functioning of dye-sensitised solar cells is thoroughly discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebert, Martin A., E-mail: Martin.Ebert@health.wa.gov.au; School of Physics, University of Western Australia, Perth, Western Australia; Foo, Kerwyn
Purpose: To use a high-quality multicenter trial dataset to determine dose-volume effects for gastrointestinal (GI) toxicity following radiation therapy for prostate carcinoma. Influential dose-volume histogram regions were to be determined as functions of dose, anatomical location, toxicity, and clinical endpoint. Methods and Materials: Planning datasets for 754 participants in the TROG 03.04 RADAR trial were available, with Late Effects of Normal Tissues (LENT) Subjective, Objective, Management, and Analytic (SOMA) toxicity assessment to a median of 72 months. A rank sum method was used to define dose-volume cut-points as near-continuous functions of dose to 3 GI anatomical regions, together with amore » comprehensive assessment of significance. Univariate and multivariate ordinal regression was used to assess the importance of cut-points at each dose. Results: Dose ranges providing significant cut-points tended to be consistent with those showing significant univariate regression odds-ratios (representing the probability of a unitary increase in toxicity grade per percent relative volume). Ranges of significant cut-points for rectal bleeding validated previously published results. Separation of the lower GI anatomy into complete anorectum, rectum, and anal canal showed the impact of mid-low doses to the anal canal on urgency and tenesmus, completeness of evacuation and stool frequency, and mid-high doses to the anorectum on bleeding and stool frequency. Derived multivariate models emphasized the importance of the high-dose region of the anorectum and rectum for rectal bleeding and mid- to low-dose regions for diarrhea and urgency and tenesmus, and low-to-mid doses to the anal canal for stool frequency, diarrhea, evacuation, and bleeding. Conclusions: Results confirm anatomical dependence of specific GI toxicities. They provide an atlas summarizing dose-histogram effects and derived constraints as functions of anatomical region, dose, toxicity, and endpoint for informing future radiation therapy planning.« less
Abundance and size distribution dynamics of abyssal epibenthic megafauna in the northeast Pacific.
Ruhl, Henry A
2007-05-01
The importance of interannual variation in deep-sea abundances is now becoming recognized. There is, however, relatively little known about what processes dominate the observed fluctuations. The abundance and size distribution of the megabenthos have been examined here using a towed camera system at a deep-sea station in the northeast Pacific (Station M) from 1989 to 2004. This 16-year study included 52 roughly seasonal transects averaging 1.2 km in length with over 35600 photographic frames analyzed. Mobile epibenthic megafauna at 4100 m depth have exhibited interannual scale changes in abundance from one to three orders of magnitude. Increases in abundance have now been significantly linked to decreases in mean body size, suggesting that accruals in abundance probably result from the recruitment of young individuals. Examinations of size-frequency histograms indicate several possible recruitment events. Shifts in size-frequency distributions were also used to make basic estimations of individual growth rates from 1 to 6 mm/month, depending on the taxon. Regional intensification in reproduction followed by recruitment within the study area could explain the majority of observed accruals in abundance. Although some adult migration is certainly probable in accounting for local variation in abundances, the slow movements of benthic life stages restrict regional migrations for most taxa. Negative competitive interactions and survivorship may explain the precipitous declines of some taxa. This and other studies have shown that abundances from protozoans to large benthic invertebrates and fishes all have undergone significant fluctuations in abundance at Station M over periods of weeks to years.
A Geometric View of the Mean of a Set of Numbers
ERIC Educational Resources Information Center
Sarkar, Jyotirmoy; Rashid, Mamunur
2016-01-01
The sample mean is sometimes depicted as a fulcrum placed under the Dot plot. We provide an alternative geometric visualization of the sample mean using the empirical cumulative distribution function or the cumulative histogram data.
2013-09-01
persistent layers of particulate matter (defined by turbidity or chlorophyll ); DISTRIBUTION STATEMENT A. Approved for public release; distribution... black line) for data presented in Fig. 1 and averaged along isopycnals. 2013 Figure 3: Histogram showing the statistical differences in mixing...mean. 3 SIGNIFICANCE Attenuation of light and sound propagation in the upper
Measuring kinetics of complex single ion channel data using mean-variance histograms.
Patlak, J B
1993-01-01
The measurement of single ion channel kinetics is difficult when those channels exhibit subconductance events. When the kinetics are fast, and when the current magnitudes are small, as is the case for Na+, Ca2+, and some K+ channels, these difficulties can lead to serious errors in the estimation of channel kinetics. I present here a method, based on the construction and analysis of mean-variance histograms, that can overcome these problems. A mean-variance histogram is constructed by calculating the mean current and the current variance within a brief "window" (a set of N consecutive data samples) superimposed on the digitized raw channel data. Systematic movement of this window over the data produces large numbers of mean-variance pairs which can be assembled into a two-dimensional histogram. Defined current levels (open, closed, or sublevel) appear in such plots as low variance regions. The total number of events in such low variance regions is estimated by curve fitting and plotted as a function of window width. This function decreases with the same time constants as the original dwell time probability distribution for each of the regions. The method can therefore be used: 1) to present a qualitative summary of the single channel data from which the signal-to-noise ratio, open channel noise, steadiness of the baseline, and number of conductance levels can be quickly determined; 2) to quantify the dwell time distribution in each of the levels exhibited. In this paper I present the analysis of a Na+ channel recording that had a number of complexities. The signal-to-noise ratio was only about 8 for the main open state, open channel noise, and fast flickers to other states were present, as were a substantial number of subconductance states. "Standard" half-amplitude threshold analysis of these data produce open and closed time histograms that were well fitted by the sum of two exponentials, but with apparently erroneous time constants, whereas the mean-variance histogram technique provided a more credible analysis of the open, closed, and subconductance times for the patch. I also show that the method produces accurate results on simulated data in a wide variety of conditions, whereas the half-amplitude method, when applied to complex simulated data shows the same errors as were apparent in the real data. The utility and the limitations of this new method are discussed. Images FIGURE 2 FIGURE 4 FIGURE 8 FIGURE 9 PMID:7690261
NASA Astrophysics Data System (ADS)
Di Matteo, S.; Villante, U.
2017-05-01
The occurrence of waves at discrete frequencies in the solar wind (SW) parameters has been reported in the scientific literature with some controversial results, mostly concerning the existence (and stability) of favored sets of frequencies. On the other hand, the experimental results might be influenced by the analytical methods adopted for the spectral analysis. We focused attention on the fluctuations of the SW dynamic pressure (PSW) occurring in the leading edges of streams following interplanetary shocks and compared the results of the Welch method (WM) with those of the multitaper method (MTM). The results of a simulation analysis demonstrate that the identification of the wave occurrence and the frequency estimate might be strongly influenced by the signal characteristics and analytical methods, especially in the presence of multicomponent signals. In SW streams, PSW oscillations are routinely detected in the entire range f ≈ 1.2-5.0 mHz; nevertheless, the WM/MTM agreement in the identification and frequency estimate occurs in ≈50% of events and different sets of favored frequencies would be proposed for the same set of events by the WM and MTM analysis. The histogram of the frequency distribution of the events identified by both methods suggests more relevant percentages between f ≈ 1.7-1.9, f ≈ 2.7-3.4, and f ≈ 3.9-4.4 (with a most relevant peak at f ≈ 4.2 mHz). Extremely severe thresholds select a small number (14) of remarkable events, with a one-to-one correspondence between WM and MTM: interestingly, these events reveal a tendency for a favored occurrence in bins centered at f ≈ 2.9 and at f ≈ 4.2 mHz.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shusharina, N; Choi, N; Bortfeld, T
2016-06-15
Purpose: To determine whether the difference in cumulative 18F-FDG uptake histogram of lung treated with either IMRT or PSPT is associated with radiation pneumonitis (RP) in patients with inoperable stage II and III NSCLC. Methods: We analyzed 24 patients from a prospective randomized trial to compare IMRT (n=12) with vs. PSPT (n=12) for inoperable NSCLC. All patients underwent PET-CT imaging between 35 and 88 days post-therapy. Post-treatment PET-CT was aligned with planning 4D CT to establish a voxel-to-voxel correspondence between post-treatment PET and planning dose images. 18F-FDG uptake as a function of radiation dose to normal lung was obtained formore » each patient. Distribution of the standard uptake value (SUV) was analyzed using a volume histogram method. The image quantitative characteristics and DVH measures were correlated with clinical symptoms of pneumonitis. Results: Patients with RP were present in both groups: 5 in the IMRT and 6 in the PSPT. The analysis of cumulative SUV histograms showed significantly higher relative volumes of the normal lung having higher SUV uptake in the PSPT patients for both symptomatic and asymptomatic cases (VSUV=2: 10% for IMRT vs 16% for proton RT and VSUV=1: 10% for IMRT vs 23% for proton RT). In addition, the SUV histograms for symptomatic cases in PSPT patients exhibited a significantly longer tail at the highest SUV. The absolute volume of the lung receiving the dose >70 Gy was larger in the PSPT patients. Conclusion: 18F-FDG uptake – radiation dose response correlates with RP in both groups of patients by means of the linear regression slope. SUV is higher for the PSPT patients for both symptomatic and asymptomatic cases. Higher uptake after PSPT patients is explained by larger volumes of the lung receiving high radiation dose.« less
Gottschlich, Carsten
2016-01-01
We present a new type of local image descriptor which yields binary patterns from small image patches. For the application to fingerprint liveness detection, we achieve rotation invariant image patches by taking the fingerprint segmentation and orientation field into account. We compute the discrete cosine transform (DCT) for these rotation invariant patches and attain binary patterns by comparing pairs of two DCT coefficients. These patterns are summarized into one or more histograms per image. Each histogram comprises the relative frequencies of pattern occurrences. Multiple histograms are concatenated and the resulting feature vector is used for image classification. We name this novel type of descriptor convolution comparison pattern (CCP). Experimental results show the usefulness of the proposed CCP descriptor for fingerprint liveness detection. CCP outperforms other local image descriptors such as LBP, LPQ and WLD on the LivDet 2013 benchmark. The CCP descriptor is a general type of local image descriptor which we expect to prove useful in areas beyond fingerprint liveness detection such as biological and medical image processing, texture recognition, face recognition and iris recognition, liveness detection for face and iris images, and machine vision for surface inspection and material classification. PMID:26844544
Search for Correlated Fluctuations in the Beta+ Decay of Na-22
NASA Astrophysics Data System (ADS)
Silverman, M. P.; Strange, W.
2008-10-01
Claims for a ``cosmogenic'' force that correlates otherwise independent stochastic events have been made for at least 10 years, based largely on visual inspection of time series of histograms whose shapes were interpreted as suggestive of recurrent patterns with semi-diurnal, diurnal, and monthly periods. Building on our earlier work to test randomness of different nuclear decay processes, we have searched for correlations in the time-series of coincident positron-electron annihilations deriving from beta+ decay of Na-22. Disintegrations were counted within a narrow time window over a period of 7 days, leading to a time series of more than 1 million events. Statistical tests were performed on the raw time series, its correlation function, and its Fourier transform to search for cyclic correlations indicative of quantum-mechanical violating deviations from Poisson statistics. The time series was then partitioned into a sequence of 167 ``bags'' each of 8192 events. A histogram was made of the events of each bag, where contiguous frequency classes differed by a single count. The chronological sequence of histograms was then tested for correlations within classes. In all cases the results of the tests were in accord with statistical control, giving no evidence of correlated fluctuations.
Unbiased estimators for spatial distribution functions of classical fluids
NASA Astrophysics Data System (ADS)
Adib, Artur B.; Jarzynski, Christopher
2005-01-01
We use a statistical-mechanical identity closely related to the familiar virial theorem, to derive unbiased estimators for spatial distribution functions of classical fluids. In particular, we obtain estimators for both the fluid density ρ(r) in the vicinity of a fixed solute and the pair correlation g(r) of a homogeneous classical fluid. We illustrate the utility of our estimators with numerical examples, which reveal advantages over traditional histogram-based methods of computing such distributions.
Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis
Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina
2015-01-01
Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed. PMID:26167524
Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis.
Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina
2015-01-01
Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed.
Dual-frequency ultrasound for detecting and sizing bubbles.
Buckey, Jay C; Knaus, Darin A; Alvarenga, Donna L; Kenton, Marc A; Magari, Patrick J
2005-01-01
ISS construction and Mars exploration require extensive extravehicular activity (EVA), exposing crewmembers to increased decompression sickness risk. Improved bubble detection technologies could help increase EVA efficiency and safety. Creare Inc. has developed a bubble detection and sizing instrument using dual-frequency ultrasound. The device emits "pump" and "image" signals at two frequencies. The low-frequency pump signal causes an appropriately-sized bubble to resonate. When the image frequency hits a resonating bubble, mixing signals are returned at the sum and difference of the two frequencies. To test the feasibility of transcutaneous intravascular detection, intravascular bubbles in anesthetized swine were produced using agitated saline and decompression stress. Ultrasonic transducers on the chest provided the two frequencies. Mixing signals were detected transthoracically in the right atrium using both methods. A histogram of estimated bubble sizes could be constructed. Bubbles can be detected and sized transthoracically in the right atrium using dual-frequency ultrasound. c2005 Elsevier Ltd. All rights reserved.
Evaluation of the effectiveness of color attributes for video indexing
NASA Astrophysics Data System (ADS)
Chupeau, Bertrand; Forest, Ronan
2001-10-01
Color features are reviewed and their effectiveness assessed in the application framework of key-frame clustering for abstracting unconstrained video. Existing color spaces and associated quantization schemes are first studied. Description of global color distribution by means of histograms is then detailed. In our work, 12 combinations of color space and quantization were selected, together with 12 histogram metrics. Their respective effectiveness with respect to picture similarity measurement was evaluated through a query-by-example scenario. For that purpose, a set of still-picture databases was built by extracting key frames from several video clips, including news, documentaries, sports and cartoons. Classical retrieval performance evaluation criteria were adapted to the specificity of our testing methodology.
Evaluation of the effectiveness of color attributes for video indexing
NASA Astrophysics Data System (ADS)
Chupeau, Bertrand; Forest, Ronan
2001-01-01
Color features are reviewed and their effectiveness assessed in the application framework of key-frame clustering for abstracting unconstrained video. Existing color spaces and associated quantization schemes are first studied. Description of global color distribution by means of histograms is then detailed. In our work, twelve combinations of color space and quantization were selected, together with twelve histogram metrics. Their respective effectiveness with respect to picture similarity measurement was evaluated through a query-be-example scenario. For that purpose, a set of still-picture databases was built by extracting key-frames from several video clips, including news, documentaries, sports and cartoons. Classical retrieval performance evaluation criteria were adapted to the specificity of our testing methodology.
Evaluation of the effectiveness of color attributes for video indexing
NASA Astrophysics Data System (ADS)
Chupeau, Bertrand; Forest, Ronan
2000-12-01
Color features are reviewed and their effectiveness assessed in the application framework of key-frame clustering for abstracting unconstrained video. Existing color spaces and associated quantization schemes are first studied. Description of global color distribution by means of histograms is then detailed. In our work, twelve combinations of color space and quantization were selected, together with twelve histogram metrics. Their respective effectiveness with respect to picture similarity measurement was evaluated through a query-be-example scenario. For that purpose, a set of still-picture databases was built by extracting key-frames from several video clips, including news, documentaries, sports and cartoons. Classical retrieval performance evaluation criteria were adapted to the specificity of our testing methodology.
CALIPSO V1.00 L3 IceCloud Formal Release Announcement
Atmospheric Science Data Center
2018-06-13
... The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center in collaboration with the CALIPSO mission team announces the ... distributions of ice cloud extinction coefficients and ice water content histograms on a uniform spatial grid. All parameters are ...
Experiment of Rain Retrieval over Land Using Surface Emissivity Map Derived from TRMM TMI and JRA25
NASA Astrophysics Data System (ADS)
Furuzawa, Fumie; Masunaga, Hirohiko; Nakamura, Kenji
2010-05-01
We are developing a data-set of global land surface emissivity calculated from TRMM TMI brightness temperature (TB) and atmospheric profile data of Japanese 25-year Reanalysis Project (JRA-25) for the region identified as no-rain by TRMM PR, assuming zero cloud liquid water beyond 0-C level. For the evaluation, some characteristics of global monthly emissivity maps, for example, dependency of emissivity on each TMI frequency or each local time or seasonal/annual variation are checked. Moreover, these data are classified based on JRA25 land type or soilwetness and compared. Histogram of polarization difference of emissivity is similar to that of TB and mostly reflects the variability of land type or soil wetness, while histogram of vertical emissivity show a small difference. Next, by interpolating this instantaneous dataset with Gaussian function weighting, we derive an emissivity over neighboring rainy region and assess the interpolated emissivity by running radiative transfer model using PR rain profile and comparing with observed TB. Preliminary rain retrieval from the emissivities for some frequencies and TBs is evaluated based on PR rain profile and TMI rain rate. Moreover, another method is tested to estimate surface temperature from two emissivities, based on their statistical relation for each land type. We will show the results for vertical and horizontal emissivities of each frequency.
Lichtenhan, Jeffery T.; Chertoff, Mark E.
2008-01-01
An analytic compound action potential (CAP) obtained by convolving functional representations of the post-stimulus time histogram summed across auditory nerve neurons [P(t)] and a single neuron action potential [U(t)] was fit to human CAPs. The analytic CAP fit to pre- and postnoise-induced temporary hearing threshold shift (TTS) estimated in vivoP(t) and U(t) and the number of neurons contributing to the CAPs (N). The width of P(t) decreased with increasing signal level and was wider at the lowest signal level following noise exposure. P(t) latency decreased with increasing signal level and was shorter at all signal levels following noise exposure. The damping and oscillatory frequency of U(t) increased with signal level. For subjects with large amounts of TTS, U(t) had greater damping than before noise exposure particularly at low signal levels. Additionally, U(t) oscillation was lower in frequency at all click intensities following noise exposure. N increased with signal level and was smaller after noise exposure at the lowest signal level. Collectively these findings indicate that neurons contributing to the CAP during TTS are fewer in number, shorter in latency, and poorer in synchrony than before noise exposure. Moreover, estimates of single neuron action potentials may decay more rapidly and have a lower oscillatory frequency during TTS. PMID:18397026
NASA Astrophysics Data System (ADS)
Lee, Richard; Chan, Elisa K.; Kosztyla, Robert; Liu, Mitchell; Moiseenko, Vitali
2012-12-01
The relationship between rectal dose distribution and the incidence of late rectal complications following external-beam radiotherapy has been previously studied using dose-volume histograms or dose-surface histograms. However, they do not account for the spatial dose distribution. This study proposes a metric based on both surface dose and distance that can predict the incidence of rectal bleeding in prostate cancer patients treated with radical radiotherapy. One hundred and forty-four patients treated with radical radiotherapy for prostate cancer were prospectively followed to record the incidence of grade ≥2 rectal bleeding. Radiotherapy plans were used to evaluate a dose-distance metric that accounts for the dose and its spatial distribution on the rectal surface, characterized by a logistic weighting function with slope a and inflection point d0. This was compared to the effective dose obtained from dose-surface histograms, characterized by the parameter n which describes sensitivity to hot spots. The log-rank test was used to determine statistically significant (p < 0.05) cut-off values for the dose-distance metric and effective dose that predict for the occurrence of rectal bleeding. For the dose-distance metric, only d0 = 25 and 30 mm combined with a > 5 led to statistical significant cut-offs. For the effective dose metric, only values of n in the range 0.07-0.35 led to statistically significant cut-offs. The proposed dose-distance metric is a predictor of rectal bleeding in prostate cancer patients treated with radiotherapy. Both the dose-distance metric and the effective dose metric indicate that the incidence of grade ≥2 rectal bleeding is sensitive to localized damage to the rectal surface.
Content based Image Retrieval based on Different Global and Local Color Histogram Methods: A Survey
NASA Astrophysics Data System (ADS)
Suhasini, Pallikonda Sarah; Sri Rama Krishna, K.; Murali Krishna, I. V.
2017-02-01
Different global and local color histogram methods for content based image retrieval (CBIR) are investigated in this paper. Color histogram is a widely used descriptor for CBIR. Conventional method of extracting color histogram is global, which misses the spatial content, is less invariant to deformation and viewpoint changes, and results in a very large three dimensional histogram corresponding to the color space used. To address the above deficiencies, different global and local histogram methods are proposed in recent research. Different ways of extracting local histograms to have spatial correspondence, invariant colour histogram to add deformation and viewpoint invariance and fuzzy linking method to reduce the size of the histogram are found in recent papers. The color space and the distance metric used are vital in obtaining color histogram. In this paper the performance of CBIR based on different global and local color histograms in three different color spaces, namely, RGB, HSV, L*a*b* and also with three distance measures Euclidean, Quadratic and Histogram intersection are surveyed, to choose appropriate method for future research.
NASA Astrophysics Data System (ADS)
Buettner, Florian; Gulliford, Sarah L.; Webb, Steve; Sydes, Matthew R.; Dearnaley, David P.; Partridge, Mike
2009-11-01
Many studies have been performed to assess correlations between measures derived from dose-volume histograms and late rectal toxicities for radiotherapy of prostate cancer. The purpose of this study was to quantify correlations between measures describing the shape and location of the dose distribution and different outcomes. The dose to the rectal wall was projected on a two-dimensional map. In order to characterize the dose distribution, its centre of mass, longitudinal and lateral extent, and eccentricity were calculated at different dose levels. Furthermore, the dose-surface histogram (DSH) was determined. Correlations between these measures and seven clinically relevant rectal-toxicity endpoints were quantified by maximally selected standardized Wilcoxon rank statistics. The analysis was performed using data from the RT01 prostate radiotherapy trial. For some endpoints, the shape of the dose distribution is more strongly correlated with the outcome than simple DSHs. Rectal bleeding was most strongly correlated with the lateral extent of the dose distribution. For loose stools, the strongest correlations were found for longitudinal extent; proctitis was most strongly correlated with DSH. For the other endpoints no statistically significant correlations could be found. The strengths of the correlations between the shape of the dose distribution and outcome differed considerably between the different endpoints. Due to these significant correlations, it is desirable to use shape-based tools in order to assess the quality of a dose distribution.
Carrier Modulation Via Waveform Probability Density Function
NASA Technical Reports Server (NTRS)
Williams, Glenn L.
2006-01-01
Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital one's or zero's. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental physical laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.
Carrier Modulation Via Waveform Probability Density Function
NASA Technical Reports Server (NTRS)
Williams, Glenn L.
2004-01-01
Beyond the classic modes of carrier modulation by varying amplitude (AM), phase (PM), or frequency (FM), we extend the modulation domain of an analog carrier signal to include a class of general modulations which are distinguished by their probability density function histogram. Separate waveform states are easily created by varying the pdf of the transmitted waveform. Individual waveform states are assignable as proxies for digital ONEs or ZEROs. At the receiver, these states are easily detected by accumulating sampled waveform statistics and performing periodic pattern matching, correlation, or statistical filtering. No fundamental natural laws are broken in the detection process. We show how a typical modulation scheme would work in the digital domain and suggest how to build an analog version. We propose that clever variations of the modulating waveform (and thus the histogram) can provide simple steganographic encoding.
Naturalness preservation image contrast enhancement via histogram modification
NASA Astrophysics Data System (ADS)
Tian, Qi-Chong; Cohen, Laurent D.
2018-04-01
Contrast enhancement is a technique for enhancing image contrast to obtain better visual quality. Since many existing contrast enhancement algorithms usually produce over-enhanced results, the naturalness preservation is needed to be considered in the framework of image contrast enhancement. This paper proposes a naturalness preservation contrast enhancement method, which adopts the histogram matching to improve the contrast and uses the image quality assessment to automatically select the optimal target histogram. The contrast improvement and the naturalness preservation are both considered in the target histogram, so this method can avoid the over-enhancement problem. In the proposed method, the optimal target histogram is a weighted sum of the original histogram, the uniform histogram, and the Gaussian-shaped histogram. Then the structural metric and the statistical naturalness metric are used to determine the weights of corresponding histograms. At last, the contrast-enhanced image is obtained via matching the optimal target histogram. The experiments demonstrate the proposed method outperforms the compared histogram-based contrast enhancement algorithms.
Fitting monthly Peninsula Malaysian rainfall using Tweedie distribution
NASA Astrophysics Data System (ADS)
Yunus, R. M.; Hasan, M. M.; Zubairi, Y. Z.
2017-09-01
In this study, the Tweedie distribution was used to fit the monthly rainfall data from 24 monitoring stations of Peninsula Malaysia for the period from January, 2008 to April, 2015. The aim of the study is to determine whether the distributions within the Tweedie family fit well the monthly Malaysian rainfall data. Within the Tweedie family, the gamma distribution is generally used for fitting the rainfall totals, however the Poisson-gamma distribution is more useful to describe two important features of rainfall pattern, which are the occurrences (dry months) and the amount (wet months). First, the appropriate distribution of the monthly rainfall was identified within the Tweedie family for each station. Then, the Tweedie Generalised Linear Model (GLM) with no explanatory variable was used to model the monthly rainfall data. Graphical representation was used to assess model appropriateness. The QQ plots of quantile residuals show that the Tweedie models fit the monthly rainfall data better for majority of the stations in the west coast and mid land than those in the east coast of Peninsula. This significant finding suggests that the best fitted distribution depends on the geographical location of the monitoring station. In this paper, a simple model is developed for generating synthetic rainfall data for use in various areas, including agriculture and irrigation. We have showed that the data that were simulated using the Tweedie distribution have fairly similar frequency histogram to that of the actual data. Both the mean number of rainfall events and mean amount of rain for a month were estimated simultaneously for the case that the Poisson gamma distribution fits the data reasonably well. Thus, this work complements previous studies that fit the rainfall amount and the occurrence of rainfall events separately, each to a different distribution.
NASA Astrophysics Data System (ADS)
Pavel-Mititean, Luciana M.; Rowbottom, Carl G.; Hector, Charlotte L.; Partridge, Mike; Bortfeld, Thomas; Schlegel, Wolfgang
2004-06-01
A geometric model is presented which allows calculation of the dosimetric consequences of rectal motion in prostate radiotherapy. Variations in the position of the rectum are measured by repeat CT scanning during the courses of treatment of five patients. Dose distributions are calculated by applying the same conformal treatment plan to each imaged fraction and rectal dose-surface histograms produced. The 2D model allows isotropic expansion and contraction in the plane of each CT slice. By summing the dose to specific volume elements tracked by the model, composite dose distributions are produced that explicitly include measured inter-fraction motion for each patient. These are then used to estimate effective dose-surface histograms (DSHs) for the entire treatment. Results are presented showing the magnitudes of the measured target and rectal motion and showing the effects of this motion on the integral dose to the rectum. The possibility of using such information to calculate normal tissue complication probabilities (NTCP) is demonstrated and discussed.
NASA Astrophysics Data System (ADS)
Alimi, Isiaka; Shahpari, Ali; Ribeiro, Vítor; Sousa, Artur; Monteiro, Paulo; Teixeira, António
2017-05-01
In this paper, we present experimental results on channel characterization of single input single output (SISO) free-space optical (FSO) communication link that is based on channel measurements. The histograms of the FSO channel samples and the log-normal distribution fittings are presented along with the measured scintillation index. Furthermore, we extend our studies to diversity schemes and propose a closed-form expression for determining ergodic channel capacity of multiple input multiple output (MIMO) FSO communication systems over atmospheric turbulence fading channels. The proposed empirical model is based on SISO FSO channel characterization. Also, the scintillation effects on the system performance are analyzed and results for different turbulence conditions are presented. Moreover, we observed that the histograms of the FSO channel samples that we collected from a 1548.51 nm link have good fits with log-normal distributions and the proposed model for MIMO FSO channel capacity is in conformity with the simulation results in terms of normalized mean-square error (NMSE).
Genetic Engineering of Optical Properties of Biomaterials
NASA Astrophysics Data System (ADS)
Gourley, Paul; Naviaux, Robert; Yaffe, Michael
2008-03-01
Baker's yeast cells are easily cultured and can be manipulated genetically to produce large numbers of bioparticles (cells and mitochondria) with controllable size and optical properties. We have recently employed nanolaser spectroscopy to study the refractive index of individual cells and isolated mitochondria from two mutant strains. Results show that biomolecular changes induced by mutation can produce bioparticles with radical changes in refractive index. Wild-type mitochondria exhibit a distribution with a well-defined mean and small variance. In striking contrast, mitochondria from one mutant strain produced a histogram that is highly collapsed with a ten-fold decrease in the mean and standard deviation. In a second mutant strain we observed an opposite effect with the mean nearly unchanged but the variance increased nearly a thousand-fold. Both histograms could be self-consistently modeled with a single, log-normal distribution. The strains were further examined by 2-dimensional gel electrophoresis to measure changes in protein composition. All of these data show that genetic manipulation of cells represents a new approach to engineering optical properties of bioparticles.
NASA Astrophysics Data System (ADS)
Lancaster, S. T.; Frueh, W. T.
2011-12-01
A large number (N = 351) of radiocarbon dates of charcoal from valley-bottom sediments in headwater valleys of the southern Oregon Coast Range provides the basis for a new index of fire frequency during the past 17,000 years in this steep landscape covered by dense coniferous forest. Study areas were chosen for their relative lack of recent forest disturbance by harvest or fire, and sampling of stream banks and terrace risers was random, weighted by deposit volume and bank or riser area. This sampling methodology was designed to characterize sediment residence times within valley-bottom storage, and the overall shape of the calibrated age distribution is therefore assumed representative of the dependence of charcoal preservation probability on calibrated age. A proxy record of fire history in the study areas is obtained by fitting a gamma distribution to the weighted mean calibrated charcoal ages by the method of moments; calculating the relative difference between the fit and the normalized histogram, with 50-year bin-widths, of charcoal ages; and smoothing that relative difference with a gaussian distribution, the standard deviation of which is at least two bin-widths and inversely proportional to the value of the fit distribution at larger ages. The calibrated charcoal age mean and variance of 1900 yrs BP and 7.39 x 106 yr2, respectively, yield shape and scale parameters of the fit gamma distribution of 0.490 and 3880 yrs, respectively. This heavy-tailed distribution indicates that probabilities of charcoal evacuation are not simply proportional to relative volume of encasing sediment deposits but, rather, decrease with deposit age. The smoothed proxy record of relative fire frequency has a global maximum at 7700 BP and prominent local maxima at 600 BP and 5700 BP, in order of decreasing magnitude; a global minimum at 4500 BP and local minimum at 1800 BP roughly bracket a period of fluctuating but relatively low fire frequency during the period 5000-1500 BP. Although resolution in the late glacial to early Holocene is limited, the record shows a high relative fire frequency during the late glacial before dipping 10,000-9000 BP. The 7700 BP maximum and 1800 BP minimum are consistent with another fire history from lake sediments northeast of our sites in the Oregon Coast Range. Other features appear to contradict that record but to support of climate change inferences based on other climate proxies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sugano, Yasutaka; Mizuta, Masahiro; Takao, Seishin
Purpose: Radiotherapy of solid tumors has been performed with various fractionation regimens such as multi- and hypofractionations. However, the ability to optimize the fractionation regimen considering the physical dose distribution remains insufficient. This study aims to optimize the fractionation regimen, in which the authors propose a graphical method for selecting the optimal number of fractions (n) and dose per fraction (d) based on dose–volume histograms for tumor and normal tissues of organs around the tumor. Methods: Modified linear-quadratic models were employed to estimate the radiation effects on the tumor and an organ at risk (OAR), where the repopulation of themore » tumor cells and the linearity of the dose-response curve in the high dose range of the surviving fraction were considered. The minimization problem for the damage effect on the OAR was solved under the constraint that the radiation effect on the tumor is fixed by a graphical method. Here, the damage effect on the OAR was estimated based on the dose–volume histogram. Results: It was found that the optimization of fractionation scheme incorporating the dose–volume histogram is possible by employing appropriate cell surviving models. The graphical method considering the repopulation of tumor cells and a rectilinear response in the high dose range enables them to derive the optimal number of fractions and dose per fraction. For example, in the treatment of prostate cancer, the optimal fractionation was suggested to lie in the range of 8–32 fractions with a daily dose of 2.2–6.3 Gy. Conclusions: It is possible to optimize the number of fractions and dose per fraction based on the physical dose distribution (i.e., dose–volume histogram) by the graphical method considering the effects on tumor and OARs around the tumor. This method may stipulate a new guideline to optimize the fractionation regimen for physics-guided fractionation.« less
SU-F-I-45: An Automated Technique to Measure Image Contrast in Clinical CT Images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanders, J; Abadi, E; Meng, B
Purpose: To develop and validate an automated technique for measuring image contrast in chest computed tomography (CT) exams. Methods: An automated computer algorithm was developed to measure the distribution of Hounsfield units (HUs) inside four major organs: the lungs, liver, aorta, and bones. These organs were first segmented or identified using computer vision and image processing techniques. Regions of interest (ROIs) were automatically placed inside the lungs, liver, and aorta and histograms of the HUs inside the ROIs were constructed. The mean and standard deviation of each histogram were computed for each CT dataset. Comparison of the mean and standardmore » deviation of the HUs in the different organs provides different contrast values. The ROI for the bones is simply the segmentation mask of the bones. Since the histogram for bones does not follow a Gaussian distribution, the 25th and 75th percentile were computed instead of the mean. The sensitivity and accuracy of the algorithm was investigated by comparing the automated measurements with manual measurements. Fifteen contrast enhanced and fifteen non-contrast enhanced chest CT clinical datasets were examined in the validation procedure. Results: The algorithm successfully measured the histograms of the four organs in both contrast and non-contrast enhanced chest CT exams. The automated measurements were in agreement with manual measurements. The algorithm has sufficient sensitivity as indicated by the near unity slope of the automated versus manual measurement plots. Furthermore, the algorithm has sufficient accuracy as indicated by the high coefficient of determination, R2, values ranging from 0.879 to 0.998. Conclusion: Patient-specific image contrast can be measured from clinical datasets. The algorithm can be run on both contrast enhanced and non-enhanced clinical datasets. The method can be applied to automatically assess the contrast characteristics of clinical chest CT images and quantify dependencies that may not be captured in phantom data.« less
Document image cleanup and binarization
NASA Astrophysics Data System (ADS)
Wu, Victor; Manmatha, Raghaven
1998-04-01
Image binarization is a difficult task for documents with text over textured or shaded backgrounds, poor contrast, and/or considerable noise. Current optical character recognition (OCR) and document analysis technology do not handle such documents well. We have developed a simple yet effective algorithm for document image clean-up and binarization. The algorithm consists of two basic steps. In the first step, the input image is smoothed using a low-pass filter. The smoothing operation enhances the text relative to any background texture. This is because background texture normally has higher frequency than text does. The smoothing operation also removes speckle noise. In the second step, the intensity histogram of the smoothed image is computed and a threshold automatically selected as follows. For black text, the first peak of the histogram corresponds to text. Thresholding the image at the value of the valley between the first and second peaks of the histogram binarizes the image well. In order to reliably identify the valley, the histogram is smoothed by a low-pass filter before the threshold is computed. The algorithm has been applied to some 50 images from a wide variety of source: digitized video frames, photos, newspapers, advertisements in magazines or sales flyers, personal checks, etc. There are 21820 characters and 4406 words in these images. 91 percent of the characters and 86 percent of the words are successfully cleaned up and binarized. A commercial OCR was applied to the binarized text when it consisted of fonts which were OCR recognizable. The recognition rate was 84 percent for the characters and 77 percent for the words.
Correlations between the frequencies of twin kHz QPOs and spins of neutron stars in LMXBs
NASA Astrophysics Data System (ADS)
Wang, De-Hua; Zhang, Cheng-Min; Qu, Jin-Lu; Yang, Yi-Yan
2018-02-01
We investigate the correlation between the frequencies of twin kilohertz quasi-periodic oscillations (kHz QPOs) and neutron star (NS) spins in low-mass X-ray binaries (LMXBs), based on the data sets of 12 sources with simultaneously detected twin kHz QPOs and NS spins, and find that the histogram of the ratio between the frequency difference of twin kHz QPOs (Δν ≡ ν2 - ν1) and NS spin νs shows a non-uniform distribution with a gap at Δν/νs ∼ 0.65. We try to classify the 12 sources into two categories according to this gap: (I) slow rotators with 〈νs〉 ∼ 311 Hz, XTE J1807.4-294, 4U 1915-05, IGR J17191-2821, 4U 1702-43, 4U 1728-34 and 4U 0614+09 follow a relation Δν/νs > 0.65; (II) fast rotators with 〈νs〉 ∼ 546 Hz, SAX J1808.4-3658, KS 1731-260, Aql X-1, 4U 1636-53, SAX J1750.8-2900 and 4U 1608-52 satisfy the relation Δν/νs < 0.65. However, the linear fittings of Δν versus νs relations of groups (I) and (II) are unsatisfactory to ensure any certain correlations. We suggest that this phenomenon may arise from the fact that most measured kHz QPOs and spins satisfy the conditions of 1.1 νs ≤ ν2 < 1300 Hz and Δν decreasing with ν2. Apparently, the diversified distribution of Δν/νs refutes the simple beat-frequency model, and the statistical correlations between the twin kHz QPOs and NS spins may arise from the magnetosphere-disc boundary environments, e.g. co-rotation radius and NS radius, that modulate the occurrences of X-ray signals. Furthermore, we also find a distribution of the ratio of ν2 to ν1 clusters around the value of 〈ν2/ν1〉 ∼ 3: 2, which shows no obvious correlation with NS spins.
Adaptive local thresholding for robust nucleus segmentation utilizing shape priors
NASA Astrophysics Data System (ADS)
Wang, Xiuzhong; Srinivas, Chukka
2016-03-01
This paper describes a novel local thresholding method for foreground detection. First, a Canny edge detection method is used for initial edge detection. Then, tensor voting is applied on the initial edge pixels, using a nonsymmetric tensor field tailored to encode prior information about nucleus size, shape, and intensity spatial distribution. Tensor analysis is then performed to generate the saliency image and, based on that, the refined edge. Next, the image domain is divided into blocks. In each block, at least one foreground and one background pixel are sampled for each refined edge pixel. The saliency weighted foreground histogram and background histogram are then created. These two histograms are used to calculate a threshold by minimizing the background and foreground pixel classification error. The block-wise thresholds are then used to generate the threshold for each pixel via interpolation. Finally, the foreground is obtained by comparing the original image with the threshold image. The effective use of prior information, combined with robust techniques, results in far more reliable foreground detection, which leads to robust nucleus segmentation.
Axelsen, Jacob Bock; Yan, Koon-Kiu; Maslov, Sergei
2007-01-01
Background The evolution of the full repertoire of proteins encoded in a given genome is mostly driven by gene duplications, deletions, and sequence modifications of existing proteins. Indirect information about relative rates and other intrinsic parameters of these three basic processes is contained in the proteome-wide distribution of sequence identities of pairs of paralogous proteins. Results We introduce a simple mathematical framework based on a stochastic birth-and-death model that allows one to extract some of this information and apply it to the set of all pairs of paralogous proteins in H. pylori, E. coli, S. cerevisiae, C. elegans, D. melanogaster, and H. sapiens. It was found that the histogram of sequence identities p generated by an all-to-all alignment of all protein sequences encoded in a genome is well fitted with a power-law form ~ p-γ with the value of the exponent γ around 4 for the majority of organisms used in this study. This implies that the intra-protein variability of substitution rates is best described by the Gamma-distribution with the exponent α ≈ 0.33. Different features of the shape of such histograms allow us to quantify the ratio between the genome-wide average deletion/duplication rates and the amino-acid substitution rate. Conclusion We separately measure the short-term ("raw") duplication and deletion rates rdup∗, rdel∗ which include gene copies that will be removed soon after the duplication event and their dramatically reduced long-term counterparts rdup, rdel. High deletion rate among recently duplicated proteins is consistent with a scenario in which they didn't have enough time to significantly change their functional roles and thus are to a large degree disposable. Systematic trends of each of the four duplication/deletion rates with the total number of genes in the genome were analyzed. All but the deletion rate of recent duplicates rdel∗ were shown to systematically increase with Ngenes. Abnormally flat shapes of sequence identity histograms observed for yeast and human are consistent with lineages leading to these organisms undergoing one or more whole-genome duplications. This interpretation is corroborated by our analysis of the genome of Paramecium tetraurelia where the p-4 profile of the histogram is gradually restored by the successive removal of paralogs generated in its four known whole-genome duplication events. PMID:18039386
Dao Duc, Khanh; Parutto, Pierre; Chen, Xiaowei; Epsztein, Jérôme; Konnerth, Arthur; Holcman, David
2015-01-01
The dynamics of neuronal networks connected by synaptic dynamics can sustain long periods of depolarization that can last for hundreds of milliseconds such as Up states recorded during sleep or anesthesia. Yet the underlying mechanism driving these periods remain unclear. We show here within a mean-field model that the residence time of the neuronal membrane potential in cortical Up states does not follow a Poissonian law, but presents several peaks. Furthermore, the present modeling approach allows extracting some information about the neuronal network connectivity from the time distribution histogram. Based on a synaptic-depression model, we find that these peaks, that can be observed in histograms of patch-clamp recordings are not artifacts of electrophysiological measurements, but rather are an inherent property of the network dynamics. Analysis of the equations reveals a stable focus located close to the unstable limit cycle, delimiting a region that defines the Up state. The model further shows that the peaks observed in the Up state time distribution are due to winding around the focus before escaping from the basin of attraction. Finally, we use in vivo recordings of intracellular membrane potential and we recover from the peak distribution, some information about the network connectivity. We conclude that it is possible to recover the network connectivity from the distribution of times that the neuronal membrane voltage spends in Up states.
Distributions-per-level: a means of testing level detectors and models of patch-clamp data.
Schröder, I; Huth, T; Suitchmezian, V; Jarosik, J; Schnell, S; Hansen, U P
2004-01-01
Level or jump detectors generate the reconstructed time series from a noisy record of patch-clamp current. The reconstructed time series is used to create dwell-time histograms for the kinetic analysis of the Markov model of the investigated ion channel. It is shown here that some additional lines in the software of such a detector can provide a powerful new means of patch-clamp analysis. For each current level that can be recognized by the detector, an array is declared. The new software assigns every data point of the original time series to the array that belongs to the actual state of the detector. From the data sets in these arrays distributions-per-level are generated. Simulated and experimental time series analyzed by Hinkley detectors are used to demonstrate the benefits of these distributions-per-level. First, they can serve as a test of the reliability of jump and level detectors. Second, they can reveal beta distributions as resulting from fast gating that would usually be hidden in the overall amplitude histogram. Probably the most valuable feature is that the malfunctions of the Hinkley detectors turn out to depend on the Markov model of the ion channel. Thus, the errors revealed by the distributions-per-level can be used to distinguish between different putative Markov models of the measured time series.
Reliability of dose volume constraint inference from clinical data.
Lutz, C M; Møller, D S; Hoffmann, L; Knap, M M; Alber, M
2017-04-21
Dose volume histogram points (DVHPs) frequently serve as dose constraints in radiotherapy treatment planning. An experiment was designed to investigate the reliability of DVHP inference from clinical data for multiple cohort sizes and complication incidence rates. The experimental background was radiation pneumonitis in non-small cell lung cancer and the DVHP inference method was based on logistic regression. From 102 NSCLC real-life dose distributions and a postulated DVHP model, an 'ideal' cohort was generated where the most predictive model was equal to the postulated model. A bootstrap and a Cohort Replication Monte Carlo (CoRepMC) approach were applied to create 1000 equally sized populations each. The cohorts were then analyzed to establish inference frequency distributions. This was applied to nine scenarios for cohort sizes of 102 (1), 500 (2) to 2000 (3) patients (by sampling with replacement) and three postulated DVHP models. The Bootstrap was repeated for a 'non-ideal' cohort, where the most predictive model did not coincide with the postulated model. The Bootstrap produced chaotic results for all models of cohort size 1 for both the ideal and non-ideal cohorts. For cohort size 2 and 3, the distributions for all populations were more concentrated around the postulated DVHP. For the CoRepMC, the inference frequency increased with cohort size and incidence rate. Correct inference rates >[Formula: see text] were only achieved by cohorts with more than 500 patients. Both Bootstrap and CoRepMC indicate that inference of the correct or approximate DVHP for typical cohort sizes is highly uncertain. CoRepMC results were less spurious than Bootstrap results, demonstrating the large influence that randomness in dose-response has on the statistical analysis.
Normative database of donor keratographic readings in an eye-bank setting.
Lewis, Jennifer R; Bogucki, Jennifer M; Mahmoud, Ashraf M; Lembach, Richard G; Roberts, Cynthia J
2010-04-01
To generate a normative donor topographic database from rasterstereography images of whole globes acquired in an eye-bank setting with minimal manipulation or handling. Eye-bank laboratory. In a retrospective study, rasterstereography topographic images that had been prospectively collected in duplicate of donor eyes received by the Central Ohio Lions Eye Bank between 1997 and 1999 were analyzed. Best-fit sphere (BFS) and simulated keratometry (K) values were extracted. These values were recalculated after application of custom software to correct any tilt of the mapped surfaces relative to the image plane. The mean value variances between right eyes and left eyes, between consecutive scans, and after untilting were analyzed by repeated-measures analysis of variance and t tests (P
Reliability of dose volume constraint inference from clinical data
NASA Astrophysics Data System (ADS)
Lutz, C. M.; Møller, D. S.; Hoffmann, L.; Knap, M. M.; Alber, M.
2017-04-01
Dose volume histogram points (DVHPs) frequently serve as dose constraints in radiotherapy treatment planning. An experiment was designed to investigate the reliability of DVHP inference from clinical data for multiple cohort sizes and complication incidence rates. The experimental background was radiation pneumonitis in non-small cell lung cancer and the DVHP inference method was based on logistic regression. From 102 NSCLC real-life dose distributions and a postulated DVHP model, an ‘ideal’ cohort was generated where the most predictive model was equal to the postulated model. A bootstrap and a Cohort Replication Monte Carlo (CoRepMC) approach were applied to create 1000 equally sized populations each. The cohorts were then analyzed to establish inference frequency distributions. This was applied to nine scenarios for cohort sizes of 102 (1), 500 (2) to 2000 (3) patients (by sampling with replacement) and three postulated DVHP models. The Bootstrap was repeated for a ‘non-ideal’ cohort, where the most predictive model did not coincide with the postulated model. The Bootstrap produced chaotic results for all models of cohort size 1 for both the ideal and non-ideal cohorts. For cohort size 2 and 3, the distributions for all populations were more concentrated around the postulated DVHP. For the CoRepMC, the inference frequency increased with cohort size and incidence rate. Correct inference rates >85 % were only achieved by cohorts with more than 500 patients. Both Bootstrap and CoRepMC indicate that inference of the correct or approximate DVHP for typical cohort sizes is highly uncertain. CoRepMC results were less spurious than Bootstrap results, demonstrating the large influence that randomness in dose-response has on the statistical analysis.
Kromin, A A; Dvoenko, E E; Zenina, O Yu
2016-07-01
Reflection of the state of hunger in impulse activity of nose wing muscles and upper esophageal sphincter muscles was studied in chronic experiments on rabbits subjected to 24-h food deprivation in the absence of locomotion and during search behavior. In the absence of apparent behavioral activity, including sniffing, alai nasi muscles of hungry rabbits constantly generated bursts of action potentials synchronous with breathing, while upper esophageal sphincter muscles exhibited regular aperiodic low-amplitude impulse activity of tonic type. Latent form of food motivation was reflected in the structure of temporal organization of impulse activity of alai nasi muscles in the form of bimodal distribution of interpulse intervals and in temporal structure of impulse activity of upper esophageal sphincter muscles in the form of monomodal distribution. The latent form of food motivation was manifested in the structure of temporal organization of periods of the action potentials burst-like rhythm, generated by alai nasi muscles, in the form of monomodal distribution, characterized by a high degree of dispersion of respiratory cycle periods. In the absence of physical activity hungry animals sporadically exhibited sniffing activity, manifested in the change from the burst-like impulse activity of alai nasi muscles to the single-burst activity type with bimodal distribution of interpulse intervals and monomodal distribution of the burst-like action potentials rhythm periods, the maximum of which was shifted towards lower values, which was the cause of increased respiratory rate. At the same time, the monomodal temporal structure of impulse activity of the upper esophageal sphincter muscles was not changed. With increasing food motivation in the process of search behavior temporal structure of periods of the burst-like action potentials rhythm, generated by alai nasi muscles, became similar to that observed during sniffing, not accompanied by animal's locomotion, which is typical for the increased respiratory rhythm frequency. Increased hunger motivation was reflected in the temporal structure of impulse activity of upper esophageal sphincter muscles in the form of a shift to lower values of the maximum of monomodal distribution of interpulse intervals on the histogram, resulting in higher impulse activity frequency. The simultaneous increase in the frequency of action potentials bursts generation by alai nasi muscles and regular impulse activity of upper esophageal sphincter muscles is a reliable criterion for enhanced food motivation during search behavior in rabbits.
Gaing, Byron; Sigmund, Eric E; Huang, William C; Babb, James S; Parikh, Nainesh S; Stoffel, David; Chandarana, Hersh
2015-03-01
The aim of this study was to determine if voxel-based histogram analysis of intravoxel incoherent motion imaging (IVIM) parameters can differentiate various subtypes of renal tumors, including benign and malignant lesions. A total of 44 patients with renal tumors who underwent surgery and had histopathology available were included in this Health Insurance Portability and Accountability Act-compliant, institutional review board-approved, single-institution prospective study. In addition to routine renal magnetic resonance imaging examination performed on a 1.5-T system, all patients were imaged with axial diffusion-weighted imaging using 8 b values (range, 0-800 s/mm). A biexponential model was fitted to the diffusion signal data using a segmented algorithm to extract the IVIM parameters perfusion fraction (fp), tissue diffusivity (Dt), and pseudodiffusivity (Dp) for each voxel. Mean and histogram measures of heterogeneity (standard deviation, skewness, and kurtosis) of IVIM parameters were correlated with pathology results of tumor subtype using unequal variance t tests to compare subtypes in terms of each measure. Correction for multiple comparisons was accomplished using the Tukey honestly significant difference procedure. A total of 44 renal tumors including 23 clear cell (ccRCC), 4 papillary (pRCC), 5 chromophobe, and 5 cystic renal cell carcinomas, as well as benign lesions, 4 oncocytomas (Onc) and 3 angiomyolipomas (AMLs), were included in our analysis. Mean IVIM parameters fp and Dt differentiated 8 of 15 pairs of renal tumors. Histogram analysis of IVIM parameters differentiated 9 of 15 subtype pairs. One subtype pair (ccRCC vs pRCC) was differentiated by mean analysis but not by histogram analysis. However, 2 other subtype pairs (AML vs Onc and ccRCC vs Onc) were differentiated by histogram distribution parameters exclusively. The standard deviation of Dt [σ(Dt)] differentiated ccRCC (0.362 ± 0.136 × 10 mm/s) from AML (0.199 ± 0.043 × 10 mm/s) (P = 0.002). Kurtosis of fp separated Onc (2.767 ± 1.299) from AML (-0.325 ± 0.279; P = 0.001), ccRCC (0.612 ± 1.139; P = 0.042), and pRCC (0.308 ± 0.730; P = 0.025). Intravoxel incoherent motion imaging parameters with inclusion of histogram measures of heterogeneity can help differentiate malignant from benign lesions as well as various subtypes of renal cancers.
NASA Astrophysics Data System (ADS)
Huete, Alfredo R.; Didan, Kamel; van Leeuwen, Willem J. D.; Vermote, Eric F.
1999-12-01
Vegetation indices have emerged as important tools in the seasonal and inter-annual monitoring of the Earth's vegetation. They are radiometric measures of the amount and condition of vegetation. In this study, the Sea-viewing Wide Field-of-View sensor (SeaWiFS) is used to investigate coarse resolution monitoring of vegetation with multiple indices. A 30-day series of SeaWiFS data, corrected for molecular scattering and absorption, was composited to cloud-free, single channel reflectance images. The normalized difference vegetation index (NDVI) and an optimized index, the enhanced vegetation index (EVI), were computed over various 'continental' regions. The EVI had a normal distribution of values over the continental set of biomes while the NDVI was skewed toward higher values and saturated over forested regions. The NDVI resembled the skewed distributions found in the red band while the EVI resembled the normal distributions found in the NIR band. The EVI minimized smoke contamination over extensive portions of the tropics. As a result, major biome types with continental regions were discriminable in both the EVI imagery and histograms, whereas smoke and saturation considerably degraded the NDVI histogram structure preventing reliable discrimination of biome types.
Particle size analysis of some water/oil/water multiple emulsions.
Ursica, L; Tita, D; Palici, I; Tita, B; Vlaia, V
2005-04-29
Particle size analysis gives useful information about the structure and stability of multiple emulsions, which are important characteristics of these systems. It also enables the observation of the growth process of particles dispersed in multiple emulsions, accordingly, the evolution of their dimension in time. The size of multiple particles in the seven water/oil/water (W/O/W) emulsions was determined by measuring the particles size observed during the microscopic examination. In order to describe the distribution of the size of multiple particles, the value of two parameters that define the particle size was calculated: the arithmetical mean diameter and the median diameter. The results of the particle size analysis in the seven multiple emulsions W/O/W studied are presented as histograms of the distribution density immediately, 1 and 3 months after the preparation of each emulsion, as well as by establishing the mean and the median diameter of particles. The comparative study of the distribution histograms and of the mean and median diameters of W/O/W multiple particles indicates that the prepared emulsions are fine and very fine dispersions, stable, and presenting a growth of the abovementioned diameters during the study.
Popelianskiĭ, Ia Iu; Bogdanov, E I; Khamidullina, V Z
1988-01-01
In 8 patients with radial neuropathy the authors studied histograms of distribution of potentials of motor units (PMU) by their duration, as well as of the number of intercrossings (T) and the mean amplitude of interference EMG of the musculus brachioradialis. The findings included a decrease in the T value and T/M ratio in the presence of an insignificant shift of the histograms and of the mean duration of PMU. With regard to the diagnosis of early neuropathies a reduction in the average value of T and T/M in the presence of ungraded voluntary tension of the muscle is diagnostically more important than changes in the duration of individual PMU.
Schultz-Coulon, H J
1975-07-01
The applicability of a newly developed fundamental frequency analyzer to diagnosis in phoniatrics is reviewed. During routine voice examination, the analyzer allows a quick and accurate measurement of fundamental frequency and sound level of the speaking voice, and of vocal range and maximum phonation time. By computing fundamental frequency histograms, the median fundamental frequency and the total pitch range can be better determined and compared. Objective studies of certain technical faculties of the singing voice, which usually are estimated subjectively by the speech therapist, may now be done by means of this analyzer. Several examples demonstrate the differences between correct and incorrect phonation. These studies compare the pitch perturbations during the crescendo and decrescendo of a swell-tone, and show typical traces of staccato, thrill and yodel. Conclusions of the study indicate that fundamental frequency analysis is a valuable supplemental method for objective voice examination.
Lunar soils grain size catalog
NASA Technical Reports Server (NTRS)
Graf, John C.
1993-01-01
This catalog compiles every available grain size distribution for Apollo surface soils, trench samples, cores, and Luna 24 soils. Original laboratory data are tabled, and cumulative weight distribution curves and histograms are plotted. Standard statistical parameters are calculated using the method of moments. Photos and location comments describe the sample environment and geological setting. This catalog can help researchers describe the geotechnical conditions and site variability of the lunar surface essential to the design of a lunar base.
Feature and contrast enhancement of mammographic image based on multiscale analysis and morphology.
Wu, Shibin; Yu, Shaode; Yang, Yuhan; Xie, Yaoqin
2013-01-01
A new algorithm for feature and contrast enhancement of mammographic images is proposed in this paper. The approach bases on multiscale transform and mathematical morphology. First of all, the Laplacian Gaussian pyramid operator is applied to transform the mammography into different scale subband images. In addition, the detail or high frequency subimages are equalized by contrast limited adaptive histogram equalization (CLAHE) and low-pass subimages are processed by mathematical morphology. Finally, the enhanced image of feature and contrast is reconstructed from the Laplacian Gaussian pyramid coefficients modified at one or more levels by contrast limited adaptive histogram equalization and mathematical morphology, respectively. The enhanced image is processed by global nonlinear operator. The experimental results show that the presented algorithm is effective for feature and contrast enhancement of mammogram. The performance evaluation of the proposed algorithm is measured by contrast evaluation criterion for image, signal-noise-ratio (SNR), and contrast improvement index (CII).
Feature and Contrast Enhancement of Mammographic Image Based on Multiscale Analysis and Morphology
Wu, Shibin; Xie, Yaoqin
2013-01-01
A new algorithm for feature and contrast enhancement of mammographic images is proposed in this paper. The approach bases on multiscale transform and mathematical morphology. First of all, the Laplacian Gaussian pyramid operator is applied to transform the mammography into different scale subband images. In addition, the detail or high frequency subimages are equalized by contrast limited adaptive histogram equalization (CLAHE) and low-pass subimages are processed by mathematical morphology. Finally, the enhanced image of feature and contrast is reconstructed from the Laplacian Gaussian pyramid coefficients modified at one or more levels by contrast limited adaptive histogram equalization and mathematical morphology, respectively. The enhanced image is processed by global nonlinear operator. The experimental results show that the presented algorithm is effective for feature and contrast enhancement of mammogram. The performance evaluation of the proposed algorithm is measured by contrast evaluation criterion for image, signal-noise-ratio (SNR), and contrast improvement index (CII). PMID:24416072
LACIE performance predictor final operational capability program description, volume 2
NASA Technical Reports Server (NTRS)
1976-01-01
Given the swath table files, the segment set for one country and cloud cover data, the SAGE program determines how many times and under what conditions each segment is accessed by satellites. The program writes a record for each segment on a data file which contains the pertinent acquisition data. The weather data file can also be generated from a NASA supplied tape. The Segment Acquisition Selector Program (SACS) selects data from the segment reference file based upon data input manually and from a crop window file. It writes the extracted data to a data acquisition file and prints two summary reports. The POUT program reads from associated LACIE files and produces printed reports. The major types of reports that can be produced are: (1) Substrate Reference Data Reports, (2) Population Mean, Standard Deviation and Histogram Reports, (3) Histograms of Monte Carlo Statistics Reports, and (4) Frequency of Sample Segment Acquisitions Reports.
Hansen, John P
2003-01-01
Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 1, presents basic information about data including a classification system that describes the four major types of variables: continuous quantitative variable, discrete quantitative variable, ordinal categorical variable (including the binomial variable), and nominal categorical variable. A histogram is a graph that displays the frequency distribution for a continuous variable. The article also demonstrates how to calculate the mean, median, standard deviation, and variance for a continuous variable.
Mobile satellite services: A survey of business needs
NASA Astrophysics Data System (ADS)
Hainzer, Eric M.
Conceptualizing and understanding the international business traveler's communication requirements by the use of a survey and selection of a mobile satellite system that satisfies those requirements are discussed. Chapter 5 incorporates an in depth analysis of the respondent's answers to survey questions and graphing them with frequency distribution histograms. Chapter 6 concludes with a selection of the most likely MSS manufacturer who appears to satisfy those communication requirements discovered in the previous chapter. Following a general-introduction in Chapter 1, the current climate of mobile satellite system (MSS) providers is discussed in Chapter 2. Chapter 3 assesses the implication of launch vehicles as it pertains to the political, technical, and financial aspects of MSS manufacturers and users. Special attention is provided, when possible, between the political environment and its relationship with forefront technology. In chapter 4, the procedure that was used to create the survey and its research methodology is shown. Graphs and charts are used, where appropriate, for the purpose of clarity and readability.
Liu, Chunling; Wang, Kun; Li, Xiaodan; Zhang, Jine; Ding, Jie; Spuhler, Karl; Duong, Timothy; Liang, Changhong; Huang, Chuan
2018-06-01
Diffusion-weighted imaging (DWI) has been studied in breast imaging and can provide more information about diffusion, perfusion and other physiological interests than standard pulse sequences. The stretched-exponential model has previously been shown to be more reliable than conventional DWI techniques, but different diagnostic sensitivities were found from study to study. This work investigated the characteristics of whole-lesion histogram parameters derived from the stretched-exponential diffusion model for benign and malignant breast lesions, compared them with conventional apparent diffusion coefficient (ADC), and further determined which histogram metrics can be best used to differentiate malignant from benign lesions. This was a prospective study. Seventy females were included in the study. Multi-b value DWI was performed on a 1.5T scanner. Histogram parameters of whole lesions for distributed diffusion coefficient (DDC), heterogeneity index (α), and ADC were calculated by two radiologists and compared among benign lesions, ductal carcinoma in situ (DCIS), and invasive carcinoma confirmed by pathology. Nonparametric tests were performed for comparisons among invasive carcinoma, DCIS, and benign lesions. Comparisons of receiver operating characteristic (ROC) curves were performed to show the ability to discriminate malignant from benign lesions. The majority of histogram parameters (mean/min/max, skewness/kurtosis, 10-90 th percentile values) from DDC, α, and ADC were significantly different among invasive carcinoma, DCIS, and benign lesions. DDC 10% (area under curve [AUC] = 0.931), ADC 10% (AUC = 0.893), and α mean (AUC = 0.787) were found to be the best metrics in differentiating benign from malignant tumors among all histogram parameters derived from ADC and α, respectively. The combination of DDC 10% and α mean , using logistic regression, yielded the highest sensitivity (90.2%) and specificity (95.5%). DDC 10% and α mean derived from the stretched-exponential model provides more information and better diagnostic performance in differentiating malignancy from benign lesions than ADC parameters derived from a monoexponential model. 2 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:1701-1710. © 2017 International Society for Magnetic Resonance in Medicine.
Theoretical cratering rates on Ida, Mathilde, Eros and Gaspra
NASA Astrophysics Data System (ADS)
Jeffers, S. V.; Asher, D. J.; Bailey, M. E.
2002-11-01
We investigate the main influences on crater size distributions, by deriving results for the four example target objects, (951) Gaspra, (243) Ida, (253) Mathilde and (433) Eros. The dynamical history of each of these asteroids is modelled using the MERCURY (Chambers 1999) numerical integrator. The use of an efficient, Öpik-type, collision code enables the calculation of a velocity histogram and the probability of impact. This when combined with a crater scaling law and an impactor size distribution, through a Monte Carlo method, results in a crater size distribution. The resulting crater probability distributions are in good agreement with observed crater distributions on these asteroids.
Comparison Tools for Assessing the Microgravity Environment of Missions, Carriers and Conditions
NASA Technical Reports Server (NTRS)
DeLombard, Richard; McPherson, Kevin; Moskowitz, Milton; Hrovat, Ken
1997-01-01
The Principal Component Spectral Analysis and the Quasi-steady Three-dimensional Histogram techniques provide the means to describe the microgravity acceleration environment of an entire mission on a single plot. This allows a straight forward comparison of the microgravity environment between missions, carriers, and conditions. As shown in this report, the PCSA and QTH techniques bring both the range and median of the microgravity environment onto a single page for an entire mission or another time period or condition of interest. These single pages may then be used to compare similar analyses of other missions, time periods or conditions. The PCSA plot is based on the frequency distribution of the vibrational energy and is normally used for an acceleration data set containing frequencies above the lowest natural frequencies of the vehicle. The QTH plot is based on the direction and magnitude of the acceleration and is normally used for acceleration data sets with frequency content less than 0.1 Hz. Various operating conditions are made evident by using PCSA and QTH plots. Equipment operating either full or part time with sufficient magnitude to be considered a disturbance is very evident as well as equipment contributing to the background acceleration environment. A source's magnitude and/or frequency variability is also evident by the source's appearance on a PCSA plot. The PCSA and QTH techniques are valuable tools for extracting useful information from acceleration data taken over large spans of time. This report shows that these techniques provide a tool for comparison between different sets of microgravity acceleration data, for example different missions, different activities within a mission, and/or different attitudes within a mission. These techniques, as well as others, may be employed in order to derive useful information from acceleration data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Volkán-Kacsó, Sándor
2014-06-14
A theoretical method is proposed for the calculation of the photon counting probability distribution during a bin time. Two-state fluorescence and steady excitation are assumed. A key feature is a kinetic scheme that allows for an extensive class of stochastic waiting time distribution functions, including power laws, expanded as a sum of weighted decaying exponentials. The solution is analytic in certain conditions, and an exact and simple expression is found for the integral contribution of “bright” and “dark” states. As an application for power law kinetics, theoretical results are compared with experimental intensity histograms from a number of blinking CdSe/ZnSmore » quantum dots. The histograms are consistent with distributions of intensity states around a “bright” and a “dark” maximum. A gap of states is also revealed in the more-or-less flat inter-peak region. The slope and to some extent the flatness of the inter-peak feature are found to be sensitive to the power-law exponents. Possible models consistent with these findings are discussed, such as the combination of multiple charging and fluctuating non-radiative channels or the multiple recombination center model. A fitting of the latter to experiment provides constraints on the interaction parameter between the recombination centers. Further extensions and applications of the photon counting theory are also discussed.« less
Yin, Ping; Xiong, Hua; Liu, Yi; Sah, Shambhu K; Zeng, Chun; Wang, Jingjie; Li, Yongmei; Hong, Nan
2018-01-01
To investigate the application value of using dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) with extended Tofts linear model for relapsing-remitting multiple sclerosis (RRMS) and its correlation with expanded disability status scale (EDSS) scores and disease duration. Thirty patients with multiple sclerosis (MS) underwent conventional magnetic resonance imaging (MRI) and DCE-MRI with a 3.0 Tesla MR scanner. An extended Tofts linear model was used to quantitatively measure MR imaging biomarkers. The histogram parameters and correlation among imaging biomarkers, EDSS scores, and disease duration were also analyzed. The MR imaging biomarkers volume transfer constant (K trans ), volume of the extravascular extracellular space per unit volume of tissue (Ve), fractional plasma volume (V p ), cerebral blood flow (CBF), and cerebral blood volume (CBV) of contrast-enhancing (CE) lesions were significantly higher (P < 0.05) than those of nonenhancing (NE) lesions and normal-appearing white matter (NAWM) regions. The skewness of Ve value in CE lesions was more close to normal distribution. There was no significant correlation among the biomarkers with the EDSS scores and disease duration (P > 0.05). Our study demonstrates that the DCE-MRI with the extended Tofts linear model can measure the permeability and perfusion characteristic in MS lesions and in NAWM regions. The K trans , Ve, Vp, CBF, and CBV of CE lesions were significantly higher than that of NE lesions. The skewness of Ve value in CE lesions was more close to normal distribution, indicating that the histogram can be helpful to distinguish the pathology of MS lesions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levine, Benjamin G., E-mail: ben.levine@temple.ed; Stone, John E., E-mail: johns@ks.uiuc.ed; Kohlmeyer, Axel, E-mail: akohlmey@temple.ed
2011-05-01
The calculation of radial distribution functions (RDFs) from molecular dynamics trajectory data is a common and computationally expensive analysis task. The rate limiting step in the calculation of the RDF is building a histogram of the distance between atom pairs in each trajectory frame. Here we present an implementation of this histogramming scheme for multiple graphics processing units (GPUs). The algorithm features a tiling scheme to maximize the reuse of data at the fastest levels of the GPU's memory hierarchy and dynamic load balancing to allow high performance on heterogeneous configurations of GPUs. Several versions of the RDF algorithm aremore » presented, utilizing the specific hardware features found on different generations of GPUs. We take advantage of larger shared memory and atomic memory operations available on state-of-the-art GPUs to accelerate the code significantly. The use of atomic memory operations allows the fast, limited-capacity on-chip memory to be used much more efficiently, resulting in a fivefold increase in performance compared to the version of the algorithm without atomic operations. The ultimate version of the algorithm running in parallel on four NVIDIA GeForce GTX 480 (Fermi) GPUs was found to be 92 times faster than a multithreaded implementation running on an Intel Xeon 5550 CPU. On this multi-GPU hardware, the RDF between two selections of 1,000,000 atoms each can be calculated in 26.9 s per frame. The multi-GPU RDF algorithms described here are implemented in VMD, a widely used and freely available software package for molecular dynamics visualization and analysis.« less
Stone, John E.; Kohlmeyer, Axel
2011-01-01
The calculation of radial distribution functions (RDFs) from molecular dynamics trajectory data is a common and computationally expensive analysis task. The rate limiting step in the calculation of the RDF is building a histogram of the distance between atom pairs in each trajectory frame. Here we present an implementation of this histogramming scheme for multiple graphics processing units (GPUs). The algorithm features a tiling scheme to maximize the reuse of data at the fastest levels of the GPU’s memory hierarchy and dynamic load balancing to allow high performance on heterogeneous configurations of GPUs. Several versions of the RDF algorithm are presented, utilizing the specific hardware features found on different generations of GPUs. We take advantage of larger shared memory and atomic memory operations available on state-of-the-art GPUs to accelerate the code significantly. The use of atomic memory operations allows the fast, limited-capacity on-chip memory to be used much more efficiently, resulting in a fivefold increase in performance compared to the version of the algorithm without atomic operations. The ultimate version of the algorithm running in parallel on four NVIDIA GeForce GTX 480 (Fermi) GPUs was found to be 92 times faster than a multithreaded implementation running on an Intel Xeon 5550 CPU. On this multi-GPU hardware, the RDF between two selections of 1,000,000 atoms each can be calculated in 26.9 seconds per frame. The multi-GPU RDF algorithms described here are implemented in VMD, a widely used and freely available software package for molecular dynamics visualization and analysis. PMID:21547007
a Probability-Based Statistical Method to Extract Water Body of TM Images with Missing Information
NASA Astrophysics Data System (ADS)
Lian, Shizhong; Chen, Jiangping; Luo, Minghai
2016-06-01
Water information cannot be accurately extracted using TM images because true information is lost in some images because of blocking clouds and missing data stripes, thereby water information cannot be accurately extracted. Water is continuously distributed in natural conditions; thus, this paper proposed a new method of water body extraction based on probability statistics to improve the accuracy of water information extraction of TM images with missing information. Different disturbing information of clouds and missing data stripes are simulated. Water information is extracted using global histogram matching, local histogram matching, and the probability-based statistical method in the simulated images. Experiments show that smaller Areal Error and higher Boundary Recall can be obtained using this method compared with the conventional methods.
Automated Counting of Particles To Quantify Cleanliness
NASA Technical Reports Server (NTRS)
Rhode, James
2005-01-01
A machine vision system, similar to systems used in microbiological laboratories to count cultured microbes, has been proposed for quantifying the cleanliness of nominally precisely cleaned hardware by counting residual contaminant particles. The system would include a microscope equipped with an electronic camera and circuitry to digitize the camera output, a personal computer programmed with machine-vision and interface software, and digital storage media. A filter pad, through which had been aspirated solvent from rinsing the hardware in question, would be placed on the microscope stage. A high-resolution image of the filter pad would be recorded. The computer would analyze the image and present a histogram of sizes of particles on the filter. On the basis of the histogram and a measure of the desired level of cleanliness, the hardware would be accepted or rejected. If the hardware were accepted, the image would be saved, along with other information, as a quality record. If the hardware were rejected, the histogram and ancillary information would be recorded for analysis of trends. The software would perceive particles that are too large or too numerous to meet a specified particle-distribution profile. Anomalous particles or fibrous material would be flagged for inspection.
Using an image-extended relational database to support content-based image retrieval in a PACS.
Traina, Caetano; Traina, Agma J M; Araújo, Myrian R B; Bueno, Josiane M; Chino, Fabio J T; Razente, Humberto; Azevedo-Marques, Paulo M
2005-12-01
This paper presents a new Picture Archiving and Communication System (PACS), called cbPACS, which has content-based image retrieval capabilities. The cbPACS answers range and k-nearest- neighbor similarity queries, employing a relational database manager extended to support images. The images are compared through their features, which are extracted by an image-processing module and stored in the extended relational database. The database extensions were developed aiming at efficiently answering similarity queries by taking advantage of specialized indexing methods. The main concept supporting the extensions is the definition, inside the relational manager, of distance functions based on features extracted from the images. An extension to the SQL language enables the construction of an interpreter that intercepts the extended commands and translates them to standard SQL, allowing any relational database server to be used. By now, the system implemented works on features based on color distribution of the images through normalized histograms as well as metric histograms. Metric histograms are invariant regarding scale, translation and rotation of images and also to brightness transformations. The cbPACS is prepared to integrate new image features, based on texture and shape of the main objects in the image.
NASA Astrophysics Data System (ADS)
Galich, N. E.
A novel nonlinear statistical method of immunofluorescence data analysis is presented. The data of DNA fluorescence due to oxidative activity in neutrophils nuclei of peripheral blood is analyzed. Histograms of photon counts statistics are generated using flow cytometry method. The histograms represent the distributions of fluorescence flash frequency as functions of intensity for large populations∼104-105 of fluorescing cells. We have shown that these experiments present 3D-correlations of oxidative activity of DNA for full chromosomes set in cells with spatial resolution of measurements is about few nanometers in the flow direction the jet of blood. Detailed analysis showed that large-scale correlations in oxidative activity of DNA in cells are described as networks of small- worlds (complex systems with logarithmic scaling) with self own small-world networks for given donor at given time for all states of health. We observed changes in fractal networks of oxidative activity of DNA in neutrophils in vivo and during medical treatments for classification and diagnostics of pathologies for wide spectra of diseases. Our approach based on analysis of changes topology of networks (fractal dimension) at variation the scales of networks. We produce the general estimation of health status of a given donor in a form of yes/no of answers (healthy/sick) in the dependence on the sign of plus/minus in the trends change of fractal dimensions due to decreasing the scale of nets. We had noted the increasing biodiversity of neutrophils and stochastic (Brownian) character of intercellular correlations of different neutrophils in the blood of healthy donor. In the blood of sick people we observed the deterministic cell-cell correlations of neutrophils and decreasing their biodiversity.
NASA Astrophysics Data System (ADS)
Underwood, T. S. A.; Sung, W.; McFadden, C. H.; McMahon, S. J.; Hall, D. C.; McNamara, A. L.; Paganetti, H.; Sawakuchi, G. O.; Schuemann, J.
2017-04-01
Whilst Monte Carlo (MC) simulations of proton energy deposition have been well-validated at the macroscopic level, their microscopic validation remains lacking. Equally, no gold-standard yet exists for experimental metrology of individual proton tracks. In this work we compare the distributions of stochastic proton interactions simulated using the TOPAS-nBio MC platform against confocal microscope data for Al2O3:C,Mg fluorescent nuclear track detectors (FNTDs). We irradiated 8× 4× 0.5 mm3 FNTD chips inside a water phantom, positioned at seven positions along a pristine proton Bragg peak with a range in water of 12 cm. MC simulations were implemented in two stages: (1) using TOPAS to model the beam properties within a water phantom and (2) using TOPAS-nBio with Geant4-DNA physics to score particle interactions through a water surrogate of Al2O3:C,Mg. The measured median track integrated brightness (IB) was observed to be strongly correlated to both (i) voxelized track-averaged linear energy transfer (LET) and (ii) frequency mean microdosimetric lineal energy, \\overline{{{y}F}} , both simulated in pure water. Histograms of FNTD track IB were compared against TOPAS-nBio histograms of the number of terminal electrons per proton, scored in water with mass-density scaled to mimic Al2O3:C,Mg. Trends between exposure depths observed in TOPAS-nBio simulations were experimentally replicated in the study of FNTD track IB. Our results represent an important first step towards the experimental validation of MC simulations on the sub-cellular scale and suggest that FNTDs can enable experimental study of the microdosimetric properties of individual proton tracks.
Underwood, T S A; Sung, W; McFadden, C H; McMahon, S J; Hall, D C; McNamara, A L; Paganetti, H; Sawakuchi, G O; Schuemann, J
2017-04-21
Whilst Monte Carlo (MC) simulations of proton energy deposition have been well-validated at the macroscopic level, their microscopic validation remains lacking. Equally, no gold-standard yet exists for experimental metrology of individual proton tracks. In this work we compare the distributions of stochastic proton interactions simulated using the TOPAS-nBio MC platform against confocal microscope data for Al 2 O 3 :C,Mg fluorescent nuclear track detectors (FNTDs). We irradiated [Formula: see text] mm 3 FNTD chips inside a water phantom, positioned at seven positions along a pristine proton Bragg peak with a range in water of 12 cm. MC simulations were implemented in two stages: (1) using TOPAS to model the beam properties within a water phantom and (2) using TOPAS-nBio with Geant4-DNA physics to score particle interactions through a water surrogate of Al 2 O 3 :C,Mg. The measured median track integrated brightness (IB) was observed to be strongly correlated to both (i) voxelized track-averaged linear energy transfer (LET) and (ii) frequency mean microdosimetric lineal energy, [Formula: see text], both simulated in pure water. Histograms of FNTD track IB were compared against TOPAS-nBio histograms of the number of terminal electrons per proton, scored in water with mass-density scaled to mimic Al 2 O 3 :C,Mg. Trends between exposure depths observed in TOPAS-nBio simulations were experimentally replicated in the study of FNTD track IB. Our results represent an important first step towards the experimental validation of MC simulations on the sub-cellular scale and suggest that FNTDs can enable experimental study of the microdosimetric properties of individual proton tracks.
Multiplicity of the 660-km discontinuity beneath the Izu-Bonin area
NASA Astrophysics Data System (ADS)
Zhou, Yuan-Ze; Yu, Xiang-Wei; Yang, Hui; Zang, Shao-Xian
2012-05-01
The relatively simple subducting slab geometry in the Izu-Bonin region provides a valuable opportunity to study the multiplicity of the 660-km discontinuity and the related response of the subducting slab on the discontinuity. Vertical short-period recordings of deep events with simple direct P phases beneath the Izu-Bonin region were retrieved from two seismic networks in the western USA and were used to study the structure of the 660-km discontinuity. After careful selection and pre-processing, 23 events from the networks, forming 32 pairs of event-network records, were processed. Related vespagrams were produced using the N-th root slant stack method for detecting weak down-going SdP phases that were inverted to the related conversion points. From depth histograms and the spatial distribution of the conversion points, there were three clear interfaces at depths of 670, 710 and 730 km. These interfaces were depressed approximately 20-30 km in the northern region. In the southern region, only two layers were identified in the depth histograms, and no obvious layered structure could be observed from the distribution of the conversion points.
Shot-Noise Limited Single-Molecule FRET Histograms: Comparison between Theory and Experiments†
Nir, Eyal; Michalet, Xavier; Hamadani, Kambiz M.; Laurence, Ted A.; Neuhauser, Daniel; Kovchegov, Yevgeniy; Weiss, Shimon
2011-01-01
We describe a simple approach and present a straightforward numerical algorithm to compute the best fit shot-noise limited proximity ratio histogram (PRH) in single-molecule fluorescence resonant energy transfer diffusion experiments. The key ingredient is the use of the experimental burst size distribution, as obtained after burst search through the photon data streams. We show how the use of an alternated laser excitation scheme and a correspondingly optimized burst search algorithm eliminates several potential artifacts affecting the calculation of the best fit shot-noise limited PRH. This algorithm is tested extensively on simulations and simple experimental systems. We find that dsDNA data exhibit a wider PRH than expected from shot noise only and hypothetically account for it by assuming a small Gaussian distribution of distances with an average standard deviation of 1.6 Å. Finally, we briefly mention the results of a future publication and illustrate them with a simple two-state model system (DNA hairpin), for which the kinetic transition rates between the open and closed conformations are extracted. PMID:17078646
Waldenberg, Christian; Hebelka, Hanna; Brisby, Helena; Lagerstrand, Kerstin Magdalena
2018-05-01
Magnetic resonance imaging (MRI) is the best diagnostic imaging method for low back pain. However, the technique is currently not utilized in its full capacity, often failing to depict painful intervertebral discs (IVDs), potentially due to the rough degeneration classification system used clinically today. MR image histograms, which reflect the IVD heterogeneity, may offer sensitive imaging biomarkers for IVD degeneration classification. This study investigates the feasibility of using histogram analysis as means of objective and continuous grading of IVD degeneration. Forty-nine IVDs in ten low back pain patients (six males, 25-69 years) were examined with MRI (T2-weighted images and T2-maps). Each IVD was semi-automatically segmented on three mid-sagittal slices. Histogram features of the IVD were extracted from the defined regions of interest and correlated to Pfirrmann grade. Both T2-weighted images and T2-maps displayed similar histogram features. Histograms of well-hydrated IVDs displayed two separate peaks, representing annulus fibrosus and nucleus pulposus. Degenerated IVDs displayed decreased peak separation, where the separation was shown to correlate strongly with Pfirrmann grade (P < 0.05). In addition, some degenerated IVDs within the same Pfirrmann grade displayed diametrically different histogram appearances. Histogram features correlated well with IVD degeneration, suggesting that IVD histogram analysis is a suitable tool for objective and continuous IVD degeneration classification. As histogram analysis revealed IVD heterogeneity, it may be a clinical tool for characterization of regional IVD degeneration effects. To elucidate the usefulness of histogram analysis in patient management, IVD histogram features between asymptomatic and symptomatic individuals needs to be compared.
Lee, Ki Baek
2018-01-01
Objective To describe the quantitative image quality and histogram-based evaluation of an iterative reconstruction (IR) algorithm in chest computed tomography (CT) scans at low-to-ultralow CT radiation dose levels. Materials and Methods In an adult anthropomorphic phantom, chest CT scans were performed with 128-section dual-source CT at 70, 80, 100, 120, and 140 kVp, and the reference (3.4 mGy in volume CT Dose Index [CTDIvol]), 30%-, 60%-, and 90%-reduced radiation dose levels (2.4, 1.4, and 0.3 mGy). The CT images were reconstructed by using filtered back projection (FBP) algorithms and IR algorithm with strengths 1, 3, and 5. Image noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR) were statistically compared between different dose levels, tube voltages, and reconstruction algorithms. Moreover, histograms of subtraction images before and after standardization in x- and y-axes were visually compared. Results Compared with FBP images, IR images with strengths 1, 3, and 5 demonstrated image noise reduction up to 49.1%, SNR increase up to 100.7%, and CNR increase up to 67.3%. Noteworthy image quality degradations on IR images including a 184.9% increase in image noise, 63.0% decrease in SNR, and 51.3% decrease in CNR, and were shown between 60% and 90% reduced levels of radiation dose (p < 0.0001). Subtraction histograms between FBP and IR images showed progressively increased dispersion with increased IR strength and increased dose reduction. After standardization, the histograms appeared deviated and ragged between FBP images and IR images with strength 3 or 5, but almost normally-distributed between FBP images and IR images with strength 1. Conclusion The IR algorithm may be used to save radiation doses without substantial image quality degradation in chest CT scanning of the adult anthropomorphic phantom, down to approximately 1.4 mGy in CTDIvol (60% reduced dose). PMID:29354008
Xu, Kui; Boas, David A; Sakadžić, Sava; LaManna, Joseph C
2017-01-01
Key to the understanding of the principles of physiological and structural acclimatization to changes in the balance between energy supply (represented by substrate and oxygen delivery, and mitochondrial oxidative phosphorylation) and energy demand (initiated by neuronal activity) is to determine the controlling variables, how they are sensed and the mechanisms initiated to maintain the balance. The mammalian brain depends completely on continuous delivery of oxygen to maintain its function. We hypothesized that tissue oxygen is the primary sensed variable. In this study two-photon phosphorescence lifetime microscopy (2PLM) was used to determine and define the tissue oxygen tension field within the cerebral cortex of mice to a cortical depth of between 200-250 μm under normoxia and acute hypoxia (FiO 2 = 0.10). High-resolution images can provide quantitative distributions of oxygen and intercapillary oxygen gradients. The data are best appreciated by quantifying the distribution histogram that can then be used for analysis. For example, in the brain cortex of a mouse, at a depth of 200 μm, tissue oxygen tension was mapped and the distribution histogram was compared under normoxic and mild hypoxic conditions. This powerful method can provide for the first time a description of the delivery and availability of brain oxygen in vivo.
Bin Ratio-Based Histogram Distances and Their Application to Image Classification.
Hu, Weiming; Xie, Nianhua; Hu, Ruiguang; Ling, Haibin; Chen, Qiang; Yan, Shuicheng; Maybank, Stephen
2014-12-01
Large variations in image background may cause partial matching and normalization problems for histogram-based representations, i.e., the histograms of the same category may have bins which are significantly different, and normalization may produce large changes in the differences between corresponding bins. In this paper, we deal with this problem by using the ratios between bin values of histograms, rather than bin values' differences which are used in the traditional histogram distances. We propose a bin ratio-based histogram distance (BRD), which is an intra-cross-bin distance, in contrast with previous bin-to-bin distances and cross-bin distances. The BRD is robust to partial matching and histogram normalization, and captures correlations between bins with only a linear computational complexity. We combine the BRD with the ℓ1 histogram distance and the χ(2) histogram distance to generate the ℓ1 BRD and the χ(2) BRD, respectively. These combinations exploit and benefit from the robustness of the BRD under partial matching and the robustness of the ℓ1 and χ(2) distances to small noise. We propose a method for assessing the robustness of histogram distances to partial matching. The BRDs and logistic regression-based histogram fusion are applied to image classification. The experimental results on synthetic data sets show the robustness of the BRDs to partial matching, and the experiments on seven benchmark data sets demonstrate promising results of the BRDs for image classification.
Optical Logarithmic Transformation of Speckle Images with Bacteriorhodopsin Films
NASA Technical Reports Server (NTRS)
Downie, John D.
1995-01-01
The application of logarithmic transformations to speckle images is sometimes desirable in converting the speckle noise distribution into an additive, constant-variance noise distribution. The optical transmission properties of some bacteriorhodopsin films are well suited to implement such a transformation optically in a parallel fashion. I present experimental results of the optical conversion of a speckle image into a transformed image with signal-independent noise statistics, using the real-time photochromic properties of bacteriorhodopsin. The original and transformed noise statistics are confirmed by histogram analysis.
Theory and Application of DNA Histogram Analysis.
ERIC Educational Resources Information Center
Bagwell, Charles Bruce
The underlying principles and assumptions associated with DNA histograms are discussed along with the characteristics of fluorescent probes. Information theory was described and used to calculate the information content of a DNA histogram. Two major types of DNA histogram analyses are proposed: parametric and nonparametric analysis. Three levels…
Interference Fringes Used to Determine Retinal Ganglion Cell Receptive Field Sizes.
1982-07-01
National Technical Informtion -Tvc, b ~ ti I to the general public, including foreign n"atfonals.... This tecinical report has been reviewed and Is...adjusted to satisfy the follow- ing criteria: (a) tachycardia and transient hypertension in response to strong noxious stimuli, and ( b ) stage I or...tube voltage output for drifting interference fringes with a spatial frequency of 11.6 cycles/degree; B shows the histogram of the pulse-height
IRT-LR-DIF with Estimation of the Focal-Group Density as an Empirical Histogram
ERIC Educational Resources Information Center
Woods, Carol M.
2008-01-01
Item response theory-likelihood ratio-differential item functioning (IRT-LR-DIF) is used to evaluate the degree to which items on a test or questionnaire have different measurement properties for one group of people versus another, irrespective of group-mean differences on the construct. Usually, the latent distribution is presumed normal for both…
Macrophage Biochemistry, Activation and Function
1981-01-01
vacuolar apparatus become more abundant. Functional capabilities, including phagocytic activity, protein synthesis and surface receptors, also increase...properties of cell components of other tissues has led to the following assignment of marker enzymes to specific macrophage components. This assessment is...subfractions. The surface area of each histogram bar then gives the frac- tional amount of constituent present within each normalized fraction. Distribution
Aromaticity of benzene derivatives: an exploration of the Cambridge Structural Database.
Majerz, Irena; Dziembowska, Teresa
2018-04-01
The harmonic oscillator model of aromaticity (HOMA) index, one of the most popular aromaticity indices for solid-state benzene rings in the Cambridge Structural Database (CSD), has been analyzed. The histograms of HOMA for benzene, for benzene derivatives with one formyl, nitro, amino or hydroxy group as well as the histograms for the derivatives with two formyl, nitro, amino or hydroxy groups in ortho, meta and para positions were investigated. The majority of the substituted benzene derivatives in the CSD are characterized by a high value of HOMA, indicating fully aromatic character; however, the distribution of the HOMA value from 1 to about 0 indicates decreasing aromaticity down to non-aromatic character. Among the benzene derivatives investigated, a significant decrease in aromaticity can be related to compounds with diamino and dinitro groups in the meta position.
Accelerating the weighted histogram analysis method by direct inversion in the iterative subspace.
Zhang, Cheng; Lai, Chun-Liang; Pettitt, B Montgomery
The weighted histogram analysis method (WHAM) for free energy calculations is a valuable tool to produce free energy differences with the minimal errors. Given multiple simulations, WHAM obtains from the distribution overlaps the optimal statistical estimator of the density of states, from which the free energy differences can be computed. The WHAM equations are often solved by an iterative procedure. In this work, we use a well-known linear algebra algorithm which allows for more rapid convergence to the solution. We find that the computational complexity of the iterative solution to WHAM and the closely-related multiple Bennett acceptance ratio (MBAR) method can be improved by using the method of direct inversion in the iterative subspace. We give examples from a lattice model, a simple liquid and an aqueous protein solution.
Histogram deconvolution - An aid to automated classifiers
NASA Technical Reports Server (NTRS)
Lorre, J. J.
1983-01-01
It is shown that N-dimensional histograms are convolved by the addition of noise in the picture domain. Three methods are described which provide the ability to deconvolve such noise-affected histograms. The purpose of the deconvolution is to provide automated classifiers with a higher quality N-dimensional histogram from which to obtain classification statistics.
Introducing parallelism to histogramming functions for GEM systems
NASA Astrophysics Data System (ADS)
Krawczyk, Rafał D.; Czarski, Tomasz; Kolasinski, Piotr; Pozniak, Krzysztof T.; Linczuk, Maciej; Byszuk, Adrian; Chernyshova, Maryna; Juszczyk, Bartlomiej; Kasprowicz, Grzegorz; Wojenski, Andrzej; Zabolotny, Wojciech
2015-09-01
This article is an assessment of potential parallelization of histogramming algorithms in GEM detector system. Histogramming and preprocessing algorithms in MATLAB were analyzed with regard to adding parallelism. Preliminary implementation of parallel strip histogramming resulted in speedup. Analysis of algorithms parallelizability is presented. Overview of potential hardware and software support to implement parallel algorithm is discussed.
Comparison of Histograms for Use in Cloud Observation and Modeling
NASA Technical Reports Server (NTRS)
Green, Lisa; Xu, Kuan-Man
2005-01-01
Cloud observation and cloud modeling data can be presented in histograms for each characteristic to be measured. Combining information from single-cloud histograms yields a summary histogram. Summary histograms can be compared to each other to reach conclusions about the behavior of an ensemble of clouds in different places at different times or about the accuracy of a particular cloud model. As in any scientific comparison, it is necessary to decide whether any apparent differences are statistically significant. The usual methods of deciding statistical significance when comparing histograms do not apply in this case because they assume independent data. Thus, a new method is necessary. The proposed method uses the Euclidean distance metric and bootstrapping to calculate the significance level.
Free energies from dynamic weighted histogram analysis using unbiased Markov state model.
Rosta, Edina; Hummer, Gerhard
2015-01-13
The weighted histogram analysis method (WHAM) is widely used to obtain accurate free energies from biased molecular simulations. However, WHAM free energies can exhibit significant errors if some of the biasing windows are not fully equilibrated. To account for the lack of full equilibration, we develop the dynamic histogram analysis method (DHAM). DHAM uses a global Markov state model to obtain the free energy along the reaction coordinate. A maximum likelihood estimate of the Markov transition matrix is constructed by joint unbiasing of the transition counts from multiple umbrella-sampling simulations along discretized reaction coordinates. The free energy profile is the stationary distribution of the resulting Markov matrix. For this matrix, we derive an explicit approximation that does not require the usual iterative solution of WHAM. We apply DHAM to model systems, a chemical reaction in water treated using quantum-mechanics/molecular-mechanics (QM/MM) simulations, and the Na(+) ion passage through the membrane-embedded ion channel GLIC. We find that DHAM gives accurate free energies even in cases where WHAM fails. In addition, DHAM provides kinetic information, which we here use to assess the extent of convergence in each of the simulation windows. DHAM may also prove useful in the construction of Markov state models from biased simulations in phase-space regions with otherwise low population.
Noise-induced hearing loss alters the temporal dynamics of auditory-nerve responses
Scheidt, Ryan E.; Kale, Sushrut; Heinz, Michael G.
2010-01-01
Auditory-nerve fibers demonstrate dynamic response properties in that they adapt to rapid changes in sound level, both at the onset and offset of a sound. These dynamic response properties affect temporal coding of stimulus modulations that are perceptually relevant for many sounds such as speech and music. Temporal dynamics have been well characterized in auditory-nerve fibers from normal-hearing animals, but little is known about the effects of sensorineural hearing loss on these dynamics. This study examined the effects of noise-induced hearing loss on the temporal dynamics in auditory-nerve fiber responses from anesthetized chinchillas. Post-stimulus time histograms were computed from responses to 50-ms tones presented at characteristic frequency and 30 dB above fiber threshold. Several response metrics related to temporal dynamics were computed from post-stimulus-time histograms and were compared between normal-hearing and noise-exposed animals. Results indicate that noise-exposed auditory-nerve fibers show significantly reduced response latency, increased onset response and percent adaptation, faster adaptation after onset, and slower recovery after offset. The decrease in response latency only occurred in noise-exposed fibers with significantly reduced frequency selectivity. These changes in temporal dynamics have important implications for temporal envelope coding in hearing-impaired ears, as well as for the design of dynamic compression algorithms for hearing aids. PMID:20696230
Statistical analysis of the 70 meter antenna surface distortions
NASA Technical Reports Server (NTRS)
Kiedron, K.; Chian, C. T.; Chuang, K. L.
1987-01-01
Statistical analysis of surface distortions of the 70 meter NASA/JPL antenna, located at Goldstone, was performed. The purpose of this analysis is to verify whether deviations due to gravity loading can be treated as quasi-random variables with normal distribution. Histograms of the RF pathlength error distribution for several antenna elevation positions were generated. The results indicate that the deviations from the ideal antenna surface are not normally distributed. The observed density distribution for all antenna elevation angles is taller and narrower than the normal density, which results in large positive values of kurtosis and a significant amount of skewness. The skewness of the distribution changes from positive to negative as the antenna elevation changes from zenith to horizon.
Ability of near infrared spectroscopy to monitor air-dry density distribution and variation of wood
Brian K. Via; Chi-Leung So; Todd F. Shupe; Michael Stine; Leslie H. Groom
2005-01-01
Process control of wood density with near infrared spectroscopy (NIR) would be useful for pulp mills that need to maximize pulp yield without compromising paper strength properties. If models developed from the absorbance at wavelengths in the NIR region could provide density histograms, fiber supply personnel could monitor chip density variation as the chips enter the...
2010-03-01
2-29 2.7.4 Normalized Difference Skin Index (NDSI) . . . . 2-30 2.7.5 Normalized Difference Vegetation Index ( NDVI ) 2-31 2.7.6...C-1 C.2 NDVI Method . . . . . . . . . . . . . . . . . . . . . . . C-4 Bibliography... NDVI ,NDSI) and (NDGRI,NDSI) values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-6 4.3. Joint distributions of ( NDVI ,NDSI) and
Differentially Private Synthesization of Multi-Dimensional Data using Copula Functions
Li, Haoran; Xiong, Li; Jiang, Xiaoqian
2014-01-01
Differential privacy has recently emerged in private statistical data release as one of the strongest privacy guarantees. Most of the existing techniques that generate differentially private histograms or synthetic data only work well for single dimensional or low-dimensional histograms. They become problematic for high dimensional and large domain data due to increased perturbation error and computation complexity. In this paper, we propose DPCopula, a differentially private data synthesization technique using Copula functions for multi-dimensional data. The core of our method is to compute a differentially private copula function from which we can sample synthetic data. Copula functions are used to describe the dependence between multivariate random vectors and allow us to build the multivariate joint distribution using one-dimensional marginal distributions. We present two methods for estimating the parameters of the copula functions with differential privacy: maximum likelihood estimation and Kendall’s τ estimation. We present formal proofs for the privacy guarantee as well as the convergence property of our methods. Extensive experiments using both real datasets and synthetic datasets demonstrate that DPCopula generates highly accurate synthetic multi-dimensional data with significantly better utility than state-of-the-art techniques. PMID:25405241
A Simple Model of Cirrus Horizontal Inhomogeneity and Cloud Fraction
NASA Technical Reports Server (NTRS)
Smith, Samantha A.; DelGenio, Anthony D.
1998-01-01
A simple model of horizontal inhomogeneity and cloud fraction in cirrus clouds has been formulated on the basis that all internal horizontal inhomogeneity in the ice mixing ratio is due to variations in the cloud depth, which are assumed to be Gaussian. The use of such a model was justified by the observed relationship between the normalized variability of the ice water mixing ratio (and extinction) and the normalized variability of cloud depth. Using radar cloud depth data as input, the model reproduced well the in-cloud ice water mixing ratio histograms obtained from horizontal runs during the FIRE2 cirrus campaign. For totally overcast cases the histograms were almost Gaussian, but changed as cloud fraction decreased to exponential distributions which peaked at the lowest nonzero ice value for cloud fractions below 90%. Cloud fractions predicted by the model were always within 28% of the observed value. The predicted average ice water mixing ratios were within 34% of the observed values. This model could be used in a GCM to produce the ice mixing ratio probability distribution function and to estimate cloud fraction. It only requires basic meteorological parameters, the depth of the saturated layer and the standard deviation of cloud depth as input.
Chromosome number variation in two antipodean floras.
Peruzzi, Lorenzo; Dawson, Murray I; Bedini, Gianni
2011-01-01
We compared chromosome number (CN) variation in the nearly antipodean Italian and New Zealand floras to verify (i) whether patterns of variation reflect their similar latitudinal ranges or their different biogeographic/taxonomic contexts, (ii) if any differences are equally distributed across major taxa/lineages and (iii) if the frequency, number and taxonomic distribution of B-chromosomes differ between the two countries. We compared two datasets comprising 3426 (Italy) and 2525 (New Zealand) distinct cytotypes. We also compared a subset based on taxonomic orders and superimposed them onto a phylogeny of vascular plants. We used standard statistics, histograms, and either analysis of variance or Kruskal-Wallis tests to analyse the data. Mean CN of the vascular New Zealand flora is about twice that of Italy. For most orders, mean CN values for New Zealand are higher than those of the Italian flora and the differences are statistically significant. Further differences in CN variation among the orders and main clades that we studied, irrespective of geographical distinctions, are revealed. No correlation was found between chromosome and B-chromosome number. Mean CN of the whole New Zealand dataset is about twice that of the Italian flora. This suggests that extensive polyploidization played a major role in the evolution of the New Zealand vascular flora that is characterized by a rate of high endemism. Our results show that the hypothesis of a polyploid increase proportional to distance from the Equator cannot be applied to territories with the same latitudinal ranges but placed in different hemispheres. We suggest that bioclimatic gradients, rather than or in addition to latitudinal gradients, might account for a polyploidy increase. Our data also suggest that any adaptive role of B-chromosomes at geographic scale may be sought in their frequency rather than in their number.
Chromosome number variation in two antipodean floras
Peruzzi, Lorenzo; Dawson, Murray I.; Bedini, Gianni
2011-01-01
Background and aims We compared chromosome number (CN) variation in the nearly antipodean Italian and New Zealand floras to verify (i) whether patterns of variation reflect their similar latitudinal ranges or their different biogeographic/taxonomic contexts, (ii) if any differences are equally distributed across major taxa/lineages and (iii) if the frequency, number and taxonomic distribution of B-chromosomes differ between the two countries. Methodology We compared two datasets comprising 3426 (Italy) and 2525 (New Zealand) distinct cytotypes. We also compared a subset based on taxonomic orders and superimposed them onto a phylogeny of vascular plants. We used standard statistics, histograms, and either analysis of variance or Kruskal–Wallis tests to analyse the data. Principal results Mean CN of the vascular New Zealand flora is about twice that of Italy. For most orders, mean CN values for New Zealand are higher than those of the Italian flora and the differences are statistically significant. Further differences in CN variation among the orders and main clades that we studied, irrespective of geographical distinctions, are revealed. No correlation was found between chromosome and B-chromosome number. Conclusions Mean CN of the whole New Zealand dataset is about twice that of the Italian flora. This suggests that extensive polyploidization played a major role in the evolution of the New Zealand vascular flora that is characterized by a rate of high endemism. Our results show that the hypothesis of a polyploid increase proportional to distance from the Equator cannot be applied to territories with the same latitudinal ranges but placed in different hemispheres. We suggest that bioclimatic gradients, rather than or in addition to latitudinal gradients, might account for a polyploidy increase. Our data also suggest that any adaptive role of B-chromosomes at geographic scale may be sought in their frequency rather than in their number. PMID:22476490
The Kepler Light Curves of AGN: A Detailed Analysis
Smith, Krista Lynne; Mushotzky, Richard F.; Boyd, Patricia T.; ...
2018-04-25
Here, we present a comprehensive analysis of 21 light curves of Type 1 active galactic nuclei (AGN) from the Kepler spacecraft. First, we describe the necessity and development of a customized pipeline for treating Kepler data of stochastically variable sources like AGN. We then present the light curves, power spectral density functions (PSDs), and flux histograms. The light curves display an astonishing variety of behaviors, many of which would not be detected in ground-based studies, including switching between distinct flux levels. Six objects exhibit PSD flattening at characteristic timescales that roughly correlate with black hole mass. These timescales are consistentmore » with orbital timescales or free-fall accretion timescales. We check for correlations of variability and high-frequency PSD slope with accretion rate, black hole mass, redshift, and luminosity. We find that bolometric luminosity is anticorrelated with both variability and steepness of the PSD slope. We do not find evidence of the linear rms–flux relationships or lognormal flux distributions found in X-ray AGN light curves, indicating that reprocessing is not a significant contributor to optical variability at the 0.1%–10% level.« less
The Kepler Light Curves of AGN: A Detailed Analysis
NASA Astrophysics Data System (ADS)
Smith, Krista Lynne; Mushotzky, Richard F.; Boyd, Patricia T.; Malkan, Matt; Howell, Steve B.; Gelino, Dawn M.
2018-04-01
We present a comprehensive analysis of 21 light curves of Type 1 active galactic nuclei (AGN) from the Kepler spacecraft. First, we describe the necessity and development of a customized pipeline for treating Kepler data of stochastically variable sources like AGN. We then present the light curves, power spectral density functions (PSDs), and flux histograms. The light curves display an astonishing variety of behaviors, many of which would not be detected in ground-based studies, including switching between distinct flux levels. Six objects exhibit PSD flattening at characteristic timescales that roughly correlate with black hole mass. These timescales are consistent with orbital timescales or free-fall accretion timescales. We check for correlations of variability and high-frequency PSD slope with accretion rate, black hole mass, redshift, and luminosity. We find that bolometric luminosity is anticorrelated with both variability and steepness of the PSD slope. We do not find evidence of the linear rms–flux relationships or lognormal flux distributions found in X-ray AGN light curves, indicating that reprocessing is not a significant contributor to optical variability at the 0.1%–10% level.
NASA Astrophysics Data System (ADS)
Dong, Yang; He, Honghui; He, Chao; Ma, Hui
2016-10-01
Polarized light is sensitive to the microstructures of biological tissues and can be used to detect physiological changes. Meanwhile, spectral features of the scattered light can also provide abundant microstructural information of tissues. In this paper, we take the backscattering polarization Mueller matrix images of bovine skeletal muscle tissues during the 24-hour experimental time, and analyze their multispectral behavior using quantitative Mueller matrix parameters. In the processes of rigor mortis and proteolysis of muscle samples, multispectral frequency distribution histograms (FDHs) of the Mueller matrix elements can reveal rich qualitative structural information. In addition, we analyze the temporal variations of the sample using the multispectral Mueller matrix transformation (MMT) parameters. The experimental results indicate that the different stages of rigor mortis and proteolysis for bovine skeletal muscle samples can be judged by these MMT parameters. The results presented in this work show that combining with the multispectral technique, the FDHs and MMT parameters can characterize the microstructural variation features of skeletal muscle tissues. The techniques have the potential to be used as tools for quantitative assessment of meat qualities in food industry.
Decimetre Waves and Cerebellar Cortex Key Element - Purkinje Cells
NASA Astrophysics Data System (ADS)
Maharramov, Akif A.
2007-04-01
Acute experiments have been carried out on both decerebrated and anaesthetized adult cats exposed to Decimetre Range Microwaves (DRM) (λ=65 cm, duration of exposition -10 min., at least) by the help of a 2 cm radius contact applicator located on the temple part of the head with cerebellum in the centre of projection of irradiation conducted from portable physiotherapeutic apparatus ``Romashka''. Extracelullar registration of Purkinje Cells (PC) impulse activities have been led by glass microelectrodes. Statistical and computational analyses of PC activities have been realized by the help of histograms - the characteristic distribution of the number of interimpulse intervals (II) between electrical discharges of a neuron on the II durations, drawn up by the help of a computer. The results obtained revealed the reaction of PC to DRM as a succession of starting to react of PC electrophysiological parameters, beginning from decreasing of known ``Inhibitory Pause'' duration, and further, at first, ``Simple Spikes'' then ``Complex Spikes'' frequencies' increase, and furthermore, durations' increase in ``Big Interimpulse Intervals'', parameter, introduced for the first time by us, in this way, showing the ``evolutionary'' nature of Electromagnetic Fields and Living object interactions.
The Kepler Light Curves of AGN: A Detailed Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Krista Lynne; Mushotzky, Richard F.; Boyd, Patricia T.
Here, we present a comprehensive analysis of 21 light curves of Type 1 active galactic nuclei (AGN) from the Kepler spacecraft. First, we describe the necessity and development of a customized pipeline for treating Kepler data of stochastically variable sources like AGN. We then present the light curves, power spectral density functions (PSDs), and flux histograms. The light curves display an astonishing variety of behaviors, many of which would not be detected in ground-based studies, including switching between distinct flux levels. Six objects exhibit PSD flattening at characteristic timescales that roughly correlate with black hole mass. These timescales are consistentmore » with orbital timescales or free-fall accretion timescales. We check for correlations of variability and high-frequency PSD slope with accretion rate, black hole mass, redshift, and luminosity. We find that bolometric luminosity is anticorrelated with both variability and steepness of the PSD slope. We do not find evidence of the linear rms–flux relationships or lognormal flux distributions found in X-ray AGN light curves, indicating that reprocessing is not a significant contributor to optical variability at the 0.1%–10% level.« less
Mori, Toshifumi; Hamers, Robert J; Pedersen, Joel A; Cui, Qiang
2014-07-17
Motivated by specific applications and the recent work of Gao and co-workers on integrated tempering sampling (ITS), we have developed a novel sampling approach referred to as integrated Hamiltonian sampling (IHS). IHS is straightforward to implement and complementary to existing methods for free energy simulation and enhanced configurational sampling. The method carries out sampling using an effective Hamiltonian constructed by integrating the Boltzmann distributions of a series of Hamiltonians. By judiciously selecting the weights of the different Hamiltonians, one achieves rapid transitions among the energy landscapes that underlie different Hamiltonians and therefore an efficient sampling of important regions of the conformational space. Along this line, IHS shares similar motivations as the enveloping distribution sampling (EDS) approach of van Gunsteren and co-workers, although the ways that distributions of different Hamiltonians are integrated are rather different in IHS and EDS. Specifically, we report efficient ways for determining the weights using a combination of histogram flattening and weighted histogram analysis approaches, which make it straightforward to include many end-state and intermediate Hamiltonians in IHS so as to enhance its flexibility. Using several relatively simple condensed phase examples, we illustrate the implementation and application of IHS as well as potential developments for the near future. The relation of IHS to several related sampling methods such as Hamiltonian replica exchange molecular dynamics and λ-dynamics is also briefly discussed.
Acoustical study of classical Peking Opera singing.
Sundberg, Johan; Gu, Lide; Huang, Qiang; Huang, Ping
2012-03-01
Acoustic characteristics of classical opera singing differ considerably between the Western and the Chinese cultures. Singers in the classical Peking opera tradition specialize on one out of a limited number of standard roles. Audio and electroglottograph signals were recorded for four performers of the Old Man role and three performers of the Colorful Face role. Recordings were made of the singers' speech and when they sang recitatives and songs from their roles. Sound pressure level, fundamental frequency, and spectrum characteristics were analyzed. Histograms showing the distribution of fundamental frequency showed marked peaks for the songs, suggesting a scale tone structure. Some of the intervals between these peaks were similar to those used in Western music. Vibrato rate was about 3.5Hz, that is, considerably slower than in Western classical singing. Spectra of vibrato-free tones contained unbroken series of harmonic partials sometimes reaching up to 17 000Hz. Long-term-average spectrum (LTAS) curves showed no trace of a singer's formant cluster. However, the Colorful Face role singers' LTAS showed a marked peak near 3300Hz, somewhat similar to that found in Western pop music singers. The mean LTAS spectrum slope between 700 and 6000Hz decreased by about 0.2dB/octave per dB of equivalent sound level. Copyright © 2012 The Voice Foundation. Published by Mosby, Inc. All rights reserved.
Texton-based analysis of paintings
NASA Astrophysics Data System (ADS)
van der Maaten, Laurens J. P.; Postma, Eric O.
2010-08-01
The visual examination of paintings is traditionally performed by skilled art historians using their eyes. Recent advances in intelligent systems may support art historians in determining the authenticity or date of creation of paintings. In this paper, we propose a technique for the examination of brushstroke structure that views the wildly overlapping brushstrokes as texture. The analysis of the painting texture is performed with the help of a texton codebook, i.e., a codebook of small prototypical textural patches. The texton codebook can be learned from a collection of paintings. Our textural analysis technique represents paintings in terms of histograms that measure the frequency by which the textons in the codebook occur in the painting (so-called texton histograms). We present experiments that show the validity and effectiveness of our technique for textural analysis on a collection of digitized high-resolution reproductions of paintings by Van Gogh and his contemporaries. As texton histograms cannot be easily be interpreted by art experts, the paper proposes to approaches to visualize the results on the textural analysis. The first approach visualizes the similarities between the histogram representations of paintings by employing a recently proposed dimensionality reduction technique, called t-SNE. We show that t-SNE reveals a clear separation of paintings created by Van Gogh and those created by other painters. In addition, the period of creation is faithfully reflected in the t-SNE visualizations. The second approach visualizes the similarities and differences between paintings by highlighting regions in a painting in which the textural structure of the painting is unusual. We illustrate the validity of this approach by means of an experiment in which we highlight regions in a painting by Monet that are not very "Van Gogh-like". Taken together, we believe the tools developed in this study are well capable of assisting for art historians in support of their study of paintings.
Long-range anticorrelations and non-Gaussian behavior of the heartbeat
NASA Technical Reports Server (NTRS)
Peng, C.-K.; Mietus, J.; Hausdorff, J. M.; Havlin, S.; Stanley, H. E.; Goldberger, A. L.
1993-01-01
We find that the successive increments in the cardiac beat-to-beat intervals of healthy subjects display scale-invariant, long-range anticorrelations (up to 10 exp 4 heart beats). Furthermore, we find that the histogram for the heartbeat intervals increments is well described by a Levy (1991) stable distribution. For a group of subjects with severe heart disease, we find that the distribution is unchanged, but the long-range correlations vanish. Therefore, the different scaling behavior in health and disease must relate to the underlying dynamics of the heartbeat.
Xie, Xufen; Yan, Jiawei; Liang, Jinghong; Li, Jijun; Zhang, Meng; Mao, Bingwei
2013-10-01
We present quantum conductance measurements of germanium by means of an electrochemical scanning tunneling microscope (STM) break junction based on a jump-to-contact mechanism. Germanium nanowires between a platinum/iridium tip and different substrates were constructed to measure the quantum conductance. By applying appropriate potentials to the substrate and the tip, the process of heterogeneous contact and homogeneous breakage was realized. Typical conductance traces exhibit steps at 0.025 and 0.05 G0. The conductance histogram indicates that the conductance of germanium nanowires is located between 0.02 and 0.15 G0 in the low-conductance region and is free from the influence of substrate materials. However, the distribution of conductance plateaus is too discrete to display distinct peaks in the conductance histogram of the high-conductance region. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A new phase correction method in NMR imaging based on autocorrelation and histogram analysis.
Ahn, C B; Cho, Z H
1987-01-01
A new statistical approach to phase correction in NMR imaging is proposed. The proposed scheme consists of first-and zero-order phase corrections each by the inverse multiplication of estimated phase error. The first-order error is estimated by the phase of autocorrelation calculated from the complex valued phase distorted image while the zero-order correction factor is extracted from the histogram of phase distribution of the first-order corrected image. Since all the correction procedures are performed on the spatial domain after completion of data acquisition, no prior adjustments or additional measurements are required. The algorithm can be applicable to most of the phase-involved NMR imaging techniques including inversion recovery imaging, quadrature modulated imaging, spectroscopic imaging, and flow imaging, etc. Some experimental results with inversion recovery imaging as well as quadrature spectroscopic imaging are shown to demonstrate the usefulness of the algorithm.
NASA Astrophysics Data System (ADS)
Ojima, Nobutoshi; Fujiwara, Izumi; Inoue, Yayoi; Tsumura, Norimichi; Nakaguchi, Toshiya; Iwata, Kayoko
2011-03-01
Uneven distribution of skin color is one of the biggest concerns about facial skin appearance. Recently several techniques to analyze skin color have been introduced by separating skin color information into chromophore components, such as melanin and hemoglobin. However, there are not many reports on quantitative analysis of unevenness of skin color by considering type of chromophore, clusters of different sizes and concentration of the each chromophore. We propose a new image analysis and simulation method based on chromophore analysis and spatial frequency analysis. This method is mainly composed of three techniques: independent component analysis (ICA) to extract hemoglobin and melanin chromophores from a single skin color image, an image pyramid technique which decomposes each chromophore into multi-resolution images, which can be used for identifying different sizes of clusters or spatial frequencies, and analysis of the histogram obtained from each multi-resolution image to extract unevenness parameters. As the application of the method, we also introduce an image processing technique to change unevenness of melanin component. As the result, the method showed high capabilities to analyze unevenness of each skin chromophore: 1) Vague unevenness on skin could be discriminated from noticeable pigmentation such as freckles or acne. 2) By analyzing the unevenness parameters obtained from each multi-resolution image for Japanese ladies, agerelated changes were observed in the parameters of middle spatial frequency. 3) An image processing system modulating the parameters was proposed to change unevenness of skin images along the axis of the obtained age-related change in real time.
Predicting the Valence of a Scene from Observers’ Eye Movements
R.-Tavakoli, Hamed; Atyabi, Adham; Rantanen, Antti; Laukka, Seppo J.; Nefti-Meziani, Samia; Heikkilä, Janne
2015-01-01
Multimedia analysis benefits from understanding the emotional content of a scene in a variety of tasks such as video genre classification and content-based image retrieval. Recently, there has been an increasing interest in applying human bio-signals, particularly eye movements, to recognize the emotional gist of a scene such as its valence. In order to determine the emotional category of images using eye movements, the existing methods often learn a classifier using several features that are extracted from eye movements. Although it has been shown that eye movement is potentially useful for recognition of scene valence, the contribution of each feature is not well-studied. To address the issue, we study the contribution of features extracted from eye movements in the classification of images into pleasant, neutral, and unpleasant categories. We assess ten features and their fusion. The features are histogram of saccade orientation, histogram of saccade slope, histogram of saccade length, histogram of saccade duration, histogram of saccade velocity, histogram of fixation duration, fixation histogram, top-ten salient coordinates, and saliency map. We utilize machine learning approach to analyze the performance of features by learning a support vector machine and exploiting various feature fusion schemes. The experiments reveal that ‘saliency map’, ‘fixation histogram’, ‘histogram of fixation duration’, and ‘histogram of saccade slope’ are the most contributing features. The selected features signify the influence of fixation information and angular behavior of eye movements in the recognition of the valence of images. PMID:26407322
Encoding of a spectrally-complex communication sound in the bullfrog's auditory nerve.
Schwartz, J J; Simmons, A M
1990-02-01
1. A population study of eighth nerve responses in the bullfrog, Rana catesbeiana, was undertaken to analyze how the eighth nerve codes the complex spectral and temporal structure of the species-specific advertisement call over a biologically-realistic range of intensities. Synthetic advertisement calls were generated by Fourier synthesis and presented to individual eighth nerve fibers of anesthetized bullfrogs. Fiber responses were analyzed by calculating rate responses based on post-stimulus-time (PST) histograms and temporal responses based on Fourier transforms of period histograms. 2. At stimulus intensities of 70 and 80 dB SPL, normalized rate responses provide a fairly good representation of the complex spectral structure of the stimulus, particularly in the low- and mid-frequency range. At higher intensities, rate responses saturate, and very little of the spectral structure of the complex stimulus can be seen in the profile of rate responses of the population. 3. Both AP and BP fibers phase-lock strongly to the fundamental (100 Hz) of the complex stimulus. These effects are relatively resistant to changes in stimulus intensity. Only a small number of fibers synchronize to the low-frequency spectral energy in the stimulus. The underlying spectral complexity of the stimulus is not accurately reflected in the timing of fiber firing, presumably because firing is 'captured' by the fundamental frequency. 4. Plots of average localized synchronized rate (ALSR), which combine both spectral and temporal information, show a similar, low-pass shape at all stimulus intensities. ALSR plots do not generally provide an accurate representation of the structure of the advertisement call. 5. The data suggest that anuran peripheral auditory fibers may be particularly sensitive to the amplitude envelope of sounds.
Histogram analysis of T2*-based pharmacokinetic imaging in cerebral glioma grading.
Liu, Hua-Shan; Chiang, Shih-Wei; Chung, Hsiao-Wen; Tsai, Ping-Huei; Hsu, Fei-Ting; Cho, Nai-Yu; Wang, Chao-Ying; Chou, Ming-Chung; Chen, Cheng-Yu
2018-03-01
To investigate the feasibility of histogram analysis of the T2*-based permeability parameter volume transfer constant (K trans ) for glioma grading and to explore the diagnostic performance of the histogram analysis of K trans and blood plasma volume (v p ). We recruited 31 and 11 patients with high- and low-grade gliomas, respectively. The histogram parameters of K trans and v p , derived from the first-pass pharmacokinetic modeling based on the T2* dynamic susceptibility-weighted contrast-enhanced perfusion-weighted magnetic resonance imaging (T2* DSC-PW-MRI) from the entire tumor volume, were evaluated for differentiating glioma grades. Histogram parameters of K trans and v p showed significant differences between high- and low-grade gliomas and exhibited significant correlations with tumor grades. The mean K trans derived from the T2* DSC-PW-MRI had the highest sensitivity and specificity for differentiating high-grade gliomas from low-grade gliomas compared with other histogram parameters of K trans and v p . Histogram analysis of T2*-based pharmacokinetic imaging is useful for cerebral glioma grading. The histogram parameters of the entire tumor K trans measurement can provide increased accuracy with additional information regarding microvascular permeability changes for identifying high-grade brain tumors. Copyright © 2017 Elsevier B.V. All rights reserved.
Active optics - The NTT and the future
NASA Astrophysics Data System (ADS)
Wilson, R. N.; Franza, F.; Giordano, P.; Noethe, L.; Tarenghi, M.
1988-09-01
An account is given of the essential design features and advantages of the ESO's NTT system optics, constituting an active telescope in which the optical correction process exhibited in histograms can be performed at will, on-line, so that the intrinsic quality of the telescope can be fully realized. This technology allows the relaxation of low spatial frequency (long-wave) manufacturing tolerances, and accomplishes automatic maintenance with respect to errors due to optics' maladjustment. Linearity, convergence, and orthogonality laws are used by the optical correction process algorithm.
1980-09-01
gates stuck-at- zero or stuck-at-one, one at a time. As indicated, there is little differ- ence between the function most affected and the function...one at a time. These two histograms are not arranged in descending order. Note that there is little difference between the function most affected and...the function least affected for either case. Even though there is little difference in the frequency with which each instruction will be affected, the
Information granules in image histogram analysis.
Wieclawek, Wojciech
2018-04-01
A concept of granular computing employed in intensity-based image enhancement is discussed. First, a weighted granular computing idea is introduced. Then, the implementation of this term in the image processing area is presented. Finally, multidimensional granular histogram analysis is introduced. The proposed approach is dedicated to digital images, especially to medical images acquired by Computed Tomography (CT). As the histogram equalization approach, this method is based on image histogram analysis. Yet, unlike the histogram equalization technique, it works on a selected range of the pixel intensity and is controlled by two parameters. Performance is tested on anonymous clinical CT series. Copyright © 2017 Elsevier Ltd. All rights reserved.
Stochastic HKMDHE: A multi-objective contrast enhancement algorithm
NASA Astrophysics Data System (ADS)
Pratiher, Sawon; Mukhopadhyay, Sabyasachi; Maity, Srideep; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.
2018-02-01
This contribution proposes a novel extension of the existing `Hyper Kurtosis based Modified Duo-Histogram Equalization' (HKMDHE) algorithm, for multi-objective contrast enhancement of biomedical images. A novel modified objective function has been formulated by joint optimization of the individual histogram equalization objectives. The optimal adequacy of the proposed methodology with respect to image quality metrics such as brightness preserving abilities, peak signal-to-noise ratio (PSNR), Structural Similarity Index (SSIM) and universal image quality metric has been experimentally validated. The performance analysis of the proposed Stochastic HKMDHE with existing histogram equalization methodologies like Global Histogram Equalization (GHE) and Contrast Limited Adaptive Histogram Equalization (CLAHE) has been given for comparative evaluation.
Infrared image segmentation method based on spatial coherence histogram and maximum entropy
NASA Astrophysics Data System (ADS)
Liu, Songtao; Shen, Tongsheng; Dai, Yao
2014-11-01
In order to segment the target well and suppress background noises effectively, an infrared image segmentation method based on spatial coherence histogram and maximum entropy is proposed. First, spatial coherence histogram is presented by weighting the importance of the different position of these pixels with the same gray-level, which is obtained by computing their local density. Then, after enhancing the image by spatial coherence histogram, 1D maximum entropy method is used to segment the image. The novel method can not only get better segmentation results, but also have a faster computation time than traditional 2D histogram-based segmentation methods.
VizieR Online Data Catalog: CH4 and hot methane continuum hybrid line list (Yurchenko+, 2017)
NASA Astrophysics Data System (ADS)
Yurchenko, S. N.; Amundsen, D. S.; Tennyson, J.; Waldmann, I. P.
2017-07-01
The states file ch4_e50.dat contains a list of rovibrational states. Each state is labelled with: nine normal mode vibrational quantum numbers and the vibrational symmetry; three rotational quantum numbers including the total angular momentum J and rotational symmetry; the total symmetry quantum number Gamma and the running number in the same (J,Gamma) block. In addition there are nine local mode vibrational numbers and the largest coefficient used to assign the state in question. Each rovibrational state has a unique number, which is the number of the row in which it appears in the file. This number is the means by which the state is related to the second part of the data system, the transitions files. The total degeneracy is also given to facilitate the intensity calculations. Because of their size, the transitions are listed in 120 separate files, each containing all the transitions in a 100cm-1 frequency range. These transition files t_*.dat contain the strong methane lines lines consisting of three columns: the reference number in the energy file of the upper state, that of the lower state, the Einstein A coefficient of the transition and the transition wavenumber. These entries are ordered by increasing frequency. The name of the file includes the lowest frequency in the range; thus the t-00500.dat file contains all the transitions in the frequency range 500-600cm-1. 19 histograms xYYYYK.dat files contain CH4_ super-lines representing the continuum computed at the temperature T=YYYYK using R=1000000 (7090081 super-lines each) covering the wavenumber range from 10 to 12000cm-1. The energy file, the transitions files and the histograms files are bzipped, and need to be extracted before use. The pressure broadening parameters used in the calculations are listed in broad.dat. A programme ExoCross to generate synthetic spectra from these line lists can be obtained at www.exomol.com. (4 data files).
ERIC Educational Resources Information Center
Vandermeulen, H.; DeWreede, R. E.
1983-01-01
Presents a histogram drawing program which sorts real numbers in up to 30 categories. Entered data are sorted and saved in a text file which is then used to generate the histogram. Complete Applesoft program listings are included. (JN)
Local breast density assessment using reacquired mammographic images.
García, Eloy; Diaz, Oliver; Martí, Robert; Diez, Yago; Gubern-Mérida, Albert; Sentís, Melcior; Martí, Joan; Oliver, Arnau
2017-08-01
The aim of this paper is to evaluate the spatial glandular volumetric tissue distribution as well as the density measures provided by Volpara™ using a dataset composed of repeated pairs of mammograms, where each pair was acquired in a short time frame and in a slightly changed position of the breast. We conducted a retrospective analysis of 99 pairs of repeatedly acquired full-field digital mammograms from 99 different patients. The commercial software Volpara™ Density Maps (Volpara Solutions, Wellington, New Zealand) is used to estimate both the global and the local glandular tissue distribution in each image. The global measures provided by Volpara™, such as breast volume, volume of glandular tissue, and volumetric breast density are compared between the two acquisitions. The evaluation of the local glandular information is performed using histogram similarity metrics, such as intersection and correlation, and local measures, such as statistics from the difference image and local gradient correlation measures. Global measures showed a high correlation (breast volume R=0.99, volume of glandular tissue R=0.94, and volumetric breast density R=0.96) regardless the anode/filter material. Similarly, histogram intersection and correlation metric showed that, for each pair, the images share a high degree of information. Regarding the local distribution of glandular tissue, small changes in the angle of view do not yield significant differences in the glandular pattern, whilst changes in the breast thickness between both acquisition affect the spatial parenchymal distribution. This study indicates that Volpara™ Density Maps is reliable in estimating the local glandular tissue distribution and can be used for its assessment and follow-up. Volpara™ Density Maps is robust to small variations of the acquisition angle and to the beam energy, although divergences arise due to different breast compression conditions. Copyright © 2017 Elsevier B.V. All rights reserved.
Bin recycling strategy for improving the histogram precision on GPU
NASA Astrophysics Data System (ADS)
Cárdenas-Montes, Miguel; Rodríguez-Vázquez, Juan José; Vega-Rodríguez, Miguel A.
2016-07-01
Histogram is an easily comprehensible way to present data and analyses. In the current scientific context with access to large volumes of data, the processing time for building histogram has dramatically increased. For this reason, parallel construction is necessary to alleviate the impact of the processing time in the analysis activities. In this scenario, GPU computing is becoming widely used for reducing until affordable levels the processing time of histogram construction. Associated to the increment of the processing time, the implementations are stressed on the bin-count accuracy. Accuracy aspects due to the particularities of the implementations are not usually taken into consideration when building histogram with very large data sets. In this work, a bin recycling strategy to create an accuracy-aware implementation for building histogram on GPU is presented. In order to evaluate the approach, this strategy was applied to the computation of the three-point angular correlation function, which is a relevant function in Cosmology for the study of the Large Scale Structure of Universe. As a consequence of the study a high-accuracy implementation for histogram construction on GPU is proposed.
A Bayesian Analysis of Scale-Invariant Processes
2012-01-01
Earth Grid (EASE- Grid). The NED raster elevation data of one arc-second resolution (30 m) over the continental US are derived from multiple satellites ...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send...empirical and ME distributions, yet ensuring computational efficiency. Instead of com- puting empirical histograms from large amount of data , only some
Uncertainty in Damage Detection, Dynamic Propagation and Just-in-Time Networks
2015-08-03
estimated parameter uncertainty in dynamic data sets; high order compact finite difference schemes for Helmholtz equations with discontinuous wave numbers...delay differential equations with a Gamma distributed delay. We found that with the same population size the histogram plots for the solution to the...schemes for Helmholtz equations with discontinuous wave numbers across interfaces. • We carried out numerical sensitivity analysis with respect to
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heczko, S; McAuley, GA; Slater, JM
Purpose: To evaluate the impact of titanium and surgical stainless steel implants on the microscopic dose distribution in proton treatment plans Methods: Geant4 Monte Carlo simulations were used to analyze the microdosimetric distribution of proton radiation in the vicinity of 3.1 mm thick CP Grade 4 titanium (Ti) or 316 stainless steel (SS316) plates in a water phantom. Additional simulations were performed using either water, or water with a density equivalent to the respective metals (Tiwater, SS316water) (to reflect common practice in treatment planning). Implants were placed at the COM of SOBPs of 157 MeV (range of ∼15 cm inmore » water) protons with 30 or 60 mm modulation. Primary and secondary particle dose and fluence, frequency-weighted and dose-weighted average lineal energy, average radiation quality factor, dose equivalent and energy deposition histograms in the plate vicinity were compared. Results: Preliminary results show frequency-weighted (yf) and dose-weighted lineal energy (yd) was increased downstream of the Ti plate (yf = 3.1 keV/µm; yd = 5.5 keV/µm) and Tiwater (yf = 4.1 keV/µm; yd = 6.8 keV/µm) compared to that of water (ie, the absence of a plate) (yf = 2.5 keV/µm; yd = 4.5 keV/µm). In addition, downstream proton dose deposition was also elevated due to the presence of the Ti plate or Tiwater. The additional dose deposited at higher lineal energy implies that tissues downstream of the plate will receive a higher dose equivalent. Detailed analyses of the Ti, Tiwater, SS316, and SS316 water simulations will be presented. Conclusion: The presence of high-density materials introduces changes in the spatial distribution of radiation in the vicinity of an implant. Further work quantifying these effects could be incorporated into future treatment planning systems resulting in more accurate treatment plans. This project was sponsored with funding from the Department of Defense (DOD # W81XWH-10-2-0192).« less
de Perrot, T; Lenoir, V; Domingo Ayllón, M; Dulguerov, N; Pusztaszeri, M; Becker, M
2017-11-01
Head and neck squamous cell carcinoma associated with human papillomavirus infection represents a distinct tumor entity. We hypothesized that diffusion phenotypes based on the histogram analysis of ADC values reflect distinct degrees of tumor heterogeneity in human papillomavirus-positive and human papillomavirus-negative head and neck squamous cell carcinomas. One hundred five consecutive patients (mean age, 64 years; range, 45-87 years) with primary oropharyngeal ( n = 52) and oral cavity ( n = 53) head and neck squamous cell carcinoma underwent MR imaging with anatomic and diffusion-weighted sequences ( b = 0, b = 1000 s/mm 2 , monoexponential ADC calculation). The collected tumor voxels from the contoured ROIs provided histograms from which position, dispersion, and form parameters were computed. Histogram data were correlated with histopathology, p16-immunohistochemistry, and polymerase chain reaction for human papillomavirus DNA. There were 21 human papillomavirus-positive and 84 human papillomavirus-negative head and neck squamous cell carcinomas. At histopathology, human papillomavirus-positive cancers were more often nonkeratinizing (13/21, 62%) than human papillomavirus-negative cancers (19/84, 23%; P = .001), and their mitotic index was higher (71% versus 49%; P = .005). ROI-based mean and median ADCs were significantly lower in human papillomavirus-positive (1014 ± 178 × 10 -6 mm 2 /s and 970 ± 187 × 10 -6 mm 2 /s, respectively) than in human papillomavirus-negative tumors (1184 ± 168 × 10 -6 mm 2 /s and 1161 ± 175 × 10 -6 mm 2 /s, respectively; P < .001), whereas excess kurtosis and skewness were significantly higher in human papillomavirus-positive (1.934 ± 1.386 and 0.923 ± 0.510, respectively) than in human papillomavirus-negative tumors (0.643 ± 0.982 and 0.399 ± 0.516, respectively; P < .001). Human papillomavirus-negative head and neck squamous cell carcinoma had symmetric normally distributed ADC histograms, which corresponded histologically to heterogeneous tumors with variable cellularity, high stromal component, keratin pearls, and necrosis. Human papillomavirus-positive head and neck squamous cell carcinomas had leptokurtic skewed right histograms, which corresponded to homogeneous tumors with back-to-back densely packed cells, scant stromal component, and scattered comedonecrosis. Diffusion phenotypes of human papillomavirus-positive and human papillomavirus-negative head and neck squamous cell carcinomas show significant differences, which reflect their distinct degree of tumor heterogeneity. © 2017 by American Journal of Neuroradiology.
NASA Technical Reports Server (NTRS)
Dasarathy, B. V.
1976-01-01
An algorithm is proposed for dimensionality reduction in the context of clustering techniques based on histogram analysis. The approach is based on an evaluation of the hills and valleys in the unidimensional histograms along the different features and provides an economical means of assessing the significance of the features in a nonparametric unsupervised data environment. The method has relevance to remote sensing applications.
Liang, He-Yue; Huang, Ya-Qin; Yang, Zhao-Xia; Ying-Ding; Zeng, Meng-Su; Rao, Sheng-Xiang
2016-07-01
To determine if magnetic resonance imaging (MRI) histogram analyses can help predict response to chemotherapy in patients with colorectal hepatic metastases by using response evaluation criteria in solid tumours (RECIST1.1) as the reference standard. Standard MRI including diffusion-weighted imaging (b=0, 500 s/mm(2)) was performed before chemotherapy in 53 patients with colorectal hepatic metastases. Histograms were performed for apparent diffusion coefficient (ADC) maps, arterial, and portal venous phase images; thereafter, mean, percentiles (1st, 10th, 50th, 90th, 99th), skewness, kurtosis, and variance were generated. Quantitative histogram parameters were compared between responders (partial and complete response, n=15) and non-responders (progressive and stable disease, n=38). Receiver operator characteristics (ROC) analyses were further analyzed for the significant parameters. The mean, 1st percentile, 10th percentile, 50th percentile, 90th percentile, 99th percentile of the ADC maps were significantly lower in responding group than that in non-responding group (p=0.000-0.002) with area under the ROC curve (AUCs) of 0.76-0.82. The histogram parameters of arterial and portal venous phase showed no significant difference (p>0.05) between the two groups. Histogram-derived parameters for ADC maps seem to be a promising tool for predicting response to chemotherapy in patients with colorectal hepatic metastases. • ADC histogram analyses can potentially predict chemotherapy response in colorectal liver metastases. • Lower histogram-derived parameters (mean, percentiles) for ADC tend to have good response. • MR enhancement histogram analyses are not reliable to predict response.
2013-01-01
Background The high variations of background luminance, low contrast and excessively enhanced contrast of hand bone radiograph often impede the bone age assessment rating system in evaluating the degree of epiphyseal plates and ossification centers development. The Global Histogram equalization (GHE) has been the most frequently adopted image contrast enhancement technique but the performance is not satisfying. A brightness and detail preserving histogram equalization method with good contrast enhancement effect has been a goal of much recent research in histogram equalization. Nevertheless, producing a well-balanced histogram equalized radiograph in terms of its brightness preservation, detail preservation and contrast enhancement is deemed to be a daunting task. Method In this paper, we propose a novel framework of histogram equalization with the aim of taking several desirable properties into account, namely the Multipurpose Beta Optimized Bi-Histogram Equalization (MBOBHE). This method performs the histogram optimization separately in both sub-histograms after the segmentation of histogram using an optimized separating point determined based on the regularization function constituted by three components. The result is then assessed by the qualitative and quantitative analysis to evaluate the essential aspects of histogram equalized image using a total of 160 hand radiographs that are implemented in testing and analyses which are acquired from hand bone online database. Result From the qualitative analysis, we found that basic bi-histogram equalizations are not capable of displaying the small features in image due to incorrect selection of separating point by focusing on only certain metric without considering the contrast enhancement and detail preservation. From the quantitative analysis, we found that MBOBHE correlates well with human visual perception, and this improvement shortens the evaluation time taken by inspector in assessing the bone age. Conclusions The proposed MBOBHE outperforms other existing methods regarding comprehensive performance of histogram equalization. All the features which are pertinent to bone age assessment are more protruding relative to other methods; this has shorten the required evaluation time in manual bone age assessment using TW method. While the accuracy remains unaffected or slightly better than using unprocessed original image. The holistic properties in terms of brightness preservation, detail preservation and contrast enhancement are simultaneous taken into consideration and thus the visual effect is contributive to manual inspection. PMID:23565999
Verification of Dose Distribution in Carbon Ion Radiation Therapy for Stage I Lung Cancer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Irie, Daisuke; Saitoh, Jun-ichi, E-mail: junsaito@gunma-u.ac.jp; Shirai, Katsuyuki
Purpose: To evaluate robustness of dose distribution of carbon-ion radiation therapy (C-ion RT) in non-small cell lung cancer (NSCLC) and to identify factors affecting the dose distribution by simulated dose distribution. Methods and Materials: Eighty irradiation fields for delivery of C-ion RT were analyzed in 20 patients with stage I NSCLC. Computed tomography images were obtained twice before treatment initiation. Simulated dose distribution was reconstructed on computed tomography for confirmation under the same settings as actual treatment with respiratory gating and bony structure matching. Dose-volume histogram parameters, such as %D95 (percentage of D95 relative to the prescribed dose), were calculated.more » Patients with any field for which the %D95 of gross tumor volume (GTV) was below 90% were classified as unacceptable for treatment, and the optimal target margin for such cases was examined. Results: Five patients with a total of 8 fields (10% of total number of fields analyzed) were classified as unacceptable according to %D95 of GTV, although most patients showed no remarkable change in the dose-volume histogram parameters. Receiver operating characteristic curve analysis showed that tumor displacement and change in water-equivalent pathlength were significant predictive factors of unacceptable cases (P<.001 and P=.002, respectively). The main cause of degradation of the dose distribution was tumor displacement in 7 of the 8 unacceptable fields. A 6-mm planning target volume margin ensured a GTV %D95 of >90%, except in 1 extremely unacceptable field. Conclusions: According to this simulation analysis of C-ion RT for stage I NSCLC, a few fields were reported as unacceptable and required resetting of body position and reconfirmation. In addition, tumor displacement and change in water-equivalent pathlength (bone shift and/or chest wall thickness) were identified as factors influencing the robustness of dose distribution. Such uncertainties should be regarded in planning.« less
Using histograms to introduce randomization in the generation of ensembles of decision trees
Kamath, Chandrika; Cantu-Paz, Erick; Littau, David
2005-02-22
A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.
Warner, Graham C.; Helmer, Karl G.
2018-01-01
As the sharing of data is mandated by funding agencies and journals, reuse of data has become more prevalent. It becomes imperative, therefore, to develop methods to characterize the similarity of data. While users can group data based on the acquisition parameters stored in the file headers, these gives no indication whether a file can be combined with other data without increasing the variance in the data set. Methods have been implemented that characterize the signal-to-noise ratio or identify signal drop-outs in the raw image files, but potential users of data often have access to calculated metric maps and these are more difficult to characterize and compare. Here we describe a histogram-distance-based method applied to diffusion metric maps of fractional anisotropy and mean diffusivity that were generated using data extracted from a repository of clinically-acquired MRI data. We describe the generation of the data set, the pitfalls specific to diffusion MRI data, and the results of the histogram distance analysis. We find that, in general, data from GE scanners are less similar than are data from Siemens scanners. We also find that the distribution of distance metric values is not Gaussian at any selection of the acquisition parameters considered here (field strength, number of gradient directions, b-value, and vendor). PMID:29568257
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, Vincent K., E-mail: vincent.shen@nist.gov; Siderius, Daniel W.
2014-06-28
Using flat-histogram Monte Carlo methods, we investigate the adsorptive behavior of the square-well fluid in two simple slit-pore-like models intended to capture fundamental characteristics of flexible adsorbent materials. Both models require as input thermodynamic information about the flexible adsorbent material itself. An important component of this work involves formulating the flexible pore models in the appropriate thermodynamic (statistical mechanical) ensembles, namely, the osmotic ensemble and a variant of the grand-canonical ensemble. Two-dimensional probability distributions, which are calculated using flat-histogram methods, provide the information necessary to determine adsorption thermodynamics. For example, we are able to determine precisely adsorption isotherms, (equilibrium) phasemore » transition conditions, limits of stability, and free energies for a number of different flexible adsorbent materials, distinguishable as different inputs into the models. While the models used in this work are relatively simple from a geometric perspective, they yield non-trivial adsorptive behavior, including adsorption-desorption hysteresis solely due to material flexibility and so-called “breathing” of the adsorbent. The observed effects can in turn be tied to the inherent properties of the bare adsorbent. Some of the effects are expected on physical grounds while others arise from a subtle balance of thermodynamic and mechanical driving forces. In addition, the computational strategy presented here can be easily applied to more complex models for flexible adsorbents.« less
NASA Astrophysics Data System (ADS)
Shen, Vincent K.; Siderius, Daniel W.
2014-06-01
Using flat-histogram Monte Carlo methods, we investigate the adsorptive behavior of the square-well fluid in two simple slit-pore-like models intended to capture fundamental characteristics of flexible adsorbent materials. Both models require as input thermodynamic information about the flexible adsorbent material itself. An important component of this work involves formulating the flexible pore models in the appropriate thermodynamic (statistical mechanical) ensembles, namely, the osmotic ensemble and a variant of the grand-canonical ensemble. Two-dimensional probability distributions, which are calculated using flat-histogram methods, provide the information necessary to determine adsorption thermodynamics. For example, we are able to determine precisely adsorption isotherms, (equilibrium) phase transition conditions, limits of stability, and free energies for a number of different flexible adsorbent materials, distinguishable as different inputs into the models. While the models used in this work are relatively simple from a geometric perspective, they yield non-trivial adsorptive behavior, including adsorption-desorption hysteresis solely due to material flexibility and so-called "breathing" of the adsorbent. The observed effects can in turn be tied to the inherent properties of the bare adsorbent. Some of the effects are expected on physical grounds while others arise from a subtle balance of thermodynamic and mechanical driving forces. In addition, the computational strategy presented here can be easily applied to more complex models for flexible adsorbents.
Scaling images using their background ratio. An application in statistical comparisons of images.
Kalemis, A; Binnie, D; Bailey, D L; Flower, M A; Ott, R J
2003-06-07
Comparison of two medical images often requires image scaling as a pre-processing step. This is usually done with the scaling-to-the-mean or scaling-to-the-maximum techniques which, under certain circumstances, in quantitative applications may contribute a significant amount of bias. In this paper, we present a simple scaling method which assumes only that the most predominant values in the corresponding images belong to their background structure. The ratio of the two images to be compared is calculated and its frequency histogram is plotted. The scaling factor is given by the position of the peak in this histogram which belongs to the background structure. The method was tested against the traditional scaling-to-the-mean technique on simulated planar gamma-camera images which were compared using pixelwise statistical parametric tests. Both sensitivity and specificity for each condition were measured over a range of different contrasts and sizes of inhomogeneity for the two scaling techniques. The new method was found to preserve sensitivity in all cases while the traditional technique resulted in significant degradation of sensitivity in certain cases.
Putranto, Dedy Septono Catur; Priambodo, Purnomo Sidi; Hartanto, Djoko; Du, Wei; Satoh, Hiroaki; Ono, Atsushi; Inokawa, Hiroshi
2014-09-08
Low-frequency noise and hole lifetime in silicon-on-insulator (SOI) metal-oxide-semiconductor field-effect transistors (MOSFETs) are analyzed, considering their use in photon detection based on single-hole counting. The noise becomes minimum at around the transition point between front- and back-channel operations when the substrate voltage is varied, and increases largely on both negative and positive sides of the substrate voltage showing peculiar Lorentzian (generation-recombination) noise spectra. Hole lifetime is evaluated by the analysis of drain current histogram at different substrate voltages. It is found that the peaks in the histogram corresponding to the larger number of stored holes become higher as the substrate bias becomes larger. This can be attributed to the prolonged lifetime caused by the higher electric field inside the body of SOI MOSFET. It can be concluded that, once the inversion channel is induced for detection of the photo-generated holes, the small absolute substrate bias is favorable for short lifetime and low noise, leading to high-speed operation.
Pedestrian detection in crowded scenes with the histogram of gradients principle
NASA Astrophysics Data System (ADS)
Sidla, O.; Rosner, M.; Lypetskyy, Y.
2006-10-01
This paper describes a close to real-time scale invariant implementation of a pedestrian detector system which is based on the Histogram of Oriented Gradients (HOG) principle. Salient HOG features are first selected from a manually created very large database of samples with an evolutionary optimization procedure that directly trains a polynomial Support Vector Machine (SVM). Real-time operation is achieved by a cascaded 2-step classifier which uses first a very fast linear SVM (with the same features as the polynomial SVM) to reject most of the irrelevant detections and then computes the decision function with a polynomial SVM on the remaining set of candidate detections. Scale invariance is achieved by running the detector of constant size on scaled versions of the original input images and by clustering the results over all resolutions. The pedestrian detection system has been implemented in two versions: i) fully body detection, and ii) upper body only detection. The latter is especially suited for very busy and crowded scenarios. On a state-of-the-art PC it is able to run at a frequency of 8 - 20 frames/sec.
Quartz crystal microbalance as a sensing active element for rupture scanning within frequency band.
Dultsev, F N; Kolosovsky, E A
2011-02-14
A new method based on the use of quartz crystal microbalance (QCM) as an active sensing element is developed, optimized and tested in a model system to measure the rupture force and deduce size distribution of nanoparticles. As suggested by model predictions, the QCM is shaped as a strip. The ratio of rupture signals at the second and the third harmonics versus the geometric position of a body on QCM surface is investigated theoretically. Recommendations concerning the use of the method for measuring the nanoparticle size distribution are presented. It is shown experimentally for an ensemble of test particles with a characteristic size within 20-30 nm that the proposed method allows one to determine particle size distribution. On the basis of the position and value of the measured rupture signal, a histogram of particle size distribution and percentage of each size fraction were determined. The main merits of the bond-rupture method are its rapid response, simplicity and the ability to discriminate between specific and non-specific interactions. The method is highly sensitive with respect to mass (the sensitivity is generally dependent on the chemical nature of receptor and analyte and may reach 8×10(-14) g mm(-2)) and applicable to measuring rupture forces either for weak bonds, for example hydrogen bonds, or for strong covalent bonds (10(-11)-10(-9) N). This procedure may become a good alternative for the existing methods, such as AFM or optical methods of determining biological objects, and win a broad range of applications both in laboratory research and in biosensing for various purposes. Possible applications include medicine, diagnostics, environmental or agricultural monitoring. Copyright © 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Chen, Huaiyu; Cao, Li
2017-06-01
In order to research multiple sound source localization with room reverberation and background noise, we analyze the shortcomings of traditional broadband MUSIC and ordinary auditory filtering based broadband MUSIC method, then a new broadband MUSIC algorithm with gammatone auditory filtering of frequency component selection control and detection of ascending segment of direct sound componence is proposed. The proposed algorithm controls frequency component within the interested frequency band in multichannel bandpass filter stage. Detecting the direct sound componence of the sound source for suppressing room reverberation interference is also proposed, whose merits are fast calculation and avoiding using more complex de-reverberation processing algorithm. Besides, the pseudo-spectrum of different frequency channels is weighted by their maximum amplitude for every speech frame. Through the simulation and real room reverberation environment experiments, the proposed method has good performance. Dynamic multiple sound source localization experimental results indicate that the average absolute error of azimuth estimated by the proposed algorithm is less and the histogram result has higher angle resolution.
FPGA based charge fast histogramming for GEM detector
NASA Astrophysics Data System (ADS)
Poźniak, Krzysztof T.; Byszuk, A.; Chernyshova, M.; Cieszewski, R.; Czarski, T.; Dominik, W.; Jakubowska, K.; Kasprowicz, G.; Rzadkiewicz, J.; Scholz, M.; Zabolotny, W.
2013-10-01
This article presents a fast charge histogramming method for the position sensitive X-ray GEM detector. The energy resolved measurements are carried out simultaneously for 256 channels of the GEM detector. The whole process of histogramming is performed in 21 FPGA chips (Spartan-6 series from Xilinx) . The results of the histogramming process are stored in an external DDR3 memory. The structure of an electronic measuring equipment and a firmware functionality implemented in the FPGAs is described. Examples of test measurements are presented.
Local dynamic range compensation for scanning electron microscope imaging system.
Sim, K S; Huang, Y H
2015-01-01
This is the extended project by introducing the modified dynamic range histogram modification (MDRHM) and is presented in this paper. This technique is used to enhance the scanning electron microscope (SEM) imaging system. By comparing with the conventional histogram modification compensators, this technique utilizes histogram profiling by extending the dynamic range of each tile of an image to the limit of 0-255 range while retains its histogram shape. The proposed technique yields better image compensation compared to conventional methods. © Wiley Periodicals, Inc.
Nondestructive Detection of the Internalquality of Apple Using X-Ray and Machine Vision
NASA Astrophysics Data System (ADS)
Yang, Fuzeng; Yang, Liangliang; Yang, Qing; Kang, Likui
The internal quality of apple is impossible to be detected by eyes in the procedure of sorting, which could reduce the apple’s quality reaching market. This paper illustrates an instrument using X-ray and machine vision. The following steps were introduced to process the X-ray image in order to determine the mould core apple. Firstly, lifting wavelet transform was used to get a low frequency image and three high frequency images. Secondly, we enhanced the low frequency image through image’s histogram equalization. Then, the edge of each apple's image was detected using canny operator. Finally, a threshold was set to clarify mould core and normal apple according to the different length of the apple core’s diameter. The experimental results show that this method could on-line detect the mould core apple with less time consuming, less than 0.03 seconds per apple, and the accuracy could reach 92%.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Domengie, F., E-mail: florian.domengie@st.com; Morin, P.; Bauza, D.
We propose a model for dark current induced by metallic contamination in a CMOS image sensor. Based on Shockley-Read-Hall kinetics, the expression of dark current proposed accounts for the electric field enhanced emission factor due to the Poole-Frenkel barrier lowering and phonon-assisted tunneling mechanisms. To that aim, we considered the distribution of the electric field magnitude and metal atoms in the depth of the pixel. Poisson statistics were used to estimate the random distribution of metal atoms in each pixel for a given contamination dose. Then, we performed a Monte-Carlo-based simulation for each pixel to set the number of metalmore » atoms the pixel contained and the enhancement factor each atom underwent, and obtained a histogram of the number of pixels versus dark current for the full sensor. Excellent agreement with the dark current histogram measured on an ion-implanted gold-contaminated imager has been achieved, in particular, for the description of the distribution tails due to the pixel regions in which the contaminant atoms undergo a large electric field. The agreement remains very good when increasing the temperature by 15 °C. We demonstrated that the amplification of the dark current generated for the typical electric fields encountered in the CMOS image sensors, which depends on the nature of the metal contaminant, may become very large at high electric field. The electron and hole emissions and the resulting enhancement factor are described as a function of the trap characteristics, electric field, and temperature.« less
Liu, Song; Zhang, Yujuan; Chen, Ling; Guan, Wenxian; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang
2017-10-02
Whole-lesion apparent diffusion coefficient (ADC) histogram analysis has been introduced and proved effective in assessment of multiple tumors. However, the application of whole-volume ADC histogram analysis in gastrointestinal tumors has just started and never been reported in T and N staging of gastric cancers. Eighty patients with pathologically confirmed gastric carcinomas underwent diffusion weighted (DW) magnetic resonance imaging before surgery prospectively. Whole-lesion ADC histogram analysis was performed by two radiologists independently. The differences of ADC histogram parameters among different T and N stages were compared with independent-samples Kruskal-Wallis test. Receiver operating characteristic (ROC) analysis was performed to evaluate the performance of ADC histogram parameters in differentiating particular T or N stages of gastric cancers. There were significant differences of all the ADC histogram parameters for gastric cancers at different T (except ADC min and ADC max ) and N (except ADC max ) stages. Most ADC histogram parameters differed significantly between T1 vs T3, T1 vs T4, T2 vs T4, N0 vs N1, N0 vs N3, and some parameters (ADC 5% , ADC 10% , ADC min ) differed significantly between N0 vs N2, N2 vs N3 (all P < 0.05). Most parameters except ADC max performed well in differentiating different T and N stages of gastric cancers. Especially for identifying patients with and without lymph node metastasis, the ADC 10% yielded the largest area under the ROC curve of 0.794 (95% confidence interval, 0.677-0.911). All the parameters except ADC max showed excellent inter-observer agreement with intra-class correlation coefficients higher than 0.800. Whole-volume ADC histogram parameters held great potential in differentiating different T and N stages of gastric cancers preoperatively.
Gihr, Georg Alexander; Horvath-Rizea, Diana; Kohlhof-Meinecke, Patricia; Ganslandt, Oliver; Henkes, Hans; Richter, Cindy; Hoffmann, Karl-Titus; Surov, Alexey; Schob, Stefan
2018-06-14
Meningiomas are the most frequently diagnosed intracranial masses, oftentimes requiring surgery. Especially procedure-related morbidity can be substantial, particularly in elderly patients. Hence, reliable imaging modalities enabling pretherapeutic prediction of tumor grade, growth kinetic, realistic prognosis, and-as a consequence-necessity of surgery are of great value. In this context, a promising diagnostic approach is advanced analysis of magnetic resonance imaging data. Therefore, our study investigated whether histogram profiling of routinely acquired postcontrast T1-weighted images is capable of separating low-grade from high-grade lesions and whether histogram parameters reflect Ki-67 expression in meningiomas. Pretreatment T1-weighted postcontrast volumes of 44 meningioma patients were used for signal intensity histogram profiling. WHO grade, tumor volume, and Ki-67 expression were evaluated. Comparative and correlative statistics investigating the association between histogram profile parameters and neuropathology were performed. None of the investigated histogram parameters revealed significant differences between low-grade and high-grade meningiomas. However, significant correlations were identified between Ki-67 and the histogram parameters skewness and entropy as well as between entropy and tumor volume. Contrary to previously reported findings, pretherapeutic postcontrast T1-weighted images can be used to predict growth kinetics in meningiomas if whole tumor histogram analysis is employed. However, no differences between distinct WHO grades were identifiable in out cohort. As a consequence, histogram analysis of postcontrast T1-weighted images is a promising approach to obtain quantitative in vivo biomarkers reflecting the proliferative potential in meningiomas. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Jeong, Chang Bu; Kim, Kwang Gi; Kim, Tae Sung; Kim, Seok Ki
2011-06-01
Whole-body bone scan is one of the most frequent diagnostic procedures in nuclear medicine. Especially, it plays a significant role in important procedures such as the diagnosis of osseous metastasis and evaluation of osseous tumor response to chemotherapy and radiation therapy. It can also be used to monitor the possibility of any recurrence of the tumor. However, it is a very time-consuming effort for radiologists to quantify subtle interval changes between successive whole-body bone scans because of many variations such as intensity, geometry, and morphology. In this paper, we present the most effective method of image enhancement based on histograms, which may assist radiologists in interpreting successive whole-body bone scans effectively. Forty-eight successive whole-body bone scans from 10 patients were obtained and evaluated using six methods of image enhancement based on histograms: histogram equalization, brightness-preserving bi-histogram equalization, contrast-limited adaptive histogram equalization, end-in search, histogram matching, and exact histogram matching (EHM). Comparison of the results of the different methods was made using three similarity measures peak signal-to-noise ratio, histogram intersection, and structural similarity. Image enhancement of successive bone scans using EHM showed the best results out of the six methods measured for all similarity measures. EHM is the best method of image enhancement based on histograms for diagnosing successive whole-body bone scans. The method for successive whole-body bone scans has the potential to greatly assist radiologists quantify interval changes more accurately and quickly by compensating for the variable nature of intensity information. Consequently, it can improve radiologists' diagnostic accuracy as well as reduce reading time for detecting interval changes.
Structure Size Enhanced Histogram
NASA Astrophysics Data System (ADS)
Wesarg, Stefan; Kirschner, Matthias
Direct volume visualization requires the definition of transfer functions (TFs) for the assignment of opacity and color. Multi-dimensional TFs are based on at least two image properties, and are specified by means of 2D histograms. In this work we propose a new type of a 2D histogram which combines gray value with information about the size of the structures. This structure size enhanced (SSE) histogram is an intuitive approach for representing anatomical features. Clinicians — the users we are focusing on — are much more familiar with selecting features by their size than by their gradient magnitude value. As a proof of concept, we employ the SSE histogram for the definition of two-dimensional TFs for the visualization of 3D MRI and CT image data.
Face recognition algorithm using extended vector quantization histogram features.
Yan, Yan; Lee, Feifei; Wu, Xueqian; Chen, Qiu
2018-01-01
In this paper, we propose a face recognition algorithm based on a combination of vector quantization (VQ) and Markov stationary features (MSF). The VQ algorithm has been shown to be an effective method for generating features; it extracts a codevector histogram as a facial feature representation for face recognition. Still, the VQ histogram features are unable to convey spatial structural information, which to some extent limits their usefulness in discrimination. To alleviate this limitation of VQ histograms, we utilize Markov stationary features (MSF) to extend the VQ histogram-based features so as to add spatial structural information. We demonstrate the effectiveness of our proposed algorithm by achieving recognition results superior to those of several state-of-the-art methods on publicly available face databases.
Xu, Yan; Ru, Tong; Zhu, Lijing; Liu, Baorui; Wang, Huanhuan; Zhu, Li; He, Jian; Liu, Song; Zhou, Zhengyang; Yang, Xiaofeng
To monitor early response for locally advanced cervical cancers undergoing concurrent chemo-radiotherapy (CCRT) by ultrasonic histogram. B-mode ultrasound examinations were performed at 4 time points in thirty-four patients during CCRT. Six ultrasonic histogram parameters were used to assess the echogenicity, homogeneity and heterogeneity of tumors. I peak increased rapidly since the first week after therapy initiation, whereas W low , W high and A high changed significantly at the second week. The average ultrasonic histogram progressively moved toward the right and converted into more symmetrical shape. Ultrasonic histogram could be served as a potential marker to monitor early response during CCRT. Copyright © 2018 Elsevier Inc. All rights reserved.
Dosimetric variations due to interfraction organ deformation in cervical cancer brachytherapy.
Kobayashi, Kazuma; Murakami, Naoya; Wakita, Akihisa; Nakamura, Satoshi; Okamoto, Hiroyuki; Umezawa, Rei; Takahashi, Kana; Inaba, Koji; Igaki, Hiroshi; Ito, Yoshinori; Shigematsu, Naoyuki; Itami, Jun
2015-12-01
We quantitatively estimated dosimetric variations due to interfraction organ deformation in multi-fractionated high-dose-rate brachytherapy (HDRBT) for cervical cancer using a novel surface-based non-rigid deformable registration. As the number of consecutive HDRBT fractions increased, simple addition of dose-volume histogram parameters significantly overestimated the dose, compared with distribution-based dose addition. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Hypothesis Testing Using Spatially Dependent Heavy Tailed Multisensor Data
2014-12-01
Office of Research 113 Bowne Hall Syracuse, NY 13244 -1200 ABSTRACT HYPOTHESIS TESTING USING SPATIALLY DEPENDENT HEAVY-TAILED MULTISENSOR DATA Report...consistent with the null hypothesis of linearity and can be used to estimate the distribution of a test statistic that can discrimi- nate between the null... Test for nonlinearity. Histogram is generated using the surrogate data. The statistic of the original time series is represented by the solid line
Combining Vector Quantization and Histogram Equalization.
ERIC Educational Resources Information Center
Cosman, Pamela C.; And Others
1992-01-01
Discussion of contrast enhancement techniques focuses on the use of histogram equalization with a data compression technique, i.e., tree-structured vector quantization. The enhancement technique of intensity windowing is described, and the use of enhancement techniques for medical images is explained, including adaptive histogram equalization.…
Histogram Curve Matching Approaches for Object-based Image Classification of Land Cover and Land Use
Toure, Sory I.; Stow, Douglas A.; Weeks, John R.; Kumar, Sunil
2013-01-01
The classification of image-objects is usually done using parametric statistical measures of central tendency and/or dispersion (e.g., mean or standard deviation). The objectives of this study were to analyze digital number histograms of image objects and evaluate classifications measures exploiting characteristic signatures of such histograms. Two histograms matching classifiers were evaluated and compared to the standard nearest neighbor to mean classifier. An ADS40 airborne multispectral image of San Diego, California was used for assessing the utility of curve matching classifiers in a geographic object-based image analysis (GEOBIA) approach. The classifications were performed with data sets having 0.5 m, 2.5 m, and 5 m spatial resolutions. Results show that histograms are reliable features for characterizing classes. Also, both histogram matching classifiers consistently performed better than the one based on the standard nearest neighbor to mean rule. The highest classification accuracies were produced with images having 2.5 m spatial resolution. PMID:24403648
Shin, Young Gyung; Yoo, Jaeheung; Kwon, Hyeong Ju; Hong, Jung Hwa; Lee, Hye Sun; Yoon, Jung Hyun; Kim, Eun-Kyung; Moon, Hee Jung; Han, Kyunghwa; Kwak, Jin Young
2016-08-01
The objective of the study was to evaluate whether texture analysis using histogram and gray level co-occurrence matrix (GLCM) parameters can help clinicians diagnose lymphocytic thyroiditis (LT) and differentiate LT according to pathologic grade. The background thyroid pathology of 441 patients was classified into no evidence of LT, chronic LT (CLT), and Hashimoto's thyroiditis (HT). Histogram and GLCM parameters were extracted from the regions of interest on ultrasound. The diagnostic performances of the parameters for diagnosing and differentiating LT were calculated. Of the histogram and GLCM parameters, the mean on histogram had the highest Az (0.63) and VUS (0.303). As the degrees of LT increased, the mean decreased and the standard deviation and entropy increased. The mean on histogram from gray-scale ultrasound showed the best diagnostic performance as a single parameter in differentiating LT according to pathologic grade as well as in diagnosing LT. Copyright © 2016 Elsevier Ltd. All rights reserved.
Guan, Yue; Shi, Hua; Chen, Ying; Liu, Song; Li, Weifeng; Jiang, Zhuoran; Wang, Huanhuan; He, Jian; Zhou, Zhengyang; Ge, Yun
2016-01-01
The aim of this study was to explore the application of whole-lesion histogram analysis of apparent diffusion coefficient (ADC) values of cervical cancer. A total of 54 women (mean age, 53 years) with cervical cancers underwent 3-T diffusion-weighted imaging with b values of 0 and 800 s/mm prospectively. Whole-lesion histogram analysis of ADC values was performed. Paired sample t test was used to compare differences in ADC histogram parameters between cervical cancers and normal cervical tissues. Receiver operating characteristic curves were constructed to identify the optimal threshold of each parameter. All histogram parameters in this study including ADCmean, ADCmin, ADC10%-ADC90%, mode, skewness, and kurtosis of cervical cancers were significantly lower than those of normal cervical tissues (all P < 0.0001). ADC90% had the largest area under receiver operating characteristic curve of 0.996. Whole-lesion histogram analysis of ADC maps is useful in the assessment of cervical cancer.
Holmes, D. F.; Lu, Y.; Purslow, P. P.; Kadler, K. E.; Bechet, D.; Wess, T. J.
2012-01-01
Scaling relationships have been formulated to investigate the influence of collagen fibril diameter (D) on age-related variations in the strain energy density of tendon. Transmission electron microscopy was used to quantify D in tail tendon from 1.7- to 35.3-mo-old (C57BL/6) male mice. Frequency histograms of D for all age groups were modeled as two normally distributed subpopulations with smaller (DD1) and larger (DD2) mean Ds, respectively. Both DD1 and DD2 increase from 1.6 to 4.0 mo but decrease thereafter. From tensile tests to rupture, two strain energy densities were calculated: 1) uE [from initial loading until the yield stress (σY)], which contributes primarily to tendon resilience, and 2) uF [from σY through the maximum stress (σU) until rupture], which relates primarily to resistance of the tendons to rupture. As measured by the normalized strain energy densities uE/σY and uF/σU, both the resilience and resistance to rupture increase with increasing age and peak at 23.0 and 4.0 mo, respectively, before decreasing thereafter. Multiple regression analysis reveals that increases in uE/σY (resilience energy) are associated with decreases in DD1 and increases in DD2, whereas uF/σU (rupture energy) is associated with increases in DD1 alone. These findings support a model where age-related variations in tendon resilience and resistance to rupture can be directed by subtle changes in the bimodal distribution of Ds. PMID:22837169
Meng, Jie; Zhu, Lijing; Zhu, Li; Wang, Huanhuan; Liu, Song; Yan, Jing; Liu, Baorui; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng
2016-10-22
To explore the role of apparent diffusion coefficient (ADC) histogram shape related parameters in early assessment of treatment response during the concurrent chemo-radiotherapy (CCRT) course of advanced cervical cancers. This prospective study was approved by the local ethics committee and informed consent was obtained from all patients. Thirty-two patients with advanced cervical squamous cell carcinomas underwent diffusion weighted magnetic resonance imaging (b values, 0 and 800 s/mm 2 ) before CCRT, at the end of 2nd and 4th week during CCRT and immediately after CCRT completion. Whole lesion ADC histogram analysis generated several histogram shape related parameters including skewness, kurtosis, s-sD av , width, standard deviation, as well as first-order entropy and second-order entropies. The averaged ADC histograms of 32 patients were generated to visually observe dynamic changes of the histogram shape following CCRT. All parameters except width and standard deviation showed significant changes during CCRT (all P < 0.05), and their variation trends fell into four different patterns. Skewness and kurtosis both showed high early decline rate (43.10 %, 48.29 %) at the end of 2nd week of CCRT. All entropies kept decreasing significantly since 2 weeks after CCRT initiated. The shape of averaged ADC histogram also changed obviously following CCRT. ADC histogram shape analysis held the potential in monitoring early tumor response in patients with advanced cervical cancers undergoing CCRT.
[Clinical application of MRI histogram in evaluation of muscle fatty infiltration].
Zheng, Y M; Du, J; Li, W Z; Wang, Z X; Zhang, W; Xiao, J X; Yuan, Y
2016-10-18
To describe a method based on analysis of the histogram of intensity values produced from the magnetic resonance imaging (MRI) for quantifying the degree of fatty infiltration. The study included 25 patients with dystrophinopathy. All the subjects underwent muscle MRI test at thigh level. The histogram M values of 250 muscles adjusted for subcutaneous fat, representing the degree of fatty infiltration, were compared with the expert visual reading using the modified Mercuri scale. There was a significant positive correlation between the histogram M values and the scores of visual reading (r=0.854, P<0.001). The distinct pattern of muscle involvement detected in the patients with dystrophinopathy in our study of histogram M values was similar to that of visual reading and results in literature. The histogram M values had stronger correlations with the clinical data than the scores of visual reading as follows: the correlations with age (r=0.730, P<0.001) and (r=0.753, P<0.001); with strength of knee extensor (r=-0.468, P=0.024) and (r=-0.460, P=0.027) respectively. Meanwhile, the histogram M values analysis had better repeatability than visual reading with the interclass correlation coefficient was 0.998 (95% CI: 0.997-0.998, P<0.001) and 0.958 (95% CI: 0.946-0.967, P<0.001) respectively. Histogram M values analysis of MRI with the advantages of repeatability and objectivity can be used to evaluate the degree of muscle fatty infiltration.
Dissimilarity representations in lung parenchyma classification
NASA Astrophysics Data System (ADS)
Sørensen, Lauge; de Bruijne, Marleen
2009-02-01
A good problem representation is important for a pattern recognition system to be successful. The traditional approach to statistical pattern recognition is feature representation. More specifically, objects are represented by a number of features in a feature vector space, and classifiers are built in this representation. This is also the general trend in lung parenchyma classification in computed tomography (CT) images, where the features often are measures on feature histograms. Instead, we propose to build normal density based classifiers in dissimilarity representations for lung parenchyma classification. This allows for the classifiers to work on dissimilarities between objects, which might be a more natural way of representing lung parenchyma. In this context, dissimilarity is defined between CT regions of interest (ROI)s. ROIs are represented by their CT attenuation histogram and ROI dissimilarity is defined as a histogram dissimilarity measure between the attenuation histograms. In this setting, the full histograms are utilized according to the chosen histogram dissimilarity measure. We apply this idea to classification of different emphysema patterns as well as normal, healthy tissue. Two dissimilarity representation approaches as well as different histogram dissimilarity measures are considered. The approaches are evaluated on a set of 168 CT ROIs using normal density based classifiers all showing good performance. Compared to using histogram dissimilarity directly as distance in a emph{k} nearest neighbor classifier, which achieves a classification accuracy of 92.9%, the best dissimilarity representation based classifier is significantly better with a classification accuracy of 97.0% (text{emph{p" border="0" class="imgtopleft"> = 0.046).
NASA Astrophysics Data System (ADS)
Manz, Christoph; Kobitski, Andrei Yu.; Samanta, Ayan; Jäschke, Andres; Nienhaus, G. Ulrich
2018-03-01
RNA (ribonucleic acid) molecules are highly flexible biopolymers fluctuating at physiological temperatures among many different conformations that are represented by minima in a hierarchical conformational free energy landscape. Here we have employed single-molecule FRET (smFRET) to explore the energy landscape of the B. subtilis yitJ SAM-I riboswitch (RS). In this small RNA molecule, specific binding of an S-adenosyl-L-methionine (SAM) ligand in the aptamer domain regulates gene expression by inducing structural changes in another domain, the expression platform, causing transcription termination by the RNA polymerase. We have measured smFRET histograms over wide ranges of Mg2+ concentration for three RS variants that were specifically labeled with fluorescent dyes on different sites. In the analysis, different conformations are associated with discrete Gaussian model distributions, which are typically fairly broad on the FRET efficiency scale and thus can be extremely challenging to unravel due to their mutual overlap. Our earlier work on two SAM-I RS variants revealed four major conformations. By introducing a global fitting procedure which models both the Mg2+ concentration dependencies of the fractional populations and the average FRET efficiencies of the individual FRET distributions according to Mg2+ binding isotherms, we were able to consistently describe the histogram data of both variants at all studied Mg2+ concentrations. With the third FRET-labeled variant, however, we found significant deviations when applying the four-state model to the data. This can arise because the different FRET labeling of the new variant allows two states to be distinguished that were previously not separable due to overlap. Indeed, the resulting five-state model presented here consistently describes the smFRET histograms of all three variants as well as their variations with Mg2+ concentration. We also performed a triangulation of the donor position for two of the constructs to explore how the expression platform is oriented with respect to the aptamer.
SAMBA: Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahlfeld, R., E-mail: r.ahlfeld14@imperial.ac.uk; Belkouchi, B.; Montomoli, F.
2016-09-01
A new arbitrary Polynomial Chaos (aPC) method is presented for moderately high-dimensional problems characterised by limited input data availability. The proposed methodology improves the algorithm of aPC and extends the method, that was previously only introduced as tensor product expansion, to moderately high-dimensional stochastic problems. The fundamental idea of aPC is to use the statistical moments of the input random variables to develop the polynomial chaos expansion. This approach provides the possibility to propagate continuous or discrete probability density functions and also histograms (data sets) as long as their moments exist, are finite and the determinant of the moment matrixmore » is strictly positive. For cases with limited data availability, this approach avoids bias and fitting errors caused by wrong assumptions. In this work, an alternative way to calculate the aPC is suggested, which provides the optimal polynomials, Gaussian quadrature collocation points and weights from the moments using only a handful of matrix operations on the Hankel matrix of moments. It can therefore be implemented without requiring prior knowledge about statistical data analysis or a detailed understanding of the mathematics of polynomial chaos expansions. The extension to more input variables suggested in this work, is an anisotropic and adaptive version of Smolyak's algorithm that is solely based on the moments of the input probability distributions. It is referred to as SAMBA (PC), which is short for Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos. It is illustrated that for moderately high-dimensional problems (up to 20 different input variables or histograms) SAMBA can significantly simplify the calculation of sparse Gaussian quadrature rules. SAMBA's efficiency for multivariate functions with regard to data availability is further demonstrated by analysing higher order convergence and accuracy for a set of nonlinear test functions with 2, 5 and 10 different input distributions or histograms.« less
SAMBA: Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos
NASA Astrophysics Data System (ADS)
Ahlfeld, R.; Belkouchi, B.; Montomoli, F.
2016-09-01
A new arbitrary Polynomial Chaos (aPC) method is presented for moderately high-dimensional problems characterised by limited input data availability. The proposed methodology improves the algorithm of aPC and extends the method, that was previously only introduced as tensor product expansion, to moderately high-dimensional stochastic problems. The fundamental idea of aPC is to use the statistical moments of the input random variables to develop the polynomial chaos expansion. This approach provides the possibility to propagate continuous or discrete probability density functions and also histograms (data sets) as long as their moments exist, are finite and the determinant of the moment matrix is strictly positive. For cases with limited data availability, this approach avoids bias and fitting errors caused by wrong assumptions. In this work, an alternative way to calculate the aPC is suggested, which provides the optimal polynomials, Gaussian quadrature collocation points and weights from the moments using only a handful of matrix operations on the Hankel matrix of moments. It can therefore be implemented without requiring prior knowledge about statistical data analysis or a detailed understanding of the mathematics of polynomial chaos expansions. The extension to more input variables suggested in this work, is an anisotropic and adaptive version of Smolyak's algorithm that is solely based on the moments of the input probability distributions. It is referred to as SAMBA (PC), which is short for Sparse Approximation of Moment-Based Arbitrary Polynomial Chaos. It is illustrated that for moderately high-dimensional problems (up to 20 different input variables or histograms) SAMBA can significantly simplify the calculation of sparse Gaussian quadrature rules. SAMBA's efficiency for multivariate functions with regard to data availability is further demonstrated by analysing higher order convergence and accuracy for a set of nonlinear test functions with 2, 5 and 10 different input distributions or histograms.
MCNP Output Data Analysis with ROOT (MODAR)
NASA Astrophysics Data System (ADS)
Carasco, C.
2010-06-01
MCNP Output Data Analysis with ROOT (MODAR) is a tool based on CERN's ROOT software. MODAR has been designed to handle time-energy data issued by MCNP simulations of neutron inspection devices using the associated particle technique. MODAR exploits ROOT's Graphical User Interface and functionalities to visualize and process MCNP simulation results in a fast and user-friendly way. MODAR allows to take into account the detection system time resolution (which is not possible with MCNP) as well as detectors energy response function and counting statistics in a straightforward way. Program summaryProgram title: MODAR Catalogue identifier: AEGA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 155 373 No. of bytes in distributed program, including test data, etc.: 14 815 461 Distribution format: tar.gz Programming language: C++ Computer: Most Unix workstations and PC Operating system: Most Unix systems, Linux and windows, provided the ROOT package has been installed. Examples where tested under Suse Linux and Windows XP. RAM: Depends on the size of the MCNP output file. The example presented in the article, which involves three two-dimensional 139×740 bins histograms, allocates about 60 MB. These data are running under ROOT and include consumption by ROOT itself. Classification: 17.6 External routines: ROOT version 5.24.00 ( http://root.cern.ch/drupal/) Nature of problem: The output of an MCNP simulation is an ASCII file. The data processing is usually performed by copying and pasting the relevant parts of the ASCII file into Microsoft Excel. Such an approach is satisfactory when the quantity of data is small but is not efficient when the size of the simulated data is large, for example when time-energy correlations are studied in detail such as in problems involving the associated particle technique. In addition, since the finite time resolution of the simulated detector cannot be modeled with MCNP, systems in which time-energy correlation is crucial cannot be described in a satisfactory way. Finally, realistic particle energy deposit in detectors is calculated with MCNP in a two-step process involving type-5 then type-8 tallies. In the first step, the photon flux energy spectrum associated to a time region is selected and serves as a source energy distribution for the second step. Thus, several files must be manipulated before getting the result, which can be time consuming if one needs to study several time regions or different detectors performances. In the same way, modeling counting statistics obtained in a limited acquisition time requires several steps and can also be time consuming. Solution method: In order to overcome the previous limitations, the MODAR C++ code has been written to make use of CERN's ROOT data analysis software. MCNP output data are read from the MCNP output file with dedicated routines. Two-dimensional histograms are filled and can be handled efficiently within the ROOT framework. To keep a user friendly analysis tool, all processing and data display can be done by means of ROOT Graphical User Interface. Specific routines have been written to include detectors finite time resolution and energy response function as well as counting statistics in a straightforward way. Additional comments: The possibility of adding tallies has also been incorporated in MODAR in order to describe systems in which the signal from several detectors can be summed. Moreover, MODAR can be adapted to handle other problems involving two-dimensional data. Running time: The CPU time needed to smear a two-dimensional histogram depends on the size of the histogram. In the presented example, the time-energy smearing of one of the 139×740 two-dimensional histograms takes 3 minutes with a DELL computer equipped with INTEL Core 2.
Thresholding histogram equalization.
Chuang, K S; Chen, S; Hwang, I M
2001-12-01
The drawbacks of adaptive histogram equalization techniques are the loss of definition on the edges of the object and overenhancement of noise in the images. These drawbacks can be avoided if the noise is excluded in the equalization transformation function computation. A method has been developed to separate the histogram into zones, each with its own equalization transformation. This method can be used to suppress the nonanatomic noise and enhance only certain parts of the object. This method can be combined with other adaptive histogram equalization techniques. Preliminary results indicate that this method can produce images with superior contrast.
Morikawa, Kei; Kurimoto, Noriaki; Inoue, Takeo; Mineshita, Masamichi; Miyazawa, Teruomi
2015-01-01
Endobronchial ultrasonography using a guide sheath (EBUS-GS) is an increasingly common bronchoscopic technique, but currently, no methods have been established to quantitatively evaluate EBUS images of peripheral pulmonary lesions. The purpose of this study was to evaluate whether histogram data collected from EBUS-GS images can contribute to the diagnosis of lung cancer. Histogram-based analyses focusing on the brightness of EBUS images were retrospectively conducted: 60 patients (38 lung cancer; 22 inflammatory diseases), with clear EBUS images were included. For each patient, a 400-pixel region of interest was selected, typically located at a 3- to 5-mm radius from the probe, from recorded EBUS images during bronchoscopy. Histogram height, width, height/width ratio, standard deviation, kurtosis and skewness were investigated as diagnostic indicators. Median histogram height, width, height/width ratio and standard deviation were significantly different between lung cancer and benign lesions (all p < 0.01). With a cutoff value for standard deviation of 10.5, lung cancer could be diagnosed with an accuracy of 81.7%. Other characteristics investigated were inferior when compared to histogram standard deviation. Histogram standard deviation appears to be the most useful characteristic for diagnosing lung cancer using EBUS images. © 2015 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Qin, Lin; Fan, Shanhui; Zhou, Chuanqing
2017-04-01
To implement the optical coherence tomography (OCT) angiography on the low scanning speed OCT system, we developed a joint phase and amplitude method to generate 3-D angiograms by analysing the frequency distribution of signals from non-moving and moving scatterers and separating the signals from the tissue and blood flow with high-pass filter dynamically. This approach firstly compensates the sample motion between adjacent A-lines. Then according to the corrected phase information, we used a histogram method to determine the bulk non-moving tissue phases dynamically, which is regarded as the cut-off frequency of a high-pass filter, and separated the moving and non-moving scatters using the mentioned high-pass filter. The reconstructed image can visualize the components of moving scatters flowing, and enables volumetric flow mapping combined with the corrected phase information. Furthermore, retinal and choroidal blood vessels can be simultaneously obtained by separating the B-scan into retinal part and choroidal parts using a simple segmentation algorithm along the RPE. After the compensation of axial displacements between neighbouring images, three-dimensional vasculature of ocular vessels has been visualized. Experiments were performed to demonstrate the effectiveness of the proposed method for 3-D vasculature imaging of human retina and choroid. The results revealed depth-resolved vasculatures in retina and choroid, suggesting that our approach can be used for noninvasive and three-dimensional angiography with a low-speed clinical OCT, and it has a great potential for clinic application.
Automatic detection method for mura defects on display film surface using modified Weber's law
NASA Astrophysics Data System (ADS)
Kim, Myung-Muk; Lee, Seung-Ho
2014-07-01
We propose a method that automatically detects mura defects on display film surfaces using a modified version of Weber's law. The proposed method detects mura defects regardless of their properties and shapes by identifying regions perceived by human vision as mura using the brightness of pixel and image distribution ratio of mura in an image histogram. The proposed detection method comprises five stages. In the first stage, the display film surface image is acquired and a gray-level shift performed. In the second and third stages, the image histogram is acquired and analyzed, respectively. In the fourth stage, the mura range is acquired. This is followed by postprocessing in the fifth stage. Evaluations of the proposed method conducted using 200 display film mura image samples indicate a maximum detection rate of ˜95.5%. Further, the results of application of the Semu index for luminance mura in flat panel display (FPD) image quality inspection indicate that the proposed method is more reliable than a popular conventional method.
Tackling action-based video abstraction of animated movies for video browsing
NASA Astrophysics Data System (ADS)
Ionescu, Bogdan; Ott, Laurent; Lambert, Patrick; Coquin, Didier; Pacureanu, Alexandra; Buzuloiu, Vasile
2010-07-01
We address the issue of producing automatic video abstracts in the context of the video indexing of animated movies. For a quick browse of a movie's visual content, we propose a storyboard-like summary, which follows the movie's events by retaining one key frame for each specific scene. To capture the shot's visual activity, we use histograms of cumulative interframe distances, and the key frames are selected according to the distribution of the histogram's modes. For a preview of the movie's exciting action parts, we propose a trailer-like video highlight, whose aim is to show only the most interesting parts of the movie. Our method is based on a relatively standard approach, i.e., highlighting action through the analysis of the movie's rhythm and visual activity information. To suit every type of movie content, including predominantly static movies or movies without exciting parts, the concept of action depends on the movie's average rhythm. The efficiency of our approach is confirmed through several end-user studies.
Generalized image contrast enhancement technique based on Heinemann contrast discrimination model
NASA Astrophysics Data System (ADS)
Liu, Hong; Nodine, Calvin F.
1994-03-01
This paper presents a generalized image contrast enhancement technique which equalizes perceived brightness based on the Heinemann contrast discrimination model. This is a modified algorithm which presents an improvement over the previous study by Mokrane in its mathematically proven existence of a unique solution and in its easily tunable parameterization. The model uses a log-log representation of contrast luminosity between targets and the surround in a fixed luminosity background setting. The algorithm consists of two nonlinear gray-scale mapping functions which have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of gray scale distribution of the image, and can be uniquely determined once the previous three are given. Tests have been carried out to examine the effectiveness of the algorithm for increasing the overall contrast of images. It can be demonstrated that the generalized algorithm provides better contrast enhancement than histogram equalization. In fact, the histogram equalization technique is a special case of the proposed mapping.
Significance levels for studies with correlated test statistics.
Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S
2008-07-01
When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.
Xu, Xiao-Quan; Ma, Gao; Wang, Yan-Jun; Hu, Hao; Su, Guo-Yi; Shi, Hai-Bin; Wu, Fei-Yun
2017-07-18
To evaluate the correlation between histogram parameters derived from diffusion-kurtosis (DK) imaging and the clinical stage of nasopharyngeal carcinoma (NPC). High T-stage (T3/4) NPC showed significantly higher Kapp-mean (P = 0.018), Kapp-median (P = 0.029) and Kapp-90th (P = 0.003) than low T-stage (T1/2) NPC. High N-stage NPC (N2/3) showed significantly lower Dapp-mean (P = 0.002), Dapp-median (P = 0.002) and Dapp-10th (P < 0.001) than low N-stage NPC (N0/1). High AJCC-stage NPC (III/IV) showed significantly lower Dapp-10th (P = 0.038) than low AJCC-stage NPC (I/II). ROC analyses indicated that Kapp-90th was optimal for predicting high T-stage (AUC, 0.759; sensitivity, 0.842; specificity, 0.607), while Dapp-10th was best for predicting high N- and AJCC-stage (N-stage, AUC, 0.841; sensitivity, 0.875; specificity, 0.807; AJCC-stage, AUC, 0.671; sensitivity, 0.800; specificity, 0.588). DK imaging data of forty-seven consecutive NPC patients were retrospectively analyzed. Apparent diffusion for Gaussian distribution (Dapp) and apparent kurtosis coefficient (Kapp) were generated using diffusion-kurtosis model. Histogram parameters, including mean, median, 10th, 90th percentiles, skewness and kurtosis of Dapp and Kapp were calculated. Patients were divided into low and high T, N and clinical stage based on American Joint Committee on Cancer (AJCC) staging system. Differences of histogram parameters between low and high T, N and AJCC stages were compared using t test. Multiple receiver operating characteristic (ROC) curves were used to determine and compare the value of significant parameters in predicting high T, N and AJCC stage, respectively. DK imaging-derived parameters correlated well with clinical stage of NPC, therefore could serve as an adjunctive imaging technique for evaluating NPC.
Chung, Hoi Sung; Gopich, Irina V; McHale, Kevin; Cellmer, Troy; Louis, John M; Eaton, William A
2011-04-28
Recently developed statistical methods by Gopich and Szabo were used to extract folding and unfolding rate coefficients from single-molecule Förster resonance energy transfer (FRET) data for proteins with kinetics too fast to measure waiting time distributions. Two types of experiments and two different analyses were performed. In one experiment bursts of photons were collected from donor and acceptor fluorophores attached to a 73-residue protein, α(3)D, freely diffusing through the illuminated volume of a confocal microscope system. In the second, the protein was immobilized by linkage to a surface, and photons were collected until one of the fluorophores bleached. Folding and unfolding rate coefficients and mean FRET efficiencies for the folded and unfolded subpopulations were obtained from a photon by photon analysis of the trajectories using a maximum likelihood method. The ability of the method to describe the data in terms of a two-state model was checked by recoloring the photon trajectories with the extracted parameters and comparing the calculated FRET efficiency histograms with the measured histograms. The sum of the rate coefficients for the two-state model agreed to within 30% with the relaxation rate obtained from the decay of the donor-acceptor cross-correlation function, confirming the high accuracy of the method. Interestingly, apparently reliable rate coefficients could be extracted using the maximum likelihood method, even at low (<10%) population of the minor component where the cross-correlation function was too noisy to obtain any useful information. The rate coefficients and mean FRET efficiencies were also obtained in an approximate procedure by simply fitting the FRET efficiency histograms, calculated by binning the donor and acceptor photons, with a sum of three-Gaussian functions. The kinetics are exposed in these histograms by the growth of a FRET efficiency peak at values intermediate between the folded and unfolded peaks as the bin size increases, a phenomenon with similarities to NMR exchange broadening. When comparable populations of folded and unfolded molecules are present, this method yields rate coefficients in very good agreement with those obtained with the maximum likelihood method. As a first step toward characterizing transition paths, the Viterbi algorithm was used to locate the most probable transition points in the photon trajectories.
Kim, Ji Youn; Kim, Hai-Joong; Hahn, Meong Hi; Jeon, Hye Jin; Cho, Geum Joon; Hong, Sun Chul; Oh, Min Jeong
2013-09-01
Our aim was to figure out whether volumetric gray-scale histogram difference between anterior and posterior cervix can indicate the extent of cervical consistency. We collected data of 95 patients who were appropriate for vaginal delivery with 36th to 37th weeks of gestational age from September 2010 to October 2011 in the Department of Obstetrics and Gynecology, Korea University Ansan Hospital. Patients were excluded who had one of the followings: Cesarean section, labor induction, premature rupture of membrane. Thirty-four patients were finally enrolled. The patients underwent evaluation of the cervix through Bishop score, cervical length, cervical volume, three-dimensional (3D) cervical volumetric gray-scale histogram. The interval days from the cervix evaluation to the delivery day were counted. We compared to 3D cervical volumetric gray-scale histogram, Bishop score, cervical length, cervical volume with interval days from the evaluation of the cervix to the delivery. Gray-scale histogram difference between anterior and posterior cervix was significantly correlated to days to delivery. Its correlation coefficient (R) was 0.500 (P = 0.003). The cervical length was significantly related to the days to delivery. The correlation coefficient (R) and P-value between them were 0.421 and 0.013. However, anterior lip histogram, posterior lip histogram, total cervical volume, Bishop score were not associated with days to delivery (P >0.05). By using gray-scale histogram difference between anterior and posterior cervix and cervical length correlated with the days to delivery. These methods can be utilized to better help predict a cervical consistency.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
This volume contains geology of the Durango D detail area, radioactive mineral occurrences in Colorado, and geophysical data interpretation. Eight appendices provide: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, magnetic and ancillary profiles, and test line data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
Geology of Durango C detail area, radioactive mineral occurrences in Colorado, and geophysical data interpretation are included in this report. Eight appendices provide: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, magnetic and ancillary profiles, and test line data.
Random Matrix Theory and Elliptic Curves
2014-11-24
distribution is unlimited. 1 ELLIPTIC CURVES AND THEIR L-FUNCTIONS 2 points on that curve. Counting rational points on curves is a field with a rich ...deficiency of zeros near the origin of the histograms in Figure 1. While as d becomes large this discretization becomes smaller and has less and less effect...order of 30), the regular oscillations seen at the origin become dominated by fluctuations of an arithmetic origin, influenced by zeros of the Riemann
Differentially Constrained Motion Planning with State Lattice Motion Primitives
2012-02-01
datapoint distribution in such histograms to a scalar may be used . One example is Kullback - Leibler divergence; an even simpler method is a sum of ...the Coupled Layer Architecture for Robotic Autonomy (CLARAty) system at the Jet Propulsion Laboratory. This al- lowed us to test the application of ... good fit to extend the tree or the graph towards a random sample. However, by virtue of the regular structure of the state samples, lattice
Dong, Yang; He, Honghui; He, Chao; Zhou, Jialing; Zeng, Nan; Ma, Hui
2016-08-10
Silk fibers suffer from microstructural changes due to various external environmental conditions including daily washings. In this paper, we take the backscattering Mueller matrix images of silk samples for non-destructive and real-time quantitative characterization of the wavelength-scale microstructure and examination of the effects of washing by different detergents. The 2D images of the 16 Mueller matrix elements are reduced to the frequency distribution histograms (FDHs) whose central moments reveal the dominant structural features of the silk fibers. A group of new parameters are also proposed to characterize the wavelength-scale microstructural changes of the silk samples during the washing processes. Monte Carlo (MC) simulations are carried out to better understand how the Mueller matrix parameters are related to the wavelength-scale microstructure of silk fibers. The good agreement between experiments and simulations indicates that the Mueller matrix polarimetry and FDH based parameters can be used to quantitatively detect the wavelength-scale microstructural features of silk fibers. Mueller matrix polarimetry may be used as a powerful tool for non-destructive and in situ characterization of the wavelength-scale microstructures of silk based materials.
Dong, Yang; He, Honghui; He, Chao; Zhou, Jialing; Zeng, Nan; Ma, Hui
2016-01-01
Silk fibers suffer from microstructural changes due to various external environmental conditions including daily washings. In this paper, we take the backscattering Mueller matrix images of silk samples for non-destructive and real-time quantitative characterization of the wavelength-scale microstructure and examination of the effects of washing by different detergents. The 2D images of the 16 Mueller matrix elements are reduced to the frequency distribution histograms (FDHs) whose central moments reveal the dominant structural features of the silk fibers. A group of new parameters are also proposed to characterize the wavelength-scale microstructural changes of the silk samples during the washing processes. Monte Carlo (MC) simulations are carried out to better understand how the Mueller matrix parameters are related to the wavelength-scale microstructure of silk fibers. The good agreement between experiments and simulations indicates that the Mueller matrix polarimetry and FDH based parameters can be used to quantitatively detect the wavelength-scale microstructural features of silk fibers. Mueller matrix polarimetry may be used as a powerful tool for non-destructive and in situ characterization of the wavelength-scale microstructures of silk based materials. PMID:27517919
NASA Astrophysics Data System (ADS)
Dong, Yang; He, Honghui; He, Chao; Ma, Hui
2017-02-01
Mueller matrix polarimetry is a powerful tool for detecting microscopic structures, therefore can be used to monitor physiological changes of tissue samples. Meanwhile, spectral features of scattered light can also provide abundant microstructural information of tissues. In this paper, we take the 2D multispectral backscattering Mueller matrix images of bovine skeletal muscle tissues, and analyze their temporal variation behavior using multispectral Mueller matrix parameters. The 2D images of the Mueller matrix elements are reduced to the multispectral frequency distribution histograms (mFDHs) to reveal the dominant structural features of the muscle samples more clearly. For quantitative analysis, the multispectral Mueller matrix transformation (MMT) parameters are calculated to characterize the microstructural variations during the rigor mortis and proteolysis processes of the skeletal muscle tissue samples. The experimental results indicate that the multispectral MMT parameters can be used to judge different physiological stages for bovine skeletal muscle tissues in 24 hours, and combining with the multispectral technique, the Mueller matrix polarimetry and FDH analysis can monitor the microstructural variation features of skeletal muscle samples. The techniques may be used for quick assessment and quantitative monitoring of meat qualities in food industry.
NASA Astrophysics Data System (ADS)
He, Honghui; Dong, Yang; Zhou, Jialing; Ma, Hui
2017-03-01
As one of the salient features of light, polarization contains abundant structural and optical information of media. Recently, as a comprehensive description of polarization property, the Mueller matrix polarimetry has been applied to various biomedical studies such as cancerous tissues detections. In previous works, it has been found that the structural information encoded in the 2D Mueller matrix images can be presented by other transformed parameters with more explicit relationship to certain microstructural features. In this paper, we present a statistical analyzing method to transform the 2D Mueller matrix images into frequency distribution histograms (FDHs) and their central moments to reveal the dominant structural features of samples quantitatively. The experimental results of porcine heart, intestine, stomach, and liver tissues demonstrate that the transformation parameters and central moments based on the statistical analysis of Mueller matrix elements have simple relationships to the dominant microstructural properties of biomedical samples, including the density and orientation of fibrous structures, the depolarization power, diattenuation and absorption abilities. It is shown in this paper that the statistical analysis of 2D images of Mueller matrix elements may provide quantitative or semi-quantitative criteria for biomedical diagnosis.
NASA Astrophysics Data System (ADS)
Chen, Tao; Lai, Wei; Du, Nan-Shan
1994-03-01
Monthly investigations were mae on the population of Chinese freshwater crab, Sinopotamon yangtsekiense Bott, 1967 from April, 1984 to March, 1985. The data on 4413 specimens show that the growth was affected mainly by temperature. During the April to November growth period, the crabs' major development occurred from June through October. One year was required for a fine white oocyte to develop into a mature egg. The reproduction period was June October. Females bearing eggs were taken from June August, and crabs with young were found from July October. The females reproduced once a year but could for more than one year. The number of eggs carried by a female varied greatly according to the size of the crab, ranging from 30 to 100 eggs. New-born crabs become mature after 1 2 years. The sex ratio was approximately 1∶1 in the overall population. However, the larger crabs are predominantly male. The age distribution of S. yangtsekinese was estimated from size frequency histograms. There were more adult crabs (over 70%) from June to October and more immature crabs (over 50%) from November to May.
[Glossary of terms used by radiologists in image processing].
Rolland, Y; Collorec, R; Bruno, A; Ramée, A; Morcet, N; Haigron, P
1995-01-01
We give the definition of 166 words used in image processing. Adaptivity, aliazing, analog-digital converter, analysis, approximation, arc, artifact, artificial intelligence, attribute, autocorrelation, bandwidth, boundary, brightness, calibration, class, classification, classify, centre, cluster, coding, color, compression, contrast, connectivity, convolution, correlation, data base, decision, decomposition, deconvolution, deduction, descriptor, detection, digitization, dilation, discontinuity, discretization, discrimination, disparity, display, distance, distorsion, distribution dynamic, edge, energy, enhancement, entropy, erosion, estimation, event, extrapolation, feature, file, filter, filter floaters, fitting, Fourier transform, frequency, fusion, fuzzy, Gaussian, gradient, graph, gray level, group, growing, histogram, Hough transform, Houndsfield, image, impulse response, inertia, intensity, interpolation, interpretation, invariance, isotropy, iterative, JPEG, knowledge base, label, laplacian, learning, least squares, likelihood, matching, Markov field, mask, matching, mathematical morphology, merge (to), MIP, median, minimization, model, moiré, moment, MPEG, neural network, neuron, node, noise, norm, normal, operator, optical system, optimization, orthogonal, parametric, pattern recognition, periodicity, photometry, pixel, polygon, polynomial, prediction, pulsation, pyramidal, quantization, raster, reconstruction, recursive, region, rendering, representation space, resolution, restoration, robustness, ROC, thinning, transform, sampling, saturation, scene analysis, segmentation, separable function, sequential, smoothing, spline, split (to), shape, threshold, tree, signal, speckle, spectrum, spline, stationarity, statistical, stochastic, structuring element, support, syntaxic, synthesis, texture, truncation, variance, vision, voxel, windowing.
Action recognition via cumulative histogram of multiple features
NASA Astrophysics Data System (ADS)
Yan, Xunshi; Luo, Yupin
2011-01-01
Spatial-temporal interest points (STIPs) are popular in human action recognition. However, they suffer from difficulties in determining size of codebook and losing much information during forming histograms. In this paper, spatial-temporal interest regions (STIRs) are proposed, which are based on STIPs and are capable of marking the locations of the most ``shining'' human body parts. In order to represent human actions, the proposed approach takes great advantages of multiple features, including STIRs, pyramid histogram of oriented gradients and pyramid histogram of oriented optical flows. To achieve this, cumulative histogram is used to integrate dynamic information in sequences and to form feature vectors. Furthermore, the widely used nearest neighbor and AdaBoost methods are employed as classification algorithms. Experiments on public datasets KTH, Weizmann and UCF sports show that the proposed approach achieves effective and robust results.
Anvil Clouds of Tropical Mesoscale Convective Systems in Monsoon Regions
NASA Technical Reports Server (NTRS)
Cetrone, J.; Houze, R. A., Jr.
2009-01-01
The anvil clouds of tropical mesoscale convective systems (MCSs) in West Africa, the Maritime Continent and the Bay of Bengal have been examined with TRMM and CloudSat satellite data and ARM ground-based radar observations. The anvils spreading out from the precipitating cores of MCSs are subdivided into thick, medium and thin portions. The thick portions of anvils show distinct differences from one climatological regime to another. In their upper portions, the thick anvils of West Africa MCSs have a broad, flat histogram of reflectivity, and a maximum of reflectivity in their lower portions. The reflectivity histogram of the Bay of Bengal thick anvils has a sharply peaked distribution of reflectivity at all altitudes with modal values that increase monotonically downward. The reflectivity histogram of the Maritime Continent thick anvils is intermediate between that of the West Africa and Bay of Bengal anvils, consistent with the fact this region comprises a mix of land and ocean influences. It is suggested that the difference between the statistics of the continental and oceanic anvils is related to some combination of two factors: (1) the West African anvils tend to be closely tied to the convective regions of MCSs while the oceanic anvils are more likely to be extending outward from large stratiform precipitation areas of MCSs, and (2) the West African MCSs result from greater buoyancy, so that the convective cells are more likely to produce graupel particles and detrain them into anvils
Accelerated weight histogram method for exploring free energy landscapes
NASA Astrophysics Data System (ADS)
Lindahl, V.; Lidmar, J.; Hess, B.
2014-07-01
Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.
Accelerated weight histogram method for exploring free energy landscapes.
Lindahl, V; Lidmar, J; Hess, B
2014-07-28
Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.
Robust and fast pedestrian detection method for far-infrared automotive driving assistance systems
NASA Astrophysics Data System (ADS)
Liu, Qiong; Zhuang, Jiajun; Ma, Jun
2013-09-01
Despite considerable effort has been contributed to night-time pedestrian detection for automotive driving assistance systems recent years, robust and real-time pedestrian detection is by no means a trivial task and is still underway due to the moving cameras, uncontrolled outdoor environments, wide range of possible pedestrian presentations and the stringent performance criteria for automotive applications. This paper presents an alternative night-time pedestrian detection method using monocular far-infrared (FIR) camera, which includes two modules (regions of interest (ROIs) generation and pedestrian recognition) in a cascade fashion. Pixel-gradient oriented vertical projection is first proposed to estimate the vertical image stripes that might contain pedestrians, and then local thresholding image segmentation is adopted to generate ROIs more accurately within the estimated vertical stripes. A novel descriptor called PEWHOG (pyramid entropy weighted histograms of oriented gradients) is proposed to represent FIR pedestrians in recognition module. Specifically, PEWHOG is used to capture both the local object shape described by the entropy weighted distribution of oriented gradient histograms and its pyramid spatial layout. Then PEWHOG is fed to a three-branch structured classifier using support vector machines (SVM) with histogram intersection kernel (HIK). An off-line training procedure combining both the bootstrapping and early-stopping strategy is introduced to generate a more robust classifier by exploiting hard negative samples iteratively. Finally, multi-frame validation is utilized to suppress some transient false positives. Experimental results on FIR video sequences from various scenarios demonstrate that the presented method is effective and promising.
Meng, Jie; Zhu, Lijing; Zhu, Li; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng
2017-11-01
Background Apparent diffusion coefficient (ADC) histogram analysis has been widely used in determining tumor prognosis. Purpose To investigate the dynamic changes of ADC histogram parameters during concurrent chemo-radiotherapy (CCRT) in patients with advanced cervical cancers. Material and Methods This prospective study enrolled 32 patients with advanced cervical cancers undergoing CCRT who received diffusion-weighted (DW) magnetic resonance imaging (MRI) before CCRT, at the end of the second and fourth week during CCRT and one month after CCRT completion. The ADC histogram for the entire tumor volume was generated, and a series of histogram parameters was obtained. Dynamic changes of those parameters in cervical cancers were investigated as early biomarkers for treatment response. Results All histogram parameters except AUC low showed significant changes during CCRT (all P < 0.05). There were three variable trends involving different parameters. The mode, 5th, 10th, and 25th percentiles showed similar early increase rates (33.33%, 33.99%, 34.12%, and 30.49%, respectively) at the end of the second week of CCRT. The pre-CCRT 5th and 25th percentiles of the complete response (CR) group were significantly lower than those of the partial response (PR) group. Conclusion A series of ADC histogram parameters of cervical cancers changed significantly at the early stage of CCRT, indicating their potential in monitoring early tumor response to therapy.
Schob, Stefan; Münch, Benno; Dieckow, Julia; Quäschling, Ulf; Hoffmann, Karl-Titus; Richter, Cindy; Garnov, Nikita; Frydrychowicz, Clara; Krause, Matthias; Meyer, Hans-Jonas; Surov, Alexey
2018-04-01
Diffusion weighted imaging (DWI) quantifies motion of hydrogen nuclei in biological tissues and hereby has been used to assess the underlying tissue microarchitecture. Histogram-profiling of DWI provides more detailed information on diffusion characteristics of a lesion than the standardly calculated values of the apparent diffusion coefficient (ADC)-minimum, mean and maximum. Hence, the aim of our study was to investigate, which parameters of histogram-profiling of DWI in primary central nervous system lymphoma can be used to specifically predict features like cellular density, chromatin content and proliferative activity. Pre-treatment ADC maps of 21 PCNSL patients (8 female, 13 male, 28-89 years) from a 1.5T system were used for Matlab-based histogram profiling. Results of histopathology (H&E staining) and immunohistochemistry (Ki-67 expression) were quantified. Correlations between histogram-profiling parameters and neuropathologic examination were calculated using SPSS 23.0. The lower percentiles (p10 and p25) showed significant correlations with structural parameters of the neuropathologic examination (cellular density, chromatin content). The highest percentile, p90, correlated significantly with Ki-67 expression, resembling proliferative activity. Kurtosis of the ADC histogram correlated significantly with cellular density. Histogram-profiling of DWI in PCNSL provides a comprehensible set of parameters, which reflect distinct tumor-architectural and tumor-biological features, and hence, are promising biomarkers for treatment response and prognosis. Copyright © 2018. Published by Elsevier Inc.
ADC histogram analysis of muscle lymphoma - Correlation with histopathology in a rare entity.
Meyer, Hans-Jonas; Pazaitis, Nikolaos; Surov, Alexey
2018-06-21
Diffusion weighted imaging (DWI) is able to reflect histopathology architecture. A novel imaging approach, namely histogram analysis, is used to further characterize lesion on MRI. The purpose of this study is to correlate histogram parameters derived from apparent diffusion coefficient- (ADC) maps with histopathology parameters in muscle lymphoma. Eight patients (mean age 64.8 years, range 45-72 years) with histopathologically confirmed muscle lymphoma were retrospectively identified. Cell count, total nucleic and average nucleic areas were estimated using ImageJ. Additionally, Ki67-index was calculated. DWI was obtained on a 1.5T scanner by using the b values of 0 and 1000 s/mm2. Histogram analysis was performed as a whole lesion measurement by using a custom-made Matlabbased application. The correlation analysis revealed statistically significant correlation between cell count and ADCmean (p=-0.76, P=0.03) as well with ADCp75 (p=-0.79, P=0.02). Kurtosis and entropy correlated with average nucleic area (p=-0.81, P=0.02, p=0.88, P=0.007, respectively). None of the analyzed ADC parameters correlated with total nucleic area and with Ki67-index. This study identified significant correlations between cellularity and histogram parameters derived from ADC maps in muscle lymphoma. Thus, histogram analysis parameters reflect histopathology in muscle tumors. Advances in knowledge: Whole lesion ADC histogram analysis is able to reflect histopathology parameters in muscle lymphomas.
An Improved Algorithm to Generate a Wi-Fi Fingerprint Database for Indoor Positioning
Chen, Lina; Li, Binghao; Zhao, Kai; Rizos, Chris; Zheng, Zhengqi
2013-01-01
The major problem of Wi-Fi fingerprint-based positioning technology is the signal strength fingerprint database creation and maintenance. The significant temporal variation of received signal strength (RSS) is the main factor responsible for the positioning error. A probabilistic approach can be used, but the RSS distribution is required. The Gaussian distribution or an empirically-derived distribution (histogram) is typically used. However, these distributions are either not always correct or require a large amount of data for each reference point. Double peaks of the RSS distribution have been observed in experiments at some reference points. In this paper a new algorithm based on an improved double-peak Gaussian distribution is proposed. Kurtosis testing is used to decide if this new distribution, or the normal Gaussian distribution, should be applied. Test results show that the proposed algorithm can significantly improve the positioning accuracy, as well as reduce the workload of the off-line data training phase. PMID:23966197
An improved algorithm to generate a Wi-Fi fingerprint database for indoor positioning.
Chen, Lina; Li, Binghao; Zhao, Kai; Rizos, Chris; Zheng, Zhengqi
2013-08-21
The major problem of Wi-Fi fingerprint-based positioning technology is the signal strength fingerprint database creation and maintenance. The significant temporal variation of received signal strength (RSS) is the main factor responsible for the positioning error. A probabilistic approach can be used, but the RSS distribution is required. The Gaussian distribution or an empirically-derived distribution (histogram) is typically used. However, these distributions are either not always correct or require a large amount of data for each reference point. Double peaks of the RSS distribution have been observed in experiments at some reference points. In this paper a new algorithm based on an improved double-peak Gaussian distribution is proposed. Kurtosis testing is used to decide if this new distribution, or the normal Gaussian distribution, should be applied. Test results show that the proposed algorithm can significantly improve the positioning accuracy, as well as reduce the workload of the off-line data training phase.
Time-cumulated visible and infrared histograms used as descriptor of cloud cover
NASA Technical Reports Server (NTRS)
Seze, G.; Rossow, W.
1987-01-01
To study the statistical behavior of clouds for different climate regimes, the spatial and temporal stability of VIS-IR bidimensional histograms is tested. Also, the effect of data sampling and averaging on the histogram shapes is considered; in particular the sampling strategy used by the International Satellite Cloud Climatology Project is tested.
Interpreting Histograms. As Easy as It Seems?
ERIC Educational Resources Information Center
Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim
2014-01-01
Histograms are widely used, but recent studies have shown that they are not as easy to interpret as it might seem. In this article, we report on three studies on the interpretation of histograms in which we investigated, namely, (1) whether the misinterpretation by university students can be considered to be the result of heuristic reasoning, (2)…
Improving Real World Performance of Vision Aided Navigation in a Flight Environment
2016-09-15
Introduction . . . . . . . 63 4.2 Wide Area Search Extent . . . . . . . . . . . . . . . . . 64 4.3 Large-Scale Image Navigation Histogram Filter ...65 4.3.1 Location Model . . . . . . . . . . . . . . . . . . 66 4.3.2 Measurement Model . . . . . . . . . . . . . . . 66 4.3.3 Histogram Filter ...Iteration of Histogram Filter . . . . . . . . . . . 70 4.4 Implementation and Flight Test Campaign . . . . . . . . 71 4.4.1 Software Implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
This volume contains geology of the Durango A detail area, radioactive mineral occurences in Colorado, and geophysical data interpretation. Eight appendices provide the following: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, magnetic and ancillary profiles, and test line data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-01-01
The geology of the Durango B detail area, the radioactive mineral occurrences in Colorado and the geophysical data interpretation are included in this report. Seven appendices contain: stacked profiles, geologic histograms, geochemical histograms, speed and altitude histograms, geologic statistical tables, geochemical statistical tables, and test line data.
Students' Understanding of Bar Graphs and Histograms: Results from the LOCUS Assessments
ERIC Educational Resources Information Center
Whitaker, Douglas; Jacobbe, Tim
2017-01-01
Bar graphs and histograms are core statistical tools that are widely used in statistical practice and commonly taught in classrooms. Despite their importance and the instructional time devoted to them, many students demonstrate misunderstandings when asked to read and interpret bar graphs and histograms. Much of the research that has been…
NASA Astrophysics Data System (ADS)
Golonka, P.; Pierzchała, T.; Waş, Z.
2004-02-01
Theoretical predictions in high energy physics are routinely provided in the form of Monte Carlo generators. Comparisons of predictions from different programs and/or different initialization set-ups are often necessary. MC-TESTER can be used for such tests of decays of intermediate states (particles or resonances) in a semi-automated way. Our test consists of two steps. Different Monte Carlo programs are run; events with decays of a chosen particle are searched, decay trees are analyzed and appropriate information is stored. Then, at the analysis step, a list of all found decay modes is defined and branching ratios are calculated for both runs. Histograms of all scalar Lorentz-invariant masses constructed from the decay products are plotted and compared for each decay mode found in both runs. For each plot a measure of the difference of the distributions is calculated and its maximal value over all histograms for each decay channel is printed in a summary table. As an example of MC-TESTER application, we include a test with the τ lepton decay Monte Carlo generators, TAUOLA and PYTHIA. The HEPEVT (or LUJETS) common block is used as exclusive source of information on the generated events. Program summaryTitle of the program:MC-TESTER, version 1.1 Catalogue identifier: ADSM Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADSM Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer: PC, two Intel Xeon 2.0 GHz processors, 512MB RAM Operating system: Linux Red Hat 6.1, 7.2, and also 8.0 Programming language used:C++, FORTRAN77: gcc 2.96 or 2.95.2 (also 3.2) compiler suite with g++ and g77 Size of the package: 7.3 MB directory including example programs (2 MB compressed distribution archive), without ROOT libraries (additional 43 MB). No. of bytes in distributed program, including test data, etc.: 2 024 425 Distribution format: tar gzip file Additional disk space required: Depends on the analyzed particle: 40 MB in the case of τ lepton decays (30 decay channels, 594 histograms, 82-pages booklet). Keywords: particle physics, decay simulation, Monte Carlo methods, invariant mass distributions, programs comparison Nature of the physical problem: The decays of individual particles are well defined modules of a typical Monte Carlo program chain in high energy physics. A fast, semi-automatic way of comparing results from different programs is often desirable, for the development of new programs, to check correctness of the installations or for discussion of uncertainties. Method of solution: A typical HEP Monte Carlo program stores the generated events in the event records such as HEPEVT or PYJETS. MC-TESTER scans, event by event, the contents of the record and searches for the decays of the particle under study. The list of the found decay modes is successively incremented and histograms of all invariant masses which can be calculated from the momenta of the particle decay products are defined and filled. The outputs from the two runs of distinct programs can be later compared. A booklet of comparisons is created: for every decay channel, all histograms present in the two outputs are plotted and parameter quantifying shape difference is calculated. Its maximum over every decay channel is printed in the summary table. Restrictions on the complexity of the problem: For a list of limitations see Section 6. Typical running time: Varies substantially with the analyzed decay particle. On a PC/Linux with 2.0 GHz processors MC-TESTER increases the run time of the τ-lepton Monte Carlo program TAUOLA by 4.0 seconds for every 100 000 analyzed events (generation itself takes 26 seconds). The analysis step takes 13 seconds; ? processing takes additionally 10 seconds. Generation step runs may be executed simultaneously on multi-processor machines. Accessibility: web page: http://cern.ch/Piotr.Golonka/MC/MC-TESTER e-mails: Piotr.Golonka@CERN.CH, T.Pierzchala@friend.phys.us.edu.pl, Zbigniew.Was@CERN.CH.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Syn, C
Strength of the apple parts has been noticed to decrease, especially those installed by the new induction heating system since the LEP campaign started. Fig. 1 shows the ultimate tensile strength (UTS), yield strength (YS), and elongation of the installed or installation-simulated apples on various systems. One can clearly see the mean values of UTS and YS of the post-LEP parts decreased by about 8 ksi and 6 ksi respectively from those of the pre-LEP parts. The slight increase in elongation seen in Fig.1 can be understood from the weak inverse relationship between the strength and elongation in metals. Fig.2more » shows the weak correlation between the YS and elongation of the parts listed in Fig. 1. Strength data listed in Figure 1 were re-plotted as histograms in Figs. 3 and 4. Figs. 3a and 4a show histograms of all UTS and YS data. Figs. 3b and 4b shows histograms of pre-LEP data and Figs. 3c and 4c of post-LEP data. Data on statistical scatter of tensile strengths have been rarely published by material suppliers. Instead, only the minimum 'guaranteed' strength data are typically presented. An example of strength distribution of aluminum 7075-T6 sheet material, listed in Fig. 5, show that its scatter width of both UTS and YS for a single sheet can be about 6 ksi and for multi-lot scatter can be as large as 11 ksi even though the sheets have been produced through well-controlled manufacturing process. By approximating the histograms shown in Figs. 3 and 4 by a Gaussian or similar type of distribution curves, one can plausibly see the strength reductions in the later or more recent apples. The pre-LEP data in Figs. 3b and 4b show wider scatter than the post-LEP data in Figs. 3c and 4c and seem to follow the binomial distribution of strength indicating that the apples might have been made from two different lots of material, either from two different vendors or from two different melts of perhaps slightly different chemical composition by a single vendor. The post-LEP apples seem to have been from a single batch of material. The pre-LEP apples of the weak strength and the post-LEP apples with even weaker strength could have been made of the same batch of material, and the small strength differential might be due to the difference in the induction heating system. If the pre-LEP apples with the lower strength and the post LEP apples are made from the same batch of material, their combined scatter of strength data would be wider and can be understood as a result of the additional processing steps of stress relief and induction heating as discussed.« less
NASA Astrophysics Data System (ADS)
Viswanathan, G. M.; Buldyrev, S. V.; Garger, E. K.; Kashpur, V. A.; Lucena, L. S.; Shlyakhter, A.; Stanley, H. E.; Tschiersch, J.
2000-09-01
We analyze nonstationary 137Cs atmospheric activity concentration fluctuations measured near Chernobyl after the 1986 disaster and find three new results: (i) the histogram of fluctuations is well described by a log-normal distribution; (ii) there is a pronounced spectral component with period T=1yr, and (iii) the fluctuations are long-range correlated. These findings allow us to quantify two fundamental statistical properties of the data: the probability distribution and the correlation properties of the time series. We interpret our findings as evidence that the atmospheric radionuclide resuspension processes are tightly coupled to the surrounding ecosystems and to large time scale weather patterns.
Meyer, Hans Jonas; Emmer, Alexander; Kornhuber, Malte; Surov, Alexey
2018-05-01
Diffusion-weighted imaging (DWI) has the potential of being able to reflect histopathology architecture. A novel imaging approach, namely histogram analysis, is used to further characterize tissues on MRI. The aim of this study was to correlate histogram parameters derived from apparent diffusion coefficient (ADC) maps with serological parameters in myositis. 16 patients with autoimmune myositis were included in this retrospective study. DWI was obtained on a 1.5 T scanner by using the b-values of 0 and 1000 s mm - 2 . Histogram analysis was performed as a whole muscle measurement by using a custom-made Matlab-based application. The following ADC histogram parameters were estimated: ADCmean, ADCmax, ADCmin, ADCmedian, ADCmode, and the following percentiles ADCp10, ADCp25, ADCp75, ADCp90, as well histogram parameters kurtosis, skewness, and entropy. In all patients, the blood sample was acquired within 3 days to the MRI. The following serological parameters were estimated: alanine aminotransferase, aspartate aminotransferase, creatine kinase, lactate dehydrogenase, C-reactive protein (CRP) and myoglobin. All patients were screened for Jo1-autobodies. Kurtosis correlated inversely with CRP (p = -0.55 and 0.03). Furthermore, ADCp10 and ADCp90 values tended to correlate with creatine kinase (p = -0.43, 0.11, and p = -0.42, = 0.12 respectively). In addition, ADCmean, p10, p25, median, mode, and entropy were different between Jo1-positive and Jo1-negative patients. ADC histogram parameters are sensitive for detection of muscle alterations in myositis patients. Advances in knowledge: This study identified that kurtosis derived from ADC maps is associated with CRP in myositis patients. Furthermore, several ADC histogram parameters are statistically different between Jo1-positive and Jo1-negative patients.
Can histogram analysis of MR images predict aggressiveness in pancreatic neuroendocrine tumors?
De Robertis, Riccardo; Maris, Bogdan; Cardobi, Nicolò; Tinazzi Martini, Paolo; Gobbo, Stefano; Capelli, Paola; Ortolani, Silvia; Cingarlini, Sara; Paiella, Salvatore; Landoni, Luca; Butturini, Giovanni; Regi, Paolo; Scarpa, Aldo; Tortora, Giampaolo; D'Onofrio, Mirko
2018-06-01
To evaluate MRI derived whole-tumour histogram analysis parameters in predicting pancreatic neuroendocrine neoplasm (panNEN) grade and aggressiveness. Pre-operative MR of 42 consecutive patients with panNEN >1 cm were retrospectively analysed. T1-/T2-weighted images and ADC maps were analysed. Histogram-derived parameters were compared to histopathological features using the Mann-Whitney U test. Diagnostic accuracy was assessed by ROC-AUC analysis; sensitivity and specificity were assessed for each histogram parameter. ADC entropy was significantly higher in G2-3 tumours with ROC-AUC 0.757; sensitivity and specificity were 83.3 % (95 % CI: 61.2-94.5) and 61.1 % (95 % CI: 36.1-81.7). ADC kurtosis was higher in panNENs with vascular involvement, nodal and hepatic metastases (p= .008, .021 and .008; ROC-AUC= 0.820, 0.709 and 0.820); sensitivity and specificity were: 85.7/74.3 % (95 % CI: 42-99.2 /56.4-86.9), 36.8/96.5 % (95 % CI: 17.2-61.4 /76-99.8) and 100/62.8 % (95 % CI: 56.1-100/44.9-78.1). No significant differences between groups were found for other histogram-derived parameters (p >.05). Whole-tumour histogram analysis of ADC maps may be helpful in predicting tumour grade, vascular involvement, nodal and liver metastases in panNENs. ADC entropy and ADC kurtosis are the most accurate parameters for identification of panNENs with malignant behaviour. • Whole-tumour ADC histogram analysis can predict aggressiveness in pancreatic neuroendocrine neoplasms. • ADC entropy and kurtosis are higher in aggressive tumours. • ADC histogram analysis can quantify tumour diffusion heterogeneity. • Non-invasive quantification of tumour heterogeneity can provide adjunctive information for prognostication.
Tsuchiya, Naoko; Doai, Mariko; Usuda, Katsuo; Uramoto, Hidetaka; Tonami, Hisao
2017-01-01
Investigating the diagnostic accuracy of histogram analyses of apparent diffusion coefficient (ADC) values for determining non-small cell lung cancer (NSCLC) tumor grades, lymphovascular invasion, and pleural invasion. We studied 60 surgically diagnosed NSCLC patients. Diffusion-weighted imaging (DWI) was performed in the axial plane using a navigator-triggered single-shot, echo-planar imaging sequence with prospective acquisition correction. The ADC maps were generated, and we placed a volume-of-interest on the tumor to construct the whole-lesion histogram. Using the histogram, we calculated the mean, 5th, 10th, 25th, 50th, 75th, 90th, and 95th percentiles of ADC, skewness, and kurtosis. Histogram parameters were correlated with tumor grade, lymphovascular invasion, and pleural invasion. We performed a receiver operating characteristics (ROC) analysis to assess the diagnostic performance of histogram parameters for distinguishing different pathologic features. The ADC mean, 10th, 25th, 50th, 75th, 90th, and 95th percentiles showed significant differences among the tumor grades. The ADC mean, 25th, 50th, 75th, 90th, and 95th percentiles were significant histogram parameters between high- and low-grade tumors. The ROC analysis between high- and low-grade tumors showed that the 95th percentile ADC achieved the highest area under curve (AUC) at 0.74. Lymphovascular invasion was associated with the ADC mean, 50th, 75th, 90th, and 95th percentiles, skewness, and kurtosis. Kurtosis achieved the highest AUC at 0.809. Pleural invasion was only associated with skewness, with the AUC of 0.648. ADC histogram analyses on the basis of the entire tumor volume are able to stratify NSCLCs' tumor grade, lymphovascular invasion and pleural invasion.
Improved automatic adjustment of density and contrast in FCR system using neural network
NASA Astrophysics Data System (ADS)
Takeo, Hideya; Nakajima, Nobuyoshi; Ishida, Masamitsu; Kato, Hisatoyo
1994-05-01
FCR system has an automatic adjustment of image density and contrast by analyzing the histogram of image data in the radiation field. Advanced image recognition methods proposed in this paper can improve the automatic adjustment performance, in which neural network technology is used. There are two methods. Both methods are basically used 3-layer neural network with back propagation. The image data are directly input to the input-layer in one method and the histogram data is input in the other method. The former is effective to the imaging menu such as shoulder joint in which the position of interest region occupied on the histogram changes by difference of positioning and the latter is effective to the imaging menu such as chest-pediatrics in which the histogram shape changes by difference of positioning. We experimentally confirm the validity of these methods (about the automatic adjustment performance) as compared with the conventional histogram analysis methods.
NASA Astrophysics Data System (ADS)
Zeng, Bangze; Zhu, Youpan; Li, Zemin; Hu, Dechao; Luo, Lin; Zhao, Deli; Huang, Juan
2014-11-01
Duo to infrared image with low contrast, big noise and unclear visual effect, target is very difficult to observed and identified. This paper presents an improved infrared image detail enhancement algorithm based on adaptive histogram statistical stretching and gradient filtering (AHSS-GF). Based on the fact that the human eyes are very sensitive to the edges and lines, the author proposed to extract the details and textures by using the gradient filtering. New histogram could be acquired by calculating the sum of original histogram based on fixed window. With the minimum value for cut-off point, author carried on histogram statistical stretching. After the proper weights given to the details and background, the detail-enhanced results could be acquired finally. The results indicate image contrast could be improved and the details and textures could be enhanced effectively as well.
Sugajima, Y; Mitarai, G; Koeda, M; Moritani, T
1996-06-01
The effect of whole body water immersion on the recruitment order of hip flexor motor units was investigated in 11 male subjects. Intramuscular spike potentials were recorded, with fine bipolar wire electrodes, from the iliopsoas, the sartorius, the rectus femoris and the tensor fasciae latae during voluntary isometric contraction while the subjects were standing erect with the hip on the test side flexed to 60 degrees and the knee flexed to 120 degrees . Data were analysed by measuring the recruitment threshold in slow ramp contraction and by a computer-aided amplitude-frequency histogram of the spike potentials during short sustained contraction. The motor units were classified as low-amplitude units if they delivered spike potentials of less than 0.5 mV and high-amplitude units if the spike potentials exceeded 0.5 mV. In the ramp experiments, exposure to water immersion gave rise to a sudden increase in the recruitment thresholds of the low-amplitude units in all muscles, while in the recruitment thresholds of the high-amplitude units, the alterations differed among the muscles. The thresholds in the rectus femoris and tensor fasciae latae increased in the same direction as those of the low-amplitude units, while those in the iliopsoas and sartorius decreased in the opposite direction. The amplitude-frequency histograms clearly indicated that these different alterations occurred in all subjects, without exception. We concluded that unloading induced by water immersion changed the recruitment order of motor units during isometric contraction in the iliopsoas and sartorius, facilitating the recruitment of their larger motor units.
Trofimov, Alexei; Unkelbach, Jan; DeLaney, Thomas F; Bortfeld, Thomas
2012-01-01
Dose-volume histograms (DVH) are the most common tool used in the appraisal of the quality of a clinical treatment plan. However, when delivery uncertainties are present, the DVH may not always accurately describe the dose distribution actually delivered to the patient. We present a method, based on DVH formalism, to visualize the variability in the expected dosimetric outcome of a treatment plan. For a case of chordoma of the cervical spine, we compared 2 intensity modulated proton therapy plans. Treatment plan A was optimized based on dosimetric objectives alone (ie, desired target coverage, normal tissue tolerance). Plan B was created employing a published probabilistic optimization method that considered the uncertainties in patient setup and proton range in tissue. Dose distributions and DVH for both plans were calculated for the nominal delivery scenario, as well as for scenarios representing deviations from the nominal setup, and a systematic error in the estimate of range in tissue. The histograms from various scenarios were combined to create DVH bands to illustrate possible deviations from the nominal plan for the expected magnitude of setup and range errors. In the nominal scenario, the DVH from plan A showed superior dose coverage, higher dose homogeneity within the target, and improved sparing of the adjacent critical structure. However, when the dose distributions and DVH from plans A and B were recalculated for different error scenarios (eg, proton range underestimation by 3 mm), the plan quality, reflected by DVH, deteriorated significantly for plan A, while plan B was only minimally affected. In the DVH-band representation, plan A produced wider bands, reflecting its higher vulnerability to delivery errors, and uncertainty in the dosimetric outcome. The results illustrate that comparison of DVH for the nominal scenario alone does not provide any information about the relative sensitivity of dosimetric outcome to delivery uncertainties. Thus, such comparison may be misleading and may result in the selection of an inferior plan for delivery to a patient. A better-informed decision can be made if additional information about possible dosimetric variability is presented; for example, in the form of DVH bands. Copyright © 2012 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
Improving the imaging of calcifications in CT by histogram-based selective deblurring
NASA Astrophysics Data System (ADS)
Rollano-Hijarrubia, Empar; van der Meer, Frits; van der Lugt, Add; Weinans, Harrie; Vrooman, Henry; Vossepoel, Albert; Stokking, Rik
2005-04-01
Imaging of small high-density structures, such as calcifications, with computed tomography (CT) is limited by the spatial resolution of the system. Blur causes small calcifications to be imaged with lower contrast and overestimated volume, thereby hampering the analysis of vessels. The aim of this work is to reduce the blur of calcifications by applying three-dimensional (3D) deconvolution. Unfortunately, the high-frequency amplification of the deconvolution produces edge-related ring artifacts and enhances noise and original artifacts, which degrades the imaging of low-density structures. A method, referred to as Histogram-based Selective Deblurring (HiSD), was implemented to avoid these negative effects. HiSD uses the histogram information to generate a restored image in which the low-intensity voxel information of the observed image is combined with the high-intensity voxel information of the deconvolved image. To evaluate HiSD we scanned four in-vitro atherosclerotic plaques of carotid arteries with a multislice spiral CT and with a microfocus CT (μCT), used as reference. Restored images were generated from the observed images, and qualitatively and quantitatively compared with their corresponding μCT images. Transverse views and maximum-intensity projections of restored images show the decrease of blur of the calcifications in 3D. Measurements of the areas of 27 calcifications and total volumes of calcification of 4 plaques show that the overestimation of calcification was smaller for restored images (mean-error: 90% for area; 92% for volume) than for observed images (143%; 213%, respectively). The qualitative and quantitative analyses show that the imaging of calcifications in CT can be improved considerably by applying HiSD.
A Physical Model to Estimate Snowfall over Land using AMSU-B Observations
NASA Technical Reports Server (NTRS)
Kim, Min-Jeong; Weinman, J. A.; Olson, W. S.; Chang, D.-E.; Skofronick-Jackson, G.; Wang, J. R.
2008-01-01
In this study, we present an improved physical model to retrieve snowfall rate over land using brightness temperature observations from the National Oceanic and Atmospheric Administration's (NOAA) Advanced Microwave Sounder Unit-B (AMSU-B) at 89 GHz, 150 GHz, 183.3 +/- 1 GHz, 183.3 +/- 3 GHz, and 183.3 +/- 7 GHz. The retrieval model is applied to the New England blizzard of March 5, 2001 which deposited about 75 cm of snow over much of Vermont, New Hampshire, and northern New York. In this improved physical model, prior retrieval assumptions about snowflake shape, particle size distributions, environmental conditions, and optimization methodology have been updated. Here, single scattering parameters for snow particles are calculated with the Discrete-Dipole Approximation (DDA) method instead of assuming spherical shapes. Five different snow particle models (hexagonal columns, hexagonal plates, and three different kinds of aggregates) are considered. Snow particle size distributions are assumed to vary with air temperature and to follow aircraft measurements described by previous studies. Brightness temperatures at AMSU-B frequencies for the New England blizzard are calculated using these DDA calculated single scattering parameters and particle size distributions. The vertical profiles of pressure, temperature, relative humidity and hydrometeors are provided by MM5 model simulations. These profiles are treated as the a priori data base in the Bayesian retrieval algorithm. In algorithm applications to the blizzard data, calculated brightness temperatures associated with selected database profiles agree with AMSU-B observations to within about +/- 5 K at all five frequencies. Retrieved snowfall rates compare favorably with the near-concurrent National Weather Service (NWS) radar reflectivity measurements. The relationships between the NWS radar measured reflectivities Z(sub e) and retrieved snowfall rate R for a given snow particle model are derived by a histogram matching technique. All of these Z(sub e)-R relationships fall in the range of previously established Z(sub e)-R relationships for snowfall. This suggests that the current physical model developed in this study can reliably estimate the snowfall rate over land using the AMSU-B measured brightness temperatures.
Kwon, M-R; Shin, J H; Hahn, S Y; Oh, Y L; Kwak, J Y; Lee, E; Lim, Y
2018-06-01
To evaluate the diagnostic value of histogram analysis using ultrasound (US) to differentiate between the subtypes of follicular variant of papillary thyroid carcinoma (FVPTC). The present study included 151 patients with surgically confirmed FVPTC diagnosed between January 2014 and May 2016. Their preoperative US features were reviewed retrospectively. Histogram parameters (mean, maximum, minimum, range, root mean square, skewness, kurtosis, energy, entropy, and correlation) were obtained for each nodule. The 152 nodules in 151 patients comprised 48 non-invasive follicular thyroid neoplasm with papillary-like nuclear features (NIFTPs; 31.6%), 60 invasive encapsulated FVPTCs (EFVPTCs; 39.5%), and 44 infiltrative FVPTCs (28.9%). The US features differed significantly between the subtypes of FVPTC. Discrimination was achieved between NIFTPs and infiltrative FVPTC, and between invasive EFVPTC and infiltrative FVPTC using histogram parameters; however, the parameters were not significantly different between NIFTP and invasive EFVPTC. It is feasible to use greyscale histogram analysis to differentiate between NIFTP and infiltrative FVPTC, but not between NIFTP and invasive EFVPTC. Histograms can be used as a supplementary tool to differentiate the subtypes of FVPTC. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
DSP+FPGA-based real-time histogram equalization system of infrared image
NASA Astrophysics Data System (ADS)
Gu, Dongsheng; Yang, Nansheng; Pi, Defu; Hua, Min; Shen, Xiaoyan; Zhang, Ruolan
2001-10-01
Histogram Modification is a simple but effective method to enhance an infrared image. There are several methods to equalize an infrared image's histogram due to the different characteristics of the different infrared images, such as the traditional HE (Histogram Equalization) method, and the improved HP (Histogram Projection) and PE (Plateau Equalization) method and so on. If to realize these methods in a single system, the system must have a mass of memory and extremely fast speed. In our system, we introduce a DSP + FPGA based real-time procession technology to do these things together. FPGA is used to realize the common part of these methods while DSP is to do the different part. The choice of methods and the parameter can be input by a keyboard or a computer. By this means, the function of the system is powerful while it is easy to operate and maintain. In this article, we give out the diagram of the system and the soft flow chart of the methods. And at the end of it, we give out the infrared image and its histogram before and after the process of HE method.
Song, Yong Sub; Choi, Seung Hong; Park, Chul-Kee; Yi, Kyung Sik; Lee, Woong Jae; Yun, Tae Jin; Kim, Tae Min; Lee, Se-Hoon; Kim, Ji-Hoon; Sohn, Chul-Ho; Park, Sung-Hye; Kim, Il Han; Jahng, Geon-Ho; Chang, Kee-Hyun
2013-01-01
The purpose of this study was to differentiate true progression from pseudoprogression of glioblastomas treated with concurrent chemoradiotherapy (CCRT) with temozolomide (TMZ) by using histogram analysis of apparent diffusion coefficient (ADC) and normalized cerebral blood volume (nCBV) maps. Twenty patients with histopathologically proven glioblastoma who had received CCRT with TMZ underwent perfusion-weighted imaging and diffusion-weighted imaging (b = 0, 1000 sec/mm(2)). The corresponding nCBV and ADC maps for the newly visible, entirely enhancing lesions were calculated after the completion of CCRT with TMZ. Two observers independently measured the histogram parameters of the nCBV and ADC maps. The histogram parameters between the true progression group (n = 10) and the pseudoprogression group (n = 10) were compared by use of an unpaired Student's t test and subsequent multivariable stepwise logistic regression analysis to determine the best predictors for the differential diagnosis between the two groups. Receiver operating characteristic analysis was employed to determine the best cutoff values for the histogram parameters that proved to be significant predictors for differentiating true progression from pseudoprogression. Intraclass correlation coefficient was used to determine the level of inter-observer reliability for the histogram parameters. The 5th percentile value (C5) of the cumulative ADC histograms was a significant predictor for the differential diagnosis between true progression and pseudoprogression (p = 0.044 for observer 1; p = 0.011 for observer 2). Optimal cutoff values of 892 × 10(-6) mm(2)/sec for observer 1 and 907 × 10(-6) mm(2)/sec for observer 2 could help differentiate between the two groups with a sensitivity of 90% and 80%, respectively, a specificity of 90% and 80%, respectively, and an area under the curve of 0.880 and 0.840, respectively. There was no other significant differentiating parameter on the nCBV histograms. Inter-observer reliability was excellent or good for all histogram parameters (intraclass correlation coefficient range: 0.70-0.99). The C5 of the cumulative ADC histogram can be a promising parameter for the differentiation of true progression from pseudoprogression of newly visible, entirely enhancing lesions after CCRT with TMZ for glioblastomas.
Song, Yong Sub; Park, Chul-Kee; Yi, Kyung Sik; Lee, Woong Jae; Yun, Tae Jin; Kim, Tae Min; Lee, Se-Hoon; Kim, Ji-Hoon; Sohn, Chul-Ho; Park, Sung-Hye; Kim, Il Han; Jahng, Geon-Ho; Chang, Kee-Hyun
2013-01-01
Objective The purpose of this study was to differentiate true progression from pseudoprogression of glioblastomas treated with concurrent chemoradiotherapy (CCRT) with temozolomide (TMZ) by using histogram analysis of apparent diffusion coefficient (ADC) and normalized cerebral blood volume (nCBV) maps. Materials and Methods Twenty patients with histopathologically proven glioblastoma who had received CCRT with TMZ underwent perfusion-weighted imaging and diffusion-weighted imaging (b = 0, 1000 sec/mm2). The corresponding nCBV and ADC maps for the newly visible, entirely enhancing lesions were calculated after the completion of CCRT with TMZ. Two observers independently measured the histogram parameters of the nCBV and ADC maps. The histogram parameters between the true progression group (n = 10) and the pseudoprogression group (n = 10) were compared by use of an unpaired Student's t test and subsequent multivariable stepwise logistic regression analysis to determine the best predictors for the differential diagnosis between the two groups. Receiver operating characteristic analysis was employed to determine the best cutoff values for the histogram parameters that proved to be significant predictors for differentiating true progression from pseudoprogression. Intraclass correlation coefficient was used to determine the level of inter-observer reliability for the histogram parameters. Results The 5th percentile value (C5) of the cumulative ADC histograms was a significant predictor for the differential diagnosis between true progression and pseudoprogression (p = 0.044 for observer 1; p = 0.011 for observer 2). Optimal cutoff values of 892 × 10-6 mm2/sec for observer 1 and 907 × 10-6 mm2/sec for observer 2 could help differentiate between the two groups with a sensitivity of 90% and 80%, respectively, a specificity of 90% and 80%, respectively, and an area under the curve of 0.880 and 0.840, respectively. There was no other significant differentiating parameter on the nCBV histograms. Inter-observer reliability was excellent or good for all histogram parameters (intraclass correlation coefficient range: 0.70-0.99). Conclusion The C5 of the cumulative ADC histogram can be a promising parameter for the differentiation of true progression from pseudoprogression of newly visible, entirely enhancing lesions after CCRT with TMZ for glioblastomas. PMID:23901325
2017-02-01
note, a number of different measures implemented in both MATLAB and Python as functions are used to quantify similarity/distance between 2 vector-based...this technical note are widely used and may have an important role when computing the distance and similarity of large datasets and when considering high...throughput processes. In this technical note, a number of different measures implemented in both MAT- LAB and Python as functions are used to
Histogram-based ionogram displays and their application to autoscaling
NASA Astrophysics Data System (ADS)
Lynn, Kenneth J. W.
2018-03-01
A simple method is described for displaying and auto scaling the basic ionogram parameters foF2 and h'F2 as well as some additional layer parameters from digital ionograms. The technique employed is based on forming frequency and height histograms in each ionogram. This technique has now been applied specifically to ionograms produced by the IPS5D ionosonde developed and operated by the Australian Space Weather Service (SWS). The SWS ionograms are archived in a cleaned format and readily available from the SWS internet site. However, the method is applicable to any ionosonde which produces ionograms in a digital format at a useful signal-to-noise level. The most novel feature of the technique for autoscaling is its simplicity and the avoidance of the mathematical imaging and line fitting techniques often used. The program arose from the necessity to display many days of ionogram output to allow the location of specific types of ionospheric event such as ionospheric storms, travelling ionospheric disturbances and repetitive ionospheric height changes for further investigation and measurement. Examples and applications of the method are given including the removal of sporadic E and spread F.
NASA Astrophysics Data System (ADS)
Ji, Sungchul
A new mathematical formula referred to as the Planckian distribution equation (PDE) has been found to fit long-tailed histograms generated in various fields of studies, ranging from atomic physics to single-molecule enzymology, cell biology, brain neurobiology, glottometrics, econophysics, and to cosmology. PDE can be derived from a Gaussian-like equation (GLE) by non-linearly transforming its variable, x, while keeping the y coordinate constant. Assuming that GLE represents a random distribution (due to its symmetry), it is possible to define a binary logarithm of the ratio between the areas under the curves of PDE and GLE as a measure of the non-randomness (or order) underlying the biophysicochemical processes generating long-tailed histograms that fit PDE. This new function has been named the Planckian information, IP, which (i) may be a new measure of order that can be applied widely to both natural and human sciences and (ii) can serve as the opposite of the Boltzmann-Gibbs entropy, S, which is a measure of disorder. The possible rationales for the universality of PDE may include (i) the universality of the wave-particle duality embedded in PDE, (ii) the selection of subsets of random processes (thereby breaking the symmetry of GLE) as the basic mechanism of generating order, organization, and function, and (iii) the quantity-quality complementarity as the connection between PDE and Peircean semiotics.
Bassler, Niels; Kantemiris, Ioannis; Karaiskos, Pantelis; Engelke, Julia; Holzscheiter, Michael H; Petersen, Jørgen B
2010-04-01
Antiprotons have been suggested as a possibly superior modality for radiotherapy, due to the energy released when antiprotons annihilate, which enhances the Bragg peak and introduces a high-LET component to the dose. However, concerns are expressed about the inferior lateral dose distribution caused by the annihilation products. We use the Monte Carlo code FLUKA to generate depth-dose kernels for protons, antiprotons, and carbon ions. Using these we then build virtual treatment plans optimized according to ICRU recommendations for the different beam modalities, which then are recalculated with FLUKA. Dose-volume histograms generated from these plans can be used to compare the different irradiations. The enhancement in physical and possibly biological dose from annihilating antiprotons can significantly lower the dose in the entrance channel; but only at the expense of a diffuse low dose background from long-range secondary particles. Lateral dose distributions are improved using active beam delivery methods, instead of flat fields. Dose-volume histograms for different treatment scenarios show that antiprotons have the potential to reduce the volume of normal tissue receiving medium to high dose, however, in the low dose region antiprotons are inferior to both protons and carbon ions. This limits the potential usage to situations where dose to normal tissue must be reduced as much as possible. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
Ogle, K.M.; Lee, R.W.
1994-01-01
Radon-222 activity was measured for 27 water samples from streams, an alluvial aquifer, bedrock aquifers, and a geothermal system, in and near the 510-square mile area of Owl Creek Basin, north- central Wyoming. Summary statistics of the radon- 222 activities are compiled. For 16 stream-water samples, the arithmetic mean radon-222 activity was 20 pCi/L (picocuries per liter), geometric mean activity was 7 pCi/L, harmonic mean activity was 2 pCi/L and median activity was 8 pCi/L. The standard deviation of the arithmetic mean is 29 pCi/L. The activities in the stream-water samples ranged from 0.4 to 97 pCi/L. The histogram of stream-water samples is left-skewed when compared to a normal distribution. For 11 ground-water samples, the arithmetic mean radon- 222 activity was 486 pCi/L, geometric mean activity was 280 pCi/L, harmonic mean activity was 130 pCi/L and median activity was 373 pCi/L. The standard deviation of the arithmetic mean is 500 pCi/L. The activity in the ground-water samples ranged from 25 to 1,704 pCi/L. The histogram of ground-water samples is left-skewed when compared to a normal distribution. (USGS)
Chabli, A; Guitton, D; Fortin, S; Molotchnikoff, S
2000-03-01
The present study examined, in the superior colliculus (SC) of anaesthetised cats, the functional connectivity between superficial-layer neurones (SLNs) and tectoreticular neurones (TRNs: collicular output cells). TRNs were antidromically identified by electrical stimulation of the predorsal bundle. The auto- and cross-correlation histograms of visual responses of both types of neurones were recorded and analysed. A delayed, sharp peak in cross-correlograms allowed us to verify whether SLN and TRN cells were coupled; in addition, oscillatory activities were compared to verify if rhythmic responses of SLN sites were transmitted to TRN sites. We found that oscillatory activity was rarely observed in spontaneous activity of superficial (1/74) and TRN sites (1/48). Moving light bars induced oscillation in 31% (23/74) of the superficial-layer and in 23% (11/48) of the TRN sites. The strength of the rhythmic responses was determined by specific ranges of stimulus velocity in 83% (19/23) and 64% (7/11) of oscillating SLN and TRN sites, respectively. Frequencies of oscillations ranged between 5 and 125 Hz and were confined, for 53% of the cells, to the 5-20 Hz band. Thus, the band-width of frequencies of the stimulus-related oscillations in the superior colliculus was broader than the gamma range. Analysis of cross-correlation histograms revealed a significant predominant peak with a mean delay of 2.7+/-0.9 ms in 46% (17/37) of SLN-TRN pairs. Most correlated SLN-TRN pairs (88%: 15/17) had superimposed receptive fields, suggesting they were functionally interconnected. However, individual oscillatory frequencies of correlated and oscillatory SLN and TRN cells were never the same (0/8). Together, these results suggest that the neurones in collicular superficial layer contact TRNs and, consequently, support the idea that the superficial layers contribute to collicular outputs producing eye- and head-orienting movements.
Inferred Eccentricity and Period Distributions of Kepler Eclipsing Binaries
NASA Astrophysics Data System (ADS)
Prsa, Andrej; Matijevic, G.
2014-01-01
Determining the underlying eccentricity and orbital period distributions from an observed sample of eclipsing binary stars is not a trivial task. Shen and Turner (2008) have shown that the commonly used maximum likelihood estimators are biased to larger eccentricities and they do not describe the underlying distribution correctly; orbital periods suffer from a similar bias. Hogg, Myers and Bovy (2010) proposed a hierarchical probabilistic method for inferring the true eccentricity distribution of exoplanet orbits that uses the likelihood functions for individual star eccentricities. The authors show that proper inference outperforms the simple histogramming of the best-fit eccentricity values. We apply this method to the complete sample of eclipsing binary stars observed by the Kepler mission (Prsa et al. 2011) to derive the unbiased underlying eccentricity and orbital period distributions. These distributions can be used for the studies of multiple star formation, dynamical evolution, and they can serve as a drop-in replacement to prior, ad-hoc distributions used in the exoplanet field for determining false positive occurrence rates.
Schob, Stefan; Meyer, Hans Jonas; Dieckow, Julia; Pervinder, Bhogal; Pazaitis, Nikolaos; Höhn, Anne Kathrin; Garnov, Nikita; Horvath-Rizea, Diana; Hoffmann, Karl-Titus; Surov, Alexey
2017-04-12
Pre-surgical diffusion weighted imaging (DWI) is increasingly important in the context of thyroid cancer for identification of the optimal treatment strategy. It has exemplarily been shown that DWI at 3T can distinguish undifferentiated from well-differentiated thyroid carcinoma, which has decisive implications for the magnitude of surgery. This study used DWI histogram analysis of whole tumor apparent diffusion coefficient (ADC) maps. The primary aim was to discriminate thyroid carcinomas which had already gained the capacity to metastasize lymphatically from those not yet being able to spread via the lymphatic system. The secondary aim was to reflect prognostically important tumor-biological features like cellularity and proliferative activity with ADC histogram analysis. Fifteen patients with follicular-cell derived thyroid cancer were enrolled. Lymph node status, extent of infiltration of surrounding tissue, and Ki-67 and p53 expression were assessed in these patients. DWI was obtained in a 3T system using b values of 0, 400, and 800 s/mm². Whole tumor ADC volumes were analyzed using a histogram-based approach. Several ADC parameters showed significant correlations with immunohistopathological parameters. Most importantly, ADC histogram skewness and ADC histogram kurtosis were able to differentiate between nodal negative and nodal positive thyroid carcinoma. histogram analysis of whole ADC tumor volumes has the potential to provide valuable information on tumor biology in thyroid carcinoma. However, further studies are warranted.
Schob, Stefan; Meyer, Hans Jonas; Dieckow, Julia; Pervinder, Bhogal; Pazaitis, Nikolaos; Höhn, Anne Kathrin; Garnov, Nikita; Horvath-Rizea, Diana; Hoffmann, Karl-Titus; Surov, Alexey
2017-01-01
Pre-surgical diffusion weighted imaging (DWI) is increasingly important in the context of thyroid cancer for identification of the optimal treatment strategy. It has exemplarily been shown that DWI at 3T can distinguish undifferentiated from well-differentiated thyroid carcinoma, which has decisive implications for the magnitude of surgery. This study used DWI histogram analysis of whole tumor apparent diffusion coefficient (ADC) maps. The primary aim was to discriminate thyroid carcinomas which had already gained the capacity to metastasize lymphatically from those not yet being able to spread via the lymphatic system. The secondary aim was to reflect prognostically important tumor-biological features like cellularity and proliferative activity with ADC histogram analysis. Fifteen patients with follicular-cell derived thyroid cancer were enrolled. Lymph node status, extent of infiltration of surrounding tissue, and Ki-67 and p53 expression were assessed in these patients. DWI was obtained in a 3T system using b values of 0, 400, and 800 s/mm2. Whole tumor ADC volumes were analyzed using a histogram-based approach. Several ADC parameters showed significant correlations with immunohistopathological parameters. Most importantly, ADC histogram skewness and ADC histogram kurtosis were able to differentiate between nodal negative and nodal positive thyroid carcinoma. Conclusions: histogram analysis of whole ADC tumor volumes has the potential to provide valuable information on tumor biology in thyroid carcinoma. However, further studies are warranted. PMID:28417929
Zolal, Amir; Juratli, Tareq A; Linn, Jennifer; Podlesek, Dino; Sitoci Ficici, Kerim Hakan; Kitzler, Hagen H; Schackert, Gabriele; Sobottka, Stephan B; Rieger, Bernhard; Krex, Dietmar
2016-05-01
Objective To determine the value of apparent diffusion coefficient (ADC) histogram parameters for the prediction of individual survival in patients undergoing surgery for recurrent glioblastoma (GBM) in a retrospective cohort study. Methods Thirty-one patients who underwent surgery for first recurrence of a known GBM between 2008 and 2012 were included. The following parameters were collected: age, sex, enhancing tumor size, mean ADC, median ADC, ADC skewness, ADC kurtosis and fifth percentile of the ADC histogram, initial progression free survival (PFS), extent of second resection and further adjuvant treatment. The association of these parameters with survival and PFS after second surgery was analyzed using log-rank test and Cox regression. Results Using log-rank test, ADC histogram skewness of the enhancing tumor was significantly associated with both survival (p = 0.001) and PFS after second surgery (p = 0.005). Further parameters associated with prolonged survival after second surgery were: gross total resection at second surgery (p = 0.026), tumor size (0.040) and third surgery (p = 0.003). In the multivariate Cox analysis, ADC histogram skewness was shown to be an independent prognostic factor for survival after second surgery. Conclusion ADC histogram skewness of the enhancing lesion, enhancing lesion size, third surgery, as well as gross total resection have been shown to be associated with survival following the second surgery. ADC histogram skewness was an independent prognostic factor for survival in the multivariate analysis.
Sun, Xiaofei; Shi, Lin; Luo, Yishan; Yang, Wei; Li, Hongpeng; Liang, Peipeng; Li, Kuncheng; Mok, Vincent C T; Chu, Winnie C W; Wang, Defeng
2015-07-28
Intensity normalization is an important preprocessing step in brain magnetic resonance image (MRI) analysis. During MR image acquisition, different scanners or parameters would be used for scanning different subjects or the same subject at a different time, which may result in large intensity variations. This intensity variation will greatly undermine the performance of subsequent MRI processing and population analysis, such as image registration, segmentation, and tissue volume measurement. In this work, we proposed a new histogram normalization method to reduce the intensity variation between MRIs obtained from different acquisitions. In our experiment, we scanned each subject twice on two different scanners using different imaging parameters. With noise estimation, the image with lower noise level was determined and treated as the high-quality reference image. Then the histogram of the low-quality image was normalized to the histogram of the high-quality image. The normalization algorithm includes two main steps: (1) intensity scaling (IS), where, for the high-quality reference image, the intensities of the image are first rescaled to a range between the low intensity region (LIR) value and the high intensity region (HIR) value; and (2) histogram normalization (HN),where the histogram of low-quality image as input image is stretched to match the histogram of the reference image, so that the intensity range in the normalized image will also lie between LIR and HIR. We performed three sets of experiments to evaluate the proposed method, i.e., image registration, segmentation, and tissue volume measurement, and compared this with the existing intensity normalization method. It is then possible to validate that our histogram normalization framework can achieve better results in all the experiments. It is also demonstrated that the brain template with normalization preprocessing is of higher quality than the template with no normalization processing. We have proposed a histogram-based MRI intensity normalization method. The method can normalize scans which were acquired on different MRI units. We have validated that the method can greatly improve the image analysis performance. Furthermore, it is demonstrated that with the help of our normalization method, we can create a higher quality Chinese brain template.
Partial Discharge Ultrasound Detection Using the Sagnac Interferometer System
Li, Xiaomin; Gao, Yan; Zhang, Hongjuan; Wang, Dong; Jin, Baoquan
2018-01-01
Partial discharge detection is crucial for electrical cable safety evaluation. The ultrasonic signals frequently generated in the partial discharge process contains important characteristic information. However, traditional ultrasonic transducers are easily subject to strong electromagnetic interference in environments with high voltages and strong magnetic fields. In order to overcome this problem, an optical fiber Sagnac interferometer system is proposed for partial discharge ultrasound detection. Optical fiber sensing and time-frequency analysis of the ultrasonic signals excited by the piezoelectric ultrasonic transducer is realized for the first time. The effective frequency band of the Sagnac interferometer system was up to 175 kHz with the help of a designed 10 kV partial discharge simulator device. Using the cumulative histogram method, the characteristic ultrasonic frequency band of the partial discharges was between 28.9 kHz and 57.6 kHz for this optical fiber partial discharge detection system. This new ultrasound sensor can be used as an ideal ultrasonic source for the intrinsically safe detection of partial discharges in an explosive environment. PMID:29734682
NASA Astrophysics Data System (ADS)
Kusyk, Janusz; Eskicioglu, Ahmet M.
2005-10-01
Digital watermarking is considered to be a major technology for the protection of multimedia data. Some of the important applications are broadcast monitoring, copyright protection, and access control. In this paper, we present a semi-blind watermarking scheme for embedding a logo in color images using the DFT domain. After computing the DFT of the luminance layer of the cover image, the magnitudes of DFT coefficients are compared, and modified. A given watermark is embedded in three frequency bands: Low, middle, and high. Our experiments show that the watermarks extracted from the lower frequencies have the best visual quality for low pass filtering, adding Gaussian noise, JPEG compression, resizing, rotation, and scaling, and the watermarks extracted from the higher frequencies have the best visual quality for cropping, intensity adjustment, histogram equalization, and gamma correction. Extractions from the fragmented and translated image are identical to extractions from the unattacked watermarked image. The collusion and rewatermarking attacks do not provide the hacker with useful tools.
The relationship of storm severity to directionally resolved radio emissions
NASA Technical Reports Server (NTRS)
Johnson, R. O.; Bushman, M. L.; Sherrill, W. M.
1980-01-01
Directionally resolved atmospheric radio frequency emission data were acquired from thunderstorms occurring in the central and southwestern United States. In addition, RF sferic tracking data were obtained from hurricanes and tropical depressions occurring in the Gulf of Mexico. The data were acquired using a crossed baseline phase interferometer operating at a frequency of 2.001 MHz. The received atmospherics were tested for phase linearity across the array, and azimuth/elevation angles of arrival were computed in real time. A histogram analysis of sferic burst count versus azimuth provided lines of bearing to centers of intense electrical activity. Analysis indicates a consistent capability of the phase linear direction finder to detect severe meteorological activity to distances of 2000 km from the receiving site. The technique evidences the ability to discriminate severe storms from nonsevere storms coexistent in large regional scale thunderstorm activity.
Image Enhancement via Subimage Histogram Equalization Based on Mean and Variance
2017-01-01
This paper puts forward a novel image enhancement method via Mean and Variance based Subimage Histogram Equalization (MVSIHE), which effectively increases the contrast of the input image with brightness and details well preserved compared with some other methods based on histogram equalization (HE). Firstly, the histogram of input image is divided into four segments based on the mean and variance of luminance component, and the histogram bins of each segment are modified and equalized, respectively. Secondly, the result is obtained via the concatenation of the processed subhistograms. Lastly, the normalization method is deployed on intensity levels, and the integration of the processed image with the input image is performed. 100 benchmark images from a public image database named CVG-UGR-Database are used for comparison with other state-of-the-art methods. The experiment results show that the algorithm can not only enhance image information effectively but also well preserve brightness and details of the original image. PMID:29403529
Image contrast enhancement using adjacent-blocks-based modification for local histogram equalization
NASA Astrophysics Data System (ADS)
Wang, Yang; Pan, Zhibin
2017-11-01
Infrared images usually have some non-ideal characteristics such as weak target-to-background contrast and strong noise. Because of these characteristics, it is necessary to apply the contrast enhancement algorithm to improve the visual quality of infrared images. Histogram equalization (HE) algorithm is a widely used contrast enhancement algorithm due to its effectiveness and simple implementation. But a drawback of HE algorithm is that the local contrast of an image cannot be equally enhanced. Local histogram equalization algorithms are proved to be the effective techniques for local image contrast enhancement. However, over-enhancement of noise and artifacts can be easily found in the local histogram equalization enhanced images. In this paper, a new contrast enhancement technique based on local histogram equalization algorithm is proposed to overcome the drawbacks mentioned above. The input images are segmented into three kinds of overlapped sub-blocks using the gradients of them. To overcome the over-enhancement effect, the histograms of these sub-blocks are then modified by adjacent sub-blocks. We pay more attention to improve the contrast of detail information while the brightness of the flat region in these sub-blocks is well preserved. It will be shown that the proposed algorithm outperforms other related algorithms by enhancing the local contrast without introducing over-enhancement effects and additional noise.
Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma.
Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang
2016-06-01
The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC).Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement.The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P <0.05), with area under the ROC curves (AUCs) of 0.66 to 0.74 for ADC and 0.76 to 0.88 for PVP. The largest AUC of PVP (1th percentile) showed significantly higher accuracy compared with that of arterial phase (AP) or tumor size (P <0.001).MR histogram analyses-in particular for 1th percentile for PVP images-held promise for prediction of MVI of HCC.
Lu, Shan Shan; Kim, Sang Joon; Kim, Namkug; Kim, Ho Sung; Choi, Choong Gon; Lim, Young Min
2015-04-01
This study intended to investigate the usefulness of histogram analysis of apparent diffusion coefficient (ADC) maps for discriminating primary CNS lymphomas (PCNSLs), especially atypical PCNSLs, from tumefactive demyelinating lesions (TDLs). Forty-seven patients with PCNSLs and 18 with TDLs were enrolled in our study. Hyperintense lesions seen on T2-weighted images were defined as ROIs after ADC maps were registered to the corresponding T2-weighted image. ADC histograms were calculated from the ROIs containing the entire lesion on every section and on a voxel-by-voxel basis. The ADC histogram parameters were compared among all PCNSLs and TDLs as well as between the subgroup of atypical PCNSLs and TDLs. ROC curves were constructed to evaluate the diagnostic performance of the histogram parameters and to determine the optimum thresholds. The differences between the PCNSLs and TDLs were found in the minimum ADC values (ADCmin) and in the 5th and 10th percentiles (ADC5% and ADC10%) of the cumulative ADC histograms. However, no statistical significance was found in the mean ADC value or in the ADC value concerning the mode, kurtosis, and skewness. The ADCmin, ADC5%, and ADC10% were also lower in atypical PCNSLs than in TDLs. ADCmin was the best indicator for discriminating atypical PCNSLs from TDLs, with a threshold of 556×10(-6) mm2/s (sensitivity, 81.3 %; specificity, 88.9%). Histogram analysis of ADC maps may help to discriminate PCNSLs from TDLs and may be particularly useful in differentiating atypical PCNSLs from TDLs.
Tiano, L; Chessa, M G; Carrara, S; Tagliafierro, G; Delmonte Corrado, M U
1999-01-01
The chromatin structure dynamics of the Colpoda inflata macronucleus have been investigated in relation to its functional condition, concerning chromatin body extrusion regulating activity. Samples of 2- and 25-day-old resting cysts derived from a standard culture, and of 1-year-old resting cysts derived from a senescent culture, were examined by means of histogram analysis performed on acquired optical microscopy images. Three groups of histograms were detected in each sample. Histogram classification, clustering and matching were assessed in order to obtain the mean histogram of each group. Comparative analysis of the mean histogram showed a similarity in the grey level range of 25-day- and 1-year-old cysts, unlike the wider grey level range found in 2-day-old cysts. Moreover, the respective mean histograms of the three cyst samples appeared rather similar in shape. All this implies that macronuclear chromatin structural features of 1-year-old cysts are common to both cyst standard cultures. The evaluation of the acquired images and their respective histograms evidenced a dynamic state of the macronuclear chromatin, appearing differently condensed in relation to the chromatin body extrusion regulating activity of the macronucleus. The coexistence of a chromatin-decondensed macronucleus with a pycnotic extrusion body suggests that chromatin unable to decondense, thus inactive, is extruded. This finding, along with the presence of chromatin structural features common to standard and senescent cyst populations, supports the occurrence of 'rejuvenated' cell lines from 1-year-old encysted senescent cells, a phenomenon which could be a result of accomplished macronuclear renewal.
Performance analysis of a dual-tree algorithm for computing spatial distance histograms
Chen, Shaoping; Tu, Yi-Cheng; Xia, Yuni
2011-01-01
Many scientific and engineering fields produce large volume of spatiotemporal data. The storage, retrieval, and analysis of such data impose great challenges to database systems design. Analysis of scientific spatiotemporal data often involves computing functions of all point-to-point interactions. One such analytics, the Spatial Distance Histogram (SDH), is of vital importance to scientific discovery. Recently, algorithms for efficient SDH processing in large-scale scientific databases have been proposed. These algorithms adopt a recursive tree-traversing strategy to process point-to-point distances in the visited tree nodes in batches, thus require less time when compared to the brute-force approach where all pairwise distances have to be computed. Despite the promising experimental results, the complexity of such algorithms has not been thoroughly studied. In this paper, we present an analysis of such algorithms based on a geometric modeling approach. The main technique is to transform the analysis of point counts into a problem of quantifying the area of regions where pairwise distances can be processed in batches by the algorithm. From the analysis, we conclude that the number of pairwise distances that are left to be processed decreases exponentially with more levels of the tree visited. This leads to the proof of a time complexity lower than the quadratic time needed for a brute-force algorithm and builds the foundation for a constant-time approximate algorithm. Our model is also general in that it works for a wide range of point spatial distributions, histogram types, and space-partitioning options in building the tree. PMID:21804753
Tsuchiya, Naoko; Doai, Mariko; Usuda, Katsuo; Uramoto, Hidetaka
2017-01-01
Purpose Investigating the diagnostic accuracy of histogram analyses of apparent diffusion coefficient (ADC) values for determining non-small cell lung cancer (NSCLC) tumor grades, lymphovascular invasion, and pleural invasion. Materials and methods We studied 60 surgically diagnosed NSCLC patients. Diffusion-weighted imaging (DWI) was performed in the axial plane using a navigator-triggered single-shot, echo-planar imaging sequence with prospective acquisition correction. The ADC maps were generated, and we placed a volume-of-interest on the tumor to construct the whole-lesion histogram. Using the histogram, we calculated the mean, 5th, 10th, 25th, 50th, 75th, 90th, and 95th percentiles of ADC, skewness, and kurtosis. Histogram parameters were correlated with tumor grade, lymphovascular invasion, and pleural invasion. We performed a receiver operating characteristics (ROC) analysis to assess the diagnostic performance of histogram parameters for distinguishing different pathologic features. Results The ADC mean, 10th, 25th, 50th, 75th, 90th, and 95th percentiles showed significant differences among the tumor grades. The ADC mean, 25th, 50th, 75th, 90th, and 95th percentiles were significant histogram parameters between high- and low-grade tumors. The ROC analysis between high- and low-grade tumors showed that the 95th percentile ADC achieved the highest area under curve (AUC) at 0.74. Lymphovascular invasion was associated with the ADC mean, 50th, 75th, 90th, and 95th percentiles, skewness, and kurtosis. Kurtosis achieved the highest AUC at 0.809. Pleural invasion was only associated with skewness, with the AUC of 0.648. Conclusions ADC histogram analyses on the basis of the entire tumor volume are able to stratify NSCLCs' tumor grade, lymphovascular invasion and pleural invasion. PMID:28207858
Accelerometer Data Analysis and Presentation Techniques
NASA Technical Reports Server (NTRS)
Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy
1997-01-01
The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.
Chao, Calvin Yi-Ping; Tu, Honyih; Wu, Thomas Meng-Hsiu; Chou, Kuo-Yu; Yeh, Shang-Fu; Yin, Chin; Lee, Chih-Lin
2017-11-23
A study of the random telegraph noise (RTN) of a 1.1 μm pitch, 8.3 Mpixel CMOS image sensor (CIS) fabricated in a 45 nm backside-illumination (BSI) technology is presented in this paper. A noise decomposition scheme is used to pinpoint the noise source. The long tail of the random noise (RN) distribution is directly linked to the RTN from the pixel source follower (SF). The full 8.3 Mpixels are classified into four categories according to the observed RTN histogram peaks. A theoretical formula describing the RTN as a function of the time difference between the two phases of the correlated double sampling (CDS) is derived and validated by measured data. An on-chip time constant extraction method is developed and applied to the RTN analysis. The effects of readout circuit bandwidth on the settling ratios of the RTN histograms are investigated and successfully accounted for in a simulation using a RTN behavior model.
Action Recognition Using 3D Histograms of Texture and A Multi-Class Boosting Classifier.
Zhang, Baochang; Yang, Yun; Chen, Chen; Yang, Linlin; Han, Jungong; Shao, Ling
2017-10-01
Human action recognition is an important yet challenging task. This paper presents a low-cost descriptor called 3D histograms of texture (3DHoTs) to extract discriminant features from a sequence of depth maps. 3DHoTs are derived from projecting depth frames onto three orthogonal Cartesian planes, i.e., the frontal, side, and top planes, and thus compactly characterize the salient information of a specific action, on which texture features are calculated to represent the action. Besides this fast feature descriptor, a new multi-class boosting classifier (MBC) is also proposed to efficiently exploit different kinds of features in a unified framework for action classification. Compared with the existing boosting frameworks, we add a new multi-class constraint into the objective function, which helps to maintain a better margin distribution by maximizing the mean of margin, whereas still minimizing the variance of margin. Experiments on the MSRAction3D, MSRGesture3D, MSRActivity3D, and UTD-MHAD data sets demonstrate that the proposed system combining 3DHoTs and MBC is superior to the state of the art.
Contact-free palm-vein recognition based on local invariant features.
Kang, Wenxiong; Liu, Yang; Wu, Qiuxia; Yue, Xishun
2014-01-01
Contact-free palm-vein recognition is one of the most challenging and promising areas in hand biometrics. In view of the existing problems in contact-free palm-vein imaging, including projection transformation, uneven illumination and difficulty in extracting exact ROIs, this paper presents a novel recognition approach for contact-free palm-vein recognition that performs feature extraction and matching on all vein textures distributed over the palm surface, including finger veins and palm veins, to minimize the loss of feature information. First, a hierarchical enhancement algorithm, which combines a DOG filter and histogram equalization, is adopted to alleviate uneven illumination and to highlight vein textures. Second, RootSIFT, a more stable local invariant feature extraction method in comparison to SIFT, is adopted to overcome the projection transformation in contact-free mode. Subsequently, a novel hierarchical mismatching removal algorithm based on neighborhood searching and LBP histograms is adopted to improve the accuracy of feature matching. Finally, we rigorously evaluated the proposed approach using two different databases and obtained 0.996% and 3.112% Equal Error Rates (EERs), respectively, which demonstrate the effectiveness of the proposed approach.
NASA Astrophysics Data System (ADS)
Hori, Yasuaki; Yasuno, Yoshiaki; Sakai, Shingo; Matsumoto, Masayuki; Sugawara, Tomoko; Madjarova, Violeta; Yamanari, Masahiro; Makita, Shuichi; Yasui, Takeshi; Araki, Tsutomu; Itoh, Masahide; Yatagai, Toyohiko
2006-03-01
A set of fully automated algorithms that is specialized for analyzing a three-dimensional optical coherence tomography (OCT) volume of human skin is reported. The algorithm set first determines the skin surface of the OCT volume, and a depth-oriented algorithm provides the mean epidermal thickness, distribution map of the epidermis, and a segmented volume of the epidermis. Subsequently, an en face shadowgram is produced by an algorithm to visualize the infundibula in the skin with high contrast. The population and occupation ratio of the infundibula are provided by a histogram-based thresholding algorithm and a distance mapping algorithm. En face OCT slices at constant depths from the sample surface are extracted, and the histogram-based thresholding algorithm is again applied to these slices, yielding a three-dimensional segmented volume of the infundibula. The dermal attenuation coefficient is also calculated from the OCT volume in order to evaluate the skin texture. The algorithm set examines swept-source OCT volumes of the skins of several volunteers, and the results show the high stability, portability and reproducibility of the algorithm.
Multifractal diffusion entropy analysis: Optimal bin width of probability histograms
NASA Astrophysics Data System (ADS)
Jizba, Petr; Korbel, Jan
2014-11-01
In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. The method presented uses techniques of Rényi’s entropy and the mean squared error analysis to discuss the conditions under which the error in the multifractal spectrum estimation is minimal. We illustrate the utility of our approach by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the method proposed we compare the multifractal δ-spectrum for various bin-widths and show the robustness of the method, especially for large values of q. For such values, other methods in use, e.g., those based on moment estimation, tend to fail for heavy-tailed data or data with long correlations. Connection between the δ-spectrum and Rényi’s q parameter is also discussed and elucidated on a simple example of multiscale time series.
Contact-Free Palm-Vein Recognition Based on Local Invariant Features
Kang, Wenxiong; Liu, Yang; Wu, Qiuxia; Yue, Xishun
2014-01-01
Contact-free palm-vein recognition is one of the most challenging and promising areas in hand biometrics. In view of the existing problems in contact-free palm-vein imaging, including projection transformation, uneven illumination and difficulty in extracting exact ROIs, this paper presents a novel recognition approach for contact-free palm-vein recognition that performs feature extraction and matching on all vein textures distributed over the palm surface, including finger veins and palm veins, to minimize the loss of feature information. First, a hierarchical enhancement algorithm, which combines a DOG filter and histogram equalization, is adopted to alleviate uneven illumination and to highlight vein textures. Second, RootSIFT, a more stable local invariant feature extraction method in comparison to SIFT, is adopted to overcome the projection transformation in contact-free mode. Subsequently, a novel hierarchical mismatching removal algorithm based on neighborhood searching and LBP histograms is adopted to improve the accuracy of feature matching. Finally, we rigorously evaluated the proposed approach using two different databases and obtained 0.996% and 3.112% Equal Error Rates (EERs), respectively, which demonstrate the effectiveness of the proposed approach. PMID:24866176
Discoveries far from the lamppost with matrix elements and ranking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Debnath, Dipsikha; Gainer, James S.; Matchev, Konstantin T.
2015-04-01
The prevalence of null results in searches for new physics at the LHC motivates the effort to make these searches as model-independent as possible. We describe procedures for adapting the Matrix Element Method for situations where the signal hypothesis is not known a priori. We also present general and intuitive approaches for performing analyses and presenting results, which involve the flattening of background distributions using likelihood information. The first flattening method involves ranking events by background matrix element, the second involves quantile binning with respect to likelihood (and other) variables, and the third method involves reweighting histograms by the inversemore » of the background distribution.« less
WE-A-17A-12: The Influence of Eye Plaque Design On Dose Distributions and Dose- Volume Histograms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aryal, P; Molloy, JA; Rivard, MJ
Purpose: To investigate the effect of slot design of the model EP917 plaque on dose distributions and dose-volume histograms (DVHs). Methods: The dimensions and orientation of the slots in EP917 plaques were measured. In the MCNP5 radiation simulation geometry, dose distributions on orthogonal planes and DVHs for a tumor and sclera were generated for comparisons. 27 slot designs and 13 plaques were evaluated and compared with the published literature and the Plaque Simulator clinical treatment planning system. Results: The dosimetric effect of the gold backing composition and mass density was < 3%. Slot depth, width, and length changed the centralmore » axis (CAX) dose distributions by < 1% per 0.1 mm in design variation. Seed shifts in the slot towards the eye and shifts of the {sup 125} I-coated Ag rod within the capsule had the greatest impact on CAX dose distribution, increasing by 14%, 9%, 4%, and 2.5% at 1, 2, 5, and 10 mm, respectively, from the inner sclera. Along the CAX, dose from the full plaque geometry using the measured slot design was 3.4% ± 2.3% higher than the manufacturer-provided geometry. D{sub 10} for the simulated tumor, inner sclera, and outer sclera for the measured plaque was also higher, but 9%, 10%, and 20%, respectively. In comparison to the measured plaque design, a theoretical plaque having narrow and deep slots delivered 30%, 37%, and 62% lower D{sub 10} doses to the tumor, inner sclera, and outer sclera, respectively. CAX doses at −1, 0, 1, and 2 mm were also lower by a factor of 2.6, 1.4, 1.23, and 1.13, respectively. Conclusion: The study identified substantial sensitivity of the EP917 plaque dose distributions to slot design. However, it did not identify substantial dosimetric variations based on radionuclide choice ({sup 125}I, {sup 103}Pd, or {sup 131}Cs). COMS plaques provided lower scleral doses with similar tumor dose coverage.« less
A comparison of two methods for expert elicitation in health technology assessments.
Grigore, Bogdan; Peters, Jaime; Hyde, Christopher; Stein, Ken
2016-07-26
When data needed to inform parameters in decision models are lacking, formal elicitation of expert judgement can be used to characterise parameter uncertainty. Although numerous methods for eliciting expert opinion as probability distributions exist, there is little research to suggest whether one method is more useful than any other method. This study had three objectives: (i) to obtain subjective probability distributions characterising parameter uncertainty in the context of a health technology assessment; (ii) to compare two elicitation methods by eliciting the same parameters in different ways; (iii) to collect subjective preferences of the experts for the different elicitation methods used. Twenty-seven clinical experts were invited to participate in an elicitation exercise to inform a published model-based cost-effectiveness analysis of alternative treatments for prostate cancer. Participants were individually asked to express their judgements as probability distributions using two different methods - the histogram and hybrid elicitation methods - presented in a random order. Individual distributions were mathematically aggregated across experts with and without weighting. The resulting combined distributions were used in the probabilistic analysis of the decision model and mean incremental cost-effectiveness ratios and the expected values of perfect information (EVPI) were calculated for each method, and compared with the original cost-effectiveness analysis. Scores on the ease of use of the two methods and the extent to which the probability distributions obtained from each method accurately reflected the expert's opinion were also recorded. Six experts completed the task. Mean ICERs from the probabilistic analysis ranged between £162,600-£175,500 per quality-adjusted life year (QALY) depending on the elicitation and weighting methods used. Compared to having no information, use of expert opinion decreased decision uncertainty: the EVPI value at the £30,000 per QALY threshold decreased by 74-86 % from the original cost-effectiveness analysis. Experts indicated that the histogram method was easier to use, but attributed a perception of more accuracy to the hybrid method. Inclusion of expert elicitation can decrease decision uncertainty. Here, choice of method did not affect the overall cost-effectiveness conclusions, but researchers intending to use expert elicitation need to be aware of the impact different methods could have.
Brennan, Gerard P; Hunter, Stephen J; Snow, Greg; Minick, Kate I
2017-12-01
The Centers for Medicare and Medicaid Services (CMS) require physical therapists document patients' functional limitations. The process is not standardized. A systematic approach to determine a patient's functional limitations and responsiveness to change is needed. The purpose of this study is to compare patient-reported outcomes (PROs) responsiveness to change using 7-level severity/complexity modifier scale proposed by Medicare to a derived scale implemented by Intermountain Healthcare's Rehabilitation Outcomes Management System (ROMS). This was a retrospective, observational cohort design. 165,183 PROs prior to July 1, 2013, were compared to 46,334 records from July 1, 2013, to December 31, 2015. Histograms and ribbon plots illustrate distribution and change of patients' scores. ROMS raw score ranges were calculated and compared to CMS' severity/complexity levels based on score percentage. Distribution of the population was compared based on the 2 methods. Sensitivity and specificity were compared for responsiveness to change based on minimal clinically important difference (MCID). Histograms demonstrated few patient scores placed in CMS scale levels at the extremes, whereas the majority of scores placed in 2 middle levels (CJ, CK). ROMS distributed scores more evenly across levels. Ribbon plots illustrated advantage of ROMS' using narrower score ranges. Greater chance for patients to change levels was observed with ROMS when an MCID was achieved. ROMS narrower scale levels resulted in greater sensitivity and good specificity. Geographic representation for the United States was limited. Without patients' global rating of change, a reference standard to gauge validation of improvement could not be provided. ROMS provides a standard approach to identify accurately functional limitation modifier levels and to detect improvement more accurately than a straight across transposition using the CMS scale. © 2017 American Physical Therapy Association
NASA Astrophysics Data System (ADS)
Mackiewicz, Michal W.; Fisher, Mark; Jamieson, Crawford
2008-03-01
Wireless Capsule Endoscopy (WCE) is a colour imaging technology that enables detailed examination of the interior of the gastrointestinal tract. A typical WCE examination takes ~ 8 hours and captures ~ 40,000 useful images. After the examination, the images are viewed as a video sequence, which generally takes a clinician over an hour to analyse. The manufacturers of the WCE provide certain automatic image analysis functions e.g. Given Imaging offers in their Rapid Reader software: The Suspected Blood Indicator (SBI), which is designed to report the location in the video of areas of active bleeding. However, this tool has been reported to have insufficient specificity and sensitivity. Therefore it does not free the specialist from reviewing the entire footage and was suggested only to be used as a fast screening tool. In this paper we propose a method of bleeding detection that uses in its first stage Hue-Saturation-Intensity colour histograms to track a moving background and bleeding colour distributions over time. Such an approach addresses the problem caused by drastic changes in blood colour distribution that occur when it is altered by gastrointestinal fluids and allow detection of other red lesions, which although are usually "less red" than fresh bleeding, they can still be detected when the difference between their colour distributions and the background is large enough. In the second stage of our method, we analyse all candidate blood frames, by extracting colour (HSI) and texture (LBP) features from the suspicious image regions (obtained in the first stage) and their neighbourhoods and classifying them using Support Vector Classifier into Bleeding, Lesion and Normal classes. We show that our algorithm compares favourably with the SBI on the test set of 84 full length videos.
Kharche, Sanjay R.; So, Aaron; Salerno, Fabio; Lee, Ting-Yim; Ellis, Chris; Goldman, Daniel; McIntyre, Christopher W.
2018-01-01
Dialysis prolongs life but augments cardiovascular mortality. Imaging data suggests that dialysis increases myocardial blood flow (BF) heterogeneity, but its causes remain poorly understood. A biophysical model of human coronary vasculature was used to explain the imaging observations, and highlight causes of coronary BF heterogeneity. Post-dialysis CT images from patients under control, pharmacological stress (adenosine), therapy (cooled dialysate), and adenosine and cooled dialysate conditions were obtained. The data presented disparate phenotypes. To dissect vascular mechanisms, a 3D human vasculature model based on known experimental coronary morphometry and a space filling algorithm was implemented. Steady state simulations were performed to investigate the effects of altered aortic pressure and blood vessel diameters on myocardial BF heterogeneity. Imaging showed that stress and therapy potentially increased mean and total BF, while reducing heterogeneity. BF histograms of one patient showed multi-modality. Using the model, it was found that total coronary BF increased as coronary perfusion pressure was increased. BF heterogeneity was differentially affected by large or small vessel blocking. BF heterogeneity was found to be inversely related to small blood vessel diameters. Simulation of large artery stenosis indicates that BF became heterogeneous (increase relative dispersion) and gave multi-modal histograms. The total transmural BF as well as transmural BF heterogeneity reduced due to large artery stenosis, generating large patches of very low BF regions downstream. Blocking of arteries at various orders showed that blocking larger arteries results in multi-modal BF histograms and large patches of low BF, whereas smaller artery blocking results in augmented relative dispersion and fractal dimension. Transmural heterogeneity was also affected. Finally, the effects of augmented aortic pressure in the presence of blood vessel blocking shows differential effects on BF heterogeneity as well as transmural BF. Improved aortic blood pressure may improve total BF. Stress and therapy may be effective if they dilate small vessels. A potential cause for the observed complex BF distributions (multi-modal BF histograms) may indicate existing large vessel stenosis. The intuitive BF heterogeneity methods used can be readily used in clinical studies. Further development of the model and methods will permit personalized assessment of patient BF status. PMID:29867555
Ellingson, B M; Sahebjam, S; Kim, H J; Pope, W B; Harris, R J; Woodworth, D C; Lai, A; Nghiemphu, P L; Mason, W P; Cloughesy, T F
2014-04-01
Pre-treatment ADC characteristics have been shown to predict response to bevacizumab in recurrent glioblastoma multiforme. However, no studies have examined whether ADC characteristics are specific to this particular treatment. The purpose of the current study was to determine whether ADC histogram analysis is a bevacizumab-specific or treatment-independent biomarker of treatment response in recurrent glioblastoma multiforme. Eighty-nine bevacizumab-treated and 43 chemotherapy-treated recurrent glioblastoma multiformes never exposed to bevacizumab were included in this study. In all patients, ADC values in contrast-enhancing ROIs from MR imaging examinations performed at the time of recurrence, immediately before commencement of treatment for recurrence, were extracted and the resulting histogram was fitted to a mixed model with a double Gaussian distribution. Mean ADC in the lower Gaussian curve was used as the primary biomarker of interest. The Cox proportional hazards model and log-rank tests were used for survival analysis. Cox multivariate regression analysis accounting for the interaction between bevacizumab- and non-bevacizumab-treated patients suggested that the ability of the lower Gaussian curve to predict survival is dependent on treatment (progression-free survival, P = .045; overall survival, P = .003). Patients with bevacizumab-treated recurrent glioblastoma multiforme with a pretreatment lower Gaussian curve > 1.2 μm(2)/ms had a significantly longer progression-free survival and overall survival compared with bevacizumab-treated patients with a lower Gaussian curve < 1.2 μm(2)/ms. No differences in progression-free survival or overall survival were observed in the chemotherapy-treated cohort. Bevacizumab-treated patients with a mean lower Gaussian curve > 1.2 μm(2)/ms had a significantly longer progression-free survival and overall survival compared with chemotherapy-treated patients. The mean lower Gaussian curve from ADC histogram analysis is a predictive imaging biomarker for bevacizumab-treated, not chemotherapy-treated, recurrent glioblastoma multiforme. Patients with recurrent glioblastoma multiforme with a mean lower Gaussian curve > 1.2 μm(2)/ms have a survival advantage when treated with bevacizumab.
Contrasting the co-variability of daytime cloud and precipitation over tropical land and ocean
NASA Astrophysics Data System (ADS)
Jin, Daeho; Oreopoulos, Lazaros; Lee, Dongmin; Cho, Nayeong; Tan, Jackson
2018-03-01
The co-variability of cloud and precipitation in the extended tropics (35° N-35° S) is investigated using contemporaneous data sets for a 13-year period. The goal is to quantify potential relationships between cloud type fractions and precipitation events of particular strength. Particular attention is paid to whether the relationships exhibit different characteristics over tropical land and ocean. A primary analysis metric is the correlation coefficient between fractions of individual cloud types and frequencies within precipitation histogram bins that have been matched in time and space. The cloud type fractions are derived from Moderate Resolution Imaging Spectroradiometer (MODIS) joint histograms of cloud top pressure and cloud optical thickness in 1° grid cells, and the precipitation frequencies come from the Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) data set aggregated to the same grid.
It is found that the strongest coupling (positive correlation) between clouds and precipitation occurs over ocean for cumulonimbus clouds and the heaviest rainfall. While the same cloud type and rainfall bin are also best correlated over land compared to other combinations, the correlation magnitude is weaker than over ocean. The difference is attributed to the greater size of convective systems over ocean. It is also found that both over ocean and land the anti-correlation of strong precipitation with weak
(i.e., thin and/or low) cloud types is of greater absolute strength than positive correlations between weak cloud types and weak precipitation. Cloud type co-occurrence relationships explain some of the cloud-precipitation anti-correlations. Weak correlations between weaker rainfall and clouds indicate poor predictability for precipitation when cloud types are known, and this is even more true over land than over ocean.
Kafayat, A Fakoya; Martins, A Anetekhai; Shehu, L Akintola; Abdulwakil, O Sabal; Abass, Mikhail A
2015-03-01
The Gorean snapper, Lutanus goreensis is an important component of artisanal fisheries and trawl landings in the Gulf of Guinea. Despite its economic importance, there is a dearth of information on size structure and life history strategies of the species. Therefore, the objectives of this study were to provide baseline data on the life stages, exploitation status and habitat use for the species in Nigeria. Monthly samples were obtained from artisanal and trawl catches in Five Cowrie Creek and Lagos coastal waters between December 2008 and December 2010, respectively. Length-frequency distributions of the fishes caught were analysed to provide preliminary information on mean and modal lengths at capture and life-history strategies based on habitat use and estuarine-dependency for L. goreensis. A total of 822 specimens of L. goreensis were collected from Five Cowrie Creek while 377 specimens were collected from Lagos coastal waters. Total length varied between 7.90-34.90 cm for creek samples and from 21.90-56.10 cm for marine samples. Length-frequency histograms showed polymodal size distributions in creek and marine samples. Length-frequency distributions of L. goreensis showed a high abundance ofjuveniles (<20 cm) and sub-adults (20-35 cm) which accounted for 84.1% and 68.4% of creek and marine samples examined, respectively. For the creek samples, fish in modal length class of 13.00-13.99 cm were the most exploited while in the marine samples, length classes of 29.00-30.99 cm and 31.00-32.99cm constituted the most frequently exploited fishes. Increase in total lengths from the creek (mean +/- SD; 16.19 +/- 3.73 cm) to the marine habitat samples (32.89 +/- 6.14 cm) indicated ontogenetic shift in habitat use. Occurrence of a predominant juvenile population in Five Cowrie Creek by L. goreensis suggests estuarine-dependency and is indicative of a temporary juvenile habitat or a migratory corridor. In conclusion, data from the presently reported study and previous studies demonstrated that juvenile L. goreensis displays estuarine dependency and habitat flexibility. Hence, this underscores the importance of preserving estuarine environments as essential fish habitats to prevent overfishing. The study also concludes that the species is vulnerable to recruitment overfishing in the marine environment especially as a consequence of shrimping. Consequently, it advocates for ban on all fishing activities during peak spawning periods in breeding grounds and shrimp ground assemblage.
Wu, Chen-Jiang; Wang, Qing; Li, Hai; Wang, Xiao-Ning; Liu, Xi-Sheng; Shi, Hai-Bin; Zhang, Yu-Dong
2015-10-01
To investigate diagnostic efficiency of DWI using entire-tumor histogram analysis in differentiating the low-grade (LG) prostate cancer (PCa) from intermediate-high-grade (HG) PCa in comparison with conventional ROI-based measurement. DW images (b of 0-1400 s/mm(2)) from 126 pathology-confirmed PCa (diameter >0.5 cm) in 110 patients were retrospectively collected and processed by mono-exponential model. The measurement of tumor apparent diffusion coefficients (ADCs) was performed with using histogram-based and ROI-based approach, respectively. The diagnostic ability of ADCs from two methods for differentiating LG-PCa (Gleason score, GS ≤ 6) from HG-PCa (GS > 6) was determined by ROC regression, and compared by McNemar's test. There were 49 LG-tumor and 77 HG-tumor at pathologic findings. Histogram-based ADCs (mean, median, 10th and 90th) and ROI-based ADCs (mean) showed dominant relationships with ordinal GS of Pca (ρ = -0.225 to -0.406, p < 0.05). All above imaging indices reflected significant difference between LG-PCa and HG-PCa (all p values <0.01). Histogram 10th ADCs had dominantly high Az (0.738), Youden index (0.415), and positive likelihood ratio (LR+, 2.45) in stratifying tumor GS against mean, median and 90th ADCs, and ROI-based ADCs. Histogram mean, median, and 10th ADCs showed higher specificity (65.3%-74.1% vs. 44.9%, p < 0.01), but lower sensitivity (57.1%-71.3% vs. 84.4%, p < 0.05) than ROI-based ADCs in differentiating LG-PCa from HG-PCa. DWI-associated histogram analysis had higher specificity, Az, Youden index, and LR+ for differentiation of PCa Gleason grade than ROI-based approach.
Choi, M H; Oh, S N; Park, G E; Yeo, D-M; Jung, S E
2018-05-10
To evaluate the interobserver and intermethod correlations of histogram metrics of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) parameters acquired by multiple readers using the single-section and whole-tumor volume methods. Four DCE parameters (K trans , K ep , V e , V p ) were evaluated in 45 patients (31 men and 14 women; mean age, 61±11 years [range, 29-83 years]) with locally advanced rectal cancer using pre-chemoradiotherapy (CRT) MRI. Ten histogram metrics were extracted using two methods of lesion selection performed by three radiologists: the whole-tumor volume method for the whole tumor on axial section-by-section images and the single-section method for the entire area of the tumor on one axial image. The interobserver and intermethod correlations were evaluated using the intraclass correlation coefficients (ICCs). The ICCs showed excellent interobserver and intermethod correlations in most of histogram metrics of the DCE parameters. The ICCs among the three readers were > 0.7 (P<0.001) for all histogram metrics, except for the minimum and maximum. The intermethod correlations for most of the histogram metrics were excellent for each radiologist, regardless of the differences in the radiologists' experience. The interobserver and intermethod correlations for most of the histogram metrics of the DCE parameters are excellent in rectal cancer. Therefore, the single-section method may be a potential alternative to the whole-tumor volume method using pre-CRT MRI, despite the fact that the high agreement between the two methods cannot be extrapolated to post-CRT MRI. Copyright © 2018 Société française de radiologie. Published by Elsevier Masson SAS. All rights reserved.
van Heeswijk, Miriam M; Lambregts, Doenja M J; Maas, Monique; Lahaye, Max J; Ayas, Z; Slenter, Jos M G M; Beets, Geerard L; Bakers, Frans C H; Beets-Tan, Regina G H
2017-06-01
The apparent diffusion coefficient (ADC) is a potential prognostic imaging marker in rectal cancer. Typically, mean ADC values are used, derived from precise manual whole-volume tumor delineations by experts. The aim was first to explore whether non-precise circular delineation combined with histogram analysis can be a less cumbersome alternative to acquire similar ADC measurements and second to explore whether histogram analyses provide additional prognostic information. Thirty-seven patients who underwent a primary staging MRI including diffusion-weighted imaging (DWI; b0, 25, 50, 100, 500, 1000; 1.5 T) were included. Volumes-of-interest (VOIs) were drawn on b1000-DWI: (a) precise delineation, manually tracing tumor boundaries (2 expert readers), and (b) non-precise delineation, drawing circular VOIs with a wide margin around the tumor (2 non-experts). Mean ADC and histogram metrics (mean, min, max, median, SD, skewness, kurtosis, 5th-95th percentiles) were derived from the VOIs and delineation time was recorded. Measurements were compared between the two methods and correlated with prognostic outcome parameters. Median delineation time reduced from 47-165 s (precise) to 21-43 s (non-precise). The 45th percentile of the non-precise delineation showed the best correlation with the mean ADC from the precise delineation as the reference standard (ICC 0.71-0.75). None of the mean ADC or histogram parameters showed significant prognostic value; only the total tumor volume (VOI) was significantly larger in patients with positive clinical N stage and mesorectal fascia involvement. When performing non-precise tumor delineation, histogram analysis (in specific 45th ADC percentile) may be used as an alternative to obtain similar ADC values as with precise whole tumor delineation. Histogram analyses are not beneficial to obtain additional prognostic information.
Zhang, Yu-Dong; Wang, Qing; Wu, Chen-Jiang; Wang, Xiao-Ning; Zhang, Jing; Liu, Hui; Liu, Xi-Sheng; Shi, Hai-Bin
2015-04-01
To evaluate histogram analysis of intravoxel incoherent motion (IVIM) for discriminating the Gleason grade of prostate cancer (PCa). A total of 48 patients pathologically confirmed as having clinically significant PCa (size > 0.5 cm) underwent preoperative DW-MRI (b of 0-900 s/mm(2)). Data was post-processed by monoexponential and IVIM model for quantitation of apparent diffusion coefficients (ADCs), perfusion fraction f, diffusivity D and pseudo-diffusivity D*. Histogram analysis was performed by outlining entire-tumour regions of interest (ROIs) from histological-radiological correlation. The ability of imaging indices to differentiate low-grade (LG, Gleason score (GS) ≤6) from intermediate/high-grade (HG, GS > 6) PCa was analysed by ROC regression. Eleven patients had LG tumours (18 foci) and 37 patients had HG tumours (42 foci) on pathology examination. HG tumours had significantly lower ADCs and D in terms of mean, median, 10th and 75th percentiles, combined with higher histogram kurtosis and skewness for ADCs, D and f, than LG PCa (p < 0.05). Histogram D showed relatively higher correlations (ñ = 0.641-0.668 vs. ADCs: 0.544-0.574) with ordinal GS of PCa; and its mean, median and 10th percentile performed better than ADCs did in distinguishing LG from HG PCa. It is feasible to stratify the pathological grade of PCa by IVIM with histogram metrics. D performed better in distinguishing LG from HG tumour than conventional ADCs. • GS had relatively higher correlation with tumour D than ADCs. • Difference of histogram D among two-grade tumours was statistically significant. • D yielded better individual features in demonstrating tumour grade than ADC. • D* and f failed to determine tumour grade of PCa.
Serial data acquisition for GEM-2D detector
NASA Astrophysics Data System (ADS)
Kolasinski, Piotr; Pozniak, Krzysztof T.; Czarski, Tomasz; Linczuk, Maciej; Byszuk, Adrian; Chernyshova, Maryna; Juszczyk, Bartlomiej; Kasprowicz, Grzegorz; Wojenski, Andrzej; Zabolotny, Wojciech; Zienkiewicz, Pawel; Mazon, Didier; Malard, Philippe; Herrmann, Albrecht; Vezinet, Didier
2014-11-01
This article debates about data fast acquisition and histogramming method for the X-ray GEM detector. The whole process of histogramming is performed by FPGA chips (Spartan-6 series from Xilinx). The results of the histogramming process are stored in an internal FPGA memory and then sent to PC. In PC data is merged and processed by MATLAB. The structure of firmware functionality implemented in the FPGAs is described. Examples of test measurements and results are presented.
Choi, Sang Hyun; Lee, Jeong Hyun; Choi, Young Jun; Park, Ji Eun; Sung, Yu Sub; Kim, Namkug; Baek, Jung Hwan
2017-01-01
This study aimed to explore the added value of histogram analysis of the ratio of initial to final 90-second time-signal intensity AUC (AUCR) for differentiating local tumor recurrence from contrast-enhancing scar on follow-up dynamic contrast-enhanced T1-weighted perfusion MRI of patients treated for head and neck squamous cell carcinoma (HNSCC). AUCR histogram parameters were assessed among tumor recurrence (n = 19) and contrast-enhancing scar (n = 27) at primary sites and compared using the t test. ROC analysis was used to determine the best differentiating parameters. The added value of AUCR histogram parameters was assessed when they were added to inconclusive conventional MRI results. Histogram analysis showed statistically significant differences in the 50th, 75th, and 90th percentiles of the AUCR values between the two groups (p < 0.05). The 90th percentile of the AUCR values (AUCR 90 ) was the best predictor of local tumor recurrence (AUC, 0.77; 95% CI, 0.64-0.91) with an estimated cutoff of 1.02. AUCR 90 increased sensitivity by 11.7% over that of conventional MRI alone when added to inconclusive results. Histogram analysis of AUCR can improve the diagnostic yield for local tumor recurrence during surveillance after treatment for HNSCC.
Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma
Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang
2016-01-01
Abstract The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC). Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement. The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P <0.05), with area under the ROC curves (AUCs) of 0.66 to 0.74 for ADC and 0.76 to 0.88 for PVP. The largest AUC of PVP (1th percentile) showed significantly higher accuracy compared with that of arterial phase (AP) or tumor size (P <0.001). MR histogram analyses—in particular for 1th percentile for PVP images—held promise for prediction of MVI of HCC. PMID:27368028
Effect of respiratory and cardiac gating on the major diffusion-imaging metrics
Hamaguchi, Hiroyuki; Sugimori, Hiroyuki; Nakanishi, Mitsuhiro; Nakagawa, Shin; Fujiwara, Taro; Yoshida, Hirokazu; Takamori, Sayaka; Shirato, Hiroki
2016-01-01
The effect of respiratory gating on the major diffusion-imaging metrics and that of cardiac gating on mean kurtosis (MK) are not known. For evaluation of whether the major diffusion-imaging metrics—MK, fractional anisotropy (FA), and mean diffusivity (MD) of the brain—varied between gated and non-gated acquisitions, respiratory-gated, cardiac-gated, and non-gated diffusion-imaging of the brain were performed in 10 healthy volunteers. MK, FA, and MD maps were constructed for all acquisitions, and the histograms were constructed. The normalized peak height and location of the histograms were compared among the acquisitions by use of Friedman and post hoc Wilcoxon tests. The effect of the repetition time (TR) on the diffusion-imaging metrics was also tested, and we corrected for its variation among acquisitions, if necessary. The results showed a shift in the peak location of the MK and MD histograms to the right with an increase in TR (p ≤ 0.01). The corrected peak location of the MK histograms, the normalized peak height of the FA histograms, the normalized peak height and the corrected peak location of the MD histograms varied significantly between the gated and non-gated acquisitions (p < 0.05). These results imply an influence of respiration and cardiac pulsation on the major diffusion-imaging metrics. The gating conditions must be kept identical if reproducible results are to be achieved. PMID:27073115
Encoding frequency contrast in primate auditory cortex
Scott, Brian H.; Semple, Malcolm N.
2014-01-01
Changes in amplitude and frequency jointly determine much of the communicative significance of complex acoustic signals, including human speech. We have previously described responses of neurons in the core auditory cortex of awake rhesus macaques to sinusoidal amplitude modulation (SAM) signals. Here we report a complementary study of sinusoidal frequency modulation (SFM) in the same neurons. Responses to SFM were analogous to SAM responses in that changes in multiple parameters defining SFM stimuli (e.g., modulation frequency, modulation depth, carrier frequency) were robustly encoded in the temporal dynamics of the spike trains. For example, changes in the carrier frequency produced highly reproducible changes in shapes of the modulation period histogram, consistent with the notion that the instantaneous probability of discharge mirrors the moment-by-moment spectrum at low modulation rates. The upper limit for phase locking was similar across SAM and SFM within neurons, suggesting shared biophysical constraints on temporal processing. Using spike train classification methods, we found that neural thresholds for modulation depth discrimination are typically far lower than would be predicted from frequency tuning to static tones. This “dynamic hyperacuity” suggests a substantial central enhancement of the neural representation of frequency changes relative to the auditory periphery. Spike timing information was superior to average rate information when discriminating among SFM signals, and even when discriminating among static tones varying in frequency. This finding held even when differences in total spike count across stimuli were normalized, indicating both the primacy and generality of temporal response dynamics in cortical auditory processing. PMID:24598525
Snow grain size and shape distributions in northern Canada
NASA Astrophysics Data System (ADS)
Langlois, A.; Royer, A.; Montpetit, B.; Roy, A.
2016-12-01
Pioneer snow work in the 1970s and 1980s proposed new approaches to retrieve snow depth and water equivalent from space using passive microwave brightness temperatures. Numerous research work have led to the realization that microwave approaches depend strongly on snow grain morphology (size and shape), which was poorly parameterized since recently, leading to strong biases in the retrieval calculations. Related uncertainties from space retrievals and the development of complex thermodynamic multilayer snow and emission models motivated several research works on the development of new approaches to quantify snow grain metrics given the lack of field measurements arising from the sampling constraints of such variable. This presentation focuses on the unknown size distribution of snow grain sizes. Our group developed a new approach to the `traditional' measurements of snow grain metrics where micro-photographs of snow grains are taken under angular directional LED lighting. The projected shadows are digitized so that a 3D reconstruction of the snow grains is possible. This device has been used in several field campaigns and over the years a very large dataset was collected and is presented in this paper. A total of 588 snow photographs from 107 snowpits collected during the European Space Agency (ESA) Cold Regions Hydrology high-resolution Observatory (CoReH2O) mission concept field campaign, in Churchill, Manitoba Canada (January - April 2010). Each of the 588 photographs was classified as: depth hoar, rounded, facets and precipitation particles. A total of 162,516 snow grains were digitized across the 588 photographs, averaging 263 grains/photo. Results include distribution histograms for 5 `size' metrics (projected area, perimeter, equivalent optical diameter, minimum axis and maximum axis), and 2 `shape' metrics (eccentricity, major/minor axis ratio). Different cumulative histograms are found between the grain types, and proposed fits are presented with the Kernel distribution function. Finally, a comparison with the Specific Surface Area (SSA) derived from reflectance values using the Infrared Integrating Sphere (IRIS) highlight different power statistical fits for the 5 `size' metrics.
Yue, Jianting; Mauxion, Thibault; Reyes, Diane K.; Lodge, Martin A.; Hobbs, Robert F.; Rong, Xing; Dong, Yinfeng; Herman, Joseph M.; Wahl, Richard L.; Geschwind, Jean-François H.; Frey, Eric C.
2016-01-01
Purpose: Radioembolization with yttrium-90 microspheres may be optimized with patient-specific pretherapy treatment planning. Dose verification and validation of treatment planning methods require quantitative imaging of the post-therapy distribution of yttrium-90 (Y-90). Methods for quantitative imaging of Y-90 using both bremsstrahlung SPECT and PET have previously been described. The purpose of this study was to compare the two modalities quantitatively in humans. Methods: Calibration correction factors for both quantitative Y-90 bremsstrahlung SPECT and a non-time-of-flight PET system without compensation for prompt coincidences were developed by imaging three phantoms. The consistency of these calibration correction factors for the different phantoms was evaluated. Post-therapy images from both modalities were obtained from 15 patients with hepatocellular carcinoma who underwent hepatic radioembolization using Y-90 glass microspheres. Quantitative SPECT and PET images were rigidly registered and the total liver activities and activity distributions estimated for each modality were compared. The activity distributions were compared using profiles, voxel-by-voxel correlation and Bland–Altman analyses, and activity-volume histograms. Results: The mean ± standard deviation of difference in the total activity in the liver between the two modalities was 0% ± 9% (range −21%–18%). Voxel-by-voxel comparisons showed a good agreement in regions corresponding roughly to treated tumor and treated normal liver; the agreement was poorer in regions with low or no expected activity, where PET appeared to overestimate the activity. The correlation coefficients between intrahepatic voxel pairs for the two modalities ranged from 0.86 to 0.94. Cumulative activity volume histograms were in good agreement. Conclusions: These data indicate that, with appropriate reconstruction methods and measured calibration correction factors, either Y-90 SPECT/CT or Y-90 PET/CT can be used for quantitative post-therapy monitoring of Y-90 activity distribution following hepatic radioembolization. PMID:27782730
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giantsoudi, D; MacDonald, S; Paganetti, H
2014-06-01
Purpose: To compare the linear energy transfer (LET) distributions between passive scattering and pencil beam scanning proton radiation therapy techniques for medulloblastoma patients and study the potential radiobiological implications. Methods: A group of medulloblastoma patients, previously treated with passive scattering (PS) proton craniospinal irradiation followed by prosterior fossa or involved field boost, were selected from the patient database of our institution. Using the beam geometry and planning computed tomography (CT) image sets of the original treatment plans, pencil beam scanning (PBS) treatment plans were generated for the cranial treatment for each patient, with average beam spot size of 8mm (sigmamore » in air at isocenter). 3-dimensional dose and LET distributions were calculated by Monte Carlo methods (TOPAS) both for the original passive scattering and new pencil beam scanning treatment plans. LET volume histograms were calculated for the target and OARs and compared for the two delivery methods. Variable RBE weighted dose distributions and volume histograms were also calculated using a variable dose and LET-based model. Results: Better dose conformity was achieved with PBS planning compared to PS, leading to increased dose coverage for the boost target area and decreased average dose to the structures adjacent to it and critical structures outside the whole brain treatment field. LET values for the target were lower for PBS plans. Elevated LET values for OARs close to the boosted target areas were noticed, due to end of range of proton beams falling inside these structures, resulting in higher RBE weighted dose for these structures compared to the clinical RBE value of 1.1. Conclusion: Transitioning from passive scattering to pencil beam scanning proton radiation treatment can be dosimetrically beneficial for medulloblastoma patients. LET–guided treatment planning could contribute to better decision making for these cases, especially for critical structures at close proximity to the boosted target area.« less
Yue, Jianting; Mauxion, Thibault; Reyes, Diane K; Lodge, Martin A; Hobbs, Robert F; Rong, Xing; Dong, Yinfeng; Herman, Joseph M; Wahl, Richard L; Geschwind, Jean-François H; Frey, Eric C
2016-10-01
Radioembolization with yttrium-90 microspheres may be optimized with patient-specific pretherapy treatment planning. Dose verification and validation of treatment planning methods require quantitative imaging of the post-therapy distribution of yttrium-90 (Y-90). Methods for quantitative imaging of Y-90 using both bremsstrahlung SPECT and PET have previously been described. The purpose of this study was to compare the two modalities quantitatively in humans. Calibration correction factors for both quantitative Y-90 bremsstrahlung SPECT and a non-time-of-flight PET system without compensation for prompt coincidences were developed by imaging three phantoms. The consistency of these calibration correction factors for the different phantoms was evaluated. Post-therapy images from both modalities were obtained from 15 patients with hepatocellular carcinoma who underwent hepatic radioembolization using Y-90 glass microspheres. Quantitative SPECT and PET images were rigidly registered and the total liver activities and activity distributions estimated for each modality were compared. The activity distributions were compared using profiles, voxel-by-voxel correlation and Bland-Altman analyses, and activity-volume histograms. The mean ± standard deviation of difference in the total activity in the liver between the two modalities was 0% ± 9% (range -21%-18%). Voxel-by-voxel comparisons showed a good agreement in regions corresponding roughly to treated tumor and treated normal liver; the agreement was poorer in regions with low or no expected activity, where PET appeared to overestimate the activity. The correlation coefficients between intrahepatic voxel pairs for the two modalities ranged from 0.86 to 0.94. Cumulative activity volume histograms were in good agreement. These data indicate that, with appropriate reconstruction methods and measured calibration correction factors, either Y-90 SPECT/CT or Y-90 PET/CT can be used for quantitative post-therapy monitoring of Y-90 activity distribution following hepatic radioembolization.
Latysheva, Anna; Eeg Emblem, Kyrre; Server, Andrés; Brandal, Petter; Meling, Torstein R; Pahnke, Jens; Hald, John K
2018-06-12
According to the new World Health Organization 2016 classification for tumors of the central nervous system, 1p/19q codeletion defines the genetic hallmark that differentiates oligodendrogliomas from diffuse astrocytomas. The aim of our study was to evaluate whether relative cerebral blood volume (rCBV) and apparent diffusion coefficient (ADC) histogram analysis can stratify survival in adult patients with genetic defined diffuse glioma grades II and III. Sixty-seven patients with untreated diffuse gliomas World Health Organization grades II and III and known 1p/19q codeletion status were included retrospectively and analyzed using ADC and rCBV maps based on whole-tumor volume histograms. Overall survival and progression-free survival (PFS) were analyzed by using Kaplan-Meier and Cox survival analyses adjusted for known survival predictors. Significant longer PFS was associated with homogeneous rCBV distribution-higher rCBVpeak (median, 37 vs 26 months; hazard ratio [HR], 3.2; P = 0.02) in patients with astrocytomas, and heterogeneous rCBV distribution-lower rCBVpeak (median, 46 vs 37 months; HR, 5.3; P < 0.001) and higher rCBVmean (median, 44 vs 39 months; HR, 7.9; P = 0.003) in patients with oligodendrogliomas. Apparent diffusion coefficient parameters (ADCpeak, ADCmean) did not stratify PFS and overall survival. Tumors with heterogeneous perfusion signatures and high average values were associated with longer PFS in patients with oligodendrogliomas. On the contrary, heterogeneous perfusion distribution was associated with poor outcome in patients with diffuse astrocytomas.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.
NASA Astrophysics Data System (ADS)
Burri, Samuel; Powolny, François; Bruschini, Claudio E.; Michalet, Xavier; Regazzoni, Francesco; Charbon, Edoardo
2014-05-01
This paper presents our work on a 65k pixel single-photon avalanche diode (SPAD) based imaging sensor realized in a 0.35μm standard CMOS process. At a resolution of 512 by 128 pixels the sensor is read out in 6.4μs to deliver over 150k monochrome frames per second. The individual pixel has a size of 24μm2 and contains the SPAD with a 12T quenching and gating circuitry along with a memory element. The gating signals are distributed across the chip through a balanced tree to minimize the signal skew between the pixels. The array of pixels is row-addressable and data is sent out of the chip on 128 lines in parallel at a frequency of 80MHz. The system is controlled by an FPGA which generates the gating and readout signals and can be used for arbitrary real-time computation on the frames from the sensor. The communication protocol between the camera and a conventional PC is USB2. The active area of the chip is 5% and can be significantly improved with the application of a micro-lens array. A micro-lens array, for use with collimated light, has been designed and its performance is reviewed in the paper. Among other high-speed phenomena the gating circuitry capable of generating illumination periods shorter than 5ns can be used for Fluorescence Lifetime Imaging (FLIM). In order to measure the lifetime of fluorophores excited by a picosecond laser, the sensor's illumination period is synchronized with the excitation laser pulses. A histogram of the photon arrival times relative to the excitation is then constructed by counting the photons arriving during the sensitive time for several positions of the illumination window. The histogram for each pixel is transferred afterwards to a computer where software routines extract the lifetime at each location with an accuracy better than 100ps. We show results for fluorescence lifetime measurements using different fluorophores with lifetimes ranging from 150ps to 5ns.
Infrared face recognition based on LBP histogram and KW feature selection
NASA Astrophysics Data System (ADS)
Xie, Zhihua
2014-07-01
The conventional LBP-based feature as represented by the local binary pattern (LBP) histogram still has room for performance improvements. This paper focuses on the dimension reduction of LBP micro-patterns and proposes an improved infrared face recognition method based on LBP histogram representation. To extract the local robust features in infrared face images, LBP is chosen to get the composition of micro-patterns of sub-blocks. Based on statistical test theory, Kruskal-Wallis (KW) feature selection method is proposed to get the LBP patterns which are suitable for infrared face recognition. The experimental results show combination of LBP and KW features selection improves the performance of infrared face recognition, the proposed method outperforms the traditional methods based on LBP histogram, discrete cosine transform(DCT) or principal component analysis(PCA).
Tamone, S.L.; Taggart, S. James; Andrews, A.G.; Mondragon, Jennifer; Nielsen, J.K.
2007-01-01
Whether male Tanner crabs, Chionoecetes bairdi, undergo a terminal molt associated with a change in claw allometry has long been debated. We measured molting hormone levels in captured male C. bairdi to assess the potential for molting. We plotted a frequency histogram of chela height to carapace width ratios and found a bimodal distribution of crabs with a ratio of approximately 0.18 separating the two modes. Male crabs with a ratio less than 0.18 were classified as "small-clawed" (SC) while crabs with a ratio greater than 0.18 were classified as "large-clawed" (LC). Circulating molting hormones between SC and LC crabs were compared. Significantly lower ecdysteroid levels were found in LC crabs, indicating that this morphotype had negligible potential for molting. Circulating ecdysteroids were measured in SC males of different shell conditions (soft, new, old, and very old) and no significant differences were found. This research suggests that the molt to LC morphology is a terminal molt. The results from this study have important implications for fisheries management because sub-legal LC males will not recruit into the fishery and removal of larger males may have long term effects on population size structure.
Air Traffic Sector Configuration Change Frequency
NASA Technical Reports Server (NTRS)
Chatterji, Gano Broto; Drew, Michael
2009-01-01
Several techniques for partitioning airspace have been developed in the literature. The question of whether a region of airspace created by such methods can be used with other days of traffic, and the number of times a different partition is needed during the day is examined in this paper. Both these aspects are examined for the Fort Worth Center airspace sectors. A Mixed Integer Linear Programming method is used with actual air traffic data of ten high-volume low-weather-delay days for creating sectors. Nine solutions were obtained for each two-hour period of the day by partitioning the center airspace into two through 18 sectors in steps of two sectors. Actual track-data were played back with the generated partitions for creating histograms of the traffic-counts. The best partition for each two-hour period was then identified based on the nine traffic-count distributions. Numbers of sectors in such partitions were analyzed to determine the number of times a different configuration is needed during the day. One to three partitions were selected for the 24-hour period, and traffic data from ten days were played back to test if the traffic-counts stayed below the threshold values associated with these partitions. Results show that these partitions are robust and can be used for longer durations than they were designed for
NASA Astrophysics Data System (ADS)
Sanderson, Mark I.; Simmons, James A.
2005-11-01
Echolocating big brown bats (Eptesicus fuscus) emit trains of frequency-modulated (FM) biosonar signals whose duration, repetition rate, and sweep structure change systematically during interception of prey. When stimulated with a 2.5-s sequence of 54 FM pulse-echo pairs that mimic sounds received during search, approach, and terminal stages of pursuit, single neurons (N=116) in the bat's inferior colliculus (IC) register the occurrence of a pulse or echo with an average of <1 spike/sound. Individual IC neurons typically respond to only a segment of the search or approach stage of pursuit, with fewer neurons persisting to respond in the terminal stage. Composite peristimulus-time-histogram plots of responses assembled across the whole recorded population of IC neurons depict the delay of echoes and, hence, the existence and distance of the simulated biosonar target, entirely as on-response latencies distributed across time. Correlated changes in pulse duration, repetition rate, and pulse or echo amplitude do modulate the strength of responses (probability of the single spike actually occurring for each sound), but registration of the target itself remains confined exclusively to the latencies of single spikes across cells. Modeling of echo processing in FM biosonar should emphasize spike-time algorithms to explain the content of biosonar images.
Mousavi, Seyed Mortaza; Adamoğlu, Ahmet; Demiralp, Tamer; Shayesteh, Mahrokh G
2014-01-01
Awareness during general anesthesia for its serious psychological effects on patients and some juristically problems for anesthetists has been an important challenge during past decades. Monitoring depth of anesthesia is a fundamental solution to this problem. The induction of anesthesia alters frequency and mean of amplitudes of the electroencephalogram (EEG), and its phase couplings. We analyzed EEG changes for phase coupling between delta and alpha subbands using a new algorithm for depth of general anesthesia measurement based on complex wavelet transform (CWT) in patients anesthetized by Propofol. Entropy and histogram of modulated signals were calculated by taking bispectral index (BIS) values as reference. Entropies corresponding to different BIS intervals using Mann-Whitney U test showed that they had different continuous distributions. The results demonstrated that there is a phase coupling between 3 and 4 Hz in delta and 8-9 Hz in alpha subbands and these changes are shown better at the channel T 7 of EEG. Moreover, when BIS values increase, the entropy value of modulated signal also increases and vice versa. In addition, measuring phase coupling between delta and alpha subbands of EEG signals through continuous CWT analysis reveals the depth of anesthesia level. As a result, awareness during anesthesia can be prevented.
Establishing a link between vehicular PM sources and PM measurements in urban street canyons.
Eisner, Alfred D; Richmond-Bryant, Jennifer; Wiener, Russell W; Hahn, Intaek; Drake-Richman, Zora E; Ellenson, William D
2009-12-01
The Brooklyn Traffic Real-Time Ambient Pollutant Penetration and Environmental Dispersion (B-TRAPPED) study, conducted in Brooklyn, NY, USA, in 2005, was designed with multiple goals in mind, two of which were contaminant source characterization and street canyon transport and dispersion monitoring. In the portion of the study described here, synchronized wind velocity and azimuth as well as particulate matter (PM) concentrations at multiple locations along 33rd Street were used to determine the feasibility of using traffic emissions in a complex urban topography as a sole tracer for studying urban contaminant transport. We demonstrate in this paper that it is possible to link downwind concentrations of contaminants in an urban street canyon to the vehicular traffic cycle using Eigen-frequency analysis. In addition, multivariable circular histograms are used to establish directional frequency maxima for wind velocity and contaminant concentration.
Multispectral histogram normalization contrast enhancement
NASA Technical Reports Server (NTRS)
Soha, J. M.; Schwartz, A. A.
1979-01-01
A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.
Remote logo detection using angle-distance histograms
NASA Astrophysics Data System (ADS)
Youn, Sungwook; Ok, Jiheon; Baek, Sangwook; Woo, Seongyoun; Lee, Chulhee
2016-05-01
Among all the various computer vision applications, automatic logo recognition has drawn great interest from industry as well as various academic institutions. In this paper, we propose an angle-distance map, which we used to develop a robust logo detection algorithm. The proposed angle-distance histogram is invariant against scale and rotation. The proposed method first used shape information and color characteristics to find the candidate regions and then applied the angle-distance histogram. Experiments show that the proposed method detected logos of various sizes and orientations.
NASA Astrophysics Data System (ADS)
Maggio, Angelo; Carillo, Viviana; Cozzarini, Cesare; Perna, Lucia; Rancati, Tiziana; Valdagni, Riccardo; Gabriele, Pietro; Fiorino, Claudio
2013-04-01
The aim of this study was to evaluate the correlation between the ‘true’ absolute and relative dose-volume histograms (DVHs) of the bladder wall, dose-wall histogram (DWH) defined on MRI imaging and other surrogates of bladder dosimetry in prostate cancer patients, planned both with 3D-conformal and intensity-modulated radiation therapy (IMRT) techniques. For 17 prostate cancer patients, previously treated with radical intent, CT and MRI scans were acquired and matched. The contours of bladder walls were drawn by using MRI images. External bladder surfaces were then used to generate artificial bladder walls by performing automatic contractions of 5, 7 and 10 mm. For each patient a 3D conformal radiotherapy (3DCRT) and an IMRT treatment plan was generated with a prescription dose of 77.4 Gy (1.8 Gy/fr) and DVH of the whole bladder of the artificial walls (DVH-5/10) and dose-surface histograms (DSHs) were calculated and compared against the DWH in absolute and relative value, for both treatment planning techniques. A specific software (VODCA v. 4.4.0, MSS Inc.) was used for calculating the dose-volume/surface histogram. Correlation was quantified for selected dose-volume/surface parameters by the Spearman correlation coefficient. The agreement between %DWH and DVH5, DVH7 and DVH10 was found to be very good (maximum average deviations below 2%, SD < 5%): DVH5 showed the best agreement. The correlation was slightly better for absolute (R = 0.80-0.94) compared to relative (R = 0.66-0.92) histograms. The DSH was also found to be highly correlated with the DWH, although slightly higher deviations were generally found. The DVH was not a good surrogate of the DWH (R < 0.7 for most of parameters). When comparing the two treatment techniques, more pronounced differences between relative histograms were seen for IMRT with respect to 3DCRT (p < 0.0001).
Classification of calcium in intravascular OCT images for the purpose of intervention planning
NASA Astrophysics Data System (ADS)
Shalev, Ronny; Bezerra, Hiram G.; Ray, Soumya; Prabhu, David; Wilson, David L.
2016-03-01
The presence of extensive calcification is a primary concern when planning and implementing a vascular percutaneous intervention such as stenting. If the balloon does not expand, the interventionalist must blindly apply high balloon pressure, use an atherectomy device, or abort the procedure. As part of a project to determine the ability of Intravascular Optical Coherence Tomography (IVOCT) to aid intervention planning, we developed a method for automatic classification of calcium in coronary IVOCT images. We developed an approach where plaque texture is modeled by the joint probability distribution of a bank of filter responses where the filter bank was chosen to reflect the qualitative characteristics of the calcium. This distribution is represented by the frequency histogram of filter response cluster centers. The trained algorithm was evaluated on independent ex-vivo image data accurately labeled using registered 3D microscopic cryo-image data which was used as ground truth. In this study, regions for extraction of sub-images (SI's) were selected by experts to include calcium, fibrous, or lipid tissues. We manually optimized algorithm parameters such as choice of filter bank, size of the dictionary, etc. Splitting samples into training and testing data, we achieved 5-fold cross validation calcium classification with F1 score of 93.7+/-2.7% with recall of >=89% and a precision of >=97% in this scenario with admittedly selective data. The automated algorithm performed in close-to-real-time (2.6 seconds per frame) suggesting possible on-line use. This promising preliminary study indicates that computational IVOCT might automatically identify calcium in IVOCT coronary artery images.
Spatio-temporal modeling of the African swine fever epidemic in the Russian Federation, 2007-2012.
Korennoy, F I; Gulenkin, V M; Malone, J B; Mores, C N; Dudnikov, S A; Stevenson, M A
2014-10-01
In 2007 African swine fever (ASF) entered Georgia and in the same year the disease entered the Russian Federation. From 2007 to 2012 ASF spread throughout the southern region of the Russian Federation. At the same time several cases of ASF were detected in the central and northern regions of the Russian Federation, forming a northern cluster of outbreaks in 2011. This northern cluster is of concern because of its proximity to mainland Europe. The aim of this study was to use details of recorded ASF outbreaks and human and swine population details to estimate the spatial distribution of ASF risk in the southern region of the European part of the Russian Federation. Our model of ASF risk was comprised of two components. The first was an estimate of ASF suitability scores calculated using maximum entropy methods. The second was an estimate of ASF risk as a function of Euclidean distance from index cases. An exponential distribution fitted to a frequency histogram of the Euclidean distance between consecutive ASF cases had a mean value of 156 km, a distance greater than the surveillance zone radius of 100-150 km stated in the ASF control regulations for the Russian Federation. We show that the spatial and temporal risk of ASF expansion is related to the suitability of the area of potential expansion, which is in turn a function of socio-economic and geographic variables. We propose that the methodology presented in this paper provides a useful tool to optimize surveillance for ASF in affected areas. Copyright © 2014 Elsevier Ltd. All rights reserved.
Spatio-temporal modeling of the African swine fever epidemic in the Russian Federation, 2007–2012
Korennoy, F.I.; Gulenkin, V.M.; Malone, J.B.; Mores, C.N.; Dudnikov, S.A.; Stevenson, M.A.
2015-01-01
In 2007 African swine fever (ASF) entered Georgia and in the same year the disease entered the Russian Federation. From 2007 to 2012 ASF spread throughout the southern region of the Russian Federation. At the same time several cases of ASF were detected in the central and northern regions of the Russian Federation, forming a northern cluster of outbreaks in 2011. This northern cluster is of concern because of its proximity to mainland Europe. The aim of this study was to use details of recorded ASF outbreaks and human and swine population details to estimate the spatial distribution of ASF risk in the southern region of the European part of the Russian Federation. Our model of ASF risk was comprised of two components. The first was an estimate of ASF suitability scores calculated using maximum entropy methods. The second was an estimate of ASF risk as a function of Euclidean distance from index cases. An exponential distribution fitted to a frequency histogram of the Euclidean distance between consecutive ASF cases had a mean value of 156 km, a distance greater than the surveillance zone radius of 100–150 km stated in the ASF control regulations for the Russian Federation. We show that the spatial and temporal risk of ASF expansion is related to the suitability of the area of potential expansion, which is in turn a function of socio-economic and geographic variables. We propose that the methodology presented in this paper provides a useful tool to optimize surveillance for ASF in affected areas. PMID:25457602
NASA Astrophysics Data System (ADS)
Saraceno, Martin; Provost, Christine; Piola, Alberto R.
2005-11-01
The time-space distribution of chlorophyll a in the southwestern Atlantic is examined using 6 years (1998-2003) of sea surface color images from Sea-viewing Wide Field of View Sensor (SeaWiFS). Chlorophyll a (chl a) distribution is confronted with sea surface temperature (SST) fronts retrieved from satellite imagery. Histogram analysis of the color, SST, and SST gradient data sets provides a simple procedure for pixel classification from which eight biophysical regions in the SWA are identified, including three new regions with regard to Longhurst (1998) work: Patagonian Shelf Break (PSB), Brazil Current Overshoot, and Zapiola Rise region. In the PSB region, coastal-trapped waves are suggested as a possible mechanism leading to the intraseasonal frequencies observed in SST and chl a. Mesoscale activity associated with the Brazil Current Front and, in particular, eddies drifting southward is probably responsible for the high chl a values observed throughout the Brazil Current Overshoot region. The Zapiola Rise is characterized by a local minimum in SST gradient magnitudes and shows chl a maximum values in February, 3 months later than the austral spring bloom of the surroundings. Significant interannual variability is present in the color imagery. In the PSB, springs and summers with high chl a concentrations seem associated with stronger local northerly wind speed, and possible mechanisms are discussed. Finally, the Brazil-Malvinas front is detected using both SST gradient and SeaWiFS images. The time-averaged position of the front at 54.2°W is estimated at 38.9°S and its alongshore migration of about 300 km.
Possible Electromagnetic Effects on Abnormal Animal Behavior Before an Earthquake
Hayakawa, Masashi
2013-01-01
Simple Summary Possible electromagnetic effects on abnormal animal behavior before earthquakes. Abstract The former statistical properties summarized by Rikitake (1998) on unusual animal behavior before an earthquake (EQ) have first been presented by using two parameters (epicentral distance (D) of an anomaly and its precursor (or lead) time (T)). Three plots are utilized to characterize the unusual animal behavior; (i) EQ magnitude (M) versus D, (ii) log T versus M, and (iii) occurrence histogram of log T. These plots are compared with the corresponding plots for different seismo-electromagnetic effects (radio emissions in different frequency ranges, seismo-atmospheric and -ionospheric perturbations) extensively obtained during the last 15–20 years. From the results of comparisons in terms of three plots, it is likely that lower frequency (ULF (ultra-low-frequency, f ≤ 1 Hz) and ELF (extremely-low-frequency, f ≤ a few hundreds Hz)) electromagnetic emissions exhibit a very similar temporal evolution with that of abnormal animal behavior. It is also suggested that a quantity of field intensity multiplied by the persistent time (or duration) of noise would play the primary role in abnormal animal behavior before an EQ. PMID:26487307
Automatic analysis of ciliary beat frequency using optical flow
NASA Astrophysics Data System (ADS)
Figl, Michael; Lechner, Manuel; Werther, Tobias; Horak, Fritz; Hummel, Johann; Birkfellner, Wolfgang
2012-02-01
Ciliary beat frequency (CBF) can be a useful parameter for diagnosis of several diseases, as e.g. primary ciliary dyskinesia. (PCD). CBF computation is usually done using manual evaluation of high speed video sequences, a tedious, observer dependent, and not very accurate procedure. We used the OpenCV's pyramidal implementation of the Lukas-Kanade algorithm for optical flow computation and applied this to certain objects to follow the movements. The objects were chosen by their contrast applying the corner detection by Shi and Tomasi. Discrimination between background/noise and cilia by a frequency histogram allowed to compute the CBF. Frequency analysis was done using the Fourier transform in matlab. The correct number of Fourier summands was found by the slope in an approximation curve. The method showed to be usable to distinguish between healthy and diseased samples. However there remain difficulties in automatically identifying the cilia, and also in finding enough high contrast cilia in the image. Furthermore the some of the higher contrast cilia are lost (and sometimes found) by the method, an easy way to distinguish the correct sub-path of a point's path have yet to be found in the case where the slope methods doesn't work.
Illumination invariant feature point matching for high-resolution planetary remote sensing images
NASA Astrophysics Data System (ADS)
Wu, Bo; Zeng, Hai; Hu, Han
2018-03-01
Despite its success with regular close-range and remote-sensing images, the scale-invariant feature transform (SIFT) algorithm is essentially not invariant to illumination differences due to the use of gradients for feature description. In planetary remote sensing imagery, which normally lacks sufficient textural information, salient regions are generally triggered by the shadow effects of keypoints, reducing the matching performance of classical SIFT. Based on the observation of dual peaks in a histogram of the dominant orientations of SIFT keypoints, this paper proposes an illumination-invariant SIFT matching method for high-resolution planetary remote sensing images. First, as the peaks in the orientation histogram are generally aligned closely with the sub-solar azimuth angle at the time of image collection, an adaptive suppression Gaussian function is tuned to level the histogram and thereby alleviate the differences in illumination caused by a changing solar angle. Next, the suppression function is incorporated into the original SIFT procedure for obtaining feature descriptors, which are used for initial image matching. Finally, as the distribution of feature descriptors changes after anisotropic suppression, and the ratio check used for matching and outlier removal in classical SIFT may produce inferior results, this paper proposes an improved matching procedure based on cross-checking and template image matching. The experimental results for several high-resolution remote sensing images from both the Moon and Mars, with illumination differences of 20°-180°, reveal that the proposed method retrieves about 40%-60% more matches than the classical SIFT method. The proposed method is of significance for matching or co-registration of planetary remote sensing images for their synergistic use in various applications. It also has the potential to be useful for flyby and rover images by integrating with the affine invariant feature detectors.
Chen, T; Li, Y; Lu, S-S; Zhang, Y-D; Wang, X-N; Luo, C-Y; Shi, H-B
2017-11-01
To evaluate the diagnostic performance of histogram analysis of diffusion kurtosis magnetic resonance imaging (DKI) and standard diffusion-weighted imaging (DWI) in discriminating tumour grades of endometrial carcinoma (EC). Seventy-three patients with EC were included in this study. The apparent diffusion coefficient (ADC) value from standard DWI, apparent diffusion for Gaussian distribution (D app ), and apparent kurtosis coefficient (K app ) from DKI were acquired using a 3 T magnetic resonance imaging (MRI) system. The measurement was based on an entire-tumour analysis. Histogram parameters (D app , K app , and ADC) were compared between high-grade (grade 3) and low-grade (grade 1 and 2) tumours. The diagnostic performance of imaging parameters for discriminating high- from low-grade tumours was analysed using a receiver operating characteristic curve (ROC). The area under the ROC curve (AUC) of the 10th percentile of D app , 90th percentile of K app and 10th percentile of ADC were higher than other parameters in distinguishing high-grade tumours from low-grade tumours (AUC=0.821, 0.891 and 0.801, respectively). The combination of 10th percentile of D app and 90th percentile of K app improved the AUC to 0.901, which was significantly higher than that of the 10th percentile of ADC (0.810, p=0.0314) in differentiating high- from low-grade EC. Entire-tumour volume histogram analysis of DKI and standard DWI were feasible for discriminating histological tumour grades of EC. DKI was relatively better than DWI in distinguishing high-grade from low-grade tumour in EC. Copyright © 2017. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Cai, Wenli; Yoshida, Hiroyuki; Harris, Gordon J.
2007-03-01
Measurement of the volume of focal liver tumors, called liver tumor volumetry, is indispensable for assessing the growth of tumors and for monitoring the response of tumors to oncology treatments. Traditional edge models, such as the maximum gradient and zero-crossing methods, often fail to detect the accurate boundary of a fuzzy object such as a liver tumor. As a result, the computerized volumetry based on these edge models tends to differ from manual segmentation results performed by physicians. In this study, we developed a novel computerized volumetry method for fuzzy objects, called dynamic-thresholding level set (DT level set). An optimal threshold value computed from a histogram tends to shift, relative to the theoretical threshold value obtained from a normal distribution model, toward a smaller region in the histogram. We thus designed a mobile shell structure, called a propagating shell, which is a thick region encompassing the level set front. The optimal threshold calculated from the histogram of the shell drives the level set front toward the boundary of a liver tumor. When the volume ratio between the object and the background in the shell approaches one, the optimal threshold value best fits the theoretical threshold value and the shell stops propagating. Application of the DT level set to 26 hepatic CT cases with 63 biopsy-confirmed hepatocellular carcinomas (HCCs) and metastases showed that the computer measured volumes were highly correlated with those of tumors measured manually by physicians. Our preliminary results showed that DT level set was effective and accurate in estimating the volumes of liver tumors detected in hepatic CT images.
ERIC Educational Resources Information Center
Leyden, Michael B.
1975-01-01
Describes various elementary school activities using a loaf of raisin bread to promote inquiry skills. Activities include estimating the number of raisins in the loaf by constructing histograms of the number of raisins in a slice. (MLH)
A tool for the estimation of the distribution of landslide area in R
NASA Astrophysics Data System (ADS)
Rossi, M.; Cardinali, M.; Fiorucci, F.; Marchesini, I.; Mondini, A. C.; Santangelo, M.; Ghosh, S.; Riguer, D. E. L.; Lahousse, T.; Chang, K. T.; Guzzetti, F.
2012-04-01
We have developed a tool in R (the free software environment for statistical computing, http://www.r-project.org/) to estimate the probability density and the frequency density of landslide area. The tool implements parametric and non-parametric approaches to the estimation of the probability density and the frequency density of landslide area, including: (i) Histogram Density Estimation (HDE), (ii) Kernel Density Estimation (KDE), and (iii) Maximum Likelihood Estimation (MLE). The tool is available as a standard Open Geospatial Consortium (OGC) Web Processing Service (WPS), and is accessible through the web using different GIS software clients. We tested the tool to compare Double Pareto and Inverse Gamma models for the probability density of landslide area in different geological, morphological and climatological settings, and to compare landslides shown in inventory maps prepared using different mapping techniques, including (i) field mapping, (ii) visual interpretation of monoscopic and stereoscopic aerial photographs, (iii) visual interpretation of monoscopic and stereoscopic VHR satellite images and (iv) semi-automatic detection and mapping from VHR satellite images. Results show that both models are applicable in different geomorphological settings. In most cases the two models provided very similar results. Non-parametric estimation methods (i.e., HDE and KDE) provided reasonable results for all the tested landslide datasets. For some of the datasets, MLE failed to provide a result, for convergence problems. The two tested models (Double Pareto and Inverse Gamma) resulted in very similar results for large and very large datasets (> 150 samples). Differences in the modeling results were observed for small datasets affected by systematic biases. A distinct rollover was observed in all analyzed landslide datasets, except for a few datasets obtained from landslide inventories prepared through field mapping or by semi-automatic mapping from VHR satellite imagery. The tool can also be used to evaluate the probability density and the frequency density of landslide volume.
NASA Astrophysics Data System (ADS)
Wan, Minjie; Gu, Guohua; Qian, Weixian; Ren, Kan; Chen, Qian
2018-06-01
Infrared (IR) small target enhancement plays a significant role in modern infrared search and track (IRST) systems and is the basic technique of target detection and tracking. In this paper, a coarse-to-fine grey level mapping method using improved sigmoid transformation and saliency histogram is designed to enhance IR small targets under different backgrounds. For the stage of rough enhancement, the intensity histogram is modified via an improved sigmoid function so as to narrow the regular intensity range of background as much as possible. For the part of further enhancement, a linear transformation is accomplished based on a saliency histogram constructed by averaging the cumulative saliency values provided by a saliency map. Compared with other typical methods, the presented method can achieve both better visual performances and quantitative evaluations.
Massar, Melody L; Bhagavatula, Ramamurthy; Ozolek, John A; Castro, Carlos A; Fickus, Matthew; Kovačević, Jelena
2011-10-19
We present the current state of our work on a mathematical framework for identification and delineation of histopathology images-local histograms and occlusion models. Local histograms are histograms computed over defined spatial neighborhoods whose purpose is to characterize an image locally. This unit of description is augmented by our occlusion models that describe a methodology for image formation. In the context of this image formation model, the power of local histograms with respect to appropriate families of images will be shown through various proved statements about expected performance. We conclude by presenting a preliminary study to demonstrate the power of the framework in the context of histopathology image classification tasks that, while differing greatly in application, both originate from what is considered an appropriate class of images for this framework.
Chen, Zhaoxue; Yu, Haizhong; Chen, Hao
2013-12-01
To solve the problem of traditional K-means clustering in which initial clustering centers are selected randomly, we proposed a new K-means segmentation algorithm based on robustly selecting 'peaks' standing for White Matter, Gray Matter and Cerebrospinal Fluid in multi-peaks gray histogram of MRI brain image. The new algorithm takes gray value of selected histogram 'peaks' as the initial K-means clustering center and can segment the MRI brain image into three parts of tissue more effectively, accurately, steadily and successfully. Massive experiments have proved that the proposed algorithm can overcome many shortcomings caused by traditional K-means clustering method such as low efficiency, veracity, robustness and time consuming. The histogram 'peak' selecting idea of the proposed segmentootion method is of more universal availability.
Neutron camera employing row and column summations
Clonts, Lloyd G.; Diawara, Yacouba; Donahue, Jr, Cornelius; Montcalm, Christopher A.; Riedel, Richard A.; Visscher, Theodore
2016-06-14
For each photomultiplier tube in an Anger camera, an R.times.S array of preamplifiers is provided to detect electrons generated within the photomultiplier tube. The outputs of the preamplifiers are digitized to measure the magnitude of the signals from each preamplifier. For each photomultiplier tube, a corresponding summation circuitry including R row summation circuits and S column summation circuits numerically add the magnitudes of the signals from preamplifiers for each row and for each column to generate histograms. For a P.times.Q array of photomultiplier tubes, P.times.Q summation circuitries generate P.times.Q row histograms including R entries and P.times.Q column histograms including S entries. The total set of histograms include P.times.Q.times.(R+S) entries, which can be analyzed by a position calculation circuit to determine the locations of events (detection of a neutron).
Cho, Gene Young; Moy, Linda; Kim, Sungheon G; Baete, Steven H; Moccaldi, Melanie; Babb, James S; Sodickson, Daniel K; Sigmund, Eric E
2016-08-01
To examine heterogeneous breast cancer through intravoxel incoherent motion (IVIM) histogram analysis. This HIPAA-compliant, IRB-approved retrospective study included 62 patients (age 48.44 ± 11.14 years, 50 malignant lesions and 12 benign) who underwent contrast-enhanced 3 T breast MRI and diffusion-weighted imaging. Apparent diffusion coefficient (ADC) and IVIM biomarkers of tissue diffusivity (Dt), perfusion fraction (fp), and pseudo-diffusivity (Dp) were calculated using voxel-based analysis for the whole lesion volume. Histogram analysis was performed to quantify tumour heterogeneity. Comparisons were made using Mann-Whitney tests between benign/malignant status, histological subtype, and molecular prognostic factor status while Spearman's rank correlation was used to characterize the association between imaging biomarkers and prognostic factor expression. The average values of the ADC and IVIM biomarkers, Dt and fp, showed significant differences between benign and malignant lesions. Additional significant differences were found in the histogram parameters among tumour subtypes and molecular prognostic factor status. IVIM histogram metrics, particularly fp and Dp, showed significant correlation with hormonal factor expression. Advanced diffusion imaging biomarkers show relationships with molecular prognostic factors and breast cancer malignancy. This analysis reveals novel diagnostic metrics that may explain some of the observed variability in treatment response among breast cancer patients. • Novel IVIM biomarkers characterize heterogeneous breast cancer. • Histogram analysis enables quantification of tumour heterogeneity. • IVIM biomarkers show relationships with breast cancer malignancy and molecular prognostic factors.
Effect of respiratory and cardiac gating on the major diffusion-imaging metrics.
Hamaguchi, Hiroyuki; Tha, Khin Khin; Sugimori, Hiroyuki; Nakanishi, Mitsuhiro; Nakagawa, Shin; Fujiwara, Taro; Yoshida, Hirokazu; Takamori, Sayaka; Shirato, Hiroki
2016-08-01
The effect of respiratory gating on the major diffusion-imaging metrics and that of cardiac gating on mean kurtosis (MK) are not known. For evaluation of whether the major diffusion-imaging metrics-MK, fractional anisotropy (FA), and mean diffusivity (MD) of the brain-varied between gated and non-gated acquisitions, respiratory-gated, cardiac-gated, and non-gated diffusion-imaging of the brain were performed in 10 healthy volunteers. MK, FA, and MD maps were constructed for all acquisitions, and the histograms were constructed. The normalized peak height and location of the histograms were compared among the acquisitions by use of Friedman and post hoc Wilcoxon tests. The effect of the repetition time (TR) on the diffusion-imaging metrics was also tested, and we corrected for its variation among acquisitions, if necessary. The results showed a shift in the peak location of the MK and MD histograms to the right with an increase in TR (p ≤ 0.01). The corrected peak location of the MK histograms, the normalized peak height of the FA histograms, the normalized peak height and the corrected peak location of the MD histograms varied significantly between the gated and non-gated acquisitions (p < 0.05). These results imply an influence of respiration and cardiac pulsation on the major diffusion-imaging metrics. The gating conditions must be kept identical if reproducible results are to be achieved. © The Author(s) 2016.
A cost-effective line-based light-balancing technique using adaptive processing.
Hsia, Shih-Chang; Chen, Ming-Huei; Chen, Yu-Min
2006-09-01
The camera imaging system has been widely used; however, the displaying image appears to have an unequal light distribution. This paper presents novel light-balancing techniques to compensate uneven illumination based on adaptive signal processing. For text image processing, first, we estimate the background level and then process each pixel with nonuniform gain. This algorithm can balance the light distribution while keeping a high contrast in the image. For graph image processing, the adaptive section control using piecewise nonlinear gain is proposed to equalize the histogram. Simulations show that the performance of light balance is better than the other methods. Moreover, we employ line-based processing to efficiently reduce the memory requirement and the computational cost to make it applicable in real-time systems.
Free energy calculations, enhanced by a Gaussian ansatz, for the "chemical work" distribution.
Boulougouris, Georgios C
2014-05-15
The evaluation of the free energy is essential in molecular simulation because it is intimately related with the existence of multiphase equilibrium. Recently, it was demonstrated that it is possible to evaluate the Helmholtz free energy using a single statistical ensemble along an entire isotherm by accounting for the "chemical work" of transforming each molecule, from an interacting one, to an ideal gas. In this work, we show that it is possible to perform such a free energy perturbation over a liquid vapor phase transition. Furthermore, we investigate the link between a general free energy perturbation scheme and the novel nonequilibrium theories of Crook's and Jarzinsky. We find that for finite systems away from the thermodynamic limit the second law of thermodynamics will always be an inequality for isothermal free energy perturbations, resulting always to a dissipated work that may tend to zero only in the thermodynamic limit. The work, the heat, and the entropy produced during a thermodynamic free energy perturbation can be viewed in the context of the Crooks and Jarzinsky formalism, revealing that for a given value of the ensemble average of the "irreversible" work, the minimum entropy production corresponded to a Gaussian distribution for the histogram of the work. We propose the evaluation of the free energy difference in any free energy perturbation based scheme on the average irreversible "chemical work" minus the dissipated work that can be calculated from the variance of the distribution of the logarithm of the work histogram, within the Gaussian approximation. As a consequence, using the Gaussian ansatz for the distribution of the "chemical work," accurate estimates for the chemical potential and the free energy of the system can be performed using much shorter simulations and avoiding the necessity of sampling the computational costly tails of the "chemical work." For a more general free energy perturbation scheme that the Gaussian ansatz may not be valid, the free energy calculation can be expressed in terms of the moment generating function of the "chemical work" distribution. Copyright © 2014 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Rhodes, Andrew P.; Christian, John A.; Evans, Thomas
2017-12-01
With the availability and popularity of 3D sensors, it is advantageous to re-examine the use of point cloud descriptors for the purpose of pose estimation and spacecraft relative navigation. One popular descriptor is the oriented unique repeatable clustered viewpoint feature histogram (
Hybrid Histogram Descriptor: A Fusion Feature Representation for Image Retrieval.
Feng, Qinghe; Hao, Qiaohong; Chen, Yuqi; Yi, Yugen; Wei, Ying; Dai, Jiangyan
2018-06-15
Currently, visual sensors are becoming increasingly affordable and fashionable, acceleratingly the increasing number of image data. Image retrieval has attracted increasing interest due to space exploration, industrial, and biomedical applications. Nevertheless, designing effective feature representation is acknowledged as a hard yet fundamental issue. This paper presents a fusion feature representation called a hybrid histogram descriptor (HHD) for image retrieval. The proposed descriptor comprises two histograms jointly: a perceptually uniform histogram which is extracted by exploiting the color and edge orientation information in perceptually uniform regions; and a motif co-occurrence histogram which is acquired by calculating the probability of a pair of motif patterns. To evaluate the performance, we benchmarked the proposed descriptor on RSSCN7, AID, Outex-00013, Outex-00014 and ETHZ-53 datasets. Experimental results suggest that the proposed descriptor is more effective and robust than ten recent fusion-based descriptors under the content-based image retrieval framework. The computational complexity was also analyzed to give an in-depth evaluation. Furthermore, compared with the state-of-the-art convolutional neural network (CNN)-based descriptors, the proposed descriptor also achieves comparable performance, but does not require any training process.
Improved LSB matching steganography with histogram characters reserved
NASA Astrophysics Data System (ADS)
Chen, Zhihong; Liu, Wenyao
2008-03-01
This letter bases on the researches of LSB (least significant bit, i.e. the last bit of a binary pixel value) matching steganographic method and the steganalytic method which aims at histograms of cover images, and proposes a modification to LSB matching. In the LSB matching, if the LSB of the next cover pixel matches the next bit of secret data, do nothing; otherwise, choose to add or subtract one from the cover pixel value at random. In our improved method, a steganographic information table is defined and records the changes which embedded secrete bits introduce in. Through the table, the next LSB which has the same pixel value will be judged to add or subtract one dynamically in order to ensure the histogram's change of cover image is minimized. Therefore, the modified method allows embedding the same payload as the LSB matching but with improved steganographic security and less vulnerability to attacks compared with LSB matching. The experimental results of the new method show that the histograms maintain their attributes, such as peak values and alternative trends, in an acceptable degree and have better performance than LSB matching in the respects of histogram distortion and resistance against existing steganalysis.
1980-05-01
different cells of the histogram are recognisably those of a IGaussian distribution. On the other hand, at Q = 2 there is no way to decide whether the...levels. INote further that the first row (Q=8) contains the same data as in Table 6, but the cells are arranged in a different order. Consequent- Ily the...values of the more coarsely merged cells will be different at lower levels of Q. Now since we are dealing with nominal data the order of the bins is
A database system to support image algorithm evaluation
NASA Technical Reports Server (NTRS)
Lien, Y. E.
1977-01-01
The design is given of an interactive image database system IMDB, which allows the user to create, retrieve, store, display, and manipulate images through the facility of a high-level, interactive image query (IQ) language. The query language IQ permits the user to define false color functions, pixel value transformations, overlay functions, zoom functions, and windows. The user manipulates the images through generic functions. The user can direct images to display devices for visual and qualitative analysis. Image histograms and pixel value distributions can also be computed to obtain a quantitative analysis of images.
The DataCube Server. Animate Agent Project Working Note 2, Version 1.0
1993-11-01
before this can be called a histogram of all the needed levels must be made and their one band images must be made. Note if a levels backprojection...will not be used then the level does not need to be histogrammed. Any points outside the active region in a levels backprojection will be undefined...this can be called a histogram of all the needed levels must be made and their one band images must be made. Note if a levels backprojection will not
MCNP output data analysis with ROOT (MODAR)
NASA Astrophysics Data System (ADS)
Carasco, C.
2010-12-01
MCNP Output Data Analysis with ROOT (MODAR) is a tool based on CERN's ROOT software. MODAR has been designed to handle time-energy data issued by MCNP simulations of neutron inspection devices using the associated particle technique. MODAR exploits ROOT's Graphical User Interface and functionalities to visualize and process MCNP simulation results in a fast and user-friendly way. MODAR allows to take into account the detection system time resolution (which is not possible with MCNP) as well as detectors energy response function and counting statistics in a straightforward way. New version program summaryProgram title: MODAR Catalogue identifier: AEGA_v1_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGA_v1_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 150 927 No. of bytes in distributed program, including test data, etc.: 4 981 633 Distribution format: tar.gz Programming language: C++ Computer: Most Unix workstations and PCs Operating system: Most Unix systems, Linux and windows, provided the ROOT package has been installed. Examples where tested under Suse Linux and Windows XP. RAM: Depends on the size of the MCNP output file. The example presented in the article, which involves three two dimensional 139×740 bins histograms, allocates about 60 MB. These data are running under ROOT and include consumption by ROOT itself. Classification: 17.6 Catalogue identifier of previous version: AEGA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 1161 External routines: ROOT version 5.24.00 ( http://root.cern.ch/drupal/) Does the new version supersede the previous version?: Yes Nature of problem: The output of a MCNP simulation is an ascii file. The data processing is usually performed by copying and pasting the relevant parts of the ascii file into Microsoft Excel. Such an approach is satisfactory when the quantity of data is small but is not efficient when the size of the simulated data is large, for example when time-energy correlations are studied in detail such as in problems involving the associated particle technique. In addition, since the finite time resolution of the simulated detector cannot be modeled with MCNP, systems in which time-energy correlation is crucial cannot be described in a satisfactory way. Finally, realistic particle energy deposit in detectors is calculated with MCNP in a two step process involving type-5 then type-8 tallies. In the first step, the photon flux energy spectrum associated to a time region is selected and serves as a source energy distribution for the second step. Thus, several files must be manipulated before getting the result, which can be time consuming if one needs to study several time regions or different detectors performances. In the same way, modeling counting statistics obtained in a limited acquisition time requires several steps and can also be time consuming. Solution method: In order to overcome the previous limitations, the MODAR C++ code has been written to make use of CERN's ROOT data analysis software. MCNP output data are read from the MCNP output file with dedicated routines. Two dimensional histograms are filled and can be handled efficiently within the ROOT framework. To keep a user friendly analysis tool, all processing and data display can be done by means of ROOT Graphical User Interface. Specific routines have been written to include detectors finite time resolution and energy response function as well as counting statistics in a straightforward way. Reasons for new version: For applications involving the Associate Particle Technique, a large number of gamma rays are produced by the fast neutrons interactions. To study the energy spectra, it is useful to identify the gamma-ray energy peaks in a straightforward way. Therefore, the possibility to show gamma rays corresponding to specific reactions has been added in MODAR. Summary of revisions: It is possible to use a gamma ray database to better identify in the energy spectra gamma ray peaks with their first and second escapes. Histograms can be scaled by the number of source particle to evaluate the number of counts that is expected without statistical uncertainties. Additional comments: The possibility of adding tallies has also been incorporated in MODAR in order to describe systems in which the signal from several detectors can be summed. Moreover, MODAR can be adapted to handle other problems involving two dimensional data. Running time: The CPU time needed to smear a two dimensional histogram depends on the size of the histogram. In the presented example, the time-energy smearing of one of the 139×740 two dimensional histograms takes 3 minutes with a DELL computer equipped with INTEL Core 2.
Gihr, Georg Alexander; Horvath-Rizea, Diana; Garnov, Nikita; Kohlhof-Meinecke, Patricia; Ganslandt, Oliver; Henkes, Hans; Meyer, Hans Jonas; Hoffmann, Karl-Titus; Surov, Alexey; Schob, Stefan
2018-02-01
Presurgical grading, estimation of growth kinetics, and other prognostic factors are becoming increasingly important for selecting the best therapeutic approach for meningioma patients. Diffusion-weighted imaging (DWI) provides microstructural information and reflects tumor biology. A novel DWI approach, histogram profiling of apparent diffusion coefficient (ADC) volumes, provides more distinct information than conventional DWI. Therefore, our study investigated whether ADC histogram profiling distinguishes low-grade from high-grade lesions and reflects Ki-67 expression and progesterone receptor status. Pretreatment ADC volumes of 37 meningioma patients (28 low-grade, 9 high-grade) were used for histogram profiling. WHO grade, Ki-67 expression, and progesterone receptor status were evaluated. Comparative and correlative statistics investigating the association between histogram profiling and neuropathology were performed. The entire ADC profile (p10, p25, p75, p90, mean, median) was significantly lower in high-grade versus low-grade meningiomas. The lower percentiles, mean, and modus showed significant correlations with Ki-67 expression. Skewness and entropy of the ADC volumes were significantly associated with progesterone receptor status and Ki-67 expression. ROC analysis revealed entropy to be the most accurate parameter distinguishing low-grade from high-grade meningiomas. ADC histogram profiling provides a distinct set of parameters, which help differentiate low-grade versus high-grade meningiomas. Also, histogram metrics correlate significantly with histological surrogates of the respective proliferative potential. More specifically, entropy revealed to be the most promising imaging biomarker for presurgical grading. Both, entropy and skewness were significantly associated with progesterone receptor status and Ki-67 expression and therefore should be investigated further as predictors for prognostically relevant tumor biological features. Since absolute ADC values vary between MRI scanners of different vendors and field strengths, their use is more limited in the presurgical setting.
Reiner, Caecilia S; Gordic, Sonja; Puippe, Gilbert; Morsbach, Fabian; Wurnig, Moritz; Schaefer, Niklaus; Veit-Haibach, Patrick; Pfammatter, Thomas; Alkadhi, Hatem
2016-03-01
To evaluate in patients with hepatocellular carcinoma (HCC), whether assessment of tumor heterogeneity by histogram analysis of computed tomography (CT) perfusion helps predicting response to transarterial radioembolization (TARE). Sixteen patients (15 male; mean age 65 years; age range 47-80 years) with HCC underwent CT liver perfusion for treatment planning prior to TARE with Yttrium-90 microspheres. Arterial perfusion (AP) derived from CT perfusion was measured in the entire tumor volume, and heterogeneity was analyzed voxel-wise by histogram analysis. Response to TARE was evaluated on follow-up imaging (median follow-up, 129 days) based on modified Response Evaluation Criteria in Solid Tumors (mRECIST). Results of histogram analysis and mean AP values of the tumor were compared between responders and non-responders. Receiver operating characteristics were calculated to determine the parameters' ability to discriminate responders from non-responders. According to mRECIST, 8 patients (50%) were responders and 8 (50%) non-responders. Comparing responders and non-responders, the 50th and 75th percentile of AP derived from histogram analysis was significantly different [AP 43.8/54.3 vs. 27.6/34.3 mL min(-1) 100 mL(-1)); p < 0.05], while the mean AP of HCCs (43.5 vs. 27.9 mL min(-1) 100 mL(-1); p > 0.05) was not. Further heterogeneity parameters from histogram analysis (skewness, coefficient of variation, and 25th percentile) did not differ between responders and non-responders (p > 0.05). If the cut-off for the 75th percentile was set to an AP of 37.5 mL min(-1) 100 mL(-1), therapy response could be predicted with a sensitivity of 88% (7/8) and specificity of 75% (6/8). Voxel-wise histogram analysis of pretreatment CT perfusion indicating tumor heterogeneity of HCC improves the pretreatment prediction of response to TARE.
Umanodan, Tomokazu; Fukukura, Yoshihiko; Kumagae, Yuichi; Shindo, Toshikazu; Nakajo, Masatoyo; Takumi, Koji; Nakajo, Masanori; Hakamada, Hiroto; Umanodan, Aya; Yoshiura, Takashi
2017-04-01
To determine the diagnostic performance of apparent diffusion coefficient (ADC) histogram analysis in diffusion-weighted (DW) magnetic resonance imaging (MRI) for differentiating adrenal adenoma from pheochromocytoma. We retrospectively evaluated 52 adrenal tumors (39 adenomas and 13 pheochromocytomas) in 47 patients (21 men, 26 women; mean age, 59.3 years; range, 16-86 years) who underwent DW 3.0T MRI. Histogram parameters of ADC (b-values of 0 and 200 [ADC 200 ], 0 and 400 [ADC 400 ], and 0 and 800 s/mm 2 [ADC 800 ])-mean, variance, coefficient of variation (CV), kurtosis, skewness, and entropy-were compared between adrenal adenomas and pheochromocytomas, using the Mann-Whitney U-test. Receiver operating characteristic (ROC) curves for the histogram parameters were generated to differentiate adrenal adenomas from pheochromocytomas. Sensitivity and specificity were calculated by using a threshold criterion that would maximize the average of sensitivity and specificity. Variance and CV of ADC 800 were significantly higher in pheochromocytomas than in adrenal adenomas (P < 0.001 and P = 0.001, respectively). With all b-value combinations, the entropy of ADC was significantly higher in pheochromocytomas than in adrenal adenomas (all P ≤ 0.001), and showed the highest area under the ROC curve among the ADC histogram parameters for diagnosing adrenal adenomas (ADC 200 , 0.82; ADC 400 , 0.87; and ADC 800 , 0.92), with sensitivity of 84.6% and specificity of 84.6% (cutoff, ≤2.82) with ADC 200 ; sensitivity of 89.7% and specificity of 84.6% (cutoff, ≤2.77) with ADC 400 ; and sensitivity of 94.9% and specificity of 92.3% (cutoff, ≤2.67) with ADC 800 . ADC histogram analysis of DW MRI can help differentiate adrenal adenoma from pheochromocytoma. 3 J. Magn. Reson. Imaging 2017;45:1195-1203. © 2016 International Society for Magnetic Resonance in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fiorino, Claudio; Fellin, Gianni; Rancati, Tiziana
2008-03-15
Purpose: To assess the predictors of late rectal toxicity in a prospectively investigated group of patients treated at 70-80 Gy for prostate cancer (1.8-2 Gy fractions) with three-dimensional conformal radiotherapy. Methods and Materials: A total of 1,132 patients were entered into the study between 2002 and 2004. Three types of rectal toxicity, evaluated by a self-administered questionnaire, mainly based on the subjective objective management, analytic late effects of normal tissue system, were considered: stool frequency/tenesmus/pain, fecal incontinence, and bleeding. The data from 506 patients with a follow-up of 24 months were analyzed. The correlation between a number of clinical andmore » dosimetric parameters and Grade 2 or greater toxicity was investigated by univariate and multivariate (MVA) logistic analyses. Results: Of the 1,132 patients, 21, 15, and 30 developed stool frequency/tenesmus/pain, fecal incontinence, and bleeding, respectively. Stool frequency/tenesmus/pain correlated with previous abdominal/pelvic surgery (MVA, p = 0.05, odds ratio [OR], 3.3). With regard to incontinence, MVA showed the volume receiving {>=}40 Gy (V{sub 40}) (p = 0.035, OR, 1.037) and surgery (p = 0.02, OR, 4.4) to be the strongest predictors. V{sub 40} to V{sub 70} were highly predictive of bleeding; V{sub 70} showed the strongest impact on MVA (p = 0.03), together with surgery (p = 0.06, OR, 2.5), which was also the main predictor of Grade 3 bleeding (p = 0.02, OR, 4.2). Conclusions: The predictive value of the dose-volume histogram was confirmed for bleeding, consistent with previously suggested constraints (V{sub 50} <55%, V{sub 60} <40%, V{sub 70} <25%, and V{sub 75} <5%). A dose-volume histogram constraint for incontinence can be suggested (V{sub 40} <65-70%). Previous abdominal/pelvic surgery correlated with all toxicity types; thus, a modified constraint for bleeding (V{sub 70} <15%) can be suggested for patients with a history of abdominal/pelvis surgery, although further validation on a larger population with longer follow-up is needed.« less
Hamit, Murat; Yun, Weikang; Yan, Chuanbo; Kutluk, Abdugheni; Fang, Yang; Alip, Elzat
2015-06-01
Image feature extraction is an important part of image processing and it is an important field of research and application of image processing technology. Uygur medicine is one of Chinese traditional medicine and researchers pay more attention to it. But large amounts of Uygur medicine data have not been fully utilized. In this study, we extracted the image color histogram feature of herbal and zooid medicine of Xinjiang Uygur. First, we did preprocessing, including image color enhancement, size normalizition and color space transformation. Then we extracted color histogram feature and analyzed them with statistical method. And finally, we evaluated the classification ability of features by Bayes discriminant analysis. Experimental results showed that high accuracy for Uygur medicine image classification was obtained by using color histogram feature. This study would have a certain help for the content-based medical image retrieval for Xinjiang Uygur medicine.
LSAH: a fast and efficient local surface feature for point cloud registration
NASA Astrophysics Data System (ADS)
Lu, Rongrong; Zhu, Feng; Wu, Qingxiao; Kong, Yanzi
2018-04-01
Point cloud registration is a fundamental task in high level three dimensional applications. Noise, uneven point density and varying point cloud resolutions are the three main challenges for point cloud registration. In this paper, we design a robust and compact local surface descriptor called Local Surface Angles Histogram (LSAH) and propose an effectively coarse to fine algorithm for point cloud registration. The LSAH descriptor is formed by concatenating five normalized sub-histograms into one histogram. The five sub-histograms are created by accumulating a different type of angle from a local surface patch respectively. The experimental results show that our LSAH is more robust to uneven point density and point cloud resolutions than four state-of-the-art local descriptors in terms of feature matching. Moreover, we tested our LSAH based coarse to fine algorithm for point cloud registration. The experimental results demonstrate that our algorithm is robust and efficient as well.
Method development estimating ambient mercury concentration from monitored mercury wet deposition
NASA Astrophysics Data System (ADS)
Chen, S. M.; Qiu, X.; Zhang, L.; Yang, F.; Blanchard, P.
2013-05-01
Speciated atmospheric mercury data have recently been monitored at multiple locations in North America; but the spatial coverage is far less than the long-established mercury wet deposition network. The present study describes a first attempt linking ambient concentration with wet deposition using Beta distribution fitting of a ratio estimate. The mean, median, mode, standard deviation, and skewness of the fitted Beta distribution parameters were generated using data collected in 2009 at 11 monitoring stations. Comparing the normalized histogram and the fitted density function, the empirical and fitted Beta distribution of the ratio shows a close fit. The estimated ambient mercury concentration was further partitioned into reactive gaseous mercury and particulate bound mercury using linear regression model developed by Amos et al. (2012). The method presented here can be used to roughly estimate mercury ambient concentration at locations and/or times where such measurement is not available but where wet deposition is monitored.
NASA Astrophysics Data System (ADS)
Ivanova, Mariya A.; Klopov, Nicolay V.; Lebedev, Andrei D.; Noskin, Leonid A.; Noskin, Valentin A.; Pavlov, Michail Y.
1997-05-01
We discuss the use of the QELS method for screening of population groups for verified pathologies. For mathematical analysis of experimental data the regularization procedure have been used. This allows us to determine the histograms of particle size distribution of blood plasma samples. For the interpretation of the histogram data the special program of the mathematical processing - 'semiotic classifier' - have been created. The main idea of the 'semiotic classifier' is based on the fact, that formation of the pathological trace in human organism depends not only on concrete disease nature but also on the interaction between the organism sanogenetic mechanisms. We separate five pathological symptomatic complexes of organism status: allergic diseases, intoxications, organism catabolic shifts, auto-immune diseases and degenerative-dystrophy processes. The use of this 'semiotic classifier' in the system of monitoring investigations allows to solve the next problems: (1) to separate the persons with the expressed initial level of pathological processes to the risk groups for the special clinical investigations, (2) to set up the predisposition of the concrete individual towards definite pathologies at the preclinical stage, (3) under the conditions of expressed clinical pathology to study the dynamics of pathology processes.
Is there a preference for linearity when viewing natural images?
NASA Astrophysics Data System (ADS)
Kane, David; Bertamío, Marcelo
2015-01-01
The system gamma of the imaging pipeline, defined as the product of the encoding and decoding gammas, is typically greater than one and is stronger for images viewed with a dark background (e.g. cinema) than those viewed in lighter conditions (e.g. office displays).1-3 However, for high dynamic range (HDR) images reproduced on a low dynamic range (LDR) monitor, subjects often prefer a system gamma of less than one,4 presumably reflecting the greater need for histogram equalization in HDR images. In this study we ask subjects to rate the perceived quality of images presented on a LDR monitor using various levels of system gamma. We reveal that the optimal system gamma is below one for images with a HDR and approaches or exceeds one for images with a LDR. Additionally, the highest quality scores occur for images where a system gamma of one is optimal, suggesting a preference for linearity (where possible). We find that subjective image quality scores can be predicted by computing the degree of histogram equalization of the lightness distribution. Accordingly, an optimal, image dependent system gamma can be computed that maximizes perceived image quality.
Diagnosing Cloud Biases in the GFDL AM3 Model With Atmospheric Classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Stuart; Marchand, Roger; Ackerman, Thomas
In this paper, we define a set of 21 atmospheric states, or recurring weather patterns, for a region surrounding the Atmospheric Radiation Measurement Program's Southern Great Plains site using an iterative clustering technique. The states are defined using dynamic and thermodynamic variables from reanalysis, tested for statistical significance with cloud radar data from the Southern Great Plains site, and are determined every 6 h for 14 years, creating a time series of atmospheric state. The states represent the various stages of the progression of synoptic systems through the region (e.g., warm fronts, warm sectors, cold fronts, cold northerly advection, andmore » high-pressure anticyclones) with a subset of states representing summertime conditions with varying degrees of convective activity. We use the states to classify output from the NOAA/Geophysical Fluid Dynamics Laboratory AM3 model to test the model's simulation of the frequency of occurrence of the states and of the cloud occurrence during each state. The model roughly simulates the frequency of occurrence of the states but exhibits systematic cloud occurrence biases. Comparison of observed and model-simulated International Satellite Cloud Climatology Project histograms of cloud top pressure and optical thickness shows that the model lacks high thin cloud under all conditions, but biases in thick cloud are state-dependent. Frontal conditions in the model do not produce enough thick cloud, while fair-weather conditions produce too much. Finally, we find that increasing the horizontal resolution of the model improves the representation of thick clouds under all conditions but has little effect on high thin clouds. However, increasing resolution also changes the distribution of states, causing an increase in total cloud occurrence bias.« less
NASA Astrophysics Data System (ADS)
Takamatsu, Atsuko
2006-11-01
Three-oscillator systems with plasmodia of true slime mold, Physarum polycephalum, which is an oscillatory amoeba-like unicellular organism, were experimentally constructed and their spatio-temporal patterns were investigated. Three typical spatio-temporal patterns were found: rotation ( R), partial in-phase ( PI), and partial anti-phase with double frequency ( PA). In pattern R, phase differences between adjacent oscillators were almost 120 ∘. In pattern PI, two oscillators were in-phase and the third oscillator showed anti-phase against the two oscillators. In pattern PA, two oscillators showed anti-phase and the third oscillator showed frequency doubling oscillation with small amplitude. Actually each pattern is not perfectly stable but quasi-stable. Interestingly, the system shows spontaneous switching among the multiple quasi-stable patterns. Statistical analyses revealed a characteristic in the residence time of each pattern: the histograms seem to have Gamma-like distribution form but with a sharp peak and a tail on the side of long period. That suggests the attractor of this system has complex structure composed of at least three types of sub-attractors: a “Gamma attractor”-involved with several Poisson processes, a “deterministic attractor”-the residence time is deterministic, and a “stable attractor”-each pattern is stable. When the coupling strength was small, only the Gamma attractor was observed and switching behavior among patterns R, PI, and PA almost always via an asynchronous pattern named O. A conjecture is as follows: Internal/external noise exposes each pattern of R, PI, and PA coexisting around bifurcation points: That is observed as the Gamma attractor. As coupling strength increases, the deterministic attractor appears then followed by the stable attractor, always accompanied with the Gamma attractor. Switching behavior could be caused by regular existence of the Gamma attractor.
Spent Fuel Test-Climax: core logging for site investigation and instrumentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilder, D.G.; Yow, J.L. Jr.; Thorpe, R.K.
1982-05-28
As an integral part of the Spent Fuel Test-Climax 5150 ft (1570 m) of granite core was obtained. This core was diamond drilled in various sizes, mainly 38-mm and 76-mm diameters. The core was teken with single tube core barrels and was unoriented. Techniques used to drill and log this core are discussed, as well as techniques to orient the core. Of the 5150 ft (1570 m) of core more than 3645 ft (1111 m) was retained and logged in some detail. As a result of the core logging, geologic discontinuities were identified, joint frequency and spacing characterized. Discontinuities identifiedmore » included several joint sets, shear zones and faults. Correlations based on coring along were generally found to be impossible, even for the more prominent features. The only feature properly correlated from the exploratory drilling was the fault system at the end of the facility, but it was not identified from the exploratory core as a fault. Identification of discontinuities was later helped by underground mapping that identified several different joint sets with different characteristics. It was found that joint frequency varied from 0.3 to 1.1 joint per foot of core for open fractures and from 0.3 to 3.3/ft for closed or healed fractures. Histograms of fracture spacing indicate that there is likely a random distribution of spacing superimposed upon uniformly spaced fractures. It was found that a low angle joint set had a persistent mean orientation. These joints were healed and had pervasive wall rock alteration which made identification of joints in this set possible. The recognition of a joint set with known attitude allowed orientation of much of the core. This orientation technique was found to be effective. 10 references, 25 figures, 4 tables.« less
Diagnosing Cloud Biases in the GFDL AM3 Model With Atmospheric Classification
NASA Astrophysics Data System (ADS)
Evans, Stuart; Marchand, Roger; Ackerman, Thomas; Donner, Leo; Golaz, Jean-Christophe; Seman, Charles
2017-12-01
We define a set of 21 atmospheric states, or recurring weather patterns, for a region surrounding the Atmospheric Radiation Measurement Program's Southern Great Plains site using an iterative clustering technique. The states are defined using dynamic and thermodynamic variables from reanalysis, tested for statistical significance with cloud radar data from the Southern Great Plains site, and are determined every 6 h for 14 years, creating a time series of atmospheric state. The states represent the various stages of the progression of synoptic systems through the region (e.g., warm fronts, warm sectors, cold fronts, cold northerly advection, and high-pressure anticyclones) with a subset of states representing summertime conditions with varying degrees of convective activity. We use the states to classify output from the NOAA/Geophysical Fluid Dynamics Laboratory AM3 model to test the model's simulation of the frequency of occurrence of the states and of the cloud occurrence during each state. The model roughly simulates the frequency of occurrence of the states but exhibits systematic cloud occurrence biases. Comparison of observed and model-simulated International Satellite Cloud Climatology Project histograms of cloud top pressure and optical thickness shows that the model lacks high thin cloud under all conditions, but biases in thick cloud are state-dependent. Frontal conditions in the model do not produce enough thick cloud, while fair-weather conditions produce too much. We find that increasing the horizontal resolution of the model improves the representation of thick clouds under all conditions but has little effect on high thin clouds. However, increasing resolution also changes the distribution of states, causing an increase in total cloud occurrence bias.
Diagnosing Cloud Biases in the GFDL AM3 Model With Atmospheric Classification
Evans, Stuart; Marchand, Roger; Ackerman, Thomas; ...
2017-11-16
In this paper, we define a set of 21 atmospheric states, or recurring weather patterns, for a region surrounding the Atmospheric Radiation Measurement Program's Southern Great Plains site using an iterative clustering technique. The states are defined using dynamic and thermodynamic variables from reanalysis, tested for statistical significance with cloud radar data from the Southern Great Plains site, and are determined every 6 h for 14 years, creating a time series of atmospheric state. The states represent the various stages of the progression of synoptic systems through the region (e.g., warm fronts, warm sectors, cold fronts, cold northerly advection, andmore » high-pressure anticyclones) with a subset of states representing summertime conditions with varying degrees of convective activity. We use the states to classify output from the NOAA/Geophysical Fluid Dynamics Laboratory AM3 model to test the model's simulation of the frequency of occurrence of the states and of the cloud occurrence during each state. The model roughly simulates the frequency of occurrence of the states but exhibits systematic cloud occurrence biases. Comparison of observed and model-simulated International Satellite Cloud Climatology Project histograms of cloud top pressure and optical thickness shows that the model lacks high thin cloud under all conditions, but biases in thick cloud are state-dependent. Frontal conditions in the model do not produce enough thick cloud, while fair-weather conditions produce too much. Finally, we find that increasing the horizontal resolution of the model improves the representation of thick clouds under all conditions but has little effect on high thin clouds. However, increasing resolution also changes the distribution of states, causing an increase in total cloud occurrence bias.« less
ERIC Educational Resources Information Center
Brookes, Bertram C.; Griffiths, Jose M.
1978-01-01
Frequency, rank, and frequency rank distributions are defined. Extensive discussion on several aspects of frequency rank distributions includes the Poisson process as a means of exploring the stability of ranks; the correlation of frequency rank distributions; and the transfer coefficient, a new measure in frequency rank distribution. (MBR)
Nemmi, Federico; Saint-Aubert, Laure; Adel, Djilali; Salabert, Anne-Sophie; Pariente, Jérémie; Barbeau, Emmanuel; Payoux, Pierre; Péran, Patrice
2014-01-01
Purpose AV-45 amyloid biomarker is known to show uptake in white matter in patients with Alzheimer’s disease (AD) but also in healthy population. This binding; thought to be of a non-specific lipophilic nature has not yet been investigated. The aim of this study was to determine the differential pattern of AV-45 binding in healthy and pathological populations in white matter. Methods We recruited 24 patients presenting with AD at early stage and 17 matched, healthy subjects. We used an optimized PET-MRI registration method and an approach based on intensity histogram using several indexes. We compared the results of the intensity histogram analyses with a more canonical approach based on target-to-cerebellum Standard Uptake Value (SUVr) in white and grey matters using MANOVA and discriminant analyses. A cluster analysis on white and grey matter histograms was also performed. Results White matter histogram analysis revealed significant differences between AD and healthy subjects, which were not revealed by SUVr analysis. However, white matter histograms was not decisive to discriminate groups, and indexes based on grey matter only showed better discriminative power than SUVr. The cluster analysis divided our sample in two clusters, showing different uptakes in grey but also in white matter. Conclusion These results demonstrate that AV-45 binding in white matter conveys subtle information not detectable using SUVr approach. Although it is not better than standard SUVr to discriminate AD patients from healthy subjects, this information could reveal white matter modifications. PMID:24573658
INFERRING THE ECCENTRICITY DISTRIBUTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogg, David W.; Bovy, Jo; Myers, Adam D., E-mail: david.hogg@nyu.ed
2010-12-20
Standard maximum-likelihood estimators for binary-star and exoplanet eccentricities are biased high, in the sense that the estimated eccentricity tends to be larger than the true eccentricity. As with most non-trivial observables, a simple histogram of estimated eccentricities is not a good estimate of the true eccentricity distribution. Here, we develop and test a hierarchical probabilistic method for performing the relevant meta-analysis, that is, inferring the true eccentricity distribution, taking as input the likelihood functions for the individual star eccentricities, or samplings of the posterior probability distributions for the eccentricities (under a given, uninformative prior). The method is a simple implementationmore » of a hierarchical Bayesian model; it can also be seen as a kind of heteroscedastic deconvolution. It can be applied to any quantity measured with finite precision-other orbital parameters, or indeed any astronomical measurements of any kind, including magnitudes, distances, or photometric redshifts-so long as the measurements have been communicated as a likelihood function or a posterior sampling.« less
NASA Astrophysics Data System (ADS)
Jenkins, Colleen; Jordan, Jay; Carlson, Jeff
2007-02-01
This paper presents parameter estimation techniques useful for detecting background changes in a video sequence with extreme foreground activity. A specific application of interest is automated detection of the covert placement of threats (e.g., a briefcase bomb) inside crowded public facilities. We propose that a histogram of pixel intensity acquired from a fixed mounted camera over time for a series of images will be a mixture of two Gaussian functions: the foreground probability distribution function and background probability distribution function. We will use Pearson's Method of Moments to separate the two probability distribution functions. The background function can then be "remembered" and changes in the background can be detected. Subsequent comparisons of background estimates are used to detect changes. Changes are flagged to alert security forces to the presence and location of potential threats. Results are presented that indicate the significant potential for robust parameter estimation techniques as applied to video surveillance.
Wang, G J; Wang, Y; Ye, Y; Chen, F; Lu, Y T; Li, S L
2017-11-07
Objective: To investigate the features of apparent diffusion coefficient (ADC) histogram parameters based on entire tumor volume data in high resolution diffusion weighted imaging of nasopharyngeal carcinoma (NPC) and to evaluate its correlations with cancer stages. Methods: This retrospective study included 154 cases of NPC patients[102 males and 52 females, mean age (48±11) years]who had received readout segmentation of long variable echo trains of MRI scan before radiation therapy. The area of tumor was delineated on each section of axial ADC maps to generate ADC histogram by using Image J. ADC histogram of entire tumor along with the histogram parameters-the tumor voxels, ADC(mean), ADC(25%), ADC(50%), ADC(75%), skewness and kurtosis were obtained by merging all sections with SPSS 22.0 software. Intra-observer repeatability was assessed by using intra-class correlation coefficients (ICC). The patients were subdivided into two groups according to cancer volume: small cancer group (<305 voxels, about 2 cm(3)) and large cancer group (≥2 cm(3)). The correlation between ADC histogram parameters and cancer stages was evaluated with Spearman test. Results: The ICC of measuring ADC histogram parameters of tumor voxels, ADC(mean), ADC(25%), ADC(50%), ADC(75%), skewness, kurtosis was 0.938, 0.861, 0.885, 0.838, 0.836, 0.358 and 0.456, respectively. The tumor voxels was positively correlated with T staging ( r =0.368, P <0.05). There were significant differences in tumor voxels among patients with different T stages ( K =22.306, P <0.05). There were significant differences in the ADC(mean), ADC(25%), ADC(50%) among patients with different T stages in the small cancer group( K =8.409, 8.187, 8.699, all P <0.05), and the up-mentioned three indices were positively correlated with T staging ( r =0.221, 0.209, 0.235, all P <0.05). Skewness and kurtosis differed significantly between the groups with different cancer volume( t =-2.987, Z =-3.770, both P <0.05). Conclusion: The tumor volume, tissue uniformity of NPC are important factors affecting ADC and cancer stages, parameters of ADC histogram (ADC(mean), ADC(25%), ADC(50%)) increases with T staging in NPC smaller than 2 cm(3).
Surface topography analysis and performance on post-CMP images (Conference Presentation)
NASA Astrophysics Data System (ADS)
Lee, Jusang; Bello, Abner F.; Kakita, Shinichiro; Pieniazek, Nicholas; Johnson, Timothy A.
2017-03-01
Surface topography on post-CMP processing can be measured with white light interference microscopy to determine the planarity. Results are used to avoid under or over polishing and to decrease dishing. The numerical output of the surface topography is the RMS (root-mean-square) of the height. Beyond RMS, the topography image is visually examined and not further quantified. Subjective comparisons of the height maps are used to determine optimum CMP process conditions. While visual comparison of height maps can determine excursions, it's only through manual inspection of the images. In this work we describe methods of quantifying post-CMP surface topography characteristics that are used in other technical fields such as geography and facial-recognition. The topography image is divided into small surface patches of 7x7 pixels. Each surface patch is fitted to an analytic surface equation, in this case a third order polynomial, from which the gradient, directional derivatives, and other characteristics are calculated. Based on the characteristics, the surface patch is labeled as peak, ridge, flat, saddle, ravine, pit or hillside. The number of each label and thus the associated histogram is then used as a quantified characteristic of the surface topography, and could be used as a parameter for SPC (statistical process control) charting. In addition, the gradient for each surface patch is calculated, so the average, maximum, and other characteristics of the gradient distribution can be used for SPC. Repeatability measurements indicate high confidence where individual labels can be lower than 2% relative standard deviation. When the histogram is considered, an associated chi-squared value can be defined from which to compare other measurements. The chi-squared value of the histogram is a very sensitive and quantifiable parameter to determine the within wafer and wafer-to-wafer topography non-uniformity. As for the gradient histogram distribution, the chi-squared could again be calculated and used as yet another quantifiable parameter for SPC. In this work we measured the post Cu CMP of a die designed for 14nm technology. A region of interest (ROI) known to be indicative of the CMP processing is chosen for the topography analysis. The ROI, of size 1800 x 2500 pixels where each pixel represents 2um, was repeatably measured. We show the sensitivity based on measurements and the comparison between center and edge die measurements. The topography measurements and surface patch analysis were applied to hundreds of images representing the periodic process qualification runs required to control and verify CMP performance and tool matching. The analysis is shown to be sensitive to process conditions that vary in polishing time, type of slurry, CMP tool manufacturer, and CMP pad lifetime. Keywords: Keywords: CMP, Topography, Image Processing, Metrology, Interference microscopy, surface processing [1] De Lega, Xavier Colonna, and Peter De Groot. "Optical topography measurement of patterned wafers." Characterization and Metrology for ULSI Technology 2005 788 (2005): 432-436. [2] de Groot, Peter. "Coherence scanning interferometry." Optical Measurement of Surface Topography. Springer Berlin Heidelberg, 2011. 187-208. [3] Watson, Layne T., Thomas J. Laffey, and Robert M. Haralick. "Topographic classification of digital image intensity surfaces using generalized splines and the discrete cosine transformation." Computer Vision, Graphics, and Image Processing 29.2 (1985): 143-167. [4] Wang, Jun, et al. "3D facial expression recognition based on primitive surface feature distribution." Computer Vision and Pattern Recognition, 2006 IEEE Computer Society Conference on. Vol. 2. IEEE, 2006.
Escobedo, Fernando A
2014-03-07
In this work, a variant of the Gibbs-Duhem integration (GDI) method is proposed to trace phase coexistence lines that combines some of the advantages of the original GDI methods such as robustness in handling large system sizes, with the ability of histogram-based methods (but without using histograms) to estimate free-energies and hence avoid the need of on-the-fly corrector schemes. This is done by fitting to an appropriate polynomial function not the coexistence curve itself (as in GDI schemes) but the underlying free-energy function of each phase. The availability of a free-energy model allows the post-processing of the simulated data to obtain improved estimates of the coexistence line. The proposed method is used to elucidate the phase behavior for two non-trivial hard-core mixtures: a binary blend of spheres and cubes and a system of size-polydisperse cubes. The relative size of the spheres and cubes in the first mixture is chosen such that the resulting eutectic pressure-composition phase diagram is nearly symmetric in that the maximum solubility of cubes in the sphere-rich solid (∼20%) is comparable to the maximum solubility of spheres in the cube-rich solid. In the polydisperse cube system, the solid-liquid coexistence line is mapped out for an imposed Gaussian activity distribution, which produces near-Gaussian particle-size distributions in each phase. A terminal polydispersity of 11.3% is found, beyond which the cubic solid phase would not be stable, and near which significant size fractionation between the solid and isotropic phases is predicted.
Serial data acquisition for the X-ray plasma diagnostics with selected GEM detector structures
NASA Astrophysics Data System (ADS)
Czarski, T.; Chernyshova, M.; Pozniak, K. T.; Kasprowicz, G.; Zabolotny, W.; Kolasinski, P.; Krawczyk, R.; Wojenski, A.; Zienkiewicz, P.
2015-10-01
The measurement system based on GEM—Gas Electron Multiplier detector is developed for X-ray diagnostics of magnetic confinement tokamak plasmas. The paper is focused on the measurement subject and describes the fundamental data processing to obtain reliable characteristics (histograms) useful for physicists. The required data processing have two steps: 1—processing in the time domain, i.e. events selections for bunches of coinciding clusters, 2—processing in the planar space domain, i.e. cluster identification for the given detector structure. So, it is the software part of the project between the electronic hardware and physics applications. The whole project is original and it was developed by the paper authors. The previous version based on 1-D GEM detector was applied for the high-resolution X-ray crystal spectrometer KX1 in the JET tokamak. The current version considers 2-D detector structures for the new data acquisition system. The fast and accurate mode of data acquisition implemented in the hardware in real time can be applied for the dynamic plasma diagnostics. Several detector structures with single-pixel sensors and multi-pixel (directional) sensors are considered for two-dimensional X-ray imaging. Final data processing is presented by histograms for selected range of position, time interval and cluster charge values. Exemplary radiation source properties are measured by the basic cumulative characteristics: the cluster position distribution and cluster charge value distribution corresponding to the energy spectra. A shorter version of this contribution is due to be published in PoS at: 1st EPS conference on Plasma Diagnostics
Wang, Hai-yi; Su, Zi-hua; Xu, Xiao; Sun, Zhi-peng; Duan, Fei-xue; Song, Yuan-yuan; Li, Lu; Wang, Ying-wei; Ma, Xin; Guo, Ai-tao; Ma, Lin; Ye, Hui-yi
2016-01-01
Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan–rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K trans & Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733
Shirai, Katsuyuki; Kawashima, Motohiro; Saitoh, Jun-Ichi; Abe, Takanori; Fukata, Kyohei; Shigeta, Yuka; Irie, Daisuke; Shiba, Shintaro; Okano, Naoko; Ohno, Tatsuya; Nakano, Takashi
2017-01-01
The safety and efficacy of carbon-ion radiotherapy for advanced non-small cell lung cancer have not been established. We evaluated the clinical outcomes and dose-volume histogram parameters of carbon-ion radiotherapy compared with photon therapy in T2b-4N0M0 non-small cell lung cancer. Twenty-three patients were treated with carbon-ion radiotherapy between May 2011 and December 2015. Seven, 14, and 2 patients had T2b, T3, and T4, respectively. The median age was 78 (range, 53-91) years, with 22 male patients. There were 12 adenocarcinomas, 8 squamous cell carcinomas, 1 non-small cell lung carcinoma, and 2 clinically diagnosed lung cancers. Eleven patients were operable, and 12 patients were inoperable. Most patients (91%) were treated with carbon-ion radiotherapy of 60.0 Gy relative biological effectiveness (RBE) in 4 fractions or 64.0 Gy (RBE) in 16 fractions. Local control and overall survival rates were calculated. Dose-volume histogram parameters of normal lung and tumor coverages were compared between carbon-ion radiotherapy and photon therapies, including three-dimensional conformal radiotherapy (3DCRT) and intensity-modulated radiotherapy (IMRT). The median follow-up of surviving patients was 25 months. Three patients experienced local recurrence, and the 2-year local control rate was 81%. During follow-up, 5 patients died of lung cancer, and 1 died of intercurrent disease. The 2-year overall survival rate was 70%. Operable patients had a better overall survival rate compared with inoperable patients (100% vs. 43%; P = 0.04). There was no grade ≥2 radiation pneumonitis. In dose-volume histogram analysis, carbon-ion radiotherapy had a significantly lower dose to normal lung and greater tumor coverage compared with photon therapies. Carbon-ion radiotherapy was effectively and safely performed for T2b-4N0M0 non-small cell lung cancer, and the dose distribution was superior compared with those for photon therapies. A Japanese multi-institutional study is ongoing to prospectively evaluate these patients and establish the use of carbon-ion radiotherapy.
MCNP-based computational model for the Leksell gamma knife.
Trnka, Jiri; Novotny, Josef; Kluson, Jaroslav
2007-01-01
We have focused on the usage of MCNP code for calculation of Gamma Knife radiation field parameters with a homogenous polystyrene phantom. We have investigated several parameters of the Leksell Gamma Knife radiation field and compared the results with other studies based on EGS4 and PENELOPE code as well as the Leksell Gamma Knife treatment planning system Leksell GammaPlan (LGP). The current model describes all 201 radiation beams together and simulates all the sources in the same time. Within each beam, it considers the technical construction of the source, the source holder, collimator system, the spherical phantom, and surrounding material. We have calculated output factors for various sizes of scoring volumes, relative dose distributions along basic planes including linear dose profiles, integral doses in various volumes, and differential dose volume histograms. All the parameters have been calculated for each collimator size and for the isocentric configuration of the phantom. We have found the calculated output factors to be in agreement with other authors' works except the case of 4 mm collimator size, where averaging over the scoring volume and statistical uncertainties strongly influences the calculated results. In general, all the results are dependent on the choice of the scoring volume. The calculated linear dose profiles and relative dose distributions also match independent studies and the Leksell GammaPlan, but care must be taken about the fluctuations within the plateau, which can influence the normalization, and accuracy in determining the isocenter position, which is important for comparing different dose profiles. The calculated differential dose volume histograms and integral doses have been compared with data provided by the Leksell GammaPlan. The dose volume histograms are in good agreement as well as integral doses calculated in small calculation matrix volumes. However, deviations in integral doses up to 50% can be observed for large volumes such as for the total skull volume. The differences observed in treatment of scattered radiation between the MC method and the LGP may be important in this case. We have also studied the influence of differential direction sampling of primary photons and have found that, due to the anisotropic sampling, doses around the isocenter deviate from each other by up to 6%. With caution about the details of the calculation settings, it is possible to employ the MCNP Monte Carlo code for independent verification of the Leksell Gamma Knife radiation field properties.
The number distribution of weak Explosive Events observed by SUMER/SoHO
NASA Astrophysics Data System (ADS)
Mendoza-Torres, J. E.
2016-11-01
Explosive Events (EEs) observed by SUMER on SoHO at the 1393.8 Å Si IV line are analyzed. We look for EEs to study their number distribution at low energies. Eight data sets taken in June 1996 in raster observations are used. In these observations a field on the solar disk is scanned several times during a period considerably longer than the typical timelife of an EE. To look for EE, we first identified the maxima and locations of spectral line increases. The maxima that took place at inner locations of the rastered fields were considered as possible EEs. From this sample, the cases where the spectral line underwent Doppler shifts at most ±3″ from the location of the maximum were considered EEs. After a selection, the region within 5″ of the event was ignored for 5 min either side of the EE in order to conclusively select a different maxima. Based on the analysis of the locations of EEs, it was seen that the more intense EEs tend to take place at given regions while at the intermediate regions the observed EEs are less intense. Therefore we refer to them as Regions of Enhanced Emission (REE) and Quiet Regions (QR), respectively. The width of the REE regions, as seen in North-South direction is about 10-30″. In this work, a total of 487 EEs are analyzed, 266 at REE and 221 at QR. Also, Histograms are made of the maxima of the amplitude of the spectral line during EEs at both REE and QR. At the Histogram for EEs at QR the number grows as the flux decreases with a slope of -1.8. For EEs at REE the Histogram has a maximum about 1 Watts m-2 sr-1 Å-1 with a high energy slope of about -1.6. These numbers are both below the value required to give an important input of energy for coronal heating, as analyzed in the case of microflares (Hudson, 1991). The averages of the maxima of EEs at each set for the REE and QR are computed. The scatter plot of the average values indicates that there is a linear relation between them and the maximum amplitudes of EEs at REE are about two times larger than the amplitudes for EEs at QR.
Charge fluctuations in nanoscale capacitors.
Limmer, David T; Merlet, Céline; Salanne, Mathieu; Chandler, David; Madden, Paul A; van Roij, René; Rotenberg, Benjamin
2013-09-06
The fluctuations of the charge on an electrode contain information on the microscopic correlations within the adjacent fluid and their effect on the electronic properties of the interface. We investigate these fluctuations using molecular dynamics simulations in a constant-potential ensemble with histogram reweighting techniques. This approach offers, in particular, an efficient, accurate, and physically insightful route to the differential capacitance that is broadly applicable. We demonstrate these methods with three different capacitors: pure water between platinum electrodes and a pure as well as a solvent-based organic electrolyte each between graphite electrodes. The total charge distributions with the pure solvent and solvent-based electrolytes are remarkably Gaussian, while in the pure ionic liquid the total charge distribution displays distinct non-Gaussian features, suggesting significant potential-driven changes in the organization of the interfacial fluid.
Charge Fluctuations in Nanoscale Capacitors
NASA Astrophysics Data System (ADS)
Limmer, David T.; Merlet, Céline; Salanne, Mathieu; Chandler, David; Madden, Paul A.; van Roij, René; Rotenberg, Benjamin
2013-09-01
The fluctuations of the charge on an electrode contain information on the microscopic correlations within the adjacent fluid and their effect on the electronic properties of the interface. We investigate these fluctuations using molecular dynamics simulations in a constant-potential ensemble with histogram reweighting techniques. This approach offers, in particular, an efficient, accurate, and physically insightful route to the differential capacitance that is broadly applicable. We demonstrate these methods with three different capacitors: pure water between platinum electrodes and a pure as well as a solvent-based organic electrolyte each between graphite electrodes. The total charge distributions with the pure solvent and solvent-based electrolytes are remarkably Gaussian, while in the pure ionic liquid the total charge distribution displays distinct non-Gaussian features, suggesting significant potential-driven changes in the organization of the interfacial fluid.
Computational analysis of kidney scintigrams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vrincianu, D.; Puscasu, E.; Creanga, D.
The scintigraphic investigation of normal and pathological kidneys was carried out using specialized gamma-camera device from nuclear medicine hospital department. Technetium 90m isotope with gamma radiation emission, coupled with vector molecules for kidney tissues was introduced into the subject body, its dynamics being recorded as data source for kidney clearance capacity. Two representative data series were investigated, corresponding to healthy and pathological organs respectively. The semi-quantitative tests applied for the comparison of the two distinct medical situations were: the shape of probability distribution histogram, the power spectrum, the auto-correlation function and the Lyapunov exponent. While power spectrum led to similarmore » results in both cases, significant differences were revealed by means of distribution probability, Lyapunov exponent and correlation time, recommending these numerical tests as possible complementary tools in clinical diagnosis.« less
``Sweetening'' Technical Physics with Hershey's Kisses
NASA Astrophysics Data System (ADS)
Stone, Chuck
2003-04-01
This paper describes an activity in which students measure the mass of each candy in one full bag of Hershey's Kisses and then use a simple spreadsheet program to construct a histogram showing the number of candies as a function of mass. Student measurements indicate that one single bag of 80 Kisses yields enough data to produce a noticeable variation in the candy's mass distribution. The bimodal character of this distribution provides a useful discussion topic. This activity can be performed as a classroom project, a laboratory exercise, or an interactive lecture demonstration. In all these formats, students have the opportunity to collect, organize, process, and analyze real data. In addition to strengthening graphical analysis skills, this activity introduces students to fundamentals of statistics, manufacturing processes in the industrial workplace, and process control techniques.
Web-based CERES Clouds QC Property Viewing Tool
NASA Astrophysics Data System (ADS)
Smith, R. A.; Chu, C.; Sun-Mack, S.; Chen, Y.; Heckert, E.; Minnis, P.
2014-12-01
This presentation will display the capabilities of a web-based CERES cloud property viewer. Terra data will be chosen for examples. It will demonstrate viewing of cloud properties in gridded global maps, histograms, time series displays, latitudinal zonal images, binned data charts, data frequency graphs, and ISCCP plots. Images can be manipulated by the user to narrow boundaries of the map as well as color bars and value ranges, compare datasets, view data values, and more. Other atmospheric studies groups will be encouraged to put their data into the underlying NetCDF data format and view their data with the tool. A laptop will hopefully be available to allow conference attendees to try navigating the tool.
Allodi, S; Reese, B E; Cavalcante, L A
1990-01-01
The spectra of fiber sizes at different depths of the optic tract of the opossum Didelphis marsupialis were examined by electron microscopy in order to test for correlations between the eventual location of axons and relevant developmental events. Frequency histograms showed 1) a predominant representation of medium-sized axons and the virtual exclusion of coarse fibers from the deepest portion of that pathway, and 2) a progressive increase in the proportion of thin axons from deep to superficial sites of the tract. These findings are discussed in terms of the view of the optic tract as a chronological map of axon arrival.
Rocketdyne automated dynamics data analysis and management system
NASA Technical Reports Server (NTRS)
Tarn, Robert B.
1988-01-01
An automated dynamics data analysis and management systems implemented on a DEC VAX minicomputer cluster is described. Multichannel acquisition, Fast Fourier Transformation analysis, and an online database have significantly improved the analysis of wideband transducer responses from Space Shuttle Main Engine testing. Leakage error correction to recover sinusoid amplitudes and correct for frequency slewing is described. The phase errors caused by FM recorder/playback head misalignment are automatically measured and used to correct the data. Data compression methods are described and compared. The system hardware is described. Applications using the data base are introduced, including software for power spectral density, instantaneous time history, amplitude histogram, fatigue analysis, and rotordynamics expert system analysis.
NASA Astrophysics Data System (ADS)
Savran, W. H.; Louie, J. N.; Pullammanappallil, S.; Pancha, A.
2011-12-01
When deterministically modeling the propagation of seismic waves, shallow shear-wave velocity plays a crucial role in predicting shaking effects such as peak ground velocity (PGV). The Clark County Parcel Map provides us with a data set of geotechnical velocities in Las Vegas Valley, at an unprecedented level of detail. Las Vegas Valley is a basin with similar geologic properties to some areas of Southern California. We analyze elementary spatial statistical properties of the Parcel Map, along with calculating its spatial variability. We then investigate these spatial statistics from the PGV results computed from two geotechnical models that incorporate the Parcel Map as parameters. Plotting a histogram of the Parcel Map 30-meter depth-averaged shear velocity (Vs30) values shows the data to approximately fit a bimodal normal distribution with μ1 = 400 m/s, σ1 = 76 m/s, μ2 = 790 m/s, σ2 = 149 m/s, and p = 0.49., where μ is the mean, σ is standard deviation, and p is the probability mixing factor for the bimodal distribution. Based on plots of spatial power spectra, the Parcel Map appears to be fractal over the second and third decades, in kilometers. The spatial spectra possess the same fractal dimension in the N-S and the E-W directions, indicating isotropic scale invariance. We configured finite-difference wave propagation models at 0.5 Hz with LLNL's E3D code, utilizing the Parcel Map as input parameters to compute a PGV data set from a scenario earthquake (Black Hills M6.5). The resulting PGV is fractal over the same spatial frequencies as the Vs30 data sets associated with their respective models. The fractal dimension is systematically lower in all of the PGV maps as opposed to the Vs30 maps, showing that the PGV maps are richer in higher spatial frequencies. This is potentially caused by a lens focusing effects on seismic waves due to spatial heterogeneity in site conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wortel, Ruud C.; Incrocci, Luca; Pos, Floris J.
Purpose: Image-guided intensity modulated radiation therapy (IG-IMRT) allows significant dose reductions to organs at risk in prostate cancer patients. However, clinical data identifying the benefits of IG-IMRT in daily practice are scarce. The purpose of this study was to compare dose distributions to organs at risk and acute gastrointestinal (GI) and genitourinary (GU) toxicity levels of patients treated to 78 Gy with either IG-IMRT or 3D-CRT. Methods and Materials: Patients treated with 3D-CRT (n=215) and IG-IMRT (n=260) receiving 78 Gy in 39 fractions within 2 randomized trials were selected. Dose surface histograms of anorectum, anal canal, and bladder were calculated. Identical toxicitymore » questionnaires were distributed at baseline, prior to fraction 20 and 30 and at 90 days after treatment. Radiation Therapy Oncology Group (RTOG) grade ≥1, ≥2, and ≥3 endpoints were derived directly from questionnaires. Univariate and multivariate binary logistic regression analyses were applied. Results: The median volumes receiving 5 to 75 Gy were significantly lower (all P<.001) with IG-IMRT for anorectum, anal canal, and bladder. The mean dose to the anorectum was 34.4 Gy versus 47.3 Gy (P<.001), 23.6 Gy versus 44.6 Gy for the anal canal (P<.001), and 33.1 Gy versus 43.2 Gy for the bladder (P<.001). Significantly lower grade ≥2 toxicity was observed for proctitis, stool frequency ≥6/day, and urinary frequency ≥12/day. IG-IMRT resulted in significantly lower overall RTOG grade ≥2 GI toxicity (29% vs 49%, respectively, P=.002) and overall GU grade ≥2 toxicity (38% vs 48%, respectively, P=.009). Conclusions: A clinically meaningful reduction in dose to organs at risk and acute toxicity levels was observed in IG-IMRT patients, as a result of improved technique and tighter margins. Therefore reduced late toxicity levels can be expected as well; additional research is needed to quantify such reductions.« less
NASA Astrophysics Data System (ADS)
Le, Jia-Liang; Bažant, Zdeněk P.
2011-07-01
This paper extends the theoretical framework presented in the preceding Part I to the lifetime distribution of quasibrittle structures failing at the fracture of one representative volume element under constant amplitude fatigue. The probability distribution of the critical stress amplitude is derived for a given number of cycles and a given minimum-to-maximum stress ratio. The physical mechanism underlying the Paris law for fatigue crack growth is explained under certain plausible assumptions about the damage accumulation in the cyclic fracture process zone at the tip of subcritical crack. This law is then used to relate the probability distribution of critical stress amplitude to the probability distribution of fatigue lifetime. The theory naturally yields a power-law relation for the stress-life curve (S-N curve), which agrees with Basquin's law. Furthermore, the theory indicates that, for quasibrittle structures, the S-N curve must be size dependent. Finally, physical explanation is provided to the experimentally observed systematic deviations of lifetime histograms of various ceramics and bones from the Weibull distribution, and their close fits by the present theory are demonstrated.
Zhou, Nan; Guo, Tingting; Zheng, Huanhuan; Pan, Xia; Chu, Chen; Dou, Xin; Li, Ming; Liu, Song; Zhu, Lijing; Liu, Baorui; Chen, Weibo; He, Jian; Yan, Jing; Zhou, Zhengyang; Yang, Xiaofeng
2017-01-01
We investigated apparent diffusion coefficient (ADC) histogram analysis to evaluate radiation-induced parotid damage and predict xerostomia degrees in nasopharyngeal carcinoma (NPC) patients receiving radiotherapy. The imaging of bilateral parotid glands in NPC patients was conducted 2 weeks before radiotherapy (time point 1), one month after radiotherapy (time point 2), and four months after radiotherapy (time point 3). From time point 1 to 2, parotid volume, skewness, and kurtosis decreased (P < 0.001, = 0.001, and < 0.001, respectively), but all other ADC histogram parameters increased (all P < 0.001, except P = 0.006 for standard deviation [SD]). From time point 2 to 3, parotid volume continued to decrease (P = 0.022), and SD, 75th and 90th percentiles continued to increase (P = 0.024, 0.010, and 0.006, respectively). Early change rates of parotid ADCmean, ADCmin, kurtosis, and 25th, 50th, 75th, 90th percentiles (from time point 1 to 2) correlated with late parotid atrophy rate (from time point 1 to 3) (all P < 0.05). Multiple linear regression analysis revealed correlations among parotid volume, time point, and ADC histogram parameters. Early mean change rates for bilateral parotid SD and ADCmax could predict late xerostomia degrees at seven months after radiotherapy (three months after time point 3) with AUC of 0.781 and 0.818 (P = 0.014, 0.005, respectively). ADC histogram parameters were reproducible (intraclass correlation coefficient, 0.830 - 0.999). ADC histogram analysis could be used to evaluate radiation-induced parotid damage noninvasively, and predict late xerostomia degrees of NPC patients treated with radiotherapy. PMID:29050274
Histogram Analysis of Diffusion Tensor Imaging Parameters in Pediatric Cerebellar Tumors.
Wagner, Matthias W; Narayan, Anand K; Bosemani, Thangamadhan; Huisman, Thierry A G M; Poretti, Andrea
2016-05-01
Apparent diffusion coefficient (ADC) values have been shown to assist in differentiating cerebellar pilocytic astrocytomas and medulloblastomas. Previous studies have applied only ADC measurements and calculated the mean/median values. Here we investigated the value of diffusion tensor imaging (DTI) histogram characteristics of the entire tumor for differentiation of cerebellar pilocytic astrocytomas and medulloblastomas. Presurgical DTI data were analyzed with a region of interest (ROI) approach to include the entire tumor. For each tumor, histogram-derived metrics including the 25th percentile, 75th percentile, and skewness were calculated for fractional anisotropy (FA) and mean (MD), axial (AD), and radial (RD) diffusivity. The histogram metrics were used as primary predictors of interest in a logistic regression model. Statistical significance levels were set at p < .01. The study population included 17 children with pilocytic astrocytoma and 16 with medulloblastoma (mean age, 9.21 ± 5.18 years and 7.66 ± 4.97 years, respectively). Compared to children with medulloblastoma, children with pilocytic astrocytoma showed higher MD (P = .003 and P = .008), AD (P = .004 and P = .007), and RD (P = .003 and P = .009) values for the 25th and 75th percentile. In addition, histogram skewness showed statistically significant differences for MD between low- and high-grade tumors (P = .008). The 25th percentile for MD yields the best results for the presurgical differentiation between pediatric cerebellar pilocytic astrocytomas and medulloblastomas. The analysis of other DTI metrics does not provide additional diagnostic value. Our study confirms the diagnostic value of the quantitative histogram analysis of DTI data in pediatric neuro-oncology. Copyright © 2015 by the American Society of Neuroimaging.
Bao, Shixing; Watanabe, Yoshiyuki; Takahashi, Hiroto; Tanaka, Hisashi; Arisawa, Atsuko; Matsuo, Chisato; Wu, Rongli; Fujimoto, Yasunori; Tomiyama, Noriyuki
2018-05-31
This study aimed to determine whether whole-tumor histogram analysis of normalized cerebral blood volume (nCBV) and apparent diffusion coefficient (ADC) for contrast-enhancing lesions can be used to differentiate between glioblastoma (GBM) and primary central nervous system lymphoma (PCNSL). From 20 patients, 9 with PCNSL and 11 with GBM without any hemorrhagic lesions, underwent MRI, including diffusion-weighted imaging and dynamic susceptibility contrast perfusion-weighted imaging before surgery. Histogram analysis of nCBV and ADC from whole-tumor voxels in contrast-enhancing lesions was performed. An unpaired t-test was used to compare the mean values for each type of tumor. A multivariate logistic regression model (LRM) was performed to classify GBM and PCNSL using the best parameters of ADC and nCBV. All nCBV histogram parameters of GBMs were larger than those of PCNSLs, but only average nCBV was statistically significant after Bonferroni correction. Meanwhile, ADC histogram parameters were also larger in GBM compared to those in PCNSL, but these differences were not statistically significant. According to receiver operating characteristic curve analysis, the nCBV average and ADC 25th percentile demonstrated the largest area under the curve with values of 0.869 and 0.838, respectively. The LRM combining these two parameters differentiated between GBM and PCNSL with a higher area under the curve value (Logit (P) = -21.12 + 10.00 × ADC 25th percentile (10 -3 mm 2 /s) + 5.420 × nCBV mean, P < 0.001). Our results suggest that whole-tumor histogram analysis of nCBV and ADC combined can be a valuable objective diagnostic method for differentiating between GBM and PCNSL.
Hempel, Johann-Martin; Schittenhelm, Jens; Brendle, Cornelia; Bender, Benjamin; Bier, Georg; Skardelly, Marco; Tabatabai, Ghazaleh; Castaneda Vega, Salvador; Ernemann, Ulrike; Klose, Uwe
2017-10-01
To assess the diagnostic performance of histogram analysis of diffusion kurtosis imaging (DKI) maps for in vivo assessment of the 2016 World Health Organization Classification of Tumors of the Central Nervous System (2016 CNS WHO) integrated glioma grades. Seventy-seven patients with histopathologically-confirmed glioma who provided written informed consent were retrospectively assessed between 01/2014 and 03/2017 from a prospective trial approved by the local institutional review board. Ten histogram parameters of mean kurtosis (MK) and mean diffusivity (MD) metrics from DKI were independently assessed by two blinded physicians from a volume of interest around the entire solid tumor. One-way ANOVA was used to compare MK and MD histogram parameter values between 2016 CNS WHO-based tumor grades. Receiver operating characteristic analysis was performed on MK and MD histogram parameters for significant results. The 25th, 50th, 75th, and 90th percentiles of MK and average MK showed significant differences between IDH1/2 wild-type gliomas, IDH1/2 mutated gliomas, and oligodendrogliomas with chromosome 1p/19q loss of heterozygosity and IDH1/2 mutation (p<0.001). The 50th, 75th, and 90th percentiles showed a slightly higher diagnostic performance (area under the curve (AUC) range; 0.868-0.991) than average MK (AUC range; 0.855-0.988) in classifying glioma according to the integrated approach of 2016 CNS WHO. Histogram analysis of DKI can stratify gliomas according to the integrated approach of 2016 CNS WHO. The 50th (median), 75th , and the 90th percentiles showed the highest diagnostic performance. However, the average MK is also robust and feasible in routine clinical practice. Copyright © 2017 Elsevier B.V. All rights reserved.
Zhou, Nan; Guo, Tingting; Zheng, Huanhuan; Pan, Xia; Chu, Chen; Dou, Xin; Li, Ming; Liu, Song; Zhu, Lijing; Liu, Baorui; Chen, Weibo; He, Jian; Yan, Jing; Zhou, Zhengyang; Yang, Xiaofeng
2017-09-19
We investigated apparent diffusion coefficient (ADC) histogram analysis to evaluate radiation-induced parotid damage and predict xerostomia degrees in nasopharyngeal carcinoma (NPC) patients receiving radiotherapy. The imaging of bilateral parotid glands in NPC patients was conducted 2 weeks before radiotherapy (time point 1), one month after radiotherapy (time point 2), and four months after radiotherapy (time point 3). From time point 1 to 2, parotid volume, skewness, and kurtosis decreased ( P < 0.001, = 0.001, and < 0.001, respectively), but all other ADC histogram parameters increased (all P < 0.001, except P = 0.006 for standard deviation [SD]). From time point 2 to 3, parotid volume continued to decrease ( P = 0.022), and SD, 75 th and 90 th percentiles continued to increase ( P = 0.024, 0.010, and 0.006, respectively). Early change rates of parotid ADC mean , ADC min , kurtosis, and 25 th , 50 th , 75 th , 90 th percentiles (from time point 1 to 2) correlated with late parotid atrophy rate (from time point 1 to 3) (all P < 0.05). Multiple linear regression analysis revealed correlations among parotid volume, time point, and ADC histogram parameters. Early mean change rates for bilateral parotid SD and ADC max could predict late xerostomia degrees at seven months after radiotherapy (three months after time point 3) with AUC of 0.781 and 0.818 ( P = 0.014, 0.005, respectively). ADC histogram parameters were reproducible (intraclass correlation coefficient, 0.830 - 0.999). ADC histogram analysis could be used to evaluate radiation-induced parotid damage noninvasively, and predict late xerostomia degrees of NPC patients treated with radiotherapy.